Novermber 2 2024 •  Episode 021

Sunita Verma - H&M - Mindset Over Metrics: Using Experimentation To Develop A Data-Driven Culture

“ Starting small and getting quick runs has helped to gain more resources and funding. We were able to expand our toolset and scale further. Once we were able to establish small use cases for experimentation, business teams gained confidence. We then had a bigger voice and were able to impact more significant strategic decisions.”


Sunita Verma is currently Test & Learn Specialist (East Asia) at global fashion mega brand H&M. She is an experimentation and growth professional with 5+ years of experience in fostering a culture of experimentation within leading organisations. Her work at Zalora and FairPrice Group has been pivotal in advancing data-driven decision-making and promoting a more agile approach to business operations.

At Zalora, an online fashion retailer, Sunita integrated experimentation techniques to enhance user experience and optimise operational processes. Her focus on data-driven strategies and testing methodologies provided valuable insights into user behaviour and preferences.

At FairPrice Group, one of Singapore’s largest supermarket chains, Sunita drove the adoption of an agile operational model. Her efforts in data analytics and iterative testing enabled FairPrice Group to make informed decisions more swiftly and respond effectively to evolving customer needs.

Sunita is passionate about leveraging experimentation and agile practices to drive continuous improvement and innovation within organisations. Her professional journey reflects a commitment to enhancing organisational effectiveness and adaptability through thoughtful application of these principles.

 

 

Get the transcript

Episode 021 - Sunita Verma - H&M - Sunita Verma - H&M - Mindset Over Metrics: Using Experimentation To Develop A Data-Driven Culture

Gavin Bryant 00:04

Hello and welcome to the Experimentation Masters Podcast. Today, I would like to welcome Sunita Verma to the show. Sunita is currently working as the test and learn specialist East Asia at global fashion mega brand H&M. In her role, she is responsible for building and scaling the experimentation program across six Asian markets. Prior to H&M, Sunita worked as Manager Product Growth and experimentation at FairPrice group, and was also senior executive growth experimentation at Zalora group. Welcome to the show, Sunita.


Sunita Verma 00:43

Thank you so much, Gavin, What a nice introduction.

 

Gavin Bryant 00:47

I'm really excited for our conversation today, Sunita. For our audience and our listeners, Sunita has just taken on the most amazing role two to three months ago, and starting at H&M so I'm interested to dive in a little to learn more about her early few weeks at H&M, but Sunita is also one of our experimentation and growth leaders who will be presenting at the upcoming 2024 Asia Pacific Experimentation Summit. So yes, we're interested to learn a little bit more about the focus of Sunita's presentation for the summit. But let's start by giving our audience a little bit of an overview and background about you and your experience, Sunita.

 

Sunita Verma  01:36

Yes, for sure, I think you did a really great job already. Yes, I've just--- I've been in the CRO space for approximately three years now, but that wasn't really where my career, my education started in. So started out as a very like software engineer information system sort of background, and then I kind of moved into decoding and analytics space for around two years, and that's where I thought--- At least when I graduated, I thought, Okay, this is where I'm going to be. This is what I'm going to do. But, yes, while working in this space something kind of felt missing in the role that closeness to the customer data, customer behavior. Yes, that was when kind of got into a wonderful opportunity in Zalora and I guess since then, being the first time in that CRO and experimentation space, it was so interesting that I've never looked back since, evidently, and all of my future roles, like in fair price group, as you've mentioned, and in H&M currently, all follow the same thread. And, yes, I mean, I've always been trying to explore something different. So with fair price group, I was really trying a different industry there how grocery and supermarket retail works, and how it works together with CRO and now with H&M so what the new wonderful challenge? How do we work in a huge MNC that has markets all over the world and has a global office like based in Sweden, how does that entire collaboration work? Again, another space that is just a nice challenge. I don't know what's next, though, but yes.

 

Gavin Bryant 03:30

You mentioned closeness to customer. What are you enjoying so much about your roles in experimentation and the closeness to customer?

 

Sunita Verma 03:43

I think sometimes it's really just those moments where you find yourself looking into certain dashboards or certain data points, and you kind of get a little bit lost and absorbed in it, and you spend an hour in there. When you come out of it, you feel even more energized. You don't feel tired out. You don't feel like it was grinding work. And that was what I liked about it. It was, I guess, that very interesting psychology factor within the work itself. 

 

Gavin Bryant 04:17

So the customer behaviors and the behavioral psychology component of it, what you're really passionate about, enjoy?

 

Sunita Verma 04:27

Yes, for sure, I think that is something that I have, or has at least kept me continuing in this role. 

 

Gavin Bryant 04:39

Okay, one of the things that I'm really excited to learn more about and talk to you about is your new role at H&M. So talk us through your last two to three months. The role at H&M, it's a new function, new capability in the business. It is one of the world's largest, most well-known brands. Where did you start, and what has been your focus and your strategy and plan over the last two to three months at H&M?

 

Sunita Verma 05:10

Yes, I mean, again, like it has been such an honor being in such a huge organization. It is the first time I'm working in an absolute MNC. Previous companies weren't really off that scale. So what I was very unsure about was definitely, how is it going to work when there are so many different stakeholders to align with, especially now that you have global counterparts in place, but also several markets within our East Asia region in place as well. So the first thing we kind of looked at was to really understand how global is functioning today, and to kind of build that relationship. We do have our global counterparts, and we definitely set up sessions where we talk to each other, and just on a personal level as well, and also, of course, on the business level, just to keep that safe space where we can share any doubts or uncertainties between the regions and beyond that is to really see okay, they do have an experimentation framework and a strategy today. But how relevant is it for East Asia? Would it work if we really just copy and paste the framework that they are working with to a region simply like that, we realize, or at least after looking through the frameworks, we realize, that may not be the best approach. And so what we've been working in the last few months is, how can we change it? How can we adapt the process that they have and make it relevant for the East Asia region, the reasons that we'll need to adapt it is really because we, of course, have colleagues that are from Asian backgrounds and in very different environments from like our European counterparts, for example. So how do we make sure we match their way of working and their styles of communication and also their comfort levels, in a way, because testing is something absolutely new to the region, or at least doing it in a structured way and yes, how can we make it something that is where we can make sure we get buy in very early on as well from our own region's top management. So those few things were stuff that were top of mind and top of action for us in the last few weeks. 

 

Gavin Bryant 07:55

When we were chatting offline before we started our conversation, you're just at the point now to start testing some of those early use cases, which will be really exciting. I think, one of the things that I've noticed working in the Asia Pacific region, and we were also talking about this too, that each of the areas and countries, they're really like, their own local economies and markets, they all have their own different customs and principles and idiosyncrasies. So each country has to be somewhat catered for individually, rather than a broad brush approach.

 

Sunita Verma 08:38

That's right, and at the start of the day, there will always be these sort of questions of, if we do test it in one market, it may not be usable in another market. And thus, should we even run this test? Like, is it something that's worth doing? However right now, since it's such an early stage of the process. Our main focus is, let's get something running, let's get something out there, and then we can fine tune the process. We can make it personalized, and we can make it better, but we can only do that once we've got a foundational base going, but absolutely right. The different markets that we've got have, like I said, their own personalities, we've seen very different problems within each market and very different benefits as well. So I'm interested to see where experimentation hits do, especially with all of these nuances within different markets. 

 

Gavin Bryant 09:46

One of the things that I thought that what you're doing is very smart and very clever, and that is testing some of the more broad based assumptions that are existing within the organization in which we know that those early tests and those quick wins on the board are so, so important. So I thought that was a really smart strategy to start off the program testing some strongly held beliefs and assumptions. So really excited to hear where the program heads over coming months, and no doubt, all the successes that you have.

 

Sunita Verma 10:27

You know what? I hold the same excitement. I hope everything goes great, honestly, but yes, thank you. 

 

Gavin Bryant 10:36

Let's just shift the focus a little bit, moving on from H&M,. You've worked in three different programs now. H&M, the newest of those programs. From your experience, what's required in those earlier stages of a program to really get it moving and to give it a nice kick-start?

 

Sunita Verma 10:59

First and foremost, and this is felt in literally every organization that I've been with so far in a CRO capacity, is buy in from the bigger bosses, like that is a absolute month if that doesn't exist, if the higher management doesn't see value in testing and seeing value in wanting to be proven wrong, in wanting to see that they do hold assumptions as well, that the organization holds assumptions about things, about their customers then it's going to be a really tough journey, and one that I don't know if it should even be started right if we don't have that buy in, but I think more importantly also, it's bosses who kind of value bad news over having those sort of rosy pictures painted for them. A lot of times we do see people picking out analysis or metrics in a very specific way so that it gives a very nice story and very nice narrative. And those are the sort of organizations or departments where we know it's not ready for this sort of program, this sort of experimentation program as well. But yes, I mean, apart from that, I mean, we'll just need the basic tools to enable AB testing if there's already buy in, and as what we mentioned before, like to really start developing some confidence in this program, to start getting those first wins or losses, can be seen as loss aversions, and start building a case to get resources funding eventually, like greater resources and funding and yes, but buy in from the big bosses is a big one for me. 

 

Gavin Bryant 12:57

So thinking to the big bosses in the organizations that you've worked in previously, did you notice a shift in their mindset and attitude over time that maybe at the start, they were a little bit--- There were sacred cows that they were protecting, and over time, they became more humble and more open to being proven wrong and their Ideas tested.

 

Sunita Verma 13:24

For sure, I guess I didn't see that in my immediate big bosses, because the reason they even started out this team was because they really believed that they were wrong at times. And they were willing to be proven wrong, and they were really open to the process from the start. But we have seen shift in teams that we've worked with so external teams as who were possibly not really working in an agile style to start off, and they might have been very used to a very--- BAU will look at this few metrics, and we'll do one of these few actions to combat what we are seeing. And that's it. It stops there. And, yes, customer behavior and those sort of things were thoughts that were not really in their mind when they were working. So these teams are the ones where we've worked with a lot just because we saw so much potential, but at the same time, it was a very stretched out relationship that we had to build over time as well. But it truly paid off, at least in my eyes, because eventually we did see even their heads of the department starting off their questions with, what does the data say? Or can we look at what this customer data point is telling us which was something that we always said, but now they are the ones saying it. And it felt great. It felt amazing. 

 

Gavin Bryant 15:07

That's a fantastic example. Thinking about the three programs that you've worked with at FairPrice, Zalora and now at H&M; what would be your top three learnings from your time in the experimentation game working over those three programs to date?

 

Sunita Verma 15:32

I think first and foremost, and I think something that I've alluded to a little earlier, which was relationship building. I think of starting out, I didn't realize how much of a key role it would play, but I've learned that relationship building is a key part of the job, arguably one of the most important parts of the job as well. And I would also say that what I've learned is that,---From what I've seen, testing for the sake of testing, is very common and also needs to be one of the first things to tackle when starting out an experimentation program, and once you've got buy ins from bosses. I don't know if it's just something that I'm observing in APAC since that's where I've been working all this well, but yes, so that's where a lot of cherry picking of results and metrics do happen. So it's much more common even with the established experimentation program. And I think beyond that--- And in the same line of thought, it is really easy for biasness to creep into test design and the results of it as well. We've seen many examples of it. And it is really because there seems to be a lot of personal attachment to the work that is done, and a lot of old knowledge and gut feel, because you've been doing this for a long time, and yes I mean, it's a very natural thing, but also something that we realize we really need to pay a lot of attention to. 

 

Gavin Bryant 17:27

Now that's an interesting point where experimentation can be used as a validation tool rather than a learning tool. And what from your experience are the best ways to get around that, to ensure that you're experimenting on those opportunities, that are high impact, high value for the organisation rather than being a tool to confirm existing beliefs?

 

Sunita Verma 17:59

The easiest way I would say, or at least what we have done, is to very much be the one who is vetting the process throughout, especially at the start. And I guess as someone external coming into the company, it might be a little bit daunting to tell I don't know product managers or that sort of thing, like who have been working with their product for so long that, hey, I'm questioning this, this, this and this, in your hypothesis, in the way you're thinking about this. However, it's also really important to constantly do that in a way where you are really framing yourself as just trying to make sure you are helping them achieve their goal at the end of the day, and our very first point of relationship building plays a hugely important role in this. It's much easier to do it when I'm friends with the product manager, when I'm friends with the stakeholder. So it all kind of comes together, but I guess at the very basis of it to just be very present in the process and don't be afraid to point things out. Just do it in a very like strategic communication sort of way.

 

Gavin Bryant 19:22

Yes, I think that's a great point that you make there, that as part of the role as an experimentation professional, it is not to accept the status quo all the time. It is to challenge in a constructive and non-threatening way. And you made a really good point there where you suggest that your role is to support and to support other cross functional teams to be more successful and effective in their role, which I thought was a really good way of framing it.

 

Sunita Verma 19:57

Cool, yes, and I guess just to add on a little bit. Well, it's not all the time that your suggestion so that you pointing out things will be accepted, even if you try your best to frame it in a very correct way or in a very approachable and acceptable way. But I think the key beyond that is to just keep going at this process with that same heart, and not be very disheartened by it, like it's not personal, it's just work. So yes, that---

 

Gavin Bryant 20:33

Exactly, we're all driving for the same thing, and that's customer value and organisational growth, isn't it?

 

Sunita Verma 20:41

That's right, just whether it takes a shorter time or a longer time.

 

Gavin Bryant 20:49

Let's shift our discussion a little onto more of a practical note. The focus for your presentation at the APAC Experimentation Summit is around a theme of mindset over metrics. And I can tell already in our discussion today that relationships are really important to you. And I just wanted to get your perspective and your inspiration around the theme for your presentation, mindset over metrics.

 

Sunita Verma 21:22

I think that inspiration came about just because in the companies I've been in, data has more or less been a driving force for the company itself. There's already extensive data being collected across business, across the point of sale, across the customer journey and their behavior. But a lot of times it does feel like it ends there. So beyond that, there may be a whole series of issues that cause these data points to be turned into, I don't know, periodic reports rather than actionable insights. Or it could even be the case that those who turn the data into insights eventually don't have much of their insights turned into action, just because, there's a lack of motivation from other teams to prioritize them. Maybe so or actually, even worse, when data is cherry picked, like what we've talked before, like data is present in abundance, but it is cherry picked to frame a nice picture, which is extremely easy to do. And yes, all of these kind of devalue the data greatly, and yet, all of these are kind of common things we still see within, like companies, at least where I am today. So it is kind of evident to me that there is a gap somewhere in this process where data is there and it has a lot of potential. However, there is that chasm, or that space where people may not be prepared to really be honest with it and be open with it, hence that focus of framing the mindset before getting down to the metrics and the data. 

 

Gavin Bryant 23:18

Let me just probe a little … one of the points that you mentioned there around insights to action, and that even when we can generate high quality insights, sometimes it may not eventuate into business actions. And one of the things that you highlighted there was maybe motivation from other teams. Are there any other instances or scenarios you can think of over your journey so far where that insights may not reach action? Are there any other blockers or factors that people need to consider there?

 

Sunita Verma 23:59

Typically, it really depends on the kind of team that you're working with, and maybe even their history of how they have always been working and what is important to them and their priority. Because some teams may really think your idea is a great idea. However, the way they function is that they get directions from another business team, and they follow it and they take it. So it's really not the data analysts fault here for not coming up with great insights, but it's something else amiss within the ways of work that really prevents it from happening. And yes, it becomes areas that the CRO team and experimentation team really has to look at, how do we unblock this? How do we make sure that this team does not just receive? Instructions from someone and deliver them. And yes, how can we change that?

 

Gavin Bryant 25:07

Yes, that's a really interesting point that the role of experimentation, it's not only to perform experiments and be providing technical expertise around experimentation, design and execution, but there is a very strong component around change management, stakeholder relationship and to support driving that change that you want to see through the organisation.

 

Sunita Verma 25:35

For sure, hence why it's just a very fun, dynamic role.

 

Gavin Bryant 25:42

There was an interesting---- When I was earlier in my career and had the opportunity to build and scale a program from scratch, that after I'd worked through that experience, that I came to the conclusion it was probably 50% managing change and stakeholder engagement, and 50% performing experiments. And I've probably shifted my perspective on that over the years, and it probably feels more like 70% communicating, influencing and change management, and maybe 30% performing and executing experiments, which seems like is a better balance of the scales.

 

Sunita Verma 26:24

I feel you. I agree with you, yes, like me coming in as just super new in this role, like couple of years back, my belief on that has shifted similar to you. Yes.

 

Gavin Bryant 26:40

Now let's talk about a data driven culture that you were suggesting earlier, that business leaders, particularly teams and team managers, shifting their focus and actively seeking out experimentation results, experimentation data, and using data as a first port of call like a default to inform decisions. What are some of the other characteristics that you've seen in the organisations you've worked in that really start to embrace and make a shift to be more data driven? 

 

Sunita Verma 27:20

I think one of it is pretty basic, but investing more in data and the cleaning of it and the structuring of it, it sounds very basic. However, there are places or instances where data is really littered with issues, and we've seen cases like that, and it usually just stops you from being able to test anything. There's literally no point if you can't trust your insights, like you can't trust the data that is backing up your hypothesis, and you can't trust the results of your tests eventually, and yes, everything just kind of stops in a pipeline if issues like that keep cropping up. But assuming that this sort of investment in having proper people look after the data and ensure data issues don't crop up, I guess beyond that, there will really need to be data, knowledge and understanding among colleagues of all departments and different levels. I guess over at FairPrice group, it was also really great. I mean, all the organizations that have been in so far that has been something that's top of mind already for the leaders within that organization where there are these sort of data training sessions and this investment in platforms where you can really view, like drag and drop sort of data platforms so that you can visualize data and answer your curious questions more easily if you can't. Literally, type on an SQL and query the data out, which is really difficult for someone who doesn't have experience in that. So this sort of efforts into making data accessible, not just in terms of literal access, but making it easy for them as well, and making it something that's not scary and that also has a little bit of a culture play into it, because you also have to make sure that the people hired in these departments are naturally curious and naturally looking to improve, otherwise the data is not going to be used anyway. So that's, I guess, a base requirement. And yes, I mean beyond that, I think goes back to a previous point where the organizational direction for being open, honest, customer driven. It has to be top down, and people will have to really be seeing leaders walk the talk. So like on, all of these values. So it can be in terms of their focuses, or even the kind of questions that they have in meetings, the sort of things that they say, the actions that they do, it really has to leave and breathe these values otherwise, if they just say it and don't act on it, all employees can tell that no one believes in this, and they won't want to then be open with data or be honest and just play it safe, just share what they want to hear. So yes, I mean those--- I feel like once, if all these three points are secured in an organization and it's there, then building up a data driven culture is only natural, like it will happen for sure, but these are just kind of the foundational basis.

 

Gavin Bryant 31:10

Yes, I think that was a really good summary of the key pillars there to do a data driven culture. One of the things that I wanted to pull out, particularly there was those foundational data pipes, systems and platforms that they are the ground floor, the foundation of your experimentation efforts, and to your point that if in the first instance, data within the organization is not trustworthy and reliable, it will forever come into question, which makes the job really, really hard.

 

Sunita Verma 31:46

Yes, don't kind of like fix that before trying to set out on a CRO program. But yes.

 

Gavin Bryant 31:55

Good point. So thinking about injecting experimentation insights into business decision making and into product decision making and into product development processes. What's been your experience there and what are some of the successes that you've seen in that space? 

 

Sunita Verma 32:23

I think alluding to one of the examples shared before how we've worked extensively with teams that are outside of our own team, and takes a long time, but we do see success in changing mindsets eventually. And I think a few keys behind that was one repetition. It sounds so lame like repetition, but it will like testing, and the sort of optimizations will never really be part of a company's DNA, just because there is an experimentation team, or there is an experimentation tool. So it's really just about making sure that your team, or a representative from the CRO team, is continuously there in the forums or the meetings that matter so once, where reviews of product features are being discussed. So product managers may have that big discussions as well, or being in meetings where they are talking about, say, a quarterly review and looking at how revenue has been shifting between different channels. It might not be a specific product. So all of these sort of conversations, it honestly spans every department possible, like the opportunities endless. So just about focusing which ones would be a lower hanging fruit, one where they would be a bit more receptive. In our experience so far, I mean, we have tried targeting like the retail e commerce team and like the product managers team, just because these few folks have been a little bit more technical. They do already have some access to data insights and are a bit more familiar with data as well so that was just something we used to kick start our own team and build that confidence in other stakeholders as well. But yes really, repetition is key. Just keep being in these meetings, keep bringing up, like, oh, let's test this, let's test that, even if only 10% of the ideas are followed up on. But apart from that, I think starting small for sure, getting those small wins personally that has helped us gain more resources and funding.   That was how we were able to get certain tools and scale further as well. Once we were able to build these small use cases, and once people see these small use cases, they just gain that confidence and will let you, or will let your voice matter a little more in bigger and bigger decisions. So yes, no choice, we'll have to start small, and it's a great place to start injecting experimentation and that flavor within some decision making in the company, and I think in the same thread, one key thing to even start injecting experimentation right as a DNA in the company, is patience, really. So change takes time, and it's very, very--- I guess to some it can be very frustrating. It can be very difficult to accept that people are not open to this. Like, why? Like I'm trying to do things to make things better for you, but at the end of the day, I think it comes to really understanding the stakeholder they are working with and why they might need more time to accept this and to also step out of your own shoes. I think when I started off, I was very much in my own like world, I think. And after talking to a lot of stakeholders, you realize why they may need more time and why this is an actual adjustment for them to make, and yes so be a bit more patient and to be a bit more understanding of people will really help in terms of injecting your own opinions and ways of working, like experimentation within their decision making process. But of course, everything that I mentioned before, like I don't believe we can do the same thing for all teams, like different teams are at varying degrees of openness and acceptance, and some of them may already be testing things out on their own, maybe just not in a structured way. Some teams may not even be looking at data today, so we'll kind of have to evaluate each team where they stand and build like a strategy or an approach for it. So yes, but those three things are usually like the main things I think about before deciding, okay, how am I going to approach this team or that team, or which do I Approach first? 

 

Gavin Bryant 37:52

Yes, that's something that I've done myself previously, is to devise a personalized and individualized communication strategy for all key stakeholders and teams to really understand the who, what, when, where, why, how, and have a really targeted and individualized communication plan and engagement plan with those key stakeholders to assist in ensuring that the communications engagement process is really structured, but also it's really targeted to the individual needs and requirements. One of the things that I really love that you mentioned before was, just because an experimentation program exists, or an experimentation capability is present, it doesn't necessitate that it will succeed. And, yes, I really love that metaphor of like going to the gym. It's about the reps, doing lots and lots of sets and reps, and to keep doing lots of sets and reps over the time and also with communications as well, just being really proactive in demanding a seat at the table and injecting yourself into the relevant business forums and review sessions and meetings to be able to increase the presence of the program and to value add into these review and decision making processes. So, yes I think really smart strategy. 

 

Sunita Verma 39:28

Yes, and that being said, it's also not a guarantee that you will know what are the forums and the processes in place. A lot of times there's no transparency on it, or a little transparency on it. People don't usually think, oh yes, let me include you in everything that we've got. So yes, it really becomes on the onus of that CRO person to reach out in different ways to different people and figure out how they are structured, your ways of work currently. What are the forums in place? What are the cadences in place and see where they can fit themselves into?

 

Gavin Bryant 40:10

Yes, I think, yes and being very, very much value and benefits led, how can you help them and support them with their work to be better and more successful with their work. Let's, round out with one final question around our mindset over metrics focus, so what will be your key pieces of advice for organizations that are maybe getting started on this journey or maybe at a lower level of maturity. What are your key tips?

 

Sunita Verma 40:48

Key tips, I guess, is to just really evaluate the organization and the leaders within the organization, if needed. Really just look them up, look up like how they are, like on social media, their presence, everything, just to really get a feel of the individuals who are leading this company or leading the organization, or even leading the teams. Because, like I said, like a very, very basic thing that we'll meet in a CRO program is buy ins from those higher up, the bigger bosses, and without that, I do see it as next to impossible to start such a program. But that being said, I do also recognize that a lot of times, like CRO consultants or an experimentation team is hired with the expectation to change the culture or shape the culture of the company, to be experiment led or to be agile when it wasn't agile before as but sometimes it's really not realistic to do that at all, or to approach it in that way, just because a consultant or experimentation expert literally has no authority or influence to change the company's culture. It starts from the top, and the consultant is not at the top. So yes, this goes back to my first point earlier, buy in from higher ups. Otherwise this experimentation led culture is just not gonna happen. Well, unless the CRO consultant really has that much influence and power, highly unlikely. 

 

Gavin Bryant 42:51

Yes, that's a really good point. Hey Sunita, I'm going to put you in the hot seat now, and we're going to close out with our fast four questions. So four fun, quick questions to finish out our conversation today. Number one, what's your biggest lesson learned from experimenting?

 

Sunita Verma 43:12

I would say most of the time, everyone is really trying to do their best. Or what you think is the best? Like no one is trying to hinder you on purpose. So if someone disagrees with you, figure out why, like, don't take it personally. Don't take it as Oh, they hate me. Figure out why. There's a probably a good reasoning that would help you open your own perspective as well, yes.

 

Gavin Bryant 43:37

Yes, you made a good point earlier about empathizing with your users. So try to understand their motivations, their desires and their needs. Put yourself in their shoes. Walk a mile in their shoes.

 

Sunita Verma 43:49

Exactly.

 

Gavin Bryant 43:51

Number two, what's a common misconception people have about experimentation?

 

Sunita Verma 43:58

So I've heard this from a few folks that running experiments will grow the company. I'm dead against that statement, just because running experiments will not bring in more revenue. Of course, like everything behind that will like having an open mind that is willing to listen to the data, take in the feedback and change our ways of thinking or improve it is what makes the organization grow. Experiments is really just a medium to just be a risk mitigation tool. Everything else, it's dependent on the people and their mindsets essentially.

 

Gavin Bryant 44:39

That's a good point. So experimentation, it's not the magic genie in the bottle if it's not used in the correct way, or if it's misused. 

 

Sunita Verma 44:50

That's right. It's not the growth hacker. 

 

Gavin Bryant 44:55

Number three, what's a strongly held belief around experimentation that you've since changed your mind on?

 

Sunita Verma 45:03

I think we alluded to this earlier, but I really did believe that running an experimentation program was really about hard skills, like understanding of statistics, experiment design, experiment analysis, those sort of things. And that was what was initially on the job description as well when I first started out. But yes, I mean, the reality of it, as we both felt, was that a significant part of our time and of our brain work goes into how do we approach this situation, how do we approach this stakeholder, and how do we make sure they feel they can be comfortable with testing and experimenting and be a part of the process.

 

Gavin Bryant 45:52

Finally, what's the one key takeaway our audience should take from our discussion today? 

 

Sunita Verma 46:01

I think entire summary of it would be there has essentially been a great shift in ways of working in the last decade alone, actually, where data capabilities in most companies are just getting better and better and better really quickly. However, the shift in mindset is always the one that is lagging behind, and understandably so. People mindsets are the hardest thing to change. Technology much easier so hence to make the most out of this wonderful data that we've got today. Like the one key thing is to really start to make that difficult push to open our mind. And a lot of times it's very difficult because it means proving ourselves incorrect or proving ourselves wrong, but it also means speaking against something, of course, in a politically correct way, but still speaking against something, and not to say that I'm great at it, or have excelled in it, but bringing that up because of course, I've been growing up in a very Asian culture where it's really ingrained in you that there is a certain way of doing things, and we don't question or talk back to those older than us or higher up than us, I think it's a very cultured thing, so it will definitely be harder to move away from this mindset. Hence why we should focus our energies on this a lot more. 

 

Gavin Bryant 47:37

What a fantastic way to end our discussion today. Sunita, if people want to reach out to you and get in contact, where's the best place to find you?

 

Sunita Verma 47:47

LinkedIn. LinkedIn is the absolute best case. Can find me at Sunita Verma.

 

Gavin Bryant 47:53

Perfect. So thank you so much for your time today, Sunita. For those that are interested in heading along to the 2024 APEC Experimentation Summit on November 28 www.apecexperimentation.com and hopefully we'll see you there. Thanks Sunita. Chat soon.

 

Sunita Verma  48:16

Thank you so much, Gavin, thank you.

 

“ When I was starting out on my experimentation journey, I didn’t realise how much of a key role relationship building would play. Over time, I’ve learned that relationship building with key stakeholders is a key part of the experimentation role, arguably one of the most important parts of the job.”


Highlights

  • The H&M Experimentation Program based out of Singapore is responsible for Test & Learn activities across six Asian markets. The experimentation program in Asia plugs into the global H&M experimentation program headquartered in Sweden in Europe

  • Managing a Global Experimentation Program - 1). Understand global strategy, systems and processes 2). Establish and build personal relationships with global counterparts 3). Share doubts, fears and uncertainties 4). Get clarity on global experimentation frameworks and strategies 5). Determine relevancy of global processes and frameworks for local markets in East Asia 6). Adapt processes and systems for East Asian markets

  • Copy and Pasting global frameworks and models into Asian markets is typically a flawed strategy. All Asian countries have unique customs, cultures , ways of working and communication styles that need to be considered individually

  • Often experimentation learnings and insights from one Asian market do not have generalizability across all Asian markets. Each different Asian market is highly nuanced with unique Customer Needs, Problems and Benefits in each respective market

  • The focus of the H&M Asia experimentation team is to start small, making a fast start - establish a foundational base of core experimentation capabilities and start performing experiments, adjusting and fine tuning processes over time

  • Strategy for producing early experimentation program Quick Wins - test contentious, strongly held beliefs and assumptions from H&M Global in local markets

  • Top-Down Executive support is critical - if Executives don’t see value in testing their assumptions and being open to being proven wrong, it’s going to be a tough journey for experimentation to succeed in the organisation

  • How do you know when a business unit may be ready for experimentation? When leaders value bad news over a positive story. Teams that are constantly cherry-picking data to produce a good news story, or positive narrative, may not be ready for experimentation as they don’t value the truth

  • When starting an experimentation program relationship building is critical. Relationship building, stakeholder engagement, influencing and organisational change are often underestimated elements of a successful experimentation program

  • Your primary job as an experimentation professional is to help other people be more successful in their role (I.e., Product Managers). Be present in the product development and release process. Don’t be afraid to ask questions and challenge status quo. Contribute in a strategic manner

  • Before thinking about starting an experimentation program ensure that the organisation has reliable and trustworthy data. Data must be easily accessible, clean and well-structured. Layering an experimentation program on top of poor-quality data foundations will lead to untrustworthy and unreliable experimentation insights. Increase organisational data literacy through capability building and training sessions

  • Hire right - new hires need to be naturally curious and inquisitive around data, having a strong bias for continuous improvement. Leaders need to “walk the walk and talk the talk” and are open, honest and customer-driven. Nothing undermines and experimentation culture faster than hypocritical leadership behaviours

  • Inject the Experimentation Team into as many business forums as possible to share insights, assist discussions and support decision-making - Product Reviews, Quarterly Meetings, Revenue Discussions, Town Halls etc. Repetition with business teams and stakeholders is key. As stakeholders gain more confidence in experimentation, you increase the reach and strength of your voice, impacting bigger, more strategic business decisions

  • Think about the Value - Investment Flywheel - start small, generate benefits and develop proof points to increase investment and resourcing in the experimentation program. The virtuous cycle continues. The more business and customer value the experimentation program delivers over time, the more resources and funding you receive. Rinse and repeat

  • To meaningfully affect organisational culture change Top-Down and Bottom-Up support is required. Experimentation Consultants or an Experimentation Team are often hired with the expectation to change or shape the culture of the company. In isolation this is very unrealistic and difficult to do. Particularly, consultants, have little agency or authority to influence a company's culture at large. Cultural change must be elicited Top-Down

  • Shifting organisational mindsets is always lagging and difficult to do. People’s mindsets are the hardest thing to change

In this episode we discuss:

  • How to interface with a global experimentation program

  • Why global experimentation strategies need to be adapted for local markets

  • Gaining quick wins by testing contentious beliefs

  • Relationship building is the most critical part of experimentation

  • Experimentation is a tough journey without Top-Down support

  • How to identify nascent teams for experimentation onboarding

  • Your job is to support other teams to be more successful and effective

  • Hiring for curiosity and inquisitiveness around data

  • Building an experimentation program on solid data foundations

  • Increasing experimentation reach to impact more strategic decisions

  • Why behavioural change must be modelled by senior leaders

 

Episode Resources

Sunita Verma - LinkedIn

Success starts now.

Beat The Odds newsletter is jam packed with 100% practical lessons, strategies, and tips from world-leading experts in Experimentation, Innovation and Product Design.

So, join the hundreds of people who Beat The Odds every month with our help.

Spread the word


Review the show

 
 
 
 

Connect with Gavin