January 27 2022 • Episode 004
Jonny Longden - Scaling Experimentation at Sky
“You’re putting the scientific method at the heart of how you make decisions, using data and research to come up with hypotheses, and finding ways to validate your hypotheses with real customers, so you can understand how things really work, instead of how you think they work.”
Jonny Longden is Conversion Director at Journey Further, a performance marketing agency that works with the world’s leading brands, including Liberty London, Virgin Money, and Lick Home.
Based in the United Kingdom, Journey Further helps businesses become data-driven and build experimentation into their programs.
During his 15-year career of improving websites with strategy, experimentation, and data, he built and led the conversion team at Sky, and has worked with the likes of Nike, Visa, O2, Nokia, Principal Hotels and Manchester United.
Get the transcript
Episode 004 - Jonny Longden - Scaling Experimentation at Sky
Gavin Bryant
Hello, and welcome to the Experimentation Masters Podcast. Today I would like to welcome Jonny Longden to the show. Jonny is Conversion Director, at Journey Further.... Performance Marketing Agency that works with some of the world's leading brands. In this episode, we're going to discuss how Jonny implemented and scaled experimentation at Sky. Welcome to the show, Jonny.
Jonny Longden
Thank you very much. Hello! Thank you for having me on.
Gavin Bryant
Okay, let's get started. Jonny you’ve been working around experimentation for over a decade now. How did you get started?
Jonny Longden
Yeah, so before I got into experimentation, I was in what today people called Data Science, but it was just called analytics then really. So, I was never really a data scientist, I was kind of a consultant. So working with CRM, Database Marketing, and things like that. And basically, helping companies build propensity modeling and segmentation and stuff like that, and then I kind of had a sense that digital was the way things were going to go, which obviously was, and I sort of wanted to transition into that world. So I got a job with an agency.... With a digital agency that was kind of a web dev design all-round agency. And initially in a sort of a data planning kind of role, and then that I very quickly sort of morphed that myself into more of an experimentation role purely because around that time..... This is about 14 years ago, and around that time, that's when Google Website Optimizer first launched, and that sort of coincided with me starting in this role. And because of my background, because you know... A lot of what I'd done before that as well was things like experimentation in direct marketing campaigns and things like that.
So it was instantly just a way to bridge the two worlds. And to bring this learning that I had from my previous career into a digital environment. So I was sort of instantly hooked, and started really sort of pursuing ways to do that and do more of that. I've always had a kind of an analytics edge to my career as well. So, I've always done Digital Analytics alongside that, as well as strategy and all that sort of stuff as well. So now, that's how it came about.
Gavin Bryant
Okay, so you have a really strong foundation in analytics and data, which is effectively the cornerstone of good experimentation practice.
Jonny Longden
Yeah, exactly. I think the other thing as well is that at the time, I noticed that there were quite a lot of people, sort of nascently doing digital analytics and things like that, who didn't actually have any experience in analytics. So you have like agencies and people and companies that have people in them using Google Analytics, and the proto versions of that around the time. And they would generally come from a sort of a planning background, or they'd be account managers or something like that. And so I kind of saw that there's a bit of a gap in the market where if you get some, some actual experience of analytics, and couple that with digital analytics, you know, that's a fairly rare thing at the time, which it was.... It still is, in a way, really, because I think one of the issues is that Google Analytics and tools like that they sort of do the analytics for you, in a way, because it just presents you with all these reports. But actually, the critical thinking that goes with analytics is something that is very important. And it's something that is trained through sort of doing proper analytics, and there is still a bit of a gap there, where you, you know, what people call analytics is not what analytics really is.
Gavin Bryant
So thinking about your guiding principles, or your experimentation thesis, how would you describe that?
Jonny Longden
Yeah, I think, you know, I very often say to people, that after 14 years of running tests on websites, it would be really easy for me to go around saying, I know what you should do to your website, I've got this list of best practices. I'm an expert in all this. And it's simply not true. I have no idea. Because really the reality, the only true thing that you can say after 14 years of running tests, is that you just have to test everything. Because you can't really second guess what's going to work on websites. You can't to an extent but not in the grand scheme of things. And, you know, no matter how much of a no brainer, some things seem, no matter how obvious it seems to you, there's a very good chance it's not going to work. And you know, so you have to be humble. You must realize that our opinions and our rational ways of thinking about what you should do to your site or your business or whatever, completely inherently flawed, and the only way to really learn is to test pretty much everything. So that is really the sort of starting point that I have behind it. And you know, and any company that is that we're working with are asked for advice, that is what I would say to them, that, you know, if you're not testing, there's a really, really good chance that 90% of what you're doing is a complete and utter waste of time and money or worse is actually having a detrimental impact on your business.
So yeah, and guess, really like stepping outside of that on a wider level. You know, what we do has become complicated around the terminology that people use, and the way it's described and the way it's perceived and things like that, you know, it gets seen as a channel alongside PPC and things like that. And at the end of the day, when you just forget about all that, what we're actually talking about, what we're doing, is putting the scientific method, at the heart of how you make decisions around your business, you know, it's just about using research and data to come up with hypotheses, and finding ways to validate those hypotheses, in real situations with real customers, so that you can understand how things really work instead of how you think they work. And that at the end of the day, is what you're trying to do. And you think, why would you not do that all of the great things in the world have been brought about through the careful application of a scientific method, you know, spaceflight, medicine, everything. So, you know, apply that to your business, why wouldn't you do that? And that, you know, could try and get people to understand that, that's broadly what it is and to see beyond all of this sort of slight misperceptions and strangeness that comes with it when you think what it is.
Gavin Bryant
From your experience, why do you think there's some reticence to more readily applying the scientific method to problem solving in business?
Jonny Longden
Yeah, it's a really interesting area. And I think personally, a lot of it comes down to the nature of corporate hierarchy, and the way people are kind of hired and valued and monetized. Because if you think about it, like people, when people hire people into businesses, they're really buying their past experience, like you get paid more, for having more relevant, and then literally more in time, background experience, so you know, someone who can come into a business and go, I've worked here, and I've done this, and I've got 10 years’ experience, gets paid more for that. And that's how we work in the business world. That's how the business world sort of, that's the commodity of people in the business world.
And so when somebody gets into a company, they have to justify what they're being paid, and what they're being paid is related to their past experience, and their brain and their things that they've done and how they do things. And so they have to justify their value using their opinion, and their past experience and the knowledge that they have, rather than their ability to exercise a process. So it really comes down to that. And that is why you have a lot of ego in business, you know, people start in new companies, and they go, 'Well, you know, I worked here, and this is how we did it here. And this is it work there.' And, you know, everybody wants to bring their own opinion and their own ego and mind to the table. And that's fundamentally what basically underpins business culture. And that is why experimentation is so challenging a lot of the time, and it's so challenging to get people to think like that, because even though they might rationally consciously think, yeah, you know, we should do this testing, at the end of the day, subconsciously, and emotionally, they're living according to that, because that's how the business world works. And they can't really not do that. So, yeah, that's huge, right? It's, you know, how do you change that, but we chip away?
Gavin Bryant
Yeah, I think that's a really good point. The notion of; I don't know, is lost in business. And I don't know is something that even as children in early preschool, and in high school is viewed as negative, but we should be starting from a position of, I don't know, and asking questions and forming hypotheses as a foundation element rather than calling the shots. It's a much better position to come from and then we build up our reasoning, understanding sequentially, a little bit like climbing a ladder.
Jonny Longden
Exactly, yeah. And it's the same with failure, you know, there are companies where that are built on a culture of the fear of failure, you know, people think if they are owning and responsible for initiative that fails, that they will get fired, or they at least won't get promoted, or something like that. So, you know, that causes people to either avoid doing things that they think might fail, or to spend things that they have done into things that are successful. And that's very, very common, you know, you get businesses that have never, ever failed at anything, like, obviously, they have, but they very carefully cover it up. So, you know, I've been in roles in my life where my job was effectively to spin numbers to make things look like they were successful, and they weren't. And so, you know, that happens all the time. So yeah, it's about trying to kind of instill that sort of growth mindset, which is what you do with children, you want children to realize that trying something, and failing is literally how you learn, like if you can't get on a bike, because you're terrified of falling off, you'll never learn to ride a bike, like falling off, and then working out how not to fall off, is how you learn to ride a bike. And it's exactly the same in anything. But it's amazing that companies, a lot of businesses don't work like that at all. So, you know, in a lot of ways a business can be more juvenile than a child.
Gavin Bryant
Yeah, I get asked that question a lot. And you probably do too, in your role. You know, people want to know, how can they position and talk about failure more readily. And I think a way to reframe that is always around learning. If you're framing failure is failure, no executive or senior leader, as you pointed out, wants to be associated with failure. And as much as failure is beneficial, and they can stop an initiative early in the piece, and be a positive in some respects, that there's this stigma around failure in business. And really, it's like a bad smell. No one wants to be attached to it.
Jonny Longden
Exactly, yeah. Like, there's quite a few things around terminology and the language that we use like that, you know, I have had this conversation quite a lot recently, but it's quite common in experimentation, and CRO, we'll get back to that in a minute. But people talk about winning and losing in experiments, and, you know, that in itself is very, very charged language, like, who wants to lose? You know, it's like, sports language, isn't it? It's like, you know, why would you want to lose when you can win? but if you went and asked a scientist who was trying to develop cancer cures or something, what the outcome of an experiment was, there's no way they would say it. It just doesn't make any sense. It's nonsense, you can't win or lose in experimentation, you're finding something out, that's the point.
So yeah, all that kind of language of winning, and losing and failing and succeeding, and things like that doesn't do anybody any favors, and it's hard to kind of get out of the other one, which is slightly unrelated. But, you know, I said to CRO, and I said experimentation in the same sentence. And, you know, that's a weird thing that a lot of people call it conversion optimization, and it just has that name, so you can't really get out of it. And yeah, it's a really sort of weird name. And a lot of people go, don't be so panicky about like, you know, the language, it's just a name doesn't matter. It kind of does for the same reason, you know, the exact same reason that, you know, people going around talking about failure and losing is has a really big impact on on their work, their likelihood to take seriously what it is we're trying to show them, and conversion optimization. You know, the words have so much meaning in them, but you know, that that sort of railroads what you're trying to do really like for a start optimization, it really implies tweaking something that's already been done. You know, we're gonna build this stuff. We're going to pull this working, and then we'll optimize it. So you have people going forward, there's no point optimizing this because we We're going to rebuild it, you know, when there's no point optimizing this website, because we're going to rebuild it. It's like, there's every point in optimizing in doing experimentation if you're going to rebuild it, but they can't see beyond that, because the words give them this view of what it is. And there's no kind of answer to this question. But the language is kind of quite important, really.
Gavin Bryant
Let's jump forward to Sky. So thinking about those early days at Sky? What was the experimentation culture like before implementation and scaling of the program?
Jonny Longden
Yeah, so a bit of context. First is; just before I joined Sky, Sky had a very fragmented engineering setup, where different divisions and even departments within those divisions had their own completely separate product teams and developers, and they were completely siloed, within vertical business units. So you know, Sky Sports, and the publishing Sky Sports and Sky News might have had separate ones, you know, there would have been one in the sales area and customer service area. And actually, like, what happened prior to me joining was a big companywide initiative was a move to count, consolidate all of that into a single sort of central Center of Excellence for digital.
So actually, like the location moved drastically, as well. So actually, what happened was that we we built a whole new office in Leeds, that today, I believe, is somewhere around 600 software developers. And that is a sense, it's almost like a central agency... Internal agency, as it were, for any engineering and Dev stuff that happens. And I was hired, right at the very, very inception of that, like, just before we started building the office. And my role was initially incredibly ambiguous, like I was hired on the basis of just.... We're hiring all these developers, we're putting a load of investment into development of our different products and digital stuff, how do we make sure it's commercially viable? And that was really like the job description. It was a question rather than any kind of solution. And it was basically because of my background that my answer to that question was experimentation. And so I very quickly sort of morphed my own role into being almost solely about experimentation in that product environment. And prior to that, there had, there had been some experimentation going on in some individual areas. But it was incredibly sort of varied in terms of what people were doing in different areas. There was some reasonably advanced stuff going on in the sales area. Although interestingly, that area sort of works to kind of daily trading targets. And so they were trying to do this very, very fast, you know, testing, like almost daily testing, what you wish you can imagine, is not a good thing to do. So, yeah, there wasn't there was no cohesive sort of plan for it really. And certainly no culture of it. So Sky is not too much of a what I was talking about in terms of, you know, having a real aversion to failure and things like that. It's not Google or Amazon in that respect, either, but it's not the worst kind of place, you could work like that. And so there was some there was some receptiveness, I think, in the culture to doing that right from the outset.
Gavin Bryant
Okay, there was a culture of experimentation. It was fragmented. There was no standardization of processes and procedures in communication, and quite different levels of maturation, depending on the internal department. So based on that landscape that you you're faced into, what happened next.
Jonny Longden
Yeah, I quickly managed to secure funding for resource for people and I was able... I mean, the real beauty of it was that we were building this whole kind of environment from scratch. And there is no processes now, and we invented it all from scratch. And I was kind of like fairly closely involved in how the agile ways of working and things like that were being designed. So it was a really beneficial place to be in terms of being able to kind of take a step back and go, this is the way that experimentation should work in these environments. Because otherwise, you're really sort of dealing with a lot of legacy things that already exist and trying to kind of work around them. Well, we didn't have that. We were building everything new. And I was working side by side with the, the, you know, the senior people developing the engineering practices and things like that. So it was really interesting time and managed to develop what I still think are pretty pioneering ways of experimentation integrating with product development.
There were some interesting kind of hurdles that happened, though, which just good sort of stories, I guess, for people to hear about.
The first thing really that happened was I managed to hire a really good team of people, you know, skilled and capable of running experimentation. And what we did was we aligned them to different squads within the business. So, there's squads that would look after Sky Sport, Sky News, sales, service, product that, you know, all that sort of stuff. And we aligned these resources with these areas, so that they were effectively part of the scrum team, you know, not from a direct reporting point of view, but from a cultural point of view, they were sort of sitting with the Scrum teams in these areas, and working closely with the product owners, and all that sort of stuff, and that worked really well, like the theory behind that was that product teams don't really want somebody over there throwing things over the wall at them they want, it works better, if everybody's kind of working together and involved and discussing things together. And that works really well to a point. And the interesting thing that happened was that what we learned eventually was that we were coming up with things that were successful outcomes of experiments that we would then say this needs to be pushed into production, it needs to be released on the production side. And those things would just end up sitting in a backlog and never actually getting done.
And so that was the first kind of hurdle, where you kind of realized, why is that happening?
And the answer to it became really interesting, because what I learned was that there were two things; one is that, if a squad has a single roadmap, then that tends to get prioritized in different ways. And what you end up having is there are critical bugs that everyone goes, we can't possibly have that as brand damage that just sort of get solved. And plus developers love solving bugs, you know, it's almost like a little competition, see who can solve the bug fastest.
So those things all kind of get done. And like any new bug that's found, it kind of gets done. And then at the opposite end of the scale, you've got bigger.... In sort of bigger projects that tend to come from the business side of new functionality that's required new products that have been developed and things like that. And because of the external demand for those things from other areas of the business, you know, there's a reactive nature to the prioritization of that, it's kind of who shouts the loudest. And plus, those projects tend to be more interesting for developers to work on. And that's kind of a really key thing is the developers, you know, that their careers are based on how much they get exposure to new technologies and new code bases and things like that.
So, they will gravitate towards the more complex and interesting projects. And by nature, what comes out of experimentation tends to be more simple to do. So that's why you end up with this situation where that kind of stuff will often never get done. And the answer we came up with was one; we split that roadmap into several different roadmaps, to three different roadmaps in order that each one could be tracked differently. And if you know, resource was tight, each one could be squeezed but not killed completely.
And the other thing that happened was the developers were rotated around those. So that's a really important thing that people probably don't often think about is like, you know, developers don't really like working on basic projects and if that was all they were doing that might be felt relatively unfulfilling. And so to rotate them around, but the other one was to have a specific roadmap that was around developers supporting the experimentation project. So yeah, that that all kind of hadn't 100% come to fruition by the time I left, so I don't really know where it went. But that was the theory of it at least.
Gavin Bryant
So once the team started running experiments and effectively interrogating the roadmaps, and the roadmaps are effectively hypotheses about what may work in the future, how did the business take on the data from experiments and change the approach if it was warranted, by way of road mapping?
Jonny Longden
Yeah, so it depends on the area, I guess. And I guess we sort of develop little different micro cultures in different sort of areas of the business. But I think, one of the... Just to slightly change the question, but I think one of the most important things that we were able to do in terms of developing the culture was trying to surface and socialize sort of bigger and more high profile tests and their outcomes. So as you can imagine, Sky is an enormous company in a complex matrix environment, and it's very political, as well as businesses like that are.
And one of my roles really ended up being sort of almost constant stakeholder management and socializing, what we're doing, and that kind of stuff, and that's a constantly changing, playing field, because restructures happen and things like this. So, you know, it constantly changing and constantly having to do it, but what always kind of got traction with things was about demonstrating here's something that everybody thought was going to work, and we invested in and it didn't, or here's something that seems like a great idea. And actually, it didn't do anything, and obviously, there's a commercial implication to those, which is what ultimately piqued up people's interests. So, that's really what ended up being the glue between what we were doing in the product environment and the rest of the business and the decisions that were made, was being able to show the commercial impact, quite clearly of what we were doing. And as you'll know yourself, there is no perfect way of being able to show the revenue or commercial impact of an experimentation program. But you have to do it, you have to try if you're going to get traction. So that's what we did, and we were able to go around and say we have been both generate and save this amount of money through these initiatives that we've been doing, and that's what gets people interested and what gets people understanding how they can make decisions around things.
Gavin Bryant
Were there any other communication strategies, or forums or other types of modes that you use, that you found really effective, and you would advise other teams or people use?
Jonny Longden
I think like, other than that, I mean, communicating and socializing experiments, and, if possible, those being the most contentious experiments that you can use as examples, that's always a good one is, you know, like things that are a bit divisive in terms of what people think of in terms of the control versus the variant, are always good, because, you know, the answer is not, you know, somebody might not be happy with the answer, but it's not your fault. It's the data, you know, that's what customers are doing. So that that is really, really powerful. I think the other one is just to try and instill in, you know, everybody a sense of, of saying, oh, you should you know.... Just like Chris Gower's book, you should test that, you know, I've read that years and years ago, and it sort of stuck with me that that is really what you're trying to instill is to get people to think yeah, that's a good idea she tests done.
But in that statement, I didn't really go into it in that book that I remember but within that statement is loaded the fact that actually all ideas are completely welcome, and that's really important. Like I think if you... What you don't want is people to think that actually, the process and the ownership of idea generation is somebodies’ responsibility. And they're going to use data and research and everybody else's ideas are worthless. You know, all ideas, no matter how stupid they seem, are completely and utterly valid, providing you test them.
So, that's really what you're trying to get across is that, I guess ultimately, what I'm saying is to invite ideas from everybody. And that is a communication in itself, you know, if you can really firmly communicate that all ideas are welcome and give people the way in the Forum to submit ideas, then they become involved in the process, they become involved in the program, because everybody loves to see, you know, what happened to their idea and what the outcome of it was. So yeah, I think that's the other really important thing.
Gavin Bryant
You touched a little while ago on organizational politics, what were some of the stakeholder management and politics that you had to clear a path for the program to succeed?
Jonny Longden
Yeah, I mean, business politics is a funny kind of thing. And, you know, my answer will not be specific to Sky, it's, you know, I've worked in and with a lot of big corporations. And it ultimately comes down to survival, I guess, really, big companies, they restructure constantly a lot more frequently than others.
So everything's moving and changing and ultimately a very senior level, people are both protecting and making a claim for reports and funding from, you know, pots of funding. And as a senior person in a business like that you are constantly under fire from other people going well, I don't think that's really worth as doing, I think we should do this and stuff like that.
So, you know, senior people are really kind of trying to make a claim for things and trying to justify their existence. And experimentation is actually a really good way to support that, like, you know, if you understand what it is people are trying to achieve because it allows somebody to prove or disprove something that they want, you know, at the end of the day, these things always end up in a bit of a complex kind of spin type discussions, because, obviously, somebody wants to... They want an outcome, and that's happening, it's like, they don't really have the option.... You don't really have the option of being sort of open to one outcome or another. But even so, you know, like, what I've always tried to get across is that just being wedded to the process of experimentation, is in and of itself really positive, like, you know, as a very senior person, if you can say, we are using a rational process to further the commercial interests of this area of the business. And, you know, we're trying to sort of turn away from bias and things like that, that are not going to deliver, then that in itself can be a very strong position politically.
So yeah, that's the short answer.... I mean, you know, business politics is a super complex and pretty boring area. And there's not a lot you can do about it. It's just going back to what I was saying at the beginning around that. That's how the nature of hierarchical businesses work, and there's some interesting theories about ways that you would actually make business nonhierarchical, which I won't go into. But yes, it is an interesting thing that I got quite into reading about a couple of years ago. So yeah, but we're a long way off general business being like that.
Gavin Bryant
Completely.... So what were some of the major benefits that you saw from the program? You've touched on commercial benefits, which was an excellent way to frame the efficacy of the program? What were some of the other benefits you observed within the business?
Jonny Longden
I think, one of the things that I was always quite proud of was just being able to shift the culture of product development towards being more open to the idea of experimentation. And that's another thing that was just really interesting. And again, it's not really a Sky thing. It goes way beyond Sky is that when you've got product teams working agile, or allegedly agile, there's a particular kind of culture that goes with that, which tends to come from the fact that most people working in that environment, either at that point or back in their career up until that point, have been very much just focused on delivery.
So, the actual concept of Agile is meant to be that you've got a fairly autonomous unit of engineers responding directly to customer needs, and customer behavior and customer requirements and iteratively developing a product, via data on what actual customers are being feeding back and how they're using the product. That's what Agile is supposed to be, to me in a kind of a utopian sense. But it never really is that, what it is, is being faster at delivering what other people ask. And so a lot of people in that world are very much focused on just delivering... So it's like, you know, we've got this list of stuff to do, that's been given to us by somebody else. And we're just going to do it, and we've got to get through it as fast as possible. That's the delivery mindset. And what you want to do is start to build in some questioning and challenging to that, should you actually do that? Actually, that's a good idea, but we think we could do it in a slightly different way, and to try and put experimentation at the heart of that is really the goal. And I won't pretend like, we've completely changed the culture of product development at Sky like that. But we did get some way. So that was an interesting and good outcome for me that there were places where we started to really show that product development, and that sort of engineering mindset was happening in a slightly more a better way.
Gavin Bryant
Adding the validation step into the product development lifecycle, rather than skipping straight from idea to build.
Jonny Longden
Yeah, exactly.
Gavin Bryant
So if you had your time again, what's one thing that you would potentially look to do differently?
Jonny Longden
I mean, in general, along the way, I learned a huge amount, that, you know, no matter if you could magically go back and just know that right from the beginning, would have done things quite differently. And I think, actually right now, having been in an agency environment.... I mean, I've worked in agency environments before, but quite a long time ago, and right now working with businesses, and needing to sell to them. And you kind of need to constantly sell and resell the idea of what you're doing and show the benefit.
There's a lot that I've learned now that, you know, in terms of almost like the selling side of it, that had I... Had that experience joining Sky, I think I probably would have been able to do a lot more convincing of wider areas of the business. So yeah, well, whilst that is what I did like that, you know, like thinking back, there are techniques and ways of communicating, things and ways of explaining things that I have now that I had been... Would have been able to provide people a simple vision.
Gavin Bryant
Yep. Now, I completely agree. That's something that implementing and establishing an experimentation program that I've grappled with as well, that there's... The doing of the experimentation and the generating of the results, but it's really what you do with results. And it seems like the magic really happens in how you communicate those results. And it's about really simplifying those results and outcomes, what they mean for the relevant business departments and putting it in very meaningful terms that people can understand and rally around.
Jonny Longden
Yeah, I think the other thing is that different people that you talk to have very, very, very different perceptions of what experimentation is, before you even start saying anything to them. And, you know, some people have almost no knowledge of it, but other people have very different perception of what it is. And you have to be able to tailor what you say, according to that sort of background experience. Because there's one way that you could explain what you're doing, which for some people would seem incredibly complicated, and for others would not. And so you almost have to say, you almost have to tailor exactly what you're saying, in quite dramatic way for different people based on their past experience and their perception of what it is. So you know, understanding who you're talking to, and an understanding a lot about questioning them, a lot first around what they think is and what they think the outcome of it is; first allows you to talk to them in a very different way.
Gavin Bryant
Good point. So, businesses that are looking to commence an experimentation journey, what would your key pieces of advice be for those that are just getting started with experimentation?
Jonny Longden
Yeah, I think number one would be don't underestimate the scale involved. You know, it's a real problem, I think that... I see it all the time that it is ostensibly very easy to run A/B tests. So, if you're going to get into A/B testing on your website, you can get Google Optimize, you can still have that in Google Tag Manager, and you can run an A/B tests, that's sort of really easy but that does not constitute experimentation, doesn't constitute anywhere near, you know, running a good experimentation program.
So however you go about it, you need to make sure you are sourcing scale, and experience in running experimentation. Because whilst it's very easy to do, it's very, very, very easy to do it wrong. And to the extent that there's no.... You might as well not bother. So, we come across businesses, where they sort of say they're already doing experimentation, and they're literally what they're doing is running tests for an hour, and then stopping it and going, 'Oh, that's the winner'. I mean, you know, that's no joke, that sort of stuff happens all the time because it seems like such an easy thing to do, and somebody in a marketing team going, Yeah, I've done that I'll run a couple of experiments, is just a million, million miles off actually doing it properly and generating value from it, and there's so much that goes into doing it properly. And then I'm not saying you have to go and invest like absolutely hundreds of 1000s of dollars or pounds or whatever, you know, but whatever way you're doing, you have to be sure that you're getting it right. And you've got the right skills, especially around statistics, and things like that, but also just about how you learn from the tests and how you come up with ideas and all this sort of stuff. So, you know, it's not an easy thing to do. That would be my main one.
Gavin Bryant
The way I think about that, poorly design experiments, produce poor data and insights, which leads to poor decisions, which leads to poor business investments, which impacts business strategy. So it's like, a flow on effect that really stems from if an experiment is not a trustworthy, rigorous and disciplined process.
Jonny Longden
Exactly. And it's like I say, you're better off not bothering because if you think you're running a testing program, and you're making decisions about what to do, and the vast majority of those tests are false positives, which is entirely feasible if you're not running the steps properly, then you would be better off just guessing. You probably actually be better off guessing because there would be a bit more thought around it rather than what's effectively like flipping a dice, or flipping a coin five times or something. So, it's easy to get wrong.
Gavin Bryant
Okay, just closing with three quick questions now. So signature question, an experiment that you've performed, that reframed organisational perspective.
Jonny Longden
Yeah. So interestingly, like this, just a series of experiments, rather than one experiment that we're running for a client right now.
So I have a client who sells furniture, they're an eCommerce furniture retailer, and they have two different sort of sets of products; 1) they buy in products and warehouse them themselves, and then you're buying that direct from them. 2) They're kind of almost drop shipping. So when you order it's ordered on demand from a manufacturer who actually builds it and sends it.
And the first types of products you can get within two days, the second type of products, it will take maybe 16 weeks to get. So we've done a huge amount of... Initially just sort of some small testing on trying to kind of limit people's choices, trying to limit what they see to just stuff that's in stock to say, does just showing them just express delivery products, increase conversion, which it hugely did.
So, we have built up that to bigger and sort of bolder tests and things like that. Where this is eventually going, which is, for me a really interesting aspect of how experimentation works, is a big strategic question, which is, should they even sell the products that they don't stock.
Another potential option would be to actually split the brand. So you've got two different brands;
One, the way you can focus on the EXPRESS DELIVERY aspect of it.
And another way you can focus it on the quality.
So those are really... We haven't answered the question yet. But there's a big sort of strategic questions. And that is a really good example that I often give people as to how experimentation should work is bottom up strategy development, you're learning something from running small tests, that then creates a really big question that you need to answer in a big way of running a business differently. And that's way more important than, you know, going after winning tests to hack a bit of money at other website, if you really want to get the value out of experimentation. That's how you do it. You learn and you pivot your business on the back of what you learn.
Gavin Bryant
Yeah, I think that's a really good example, because it really highlights how strongly experimentation should be anchored back into strategy. And in this particular example, that you've provided, that raises a big question around strategy, and is a strategic pivot required.
Jonny Longden
Exactly. And once you've realized that that's a question that you need to kind of existentially solve, then you can start designing a load of other experiments to it, to supply even further and to gather the data that you need in order to make a decision.
Gavin Bryant
Question number two, your top three resources that you would recommend to the audience?
Jonny Longden
As in tools or...?
Gavin Bryant
Books, blog?
Jonny Longden
Okay, yeah.... So at the moment, I'm a really big fan of reading books that are not directly about experimentation and things like that, because I think you can learn a lot about stuff from outside of the direct industry and things like that.
So I think some interesting stuff that I've read recently. "How to think like a rocket Scientist" was quite a good book, I think. I'm going to struggle to remember the names of stuff.... Oh, yeah, another book, which is about continuous improvement.
So continuous improvement is sort of an extension of the kind of Kaizen, you know, manufacturing thing... I can't actually exactly remember the name of the book; I think it's called "How to succeed with continuous improvement". That's right. So that was quite interesting, and the other area generally, I would say that I've been kind of focusing on.... It actually, just like I mentioned before, some interesting things about organisational hierarchy. So, there is a movement called Holocracy, which is a system of non-hierarchical business management. And because of this stuff that I was talking about, I got quite interested in that. And there's a few of the things like that in that area. And just think about how you might run a business non hierarchically, and thinking how that might impact the ability to do experimentation.
Jonny Longden
So, that's a few sort of themes that I've been looking into around reading and learning.
Gavin Bryant
Final question! If people want to get in touch with you, what's the best place to find you?
Jonny Longden
LinkedIn... So yeah, connect with me on LinkedIn. I'll connect with anybody on LinkedIn. I'm very, very open to having conversations with anybody about anything really like. I'll always try and make time for that. So anybody want to talk about anything I am very happy to do that.
Gavin Bryant
Fantastic. Thanks so much for your time today, Jonny great to chat.
Jonny Longden
Thank you very much.
“After 14 years of running tests on websites, you have to experiment on everything. No matter how much of a no brainer, no matter how obvious something may seem, there’s a very good chance that it’s not going to work”
Highlights
Test everything. No matter how much of a no brainer, or how obvious it may seem to you, there’s a very good chance that it’s not going to work
Be humble - our opinions and assumptions on what we think we should be doing to improve our product or service are inherently flawed
If you’re not experimenting regularly there’s a good chance that 90% of what you’re doing is an utter waste of time and money, having negative affects on your business
Experimentation is not another channel that sits alongside SEO or Paid Advertising. Experimentation is the heartbeat of your business - it’s the operating model of your business. Experimentation should not be an addendum, an add-on tool or method. Experimentation is how your business develops new products, make decisions and make business investments
Corporate cultures can make experimentation challenging. People are hired and monetised based on their past experiences. Business leaders have to justify their value using their opinions and past experiences, rather than exercising a disciplined thought process. Culture clash can occur. Experiments don’t care for opinions. Experiments clash with egos
Failure has a negative stigma. Many companies have a fear of failure. People don’t want to fail for fear of reprisal - being fired or missing out on a promotion. People avoid working on projects which may fail, choosing to work on successful projects. Experimentation isn’t about winning or losing. Failure needs to be reframed as organisational learning
Just because an experiment tests positively, there’s no guarantee that it will be implemented. Lobby hard for implementation otherwise business value will sit in a product backlog
Constant communication and stakeholder engagement is critical. Surface the commercial outcomes of high-profile experiments or contentious experiments to drive interest and engagement from key stakeholders. Focus on constantly selling and reselling experimentation and the benefits of your work
It can be challenging to quantify the commercial impact of an experimentation program. Use cost avoidance and revenue generated to influence business decision-making
Invite ideas from right across the organisation. All ideas should be welcome. Ideas from all teams are valid as long as you test them
Performing experiments and A/B tests has never been easier. While experimentation is easy to do, it’s easy to do experimentation very badly. Don’t underestimate the scale and expertise required to perform reliable experiments. Otherwise, you’re better off guessing
Experimentation works to promote bottom up strategy development. Learnings from performing small experiments can create a big strategic question that needs to be answered. Answering the question may result in a different strategic direction for the business
In this episode we discuss:
How Jonny got started with experimentation
Jonny’s guiding principles for experimentation
Why there’s no such thing as a “no brainer” with experimentation
Reluctance for use of the scientific method in business
Why the language of experimentation can be problematic
The culture of experimentation at Sky
How experimentation started at Sky
Effective communication strategies for experimentation
Key benefits of the Sky experimentation program
Advice for businesses getting started with experimentation