Episode
#
151
|
November 9, 2020
| Season
2
,
,
Bonus
Episode
45

How to embed experimentation into Agile product development

With

Jonny Longden

(

Journey Further

)

We learn how to embed an experimentation program into your existing Agile product development process.
Recorded during
-
This is some text inside of a div block.
Audio only:

Partners

Partnership

Episode guest

Jonny Longden

Co-Founder & Conversion Director
at
Journey Further
LinkedInTwitter
View full profile

Episode host

Guido X Jansen

Cognitive psychologist, CRO specialist, podcast host
LinkedInTwitter
CRO Podcast host Guido Jansen

Shownotes

Book(s) recommended in this episode

Experimentation Works

Transcript

Please note that the transcript below is generated automatically and isn't checked on accuracy. As a result, the transcript might not reflect the exact words spoken by the people in the interview.

Guido: [00:00:00] To kick things off, I'd love to know a bit more about your background and how you actually got started in CRO.

Jonny: [00:01:44] Sure. Um, so I, I started my career in, uh, well, I guess what now would be called data science, but we didn't call it that then it was just called analytics. Um, so I worked in sort of direct marketing and CRM.

Um, And then we became a kind of a consultant, really helping people with things like segmentation and propensity modeling and spatial analytics and things like that. And then I was working for an agency that was doing that did a bit of both of that sort of offline stuff in digital. And this was in about 2007 or something like that.

And, um, and I saw, I sort of just got exposed to this sort of world of digital analytics and, um, you know, and, and. Conversion optimization. Wasn't really a thing. Then it like AB testing tools were just about emerging. Um, and I just became fascinated in that, but particularly I saw that the people, most of the people that were doing that sort of stuff, product, no actual experience in analytics.

So I thought it must be quite. Beneficial to take that real experience of analytics and put it with that. So that's why I kind of crossed over into digital. And, um, and that was, that was a successful thing. Kind of, um, uh, got me in know quite a lot. Yeah. The, most of the people doing it at the time, I then run analytics and conversion.

Agency teams in a couple of agencies for about seven or eight years. Um, most of that time was with a big global agency called RGA. Um, uh, had quite a lot of New York and they're kind of, um, an innovation consultancy really. So whilst running a team, I then ended up becoming more of a management consultant.

So I would fly around the world and help big organizations with their sort of internal setup for analytics and conversion. So people like nightie, um, this big Russian retailer called M video Manchester, United few of the companies like that. And, um, and then I had a baby, so I just couldn't do the travel anymore.

So I wanted to go client side. So I did a stint at visa working on, uh, their, um, version of PayPal is called visa checkout. Um, and then went to work for sky. And, uh, sky's sort of probably the most interesting part of my story, I guess. So sky before I worked that hard, um, had real soft, fragmented, different digital teams in different parts of the business and for efficiency reasons, they wanted to put it all together into a center of excellence in leads.

And I was hired right at the start of that process, um, with quite a broad remit, really just about how. How do you make that sort of thing, commercially beneficial and how do you make sure it has good commercial impact? And because of my background, my answer to that question is, you know, experimentation.

So I built a, uh, a big experimentation function there. Um, the really interesting part about it was because we were building this center of excellence. There was nothing legacy to deal with. We, you know, we were literally building everything from scratch, including older. Ways of working for developers in the agile methodologies and everything from scratch.

So it was a really interesting, great opportunity just to go, right. What's the perfect way that this should work and, and stay up right from the beginning. And that's what we did. And so that's still running a lot like that now, even though I'm not there, um, That's by incidentally, where I learned a hell of a lot about what we're going to talk about today, which is, you know, how do you, how do you properly integrate experimentation with our job?

Um, and I learned that the, uh, you know, the hard way in some ways,

Guido: [00:05:14] And Skype keeps popping up as positive examples, uh, of how to do a CRO. A lot of people, uh, from sky I see popping around and talking about it's, uh, uh, conferences.

Jonny: [00:05:27] Yeah. Um, I, um, I bought optimized whilst I was staff sky and, um, I never a big advocate of what we, what we built at sky and how that works.

And, and, you know, from a process, point of view, how it works without jail and stuff, particularly because a big cause most of our testing was server-side testing as well. So we had the developers really heavily involved in that using full stack and things.

Guido: [00:05:47] Yeah, no. Um, optimizing now, uh, being bought by, uh, by epi server or what, w what was your first reaction to that?

Jonny: [00:05:56] I mean, that's a bit because now, uh, I, you know, I run a consultancy and we work with small clients. We don't actually have any clients that have optimized because it's so expensive. Um, and so I'm a bit removed from, it really is a thing. So I don't know. I thought it was a bit odd. Um, there's, you know, it's not the most obvious connection that you would make, but, um, other than that, I don't know, you know, like, I don't know much about the acquisition and what it will mean or anything like that.

So certainly I've, I have heard some grumblings from people that they believe it will change things and stuff like that, but I don't know much about it. Really.

Guido: [00:06:34] Yeah. I've spoken to a couple of people that are almost no one saw this coming. That's basically the, the main trends and, um, Uh, yeah, we'll see what this brings.

If it will remove, uh, automatically from the, uh, basically the open market as a free service, or if it's going to be connected to episode over alone, we'll have to see. I mean, I actually have a scheduled, um, uh, both guys with, um, the CEO for my episode for Alex Augsburger. So, um, I'm going to ask him, so what, what, uh, uh, uh, just introduction, how you got started with, with CRO.

Why are you still in Ciro?

Jonny: [00:07:13] Yeah. Well, um, so I, uh, I just generally have this real passionate about the whole thing or I can always have, um, and part of that's fueled by the fact that I honestly think that not very many people do it right at all. Um, you know, th there's a, there's a whole range of. What you mean when you say CRO, like I'm really big spectrum and like, you know, you have at the bottom end of the scale companies who, um, you know, they've literally got somebody in the marketing team who very occasionally runs an AB test and Google optimize or something like that.

And they can genuinely believe that they do CRO, um, by doing that. And then at the opposite end of the scale, you know, you've got sky all the way up to booking.com and I'll do the big guys that people talk about when they talk about experimentation. And this there's a whole range in between that, but the vast, vast majority of people, um, I solved the bottom, ended up scale and, and don't realize it.

And that's just real sort of, it's really interesting for me. It's really, it's something that I want to try and change. I want to try and open people's eyes to the fact that there's a better way to do it and that it has such enormous benefit as well. Like, you know, when you do it right, and you get the right process behind it, it just, can't not have.

Really really massive impact and a really beneficial impact. And so that's actually really what drove me to, to start journey further conversion because, um, you know, I just, I just want to kind of go out there and open people's eyes to it and get them to understand that. But at the end of the day, then I like to have to really answer your question.

It's just an alert. I'm sure a lot of people in Sarah are the same. It's just, there's some thing. Really really exciting about, about running tests and about running experiments and proving things and particularly not putting things as well, because you know that one of the things I always tell people is, um, The only really true thing about CRO is that, you know, you can't really guess what's going to work.

You can't really, um, you know, you can't really sort of second guess things and have best practices because at the end of the day, things that seem really obvious, um, will fail and things that seem really stupid will win. And that is really fascinating and learn, trying to understand why those things are like that, and also exposing your own.

Um, you know, cognitive bias and things like that is just, it's just really interesting and exciting. So.

Guido: [00:10:13] When you encounter a client or prospect that comes to you for, for help. And they do just the, like you said, Uh, marketing department, very, uh, distributors. And they, a couple of times a year, they, they run, they beat us and they think, okay, we do see are already, but what do you tell them? What's your role really about?

Jonny: [00:10:31] Yeah, so the, the, the really interesting stat that I've, I've very often start with. Is, uh, from Australia optimized, we actually did this study. So you may know of it. They gave, um, virtually I think all of their historical data from the whole platform to Stephan Tomko, um, uh, Harvard business school and, um, and you know, he was running.

Yeah. He was running research on all sorts of different nuances of statistics and things like that. But the overarching thing that he came out with, which he mentioned in his book is that, uh, only one in 10 of all of those tests historically have ever been run on the platform, came out with a positive result.

And, um, you know, another way to look at that. So, you know, if you, if you're an e-commerce director or any kind of website owner or whatever, the other way to look at that is that. Everything that you do to your website in no matter how big or small you're kind of, you're making content changes your, you know, putting things through your dev agency.

And if you're not testing any of those things, then there's a really good chance that nine out of 10 of those things is a waste of money or a waste of time, or even worse is damaging your revenue. And that's what I try and get people to realize. First of all, it's like, and, you know, just by showing examples of strengths where, um, you know, where.

It seemed like a no brainer. It seemed like something that, you know, like you couldn't possibly not do the actually turned out to be detrimental, then they start to get it and they start to realize actually, yeah, you know, you, you know, everything you do to your outward facing website is a risk. And then.

You know, so CRO, which, you know, as we all know is not a great term for what we actually do. Um, but CRO is really kind of risk management. You know, that is what it is. You are, you are basically making sure that what you invest in web development and. Is commercially beneficial and commercially impactful and, and not a waste of money.

So, you know, there's, there's, there's a whole way of like, Oh, so articulate in the economics behind it, of how, you know, you, you know, what you're effectively doing is you're not you, you know, number one, you're not doing things that don't work, but as well as saving the. You know, damage to your conversion rate and revenue and profit and everything that goes along with that.

You're also not spending the money in production, web development to develop those things. So, you know, there's a, there's a whole sort of virtual circle of had. And even if you, even if you spend money on CRO in done properly, that should basically almost just displace what you would have spent in production, web development on things that you're not.

You know that aren't going to work. So it's not really a new cost. It's just a more efficient version of web development and a less risky version of what department in a way. And it's probably easier,

Guido: [00:13:23] uh, to follow the things true through syrup because you don't necessarily have to develop the whole product or functionality.

Yeah.

Jonny: [00:13:32] Yeah. Nice. And often, often an objection be, well, you know, there's some, there's just some big things that we know we need to do. And, you know, we can't really test them cause they're too big. Like we need to, um, implement PayPal or we need to, uh, you know, redesign the checkout or whatever. And you know, from there you can sort of say, well, why do you need to do that?

What makes you think you need to do that? You know, how do you know, like, uh, and often like, you know, redesigning a checkout? Well, I, well, you know, we haven't done it for a long time and, uh, you know, nobody likes it and it's like, well, who doesn't like it? You know, you don't like it. Well, you know, maybe your customers might.

Um, so, you know, you've got to go back to, um, You know, really like, you know, you start with research, you then start testing small things. And again, like, you know, a lot of people would test small things and then just implement them or not, and move on. But the whole point should be that you are, you're building up this sort of ladder of learning.

Like, you know, you test small things. If a small thing works, then that should give you. An indication that that gives you the license to do a slightly bigger thing along the same theme, because you've proven that there's a point in investing a little bit more time and effort in doing that. And eventually you might get to a fairly big project which you're doing in production, web development, and then you would feature.

Test it before actually releasing it just to make sure or to refine it. But by the time you've got to do that, you should have, you should have proven that there's a point in doing that piece of work in the first place. And that to me is, is what experimentation is, is, you know, co conversely to CRO, which a lot of people.

Use those, like, you know, let's test these, let's hack these few things on the front end, just to try and track a bit more revenue out of the website, you know, really excellent experimentation done properly as a sort of a Canon of going from research to learning to the development of that learning to ultimately doing bigger things and bolder things by the bits of innovation, which ultimately costs money.

But that's the point is you have, you have de-risk that money that you're smiling.

Guido: [00:15:34] Yeah, framing it as Richmond management, uh, makes a lot of sense. I think, uh, we did an episode about specifically that with, uh, Stephen Buffalo, which, uh, I'll link to that in the, in the, in the show notes. And I wouldn't be the first time I, I got to a client and they actually.

Because the term experimentation for some people have as the connotation of actually being something that's very risky. Well, actually it's, it's the opposite. Uh, at least the way, the way that we, uh, do it with, uh, with, uh, CRO and yeah, I think, I think actually the, the one in 10 that you mentioned, uh, from, um, the, the Stephen Tompkins research, I think that that might be actually.

They might be positive buyers in there still because those are experiments that people will consciously

one in 10 might be even on the positive side for that all the changes have.

Jonny: [00:16:26] Cause you got to think as well, like all those companies they'd bought optimized. Like so not only are they doing testing, they must be reasonably serious about it because they've got miserly, you know? So, yeah. And, and that means that they're probably putting a reasonable amount of research.

And data behind those tests. So absolutely that, that could be, that could be positive or too positive even. Um, but they're, they're, they're, they're the thing, like, I dunno, um, ultimately your, ultimately we're trying to persuade somebody like an e-commerce director or somebody relatively senior, um, you know, tick to consider the essence, consider, you know, the, the transformation that goes with doing it properly.

And, um, and I find the challenge is often that. That those kinds of people they have at some point sort of box heck CRO by the fact that they've got, you know, somebody over there that does that, um, or an agency that's doing a bit of that for the Muslim thing like that. And it's almost a kind of a unit, you know, um, You might have like a balance sheet where they've got, they've got annual budgets and they've got PPC, SEO CRO.

And by that fact they think that they are doing CRO. And just back to what I was saying originally, there's so many different. Things that CRO can mean, but you know, they're part of the challenge is that, you know, th they will say, well, we've got a CRO person. We've got even got like little CRO team, so we've got it covered.

That's absolutely fine. We're doing it. And yet, what, what are you trying to get them to realize is that, um, It's a tactic that should sit alongside PPC and SEO. It's a, it's a whole way of life around how you do web development or even marketing as well in a way. But, but that's the difference where you're trying to get people to say, and I'll say like, you know, so you've got your CRO.

What. What's in your roadmap for web development and where did that stuff come from? Because that's the thing, you know, they'll find that, well, I've got this roadmap, web development, and there's a whole load of stuff in there. We did a strategy presentation at the beginning of the year and we've got redesigned the checkout and we've got, you know, new payment methods and we've got, um, you know, tech debt and all this kind of stuff in the, in the, in the roadmap.

Um, and then there's also a few little things that came out of the CRO team and you're like, well, If what you mean by CRO, if really, if you take away the term, what it really is, is simply using research and data to understand what you should do to your website and experimenting, to validate. Whether you really should or not.

That's all it is. So why would you do anything other than that? Why have you got some stuff here in this roadmap that isn't that, and then a little bit of stuff that is that, you know, that's, that's where you're trying to get people to understand. It's really, you go to somebody at that level, you have to go back to the economics of it and say, you know, you are you effectively, um, doing this stuff, investing in web development, either with an internal team or an agency, and you have no real idea whether that stuff is going to work.

So, you know, you've, you've putting a massive risk on that investment by guessing. And even though they might not feel like it's guessing because you know, we've had the, did all this work that, so do we need a new checkout? Well, you know,

Guido: [00:19:39] the competitors doing it, so it's, I'm guessing.

Jonny: [00:19:41] Yeah. Yeah. It's like, I'll check out, just look like John Lewis has checkout, so we need it to, you know,

Guido: [00:19:47] exactly.

Well, so you just mentioned already, uh, that there's, uh, you can apply this to product and, uh, but also marketing. Um, how would you see it? Because today we're talking about how to embed the, a, this experimentation, a way of working, uh, into agile product development. Um, How do you see this experimentation applied to marketing versus product?

Jonny: [00:20:09] Yeah, I think like, um, I think that all comes down to the process that you have and, uh, you know, what a lot of people, what we talk about and what I love, I know a lot of other people talk about as a sort of operating system, um, for experimentation, which is really, you know, the system it's validation of how you go from.

Research and data that understands customer problems and opportunities and things like that to ideas, to experimentation. And in that sense of experimentation is really, uh, a word for validation. And so if you take it to a slightly higher level where you're simply, you know, managing the process of research, data and insights, Then monitoring the process of ideation and prioritization of ideas and what you should be doing, and then doing some kind of validation in order to ensure that those ideas are really the right ideas before you actually pushed them out into the real world, then that doesn't have to necessarily.

Um, relate just to website content that can be, you know, marketing, it could be, um, you know, PPC messaging testing, or all sorts of things. Um, and I mean, you know, in a lot of those channels, you can do experimentation in the same way. So I don't really see that, you know, you need to necessarily, um, limit it just to website content.

Although of course that's where it is. Traditionally most teams, um, out of interest, like after I worked at sky, I then went. To be the e-commerce director of a hotel company, fairly large hotel company in the UK. Um, and what attracted me to that was, was that really like an experiment, like, you know, it's an experimental job.

Like what happens if I take this stuff and this way of working that I've built as sky and apply it to everything because I run the P and L for not just the website, but all of the marketing and distribution as well. And, um, and so I, I use the same process and use the same. Uh, uh, system that I developed for everything and it works really, really well.

Um, and that's now what we, what we effectively sell and help other people with, uh, at journey further. Um, but yeah, like it doesn't. I think if you, if you get hung up on the idea of an AB test and all the statistical significance and things like that, which absolutely you should do where you can, then it precludes you from doing other kinds of validation where that's, that's really what it is you're doing.

You're validating ideas. Um, so yeah.

Guido: [00:22:41] I think it makes sense. That's a, you start doing this, uh, on your own marketing channels. I mean, that's, that's where you probably have the most traffic and in terms of Shiro, if that includes AB testing, that's where you have a lot of data. Um, but of course it's that those principles you can apply them to.

Email SEO, SCA, uh, products, uh, all your marketing channels. It's not exclusive for digital, um, for your own website.

Jonny: [00:23:07] Yeah. And, and also if you have a really good, well articulated, um, sort of strategy for what. Performance means like how you, how you basically kind of chunk out performance into the different leavers that you can pull in order to drive profitability for your business.

Then, you know, that will naturally cascade down into the different things that you're doing to the different traffic drivers, different, you know, parts of conversion and things like that. Even if you're an e-commerce business into things like, um, the distribution chain or, you know, the, the, you know, uh, Returns and stuff like that.

So, you know, if you, if you can articulate things really well like that, then you've got a really good holistic view of what the business looks like in terms of what you can do to affect it, then optimization in the pure sense of the word means you're going to try and optimize all those different things.

Um, and then, you know, again, going back to the idea that optimization or experimentation is just. Um, using research and data to understand where the opportunities are. Um, some sort of validation to make sure that, you know, you're taking your own cognitive bias out of the equation, then you can, you should do that for

Guido: [00:24:20] everything.

Yeah. And the validation doesn't need to be an AB test. Right? That's uh, often, uh, something that's, uh, uh, people use as a contract. Oh, we cannot do zero because we don't have to traffic. Uh, like we're a business to business. We can not do. Uh, we don't have the traffic to do AB testing and thus, we cannot do zero.

Jonny: [00:24:38] Yeah. And of course that's not true at us. Never tree, rarely you come, you absolutely come and do a B testing. You just have to understand the statistics of it. You know, you have to understand. The statistics well enough to know that there is a, there is a place where you can accept much lower levels of significance than 95%.

Um, as long as you, you know, using sample sizes in the right way and things like that, and, you know, testing on upstream metrics rather than eventual conversion and all that sort of stuff, you absolutely can do it. But even if like, you know, it gets to the level where, you know, It's really, really imperfect.

It's still better than not doing it. You know, it's still always slightly better than not bothering and just guessing, you know, you knew, and again, the whole point is that you start learning and you, you might run these tests that maybe they're not perfect statistically, but you will learn. From the data that you get off the back of it and the postdocs research and all that sort of stuff.

So, uh,

Guido: [00:25:32] assuming your, your company grows in your marketing channels grow, you will prepare, be very good preparation for when the time comes. Uh, and, and you can run those tests. Or make a test or more test anyway. Uh, so we're here to talk about, uh, specifically your, your, uh, article about, uh, admitting experimentation.

Um, so first off let's ask the question. I mean, this is, uh, probably so embedding it in an organization, uh, or amending it into your, uh, agile product development is probably in contrast to. Having it as a standalone team. Um, so one of the differences there, what, why should you go for a bending it in your whole process?

Well,

Jonny: [00:26:14] just before we go onto that, there's a couple of things, really. So, uh, what, what sometimes what, I mean, when I talk about embedding it is, um, as simpler levels and thinking about our jail and, and sort of organizational structures and things like that. It's simply having a really properly defined process for how you, um, get something from experimentation into production development and where that comes from is what, you know, we, we come across this a lot with clients that we have where, um, you know, somebody, um, And there may be sort of a CRO internal person or an agency or something like that.

And, um, they're running tests and some of those will win. And then, you know, they're effectively saying, this is what, and you should push this live. And what you find a lot of the time is that that stuff can just sort of linger, um, and not actually ever really get. You know, produced. Um, and that's why you got this thing where you have people who just have tons and tons of tests running at a hundred percent in their, in the AB testing platforms, because they can't really get those tests pushed lines.

And you've got to think like, why is that? Like, why, why, why can you not do that? And it's because of, uh, you know, some sort of disconnect between the way that soft production web development. Uh, operation and ultimately Pew and prioritization is run and what those tests are. And, you know, for example, there are things like, um, you know, uh, a lot of the time, um, the, the output of an AB test is quite a simple change and.

Um, developers a lot of the time don't really like to work on small things. You know, you can have a, an internal dev team and those developers may just naturally sort of gravitate towards like bigger, more interesting projects that are in, you know, particularly that involve them working in new and different code bases, like react and things like that.

Or

Guido: [00:28:17] give me a new framework and all that.

Jonny: [00:28:19] Yeah, exactly. So there's, there's things like that. That can mean that, you know, here's a simple test that. You know, it was just a content change to a page. I need to push in life how that can just kind of linger. Um, similarly, you know, there are, there are things like the, the sort of hippo effect where, um, you might have senior people just sort of saying, you know, I need, I need this doing now.

And these are kind of bigger things and those things will turn round the top of that dev queue. And so the CRO team, and not really getting any traction because they're not. Seen as having the authority to actually dictate what should go through the production web development queue. And there's a ton of other things like that, where, um, you know, it's very easy to overlook those things.

And just to then end up sort of, you know, for a CRO team or a Sarah person just be, you know, kind of. Uh, as I say, leaving everything at a hundred percent because you can't really do anything else. Um, so that's, that's sort of like the, the simple thing is that, you know, you, people just need to think about that, to think about what is the process, um,

Guido: [00:29:22] uh, with a, with a company, uh, very recognizable that all these changes that we, uh, Uh, suggested that it came from a researcher and a AB test, all added to the bottom of the, of the, of the queue for, uh, for development.

Never really got to the top. And after a couple of months of implementing nothing, uh, suddenly conversion went down because they, they changed the whole pricing model of the website. Of their service, which obviously has nothing to do with, with the usability of the website or whatever, but they look for that.

So, Hey guys, zero's the conversion is going down. What, what should we do? Well, we have this whole list of, uh, currently 50 items. That you haven't implemented yet or improvements or the website let's do that. So they picked up all these items in one, uh, two sprints, two weeks sprints, and I think conversion John gen.

So there were a price comparison websites. Uh, I think the, uh, the conversion was before that was like 32 ish percent click outs. And also implementing those 50 changes. They were up to 43. This is a massive difference also and all small changes.

Jonny: [00:30:33] Yeah, I know a lot sometimes. Isn't that. So, um, yeah, but with agile, particularly, um, as well, you know, just on a, sort of a bigger level than what I was just talking about.

The thing with agile is that, you know, you, you know, in, in, in a company like sky, you know, you've got, um, uh, you know, very, very big. Team of developers split into different sort of squads and tribes and all this kind of stuff. And, um, really the, the, one of the original concepts of agile based on the manifesto is that it's about, um, Those sort of little groups of people working as a sort of an autonomous unit and being responsible for that product or area of the website or whatever, and having a level of autonomous control over it.

And, and you know, if it's gonna work the way it's really supposed to work. Yeah. That, that autonomous unit should be feeding directly from customer insight and customer data and adapting the product, um, accordingly. And that's how it's meant to work. Right. I mean, it doesn't always work like that. Like, you know, when people say they've got agile, it really often isn't like that.

It's, you know, it's, it's a way for seemingly people more senior to get things done faster than they would have otherwise done. Um, However, that is how it's supposed to work. So, um, you know, uh, that it's health should inherently be experimental because if the idea is you've got this, you know, this, uh, product team, um, like, you know, Product squad in an agile environment.

And you know, that part of the manifesto that says it's written customer collaboration means that you should be listening to customers and constantly feeding from customer data to understand what the improvements are that should be make well. That is. Research and analytics that is customer research and analytics.

And then, you know, the, the other parts of our jar, which are about MVP and all that sort of stuff is ultimately about de-risking what's happening through gradually developing features in a way that they can be, uh, extensively tested and developed, um, you know, in an expletive manner and that's experimentation.

So really like, you know, If you really purely look at the agile manifesto, then agile and experimentation are the same thing effectively. They are. That is what it's supposed to be about. So for that reason, you know, you almost, you know, you almost shouldn't, you know, if you just look at it on a very, very pure way, You shouldn't really need to embed experimentation into it because it already, it already is, or it should be that.

Guido: [00:33:20] So mainly a branding problem that we have.

Jonny: [00:33:24] I think it's, uh, yeah, the, the a hundred percent is a branding problem, but I think it's more, it's more, um, that. You know, our jail isn't and hardly ever is implemented in the way that it was probably originally intended. Um, you know, it's, it's very often just simply seen as a way of working for developers.

Uh, and, um, you know, if you think that like pure way it's meant to work, because you've got this autonomous team that are constantly like, you know, Feeding off customer data and developing things. Well, that's not really how it works. Like, you know, you've got a little squad of developers and they get given stuff.

Bye. Um, smell. So were a product owner. Um, and, um, and that stuff gets prioritized according to who shouts the loudest and things like that in the organization. And, um, none of it would ever really get tested because, you know, somebody said do it. So, yeah. Uh, the

Guido: [00:34:27] weather we, yeah,

Jonny: [00:34:29] exactly. So, um, um, and in that respect and know the, the agile team will be kind of, we'll be disassociated from the responsibility of whatever it is that they're producing.

So, you know, like, um, the, the, it's an agile team in an engineering department that is, um, delivering stuff that they're told to by the marketing team or by, yeah. The, um, you know, the sales team or whatever. And so they're almost like a, kind of a little internal agency that is just funded by somebody else to do stuff,

Guido: [00:35:07] very de motivating.

Right. For those people

Jonny: [00:35:09] slightly. Yeah. Especially, especially when they, I think are acutely aware of what of the way our gel is meant to work. Um, and so, yeah, you've got this, you know, you've got this sort of, um, Worldwide. So it's a little bit like, you know, an, uh, youth, uh, company using an external dev agency where that dove agency will not typically challenge what it is that they're asked to do.

They'll just do it. And then, you know, and, and also why would you that, and as that, as the engineering team, Want to test those things because you're not responsible for the commercial output of them. And, uh, you know, so that, that's where it all kind of goes stray, really. Um, you know, when then. Just back to your very original question.

Like when the one actually talk about, um, van bedding, experimentation to watch all that comes from that place where, you know, most people probably got a CRO team over here and then, uh, you know, agile developers or an external agency even. Um, just back to what I was saying before, you know, here's this big list of stuff that they're working on and then his CRO over here and really like what, what it should be about is.

There's like a stage of stripping away the concept of CRO temporarily and saying, right, you know, what is going on here is you've got research and analysis and data that should be telling us the kinds of stuff that we should be doing and on a process of testing and learning that, um, that basically de-risks what we're actually doing.

And so when I, when I talk about embedding it in that level, it's just bringing those things together. You've got. If you've got a CRO team over there and you've got engineers over there, you've got what you need. You just haven't put them together in the right way. So you've got the expertise at data analytics and user testing and research and things like that.

You've then got the expertise of how to form that into. Ideas and prioritize ideas and all that sort of stuff. And you've got the developers to help execute it. You just haven't brought them together in the right way. Um,

Guido: [00:37:10] so did you encounter a, at a, at a prospect or a new client? Uh, where do you start?

Jonny: [00:37:17] Um, I think it's just.

With them generally tend to kind of start with what I was saying at the beginning around like the economics of it. Like, you know, you try and get people to understand the economics of it and the fact that they are doing things to the website, you know, that even though they seem like no-brainers could be having a detrimental effect.

Um, and then the other thing that the other thing I really try and get to create. Yeah, across the people is that it's not actually that difficult. Like I think, you know, um, you have conversations around this stuff with people, you know, uh, senior people in businesses and you know, that their minds of saying to them, Oh, transformation project.

And, you know, it sounds really complicated and business transformation and, you know, we're going to have to get Mackenzie and or something. And, um, and it isn't, it isn't really that hard. It's just about, you know, at the end of the day, And you really, really strip it down to what is, is better. Prioritizing the queue of stuff going into web development.

That's literally it like, you know, you've got a queue of stuff. You've got JIRA with a big list of stuff and that the team are going to, that the engineers are going to work through what is the best way to a put stuff into that list and be organized that list so that they are working on it. In the right way and giving that commercially output, the best commercial output.

And that is ultimately all it is, right. That is what it is. That's the absolute output of experimentation. Um, and so getting them to see that it all becomes a little bit more simple. Um, and then, you know, the idea of just reorganizing people and things like that gets a bit easier, but you know, then you've got kind of questions of like, um, what's the right model for it.

Um, you know, do you, um, try and sub democratize it and get product teams actually running experiments? Or do you have a, a small group that is, you know, running experimentation for them know that that's where I come back to this idea of properly embedding experimentation, which is in, in a way what, what we, um, what we managed to do at sky to an extent, which is that, you know, you.

Really going back to the idea of what our jail is. You should have a, uh, conversion optimization or experimentation. Kind of consultant or whatever you want to call them. That is aligned to each different product area. And their responsibility is ultimately, um, feeding research analysis and ideas into the product team.

And to an extent alongside the product owner, managing the process of, of that, of experimentation and what gets built and how it gets experimented on. And the. And how you go from sort of, you know, small tests to big tests and stuff like that. And that is their role within that, within that team. And if those, if all of those, if all those different, um, consultants are either are actually literally, you know, together a team under some sort of general line management, or they are just a virtual team.

Um, you know, depending on what their reporting structures are, then they can all work together to ensure the development of that function of that, that concept in the business. Um, now I think that's really important because again, going back to the idea of our child being sort of autonomous units, I don't think generally, um, people working in a proper agile way, like, um, Outside, uh, input if they can help it, because that is the con that is, you know, really what our job is supposed to be.

It's supposed to be, they're responsible for that. And if you, if you're able to put that within the squad and to make it a part of the squad and to involve everybody in it, so that, you know, literally daily stand ups, they're discussing the experimentation process and discussing what gets tested on and all this kind of stuff, then, um, that.

That to me works much and gets much, much better output. And, you know, I think not doing that, you would find a fair amount of resistance. Um, if, you know, if it's just a, somebody over in that BI team, like over there, who does experimentation is kind of telling us what we should test on and stuff like that, then, you know, you get a lot more resistance.

So, uh,

Guido: [00:41:42] I, I usually, uh, see in a zero specialist also, or does hero team as, as, as the body, that's advisers to the product manager on what to do, but it's still the product manager's choice if they're going to implement it or not, because it's not, she wrote him. We cannot necessarily. See the whole impact of implementing Samba something and all the business impacts of that.

I mean the development or the logistics on, on creating something we can say, okay, this is, this is the impact it will have on your users, but then it's still going to be, um, uh, up to the product owner to decide, okay, are we actually, does it make business sense to do it?

Jonny: [00:42:15] Yeah. Yeah, absolutely. Um, I think the other, the other interesting part is like, you know, you got on the, on.

On one end of the scale, if you have a, if you have a CRO team, um, that have too much. Um, responsibility for actually executing every single part of the process. Then they become a real bot on that cause. So if, if every sort of product manager or product owner has to sort of go and say, right, we want to test this now.

And can you develop this test? And you know, and it's, it's going through a bond that then it becomes a bottleneck, but it. You know, the, the absolute opposite of that issue. Just give testing capabilities to every different product team, uh, and allow everybody to test everything. Then that's completely anarchic.

And, you know, you have, if you've got people running SAS throughout really that scaled and things like how to measure them and all that sort of stuff. So the sweet spot is somewhere in the middle, which is, um, you know, you might have. A little center of excellence or somebody at least who is responsible for really, you know, developing things like ways of measuring and processes that can be shared around our teams.

Um, and then you've got like a say that sort of federated. View where, where you've got individual people who are aligned to those squads who are, you know, helping that process and, and, and helping each squad run their tests. Then you've kind of got the best of both worlds. I mean, that's, you know, highly dependent on the organization itself and the way it's structured and what's right for it.

But I think that's, that's ultimately the goal with that sort of embedded view is that you've got that balance between having experts, having. Having real experts who are, you know, like maybe, you know, just simply championing the whole thing to expertise in each squad, that's actually helping deliver it.

And then ultimately you are really empowering the agile squads man engineers to really ultimately deliver against them.

Guido: [00:44:58] Besides the, the mindset that people haven't and of course, they're, they're, uh, they're used to a way of working that might not be completely agile. Um, What do you see are the biggest bottlenecks to, to achieving this? Um, for companies?

Jonny: [00:45:13] I think, well, like I say, I think just, um, a fear of, uh, of the complexity of change.

Um, so, you know, and, and, you know, and again, like in big organizations kind of changing things and restructuring things can seem really, really complicated. Um, But, you know, if you really break it down, it doesn't have to be that difficult. Really. Um, I think the other one is just, it's an unknown, unknown, uh, and, and just like I was saying earlier on, like, I, you know, I, I, I kind of have this mission to try and sort of open people's eyes to the fact that this can be like, um, way, way better of way of working that can drive like a massive amount of innovation.

But I think, you know, there's a lot of cases where like, just like I say, people don't know. That that's even a potential, like, you know, they, they wouldn't, it just wouldn't cross their mind. It's not like it's not like this sort of thing. Oh, you know, wish we were like booking.com and all that kind of stuff.

They're just not even not really aware of it as whole as a thing. So I think, you know, you've, you've, you've, you've got to kind of show people that it is a possibility before they can even start to think about, um, how the change might happen. Um, and then, you know, the other thing, the other big thing is just sort of, uh, different bits of cultural resistance to it.

And just like I say, that that can ultimately boil down to different sort of political ways of working within the organization, different things about responsibilities and stuff like that, where, you know, uh, who feels like they own, what, and you know, how do you then, um, you know, Introduced something new to that.

And, you know, you've, you've got to think about all that, but, um, yeah, it, it, it gets harder as the organization gets bigger, but, you know, I think really the way I do, I often advise people to start out with this is just to just start testing things and particularly to start trying test. Really sort of, uh, contentious things.

So, you know, if you, if you can find two senior people arguing about the design of a page or something like that and test it, then it, you know, it's like, um,

Guido: [00:47:27] you already know they care about it.

Jonny: [00:47:29] Yeah. Well, and also, uh, you know, one of them will be annoyed by the result, but. Um, you know, it's, it, it just massively raises the profile of experimentation.

You know, like the more, the more you can get people involved in things. And, and even if, you know, even if the one that it doesn't win, you can then show, well, here, look has all this. States that we got off the back of it. And, you know, you know, there is some likes in your idea if you do it this slightly different way, you know?

And so you kind of show the process of how, um, of how that sort of learning aspects of it works. And you know, that I think is how you just start to chip away and get people interested

Guido: [00:48:11] in. You need to, uh, you need to stay away from making it a personal thing. It shouldn't be a personal take on some, or you got it wrong.

Uh, you have the wrong ideas about this that you should probably stay away from that.

Jonny: [00:48:23] Yeah, absolutely. Yeah. You can, you know, you, um, I mean that, that, that is well, that's, that's the other interesting thing about it. Like you. That, that, that can only be a problem if people have that sort of cultural fear of failure, you know, like, you know, which often is the case.

Guido: [00:48:42] If you say to a, I don't know, a marketeer or that their campaign was unsuccessful or the, to the designer that they're designed didn't work that often feels like a personal attack and people might choke on that.

Jonny: [00:48:55] Yeah, absolutely. And like, you know, I think a lot of, in a lot of businesses, the whole. The whole sort of hierarchical nature of business itself, flight, um, you know, really sort of leads people to think, well, I've been hired into this really senior position I've been hired because of my expertise.

And I've got to demonstrate my expertise and my, and my ability to come up with ideas. And if those ideas are sort of proven to not work through a test, then yeah, that doesn't look good. But again, like that's why, um, As with any kind of experimentation, the most, most important thing is that it's the post research that you do afterwards, because what are you can learn, you know, like from even something that loses can have such a massive dramatic impact on something going forward.

So that's the most important things to try and get across is that the, you know, it's not. A test that doesn't win. It's not a failure. It's a very, very important piece of data and the will lead to something successful. So therefore in a way that idea. Was a good idea. It always is a good idea because you're going to learn something from it.

So

Guido: [00:50:05] you, uh, you just mentioned, uh, resources, I'm assuming, uh, like, uh, an average mid-sized company, um, What kind of resources do they need? What would the team look like if you have a free hand in, in, um, um, combining a team, uh, what skill set needs to need to be there, to have this in a, in a proper agile way, in, in a way that they can really make.

Yeah.

Jonny: [00:50:34] Yeah. I usually self present in sort of three chunks really well, maybe four, but like, I, you know, I kind of think of it as three broad areas, which is, um, research and analytics. Design and creative and, um, development basically. And, um, there are, you know, you can map out a whole set of different skill sets in each of those three broad categories that you need for.

Uh, a, any CRO function to develop, but that doesn't necessarily say where those skills should sit. And, and I, in a really small company there could, you know, you could try, you could do, you could try your best to find one person who can do all of them. Um, yeah, but in a bigger company, there might be skills that are distributed across different teams that all need to come together in different ways.

So, you know, in, in, in, in research and in analytics, you've got digital analytics, you've got, um, uh, you know, you might need like sort of more. Traditional offline data analytics for CRM data and stuff like that. Uh, those user psychology, um, user research, you know, you can go through all these different sorts of things.

Um, Design and creative, obviously like the most important thing really is, is, is essentially like, um, copywriting probably. Um, you know, you can get quite far without really using what you might call UX design or visual design, but eventually you are going to need those skills. Um, you've then got things like loss of broad design thinking and then, you know, in development.

Well, you know, again, you can get reasonably far without anything. Well, you know, if you're using a Wiziwig and you've got basic stuff, but then sort of very quickly, you're going to need front end developer. Um, eventually you might need backend development of some description. Um, and then alongside that, there's always the whole kind of thing of targeting and then two will implementation and stuff like that.

So there's a whole range of skills like that. And, um, You, you know, any organization thinking seriously about CRO just needs to think where, what is our plan for where we get all this stuff, um, in the short, medium and long-term, and, you know, for a large organization, all those things probably exist somewhere.

It's just, it's the process and the ability and the license for those things to be utilized where they're needed. So, you know, you, you may well have, um, Skilled digital analysts that sit within some sort of central BI team associated with Adobe analytics or something like that. Um, and they may not well be, you know, really structurally part of anything to do with, um, their website development a bit weird, but they might not be, but so, you know, What if this CRO functional team, whatever it looks like needs to use kind of advanced digital analytics, then how was I getting that unlike?

How is that sort of been almost cross charged and paid for or whatever, or, you know, it does, is that structure, right? So that's, you know, you kind of have to map out like, this is the, this is the resources needed and where it comes from. Um, And yeah, again, uh, with that sort of embedded RGR model, like, you know, our jail teams are always going to lean on the resources from other areas.

Uh, you know, you might have a specific user experience design and visual design team. So, you know, how does that work? Like what's the internal commercials almost like how to power do they get time and how do they secure time from them? You know, you have to think of it almost like almost like designing a little business within the business, hang on.

You know, these are the resources we need and, you know, um, there's, you know, while I was vice people, is things like, you know, you, you, you can get, you can get quite far just using Fiverr and things like that, you know, free lunches and stuff. Um, so, you know, Again, thinking about that map. You've got a thanks to, well, where can we just start to get bits of this results from externally until such point when we know we need to hire it internally

Guido: [00:55:00] and you, you can just hire those skills, right? I mean, it's important to have some knowledge buildup inside of your company, but execution. Uh, you can just, uh, especially in the beginning, you can just outsource.

Jonny: [00:55:10] Yeah.

Guido: [00:55:11] Yeah,  thanks so much, uh, our hour together as a flown past.

Jonny: [00:55:17] Yeah,

Guido: [00:55:19] I think there's a lot of interesting information.

I think it can help a lot of, uh, people, uh, if they come to a company where agile is already implemented, that they can just reframe this as, uh, maybe it's not a scary new thing that we're trying to do or. It's just trying to do agile properly. I think that that might help a lot of people, uh, in, uh, in achieving their goals in an embedding zero and experimentation in agile.

Yep. Um, so you've been, uh, been in this, uh, working in this field for quite some time now. So where my final question for you will be, um, what's the inside that you think you might have that others in the industry don't have.

Jonny: [00:55:59] Um, so I think, you know, that you see that, uh, a hell of a lot of people working in CRO as we call it, um, have not ever really hard, the client's side.

Experience. Um, and you know, I think, I think traditionally CRO is a kind of a black box thing where, you know, like as a consultant or an agency, you would go to a competency, we'll go away and do this testing for you and come and tells you what, tell you what works. And I think we're on a tipping point where that is just.

You know, not going to be what people see CRO hours in the future. I think, um, you know, it ultimately is something that should be in-house and should be controlled within, within the business, because what it is is really a way of working around web development and digital innovation. And so. You know, but, but making that work in a business and the things that, the things that you need to change in order to do that is a very specific thing.

And, and, um, having done that, having built that client side, um, and, you know, with all of the consulting work that I used to do as well, um, I think that's really the sort of thing that I can bring to the table is w how do you. Adapt to structure internally in an organization in order to make it work. Um, and plus that just gives you, you know, that that just gives you a hell of a lot of insight about how to do the day job at guests as well, and in different ways and how to, how to kind of ultimately make it work.

So. Um, that's really what, what a big part of what we sort of say is our USP as an agency and, and every single person that works for me is X client site in exactly the same way as well. Um, and that's part of how we kind of go out and sell what we do. So,

Guido: [00:57:52] yeah. Nice. And my final question for you, um, who should I invite on this podcast?

Who inspires you all will be guests to share their experience here.

Jonny: [00:58:02] John Crowder. He works for me, um, is, uh, absolutely brilliant and this sort of fell, but he's also, um, just a really great personality. So yeah, it will be, uh, it would be a really interesting person to have on. Um,

Guido: [00:58:16] and what should we talk about,

Jonny: [00:58:18] uh, I think you'd have to ask him, but I think so he's, he is on, on the side a, um, uh, in a, in a couple of rock bands and stuff like that.

So, um, he, you know, just has a lot of, um,

Guido: [00:58:35] so he's a proper funeral rockstar.

Jonny: [00:58:39] I can't imagine he would ever want to call himself that, but

yeah, something around that, um,

Guido: [00:58:49] Cool. So if you can introduce us, then we'll connect and see if we can work something out for both guests. Thank you so much. Thank you for your time. And, um, talk to you again soon.

View complete transcript

Get your cup of CROppuccino

Through our mailinglist we periodically inform you when a special episode goes live, when there are professional events you might be interested in or when our partners have an exclusive offer for you. You'll receive an e-mail roughly once a month and you can unsubscribe at any time.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.