Episode
#
133
|
July 22, 2020
| Season
2
,
,
Bonus
Episode

Things that are always broken in Analytics

With

Charles Meaden

(

Digital Nation

)

Arnout Hellemans

(

Online Markethink

)

We learn about those pesky things in (Google) Analytics that always seem to be broken... Live recorded pre-COVID-19 at Conversion Hotel 2019.
-
This is some text inside of a div block.
Audio only:

Partners

Partnership

Episode guest

Shownotes

Book(s) recommended in this episode

Transcript

Please note that the transcript below is generated automatically and isn't checked on accuracy. As a result, the transcript might not reflect the exact words spoken by the people in the interview.

Guido X Jansen: [00:00:00] Welcome to an hour bonus episode recorded live with audience pre COVID-19 ethical version hotel conference in November, 2019 on the Island of TESL in the Netherlands. This session is with our notes, elements, freelance, SEO, and analytics, especially it's from online. Mark, I think, and a Charles meeting.

from a digital nation, in the UK. And the topic of this session was things that are always broken in analytics. And my first question was asking our notes about what the broken things are that he usually encounters when starting with a new class,

Arnout Hellemans: [00:00:35] anything except for Google ads tracking. The rest of the shot,

Guido X Jansen: [00:00:40] because that's what they

Arnout Hellemans: [00:00:43] it's, that it's just, it's a tool.

In my opinion, it's a Google analytics is a way to sell ads. If it's the only one that gives you proper ROI reports and everything that one works, the rest you have to set up manually using UTM, tagging. yeah, pretty much.

Guido X Jansen: [00:01:02] Is it usually that your clients come to you to fix those things or do they want something else happen?

And then,

Arnout Hellemans: [00:01:08] you probably know the answer. They come to me because they want to grow. And then I look at the data and they say, 60% of your traffic is direct. We might first want to try and fix that.

Guido X Jansen: [00:01:18] Yeah. Or you all can already fix those KPIs by just fixing.

Arnout Hellemans: [00:01:22] Exactly. And then the

Guido X Jansen: [00:01:23] already

Arnout Hellemans: [00:01:24] reach, or I think by default there is.

There's hardly any, and even my own projects, I regularly find stuff that is broken and I still need to fix.

Guido X Jansen: [00:01:36] So why is that is default so hard to set up and let us correctly?

Arnout Hellemans: [00:01:42] I think there's a huge misunderstanding about things like direct traffic. Just ask anybody in your organization, what is direct and they go Hey, people type in the URL.

Really because it isn't and we know that. so I think, yeah,

Guido X Jansen: [00:01:59] so we're here to learn Arno to tell us

Arnout Hellemans: [00:02:01] well, there's plenty of ways. So my biggest eye-opener was just last year where I really started understanding that's either the HTP refer that defines the channel, or it's a UTM tag. If neither of those is filled.

It basically ends up being direct. so with more apps, for instance, being there for publishers where you basically publish feeds of your content, if those are native apps and they're not UTM track, they become direct. So speed the other

Guido X Jansen: [00:02:32] category.

Arnout Hellemans: [00:02:33] Okay. Yeah, but they name it direct. So it's just.

Yeah.

Guido X Jansen: [00:02:38] Yeah. Charles welcome. any, anything you want to add, whether it was usually most

Charles Meaden: [00:02:42] brief things? I have a list of 145 common issues. These are things that keep on adding and adding. Now, couple of years ago, we had a client who 60% direct traffic, straight away you go that's wrong. We looked at the usual stuff and.

The bounce rate was all over the place and it wasn't a single page. And what it turned out to be was a programmatic bolt to randomly simulate user journeys. Now, the clients had not noticed this for six months. They reported these figures back to the bolt. So the biggest problem is the depths who setting up just assume they've been told, put Google analytics on and they think everything should work.

And then all the clever stuff. So the classic thing is let's not have a thank you page and a form. And then somebody says, I've spent a million on marketing, whereas on my thank you pages, I'm going to think about that. And it's a real, the biggest thing is broken is a lack of education and a lack of marketing people saying, put Google antics on and not saying what they want and lack of devs.

Who've told to do everything else. And somebody gives them a code and goes, there's a code it's done. We should

Guido X Jansen: [00:03:42] stop doing things with both bolts, even after, also after do concession, we shouldn't be using, messing up the bots.

Charles Meaden: [00:03:50] You could do stuff with bots.

Arnout Hellemans: [00:03:52] Alright. I love bots. Crawlers are bots and like everything and you want to test journey, so it makes perfect sense, but there's plenty of things.

I think that the really interesting discussion, where people would, basically have CPA bidding on their Google ads based on a goal in analytics who somebody changed the goal. So instead of measuring, just say 10 conversions or 20, that were like 300 and they set the CPA to 50 euros without a budget cap.

So literally Google went all out because the signal was you're doing great. So they kept on spending more. So he ended up with spending about 10 K a day instead of just a few hundred. And

Charles Meaden: [00:04:37] that's a classic example of a set and forget it, people set it up, somebody moves on next person takes over the analytics.

And we look at, sometimes we look at the goals and go, why have you got a goal for summer 15? Do you need it? They go, Oh, somebody else set it up and I don't want to touch it. we did one for an exhibition company and they have GA counts on every page. And I said, what are the I've already got access to?

He said, yeah, those other three, we don't know what they do. But if we turn them off, somebody from America is going to shelter us. Is it causing any damage? And we were like, no, cause it wasn't disrupting the data. But in that case, This is a big billion dollar company and the Americans were just not talking to the team because they didn't see the need.

Guido X Jansen: [00:05:18] That's I give it a try and turn it off. See what happens.

Charles Meaden: [00:05:21] They really didn't want to. There was a real fear of that organization.

Arnout Hellemans: [00:05:25] The other thing I think, as happened is when you start measuring the right goals, And their KPI is to reach certain goals and suddenly becomes less. They go nah, don't do that because, I might lose my bonus.

Charles Meaden: [00:05:38] Oh, we came across an organization that was bonused on the bounce rates. And we came in and said, but the bounce rate is fundamentally wrong. And they said, do you have to make, do you have to tell our boss about this yet? He's paying us. And the boss went. Okay, can we change it after Christmas? like I can't afford to lose my team.

It's a bad thing for bonus and we wouldn't yet. It's a really stupid thing.

Guido X Jansen: [00:06:01] So is there any KPI in Google analytics that we can use for bonus?

Arnout Hellemans: [00:06:07] Yeah, of course. And so the one thing that I started doing it was about nine years ago. I don't like the revenue field in Google analytics. So in eCommerce and I tend to, for any client use e-commerce.

Instead of goals. the reason being that any report you can just select e-commerce and it will tell you how much you make on the sessions from a certain device, from a certain location, whatever. but instead of sending revenue, I basically send profit and now a lot of objections are like you, but they don't really want to tell me the profit.

I've been there, that's confidential and blah, blah, blah. And then I go give me a ballpark figure. Is it 0.2 0.3. Is that 30? What's your average, profit per sale. And then just do it times zero three and send that. And then suddenly it becomes really clear where you're making the money.

Guido X Jansen: [00:07:01] Yeah. Do we already have some questions from the audience,

Arnout Hellemans: [00:07:05] Nick?

Guido X Jansen: [00:07:09] You had a question, right? no, not anymore. think about those questions, people. so when you spoke about, doing this for larger clients, does it help, if you're now working for smaller clients thinking, Oh, this all this crap, I don't want to deal with it. that's move up.

Market let's work for larger clients. They will have this all figured out.

Charles Meaden: [00:07:26] No, they don't. They seriously, they don't. cause the same issues apply and it takes. That's not true. Some sort of sample the cookie this time we ever implemented enhanced eCommerce. It was four days. And that was for a big client because their devs were so switched on brief Winton.

I'm working for a very large UK retailer, and we are now a year into the project because the marketing team is on a completely different team to the team that runs the website and its politics and its silo. And how should we say, if we get it harvested commerce in, we can make the ad spend a lot more effective.

The eCommerce team is not well, just on that, they've waited there. There's no page, please. So they see this as a hassle and there's no amount of education you do. Smaller clients often see the benefits quicker because especially if it's a who owns it, and you're going to say I'm delivering more revenue to you.

That's it.

Guido X Jansen: [00:08:18] Yeah,

Arnout Hellemans: [00:08:19] smaller companies. I, so I love working for smaller companies. However, the problem is they don't have the money, time or resources to implement loads of the stuff. So it's always that sweet spot. So I'd love to work for scale-ups, but the larger the company, this is my personal view, the larger the company, and especially old school companies that started to do e-business basically.

It's fundamentally fucked. Yeah. It's really that bad.

Guido X Jansen: [00:08:50] So how do you approach a, when a client comes to you and then, you need to do some, you need to do your work. how do you start out? How do you give them? Like an estimate? Okay. But we can start out, but we need X days, weeks, months to fix stuff.

And how do you sell that?

Charles Meaden: [00:09:06] We just personally speaking, we demonstrate all the other clients we've done what it's for. Then we say out of 500, we found issues and that wakes people up. Yeah. And we'll say, I can't tell you is going to fix it until I know what the baseline is. The baseline we come in and we say, these are the issues.

This is the impact it's having. And this is the time it will take to fix it. So you can possibly going to get 10 of them halftime it's stuff that sums up the classics. The thing is small organizations don't block their own IP addresses. And somebody in that's been on the site or they've been testing, or we discovered the other week that people were filling out sales forms on the website.

So they were taking phone calls and then fleeing on the website. So of course their website activity looked fantastic. Yeah. And when we sh I think the best we saw was the company where we said, are you selling to the Philippines? And they were like, no, we don't sell to the Philippines. And 90% of all their traffic was from the Philippines.

Okay. All their staff are based in the Philippines and put st. GA tag on the external site. And the extra dates and the breaks. It was just a nightmare where I'm not really well, I'm gonna lose no to the traffic, but your conversion rate is really good because they would say our cost per act, our cost per acquisition has gone through the roof.

Yeah. Because you keep advertising to the wrong people. So yeah, a little bit, it's demonstrating to people and then once you've gone and demonstrated, you know what you're doing, and there was a route out of there, then you can say how long it's

Arnout Hellemans: [00:10:27] going to be. Yep. I often do a little bit of a quiz. So I basically asked him like, what is direct traffic?

And literally I've only had one client that was properly being able to explain it to me. I asked the same for a bounce. Does everybody know what a bounce is here?

Guido X Jansen: [00:10:47] Yes, we all know, but please explain.

Arnout Hellemans: [00:10:50] I think I thought I knew up until the moment I was at conversion hotel, number one, where, and she shut Muller basically opened my eyes.

What an actual bounces. So an actual bounces that only one interaction is measured in GA. So for instance, if you write a blog post, I often use this example. So you were at a blog post on how to fix your phone when it's got wet. So you write a blog post, put it in a ball of rice and just a complete how to some people just go to their computer.

They Google it, they go to that page. They scroll all the way down. And they won't be going to any other place on that blog because they've achieved their task. They wanted to know how to fix their wet foam. So they put it in a bowl of rice. They closed the browser. That page has 98% bounce rate.

But does that mean it's shitty page?

Guido X Jansen: [00:11:55] Depends on why are you checking how many people are I have fixed phones down,

Arnout Hellemans: [00:11:59] but what I'm saying when you can't really track that? can you, so it's, but the way to do this is, in these kinds of cases, what I tend to do is I put, I send an event when people scroll to three quarters of the page.

Which equals to an intent that page was as interesting as I wanted it to be.

Charles Meaden: [00:12:20] And we also do that. Then we put a timer tag on, so every 30 seconds. So we had a client that had, we took them for certain of charity and we took it from 3000 to 90,000 a year because the last website was so bad at a couple of good links from the BBC and the daily mail.

And they said, Oh, she's brilliant. But 90% of all our traffic leaves after a page and then their mindset, this is terrible. We went. Then we'll put a timer on there and then we put some scroll tracking, learn my whole, they will never three and a half minutes, but again, it's an education point of view.

Guido X Jansen: [00:12:52] Yeah.

So I hear a lot about, misaligns goals and, bonuses. Is that something that improves in larger organizations or

Arnout Hellemans: [00:13:00] yeah, but you have to go high up. And so it also depends on where you get hired. So if you get approached by a marketing team, they are probably in the silo of Mark. So they just want to prove, make the analytics work for their business case.

Whereas if you can get higher than say C level at a company or a founder or whatever, then you can define what are your actual business goals. what are you trying to do? And then start measuring that and then flow it down to those individual goals if needed. But otherwise it's really hard and trying to get in when bonuses get set, not when they're almost like getting on the payout side, because then you won't get any changes

Charles Meaden: [00:13:45] wherever we go.

And we will say to people, if we're doing trading, whatever, we'll say, what information do you need to get yourself a pay rise? Don't tell me in terms of Google analytics or Adobe what's, you need to get your job done, and then they'll describe it to you because otherwise you're constraining them by going, Oh, I need this and I need that.

And then we say, if we do X amount of work, we can get you there. Or if we don't know, one of our clients is an old school catalog company and they've always lived off numbers. And there's a guy who said old school catalog was hard. Cause you had to send stuff in the post and it costs you money. You had to print and stuff.

And for the last two years, they have tracked every single click on their website, every single interaction. And they've just done a big machine learning project and they now know exactly what combination of clicks we'll get a sale and that is then fed back. But they couldn't have got there if they didn't have that culture, that maturity inside the organization that said it's w and they said, it's risky.

We don't know if it's just going to work and you want to be moving people all the time to that bill.

Arnout Hellemans: [00:14:45] Yeah. And so the other thing I'm seeing is that, numbers, so we all have had this question where people selling numbers, what is a good conversion rate? What is a good conversion rate, anyone, however,

yeah, exactly. or not depends. but, but what I've noticed is that when we talk in terms of profit and actual hard dollars euros or pounds, people really understand it. And I use a lot of analogies where I say, this is a store. This is somebody going into these kinds of things. You've got a hundred people walking in and 99 of them just walking up, but they actually had those products.

So what you need to visualize it to people have not. They're looking at numbers and I often tell people they're actually people, not numbers. Don't frustrate me.

Charles Meaden: [00:15:39] we have that with a UK retail, it's got 200 stools. And I said to people act the same way in your stores. He went down with completely different because the store up in Newcastle is different from London.

I said, so why do people all act exactly the same way on the website? He went, Oh, I see what you're getting at now. Cause his mind there was just a website Vista. So for him it was just a waste of money. I don't care about the website. mate, you doing a hundred million a year and you really don't want to spend any money on optimization because inhibit it was still a waste of money.

Guido X Jansen: [00:16:08] Anyone in the audience that just wants to rent all of their clients

Arnout Hellemans: [00:16:14] or rent provide us.

Guido X Jansen: [00:16:15] Yeah, go ahead.

Arnout Hellemans: [00:16:16] Yeah. I have a question regarding what the process kind. if you look at finance, if you look up operations, then they use a lot of data as well. And the first thing they do when they do an analysis is check for data, check the quality of data, and then we marketing people come in and then we just, we draw a conclusion first.

And when we don't agree with the conclusion, then we start checking the data, quality it more like a discussion on how we use data S in our marketing field. So we need to check first for the quality, instead of just running around with insights and analysis and making bonuses based on that, on the data analysis.

I completely agree. Couldn't agree more.

Charles Meaden: [00:16:57] So what are the first things we'll always do is say, look at their analytics and say, how close are you? To your accounting system or your sales system. And if your mom to two to 3% out, you've got a problem. If it's within two, 3%, that happens because one of the first things we find is we go through all these actions.

People say, I don't trust the analytics system because that is what gets the wages paid. And some people say I'll report to them. I don't really believe in it. And that's normally when we've been brought in, because somebody got on this is ridiculous. We need to make business decisions based on allied data.

Arnout Hellemans: [00:17:30] Yeah. so I just got a proposal for one of my clients, an internal proposal to do and handsome attribution. and the girl who wrote it was actually really smart and really good, but my reply to her was let's first get rid of that 50% direct traffic that is currently in there because how. how in God's name are we going to do enhance the contribution?

If the data is not correct or not to be trusted because otherwise we'll just get into discussions afterwards, as you mentioned, where they go, yeah. But obviously Facebook is reporting that this happened and that you can't follow it. It's like now we're trusting Facebook. We'll trust what happened.

Charles Meaden: [00:18:17] We'll trust. Whoever makes us look good.

Arnout Hellemans: [00:18:20] Exactly true. That true. That

Guido X Jansen: [00:18:22] next

Arnout Hellemans: [00:18:22] question. Oh, come on guys. I know you have a question. Cool. Alright guys, Marsha, how did you check the data quality of the implementation you were actually then,

Charles Meaden: [00:18:33] applying

Arnout Hellemans: [00:18:34] you see that the data quality at when you are there is rubbish, but then developer or

Charles Meaden: [00:18:40] yourself needs to implement additional tax.

Arnout Hellemans: [00:18:43] How do you make sure that tax are. Good, sir.

Charles Meaden: [00:18:48] Then you walk the customer journey. The very first thing I do is I fire up Chrome. Yeah. I like get GA tools or GAD bug. Am I go RO I take, I will take 20 minutes looking at GA game one of the most common pages, and then I will walk through and I want to make I'll attempt to make a purchase and I'll do it on mobility.

What are the most, what are the most popular devices? I'll do that first.

Arnout Hellemans: [00:19:10] so you are in most of the product, product teams. There are testers, but in this case for, from a

Charles Meaden: [00:19:15] GA perspective, most of the time you test yourself. Yeah. We always test ourselves because we've often been brought in to audit that we've got an organization, so we're starting from scratch.

Arnout Hellemans: [00:19:23] So I tried to educate, my tester to also look at GA. she did it the first time, but the second time and third time, she was like, yeah, that UA code was filled. Yeah, but we were missing all these steps and she was like, Oh, but I can I test this? So one of the things that I'm looking into doing because of a measurement, a measure camp session was using ghost inspector, and then together with, GA debug to run it automated and then look at whether all the events fired.

So that could be a way to automate these kinds of tests spot. I haven't built it yet. But it's something that is in my mind of doing. and I think that the, secondly, it's, ah, there's another one I see so often is they use a different, GA code, on acceptance. or they use the same and they don't exclude the host name in the maintenance thing.

So what is suddenly it flatlines. So another good one is set up alerts for flat-lining in GA. Or huge struggle.

Charles Meaden: [00:20:32] So things like payment gateways, somebody in finance has put a new payment gateway in and all of a sudden your top referring campaign is PayPal. All the new bark is all whichever territory you've got him because nobody in finance thought they'd get their job.

Arnout Hellemans: [00:20:46] Yeah. Yeah, exactly. so there's loads of these things that you really want to, but primarily tested yourself or. I get it to fail one or two times. And I'll double question it and then go to the founder and saying, we're missing this data and saying what happened because I want them to feel what it is to be missing these data.

Charles Meaden: [00:21:10] our wheels working assumption, assumptions, things are gonna break. So we have early warning systems set up, which basically might say, give us the top browsers, whatever it is. And then look back this week, last week at running average. And then you'll know if someone's broken something. So a couple of years ago, we did some work for HeartCore company on the brand new checkout.

And the conversion rate on day one was down 25%. Everybody went, this is supposed to be wrong. And these were days where Firefox was 25% of the market. And you couldn't check out and Firefox and turns out that only tested on one browser.

Guido X Jansen: [00:21:43] Do you guys have any experience with outsourcing this and automating this?

there are, I think there are services out there. Let me, you can set up,

Charles Meaden: [00:21:49] we spend a lot of time and look at how much we can automate this. But part of it is experience. So I use an auto, I use a lot of the tools around the Google antics, API to get me information quicker. But at the end of the day, I've still got to look at the data and go that the guys would have 60% direct at all is going to pick that up.

It's going to give me a hint. I could look at that one. so one of the things we did recently is we worked hard at the API. You could get the viewport and you can get the browser window. And so anything that's basically got a zero difference between the viewport and browser would, there must be a headless Chrome.

Therefore that's not human being. And that was 5% of Declan's traffic. And he went right. I even have a word with the it department, somebody that sets up their own server monitoring tool, all the best with the best interest, but they had to thought about that impact. So it's quite nice. There's an SEO tool called light bulb that natural light bulb.

Yep. Cyborg automatically. Doesn't follow the GA tag. Oh, that's it.

Arnout Hellemans: [00:22:45] But then I would say, so I work with the developers of a side bulb and say date. They did the same thing with Adobe analytics because it was literally killing it because it was trying to render every page. And it was so they're, you can exclude anything, but even with tools like screaming, frog, you can basically in Chrome, you can see, whether the GA tag is firing on every page.

So these are all. We're relatively basic, check. She want one, when I want to put in.

Guido X Jansen: [00:23:13] Yup. And, you mentioned, Adobe analytics. most people, I think use Google analytics. Is there any inherent difference between those tools in how they handle this? Is there anyone and it's all better than the other or it's just all the same.

Arnout Hellemans: [00:23:28] No. I only got to work with Adobe, say in January, it took me a while to get used to it. it has some really cool features. but in general I still prefer J but that's probably it because 90% of the time I spend in GA

Charles Meaden: [00:23:43] and it's still down for implementation. Yeah, you can't roll Adobe out of the box and it will just work.

Yeah.

Arnout Hellemans: [00:23:48] Yeah.

Guido X Jansen: [00:23:49] Hello? Yes. I have a question. name and registration please.

Charles Meaden: [00:23:53] Marino Tai.

Guido X Jansen: [00:23:53] Hey,

Charles Meaden: [00:23:54] what's the moment you can trust the data

Arnout Hellemans: [00:23:56] and maybe okay.

Charles Meaden: [00:23:59] Stop it.

Guido X Jansen: [00:24:01] That's

Arnout Hellemans: [00:24:07] that's an excellent question

Charles Meaden: [00:24:09] for the 30 seconds after you run every test possible and nobody else has messed with the code.

Arnout Hellemans: [00:24:14] Yeah. so to me, it's when it feels reasonable. I'm,  no a lot of good feeling, but if I double check everything, so for instance, a great check for me always is you take direct traffic for instance, and you take the landing page.

If the landing page is not the homepage, it's probably a tagging problem somewhere. so these kinds of checks you can do, but if I fully customized everything and it feels like it's the right way, I check it now and then, but I, there is no health score as, Oh, I wish there was, but

Charles Meaden: [00:24:51] there isn't when my analytics data sings correctly with other data sources in the company.

So what I'm seeing the same revenue and the same products that I'm seeing, the sales order system, because that's the stuff going out of

Arnout Hellemans: [00:25:01] the warehouse. Yep. and the other thing like, it is not trustworthy. And, one of the people here build a tool, and I've spoken about it like two years ago here.

it's using the measurement protocol to spoof hits on websites and there is hardly any way. For them to prevent that from happening? there is, but it's quite complicated. So I think, do the best you can and then stay with that.

Guido X Jansen: [00:25:28] You've been trying all your lives to get away from all those guts, feeding things and be data driven, but,

Arnout Hellemans: [00:25:34] and Arnold basically says, just trust your gut mate.

Guido X Jansen: [00:25:37] question one B or.

Charles Meaden: [00:25:40] Oh, it's more, I think you all, both experienced and maybe an implementation from hard-coded to a tech manager

Arnout Hellemans: [00:25:47] and this the

Charles Meaden: [00:25:47] same implementation and different numbers.

Arnout Hellemans: [00:25:51] And it's old.

Charles Meaden: [00:25:52] Coming back to

Arnout Hellemans: [00:25:52] the feeling again? no. That is different because there is just depending on how much JavaScript is being loaded and when, that has, can have quite a significant impact.

Yeah.

Charles Meaden: [00:26:05] It has an impact, but you have to live with it.

Arnout Hellemans: [00:26:09] So I prefer not to use tech manager for my analytics, but that's my personal opinion. Whereas

Charles Meaden: [00:26:15] I see, I prefer I do. in the right hands. Yep. It's like the thing in the wrong, he has no access to what

Arnout Hellemans: [00:26:26] I do affects us to quite a bit. Quite a lot.

Charles Meaden: [00:26:30] Yeah.

Guido X Jansen: [00:26:32] Yeah. One final question.

Arnout Hellemans: [00:26:33] What is the most interesting technique you have experienced in decreasing your direct traffic?

Charles Meaden: [00:26:41] Getting people to put UTM tags on everything that moves I've been asked. That's the obvious answer. I know, but yeah.

Arnout Hellemans: [00:26:47] Okay. so I have a deck of my SlideShare with, six practical flick fixes.

You can do a, it goes into search horse sources that are not being seen as organic and it does direct. So there's a few things. So the first one is adding a different, A loop, sources. So what is calm.android.google as refer. so you can basically filter on that. And the second one I've implemented is UTM tech, your open graph, for organic sharing.

So that is, it's a pity. Facebook itself, doesn't pick it up, but tools like Slack and WhatsApp do so you basically can see this is as people call it dark social. So Dan, you can make it as a channel as organic, organic social. So that's another trick, but just look up the deck it's on, on my SlideShare.

Guido X Jansen: [00:27:43] And for further deep dive, Arnold will be at the bar tonight. Thank you so much, China. Thank you so much, Charles. thank you for, for the questions. we will have a small 10 minute break. again, we will publish, probably publish, this session and the other sessions, if you want to. be notified that when we do go to Sierra adults, cafe slash subscribe or subscribe, if it's a podcast app, next session will be with two fairly familiar faces to you.

At least it should be Emily Robinson and Lucas Mira. We talk about data science. the one after that experimentation culture with Kevin Anderson from ING and Denise Fisher from Boulder gum. And the last one we're going to talk with Roger Dooley and bot skirts about psychology. So that's going to be interesting.

So see you in a bit. Thanks.

View complete transcript

Get your cup of CROppuccino

Through our mailinglist we periodically inform you when a special episode goes live, when there are professional events you might be interested in or when our partners have an exclusive offer for you. You'll receive an e-mail roughly once a month and you can unsubscribe at any time.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Some other recordings from

The Conference Formerly Known as Conversion Hotel

:

Some other episodes you'll (probably) also like:

Some other episodes you'll (probably) also like: