CRO Live Hour
CRO Live Hour
#62 – A/B Testing Prioritization (ft. Ayat Shukairy)
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Most people have a ton of A/B testing ideas – but if we're being honest, some test ideas are not worth it.
If I were you, I'd focus on tests with the most potential and biggest payoff.
But how, then, do you determine which hypotheses you should do first (or ignore)?
This is exactly what we're talking about in today's episode. Tune in...
QUESTION — Have a question about e-commerce Conversion Optimization, A/B Testing, User Research, Marketing, or Anything Else? Post in the comments section of this video!
Visit Our Website, And Learn More About Our Services:
invespcro.com
Follow Invesp on Social Media:
Linkedin
Facebook
Twitter/X
Read more about conversion rate optimization on our blog.
[00:00:00] Ayat: it's hard to necessarily decide When I'm prioritizing, which ones are going to be the short term, which are going to have the long term impact. As you work with the client a little bit more, you'll be able to have a little bit more sense of what type of an impact it has. We do a lot of kind of analysis also in terms of, for example, those types of experiments, hay What was the LOE for this one?
[00:00:22] What type of an impact did it have on the bottom line? And, what does that mean for the client overall in terms of the strategy that I'm going to put forth?
[00:00:31] Khalid: [00:01:00] Another week, another CRO live hour. Yes, we do have a guest today. I convinced Ayat, her, I don't really do any dragging. I'm like, Hey, you want to join us? We have a very interesting topic to discuss. Simba, Ayat, how are you guys doing?
[00:01:48] Simbar: I've been good. I thought like maybe for the podcast listeners, before you say Ayat's name, we're going to have like some drum roll. We can always have a drum roll,
[00:01:57] Khalid: you know, But I'm so horrible with these drum [00:02:00] rolls. So I'm going to even like put myself on the spot over here.
[00:02:03] Ayat: Yeah, I'm glad to be here.
[00:02:04] Thank you for having me. I'm always from afar looking at the CRO live hour. And finally I've been invited to join.
[00:02:12] Khalid: Yes, took a while, it took us only about I don't know how many episodes Simba, I keep on asking Simba every month I ask him that question. I'm like, how many episodes we have, and we don't have an exact number.
[00:02:21] Because we've been doing it now for how long? more than two years. More than two years.
[00:02:26] Ayat: Wow. So two years later, Khaled finally decides, let me tell my co founder to join.
[00:02:33] Khalid: Not that I ran a firm. Things to talk about because I can always talk about anything CRO related. But I thought like for this episode, you probably have more insights, I have to use this tradition CRO live hour before we get into CRO live hour.
[00:02:46] I'm not sure I have to tell our audience. I'm not sure if Simba is going to be joining us. Next Thursday for CRO live hour or not, because the next one next week, but when one of us is not going to join, I think, because Liverpool is playing Man United. one [00:03:00] of us is going to be very upset.
[00:03:01] Ayat: what I've seen in the past is that you guys completely cancel. Siara live hour when there's a Liverpool game happening.
[00:03:08] Khalid: No, we've always been, especially when Liverpool wins, I make sure to gloat. I'm like Simba. I didn't like how to want to do, I'm like, come on man. the whole company depends on this.
[00:03:17] And then we start and I'm just talking soccer. He's like, liverpool looks like Liverpool, but okay. Away from Liverpool. Simba. How's your week?
[00:03:24] Simbar: My week has been good. Busy as usual. Learning some new things and trying to implement like those things that I'm learning to our marketing. So yeah, it's been good.
[00:03:36] How about you?
[00:03:37] Khalid: You always keep it big, like learning new things, implementing them into marketing. I'm like, okay, tell us more, man, tell us more.
[00:03:44] Simbar: Like I used to say when I joined Invesp, about, five years now, maybe in the next two weeks.
[00:03:50] So that's interesting. The marketing that we used to do back then is different from the marketing that we are doing now. So for me, it's always like a new learning process right now, like we're focusing [00:04:00] on ABM. Campaign that we've been working on, like for a couple of weeks now, it's something that we didn't do before.
[00:04:06] So it's like a learning curve for me at the same time, like I can see some of the initiatives within that ABM program some of the parts that we used to do, but we didn't really call it ABM, but right now that's
[00:04:19] Khalid: the new thing. That is very cool. Yeah, it's just fascinating to me, by the way. Every once in a while, we pick up something new. We're always improving our marketing. And sometimes I'm like, we really need to do this. I'll give an example of FigPi. For a long time, I knew we need to do content for FigPi. But I'm like, it's not a priority right now.
[00:04:35] I mean, It's funny coming from the fact that we built Invest based on using content. But I'm like, Oh, we don't have the energy. And then eventually we decided it took me like three years to decide. And then I'm like, Oh man, I should have done it three years ago. same thing with ABM. I was like, so at some point it's like you start with something and you think it's just that initial, start that is really rough and tough and you're not going to get everything perfect.
[00:04:56] This is learning and it's exciting as well. Very cool. And. Our [00:05:00] dear guest, Ayat, how is your week?
[00:05:02] Ayat: Week has been all right. For me, it's always more on like managing the team and, we have onboarding some new people, there's always excitement there because, obviously you're getting to know new members, growing all that good stuff.
[00:05:14] Yeah, my week has been pretty okay and it's the first week of Ramadan. So just also navigating that.
[00:05:21] Khalid: Yeah. me and I have to usually do boot trapped, which we did not record this week, but hopefully we'll record it. So that's our episode where we give an update about our journey, on the invest side and on the fake pie side of things.
[00:05:34] so you can always catch I had there and she's appears on so many different podcasts. But by the way, I have, I do hear through the grapevine that might be having your own podcast or something. I don't know. So we'll find out, you know,
[00:05:45] Ayat: an interesting grapevine. I'd like to be tapped into the same grapevine that you're tapped into
[00:05:49] Khalid: Many grapevines.
[00:05:50] so Simba, what are we talking about today?
[00:05:56] Simbar: Okay, cool. So today we have about five questions and [00:06:00] about B testing prioritization. the first one is are there any specific frameworks or methodologies that you find particularly effective for prioritizing A B tests?
[00:06:11] Maybe like before we get into this, let's just, Give like an idea of what a beta testing prioritization is.
[00:06:18] Ayat: I'll jump in here.
[00:06:19] Khalid: She takes the easy question. Okay.
[00:06:21] Ayat: you know, Again, like for any type of AB testing program to be successful, you need to be able to conduct research, right? Any type of research.
[00:06:32] This includes, analytics research user research, getting a little bit more information about. What are the problem areas throughout the site that I can really enhance and create a better experience overall. And I always like to even segment those to say okay, they're like the bug fixes, which aren't really AB tests.
[00:06:51] They just need to be fixed in order for the site to just be more functional and better. Then there's more like usability aspects. Again, you want a user friendly site. So those [00:07:00] are following the known NNG 10 heuristics and ensuring that. Hey, we're addressing all of these on all the various pages of our site.
[00:07:08] And then finally the last tier, which is the most difficult, most complex experiments where, the CRO really needs to think a little bit more is the CRO, fixes where it's more persuasive, getting into the psychology of the visitor, trying to really just enhance the experience and elevate it even more than what it currently is taking it the next level from like the, A standard website to an even better functioning site that really enhances what your value is all about, what the company's all about, all that good stuff.
[00:07:37] you're collecting all this data and you have 150 different ideas. How do you decide what to do first and how to go first? So there's so many frameworks out there. Every single CRO company has developed their own. We of course, Adam best have developed our own. And of course I'm biased, but it's the best framework out there.
[00:07:57] and What we did actually, what we looked at some of the other [00:08:00] frameworks that competitors and other, companies follow. So we took, there's some that are very simplistic, some that are super complex and we've tried to find a middle ground in terms of, Ensuring that we're really addressing the issues.
[00:08:12] What is this targeting? What is this particularly targeting? And you fill out this form in order for us to figure out which one will actually go first. The one that scores the highest essentially is the one that should be targeted. So that's overview of prioritization and how we've approached it.
[00:08:29] Khalid: I'll ask you, I had the question here and it's just fascinating to me because I've spent some time now. We have access to so much data at Invesp. Sometimes I'm like, Oh, it's there. We just need to look through it. But just based on your experience. And I almost know exactly what I'm walking into, but nonetheless, I will ask the question.
[00:08:46] many conversion opportunities, between UX issues or conversion issues, do you usually uncover on a website? Is 30 normal? Is 50 normal? Is 150 normal? Before we even get into the [00:09:00] prioritization, what's a normal number?
[00:09:02] Ayat: I mean, It depends. Then they say, all marketers Simba, I don't know if you feel this way as well forgot what, who I talked to.
[00:09:09] I think there's somebody from tenuity. This is what I remember, but it could be completely wrong. I forgot who it was exactly, but it was a marketer, digital marketer that has literally a shirt that says it depends. They like printed it out on shirts because. It always depends on the site and always depends on the company and always depends on so many different factors.
[00:09:26] But yeah, what I've seen in my experience, at least we would be able to uncover anywhere between, 70 to 150 different opportunities again. And this is also considering we have something called the conversion framework. which is something that, we're digging in a little bit deeper to understand where are those trust factors, value proposition, continuity.
[00:09:46] So those are taking into account a whole other angle of AB testing that Not all companies necessarily follow. So yeah, that's like the, typically when we finish any type of prioritization, that's the number that we usually end up [00:10:00] with. And then of course, like as you AB test, you're really uncovering more opportunities.
[00:10:04] So you're adding a lot of those opportunities to your prioritization sheet and rescoring everything to ensure that you're still targeting the highest priority first.
[00:10:13] Simbar: I like that. Okay.
[00:10:15] I actually think like you mentioned something about be scoring of the test that you have, are there any certain factors that you look at when you're doing the scoring?
[00:10:23] As in any elements that you look at to do what do you call this, the prioritization?
[00:10:28] When you're prioritizing a test, you say you'll be scoring you'll be looking at each hypothesis and scoring what elements of that prioritization framework that you use to do the scoring?
[00:10:39] Ayat: So there's a multitude elements that we look at, but we'll look at, for example how the particular experiment was discovered, whether it was discovered, for example, through just simply heuristics, that's usually like a lower score or was it actually user research or was it analytics?
[00:10:54] It could be a higher score as a result of that. Again, we also look at the complexity of the experiment or [00:11:00] the concept because if it's something that is going to be higher complexity and complexity can vary meaning, is it going to be difficult to get their higher ups to agree on it as well?
[00:11:11] The client, a lot of times there's a little bit of a political battle that you have to consider as well. And then also the implementation, like how many hours is going to take if this is going Twenty hours to implement. I really need to take a step back and figure out. Does it make sense to put this as a higher priority at this point in time?
[00:11:26] So there's just a slew of different things that will look at also like the page value like where depending on the page value, the Experiment is going to, fluctuate in terms of score. If it's higher value, it'll be probably like, higher priority.
[00:11:40] Khalid: Let me pick up from here and just talk about something.
[00:11:42] Because at least for Invesp, the prioritization framework has evolved over the years. At some point we used to score an idea against 18 different metrics. Including how much traffic the page gets, whereas it's in the funnel. And then there's the [00:12:00] how do we uncover the issue?
[00:12:01] Is it based on the conversion framework? One of the elements of the conversion framework is it's based on what kind of research actually helped us discover the issue. So those are all. Things that help us score an element, but then what we discover, let's say you have a page that has high value so let's say you have the cart page and it has a really high value It's you know bottom of the funnel It's getting a ton of traffic and you end up with the first ten ideas that you want to work on are On the cart.
[00:12:28] Guess what? I'm not going to be able to run 10 experiments at the same time. Unless you're on eBay Target, but even with those companies, what we've worked with, we did not do 10 experiments at the same time. Because you can run something called exclusivity groups, but we like to run one experiment per device, sometimes it's responsive.
[00:12:43] So it's made more sense in the next iteration of the conversion framework, which has not made it possible. Public because we're still testing it with several clients where we said, You know what? Let's calculate the page value separately. And for that, we look at several metrics. How many [00:13:00] visitors come to the page?
[00:13:01] What kind of conversion rate for the page? What kind of conversion rate the page has? how long it will take us, what kind of MDE we would get if we want to run an experiment for three weeks with a certain number of variations. And then based on that, we do some revenue calculation that helps us understand that, what's the homepage.
[00:13:19] If we have a winning experiment that generates a 10 percent lift on the homepage, it's probably like, you know, 15, 000. But if we have a 10 percent lift on the cart page, it's worth 6, 000. So we start looking at those. numbers to really prioritize the pages. And then we say, okay, the cart page is the top priority.
[00:13:37] And then maybe it's the checkout and maybe it's the homepage. And then it's the category pages now within the realm of the cart page or the realm of the checkout. What are the different priorities? So this has evolved over time. So
[00:13:51] Ayat: yeah. absolutely. I, again, when you're doing the prioritization you might end up with two cart page experiments or three, that are [00:14:00] like consecutive.
[00:14:00] that's where I always say that the CRO plays a really critical role within prioritizing things because you cannot be pulling. Three experiments for the cart page and just running those for your client, you need to be looking at, no, I want to run on multiple different pages.
[00:14:14] So I'm going to look at what are those top ones for cart, but then, next tier, what's next after that in terms of pages. And that way it'll give you an idea of okay, this is the structure that I'm going to run with for this particular month. Again, depending on our clients, sometimes we'll run four, six, eight experiments, depending on how large our contracts are, right?
[00:14:32] Depending on that's how we're going to look at the prioritization and determine which experiments we're going to be running during that month. But having that overview and deciding on, okay, these are the next experiments for the next eight, week period really helps first of all, the client to ensure that Even business goals are aligned with because that's a huge consideration for us that are these experiments aligned with the objectives and business goals of this company.
[00:14:59] And then [00:15:00] additionally ensuring that we're really targeting all the various pages because we don't want to really be optimizing only one page. We want to be looking at that entire funnel and. Improving it across the board.
[00:15:11] Khalid: That was a lengthy answer for one. Simba is like, Hey let's start here.
[00:15:14] And it's God, man, guys, we have like five questions. Sorry.
[00:15:16] Simbar: was a detailed one and it was good. So the next question is how do you balance short term wins versus long term impact when prioritizing A B testing initiatives?
[00:15:27] Khalid: I ask since you've jumped on the first question jump on this one.
[00:15:29] Let's see.
[00:15:30] Ayat: Well, Yeah, it's a kind of like an estimation, right? Because at that point in time, you don't necessarily know what type of an impact this is going to have. We always say for example, whenever we start a a project that we were actually discussing this just yesterday, that we're going to be targeting kind of those low hanging fruits.
[00:15:45] Even in that particular case, it really is highly dependent on the solution. So even if it's a low hanging fruit that you found and you've uncovered, perhaps the solution that you put forth isn't one that actually impacted the visitors positively and caused actually a decrease in [00:16:00] conversion rates.
[00:16:00] So again, it's hard to necessarily decide When I'm prioritizing, which ones are going to be the short term, which are going to have the long term impact. As you work with the client a little bit more, you'll be able to have a little bit more sense of what type of an impact it has. We do a lot of kind of analysis also in terms of, for example, those types of experiments, hay What was the LOE for this one?
[00:16:24] What type of an impact did it have on the bottom line? And, what does that mean for the client overall in terms of the strategy that I'm going to put forth?
[00:16:33] Khalid: think one of the areas that we still can do better ads is having those strategic conversations clients.
[00:16:42] Every client comes to us, he's like, Hey, I want to improve my conversion rates. Okay, Mr. or Ms. and it will help you improve your conversion rates, but really we're not helping them improve their conversion rates. Flat out, I'm helping you make more money online. That's exactly what I'm trying to do.
[00:16:59] And [00:17:00] hopefully the money that you make online is tons more than we charge you for the service. It's just very straightforward. We'd love to see you win, but okay, move away from trying to improve conversion rates into what are your business goals for this quarter? What are you trying to accomplish?
[00:17:17] I think I've seen this with some of the testing that IATS team does. Sometimes you're running an experiment because there is a business objective that you are trying to accomplish. And sometimes, even though that experiment loses, the client and the team would recommend to go ahead and implement the experiment because there is a direction.
[00:17:36] It's okay. So this is not perfect. We did not have the win that we were expecting. But we're still going to implement it because there's a strategic business interest in a certain implementation in a certain way. Now, as we're making that decision of deploying that experiment, I'll go back to losing experiment.
[00:17:51] Guess what? Now our decision is informed by data. Okay, yeah, this experiment might cause us, 345 percent loss in revenue. We've ran into [00:18:00] issues like that, but guess what? Strategically, this is important. So there will be a second, there will be a third, there will be a fourth iteration. Trying to fix some of the losses that we've had because strategically this is exactly where we want to head.
[00:18:14] It reminds me of a story of Netflix one day deciding that You know what, we're gonna drop DVDs the hard DVDs completely, and we're gonna just change to something just all digital. When your business, when your 99 percent of your business comes through the DVDs that's a little risky, correct?
[00:18:34] Now, of course, they had money in the bank, and they had investors, and they had connections, so yeah, mitigation strategy, but sometimes you say, you know what, the business is heading in that direction, and we need to test this out, we need to make it work. So what does it take to make it work?
[00:18:48] So I think that's the, Really the fun side of conversion optimization.
[00:18:52] Ayat: would also add to that, so when you're making that strategic decision, a lot of times the client will say, it really depends on what type of [00:19:00] a negative impact this has. So if it does have a negative impact on our business, we want to see what is the extent of it?
[00:19:05] And if we can actually. Still operate and function knowing that we're going to have this hit, we're going to take this hit. those are those discussions that need be had with the client and ensure that, again, there's that communication piece and everyone's on the same page of what it's going to have, what the impact is and what are those.
[00:19:25] guardrails that the client really, like if it exceeds this, we're not going to be implementing it. And you have to move into a different strategy. We've had kind of some situations where again, it may have been us looking at some sub goals of the experiment in order for us to determine what was it that didn't resonate with this particular solution.
[00:19:42] Updating accordingly and relaunching with a better win or maybe perhaps not as much of a loss.
[00:19:47] Simbar: Yeah, those are good answers. I was getting like a little bit short because we had gone this far without telling a story. I was
[00:19:57] Khalid: going to say a story. Here we go. Somebody [00:20:00] asked for a story.
[00:20:01] Two things. First, I actually, since we're talking about stories, not that I'm plugging this. Storyteller tactics. I actually bought this so I will tell you guys whether it's good or not I've been impressed actually with some of the stuff. I've just watched the Tutorials on how to use it and I'm like, oh my stories can be so much better So we shall see if the stories will improve, you or not That's number one number two One thing that's always interesting talking about like testing and making a decision and a data informed decision I hate using those jargon words, but okay, it is what it is Every time a website owner, CEO, I don't know, a president comes and talks to us or a co founder and they say, Hey, we're upgrading, we're migrating from this platform to this platform.
[00:20:46] We've learned, and when I say we've learned through a very hard experience that usually you think that, You're getting a trip around the world and instead you're getting an atlas or something like that. So you got to be careful what to expect [00:21:00] for. So I've learned every time a client tells us that it's like, Hey, so what is the cutoff point where you say, you know what?
[00:21:06] Okay, this is too painful. We need to revert back to the old site. Now, every time I tell this to somebody, they're like, nah, we invested too much. And we're not going to do that. And I'm like, I understand. I understand you've invested too much. However. What if you launched a new website and it causes you a 20 percent drop off in conversions?
[00:21:24] They're like, no way this is going to cause. I'm like okay. If it causes, let's say theoretical, because they're like, Oh, it's not going to cause, I'm like, if it costs 50 percent drop off in conversions Oh, I'm going to roll it back. I'm like, okay. So we agree that there is a threshold. We just need to talk about this threshold.
[00:21:36] Correct. Because 50 percent is not acceptable to you, and they're like, yeah, okay. they're like we'll decide when we go live. I'm like no. When you go live you have about a million things going on and it's gonna be too stressful Let's have the conversation right now. Let's draw the line in the sand and let's say maybe it's ten percent.
[00:21:54] I remember, so we're having the same conversation with ZGallery. They were migrating from custom [00:22:00] built to They attempted to migrate to another custom belt, went bankrupt because of a migration, which is a whole other story. The amount of money that they were spending left and right, unfortunately. And then they got acquired by one of our clients, and we knew the clients really well because we've worked with them for a long time, on and off actually since 2011.
[00:22:16] So having a conversation with Jordan, you can mention, and I don't think he was convinced. But then somebody on his board, he says, you know what? I am the poster child for failed migration where I almost lost my business. Let's put that cutoff line. And we went back and forth until they agreed, okay, anything more than a 15 percent drop off in conversions, we're going to roll it back.
[00:22:38] It's just kind of reality, correct? So we have the discussion setting up a sidewide test, old versions versus a new version is horrendous.
[00:22:46] Ayat: Server
[00:22:48] Khalid: side, Lambda. And this is many years ago where, oh my God, it took us almost a month and a half to implement. And then server side testing, especially sidewide is never perfect.
[00:22:59] And then they [00:23:00] decided, you know what, we're just going to go live and we're just going to watch the data. So we went from doing an A B test site wide into monitoring analytics and saying, okay yeah, there's a little bit drop in conversions not huge, it was not horrendous. they said, we're going to go with the new site, but just as a reminder, new balance.
[00:23:16] 2009 I think that's when they went just before Black Friday a week before Black Friday They went on went live for the new websites VP of e commerce is super excited about it. They launched it and their conversion rate drops by 70 80 percent They rolled it back took them about a month because they didn't have contingency plans and many people lost their jobs So there you go two stories man know for the price of one
[00:23:40] Simbar: Yeah, Yeah.
[00:23:41] Those are good ones. The next question is what are the most common pitfalls or challenges encountered when prioritizing A B tests and how do you overcome them?
[00:23:50] Khalid: I'll jump in over here. I think one of the biggest challenges that I see is there are frameworks out there, correct?
[00:23:57] And each framework, I think to some [00:24:00] extent has a weak point. The biggest challenge that I see in Pitfall is how subjective a framework is. I've uncovered 150 issues, I take those 150 issues, I give them to three of my CROs and I tell them, use the same framework. What is that match between those three different CROs?
[00:24:17] Are they matching 85%? The categories and the prioritization is very similar. Or, Is this CRO telling me here's the top five and for second CRO, those top five are like at the bottom of the list or in the middle of the list, then you have a problem. Subjectivity is an issue and that's, I see that as a major issue with some of the frameworks.
[00:24:37] Actually, one of the more popular frameworks ICE it uses so little data that, you end up with very subjective prioritization, maybe even worse than that, which is very funny is people who try to use eyes and they're like, Oh, it's so inconsistent. Oh, we're not going to use any prioritization.
[00:24:54] And who cares about prioritization? There's somebody actually who had made a post about that. And I just thought, I'm [00:25:00] like, oh my God, he's like, oh, there's no place for this. And I'm like, yes, that's just like me trying to build a system. And I look at my engineers and I only have four engineers and I have enough work for a year.
[00:25:10] And I'm like, you know what? Why don't you randomly just, here's a list of 300 different tasks that we need to finish. Just go ahead and pick something and then let's just see. The fact that you've used a prioritization framework and it did not work. It's a problem either of the prioritization framework itself or how you've used it, correct?
[00:25:28] If you think that there's no problem with the prioritization framework, then there's probably an issue with how you use it. So you want to fix that before you're saying I'm gonna throw what's the baby with the water. What's the expression? I forgot. Okay. No one knows it.
[00:25:39] Ayat: Never heard of that expression, Khaled. You're coming up with a new expression.
[00:25:42] Khalid: Oh, the baby with the bathwater. you know, Where you just say oh, it's dirty. She's just gonna throw both. No. You need to prioritize conversion optimization is just not going to exist by itself.
[00:25:52] It needs project management. We're not going to reinvent marketing. We're not going to reinvent, project management. They are there. They've been there [00:26:00] Forever. So figure out the issue like, that's the second one. The third issue that people might not notice is, I think we've had that with our previous framework where it was bloated.
[00:26:11] There are just many things and many things get added. It's like, oh, let's rank, let's use this as a ranking factor. This as a ranking factor. So the first iteration of our prioritization framework was almost 24 different elements. We had to step back and say, Hey, When people are filling out the prioritization framework and you can do it after like you pull your prioritizations from six months, 12 months.
[00:26:30] So hopefully you've done, I don't know, 20, 30 different prioritizations. You can see, are there fields within the prioritization that are always getting filled yes, a hundred percent of the time? That's probably is a useless field in that case, correct? everybody's filing it, yes, you know, constantly, I'm like I don't know.
[00:26:46] Okay, this is really not very helpful. But again, always question is the problem of the framework? Is it a problem of the people filling out the particular field in a particular way? Because the field might be perfect, but people are just lazy. And they're saying, [00:27:00] yeah yeah, we discovered this through analytics.
[00:27:01] Yes, we discovered through this analytics. I'm like, show me the analytics. Oh, I don't have analytics. Okay. It's not the framework problem. It's a Joe Schmo problem. So let's fix that. And then the final thing that I would say is after you prioritize, like I had said, a CRO or a digital marketer or a project manager, and you sit back and say, okay, so here are the top priority items.
[00:27:23] Do they actually make sense? it's not going to be on auto drive. You have to sit back, make sure that it all makes sense. Are all the items getting very close score? Everything is getting 75. I'm like, okay, either you have a problem with the framework or how it's getting filled out.
[00:27:38] So those are just kind of things that I look at as pitfalls that people don't pay close attention to.
[00:27:43] Ayat: I'll add a couple of things. I think I in my experience, and again, I think it always goes back to like human error, or the ability to kind of add things and be really proactive.
[00:27:54] I think that's the thing with frameworks and CRO in general, you really do need to be very proactive. [00:28:00] And if you don't, Add, for example, the wins and the, even the losses and the information and the learnings from that back into the framework. There's so many missed opportunities there. And you're not going to be able to prioritize them accordingly.
[00:28:15] And, you're really basing a lot on, again, just the intuition of the CRO, which sometimes is just not enough necessarily. So I think that's a key thing. It's making sure. If you're conducting a CRO project that research is always included. And then that research is also part of it is the A B tests that you're conducting.
[00:28:34] That's part of your research. So adding those opportunities back into the framework and then rescoring it all is a really good way to ensure that you can build on the wins. And Keep that prioritization rich with ideas the second thing is and how did mentioned this one is just the cro when you're putting together your eight week plan Really being a looking at it and making sure everything does make [00:29:00] sense it is still something where you're scoring manually and you have to make sure that it makes sense for the client for their business objectives, aligning that with the business objectives that the client has.
[00:29:10] The last one I'll say is where we see some pitfalls is very often we have our clients recommending certain experiments. The hippo in the room's deciding, I want to do this, on the sites. So making sure that those are also added to the prioritization and then showing the client, Hey, The opportunity that you actually mentioned is really low.
[00:29:29] It's like a low priority. We have some other priorities that we want to move forward with before we do. Because I think, making sure that it is you're not just taking something the client is saying and implementing it and you're actually feeding it through this prioritization, again, makes the whole prioritization concept a lot more meaningful.
[00:29:46] Khalid: I give you thumbs up in the, like in a LinkedIn live. I'm like, I like it.
[00:29:49] Simbar: Okay, that's good. So the last question that we have is when working with a team, how do you ensure alignment? on the prioritization of A B testing ideas?
[00:29:59] Ayat: [00:30:00] Well, I mean, I think it's just about building the right processes within your organization, ensuring everybody's on the same page, ensuring there's buy in from the entire team. We've recently been doing a lot of work also aligning the type of, trying to categorize all the different experiments that we're conducting within certain categories just to ensure that there is a streamlined process, a streamlined way that we're looking at problems, a streamlined way for us to actually pull data.
[00:30:29] Again, I think it's just about making sure that you're always optimizing the processes and you're discussions with the team to ensure that everybody's on board with whatever it is that you're doing.
[00:30:38] Khalid: add something there because you create the prioritization plan and the prioritization plan, if you're prioritizing 150, think about this, if you're prioritizing 160 items and you're saying like, no, in a month I might do four experiments.
[00:30:55] Four experiments, 160 items. That's 40 months. That is three and a half [00:31:00] years worth of work. it's way, way ahead. You look at the business objectives that helps you I think that's one of the criteria is that you want to look at if there's a strong business objective driver, the client wants a feature that gives an item a priority, but still it's ranked within the framework.
[00:31:14] Okay. So I have this three year plan, one year plan, two year plan. I think what Ayat had just mentioned. Where we like to take the plan and say, okay, what am I going to be working on in the next eight weeks to 12 weeks? Here is my, I have my long term plan. I know where I'm heading. I have the map, but in order for me to get there, okay, what are the next three moves that I need to make?
[00:31:38] Okay, so that's eight to 12 weeks. And now I'm having both internal discussions and discussions with the stakeholders, if it's a client or if you're an internal resource. Here's my plan. Now, although it's an 8 week plan, it's subject to change, correct? I've Ransom experiments, things change. Guess what? This is flexible.
[00:31:56] What I like to say is if you're [00:32:00] holding a weekly call with the client team if you're holding that weekly call, you say, like at the end of the calls Hey guys, here's our next eight weeks plan. We're going to work on that. that's what we're going to be changing. That's what we're going to be testing.
[00:32:11] Any changes, any recommendations. So it's always about making sure that. Really, there's visibility, and then there's alignment, correct? Everybody knows. Not everybody's like, Hey, what happened to this plan? This experiment, I thought we talked about it. No, we've changed. We've shifted priorities.
[00:32:25] Everything needs to be very well communicated, so we're all on the same page. And I think, by the way, that's the very first step of getting people excited about experimentation. That's one of the things that a CR needs to do. You need to get everybody excited about the experiment, the research, the experiments, and the results.
[00:32:42] And then finally, if things go well, the business, how we help the business actually achieves. And
[00:32:50] Ayat: that's really kind of like full circle to what you mentioned at the beginning of the podcast, Khaled, about storytelling. Because I believe that CROs need to be [00:33:00] really good storytellers and they need to be able to tell the clients.
[00:33:04] Through, their story of why this is a really great experiment, why it's going to have a positive impact, what type of an impact did it have? All of it is about how you deliver that information. And it's so critical. And I'm going to just say to what you were saying about the framework is I always tell people.
[00:33:20] I'm Your prioritization sheet is a living document. we've mentioned this throughout the podcast is that you're always constantly updating it. You're always constantly reviewing it. Even after that eight week plan, I'm feeding in a lot of information. I've run experiments. I've, run some more data.
[00:33:37] I've looked at analytics, again, heat map, session recordings, whatever it is, I'm feeding more information back into My document. So it is a living document and it is subject to change. And of course, just making sure that you're in alignment with the client is the key thing.
[00:33:50] Simbar: Yep. That makes sense.
[00:33:52] So those are the questions that we have for today. Unless maybe you wanna share more stuff. I'm here for it.
[00:33:59] Khalid: I [00:34:00] have nothing to share, man. I am fasting, operating a little sleep, but I, The discussion was good. So I enjoyed it. So, you know, In black prioritization, it's always good.
[00:34:08] Thank you Ayat for joining us, what do you think of CRO live hour fun? We're always talking about interesting, fun things. Thank you everyone for listening and until next time, happy testing.