CRO Live Hour

#49 – How To Design Winning Holiday Experiments

October 11, 2023 Khalid & Simbar Season 1 Episode 49
#49 – How To Design Winning Holiday Experiments
CRO Live Hour
More Info
CRO Live Hour
#49 – How To Design Winning Holiday Experiments
Oct 11, 2023 Season 1 Episode 49
Khalid & Simbar

In the latest episode of the CRO Live Hour, hosts Khalid and Simbar delve into the intricacies of optimizing e-commerce websites during the holiday season. They tackle essential questions about prioritizing elements for experimentation, emphasizing the significance of choosing the right metrics tailored for holiday experiments. Tune in to discover key insights on tracking metrics vital for long-term success and understand the critical role of user feedback and customer behavior analysis in shaping the design of holiday experiments. Join Khalid and Simbar as they navigate the complexities of e-commerce strategy, providing valuable tips and strategies to enhance your online business during the festive season.

QUESTIONHave a question about e-commerce Conversion Optimization, A/B Testing, User Research, Marketing, or Anything Else? Post in the comments section of this video!

Visit Our Website, And Learn More About Our Services:

invespcro.com

Follow Invesp on Social Media:
Linkedin
Facebook
Twitter/X

Read more about conversion rate optimization on our blog.

Show Notes Transcript

In the latest episode of the CRO Live Hour, hosts Khalid and Simbar delve into the intricacies of optimizing e-commerce websites during the holiday season. They tackle essential questions about prioritizing elements for experimentation, emphasizing the significance of choosing the right metrics tailored for holiday experiments. Tune in to discover key insights on tracking metrics vital for long-term success and understand the critical role of user feedback and customer behavior analysis in shaping the design of holiday experiments. Join Khalid and Simbar as they navigate the complexities of e-commerce strategy, providing valuable tips and strategies to enhance your online business during the festive season.

QUESTIONHave a question about e-commerce Conversion Optimization, A/B Testing, User Research, Marketing, or Anything Else? Post in the comments section of this video!

Visit Our Website, And Learn More About Our Services:

invespcro.com

Follow Invesp on Social Media:
Linkedin
Facebook
Twitter/X

Read more about conversion rate optimization on our blog.

[00:00:00] Khalid: Okay. Another episode of CRO live hour. We are
back.
[00:00:04] Simbar: It's been about three weeks without the CRO live?
Oh, wow. I didn't know that. Not like publishing any episode. Yeah. It's
been three weeks. I think, it's about time we do like a recording.
[00:00:15] Khalid: It's funny. I remember why three weeks ago.
[00:00:18] That's when investment FIGPI were super busy I think it was
one of my most stressful weeks at investment 10 years or something like
that and at FIGPI So I'm like Simba I can't do it and then you are out. Do
you want to share any news with our audience? Yeah, of course,
[00:00:35] Simbar: I took a vacation for two weeks. I came back a
different person. I became like a dad for the first time I now have a son.
I'm now a dad. So it's been interesting being like a first time father. There
is no like template of how to be a father. You just have to try and take
advice from other people.
[00:00:52] Who are already dead. They have also like, been giving me
tips of how to become a father, or how to do certain things, [00:01:00]
which
[00:01:00] Khalid: they've been helpful. Yeah, It's like everybody has to
figure it out on their own. The most important thing that you will do for
your kid, but you're just trying to figure it out your own.
[00:01:09] But, by the way... The first kid is when you're just confused
and you're like asking for advice and what do I hear? Second kid is oh,
yeah, I figured it out. By the third kid is like, hey, first kid Can you take
care of the third kid? Leave me alone.
[00:01:23] Simbar: Yeah, it's been interesting so far I think he's three
weeks old today so we've been having like less and less sleep every day
waking up after about every two hours, but I'm getting used to it bit by
bit. So it's been cool so far.
[00:01:40] Khalid: And thus, now, we are back at the CRO live hour.

[00:01:44] You took vacation for two weeks, but I don't know if it counts
as vacation because I think it's just exhausting when you have a
newborn. But, is life.
[00:01:52] Simbar: mentally, I rested. But physically, I didn't really rest.
[00:01:56] Khalid: can you actually mentally rest when you're physically
tired? I [00:02:00] don't know.
[00:02:00] But anyway we are back. Let's see, what are we talking
today?
[00:02:04] Simbar: Today we are talking about winning holiday
experiments. Like how do you come up with the different winning holiday
experiment?
[00:02:12] you also had a webinar sometime. This week. Was it like two
days before? And we're going to send out like the recorded replay. I also
had you talk about some of the experiments that we launched here at
invest for companies that we work with during the holidays.
[00:02:28] We're going to be talking about that today. I have about three
questions that I will ask you. Just let me know when you're ready. Then
I'll just,
[00:02:35] Khalid: I'm ready, man. Just fire away.
[00:02:37] Simbar: Okay, cool. So the first question is how do you
Which elements of an e commerce website to prioritize for
experimentation during the holiday season?
[00:02:47] Khalid: How do you determine which elements you need to
prioritize during holiday season? I think we cannot answer that question
without talking about the strategy, the experimentation and A B testing
strategy [00:03:00] or the CRO strategy during the holidays. And the
strategy that I'm going to talk about, we've learned the hard way
Probably in the last 10 or 11 years, that's when we started focusing on
the holidays and what kind of testing we can do during the holidays.
[00:03:14] And I have to tell you, our first experiments that we've
launched during the holidays as I reflect back at them we knew that we
have to have offers and we were just different offers. But at the same

time, we said since people are highly motivated. They're coming. They're
looking for a deal.
[00:03:30] We wanted to test simple site functionality just to see which
one worked better and some of the stuff was okay hey drop downs
versus radio buttons which one with somebody who's highly motivated
Would that have an impact testing that functionality at some point? We
tried to test.
[00:03:46] branding plus discounts and offers versus just offers and
discounts versus just branding itself. Because we had one client who
refused to do any discounts during the holidays. And they're like, Hey, I
don't care. Black Friday, [00:04:00] Cyber Monday. we don't give
discounts.
[00:04:02] And over the years, I think what we've learned is that the
strategy needs to be Simple, if I may say, don't overcomplicate your
holiday testing and experimentation strategy. Remember, the goal
during the holidays, during Black Friday, Cyber Monday, probably until
the end of the year, our goal is to maximize our money coming in, the
ROI that we're getting all the sales that we are generating.
[00:04:30] I'm not necessarily Trying to learn a lot about my customers.
Maybe if I'm learning I'm just trying to learn which offers work best for
them now having said that That really dictates what kind of
experimentation we're gonna do during the holidays So usually
whenever we're sitting with a company that we work with we say, okay
so what are the offers that we're gonna be focused on this year and
Some companies go overboard with the amount of offers.
[00:04:56] I'm like, okay don't complicate it. Let's just stick with two to
three [00:05:00] different offers. We'll test them out, see which one
works best. And for each one of those offers, we will create different
creatives. Now, have the offer. different offers, from different angles,
three different creatives.
[00:05:12] In terms of testing, we always have a banner at the top of the
sites that just works very well all the time, especially during the holidays.
You combine that with making sure also on the collection pages or the
product pages for e commerce that you mentioned the offer. We like to
use scarcity and urgency as well, but they need to be authentic.

[00:05:37] So you're just not inventing, trying to trick people. And in the
cart, we just repeat the offer. Again, we apply any discounts
automatically and we run those experiments. Usually our experiments
launch. Midnight 12 a. m. Eastern, that Thursday night. That's basically
the start of Black Friday, and they will continue.
[00:05:57] We're constantly monitoring, and I [00:06:00] must add that if
the site has enough traffic. Which most of the companies we work with
have the traffic and the conversions. We are using multi armed bandits
algorithm. For those who don't know what's multi armed bandit, basically
it's an algorithm that reallocates traffic between the different variations
you have in A B test based on which variation is generating more
conversions.
[00:06:21] So the goal from an or multi armed bandit is to maximize your
revenue during the testing period. We run all of that testing usually until
Monday morning, we look at our winners and we use those. Now we
started with three campaigns. We usually take the winning campaign
and that's going to be our cyber Monday that we're going to launch with.
[00:06:40] Now we have different creatives that are already ready based
on all of this prepped two months before the holidays. So cyber Monday
is okay, here's our winning campaign. And then we continue running.
That's probably Monday, Tuesday, sometimes until Wednesday. So
what am I testing?
[00:06:55] I'm not complicated to testing. I am going to be testing my
offers. I'm going to be testing [00:07:00] something globally across the
website. I am going to be testing few elements on the Placements of my
offers on the collection pages, the placements, and the copy of my offers
on the PDPs and on the cart page.
[00:07:15] Outside of that, don't test. Don't get really too fancy. People
are highly motivated. Give them what they want. Just tell them, here's
what I have and give them what they want. And the webinar that we've
had with Adam and then it was just really fascinating because Adam
focuses on email marketing and the holidays.
[00:07:32] He's like the first portion, correct? Like he drives that traffic to
the website and we try and convert that traffic. There are some
strategies that he mentioned that I found fascinating. And then, of
course, I talked to Ayat, and she's oh. doing this for a while I'm like okay

great sometimes I'm disconnected I guess because by the way usually I
have conversations with our team members regularly starting November
no one talks to me everybody's just heads down working with their
clients they're just too busy to actually have a conversation but one of
the [00:08:00] strategies is Let's say you're working with a brand, so an
email will come out from the CEO of that brand telling people that, hey,
holidays are around the corner, we are going to be running offers.
[00:08:10] And then they give a calendar, so hey, Friday, we're going to
run this promotion on this category and Saturday, this promotion on this
category and Sunday, this promotion on this category. So they're testing
different promotions on different categories and maybe in the last three
to four hours, they might even increase the discounts that they're giving.
[00:08:27] I thought that was fascinating. That was interesting. And of
course, it goes back to also the other piece that he said that I thought,
Oh, this is really cool. It's segmenting the email lists and knowing. What
offers you make, at what time to which segments.
[00:08:40] Simbar: That's a good point.
[00:08:42] I just want us to dig a little deeper into like your explanation.
How many tests do you usually run? Within that period, let's say maybe
from Thursday night or from Thursday morning, going into Fri into Black
Friday how many tests do you usually like launch? I'm just asking this
because maybe there might be someone who's [00:09:00] listening to us
who also have an e-commerce site and they don't really know what
exactly to like test and how many tests to launch.
[00:09:07] Khalid: So let's start with the assumption to answer that
question that you're gonna launch a test. I'm both mobile and desktop,
so like one test that's going to address traffic and then you can your A B
testing software will allow you to segment the results between mobile
and desktop. So we're going to launch what we call a responsive test
just to simplify life.
[00:09:26] If you have traffic, you might actually run desktop separate
than mobile. Let's say it's one. So we will launch a test site wide, That's
in the header sites wide. I've launched a test probably on my top landing
pages. slash, or, and the homepage that's, becomes important for the
homepage doesn't get enough traffic, lots of traffic, then I'll focus on the
landing pages.

[00:09:49] If no, the homepage gets a ton of traffic, landing pages are
different. So might be one or two tests. So that's up to three tests until
now, we will launch a test on the category pages, a test on [00:10:00]
the product pages and a test in the. Cart and checkout. And usually the
cart and checkout is one test that we run.
[00:10:06] So up to six experiments. If you're running what we call a
responsive test. So one test that's going to run for both mobile and
desktop. If you say, you know what? No. My mobile is going to be
different than my desktop. So I'm launching six experiments for mobile,
six experiments for desktop.
[00:10:21] But of course, that starts with the assumption that you have
enough traffic, enough conversions. Sometimes I like just to combine
them just to make life easier. One responsive test for both devices. You
just need to make sure that your A B testing software allows you. to
segment your results based on the device, correct?
[00:10:38] It's hey, we've run the test, but I want to see what is the
conversion rate for mobile and what is the conversion rate for desktop?
And I want to slice and dice the data. And I have to do like a shameless
plug over here for FICPAI because we provide that out of the box.
[00:10:50] You run a responsive test. We allow you to segment not only
by device, but by the type of traffic that people are coming, specific
campaign that.
[00:10:58] Simbar: you also answered it like very well, [00:11:00] and
I'm just thinking like, when you run like A B test on desktop and mobile
devices, like maybe at the same time doesn't that mean more
development and A work for your team?
[00:11:09] Given that you have to be moving fast during that Black
Friday period.
[00:11:13] Khalid: Yeah, so this is interesting, because this is always a
discussion with our dev team. Ultimately, I think it's about the same
amount of time. If you're going to do separate mobile desktop, unless
you're going for a completely different experience, usually the first
platform that you're going to program the A B test for will take you the
longest.

[00:11:33] So that platform, let's say you have an experiment, and it's
going to take eight hours. Let's say you're developing the test first on
mobile. When you go to develop the test for desktop, you've already had
most of the code, you need to make a few changes, so it might take a
couple of hours. So in total you've done 10 hours.
[00:11:47] Now, if you're going to do responsive, it is probably going to
be close to 10 hours. So it's about the same, by the way, whether you're
doing responsive or you're doing combination, sometimes actually doing
it as responsive cuts down on the time by 10 to [00:12:00] 20%, which is
fascinating. It's not exactly equal.
[00:12:04] You're not launching two tests. So you're launching one test
and the time is not the combination of both. It's about 20 percent less.
But remember, we're talking about not too complicated experiments.
We're talking about banners. We're talking about special offers. I don't
anticipate that development time for any of those experiments will be
long.
[00:12:23] Most of these experiments are what we call low effort
experiments. Low effort experiments are experiments that take less than
four hours to develop. That's what I want. If you have an experiment.
And this is something that we do. Whenever we have an experiment, we
classify it as low, medium, high effort.
[00:12:38] It's important for us because that really helps us understand
how long it's going to take to implement this experiment. low effort
experiment is less than four hours. medium effort experiments is
anywhere between four to eight hours. And high effort experiments is,
It's more than probably eight hours.
[00:12:54] Sometimes it's usually around 16 to 24 hours. If I see that
there is [00:13:00] medium or high level of effort experiments in my
Black Friday, I'm stopping. I'm like, Hey, what's going on here? Why do
we have high effort experiments or medium even? Because our
experiments should be fairly straightforward. We're not trying to learn a
whole lot about our customers.
[00:13:14] We're trying to maximize our revenue season.
[00:13:17] Simbar: that's a good point. just one more time. Let me just
try not really pushing back, but I just wanna hear what you would say.

[00:13:25] Khalid: Go for it. Push back, my friend.
[00:13:27] Simbar: Some of the web elements that help increase
conversions on desktop may actually affect conversions on mobile
devices, Maybe what I'm trying to say is that what works on desktop
might not necessarily work on mobile, So if you launch same test on
both devices at the same time, don't you think that might end up
affecting one device than the other?
[00:13:51] Khalid: So I agree with the Fundamental premise that the
visitor who comes to your website on mobile is in a different [00:14:00]
mindset than visitors who come to your sites on desktop.
[00:14:04] completely agree with that. I think that's a correct assumption.
I think we see that. That's all the time in our testing. And that's the
reason whenever even we launch a single test, a responsive test, for
mobile and desktop, we have to segment just to really understand how
mobile visitors are behaving compared to desktop visitors.
[00:14:22] But there's always a but, correct? Here's the thing. Holidays
are a little bit different because we're focused on offers. And offers and
the visitors are also coming and they're highly motivated and they're
searching for discounts. Let's be very frank and honest about it. So it's
all about how do I show them the best offer in the best way on mobile
and desktop.
[00:14:44] So while people are a little bit different throughout the year, I
think the holidays is a little bit less, especially if we're just focused on
offer testing. So that is something that we want to keep in mind. Now,
there's a whole other complexity. Let's say you launch [00:15:00] mobile
desktop or you launch a responsive test.
[00:15:03] You launch mobile versus desktop and offer one, number
one, wins On mobile and offer two wins on desktop. What do we do in
that case? That's a question or even and responsive. You launch a
responsive, but then your segment in the data and you see the same
thing. Offer one wins on mobile and offer two wins on desktop.
[00:15:22] What do we do? Holidays are very special because We are
trying to maximize our revenue. So I would actually. Say, you know
what? For mobile, this is the offer that we'll have. For desktop, this is the
offer that we have. This is because we have very limited time. So we

create this, what I call, a split in the path for mobile, for desktop,
because really the data is showing us that they are different.
[00:15:43] Outside of the holidays, it's a completely different discussion.
Because outside of the holidays, you're not testing offers, correct?
You're testing, you might be testing functionality, messaging. And the
minute you see that mobile versus desktop are a little bit different, You
actually have to stop and say, okay, how many visitors are getting to
[00:16:00] us on mobile?
[00:16:00] How many visitors are getting to us to our site on desktop?
We usually like to go with one message even if we end up saying you
know what this version is really winning on desktop But really desktop
doesn't matter to me. Most of my visitors are coming on mobile I am
gonna go with the mobile version and that's gonna be even the default
on desktop.
[00:16:16] Why do we do that? because, splitting the code base into
here's mobile, here's desktop. Sounds great for marketers, hey, we want
different experience, but maintaining that is not so easy for developers.
And remember, we talked about one test, right? Oh, here's the
experience for mobile, here's the experience for desktop, 300
experiments later.
[00:16:37] Maintaining the website could be horrendous. It's just asking
for trouble. Unless fundamentally the website is architected to handle
those two different, experiences. Again, as you notice, but, unless,
caveat. So there's a lot of thinking that needs to go into that.
[00:16:51] Simbar: agree with your explanation and it also makes a lot
of sense. it makes a lot of sense because I'm also thinking that like some
of the people they might visit [00:17:00] the website using their mobile
phones and then maybe they purchase on their desktop but what if like
they did that like they took that same path and they see like the offers
are different from what they saw on The mobile device and what they
saw like on the desktop device.
[00:17:16] I'm just thinking might be far fetched
[00:17:18] Khalid: Happens all the time. Happens all the time, by the
way. Typically we say it is what it is, offer here, offer there. It is what it is.
Just the amount of effort that takes because. It's a completely different

visitor it is the same visitor, but for the A B testing software, it's a
completely different visitor, unless you have very intelligent A B testing
software that says, Oh, you came from here.
[00:17:38] I can look at your signature. I can compare it. I can like, match
it. this is a lot easier if you have what we call logged in user. So I'm like,
Oh, it's the same user ID. Okay. Now I showed you this offer on mobile.
Now you're logging in and coming back to the website. I'm going to show
you the same offer.
[00:17:52] Not everybody is locked down devices. That's number one.
This becomes a much bigger problem By the way, the mobile versus
desktop different [00:18:00] offers or different experiments variations
when you're testing pricing Oh, this item of mobile is for 80 on desktop is
for a hundred bucks by the way, this happened to me just a couple of
days ago.
[00:18:11] So I somehow joined not somehow I actually looking forward
to it a book club. So a whole bunch of friends reading a book together
And I'm just busy all the time and also I visited my doctor and he's you're
not moving enough. And I'm like, really? I need to move? I guess I am
getting older. yes, you're not in a good you just need to start moving.
[00:18:27] So I'm like, okay, let me get an Audible subscription. It's like
12, 14 and I'm like, okay, I can do it. And I get one book basically per
month. I go and now I click on their ad. From Google ads and I land and
I see a flicker where initially it starts by saying you start your subscription
with 4 a month But it's an A B test and I see the flicker and then the the
version that I was getting served showed me that's Oh start your
subscription at 14.
[00:18:56] I'm like no I won the 4 offer. [00:19:00] So I'm like, okay, go
back and I knew exactly what's happening They're running an A B test,
and I got stuck with the I'm not getting the offer I'm paying 14 I'm not
getting the 4 and I didn't know any other details. So I'm like, okay, let me
go back to like search, Chrome.
[00:19:14] And this was on my mobile device, incognito mode. I come
and again, it's flashes and I'm stuck with the 14 offer. And look, really
guys, want to see it. So now I want to see. So I go, I open mobile device.
I go to Safari. search. So by the way, because they're running an A B
test, I clicked on their ad three times.

[00:19:31] I made them pay multiple times. And then I got the offer and
it's 4 or 4. 99 for four months. So I'm like, ooh, I'd take that. So I
subscribed. But again, yeah, I am aware of what's happening. I wonder...
Another visitor who's not aware who will see the price flashes in front of
them.
[00:19:49] That's horrible Usually I don't care much for flicker effect. I just
say it's a fact of life But in this case the 4. 99 versus 14. 99 was so clear
that i'm like, oh [00:20:00] no that's a difference correct of 40. So
[00:20:03] Simbar: I also use audible, but I don't pay 12 bucks. I think
like I got a deal where I paid Dollar for the first three months So that's
like dollar to credit.
[00:20:12] Okay, so I think like it's also based like on the location, that's
all Hold on
[00:20:17] Khalid: hold now. I feel like I need to I'll do the four dollars
then I'll figure out the location it's funny because Spotify in Turkey, when
we were living in Turkey was like so cheap. I think it was like, 3 a month
or 4 a month.
[00:20:29] And then when we came to the U. S., it was like, 15. And
initially I'm like, oh, let's continue using the Turkish plan. But then now
they look at our IP and they're like, oh, you are in the U. S. We have to
switch you and you have to pay more. And I'm like, okay. So then, by the
way, the next time what I've learned.
[00:20:44] Is, if you're in Turkey, you get the Spotify subscription, you
pay for a full year, so even when you come to the U. S. it's still, Turkish
money that you paid. Anyways.
[00:20:51] Simbar: Okay. Moving on to the next question.
[00:20:54] Are there any key metrics that e commerce businesses
should track specifically during holiday [00:21:00] experiments and how
do they tie into the long term success?
[00:21:05] Khalid: I read multiple times where people argued that you
want to look at. conversion rates, or you want to look at revenue per
visit, revenue per visitor.

[00:21:14] think by the way in GA4 now, they have a different fancy
name. I don't even remember it's for revenue per visitor. initially when I
saw it, I'm like, what is that? And then I look at the formula, I'm like,
dude, why would you change the name of the metric that we've used for
the last 15 years?
[00:21:31] It makes no sense to me. The same thing, by the way, when
they like, now they removed bounce rates and they replaced it with
engagement rate. I'm like, really? Why? But again, I guess it's their
product, so they get to do stuff like that. But I digress. Some people like
to look at, some people like to look at conversion rate.
[00:21:47] I actually look at something else, profitability. And that's a very
honest and open conversation with every client, every company that we
work with. You have to have a full strategy when it comes to the
holidays. [00:22:00] You might be losing a little bit, Because you have
lost leaders, so you want to bring people in.
[00:22:06] Knowing that, I know that if somebody places an order with
me, they will place another order within 30 days or 25 days. So you need
to understand how frequently So, yeah, I might be losing money on the
first order, but I know that's on average, somebody will place an order
with me within 45 days and within a year, maybe they'll place six orders.
[00:22:29] You have to have enough data set to do that analysis before
you run a specific offer or discount. For me. And for my team, as a CRO,
like hey, the higher the discount, the easier it is for us to increase
conversion rates. But that is a stupid business strategy. I always tell
every company I work with, I can increase your conversion rates 100%.
[00:22:49] Offer your products for free, everybody's gonna buy, we're
done. Stupid. We don't want to do that. We want to think about
profitability, lifetime value of a customer, the frequency and the
[00:23:00] recency our customers and how often they place an order.
That is the framework that we need to look at. Yeah, I am tracking all the
other metrics, but I think the work of a conversion rate optimization
specialist.
[00:23:13] It's a bit more complicated to pull the data from analytics to
look at the recency, frequency, lifetime value of a customer to say, okay,
here's the data that we have. Here's the offers that make sense. Even in

instances where we talk about, okay, we're going to offer free shipping.
experiment that we launched a while back.
[00:23:32] I don't know if I would test it again, but what is the threshold
that would offer? The free shipping, right? I want to think about that, but
that requires you to dig through the data, You want to look at your
abandonment rate cart and check out the abandonment rate because if
somebody had clicked and added an item to the cart, they're highly
motivated.
[00:23:51] Okay, but then they abandon. What is that abandonment rate
based on average or a value? My abandonment rate is 50%. Okay, no,
you think [00:24:00] it's 50%. Let's look at the cart value and let's look at
the abandonment rate. What is the abandonment rate when the cart
value is around 40? 40, 45, 50, 55, 60, You might find out that
abandonment rate at 40 is only 10 percent and that abandonment rate at
90, it's actually maybe 80%.
[00:24:21] so thinking about that. Then saying you know what, listen
maybe if we offer free shipping at higher level orders, then I'm going to
reduce that abandonment rate from 80 percent maybe to 60%. So again,
there's a lot of data and lots of insights that you can pull from just saying,
Oh, a generic number to digging through it, doing that analysis and then
basing your business decision using those metrics.
[00:24:46] Simbar: That makes sense. Could you discuss the role of
user feedback and customer behavior analysis in shaping the design of
quality experiments?
[00:24:55] Khalid: I don't know it's gonna sound a little horrible, I don't
know if user feedback plays a huge role [00:25:00] in the holidays and in
the big scheme of things, correct? Because again, users are highly
motivated during the holidays. I start with that premise. They're coming
because they are looking for an offer.
[00:25:10] Now, user feedback matters to me a lot and figuring out which
offers work best. And this works more for established brands. Look at
the messaging. Look at the offers that worked really well last year and
the year before. Sometimes, most of the time, you don't have to reinvent
the wheel. Let's look at the creatives that worked really well.

[00:25:31] Let's look at the offers that worked really well. Let's look at the
offers that lost as well. Maybe we want to test with one of them and we
want to avoid the rest. That's feedback that you're getting through the
data is what matters to me during the holidays. Actual customer
feedback.
[00:25:48] I don't know. How much you will do of that. Now I might stand
corrected by the way, somebody might say, actually, we do quite a bit of
user feedback during the holidays, I would be interested. If one of our
listeners [00:26:00] says, actually, you need to consider this angle. I'm
like, okay, that's something that I'll be interested in learning about.
[00:26:05] What do you think Simba, do you think user behavior and
user feedback matters during the holidays?
[00:26:10] Simbar: I think like when it comes to direct feedback from
customers, I think They can highlight maybe their preferences, their pain
points And expectations during the holiday season, like you can reach
out to them. Maybe when you do those kind of customer interviews, jobs
to be done, customer interviews.
[00:26:26] And when you hear like much more about their preferences,
pain points and expectations, then that's how you can. Take that
feedback and also try to inject it in your holiday season offers. guess like
maybe that's also something that you can do even when it's not like
holiday.
[00:26:42] Khalid: Yeah. That's the challenge that I see with this
because I should be doing that all the time. I don't know if there's
anything special for the holidays that I'm like okay yeah this is particular
to the holidays I think you'll do that throughout. And I think by the way
the other thing it depends I guess on how large the organization is.
[00:26:57] The holidays always a mad rush. I don't think many people
[00:27:00] want to use their feedback but it will be interesting if we learn
something different.
[00:27:03] Simbar: those are the questions that I had for you, unless
maybe you wanna add something. No,
[00:27:07] Khalid: you always ask me that question and I'm like, usually
by the end of the episode I'm like, I'm tired. . But that was an excellent

discussion. I wanna ask our listeners, wherever you listen to this
podcast, leave us a review.
[00:27:20] By the way, we should put some effort to getting more
reviews for the podcast. And maybe that should be one of our goals.
[00:27:25] Simbar: Considering our listenership, like we have a lot of
people who listen to us, and I also had some people reaching out to me
on Valentine's Day saying that Oh, they enjoyed like listening to the
CRO Live, our But then it is at that moment that I should say, Oh, okay,
why don't you just drop a review? Yeah,
[00:27:42] Khalid: yeah, no, I agree. I agree. Thank you everyone. And
until next time, happy testing.