Ecom with Jon - May 12, 2024

What I learned this week

Here’s what I learned this week

I’m going to say something absolutely crazy.

I’m not sure email collection matters before the sale.

I think that being able to route select traffic to specific offers might be the better play.

I’m going to try to explain why in this email.

If we’re thinking in the pure terms of CRO I actually see a way for data collection to work hand in hand with conversion rate optimization for first time purchasers.

This is going to borrow from another framework we were using prior, so I know it’s possible to setup.

This is an A/B/C/D/E/F test and the results are fascinating

Full admission I need more data, but it’s interesting nonetheless.

The following was done targeting only mobile devices on non-product pages.
The offer was the exact same, when the offer was made apparent changed.

My long term goal is to automate this kind of advanced testing because results will 100% depend on multiple elements as we’ve covered in the past but I’m starting to think that there are a few little tweaks that will yield the majority of the results for this one and it’s not worth overcomplicating.

The elements we’re testing in these are multiple:

  1. Offer stated v. no offer stated til the end

  2. Micro optin v. no micro optin

  3. Full page v. popup

  4. Time delay v. Immediate

Let’s provide you with an overview of each then let’s look more deeply at the results.

V1

- 8 second delay
- Popup
- Micro-opt-in
- Email collection
- 5 Questions
- Coupon Code

V2
- 8 second delay
- Popup
- Micro opt-in
- 5 Questions
- Email Collection
- Coupon Code

V3
- 8 second delay
- Popup
- 5 Questions
- Email collection
- Coupon code

V4
- No delay
- Full Page
- Micro Opt-in
- 5 Questions
- Email Collection
- Coupon code

V5
- 8 second delay
- 5 Questions
- Code displayed with required Opt-in to Activate
- Email Collection
- Coupon Code Activation

V6
- No delay
- 5 Questions
- Code displayed with required Opt-in to Activate
- Email Collection
- Coupon Code Activation

Ok let’s dive in a bit.

The only form that required email first was the first one, so every other form had 100% completion for data by asking questions first (if you collect data first then it will always be complete because people will drop off at the email step)

Even without this it was completed over 90% of the time.

First we need to naturalize the numbers, when people run tests rarely do they naturalize numbers, when you’re dealing with multiple variables, you need to naturalize the numbers to get the best estimates.

The way you do this is to keep your percentages and AOV, then standardize the views based on these then extrapolate the subscription, revenue, and order count numbers.

Then we look at the variables that we’re testing for.

All the subscription rates are very close to one another with a knock on version 3 and version 4 with slight dips.

The highest subscription rate goes to the first form, it also drove the most revenue per subscriber based on the subscription to conversion rate.

But it had less views because of the delay.

Forms that showed up immediately received about 33% more views.

So now we have to factor this in, on the forms that showed up immediately, telling people the offer with a micro opt-in drove the most revenue but not the most subscriptions, however the sign up to conversion rate was pretty high as well.

The big surprise for me was V6 and here’s why.

501 people answered the first question of 1209 total views that means it’s a free data point from 41.6% of visitors.

That’s high.

354 people answered the second question so 70.4% of people that answered the first question.

307 people answered the third question so 86.7% of the people that answered the second question.

282 people answered the third question so 91.9% of the people that answered the third question.

Ultimately up until the question asking for an email address, we saw 282 answers against 1209 views or 23.3% of people that saw this provided us with full data without knowing about any offer.

Now here’s where the big dip happened.

Now of those that saw the step for email, here’s what it looks like.

I probably have a few more tweaks to this one.

27% of people signed up after being prompted to provide an email after being shown a discount, which means that 73% of people dropped off.

So let’s recap because there’s a lot going on here.

Full page, right away netted out full, complete, data from 23.3% of all visitors that saw this form.

That’s down from one data point from 41.6% that provided at least one data point.

But remember this is from all non-subscribed visitors.

So remember how I said at the beginning of this email post that I wasn’t sure that collecting emails mattered?

This is why.

Comparing to V5

I wanted to see if this was a fluke, so I also looked at V5 as it had an 8 second delay and we want to see if the delay had an impact it was also not full page, but on mobile so the majority of the page.

270 people answered the first question of 639 total views that means it’s a free data point from 42.3% of visitors. Decrease in views by almost half by waiting 8 seconds without a huge loss on the first answer.

That’s high.

205 people answered the second question so 75.9% of people that answered the first question.

180 people answered the third question so 87.8% of the people that answered the second question.

163 people answered the fourth question so 90.6% of the people that answered the third question.

Ultimately up until the question asking for an email address, we saw 163 answers against 639 views or 25.5% of people that saw this provided us with full data without knowing about any offer.

So same format with a delay yielded higher percentages but lower values, but as we can see from our chart above it had a higher subscription to conversion rate, which brings us to v4.

Compare to V4

Full disclosure I hate the idea of full screen on mobile popups generally that show up immediately, but for the sake of experiment, we went with it.

This one had the mirco opt-in on it, that killed 92.1% of people that saw the form.

66 people answered the first question of 978 total views which is only 7.9%.

Essentially this format completely killed data collection even showing right away and full screen.

It’s safe to say that micro opt-in with data collection just isn’t the most efficient from a pure data collection perspective.

63 people answered the second question so 97% of people that answered the first question.

61 people answered the third question so 95.3% of the people that answered the second question.

60 people answered the fourth question so 98.4% of the people that answered the third question.

59 people answered the fifth question so 98.3% of the people that answered the fourth question.

Ultimately up until the question asking for an email address, we saw 43 answers against 66 that got past the micro opt-in or 65% of people that saw this provided us with full data knowing about the offer.

Jon what are you trying to tell me?!?

There’s a lot of data.

I’m thinking this is a layered two step form situation.

I feel like this is better as a video, but essentially, the fall off is too big on the first step for providing an email.

So instead, what you do is you combine the data from the first with a higher subscription rate from another point in the journey tide to intent.

At a place where you can just provide an offer in exchange for that email no data needed which means we should be able to get around 20% subscription rates to tie to the existing data based on adding to the cart.

So the way we set this up is for this cart option or a cart option to show only if the first form has been completed.

Essentially, we can look to game the 25%ish data we’re collecting from everyone and tying it to profiles after the fact.

But Jon!! What if they just hate giving emails?

This is actually what we’re seeing.

People if they have a choice would prefer not to provide an email.

Data no problem. An email, that’s where the drop off happens.

More secondary tests are needed on this along with more data.

Also things are going to get tricky to properly keep track of, how many people are signing up for a second time (happens more than you think), how many people are ordering, what’s the value of those orders, should those orders be routed differently, etc.

End of the day though a data point from 43% of all people that see a form is ridiculously high. The cascading drop out rate is also tollerable.

But it means that this point of intention can be leveraged.

I talked to some smart people last week

There’s some people that are doing some pretty cool things, I’m excited to see where things land on solving some real problems around data.

The Takeaway

More tests to come.

I continue to have my mind changed daily.

Have a great week!

-Jon

Catch up on past posts: https://ecomwithjon.beehiiv.com/

You can learn from me: jonivanco.com