Conversion Rate Optimization (CRO) Research Review
I got a "Growth Marketing Mini-degree" scholarship from CXL Institute. And I will be writing a weekly review of what I learn along the journey.
Here's my review for week 2.
This week, Peep Laja is talking about the research you should be doing before starting "Conversion Rate Optimization" (CRO).
Let's start with some key points:
Where is CRO in Growth Marketing?
- In general, you should separate between two different aspects during your business growth; acquisition & activation on one side and conversion & retention on the other side. So you start by focusing on acquisition then after that, you shift your focus to conversion.
- Sure, you can work on both areas at some point but even then; you should have a dedicated team for each area.
When should I use CRO?
- If your website has a minimum of 1,000 conversions/month then you can use CRO. If you have less than that, then CRO is not right for you now. But, why?
- If you have less than 1,000 conversions you don't have enough data to work with, the sample size is not big enough to test on, so you will not be able to determine exactly what seems to be a winner while running your tests. Is it A or B?
- So if you have less than 1,000 conversions don't do CRO, just do user testing, do your research to understand your audience, implement, take risks and find your way to reach 1000 conversions.
I am qualified to do CRO, now what?
- So now you meet the minimum requirements to start CRO, how should you go about it? Find an idea you think will double your conversion and start implementing it? Brainstorming optimization ideas with your team and start testing them one by one? Here is the answer;
"If you can’t describe what you are doing as a process, you don’t know what you’re doing"
– W. Edwards Deming
- If we broke down the success formula for optimization it will be like:
- So, the first point is easy; keep testing all the time. But what about the last 2 points? How to know if my testing idea is the right one to test? How to choose the problems that I want to solve? That's what makes the difference between an expert and a novice.
- Peep Laja introduces the "ResearchXL framework". An epic framework to help you come up with "problems to test" that are most likely to get you wins, and to classify these problems and prioritize them starting with the most promising ones.
ResearchXL framework:
- The "ResearchXL framework" has 6 steps:
? Heuristic Analysis
? Technical Analysis
? Qualitative Surveys
? User Testing
? Web Analytics Analysis
? Mouse Tracking Analysis
? Copy testing
? Heuristic Analysis:
- This is an experience-based assessment, where a group of different experts like optimizers, designers, usability people, copywriters, and janitors set together to assess each page of the website according to a set of criteria in a very organized, structured manner.
- The heuristic analysis's main advantage is speed and its outcome is not guaranteed to be optimal_as it’s a kind of an educated opinion_ but it's still good enough as it is done by experts in a structured manner to avoid bias.
- The outcome of this analysis is called “areas of interest”. And the next steps of the framework will validate or invalidate the findings.
- The criteria that are used to assess the website in the heuristic Analysis are:
* Relevancy: does the page meet user expectations? (content and design)
* Clarity: Is the content/offer on the page clear?
* Value: Is it communicating value to the user? Give motivation?
* Friction: Anything causing doubts, hesitations, uncertainties?
* Distraction: Anything on the page not helping the user to take action? Or drawing attention?
? Technical Analysis:
- Technical problems are your main conversion killer.
- Here you conduct cross-browser and cross-device testing and notice if the conversion sucks with any kind of browser or device.
领英推荐
? Qualitative Surveys:
- Qualitative Surveys can be on-page surveys with users to ask them what holding them from taking the action dedicated for each page. Whatever that action is; making a purchase, signing up for a newsletter, filling a form, etc. And you should choose carefully, when the question shows up; it can be before quitting or after spending a number of seconds on a page. You can also look for the "average pages/session" for "converting users" then show the survey to those who exceed that average and make no conversion.
- Email surveys; are surveys you make with a user after he leaves the website with the main purpose of discovering the sources of friction on the website. This kind of survey should be done quite soon after making the purchase (1-7 days) or the user will forget how things went on.
- The sample size for surveys should be nothing less than 200 respondents or the findings won't make sense.
? User Testing:
- User testing is the process of observing how actual people use and interact with your website while they’re commenting their thought process out loud. So you bring them to the testing area while your website opened on the PC and asking them to complete a certain task (find an item and add it to the cart, make a purchase, etc) while describing what they do out loud. Then you start to observe how they interact with your website and how they describe what they do.
- Although Google Analytics tells you which pages have issues, and where the conversion leaks, it does not tell you why, that's the power of "user testing".
- The main benefit of user testing is identifying "bottlenecks" so you should make sure that the users you are testing are not actual users who are familiar with your website.
- When conducting user testing you should be aware of the bias that the users you are testing might have; an important one is a user's tendency to please you as you are paying them to participate. The user might be having a horrible experience while describing it as a great one. So always pay attention to the experience more than the words they say, and always keep in your mind that "a user who is taking a survey is not like the user who is gambling with his hard-earned money".
? Web Analytics Analysis:
- This analysis goal is to make sure you are collecting all the data you need, and it's collected from resources and by methods that you can trust.
- Nearly all analytics configurations are broken; as "Peep" put it " 90% of setups I’ve come across either have insufficient tracking or broken configurations"
- Your analytics checklist should include (Adwords setup, GSC linking, internal traffic filtering, URL setup, Subdomains and TLDs set up, Goal configurations, etc )
? Mouse Tracking Analysis:
- Mouse Tracking has different types like heat map/hover map, Click maps, Attention maps, Scroll map, User session replays and can also include Form analytics.
- Heat maps do not tell a lot as the user does not necessarily look where he is hovering but are good for reporting and getting buy-in from executives.
- Click maps info are tracked in analytics but the best thing about it is that it shows the clicks on unclickable areas which may suggest improving the communicated value or changing the place of buttons.
- Attention maps determine which parts of your webpage people pay the most attention to. It indicates which content your visitors read the most and which they just skip over. So you can improve your content and design.
- Scroll map: determines how far the users scroll. So you can prioritize your most important content to appear before the users stop scrolling. Also, it helps you make adjustments to trigger the user to scroll further.
- User session replays: this is simply a record of actual users sessions. It can tell you more insights about why some pages are leaking traffic. The way this is different from user testing is that it enables you to look at an actual user interacting freely and doesn't know he is being tracked so he is not biased. Whether in user-testing, the user is making a certain task you ask for and he knows he is observed.
- You need a sample size of around 2000-3000 pageviews per design screen for the mouse tracking results to be reliable.
? Copy testing:
- In copy testing, you put your copy in front of your audience and ask them questions about it.
- Copy is more important 2 times than design when it comes to conversion.
- 20-50% of users fail to successfully complete a purchase because of incomplete or unclear product information.
Now you have a list of all the possible problems you can work on. The next step is to organize them, by putting each of them in one of 5 buckets as following:
? Hypothesize: You have a problem but don't know exactly the reason behind it so you need to develop different hypotheses and test them to arrive at the right one then start testing different possible solutions.
? Test: Here you have a valid hypothesis for a problem but still need to test different solutions to solve it.
? Just Do It: Here the fix is easy to identify -a no-brainer.
? Investigate: You need more information, as the problem you identified may not be a problem.
? Instrument: You need to fix your configuration like the analytics reporting.
After sorting the problems you give each one a priority score from 1-5 according to:
? It's potential lift or revenue it would generate.
?The number of users who are exposed to the issue.
? Ease of implementation
Now, work on them one by one starting with the one with the highest score.
Notice; all you have done now is answered the question
"what problem should I solve first?"
But still, you need to prioritize the different solutions for each problem. You need to answer the question
"For the problem x that I am trying to solve, which solution should I try first? A or B or C or D, etc?"
"PXL test prioritization framework" will help you answer that last question. And that's a part of what will be covered the next week.
Stay tuned!
A final thought about this week:
Although the presented frameworks are not the only ones out there. Rather they are Peep's own frameworks that worked for him, the logic behind them totally makes sense and they help so much in structuring the process giving no room for subjectivity.
#growthmarketing #growthhacking