Trusting Your Gut in CRO; Yay or Nay?
Many CRO experts will warn you against trusting your gut when it comes to experimentation. After all, conversion rate optimisation is a practice heavily focused on data, numbers, and statistics. However, there is an argument for following your gut instincts; research has proven that our brain subconsciously notices changes in patterns which have proven to be valid and sometimes lifesaving.
True, utilising your innate intuition when running experiments is unlikely to save lives, but I believe there’s a place for listening to what your gut is trying to tell you in the world of conversion rate optimisation.
How many hours do you spend on the internet a day? Up to 6 hours? 6–10 hours? 10+? Be honest, we spend almost all our waking day online browsing websites, and so do many of your colleagues and users.
All this browsing means our brains have learnt to recognise patterns on websites — think big clickable CTAs, ‘Contact Us’ in footers, and supplying address information before payment. When information isn’t where you expect it to be it’s frustrating and that emotional reaction is your gut responding to the unexpected pattern. Using that emotion to recognise changes in patterns like this then becomes a marketable skill in offering ‘heuristic evaluations’.
You’re not the only one with that skill; every regular internet user is an expert on web patterns. Those colleagues who come to you with a hypothesis? They must have experienced something to trigger that idea, maybe they saw a competitor do something differently or found an experience frustrating. It’s then your job to help gather evidence to support and prioritise that idea.
But following an individuals gut instinct alone is not enough. Evidence is still required to determine whether an idea is worth testing at all. And this is where data steps in, for example, user testing, where users put voice to their own gut instincts, or recordings of customers getting lost on the same webpage. When multiple users appear to struggle at the same point, you can be sure that there’s a problem.
Many CRO specialists fall into the trap of simply sharing numbers when an experiment conclude: this change resulted in -2% downlift, or we saw an increase of £25,000k over the course of the experience.
For some stakeholders data like this is all they require to make decisions, but in my experience, stakeholders like to know why an experience won or lost. Understanding their customers is vital to having a successful business.
To be a great CRO specialist we must learn to interpret those results, not only to give recommendations, but also to tell a story. That interpretation requires us to trust our gut because we do not always have the complete picture when we do our analysis.
For example, how many of your experiments have failed when everyone in the business expected them to succeed? I’ve seen this countless times. The pre-experiment data suggested the variation would win, but when tested it did not — everyone wants to know why.
Using whatever data is available from that experiment you must then piece together a story. You may have the numbers and a handful of recordings, but you cannot ask every individual why they did what they did, so you have to use intuition to fill in the blanks. Including the story of an experiments’ success or failure is important because it humanises the results and can help generate the next iteration of the experiment.
In the world of experimentation, data is king and gut instincts are the enemy. But in the right context, intuition is a key part of the CRO process.
Just like real life where our subconscious brain sends hormones coursing through our body in reaction to an unusual situation, those same hormones fire when we experience broken patterns on websites. Trust your gut if something seems off about a user flow, then do the research to ratify the problem.
Finally, statistics cannot tell us why users behave a certain way. User testing highlights problems, but often, especially with failed experiments, we must use our intuition to piece together bits of data to tell a story so we can improve next time.