UX teams must use data. They must use data to identify a problem, to help them to find a solution, and finally, to help quantify and tweak their efforts.
Not so long ago I received a heads up about an A/B test a UX team wanted to run for a new page they had designed. My first reaction was “great, they don’t want to just roll out a new page, they want to make sure it has the desired outcome.”
So what was the desired outcome? By digging a bit deeper into this planned test, I discovered that...erm, there wasn’t one. The ”KPIs” had been plucked out of thin air and they were not relevant to the page in question. For example, one of the success metrics for the test was bounce rate. But this page is rarely an entry page, so why are we redesigning a page to reduce bounce rate? That surely makes the assumption that there was a high bounce rate to begin with, which in-turn prompted this redesign? No, this was not the case - clearly, no one had looked at any data to gather insights.
It occurred to me that the team had no idea why they had created this new page and had no clear objectives for the redesign. They were trying to solve a problem that didn’t exist...or did it?
The page in question, is the top of a funnel. By digging into the web analytics data, I discovered that actually the drop off rate from the top of the funnel to the bottom, was fairly significant. Ok, so now we have a problem to solve. And what a great place to start - at the top. By chipping away at each step of the funnel we can help drive overall conversion rate. So the clear KPI here has got to be step 1 to step 2 conversion rate. But this isn’t why this new page was created. So why were certain decisions taken on the layout and design of the page?
This might seem like fairly obvious stuff to some of us working in the world of digital, but I think it is really easy for areas of the business to lose sight of why they are creating something, especially when it involves jumping into a web analytics tool they are not necessarily familiar with, and trying to make sense of the data.
That’s why a smart web analytics implementation and dedicated analytics resource is so important, and why it needs to be utilised from across the business, especially by teams responsible for designing new pages.
I should clarify, that there is nothing wrong with testing a new design just because your team feels a particular page needs a refresh. Infact, I would actively encourage this. However, it is important to know why you are changing/updating certain elements of the page. If you launch a test where you have changed multiple elements on a page, and you see positive results from the new version, how do you know which changes have driven this uplift? It’s a great headline to share with the business: “We tested this new shiny page and we saw an increase of 12% in sign ins”. High fives all round... BUT how do you know which element of the new design drove this uplift? Was it the new title/the new CTA on the button/the larger fields…? These are all valuable questions you could and should have answers to...
Why does it matter?
To get the most out of your testing program, it is useful to view it as another analytics tool. Your web analytics data gives you insight into what your users are doing on your site, and from this you can make assumptions about why they are behaving in a certain way. You then test these assumptions (otherwise known as a hypothesis), to find answers.
So if your team wishes to refresh a page; break down the various elements of the page you want to redesign and test them individually. Slowly you can begin to gather valuable insights that will feed into your redesign as well as future changes. The alternative solution is to run a multivariate test, where you test multiple elements at the same time, but I’m not going to cover that here.
I’m always slightly conscious that the idea of data feeding into design can be a sensitive topic amongst some teams. Some design and UX folks may feel that it hinders their creative arm. In fact, I believe the exact opposite is true - in the world of digital, data and design compliment one another, and actually helps everyone to do their job. When fully embraced, design and data can produce truly rewarding results for everyone.
- Know why you are redesigning a page - what problem are you trying to solve?
- Have data to back up your new design
- Test whatever you want to test, but know what you are testing and why
- For gathering insights, break the elements of the page down into bite size pieces
- Make assumptions with a sound hypothesis
- Treat your testing program like any other analytics tool. Use it in conjunction with your web analytics, to find answers to important questions. Optimizely, Maxymiser and Adobe Target are effectively all web analytics tools