Many are talking about growth and product analytics, but how many well established companies are leaving behind their old ways of design by committee?In this postI want to highlight a common scenario and promote the modern approach to product development.
Once upon a time…
I received a heads-up from a UX team about an AB test they wanted to run for a new page they had designed. My first reaction was “great, you don’t want to just roll out a new page, you want to make sure it has the desired outcome.”
So what was the desired outcome? By digging a bit deeper, I discovered that…erm, there wasn’t one. The “KPIs” had been plucked out of thin air and they were not relevant to the page in question.
For example, one of the success metrics for the test was bounce rate. This suggests there was a high bounce rate to begin with, which in-turn prompted the redesign. But this page is rarely an entry page, so why were the team redesigning a page to reduce bounce rate?
It occurred to me that the team had no idea why they had created this new page and had no clear objectives for the redesign. They were trying to solve a problem that didn’t exist…or did it?
The page in question, was the top of a funnel. By digging into the data, we discovered that the drop off rate from the top of the funnel to the bottom was significant.
Ok, so now we have a problem to solve. And what a great place to start optimising — at the top!
So why were certain decisions made for the redesign?
Hopefully if you’re reading this you’ve already heard of Amplitude. For those of you that haven’t, it is a relatively new mobile and web analytics tool.
On first hearing about Amplitude I was very intrigued. I was desperate to find out what they were doing so differently to their well established rivals, that made them think they could compete. What could they possibly have that the others don’t? In hindsight, the same question could have been asked of other tech companies like Airbnb and Slack (holiday lettings and instant chat? Yeah, because thats never been done before!).
At a time when ‘growth’ is the buzz word of the tech start up scene, Amplitude have cleverly positioned their solution as a growth tool and moved away from simply being a marketing /reporting tool, which is where many of their competitors come from.
With regards to the use of analytics, companies tend to fall into one of two buckets:
Reporting vanity metrics (sessions, page views, avg. time on site) back to the business, with minimal actionable insights and superficially measuring growth
Creating loads of dashboards that no one looks at
Proactively analysing and identifying problems and opportunities
Understanding user behaviour — how do users fall in love with the app/site, and what keeps them coming back again and again?
I’ve seen vanity analytics alot and its a difficult trend to break. What do you do when you have an exec asking you “how many page views has page x had?”.
There is no doubt that vanity analytics is a cultural issue for alot of organisations, but it does feel like many analytics tools on the market haven’t helped this mindset. Perhaps it’s due to legacy features that has left some vendors behind, or maybe it’s the analytics vendors actually giving their customers what they want. Either way, Amplitude have taken the decision to do things differently.
Amplitude is a reporting tool secondary to being a product insights tool. It gives product teams easy access to insightful, actionable data, without the need to run any complex sql queries. The straight forward event tracking gives lots of flexibility for manipulating the data, with plenty of helpful out of the box charts. The most impressive feature is a predictive tool called ‘Compass’, which analyses behaviours that are closely correlated with retention.
Amplitude are in the privileged position of having been able to build their product from scratch and make it relevant for the twenty-tens, without any noisy legacy features. For example a lot of their competitors have been around for a comparatively long time, and they were designed for web. Along came mobile and they adapted their tech for native apps, with what appears to be little innovation — with the same tracking methods and reports as web.
Amplitude is a mobile first tool by focusing on helping to answer the increasingly difficult question of ‘how the hell do I retain my native app users?’.
It is clear that Amplitude have created a product that it’s customers have been (knowingly or unknowingly) crying out for (helped by the story behind the birth of the company), and they have reached this point in their journey by listening intently to their users. They have literally put into practice the culture they are trying to promote to their prospects and existing customers, of making smarter data led product development decisions.
Amplitude certainly know their industry and audience. This is also demonstrated by their content marketing — it is some of the best content I’ve seen. As an example, one of my early reads of their blog was Behavioral Cohorts: find your most engaged users. It’s heavily focused on the capabilities of their product, but it did the job of getting me thinking about the holes in our existing tool.
I also encourage you to read their Product Analytics Playbook. This is a masterclass in content marketing. It’s not just hot air; it gives a framework for developing a retention strategy with data. It demonstrates how in-tune they are with their market.
A little while ago I met their Co-founder and CEO, Spencer Skates. In the meeting he presented some key elements from the playbook and demonstrated how his team is applying them to their own retention strategy. I enjoyed Spencer’s transparency, and appreciated his understanding of what companies like mine are looking for.
Amplitude is an extremely exciting company. Of course the product is not perfect, but their appetite for customer feedback, and pace of development can only benefit their product iteration, and in-turn their customers. I personally want a partner that lets us lead them, and not the other way round. My hope is that as they grow in size, and eventually move away from ‘start-up’ status, they can maintain their way of working.
If you are in the market for a new web analytics tool, and Amplitude are on your hit list, please feel free to contact me for some impartial feedback.
(My company has recently migrated to Amplitude. Amplitude have had no involvement in the writing of this post).
Implementing a web analytics tool is time consuing and expensive. In three years my company used three (yes three!!) third party web analytics tools.
Can you imagine the amount of time that was wasted on implementing the tools, and what little time was spent on effectively using them!?
Just to set the scene, when I begun working at the company, I inherited a third party web analytics tool, which I had successfully used in my previous job. Unfortunately, the inherited tool was badly implemented, with no supporting documentation. No one in the company knew how to use it and we had received minimal support from the vendor. I cannot stress enough how much my colleagues HATED the tool — from graduates right up to execs, it was despised.
How Not to Choose a Web Analytics Tool
Shortly after joining the company we signed a contract with another third party web analytics tool to replace the despised one. It was cheaper and deemed ‘better’ by senior management. Unfortunately, no due diligence was carried out to assess the requirements of the business, to find a suitable fit. It was seen as a quick fix for the other (failing) tool.
Another full custom implementation later, and 2 years of struggling to fulfil our data requirements, the unanimous conclusion was that we couldn’t carry on as we were — the ‘better’ tool was also failing.
SO, I led the search to find a new web analytics tool, and this time I wanted things done properly.
As a result of my rollercoaster experience I feel well placed to offer some tips on how to choose the right analytics tool. There is a number of web analytics tools out there for you to choose from, so here goes….
Implementing an ab testing tool is the painless step in embracing Conversion Rate Optimisation (CRO). Perhaps you have a new shiny ab testing tool implemented on your site and you don’t know where to start. Or maybe you’ve heard everyone else is doing it (not to mention the awesome benefits), and you want to kick off the conversation with your team/boss.
Whatever tool you have chosen or you are exploring, it is important to understand that it is just an enabler. Without people and process, the tool is useless. So here is a quick guide to Conversion Rate Optimisation (CRO).
App Store Optimization is the art of getting your app visible in the app store. Much like SEO, it is about getting your app ranking high for certain search terms, and there are various factors which impact your ranking. I’m not going to attempt to cover all of these factors, but I wanted to share one element which I recently had some success with – store ratings and reviews.
Beyond the fact that apps with higher ratings rank higher for competitive keywords, it is clearly good for raising your profile and could potentially help drive your download conversion rate.
App Store Optimization Campaign – ask your users to leave you a review
As consumers we are most likely to leave a review or give feedback for a service or product thats been really bad. When we are mad and/or frustrated we often take to the internet to voice our frustration — it can be therapeutic! On the flip side we might also feel inclined to leave a review when we’ve had an amazing experience. However, its true that the majority of positive experiences go unmentioned.
Not every single user is going to love your app (shocking I know). If you have a decent amount of repeat traffic, it’s safe to assume you have a pool of users who are getting value out of your app and therefore might be willing to spare you a couple of minutes of their time to leave you a review. For the sake of this exercise, these are the people you want to identify and target.
Out of the box, third party analytics tools give you lots of data about your audience, but you will quickly reach a point where you don’t have all the data you need, specific to your business. I’ve worked with various analytics providers, and I have learnt the hard way what can easily go wrong with custom implementations, especially when it comes to making sense of the data. Page analytics tracking is one area I have run into issues in the past.
(My definition of ‘ page analytics ‘ here is the analysis and reporting of traffic sources and behavior flows. e.g which site referred a user to a specific page and which pages did the user visit next)
The purpose of this post is to give some tips for customising page tracking on your website or app to make the reports as useful as possible and to avoid having to re implement tracking.
I focus specifically on Google Analytics, but the principles can be applied to other tools, so please don’t be put off by this if you’re using one of Google’s competitors – its still relevant.
Creating a testing culture in an organisation that is unfamiliar with the concept is not easy, especially in large companies. However, when you do eventually make an impact, which can be extremely rewarding, it can begin to snowball. And by this I mean lots of people getting wind of the benefits of testing and wanting a piece of the pie. It is fantastic to reach this point, but it can come with its own problems.
Depending on how you’re structured, very quickly you can have a queue of business owners wanting to run tests on their areas of the product. If you and your team are in charge of testing, then it is your responsibility to maintain law and order. It’s all well and good implementing a shiny new testing tool, but your job should not just be about becoming the gatekeeper for this tool — it is so much more than that.
It is very easy to get carried away and just launch a test when another one has finished, because you have a Business Owner bugging you, asking “when will my test be prioritised”. Before you know it, you have run a multitude of tests without any time to breath before or afterwards. This may not be the most effective way for you to run your testing program. Here is a few points to be aware of: