How we measured change aversion with our new home screen

Read the article

My name is Katherine and I’m the Data Scientist in the App Evolution squad, working on our home screen. As you’ll have most likely noticed (and probably had feelings about!) we've had a bit of a makeover.  

A lot has changed since the last redesign - we’ve launched Monzo Plus, Monzo Premium, Monzo Flex and Instant Access Savings Pots, we’ve expanded our budgeting tools with Trends, and now more and more customers are managing their non-Monzo accounts in the app. 

That is by no means everything but it was enough to make us realise our features had outgrown the app. It was way too hard for customers to get the most out of Monzo because they couldn’t discover things or easily see their full financial picture. We redesigned the Home screen to make it easier for you to keep track of all the things you have going on in Monzo today and make space for the future. You can read more about our new home in a previous blog post.

It was one of the biggest design changes we’ve ever shipped and we had to be confident in this new home we were bringing to customers.

I’m here to tell you how we did that.

For the last year or so, my primary focus has been on experiments to test out different iterations of our new home screen. We have a strong ‘test and learn’ culture at Monzo but this experiment was a bit different. 

Why are you so special?

We’re constantly shipping changes to our app to improve the experience for our customers, whether it’s making small improvements to an existing flow or launching a new feature. Most of these changes are, intentionally, not disruptive. Maybe we change a colour, or maybe we make it easier to sign up for a given product but in most cases, you - the customer - are unaffected and are able to do what you came to do when on the app. 

But with this change…it would be pretty difficult not to notice that!

As I mentioned in our How we built the new home screen blog, new and existing customers have different needs and should be measured with different metrics. But for everyone the primary goal for the redesign was to do no harm. So it was important to set up experiments that address this from the perspective of Monzo as a business (how many people find Flex? How many upgrade to a Paid account?) as well as the perspective of our customers and how they use the app to manage their finances.

In other words, we didn’t want to make it harder for you to complete the core banking jobs you open the app to do. That’s why we measured everything from the amount of times a certain action happened to the number of seconds it took to achieve e.g. paying somebody. 

Okay we get it, you’re special - but why does that matter?

If you haven’t heard about change aversion, you’ve almost certainly experienced it (Opal fruit, anyone?). Change aversion is defined as “negative short-term reaction to changes in a product or service”. Jakob Nielson put it succinctly in his talk: Users Hate Change

Image of famous tiktok influencer Brittany Broski meme. The left image is Brittany making a disgusted face; the right image is her making a pondering face.

Change aversion is not something we’d typically need to consider in our experiments because, as mentioned, most of the changes we ship are minor enough that they have negligible reactions. But this was a big change where we’d introduced a whole new design and layout, in which many typical customer tasks would have changed. 

Without factoring in a period of aversion into this change, we’d be at risk of assuming the impact we’re seeing is representative of the longer term when in fact it’s not.

There are lots of brilliant resources, such as this article, that explain the concept in detail so I won’t address the theory here; I will, however, focus on how we identified aversion in our data, and how we built it into our experiments. Since new customers have no prior experience of the app, we only needed to worry about change aversion for our existing users.

But how do you measure change aversion?

Let’s go back to our definition: “negative short-term reaction to changes in a product or service”.

Hmm. But how long is ‘short-term’? How would we identify a reaction? How would we know if we were right?

To answer these questions, we needed a metric to track customer reactions and we considered two options:

  • Transaction activity

    • Was everyone so frustrated by their new app layout that they stopped using Monzo immediately? 

    • This seemed unlikely. People typically assign a level of effort into carrying out a task, and if the effort climbs above this perception, motivation is lost and the task is abandoned. Also, banks are a core part of our financial lives and, since most spend can happen without users even needing to open the app, it’s not a metric we’d expect to see an immediate impact on. Transaction activity still remained one of our core experiment metrics, but was unsuitable for measuring change aversion. 

  • Session Time

    • Was everyone so confused by their new home screen that it took them a lot longer to do whatever they opened the app for? 

    • This seemed more relevant to our change. Our existing customers have established behavioural patterns and default pathways for completing core jobs like topping up a pot, paying somebody or finding their card details. As we make changes to the app we interrupt these behaviours and we have to give users time to establish new ones. 

So we chose Session time as the metric by which to observe change aversion. But at the time of our first experiment we still didn’t know how long ‘short-term’ was, so we made a conservative estimate of 30 days and added this to our timelines. The downside to this was that our experiment ran for a loooong time, but the upside was that we had enough time to observe the metric and validate that change aversion could be identified via a change in app session time.

And did it?

Line graph showing the difference from control (seconds) vs. days before/after change

It sure did! Above, you can see the difference in average session time between our test (New homepage) and control (Original homepage) variants. Before assignment, both variants were within 1 second of each other (that’s good, it means our variants weren’t biased). Immediately after the change we observe an increase in the average session duration (+11% versus the control by day 2), the curve decreases and recovers by day 14, remaining slightly above the control.

Now, it’s worth saying that this could be either change aversion (I don’t like this! What is it! My eyes!) or it could be change attraction (ooh shiny new, what do we have here). But for our experiment, differentiating between the two wasn’t important: either one could lead us to the wrong conclusion had we not waited for it to play out.

We also had an in-app survey for users in our experiment which we segmented out by whether the feedback was 😊 or 😡.

Regardless of feedback type, all customers experience a change to their session time:

  • Users who provided negative feedback had a more extreme reaction, with a higher peak.

  • Users who provided good feedback ended up with a shorter average session time than pre-experiment.

  • All users followed the same behavioural curve, beginning to level out around day 14.

Line graph showing difference from control (seconds) vs. days before/after change with feedback from customers

We didn’t just rely on data from our experiments though. One of my favourite things about working at Monzo is the collaboration that happens across different disciplines, and that was more true than ever with this work. 

We ran user research alongside our experiments to give us a comprehensive understanding of the immediate and 30 day behaviour changes. In addition to our in-app survey mentioned above, we also ran a 30 day diary study with a range of existing customers. This included vulnerable customers and those we identified as most at risk to the changes we had made. The research mirrored the real experience all customers would have when enabling the new home screen, meaning those customers taking part were able to give us in-the-moment feedback over the 30 days, from initial reactions to their experience on payday and beyond. 

Combined with other qualitative and quantitative data, we were able to understand the risk of rolling the change out and how we could help customers smoothly transition, reducing any sense of fear or anxiety.

So what did we learn?

Change aversion doesn’t always need to be factored into experiments. It can be difficult to identify, it can slow down timelines and it’s not always relevant. But, if you’re introducing any change to users then it is worth considering how they will adapt to any disruption before measuring the impact on your metrics. In order for us to be  confident that our results were reflective of the longer-term new experience and not just the short-term reactions to it, we measured impact on our metrics after day 14.

Across User Research and Data Science we leveraged multiple methods to test the impact of the new home screen on our customers but the process involved so many more people. You can read more about how the rest of our squad scoped out and built the design for us to test in another blog post

Opportunities at Monzo

At Monzo, we offer a dynamic and collaborative work environment, competitive salaries, and opportunities for career growth and development. If you're interested in joining our team, visit our careers page for more information on our current job openings.