Measurement, Metrics, Analysis – How to use Analytics To Increase App Downloads

Simon Kendall

Simon Kendall is a product manager at adjust (formerly adeven) the business intelligence platform for apps. Simon spoke at the at App Promotion Summit in Berlin on the subject of “How to Use Analytics To Increase App Downloads”.  This was one of the best talks of the day and included the following topics:

  •  What To Measure In Your App?
  •  How You Can Use Key Metrics To Drive User Growth
  •  How Analytics Helps you to Succeed

We’re now able to share this presentation with you in a number of formats including video, audio/ podcast and the full transcript.

How To Use Analytics To Increase App Downloads Video

How To Use Analytics To Increase App Downloads Audio/Podcast 

How To Use Analytics To Increase App Downloads Presentation 

‘How we used analytics to increase downloads’ – Simon Kendall at App Promotion Summit Berlin from App Promotion Summit Conference

How We Used Analytics To Increase Downloads Transcript

Okay. Hi, everyone. Like she just mentioned I am the product manager at Adeven. Christian unfortunately could not make it today, but he sends his warm regards to all the friends in the room. There we are. Now I’ve been at Adeven for about a year and a half now doing product management, partner coordination, so on and so forth.

When we were on the topic of nationalities, I’m actually from Sweden. So if you have any IKEA jokes, please hold them back because I’ve heard every single one. Now Adeven develops what we call a business intelligence platform for mobile apps. So we do user acquisition performance tracking analysis and so on and so forth.

We’ve been around since May or April last year and we have just a couple of metrics of what we’re doing in a given month. This is from September and we’re working with some rather large corporate brands. We’re working with a lot of IT companies. Working with a lot of start-ups. So you get a sense of where we’re getting our data from and what experience we have.

So in my job as product manager, I spend a lot of time analyzing individual complaints. I consult new clients on how to make their app set-ups, how to figure out what metrics to measure, and that’s a few of the perspectives that I would like to talk about today.

And so I have really three quick points that I would like to just go through. On the one hand, what are the metrics to measure in your app? And how do you make sure that these metrics are actually meaningful to you? On the second part, when you have found meaningful KPIs, how do you understand the definition? And how do you make sure that you can compare them over time and compare them to your other parts?

And finally, we would like to talk about something that we’ve been working on very hard in the last couple of months which is cohort analysis. And it’s been mentioned a couple of times today and I would like to go through a little bit exactly what it means and what you can do.

So to start with: what metrics to measure. We’ve hit this topic a little bit and I think that especially Stefanie was touching on this. This has to come from you because it’s very, very different in different verticals. Lele, I’m sorry I used dating and not social discovery, which seems to be a nicer word but we’ll live with that.

Generally speaking, each different kinds of apps, these are a couple of examples of verticals we work with a lot, and the kind of behavior that you’re attempting to inspire. The kind of behavior that you’re trying to track and you’re trying to create which will lead to success are quite different depending on what you’re doing.

For example, in a publishing app like we work with, basically what you want users to do is you’re charging them indirectly for their engagement. So what you’re attempting to do is you’re attempting to make them open the app, click on an article or click on a page and stare at it for a really long time. Whereas with a social discovery app, for example, if you were to see the same behavior that would pretty much be mostly creepy.

Similarly with ecommerce apps, you’re trying to push people to browse around a little bit, take a look at different products. And that’s a wholly different behavior as well. So of course your metrics will vary a little bit. You might be looking at engagement metrics slightly differently and the custom events that you’ll be tracking are of course very different as well.

There’s also the aspect of the app life cycle which I’m sure a lot of you have a fairly decent sense of. Apps do have a more or less distinctive life cycle and you have very different metrics for this. This is an example that we made with a gaming app in mind. A lot of my team is from a gaming background or otherwise have done app developments and this is sort of a premium sort of model.

Of course this is a longer span, but you see that they’re in the acquisition phase. Of course you’re looking at retention, you’re looking at whether you’re getting the right kind of users. In a way you want to kind of ask yourself who are my paying users? And where are they coming from? And how do I get more of those? So that’s also a different aspect to keep in mind.

So we go through this process and we consult our people to look at the business models. Where are you getting your money from? What sort of behavior should your users demonstrate? And then you get there and they say they have about 300 different KPIs that I would like to look at. We always tell them that’s a really bad idea. It’s a really bad idea, and we’ve also touched on this in a few previous speakers, that if you’re trying to do what you do with a solution like ADX you’re trying to optimize campaigns. We’re trying to optimize different features. And you need to know whether a recent campaign has actually improved your results or not.

Now if you’re looking at, let’s say fifteen different registration metrics, the problem is that you won’t really have an easy answer to that question. You’ll look at it and you’ll go well it’s good in some ways and it’s really s**t in some ways. And you have this kind of mental algorithm where you’re trying to figure out what’s the most important thing here. And you might wind up making a decision that you kind of like, but isn’t necessarily the best for your app.

So we really recommend to keep it down, find two or three metrics that are meaningful that really capture the behavior that you’re looking at. And then when you get a campaign done, you see okay well they are actually behaving exactly like I need them to behave. And there you go.

And so I’m not actually going to push too much on what metrics to measure because it’s a really general question and it really has to come from you. That’s kind of what I’m trying to push here. So I would like to work a little bit more and demonstrate some of the things that we’ve done with how we define KPIs and how we look at KPIs because that’s ultimately what our business is.

So one major question is when you’re looking at KPIs, what does it mean in comparison to what I’ve done before? I had a similar campaign a year ago and now I’m seeing completely different metrics. What does that mean? Or I have an IOS app and an Android app and I’m looking at the same metric, why are they different? This is what we say when we say comparable KPIs. We have attempted to build a solution and we’re working a lot with our clients to really figure out how these KPIs come together. As a little bit of a lesson learned, I’d like to take an example.

We worked initially with the session metric. Session metrics I’m sure you’re all very familiar with. It’s very important, it’s a sort of basic engagement metric. A lot of subsequent metrics that you’ll be looking at for your chain will be based on let’s say the number of sessions or the number of revenues per session or block. The problem is that initially, the easiest definition of this is saying a session is when my app is in the foreground. That makes a lot of sense. That seems reasonable. It’s easy to program, it’s easy to understand if you’re explaining it to somebody. This also made sense when OSS were still pretty primitive and still pretty new.

What you’re seeing immediately is that we’re starting to get to a point where your app is not necessarily in the foreground, but somebody’s still using it. For example, if I open your app and I then pull down the notification center, you’re suddenly out of foreground. If I pull up Spotify and change the track and then go back to your app, I’ve also moved you out of the foreground. Or if you ask me for push notifications, you’ve also moved me out of the foreground.

A sort of easy tracking 101 sort of solution to this problem would definitely be to just say the number of context switches is the number of sessions. We had a bit of a problem with this because you’ll have one app that has a push notification request and another that doesn’t. One app will have twice as many sessions as the other even though there’s no actual difference in user behavior.

This is one of those things that we started working on. So we went back to the drawing board and we figured that there’s a very excellent web definition, and we actually stole this pretty much from web. That is that a session is a continuous stream or a burst of activity from an app which is separated by a 30 minute break from any other such stream. We went through and we looked at these numbers and the thing about that is that when you start thinking in this way instead, you’ll find that 30 minutes is pretty much always 30 minutes. The only exception is if you’re on a space ship.

So it’s the same in IOS, it’s the same on Android, it’s the same on different apps, and it’s the same in different campaigns. In some, you have a metric that you know exactly what it means and you know exactly how many there are. When we started rolling this out, we built a feature in our STK to understand this difference natively and not even talk to our server unless there had been a new session.

We started rolling this out and we said to people last week you had 600 sessions per user in the first couple of days and this week you have 5. That might sound very bad but it’s actually a much more tangible number. We start thinking about it and well actually five sessions, that makes sense. I can look at that and say someone downloaded our app and used it five times. That’s pretty easy to see.

That’s the sort of thing that I think it’s important to have an understanding of the kind of metrics that you’re looking at and have an understanding of how they’re defined and how you can compare them across each other. I’m sure that any feasible, any real tracking solution here, and we heard before that you shouldn’t pick the others. This is unfortunately for this reason. A lot of these more tricky kind of KPIs, these tricky kinds of definitions, are really hard to do if you’re not in the tracking game as your main focus.

So with that over with, I would also like to talk a little bit about cohort analysis. We’ve heard a lot about this today and I think that’s great because it’s a really important aspect. Cohort analysis is something that we have been working on really hard in the last couple of weeks with some selected partners and we’ve been doing it internally just to figure out exactly how it’s going to work.

In its strictest definition, what you do is you group users together on a common criteria. So before, we saw a cohort based on the people who had browsed, we saw cohorts based on the people that had just read something or downloaded, and we saw cohorts based on generation. In a marketing context, you’ll often be looking at cohorts based on the time of install.

So you lump people together who downloaded or installed at a certain point in time. That’s pretty much the fundament.

How I like to describe it, if I can get the next slide, there we go. How I would like to describe it is let’s say you’re making wine. I know absolutely nothing about making wine, so my example is going to be a little bit lackluster but let’s say you’re making wine and you change some part of your process. So you take out your new bottle and your old bottle and you taste the two. And you think holy s**t, I am really incompetent. My new bottle of wine is in no way better than my old bottles.

Of course we all realize that most bottles of wine get slightly better as they age. They sort of mature. They have a certain defined lifespan. The wine goes from being bottled to becoming a little bit different over time.

This is really the same with users. They download your app, they don’t really know how it works, and the work through it and they start maybe buying something in the first week and then they kind of move on. As long as you’re retaining them, you start kind of understanding how things work out.

Of course, with Lovoo for example, you might very well in the first week just look at the radar and then you start discovering more of the features, you start sending more messages, you start getting more messages, so on and so forth. So that’s what you do with cohort analysis. You take a glass from the wine bottle as if it was there on the day it was bottled.

So for example, let’s say we’re looking at revenue by week. We take out the first cohort. The first cohort, let’s say, is everyone who installed your app on week 31. And then you look at their exact behavior over the next, let’s say, six weeks of their life cycle. And you say that makes sense, I have one week of users. They’re all in the same level of the time span.

And then in order to compare that to another group, you’ll take out another cohort with the same method. You’ll pull out the numbers for somebody who installed in week 41. And you pull out their first six weeks. They’ll have the same flavor, so to speak. They’ll have the same stage in the lifespan. So we can compare them with a much more, it’s more comparable.

And you’re erasing the interference that is their lifespan in itself. That’s kind of the eagle eye’s view to cohort analysis. We have a couple of case studies but I think it’s a little bit too short today to go through them. So we’re actually publishing a couple of case studies next week, possibly, and then you might be able to take a look at that.

Like I said, today was short, so I’m going to summarize it there and say that what I’m seeing when I’m helping people set up campaigns, what I’m seeing when I’m troubleshooting campaigns, and when I’m talking to partners is that there are really three things that you need to consider.

First of all, make sure that your metrics are meaningful in that they mirror what you want your users to do, whether you want them to send messages or whether you want them to read articles. You need to really look at that.

On the second point, you need to make sure that you understand what the metric does and you need to make sure that it makes the same sort of sense the first time you use it as the next time you use it. You need to make sure that it works over all your different apps whether you’re using Android, Blackberry, who uses Blackberry I guess, IOS, and so on and so forth.

Finally, we really want to push cohort analysis because we’re seeing immense effects here. You also heard it today and I think that makes a lot of sense because that’s really the way that you can compare your users without having an interference from the lifespan. If there are any questions, feel free to drop me a line. We’ll be publishing some more publications on this topic later on as well and I can give you some hints further. I have to say the HasOffers blog is also great. They do some great work over there too so I’d like to recommend that as well. So that’s it for me.

Thanks to Simon for doing such an interesting talk.  You can find out more about App Promotion Summit here