Data-Driven
How to track user behavior in a mobile app
If you’re a mobile PM, you know that you’re faced with a ton of questions about your investments and the impact of your mobile initiatives. You likely also know how important it is to understand the interplay between web and mobile, as well as to get accurate web and mobile product event data.
This article lays out some best practices for mobile PMs to leverage their data effectively. It’s designed to help you track mobile usage from the start. If you are already tracking mobile usage, feel free to jump to the sections on strategy and governance.
Step 1 | Goals, metrics & KPIs
Before implementing any tracking solution, it's important to quickly map your customer journey. This will help you identify how usage drives Key Performance indicators (KPIs).
Goals by product lifecycle stages
What you choose to focus on in your tracking, as well as the scope of your tracking plan, may change across stages of a product or feature’s lifecycle.
For example: if the product or feature you want to analyze is new, you might want to focus on user engagement and pinpointing friction. As time goes on, you will likely collect more data about downstream events from activation. At this point, you may want to look into trends in adoption, retention, and churn.
For mobile application developers who want to analyze their users’ behavior with the same degree of rigor that their web counterparts can, advanced product lifecycle stages can help.
Grow a user base and compare iOS vs Android:
Utilize advanced analytics to identify the most effective user acquisition channels for each platform (iOS and Android).
Analyze user demographics, behaviors, and preferences specific to iOS and Android users to tailor marketing campaigns accordingly.
Conduct A/B testing on both platforms to optimize app store listings and improve conversion rates from app store visitors to downloads.
Reduce churn and analyze app version differences:
Implement comprehensive analytics to identify common churn points and understand differences between app versions (e.g., iOS vs Android versions).
Analyze user behavior patterns, such as feature usage, session length, and engagement levels, specific to different app versions to address areas of dissatisfaction and improve retention.
Use targeted notifications or personalized offers based on app version differences to re-engage and retain users.
Personalize experiences and track web-to-mobile/mobile-to-web journey:
Leverage advanced analytics techniques to segment users based on their preferences, behavior, and platform (web or mobile).
Analyze user journeys across web and mobile platforms to understand how customers transition between the two and identify opportunities for seamless integration.
Deliver personalized content, recommendations, or notifications that are tailored to users' preferences and platform usage.
Improve self-serve user experience and increase conversion rates:
Analyze user flows and interactions within the mobile app, focusing on areas where users encounter friction or difficulties in the self-serve process.
Optimize the mobile app's user interface, navigation, and overall usability to ensure a seamless and intuitive self-serve experience for both iOS and Android users.
Leverage analytics to identify drop-off points specific to each platform and implement targeted improvements to enhance conversion rates.
By tracking how customers transition between the web and mobile platforms, developers can uncover valuable insights and identify opportunities for better integrating the two.
Step 2 | Define your tracking strategy
Now that you’ve set your main KPIs, you’re ready to define your tracking strategy. Your ideal tracking strategy should support both your current and future goals.
Here’s how to do that:
1. Identify the customer journey surfaces you need to track
Begin by identifying critical surfaces to monitor and analyze. Then map the technology or framework used for each surface. Finally, choose the best tracking option for each, such as integrations, SDKs, or track APIs.
2. Plan for manual tracking and autocapture
In general, a hybrid tracking approach is your best solution for saving time and surfacing hidden insights. That means you’ll be using a combination of automatic and manual tracking.
Things you’ll want to use manual tracking to capture:
Important user journeys milestones that rarely change, like account setup
Critical actions that drive business outcomes (ex: invite user, start trial, purchase)
Usage of a new feature (ex: entrypoint, actions that indicate feature use)
Things you can uncover by tracking with autocapture:
Unexpected friction in user journeys. For example: users clicking on an element that’s not meant to be clicked (aka. deadclicks)
How unexpected user behavior or alternate user journeys are converting
Review our Lean Analytics Plan example for more information on how to blend these two approaches.
3. Identify users throughout their journey
It can be a challenge to connect user interactions across web and mobile platforms. The solution is usually to adopt some sort of identity resolution solution. Doing this can help you create cohesive views of your users, across devices, browsers, and domains.
Set up a unique user identifier to understand how users move between different products or web and mobile surfaces. For experiences that occur after login, use a username, email, or other unique information. Most analytics tools offer "identity resolution" options.
When dealing with events that occur before a user logs in, there are alternative strategies you can use. Some platforms allow you to retroactively identify a user by matching an identified user with an unidentified user based on a cookie or device ID. This way, you can stitch together the two identities and run insightful analyses, such as identifying the most effective marketing pages in driving user trials.
Remember: it is essential for all teams to use the same user identifier. This ensures that all events are accurately attributed to a specific user.
Step 3 | Bring together critical context and metadata
User behavior is the core of understanding a user’s journey within your product. That said, critical metadata can level up your analysis and help answer more questions.
For example: you can enrich user profiles with data about their role, vertical, or company size. With this enrichment, your team could tell if a feature is better adopted by users in large companies or if those kinds of users are having trouble. There are two main approaches to bringing together this data: enriching your analytics dataset inside the analytics tool, or enriching that data downstream in your data warehouse.
There are advantages to each:
Small teams or teams new to analytics, may lack the resources and expertise to manage separate data warehouses and BI tools, and may instead enrich their data within the analytics platform. This can also facilitate more targeted and personalized messaging by combining enriched data with customer engagement tools.
Advanced teams may already utilize a data warehouse to store, manage, and model financial, business, and user behavior data. Sending the behavioral data from their analytics tool to the warehouse, enables them to synthesize various data types and conduct more sophisticated analyses, while also facilitating cross-team reporting.
Step 4 | Post implementation QA
Use your analytics tool to generate your initial reports and set up dashboards that map the customer journey and focus on the priority areas you identified in Step 1. As you create these charts, identify any issues you have with implementation. You also want to keep a running list of untracked items.
Dealing with missing data
Label or ‘tag’ missing events as you go. For most events, you won’t need to work with the engineering team to instrument further. Instead, use the autocaptured data and label the events you currently need. The best auto-tracking tools even offer retroactive labeling so you can analyze the new events from the time you started auto-capturing, even if you only recently wanted to analyze this data.
First, complete an initial review of your analytics plan, addressing the questions you aim to answer. Following the autocaptured scenario approach, begin mapping the journey and identifying key KPIs, and compile a comprehensive list of missing or malfunctioning tracking elements. It's best to create this list directly in the engineering team's ticketing system to facilitate prioritization and minimize communication gaps when transitioning from spreadsheets to Jira tickets.
By consolidating tracking changes for each iteration, your team can more efficiently implement new tracking elements (reducing back-and-forth communication) and more easily identify if an entire category of tracking elements is absent. Before prioritizing these tickets, analyze the list for any patterns or discrepancies in tracking that you can learn from, and avoid in the future. Add any necessary tickets to ensure comprehensive tracking of the entire workflow.
Fixing Incorrect or inconsistent data
The best analytics platforms will allow you to update the event definitions separately from the track calls. This gives you options for repairing or iterating on the event definition without changing the actual tracking code.
By changing the event definition rather than the raw tracks you save developer time and automatically update any charts and dashboards that depend on that event. This saves time for everyone in your organization; analysts who would have to update the charts, stakeholders who have to find the latest versions of dashboards, etc. It also helps reduce human error and eliminates inconsistencies with track calls for iOS vs. Android that are typically done by different developers and teams.
It’s best to batch up these changes so that your engineering team can quickly make these small changes to tracking code. Create engineering tickets for each of the changes and folks analyzing the data can check the status of changes to the data. One way I love to do this is by using a label in Jira for any tickets that include tracking changes, then I create a unified view of all tickets that change tracks. This allows anyone who relies on the data to understand upcoming changes and see the status.
Step 5 | Continuously iterate and govern
Products are constantly improving and evolving, which means user behavior and raw events from the product will be changing too! As your team adds features, deprecates workflows, and iterates on the product, you’ll want to make sure the dataset stays trusted and up to date.
Dynamic governance for agile product teams
It can be helpful to use tools that will allow your team to follow best governance practices:
Event Verification. This can mean a process of ‘verifying’ events, by the analytics admin, on a regular basis. By following this process, your team can run their own analysis more successfully, confident that they are using the most up-to-date events.
Event Categories. You can also categorize events by team, product, or workflow to narrow down the dataset that the team uses. This makes events, charts, and dashboards easier to discover and manage.
Dataset Ownership. It can be helpful to create a single ‘owner’ for a data set who is responsible for ensuring that events stay up-to-date and accurate. They can help resolve issues and verify charts and events.
Event Lifecycle Maintenance. To avoid recreating shared charts and dashboards when features and events change, use virtual events or tables. Doing so will ensure accuracy over time and prevent confusion from outdated charts.
There are two common ways to create virtual events:
At the data level, use virtual tables, add extra properties, tags, or names to existing events, and keep lists of raw events for analysis. This approach usually involves complex SQL queries, which can be difficult to maintain and requires significant analysis resources.
Choose a tool that offers low-code or no-code support for virtual events, enabling the dataset to be defined separately from raw events, simplifying the process and achieving the same result.
Recap
Product analytics investment is an ongoing process. The teams that are most successful are those that rely on data for most of their decision-making. But this requires trustworthy and accessible data and requires that that data be available to everyone.
To maintain a comprehensive and up-to-date dataset, develop an analytics and analysis plan for instrumenting your applications effectively. Select a flexible tech stack that can evolve with your product. Ensure your tracking tools cover all aspects of the user journey, and make analysis and reporting accessible to the entire team.