To understand your customer and their unmet needs, use the analyst mindset. You’ll gather and interpret quantitative and qualitative data from a broad variety of sources, ensuring decision making is grounded in sound data. You will explore new hypotheses – looking for unexpected trends, new insights, or discoveries. You must be objective and intellectually curious, open to surprises that contradict your current thinking.
Quantitative data is gathered by tracking user behavior or by surveying a large customer base, and used to spot statistically relevant trends. However, quantitative data alone might not provide the clear, actionable insights that tell you exactly what is needed.
Qualitative data help explain the underlying reasons for an issue or can reveal unknowns and, while not statically relevant, provide insights into underlying customer motivations. Combine quantitative and qualitative data together to understand the full picture.
Adopt an analyst mindset by starting with some of these activities:
Set and measure performance metrics for your product – Determine and gain agreement among stakeholders for your key performance indicators, before you launch your product. If possible, benchmark against competitors with similar business models to set reasonable targets. Do this to make sure everyone has realistic expectations. Be sure to add tracking and reporting requirements into the project plan.
Explore data to look for unexpected trends – Intellectual curiosity is a required Product Manager trait. Market and usage data can reveal interesting trends or anomalies. Delve into it to find trends and learn new insights. Do not limit yourself narrowly, such as just tracking your KPIs or the standard “out-of-the-box” reports delivered by many analysis tools. Drill-into and segment raw data – to find insights otherwise hidden in the “averages”. Read customer service emails or reviews about your product – scanning them for common themes; repeating issues, concerns or unmet needs. Find external or internal market reports to read. Write down questions as they come to you – these can help identify analysis you should be doing or form future hypotheses to test. Do not take results at face-value – get to the “why”. Answering a question that’s been on your mind can be a fun “downtime” or late week activity.
Observe and interview customers – “Get out of the building” on a regular basis, at least monthly, and you’ll build a rich qualitative perspective and empathy for your customer. Go on Sales calls. Get involved with any User Experience or Customer Insights activities your company is undertaking, particularly in-person interviews. Don’t just test your existing product or ideas; try to learn more about the customer, their environment, and how they use your product.
Become your own analyst – While many modern organizations are blessed with a myriad of data and reports, a Product Manager should not rely entirely on others for their analysis needs:
1. Unless you cut and slice the underlying data yourself you may miss anomalies or surprise realizations, or fail to see that data may have been misinterpreted. You’re unlikely to get the full story from canned reports and second-hand information.
2. Your ability to learn and iterate quickly may be slowed if you need others to provide you with all your analysis. Whether it’s Marketing, Sales or a Business Analytics team, all typically have many other internal customers they also need to service.
3. If you can self-service your own reporting needs, you can reframe and tweak your questions without delays. Asking the right question upfront is hard; more than likely you’ll have to go back to re-analyze or augment your data, requiring additional requests.
Practice the skills that allow you to gather raw data from trusted sources. As needed, perform your own quick-and-dirty analyses, so you can learn and iterate quickly.
Techniques to equip yourself for self-service analytics
Depending on your organization’s maturity of reporting systems, and policies, consider the following options…
⇒ Become expert in your company’s reporting tools
⇒ Learn advanced Excel – especially pivot tables, charts, look-up functions, filtering/sorting, and VBA
⇒ Learn basic SQL so you can manipulate your own data
⇒ Negotiate read-only access to non-production databases or ask for a daily (or weekly) data dump for select data
⇒ Negotiate direct access to customers and users for interviews
⇒ Subscribe to 1-2 key market research companies serving your industry
⇒ Petition for a dedicated Analyst assigned to support the Product team
⇒ Make a close friend in Data Analytics
|Analyst Mindset Action Checklist|
|o Establish KPIs (Chp 12)
o Detail tracking and reporting requirements in project plan
o Regularly schedule exploratory product and market data review
o Review customer service emails
o Engage frequently in customer interviews
o Sign-up to and scan regular company reports
o Get account access to user data
o Self-train on your company’s analytics tools
o Practice techniques for self-service analytics
War story – Discovering Insights
At one digital media company I worked at, we noted that our ads, when shown on mobile devices, had much lower click-through rates than when shown on desktop computers.
This was 2009, and we were convinced that the disparity largely came from screen size and load times. Add to that, mobile ads were simplified versions of their desktop counterparts, and ads were less relevant (as, at the time, targeting technologies were still in their infancy). And, because our mobile phones are more social, intimate devices, we suspected that mobile advertising just wasn’t something people were yet comfortable with.
Interested in exploring this hypothesis, I looked at the performance for specific device and OS to see if there were any patterns (such as lower engagement rates on smaller, less advanced phones, or demographic differences between their user-base). While latest iPhones were performing much better than Android overall, but that did not seem to fully explain the gap. The key insight came with noticing there were several popular Android phones getting a high number of ad impressions, but no clicks at all. (It was impossible to test every combination.)
We had assumed demographic and form-factor differences fully explained the gap – but a bug in our ad serving technology affecting some Android phones was the greatest culprit. Now we knew how we could improve our results with something well within our control.