Blog

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

A Summary of the Human-Centered Design Community of Practice’s April Program

How are product analytics used to quantify behavior and impact along the customer journey? This question was the topic of discussion during the Human-Centered Design (HCD) Center of Excellence’s Community of Practice program on April 24. The program began with a presentation by Noble Ackerson, a Senior Product Manager at Ventera and a Senior Product Owner on the iQIES program, followed by a discussion facilitated by Stephanie Ray, the ISG strategy owner for the HCD Center of Excellence, from CMS.

The presentation was divided into three sections—Definitions, Lean Analytics, and Lessons Learned—as it relates to the iQIES program and began with a definition of analytics and why they are important.

Definitions

Ackerson states that product analytics is used to understand your user’s behavior and consists of three components: data, information, and insights.

  • Data is accessed through tools and snippets within an application, as well as through tools and conversations with users.
  • Information is analyzed through that data, using dashboards and business intelligence tools, such as Google Analytics, Heap, and Pendo. Insights are inspired from the information gathered and the questions to be answered in order to provide actionable, realistic, and informed directions to improve user lives.
  • By learning how to use data and insight effectively, the team can start predicting the future. According to Ackerson, “what separates a mature product team from an underperforming one is that great product leaders don’t sell what they can make, we make what we can sell.”

Why is analytics important? Because you cannot improve what you cannot measure. An actionable metric must be simple to understand, comparative, a ratio or rate, provide user feedback (qualitative insights), and ultimately be behavior changing. 

To emphasize the importance of user feedback or qualitative insights, Ackerson told the story of the city of Boston’s efforts to solve a real problem concerning potholes. The city planners approved an innovative way to tackle potholes using a phone app, called Street Bump, that sensed a pothole when a user drove over a pothole. The solution solved the pothole problem in affluent neighborhoods, but failed in poorer neighborhoods where most residents did not have a phone, did not want to pay the cost in losing phone minutes, or did not have a phone that could run the app. The city planners needed both quantitative and qualitative data to solve their problem.  Vanity metrics, which Ackerson described as quantitative data, is important, but it must be blended with qualitative data to learn how your user’s behavior is trending toward your user’s goals. Analysts measure to understand, guide, inspire, prove, predict, and succeed.


Lean Analytics

Ackerson stressed the importance of starting an analytics program with clear objectives or program goals. Goals must be measurable, actionable, and realistic. In addition to identifying what success looks like, failure should also be identified. Analysts can then take one objective at a time and see if the objective is being exceeded or missed. The next measure is a roadmap to guide the team.

A roadmap is a prototype for your product strategy. An example roadmap was presented to answer the question on how do we make provider management more streamlined with multiple columns showing objectives or strategic themes, and ending with goals, which can be tracked with an analytics tool or through research or diaries or one-on-one sessions with your users. From this roadmap, the lean analytics cycle begins. Ackerson notes that you must focus on one metric at a time. The next step is to pick a key performance indicator (KPI), then draw a line in the sand. From the line, find a potential improvement, both without data by making a good guess and with data by finding a commonality.

The next step is to form a hypothesis. From the hypothesis, make changes in production, then design a test. Measure test results and determine if changes were made (did we move the needle?). The KPI is met if there was success. If the KPI was not met, then pivot or quit.

A products analytics funnel shows that different analytic tools are used to analyze both quantitative and qualitative data using the HEART framework—happiness, engagement, adoption, retention, and task success, which was popularized by Google.


 Lessons Learned

First, start early and start planning. Do not assume that you are inventing everything from scratch, as there might be another Line of Business (LOB) that has gone through this before. Second, involve your security team to ensure that governance is in place, protected health information is truly protected, and personally identifiable information is properly governed. Third, configure your tool in a way to limit code changes, as more code that is added, the more likelihood for defects. Finally, ensure your strategy is couched by metrics that are simple to understand, comparative, and ratio or ranked. As Ackerson states “don’t be a feature factory, be a value engine with good behavior analysis.”

Ackerson closed his presentation with thanks to the security team for assistance in weaving Google Analytics into the fabric of iQIES, the Ventera SMC Research and Global Team, and strict alignment with Solution Management.


The remainder of the program consisted of Ackerson’s response to several questions asked by Ray concerning product analytics, lean analytics, and resources, as well as questions posed on Slack by program participants.

“Beyond the basics of trust, what are the ethical considerations when it comes to what we measure and how we use it?” Ackerson responded that users care about three things: context, control, and choice. Why are you collecting that information? Do I have the options to opt in or opt out? Do I have a choice to opt in? In addition, a privacy page should be included that clearly lists what data you are collecting if and only if you have a justifiable reason to collect it. That action affords trust.


“How do you know it’s the right metric or can you talk about a hypothesis and a test that you performed and what surprised you the most?” In his response, Ackerson recommended Alistair Croll’s book, “Lean Analytics,” as a resource in response to this question. With lean analytics you start with a key focus area and then sequence the work with one metric that matters. Ackerson discusses a framework that was coined by Silicon Valley wonks, called Pirate Metrics, because the framework of acquisition, activation, retention, referral, and revenue or AARRR sounds like the sound a pirate would make.


“Can there really be one metric that matters, or how do you go about selecting that, and from your perspective, how does that come into play for iQIES specifically?” Agreed, that the one method that matters is easier said than done. By prioritizing one metric, that will bring more value to users, more value to the business than anything else. And, this is easier said than done again because for every Planning Increment planning event, we have a handful of objectives, and they are all weighted the same. In order to inform what you do next, are those features bound to the program objectives, as a whole, in order to sort of drive that movement toward the ultimate product vision and, of course, is that driving toward CMS’s goals as well. Ackerson stated that he has seen a lot of positive traction in starting to think of one CMS, one iQIES, one SMC. And that is sort of inspiring other efforts to focus in a lean way when it comes to analytics and product.

Ray agreed and mentioned that in reading “Measure What Matters,” by John Doerr, she is finding that we have learned things through trial and error and sometimes that is the only way to master that process.


Two questions from Slack were posed, asking about best practices around establishing baseline metrics, so that you can measure improvements over time; and in terms of process, what methods are successful for encouraging adoption of lean analytics? In response to the first question, Ackerson suggested researching SMART [specific, measurable, achievable, realistic and, timely (or time-bound)] or ARC objectives. To simplify the approach, ask yourself, “are my metrics imperative, are their ratio arrays as some of the imperatives that I talked about earlier, but are they also realistic.” An example was given on improving the speed of a website. I would set my KPI or OKR for improving the speed of the site. That is the baseline, which draws the line in the sand and then follow the lean analytics cycle to determine whether you achieve success or failure. Is it achievable? Is it realistic? I would also set the baseline for over a period of time to track in increments.

In response to the second Slack question, Ackerson suggested starting small and stated that it does not have to be perfect. It starts with understanding as an individual how I am tracking toward my team’s goal. You cannot solve the problem in isolation, ask questions, document responses, which will start inspiring a way to search and use a more data-centric and data-informed approach. Ackerson repeats the adage that we cannot improve what we do not measure.

Last question from Slack asks how can you address return on investment for projects? In other words, how have you helped teams define what success looks like from the onset? From the onset, you must ask both what does success look like and what does failure look like. Remember clear communication. Also remember known knowns, known unknowns, and unknown unknowns.

Finally, “How do you stay up to date with the latest trends and practices, best practices. Can you talk about the blogs, books, or other resources that you read?” Ackerson read lots of blogs, listen to lots of podcasts, especially Intercom podcasts which are product focused. Alistair Croll’s “Lean Analytics,” was also highlighted as a book that speaks to Ackerson’s heart.

If you were unable to attend the program, visit the HCD Community of Practice archive on Confluence for this and other programs that might be of interest. Be sure to join the Community of Practice on May 29, which will feature creativity strategist and writer Natalie Nixon who will discuss brainstorming.  

  • No labels