QualityNet Jira and QualityNet Confluence will be briefly unavailable on Wednesday, July 24, 2024, between 8:00 PM ET and 9:00 PM ET while the team performs an AMI update.  If you have questions or concerns, please reach out to us in Slack at #help-atlassian.

Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Panel
borderWidth0


Column
width1400 px

Column
width850px
Image Added
A Day to Explore Important Product Issues
like Trust, Ethics, and Integrity
Amy Castellani |
Image Removed
Top 14 User Engagement Metrics for 2021
(... According to Experts)
Reading time: about
9 min

Look no further! Here’s what a group of experts had to say about how they track user engagement.

User Activity (DAU, WAU, MAU)

User activity is typically measured within some temporal window: Daily Active Users (DAU), Weekly Active Users (WAU), or Monthly Active Users (MAU). Measuring user activity by using at least one of these metrics can show you how many unique users are actively using your product during a particular interval.

When calculating user activity, it’s important to determine what your team defines as an “active” user. Is it when they log in to your product? Is it when they click on a specific attribute or open a specific page? Once your team has defined what an active user is for your specific product, then you can find how many daily, weekly, or monthly users you have.

Stickiness 

It can be helpful for your team to know how “sticky” your product is. That is to say, how often are your users returning to use your product? We can find this by dividing your Daily Active Users by your Monthly Active Users.

Stickiness = DAU / MAU

So, if you have fifty Daily Active Users and a hundred Monthly Active Users, then your stickiness percentage is 50%. Your average user is using your product about half time, or 15 days out of a 30 day month.

Week #1 Engagement 

Similar to DAU or MAU is measuring engagement during a user’s first week in your product. If a user isn’t directly engaging with your product within the first week that they gain access to it, it’s highly likely they won’t be coming back. Tracking week one engagement can also reveal any kinks in the onboarding process you may have, or show how intuitive your product really is to a new user.

Number of logins

Likely, you define an active user as someone who does more than just log in to your product, however tracking the number of logins can still be helpful. I mean, they can’t be using your product if they aren’t logging in, right? 

Deciding whether to track logins on a daily, weekly, or monthly basis will depend on your product and how often your average user should be logging in. After deciding on what works best for you and your team, tracking logins can help you spot if a user is trending towards less engagement with your product or isn’t using it as frequently.

(Segmented) User Retention 

Segmenting your user retention rates can be very helpful to see when the dropoff point in your specific product seems to occur. Looking at user retention after day one, day seven, and day thirty can help you identify when users are leaving your product. A typical user retention curve looks like this:

Image Removed

The goal of looking at user retention in this way is to improve user engagement at all points of their experience within your product and move that curve upwards.

Time In-App

Another useful metric is looking at how long your users are staying in your app. A user can open your product a hundred times, but if they’re only staying in it for a few seconds or even minutes, they likely aren’t getting enough value out of it. It is important to note here, however, that this metric will be totally dependent on your product and how it’s designed to be used. If you are a weather app, then a user staying on your app for longer than a minute or so might actually show that your product is too confusing, since it shouldn’t take them that long to check the weather. 

This is why it’s important to measure the average amount of time a user stays in your app. That way, you have a benchmark to compare a single user to, and you can more easily see if a user is spending too much or too little time on your app.

Number of returning users

New customers are great to have, and increasing the top of your funnel with new users can be crucial to growing your business. But if users are trying your product once and never coming back, then that means that they’re not seeing the value in your product or are not fully understanding it. This can be a huge problem as it will increase your churn rate and can stunt your company’s growth. 

Tracking the number of users returning to your product shows you how many users are getting value from your product versus those who aren’t. You can then target those users who aren’t getting as much value out of your product with additional messaging or educational materials. If you have a low percentage of returning users, you may want to reach out to those users and see why they haven’t returned or send along some helpful tips and tricks to get them to come back.

Feature Usage

Tracking which features are used, how frequently they’re being used, and who is actually using them can all be super helpful to your team. If you have a feature that is never being used, it may indicate that users just don’t find value in it and it may be time to get rid of that functionality, which will save you and your team a ton of time and money. On the flip side, if there is a certain feature that is getting a ton of usage, it may be helpful to look into why your users find it so valuable and if you can expand on it in a certain way.

It’s also important to look at who is using which features. If you have a less popular feature, however the users who actually use it are your highest paying customers, you are likely not going to get rid of it any time soon.

Net Promoter Score (NPS) 

NPS typically asks your users to rate on a scale from 1-10 how likely they are to recommend your product to someone else. Typically, those who rated 1-6 are considered “Detractors”, those who rated 7-8 are considered “Neutrals”, and those who rated 9-10 are considered “Promoters”. You can then find your NPS score by subtracting the percentage of “Detractors” from the percentage of “Promoters”.

NPS= % Promoters – % Detractors

So for example, if you have 68% “Promoters” and 10% “Detractors”, then you have an NPS score of 58. NPS can be a helpful metric, however it is highly speculative in nature; “How likely are you to recommend the product?” Instead, we suggest using aNPS, or Actual NPS. aNPS asks the user, “Have you already recommended the product? Why or why not?” This will give you much more actionable insight and data.

Feedback response rates 

One of the most important things you can do when trying to validate your ideas for potential new features or gather information is to ask your users for feedback. After all, who knows better what your users want than your users? Having a high feedback response rate shows that your users care enough about your product to put in the time and effort to give you feedback, and can be a great indicator of a motivated user population. 

Bounce rate 

Your bounce rate is the percentage of people who click away from your site after viewing only one page. A high bounce rate can mean that your users aren’t finding value in your product, aren’t understanding your product, or aren’t engaging well with your product.

However, bounce rates can be deceiving. A user may just have a question on pricing, or need a quick refresher on a question they had, so going to just one page in your product could satisfy their need. It’s important to keep this in mind when analyzing your bounce rate.

Customer Satisfaction Score (CSAT)

We’ve all received customer satisfaction surveys; essentially, CSAT shows shows a user’s overall content or discontent with your product. You ask your users to rank your product on a scale (1-3, 1-5, 1-10, etc.) and then you take the sum of the scores divided by the number of respondents. While NPS measures overall satisfaction, CSAT typically measures a user’s satisfaction with a certain feature inside of your product. This metric is most effective when you use it at the exact moment they just finished using that feature during their journey through your product.

Abandonment rate 

Abandonment rate is the percentage at which a user will abandon their virtual shopping cart when checking out online. You can calculate it by taking the number of abandoned carts and dividing it by the total number of users who completed the sale.

Abandonment rate = # of users who abandoned their carts / total # of users who completed the sale

Conversion rate

There are certain actions you want your users to take within your product, whether that be opening a certain page or using a certain feature. These actions can be tracked as conversions, and you can calculate your conversion rate by taking the number of users who complete those actions and dividing it by your total number of users.

Conversion rate = # of users who completed action / total # of users

So, if you have ten users who used a new page in your product and a hundred total users, then you have a conversion rate of 10% for that new page. This metric can be super helpful in seeing how quickly your user population adopts a new feature or how much of your product they are actually using and exploring.

 

3 min

Wednesday, November 10, 2021 wasn’t just another Wednesday — it was a celebration for the CCSQ and QualityNet community. Produced by the Human-Centered Design Center of Excellence (HCD CoE,) CCSQ’s 3rd Annual World Usability Day was a virtual, open-house style conference held to honor and explore good design and the usability of products. This year’s theme — Design of Our Online World: Trust, Ethics, and Integrity — set the stage for a day full of engaging and thought-provoking presentations, case studies, and a panel discussion delivered by a diverse group of design and technology professionals.

Digging into the Design of Our Online World

Carol J. Smith, Sr. Research Scientist in Human-Machine Interaction at Carnegie Mellon University, Software Engineering Institute, AI Division, led us in an exploration of trust and how UX practitioners can define and measure it during her keynote presentation. After Carol’s presentation, we welcomed Morgan Taylor, Technical Advisor of the Customer-Focused Research Group (CFRG) in the Office of Burden Reduction & Health Informatics with CMS. Morgan shared some of the CFRG’s HCD work and how the cross-agency customer engagements help build customer trust and inform policymaking. Later in the day, Rob Fay, Lead Design Strategist with Tantus Technologies, moderated a panel discussion on both bad, and empathy-driven design.  

You can learn more about all of the speakers and access session materials like recordings, presentation slides, and a link to the Collaborative Note Taking Mural Board now on the event site.

An Engaging Event

A Collaborative Note Taking Mural Board allowed attendees to interact with one another during the sessions while synthesizing and deepening their understanding of the information being presented. Breakout rooms were opened after each session to allow attendees to continue the conversation with the speakers, and at lunchtime to give attendees the chance to network with one another. The breaks between sessions were filled with thematic and fun tunes from an original trust-inspired playlist. Attendees anxiously watched in between sessions as the Wheel of Names was spun and winners of Zoom Virtual Backgrounds, Avatars, and HCD Lunch n’ Learns were announced.    

Testimonials

“I loved all of it! The speakers were great, and you could tell they are all experts. The topics were interesting and engaging and inspiring. The mural board with everyone's notes and comments was an awesome idea and I enjoyed taking notes and reading other’s thoughts. THANK YOU! I am truly impressed with the event.”

“It was great to see a bigger view of how CMS sees UX and its importance.”

“The presenters were very knowledgeable, and the material was presented in easily relatable fashion, terms and situations non-HCD people could understand.”

“Very informative - great speakers. Highly organized. All the extras - music, prizes, chats.”

Looking Forward to Next Year

Look for more information next year about World Usability Day 2022 and its theme and CCSQ events.

Visit the CCSQ World Usability Day event site now to access session materials, and visit the HCD Confluence site to learn more about the HCD CoE.

 

Panel
borderWidth0

Anchor
AmyBio
AmyBio


Column
width128px

Image Added

Panel
borderWidth0
Column
width128px

Image Removed


Column
width20


Column
width690px

Parlor.io
It’s crucial nowadays for companies to understand the individual users, champions, and decision makers behind their accounts. Parlor’s URM tracks every individual human’s unique relationship to your product and business to unite your entire organization around the needs of your users.

Amy Castellani
Amy is a Communications Specialist supporting the CCSQ Human-Centered Design Center of Excellence (HCD CoE). Amy combines her communications skills and growing knowledge of HCD to help the team promote the usage of HCD best practices throughout the CCSQ community. In 2018, Amy earned her Bachelor of Arts degree in Business Management from Goucher College, where she concentrated on marketing and communications..


Image AddedImage Removed      Image Removed


Column
width50


Column
width490

Include Page
SEPT_embedded. Advertising Include
SEPT_embedded. Advertising Include



...