Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Include Page
Nav-buttons-d-studies
Nav-buttons-d-studies



QPP Customer Satisfaction Survey Case Study

Widget Connector
urlhttps://www.youtube.com/watch?v=wQHyNDyyylg

Background

What system does your team support? What are its core functions? Who are its users?

Our team supports the Quality Payment Program (QPP). We are a centralized Human-Centered Design team with researchers, designers, a content strategist, and a web developer who manages the QPP Design System. We use an Agile project management approach with a Product Owner and a Scrum Master. We support multiple development (“Dev”) teams responsible for public facing components of the program’s digital experience. We also partner with the two communication teams that work on behalf of different parts of the program.

QPP is a regulatory compliance program that requires participating providers to collect and submit data on an annual cycle. Providers receive scores and future Medicare reimbursements are adjusted based on performance. High performance may increase payments while poor performance or non-participation leads to decreased payments. QPP participants are clinicians, practice managers, administrative staff, third-party intermediary vendors, and professional societies. Practices that participate in QPP vary in size, specialty, and organization, ranging from small, independent primary clinicians to specialists through the country’s largest health systems.

Goals

What were your goals for launching a customer satisfaction (CSAT) site survey? How do these goals differ from those of other customer research methods you use?

Historically, qualitative interviews have been the primary method of user engagement and research within QPP. While this remains an ongoing tenant of our work, a qualitative-only approach limited our view on customer satisfaction and reduced our knowledge to anecdotes from the squeakiest wheels. The CSAT Site Intercept Survey is one of two strategies we sought to expand our data collection to capture perspectives from a broader—and more representative—pool of users. For the Site Intercept Survey, we wanted to know:

  • Who is using the website?
  • Why are they visiting the website?
  • Are they able to complete their desired tasks

In addition to the Site Intercept Survey, we also distribute 3 annual Milestone Surveys that are built around key points in the program cycle. We are in our first year of implementation and at this point have distributed two of the three surveys. The third will go out at the end of the 2022 calendar year. These two quantitative approaches blend with our past and ongoing qualitative approaches to give us a fuller picture of the QPP customer experience.

Stakeholders

What stakeholders did you collaborate with to accomplish this work?
  • QPP Frontend (FE) Dev Team – Manages public facing, unauthenticated website where the Site Intercept Survey is implemented
  • ISG Center for Excellence – Using CFE burden hours. Supported creation of OMB docs.

Approach

What steps did you take to implement your approach? What was the level of effort?
  • Reviewed potential tools for hosting the survey. We considered Hotjar and Touchpoints. We chose Hotjar because the tool includes other resources beyond intercept surveys that pose value to the HCD team. The tool also includes heatmap tracking that we have used for other work.
  • Obtained Hotjar with cooperation of our CMS PM. We implemented the data collection tool on a public site, which required additional documentation including an updated privacy agreement.
  • Submitted survey to OMB for approval of instrument and burden hours. Center for Excellence helped us identify the best number responses to request to assure we have useful sample size.
  • Created survey and programmed it on Hotjar. We collaborated with the team’s UI Designer to skin the survey to match QPP branding.
  • HCD team’s developer collaborated with FE Dev team to implement survey within application. Hotjar allows the survey to be turned on, off, and be modified from browser, so no additional support is required from the dev team.
  • Researcher monitors results as they come through and set milestones for when to review. We chose to complete our first review after 300 responses.
What questions are included in the survey?
  • Which of the following best describes your level of satisfaction with the QPP website? (Likert)
  • Why did you visit the QPP website today? (Open text)
  • How easy or difficult was it for you to accomplish your goal on the QPP website today? (Likert)
  • How frequently do you visit the QPP website? (Select Best)
  • Which of the following best describes your current role? (Select Best)
  • Please share any additional comments or feedback regarding the QPP website in the space provided below. (Open text)
What was the process like for: PRA, tools, custom development, data analysis/reporting?
  • It required about 6 weeks to receive approval from OMB after some back-and-forth regarding proper formatting and documentation of the request.
  • We have been very happy with Hotjar. It is flexible for data collection, can be styled so it fits our site visual standards, and results can be viewed within the browser or exported to a spreadsheet for easy analysis.
  • The CSAT intercept survey is short, so data analysis was efficient and could be integrated into larger CX read outs.
What factors made you choose this approach?

This was the first time many of us had conducted this type of data collection, so we opted to keep it simple and iterate as needed.

Learnings / Challenges

What are your key learnings from the project? What recommendations would you make for other teams who will begin this work soon? Is there anything you would have done differently? Why?
  • Start earlier than you think you’ll need to. OMB can take time to respond and cannot give exact dates for how long it will take. They may respond with questions after a month and then extend the time it takes before the survey can be implemented.
  • Collect the most important information first. Initially we sought to collect demographic data of our users first but found most survey respondents dropped off after the first or second question. During the first 2 weeks, we received very few responses to the most important CSAT questions in the tool. We moved those questions up front and the demographic details to the back and drastically improved our data collection.

Results / Metrics

What has the outcome of your work been to date?

Beginning in 2022 with the incorporation of quantitative data collection approaches following our PRA approval and allocation of burden hours, we seek to produce 3 annual Customer Experience (CX) Reports. We have delivered one of these to date. These CX Reports incorporate findings focused on a specific window of the QPP program cycle based on data collected through our Milestone survey, the intercept survey and supporting qualitative interviews, as well as relevant information from Google Analytics, heat maps, and Help Desk ticket analysis as needed.

The CSAT Intercept survey results were incorporated into that initial CX Report that focused on the QPP Submissions Window, and it will be part of the next report that focuses on the QPP Final Feedback posting.

Do you have metrics from the CSAT survey? If so, how has your team used these to improve the customer experience?

We have conducted one intercept survey analysis to this point, reflecting the first ~300 responses that were received this spring. Analysis included a qualitative review of open text entries to identify themes. At this point, the data is a small enough sample that it will be reviewed against continued data collection throughout the year and considered in future feature enhancements.

Next Steps for QPP

What are your next steps for this work?

Continued data collection.


The HCD Center of Excellence is Here to Help!

The HCD Center of Excellence is here to support you as you participate in the CCSQ-wide goal of increasing customer satisfaction research. We can partner with you through key tasks, such as: 

  • Establish and test a site intercept survey tool
  • Determine questions for satisfaction, ease of use, and efficiency
  • Define a statistically relevant sample for continuous data collection
  • Resolve PRA coverage and Information Collection Request (ICR) for site intercept survey
  • Complete and submit Third Party Website and Applications (TPWA) form for the site intercept tool
  • Determine approach for survey display 
  • Establish a plan for leveraging ongoing customer data

 Use the buttons below to email us or submit a project intake through Confluence and our team will follow up with you shortly. And learn more about the Customer Service Initiative>> 

Include Page
Contact
Contact

Resources


Info
titleWhat is an Customer Satisfaction Survey?

While customer satisfaction as an idea is a general one, a customer satisfaction survey is more defined and refers to a particular type of customer feedback survey. CSAT surveys allow customers to assess and provide feedback, both quantitative and qualitative, on customer service and product quality. 

These surveys are an ‘always on’ method of continuous data collection that allow us to measure aspects of the CX. These surveys typically provide insights related to: 

  • Overall customer satisfaction 
  • Ease of use 
  • Efficiency 
  • Open-ended, qualitative feedback 


Community of Practice Presentation Slides

View file
nameCommunity of Practice Slides_ QPP.pdf
height150


Learn more about the Customer Service Initiative>>