Skip to main content
Consortium for Service Innovation

Service Engagement Measures Template

There are three areas in which our measures are focused: Traffic & Success, Length & Cost, and Customer Experience. You don't have to tackle these all at once! 

For each measurement, the goal is to use the most sophisticated approach you can from the good, better, and best guidance below - but don't let that stop you from using what you have available now to get started. Remember to get clear on your definitions first.

Guidance on the approaches are mostly offered for self-service and community engagements. Use your existing measures for agent assisted engagements.

Enter data in the yellow cells and the spreadsheet will calculate for you. Copy or download the Google Sheets template.

Prerequisites

Definitions

It is important to define, for your environment, what will be included in your Service Engagement Measures Spreadsheet. We have provided good, better, and best formulas for calculations in the spreadsheet and an extensive glossary of terms, but judgment is required in terms of what makes sense for your environment! You may start out with whatever set of data you can get your hands on. Define it, and trend it over time, but the intent should be widening your scope to more thoroughly and accurately reflect the full customer experience.

For example, what you do or do not count as a self-service engagement will depend on your business and may change over time.

Data

Recommended formulas in the Service Engagement Measures Template include multiple data sources. While some of these things can be mined from tools directly, you may need to do some detective work to find out where they live. Some Members discovered that they needed to enable the capturing of the data first - and all Members recommend using your organization's existing data models wherever possible. Don't reinvent the wheel!

Engagements

  • Self-Service: user sessions for self-service mechanisms (usually in a digital experience)
  • Community: user sessions and threads
  • Assisted: cases, tickets, or service requests

Cost Per Engagement

  • Self-Service: Total associated costs (may include salary, systems, overhead - based on how your company organizes their cost structure) / /number of successful self-service engagements
  • Community: Total associated costs (may include salary, systems, overhead-based on how your company organizes their cost structure)/number of successful self-service engagements
  • Assisted: Time spent on tasks and the volume of demand (again, based on how your company calculates cost per case)

Survey Data

  • Session surveys: self-service success
  • Generated answer feedback
  • Customer Satisfaction
  • Customer Effort
  • Net Promoter Score (NPS)

Time

Due to data limitations, the most useful strategy is to benchmark against yourself. Filling out the measures template one time will provide limited insight. Track the data over time (weekly, monthly, quarterly) against optimization efforts to assess what changes lead to meaningful positive change. We propose measuring monthly and presenting quarterly.  Year-over-year analysis can account for seasonal behavior.

Filling In the Template

Number of Engagements

Volume of issues for which requestors pursue a resolution that we have visibility to. Often measured through attempts, sessions, sign-ons, searches and content views.

  • Good = number of sessions or views per month
  • Better = number of sessions per month that include at least one content view
  • Best = same as better and includes criteria for a meaningful content view (i.e. time on page)

% of total engagement demand (volume)

Calculated percent of all the demand through all channels that we can assess​​​​​​.

% channel success

Approximation of issues resolved in channel. Often measured by surveys (low response rate and response bias), an estimated percent of sessions/visits without opening a case in some period of time (24 hours - 7 days), or number of views divided by an average number of views/issue. The intent is to measure how often a requestor issue is resolved per self-service (the requestor is done with indication of success) and assisted (closed case with resolution offered).

  • Good = percent of sessions/visit without request for assistance OR number of views divided by an assumed avg number of views/issue. Note: this does not account for the significant % of issues that are abandoned (without resolution or an assist request). Some apply a % abandon rate.
  • Better = survey "were you successful?" percent "yes", apply confidence interval to ensure you have a large enough sample size.  Correlate your results with "good".
  • Best = sophisticated clickstream analysis (percent of patterns that represent success or failure). This is an area of exploration and we are looking for examples of what this might look like!

Number of successful engagements

Calculated volume of completed engagements based on number of total engagements multiplied by % channel success.

% of total successful engagements

Calculated ratio of successful engagements in channel as a percent of total demand.

This calculation is a reflection of Customer Experience and indicates opportunities to improve content and mechanisms.

Average Length of Successful Engagement

Customer time to resolve, requires assumptions based on your products and customers. Authentication can help with measuring length of engagement.

  • Average time of a sample of successful sessions (using clickstream analysis). Note: as you improve the accuracy of identifying successful sessions, this will improve the accuracy of your measures.

Cost Per Successful Engagement

Total costs (may include salary, systems, overhead) associated with self-service (not easy) allocated based on time spent on tasks and the volume being served (cost/min and cost/engagement).

This is a great spot to use your organization's existing calculation for costs.  The Consortium's position is that all KCS-related costs should be counted under Assisted Engagements since even without fueling self-service, KCS provides ongoing benefit to the organization.

Customer Satisfaction

Existing standards: survey. How satisfied were you with your experience?

Customer Effort

Survey results depends largely on customer expectations.

  • Good = Survey
  • Better = Sophisticated click stream analysis
  • Best = Journey mapping and tracking improvements in the journey over time

Customer Loyalty

Use Net Promoter Score (NPS) or other existing standards.

  • Was this article helpful?