Our goal is to provide a view of the health and value of knowledge offerings across all relevant channels, or put another way: How well are we doing at connecting customers to content?
Why Measure Success With Self-Service?
We generally refer to the benefits of KCS in three stages:
- Operational efficiency
- Customer success with self-service
- Improved products & services
Operational efficiency is relatively easy to measure. After implementing KCS, it doesn't take long to see improved resolution times, improved first call resolution, and/or reduced escalations, for example. But once those gains in efficiency are achieved, it becomes how you run your business. The incremental value has already been realized.
Success with self-service is a mid-term benefit of KCS. It takes some time to publish and improve articles that are findable and usable by an external audience. For years, Consortium members have struggled with quantifying the benefits realized by improved success with self-service. We have lots of anecdotal evidence: customers are happier when they find answers without opening a case, and knowledge workers are happier when they get to solve new problems instead of repeating known answers. However, these things often do not help justify an investment in knowledge and self-service.
Justify Your Investment in Knowledge
The Service Engagement Measures Spreadsheet is intended to help justify your transformation investment. It demonstrates ROI for your KCS program, your staff, and your tools. It also offers a way to communicate to the company at large the health and engagement of your install base. This may require an investment in analytics, or at least some connecting of folks who have the data you need.
Delivering knowledge through self-service is about multiplying reach! We can leverage knowledge captured during assisted interactions at a VERY low cost. How do we know we're getting full value out of our knowledge implementation?
Benchmark Against Yourself
Measuring success with self-service becomes very complex, very quickly. Because of the variables involved, the only meaningful baseline is against your own performance. This means you need to define your scope in a way that makes sense in your environment, and you cannot report out on it without context.
Judgment is required!
This work was developed and tested over the course of more than 20 meetings, with a group of Consortium members representing 18 widely varied companies - including small, medium, and large enterprises, both publicly and privately traded, focused on both business and consumer audiences, and at multiple stages of KCS maturity. The one constant in our initial findings was: these numbers are only meaningful as part of the larger environment of an organization, and there is very little value in attempting to benchmark against other organizations. Our recommendation is to gather data monthly, and report out quarterly. Once you have a completed spreadsheet - once you start to get a sense of the total demand for support outside the assisted model - can you imagine what would happen if you turned off self-service?
Please see the Measurement Matters v6 paper for more benefits offered by a mature KCS implementation.