Home > KCS v6 Practices Guide > Section 2 The KCS Practices > The Evolve Loop > Practice 6: Process Integration > Technique 6.5: KCS Process Integration Indicators

Technique 6.5: KCS Process Integration Indicators

Integrating use of the knowledge base into the knowledge workers' workflow is a Core Concept of KCS. The frequency and quality of our Solve Loop activities dictates the level of benefit we will realize.  The Process Integration Indicators (PII) enable us to assess how often and how well we follow the workflow.  PII is most valuable as a tool to promote learning and growth: to identify coaching moments. It provides insight to people's behaviors and the degree to which the Solve Loop activities have become a habit for knowledge workers.  Most of the Process Integration Indicators are activities, and we do not want to put goals on them.

 

The activities or behaviors that create value require judgment. The Solve Loop activities must be done in the context of the bigger picture: an understanding of the long term potential value of what we are doing. Goals on activities drive behavior in the absence of judgment and will corrupt the knowledge base.  Do not put goals on activities!  

 

In order to promote the behaviors that create value we need to understand how the activities relate to the outcomes. This is a primary focus of the KCS Coach.  While observation is an important element of effective coaching, there are a number of indicators that can help the coach identify areas for learning and growth. The combination of the Article Quality Index (AQI) and the Process Integration Indicators (PII) provide coaches with the perspective they need to help people improve the value they create. 

 

The Process Integration Indicators need to be tailored to the workflow the organization has defined.  To provide an example of these indicators we will use the generic workflow that is defined in the Process Integration Technique 6.1.

Search Indicators

Search early, search often. It is interesting to know how often and when we are searching the knowledge base. This is especially true early in the adoption of KCS as people are learning the KCS workflow.  Here again, observation by a coach of knowledge workers doing their work and how they are using the knowledge base is important.  A helpful compliment to observation is having reports on search activity.  The timing of the first search compared to case open and the frequency of searches done while responding to the request can be helpful information to inform the coaching activity.  Searching is an activity, so don't put goals on it!  The reports should be used to help identify coaching opportunities about the knowledge worker's behavior. The conversation should be about the behaviors, not the search indicators.       

Contribution Indicators

Knowledge workers contribute to the health and value of the knowledge base when they:

  • Reuse and link, accurately 
  • Modify, when appropriate
  • Create, if an article doesn't exist 

 

We want to know how often we are capturing the experience and learning from our interactions. Our contribution shows up in a couple of different ways.  If we reuse an existing article, we want to record the use of the article by linking.  If we learned additional information about a known issue, we want to add that to the existing article: modify. And, if no article exists that reflects the request and resolution, we want to create one.  Reuse (linking), modify, and create all represent the primary ways we contribute.

 

Reuse and link: How we record reuse will vary from organization to organization.  In environments that have cases or incidents that reflect the interaction, we want to link the resolution article to the case or incident. In environments that do not use a case or incident, we need to provide a way to record the reuse of articles. Analyzing the patterns of article reuse helps us identify opportunities for high impact business improvements in the Evolve Loop. Link rate is the percentage of closed cases with an article linked. Link rate is an activity: do not put a goal on link rate!  Link accuracy is the percent of articles linked that are actionable, specific, and relevant to the case. Link accuracy is an outcome (not an activity), so it should have a goal.  

 

Link rate and accuracy are both helpful indicators.  While linking is a good habit to develop across the organization, the reason we care about link rate is because the pattern of article reuse can drive business improvements in the Evolve Loop.  It is sufficient to have a link rate of 60%-80%. A linking rate of 60% or greater will allow us to determine the pattern.  Driving the linking rate to 90% will not change the pattern. This is the concept of "statistically significant," meaning it is a big enough sample size to establish a pattern and more data will not change the pattern.  For most organizations, 60% is well above the statically significant threshold.  The same is not true for link accuracy.  If our link accuracy is below 90%, the pattern of reuse of articles is not likely to tell us anything useful.  Some member organizations give zero credit for link rate if a knowledge worker's link accuracy is not 90% or higher.  Maintaining a high link quality is far more important that maintaining a high link rate. 

 

Modify rate: As we reuse articles, we often have additional information that could improve what already exists.  It may be information that can improve the clarity or usefulness of an article.  More often, it is additional context we can add, based on how the requestor experienced or described the issue. Adding this to an existing article will improve article findability.  The modify rate reflects the percent of the time we modified an existing article when appropriate.  Divide the number of articles that have been modified by the number of articles reused where a modification was warranted.  Here again, calculating the total articles reused where a modification was appropriate can be difficult and a sampling technique can used to assess the modify rate.   

 

Create rate: What percentage of the time are we creating new articles when appropriate. Similar to above, the math is to divide the number of times an article was created (subtracting any duplicates) by the opportunity to create.

Contribution Index

The contribution index tells us: of all the opportunities where linking, modifying, or creating was appropriate, what percentage of the time did we link, modify, or create? It is the ratio of the number of times we contributed as a percentage of our total opportunity to contribute.  Or more simply, how often did we do the right thing? So, the math for the contribution index is the number of times we linked, modified, and created divided by the total opportunity where link, modify, or create was appropriate. One important thing to consider in this calculation is duplicates. We want to subtract any articles that are duplicates from the number of articles created. If we can easily find a preexisting article that should have been reused, then creating an article was not appropriate and should not be included in the article creation count. The links to a duplicate article should not be counted in the relevant link count either. Assessing the total opportunity to contribute is difficult.  However, just like the AQI process we can assess it based on a sampling technique.   

 

In defining the contribution index we have described the ultimate model for assessing contribution.  Members are at various states of maturity with this model. 

 

These concepts were pioneered by David Kay, KCS Certified Trainer and Consortium Innovator in his work on the Resolution Quality Index (RQI) in developing an indicator for "how often did we do the right thing."  

Sample PII

Viewing 2 of 2 comments: view all
"statistically significant" - I don’t think we want to go there. The issue isn’t significance, which has a very precise mathematical meaning, but bias — how representative the sample is of the entire caseload. I think it’s fair to say when you get a sufficiently high participation rate, you’ve driven down sample bias, so the true picture emerges.
Posted 11:09, 15 Jun 2016
If I look at the “Sample PII” table at the bottom of the page, it looks like the “Contribution Index” column tallies the percent based on (# of articles modified + articles created)/(# of modify candidates + # of create candidates). This makes good sense to me for “contribution” (since it does not include "# of articles linked" where no improvement/modification took place").

With that said, the text in the section above the chart seems to contradict this by including “linking” in the computation.

===========
Contribution Index
The contribution index tells us: of all the opportunities where linking, modifying, or creating was appropriate, what percentage of the time did we link, modify, or create? It is the ratio of the number of times we contributed as a percentage of our total opportunity to contribute. Or more simply, how often did we do the right thing? So, the math for the contribution index is the number of times we linked, modified, and created divided by the total opportunity where link, modify, or create was appropriate.
============

I don’t believe the intent of the Contribution Index is to include reuse without improvement (as indicated by modified here).
Please confirm.
Posted 11:04, 14 Dec 2016
Viewing 2 of 2 comments: view all
You must to post a comment.
Last modified
12:52, 14 Jul 2016

Tags

This page has no custom tags.

Classifications

This page has no classifications.