Home > Retired: KCS Practices Guide v5.3 > Section 2 KCS Practices and Techniques > The Evolve Loop > Practice 7: Performance Assessment > Technique 3: The Balanced Scorecard

Technique 3: The Balanced Scorecard

Measures for Individuals and Teams

Now that we understand the roles, we are ready to build goals and metrics for our team. We have found an approach based on The Balanced Scorecard by Norton and Kaplan to be a very helpful methodology. It proposes a number of critical concepts:

  • Link individual goals to department and organizational goals to help teams see how their performance is related to the higher-level performance of the company.

  • Look at performance from multiple points of view. The typical scorecard considers the key stakeholders: customers, employees, and the business.

  • Distinguish leading indicators (activities) from lagging indicators (outcomes).

 

In this section, we show examples of how we apply the scorecard methodology to establish and maintain appropriate measures for both individuals and teams:

  • Leading and lagging indicators

  • Triangulation—looking at things from at least 3 perspectives to see who is creating value

  • Radar charts—a value foot print

  • Sample scorecards for Analysts and managers

  • Activities vs. Outcomes

 

Text Box: Goals placed on activities will corrupt the knowledge base.Our first important concept distinguishes between activities and outcomes. If we put goals on leading indicators (activities), we will get what we ask for. Unfortunately, the activity by itself is not an indicator of value. For example, if we set a goal for each Support Analyst to create 10 KCS articles per month, we will get 10 KCS articles a month. However, if we are paying attention we will notice these KCS articles are often created on the last few days of the month and they contain little or no valuable information (things like "fixed the customer problem"). Goals on activities do not generate the desired outcome. In fact, in a knowledge management environment, goals based on activity will corrupt the knowledge base.

 

In our example, the outcome we want is a quality knowledge base or, perhaps, customer success on the web. The outcome needs to be the focus, not the activity.

 

Putting goals on activities will:

  • Create unwanted results 

  • Destroy the value of the indicator

  • Distract people from the real objective

  • Relieve people from using their own judgment

  • Make leadership look dull

  • Disenfranchise people

 

A very helpful concept from The Balanced Scorecard distinguishes performance drivers (motivators—we cover these in the Leadership section) from leading indicators (activities) from lagging indicators (the results or outcomes). While each of these three elements is important, the role each plays in the measurement system is different. Making a distinction between them is crucial.

 

We need to pay attention to the trends of the activities and their on-going correlation to the outcomes. 

  • Are the activity measures heading in the right direction?

  • How rapidly are they changing?

  • Do the Support Analysts have timely visibility to their performance indicators?

Slide26

While the distinction between activity and outcome measures is critical, we find people struggle with identifying which indicators are activities (leading indicators) and which are outcomes (lagging). Here are some helpful ways to test an indicator:

  • Easy to measure—probably an activity

  • Hard to measure—probably an outcome

  • Easy to ‘game'—probably an activity

  • Hard to ‘game'—probably an outcome

  • Only measurable after the fact (when the incident is closed or at the end of the month or quarter)—probably an outcome

 

See the Metrics Matrix section for examples of activity and outcome metrics.

Triangulation—Who is Creating Value?

The distinction between activities and outcomes is only part of the picture. Effective performance assessment in KCS comes from the integration of three different perspectives; including trends in activities (performance over time), key outcomes (measured against goals), and the KCS Article Quality Index (discussed in Content Health). These three perspectives consider measures that are both objective (quantifiable) and subjective (qualitative) to generate a complete view of performance.

Slide27

The concept of triangulation reflects the idea that the creation of value cannot be directly measured or counted—value is intangible. We believe the best way to assess the creation of value is through a process of triangulation. As with GPS (global positioning system) devices that calculate our location on the earth based on input from multiple satellites, an effective performance assessment model incorporates multiple views to assess the creation of value.

 

We offer, as an example, a collection of measures to create an initial assessment model. Every organization must be thoughtful about developing its own set of metrics that align with their organizational level goals (documented in the strategic framework).

 

The choice of measures for KCS must focus on the attributes that create value for the organization. The integration of the following dimensions creates a comprehensive view of performance, which in turn gives us confidence in assessing who is creating value and who might need attention from a Coach.

 

An integrated view of measures includes:

  • Trends in activities and attainment of goals in outcomes

  • Qualitative and quantitative

  • Team and individual

Aligning to Business Objectives—Balanced Scorecard Examples

The balanced scorecard format helps ensure that we have encompassed the full range of objectives. We recommend referring to the book for guidance in the process of scorecard creation.

 

To help apply the balanced scorecard process to KCS, below are two examples of scorecards based on different roles. Notice how the business objectives—Customer Loyalty, Collaboration and Teamwork, Process and Operations—are reflected and measured differently for the individual and the manager roles, and some goals—Knowledge Contribution, Employee Loyalty, and Strategic Initiatives—are role-specific.


The Support Analyst's scorecard reflects the level where the work really is done:

image066.gif

 

 

The manager's scorecard translates between organizational objectives and Analyst objectives:

image067.gif

 

Regardless of the leading indicators an organization chooses, two imperatives have emerged for the KCS performance assessment system:

  • The KCS leading indicator trends must be visible to the people doing the work.

  • Goals should be set for outcomes, not for leading indicators.

Make Trends Visible to the Analysts

Consider a driving analogy: We want to go from San Francisco to Yosemite National Park. We could reasonably expect to make the 180-mile drive in three to four hours at an average speed of 55 miles per hour (mph). Our desired outcome is to reach the destination of Yosemite in a reasonable period, but we will not know if we have been successful until we arrive. What would we need for the trip? We need a car, a driver's license, and some gas, but a successful trip requires that we also pay attention to many other factors (leading indicators) along the way. Because we would like to average 55 mph, we want to pay attention to how fast we are going. Because we have determined three to four hours is the acceptable period, we want to be aware of the passage of time at different speeds and how much gas we have in order to avoid refueling delays.

 

The dashboard in the car is very helpful in informing us about the enabling factors for a successful trip. In KCS, the trends in the leading indicators are the dashboard that let the Analysts and the organization know the status of the enabling factors. They must be visible to the people who are driving the KCS system, the Support Analysts.

 

We emphasize this visibility because we have seen multiple organizations implement KCS and not provide the Support Analyst with the feedback they need to adjust their behavior and create optimal results.

Goals for Outcomes, Not Activities!

Because leading indicators are quantifiable activities, they are often easier to measure than outcomes. This creates an almost irresistible urge to put specific goals on the leading indicators list. This is counter-productive.

 

Consider the trip to Yosemite again. If the stated goal were solely maintaining an average of 55 mph, it could be done. But in the absence of understanding the objective, the driver will choose roads that allow him to maintain the average speed regardless of destination. We might end up in Chico! Not that Chico is a bad place; it just is not where we wanted to go.

 

During the KCS adoption process, we have seen organizations put goals on KCS article creation (everyone should create five KCS articles a week) or KCS article reuse (Analysts will be measured on how often they reuse KCS articles). The goals for these leading indicators may have been met, but the quality of the knowledge base has been seriously compromised. Invalid and duplicate KCS articles are created, because the focus is on the activity not the outcome. Worse, emphasis can shift to gaming the system rather than generating real value. Inevitably, quality and morale suffer, management looks less competent, and the value of the knowledge is diminished.

 

However, the trends in the leading indicators are a great basis for insight into how effective the organization's leadership has been in describing the purpose and benefits of KCS. If people understand why they are doing it and what is in it for them (WIIFM), the likelihood that they will participate appropriately is greatly increased. Refer to the Leadership & Communication practice for more details on motivation.

Trend Lines Matter

The second crucial idea for leading indicators is the importance of managing their trends. Performance is about results, not activities. Activities are necessary for results, but the performance insight lies in the trends of the leading indicators, not the absolute values of their numbers.

 

Reports should be available to the Support Analyst on a timely basis. They must understand how they are doing and how their performance is contributing to the group's performance. Asking Support Analyst to drive the KCS system without this information is like asking someone to drive a car that has no dashboard.

A Scenario—Examples of KCS Reports

The example below is for the first six months of an organization's adoption of KCS. 

Slide28

KCS Article Creation and Reuse

KCS article creation will naturally lead KCS article reuse. As an organization approaches maturity, they will have already captured a high percentage of the known KCS articles, so the creation rate should drop off, and the reuse rate will continue to climb. Because of its link to product life cycles, this pattern will repeat itself with each new product or application introduced.

KCS Article Life Cycle Trend

The KCS article life cycle gives us a sense for the speed with which KCS articles are moving from a draft state to an approved state. Because the value of the knowledge increases as the potential audience or visibility increases, we want to make sure that there is no bottleneck in the system. KCS articles in the approved state are generally visible to a much larger audience than draft KCS articles, while published KCS articles are generally available to users or customers outside the support organization.

 

In the chart below, we see good movement of KCS articles from "draft" to "approved." Of the 9000 KCS articles that have been created in the knowledge base, about 7800 of them are approved or published. There does seem to be some kind of hold-up in getting KCS articles published—a three-month lag at least—which is out of place for a mature KCS implementation. An organization in this situation would need to make sure that it has eliminated all possible barriers to external publication.

Slide29

The Power of Participation

Participation is a leading indicator (an activity). Participation is defined as the percentage of time the Support Analysts use the knowledge base to solve an incident for which knowledge is appropriate. We divide the number of incidents that have a resolution identified in the knowledge base by the total number of applicable incidents closed. Participation rate is also referred to as the "linking rate" as it reflects the number of incidents closed with links to content. 

 

Participation is an important trend to watch as the organization adopts KCS. In general, a healthy Participation rate for an organization is in the range of 60-80%, although specific numbers vary based on which incidents are included into this calculation.  Participation indicates how often the knowledge base is being used as part of the problem solving process. The Participation rate is incremented by creating a new KCS article as well as using an existing KCS article. For example, if we closed ten incidents this week, and we reused six KCS articles and created two new KCS articles, our participation rate would be 80%.  

 

Not all incidents benefit from knowledge reuse, so it is sometimes appropriate to exclude them from the Participation rate calculation.  If there are incident closure codes or incident categories to which knowledge simply don't apply (for example, updating customer contact information, moving a license key to a new system, or rerouting a misqueued call), these codes or categories can be excluded from the set of incidents on which Participation rate will be calculated.  This has the effect of moving Participation closer to 100%, as more of the remaining incidents will benefit from knowledge.

 

In some organizations, there isn't an "incident" per se, so there is no ratio of knowledge use to incidents, and thus no Participation "rate."  Even without the rate, calculating Participation as the total number of KCS article reuses and KCS article creations provides some insight into the organization's engagement with Solve Loop practices.

This organization appears to be doing well, although perhaps it has taken too long to get there. What about the individual contribution? Can we tell who is creating value?

Slide30

We must look at participation for the group as well as for the individual.

Slide31

Here we can see there is a wide range of participation across the group. A conversation with Ed and Joe about their problem solving process would be a good idea. Because participation rate is the ratio of incidents closed to KCS articles identified, this view makes it hard to come up with a scenario for Joe or Ed that says they are "doing well." Participation rate is a powerful indicator of anyone who is not playing. Again, we raise the caution that the conversation with the Support Analysts needs to be about their understanding of KCS, problem solving process, and use of the knowledge base, not about their participation number. 

 

Let us consider Kim and Hector —are they the new heroes of the organization? We don't know enough about Kim and Hector to know if they are creating value in the knowledge base or are just busy creating KCS articles that might be duplicates or incomplete. We need more information.

Profiles of the Players

It is interesting to look at a KCS indicators profile by individual. Below is an example of Hector's profile. While it contains a tremendous amount of data, the combination of factors gives us a sense of Hector's contribution. We have averaged many of the factors over a week's time. Incidents closed, KCS articles linked (used), KCS articles created, KCS articles modified (improved), and citations (others using KCS articles Hector has created) are all represented on a per week basis. Time to resolve and first contact resolution are the monthly averages.

 

On the participation chart above, Hector and Kim both appear to be star performers. With the profile view, we see something different:

Slide32

Kim, on the other hand...

Slide33

Here is a great example of why a profile with multiple indicators is preferable over that with only a single measure. If we looked only at participation, then both Kim and Hector would appear to be doing very well. However, upon reviewing Kim's profile, we see that the KCS article created rate represents most of her activity. She does not often modify others' KCS articles and, in fact, does not reuse others' KCS articles very often. The difference between KCS articles linked (used) and the KCS articles created represents KCS article reuse.

 

In Kim's case, we see KCS article creation makes up most of the KCS articles linked (used), which indicates relatively low reuse. Based on her citations levels, we can also see that others are not using the KCS articles that Kim creates. We might infer from all this that Kim is not searching for KCS articles before creating a new one, and that the KCS articles she creates are not very useful to others. A conversation with Kim is definitely in order. It may be that she does not understand the KCS processes. However, Kim may also be working on a new release or supporting a beta product, in which case her profile might represent a good contribution.

 

Even with all this data, we still do not have enough information to determine who is creating value. We are missing a qualitative view (KCS article quality) to balance the quantitative view (activity).

 

If we refer back to the KCS Article Quality Index we discussed earlier, we can get an additional perspective on Hector and Kim.

 

Sample: Detailed KCS Article QUlaity Check List

 

Hector's quality index is 99.2%; he consistently creates KCS articles that adhere to the content standard. In contrast, Kim's quality index is 87.5% and her frequency of duplicates is very high. This number reinforces the idea that Kim is not searching before solving and creating. While her activity level is excellent, that activity is corrupting the knowledge base because of the level of duplicate KCS articles introduced.

Radar Charts - Creating a Value Footprint

The scenario with Hector and Kim requires a great deal of data and analysis with multiple charts for multiple people. A leader with a team of 15 Support Analysts is not likely to have the time to routinely do that level of analysis. Can we make it easier to quickly identify who is creating value and who needs help? For rapid assessment, we use a tool called the Radar Chart.

 

Creating a Radar Chart requires some thought. First, we want to be sure we get a balanced view. Our key metrics should reflect a balance of:

  • Leading and lagging indicators (activities and outcomes)

  • Quantity and quality

 

The leading indicators (activities) are compared to the team average (not a goal) and lagging indicators (outcomes) are compared to the goal.

 

Secondly, we have to normalize the values to a common scale—for this example we will normalize to 1, so anything less than 1 is not meeting the team average or the goal, and anything greater than 1 is better than the team average or exceeds the goal. In the case of the leading indicators (activities), we will normalize it to the team average (do not put goals on activities). For the lagging indicators (outcomes), we will normalize the goal to 1.

 

We have to decide what measures to use in the radar chart.  Organizations that use radar charts each have their own set of measures, usually defined by the KCS Adoption team. 

 

For our example we will use Hector and Kim's data from the scenario and we will use the following measures:

  • Customer satisfaction index (assuming this is captured at the individual level)
    • Based on post incident closure surveys, the goal is normalized to 1
  • Knowledge contribution
    • Article quality index (AQI); based on sampling and scoring of articles, the goal is normalized to 1
    • Citations, (peer's use of articles, # per month, 1 = team avg.)
  • Process and Operations
    • Incidents handled; the number of incidents handled/month, the team average is normalized to 1
    • Avg TTR: Average time to relief (average minutes to provide relief/answer), the team average is normalized to 1 (note that the individual values for Avg TTR have to be inverted; a shorter TTR than the team average has to have a value greater than 1 and longer is greater than 1)
    • Participation rate; the % of cases closed with a resolution linked, the team average is normalized to 1

 

Once we have decided on the measures we want to include in the radar chart and the calculations for normalizing them to 1 we can plot the chart. Following are Hector and Kim's value footprint.  We can see that these charts are much easier to read than the array of graphs we used in the scenario.

Slide34Slide35

By comparing an individual's performance to the team averages for leading indicators and the goals for the lagging indicators, we can quickly see that Hector is creating value, and Kim needs some help. This is a helpful way to view measures so long as we have a balanced view of leading and lagging indicators as well as qualitative and quantitative measures. However, no measure or collection of measures can be meaningful without an understanding of the context in which the individual works and the role of the individual. Assessing the data in the context of the environment is a key responsibility of the team manager.  We find that the assessment of value creation in a KCS environment is so different from the transaction and activity based measures we have conditioned first and second line manager to use that the managers need training on how to interpret and use value based measures.

Is What We Talk About Important?

Text Box: The conversation about performance improvement is about behavior, process, and understanding – not about the numbers.

Note that the eventual conversation with Kim should be about the behaviors and her process for problem solving, not about her participation numbers or the quality index. The numbers are the indicators. If the conversation is about the numbers, then the numbers become the focus. We want Kim to adjust her behavior; her problem-solving process might not align with the KCS practices. If we coach her on the structured problem solving process and the Solve Loop practices, the indicator should reflect the change. However, if we talk with Kim about fixing "her numbers," she can do that, but now the indicator becomes useless.

 

 

 

The moral of the story here is three-fold. 

  • We cannot depend on one measure or indicator to determine the health of the KCS system or the contribution of the players.

  • The indicators must be used along with an understanding of the nature of the environment. Assessing the creation of value requires that we have a holistic view of performance.

  • Trends in activities (leading indicators) can be very valuable, especially participation rates. But the value of the indicator will be lost if we put a goal on the activity or we focus on the number during conversations with the employees.

Team Performance - Management Effectiveness

We can use radar charts for the team performance and as a way to assess the effectiveness of the leadership in creating an environment for KCS success. For the team radar chart the same rules for balance apply but the measures we use would be different. The measures for a team will depend on the size of the team and the size of the organization. We offer the following as an example where the team is the support organization and can influence the measures listed.

 

For the team's radar chart we might use the following measures:

  • Customer loyalty - unlike customer satisfaction, which typically measures the transaction, loyalty measures the customer experience over time and their emotional connection to the company (1 = the loyalty goal)

  • Employee loyalty - loyal employees are a perquisite to loyal customers (1 = the employee loyalty goal)

  • Collaboration health index - the teams ability and willingness to collaborate; key indicators are trust and a sense of connectedness to the team

  • Support cost as a percent of revenue -

  • Incidents closed -

  • Avg TTR - Average time to relief for the team

  • Customer success on the web, (index = of % customers using web 1st X % success)

 

 

Slide36

It is important to reiterate that numbers never tell the whole story. As with many things in the KCS methodology, judgement is required. This is true for the Support Analysts as well as managers.

 

While radar charts are good at showing a collection of data or measures at a point in time, they are not great at showing trends. Trends are especially important for the leading indicators (activities) like article creation and linking rates as well as participation rate. 

 

An organization can have the best measurement system in the world but it is only effective if the managers know how to interpret the measures and how to have effective conversations with employees that influence behavior.  Performance assessment and the creation of value is fundamentally about behavior and decision making, not about the numbers.

Focus Shift—Phase II to Phase III

During the adoption, we want to focus on indicators for individual development, adoption of the Solve Loop practices, and adherence to the content standard:

  • Learning
    • KCS competency levels across the organization (% of the organization in each of the levels: KCS Candidate, KCS Contributor, KCS Publisher)
    • Time to KCS proficiency (number of days to reach each competency level)
    • Knowledge contribution
      • KCS article creation rate (people are creating KCS articles as they solve problems)
      • KCS article modify rate (people are improving KCS articles as they use them)
      • KCS article reuse rate (people are using KCS articles they find in the knowledge base to solve problems)
      • Knowledge base participation (% of incidents handled using the knowledge base)
      • KCS article rework rate (KCS article flagged as needing attention because it could not be understood or fixed by the person who found it)
      • KCS article cycle time (rate at which KCS articles move through their life cycle)
      • KCS article quality index (AQI, random sampling of articles)
  • Process and Operations

    • Incidents handled, individual (# of incidents handled/month, 1= team average)

    • Average time to relief, individual (average minutes to provide relief/answer, 1 = team average)

 

As the organization matures and KCS becomes second nature for the Support Analysts, we shift our focus from individual measures to a balance of individual and team or collaboration measures:

  • Collaboration and teamwork

    • Reputation and peer feedback

    • Invitation rate (number of times invited to collaborate)

    • Opt-in rate (number of times the invitation is accepted)

    • Knowledge contribution—reuse by others (citations)

    • KCS article quality index for the team

    • Citations or feedback from customers

    • Customer success on the web (index = of % customers using web first X % success)

    • Sample KCS lagging indicators:

  • Loyalty/satisfaction index (team)

    • Based on surveys, team (post incident and periodic)

    • Retention rate/renewal rate

 

For a complete list of all the KCS measures the Consortium has considered please see the Appendix - Metric Matrix

Summary: Performance Assessment

Performance Assessment for KCS represents a departure from traditional management practices. It focuses on collaboration, not competition, and assesses the creation of value, not activity. Job descriptions and expectations must shift to include the capture and maintenance of knowledge in the workflow (the Solve Loop). The measures must reflect the concept of collective ownership of the knowledge base. 

 

Here are the key points to remember:

  • Use the license metaphor (KCS Candidate, KCS Contributor, KCS Publisher) to manage and encourage proficiency

  • Align individual and department goals to the higher level company goals (strategic framework)

  • Distinguish indicators for activities from measures for outcomes

    • Look at trends for the activities and create goals for the outcomes

  • Create a comprehensive view of performance by integrating

    • Objective and subjective measures

    • Individual and team measures

    • Trends in activities and attainment of goals for outcomes

  • Enable timely feedback to the people doing the work

  • Conversations with Support Analysts must focus on behavior, process, and understanding, not on the numbers (otherwise the numbers become meaningless)

You must to post a comment.
Last modified
11:05, 29 May 2014

Tags

This page has no custom tags.

Classifications

This page has no classifications.