Without a good understanding of return on investment or value being added, KM initiatives will be difficult to sustain. Beyond ensuring accountability for KM activities, measurement also provides important information to the KM lead about what is working well, what isn’t, and what adjustments need to be made.
Measurement for an agency-wide KM effort seeks to answer four questions:
Agencies can approach measurement at different levels of detail. Since measurement can be a time-consuming activity, it is best to begin by identifying a few key measures and then add new ones as needed as the KM effort evolves. It is important to strike a balance between the level of effort needed to track KM results and the value of information being produced.
Costs: Costs of KM implementation can be estimated based on the time spent by the KM lead (and others) to plan, facilitate and support KM activities. IT development and content management cost elements may be included as well.
Outputs: The KM lead can maintain a simple activities log to track products and services provided to support KM implementation. Products may include guidance documents or model policies; services may include briefings, trainings, or meeting facilitation.
Exposure/Use: Employee participation in KM activities, and use of knowledge resources can be tracked via manual methods such as meeting sign-in sheets, or automated methods built into systems (e.g., tracking of web page hits or document downloads).
Outcomes: Tracking outcomes is challenging because of the intangible nature of the results being sought – e.g., employee knowledge, organizational effectiveness, and consistency in application of policies and procedures, and resilience. Typical approaches to measuring outcomes involve either surveys or interviews with employees, or studies of specific business process efficiency changes. While time or cost savings should be quantified wherever possible, it is helpful to supplement this with qualitative information that helps people to understand why and how a KM technique was beneficial.
In some cases, employee stories about how they benefited from a KM activity can be translated into estimates of cost savings for the agency. For example, the VDOT KM office facilitated a business process streamlining effort that resulted in an estimated savings of $300,000 annually. Table 4 provides examples of output, exposure/use and outcome metrics for the four categories of KM techniques discussed above.
The US Navy has developed an excellent “Metrics Guide to Knowledge Management Initiatives” - see reference 27. This guide defines the following steps for measuring and monitoring KM results:
The Navy Guide is a useful reference for design of an approach to KM measurement, and includes an appendix with sample metrics.
Table 4. Sample KM Metrics
Agency-Wide KM Elements |
KM Outputs |
Exposure/Use |
Impacts/Outcomes |
KM Leadership & Direction | |||
Strategic Planning and Policy Development |
|
|
|
Social Learning & Communities | |||
Communities of Practice (CoPs) |
|
|
|
Knowledge Codification & Dissemination | |||
Lessons Learned Repository
Organizational Narratives/Storytelling |
|
|
|
Succession & Talent Management | |||
Talent Tracking |
|
|
|
Before outcome measures can be defined, it is helpful to define expectations for each KM activity or initiative. Key types of outcomes for KM implementation include: adaptability/agility, creativity, institutional memory building, organizational internal effectiveness, and organizational external effectiveness. Once expectations are set, then specific surveys, interviews, or studies can be designed to assess results. The following examples provide a starting point for thinking about expected results from selected KM activities.
Expertise Locator System:
After-Action Reviews (AARs):
Lessons Learned (LL) Repository:
Collaboration Platforms:
Organizational Narratives/Continuity Books/Knowledge Books:
Communities of Practice:
Table 5 and Figure 6 illustrate how the Virginia DOT has characterized outcomes from their KM initiatives.
Table 5. Sample Outcomes from the Virginia DOT KM Initiatives
Initiative |
Outcome |
Lessons Learned Database |
Increased knowledge base on successful construction practices; strong level of support and utilization from construction managers and inspectors; national recognition for agency. |
After Action Review: Winter Maintenance |
Led to statewide implementation of anti-icing program based on techniques developed by frontline managers; findings will be incorporated into staff training programs. |
Standard Operating Procedure Development: Emergency Response Task Force |
Definition and common understanding of VDOT’s response to different incident types: crashes, terrorist attacks, HAZMAT spills, and weather-related events. |
Facilitation: Interagency Coordination On Incident Management |
Improved working relationships, leading to shortened incidence clearance times. |
Organizational Network Analysis: for successful construction project |
Evaluation of how team member communication network contributed to project delivery efficiency and effectiveness – lessons can be applied to future construction projects. |
Process Mapping: Environmental review process |
Annual cost savings of $300,000 from process streamlining; improved understanding of process and its intersection with other processes; model for other groups. |
CoP Support: Construction |
Statewide vertical and horizontal integration of construction expertise used to inform the state strategic plan. |
A balanced approach to monitoring KM value and effectiveness should include tracking of costs, outputs, exposure or use, and outcomes.
Documenting return on investment is important, but it is also valuable to capture success stories and qualitative descriptions of outcomes.
In order to mitigate risks, the KM outcome metrics should map back to the agency’s strategy, goals, and objectives.
“Of the initiatives we’ve undertaken at Chevron during the 1990s, few have been as important or as rewarding as our efforts to build a learning organization by sharing knowledge. In fact, I believe this priority was one of the keys to reducing our operating costs by more than $2 billion per year—from about $9.4 billion to $7.4 billion—over the last seven years”
– Derr, T.K. (1999)
See reference 28
Development of an evaluation approach should be an integral part of the planning for each new initiative.