IT Metrics Best Practices

Project Background

The client, one of the largest pharmaceutical companies in the world, had begun implementing global IT service level metrics in the previous year. Efforts were made to assemble a comprehensive set of performance indicators to monitor all IT infrastructure service areas. A metrics dashboard application was created in Notes to document the performance measures, report upon performance trends and provide related information on current services, service improvement objectives and the status of improvement projects. The metrics dashboard application also provided the IT group with the capability to share information with each other about challenges in implementing the initial set of metrics and planned changes.

Problem Summary

The company desired an outside best practice evaluation of its IT metrics dashboard. TBI was hired to provide this based on our expertise in performance measurement.

Objectives of the project were:

  • To assess the comprehensiveness of the current metrics portfolio and its alignment with IT business objectives
  • To obtain recommendations for technical improvement in metrics’ reliability, validity, and utility
  • To determine the extent to which current metrics were “benchmarkable” and recommend changes that would facilitate periodic comparison of IT performance levels to industry “best practices”

The client also wished to pilot test a cost and service level benchmarking analysis in one area of its IT infrastructure services.

TBI’s Approach

TBI reviewed the client’s IT metrics dashboard and services information for each of eight IT infrastructure services in scope: Desktop Support; E-Business Infrastructure; Messaging and Collaborative Tools; Remote Access; Application Infrastructure Directory Services; Data Centers; Network Support; and Virus Management to gain an understanding of the IT strategy, critical success factors and key performance indicators, and the existing IT service level management structure.

Metrics in use were then mapped against a “measure type” framework, which considered critical success factors for achievement of IT goals in each IT product/service area. In this part of the metrics portfolio review, focus was on the adequacy of the existing metrics to monitor critical success factors for each of the services in scope.

TBI then undertook the technical review of each metric in use examining the following:

  • Reliability – The repeatability of the measure; extent to which the method of data capture is systematic and free from error.
  • Validity – Extent to which the measure faithfully captures and reports about the service issue of concern, or is a “true” measure of the service issue that SCS intended to measure.

Additionally, two “usability” issues were also included in this metrics technical review.

  • Use of “Metrics” vs. “Measures” – while these terms are often used as synonyms, in this review a “measure” meant a data element representing a single characteristic of service (e.g., cost), where the term “metric” was used to mean a combination of data elements that allows comparison and interpretation over time and across different organizations (e.g., cost/seat)
  • Benchmarkability – extent to which the metric is designed in a manner consistent with how other organizations measure the same service issue, so that an industry performance comparison may be possible

Also included in the review was consideration of the Meaningfulness of the content, manner of expression and reporting of the metrics. This included a review of the level of detail presented in the dashboard report, with a focus on the usefulness of the metric for 1) high level performance monitoring and 2) performance analysis and problem determination.

Finally, TBI undertook a benchmarking analysis of the client’s Desktop Support services. The key cost and performance drivers that were used in drawing comparison cases from TBI’s Desktop Support database were: organizational size (# of seats); support workload (level of break/fix and deployment activity); environmental complexity (# of business locations and # of different software/hardware builds supported); and the specific services provided. Benchmarking results provided comparison values for each of these cost and performance drivers as well as for monthly cost of services and service levels.

Successful Business Solutions

The client received:

  • A gap analysis of the coverage of the metrics dashboard, suggesting additional metrics needed to cover critical success factors for each IT service area and metrics that could be eliminated since they were not particularly meaningful.
  • Recommendations for improvement of the framework under which the metrics in the dashboard were organized and reclassification of existing metrics to improve clarity and meaningfulness of the dashboard.
  • An evaluation of the strengths and weaknesses of the metrics in the dashboard and specific detailed recommendations for improving measurement reliability, validity and usability.
  • A comparison of current service cost and service levels to those of similar services provided by other organizations.
Share Button