Using Metrics to Quantify, Guide, and Monitor Analytic Projects

All too often organizations embark on the formation of an “Analytics Center of Excellence” without an understanding of how analytics will change their business processes or how they will define and measure success. In many cases, there is confusion over the deployment of analytic software and an efficacy of an analytic project.

The two are not the same. Defining, collecting, and reporting meaningful analytic performance measures is critical to the success of implementing analytics in any organization. Having a COE is not a check mark that indicates the software was deployed in the production environment on time. Unfortunately to some organizations this is success, and then the software sits, and sits, and sits, with no measurable impact on business operations. In certain cases, enterprise deployment of analytic software is a necessary but not a sufficient condition for applying analytics to solve a specific business problem.

Organizations typically fail to ask much more basic questions regarding analytic projects:

  • What business units will use analytic software tools?
  • What is the analytic maturity of the user community to which we will deploy the software?
  • How do we prioritize the set of analytic projects and what is the first model that will be estimated, and by what group? The software will not do this by itself.
  • What analytic models will be developed (e.g., logistic regressions for audit selection, NLP for case notes to optimize workstreams, cluster models for collection portfolio optimization)?
  • What data will be necessary for these models and do the analytic tools have the proper connections to these data? Are there security or privacy concerns?
  • Do we have any idea how to measure each model’s ROI?
  • Do we already license software that can do this?


Metrics provide visibility into status and progress, quantify the effectiveness of campaigns, and guide analytic direction. Utilizing metrics to establish, develop, and define program progress ensures accountability and value. These metrics will help prioritize analytic projects based on business value, urgency of need, ease of implementation, and likelihood of success. Meaningful metrics should encompass at least three dimensions:

  1. Human Resources. Monitor and evaluate the human resources expended on analytic project activities. Business stakeholders need to analyze historical data that may impact the human resources required to complete the project using comparisons to previous years and similar projects.
  2. Business Value. Measuring and tracking the benefits to the organization of an analytic project should be clear and easy to articulate. Examples can be in dollars saved, revenue generated, or call volumes.
  3. Statistical Efficacy. Without an understanding of why an analytic model is producing unexpected results, it is difficult to identify areas for improvement, or when to sunset a specific model or set of models. Examples would be the monitoring of a false-positive rate or outlier analysis.

The procedures for obtaining or compiling metrics on a periodic basis should not be complex or time consuming. Define the structure and contents for the metrics reported, make the process repeatable, and establish the roles and responsibilities for generating and acting on the measures. Focusing too much effort on a dashboard's look and feel may miss the point if the dashboard metrics themselves are not used to take specific actions. An effective dashboard will provide metrics that broaden visibility, improve risk identification, and align with organizational efforts to support quicker data-driven decisions that improve business performance. But that is another topic altogether...