The CDAO Playbook for Value Realization: A Framework for Measuring and Benchmarking Data Initiatives

Part I: The Strategic Imperative: From Cost Center to Value Engine

Chapter 1: The Case for Measurement: Beyond Justifying Existence

In the contemporary enterprise, the role of the Chief Data Officer (CDO) or Chief Data & Analytics Officer (CDAO) stands at a critical juncture. The mandate has evolved far beyond the mere stewardship of data assets; it now demands the demonstrable creation of business value. Measurement is the fundamental language of business, and for the data function to ascend from a perceived cost center to a recognized strategic value engine, a rigorous and transparent measurement program is not merely advantageous—it is non-negotiable.1 A CDO who cannot articulate the value of data initiatives in quantifiable business terms risks being confined to an operational or compliance-focused role, perpetually struggling for resources and strategic influence. This playbook provides a comprehensive framework for establishing such a program, ensuring that data investments are not only justified but are also systematically optimized to drive tangible outcomes.

The imperative for a formal measurement program is rooted in three core strategic necessities: aligning with business goals, securing stakeholder buy-in, and driving a culture of accountability.

First and foremost, measurement ensures profound and continuous alignment with overarching business strategy.2 Data initiatives, particularly those involving advanced analytics and Artificial Intelligence (AI), must be directly tethered to specific, high-priority business objectives, such as increasing revenue, reducing operational costs, enhancing customer satisfaction, or mitigating risk.4 Without this explicit linkage, even the most technically sophisticated project becomes an exercise in “analytics for the sake of doing analytics,” disconnected from the enterprise’s strategic priorities and ultimately failing to deliver meaningful impact.7 A measurement framework forces this alignment from the outset, demanding that every proposed initiative answers the fundamental question: “How will this move the needle on a key business outcome?”

Second, a clear and defensible measurement framework is the primary tool for winning over and maintaining the confidence of key stakeholders, including the C-suite and board of directors.7 By communicating the value proposition of data in the language of business—Return on Investment (ROI), market share growth, customer lifetime value—the CDO can transform the conversation from one of cost to one of strategic investment.10 This builds essential trust and provides a robust, evidence-based foundation for securing the necessary budget and resources to scale successful programs.12 It provides the clarity needed to make data-driven decisions about which projects to continue, expand, or terminate, ensuring that resources are allocated to initiatives with the highest potential for value creation.5

Finally, and perhaps most transformatively, a robust measurement program serves as a powerful catalyst for cultural change. The journey to becoming a truly data-driven organization is fraught with challenges, including resistance to change and a reliance on intuition-based decision-making.13 A transparent measurement program directly addresses this by creating a feedback loop that validates and reinforces data-informed behaviors. When teams across the organization can see a direct, causal link between their data-driven actions and a positive, measured outcome—for example, a marketing team observing a quantifiable reduction in customer churn after implementing a predictive model—it makes the value of data tangible and visible.14 This validation is the most effective mechanism for driving the adoption of new tools and processes. Consequently, the measurement program is not merely a reporting function; it is a core pillar of the CDO’s cultural transformation agenda, systematically embedding accountability and an evidence-based mindset into the fabric of the organization.15

 

Chapter 2: Defining “Value” in the Data Economy: A Multifaceted View

 

To effectively measure the contribution of data and analytics, a CDO must champion a sophisticated and multifaceted definition of “value” that extends beyond traditional, easily quantifiable financial returns. While direct financial impact is the ultimate goal, a narrow focus on immediate ROI can obscure the broader, more strategic contributions of data initiatives, leading to the underestimation of their true worth and the risk of premature termination of long-term, high-potential projects.4 A comprehensive value framework acknowledges a spectrum of returns, encompassing direct, indirect, and strategic impacts, thereby providing the C-suite with a holistic understanding of how data investments create value across the enterprise.

This spectrum of value can be categorized into three distinct but interconnected tiers:

  1. Direct Impact: This is the most tangible and immediately measurable category of value. It includes direct contributions to the organization’s top and bottom lines. Examples are plentiful and often serve as the cornerstone of any business case. They include cost savings achieved through the automation of manual processes, such as using AI to handle customer service inquiries or optimize supply chain logistics, and direct revenue growth generated from new data-driven products, services, or enhanced marketing campaigns.4 These metrics are critical for demonstrating near-term ROI and building initial credibility for the data program.
  2. Indirect Impact: This tier encompasses qualitative or semi-quantitative benefits that are vital to operational excellence but are often more challenging to assign a precise dollar value. These benefits include enhanced worker productivity, where data tools and insights enable employees to complete tasks faster or make better decisions; improvements in customer satisfaction and engagement, driven by personalization and faster service; and accelerated decision-making cycles, allowing the organization to respond more quickly to market shifts.4 While not always directly convertible to a financial figure, these indirect impacts are powerful leading indicators of future financial performance.
  3. Strategic and Long-Term Impact: This represents the most forward-looking and potentially most valuable category of returns. It relates to the ability of data and analytics to foster innovation, create sustainable competitive advantage, and secure the organization’s future growth.4 Examples include the identification of entirely new market opportunities or customer segments through advanced analytics, the development of a foundational data ecosystem that enables future AI and machine learning capabilities, and the filing of new patents based on AI-driven innovations.7 These impacts are often long-term and require a strategic investment mindset, as their value may not be fully realized for several quarters or even years.

A highly effective framework for articulating this multifaceted view of value to executive leadership is Gartner’s AI Value Pyramid.23 This model provides a C-suite-friendly lexicon for discussing a balanced portfolio of returns, structured around three key pillars:

  • Return on Investment (ROI): This is the traditional financial pillar, focusing on the direct, bottom-line impact. It answers the question, “How is this initiative contributing to profitability?” Metrics here are familiar to any CFO and include project-specific ROI, revenue uplift, and cost reduction.23
  • Return on Employee (ROE): This pillar centers on the impact of data initiatives on the workforce. It answers the question, “How is this making our employees more effective and engaged?” Key metrics include improvements in employee productivity (e.g., tasks completed per hour), time saved by automating manual work, and increases in employee satisfaction scores related to data tools and access.23
  • Return on Future (ROF): This is the strategic pillar, focused on positioning the organization for long-term success. It answers the question, “How is this initiative preparing us for the future?” Metrics in this category are designed to track innovation and competitive positioning, such as the velocity of the innovation pipeline (e.g., speed of proof-of-concept development) and the number of new market opportunities identified through data analysis.22

By adopting a comprehensive value definition that incorporates direct, indirect, and strategic impacts, and by using a clear communication framework like the AI Value Pyramid, the CDO can present a compelling and holistic narrative. This approach ensures that the full spectrum of contributions from data and analytics is recognized, fostering informed, strategic investment decisions that drive both immediate performance and long-term, sustainable growth.

 

Part II: A Compendium of Value Measurement Frameworks

 

To move from the strategic “why” of measurement to the practical “how,” the CDO must be equipped with a portfolio of established, defensible frameworks. No single framework is universally perfect; the optimal choice depends on the organization’s maturity, culture, and the specific goals of the measurement program. This section provides a compendium of leading methodologies, from holistic business management systems adapted for data to specialized models designed explicitly for valuing data and analytics. By understanding these frameworks, the CDO can select the most appropriate approach or, more likely, construct a hybrid model that leverages the strengths of each.

 

Chapter 3: The Balanced Scorecard (BSC) for Data & Analytics

 

The Balanced Scorecard (BSC), originally developed by Drs. Robert Kaplan and David Norton, is a strategic planning and management system designed to give leaders a comprehensive view of organizational performance.24 Its core premise is that relying solely on financial metrics provides a rearview-mirror perspective of past performance, which is insufficient for navigating the future. The BSC rectifies this by balancing traditional lagging financial indicators with leading indicators of future performance across three additional perspectives: customer, internal processes, and learning and growth.25 This holistic approach makes the BSC an exceptionally powerful framework for a CDO to translate the data strategy into a concrete set of measurable objectives and demonstrate its balanced contribution to the entire enterprise.27

The power of the BSC lies in its adaptability. The four perspectives can be tailored specifically to the context of a data and analytics function, creating a comprehensive dashboard that communicates value to all stakeholders.

  • Financial Perspective: This perspective directly answers the C-suite’s fundamental question: “How are our data investments contributing to the bottom line?”.28 It connects data initiatives to tangible financial outcomes.
  • Objectives: Increase revenue through data-driven products, reduce operational costs through automation, improve profitability of marketing campaigns.
  • KPIs: Return on Investment (ROI) of data projects, revenue generated from data monetization, cost savings attributed to process automation, increase in customer lifetime value (CLV).29
  • Customer/Stakeholder Perspective: This perspective focuses on the value delivered to the “customers” of the data function, who can be both external (the company’s clients) and internal (business units consuming data and analytics).27 It answers the question: “How are our data initiatives perceived by those who use them?”
  • Objectives: Improve business stakeholder satisfaction with data services, enhance external customer experience through personalization, increase trust in data across the organization.
  • KPIs: Stakeholder Satisfaction Scores (e.g., via surveys), Net Promoter Score (NPS) improvements on products influenced by data insights, reduction in customer churn rate, user adoption rates for new analytics tools.26
  • Internal Process Perspective: This perspective examines the operational excellence of the data function itself. It answers the question: “To satisfy our stakeholders and achieve our financial goals, which data processes must we excel at?”.28 These are often leading indicators of stakeholder satisfaction and financial success.
  • Objectives: Accelerate the delivery of insights, improve the quality and reliability of data assets, enhance the performance of data platforms.
  • KPIs: Time-to-Insight (from business question to actionable answer), Data Quality Score (a composite of accuracy, completeness, timeliness), dashboard load times, data pipeline latency, reduction in data-related errors.29
  • Learning & Growth (Organizational Capacity) Perspective: This perspective focuses on the foundational capabilities required to sustain innovation and long-term improvement. It answers the question: “How must our organization learn and improve to achieve our vision?”.25 These are the most forward-looking indicators.
  • Objectives: Foster a data-literate culture, accelerate innovation through experimentation, improve employee skills and access to tools.
  • KPIs: Data Literacy Assessment Scores, number of employees trained on data tools, speed of proof-of-concept (PoC) development, number of active AI experiments, employee satisfaction with data tools and support.22

A critical component of the BSC methodology is the Strategy Map, a visual diagram that illustrates the cause-and-effect relationships between the objectives across the four perspectives.24 For example, a strategy map can show how investing in

Data Literacy Training (Learning & Growth) leads to Improved Data Quality (Internal Process), which in turn enables Higher Stakeholder Satisfaction (Customer/Stakeholder), ultimately resulting in Increased ROI on Data Projects (Financial). This visual narrative is incredibly effective for communicating the data strategy to the entire organization.24

Beyond its function as a reporting tool, the BSC serves as a proactive governance and strategic alignment mechanism. The process of developing the BSC—collaborating with business and IT leaders to define shared objectives and KPIs—is often as valuable as the final artifact itself. It forces critical conversations that bridge the common gap between business needs and data capabilities, creating a shared language and understanding of what success looks like.7 The resulting Balanced Scorecard acts as a strategic “contract” between the data organization and the business, making governance decisions about project prioritization and resource allocation more transparent, objective, and aligned with agreed-upon goals.26 It transforms the abstract data strategy into a tangible, manageable plan that can be integrated into the organization’s regular performance review cadence.26

 

Chapter 4: The Data-as-a-Product (DaaP) Model: Measuring Value at the Source

 

A paradigm shift is occurring in how leading organizations manage their data assets: moving away from a traditional, project-based mindset toward treating data as a product. This “Data-as-a-Product” (DaaP) approach, championed by consultancies like McKinsey, fundamentally reframes how data initiatives are conceived, developed, funded, and measured.7 By managing data with the same rigor and customer-centric focus as a consumer product, organizations can escape the common pitfalls of bespoke, siloed data projects and unlock a more scalable, reusable, and value-driven data ecosystem. This model provides an inherent and powerful framework for measurement, as the success of a “product” is intrinsically tied to its adoption, user satisfaction, and the value it creates.32

The DaaP model stands in stark contrast to more traditional methods. In a “grassroots” approach, individual teams build their own solutions, leading to massive duplication of effort and a tangled, costly architecture. In a “big-bang” strategy, organizations attempt to build a single, monolithic platform, which is often slow, expensive, and fails to meet diverse user needs.32 The DaaP model offers a more agile and effective alternative built on four core principles:

  1. Dedicated Management and Funding: Each data product (e.g., a “Customer 360 View,” a “Product Catalog,” or a “Digital Twin”) has a dedicated product manager and a cross-functional team of engineers, architects, and modelers. This team is funded to not only build but also to continuously improve and maintain the product, much like a software development team.32
  2. Standards and Best Practices: A central body, such as a data center of excellence, establishes organization-wide standards for how data products are built. This includes defining protocols for documenting data provenance, auditing usage, measuring quality, and ensuring technological consistency.32
  3. Rigorous Quality Assurance: Data product teams are directly responsible for the quality of their product. They manage data definitions, availability, and access controls, working closely with data stewards in source systems to ensure the integrity and trustworthiness of the data they provide.32
  4. Performance Tracking: Crucially, each data product team is accountable for measuring the value of their work. This moves measurement from a centralized, top-down function to an embedded, operational responsibility.

The metrics used to evaluate a data product are directly analogous to those used for any digital product, focusing on user-centric outcomes rather than purely technical outputs. This provides a clear and intuitive way to track value. The key metric categories include:

  • Adoption and Usage Metrics: These KPIs measure the reach and relevance of the data product. They answer the question, “Are people actually using this?”
  • KPIs: Number of monthly active users, number of distinct business use cases powered by the product, and, critically, the number of times the product is reused across different parts of the business. High reuse is a powerful indicator of value and efficiency.32
  • User Satisfaction Metrics: These KPIs gauge the quality of the user experience and the trust users have in the product. They answer, “Do users find this product valuable and easy to use?”
  • KPIs: User satisfaction scores gathered from regular surveys, Net Promoter Score (NPS) for the specific data product, and qualitative feedback from user interviews.32
  • Business Impact Metrics: This is the ultimate measure of value, connecting the data product directly to business outcomes. It answers, “What tangible business results has this product enabled?”
  • KPIs: The Return on Investment (ROI) of specific business initiatives that were enabled by the data product. For instance, a “Customer 360” data product might enable a targeted marketing campaign. The success of that campaign (e.g., a 5% reduction in churn, translating to an $8M annual revenue uplift) can be directly attributed back to the data product that made it possible.14

Adopting a DaaP model yields significant, quantifiable benefits. Organizations that successfully make this shift have been shown to deliver new business use cases up to 90 percent faster, as teams can leverage existing, high-quality data products instead of starting from scratch. Furthermore, this approach can reduce the total cost of ownership for data—including technology, development, and maintenance—by as much as 30 percent, while simultaneously reducing risk and the burden of governance through standardized, reusable components.32 By embedding measurement at the product level, the DaaP model creates a direct and undeniable link between data management activities and business value creation.

 

Chapter 5: The Data Value Chain Analysis: Tracing Value from Inception to Impact

 

The Value Chain Analysis, a concept popularized by Michael Porter, provides a powerful strategic framework for deconstructing a business into its core value-creating activities.33 This logic can be adapted to create a

Data Value Chain, a model that maps the sequential stages through which raw data is transformed into actionable, high-value business outcomes.35 For a CDO, the Data Value Chain is an invaluable tool. It provides a systematic way to visualize the entire data lifecycle, identify where value is added (or lost) at each step, and apply specific metrics to diagnose bottlenecks and optimize the end-to-end process of generating business impact from data.33

Finding high-value uses for data and creating a process to transform it into actionable information is the essence of the Data Value Chain.36 While specific models may vary, a comprehensive Data Value Chain generally consists of four major stages, which can be broken down into more granular steps 36:

  1. Stage 1: Collection & Creation
  • Description: This initial stage involves identifying the business need for data and the subsequent activities of acquiring or generating it. It is the foundation upon which all subsequent value is built.
  • Key Activities: Identifying business questions or problems to solve, discovering and inventorying relevant internal and external data sources (e.g., CRM systems, IoT devices, social media), and capturing the raw data.36
  • Value Proposition: The value added here is the potential for insight. Raw, unorganized data has low intrinsic value, but its collection represents the first step toward unlocking future value.
  1. Stage 2: Processing & Publication
  • Description: Raw data is rarely usable in its initial state. This stage involves the crucial technical processes of refining, organizing, and preparing data to make it reliable, consistent, and accessible for analysis.
  • Key Activities: Data cleaning (removing errors, handling missing values), data standardization and normalization (ensuring consistent formats), data integration and deduplication (linking datasets from different silos), and storing the prepared data in an accessible repository like a data warehouse or data lake.36
  • Value Proposition: Value is added by increasing the quality, trustworthiness, and usability of the data. This stage transforms chaotic inputs into a governed, reliable asset.
  1. Stage 3: Analysis & Uptake
  • Description: This is the stage where prepared data is transformed into information and insight. It involves applying analytical techniques and ensuring that the resulting insights reach the intended business users.
  • Key Activities: Data mining, machine learning, segmentation, predictive analytics, and root cause analysis to detect patterns and forecast outcomes. It also includes the dissemination of these findings through reports, dashboards, and other visualizations, and connecting users with the insights to influence their thinking.36
  • Value Proposition: Value is created by converting data into knowledge. This stage answers the “So what?” question, providing the context and understanding needed for decision-making.
  1. Stage 4: Action & Impact
  • Description: This final and most critical stage is where insights are translated into concrete business actions and measurable outcomes. It represents the ultimate realization of data’s value.
  • Key Activities: Using insights to make a specific business decision (e.g., adjusting a marketing campaign), changing or optimizing a business process (e.g., refining a workflow), developing a new product feature, and ultimately, measuring the change in a core business metric (e.g., revenue, cost, churn).14
  • Value Proposition: This is where the potential value of data is converted into actual, realized business value. The impact here is the tangible improvement in organizational performance.

By mapping key performance indicators to each stage of this value chain, a CDO can create a powerful diagnostic and measurement system. For example:

  • Collection Metrics: Data Acquisition Cost, Data Source Coverage.
  • Processing Metrics: Data Quality Score (completeness, accuracy, timeliness), Data Pipeline Latency, Cost per Data Job.31
  • Analysis Metrics: Time-to-Insight, Model Accuracy, Dashboard Adoption Rate.31
  • Impact Metrics: ROI of Data-Driven Decisions, Revenue Growth, Cost Savings, Customer Satisfaction Improvement.19

This approach also provides a structured way to calculate a holistic ROI for data analytics. The formula proposed by Domo, $Data\:ROI = \frac{(Data\:product\:value – data\:downtime)}{data\:investment}$, is a direct application of value chain thinking.20 Here, “Data product value” represents the final, realized benefit at the end of the chain (Stage 4). “Data downtime” (e.g., broken dashboards, inaccurate data) represents value lost primarily in the Processing and Analysis stages. “Data investment” represents the total costs incurred across the entire chain, from collection to analysis.20 By analyzing the costs and value-add at each step, the CDO can identify inefficiencies (e.g., high processing costs for low-value data) and optimize the entire system for maximum impact.

 

Chapter 6: Insights from Premier Consulting Frameworks: A Comparative Review

 

Leading management consulting firms have developed proprietary frameworks to help their clients navigate the complexities of data and analytics value realization. These frameworks, born from extensive cross-industry experience, offer structured approaches that a CDO can adapt and integrate into their own measurement strategy. A review of the models from McKinsey & Company, Boston Consulting Group (BCG), and Gartner reveals distinct but complementary philosophies on how to measure and maximize the impact of data initiatives.23

McKinsey & Company: A Focus on Bottom-Line Impact and Productization

McKinsey’s approach is heavily oriented toward demonstrating a direct, causal link between data and analytics activities and bottom-line financial performance.23 Their framework consistently forces the question, “How much tangible financial value is this initiative adding?” This is exemplified by their focus on metrics like the

percentage of EBIT (Earnings Before Interest and Taxes) attributable to AI, a KPI that directly connects analytics to core profitability.23 A key insight from their research is that many companies capture only a fraction of the potential value from their data, with sectors like manufacturing capturing as little as 20-30%.8

To bridge this gap, McKinsey advocates for two primary strategies. First, they emphasize the importance of analyzing the entire business value chain to pinpoint the highest-value use cases where data and analytics can have the most significant impact.8 Second, as detailed previously, they champion the

“data-as-a-product” (DaaP) model. This approach treats data assets as products with dedicated managers, clear user bases, and performance metrics tied to adoption, satisfaction, and the ROI of the use cases they enable.32 This product-centric view shifts the focus from building technical capabilities to delivering reusable, value-generating assets.

Boston Consulting Group (BCG): An Emphasis on Maturity, Scale, and Quick Wins

BCG’s framework, often referred to as “AI@Scale,” takes a holistic, maturity-based view of an organization’s capabilities. It assesses performance across four key pillars: Strategy & Vision, Talent & Culture, Technology & Data, and Use Case Scaling.23 The effectiveness of a data program is measured by a maturity index that scores the organization’s proficiency in each of these dimensions. The goal is to move beyond isolated projects to a state where the organization can deliver large-scale business value from AI continuously.23

A central tenet of the BCG approach is the prioritization of use cases that deliver “significant business value” and can generate quick wins.23 Their experience shows that successful initiatives often generate a positive ROI in less than six months, creating crucial momentum for broader transformation.41 However, their recent research also provides a dose of reality: a survey of finance leaders found that the

median ROI for AI and GenAI initiatives is a modest 10%, far below the typical target of 20%. This highlights a significant execution gap, which BCG attributes to a failure to focus relentlessly on value, integrate AI into broader transformation, collaborate effectively, and execute in targeted, scalable steps.42

Gartner: A Balanced View of Value with the AI Value Pyramid

Gartner’s framework, the “AI Value Pyramid,” offers a powerful and balanced model for communicating the multifaceted value of data and analytics. It is designed to move the conversation beyond a singular focus on financial returns by incorporating employee-centric and future-oriented benefits.23 The pyramid consists of three pillars:

  • Return on Investment (ROI): The traditional measure of financial return, encompassing revenue uplift and cost savings. This forms the base of the pyramid.
  • Return on Employee (ROE): This measures the impact on employee productivity, efficiency, and satisfaction. KPIs include tasks completed per employee per hour, hours of manual work eliminated, and employee satisfaction survey results related to data tools. Studies cited by Gartner show that AI tools can improve user task throughput by an average of 66%.23
  • Return on Future (ROF): This measures the strategic, long-term value created by data initiatives. It captures the organization’s enhanced ability to innovate and compete. KPIs include innovation pipeline velocity, the number of new market opportunities identified, and the speed of proof-of-concept development.22

This three-pronged approach allows a CDO to present a more complete and strategic value story to the C-suite, acknowledging that not all value can be immediately captured in a traditional ROI calculation.

ISO/IEC 42001: The Emergence of Standardized Measurement for AI Management

While not a consultancy framework, the emerging ISO/IEC 42001 standard for AI Management Systems is a critical development for CDOs. This standard explicitly requires that organizations monitor, measure, analyze, and evaluate the performance of their AI systems and the management system itself.23 This codifies the need for a formal measurement program, linking it directly to governance, risk management, and compliance. It signals a future where measuring the performance and impact of AI will not just be a best practice but a requirement for certification and regulatory adherence.

These frameworks, while different in their primary focus, share a common thread: the need to move beyond purely technical metrics and connect data initiatives to tangible, measurable business outcomes. The most effective CDOs will likely create a hybrid approach, using the BSC or the AI Value Pyramid as a high-level strategic communication tool, while adopting the DaaP model for operational management and the BCG focus on quick wins for project prioritization.

 

Table: Comparative Analysis of Value Measurement Frameworks

 

The following table provides a comparative overview of the primary measurement frameworks discussed, designed to help a CDO select or combine approaches based on their organization’s specific context, maturity, and strategic priorities.

 

Framework Primary Focus Key Constructs Representative Metrics Strengths Potential Challenges
Balanced Scorecard (BSC) Holistic Strategy Execution & Communication Financial, Customer, Internal Process, Learning & Growth Perspectives ROI, Customer Satisfaction (CSAT), Time-to-Insight, Data Literacy Scores 24 Provides a comprehensive, 360-degree view of performance. Excellent for communicating strategy and showing causal links between activities and outcomes.26 Can be complex to design and implement. Requires strong leadership buy-in to cascade through the organization effectively.24
Data-as-a-Product (DaaP) Operational Value Realization & Scalability Data Products, Product Teams, User-Centric Design Product Usage/Adoption Rate, User Satisfaction Scores, ROI of Enabled Use Cases 32 Tightly integrates value measurement with development. Fosters reusability and reduces TCO. Drives accountability at the team level.32 Requires a significant organizational and cultural shift from a project to a product mindset. Can be difficult to retrofit into legacy structures.7
Data Value Chain Analysis Process Optimization & Bottleneck Identification Stages: Collection, Processing, Analysis, Action/Impact Data Quality Score, Pipeline Latency, Time-to-Insight, Business Impact of Decisions 36 Excellent for diagnosing inefficiencies in the end-to-end data lifecycle. Clearly maps technical activities to business outcomes.33 Can be perceived as overly linear. The final “Impact” stage can be difficult to attribute directly to earlier stages without careful design.35
McKinsey Framework Direct Financial & Bottom-Line Impact High-Value Use Cases, EBIT Contribution % of EBIT Attributable to AI, Revenue Uplift, Cost Savings from specific initiatives 8 Uncompromising focus on linking data to profit and loss, which resonates strongly with CFOs and CEOs. Forces prioritization of impactful projects.23 May undervalue long-term, strategic, or qualitative benefits that are not immediately reflected in EBIT.23
BCG AI@Scale Framework Capability Maturity & Scaled Transformation Pillars: Strategy, Talent, Technology, Use Cases; Maturity Index Use Case ROI, Time-to-Value, Maturity Index Score 23 Focuses on building the organizational capability for sustained value creation. Emphasis on quick wins builds momentum.41 Maturity models can be subjective. A focus on quick wins might deprioritize foundational, long-term investments if not balanced properly.42
Gartner AI Value Pyramid Balanced Portfolio of Value Communication ROI (Investment), ROE (Employee), ROF (Future) Project ROI, Employee Productivity Gains, Innovation Pipeline Velocity 23 Provides a simple, powerful lexicon for communicating a balanced value story to executives. Acknowledges and legitimizes non-financial returns.23 ROE and ROF can be more difficult to quantify in financial terms, requiring the use of proxy metrics and qualitative evidence.23

 

Chapter 7: Economic Models of Data Valuation: An Advanced Perspective

 

For the most analytically mature organizations, the conversation about data’s value can evolve beyond measuring the impact of initiatives to valuing the data itself as a distinct economic asset. This advanced perspective, grounded in the field of information economics, provides a theoretical foundation for assigning an absolute, quantifiable financial value to an organization’s data assets, placing them on par with traditional physical or financial capital on the corporate balance sheet.43 Understanding these models allows a CDO to engage in sophisticated discussions with the CFO about data as a source of enterprise value.

Data possesses unique economic characteristics that differentiate it from traditional assets. It is non-rivalrous, meaning multiple people can use the same data simultaneously without depleting it.43 Its creation often involves high up-front costs for collection and infrastructure, but very low marginal costs for replication and distribution.43 Data also creates

externalities; for example, combining two datasets can create new insights that increase the value of both.43 Finally, data has a significant

option value, as its potential future uses are often unknown at the time of collection, making it valuable to store even if its present use is not clear.43 These characteristics necessitate specialized valuation models.

Infonomics pioneer Doug Laney provides a useful categorization of data valuation methods into two main types: Foundational Models and Financial Models.43

Foundational Models (Relative Value): These methods assess the informational or utility value of data without assigning a specific monetary figure. They are crucial for assessing data quality and fitness for purpose.

  • Intrinsic Value of Information (IVI): Measures the quality and integrity of a data asset based on its core characteristics. Key drivers include its correctness (accuracy), completeness, and exclusivity. A dataset that is highly accurate and exclusively held by the organization is intrinsically more valuable than a public, incomplete dataset.43
  • Business Value of Information (BVI): Measures the suitability of a data asset for a specific business task or purpose. It assesses how well the data meets the requirements of an initiative, such as “This marketing campaign requires customer data that is at least 95% complete and updated weekly”.43
  • Performance Value of Information (PVI): Measures the impact that using the data has on key business performance indicators (KPIs). This is often determined through controlled experiments, such as A/B testing, to see how a business process performs with and without access to a particular dataset.43

Financial Models (Absolute Value): These methods aim to assign a specific, absolute economic value to data assets, making them comparable to other assets on a financial statement.

  • Cost Value of Information (CVI): Values data based on the cost to acquire or create it, the cost to replace it if lost, or the financial impact (e.g., lost cash flows) that would occur if the data were rendered unusable.43
  • Market Value of Information (MVI): Values data based on what it could be sold for in an open data marketplace. This is most applicable to data assets that have commercial potential for licensing or sale to other organizations.43
  • Economic Value of Information (EVI): This is arguably the most powerful and strategically relevant financial model for a CDO. The EVI measures the value of data based on its contribution to generating revenue or reducing costs through its use in business processes.43 It quantifies the expected cash flows, returns, or savings derived from leveraging the data asset.

The EVI provides a direct link between a data asset and its financial contribution. For example, consider a dataset of customer transaction histories. By itself, its value is latent. However, when this data is used to build a predictive churn model (the use case), and that model is used by the marketing team to launch a retention campaign that reduces customer churn by 5%, the EVI of that data can be calculated. The 5% churn reduction translates into a quantifiable financial uplift based on the Customer Lifetime Value (CLV) of the retained customers.14 This calculated uplift, minus the cost of the initiative, represents the EVI of the customer data for that specific application.

This approach can be complemented by other economic theories, such as the Subjective Theory of Value, where value is determined by the specific use case (e.g., a data product that saves a department 0.5 FTE is valued at 50% of that employee’s annual salary), and the Cost-of-Production Theory, which calculates value based on the total labor and operational expenditure required to create and maintain the data product.45 By combining these lenses, a CDO can construct a robust and defensible economic valuation of the organization’s most critical data assets.

 

Part III: The Universal KPI Catalog for Data & Analytics Initiatives

 

A successful measurement program relies on the selection of clear, relevant, and actionable Key Performance Indicators (KPIs). KPIs are the quantifiable metrics that track progress against strategic objectives, providing the data-driven evidence needed to assess performance, justify investments, and guide decision-making.15 This section serves as a practical, universal catalog of KPIs tailored for data and analytics initiatives. It is organized by strategic value category, allowing a CDO to select a balanced portfolio of metrics that reflects the full spectrum of value creation—from direct financial returns to foundational platform health and long-term innovation. Each KPI should be specific, measurable, achievable, relevant, and time-bound (SMART) to be effective.5

 

Chapter 8: Financial Impact & ROI Metrics

 

These are the ultimate lagging indicators of success, measuring the direct contribution of data and analytics to the organization’s bottom line. They are the most critical metrics for communicating with the CFO and CEO and for justifying the overall data program budget.

  • Return on Investment (ROI): The quintessential measure of profitability for a specific initiative or the entire data program. It calculates the financial gain relative to the cost of the investment. A positive ROI indicates that the initiative generated more value than it cost. The formula is: $ROI = \frac{(Net\:Benefit – Total\:Cost)}{Total\:Cost} \times 100\%$.4
  • Revenue Growth Rate: Measures the percentage increase in revenue attributable to data-driven initiatives over a specific period. This can be tracked for the entire company or for specific product lines or campaigns enhanced by analytics.30
  • Net and Gross Profit Margin: These metrics measure the profitability of core business operations. Data initiatives can impact these by identifying cost efficiencies (improving gross margin) or by driving overall profitability after all expenses (net margin).30
  • Customer Lifetime Value (CLV): Calculates the total revenue a business can expect from a single customer account throughout the business relationship. Data-driven personalization and retention strategies directly aim to increase CLV.48
  • Customer Acquisition Cost (CAC): Measures the total cost to acquire a new customer. Analytics can lower CAC by optimizing marketing spend and improving lead targeting.51
  • EBIT Attributable to AI/Analytics: A sophisticated metric, advocated by McKinsey, that isolates the portion of a company’s Earnings Before Interest and Taxes that can be directly credited to the impact of AI and analytics programs.23
  • Return on Ad Spend (ROAS): For marketing analytics, this measures the gross revenue generated for every dollar spent on advertising. It is a direct measure of campaign profitability and effectiveness.53

 

Chapter 9: Operational Efficiency & Productivity Metrics

 

These KPIs measure the impact of data and analytics on improving the speed, quality, and cost-effectiveness of internal business processes. They are often leading indicators of financial impact, as efficiency gains typically translate into cost savings and improved capacity.

  • Time-to-Insight / Time-to-Value: Measures the time it takes to go from a business question to an actionable insight or a deployed solution. A reduction in this metric signifies increased agility and faster decision-making.31
  • Process Cycle Time Reduction: Quantifies the decrease in time required to complete a specific business process (e.g., order fulfillment, customer onboarding) due to data-driven optimization.40
  • Error Rate Reduction: Tracks the decrease in errors or defects in a process (e.g., manufacturing defects, billing errors) after the implementation of analytics-based monitoring or quality control.18
  • Cost Savings from Automation: Directly quantifies the reduction in operational costs (e.g., labor, materials) resulting from the automation of tasks using AI or analytics.18
  • Increased Throughput: Measures the increase in the volume of work processed or output produced by a system or team, such as the number of customer interactions handled by an AI-powered chatbot per hour.4
  • Employee Productivity: Can be measured in various ways, including revenue per employee or tasks completed per employee per hour. This KPI demonstrates how data tools and insights are empowering the workforce to be more effective.23
  • Report Production Cycle Time: Measures the average time it takes to fulfill a management request for a new report or analysis. Reducing this time frees up the data team for more strategic work and provides faster insights to the business.54

 

Chapter 10: Customer & Market Impact Metrics

 

These KPIs focus on how data initiatives influence external stakeholders, particularly customers, and the organization’s position within its market. They are crucial for demonstrating value in customer-centric organizations.

  • Customer Satisfaction (CSAT): A measure, typically from surveys, of how satisfied customers are with a specific product, service, or interaction. Data-driven improvements can directly lift CSAT scores.5
  • Net Promoter Score (NPS): A metric that gauges customer loyalty by asking how likely a customer is to recommend the company’s product or service. It is a strong indicator of long-term brand health and growth potential.4
  • Customer Churn / Retention Rate: Measures the percentage of customers who stop using a service or the percentage who continue, respectively. Predictive analytics are often used specifically to identify at-risk customers and reduce churn, making this a direct measure of a data project’s success.19
  • New Customer Acquisition Rate: Tracks the rate at which the company is gaining new customers. This can be improved through data-driven marketing and sales strategies.22
  • Market Share Growth: Measures the company’s portion of the total market sales. Gaining market share can be a direct result of the competitive advantages conferred by superior data and analytics.18

 

Chapter 11: Data Platform & Team Performance Metrics

 

These KPIs measure the health, efficiency, and effectiveness of the underlying data infrastructure, platforms, and the teams that manage them. While often technical, they are critical leading indicators; poor performance here will inevitably hinder the ability to achieve business-level outcomes.

  • Data Quality & Health:
  • Data Quality Score: A composite metric that aggregates multiple dimensions of data quality, such as Accuracy (% of data without errors), Completeness (% of fields with required values), Consistency (lack of contradictions), Timeliness (data freshness), Uniqueness (% of non-duplicate records), and Validity (% of data conforming to format rules).31
  • Data Availability / Uptime: The percentage of time that data systems and platforms are accessible and operational for users.31
  • Data Freshness Delta: The time lag between when an event occurs in the real world and when the data representing that event is available for analysis in the data platform.39
  • Analytics Performance:
  • Model Accuracy / Precision / Recall: For machine learning and predictive models, these statistical measures quantify how well the model performs its predictive task. Higher accuracy leads to more reliable, trustworthy insights.31
  • Query Performance / Dashboard Load Time: The speed at which users can retrieve data or load analytical dashboards. Slow performance is a major barrier to adoption and user satisfaction.31
  • Data Pipeline Latency: The time it takes for data to move from its source system through the processing pipeline to its destination where it can be analyzed.31
  • Team & Project Efficiency:
  • Time-to-Model Deployment: The time from the start of a data science project to the deployment of the resulting model into a production environment.31
  • Number of Data Sources Integrated: A measure of the data team’s progress in breaking down data silos and creating a more unified data landscape.31
  • Cost per Data Job: The total cost (compute, storage, personnel) associated with running a specific data processing job, used for resource optimization and budgeting.31

 

Chapter 12: Governance, Risk, and Compliance (GRC) Metrics

 

These KPIs measure the effectiveness of the data governance program in ensuring data is managed as a secure, compliant, and trusted asset. They are crucial for mitigating risk and are of high interest to legal, compliance, and audit functions.

  • Percentage of Data Cataloged and Governed: Measures the proportion of the organization’s critical data assets that are documented in a data catalog and are under a formal governance policy. This indicates the maturity and reach of the governance program.39
  • Number of Data Breaches / Security Incidents: A direct measure of the effectiveness of data security controls. The goal is always zero, and this metric is a critical component of any risk report.39
  • Rate of Compliance Violations: Tracks the number of instances where data handling has violated regulations like GDPR or CCPA. This can be tied to the financial impact of associated fines or penalties.39
  • Data Access Compliance Rate: The percentage of data access events that comply with established security and privacy policies, ensuring that only authorized users access sensitive data.58
  • Issue Resolution Time: The average time it takes to resolve data-related issues, such as a data quality problem or a security alert, from the time they are reported.58
  • Risk Reduction: A financial metric that quantifies the value of avoided losses. This can be calculated by estimating the potential cost of a risk (e.g., a regulatory fine) and multiplying it by the reduction in probability achieved through a data governance control.19

 

Chapter 13: Innovation & Growth Metrics

 

These KPIs are designed to measure Gartner’s “Return on Future” (ROF). They are the most forward-looking indicators, tracking the organization’s capacity to innovate and create future growth streams through the strategic application of data and analytics.

  • Number of AI/Data Experiments in Progress: A measure of the organization’s exploratory activity and commitment to testing new ideas. A healthy portfolio of experiments is a leading indicator of future innovation.22
  • Speed of Proof-of-Concept (PoC) Development: The average time it takes to move an idea from conception to a working PoC. A reduction in this time indicates growing organizational agility and technical capability.22
  • Innovation Pipeline Velocity: Measures the percentage of new data-driven ideas that successfully progress through the innovation pipeline to the prototype or pilot stage. It reflects the efficiency of the innovation process.22
  • Number of New Products/Services Launched: Tracks the number of new offerings that incorporate significant data-driven features or were created based on analytical insights.22
  • Revenue from New Markets: Quantifies the revenue generated from new markets or customer segments that were identified and targeted using data analytics.22
  • Number of Patents Filed: For organizations in R&D-intensive fields, this tracks the number of new patents based on proprietary AI or data analysis techniques, representing a tangible intellectual property asset.22

 

Table: The Master KPI Catalog

 

The following table serves as a central, actionable repository of the KPIs discussed. It is designed to be a reference tool for a CDO and their team to select, define, and customize a comprehensive measurement program tailored to their organization’s strategic objectives.

 

KPI Category KPI Name Description (What it measures) Formula / Calculation Method Strategic Relevance (Business question it answers) Typical Data Sources Benchmark Target (Example)
Financial Impact Return on Investment (ROI) The financial gain or loss from a data initiative relative to its cost. ((Financial Gain – Investment Cost) / Investment Cost) * 100% Are our data investments profitable? Financial Systems, Project Cost Tracking > 20% 42
Financial Impact Customer Lifetime Value (CLV) The total net profit a company can expect from a single customer over their entire relationship. (Avg. Purchase Value * Avg. Purchase Frequency) * Avg. Customer Lifespan Are we increasing the long-term value of our customers? CRM, Sales Data, Financials Increase by 15% YoY
Financial Impact Customer Acquisition Cost (CAC) The total cost of sales and marketing efforts needed to acquire a new customer. (Total Sales & Marketing Cost) / (Number of New Customers Acquired) Are we acquiring new customers efficiently? Marketing Analytics, CRM, Financials Decrease by 10% YoY
Operational Efficiency Time-to-Insight The time elapsed from when a business question is posed to when an actionable insight is delivered. Timestamp(Insight Delivered) – Timestamp(Question Asked) How quickly can we answer critical business questions? Project Management Tools, BI Tools < 48 hours for standard requests
Operational Efficiency Error Rate Reduction The percentage decrease in errors or defects in a specific business process. ((Initial Error Rate – Post-Implementation Error Rate) / Initial Error Rate) * 100% Are our data initiatives improving process quality? Operational Systems, Quality Control Logs Reduce billing errors by 50%
Customer & Market Net Promoter Score (NPS) A measure of customer loyalty and willingness to recommend the company’s products or services. % Promoters – % Detractors How do our customers perceive our brand and products? Customer Surveys > 50 (varies by industry)
Customer & Market Customer Churn Rate The percentage of customers who stop doing business with the company over a specific period. (Customers Lost / Total Customers at Start of Period) * 100% Are we retaining our valuable customers? CRM, Subscription Management < 2% monthly churn
Platform & Team Data Quality Score A composite score measuring the health of data across dimensions like accuracy, completeness, and timeliness. Weighted average of individual quality metrics (e.g., % complete, % valid) Can we trust our data for decision-making? Data Profiling Tools, Data Catalogs > 95% for critical data elements
Platform & Team Data Availability (Uptime) The percentage of time data systems and analytics platforms are operational and accessible to users. ((Total Time – Downtime) / Total Time) * 100% Is our data platform reliable for the business? System Monitoring Tools 99.9% uptime
Platform & Team Model Accuracy The percentage of correct predictions made by a machine learning model. (Number of Correct Predictions / Total Number of Predictions) * 100% How reliable are our predictive models? ML Model Logs, Validation Datasets > 90% for fraud detection model
Governance & Risk % of Data Cataloged The proportion of critical data assets that are documented and discoverable in the enterprise data catalog. (Number of Cataloged Critical Assets / Total Critical Assets) * 100% Do we have visibility and control over our critical data? Data Catalog, Data Governance Tools 80% of critical assets cataloged
Governance & Risk Issue Resolution Time The average time taken to resolve a reported data issue (e.g., quality error, access problem). Avg(Timestamp(Issue Resolved) – Timestamp(Issue Reported)) How responsive is our data governance process? Ticketing Systems (e.g., Jira, ServiceNow) < 24 hours for high-priority issues
Innovation & Growth Innovation Pipeline Velocity The percentage of data-driven ideas that successfully advance from concept to the prototype stage. (Number of Ideas Reaching Prototype / Total Ideas Generated) * 100% Are we effectively turning innovative ideas into tangible projects? Innovation Management Software, Project Portfolio 40% increase in prototype rate 22
Innovation & Growth Speed of PoC Development The average time required to develop a proof-of-concept for a new data or AI initiative. Avg(Timestamp(PoC Complete) – Timestamp(PoC Start)) How agile are we in testing new data-driven concepts? Project Management Tools Reduce from 6 months to 6 weeks 22

 

Part IV: The CDO’s Implementation Blueprint

 

With a firm grasp of the strategic frameworks and a comprehensive catalog of KPIs, the CDO can now move to execution. This section provides a practical, step-by-step blueprint for designing, launching, and managing a value realization program. It translates theory into an actionable plan, guiding the CDO through the critical phases of establishing a baseline, benchmarking performance, designing the program, and creating a roadmap for incremental value delivery.

 

Chapter 14: Step 1 – Establishing the Performance Baseline

 

The axiom “you can’t manage what you don’t measure” has a crucial corollary: you cannot measure improvement without first knowing your starting point. Establishing a performance baseline is the foundational, non-negotiable first step in any credible measurement program.4 A baseline is a documented, quantitative snapshot of current performance that serves as the reference point against which all future progress, and the impact of all data initiatives, will be judged. Without it, claims of value creation remain anecdotal and indefensible.

The process of establishing a baseline is a structured project in itself, requiring clear objectives, data collection, and analysis.60 The key steps are as follows:

  1. Define Baseline Objectives and Scope: The first action is to clearly articulate the purpose of the baseline. Is it to measure the current efficiency of a specific business process before an automation initiative? Is it to understand the current state of data quality across the enterprise? This clarity of purpose guides the entire effort.59 The scope must be well-defined, breaking down the project into manageable components using a Work Breakdown Structure (WBS) to detail all deliverables and tasks.61
  2. Conduct a Capability and Data Audit: Before measuring performance, it is essential to understand the current state of the data landscape. This involves a thorough inventory of existing data assets (both structured and unstructured), data pipelines, and analytical tools.14 This audit should also assess the current skills and capabilities of the data team and the broader organization’s data literacy level. This process helps identify critical data sources, surface existing pain points like quality gaps or redundant processes, and understand the technological and human foundation upon which the measurement program will be built.14
  3. Collect and Validate Historical Data: For each KPI selected for the measurement program, the team must collect comprehensive historical data to create a clear picture of past and current performance.59 This may involve extracting data from various operational systems, financial records, and project management tools. It is critical to define the time period for the baseline—for example, the last 12 months of performance. This data must then be validated for quality and accuracy to ensure the baseline itself is trustworthy.64
  4. Establish the Baseline Values: Once the data is collected and validated, the team can calculate the initial value for each KPI. For metrics that exhibit natural variability, it is often insufficient to use a single data point. Instead, statistical methods, such as calculating the average performance over a set period (e.g., the last 5 to 10 data points), should be used to establish a stable central line of performance.65 This analysis reveals trends, outliers, and the natural variation in the process, creating a much richer understanding of the “as-is” state.59 For major initiatives, it is crucial to establish the “baseline trifecta”:
  • Scope Baseline: A clear definition of what work will be done and what will be delivered.61
  • Schedule Baseline: A detailed project timeline with milestones and dependencies.61
  • Cost Baseline: A time-phased budget outlining the expected cost of the project.61

In situations where reliable historical data is unavailable—a common challenge when implementing a new system or measuring a new process—the CDO must employ alternative strategies. These can include conducting time studies of current manual processes, deploying surveys to capture qualitative stakeholder perceptions, using external industry benchmarks as a proxy starting point, or, most practically, implementing a rapid data collection period of 30-60 days to generate an initial dataset from which a preliminary baseline can be established and later refined.64

 

Chapter 15: Step 2 – The Art and Science of Benchmarking

 

Once a performance baseline is established, it provides an internal measure of progress. However, to truly understand performance, context is required. Benchmarking is the process of systematically comparing an organization’s performance, processes, and practices against a chosen standard, thereby transforming raw metrics into meaningful, comparative insights.18 For a CDO, benchmarking answers critical questions: “Are we performing well compared to our peers?” “What does ‘good’ look like in our industry?” and “Where are our biggest opportunities for improvement?”.68 It provides an objective, external reference point that validates internal goals and identifies best practices that can be adopted to accelerate performance improvement.69

There are three primary types of benchmarking, each serving a different strategic purpose:

  1. Internal Benchmarking: This involves comparing processes and performance metrics across different departments, teams, business units, or locations within the same organization.71 For example, if one regional sales office has a significantly higher data-driven lead conversion rate, internal benchmarking would analyze its processes to identify best practices that can be standardized and rolled out to other offices. This is often the most accessible and resource-efficient starting point, as data is readily available and it avoids issues of data confidentiality.73
  2. Competitive Benchmarking: This is the direct comparison of performance metrics and strategies against an organization’s direct competitors.71 While strategically valuable for understanding market position, it can be challenging to execute due to the difficulty of obtaining reliable, confidential competitor data.66 Data is often sourced from public financial reports, market research, and industry analysis.
  3. Functional/Industry Benchmarking: This approach compares a specific function or process (e.g., data governance, customer service, supply chain management) against recognized leaders in that function, regardless of their industry.71 This is an excellent way to identify innovative, world-class practices. For instance, a bank looking to improve its data analytics workflow might benchmark itself against a leading tech company known for its data-driven culture.68

A systematic benchmarking process ensures that the insights generated are reliable and actionable. The key steps include 66:

  1. Identify What to Benchmark: Prioritize critical processes or KPIs that are most important to stakeholders and aligned with strategic goals.
  2. Select Comparison Partners: Identify relevant and high-performing organizations to benchmark against.
  3. Collect Data: Gather quantitative and qualitative data through a mix of primary research (surveys, interviews) and secondary sources (reports, public data).
  4. Analyze Performance Gaps: Compare your baseline performance against the benchmark data to identify the magnitude and root causes of any performance gaps.
  5. Develop and Implement an Action Plan: Create a formal plan to adopt the best practices identified and close the performance gaps.

A significant challenge in benchmarking is sourcing reliable external data. CDOs can leverage several resources:

  • Industry Associations and Consulting Firms: Many trade associations and management consulting firms publish annual reports with industry-specific benchmark data.76
  • Government Agencies: Public bodies like the U.S. Census Bureau or the Bureau of Labor Statistics provide a wealth of economic and industry data.75
  • Specialized Benchmarking Services: Several organizations offer robust, validated benchmarking data and tools. APQC (American Productivity & Quality Center) provides the Open Standards Benchmarking database, the world’s largest repository of process and performance metrics, with over 5 million data points validated through a rigorous multi-step process.78
    Gartner offers its IT Score for Data & Analytics, which allows organizations to benchmark their D&A capability maturity against peers across various industries and objectives.81
    Google Analytics also provides a built-in benchmarking feature that allows websites to compare their user engagement and acquisition metrics against anonymized industry data.84

Finally, Data and Analytics Maturity Models from bodies like DAMA, CMMI, and BARC serve as a powerful form of benchmarking.86 These models allow an organization to assess its capabilities across dimensions like strategy, governance, technology, and culture against a standardized scale of maturity (e.g., from Level 1: Ad Hoc to Level 5: Optimized). This provides a clear roadmap for improvement and allows for comparison against the established characteristics of high-performing, data-mature organizations.88

 

Chapter 16: Step 3 – Designing the Measurement Program

 

With baselines established and benchmarks identified, the next critical step is to design the formal measurement program. This involves synthesizing the strategic frameworks, selected KPIs, and governance protocols into a cohesive and operational system for tracking and reporting value. A well-designed program is not a static list of metrics; it is a dynamic system tailored to the organization’s specific needs, designed for clarity, and built to evolve.

The design process should be collaborative and structured, following several key activities:

  1. Reconfirm Alignment with Business Objectives: The design phase must begin, once again, with the organization’s strategic goals. The CDO and their team should work directly with business leaders to translate high-level enterprise Objectives and Key Results (OKRs) or strategic priorities into specific data initiatives that will be measured.2 For example, if a corporate objective is to “Increase market share in the SMB segment by 10%,” a corresponding data initiative might be to “Develop a predictive lead scoring model to identify high-potential SMB prospects.” This direct line of sight ensures that the measurement program is focused on what truly matters to the business.6
  2. Select a Balanced Portfolio of Frameworks and KPIs: It is impractical and counterproductive to measure everything. The goal is to select the “vital few” KPIs that provide the most insight with the least overhead.89 The CDO should choose a primary strategic framework to structure the program—for instance, using the Balanced Scorecard (BSC) for a holistic view or Gartner’s AI Value Pyramid to frame the value story.23 From there, they should select a balanced portfolio of 10-15 core KPIs from the catalog in Part III, ensuring representation across financial, operational, customer, and platform health categories. For specific initiatives, more granular frameworks like the Data-as-a-Product (DaaP) model can be applied, with its own set of usage and satisfaction metrics.32
  3. Define Data Collection, Governance, and Cadence: For each selected KPI, the program design must explicitly document the “who, what, when, where, and how” of its measurement.71 This includes:
  • Data Source: Where will the raw data for the KPI be sourced? (e.g., CRM, ERP, web analytics).
  • Calculation Formula: The precise, agreed-upon formula for the KPI.
  • Owner: The individual or team responsible for collecting, calculating, and validating the KPI.
  • Cadence: How often the KPI will be measured and reported (e.g., daily, weekly, monthly, quarterly).
  • Validation Protocol: The process for ensuring the accuracy and quality of the KPI data before it is reported.
    This level of definition is crucial for building trust. If stakeholders question the validity of the data behind the metrics, the entire program’s credibility is undermined.51
  1. Engage Stakeholders in the Design Process: The design of the measurement program should not happen in a vacuum. The CDO must facilitate a collaborative process, engaging leaders from Finance, IT, and key business units.59 Finance should be involved to validate ROI calculations and financial metrics. IT should be consulted on the feasibility of data collection and platform metrics. Business leaders must confirm that the selected KPIs are relevant to their operational goals and decision-making needs. This collaborative approach ensures broad buy-in, fosters a sense of shared ownership, and dramatically increases the likelihood that the measurement program will be actively used and valued by the organization.90
  2. Design the Reporting and Communication Mechanisms: The final element of the design is to plan how the results will be communicated. This involves designing the dashboards, reports, and presentation formats that will be used. The design should be audience-centric. For example, the C-suite may receive a high-level, one-page dashboard summarizing top-tier BSC metrics quarterly, while an operational team might receive a detailed daily report on data pipeline performance. The goal is to provide the right information to the right people at the right time to support decision-making.91

By following these steps, the CDO can move beyond an ad-hoc collection of metrics to a purposefully designed, governed, and stakeholder-aligned measurement program that serves as the central nervous system for the data organization.

 

Chapter 17: Step 4 – Creating the Value Realization Roadmap

 

A well-designed measurement program defines what will be measured, but a Value Realization Roadmap defines how and when that value will be delivered and demonstrated. This roadmap is a strategic, time-bound action plan that translates the measurement strategy into a sequence of prioritized initiatives, projects, and milestones.92 It is a critical tool for managing stakeholder expectations, ensuring that resources are focused on the most impactful activities, and building momentum by delivering value incrementally rather than waiting for a single, large-scale “big bang” delivery.14

Creating an effective Value Realization Roadmap involves several key activities:

  1. Prioritize Data Initiatives: Not all data projects are created equal. The first step in building the roadmap is to rigorously prioritize the portfolio of potential initiatives based on business value and feasibility.94 This prevents the organization from spreading resources too thin or investing in low-impact projects. Effective prioritization frameworks include:
  • Impact/Effort Matrix: A simple 2×2 grid that plots initiatives based on their potential business impact (high/low) and the level of effort required to implement them (high/low). The highest priority should be given to high-impact, low-effort “quick wins”.18
  • MoSCoW Method: This framework categorizes initiatives into four groups: Must-have (critical for success), Should-have (important but not vital), Could-have (desirable but not necessary), and Won’t-have (out of scope for now). This helps focus on the essentials.2
  • ICE Scoring: A quantitative method that scores each initiative on three criteria: Impact (How much will this move the needle?), Confidence (How confident are we that this will succeed?), and Ease (How easy is it to implement?). The scores are multiplied to create a final priority ranking.
    The goal is to front-load the roadmap with initiatives that can deliver demonstrable value quickly, thereby building credibility and securing buy-in for more complex, long-term projects.1
  1. Structure a Phased Rollout: The roadmap should be structured into logical, manageable phases, often aligned with quarterly planning cycles or 90-day sprints.14 This phased approach allows for incremental investment, regular value checkpoints, and the ability to learn and adapt. A typical phased roadmap might look like this:
  • Phase 1: Foundation & Quick Wins (Months 1-3):
  • Establish the core data governance council and framework.
  • Conduct the baseline assessment for critical KPIs.
  • Deliver one high-visibility “quick win” project (e.g., a sales dashboard that provides immediate, actionable insights) to demonstrate value.
  • Phase 2: Expansion & Scaling (Months 4-9):
  • Roll out the Data-as-a-Product (DaaP) model for one or two critical data domains (e.g., Customer, Product).
  • Expand the executive KPI dashboard with more metrics from the BSC.
  • Launch a formal data literacy training program for a pilot business unit.
  • Phase 3: Optimization & Advanced Analytics (Months 10-18):
  • Introduce the first predictive analytics or machine learning models into production.
  • Refine economic value (EVI) models for key data assets.
  • Embed the measurement program into the annual strategic planning and budgeting process.
  1. Define Milestones and Resource Allocation: For each phase, the roadmap must clearly define specific, measurable milestones, the resources required (people, budget, technology), and the dependencies between initiatives.92 This level of detail transforms the roadmap from a high-level wish list into a concrete, executable plan. It provides clarity on what will be delivered by when and what investment is needed, which is essential for managing executive expectations.90
  2. Plan for Continuous Review and Adaptation: A data strategy roadmap is not a static document to be created once and filed away. The business environment, technological landscape, and organizational priorities are constantly evolving.94 The roadmap must be a living document, subject to regular review and revision, typically on a quarterly basis.92 These review sessions, involving key stakeholders, provide an opportunity to assess progress against milestones, evaluate the value delivered, and adjust priorities and timelines as needed. This agility ensures the data strategy remains relevant and continuously aligned with the organization’s most pressing needs.92

By creating and actively managing a Value Realization Roadmap, the CDO provides a clear, transparent, and compelling narrative of how the data organization will systematically build capabilities and deliver increasing levels of business value over time.

 

Part V: Communicating Value and Cultivating a Data-Centric Culture

 

The technical and strategic components of a measurement program are necessary but not sufficient for success. The ultimate impact of a CDO’s work hinges on the human and political dimensions of the role: the ability to effectively communicate value, secure and maintain executive support, and drive cultural change. This section focuses on these critical “soft skills,” providing a blueprint for translating measurement results into influence and transforming the organization’s relationship with data.

 

Chapter 18: Gaining and Maintaining Executive Buy-In

 

Securing executive buy-in for data initiatives is not a one-time sales pitch; it is an ongoing process of building trust, demonstrating relevance, and consistently communicating value in a language that resonates with leadership.10 The measurement program is the CDO’s most powerful tool in this process. A successful strategy for gaining and maintaining buy-in is built on four pillars.

First, know your audience and tailor the message. Executives are focused on business outcomes, not technical minutiae.96 The CDO must understand the specific priorities, challenges, and success metrics of each member of the C-suite and frame the value proposition accordingly.10 For the CFO, the conversation should center on ROI, cost savings, and risk mitigation. For the CMO, it should focus on customer lifetime value, market share, and campaign effectiveness. For the COO, the emphasis should be on operational efficiency and productivity gains.97 This tailored communication demonstrates that the data strategy is not an isolated IT project but a direct enabler of each executive’s strategic goals.11

Second, build a strong, data-supported business case. Every proposal for a new data initiative should be framed as a solution to a specific business problem.11 The case should lead with the business challenge, articulate how the proposed data or analytics solution will address it, and, most importantly, quantify the expected outcome.96 This involves providing a detailed ROI analysis that includes both tangible benefits (e.g., projected revenue increase, cost avoidance) and, where possible, quantified intangible benefits (e.g., value of time saved, impact of faster decision-making).10 Using real-world examples or success stories from similar organizations can add significant credibility.10

Third, demonstrate quick wins to build momentum. Executive sponsors are more likely to fund large, complex initiatives if they have seen tangible results from smaller, initial investments.9 The Value Realization Roadmap should be intentionally designed to deliver high-impact, visible successes early in the process. These “quick wins” serve as powerful proof points, validating the data team’s capabilities and the potential of the broader data strategy. Celebrating these early successes and communicating their impact widely helps build the political capital needed for long-term, ambitious programs.10

Fourth, propose a phased approach to reduce perceived risk. Large, monolithic project proposals with multi-year timelines and high upfront costs can be daunting for executives. A more effective strategy is to break down large initiatives into smaller, manageable phases, each with its own clear deliverables, timeline, and value checkpoint.10 This phased implementation spreads out costs, allows for incremental investment, and provides regular opportunities to demonstrate progress and ROI. This approach lowers the barrier to initial approval and builds confidence as each phase successfully delivers on its promises.10 By consistently applying these principles, the CDO can transform executive engagement from a periodic hurdle into a continuous, collaborative partnership for value creation.

 

Chapter 19: The Art of Executive Reporting

 

The manner in which results are presented to leadership is as critical as the results themselves. An executive’s time is their most limited resource, and their attention is finite. Therefore, reports on data initiatives must be meticulously designed to be concise, visual, contextual, and laser-focused on actionable insights, not exhaustive data dumps.98 An overly complex or poorly communicated report can obscure real value and undermine the credibility of the entire measurement program.

The first principle of effective executive reporting is to focus on impact, not activity. Executives are concerned with business outcomes, not the technical processes used to achieve them.96 Reports should highlight progress against strategic goals, the ROI of key initiatives, and the impact on core business KPIs. Details about data pipeline architecture or model-building techniques belong in technical documentation, not in a C-suite presentation. The distinction between “vanity metrics” (e.g., number of dashboards built) and “value metrics” (e.g., reduction in operational costs due to insights from those dashboards) must be rigorously maintained.96

Second, lead with a story, not just data. Numbers alone are dry and often lack impact. The most effective reports frame the data within a compelling narrative that connects the numbers to a business challenge and its resolution.96 A powerful structure is to begin with the key takeaway or the most critical insight—the “headline”—and then provide the essential data points that support it. This respects the executive’s time by delivering the most important message upfront. For example, instead of building up to a conclusion, start with: “Our new predictive maintenance model has reduced equipment downtime by 15%, saving an estimated $2.5M this quarter”.96

Third, provide context to create meaning. A raw number in isolation is meaningless. A 5% increase in conversion rate is only impressive if the context is understood. Reports must always present data alongside relevant reference points to allow for proper interpretation.98 This includes:

  • Benchmarks: How does our performance compare to the industry average or best-in-class competitors?
  • Historical Trends: Is this performance an improvement or a decline compared to last quarter or last year?
  • Targets: How close are we to achieving the goal we set for this KPI?

Fourth, visualize for clarity and impact. The human brain processes visual information far more efficiently than text or tables of numbers. Executive reports should leverage simple, clear, and well-designed charts and graphs to make complex information digestible at a glance.102 Avoid “chart junk”—unnecessary visual elements that clutter the visualization—and choose the right chart type for the data story being told (e.g., line charts for trends, bar charts for comparisons).104 A well-designed executive dashboard is infinitely more effective than a dense spreadsheet or a multi-page document.103

Finally, keep it concise. The goal for a C-suite audience should be a one-page summary or a dashboard that communicates the most critical information with no more than three key takeaways.98 This discipline forces the CDO to distill the message down to its essential core, ensuring that the report is not just delivered, but actually absorbed and acted upon.

 

Chapter 20: Fostering a Culture of Accountability and Continuous Improvement

 

The ultimate objective of a measurement program extends beyond reporting and justification; it is to fundamentally embed data-driven decision-making into the organization’s DNA, creating a culture of accountability and continuous improvement.17 This cultural transformation is perhaps the CDO’s most challenging yet most valuable long-term goal. It requires a deliberate, multi-pronged strategy that addresses leadership behavior, data accessibility, employee skills, and organizational incentives.105

The cornerstone of this cultural shift is leadership by example. A data-driven culture must be championed from the top. When senior leaders visibly and consistently use data, dashboards, and analytical insights in their own meetings, reviews, and strategic decisions, it sends a powerful and unambiguous message to the entire organization about what is valued.106 The CDO must act as a coach to the C-suite, ensuring they are comfortable with the tools and fluent in the language of the measurement program. Leaders must also foster an environment of psychological safety and curiosity, where employees are encouraged to question the status quo, test hypotheses with data, and learn from experiments, even those that fail.107

Next is the principle of democratizing data responsibly. For a data-driven culture to flourish, employees at all levels must have access to the data they need to perform their jobs effectively.13 This involves breaking down data silos and investing in user-friendly, self-service analytics and BI tools that empower business users to explore data and find answers to their own questions without being entirely dependent on a central data team.106 However, this democratization must be balanced with robust data governance to ensure data quality, security, and privacy. Role-based access controls and clear data definitions are essential to prevent a data “free-for-all”.106

Accessibility to data is meaningless without the skills to use it. Therefore, a critical component of the cultural strategy is investing in data literacy. The CDO must champion and resource ongoing training programs, workshops, and learning materials designed to improve the data skills of the entire workforce, not just the data specialists.105 This training should focus on practical skills, such as how to interpret data visualizations, understand key metrics, and use analytics tools to support daily decisions. A data-literate workforce is an empowered workforce, capable of contributing to and benefiting from the data strategy.17

Incentives and recognition play a vital role in reinforcing desired behaviors. The organization must celebrate data-driven wins. When a team or individual uses data to achieve a significant business outcome—launching a successful product, identifying a major cost saving, or dramatically improving a customer experience—that success should be publicly recognized and celebrated.17 Furthermore, aligning performance management and incentive systems with data-driven behaviors, for example by including data-related KPIs in employee evaluations, can powerfully motivate change.106

Finally, a culture of continuous improvement requires robust feedback loops. The CDO must establish clear channels for business users to provide feedback on data products, dashboards, and services.63 This allows the data team to iteratively improve its offerings based on user needs. Simultaneously, the data team must proactively share insights and success stories back to the business, demonstrating the value of their work and inspiring new use cases. This two-way communication creates a virtuous cycle of collaboration and improvement, ensuring the data strategy evolves in lockstep with the needs of the business.105

 

Part VI: Navigating Challenges and Ensuring Long-Term Success

 

The path to establishing a mature and impactful data measurement program is rarely linear or without obstacles. The CDO must be a pragmatic leader, anticipating common challenges and proactively designing strategies to mitigate them. This final section provides a clear-eyed view of the most frequent pitfalls encountered in data measurement and concludes by summarizing the critical success factors that underpin sustainable, long-term value realization.

 

Chapter 21: Common Pitfalls and How to Avoid Them

 

Successfully implementing a data value measurement program requires navigating a landscape of technical, organizational, and analytical challenges. Foreknowledge of these common pitfalls allows a CDO to address them proactively rather than reactively.

  1. Poor Data Quality and Data Silos: This is the most fundamental and pervasive technical barrier. If the underlying data is inaccurate, incomplete, inconsistent, or trapped in disconnected silos, any metrics derived from it will be untrustworthy.4 Decisions based on flawed data can be worse than those based on intuition.
  • Solution: There is no shortcut. The solution lies in establishing a robust data governance framework from the outset. This includes implementing data quality monitoring and profiling, creating a centralized data catalog to break down silos, and enforcing data standards at the point of entry.108 Investing in a modern data architecture that facilitates integration is also key.102
  1. Difficulty in Measuring Intangible Benefits: Many of the most significant benefits of data initiatives, such as “improved decision-making,” “enhanced collaboration,” or “increased innovation,” are inherently difficult to quantify in direct financial terms.5 An exclusive focus on hard ROI can lead to the undervaluation of these critical contributions.
  • Solution: Employ a balanced measurement approach. Use frameworks like the Balanced Scorecard or Gartner’s AI Value Pyramid that explicitly include non-financial perspectives.23 For intangible benefits, use
    proxy metrics that are quantifiable. For example, “improved decision-making” can be proxied by measuring the reduction in decision cycle time or the increase in the percentage of decisions supported by data.20 Time saved can be translated into labor cost avoidance.
  1. Lack of Clear Goals and Objectives: A frequent cause of failure is initiating data analysis or building dashboards without a clearly defined business question or objective.109 This leads to analyses that are “interesting but not actionable” and a waste of resources.
  • Solution: Institute a strict policy of aligning every data initiative with a specific, documented business goal before any work begins. The “Why?” must be answered before the “What?” or “How?”. The prioritization process in the Value Realization Roadmap should enforce this discipline.101
  1. Confusing Correlation with Causation: This is a classic analytical error. Observing that two variables move together (correlation) does not prove that one causes the other.100 Acting on a spurious correlation can lead to ineffective or even detrimental business strategies.
  • Solution: Foster a culture of analytical rigor and critical thinking. Promote the use of more sophisticated analytical techniques like A/B testing or controlled experiments to establish causality where possible. Encourage data scientists and analysts to challenge assumptions and explore potential confounding factors before presenting conclusions.100
  1. Organizational Resistance to Change: A new measurement program often introduces a higher level of transparency and accountability, which can be met with cultural or political resistance from individuals or departments accustomed to operating on intuition or in silos.109
  • Solution: This is primarily a change management challenge. The solution requires a combination of strong, visible executive sponsorship to signal the importance of the initiative, clear and continuous communication about the “why” and the benefits, and the strategic use of quick wins to demonstrate value and build a coalition of supporters.9
  1. Using the Wrong Benchmarks: Comparing performance against an inappropriate benchmark can lead to dangerously misleading conclusions. For example, a small e-commerce startup comparing its marketing spend to that of Amazon would derive no useful insight.100
  • Solution: Be deliberate in selecting benchmarking partners and data. Ensure the comparison group is relevant in terms of industry, size, and business model. Clearly document the source and context of any benchmark data used in reports to ensure transparent interpretation.68

By anticipating these pitfalls, the CDO can build mitigation strategies directly into the design of the measurement program, significantly increasing its chances of success and long-term adoption.

 

Chapter 22: Critical Success Factors (CSFs) for Sustainable Value Realization

 

The journey from establishing a measurement program to achieving sustainable value realization is complex. It requires more than just the right frameworks and technologies. Success depends on a set of foundational, non-negotiable conditions—Critical Success Factors (CSFs)—that must be cultivated and maintained. These CSFs represent the synthesis of the principles outlined in this playbook and serve as a final checklist for the CDO focused on long-term impact.

  1. Strong and Visible Executive Sponsorship: This is the single most important CSF.9 Without unwavering and visible support from the CEO, CFO, and other C-suite leaders, any data initiative, particularly one that involves cultural change and accountability, is likely to fail. The executive sponsor must champion the program, communicate its strategic importance, and provide the political cover needed to overcome inevitable resistance.16
  2. Clear and Inextricable Alignment with Business Strategy: The measurement program cannot exist as a separate, technical function. Every metric, every dashboard, and every report must be directly and clearly linked to the organization’s primary strategic objectives.16 This alignment ensures that the data function is focused on solving the most important business problems and that its value is understood in the context of what the business is trying to achieve.
  3. Robust and Proactive Data Governance: Trust is the currency of data. If stakeholders do not trust the data, they will not trust the metrics derived from it, rendering the entire measurement program useless. A mature data governance program that ensures data quality, security, consistency, and clear ownership is the bedrock upon which any successful measurement initiative is built.9
  4. Active and Continuous Stakeholder Engagement: Value realization is a team sport. The CDO must foster a culture of continuous collaboration with leaders and teams across business, IT, and finance.112 Involving stakeholders in the design of the measurement program, regularly communicating progress, and actively seeking their feedback ensures that the program remains relevant, addresses real-world needs, and has broad organizational buy-in.
  5. A Relentless Focus on Actionable Insights: The purpose of measurement is not to generate interesting reports; it is to drive better decisions and actions that improve performance.16 The program must be designed to move beyond data reporting to insight generation. Every metric should be tied to a potential action, and the program’s success should ultimately be judged by the quality and impact of the decisions it enables.
  6. An Iterative Approach and a Culture of Continuous Improvement: The most successful data programs are not built in a single “big bang.” They start small, focus on delivering tangible value through quick wins, and then iterate and expand based on lessons learned.9 This agile, iterative approach reduces risk, builds momentum, and allows the measurement program to evolve and adapt in lockstep with the changing needs of the business.

By focusing on these six Critical Success Factors, a CDO can navigate the complexities of their role and build a data and analytics function that is not just a center of excellence, but a proven and indispensable engine of enterprise value.

 

Table: Data & Analytics Maturity Model Comparison

 

To effectively apply the frameworks and KPIs in this playbook, a CDO must first understand their organization’s starting point. Data and analytics maturity models provide a structured way to self-assess current capabilities and set realistic goals for improvement. The following table synthesizes common characteristics from leading models like those from Gartner, DAMA, and BARC to provide a comparative overview of maturity levels.

 

Maturity Level Strategy & Vision People & Culture Process & Governance Technology & Architecture
Level 1: Initial / Ad Hoc No formal data strategy. Analytics efforts are isolated, reactive, and driven by individual initiatives. Business objectives are not linked to data.116 Data literacy is very low. Strong resistance to change. Decision-making is based on intuition and anecdote. Data is seen as an IT responsibility.116 Processes are undocumented and inconsistent. No formal data governance exists. Data quality is poor and unmanaged. Data access is chaotic or highly restricted.116 Fragmented data silos are prevalent. Technology consists of basic tools like spreadsheets. No centralized data platform or integration.116
Level 2: Developing / Repeatable Awareness of data’s potential is growing. Some business units begin to explore data use cases. Strategy is still fragmented and project-based.87 Pockets of data expertise emerge (“local heroes”). A desire for data-driven decisions begins to form, but skills are limited. Early data literacy efforts are initiated.87 Some data-related processes are repeatable but not standardized or integrated. Basic data quality rules may be applied to specific projects. Data ownership is unclear.87 Some data consolidation begins (e.g., departmental data marts). Basic BI and reporting tools are introduced. Architecture remains largely siloed.87
Level 3: Defined / Solid Foundation An enterprise-wide data strategy is defined and aligned with business goals. Executive sponsorship for data is established. A roadmap for data initiatives exists.88 Data roles (e.g., data stewards, analysts) are formally defined. Data literacy programs are in place. A data-aware culture is actively being cultivated.88 Standardized data governance policies and procedures are documented and implemented. A formal data governance council is active. Data quality is actively monitored.87 A centralized data platform (e.g., data warehouse or lake) is established. A standard portfolio of BI and analytics tools is available. Data integration processes are in place.87
Level 4: Managed / Excellent Data strategy is fully integrated into the business strategy. The value of data is quantitatively measured and reported using KPIs and a framework like the BSC.88 Data-driven decision-making is the norm in most departments. Cross-functional data teams collaborate effectively. Data skills are considered a core competency.87 Governance processes are actively managed and optimized. Data quality is measured and managed as a business asset. Data access is governed but democratized via self-service.88 The data platform is scalable and supports a wide range of analytics, including predictive models. Data pipelines are automated and monitored. Metadata management is mature.87
Level 5: Optimized / Leading Data and analytics are a source of competitive advantage and innovation. The organization uses prescriptive and cognitive analytics to shape future strategy.88 A culture of continuous improvement and data-driven experimentation is deeply embedded. All employees are data literate and empowered to use data for innovation.87 Governance is automated, proactive, and adaptive. Processes are continuously optimized based on performance metrics. AI is used to improve governance itself.88 A unified, flexible data fabric or data mesh architecture is in place. AI/ML is integrated into core business processes. Real-time analytics capabilities are widespread.87