The 3 Reasons Why Data Projects Fail

THIS WEEK: And what to do to address them was a Data Leader

In partnership with

Dear Reader…

We look into the systemic failures plaguing enterprise data projects, and what you can do to address them.

The data paints a grim picture across the entire spectrum of enterprise data initiatives. Whether it's a simple customer segmentation project, predictive analytics for supply chain optimisation, or cutting-edge AI implementations, the failure rates are consistently high. MIT's research on GenAI pilots - showing 90% plus failing to create measurable value - merely represents the latest chapter in a decades-long story of data project disappointments.

For experienced data professionals who've witnessed this firsthand, these figures confirm what many have suspected: despite billions in investment and countless hours of technical brilliance, enterprise data projects are plagued with failures across all domains. The problem isn't confined to AI, it's endemic across most every type of data initiative.

Realtime User Onboarding, Zero Engineering

Quarterzip delivers realtime, AI-led onboarding for every user with zero engineering effort.

✨ Dynamic Voice guides users in the moment
✨ Picture-in-Picture stay visible across your site and others
✨ Guardrails keep things accurate with smooth handoffs if needed

No code. No engineering. Just onboarding that adapts as you grow.

The Anatomy of Data Project Failure

After examining the failure-rate landscape across analytics, machine learning, and AI, a clear pattern emerges. The issues are not technical, they are systemic. Companies are falling into predictable traps that affect everything from basic reporting dashboards to sophisticated neural networks.

The Hype Trap: Executives declare, "We need better analytics!" or "We need AI!" without defining what problem they're solving. Data teams scramble to build impressive demonstrations - from executive dashboards, to sophisticated recommendation engines. But when executives ask about ROI six months later, the silence can be deafening. Organisations build sophisticated customer churn prediction models, demand forecasting systems, or sentiment analysis tools with impressive technical metrics, but nobody can demonstrate whether they actually improved business outcomes.

The Foundation Fallacy: Organisations rush toward advanced analytics whilst neglecting basic data quality and governance. Business units get excited about machine learning outcomes or real-time dashboards but forget the foundation. Poor data quality equals expensive project failure, whether you're building a simple sales report or a complex deep learning model.

The Adoption Abyss: Perhaps most heartbreaking is the "brilliant insights, zero usage" syndrome that affects all types of data projects. Technical teams deliver sophisticated analyses, from basic customer segmentation to advanced AI-powered recommendations, that sit unused in dashboards. The insights exist, but they're not integrated into daily business processes or decision-making workflows.

The RAPPID Response: A Methodology Born from Crisis

The RAPPID Value Lifecycle, developed by KARL Dinkelmann and ZJAÉN Coetzee, emerged as a direct response to these systemic failures across all types of data initiatives. Unlike traditional project management approaches that focus on delivery, RAPPID is obsessed with one thing: measurable commercial value from data investments.

The methodology establishes the clear principle that analytics, reporting, predictive modelling, and AI are only relevant if they create measurable outcomes. This isn't just philosophical positioning, it's a fundamental restructuring of how data projects are conceived, executed, and evaluated.

Breaking the Hype Cycle

RAPPID's first milestone - Approved Funding - acts as a firewall against technology-driven initiatives across the entire data spectrum. No code gets written for any data project until leadership secures investment based on a clear business case aligned to strategic goals. This "business-first approach" prevents the launch of projects that lack financial and strategic sponsorship, whether they involve simple analytics dashboards or complex AI implementations.

Consider the implications: instead of data scientists building impressive demos and hoping they create value, RAPPID mandates that value definition comes first for every data initiative. The methodology forces leaders to answer uncomfortable questions:

  • What specific commercial outcome are we seeking from this customer analytics project?

  • How will we measure success of our demand forecasting initiative?

  • Who will act on insights from our recommendation engine?

Addressing the Foundational Problem

The RAPPID framework explicitly incorporates the proprietary BREATH Framework, dedicated entirely to ensuring "Data Trust." This embeds data quality, lineage, and compliance requirements directly into the delivery phase, minimising the risk of project failure due to unreliable data - regardless of whether you're building basic reports or advanced AI systems.

This represents a fundamental shift in thinking across all data initiatives. Rather than treating data governance as an overhead, RAPPID positions it as a mandatory prerequisite for achieving subsequent milestones. The principle is clear: You wouldn't build a house on quicksand, so don't build analytics on bad data.

Solving the Adoption Challenge

Perhaps RAPPID's most innovative element is Milestone 3: Embedded Insights, which applies to all types of data projects. Whether it's a simple sales dashboard or a complex AI-powered pricing optimisation system, the project isn't considered "done" until insights are integrated into daily business processes and decision-making workflows. This phase includes cultural adoption managed via the STACKER+DVC Framework and technical integration through the AC3ROS Framework.

This addresses what many consider the most critical failure point across all data projects. Building insights is only half the battle - getting people to actually use them is where the real value lives, thus demonstrating ROI to those that are most impacted by the initiative. Turning passive consumers into data confident advocates.

The Continuous Value Loop

What distinguishes RAPPID from other methodologies is its closed-loop structure that applies to all data initiatives. The systematic measurement and publication of realised value in Phase IV (Recognise Value) directly feeds the credibility and financial justification required to secure the next iteration of investment—whether for expanding existing analytics capabilities or venturing into more advanced AI applications.

This creates what the methodology calls a "continuous capital investment engine", transforming data analytics from episodic projects into a self-justifying capability for growth. When done correctly, successful data initiatives fund their own expansion across the analytics maturity spectrum.

Cross-Functional Accountability: Breaking Down Silos

One of the most significant contributors to data project failure across all domains is organisational silos. Technical teams optimise for accuracy whilst business teams care about outcomes, and financial teams want ROI justification. This dynamic affects everything from basic reporting projects to advanced AI implementations.

RAPPID mandates cross-functional accountability across all four milestones for every type of data project. Technical success requires specialised data teams, but operational success requires mandatory business buy-in. The ultimate goal of “Measurable Value” validates the initial capital allocation, requiring consistent collaboration between financial and technical leadership regardless of project complexity.

The framework operates on the principle that when everyone's success depends on everyone else's success, silos naturally disappear across all data initiatives.

The Pain Points RAPPID Addresses

The methodology directly confronts the seven critical pain points that plague enterprise data initiatives across the entire spectrum. Do any of these sound like your business?

The Data-Value Gap: Whether building basic dashboards or AI systems, RAPPID's Milestone 4 forces organisations to measure and publish actual business impact, transforming data from cost centre to profit engine.

Technology Hype Chasing: The business-first approach mandates clear value definition before technical execution begins, whether for simple analytics or complex AI projects.

Technical vs Commercial Disconnect: Continuous collaboration across all milestones ensures technical delivery integrates with operational workflows, from basic reporting to advanced machine learning.

Foundation Neglect: The BREATH Framework ensures data quality and governance are built into the delivery phase from day one, regardless of project complexity.

Leadership Decision Paralysis: Simple, proven methodologies provide leaders with data confidence and clarity across all types of data investments.

Organisational Silos: Cross-functional accountability requirements break down barriers between technical and business teams for all data initiatives.

Adoption Failure: The Embedded Insights milestone ensures projects aren't complete until insights drive actual business decisions, whether from basic analytics or sophisticated AI.

The Path Forward

As enterprise data project failure rates continue climbing across all domains, the need for systematic approaches becomes undeniable. The RAPPID Value Lifecycle offers a structured alternative to the current chaos, but it requires leadership willing to prioritise measurable outcomes over technological novelty—whether pursuing basic analytics or advanced AI.

For data professionals struggling with these pain points across their project portfolio, RAPPID presents both an opportunity and a challenge. The opportunity lies in its potential to transform how organisations approach all data investments. The challenge is convincing leadership to embrace a methodology that demands accountability at every stage, regardless of project complexity.

The question isn't whether data will transform business—it's whether organisations can transform their approach to data projects quickly enough to avoid becoming another failure statistic. In a landscape where failure rates remain consistently high across all types of data initiatives, methodologies like RAPPID aren't just helpful—they're essential for survival.

The framework forces a fundamental choice: continue building impressive technology that nobody uses, or start building value that nobody can ignore. The evidence suggests that organisations embracing the latter approach through structured methodologies like RAPPID are far more likely to succeed across their entire data project portfolio.

The choice, ultimately, is ours

That’s a wrap for this week
Happy Engineering Data Pro’s