Inadequate data infrastructure as a barrier to generative AI adoption

Generative AI is critical. But most organizations still don’t have the data infrastructure ready to handle it. The issue isn’t about whether your teams are excited about AI; it’s whether your systems can move fast enough, cleanly enough, and at scale. Right now, for many businesses, the answer is no.

Legacy pipelines slow things down. Data is spread across platforms and departments, isolated, often outdated, and impossible to coordinate effectively. That’s a nonstarter for generative AI. This type of AI relies heavily on high-quality, unified, and accessible data to produce useful output. Without real-time integrations and data governance baked into the foundation, scaling generative AI from concept to real deployment becomes more of a science experiment than a core business strategy.

Executives are often surprised by the magnitude of the challenge. Their teams may have AI prototypes running in silos, but moving these agents into production stalls. Why? The data isn’t ready. The architecture doesn’t support iterative AI workflows. You can’t build intelligent systems on fragmented foundations.

Now here’s the eye-opener: According to a February 2024 Gartner report, nearly two-thirds of enterprise data management professionals admitted their companies weren’t ready for generative AI. And keep this in mind, by 2026, Gartner projects that 60% of enterprise AI projects will fail due to poor data foundations. This isn’t a guess. It’s a sharp signal that companies still underestimate how vital modern data infrastructure is.

If you lead a business that’s already investing in AI, ask the hard questions. Is your pipeline fast and flexible? Is data consistently accessible across the enterprise? If not, you’re not behind, but you’re at risk. This isn’t about catching trends. It’s about keeping your AI initiatives alive and ensuring they generate value. That starts with fixing the data, at the root.

Strategic cloud partnerships and platform innovations as solutions to data challenges

AI doesn’t run on promises. It runs on clean, connected, high-volume data. The companies that understand this aren’t just talking about AI, they’re building real systems that work. The most practical move happening right now: strong partnerships and aggressive platform innovation across the cloud ecosystem. That’s where the real progress is.

SAP made a smart call with the launch of its Business Data Cloud. It’s more than a product, it’s a centralized structure that unifies SAP and non-SAP data, both structured and unstructured. AI doesn’t care where your data lives, it just needs it to be available, governed, and integrated. That’s exactly what SAP is enabling here.

Christian Klein, CEO of SAP, described Business Data Cloud as “the new center of gravity for business data” during the company’s Q1 2025 earnings call. He’s not overstating it. The platform turns fragmented enterprise data into something functional, logical layers of insight that support AI, dashboards, and analytics in a streamlined way. SAP’s partnership with Databricks shows they’re serious about merging enterprise-grade systems with high-performance data capabilities.

On another front, Oracle took a significant step by integrating its cloud database services directly inside AWS, Azure, and Google Cloud data centers. This completely changes the dynamics between traditional database solutions and hyperscale platforms. It cuts down the friction in multicloud environments, which is one of the top blockers for enterprises deploying AI across dispersed infrastructure.

Amazon CEO Andy Jassy was straightforward about the situation. He said during Amazon’s Q1 2025 earnings call, “If a company is to realize the full potential of AI, they are going to need their infrastructure and data in the cloud.” He’s right. You can’t scale bias detection, model training, or live interactions when your data is stuck in outdated, on-prem systems.

For executives, here’s the takeaway: making progress in AI is not just about having good ideas. It’s about your ability to connect platforms, upgrade storage and pipelines, and control data flows across multicloud environments. The partnerships happening right now aren’t about vendor capitalism. They’re about direct, quantifiable functionality, something your tech teams can work with and turn into real value. If your infrastructure is still closed off or built for a different era, now is the time to re-align. You need fluid systems, strong integrations, and strategic vendors who aren’t trying to sell software, but solve bottlenecks.

Accelerated adoption of cloud analytics platforms driving data value extraction

Data is only useful if it moves, fast, securely, and at scale. That’s why we’re seeing rapid uptake of cloud-native analytics platforms. Enterprises want real-time visibility, connected systems, and intelligence delivered through interfaces people actually use. The backend needs to disappear, and insight needs to become immediate. That’s what these platforms are starting to deliver.

Microsoft is pushing hard on this front. Consumption of Microsoft Fabric, their end-to-end cloud analytics platform, jumped 80% in just the first three months of 2025. That’s not surface-level interest, that signals deep integration into business processes across sectors. Microsoft CEO Satya Nadella also confirmed that OneLake, their multicloud data lake, grew more than sixfold year-over-year. That level of scale suggests businesses aren’t just storing data in the cloud, they’re using it.

Why does this matter for executives? Because it shows where things are moving, away from fragmented systems toward consolidated services that combine storage, analytics, and AI pipelines in one environment. Platform growth at that scale happens when pain points are being solved, not because of marketing.

Enterprises want faster insight loops. They want models that learn from current data, not stale snapshots. Cloud analytics platforms solve part of the problem by reducing time-to-analysis and increasing access across functional teams. Microsoft Fabric is far from the only tool doing this, but numbers like these indicate it’s resonating broadly.

This signals something important for C-level decision makers: enterprise software investment needs to consider time to value. Not just what the platform claims to do, but how operational teams actually extract and apply insights. Strong growth in a platform like Fabric means teams are clearing bottlenecks, automating data prep, and moving toward integrated decision-making workflows.

If your teams are still exporting dashboards once a week or relying on siloed tools, you’re not keeping up with competitors who’ve already activated this new layer of intelligence. It’s not about hype anymore, the infrastructure is already delivering.

Measuring data engagement to ensure return on cloud analytics investments

There’s a growing pressure inside companies to make sure data isn’t just stored, but used. Cloud costs continue to climb, and boards aren’t asking how much you’re storing. They care about what value you’re creating from that data. And that means measuring actual data engagement, not just infrastructure performance.

Domo, for example, tracks a metric they call “cards views.” It’s simple: every time a user views a visualization, on desktop, mobile, embedded, or via scheduled reports, it’s counted. That tells you whether the data is being accessed, interpreted, and turned into insight. If no one is using the data, then it’s static. And static data doesn’t move your business forward.

This isn’t just about dashboards or surface-level BI outputs. It’s about proving return on investment from cloud analytics platforms. If teams aren’t actively using data to make decisions or automate processes, then the tools themselves are pointless, regardless of scale or vendor.

For executives, it’s critical to push for measurement systems that reflect engagement and impact, not just uptime or storage volume. Data usage doesn’t need complex KPIs. It needs simple signals that clearly show usage rates and insight generation paths. Are critical datasets touching the right hands? Are decision-makers working off live, trusted information? Are KPIs being updated automatically by systems, not pulled manually?

According to Schein from Domo, data that isn’t viewed or used might as well not exist. He makes the point clearly: loading data into the cloud without meaningful engagement doesn’t count. That engagement, shown through metrics like “cards views”, is the difference between a technology investment and a business outcome.

If you’re in the C-suite, you should be asking: what percentage of our cloud data is actively used in decision-making? What’s the engagement rate of our analytics tools across departments? You don’t need 100% adoption, you need purposeful usage. And that starts with tracking real interaction.

Key takeaways for leaders

  • Inadequate infrastructure limits AI success: Most AI initiatives stall because legacy systems and siloed data prevent the scalability and performance generative AI needs. Leaders should invest in unified, modern data pipelines to avoid wasted AI spend and failed deployments.
  • Strategic partnerships fix data fragmentation: Vendors like SAP and Oracle are closing cloud integration gaps through bold partnerships and platform redesigns. Executives should align with ecosystem-ready vendors that can streamline data movement across multicloud environments.
  • Cloud analytics adoption is accelerating fast: Microsoft Fabric and other cloud-native platforms are seeing record usage as enterprises chase faster insights and more connected intelligence. Leaders should assess whether existing analytics tools can scale with real-time business needs.
  • Data engagement must be measured: Usage metrics like Domo’s “cards views” reveal whether data is actually driving decisions or just sitting idle. Executives should track data engagement to ensure cloud investments deliver tangible business outcomes.

Alexander Procter

September 4, 2025

8 Min