Fragmented data ecosystem undermines AI reliability
Companies are placing big bets on AI. The expectation is simple: faster decisions, sharper insights, and scalable performance. But there’s a bottleneck, and it’s bigger than most realize. It’s not the algorithm. It’s not the compute power. It’s fragmented data. Businesses define key terms, the foundation of business logic, differently across systems. One team says a customer is anyone who bought something in the last 90 days. Another calls anyone who clicked a marketing email a customer. Feed both systems into a single AI model, and you’re not easing complexity, you’re multiplying it.
Christian Kleinerman, EVP of Product at Snowflake, says it directly: “The biggest barrier our customers face when it comes to ROI from AI isn’t a competitor, it’s data fragmentation.” When your AI doesn’t know which definition to trust, its output loses accuracy and relevance. This hurts confidence in what should be one of your most powerful strategic advantages.
Enterprises dealing with this are dumping hours, sometimes weeks, into reconciling conflicting data definitions before they can even start deploying models. That delays deployments, inflates costs, and burns out valuable engineering time. Worse, it stalls momentum. If your leadership team is investing heavily in AI, and you’re still fighting over what a “churned subscriber” means, you’re not operating at full velocity.
There’s a reason Snowflake crossed $1 billion in quarterly revenue in May. AI features drove the demand, over 6,100 customers are using its AI functionality every week. But even Snowflake says the real obstacle is semantic chaos. If AI is going to scale, data definitions must be aligned first.
Open semantic interchange (OSI) initiative unifies business data definitions
More than a dozen serious tech players have teamed up to solve this. Not just build another product, fix a foundational flaw. Snowflake, Salesforce, Tableau, BlackRock, dbt Labs, and others have launched the Open Semantic Interchange (OSI) initiative. The point is clear: create the first universal, vendor-neutral language for business data semantics.
Why? Because no standard exists. Most data tools speak in proprietary code, interpretations of business logic that don’t connect. That’s why AI models break when definitions shift across platforms. OSI aims to fix this with a single, open model that every platform can adopt. It’s not about ownership. It’s about agreement. Agreement on what data means.
Southard Jones, CPO at Tableau, puts it simply: “The future of AI depends on trust, and trust starts with consistent, reliable data.” Tableau is offering its decades of experience defining business logic to help drive the OSI blueprint forward. They’re not doing this to build walls. They’re connecting systems the right way.
BlackRock’s Aladdin platform, which runs on integrated data across global markets, is already looking to OSI for financial applications. “We are excited to be part of the Open Semantic Interchange,” said Diwakar Goel, Global Head of Aladdin Data, “to help establish a common, vendor-neutral specification that will streamline data exchange and accelerate AI adoption.”
This is core infrastructure for digital operations. By acting now, decision-makers can improve time-to-insight, cut down on manual fixes, and make their AI stack trustworthy at scale. AI only delivers when the definitions behind the data are aligned. That alignment starts here.
OSI’s open and Vendor-Neutral standard accelerates AI scalability
Most metadata standards were built for a different era. They weren’t designed to support AI context, real-time usage, or business-specific logic. That’s why they’ve fallen short. Today’s AI agents need clarity, not just structure. They need standard definitions that understand how businesses describe customers, revenue, churn, and hundreds of other terms, natively and without retranslation.
That’s what OSI is solving. It’s not built for platforms, it’s built for outcomes. At its core, OSI supports SQL-based analytical models, but it also includes AI-specific instructions like synonyms and custom metadata. This means AI agents can finally work from a unified source of truth and apply consistent logic across tools, teams, and environments. This removes ambiguity and accelerates deployment.
Christian Kleinerman at Snowflake highlighted what’s different about OSI: it’s the first to address AI-specific needs and still plug into scaling models like SQL. That’s a clear step forward because it brings clarity and operating efficiency to a space that has been fragmented and overloaded by mismatched definitions.
Francois Lopitaux, SVP of Product Management at ThoughtSpot, explained the impact clearly: “Today, AI models are often forced to infer relationships from raw metadata, which can lead to misinterpretations and inaccurate outputs.” OSI ends that guesswork. By codifying true business context in a consistent standard, it gives AI systems a foundation they can depend on.
For enterprise leaders, this is about control and scale. Moving from proprietary semantics to an open, neutral specification doesn’t reduce performance, it enhances agility. AI models run faster, provisioning becomes smoother, and decision pipelines stabilize. OSI positions companies to build durable AI infrastructures without getting locked down by outdated metadata definitions.
Immediate compatibility with existing analytics tools enhances productivity
One of the strongest points in OSI’s favor is speed to implementation. OSI was designed to run on YAML definitions, which are already widely used by modern analytics stacks. This allows instant compatibility with tools like dbt’s Semantic Layer, enabling teams to adopt the standard immediately without redoing foundational models or work processes.
That’s a key win for engineering and data teams. You’re not starting from scratch. You’re extending what’s already built.
Ryan Segar, Chief Product Officer at dbt Labs, summarized it well: “Data and analytics engineers will now be able to work with the confidence that their work will be leverageable across the data ecosystem. Re-work and double work will be a thing of the past.” That statement hits on what matters most, velocity, efficiency, and reuse.
For enterprises balancing dozens of dashboards, data pipelines, and business apps, this is a clear productivity kicker. By enabling semantic layers to communicate across tools and platforms, OSI cuts down cycle times and removes setup bottlenecks. It also protects prior investments, which avoids unnecessary tool churn and retraining.
Executives should focus here: OSI isn’t just a better technical approach, it reduces friction across the operating model. It’s ready to boost workflows today, not two quarters from now. That matters when timing is competitive advantage. Deploy faster, reuse more, and remove redundancies from your AI pipeline using infrastructure that’s already in place.
Data standardization shifts competition to Value-Added innovation
In enterprise tech, differentiation traditionally came from how platforms structured and defined data. That’s changing. OSI is removing this layer as a competitive variable and shifting the focus to where it should be, product innovation, AI performance, and user experience. When data semantics are standardized across platforms, the strategic edge moves to who can do more with it, not who controls the definition.
This is good for business. Vendors no longer have to invest resources in proprietary data interpretation. Instead, they’re free to channel effort into building highly capable AI agents, smarter analytics experiences, and better automation across workflows. It raises the bar for value-added outcomes, not just infrastructure capability.
Southard Jones, Chief Product Officer at Tableau, explained this well: “Standardization isn’t a commoditizer, it’s a catalyst… Our focus is on being the most intelligent, intuitive, and powerful appliance you can connect to your data.” This reinforces a broader shift: the ability to deliver impact faster now becomes the real differentiator.
Francois Lopitaux at ThoughtSpot echoed the same sentiment, emphasizing that even within a common standard, vendors will set themselves apart through their user experiences and how quickly they can deliver insight at scale.
For C-level leaders, this evolution matters. When standards mature, the cost of switching technologies decreases, but the opportunity for better outcomes increases. That makes innovation, not technical lock-in, your primary growth lever. When your teams are working off aligned data definitions, their output becomes clearer, faster, and more actionable across your operations.
Collaborative governance and expanding enterprise demand propel OSI adoption
The speed and interest around OSI aren’t happening by chance. Major enterprises are pushing for solutions, and the companies behind OSI are responding with a governance model built on shared responsibility. No single player controls the standard. Every company is responsible for maintaining its integrations and adapting its platform to meet the specification.
That’s a smart move. It builds trust across the ecosystem and avoids the typical barriers that come with one company dominating a standard. For businesses investing heavily in AI, this neutrality is non-negotiable. It ensures you can adopt OSI without having to restructure or surrender control of your model logic.
Christian Kleinerman, EVP at Snowflake, stated it simply: “The whole point of OSI is that no single vendor controls it.” By maintaining this position, OSI encourages innovation while keeping the playing field open, driving more companies to join and contribute, further strengthening the specification.
Ryan Segar of dbt Labs added, “When semantics are available everywhere, from anywhere, the place where they ‘live’ becomes less relevant. Built anywhere, leveraged everywhere.” That’s an operational reality most C-suites should pay attention to. Data teams can now work more flexibly and collaboratively across the stack.
Diwakar Goel at BlackRock confirmed that OSI is already being positioned within Aladdin, its end-to-end investment management platform, to unify data across public and private markets. That’s not theory, that’s immediate utility at the enterprise level.
For executive leaders, this is a clear signal that OSI isn’t an option, it’s becoming foundational. The companies backing OSI are betting that shared standards, not isolated control, will unlock AI’s full value. Aligning with that momentum now positions your organization for a more agile and scalable AI future.
Key highlights
- Address semantic fragmentation to unlock AI ROI: Inconsistent business definitions across systems severely limit AI reliability and scalability. Leaders should prioritize unifying data semantics to reduce inefficiencies and build trust in AI outcomes.
- Align ecosystems around open data standards: Snowflake, Tableau, BlackRock, and others have launched OSI to standardize business data semantics. Executives should support industry-wide alignment to increase AI output quality and reduce data prep overhead.
- Shift from proprietary models to open, AI-ready structures: Traditional metadata lacks the context modern AI tools require. Organizations should adopt OSI’s open, SQL-based semantic model to improve accuracy and eliminate costly rework.
- Prioritize immediate integration with existing tools: OSI’s YAML-based structure works with current analytics stacks like dbt Labs. Leaders can improve productivity and speed AI implementation without ripping out existing infrastructure.
- Focus competition on capability, not control: Standardized data definitions shift innovation to user experience, AI intelligence, and value delivery. Executives should concentrate resources on building differentiated products, not guarding proprietary logic.
- Support neutral, collaborative governance to scale adoption: OSI is governed collectively, enabling flexibility and neutrality across platforms. Organizations should engage early to influence direction, accelerate deployment, and futureproof their AI strategies.


