Microsoft is transforming Fabric

Fabric isn’t just a data platform anymore, it’s evolving into something more useful. What Microsoft is doing here is reshaping Fabric to become a central layer for decision-making inside the enterprise. This isn’t just technical upgrades, it’s a functional shift that changes how companies can operate and compete in real time.

The new version of Fabric is designed to break down data silos and integrate structured and semi-structured data sources directly into unified workflows. This includes tight integration with systems like SQL Server and Cosmos DB. It takes data that was usually scattered and fragmented, brings it into one place, and then runs real-time analytics on top of it. That means insights come faster, cleaner, and closer to the source of truth.

For executives, this lowers barriers to making critical decisions. You don’t need to wait for batch reports or jump across systems, Fabric brings operational, telemetry, and analytical data into a shared infrastructure. Whether you’re working on product development, logistics, or customer engagement, the impact here is immediate. Faster access to insights leads to faster decisions. Faster decisions mean tighter optimization. That’s the game.

Michael Ni, Principal Analyst at Constellation Research, described it well. He said Fabric has moved beyond being just a data platform, it’s now a “decision infrastructure layer.” That is important. Enterprises no longer need to treat data, AI, and strategy separately. They work together now.

From a platform efficiency perspective, Microsoft is also focusing on simplified integration, a long-standing blocker for enterprise IT. Fabric’s architecture is designed to reduce complexity and cost while giving data teams more reach. The shift allows a business to move toward more accurate, customizable agentic applications, software agents designed to make decisions and act autonomously, powered by real-time data.

Integration of Cosmos DB within Fabric

Adding Cosmos DB to Fabric isn’t just a product feature, it’s a structural improvement for enterprises building agentic AI applications. These applications rely on reliable, responsive data. In many companies, that data is fragmented across systems, making it difficult to train accurate models or generate meaningful insights in real time. Microsoft’s move to integrate Cosmos DB into the Fabric platform eliminates a major obstacle: scattered semi-structured data.

Cosmos DB handles high-volume, semi-structured workloads, think event logs, customer interactions, or IoT data. By plugging Cosmos DB into Fabric, Microsoft is creating a unified data layer that processes structured and semi-structured data together. This isn’t cosmetic, it’s about performance and precision. When an intelligent application pulls from a single, integrated source of current data, the decisions it makes will be sharper and more relevant.

Arun Ulag, Corporate Vice President of Azure Data, explained the implication clearly. He said this integration allows enterprises to pull semi-structured data into Fabric, where it can be analyzed alongside structured datasets. The result is an operational foundation that supports the development of intelligent agents, AI systems that evaluate data, draw conclusions, and take action on their own.

Michael Ni of Constellation Research took it even further. He pointed out that fragmented data slows enterprises down and makes real-time intelligence harder to achieve. According to him, Combine Fabric’s unified operations with Cosmos DB, and you get the foundation needed to run AI agents that deliver insights as they happen, not after.

For executives, the message is simple. With these upgrades, it becomes easier to deploy AI across lines of business without rebuilding infrastructure. You can extract more value from the data you already have. The system scales, performs on demand, and helps close the gap between signal and action.

This isn’t future-facing, it’s operational now. Cosmos DB’s integration is in preview, which means Microsoft is fine-tuning it for wide release. But for forward-leaning companies building agentic applications, this is a signal: the infrastructure is ready.

The digital twin builder within Fabric

Microsoft has introduced a digital twin builder into Fabric’s Real-Time Intelligence stack. This allows enterprises to model both physical systems, like machines, devices, infrastructure, and logical entities, such as customers, processes, or workflows. What sets this apart is its integration directly into a low-code environment, making it more accessible to engineering teams and business-facing developers who want to scale automation across their operations.

Traditionally, building digital twins requires specialized platforms, disconnected data pipelines, and significant manual coordination. With Fabric, these constraints are removed. Telemetry, streaming data, logs, and workflows can now be connected directly to a virtual model, enabling real-time monitoring, simulation, and control. This creates a continuous feedback loop between physical activity and digital oversight.

Arun Ulag, Corporate Vice President of Azure Data at Microsoft, made the point clear: process automation is fully achievable through this digital twin builder. It goes beyond tracking behavior, it allows companies to act on live signals, model results, and adjust in real time using AI-infused workflows.

Michael Ni of Constellation Research emphasized that Microsoft’s approach is different. While standard digital twins offer static models, Fabric’s twin builder operates as a foundational intelligence layer. It incorporates historical patterns, business rules, sensor input, and live data, then refines this inputs into graphs. These graphs guide machine learning feature engineering at scale and generate decision models that downstream agents can apply autonomously.

For executives, this unlocks automation of high-value functions, predictive maintenance, customer behavior modeling, dynamic routing, and operational optimization, without building flows from scratch. This is strategic infrastructure. It takes real-time data and turns it into something actionable that scales across systems, roles, and outcomes. The functionality is still in preview, but it’s directionally clear: automation is being built into the data layer, not bolted on top.

Enhancements such as the power BI copilot and OneLake

Microsoft continues to refine Fabric not only as a technical platform but as a unified business layer where intelligence, automation, and insight are embedded directly within daily workflows. One of the most meaningful updates here is the enhanced Copilot experience in Power BI. This tool allows developers, and increasingly business users, to embed analytics into day-to-day operations with minimal friction.

Copilot is designed to interpret prompts and automatically generate visuals, reports, or metrics in context. It reduces the manual effort and technical barrier traditionally required to derive insights from large datasets. This supports faster performance reviews, operational insights, and strategic planning, all without waiting on dedicated analysts or reporting cycles. The result is compounding speed gains across decision loops.

At the same time, Microsoft has expanded OneLake capabilities within Fabric. OneLake acts as a unified file system within the platform, and the new features focus on easier file storage, data conversion, and compatibility across data formats. This matters more than it seems. Disjointed data storage and inconsistent formatting continue to be major overhead challenges for large organizations. Streamlining those functions makes the platform both faster and cheaper to maintain.

Taken together, updates to Copilot and OneLake are less about individual tools and more about reducing operational drag. Fabric now enables decision-making, reporting, and data utilization without extensive rewrites or added architectural complexity. It brings automation and insight closer to the point of execution.

For executives, this means less time waiting on intelligence and fewer gaps between data and action. Teams can move faster, build with less, and operate with more precision across departments. The platform is maturing into a system that doesn’t just store and analyze, it acts, adapts, and scales.

Key highlights

  • Fabric becomes decision infrastructure: Microsoft is repositioning Fabric as a central decision-making layer, integrating real-time data, AI, and enterprise systems. Leaders should evaluate Fabric as a foundational platform to reduce operational complexity and accelerate insight-driven action.
  • Cosmos DB enables agentic applications: By incorporating Cosmos DB, Fabric now supports both structured and semi-structured data for next-gen intelligent agents. Organizations aiming to deploy AI at scale should prioritize unified data architecture to eliminate fragmentation and increase decision velocity.
  • Digital twins unlock full-process automation: Fabric’s new digital twin builder allows enterprises to create live, AI-powered models of physical assets and business systems. Executives should assess this capability to automate complex workflows and drive adaptive, real-time operations.
  • Power BI copilot and OneLake streamline action: Enhanced automation in Power BI and improved data handling through OneLake reduce time-to-insight across teams. Leaders should leverage these tools to embed intelligence into workflows while minimizing reporting delays and overhead.

Alexander Procter

June 20, 2025

7 Min