Rapid technological and regulatory change challenges data systems

We’re in a moment where the pace of change is faster than most systems can handle. Data stacks are breaking under pressure from evolving privacy laws and the fast acceleration of AI. According to Shiv Delval, the challenge isn’t just technical, it’s systemic. The tools we’ve relied on in the past weren’t built for this speed.

Most companies keep stacking tools on top of each other. That’s not the answer. It makes the system slower, more fragile, and harder to govern. To succeed now, you need structural change. Reorganize your data. Get your integrations right. Connect systems in a way that removes friction and actually supports velocity. Real velocity, not the illusion of it.

This kind of reengineering isn’t about perfection. It’s about building infrastructure that scales with complexity. Executives who understand that will move faster than their markets and gain a sustainable edge. It’s not easy, but speed without stability isn’t useful. You need both.

For leaders already investing in AI and customer experience, realize this: if your current stack wasn’t designed for dynamic change, you’re already behind. Rebuilding isn’t optional, it’s strategic. You’ll also need to audit your existing tech from the ground up. If the system can’t adapt quickly to policy changes or enable near real-time decision-making with AI, it’s time to redesign it. The next generation of platforms prioritizes responsiveness over legacy, integration over expansion.

Data quality remains a widespread, unresolved issue

Most data is a mess. And that’s not a criticism, it’s reality. During a session poll, the majority of participants admitted their data was still a “work in progress.” And when data is shaky, your output can’t be trusted. Michelle Jackson from CBIZ shared how she received a “sorry you couldn’t make it” email just after she’d actually attended the event. That kind of disconnect is more common than most organizations want to admit.

Those kinds of failures damage trust. Soften the message when your inputs are unreliable. Shift from “certainty” to “context-rich” interactions. Say “Thanks for your interest” with a recap instead of doubling down on bad assumptions. It’s small, but it removes friction between your brand and your audience.

Delval makes a solid point here, chasing perfect data isn’t efficient. Beyond a certain point, the cost won’t improve business results. That’s a trap. Instead, lean into where AI can help: unstructured data. Conversations, reviews, text inputs, these are gold mines if you know how to mine them. Vega agrees, pointing out that people tell you what they want in their own words. You just have to listen.

Executives need to shift their KPIs. Instead of focusing on total data accuracy, focus on decision-ready data. That means data structured enough for systems to act on it, but also agile enough to adapt to dynamic signals. AI doesn’t require immaculate datasets, it requires usable, intent-rich data. Set thresholds that match your use case. Aim to push insight forward, not just polish back-end flaws.

The structure and “center” of the modern marketing stack are evolving

For years, companies have debated where the true “center” of the marketing stack sits. Many default to CRM. And yes, CRMs are still widespread, especially in mid-size organizations. But things have changed. Data warehousing has caught up. Now it’s not just analytics, it’s becoming operational.

In a live poll at the panel session, CRM led as the central stack component. But warehouses weren’t far behind. Delval made an important observation: cloud-native platforms are now capable of running live workloads, moving beyond passive data storage. So, more teams are dropping the idea of transferring data across systems and instead, running tools right on top of the warehouse.

There’s also the matter of identity resolution. Michelle Jackson pointed out the obvious gap in B2B marketing, CRMs tie client profiles to existing work emails. But those identities are temporary. When a buyer changes jobs, they often look like a new lead, when they’re actually a warm prospect. Warehouses and customer data platforms (CDPs) can help retain identity consistency across systems and job changes, an advantage for re-engagement and lifecycle marketing.

Melissa Vega took a broader view, stating that there really is no single “center.” Instead, think of the stack as connected, modular units working in sync. You don’t need everything centralized in one tool. But you do need consistency, canonical data structures, shared ownership, and clear domain definitions. That’s what enables scale, not just stacking more features.

Executives should note that centralization is less about tools and more about governance. You may use multiple platforms, CRM, CDP, warehouse, but if they operate in silos, you’re not leveraging their full capability. Focus on unify-by-design strategies: consistent identity keys, real-time pipelines, shared definitions. Flexibility in tooling is fine, as long as governance holds it together.

Identity resolution, duplication, and consent are foundational to data utility and trust

You don’t get useful data without identity and consent being locked down. Duplicates are a daily reality, and platforms like CRMs and MAPs (marketing automation platforms) alone don’t fix that. Michelle Jackson called duplication “a very real problem.” Without common keys, email domains, logins, address data, records don’t match, and your systems won’t connect experiences or metrics properly.

Expedia’s Melissa Vega shared the B2C version of the identity problem. Even after rolling out unified logins across brands, data fragmentation still shows up. App users versus browser users. Logged-in versus anonymous. Partial profiles can create irrelevant messaging, even if your targeting engine is smart.

Consent is the other half of this. Shiv Delval laid it out clearly: collect consent alongside the data point. Get clarity on what each user has agreed to, and track that over time. Granular, persistent consent is now the expectation, especially in regulated markets. It’s also just good business. Jackson, speaking as both a marketer and consumer, was blunt: “Don’t text my cell while I’m at soccer practice.” To earn access, you have to offer value, immediately and clearly.

None of this works without cross-functional agreement. Data teams, marketers, legal, product, everyone needs to align on what counts as a customer identity and how consent is captured and used. AI, personalization, segmentation, none of it lands unless identity and permissions are stable.

For executives, the takeaway here is strategic risk mitigation. Inconsistent identity resolution not only leads to inefficiency, it damages credibility with customers and can turn into regulatory liabilities. Consent isn’t a checkbox, it’s infrastructure. You can treat consent as a compliance requirement, or as a built-in value layer tied to trust and long-term brand equity. The companies that get this right will have a measurable advantage as data regulations continue to expand globally.

AI can deliver near-term value when properly scoped and supported by reliable data foundations

There’s a lot of buzz around AI right now. Most of it is noise. But if you step back and focus on where it can deliver value today, the returns are real. The panel took a grounded view: AI isn’t magic, it’s a tool. And like any tool, what matters is how you use it and what you give it to work with.

Melissa Vega from Expedia Group pointed to semantic layers as one of the most practical use cases. Instead of hard-coding every possible rule, you structure your data with context. This allows AI models to interpret intent, what the customer is trying to do, not just what button they clicked. That’s immediately useful for dynamic experiences, content matching, and responsive systems.

Michelle Jackson shared an important guardrail: AI should support marketers, not replace them. When AI generates output that’s misaligned or untrue, it’s not helpful. That’s why scope matters. Define the problem, control context, and place trained humans in the loop. Otherwise, you’re scaling bad decisions faster.

Shiv Delval pointed out that unstructured data, calls, reviews, open text, was traditionally underutilized. AI changes that. These untapped sources now hold operational value, if teams have the structure and gray-area permissions to act on it. But none of this works if you haven’t resolved identity, consent, and measurement first. Foundations still win.

For executives weighing AI investments, this is the filter: don’t pursue AI until your data is dependable enough to earn back the cost. AI doesn’t correct for disorganized systems, it amplifies what’s already there. Focus on use cases tied to measurable outputs: lower time-to-insight, improved matching of offers, real-time surface of demand signals. And be wary of friction between teams; AI systems that rely on mixed ownership fail quietly. Establish governance now so AI efforts stand up later.

Strategic stack additions can solve specialized business problems

When it comes to your data stack, more isn’t better. Targeted additions are what matter. Use cases come first, then you choose the right component to solve the gap. The panel shared real-world examples of where this worked.

Michelle Jackson cited account-based marketing (ABM) software as a transformative upgrade for B2B. Her team could focus on the right accounts, personalize outreach, and impact pipeline quality. That’s precision application, not throw-everything-at-it strategy.

Melissa Vega talked about combining Salesforce Data Cloud with Marketing Cloud. For Expedia, the integration was key in surfacing data fast and syncing it across systems without duplication or lag. That’s not just about technology, it’s also about alignment across sales, customer service, and marketing teams using the same signal set.

Shiv Delval mentioned Snowflake’s Cortex/Intelligence, a platform feature that enables natural-language access to data. People don’t need to be data engineers to run powerful queries anymore. That’s meaningful, increasing access speeds up decision-making and reduces dependency bottlenecks.

Each of these additions wasn’t about “more tools.” They were about solving specific friction points. Track where execution gets stuck, and then install precision-grade tools that remove that block. That’s when your stack stops being an overhead and starts being competitive edge.

For the C-suite, the message is simple: focus capital spending on capabilities, not categories. Too many companies buy platforms and use 10% of the functionality. That’s waste. Instead, align spend with impact. Where in the customer journey are you losing margin or dropping experience quality? Where is the cycle time too long? Use those answers to purchase and implement stack upgrades that work. And validate them with business metrics, not plugin counts.

Organizations should immediately act on five foundational data strategies

The panel didn’t leave theory on the table. They delivered five clear moves every organization can make now to bring structure, clarity, and momentum to their data environments.

The first is defining what “good enough” data means. Stop aiming for unreachable perfection. Instead, agree on what level of data quality is acceptable for each use case. Write it down. Tie it to business outcomes. This gives teams permission to move fast while maintaining confidence.

Second: capture consent at the source. Store permissions along with every data point. You should know, at any moment, not just what data you have, but what you’re allowed to do with it.

Third: establish a canonical identity. Don’t wait for perfect profiles. Decide now which identifiers matter, email, login ID, loyalty number, and align internal systems to use those as consistent reference points. Jackson, Vega, and Delval all brought up the recurring theme: fragmented identity makes marketing, personalization, and AI all harder than they need to be.

Fourth: productize metadata. Create a semantic layer that works for both humans and machines. When content, events, and behaviors are described and tagged in structured language, it’s easier to generate insight and easier for AI to process. This is infrastructure work, it doesn’t generate headlines, but it powers everything.

Fifth: act on unstructured data. Take transcripts, reviews, or open-text support logs and extract themes. Do this weekly, not quarterly. Route insights to the right teams and make a habit of operationalizing what customers are telling you in their own words.

These five steps aren’t just checkboxes. They build a more durable system. One that can support customization at scale, without breaking. One that can adapt to new technology, without pausing for rework.

C-level leaders should standardize these five actions across business units. A fragmented approach, where each department improvises metadata standards or identity rules, will only increase system friction over time. Treat these strategies as tier-one operational must-haves, not optional tech initiatives. And bake them into onboarding, process design, and tech selection criteria from the beginning.

Effective stacks are flexible and balance centralized governance with modular innovation

The panel closed with a sharp view of what a modern data stack should be: modular, governed, and responsive. Nothing lasts forever in tech, what matters is your ability to evolve without destabilizing your foundation.

Rigid systems collapse under scale and product complexity. But a stack that’s completely decentralized falls into chaos. Success is in the balance, centralize your governance standards (like identity keys, consent frameworks, and data schemas), while keeping your tooling modular and specific to purpose.

Delval got to the point: centralize your governance, not your tools. Vega emphasized clarity over centralization. Modular systems work, as long as ownership is defined, pipelines are clean, and system boundaries are known. Without that, complexity invites breakdown. Jackson brought contextual realness from the B2B side: if your data breaks when a contact switches jobs, you’re not ready for scale.

The message here isn’t about what platforms to buy. It’s about architectural philosophy. Do your systems allow for agility without sacrificing trust? Can you swap components, onboard new tools, interpret new behaviors, without redefining your toolkit every time?

For executives, stop measuring stack maturity by the number of integrated tools or the size of your platform suite. Measure it by how well your systems flex under change. When AI models evolve, when privacy laws shift, or when your GTM strategy pivots, you need to respond without rerouting your entire stack. Governance gives you that stability. Modular design gives you that freedom.

Recap

The landscape has shifted. AI is no longer optional, and your stack either keeps up, or holds you back. That doesn’t mean rebuilding everything. It means designing aggressively around what works: modular components, clear data ownership, real-time consent, and usable identity frameworks. Ignore the noise. Focus on what scales with precision and respect.

Strong governance isn’t restrictive, it’s your enabler. It gives your teams room to move fast without breaking trust. And the right stack investments don’t chase trends. They remove friction. They unlock value. They help your people make decisions faster, with data that actually means something.

This isn’t a future conversation. It’s an execution priority. Control the structure. Prioritize clarity. Build infrastructure that matches the pace of today’s AI, not yesterday’s marketing plans. The companies that get this right will move faster, serve better, and adapt without hesitation. That’s the advantage.

Alexander Procter

October 23, 2025

13 Min