Underutilized data sources hold substantial marketing potential

What most companies overlook is that they already possess most of the data they need. The issue isn’t about access. It’s about awareness and activation.

Ali Phelan cut straight to the point, everyone’s fixated on first-party data, but third-party data is where the differentiation lies. You get access to dimensions you can’t see in your own ecosystem, demographics, interests, behavior outside your platform. It’s a broader lens on your customer. When used properly, third-party data sharpens segmentation and targeting like few other tools.

Melanie Harris pointed out that direct traffic, users arriving on your platform without a clear referral, often gets ignored. It’s dismissed because it doesn’t track neatly from ads or campaigns. But these folks are high-intent users. They typed your URL, bookmarked your brand, or circled back because it stuck with them. That’s real behavior. And that’s data gold.

First-party engagement data, the signals from your apps and websites, is wrongly considered ordinary. But John Heywood framed it well. When aggregated and unified, it becomes one of the most precise signals of user preference. It’s real-time, it’s contextual, and it’s owned outright.

Many companies are sitting on these resources already. They just need to organize them, integrate them, and actually use them.

Having this data is not a strategy. Making it usable is. C-level leaders need to move from passively storing data to deploying it with precision.

First-party insights, third-party attributes, and direct traffic activity, bring them into a central platform. Tag them. Clean them. Link them. When viewed in isolation, they’re incomplete. But together, they create a reliable signal that supports better decision-making across product, sales, and service functions.

Companies already have the fuel. What they’re missing is the engine to make it work.

Integration gaps and data silos obstruct effective utilization of available marketing data

Here’s the situation most executives don’t see clearly: the problem isn’t data quantity. You’re surrounded by data. The problem is silos, mismatches, and misalignment.

Integration is where everything breaks. Systems aren’t connected. Data flows half-complete. CRM doesn’t talk to the email platform. App data never meets site analytics. You lose context, and without it, segmentation, targeting, and decision-making collapse.

John Heywood, VP of Product at Braze, was direct, access isn’t the issue anymore. “It’s data quality.” You need clean, well-structured, up-to-date data to extract value. Without it, even the best tech stack won’t help.

Ali Phelan added that integration issues hit hardest during platform migrations. When email service provider (ESP) data isn’t properly synced with CRM or analytics tools, segmentation efforts grind to a halt. It’s a systematic failure, not a tool problem.

Melanie Harris took the broader view. Integration failure and siloed data systems reinforce each other. You can’t fix one without fixing both.

The tools are already available. The architecture exists. What’s missing is disciplined execution, teams working across departments to build stable integrations so data can actually flow and be used.

C-suite leaders often think that buying new tools will fix integration problems. It won’t. Tooling is just the surface.

Real progress begins when departments commit to a shared data model, align on platform architecture, and enforce data hygiene across systems. This touches customer success, product analytics, even finance.

Executives should push for shared ownership over foundational data systems. Without integration and governance, all your “data-driven” initiatives are noise.

Traditional attribution models are unreliable in modern, fragmented customer journeys

Attribution used to be simple. Now it’s not. Today’s customer doesn’t move through your funnel the way you hoped they would. They move in all directions, across dozens of touchpoints, across devices, channels, and timeframes.

Melanie Harris explained it clearly. Before, a car lead might get submitted, a call would follow, and the deal closed in days. Now there are often more than 65 separate interactions before a buying decision happens. In the automotive industry alone, it’s over 20 touchpoints per sale.

The concept of perfect attribution, knowing what drove each decision, is now an illusion. What marketers need to do is stop waiting for perfect data and move forward with what they already have. Directional insights matter more than exhaustive tracking.

John Heywood pointed out that it’s not impossible to operate in this new ecosystem, but it does require centralized, clean, and connected data. He mentioned Braze’s work with zero-copy integrations, essentially, keeping data usable across multiple tools without introducing latency or redundancy. That kind of structure helps teams manage complexity without reprocessing data every time they change their stack.

Ali Phelan added an important point on behavior. Some customers follow a long, researched path before converting. Others make split-second decisions. Predictability is gone, and platforms must now be ready to capture intent wherever and whenever it emerges.

C-suite leadership should stop allocating time and budget toward modeling perfect attribution paths. Reality doesn’t support that anymore. Instead, focus on readiness, ensuring your systems can collect, clean, and activate data across fragmented journeys.

Abandoning rigid attribution frameworks doesn’t mean abandoning accountability. You still need visibility, but it must be based on adaptable assumptions, supported by real-time data streaming and a working understanding of behavioral diversity across your customer base.

Your teams will make better decisions if they’re empowered to act on clear trends, even if they’re incomplete, rather than wait for systems to provide answers they can’t.

Email metrics are disrupted and can’t be relied on alone to gauge engagement or intent

Email used to give you clean performance metrics. That’s changed. Privacy updates, like Apple’s Mail Privacy Protection and Gmail’s security protocols, have made open rates unreliable. Clicks are slightly better, but even they’re compromised now, routinely polluted by bots or pre-rendering systems.

Ali Phelan was blunt about it: email opens are “now meaningless.” Clicks, while better than opens, offer incomplete context. The only real measure of engagement is conversion, whether that’s a purchase, a signup, or another action that ties back to intent.

John Heywood brought another layer into the conversation: customers are scattered. They shift between email, apps, messaging platforms like WhatsApp and WeChat, and social media. When marketers rely only on email data, they lose the broader picture. Most customers don’t care which channel they use, they care about clarity, value, and relevance. Brands don’t control the journey anymore. Consumers do.

That means you need multi-channel measurement. Not just open and click rates, but channel performance tied back to behavioral patterns across platforms. Emails don’t exist in a vacuum; they’re part of a loop that includes web visits, app engagement, customer support chats, and offline actions. Only when these signals are stitched together can you understand what actually matters.

C-level executives need to shift how they measure campaign success. Fixating on legacy metrics introduces risk. Instead, ask your teams: what does the customer actually do after receiving this message? Did they take action? Engage with content? Visit the site? Move closer to purchase?

This also changes how you staff and structure teams. Email is no longer a standalone specialization, it must be tied to omnichannel thinking and measured against broader customer behaviors.

If your marketing playbook still leans on open rates for anything strategic, it’s time to replace it.

Zero-party and first-party data remain undervalued

Marketers constantly underestimate the value of the data they’re explicitly given, zero-party and first-party data. This is the data customers actively provide or the data generated by their actual behavior on your owned platforms. It’s clean, high-intent, and permission-based. Yet it often goes unused or misapplied.

John Heywood made the key point: if you want people to willingly give you information, you need to offer them value in return. That could be relevance, personalization, or visible improvements in service. Customers aren’t passive here, they expect an exchange. The companies that communicate what users get in return will get better data, and more of it.

Melanie Harris took aim at outdated CRM logic. Many organizations are still operating on frameworks built over a decade ago. These old systems compress complex, multi-month customer learning cycles into single data points, like flagging a five-year vehicle research cycle as a “walk-in.” That doesn’t just miss context, it creates misleading signals.

Ali Phelan pointed out a deeper risk: bad inputs. Sometimes customers submit aspirational answers or outright misinformation. But most are willing to give you real, usable feedback if you make the purpose of the exchange clear and show that you’re using what they shared to improve their experience.

Properly leveraged, zero- and first-party data form the foundation for AI personalization, journey orchestration, and modern CX platforms. But without meaningful governance and clear systems of reciprocity, it just turns into more noise.

Business leaders need to ensure their organizations treat zero- and first-party data as premium assets, not just marketing inputs. They require structured stewardship, constant refinement, and mechanisms for transparency. Let customers know why you’re collecting specific information, and most importantly, show them how it benefits them.

Also recognize that data decay affects even permission-based information. Customer interests shift. Buying timelines evolve. That means your systems must be agile enough to reevaluate, relearn, and recapitalize on updated inputs from the same user over time.

Fragmented customer journeys demand centralized data and enforced cross-functional collaboration

Customer journeys are no longer structured, and they’re not predictable. They vary from person to person, and change based on channel, timing, and context. As customer-led interactions multiply, the obvious move is to centralize data into one hub where all teams, marketing, operations, product, can act in sync.

John Heywood explained that in the short term, teams need to focus on bringing all digital signals, email, web, app, into a unified platform. This gives you the baseline needed to build journey experiences that reflect each customer’s real front-to-back touchpoints, regardless of channel boundaries.

Ali Phelan took it further. She called out the political aspect. Many integration issues are not technical, they’re about data ownership, team silos, and misaligned priorities. Until executive leadership mandates collaboration, fragmented data systems will persist. You can’t innovate on customer journeys when your teams don’t even share the same customer view.

Melanie Harris reminded everyone of a basic truth: before deploying AI tools or next-gen orchestration software, start with clarity. Ask one question, what exactly are you trying to learn or solve? Without that foundation, teams risk drowning in dashboards.

Centralized data isn’t a buzzword. It’s the only strategy that can support multi-touch, behaviorally-accurate customer engagement. But it only works when leadership aligns every function around shared systems and shared responsibility for outcomes.

Executives often think centralization is just a tooling problem. It’s not. It’s structural. The ability to engage customers seamlessly across their own journey, however inconsistent, requires end-to-end interoperability within your organization.

C-suite leaders need to lead this shift by redefining team KPIs, restructuring workflows for shared accountability, and breaking down legacy incentive systems that prioritize channel-specific success over overall customer outcomes.

This should be a top-down mandate: get the data together, align around a single customer truth, and empower cross-team planning with technology that scales and adapts.

Proactive data hygiene is essential and must be addressed upstream to optimize downstream performance

If you’re not taking data hygiene seriously, you’re undermining every function that depends on that data. Bad inputs corrupt segmentation, personalization, automation, and reporting. The only way to fix it is at the source, upstream, before it hits your systems and spreads through your engagement channels.

Melanie Harris made it clear: your tools must align with your end-state architecture. If your organization is moving toward Adobe, Tealium, or a cloud-based warehouse, your hygiene efforts need to start with that target system in mind. Retrofitting won’t work. You need alignment from the beginning.

Ali Phelan brought up a real concern, marketplaces filled with stale or commoditized data. Many vendors resell old lists or outdated information. When that data gets introduced into your CRM or CDP, it damages trust quickly and can lead to irrelevant outreach, compliance issues, or wasted spend.

John Heywood added that solving hygiene downstream (in platforms like ESPs or campaign tools) is inefficient. Once polluted data flows into marketing ops, cleaning it becomes reactive and expensive. Instead, apply rules, validation, and deduplication processes inside the warehouse or customer data platform (CDP) early in the pipeline. That way, every downstream system benefits from better structure and accuracy.

Good data governance isn’t about buying the latest tool. It starts with discipline, auditing your sources, managing standards, and having a shared understanding of what “clean” means for your business.

For C-suite leaders, the key takeaway here is that data quality isn’t a departmental issue, it’s a strategic one. High-performing marketing, sales, and customer experience teams all depend on solid data foundations.

Make sure your teams aren’t just investing in automation or segmentation without strategies to maintain data accuracy across time. Push for regular auditing cycles, track data lineage, and understand the full flow of how data moves, gets enriched, and is used across systems.

Clean data multiplies ROI. Dirty data compounds risk.

Generative engine optimization (GEO) and other AI search trends still rely on core SEO principles

There’s a new wave of buzz across marketing, GEO, AEO, and the influence of LLMs on how customers find and consume information. But despite the rapid evolution of platforms, the fundamentals of content discovery haven’t changed.

Melanie Harris reflected on her background in search, going back to the Yahoo era. She dismissed the hype around new acronyms, stating plainly: the goal has always been to answer user questions, ensure pages are crawlable, and deliver relevant content. If those basics aren’t in place, no optimization, AI-driven or otherwise, will matter.

John Heywood took the conversation further: the definition of authority is shifting. Community platforms, like Reddit, are emerging as trusted content sources in consumer journeys. These are not owned spaces, but they shape perception and influence buying decisions. Brands that ignore these ecosystems lose context, and possibly relevance.

Modern search is multi-source. AI is rethinking how results are delivered, but users still expect accurate answers and credible voices. Engaging in the right ecosystems and keeping content structured and useful continues to matter today just as it did a decade ago.

Executives tempted to chase the latest AI-enhanced discovery trends should assess capability and value before committing. Investing in GEO without aligning your web architecture and content protocols is premature. You don’t need to react to every acronym. You need to focus on the outcomes, visibility, credibility, and relevance.

Ensure your teams maintain search visibility across owned content and earned authority sources. If your content doesn’t answer real questions and meet structural expectations, adoption of AI search tactics will offer limited improvement.

AI personas offer promise for behavioral testing but lack emotional depth and human variability

AI-generated personas are entering the market as research tools. They can simulate audience reactions, validate decisions, and accelerate early testing. But they’re not complete replacements for real audience insight. The limitations need to be clearly understood before they’re integrated into key strategy workflows.

John Heywood described how he’s used synthetic research tools, AI personas, for scenario testing. They provide directional feedback and can validate broad assumptions. But, as he explained, AI often reflects the biases of its input. It performs best in validation, not exploration. You frequently get back what you put in.

Ali Phelan brought attention to adoption hesitancy in larger organizations. Privacy and data usage reputations matter. Many enterprise clients are pausing implementation until there’s more clarity on legal and ethical safeguards surrounding generative input modeling.

Melanie Harris shared another caution: AI personas can sound reasonable but lack human subtlety. Emotion, unpredictability, and intuition, especially in purchases involving trust, risk, or personal identity, are often not captured. What’s missing isn’t coherence, it’s credibility. Especially in categories like automotive or finance, synthetic responses can’t fully replicate customer behavior.

AI personas are useful, but with boundaries. They are a tool, not the data source. They’re scalable, but still require human oversight and qualitative validation.

Executives considering AI personas have to make the distinction between augmentation and substitution. These models are effective when handled the right way, supporting ideation and validation tasks, but they should not replace voice-of-customer programs, ethnographic research, or qualitative methodologies.

Before deploying personas system-wide, ensure your team documents their use, limits, and data pipeline validations. Treat AI personas as controlled experiments, not user truths.

Legal, privacy, and ethical practices around synthetic profiling must come from the top of the organization. Waiting for regulatory clarity is not an excuse to avoid governance planning.

Asking the right questions is more valuable than chasing new tools or complete data sets

In the current data-heavy environment, marketers often reach for new platforms or emerging technologies before understanding the outcome they’re driving toward. That’s a risk. Without clarity on the question being asked, more technology just adds confusion.

Melanie Harris brought this into focus. She emphasized that before integrating AI tools or investing in new systems, marketers need to define their objective. What’s the decision they need to make? What behavior are they trying to understand? She was direct: “Before AI, write down: What question am I trying to answer?” This helps create restraint, something many marketing teams currently lack.

The obsession with better tools, perfect attribution, and full visibility ignores operational reality. Complete datasets are rare. And that’s fine. Decision-making improves when teams accept that data is sometimes directional, not definitive.

The real advantage comes when leaders focus resources on insights aligned with business goals. Having fewer tools, better questions, and a precise method for feedback creates forward momentum.

C-suite executives shouldn’t expect their teams to deliver perfect analytics pipelines. Instead, they should ensure teams are starting with strong hypotheses and business priorities. Great marketing starts with clarity of purpose, not just access to data.

Investments in AI, machine learning, and customer intelligence software need to follow, not lead, strategic goals. When use cases drive adoption, outcomes improve. When tools come first, fragmentation increases.

Encourage rigor in questioning before funding answers.

The greatest opportunity isn’t chasing new tools, it’s refining and activating the data you already have

Many organizations are over-invested in the next wave of martech solutions and underperforming on basic execution. The most valuable assets aren’t hidden in emerging platforms or untested AI workflows. They’re already in your stack, undeployed, undermaintained, or fragmented.

Melanie Harris stated it clearly: “Not all the data will always be there, and that’s okay.” Waiting on perfect connections, chasing new trends, or depending on the next vendor pitch won’t drive results. What will drive results is building discipline around what’s already operating within your infrastructure. That means improving what you do with existing tools, fixing integrations, maintaining hygiene, and acting on available signals.

Too many teams spend their time implementing features they haven’t benchmarked and collecting data they haven’t applied. This results in noise, not clarity. Executives should focus squarely on system performance, data quality, accessibility, cross-team availability, and meaningful use.

Your advantage is not embedded in the complexity of your tech, it’s harnessed through how coherently and consistently you use it across teams.

For C-suite leaders, there’s strategic urgency in shifting from accumulation to activation. Make sure your team is improving interoperability across the platforms you’ve already paid for before onboarding new ones. Redundancy and fragmentation not only dilute insights but increase operational risk.

Push for audits that focus on activation: Is collected data being used? Can it be used across tools without transformation? Are the metrics informing decisions, or just dashboards?

Real growth comes from applying discipline and structure around existing assets, not expanding your tech footprint without fully integrating what’s already in play.

The bottom line

If you’re in the C-suite and leading marketing, product, or growth organizations, the signal is clear, your next competitive advantage isn’t about chasing the next acronym, tool, or AI layer. It’s about execution.

You already have enough data. What you need is integration across teams, platform alignment, upstream data hygiene, and real clarity on what business outcomes you’re actually driving. Skip the obsession with completeness. Prioritize usability, access, and action.

Discipline will outperform noise. Leaders who focus their teams on making data usable, fragmented or not, will move faster, adapt better, and make smarter decisions, even under imperfect conditions.

Forget perfect. Focus on precision. Clean what you have, connect what you own, and make every dataset accountable to business results. That’s where real value lives right now.

Alexander Procter

October 20, 2025

17 Min