Marketing and sales are leading AI adoption, but training hasn’t kept up

Marketing and sales teams are moving fast with AI. That’s where a lot of early innovation is happening, automating content, speeding up analytics, managing customer data at scale. If you’re in the C-suite, chances are your sales or marketing lead is already using AI to drive campaigns or streamline operations. The question isn’t if they’re using AI, it’s how well.

Unfortunately, that speed hasn’t been matched by skill development. In a recent General Assembly survey of 300+ professionals from the U.S. and UK, 68% said they’re already using AI at work. That’s widespread. But only 17% have had AI training specific to their job. The rest? Generic sessions, YouTube videos, trial and error.

That doesn’t scale.

Tools are evolving, fast. AI agents are no longer just suggestion engines, they operate in the background, executing multi-step tasks with little human input. Give them vague instructions or the wrong parameters, and the outcomes become unpredictable. That’s not good for your funnel or your brand.

Jourdan Hathaway, Chief Business Officer at General Assembly, put it clearly: “Sales and marketing teams have been early and avid adopters of AI, but a persistent skills gap prevents them from reaching their full potential.” He’s right.

If you’re a CEO or a CRO, you need to know how much value you’re leaving on the table. Teams without tailored training are winging it. That’s manageable at low scale, but today’s tools go far beyond that. Wherever AI is handling customer-facing work, bad inputs can become public fast.

So, what’s next? Replace one-size-fits-all AI training with role-specific tracks. Focus on repeatable processes that train people on how to work with AI, not just understand it. Build competence in high-impact areas like campaign planning, sales automation, and customer segmentation. The tech is accelerating, your team needs to as well.

Teams use AI for core revenue tasks, but the tools aren’t always approved

AI today is becoming a foundational part of how people get work done. The data tells the story. According to that same General Assembly survey, 57% of professionals use AI for content generation. Another 49% use it to dig into market insights. Then there’s sales ops (47%), customer relationship management (42%), and ad optimization (41%). These are not fringe jobs, they’re central to top-line growth.

But the execution still has issues. Almost half of all people using AI report doing so with unapproved tools. In finance, 56% admit this. That should raise an eyebrow.

Here’s what’s happening: the tools approved by IT often don’t match the speed or flexibility the teams need. So they turn to public platforms or third-party tools paid for on a discretionary budget. That creates a fragmented ecosystem. You’ve got entire teams making AI decisions on the fly without any guardrails. Some are doing genius-level work. Others are exposing you to risk, data leakage, poor brand outputs, or bad decision logic that scales.

Only 47% use officially sanctioned tools. The rest are taking a “best tool for the job” approach, with 21% on public free tools and another 21% on employer-funded, non-approved platforms. And maybe that’s fine in R&D. But for marketing and sales, where public content and customer messages are involved? Not good enough.

You don’t have to lock everything down. But you do need to standardize it. Get input from your marketing and sales teams directly, find out what’s actually helping them move faster. If your endorsement process is too slow, rework it. Because right now, your teams are acting faster than your policies. That’s unsustainable.

Training quality directly impacts AI ROI, and right now, results are all over the place

Let’s talk about return on effort. A lot of companies are seeing AI help with productivity, speed, and strategic focus. That’s expected. Two-thirds of professionals in the General Assembly study say AI is freeing them up to work on more valuable tasks. More than half report productivity gains. And 90% say it helps them and their teams make faster decisions. These are good signals.

But they’re not universal.

A significant number of professionals, 22%, say AI hasn’t improved productivity at all. Another 18% say it’s added to their workload. That should tell you something. The tools work, but only when people know how to use them properly.

AI can streamline operations, but if someone is spending more time correcting AI outputs or learning through trial and error, the net benefit disappears. These teams aren’t failing, they’re unprepared. That’s on leadership. Companies have poured money into AI tools but haven’t done enough to build job-relevant capability around them.

You don’t need theory-based modules that skim the surface. You need function-specific training tied to real workflows: creating better campaigns, closing more deals with less back and forth, answering key market questions without five internal follow-ups. High-performing teams are customizing AI to their output needs. Everyone else is guessing.

The takeaway here is simple: AI training can’t be one-size-fits-all anymore. If the goal is to drive efficiency and real leverage from these tools, then delivering targeted, practical education is non-negotiable. Otherwise, you’ll keep seeing mixed results and wondering why some teams accelerate while others stall.

AI’s revenue impact is still unclear, but customer experience is gaining ground

There’s enthusiasm around AI, but not enough clarity on outcomes that matter most, like revenue. Based on the General Assembly survey, just 39% of professionals say AI is clearly boosting the bottom line. That’s not where it should be.

On the other hand, 54% believe AI has improved customer experience. That’s more consistent. Better targeting, quicker responses, smarter personalization, these are areas where AI is already showing value.

The revenue gap might exist because many companies haven’t yet integrated AI deeply enough into measurable revenue-driving activities. Or they’re not tracking the right AI-related metrics. Either way, it’s time to bring better structure to how we evaluate performance. If the goal of adopting AI is to generate ROI, we need to define how that’s measured, automated processes, faster pipelines, improved conversions. Then tie AI implementation directly to those outcomes.

C-suite leaders need to step in here. Set an expectation that every deployment of AI, especially in marketing and sales, must tie to a measurable result. That could be revenue, cost savings, NPS, or efficiency within a process. The tools are improving rapidly. But if you’re not tying their use to clear metrics, you’re not going to get the visibility you need to make good decisions about scale or reinvestment.

So yes, AI is helping with customer experience. That’s worth protecting and building on. But the next phase is making sure it’s also driving measurable revenue growth, or, at the very least, not diverting resources without a serious return.

Clear guidelines and Role-Specific training are now essential to scale AI safely and effectively

AI adoption is accelerating, but internal governance hasn’t caught up. Most professionals aren’t required to use AI at all, only 11% say it’s mandated. The rest are either encouraged or left to figure it out on their own. That’s a structural problem. Decentralized adoption without guidance increases inconsistency and risk.

According to General Assembly’s data, 21% of professionals are using free, public AI tools. Another 21% use employer-funded tools they’ve personally selected, outside any formal approval process. This tells us companies are missing an opportunity to shape adoption in a way that aligns with business objectives and risk standards. These tools operate on real data, influence customer-facing outputs, and can affect how decisions are made across the organization.

The training gap makes this worse. Many users are piecing together their knowledge from scattered sources while engaging with tools capable of operating autonomously. When people lack role-specific understanding, they lean on trial and error. And when you scale decisions made this way, small errors quickly become expensive ones.

Your teams don’t need more generic theory. They need access to practical, modular training that reflects real-world workflows, things like automating lead follow-up sequences, building audience segments, improving pitch content, or developing scalable ad strategies. They also need clarity on which tools are sanctioned, how data should be handled, and what the compliance boundaries are.

Peer learning, updated modules, and contextual examples will accelerate skills acquisition faster than a classroom session or outdated corporate video. But even the best training won’t help much if companies don’t define how AI should be used in the first place.

Leadership must take ownership of this. Without structured training and clear policies, AI usage is inconsistent at best, and at worst, results in legal or reputational damage. If you’re trying to drive performance at scale, uncontrolled experimentation isn’t efficient. It’s fragmented, and it’s not aligned with your strategic goals. The fix is simple: structured frameworks, supported by dynamic, role-based training that evolves as the tech does.

Key takeaways for decision-makers

  • AI adoption in sales and marketing outpaces training: Most professionals are using AI without job-specific training, creating a gap in capability. Leaders should invest in tailored learning programs to boost effectiveness and reduce risks tied to misuse.
  • Teams are using AI tools without oversight: Nearly half of AI users operate outside approved systems, exposing companies to compliance and brand risks. Executives should enforce clear governance and standardize tool usage across teams.
  • Inconsistent training drives mixed performance results: While AI is improving productivity and decision speed for many, poorly trained users see little to no benefit. Leaders must align training strategies with real workflows to ensure consistent impact.
  • Revenue impact remains unclear despite CX gains: Only 39% of professionals link AI to revenue, but over half see improved customer experience. Companies should define clear KPIs that tie AI usage to financial outcomes and customer-facing performance.
  • Lack of guidance increases organizational risk: With only 11% of professionals required to use AI and many using unsanctioned tools, AI adoption is fragmented. Executives should combine role-specific training with structured usage policies to scale AI safely and effectively.

Alexander Procter

October 13, 2025

8 Min