Banks are investing in cybersecurity, cloud, and data infrastructure
There’s a clear shift happening in banking. Executives have realized that generative AI can’t scale off of old, fragmented systems. That kind of foundation slows everything down, data stays trapped in silos, models can’t talk to each other, and the operation becomes a patchwork of disconnected efforts. That’s why banks are prioritizing investments in cybersecurity, cloud solutions, and modern data architecture.
It’s not just about throwing more compute at the problem. Without secure access to clean, structured data, and the ability to move it fast across functions, AI will hit a wall quickly. Most large banks are still carrying the weight of outdated infrastructure, and breaking through that limitation means rethinking the stack from the ground up: streamlining data pipelines, cleaning up quality issues, and building secure connectivity across the entire enterprise.
From a leadership perspective, this is about long-term relevance. The banks that move fastest to eliminate bottlenecks across their systems will gain a massive execution advantage. Back-end agility directly translates to front-end capability. It gives you the ability to launch AI features across product lines and react to customer needs without being slowed down internally.
Generative AI is boosting operational efficiency among developers
The findings are real and repeatable. Bank of America kicked off with AI-powered coding tools and saw over 20% productivity gains across its development teams. Citi rolled out similar systems to 30,000 developers. Developers are building faster, writing cleaner code, and spending less time on repetitive work.
This is the kind of impact that doesn’t just improve workflows, it changes how teams operate. Developers can push products to market faster, improve system reliability, and spend more time on creative problem-solving. If you manage a large tech team or oversee digital transformation in a bank, this kind of efficiency translates into higher output without hiring more people.
Right now, many firms are still testing the waters. That’s natural. But we’re at a stage where early success stories are maturing into full-scale implementation. Soto Sanchez noted that companies are shifting from experimental AI initiatives to asking sharper questions: What impact does this actually have on performance? Is it moving the needle financially? That’s what matters, and we’re starting to get those answers, with real, quantifiable results.
There’s a bigger upside too. Once AI tools are integrated properly, they keep improving. Models learn, developers adapt, and processes evolve. Smart deployment early on generates compounding returns without increasing complexity.
Banks are expanding AI applications
Generative AI is showing strong value in use cases that are easy to scale: document processing, summarizing customer calls, and research automation. These were all previously slow, manual tasks that consumed analyst time without directly contributing to decision-making. Now, those processes are faster, more structured, and consistently accurate. That’s a gain in both efficiency and insight delivery.
When over two-thirds of financial professionals say they personally use generative AI tools for investment or market research, as reflected in Broadridge’s survey, you know the adoption is meaningful, not experimental. These tools are already part of everyday workflows. Analysts can parse through large data sets, extract what matters, and act quicker. Senior teams now receive condensed, relevant insights without waiting for days of manual work.
This speed doesn’t reduce quality. If implemented properly, generative AI helps analysts check broader datasets without missing crucial patterns. For leadership, this means smaller teams can cover more ground, cross-checking markets, flagging early signals, and delivering better outcomes across risk, investments, and compliance.
But scope matters. These are strong vertical wins, and they’re easier to prove in individual teams than they are across an enterprise. Michael Abbott put it clearly: this kind of functionality, document summarization or automated transcription, is widely implemented and works. It’s one of the core ways generative AI adds value now.
AI won’t scale without a solid platform
When banks start using generative AI beyond isolated teams, the complexity rises fast. AI can’t scale effectively unless it lives on top of a platform that connects to the right data, supports multiple model types, and includes built-in safeguards. That’s the next priority for leaders, moving from scattered tools to unified, secure systems that support organization-wide use.
Scaling horizontally, across functions like compliance, customer service, lending, and IT, requires discipline in governance, platform architecture, and user control. Access has to be role-based. Guardrails must be enforced upfront. Data pipelines need to be monitored for integrity. You can scale fast or scale smart, doing both at once is what creates lasting value.
Michael Abbott explained the necessity of this clearly. You need a connected platform, not a patch of tools. Models must operate within preset limits, accessing data through secure layers and delivering consistent results. With infrastructure costs decreasing and model efficiency increasing, the opportunity to shift from isolated experiments to enterprise-wide systems is real, and happening now.
This kind of scaling isn’t just technical. It demands retraining teams, aligning data strategies under a shared vision, and building internal systems around AI-ready architecture. It’s about alignment before expansion, something leaders must own directly.
Achieving full ROI from AI initiatives
There’s no real AI ROI without structural readiness. That’s the hard truth leaders are starting to confront. Generative AI can produce fast wins in small, contained use cases, but to get sustained, measurable impact across an organization, the systems, data, and processes behind it all have to be upgraded. Otherwise, the return stays limited to novelty instead of scale.
Many banks underestimated how much friction existed inside their tech landscape. Legacy systems are still common, data is fragmented, and quality issues interrupt even the best model workflows. If the data’s not clean or accessible, AI just doesn’t work well, doesn’t matter how advanced the model is. Executives who want short-term ROI must confront these limitations early or risk stalling out midway.
This divide in expectations is reflected in recent data. Broadridge reports that over one-third of banking leaders expect returns on AI investments in six months or less. Nearly the same number say ROI may take three years. Both are valid, depending on how far along the institution is in modernizing. For companies still navigating legacy environments, the longer timeline is more realistic. Leaders need to match their expectations to their readiness.
Jason Birmingham at Broadridge framed it as a realization many firms now face: you can’t rewrite every legacy system just to make AI work. It’s too time-consuming, too expensive. The smarter path is solving the practical obstacles, cleaning up data, rationalizing access layers, modernizing where impact is high.
The upside is that once those basics are fixed, momentum builds fast. AI tools stop being experiments and start acting as force-multipliers. But it all starts with infrastructure, not hype.
Relevant Data or Research: Broadridge found that over one-third of banking executives expect AI ROI in six months, while a similar portion anticipate a three-year timeline. Nearly 50% report struggles with data silos, and 40% cite data quality issues.
Mentioned Individuals: Jason Birmingham, Global Head of Engineering at Broadridge, noted that banks are recognizing the limits of rewriting legacy systems and instead focusing on overcoming practical modernization barriers to capture long-term AI value.
Main highlights
- Prioritize infrastructure to unlock AI scale: Banks must invest in secure cloud platforms, unified data architecture, and modernized systems to enable scalable generative AI adoption and remove operational bottlenecks.
- Use AI to accelerate developer output: Generative AI coding tools are driving over 20% gains in developer productivity; leaders should deploy these across technical teams to shorten delivery timelines and strengthen digital execution.
- Target high-leverage AI use cases first: Functions like document summarization and investment research are delivering quick wins; focus AI rollouts here to immediately boost analyst productivity and streamline decision-making.
- Build platform-level AI strategy: Scaling AI across the organization requires a centralized platform with governed access, multiple model support, and embedded guardrails; tech leaders should avoid fragmented tool deployment.
- Align AI expectations with technical readiness: Real ROI depends on addressing legacy system constraints, data quality, and integration gaps; leaders must calibrate AI investments based on current infrastructure maturity.