
Most legacy enterprise data centres were designed to handle fewer than 10 kilowatts of electricity delivered per rack. The latest AI servers demand closer to 100 kilowatts. That tenfold gap captures the central challenge facing financial services leaders trying to scale AI in 2026. Modernising data centre infrastructure thus becomes an imperative.
At a panel discussion, sponsored by Supermicro and Nvidia and held at LSEG Studios in December 2025, practitioners from HSBC, Schroders and AI platform provider Articul8 gathered to examine why so many financial services AI projects fail between pilot and production, and what the industry must do differently in 2026. The conversation revealed a consistent theme: technical capability alone guarantees nothing.
The prize for getting it right is substantial. Nvidia’s State of AI in Financial Services report found that nearly 70% of surveyed firms report revenue increases of 5% or more from their AI initiatives, while more than 60% report cost reductions of 5% or more. The same research, published in February 2025, also found only 36% of firms were launching pilot systems for AI or machine learning governance frameworks up from 21% the previous year. The gap between leaders and laggards is widening.
Financial services organisations are investing heavily in their technology stacks, but they also need to focus on developing and nurturing talent
Georgios Kolovos, Nvidia’s EMEA payments and fintech lead, described the scaling challenge using a “full-stack approach” built around a “three plus one” strategy. The three foundational elements of applications, data and compute infrastructure form the core technology stack, while the “plus one” refers to people and talent, the crucial human component that drives success. “Organisations taking a holistic, platform-based approach, rather than a purely tactical one, are the ones pulling ahead,” Kolovos said. “Financial services organisations are investing heavily in their technology stacks, but they also need to focus on developing and nurturing talent.”
The infrastructure piece, at least, is advancing. Shahzada Sufyan is the senior director of solution management at Supermicro, which designs and manufactures AI-optimised servers and data centre infrastructure. He described how liquid cooling now extracts up to 98% of the heat from servers, dramatically reducing energy costs across the entire data centre while enabling greater density. “If you walk into a liquid-cooled data centre, you don’t need earplugs,” he joked. It is a small detail hinting at the scale of engineering progress. Supermicro’s team recently built part of an AI data centre housing 100,000 GPUs in 122 days, a pace that would have seemed implausible two years ago.
Yet infrastructure without a strategy results in expensive hardware sitting idle. Jeff Valane, group head of AI management and strategy at HSBC, likened the challenge to building highways: organisations must establish converged infrastructure so everyone can move fast, but highways alone don’t guarantee valuable journeys. “Use cases built in a silo really struggle for scale,” he observed. “What distinguishes successful deployments is a common architecture, shared observability and integrated risk management.”
Productivity gains – scale challenges
Consider the wins the financial services sector has already secured. HSBC has deployed a productivity tool to more than 170,000 colleagues, while over 30,000 developers use coding assistants, delivering a 15% efficiency gain in software release frequency.
At Schroders, an AI system accelerates the creation of investment memos by extracting information from source documents. “We then created an investment committee agent with access to our historical deals to review that memo and suggest edits for the analyst,” Charlotte Wood, global head of AI and innovation at Schroders, explained. “This frees up time for the analyst to assess the investment case and means that we can evaluate more targets”.
Wood was candid: organisations will struggle to scale impactful use cases out across business areas without having more of a platform in place. Kolovos observed the same pattern across clients, with fraud and customer experience teams building separate AI tools that never connect, duplicating effort rather than creating what he called “a single view of the customer”.
McKinsey’s November 2025 research offers a clue to a winning approach. Organisations reporting significant AI returns are twice as likely to have redesigned workflows before selecting technology. Most have done it the other way round.
Beyond organisational challenges lies a more basic problem: the economics of scaling AI behave nothing like traditional software. Token-based pricing that seems manageable during pilots becomes punishing at enterprise scale.
Jim Berridge, strategic accounts and partnerships manager at Articul8, put it bluntly: “The token-based pricing crushes solutions once you go beyond pilot.” Unlike traditional software-as-a-service, where scaling incurs marginal costs, AI deployments incur expenses that grow exponentially. Valane described “a compounding effect” as organisations add risk-monitoring layers, such as groundedness checks and similarity scoring, each of which consumes further tokens.
Berridge’s company built a financial analyst co-pilot for a US asset manager, ingesting more than 2 million documents. They reached production in roughly six weeks, but only by working within existing data structures. “Our success came from working within the customer’s existing data structures and security perimeter, rather than requiring any re-architecture or movement of data,” he said. Speed came from restraint, not ambition.
Rethink validation, not just headcount
Conventional wisdom holds that talent shortages constrain AI adoption. Berridge, though, offered a dissenting view: ”Most large financial institutions have deep engineering talent. What slows adoption is not capability but the cultural shift from deterministic validation to probabilistic evaluation. Engineers need frameworks, clarity and enterprise-level guardrails to experiment responsibly.”
Wood’s experience supported this interpretation. She described a solution that gave ‘ridiculous’ answers and yet passed every traditional software test. The team needed to understand that AI systems require different questions: not “does it work?” but “is it right?” Training existing engineers to work with LLMs, she suggested, matters more than hiring specialists. This makes Kolovos’s “plus one” talent strategy tangible: the human element that infrastructure investment cannot replace.
The panel’s practical advice centred on three principles. First, demonstrate rather than document: proofs of concept are cheap to build and more persuasive than slide decks. Second, connect AI initiatives explicitly to company strategy, not technological novelty. Third, budget realistically: for every pound spent on AI technology, allocate a considerably bigger sum for the engineering capability to build, scale and maintain it in production.
The infrastructure will keep improving: liquid cooling, faster chips, denser racks. Firms that treat these as sufficient will watch competitors pull ahead. The highways are being built. Now the industry needs drivers to realise the benefits of AI.
For more information about how to scale the adoption of AI in financial services, visit supermicro.com
Most legacy enterprise data centres were designed to handle fewer than 10 kilowatts of electricity delivered per rack. The latest AI servers demand closer to 100 kilowatts. That tenfold gap captures the central challenge facing financial services leaders trying to scale AI in 2026. Modernising data centre infrastructure thus becomes an imperative.
At a panel discussion, sponsored by Supermicro and Nvidia and held at LSEG Studios in December 2025, practitioners from HSBC, Schroders and AI platform provider Articul8 gathered to examine why so many financial services AI projects fail between pilot and production, and what the industry must do differently in 2026. The conversation revealed a consistent theme: technical capability alone guarantees nothing.
The prize for getting it right is substantial. Nvidia’s State of AI in Financial Services report found that nearly 70% of surveyed firms report revenue increases of 5% or more from their AI initiatives, while more than 60% report cost reductions of 5% or more. The same research, published in February 2025, also found only 36% of firms were launching pilot systems for AI or machine learning governance frameworks up from 21% the previous year. The gap between leaders and laggards is widening.