
The supply chains that put food on our plates, petrol in our tanks and chips in everything are stretched tight around the world, and in recent years we’ve seen exactly what happens when they snap.
Whether it’s a container ship wedged in the Suez Canal, a fire at a previously obscure factory manufacturing a critical semiconductor component, or export restrictions on key commodities, the ramifications can be both instant, and long-lasting.
But while some of these problems may be hard to anticipate or fix – shifting that container ship is a physical problem – it’s reasonable to assume that automation and AI can help solve or mitigate others.
Supply chain analytics could uncover and quantify the risk and impact of a chokepoint related to a key commodity or help manufacturers develop an alternative. Likewise, AI can enable far more precise inventory management, reducing the change of under or over-stocking.
On the factory floor, automated quality control can prevent faulty components leaving the factory or reaching downstream customers. And predictive maintenance powered by AI should reduce the likelihood of production lines grinding to a halt, sending shockwaves up and down the supply chain.
According to research by BCG, real world benefits of AI and digital implementations by manufacturers include an up to 10% reduction in the costs of goods sold and a 10-25% improvement in yield.
So, it’s no surprise that BCG found nine out of ten companies plan to implement AI in their production networks, while over two thirds have begun to implement AI solutions.
The challenge is that the digital systems underpinning global supply chains are often fragmented and inconsistent. This means there are distinct problems which manufacturers need to solve before manufacturing can build truly AI-enabled supply chains.
Some of the reasons for this are easy to spot. Supply chains are, by their nature distributed, involving multiple organisations and locations. Raw materials may be mined in one country, shipped to another region for refining, then to the factories of multiple manufacturers, before finished goods move into global distribution networks. This makes it challenging to collect vast and often disparate data that informs decisions about automation.
When data can’t keep up with production
These chains have developed over years, even centuries, to ease the flow of raw materials and finished goods, more or less smoothly. However, the one thing these fragmented architectures have not evolved to move smoothly is data.
The digital infrastructure underpinning global chains can also be surprisingly fragile
Even within a single organisation, a factory might have its own dedicated compute tied to the production line. This might be largely disconnected from the core business network, whether for security or latency reasons. Operations spanning multiple jurisdictions can face data sovereignty or data movement challenges.
That’s before the challenges of legacy software, incompatible systems and separate operational technology platforms are taken into account.
This all contributes to the data gravity challenge for manufacturers, Martin Davies, senior solution architect at Digital Realty, explains. “Historically, you’ve got a high density compute at the core, you pull the data back there, do your work on it and send it back.”
This is particularly problematic, because of the amount of data that is generated at the edge, in factories, through the supply chain and through distribution. There is a knock on effect to the entire ecosystem, meaning careful decision making is needed based on business needs to ensure data is processed, stored and transmitted in ways that help enhance.
Addressing the data gravity challenge, he explains, opens the door to faster, more informed decision-making, which helps streamline operations. By moving towards predictive maintenance, organisations can prevent issues before they arise and improve overall outcomes. Enhanced visibility into supply chains makes it easier to anticipate potential disruptions, while better stock tracking reduces the risk of unexpected shortages and prevents excess inventory from accumulating unnoticed.
The answer, Davies continues, is to leverage architecture that moves the application, or the compute, closer to the data.
That doesn’t necessarily mean boosting the compute resources at a given extraction or manufacturing location, it does mean that the right tools are put in place with optimal networks. These systems are designed to handle the enormous processing demands of AI workloads, but their complexity, power requirements, and cooling needs can make deployment in remote or industrial locations challenging.
Instead, manufacturers need to strike the right balance across on-premise solutions, edge locations, and cloud or co-located infrastructure. This approach ensures data can be processed close to where it’s generated while still leveraging scalable compute resources elsewhere, enabling smoother AI operations across the entire supply chain.
This is where interconnected, carrier-neutral data centres and their networks come into play. In the case of Digital Realty, says Davies, its worldwide network of data centres operates on a single standardised model and are connected by a single service fabric.
Beyond its own network, Digital Realty is carrier neutral, so customers can connect to its compute facilities via whichever available network provider they prefer. Digital Realty works with partners both to establish where manufacturers stand in terms of AI and data maturity. IGXGlobal, for example, bring key expertise & products used to develop and deploy AI infrastructure. Together we ensure that deployments encompass robust security, optimised infrastructure from endpoint, across the network and back to the core.
The sort of infrastructure and interconnectivity Digital Realty offers, provides much more flexibility when dealing with the broader data challenges manufacturers face. Davies says. “We can build sovereignty into the overall solution and the architecture, then not worry so much about the latency and the pipes, because we’ve got a global backbone which takes that into account.”
This has the potential to massively reduce complexity for manufacturers and their broader supply chains, when it comes to collecting data, moving it and putting it to work.
Boosting sovereignty, cutting complexity
Specifically, it means manufacturers can work towards a much more mature data strategy and can take advantage of upcoming AI technologies to manage and streamline their supply chains. Meanwhile, IGXGlobal can contribute expert guidance on both AI and cloud transformation, as well as sector specific expertise around warehousing and global logistics.
This combination of Digital Realty’s data centre knowledge and industry specific expertise from IGXGlobal, means manufacturers can build the scalable compute and comms infrastructure needed to run those next generation platforms, and ensure they are tuned to the specific architectural and business challenges manufacturing supply chains face.
This in turn means that previously manual or semi-automated processes can be fully automated at speed. “And I think that’s key,” says Davies.
Implementing AI can also help businesses achieve their sustainability goals, by increasing efficiency and reducing waste across their manufacturing and business operations, he adds. But data centre operators like Digital Realty are making rapid strides in making their operations more sustainable as they work to accommodate the power demands of highly dense GPU based compute infrastructure, benefits which are passed on to their customers.
Manufacturing supply chains have always been complex. In a world that is simultaneously more connected and more volatile, this has made them increasingly vulnerable to shocks.
Artificial intelligence offers the promise of more efficient, more resilient supply chains, if manufacturers can develop the right data strategy. This means they need network and compute infrastructure that is as distributed as their own operations – but which is far more interconnected and resilient.
None of this guarantees that there won’t be unexpected disruptions or geopolitical shocks that reverberate up your supply chains. But with the visibility and flexibility an AI power supply chain brings, you’re far less likely to be the one stuck up the Nile without a paddle.
Five questions every manufacturing leader should consider before investing in AI
A clear data strategy is an absolute prerequisite for an effective AI strategy.
But that is just one element that must be in place before investing in AI, says Davies. As technology leaders lay the groundwork for their AI powered supply chain strategy, they should ask themselves the following questions:
AI has changed the face of the data centre. Dense GPU-based compute must be allied with fast interconnect and mass storage. And all this demands far more power than traditional facilities, as well as sophisticated cooling technologies. These demands can quickly outstrip existing onsite data centre capacity, while planning and power issues can stymie new developments. This can cripple manufacturers’ ability to scale up their AI infrastructure.
This is critical, says Davies. Ultimately it comes down to this: “If I reduce complexity, I’m reducing cost.” Building out AI capable infrastructure is a challenge in itself. Standardised interconnected data centres can bring together data and applications more efficiently, while leaving optimisation and operations in the hands of experts bolsters resilience.
This is key, says Davies. If AI is integrated into the manufacturing process any delay in data transfer across regions and its subsequent analysis could contribute to downtime or even cause a safety issue. Supply chain analytics and other business systems also rely on large amounts of data. Incomplete data can mean delayed decision making or leave leaders with a fragmented picture of their data operations.
Just because you can move data across regions, doesn’t mean you always should. Data sovereignty and compliance regimes must be considered whether you’re a global or regional player, and partners and suppliers must be able to accommodate this.
Sustainability is always a concern for manufacturers, and AI brings its own challenges. GPUs and associated infrastructure consume much more power than traditional kit, and this is only going to increase. The vast amounts of heat they emit must be captured and removed, meaning liquid cooling is now standard for AI capable data centres. Choosing the right data centre partners can bring benefits such as cutting edge cooling technology, renewable energy commitments, and broader economies of scale. At the same time, the improved efficiency AI supply chains enable can improve manufacturers’ overall sustainability.
The supply chains that put food on our plates, petrol in our tanks and chips in everything are stretched tight around the world, and in recent years we’ve seen exactly what happens when they snap.
Whether it’s a container ship wedged in the Suez Canal, a fire at a previously obscure factory manufacturing a critical semiconductor component, or export restrictions on key commodities, the ramifications can be both instant, and long-lasting.
But while some of these problems may be hard to anticipate or fix – shifting that container ship is a physical problem - it’s reasonable to assume that automation and AI can help solve or mitigate others.




