Why the metaverse will depend on advances in edge computing

Web 4.0 businesses will, in theory, offer highly immersive customer experiences. The high-speed processing demands of providing these will oblige many to transcend the cloud and adopt edge computing

Web 2.0 brought us user-generated content and interactivity – think Twitter, Facebook, Slack and Zoom. It enabled online startup Dollar Shave Club to build global recognition in 2012 with a YouTube video that cost a mere $4,500 (£3,300) to produce. The company rattled the previously bomb-proof incumbents in its market to such an extent that Unilever reportedly paid $1bn to acquire it four years later.

Web 3.0 has brought us even more disruption, in the shape of technologies such as big data, machine learning and blockchain. Companies are understandably keen, therefore, to get ahead of Web 4.0. Definitions of it vary, but this iteration promises immersive and highly personalised online services, blurring the physical and the digital. 

What will that look like? The metaverse is both the biggest Web 4.0 promise and the biggest threat. It’s a threat to incumbents such as Facebook (now Meta), largely because of its decentralised nature. It’s a promise to startups because they may be able to harness the tech in ways that could give them a competitive edge. 

All sectors are keen to get on the front foot. Microsoft recently spent $68.7bn on gaming company Activision Blizzard. JPMorgan, meanwhile, has created its own metaverse lounge – in which visitors are greeted by, of all things, a virtual tiger.

A less headline-grabbing matter is what CIOs need to do to make such technology work well. 

Firms that offer augmented reality (AR) and virtual reality (VR) in the metaverse will almost certainly have to reconfigure their computing power for the experiences to feel properly immersive. Centralised cloud computing provision, however punchy, won’t be enough. The reason for that is latency – the time lag on the network. 

Swedish telco Ericsson has pointed out that time-critical video games, such as first-person shooters, need no worse than 30 millisecond end-to-end network latency to ensure a high-quality experience. The further away the data centre is from the end device, the greater the latency. Even on the fastest fibre links, there is a latency of 5 microseconds (0.005 milliseconds) for every 1km of cable travelled by the data, according to research by Infinera.

That is why serious gamers use expensive hardware that can do the processing then and there. The problem for firms is that consumers are unlikely to want to spend much on special hardware to access metaverse services. What to do?

Gaining a competitive edge

Edge computing, where the processing muscle is placed closer to the data being crunched, is the next step on from the cloud. 

“People have been talking about the edge for two decades, but it has been limited to niche use cases,” says Ishu Verma, emerging technology evangelist at Red Hat, a provider of open-source enterprise software. “Now the idea of placing computing and storage closer to the data sources is being adopted more widely across industry and consumer applications.” 

One important reason for this is that data systems have become much more capable, cost-effective and energy-efficient, so deploying them at the edge on a large scale is far more feasible than it was. 

“In the cloud, you scale up capacity. At the edge, you scale it out to millions of sites,” says Verma, who adds that there is demand across all sectors that need low-latency services or simply want to avoid batch processing. 

Some industries are already far ahead of the pack. Manufacturing companies, for example, use edge computing to construct digital twins – intelligent virtual replicas of physical infrastructure. 

Rolls-Royce, for instance, can offer clients a virtual aircraft engine in flight that responds as the physical machine does. The sensors on the engine send back data via a satellite link, although most information is collected after the plane has landed. Machine-learning models inform the digital twin and, by extension, the end goal, which is better physical engine design and maintenance. 

Rolls-Royce’s chief information and digital officer, Stuart Hughes, says that this simulation facility is important to the firm, because some of the things it does are “really on the edge of physics”. Having a virtual replica of an engine enables the testers to put it through many more scenarios than they could do physically. 

The company’s longer-term aim is to have an engine that is “increasingly connected, contextually aware and comprehending”. 

As things stand, most firms have their main computing capacity in the cloud, although there are everyday instances of edge computing. Each ATM is a tiny data centre, for instance. And the user interface on Amazon’s voice-activated virtual assistant, Alexa, doesn’t rely on round-tripping every piece of data to the cloud. Some of it is analysed on the machine. 

In the cloud, you scale up capacity. At the edge, you scale it out to millions of sites

The offerings provided by Amazon Web Services (AWS) are a continuum – all the way down to the internet-of-things sensors on the users’ premises. So says AWS’s director of product development, George Elissaios, who adds: “Edge computing is cloud computing.” 

Firms that aren’t cloud hyper-scalers tend to still make a clear distinction between the cloud and the edge. The rule of thumb is that the cloud offers economies of scale, more control of processing and greater computing capacity. 

“Training and developing machine-learning models happens in the cloud while inferencing with real-time data happens at the edge,” Verma explains. 

Companies also have more than mere latency to consider when they look at the edge. Data sovereignty, for instance, is a thorny issue. As Facebook is finding out in the EU, some jurisdictions don’t like to see data being moved to servers beyond their borders. 

Security is another key consideration. “The approach to take is ‘trust nobody’,” Verma advises. “Any data from a remote device is suspect.” 

That is partly because remote devices can easily be tampered with – unlike those on the cloud or on the premises. 

Will edge use displace cloud use? The consensus among experts is that it won’t. 

“I don’t think the edge replaces the cloud. I don’t think the edge competes with the cloud,” says Matt George, director of segment marketing and enterprise transformation for Equinix in EMEA. “As you move along the path [towards more real-time services], what you want is the most agile and flexible IT set-up you can have.”