High-performance computing (HPC) is the engine room of scientific research in the UK. What started 30 years ago as a couple of researchers putting together some computers to accelerate their capabilities evolved into the establishment of numerous pioneering research institutions that focus on specific areas of science, such as the Earlham Institute and the Wellcome Sanger Institute for genomics and genetics research.
The importance of supercomputers in life sciences has been highlighted by a growing number of use cases where massive amounts of data must be processed. Public Health England, for example, recently upgraded its HPC system to enable faster and more effective analysis of genome sequences, with the eventual goal of facilitating more personalised medicine in the NHS.
HPC has also spread beyond scientific research into the commercial sector, particularly among large car manufacturers that utilise supercomputers in design and production. BMW has used HPC for the production of its i3 and i8 hybrid cars, while Formula 1 teams have also deployed the technology for wind-tunnel modelling, saving on the significant cost of physically putting their vehicles in a wind tunnel.
While the use of HPC has evolved significantly in this country, its true benefits are not being realised due to the way in which funding councils finance and manage access to ARCHER, the UK’s national HPC system. All publicly funded scientists in the UK can use the system for free, but limited capacity means they must wait in a queue.
“The research community is at a pivotal point where if they carry on doing what they’re doing, they’re going to struggle to keep up with research entities in other countries,” says Spencer Lamb, director of research at Verne Global. “The UK government needs to look at providing a far more flexible approach to financing the use of HPC for the research scientists in this country to ensure we keep ahead of the game.
“Ultimately our skills within this country are down to the research scientists and their ability, intelligence and creativity. What they need to hand is the most advanced, bleeding-edge tool and I don’t believe they’re being given that today.”
In recent years, drastic developments in cloud technology have enabled mass access to powerful computing capabilities through nimble, on-demand services. ARCHER, however, remains built on inflexible, on-premise architecture that is complex and expensive to run. Every three to five years, the system is redesigned and implemented again at a large cost to the taxpayer, and then it is unable to take advantage of any further advancements in innovation until the next time that refresh occurs.
These are very power-hungry systems, yet often they have to reside in a below-standard datacentre, located somewhere within a university or research institution and often without uninterruptable power supplies. Such facilities aren’t really built or fit for purpose,” says Mr Lamb.
“Technology is evolving every day and UK researchers look across to their equivalents in the US and Germany who have access to far greater supercomputers on a far more regular basis. They want to pick and choose what they’d like to have, but they don’t have that option.
“If you talk to the scientists, they just want to access something that works without getting into the day-to-day nitty gritty of making it work, which seems to happen an awful lot in this country. In some cases, it’s almost like we are providing a garden shed HPC for our scientists rather than something that’s slick, efficient and of the latest technology.”
Forward-thinking heads of scientific HPC are now looking to the cloud to fulfil their need for supercomputers that are flexible and easy to access. Hyperscale cloud platforms such as Microsoft Azure and Amazon Web Services, however, don’t provide the full control of an on-premise system, or an HPC-optimised environment, which has prevented the UK research community from being able to embrace high-performance cloud computing.
Very often UK scientists are also limited in their ability to buy an HPC instance or more cores through the websites of these vendors and when they can it’s typically expensive.
The solution is a hybrid option. hpcDIRECT, an offering from Iceland-based Verne Global, provides researchers with the equivalent of what they enjoy about on-premise systems, but without the inflexible, ailing architecture. They can choose what they want, when they want and ensure they’re getting the best value with the most appropriate HPC-attuned technology. According to Verne Global, this allows the research community to access “TrueHPC”.
The ARCHER supercomputer’s annual power bill is £1.9 million. If the same-sized supercomputer was provided through hpcDIRECT, the bill for powering it, using 100% sustainable Icelandic geothermal and hydro-electric power, would be just over £500,000. Savings like these will also enable smaller research programmes, with limited funding, to reach greater heights.
“Smaller universities with ambitions to use HPC to further their research are limited at the moment because they can’t afford it or because they’re queuing for the ARCHER supercomputer and they’re not as important as the big research institutions, which dominate that platform,” says Mr Lamb. “With hpcDIRECT, they can finally access those capabilities and the impact of that is huge.”
On-demand services that offer TrueHPC are set to accelerate the research process across the country, with potentially massive results for UK science. Just as the creation of HPC did 30 years ago, the hybrid architecture model, powered by Iceland, will now signal a new generation of research pioneers and cement the UK’s leadership position in this critical area.
For more information about hpcDIRECT, which is available via the G-Cloud 10 framework, please visit verneglobal.com/solutions/hpcdirect
UK ice-cloud research powered by low-carbon energy
A UK-based company specialising in digital environmental intelligence for the aviation industry will utilise hpcDIRECT to extend its compute-intensive research into the environmental factors that disrupt the efficiency of aircrafts.
Cambridge-based, Satavia intends to map the prevalence of high-altitude ice clouds, which can cause engines to flame out during flight. Organisations around the world will rely on this mission-critical information to develop and implement innovative solutions to boost efficiency, improve safety, and reduce costs and aircraft emissions.
“Not only will it allow us to create greater volumes of environmental data at scale and therefore provide more valuable intelligence to the aviation industry, it offers us a low-carbon source of HPC powered by geothermal energy, which helps to reduce our impact on the environment,” says Adam Durant, chief executive at Satavia.