At Swansea-based Calon Cardio-Technology, work is underway to develop the next generation of smaller and more efficient blood pumps – surgically implantable artificial hearts – used as an alternative to heart transplants in patients with chronic cardiac failure.
In designing such blood pumps, it’s obviously crucial to ensure that the pump operates as efficiently as possible. But it’s also vital to avoid damage to red blood cells as they circulate through the pump: broken-down blood cells can cause a condition called haemolysis, leading to kidney failure.
Accordingly, Calon carries out highly complex computer simulations of the flow of blood through a pump design, using a mathematical technique known as computational fluid dynamics. Working with Swansea University’s Advanced Sustainable Manufacturing Technologies centre, the operation of each new design is simulated on a supercomputer cluster operated by HPC Wales, a high performance computing facility part-funded by the European Union.
On a conventional high-powered desktop computer, each 3D simulation, involving a “mesh” of around two million elements, would take two to three days, explains Calon’s chief technology officer Graham Foster. The use of a supercomputer slashes that time significantly.
“We need to model about a dozen scenarios for each pump design, after which we tweak and refine it,” he says. “With a supercomputer, a scenario takes two to three hours, not two to three days, and we can send the dozen scenarios as a single batch. We’re saving time and also cost.”
While supercomputers aren’t mandatory for simulating the workings of products under development, there’s no doubt that the combination of high-powered computers and advanced visualisation technology is rapidly transforming the use of 3D product simulation and visualisation within product lifecycle management (PLM) and product design generally.
How? By not only making it easier to apply ever-more refined mathematical techniques, but also making the results accessible to non-specialists, such as designers, who happen to be engineers and not physicists.
“It’s about using real physics, real mathematics and real chemistry, in order to see real outcomes on screen, in a virtual world,” says Stephen Chadwick, managing director for northern Europe at simulation software firm Dassault Systèmes. “This makes it much easier for people who haven’t got a PhD to visualise the outcome of a design decision.”
Better still, such simulations deliver answers faster and more cheaply than would be possible with physical models, says Barry Christensen, director of product management at simulation software specialists ANSYS. Moreover, he notes, simulation can highlight issues that might otherwise not become apparent until products are on the market.
And building such digital prototyping technology into commonly used computer-aided design tools helps the use of simulation to become much more widespread, adds Wasim Younis, simulation solutions manager at Symetri, a firm specialising in digital design training and consultancy using Autodesk design software.
In the process, he points out, the dividing line between design and product simulation is blurring, bringing advanced simulation techniques into the realms of the everyday.
“Should I be using 2mm thick plate or 3mm thick plate? Designers often face choices like that and with simpler user interfaces built into everyday tools, finding the answer is easier than ever,” he says. “Design engineers are designing the product and working with finite element analysis at the same time.”
Even so, some simulation problems remain out of reach for ordinary businesses with ordinary computers. But perhaps not for much longer.
“There’s a class of computational fluid dynamics problems that are currently intractable at an industrial level,” says Peter Vincent, lecturer in aerospace aeronautics at Imperial College London. “They can be solved by supercomputers in national laboratories, but not at an industrial level.”
However, in partnership with graphics computing firm Nvidia, manufacturers of the graphics processing units (GPUs) powering many desktop computers, Imperial College is looking to see if Nvidia’s latest-generation Tesla chips, each with more than 2,500 compute “cores” compared to just two to four in a desktop computer, can be harnessed to the task. Emerald, the UK’s most powerful GPU-powered supercomputer, housed at the Rutherford Appleton Laboratory, contains 372 such Tesla GPUs.
“With the right computer algorithm and the right kind of simulation problem, there can be a five-to-tenfold increase in speed for a given expenditure on computing power,” says Dr Vincent. “That can bring many more problems within reach.”
Increasingly, too, virtual reality and advanced visualisation technology are adding to the armoury of simulation technologies. At the Ford Motor Company’s product development centre in Dearborn, Michigan, for instance, engineers don head-mounted computer displays, not unlike binoculars, to explore high-definition visual renderings of complete virtual vehicles, “walking around” them, exploring their interiors and even going for a virtual drive.
The use of such high-definition, photo-realistic immersive environments allows engineers to examine aspects of vehicle design that are difficult explore in the physical world.
“We’re getting a better picture of what our designers are producing for our customers, under different lighting conditions, and can iteratively look at all the build combinations that can be produced, all in a very short time,” says Elizabeth Baron, virtual reality and advanced visualisation technical specialist at Ford. “Without it, we just couldn’t check all the issues that we check now.”
TOP FIVE FUTURE PLM TECHNOLOGIES
1. High-definition, photo-realistic immersive virtual reality
2. Cloud-based supercomputing
3. Graphics processing unit-powered desktop simulation
4. Parallel-processing algorithms
5. Web-enabled collaborative simulation and analysis