Industrial companies face a difficult call when deciding the best model for dealing with, analysing and acting on the plethora of data in the industrial internet of things
For many people, getting stuck in a lift is their worst nightmare. But thanks to the internet of things (IoT), it could be a thing of the past.
German manufacturer ThyssenKrupp, which runs more than a million elevators around the world, is using intelligent IoT algorithms to predict when a lift is about to breakdown and then prevent it from doing so.
The IoT is driven by the ability to make any object a device by attaching sensors to it and connecting it to the internet. The devices are then able to communicate with each other to improve processes.
“Businesses need to find a way to keep up with the rapid pace of change that 21st-century life brings,” says Andreas Schierenbeck, chief executive at ThyssenKrupp Elevator. “IoT systems that are integrated into industry in an intelligent and practical way can provide the solution to this challenge.”
Connecting and carrying out analytics for more than a million elevators requires an enormous amount of computational power. ThyssenKrupp uses cloud, where computing resources are delivered over the internet on a pay-per-use basis.
But cloud is just one of three prominent models, the others being edge computing and fog computing. Choosing the right one can define the success or failure of any IoT project.
Edge computing is the opposite of cloud, processing and storing data at the data sources themselves. It uses local processing power and storage to carry out low-level, low-value tasks based on the data it is collecting, such as switching things on and off or sending alerts based on trigger events.
Analyst firm IDC predicts that by 2019, 45 per cent of IoT-created data will be stored, processed, analysed and acted upon close to, or at the edge of, the network. “If we want to capture the opportunities of the industrial IoT, it’s not enough to rely on today’s big central data centres and clouds,” says Colin I’Anson, chief technologist for IoT at HPE.
Fog is the middle ground. It computes at the edge but includes elements of aggregation with local resource pools in close proximity to end-users. Devices act as gateways by using distributed nodes linked to the cloud, sending and receiving data and additional compute power when needed.
Cloud, edge and fog each come with their own advantages and disadvantages
Rentokil Initial, a pest control company, is harnessing fog computing to connect its rodent containment devices to gateways that collect information and trigger alerts to technicians when the devices need to be emptied or serviced.
These gateways are also connected to a cloud-based command centre where employees and customers can analyse data relevant to them. The company works with software firm Qlik to visualise that data, making it easier to act on.
Another example of fog is the use of blockchain as a decentralised distributed system for device and data provenance. “This is not edge computing as it refers to the whole system state and not cloud as it is not held on one server,” says analyst Ian Hughes of 451 Research.
Cloud, edge and fog each come with their own advantages and disadvantages. For all the scalability and flexibility benefits of cloud, security is an enduring concern when data is handed over to a third party. Edge ensures only useful data is sent over the network, but it can get costly when more powerful devices are required to cope with extra processing. And while fog optimises the amount of data that is sent across the network, deploying more intermediate processing increases the burden of managing it.
However, industrial organisations shouldn’t see any of the models in anyway exclusive; each is appropriate to different deployment scenarios.
A smart city lighting project, for example, requires a more centralised system, while an oil refinery will have lots of edge processing for parts of the process, but an aggregated, cloud-based digital twin representation of the entire refinery.
A train may have an on-board edge processor to optimise fuel usage as it travels, but incorporates a cloud-based system to apply predictive maintenance to tracks to aggregate information with other trains.
“For simple localised monitoring, you could just send the data to a local device to process it,” says Gary Barnett, head of enterprise advisory at analyst firm GlobalData. “In more complex environments, like facilities management, the data may be sent to an intermediate server on your site so you can manage it locally, with alerts or aggregated data sent to the cloud.”
There is no one size fits all. “The best solution may involve a combination of all three approaches,” says Graeme Wright, IoT director at Fujitsu UK and Ireland.
American conglomerate General Electric is a good example of a company deploying all three models. Its industrial internet platform syncs with every physical device to create a complete continuum from assets at the edge to gateways in the factory and all the way to the cloud.
Applying this approach has driven productivity increases of up to 20 per cent in GE’s factories, resulting in $730 million in productivity gains in 2016 alone. The company is aiming for an additional $700 million in 2017.
“The cloud plays an essential role in the industrial internet of things,” says Deborah Sherry, general manager at GE Digital Europe, “but it’s not enough on its own for industrial companies looking to optimise their assets and operations.”