Whether public or private cloud, senior leaders must keep in mind the ultimate goal: to optimise or innovate business IT applications through the strategic use of cloud infrastructure and services. Poorly thought out strategies can quickly leave a business blimp-like, lagging behind Concorde competitors. In an age of cloud ascendance, ignorance is no longer an option.
In my experience, senior leadership has been so drenched in a torrent of cloud marketing spiel that they often have an inflated sense of the powers cloud possesses and an underappreciation of pitfalls to avoid. Add to the mix a legacy mindset and a firm might soon find itself staring into the void.
Businesses must consider both public and private cloud systems
Many businesses believe it’s simply a matter of lift and shift: plonking legacy applications in virtual machines, moving them from on-premise facilities into the public or private cloud, optimising them, then driving efficiencies by only paying for resources on demand. While undoubtedly true for some applications, significant efficiencies can be often gained simply by right-sizing workloads on existing on-premise infrastructure.
By not considering the long-term implications of the public cloud service delivery model, firms that initially experience savings may quickly see their costs spiral out of control. For many applications, private cloud, with its flexible licensing arrangements, is more cost efficient, especially when coupled with its automation and capacity planning capabilities.
And as it affords full ownership and control, businesses that rely on applications carrying out highly regulated activities should keep them in a private cloud to speed up compliance processes.
To unlock the true benefits of cloud services – there are more than 150 available on Amazon Web Services (AWS) – businesses must enact a developer-like approach to their applications and avoid the temptation of the virtual machine life raft, which can be daunting.
Security in either the public or private cloud depends on end-users
Starting from scratch and reimagining the way applications are run not only vastly improves efficiencies compared with lifting and shifting, but also allows for experimenting and innovation. The latest artificial intelligence services, such as Google Tensorflow or AWS Lex, turn the public cloud into a laboratory for building interesting applications that transcend competition. Don’t be fooled: if you’re not exploiting cloud services in this manner, you can bet your competitors are.
A common misnomer is that public cloud is inherently more insecure than its private variant, resulting in many organisations keeping their sensitive data on-premise or in a private cloud by default. In reality, security outcomes depend on the diligence of end-users. Too many incorrectly assume it’s the responsibility of service providers to secure applications and they end up vulnerable to threats.
But equally, many are unaware that a granular approach to application reconstruction can leverage microservices to create sophisticated architectures more secure than a trusty in-out firewall. Being unaware of these possibilities and keeping workloads on-premise as a result could actually leave a firm more exposed.
To succeed in a cloud world, these decisions must be made continually on an app-by-app basis, which essentially comes down to how much an organisation is prepared to change its approach to IT. An app-centric approach to enterprise IT moves away from the traditional periodical release and review of IT apps and infrastructure to one of continuous delivery and optimisation. Gone is the two-year life cycle and a business-first, IT-second mindset. IT and business are two sides of the same coin.