By Ben Ferguson
Shamrock Consulting Group
Cloud computing may be the flavor of the month when it comes to super-scale Information Technology, but it’s not the only model that has been put forward as a viable solution to business’ needs for scalability and economy.
Fortunately, autonomic computer systems could potentially remove one of the enduring flaws currently preventing cloud computing from delivering what it’s truly capable of.
The Weak Link
Despite the potential cost and efficiency savings available through cloud computing, there are ongoing vulnerabilities that can’t be ignored. From cyberattacks to server crashes to VM sprawl, the cloud has suffered its fair share of mishaps. So it’s no wonder that some businesses remain skeptical, convinced that they could lose a big chunk, if not all, of their profits if something goes wrong.
If you scratch beneath the surface of any of the above incidents, you’ll find that there is one element present in each of them: us.
What lies behind most cybersecurity attacks? Human failure to follow security best practice. Why did Amazon S3 go down for a few hours in February 2017? An Amazon employee entered a wrong command and took too many servers offline. Why did British Airways’ systems crash that same year? Again, a human IT engineer switched over to emergency servers too quickly.
A recent survey on the cloud revealed that cloud cost reduction was the number one priority for experienced cloud-based businesses. The reason this is such an issue is not due to any inherent failings in the utility computing model, per se, it is just that humans can be really bad at managing the provisioned resources efficiently.
All of the above could potentially be mitigated by an autonomic computing system.
Modeling the Autonomic Nervous System
When we think about smart machines, it is natural to focus on cerebral functions like planning and decision-making. In biological terms, these functions are part of the central nervous system, with nervous impulses passing to and from the brain via the spinal cord.
However, many of our life-preserving functions, including our breathing, heartbeat and digestion, are controlled by the autonomic nervous system. In 2001, IBM rolled out the concept of autonomic computing. This system would use sensors to monitor both its own state and that of the environment. A management unit would then compare the data against a set of human-designed policies and, if necessary, make adjustments through a network of effectors.
In the same way our conscious mind has been freed from managing the bodily functions of breathing and digesting, humans would also be liberated from the need to maintain and optimize the performance of an autonomic computer system.
Despite ongoing research into autonomic systems, the idea didn’t really catch on and was brushed aside with the advent of the cloud computing phenomenon.
The Slow March Towards Automation
Autonomic computing may have been more of a research project over the past two decades, but the technology has still emerged in several high-profile areas. Perhaps the most exciting of these is in the development of driverless cars by Tesla and Google. These use a combination of sensors to provide the rich data they need to navigate safely along the public highway.
Companies have also developed smart central heating systems which can use GPS data from smartphones to trigger the heating to either turn on or off as we approach or leave the house. This is just the beginning of a progression which will see the Internet of Things become a reality. With the IoT will come even more data and a further scaling up of the cloud as more resources are needed to keep up.
With the demand for IT skills already stripping supply, wage bills will rise, and so will the cost of mistakes. The forward-thinking companies will move beyond their existing cloud orchestration software and ask what else they could automate.
Autonomic Components Set to Boom
Fortunately, there are already services out there that have realized that the cloud provides an ideal platform for the development of autonomic components. Splunk’s IT Service Intelligence (ITSI) is an example. This goes beyond simple real-time monitoring by using machine learning (ML) to spot patterns, optimize performance and pro-actively address problems before they affect customers. By integrating with KPIs, ITSI automatically prioritizes business-critical functions.
Other examples already on the market are Insight Engines’ Cyber Security Investigator, Instana’s Dynamic Application Performance Management and Moogsoft AIOps for incident management.
As cloud-native applications continue to offer powerful and reliable ways to automate security, incident management and resource provision, the ability of small IT teams to manage huge cloud deployments will increase and there will be fewer incidences of human failure as a result. The big public clouds will want to offer the best automation solutions to their customers and this will boost the industry leaders even further.
As time goes on, could we see autonomic computing gradually become the industry standard across the cloud? The answer may be a lot closer than you think.
About the Author
Ben Ferguson is the senior network architect and vice president of Shamrock Consulting Group, the leader in technical procurement for telecommunications, data communications, data center, dark fiber procurement and cloud procurement services.
Since his departure from biochemical research in 2004, he has built core competencies around enterprise wide area network architecture, high density data center deployments, public and private cloud deployments and voice over IP telephony.
Ben has designed hundreds of wide area networks for some of the largest companies in the world. When he takes the occasional break from designing networks, he enjoys surfing, golf, working out, trying new restaurants and spending time with his wife, Linsey, and his dog, Hamilton.