In a previous guest blog for SystemsUp I looked at how the pace of change was being accelerated by cloud and how the development of applications was at the heart of that change.

Today I’m going to look at the growth of the multi-cloud environment and how that will affect management and service delivery in the future.

Multiple Clouds (the new Hybrid?)

Traditionally hybrid cloud referred to the use of a private cloud and a public one. However we are now starting to see the use of multiple public clouds, with applications and services distributed liberally across them and several strategies at play.

Where enterprises have been down the Platform as a Service (PaaS) route we are starting to see the emergence of a strategy that abstracts the developers of applications away from the cloud provider by using enterprise PaaS technology to deliver a level playing field for application deployment. When an enterprise has developed a template for their enterprise PaaS – OpenShift or Cloud Foundry – they just repeat that template on to the public cloud provider they want to use, meaning that their cloud native applications will operate consistently regardless of where they are deployed. At the same time the enterprise avoids coding themselves in to the cloud provider’s PaaS.

One further area to consider when adopting a multiple cloud strategy is that of management. Having a single view across each cloud but also being able to deploy applications and move them between and across clouds will be increasingly important. What you want are the right management tools that fit with the clouds you want to use today, but are flexible enough to accommodate the clouds you might want to use in the future. Likewise having the right automation and DevOps technologies that can deploy infrastructure and applications regardless of who owns the target cloud, will also be important.

Serverless, APIs and SOA

As cloud platforms, services and SaaS solutions mature and evolve so too do the software development practices. We are finding new ways to build applications based on stringing functions together that are built and managed by other providers, and then integrating applications and functions with APIs.

When a large global software vendor described their PaaS to me, what they were actually describing was a set of functions that could be called from software and APIs – a serverless computing environment as opposed to a pure play PaaS.  The top cloud providers are adding more and more functionality to their clouds but if you look at most of that functionality you will see that it is actually software functions that have been added such as, some form of machine learning, graph database or IoT capability etc.

The more applications that become cloud native, the more legacy applications that have APIs built, the more micro services that are developed, the more this approach becomes the norm. We are shifting towards an API economy where we pay for transactions rather than the compute and what we see is solutions evolving that are more akin to what would have been called a Service Oriented Architecture (SOA).

As we build more solutions by stitching individual services together, the more we need to manage APIs and the interconnection between them.

Enterprise SaaS

There are huge changes afoot with the delivery of software across the enterprise which are very exciting. However there is one area where this poses something of a dilemma.

The number one concern for any enterprise around delivering software and data in a new form is data privacy and the protection of IP. This creates a struggle between what can drive transformation and also what can hold it back.

Enterprises have spent many years building their IP, their customer base and their reputation and believe they must take every measure possible to protect and maintain it. However these are the very elements that are under attack from the digital disruptors who will innovate with new IP, win those customers and destroy established reputations.

Even banks are now being pushed towards open banking which will change their customer landscape. While their regulators are becoming more comfortable about the use of cloud, new privacy laws such as GDPR are coming into play with potentially crippling fines as penalties for non-adherence.

Enterprises are grappling with these questions:

  • Will they be able to use functions in a serverless way?
  • Will they have to retain their data within the corporate firewall?
  • Will they be disrupted by new entrants unencumbered by legacy?
  • Will they use their vast resources to change pace and innovate?

Every industry, not just banking, is being impacted by the fast pace of change in the tech sector. The ability to use cloud technology will enable them to deliver better services to customers. And as hardware becomes more ubiquitous, it will be driven by software making the changes. From a plumber running his business more efficiently with a scheduling app, to a bank doubling the number of transactions it can handle, the cloud will increasingly play a much bigger role in the technology supply chain.

The multi-cloud environment is here to stay so the sooner we learn to manage it, the sooner we can use it to foster change for the better in the enterprise environment.

This is a guest blog written by Rhys Sharp, Chief Technology Officer for Fedr8, the machine learning application analysis company. SystemsUp is a partner for Fedr8’s Green Rain programme

If you want to know more about how SystemsUp works with Fedr8 to analyse and empower your application estate, please get in touch.

 

 

Related Post