Over the past decade, there has been no IT trend more transformative than the widespread availability of the public cloud. As hyperscalers offer the promise of infinite scalability and flexibility for their workloads while alleviating the need for organizations to spend on internal infrastructure, tools and staff, organizations have rushed into a new era.
But more recently, as enterprise cloud strategies have continued to mature, there has been a growing realization not only that the expected financial return from public cloud investments can be elusive, but also that organizations They may risk sacrificing flexibility, security and control. when they bet “everything” on the public cloud. As a result, we've seen a growing number of companies begin to rethink their cloud strategies and make more sensible decisions about where their most critical workloads should reside. This reconsideration has led to a gradual migration of workloads from the public cloud to private cloud environments – “repatriation” – and reflects a growing realization of an undeniable truth: the public cloud is simply not the optimal choice for everyone. type of workload. .
So how should organizations think strategically about the types of workloads that could benefit from repatriation? Deciding which workloads actually belong depends on a deep understanding of their nature and the specific needs of the organization. Regardless of a company's specific IT architecture, successful repatriation requires a nuanced approach and understanding of how you want to access your data, what you need to protect, and how much you are willing to spend.
In this first part of a two-part series, we will look at two of the four key factors driving the current wave of repatriation: edge computing and data privacy/sovereignty.
Vice President of Private Cloud at Rackspace.
Computing 'Living on the edge': bringing workloads home
According to research from Virtana, the majority of organizations currently employ some form of hybrid cloud strategy: more than 80% operate on multiple clouds and around 75% use a form of private cloud. More recently we have seen a shift, particularly in sectors such as retail, industrial companies, transportation and healthcare, towards edge computing, driven by the need for greater flexibility and control over computing resources. The development of the Internet of Things (IoT) has been fundamental in this regard, as it has enabled the collection of a wide range of data at the edge of the network.
When the number of connected IoT devices at the edge was relatively insubstantial, it made sense for organizations to send the data they provided to the public cloud. But as these devices have continued to proliferate, a lot of efficiencies can be gained by collecting and analyzing data at the edge, including near real-time response and increased reliability of critical infrastructure such as point-of-sale and assembly systems. lines.
Especially in industries where uninterrupted operations are paramount, minimizing downtime is crucial to maintaining profitability and competitiveness. This shift toward edge computing reflects a strategic reevaluation of IT infrastructure deployment, prioritizing localized solutions over traditional public cloud services, and has led many organizations to move workloads away from the public cloud.
Data sovereignty and privacy
As companies face growing concerns around data privacy and ownership, there has been growing recognition of the need to maintain greater control over sensitive data and establish parameters and policies governing its use.
In industries such as healthcare and financial services, where large amounts of sensitive critical data are generated and exchanged, maintaining trust and control over this information is of utmost importance. Ensuring this data resides in highly oxidized environments allows organizations to effectively safeguard their assets and mitigate the risk of unauthorized access or breaches.
Additionally, increased scrutiny by key stakeholders such as CIOs, CTOs, and boards of directors has elevated the importance of data sovereignty and privacy, resulting in a notable increase in scrutiny of solutions in the third party cloud. While public clouds may be suitable for workloads that are not subject to data sovereignty laws, a private solution is often required to meet compliance thresholds. Key factors to consider when deciding whether a public or private cloud solution might be more appropriate include how much control, monitoring, portability, and customization the workload requires.
Of course, trust and privacy are not the only factors driving repatriation. Ancillary operational and strategic benefits can be gained from keeping data within trusted environments, such as greater control over how information is accessed, used, and shared.
In the second part of this series, we will look at two other key factors influencing repatriation: the rise of Kubernetes and the flexibility of containers.
We have introduced the best cloud backup.
This article was produced as part of TechRadarPro's Expert Insights channel, where we feature the best and brightest minds in today's tech industry. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing, find out more here: