Decentralization Becomes a Question of Demand
The world for IT architects has seesawed between centralization and decentralization.
In the past, centralization was preferred as it promised economies of scale, control, and cost-efficiency. Today, companies are decentralizing in various ways to allow for infrastructure agility and workload portability.
A key driver is multi-cloud adoption. Companies see the value in cloud redundancies, resilience, and better cost efficiency as different cloud platforms have different cost models.
Kevin Ji, the senior director analyst at Gartner, agreed. He observed that centralization and decentralization are driven by demand for compute density and network latency.
“And as many companies look at multi-cloud strategies, it will accelerate decentralization. They now have to consider where they put the data as well as the apps and the workloads. Essentially, you can say that the IT environment is getting more localized,” he said.
Ji noted that decentralization is not something new. He pointed to global banks that used different data centers “to follow the sun.” “For banks, this is a standard architecture.”
While decentralization as a concept has been around for a while, Ji argued that COVID-19 accelerated it. Part of the reason is data sovereignty. “I think countries and governments are becoming sensitive about data.”
This mindset has resulted in a shift toward “de-globalization” and keeping customer data onshore. “This requires IT architects to rethink how they meet different data compliance needs,” said Ji.
Of course, cost remains a major motivating factor. With cloud platforms competing on more attractive cost-charge models, companies are now exploring different platforms for their various workloads.
Decentralization supports such an approach.
For larger enterprises, decentralization can allow them to normalize data gravity. Data gravity occurs when a large data store in a location sees companies adding more apps and workloads nearer to it to overcome any network or data latency issues.
While “the fact of the matter is that there will always be data gravity,” Ji added that decentralization does help IT architects to address this challenge to some extent.
So, what is stopping decentralization? Again, costs. Specifically, ingress/egress costs of the cloud. This is the cost of moving data in and out of a cloud platform.
Ji noted that this is one reason why today we only see limited decentralization. Most of it is essentially the same cloud platform across different locations.
“We normally see that 70% or 80% of the costs are spent on one cloud platform. This is the reality,” said Ji.
Another issue is cloud-native features. When you move from one cloud platform to another, “there’s a sacrifice. And that sacrifice is cloud-native features,” Ji added.
Containers help. However, Ji believed that it would take some time before container-driven portability catches on.
So, what should an IT architect do if they want to decentralize? “Whenever we go into cloud migration or design for workload agility, you need four capabilities,” said Ji. According to Gartner, these are standardization, virtualization, automation, and instrumentation.
You need all four capabilities. “The key to infrastructure agility and portability is automation. But to build a foundation on automation, you need standardization and virtualization,” said Ji.
Fuente: Winston Thomas
7 lessons to ensure successful machine learning projects
When Michelle K. Lee, ’88, SM ’89, was sworn in as the director of the U.S. Patent and Trademark Agency in 2015, she saw an opportuni
CDO’s Next Major Task: Enabling Data Access for Non-Analysts
The chief data officer (CDO) has taken on far greater digital responsibility than her predecessor has. She spearheaded the digital transf
9 Distance Measures in Data Science
1. Euclidean Distance
We start with the most common distance measure, namely Euclidean distance. It is a distance