CLOUD COMPUTING
T he first decade of enterprise cloud adoption was largely defined by“ lift and shift” – a tactical migration where legacy workloads were moved to virtualised environments to reduce hardware overhead.
Now, a new age is here: Cloud Native 2.0. This phase is no longer about where applications live, but how they are architected to exploit the fluid, distributed nature of modern infrastructure. Being cloud-native is essential for scaling Gen AI and autonomous operations. But it cannot be overstated that Gen AI is incapable of effectively scaling on monolithic architectures. As a result, legacy systems lack the inherent elasticity required to handle the massive, fluctuating compute demands of LLMs and autonomous agents.
Cloud-native principles provide the granular control necessary to deploy AI at the edge and in the core simultaneously. It is this modularity that allows organisations to treat their infrastructure as a dynamic fabric rather than a static foundation.
“By transitioning from cloud-native to AI-native, Huawei Cloud leads the industry with nonstop innovation”
Jacqueline Shi, President of Huawei Cloud Global Marketing & Sales Service, Huawei
84 April 2026