TechNews Pictorial PriceGrabber Video Sun Nov 24 20:45:59 2024

0


Edge computing and the importance of open infrastructure
Source: Ildiko Vancsa


The "edge" is diverse, dispersed, often independently owned and operated, and comes with a set of constraints not addressed in the average data center.

Old sci-fi films painted a picture of how computers would permeate every facet of life in the future. It has come to pass, and it happened almost without us noticing: having PCs at home became commonplace, our mobile phones turned into small smart devices, and our cars began making decisions for us, controlled by thousands of sensors and controllers.

Self-driving cars, augmented and virtual reality, smart homes and more all underscore our rapidly emerging dependence on distributed computing infrastructure. Personally, I experience "connection anxiety" when my devices can't access the compute, storage and network resources they need. And we're not just craving to be connected to the network; we are also wanting more bandwidth, faster access, and more functionality. This demand for more, more, more from our network systems is a key driver of edge computing, the technology that brings us closer to this new future, parts of which seemed like an unrealistic vision not that long ago.

The next milestone on this journey is 5G. As telecommunication companies deploy 5G networks, they will provide us with higher bandwidth, ultra-low latency, and hundreds of additional connected devices with better coverage than ever before. This introduces new and stricter requirements for computer networks than what we have ever had to deal with in the previous generations of networks.

To meet these requirements, we need to move some of the computing capacity out of data centers and closer to the network edge, closer to where the action is. Open infrastructure will play a key role in making this happen, as multiple access layers in an onion-like architecture are ideally suited to an open, standards-based edge design approach. When we step outside of the large, well air-conditioned and well-equipped datacenter buildings, we are suddenly faced with many new challenges to address in the world of edge computing. Things like small footprints, low energy consumption, maintainability and security become extremely important.

The "edge" is diverse, dispersed, often independently owned and operated, and comes with a set of constraints not addressed in the average data center. Let's face it: when you're operating on the edge, there likely won't be a guy wandering into your office/store/basement with a set of replacement parts and an operator's manual to fix anything that has gone wrong.

Also, when you're dealing with a centralized data center, it can make sense to create or use proprietary solutions and utilize public clouds, but when you're dealing with an edge computing scenario, leveraging an open infrastructure makes so much more sense. However, much as Linux has set baseline norms for how compute devices operate, so, too, do we need to define these norms for how edge systems are operated in open infrastructures.

The telco edge, retail edge, enterprise edge and public edge all make up a collection of viable operating environments for edge workloads. And all of these applications running at the edge need to be able to run independently of the site itself, regardless the architecture, the processor capacity, the CPU manufacturer. An open infrastructure layer that creates a common abstraction, enables common deployment methodologies, and provides common interfaces and stable behavior is critical for the next stage of edge adoption.

Open infrastructure is not as much about packaging and deployment as it is about creating a consistent paradigm and environment for running workloads in the form best used to address those applications. Many edge workloads today run on Linux or in VM's, they may evolve for simplified lifecycle management, or they may be superseded by a next generation of applications.

The container ecosystem of products and technologies is critical for the simplification and scale requirements many edge applications need, and an open infrastructure is a provider and enabler for these types of ecosystems. Further, serverless is seen as a future method for simplifying how edge functions are executed; here, too, an open infrastructure provides a common paradigm for the various serverless environments that will be required across our diverse IT ecosystem.

Does edge even matter to most developers, or to the container ecosystem, to the groundbreakers in serverless computing? Ideally, it only matters as much as they need it to. Edge should not be a massive disruption to how we go about writing software. It should provide some new opportunities for savvy engineers, but we don't want to re-engineer everything to accommodate a new deployment location.

No one technology will emerge alone to provide a complete solution for the edge. To solve this challenge we as a technology industry need to collaborate, leverage and innovate across borders, building on the success and capability of each technology community to drive the simplification and evolution of computing to the edge.
Learn more about edge and openinfrastructure

Open infrastructure proponents from around the world will be gathering May 21-24 at the OpenStack Open Infrastructure Summit in Vancouver to share case studies and best practices and work together to solve integration challenges. The agenda includes notable speakers from ARM, AT&T, China Mobile, China Unicom, Google, Heptio, Hyper HQ, Oath Inc., Progressive Insurance, Target, Verizon, Walmart, and many more.

The summit is focused on helping users compose, integrate, and operate open infrastructure technologies to solve real problems at scale. In addition to sessions on OpenStack, there will be featured sessions on the newest project at the OpenStack Foundation—Kata Containers—and a strong focus on other open source technologies relevant to infrastructure operators, including Kubernetes, Docker, Ansible, Ceph, Istio, Envoy, Spinnaker, Tungsten Fabric (formerly OpenContrail), ONAP, OPNFV, and many more.

The full event agenda is organized by use cases, including edge computing; AI and machine learning; NFV; CI/CD; container infrastructure; and public, private, and multi-cloud strategies.

The edge computing track will feature a set of keynotes, presentations, panel discussions and working sessions to explore the challenges and find solutions in an open, collaborative environment. Join us and learn more about the use cases, requirements and design ideas in the areas like telecom, retail, and manufacturing.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |