Edge computing towards full autonomy





Edge computing is quickly losing its reputation as a fringe concept, and both users and vendors are turning their sights to the technology’s next goal: fully autonomous implementation and operation.

The edge deployment experience is closer to the simplicity of unboxing a new mobile phone, said Teresa Tung, cloud first chief technologist at IT consulting and consulting firm Accenture. “We see automated technology that simplifies dealing with the unique complexity of the edge for application, network and security deployments.”

The ability to create and manage containerized applications enables seamless cloud development and deployment, with the edge simply becoming a specialized location with tighter resource constraints, Tung says. “Self-organizing and self-healing wireless mesh communication protocols, such as Zigbee, Z-Wave, ISA100.11a or WirelessHART, can create networks where devices can be deployed ad hoc and self-configured.”

The decentralization of IT environments to include edge systems poses specific challenges, said Matteo Gallina, principal consultant at global technology research and consulting firm ISG. “Management of devices and services needs to be done outside the traditional realm of management, which includes managing physically inaccessible devices, a wide variety of solutions and operating systems, different security requirements, and more,” he says. “The larger and more widespread the systems become, the more important the role automation plays in ensuring effectiveness and reliability.”

Automation technology innovation led by open source communities

The trend towards automating edge deployments resembles the journey to AI, where innovations are led by open source groups, infrastructure manufacturers and cloud service providers, Tung says. She notes that open source communities, such as LF Edge, are leading the way in innovation and developing critical standard definitions in areas such as communications, security, and resource management.

“Infrastructure providers are creating solutions that allow computers to run anywhere and be embedded in anything,” Tung says. “It includes new hardware capabilities that are ultra-low power, ultra-fast, connected everywhere, and ultra-secure and private.” She adds, “5G opens up new opportunities for network equipment providers and telecom operators to innovate with both private and public networks with embedded edge compute capabilities.”

At the same time, innovations from cloud providers make it easier to extend centralized cloud DevOps and management practices to the edge. “Like [the] central cloud makes it easy for any developer to access services, we are now seeing the same thing happening for technologies such as 5G, robotics, digital twin and IoT,” says Tung.

Software-defined integration of multiple network services has emerged as the leading technology approach to automating edge deployments, said Ron Howell, managing network architect at Capgemini Americas. Network security, equipped with Zero Trust deployment methods with SASE edge features, can greatly enhance the automation and simplify what it takes to deploy and monitor an edge computing solution. In addition, when implemented, full-stack observation tools and methods that include AIOps will help proactively keep data and edge computing resources available and reliable.

AI applied to the network edge is now widely seen as the leading way forward in the availability of network edges. “AIOps, when used in the form of full-stack observability, is a significant improvement,” Howell says.

Several options are already available to help organizations seeking edge autonomy. “These begin with the onboarding and management of physical and functional assets, and include automated software and security updates and automated device testing,” explains Gallina. If a device works with some form of ML or AI functionality, then AIOps is needed, both at the device level to keep the local ML model up to date – and ensure that the right decisions are made in every situation. taken – as within any ML/AI backbone that may reside on premise or in centralized edge systems.

Physical and digital experiences converge on the edge

Tung uses the term “phygital” to describe the outcome when digital practices are applied to physical experiences, such as in the case of autonomous edge data center management. “We see creating highly personalized and adaptive phygital experiences as the ultimate goal,” she notes. “In a phygital world, anyone can imagine, build, and scale an experience.”

In an edge computing environment that integrates digital processes and physical devices, hands-on network management is significantly reduced or eliminated to the point where network failures and downtime are automatically detected and resolved, and configurations are applied consistently across the infrastructure, making scaling easier and more efficient. becomes easier. faster.

Automatic data quality control is another potential benefit. “This involves a combination of sensor data, edge analytics, or natural language processing (NLP) to control the system and deliver data on site,” Gallina says. Yet another way in which an autonomous edge environment can benefit enterprises is with “zero touch” remote hardware provisioning at large scale, where the operating system and system software are automatically downloaded from the cloud.

Gallina notes that a growing number of edge devices now come with dedicated operating systems and various other types of support tools. “Off-the-shelf edge applications and marketplaces are starting to become available, as are an increasing number of open source projects,” he says.

Providers are working on solutions to seamlessly manage edge assets of almost any type and with any underlying technology. Edge-oriented, open-source software projects, such as those hosted by the Linux Foundation, can further drive adoption at scale, Gallina says.

AI-optimized hardware is an emerging edge computing technology, Gallina says, with many products offering interoperability and resiliency. “Solutions and services for edge data collection — quality control, management, and analytics — are likely to expand dramatically in the coming years, just as cloud-native applications have done,” he adds.

Leaders in AI on Edge automation include IBM, ClearBlade, Verizon, hyperscalers

Numerous technologies are already available to companies considering edge automation, including offerings from hyperscaler developers and other specialist providers. One example is KubeEdge, which offers Kubernetes, an open source system for automating the deployment, scaling, and management of containerized applications.

Gallina notes that in 2021 ISG has ranked system integrators Atos, Capgemini, Cognizant, Harman, IBM and Siemens as global leaders in AI on edge technology. Leading edge computing providers include hyperscalers (AWS, Azure, Google), as well as edge platform providers ClearBlade and IBM. Verizon stands out in the telecom market.

Edge-specific features ensure autonomy and reliability

Vendors are building both digital and physical availability features into their offerings in an effort to make edge technology more autonomous and reliable. Providers generally use two methods to provide autonomy and reliability: internal sensors and redundant hardware components, Gallina says.

For example, embedded sensors can use on-site monitoring to monitor the environment, detect and report anomalies, and can be combined with failover components for the required level of redundancy.

Tung lists some other approaches:

  • Physical tamper-resistant features designed to protect devices from unauthorized access.
  • Secure identifiers built into chipsets that allow easy and reliable authentication of the devices.
  • Self-configuring network protocols, based on ad-hoc and mesh networks, to ensure connectivity where possible.
  • Partitioned boot configurations so that updates can be applied without the risk of blocking devices if the installation fails.
  • Hardware watchdog capabilities to ensure that devices automatically reboot if they become unresponsive.
  • Boot time integrity checking from a secure trust foundation, protecting devices from malicious hardware installation.
  • Trusted compute and secure execution environments to ensure approved compute runs on protected and private data.
  • Anomaly detection firewalls that pick up unusual behavior, indicative of emerging errors or unauthorized access.

Self-optimization and AI

Networks require an almost endless number of configuration settings and fine-tuning to function efficiently. “Wi-Fi networks need to be adjusted for signal strength, firewalls need to be constantly updated to support new threat vectors, and edge routers need to constantly change configuration to enforce Service Level Agreements (SLAs),” said Patrick MeLampy, a Juniper Fellow at Juniper Networks. “Almost everything can be automated, saving human labor and human error.”

Self-optimization and AI are needed to operate at the edge and determine how to deal with change, Tung says. For example, what should happen if the network goes down, the power goes out or a camera is misaligned? And what should be done when the problem is solved? “The edge won’t scale if these situations require manual intervention every time,” she warns. Troubleshooting can be addressed by simply implementing a rule to detect conditions and prioritize application deployment accordingly.

Key learning points

The edge is not a single technology, but a collection of technologies that work together to support an entirely new topology that can effortlessly connect data, AI and actions, Tung says: “The biggest innovations are yet to come,” she adds.

Meanwhile, the pendulum is swinging toward more but smaller network edge centers closer to customer needs, complemented by larger cloud services that can handle additional workloads that are less time-sensitive, less mission-critical and less latency-sensitive, Howell says. He notes that the only factor that remains immutable is that information must be highly available at all times. “This first rule of data centers has not changed: high quality services that are always available.”

Join the Network World communities on Facebook and LinkedIn to comment on top topics.

Copyright © 2022 IDG Communications, Inc.




Leave a Comment

x