Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

Edge AI: what, why and how with open source

Edge AI is transforming the way that devices interact with data centres, challenging organisations to stay up to speed with the latest innovations. From AI-powered healthcare instruments to autonomous vehicles, there are plenty of use cases that benefit from artificial intelligence on edge computing. This blog will dive into the topic, capturing key considerations when starting an edge AI project, main benefits, challenges and how open source fits into the picture.

What is Edge AI?

AI at the edge, or Edge AI, refers to the combination of artificial intelligence and edge computing. It aims to execute machine learning models on interconnected edge devices. It enables devices to make smarter decisions, without always connecting to the cloud to process the data. It is called edge, because the machine learning model runs near the user rather than in a data centre.

Edge AI is growing in popularity as industries identify new use cases and opportunities to optimise their workflows, automate business processes or unlock new chances to innovate. Self-driving cars, wearable devices, security cameras, and smart home appliances are among the technologies that take advantage of edge AI capabilities to deliver information to users in real-time when it is most essential. 

Benefits of edge AI

Nowadays, algorithms are capable of understanding different tasks such as text, sound or images. They are particularly useful in places occupied by end users with real-world problems. These AI applications would be impractical or even impossible to deploy in a centralised cloud or enterprise data centre due to issues related to latency, bandwidth and privacy.

Some of the most important benefits of edge AI are:

  • Real time insights: Since data is analysed real time, close to the user, edge AI enables real time processing and reduces the time needed to complete activities and derive insight.
  • Cost savings: Depending on the use case, some data can often be processed at the edge where it is collected, so it doesn’t all have to be sent to the data centre for training the machine learning algorithms. This reduces the cost of storing the data, as well as training the model. At the same time, organisations often utilise edge AI to reduce the power consumption of the edge devices, by optimising the time they are on and off, which again leads to cost reduction.
  • High availability: Having a decentralised way of training and running the model enables organisations to ensure that their edge devices benefit from the model even if there is a problem within the data centre.
  • Privacy: Edge AI can analyse data in real time without exposing it to humans, increasing the privacy of appearance, voice or identity of the objects involved. For example, surveillance cameras do not need someone to look at them, but rather have machine learning models that send alerts depending on the use case or need.
  • Sustainability: Using edge AI to reduce the power consumption of edge devices doesn’t just minimise costs, it also enables organisations to become more sustainable. With edge AI, enterprises can avoid utilising their devices unless they are needed.

Use cases in the industrial sector

Across verticals, enterprises are quickly developing and deploying edge AI models to address a wide variety of use cases. To get a better sense of the value that edge AI can deliver, let’s take a closer look at how it is being used in the industrial sector.

Industrial manufacturers struggle with large facilities that often use a significant number of devices. A survey fielded in the spring of 2023 by Arm found that edge computing and machine learning were among the top five technologies that will have the most impact on manufacturing in the coming years. Edge AI use cases are often tied to the modernisation of existing manufacturing factories.  They include production scheduling, quality inspection, and asset maintenance – but applications go beyond that. Their main objective is to improve the efficiency and speed of automation tasks like product assembly and quality control.

Some of the most prominent use cases of Edge AI in manufacturing include:

  • Real-time detection of defects as part of quality inspection processes that use deep neural networks for analysing product images. Often, this also enables predictive maintenance, helping manufacturers minimise the need to reactively fix their components by instead addressing potential issues preemptively. 
  • Execution of real-time production assembly tasks based on low-latency operations of industrial robots. 
  • Remote support of technicians on field tasks based on augmented reality (AR) and mixed reality (MR) devices; 

Low latency is the primary driver of edge AI in the industrial sector. However, some use cases also benefit from improved security and privacy. For example, 3D printers3d printers can use edge AI to protect intellectual property through a centralised cloud infrastructure.

Best practices for edge AI

Compared to other kinds of AI projects, running AI at the edge comes with a unique set of challenges. To maximise the value of edge AI and avoid common pitfalls, we recommend following these best practices:

  • Edge device: At the heart of Edge AI are the devices which end up running the models. They all have different architectures, features and dependencies. Ensure that the capabilities of your hardware align with the requirements of your AI model, and ensure that the software – such as the operating system – is certified on the edge device.. 
  • Security: Both in the data centres and on the edge devices there are artefacts that could compromise the security of an organisation. Whether we talk about the data used for training, the ML infrastructure used for developing or deploying the ML model, or the operating system of the edge device, organisations need to protect all these artefacts. Take advantage of the appropriate security capabilities to safeguard these components, such as secure packages, secure boot of the OS from the edge device, or full-disk encryption on the device.
  • Machine learning size: Depending on the use case, the size of the machine learning model is different. It needs to fit on the end device that it is intended to run, so developers need to optimise the model size dictate the chances to successfully deploying it.
  • Network connection: The machine learning lifecycle is an iterative process, so models need to be periodically updated. Therefore, the network connection influences both the data collection process as well as the model deployment capabilities. Organisations need to check and ensure there is a reliable network connection before building deploying models or building an AI strategy.
  • Latency: Organisations often use edge AI for real-time processing, so the latency needs to be minimal. For example, retailers need instant alerts when fraud is detected and cannot ask customers to wait at the cashiers for minutes before confirming payment. Depending on the use case, latency needs to be assessed and considered when choosing the tooling and model update cadence.
  • Scalability:  Scale is often limited to the cloud bandwidth to move and process information. It leads to high costs. To ensure a broader range of scalability, the data collection and part of the data processing should happen at the edge. 
  • Remote management: Organisations often have multiple devices or multiple remote locations, so scaling to all of them brings new challenges related to their management. To address these challenges, ensure that you have mechanisms in place for easy, remote provisioning and automated updates.

Edge AI with open source

Open source is at the centre of the artificial intelligence revolution, and open source solutions can provide an effective path to addressing many of the best practices described above. When it comes to edge devices, open source technology can be used to ensure the security, robustness and reliability of both the device and machine learning model. It gives organisations the flexibility to choose from a wide spectrum of tools and technologies, benefit from community support and quickly get started without a huge investment. Open source tooling is available across all layers of the stack, from the operating system that runs on the edge device, to the MLOps platform used for training, to the frameworks used to deploy the machine learning model.

Edge AI with Canonical

Canonical delivers a comprehensive AI stack with all the open source software organisations need for their edge AI projects.

Canonical offers an end-to-end MLOps solution that enables you to train your models. Charmed Kubeflow is the foundation of the solution, and it is seamlessly integrated with leading open source tooling such as MLflow for model registry or Spark for data streaming. It gives organisations flexibility to develop their models on any cloud platform and any Kubernetes distribution, offering capabilities such as user management, security maintenance of the used packages or managed services.

The operating system that the device runs plays an important role. Ubuntu Core is the distribution of the open source Ubuntu operating system dedicated to IoT devices. It has capabilities such as secure boot and full disk encryption to ensure the security of the device.  For certain use cases, running a small cloud, such as Microcloud enables unattended edge clusters to leverage machine learning.

Packaging models as snaps makes them easy to maintain and update in production. Snaps offer a variety of benefits including OTA updates, auto rollback in case of failure and no touch deployment. At the same time, for managing the lifecycle of the machine learning of the model and the remote management, brand stores are an ideal solution..

To learn more about Canonical’s edge AI solutions, get in touch.

Further reading

5 Edge Computing Examples You Should Know

How a real-time kernel reduces latency in telco edge clouds

MLOps Toolkit Explained

5 Edge Computing Examples You Should Know

Edge Computing Examples

In the fast-paced world of technology, innovation is the key to staying ahead of the curve. As businesses strive for efficiency, speed, and real-time data processing, the spotlight is increasingly turning towards edge computing. 

Edge computing represents a paradigm shift in the way data is processed and analysed. Unlike traditional cloud computing, which centralises data processing in distant data centres, edge computing brings the processing power closer to the source of data. This not only reduces latency but also opens up a world of possibilities for industries across the board.

In this blog, we’re excited to explore examples of this cutting-edge technology, with its diverse applications and use cases, with a special focus on how Canonical’s MicroCloud fits seamlessly into this transformative landscape.

Edge computing examples across industries

  • Smart cities and urban planning

Edge computing plays a pivotal role in the development of smart cities. By deploying edge devices such as sensors and cameras throughout urban environments, data can be processed locally to optimise traffic management, enhance public safety, and improve overall city infrastructure. Real-time analytics at the edge enable swift decision-making, leading to more efficient and responsive urban systems.

  • Healthcare and remote patient monitoring

The healthcare sector is leveraging edge computing to enhance patient care and streamline medical processes. Edge devices in healthcare facilities enable real-time monitoring of patients, ensuring timely intervention and reducing the need for extensive data transfer to centralised servers. In remote areas, edge computing facilitates telemedicine, providing access to healthcare services for those in underserved communities.

  • Industrial IoT for predictive maintenance

Edge computing is revolutionising industrial operations by enabling predictive maintenance through the Internet of Things (IoT). In manufacturing environments, sensors attached to machinery collect and analyse data locally. This allows for early detection of potential issues, minimising downtime and optimising maintenance schedules. The result is increased efficiency, reduced costs, and improved overall equipment effectiveness.

  • Autonomous vehicles and enhanced safety

The automotive industry is embracing edge computing to power autonomous vehicles and enhance road safety. Edge devices onboard vehicles process data from numerous sensors, cameras, and lidar in real-time, enabling quick decision-making without relying on distant cloud servers. This low-latency approach is critical for the success and safety of autonomous driving systems.

  • Retail and personalised customer experiences

Edge computing transforms the retail experience by enabling personalised services and improving customer engagement. In-store cameras and sensors analyse customer behaviour, allowing retailers to offer targeted promotions and optimise inventory management. This real-time data processing at the edge enhances customer satisfaction and creates a more seamless shopping experience.

MicroCloud: a tailored solution for edge computing

In the dynamic landscape of edge computing, choosing the right solution is paramount. Canonical’s MicroCloud emerges as an ideal edge cloud solution, seamlessly aligning with the diverse edge computing examples presented. Offering a compact and efficient cloud infrastructure, MicroCloud is designed to deliver edge computing capabilities with a focus on simplicity, scalability, and reliability.

Key Features of MicroCloud

Compact Form Factor: MicroCloud’s compact form factor makes it suitable for deployment in diverse environments, from industrial settings to urban landscapes, ensuring that edge computing resources are readily available where they are needed the most.

Scalability: MicroCloud allows for easy scalability, accommodating the varying demands of edge computing applications. Whether it’s in a smart city deployment or an industrial automation setting, MicroCloud can scale to meet the evolving needs of the edge.

Reliability and Security: With a robust architecture, MicroCloud ensures the reliability and security of edge computing operations. Its design aligns with the stringent data security requirements of industries such as healthcare and telecommunications, providing a trustworthy foundation for critical applications.

A consolidated snapshot of key edge computing examples and trends

To delve deeper into the world of edge computing and its dynamic use cases, read more in our whitepaper, “Edge computing use cases across industries”. This whitepaper explores real-world examples, industry-specific applications, and the potential impact of edge computing on businesses and society.

As we navigate the ever-evolving technological landscape, understanding the practical applications of edge computing is crucial for businesses aiming to stay ahead. This whitepaper serves as a valuable resource for those seeking to harness the power of edge computing and unlock new possibilities in their respective industries.

Download the whitepaper.

Further reading

Adopting open-source Industrial IoT software

How a real-time kernel reduces latency in telco edge clouds

CTO’s guide to MicroCloud: Learn how to build your Edge strategy with MicroCloud

Meet Canonical at Mobile World Congress Barcelona 2024

The world’s largest and most influential telecommunications exhibition event, Mobile World Congress (MWC), is taking place in Barcelona on 26-29 February 2024. Canonical is excited to join this important annual event once again and meet the telecom industry. 

Telecommunications is a key sector for Canonical. We offer solutions for private, public and hybrid/multi cloud environments, with a consistent experience across the entire telecom spectrum, from core clouds to the edge, with a single set of tooling. Built with the same philosophy as Ubuntu – secure, trusted and production-grade open source backed by full operations support – our solutions are fully upstream and integrate the latest technology advancements that telco leaders require to deliver best-in-class services to their customers. 

We are looking forward to meeting you at MWC 2024. Come and speak with our experts to learn how we can help you in your journey to cost-effective, secure and trusted open source telecom solutions for your infrastructure.

Hot topics in telco

To meet today’s customer expectations, telecom operators require flexible, scalable and agile operations across the many service types that make up a modern mobile network.

At this year’s MWC event in Barcelona, Canonical’s team will explain how you can elevate your telecom infrastructure with the latest innovations in cloud-native technologies and modernise your telco clouds with open source. These strategies will empower you to meet and exceed customer expectations with repeatable and reliable deployments.

Automation at scale for telco edge clouds with CNEP

We have been listening to our telco customers to understand their needs in delivering cost-effective modern edge clouds for their infrastructure that they can rely on. Canonical is proud to offer a new holistic solution, Cloud Native Execution Platform (CNEP) to meet these needs precisely at telco edge clouds.

With CNEP, we deliver the ideal software stack for telco edge clouds with automation in place, based on fully upstream and CNCF certified Kubernetes running on bare metal hardware for best performance. It brings all essential open source components together, with the aim of achieving high performance in data processing and delivery, whilst ensuring platform security and efficiency with Ubuntu Pro.

At MWC, our team will explain how operators can achieve scalable and repeatable deployment of edge clouds with CNEP. For Open Radio Access Network (RAN) readiness, CNEP is the ideal RAN platform, bringing all the technology features that cloud-native Open RAN components require. CNEP is also tailored for best performance and security assurance for distributed compute and multi-access edge computing (MEC) applications, enabling businesses to run their telco workloads on 5G edge networks.

Real-time Ubuntu for ultra-reliable and low-latency communications

Canonical has been working with all major silicon hardware vendors, such as Intel, to deliver the highest levels of performance and security to telco networks and applications. 

We have been running advanced engineering programs with the aim of enabling the latest innovations in silicon hardware in telco software infrastructure at a rapid pace, with quick software release cycles. As part of our collaboration with Intel, we have integrated Intel FlexRAN in Ubuntu real-time kernel for telco applications and networking software, which has enabled real-time processing at both operating system and silicon levels.

At this year’s MWC, we will explain how Ubuntu Pro brings real-time data processing capabilities to the telco edge for mission-critical operations and also ensures confidential computing for the most-sensitive telco workloads.

Sustainable telco edge infrastructure with an energy-efficient system stack

Telecom networks will increasingly deploy edge cloud sites in the journey to distributed and flexible cloud-native operations. This requires support for several features across the hardware and software stack to make sure that platforms are energy and cost efficient. From container images to bare metal hardware automation, Canonical’s edge cloud stack is equipped with features that ensure sustainable operations.

In Barcelona, we will explain how our open source software stack can deliver optimal deployments on telco edge clouds and help operators meet their sustainability goals.

Demos

At MWC 2024, you will get the chance to see our technical team demonstrate Canonical’s CNEP solution. This is a great opportunity for all players in the telco ecosystem to see how we meet sector requirements on cloud-native operations at the telco edge with automation. In our demo, the Canonical team will run CNEP on Intel’s 4th Generation Xeon Scalable Processor, bringing the acceleration capabilities provided by Xeon to large-scale edge network rollout for cost-efficient Open RAN deployments.

CNEP’s open and upstream APIs along with Canonical’s observability stack and telemetry solutions enable machine learning algorithms to assist edge cloud operations. The Canonical team will demonstrate how our AI/ML platform solutions can be used to boost the effectiveness of distributed computing applications running on telco edge clouds. We will show how a multi-cloud data platform can be formed for various data types collected from a telecom network. We will also show ML-based anomaly detection and LLM to summarise and explain collected data from the network. 

Come and meet us at MWC 2024

If you are interested in building your own modern telecom infrastructure and migrating to open source with cost-effective, secure and trusted solutions, Canonical can help you. We provide a full stack for your telecom infrastructure, enabling secure, trusted, flexible, optimised, automated and low-touch operations.

To learn more about our telco solutions, meet us to discuss your telecom needs at MWC Barcelona 2024, and visit our webpage at ubuntu.com/telco.

If you’re unable to find a suitable time, please reach out to Mariam Tawakol <mariam.tawakol@canonical.com> or Jacob Boe <jacob.boe@canonical.com>. Let them know your availability and what you’re interested in, and they will set up a meeting for you.

Further reading

Canonical joins Open Networking Foundation

Fast and reliable telco edge clouds with Intel FlexRAN and Real-time Ubuntu

Bringing automation to telco edge clouds at scale

How telcos are building carrier-grade infrastructure using open source

❌