0% found this document useful (0 votes)
38 views

CTOs_Guide_To_MicroClouds

Uploaded by

Luca Bomben
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views

CTOs_Guide_To_MicroClouds

Uploaded by

Luca Bomben
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Edge computing:

a CTO’s guide to MicroClouds


Practical considerations for mapping a successful
path to edge computing

February 2023
Table of content

Key takeaways from this whitepaper 3

Understanding edge computing 4

Key advantages: business needs 5

Key challenges: constraints at scale 6

Introducing MicroClouds 7

Making MicroClouds work for you 8

What are MicroClouds? 8

How do MicroClouds solve the edge constraints and challenges? 9

Which use cases do MicroClouds enable? 11

Checklist for a winning edge strategy 14

1. Delivering sustainable economics 14

2. Addressing constraints 14

3. Future-proof the organisation 14

Adopting open source MicroClouds 15

The added value of open source MicroClouds 15

The Canonical MicroCloud 15

Conclusion 19

Resources 19

2
Key takeaways from this whitepaper
• Edge computing will transform almost any industry from manufacturing to
telco, to in-shop customer experiences.

• Ecosystems provide better advantages than a single-vendor approach.

• The edge is the center of everything - the real world - closer to the consumer.

• Cutting costs, taking control and creating higher value are main motivations.

• Edge computing is a topology. Edge technologies will evolve often, quickly.

• A MicroCloud is a new class of infrastructure for on-demand computing at the


edge. Edge computing splits into MicroClouds and IoT devices.

• MicroClouds are small clusters of compute nodes with local storage and secure
networking, optimised for repeatable remote deployments.

• MicroCloud architectures are designed to solve edge computing constraints:


– Stack of modular components providing the flexibility and scalability required
to overcome the elastic needs of edge applications;
– Low-touch with few to no on-site operations and plug-and-play experiences
to match the lack of physical access at edge locations;
– Open and standard APIs provided by a modular stack, preferably based on
non-proprietary solutions, to fit long-term organisations strategy;
– Abstract layers, agnostic to the execution substrate, to future-proof edge
cloud architectures and deployments.

• MicroClouds are the common denominator of 5G in telecommunications, AI/ML


in manufacturing, personalised digital experiences in retail, data-driven
healthcare, predictive maintenance and distributed autonomous facilities.

• A winning edge strategy ensures sustainable economics, solves the technical


constraints of the edge topology, and future-proof the organisation.

• Open source MicroClouds are a powerful edge strategy.

3
Understanding edge computing
Gartner estimates that by 2025, 75% of data will be processed outside of the
traditional data center or cloud. This shift means that to stay ahead of the
competition, technical business leaders need to find out what their company
strategy to edge computing is going to be.

This paper will explore the key considerations for enterprises and mobile service
providers to easily deploy and manage the lifecycle of distributed MicroClouds -
bare metal compute clusters of between 3-100 commonly off-the-shelf servers -
to meet not only your initial use-case requirements, but also to easily cope with
future use-cases, without having to retool or redeploy. Additionally, this paper
will guide you to create a winning strategy to help your organisation harness the
power of edge computing.

What is your edge?

Terms like ‘edge’ have become unnecessarily overloaded within the infrastructure
space. ‘Edge’ is used to describe everything from embedded-type single-board-
computers and internet-of-things (IoT) devices all the way to large point-of-
presence (PoP) clusters of data center class equipment. To find out what your
edge computing strategy will be, you first need to clearly define your use cases.

Edge computing

Edge computing is essentially geographically distributed compute/infrastructure


resource pools. Whether it’s thousands of small clusters at the back of retail
shops, gas stations, micro branches or bigger 20-server sites serving localised
computing applications near their consumers. Edge computing will transform
almost any industry, from telco to manufacturing, and in-shop customer
experiences. Each edge is unique, and there are as many strategies as there are
different use cases.

As a guiding principle, in their 2021 “Gartner Predicts” report, the expert analysts
advise enterprises to rely on “ecosystems over a single-vendor approach”. This
paper will explore a number of relevant aspects to building an ecosystem of
MicroClouds, tailored to your needs.

MicroClouds

MicroClouds trade the exponential scalability of public and private clouds for
security, privacy, governance, and low-latency of decentralised edge computing
environments. They reproduce the APIs and primitives of the big clouds at the
scale of the edge: tiny clusters replicated over thousands of edge sites.
A MicroCloud strategy helps build an ecosystem of many interconnected systems
over a complex network.

4
Key advantages: business needs
The edge might sound far away, unrelated to everyday operations, yet in reality,
“the edge is the center of everything” [Wenjing Chu, LF Edge, March 2021]. The
edge of the network is where businesses operate, create value, and where users
consume data and services. In their 2021 Market Analysis Report on edge
computing, GVR analysts list industries that would hugely benefit from innovation
in edge computing: “healthcare, transportation, defense, energy, aviation,
manufacturing, mining, oil and gas, natural resources, telecommunications, and
utilities“. In short, any industry that generates and processes lots of data locally
would heavily benefit from edge computing innovations.

In telecoms, it allows mobile network operators to get much closer to their


customers through localised content, targeting services and partner promotions.
Network services could compete with central clouds leveraging an ecosystem of
localised MicroClouds. Facing the expected and already increasing data deluge, it
saves bandwidth for the vast amount of data generated by connected devices,
which can be stored and analysed locally rather than transferred centrally.

The low latency and scalability of edge computing also suits environments
where demand may fluctuate – such as retail stores or medical facilities – and
where disrupting the end-user experience is a critical need.

The economic motivations behind using Edge are clear:

• Cutting costs: Edge computing reduces the cost of maintaining a high


bandwidth for transferring data to a central cloud. It also cuts down on central
cloud storage costs.

• Taking full control: Decentralised interconnected ecosystems provide


organisational resilience to cloud outages and control over data governance
and compliance.

• Creating high-value services: Low-latency edge sites provide enterprises with


the potential to support localised services of higher value, customising the end
user experience and making the most of local data (eg., sensors).

In manufacturing, different types of autonomous machines converge onto a single


framework. Data from sensors on these machines and other areas of production
influence what the robots do next. It also tells operators which machines will soon
need maintenance and analyse dangers in real time, protecting human employees
as they move around the site. As technical leads, we must consider what happens if
the site loses connection with the central hub. Downtime in these production
environments is not just an inconvenience. It can cost millions and undermine the
entire business. The latency and network costs introduced by centralised compute
as well advocate for edge computing adoption.

Privacy, security and governance are key issues for many organisations – more so
for those in regulated industries. The European Gaia-X cloud project gives a good
example of such administrative concerns. Edge computing offers an opportunity
to gain independence from the largest tech providers while maintaining
governance over data.

5
Finally, edge computing architectures provide the local processing power
required to launch competitive cloud applications that are fully automated and
API-driven. For example, computer vision quality control in production lines or
remote telecommunications centers run without a technical expert on-site when
an edge computing architecture is implemented.

Key challenges: constraints at scale


Adopting an edge computing topology comes with key advantages. For many
industries, it will be decisive in their digital transformation as they need to disrupt
the end-users’ consuming experience. However, it needs to be executed against a
future-proof strategy. Edge computing is more than one piece of technology: it’s
going to evolve quickly and often.

Here are the main challenges to consider when adopting an edge computing
strategy, at scale:

#1 Elastic needs

Fast evolutions in technology and services as well as a highly competitive


landscape make adaptation to continuously changing needs a key challenge with
edge computing. Your needs today are likely to be very different from those of
tomorrow. Applications will evolve, use cases and consumer needs will change.
You need an edge computing solution that is flexible to your needs over time.

One particular aspect of continuously changing needs is scale. You’re likely to


start with a few sites, maybe only a 4-node PoC site. The challenge arises when
there’s a need to replicate and scale to 100s or 1,000s of sites. You need a
solution that is easily scalable.

• Will your edge sites need to run different applications over time?

• Do you have a fixed use case or do you have elastic workloads that need scalability in
the underlying hardware as well as the software apps?

#2 Physical access

Since edge locations are by definition distributed, scattered, and sometimes


located in remote places, they can be hard or costly to access. If it is too hard or too
costly to replace a component, the system will become less redundant. You need a
solution that is highly or fully automated with low to no onsite maintenance.

• Where will your edge nodes be physically located? In an air-cooled unit, a redundant
network data center, at the foot of an antenna on the top of a mountain...? Will
there be onsite IT technicians?

• Do you need high availability for all your workloads and data, or is keeping
hardware, maintenance and energy costs down a priority?

6
#3 Organisational fit

Edge deployments won’t replace all the past investments in cloud and private
data centers. An edge computing architecture needs to interface with this
existing infrastructure – whether that is public and private clouds or bare metal.
Because existing infrastructure, practices and policies are mostly centralised,
decentralised compute solutions with centralised management are much easier
to integrate. Similarly, open and standard APIs are required to match the
existing infrastructure.

• What is your deployment strategy? What rules will decide what workload runs where
and when? What services do you need to connect?

• What are your edge interfaces and types of connection (e.g. SCADA, EtherCAT,
Bluetooth and WiFi) and are they secure?

#4 Future-proofing

Being reactive to change in workloads or sizing needs is one thing (read #1),
future-proofing for evolutions in technology is another. Software and hardware
are constantly changing, and companies need to account for that. It is important
to avoid getting locked into a specific technology or vendor. An abstraction layer
can help mitigate the risks linked to re-tooling or migrating technology. As
opposed to IoT applications, a MicroCloud needs to be hardware agnostic and
each layer with standard APIs protect your applications from external changes.

• What happens if the tooling you put in place today does not match the business need
in the future?

• Does the technology you have committed to have a solid engineering team behind it
and is it backed by long-term investment?

Introducing MicroClouds

The full compute continuum

Adopting a localised variant of data center technologies, or cloud technologies, is


a powerful strategy to overcome these challenges today. Not only it reuses
technologies and skills organisations have already internalised, but it also
provides the modular and standard platform for building an ecosystem of
decentralised sites.

7
A MicroCloud is a new class of infrastructure for on-demand computing at the
edge. MicroClouds differ from IoT which uses thousands of single machines or
sensors to gather data, yet does not perform computing tasks. Instead, a
MicroCloud reuses proven cloud primitives but with the unattended, autonomous
and clustering features that resolve typical edge computing challenges.

What are the technical features of MicroClouds? How do they address


the constraints of edge computing? What kind of business use cases
do they enable?

Making MicroClouds work for you


Put simply, a MicroCloud is a small cluster of compute nodes with local storage
and networking. It is a stack made of resilient, self-healing and opinionated
technologies that allow you to create repeatable deployments at the edge.

What are MicroClouds?


In essence, a MicroCloud is cloud-ish but small enough to be localised. Standard
and open APIs provide the same functionality you would expect from larger
clouds, and an abstraction layer separates the hardware from the software.
MicroClouds often live in remote locations on a variety of devices, and therefore
need small footprints, full automation, and remote management. MicroClouds
interconnect in an ecosystem, integrating with your existing public and private
cloud infrastructure.

Cloud-ish but localised


• Automation • 1000s of remote sites
• API access • Small footprint
• Diverse substrates • No humans
• Economics • Remote deploy and manage

8
What are the technical features of a MicroCloud?

A typical MicroCloud technical stack looks something like this:

� Standard with existing clusters


� High availability
Containers layer � Lightweight
� Automated updates

� Underlay - physical
Networking layer � Overlay - logical

� Reliable
Storage layer � Scalable
� Easy to configure and use

� Scalable
� Clustering
Virtualization layer � Lightweight
� APIs

� Fully-automated
Device orchestration / � Minimal overheads
provisioning layer � API-driven

How do MicroClouds solve the edge constraints


and challenges?
MicroClouds sound miniature but they can extend vertically to be as large as you
need (although, specialised private cloud solutions are usually considered above
100 nodes per site). In the form of an edge data center, a MicroCloud is closer to a
small private cloud. MicroClouds provide decentralised compute that runs cloud-
native software with centralised management. They can be at the back of a
supermarket, a store, a stadium, a branch office, at the base of a 5G antenna site,
or any other location near the consumers.

How do MicroClouds, these tiny localised clouds, solve the edge challenges?

#1 Flexibility and scalability

Needs evolve quickly and applications change. The MicroCloud stack is made for
flexibility and scalability. It’s a stack of modular components, each providing a
specialised function: networking, storage, virtual machines, containers and
orchestration. Swapping, adding or removing a component from the stack is an
easily automated day-2 operation.

The layered approach transforms the scaling issue into a highly customisable
parameter. Scaling can happen at every level: bare metal, virtual machines and
containers. On one hand, the middle layer of virtualisation, using virtual machines
and/or linux containers, makes it easy to scale micro kubernetes clusters up and
down without having to worry about the bare metal configuration. On the other
hand, it enables you to seamlessly add or remove physical nodes to the
MicroCloud site without impacting container workloads. All operations can be
fully automated using the right software.

9
#2 Full automation

When designing your MicroCloud strategy and selecting the software for each
component of the stack, you must keep two goals in mind:

• Minimising on-site operations to the strict minimum amount required;


• Transforming remaining on-site operations into plug-an-play experiences.

A key constraint with edge computing is the often-restricted physical access to the
MicroCloud locations and the high cost of having technical staff on-site. Without
automated provisioning and updates, the promise of edge computing could not
become a viable and a scalable reality. The MicroCloud approach is built to enable
full automation with low-touch components and a composable architecture.

#3 Open and standard APIs

Since edge computing is the computing of the real - physical - world, the ‘edge’
will mean different things to different businesses. For this reason, the right
solution needs to be flexible, work with common APIs you are already using and
scale on demand. From the public clouds to remote devices located at the edge,
there is a wide spectrum of sizes and solutions. To be successful, an edge
deployment needs to be ‘composable’ — designed as a modular stack, preferably
based on non-proprietary solutions.

The MicroCloud approach was developed with open and standard APIs in mind,
aiming to fit existing enterprises infrastructure, practices and policies.

#4 Abstraction layer

The abstraction layer is crucial. It separates the day-to-day device interactions


from the main substrate (host environment). Your applications become agnostic
to the execution substrate, and your developers only have to target one platform:
a standard cloud environment. You can then choose to deploy your applications
on whichever substrate makes the most sense: public clouds, private clouds and
MicroClouds. That way your business apps are not dependent on your substrate
and you can switch infrastructure without affecting day-to-day operations.

Efficient abstraction layers future-proof your edge deployments and


infrastructures, reducing costs and risks. Edge technologies will improve, better
and cheaper hardware will replace faulty servers, and you need to take that into
account when building your edge infrastructure. With a MicroCloud architecture,
a disruption in edge technologies won’t mean having to restart from scratch: any
change will at most consist in replacing or upgrading pieces of your infrastructure
- not the whole.

10
Which use cases do MicroClouds enable?
Instead of developing applications for every possible technology platform and
device, with an open source abstraction layer you can develop an application just
once and make it available everywhere. The MicroCloud stack is a powerful
abstraction layer to superset any edge hardware with cloud-native APIs. Any
cloud application can then be scaled to the edge.

As shown in the previous section, the technical advantages are clear. However,
building the business case for MicroClouds will depend on your use cases and the
value of committing to the edge. In particular, areas where innovation is
desperately needed to improve business operations or customer experiences.
There are plenty of use cases out there, but remote locations or places where IT is
harder to manage will clearly benefit the most from MicroClouds.

Telecoms

The mobile sector is possibly one of the most mature sectors for MicroClouds.
The vast amount of information generated every day runs into billions of data
points. Even with highly sophisticated centralised systems, it can take time to
develop and launch new customer services. Those that cannot respond quickly
enough concede the competitive advantage to others. With MicroClouds, telcos
can use the data gathered locally to offer customers new services and promotions
in near real-time. Combined with 5G, there is an opportunity to enable a whole
set of edge applications that would not have been possible without hyper-low
latency and high bandwidth.

Manufacturing

Manufacturing is undergoing a huge digital transformation as part of the Fourth


Industrial Revolution. From a commercial perspective, data is either secret or
highly sensitive, so companies need to control governance at individual
production sites. MicroClouds support the private networks that enable IoT,
robots and sensors, in these shared operating spaces and even ‘dark factories’
that are fully automated. With the rise of machine learning applications and
digital twins models - simulating operations and systems in real time -, complex
computing tasks will continuously happen at the core of factories. MicroClouds
enable not only to run these workloads today, but are flexible enough to host
those of tomorrow, leveraging GPUs or specialised hardware.

Retail

The industry is faced with the ‘innovate or die’ inflection point. While some
functions, like R&D, may continue to take place centrally, much of the innovation
required needs to be close to the customer. Amazon Go is a good example. This
retail concept relies on sensors, computer vision and deep learning to track
people and goods in stores. As a result, customers can simply walk in, pick the
products they want and walk out. No tills, no cards, no queues. This requires
MicroCloud low latency and 100% uptime, as well privacy and governance.

11
Healthcare

Hospitals depend on accurate and timely patient data to make potentially life-
saving decisions. If a central site goes down or the connection is lost to a public
cloud due to a power outage then it can cause serious issues. MicroClouds with
hyper-low latency and high availability ensure that there is no disruption to clinical
workflows. Data is not only quickly and easily available to clinicians but also
individual sites can perform their own advanced population health analytics on-site
to shape their community services. Being able to process medical data on-site and
carefully select what is being sent over the network will guarantee privacy.

Utilities

Utilities companies need to depend on accurate forecasting of energy demands


and productions. Facilities being located remotely, it is difficult to monitor real-
time data to manage peak demand of electricity and allocate resources.
MicroClouds in cohesion with IoT devices on-site, ensure that the compute is done
in real-time, even offline, at hyper-low latency. Resource allocation can be
automated depending on the energy demand and forecasting of renewable
energy production. Similar to the IT and OT convergence happening in
manufacturing, advanced data analysis will help reduce costs and optimise
resources. With facilities monitored in real-time, data-driven prediction of
maintenance issues becomes possible.

The list of industries that could benefit from MicroClouds at the edge is endless
— from Agriculture to Education to Gaming. While examples are useful to
understand the principles of the MicroCloud, each use case will be unique. Once
the use case gets clearer, it becomes critical to ask the right questions and to start
creating a winning edge strategy.

12
MicroClouds for the edge

MicroClouds are cloud-ish but localised


� Reusing proven clouds primitive
� Modular MicroClouds, API integration
with existing solutions
� Fully automated, centrally managed,
with low to no onsite operations

MicroCloud for telco near a 5G


antenna site
� Lightweight to run on small servers
constrained by real estate
� API-driven for providing on-demand
compute and services
� Distributed workloads, serving
compute near the consumer

MicroCloud for industrial use inside a factory


� Scalable and low latency HA cluster for
critical industrial applications
� Local data processing for more governance
and less network costs
� Compute power at the factory core for Al
and digital twins applications

MicroCloud for retail at the back


of a supermarket
� Reliable, resilient to cloud outages
to prevent downtimes
� Predictions with data analytics and
inferences at the edge
� Running on low cost small devices

MicroCloud for businesses in branch


offices locations
� Low-touch, self-healing, few to no
human interactions
� Central remote management for
MicroCloud for utilities at power grids
corporate policies
� Real time data analysis for automated
� On-demand K8s clusters, virtual
resource allocation
machines, containers...
� Fully automated and resilient for remote
site locations
� Decentralised data driven
predictive maintenance

13
Checklist for a winning edge strategy
The decisions you make at the beginning of your edge computing journey will
influence the outcome for your organisation for years to come. When drafting
your own edge strategy, there are three core areas to consider.

1. Delivering sustainable economics

Does automation make your operations lower-touch and lower-cost?

Do you make sure to reuse internalised skills and technologies?

Are you applying automation to infrastructure and app management?

Could you accommodate a variety of app requirements without increasing


the local footprint?

Are you optimising resources for app usage to overcome scarcity?

Are your deployments repeatable to scale across hundreds or thousands


of sites?

2. Addressing constraints

Do you have enough resiliency and self-healing capabilities so it is easy to


recover a device with no impact on the overlay?

Are you minimising overhead and management costs through automation?

Do you balance centralised management with decentralised compute


power via device clustering and microservices?

Do you have tooling in place to manage distributed applications at scale?

3. Future-proof the organisation

Have you selected an open source pathway to avoid vendor lock-in,


increase choice and select different tech when needs change?

Do your deployments provide cloud capabilities – from on-demand virtual


machines to containers and Kubernetes clusters?

Did you make sure to get all your open source technology from a trusted
vendor invested in the long-term?

Could you benefit from the guidance of experts in supporting


the transition?

Following these strategic principles will help ensure that the decisions you take
now about edge computing actually add value to your organisation. Instead of
overcommitting to a single form of technology, establishing MicroClouds with an
open source abstraction layer will give you more freedom in the future.

14
Adopting open source MicroClouds

The added value of open source The Canonical MicroCloud


MicroClouds
Kubernetes
MicroClouds running on open source technologies
give you technical and business advantages over Container workloads VM workloads
proprietary edge computing solutions.
LXD cluster
• Reusability of known and proven technology
- No need to waste time and money training micro Ceph micro OVN

people for new skills


- Open source technologies are already the core Canonical

Ubuntu Server
Canonical

Ubuntu Server
Canonical

Ubuntu Server
of cloud computing or
Ubuntu Core
or
Ubuntu Core
or
Ubuntu Core
- Free software makes training more accessible
and cost-effective

• Same features and benefits of central clouds Canonical’s approach to MicroClouds for the edge is
- Open technologies and standards remove 100% based on open source software, with the
vendor lock-in ability to provide on the entire stack up to 10-year
- CNCF open source projects are a guarantee security maintenance with long-term supported (LTS)
of quality and sustainability versions and a commercial 24/7 enterprise support
- Central clouds are based on open with competitive response times. Additionally, as a
source technologies trusted open source partner, Canonical would guide
you through your first site installation and the steps
• Future-proof and secure to fully automate further deployments.
- Open source blends components based on
preferred tech and in-house skills Canonical’s MicroClouds are - by default - made of
- Security can be improved with reliance on a three open source components: . These
recognised open source vendor components have been selected for their
- Control over data privacy and governance are lightweight, low-touch, and self-healing capabilities.
retained by your organisation MicroClouds can be deployed on both standard
Ubuntu Server, as well as Ubuntu Core for an even
Open source MicroClouds backed by commercial more lightweight solution. The Ubuntu operating
support enable enterprises to be confident with system ensures a unified experience and consistent
their edge strategy, in the long run. quality throughout the entire stack, from Ubuntu
hosts to Ubuntu LTS containers.

15
Default MicroCloud components

LXD Ceph

Other
LXD CLI Juju Ansible OpenNebula
restful apps

LXD REST API

Clouds of all sizes require some amount of persistent


LXD LXD LXD LXD LXD storage, be it for machine images, scratch space for
Host Host Host Host Host
edge region processing, as a buffer before
LXC QEMU LXC QEMU LXC QEMU LXC QEMU LXC QEMU forwarding to a centralised location, or just for
Kernel Kernel Kernel Kernel Kernel localised assets.
Host A Host B Host C Host D Host ...

Canonical Ceph can be deployed in clouds of any


scale, and can be easily configured to meet data
LXD is a modern infrastructure tool that has durability needs, ensuring that inevitable hardware
everything you need to run your virtualised failure does not lead to downtime or worse,
workloads. . In addition to regular VMs, users can run data loss.
their workloads using system containers that behave
similarly but consume fewer resources while Ceph’s flexible nature allows it to solve multiple
providing bare-metal performance. LXD is image- access requirements. Block, file and object use cases
based with pre-made images available for a wide can be met by a single cluster, and if required,
number of Linux distributions and is built around a supports replication between geographically
very powerful, yet pretty simple, REST API. dispersed clusters.

MicroClouds require API-driven components that To conform with the simplicity and automation
can be fully controlled and automated remotely. The aspects of MicroClouds, Canonical utilses MicroCeph
core of LXD is a privileged daemon which exposes a - snap-based, lightweight way of deploying a
REST API over a local unix socket as well as over the Ceph cluster.
network (if enabled).

LXD clusters are:

• secure by design (unprivileged containers,


resource restrictions)
• scalable (and with built-in high availability from
three nodes)
• intuitive (simple, clear API and crisp command
line experience)

LXD also features advanced resource control (cpu,


memory, network I/O, block I/O, disk usage and
kernel resources), device passthrough (USB, GPU,
unix character and block devices, NICs, disks and
paths) and network and storage management with
support for many configurations and backend.

LXD’s flexibility, security, and reliability makes it an


ideal choice for the virtualisation layer of a
MicroCloud cluster: abstracting the physical layer
with powerful APIs.

16
OVN Automated deployment

To simplify the deployment process, while keeping


it secure and replicable, Canonical uses the power of
snaps – app packages that are easy to install, secure,
cross-platform and dependency-free.

The deployment is driven by a MicroCloud snap


that automatically configures LXD, Ceph and OVN
Networking is the third main ingredient of a
across a set of servers. When initialised, MicroCloud
cloud of any size. MicroCloud networking can be
will detect all the servers in the network and drive
configured in several ways, but in order to take
LXD, MicroCeph and MicroOVN snaps to configure a
advantage of SDN (software-defined networking)
cluster. It will then prompt to add disks to Ceph, and
OVN is included by default. In general, SDN is a
users will have a working MicroCloud ready to use.
network architecture model that adds a level of
abstraction to the functions of network nodes
For an even easier approach, MicroCloud is available
(switches, routers, bare metal servers, etc.), to
as an Ubuntu Core appliance.
manage them globally and coherently.

OVN is a trusted open-source SDN project that


provides virtual network abstractions, including
many virtual network features varying from layer 2
to higher network services like DHCP and DNS. With
OVN, MicroCloud users have sufficient flexibility
to configure their network in the way that best
suits their use cases, and connect their edge sites
seamlessly with the rest of their infrastructure.

17
Optional MicroCloud components

MAAS MicroK8s

MicroK8s is a low-ops, minimal production


Kubernetes, for devs, cloud, clusters, workstations,
Edge and IoT.

Although MicroK8s is small, it features the K8s APIs,


none added or removed, with sensible defaults that
‘just work’. A quick install, easy upgrades and great
Depending on the use case, MicroClouds can require
security make it perfect for such deployments.
several or hundreds of physical servers, and different
MicroClouds may be spread over various locations.
Additionally, MicroK8s features self-healing high-
Upgrades, replacements, inventory and installations
availability clusters. MicroK8s automatically chooses
are all very real problems to solve and quickly
Public clouds Private Cloud the bestTiny
Small private cloud nodes forcloud
private the Kubernetes datastore.
IoT
become expensive operationally.
When you lose a cluster database node, another
node is promoted. No admin needed for your
Metal as a Service (MAAS) solves these problems Smaller size of device
MicroCloud
bulletproof edge. (# of devices)
by providing a way to flexibly deploy, manage, and
maintain operating system loads on physical servers.
It keeps track of all servers and their configurations
and verifies the server and operating system
On a daily basis, we see how every industry is
substrate are healthy. MAAS also maximises the
different. Every potential use case is different. Every
available resources (server real estate), by providing
technical scenario is different. We believe the answer
maximum reuse and flexibility.
to what a strong and sustainable edge strategy
should look like is MicroClouds. Our approach is to
MAAS is not included by default, but it can easily be
design a simple, lightweight, automated solution
integrated with the solution when needed. MAAS
built on best of open source, that can be easily
will then handle the bare metal provisioning of
expanded and integrated with other products.
the server, while automation tooling of choice can
be used to deploy the MicroCloud on top of the
provisioned servers.

Public clouds Private Cloud Small private cloud Tiny private cloud IoT

Smaller size of device


MicroCloud (# of devices)

Cloud of all sizes: edge computing vs cloud computing

18
Conclusion
MicroClouds are the culmination of advances in cloud computing technologies
and hardware improvements in terms of cost, size, and reliability. They are
localised, tiny versions of cloud platforms, easily replicable at scale. An ecosystem
of MicroClouds is a powerful strategy to trade some of the public and private
clouds’ elasticity for the security, privacy, governance, and low latency needed by
edge applications.

A wide range of sectors and activities will benefit from MicroClouds: optimising the
existing facilities, creating new customer experiences, or even disrupting an entire
industry. From R&D, using the infinite flexibility and advanced services of public
clouds, to production applications running on MicroClouds next to the consumer
to make their journey a seamless experience. The advantages across all of these
cases are that MicroClouds can reuse known APIs to enable the standardisation of
applications while maintaining flexibility and industry-specific optimisations.

For many organisations, edge computing (and MicroClouds) will be new. Open
source can be a concern if there is no one monitoring it for vulnerabilities.
Canonical’s MicroCloud stack comes with enterprise support delivered long-term,
enabling you to achieve innovation at the edge while improving security and
governance through a trusted provider.

To find out how, visit ubuntu.com/edge or get in touch.

Resources
• microcloud.is • microk8s.io/

• ubuntu.com/edge • ubuntu.com/telco

• ubuntu.com/lxd • ubuntu.com/contact-us

• ubuntu.com/ceph • ubuntu.com/engage/introduction-to-micro-clouds

• maas.io/ • ubuntu.com/engage/edge-infrastructure

© Canonical Limited 2023. Ubuntu, Kubuntu, Canonical and their associated logos are the registered trademarks of
Canonical Ltd. All other trademarks are the properties of their respective owners. Any information referred to in
this document may change without notice and Canonical will not be held responsible for any such changes.
Canonical Limited, Registered in Isle of Man, Company number 110334C, Registered Office: 2nd Floor, Clarendon
House, Victoria Street, Douglas IM1 2LN, Isle of Man, VAT Registration: GB 003 2322 47

You might also like