Techology Trend of Edge Ai: Yen-Lin Lee, Pei-Kuei Tsung, and Max Wu Mediatek Inc
Techology Trend of Edge Ai: Yen-Lin Lee, Pei-Kuei Tsung, and Max Wu Mediatek Inc
Abstract — Artificial intelligence (AI), defined as intelligence Latency Efficiency Availability Privacy
exhibited by machines, has many applications in today's society,
Smart
including robotics, mobile devices, smart transportation, ADAS Smart Glasses Home
Camera
healthcare service, and more. Recently, lots of AI investment in Drone AR/VR Assistant
Sensors
both big companies and startups have launched. Besides cloud-
Fig. 1. Edge AI opportunities different from cloud-based AI
based solution, AI on the edge devices (Edge AI) takes the
advantages of rapid response with low latency, high privacy,
more robustness, and better efficient use of network bandwidth. DSP Deep Learning
CPU GPU
To enable Edge AI, new embedded system technologies are (VPU) Accelerator (DLA)
desired, including machine learning, neural network acceleration Main function
- Control - Graph Signal
Special purpose
- Serial computing - Parallel computing processing
and reduction, and heterogeneous run-time mechanism. This
Flexibility H L
paper introduces challenges and technologies trend of Edge AI.
Efficiency L H
In addition, it illustrates edge AI solutions from MediaTek,
including the dedicated AI processing unit (APU) and NeuroPilot Fig. 2. Processors comparison for AI processing
technology, which provides superior Edge AI ability in a wide
range of applications.
the dedicated AI processing unit (APU) and the NeuroPilot
technology, MediaTek provides ready-to-product solution for
I. INTRODUCTION
Edge AI.
In the recent years, artificial intelligence (AI) appears in The rest of the paper is organized as following: First, the
every technology field. From home electronics to the complex design challenges and current technology progress for Edge
simulation experiment for protein structure, AI or machine AI is illustrated in section II. Then, MediaTek approaches for
learning has been launched to enhance the quality of Edge AI is discussed in section III. Finally, section IV
computation and create possibilities for new applications, such concludes this paper.
as face unlock of mobile phone or autonomous driving.
However, high performance of machine learning or deep II. EDGE AI DESIGN CHALLENGES AND TECHNOLOGIES
learning requires huge computation capability to deal with
Figure 1 describes the opportunities and key requirements
complex training and inference methodologies and large
of Edge AI computing comparing to the challenges of cloud-
dataset [1]. That is, in order to satisfy the computation thirst,
based AI frameworks. For different applications, there are
cloud servers have to provide very powerful computational
different critical requirements including latency, efficiency,
capabilities. Hence, more and more non-traditional alternative
availability, privacy, and so on. For the vehicles or drones,
solutions have shown up in recent years to effectively execute
moving speed limits the latency tolerance of response.
the AI computation tasks. For example, Google provides the
Otherwise, a crash or accident will happen. Home assistant
tensor processing unit (TPU) solution as the specialized
systems or devices always bring privacy concerns because
computing unit for AI processing tasks [2]. NVidia also
processed content touches on personal information.
innovates new GPU server architecture to favor AI
Furthermore, power efficiency is extremely important for
characteristic [3].
Edge AI devices, especially for wearable, in order to have
The cloud-based eco-system has demonstrated itself as a
longer duration usage. All these needs make the Edge AI
practical platform to serve some AI applications. However, the
computing necessary and have brought it to the forefront.
cloud-based solution has many limitations that might prevent
However, Edge AI also has its own design challenges that
the adoption on all AI applications. Taking the autonomous
need to address:
driving for example, the connection robustness and its latency
from the server seriously impact the safety of the vehicle due A. Power Efficiency and Different Types of AI Processors
to the time-to-collision. In addition, uploading the personal Fist and the most important challenge is that the edge
information or the record of street-view video to cloud brings devices have to provide enough computational capacity within
the privacy issue. Furthermore, there is not always existing specific limitations, such as thermal or form factor size. Due
internet connectivity everywhere. These issues lead to the to these limitations, the Edge AI prefers to focus on the
requirement that AI computation must be on the edge devices inference part and leave the training stage in the cloud as usual.
(Edge AI). In this paper, design challenges and technology The inference computation in Edge AI can be handled by
trend of Edge AI are discussed, and how MediaTek overcomes various computation units inside the device. Figure 2 shows
the challenges stated above is also introduced. By developing the various embedded processors for AI computing and their