G-2020-23
Proceedings of the Edge Intelligence Workshop, Montréal, Canada, March 2--3, 2020
, , , et référence BibTeX
Artificial Intelligence (AI) is the next society transformation builder. Massive AI-based applications include cloud servers, cell phones, cars, and pandemic management systems, among others. AI along with machine learning (ML) and deep learning (DL), beats human performance in various tasks, ranging from image classification, facial recognition, medical imaging, machine translation, speech recognition, and many more. Existing DL-based applications are computationally intensive, and require large amounts of resources, CPU, GPU, memory, and network bandwidth. These constraints restrict AI application deployment in practice. Embedded devices start with limited embedded memory in order of kilo bytes with limited computing power in order of 50 MHz, power in order of mili watts. Many of such edge devices still cannot support deployment of large large deep learning models.
Most voice assistants, such as Apple Siri, Google Voice, Microsoft’s Cortana, or Huawei Celia, are based on cloud computing. Such services do not function if their network connection is slow or interrupted. Because existing intelligent applications rely on centralized data, users send their data to the cloud center and the computation is fully processed in the cloud. There is a big volume of data collected by billions of mobile end users and Internet of Thing (IoT) devices distributed at the network edge. Such giant data will explode by moving towards smart cities and 5G connectivity. According to recent forecasts, data generation by edge devices will reach 850 ZB by 2021. Providing such volume of data to the cloud requires large bandwidth resources and may violate users' privacy and the General Data Protection Regulation (GDPR) imposed by the European Union and adopted in many other countries as a data privacy standard.
Edge devices are the key enabler for modern AI applications. In recent years, we see a trend in homogenizing edge and IoT devices, in which billions of mobile and IoT devices are connected to the Internet, and generating a huge amount of data. AI for edge and IoT devices recently received significant attention leading to different synonyms, such as Edge AI, Edge Intelligence, Low Resource Computing, Energy Efficient Deep Learning, Embedded AI, among others. Industry and users both have considerable interest in keeping computation on edge. Industry saves computing resources by outsourcing computation to the user, and users gain privacy. However this privacy preservation is not free and AI consumers pay for their edge hardware. By pushing data storage, computing, analysis and control closer to the network edge, edge computing. Edge intelligence is also viewed as the ultimate solution to meet the requirements of latency, memory, scalability, energy efficiency, while resolving saving network bandwidth. In some applications edge computing is simply inevitable for robustness and safety reasons. For instance, in autonomous driving, constant high-quality network connectivity is an assumption rather than a guarantee.(...)
Paru en avril 2020 , 97 pages
Axes de recherche
Application de recherche
Document
G2023_total.pdf (5,2 Mo)