This item is non-discoverable
FETA: a Flexible Low-Power AI/ML Accelerator for Time Series Signals
Abstract
Enabling time-series AI/ML functionalities with ultra-low power consumption
The large variety of data acquired by sensors on mobile, wearable, and IoT devices has enabled numerous new applications, such as long-term medical monitoring, fitness tracking, predictive maintenance, and speech processing. ML algorithms such as neural networks (NNs) are often used to process time-dependent sensor data (time series) from these sensors. However, exploitation in edge devices is still limited due to the inefficient processing of vast amounts of sensor data.
Today, ultra-low-power devices are rarely enabled with ML functionalities. The numerous operations required by ML algorithms are typically offloaded to the cloud, at the cost of power-hungry radio communication, long latency, and privacy risks. Thus, the design of low-power NN accelerators is key to enabling ML features in any battery-powered device.
The development of optimized, yet flexible, accelerators for NNs can unlock significant power consumption savings. Thanks to the design of these circuits, the execution of computing-intensive algorithms can be made possible for any portable device and create unprecedented use cases for edge devices.
Publication Reference
CSEM White Paper, February 2025
Year
2025-02