EFLOP: A Sparsity-Aware Metric for Evaluating Computational Cost in Spiking and Non-Spiking Neural Networks

Loading...
Thumbnail Image
Author
Narduzzi, Simon
Zenke, Friedemann
Liu, Shih-Chii
Dunbar, L. Andrea
DOI
10.1088/2634-4386/addee8
Abstract
Deploying energy-efficient deep neural networks on energy-constrained edge devices is an important research topic in both machine learning and circuit design communities. Both artificial neural networks (ANNs) and spik- ing neural networks (SNNs) have been proposed as candidates for these tasks. In particular, SNNs are considered energy-efficient because they leverage temporal sparsity in their outputs. However, existing computational frameworks fail to accurately estimate the cost of running sparse networks on modern time-stepped hardware, which exploits sparsity by skipping zero-valued operations. Meanwhile, weight sparsity-aware training remains underexplored for SNNs and lacks systematic benchmarking against optimized ANNs, making fair comparisons between the two paradigms difficult. To bridge this gap, we introduce the effective floating-point operation (EFLOP), a metric that accounts for the sparse operations during pre-activation updates of both ANNs and SNNs. Applying weight sparsity-aware training to both SNNs and ANNs, we achieve up to 8.9× reduction in EFLOPs for GRU models and 3.6× for LIF models by sparsifying weights by 80%, without sacrificing accuracy on the Spiking Heidelberg Digits (SHD) and Spiking Speech Command (SSC) datasets. These findings highlight the critical role of network sparsity in designing energy-efficient neural networks and establish EFLOPs as a robust framework for cross-paradigm comparisons.
Publication Reference
Neuromorphic Computing and Engineering, Volume 5, Number 3
Year
2025-08-14
Sponsors
This work has received funding from the Swiss State Secretariat for Education, Research, and Innovation (SERI) under the SwissChips initiative. This work was partially funded by EU’s Horizon Europe Research and Innovation Programme (Grant Agreement No. 101070374, CONVOLVE) funded through SERI (Ref. 1131-52302), by the Swiss National Science Foundation [grants number CA-DNNEdge_208227 and PCEFP3_202981] and the Novartis Research Foundation.