Research Internship Project
Dynamic and Adaptive Spiking Neural Network Solutions for Energy-Autonomous IoT nodes
Context
As the Internet of Things (IoT) continues to evolve, the integration of Artificial Intelligence (AI) with
edge computing, i.e., Edge AI, emerges as a powerful synergy. This combination leverages Machine
Learning (ML) algorithms to locally process sensor data, offering real-time intelligent decision-making.
However, at this level of integration, the need to process large amounts of heterogeneous data
imposes challenging constraints in terms of efficiency, accuracy and resource utilization. Moreover, in
the context of energy-autonomous wireless sensor nodes, each calculation has to cope with a limited
and time-varying harvested energy. As a consequence, the requisite for energy frugality severally
hampers the performance of these ML algorithms, highlighting the necessity for dynamicity and
adaptability. The next generation of Artificial Neural Networks (ANN) known as Spiking Neural
Networks (SNN) [1] emerges as a promising candidate for autonomous nodes. SNN significantly
improves energy and computing efficiency given their intrinsic sparsity and event-based
representation [2]. Their natural dynamic behavior can be exploited to fit with the variation of available
energy in autonomous nodes. As such, the objective of this work is to propose new energy managers
to dynamically adjust the SNN consumed energy with respect to its Quality of Service (QoS). Weight
sharing and early stopping [3] methods will be leveraged to this purpose.
Project mission
The project mission will be organized in several periods:
- Bibliographic study and introduction to SNN, training SNN, energy-autonomous nodes, energy managers…
- Explore different methods for adaptive SNN: 1) shared weights for different timesteps, 2) various number of timesteps for early stopping
- Develop new energy managers [3] to dynamically adjust the SNN consumed energy by adapting its QoS and architecture
- Optimize the energy manager parameters and establish criteria for early inference stopping. To do so, it will be necessary to determine the required number of timesteps and the conditions under which the inference should terminate.
- Publication in an international conference
References
[1] EM. Isik, “A Survey of Spiking Neural Network Accelerator on FPGA.” arXiv, Jul. 08, 2023
[2] E. Lemaire et al., “An Analytical Estimation of Spiking Neural Networks Energy Efficiency,” in Neural Information Processing, Eds., Cham: Springer int publishing, 2023
[3]F. A. Aoudia et al.,“RLMan: An Energy Manager Based on Reinforcement Learning for Energy Harvesting Wireless Sensor Networks,” IEEE Tr. GCN, vol. 2, no. 2, 2018
Practical information
Location: LEAT Lab / SophiaTech Campus, Sophia Antipolis
Duration: 6 months from march 2025
Profile: Machine learning, IoT, embedded systems
Research keywords: SNN, signal processing, energy managers, IoT, Edge AI
Application: CV, Motivation letter, Grades
Contact and supervision
Ghattas Akkad, Benoît Miramond, Alain Pegatoquet
LEAT Lab – University Cote d’Azur / CNRS
Polytech Nice Sophia
04.89.15.44.39. / ghattas.akkad@univ-cotedazur.fr, Alain.PEGATOQUET@univ-cotedazur.fr