Research Internship Project
Development of a prototype for embedded object detection with bio-inspired retinas on robotic
platforms
Context
The LEAT lab is leader of the national ANR project DeepSee in collaboration with Renault, Prophesee
and 2 other labs in neuroscience (CERCO) and computer science (I3S). This project aims at exploring a
bio-inspired approach to develop energy-efficient solutions for image processing in automotive
applications (ADAS) as explored by [3]. The main mechanisms that are used to follow this approach are
event-based cameras (EBC are considered as artificial retinas) and spiking neural networks (SNN).
The first one is a type of sensor detecting the change of luminosity at very high temporal resolution
and low power consumption, the second one is a type of artificial neural network mimicking the way
the information is encoded in the brain. The LEAT has developed the first model of SNN able to make
object detection on event-based data [1] and the related hardware accelerator on FPGA [2].
The goal of this internship project is to deploy this spike-based AI solution onto an embedded smart
camera provided by the Prophesee company [4]. The camera is composed of an event-based sensor
and an FPGA. The work will mainly consist in deploying the existing software code (in C) on the
embedded CPU, integrate the HW accelerator (VHDL) onto the FPGA and make the communication
between them through an AXI-STREAM bus. The last part of the project will consist in realizing
experimentations of the resulting smart cameras to evaluate the real-time performances and energy
consumption before a validation onto a robotic platform or even a driving vehicle.
Project mission
The project mission will be organized in several periods:
- Bibliographic study on event-based processing
- Introduction to the existing Sw and Hw solutions at LEAT, and to the dev kit from Prophesee
- Deployment of the Sw part on CPU and the Hw part on FPGA
- Experimentations and validation
- Publication in an international conference.
References
[1] L Cordone, B Miramond, P Thierion, Object Detection with Spiking Neural Networks on Automotive Event Data, IEEE International Joint Conference on Neural Networks (IJCNN), 2022
[2] N Abderrahmane, B Miramond, E Kervennic, A Girard, SPLEAT: SPiking Low-power Event-based ArchiTecture for in-orbit processing of satellite imagery, IEEE International Joint Conference on Neural Networks, 1-10, 2022
[3] G. Chen et al, Event-based neuromorphic vision for autonomous driving: A paradigm shift for bio-inspired visual sensing and perception, IEEE Signal Processing Magazine, 2020
[4] P. De Tournemire et al, A Large Scale Event-based Detection Dataset for Automotive, 2020
Practical information
Location: LEAT Lab / SophiaTech Campus, Sophia Antipolis
Duration: 6 months from february 2025
Grant: from ANR project DeepSee
Profile: VHDL programming, FPGA design, C programming, signal/image processing
Research keywords: Embedded systems, Event-based camera, artificial neural network, Edge AI