Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Internship: Unsupervised learning of robotic multimodal data

Research Internship Project
Unsupervised learning of robotic multimodal data

Context

LEAT lab has been working for several years on the design of bio-inspired neural models. One of
them is inspired by the self-organization of the biological brain. This model named ReSOM has been
previously applied to the classification of multimodal data such as the representation of digits from
visual, auditory and motor data [1]. Thanks to a collaboration with a research lab on robotics, we
have jointly developed a dataset of data collected onto the iCuB humanoid robotic platform (arm,
head, eyes positions and images from the camera embedded onto the robot).
The goal of this internship will be to work on this dataset and to work on the unsupervised learning
of this data thanks to the software framework developed at LEAT and integrating the ReSOM neural
model.

Project mission

The project mission will be organized in several periods:

  • Bibliographic study on multimodal processing with machine learning [3]
  • Introduction to the existing Sw and Hw solutions [2] developed at LEAT for Edge AI applications
  • Sw development for the preparation of data
  • Machine learning based on the ReSOM neural architecture
  • Analysis of the results on a computer
  • Experimentations and validation of the system embedded onto the iCub robotic platform
  • Publication in an international conference.

References

[1] Brain-inspired self-organization with cellular neuromorphic computing for multimodal unsupervised
learning, L Khacef, L Rodriguez, B Miramond, MDPI Electronics, 2020
[2] A unified software/hardware scalable architecture for brain-inspired computing based on self-organizing
neural models, AR Muliukov, L Rodriguez, B Miramond, L Khacef, J Schmidt, Q Berthet, , Frontiers in
neuroscience 16, 825879, 2022
[3] Modeling multisensory enhancement with self-organizing maps, JG Martin, MA Meredith, K Ahmad –
Frontiers in computational …, 2009 – frontiersin.org

Practical information

Location: LEAT Lab / SophiaTech Campus, Sophia Antipolis
Duration: 6 months from february 2025
Profile: embedded programming, micro-controller, sensors
Research keywords: Embedded systems, signal processing, Edge AI

Contact and supervision

Benoît Miramond, Laurent Rodriguez
LEAT Lab – University Cote d’Azur / CNRS
Polytech Nice Sophia
04.89.15.44.39. / benoit.miramond@univ-cotedazur.fr

Leave a Reply

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *