Predictive coding of tactile information
ESR7
Objectives
This project will develop a multisensory search engine. It will first design and fabricate a robotic interface integrating visual, haptic and audio feedback. It will then use a predictive coding approach to analyse, interpret and generate tactile, audio and taste cues for the users to yield a virtual experience of various objects. Finally, it will implement systematic experiments to test and develop the capability of the system to improve object recognition in speed and accuracy.
Expected Results
Predictive coding modelling of haptic sensing integrating tactile and proprioceptive information.
Placement
Host institution: Imperial College London
Enrolments (in Doctoral degree): Imperial Collage London
Supervisors
Etienne Burdet, Vincent Hayward
Presentation of ESR7
PhD defence: To be announced
My name is Alexis Devillard. After a master in robotics engineering (Polytech Sorbonne) and a master in Artificial intelligent and multi-agent systems (Sorbonne University), I worked at ISIR lab as a Research engineer. In 2020 I started my PhD at Imperial college London. I am passionate by all the different interactions between robotics systems and living beings.
Abstract of PhD goals
This project introduces a multi-sensory search engine concept, exploring the potential of augmenting traditional visual search interfaces, such as Google Images, with haptic feedback. The core hypothesis is that integrating tactile sensations into search processes can enhance search efficiency by using the human brain’s ability to combine various sensory information. To investigate this, I created a multi-modal sensory dataset coupled with user questionnaires. These tools were used to identify critical sensory features that users rely on to distinguish materials and textures: compliance and roughness. To address these key features, I developed sensory feedback and recording systems, including two haptic feedback devices: DeepScreen, which provides compliance and stiffness feedback, and NaptX, offering roughness feedback via nail vibrotactile stimulation. For sensory recording, I developed an electronic skin (eSkin) for the recording of force and vibrotactile data.
This research could lead to two primary applications: tele-perception, where sensory recordings directly control sensory feedback devices, and automated sensory feedback generation based on the user’s virtual interaction with a selected material.
I am conducting evaluations to assess the realism and efficiency of the proposed feedback systems within the context of a multi-sensory search engine.
Results
Deliverable 3.3 Predictive coding model of haptic sensing
Abstract model of haptic sensing integrating tactile and proprioceptive information. Predictive coding modelling of tactile sensing in rodents. Human-like computational model of haptic sensing for robots.
Conference Article
Devillard, A.; Ramasamy, A.; Faux, D.; Hayward, V.; Burdet, E.
Concurrent Haptic, Audio, and Visual Data Set During Bare Finger Interaction with Textured Surfaces
IEEE World Haptics Conference (WHC), 2023
DOI: 10.1109/WHC56415.2023.10224372