Self-calibrating smooth pursuit through active efficient coding

C Teulière, S Forestier, L Lonini, C Zhang… - Robotics and …, 2015 - Elsevier
Robotics and Autonomous Systems, 2015Elsevier
This paper presents a model for the autonomous learning of smooth pursuit eye movements
based on an efficient coding criterion for active perception. This model accounts for the joint
development of visual encoding and eye control. Sparse coding models encode the
incoming data at two different spatial resolutions and capture the statistics of the input in
spatio-temporal basis functions. A reinforcement learner controls eye velocity so as to
maximize a reward signal based on the efficiency of the encoding. We consider the …
Abstract
This paper presents a model for the autonomous learning of smooth pursuit eye movements based on an efficient coding criterion for active perception. This model accounts for the joint development of visual encoding and eye control. Sparse coding models encode the incoming data at two different spatial resolutions and capture the statistics of the input in spatio-temporal basis functions. A reinforcement learner controls eye velocity so as to maximize a reward signal based on the efficiency of the encoding. We consider the embodiment of the approach in the iCub simulator and real robot. Motion perception and smooth pursuit control are not explicitly expressed as tasks for the robot to achieve but emerge as the result of the system’s active attempt to efficiently encode its sensory inputs. Experiments demonstrate that the proposed approach is self-calibrating and robust to strong perturbations of the perception–action link.
Elsevier