Integration of Visual and Somatosensory Feedback through Learning


The proposed project will focus on the development of non-standard control theory formulation for human like reaching behaviour based on the integration of multi-sensory information, overcoming disadvantages of a classical control theory which is bound to kinematics of rigid bodies. To fulfill the aim, the transformation matrices will be intensively examined as they directly link the sensory spaces with the work space in which the arm performs an action at a time. From a more cognitive perspective, these transformation matrices represent embodiment of living creatures as sensors are embodied within body. The iterative loop between a continuous input of sensory information and instantaneous action at a time should function dynamically adjusting to the changing environment, and even the sudden changes of embodiment should be taken into account in the control theory to be adjusted to perform a goal oriented task.

Even though there exists a theoretical framework such as a minimum jerk model to explain the human reaching behaviour, it is based on an analytical expression used in order to constrain the trajectories of the hand in work space and, the control theory is implemented separately to generate such a desired trajectory. This kind of separation of trajectory planning and control law imposes a fundamental problem as it requires rigorous kinematics of the body, and as a result, the transformation matrices between sensory input and action space become highly nonlinear. We need a general control theory which can transform the multiple sensory spaces to a single actuation space in the continuous sensory-motor coupling.

Project Research Group

Dr. Yoshikatsu Hayashi

Lecturer of Robotics, University of Reading

Prof. Slawomir Nasuto

Professor of Cybernetics, University of Reading

Prof. Sadao Kawamura

Professor of Robotics, Ritsumeikan University

Henry Eberle

PhD Student, University of Reading

Funding Body