Abstract

We describe a neural framework that learns a distributed body schema enabling simultaneous control of multiple limbs or tools. Forward simulations inform real-time motor plans, resolving redundancy by well-posed field computations rather than inverse kinematics.

Methodology

Self-organizing maps encode joint and tool configurations. A hub layer integrates modalities. Task constraints inject attractor fields. Motor commands arise from minimizing prediction error via passive motion paradigm. Validated on iCub (53 DoF) and industrial arms. We describe a neural framework that learns a distributed body schema enabling simultaneous control of multiple limbs or tools. Forward simulations inform real-time motor plans, resolving redundancy by well-posed field computations rather than inverse kinematics.

Self-organizing maps encode joint and tool configurations. A hub layer integrates modalities. Task constraints inject attractor fields. Motor commands arise from minimizing prediction error via passive motion paradigm. Validated on iCub (53 DoF) and industrial arms. We describe a neural framework that learns a distributed body schema enabling simultaneous control of multiple limbs or tools. Forward simulations inform real-time motor plans, resolving redundancy by well-posed field computations rather than inverse kinematics.