Abstract
We present a bio-inspired neural architecture where two robots learn internal models of their bodies and peripersonal space. Coupled simulations allow anticipation of partner actions and dynamic task allocation, enabling cooperative assembly of fuse-boxes in unstructured environments.
Methodology
Robots learn forward models via sensorimotor exploration. A reward-driven planner uses internal simulation to evaluate feasibility and sequence complementary actions. Real-world trials with two UR5 arms demonstrate joint key insertions and collision avoidance. We present a bio-inspired neural architecture where two robots learn internal models of their bodies and peripersonal space. Coupled simulations allow anticipation of partner actions and dynamic task allocation, enabling cooperative assembly of fuse-boxes in unstructured environments.