In this work, we focus on Voxel-based Soft Robots, aggregations of mechanically identical elastic blocks. We use the same neural controller inside each voxel, but without any inter-voxel communication, hence enabling ideal conditions for modularity: modules are all equal and interchangeable. We optimize the parameters of the neural controller—shared among the voxels—by evolutionary computation. Crucially, we use a self-attention mechanism inside the controller to overcome the absence of inter-module communication channels, thus enabling our robots to truly be driven by the collective intelligence of their modules.

Evolving Modular Soft Robots without Explicit Inter-Module Communication using Local Self-Attention

This is the Supplementary Material page with videos of our experimental results. Please see the paper for details.


1) Evolution of the self-attention controller of a biped-shaped Voxel-based Soft Robot.

2) Evolution of the self-attention controller of a comb-shaped Voxel-based Soft Robot.

3) Fine-tuning on a small biped the self-attention evolved on a large one.

4) Fine-tuning on a large biped the self-attention evolved on a small one.

5) Fine-tuning on a small comb the self-attention evolved on a large one.

6) Fine-tuning on a large comb the self-attention evolved on a small one.

This article was prepared using the Distill template.