Neuromechanical Autoencoders: Learning to Couple Elastic and Neural Network Nonlinearity
Intelligent biological systems are characterized by their embodiment in a complex environment and the intimate interplay between their nervous systems and the nonlinear mechanical properties of their bodies. This coordination, in which the dynamics of the motor system co-evolved to reduce the computational burden on the brain, is referred to as “mechanical intelligence” or “morphological computation”. In this work, we seek to develop machine learning analogs of this process, in which we jointly learn the morphology of complex nonlinear elastic solids along with a deep neural network to control it. By using a specialized differentiable simulator of elastic mechanics coupled to conventional deep learning architectures – which we refer to as neuromechanical autoencoders – we are able to learn to perform morphological computation via gradient descent. Key to our approach is the use of mechanical metamaterials – cellular solids, in particular – as the morphological substrate. Just as deep neural networks provide flexible and massively-parametric function approximators for perceptual and control tasks, cellular solid metamaterials are promising as a rich and learnable space for approximating a variety of actuation tasks. In this work we take advantage of these complementary computational concepts to co-design materials and neural network controls to achieve nonintuitive mechanical behavior. We demonstrate in simulation how it is possible to achieve translation, rotation, and shape matching, as well as a “digital MNIST” task. We additionally manufacture and evaluate one of the designs to verify its real-world behavior.
READ FULL TEXT