Individual ants are relatively simple creatures and yet a colony of ants can perform really complex tasks, such as intricate construction, foraging and defense. Recently, Harvard researchers took inspiration from ants to design a team of relatively simple robots that can work collectively to perform complex tasks using only a few basic parameters.
The research was published in ELife.
“This project continued along an abiding interest in understanding the collective dynamics of social insects such as termites and bees, especially how these insects can manipulate the environment to create complex functional architectures,” said L Mahadevan, the Lola England de Valpine Professor of Applied Mathematics, of Organismic and Evolutionary Biology, and Physics, and senior author of paper.
The research team began by studying how black carpenter ants work together to excavate out of and escape from a soft corral.
“At first, the ants inside the corral moved around randomly, communicating via their antennae before they started working together to escape the corral,” said S Ganga Prasath, a postdoctoral fellow at the Harvard John A. Paulson School of Engineering and Applied Sciences and one of the lead authors of the paper.
Ants primarily rely on their antennae to interact with the environment and other ants, a process termed antennation. The researchers observed that the ants would spontaneously congregate around areas where they interacted more often.Once a few ants started tunneling into the corral, others quickly joined in. Over time, excavation at one such location proceeded faster than at others and the ants eventually tunneled out of the corral.
From these observations, Mahadevan and his team identified two relevant parameters to understand the excavation task of the ants; the strength of collective cooperation, and the rate of excavation. Numerical simulations of mathematical models that encode these parameters showed that the ants can successfully excavate only when they cooperate with each other sufficiently strongly while simultaneously excavating efficiently.
Driven by this understanding and building upon the models, the researchers built robotic ants, nicknamed RAnts, to see if they could work together to escape a similar corral. Instead of chemical pheromones, the RAnts used “photormones,” fields of light that are left behind by the roving RAnts that mimic pheromone fields or antennation.
The RAnts were programmed only via simple local rules: to follow the gradient of the photoromone field, avoid other robots where photoromone density was high and pick up obstacles where photoromone density was high and drop them where photoromone was low. These three rules enabled the RAnts to quickly escape their confinement, and just as importantly, also allowed the researchers to explore regions of behavior that were hard to detect with real ants.
“We showed how the cooperative completion of tasks can arise from simple rules and similar such behavioral rules can be applied to solve other complex problems such as construction, search and rescue and defense.” said Prasath.
This approach is highly flexible and robust to errors in sensing and control. It could be scaled up and applied to teams of dozens or hundreds of robots using a range of different types of communication fields. It’s also more resilient than other approaches to collaborative problem solving — even if a few individual robotic units fail, the rest of the team can complete the task.
“Our work, combining lab experiments, theory and robotic mimicry, highlights the role of a malleable environment as a communication channel, whereby self-reinforcing signals lead to the emergence of cooperation and thereby the solution of complex problems. Even without global representation, planning or optimization, the interplay between simple local rules at the individual level and the embodied physics of the collective leads to intelligent behavior and is thus likely to be relevant more broadly,” said Mahadevan.