Origami-inspired robotic plants grow with their environment
How do you deploy an environmental sensor to collect climate change readings over a prolonged period on an uninhabited island without failing? How do you power a seismic detector to operate for months in an underwater cave?
In environments that are difficult to reach because of the hazards or hardships for humans, a device behaving like a native plant could be the answer.
This is the approach taken by Suyi Li, associate professor in mechanical engineering at Virginia Tech, and Clemson professor and collaborator Ian Walker. Their work is being advanced thanks to a four-year, $840,000 grant from the National Science Foundation.
Li says: “When I started to venture into robotics a few years ago, I was surprised to see that almost all robots are inspired by humans and animals to some degree.
“However, I believe the vast plant kingdom can offer us many unique lessons on approaching the design, actuation, and operation of robots. This is how Ian and I started working on this topic together. “
Li has established a research group that deploys the principles of origami to create novel forms of soft robotics with unique structures. Walker, a professor of electrical and computer engineering at Clemson, brings a rich background in biologically inspired robotics spanning two decades.
Their proposal aims to create a broad foundation for new designs, creating robotics with technology capable of surviving in wild conditions over the long haul.
Robots in the wild
Bringing together cutting-edge electronics with the unpredictability of nature is typically a clash of worlds. Technology can be a strength when electricity is available, and the environment is predictable or controlled. Those measures break down when batteries die and parts break.
This has not prevented technology from making its way into the outdoors, but challenges have followed. Other researchers recently deployed sensors for wildfire detection in remote locations of California and Oregon, but have faced issues such as navigating rocky areas at high altitudes and the particulars of protected lands.
Devices that need to be always on and programmed to detect such things as airborne particles or rare breeds of birds generally face two big obstacles: time and environment.
These are the main challenges Li’s team is tackling. The aim isn’t to fight against the flow of nature, but to channel the very approach used by nature to produce more adaptive robotics.
Walker says: “As humans, we naturally tend to think of change on the time scale of our attention span, like seconds and minutes.
“However, long-term and continuous deployments outdoors pose alternative and unique challenges. Over weeks and months, outdoor natural environments are highly dynamic places.
“Vegetation grows up and debris comes down in storms. Robotic operation in these conditions needs to become more like the ambient environment in novel ways to maintain monitoring.”
Robots that grow and adapt
Li and Walker will not be creating robot plants or making sensors that grow from seeds. Instead, their work will take advantage of the insights provided by nature that have proven to be durable over the long term. Those natural mechanisms will be converted into mechanics that adapt and respond to their environments.
Which characteristics of plants are on their radar? They have targeted the ability to move with the sun, shown in the behavior of sunflowers. Also of interest are floral organs that open and close, like a venus flytrap.
They also have taken notes from plants that attach to an object adaptively, like the winding of vines around a tree. All of these actions are the result of a plant adapting to its surroundings, and each has a set of mechanisms that make the action possible. Some of the actions are faster, like the flytrap. Some are slower, like the steady coil of vines.
To put these characteristics together into robotics, Li and Walker have combined their efforts to mimic those plant behaviors and package them as a set of innovative robotics.
Main image: (From left) Graduate researcher Vishrut Deshpande and Associate Professor Suyi Li inspect a plant-inspired robot prototype. Photo by Alex Parrish for Virginia Tech.