Computer science professors Xia Zhou and Alberto Quattrini Li, along with researchers from the HealthX and Reality and Robotics Labs, have created an alternative system that detects robots underwater. The system, called Sunflower, uses a drone to beam a laser light through the water’s surface and track the robots. On June 28, the researchers presented their findings at the 20th annual International Conference on Mobile Systems, Applications and Services.
According to Zhou, what sets Sunflower apart from other underwater detection systems is that it uses light technology instead of an acoustic signal, which is the first airborne drone to do so.
“We are not aware of other systems that can do 3D localization of the robot in the water from the air – at least no real demonstration that we have seen before,” Zhou said.
Zhou said she and the other researchers chose to use light because they believed it would be a “better medium for both communication and sensing because of the physical properties of light.” She said that not only can light travel farther than sound, but it also has a higher “communication bandwidth” than sound. Once they decided to use a drone, the researchers deemed light to be the sole medium to travel through the water effectively, leading to its use for both communication and sensing, she said.
Charles Carver GR ’22, a co-author of the paper that discussed the researchers’ findings, said that Sunflower could help people explore unknown regions of the ocean and work toward combating climate change.
“Climate change is a pretty important thing, especially relating to underwater,” he said. “You can explore reefs and other underwater ecosystems with better granularity [and] knowing where they are in relation to other ecosystems, and it could show the impacts of climate change.”
In a written statement to The Dartmouth, Carver also explained that the Sunflower operates with two main parts — the queen, on the aerial drone, and a worker, on the underwater robot. The queen has a laser beam that helps in transmitting information.
“The queen steers its laser beam through the air-water boundary, hits the worker – which senses the laser’s angle of incidence – [and] retro reflects the light back to the queen and encodes this angle information,” he wrote.
Carver wrote that the drone, or the queen, then senses the weak retro reflections of light and converts them to a digital signal by decoding the data received from the worker, or the underwater robot. He also wrote that he designed and built the optical circuits and hardware for the queen, aside from the algorithm that handles the queen’s “angle of arrival sensing” created by Qijia Shao, another author of the paper who received a master’s degree from the College this year.
Everyone was involved with the design that involved “combining the whole system and computing the final location [of the robot],” Carver said.
Zhou explained that the inspiration for Sunflower came from Li’s previous work as an underwater roboticist. She added that Li has focused on programming robots underwater for a variety of tasks and understands the challenges of robot-to-system communication.
“One challenge I learned from [Li] is the difficulty of communicating with the robots and also knowing where they are,” Zhou said. “The mainstream method now is mostly based on acoustics…but we thought of doing something different.”
Li said that prior to developing Sunflower, he and Zhou collaborated on a project called Amphilight that allows for the communication between an aerial drone and a laser light beam, for which Carver was also a lead author. Li said they expanded upon their project Amphi Light to create Sunflower, which uses laser light for localization purposes.
“The [previous] paper showed that you could do wireless communication with laser light,” Carver said. “And this paper [‘Sunflower: Locating Underwater Robots From the Air’] was the immediate follow-up.”