A team of researchers at Stanford have designed a 4D camera which could improve vision for applications in robotics and virtual and augmented reality technologies.
The new vision technique could also be used in autonomous vehicles, add the researchers – Donald Dansereau, a postdoctoral fellow in electrical engineering, and Gordon Wetzstein, assistant professor of electrical engineering, and others.
Universities are doing a lot of interesting work in the area of robotics, and Massachusetts Institute of Technology and Stanford are two of the most active in the field.
Both universities showed similar robots this week, MIT’s being a bottle-shaped device which checks pipes and Stanford’s one a machine which is said to grow like a vine.
According to DigitalTrends.com, MIT’s PipeGuard team recently won $10,000 in the university’s competition and swims through pipes to detect any problems.
The YouTube video for the device (above) describes as a “leak detection robot for city water distribution systems”.
Stanford, which also made a video (below) of its strange plant- or worm-like organic robot, if it can be called that, said its invention “could be useful in search and rescue and medical applications”.
Commercial customers for the robots are probably already lined up and perhaps the two teams could spin out into startup companies.
When Stanford’s autonomous car Shelley nears speeds of 120 mph as it tears around a racetrack without a driver, observers’ natural inclinations are to exchange high-fives or simply mouth, “wow”.
Chris Gerdes and his students, however, flip open laptops and begin dissecting the car’s performance. How many g-forces did Shelley pull through turns 14 and 15? How did it navigate the twisty chicane? What did the braking forces look like through the tight turn 5?