FourWinds10.com - Delivering Truth Around the World
Custom Search

MIT’s Leading the Pack With This Cool New Autonomous Drone Tech

Peter Rejcek

Smaller Font Larger Font RSS 2.0

5-31-18

Any Star Wars fan knows that the chances of successfully navigating an asteroid field are approximately 3,720 to 1. The odds are probably significantly higher against today’s autonomous drones, which fly quite a bit slower than sublight speed and without the mad skills of Han Solo.

Researchers at MIT believe they have hit upon a solution—more than one, actually—to train drones to move quickly through a crowded, complex environment, though we’re probably light years away from navigating through hostile star systems.

One solution, dubbed “Flight Goggles,” involves streaming a virtual realityenvironment to the drone as it flies through empty space.

“The system is at the intersection of motion capture equipment, drone technology, and high-bandwidth communications,” Sertac Karaman, associate professor of aeronautics and astronautics at MIT, told Singularity Hub. “We had to develop a number of hardware and software components to make it work.”

Generally, today’s autonomous drones are capable of mapping their surroundings as long as they cruise at low speeds. Karaman eventually wants to challenge human pilots in the emerging sport of drone racing, where aircraft zip through obstacle courses at speeds of 120 miles per hour, but a faster pace requires a different approach. That’s because even the smartest drones can’t process visual information fast enough without crashing from the slightest changes in the environment.

Karaman explains that the VR environment is useful for developing many types of robotic vehicles, especially when the physics of the vehicle are very complex.

“When you have drones go very, very fast, their aerodynamics dominate their behavior,” he said. “In this VR environment, the drone senses the real aerodynamics of fast flight. We believe this is key to developing the drones themselves. Aerodynamics for fast vehicles is very hard to simulate.”

The video below demonstrates the VR training concept using a 3D printed nylon and carbon-fiber-reinforced drone, with custom-built circuit boards integrated with a powerful supercomputer. It also sports an inertial measurement unit and camera.

 

Over the course of 10 flights, the drone, flying at about five miles per hour, successfully flew through a virtual window 361 times. It only “crashed” into the VR window three times. Karaman and his team presented the results of their experiment at last week’s IEEE International Conference on Robotics and Automation in Brisbane, Australia.

Don’t Know Where You’re Going Until You’re Gone

It’s no secret that MIT is at the forefront of robotics engineering and the development of artificial intelligence systems. The institute’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is the largest research center at MIT, with over 60 research groups representing more than 900 researchers, professors, postdocs, graduate students, and undergrads.

That expertise extends to drones. A team at CSAIL announced earlier this year that it had developed a navigation system, called NanoMap, that allows drones to fly at a speed of 20 mph through dense environments like forests.

The researchers, which included MIT professor Russ Tedrake, who is the director at CSAIL’s Center of Robotics and led the institute’s DARPA Robotics Challenge team, came up with their own version of the Heisenberg uncertainty principle, which roughly states that certain measurements of subatomic particles, such as where they are and where they’re going, cannot be known.

In the case of a drone moving through unfamiliar territory, NanoMap considers the drone’s position over time to be uncertain and accounts for that uncertainty. It uses a depth-sensing system to stitch together a series of measurements about the drone’s immediate surroundings. This allows it to not only make motion plans for its current field of view, but also anticipate how it should move around in the hidden fields of view that it has already seen, according to the team.

“It’s kind of like saving all of the images you’ve seen of the world as a big tape in your head,” said Peter Florence, a graduate student in Tedrake’s lab, in a press release. “For the drone to plan motions, it essentially goes back into time to think individually of all the different places that it was in.”

In effect, the drone takes a very existential view of its flight path, meaning that it continues to move forward doggedly even though it doesn’t perfectly know its position and orientation as it moves through the world.

Florence was the lead author on a paper, which was also presented at the Brisbane conference, describing the NanoMap navigation system.

Driving Innovation for Autonomous Vehicles

Both navigation systems have potential applications outside of fast-moving drones as the technology matures, from mundane tasks like delivering packages to search and rescue operations in a real-time disaster zone.

In the case of Flight Goggles, for the vehicle to reach drone-racing speeds, the machine will need to process camera images and other sensory data extremely fast, according to Karaman. So fast, he explained, that the cameras would need to capture hundreds or even thousands of frames every second and use massively powerful computers to crunch that data in real time.

“In some sense, the drone would be seeing everything evolve in slow motion and making decisions to safely navigate at very high speed,” Karaman said. “If we can build this technology, it may have applications throughout robotics.”

For instance, imagine a self-driving car that turns on very fast cameras and very powerful computers when the car is heading toward a crash. It can try to understand what the driver is attempting to do and avoid the crash. It just needs to nudge the car slightly in the direction commanded by the driver without going off the road.

“While these technologies may sound far-fetched, they may be the technologies we rely on to reduce and perhaps eliminate car accidents a couple decades from now,” noted Karaman, who has also helped develop computer hardware that can run sophisticated algorithms on drones the size of a bottle cap.

A Selfie in the Sky

These revolutionary drone technologies could also be used to capture the ultimate selfie.

A startup founded by three MIT graduates called Skydio has released a drone called R1 that is equipped with 13 cameras that capture omnidirectional video. It’s being marketed to extreme athletes and anyone else who wants to film himself in action with a personal flying videographer.

The company has raised $70 million to bring the autonomous camera to market, attracting money from major VC firms like Accel Partners and Andreessen Horowitz, as well as leading AI chipmaker Nvidia.

Skydio claims the drone operates at level-four autonomy, which is the second-highest degree of self-driving artificial intelligence, just short of complete autonomy. The system combines computer vision and a deep neural network to identify and track a subject. It can actually predict a subject’s next move so it will be in the optimal location for filming.

 

The founders of Skydio also believe their drone has a future outside of being a selfie camera, with applications including inspecting commercial real estate, power lines, and energy infrastructure for damage.

“People have talked about using drones for these things, but they have to be manually flown and it’s not scalable or reliable,” said Adam Bry, co-founder and CEO of Skydio, in a press release. “We’re going in the direction of sleek, birdlike devices that are quiet, reliable, and intelligent, and that people are comfortable using on a daily basis.”

It’s a concept that might just take off thanks to the ongoing research at MIT.

https://singularityhub.com/2018/05/31/mit-researchers-drive-new-advances-in-drone-technology/#sm.0001ge5tabu6yflksqo22u7yw3bp4