Kitware Develops Novel System for Autonomous Robot Navigation

The developed technology will provide autonomous robots with advanced vision capabilities for use in military and search-and-rescue operations.

  • Share on TwitterShare on FacebookShare on Google+Share on LinkedInEmail a friend
We’ve assembled a great team for this effort, and we are all inspired by the potential future applications of autonomous robot navigation.

Clifton Park, NY (PRWEB) December 05, 2012

Kitware, a leading provider of custom R&D software solutions, today announces the award of $100,000 in Phase I STTR funding from the United States Army to develop a novel robot navigation system based on high-level landmarks for use in military and search-and-rescue applications. The project is a collaboration of Kitware’s computer vision expertise and Texas A&M University’s renowned robotics capabilities.

The use of autonomous robots can improve safety and increase situational awareness in a broad range of military and commercial intelligence applications. This technology can be used to explore an unknown combat environment to locate enemy forces, retrieve wounded allies, or assess enemy artillery strength. In disaster recovery or search-and-rescue efforts, autonomous robots can be used in areas that would be inaccessible or extremely dangerous for human respondents.

Low communication bandwidth due to interference typically results in delays of many seconds between updates, making it impossible for a human operative to directly control a robot from outside the scene. Instead, a supervisory control system must be used to allow the operator to issue high-level commands that a robot can follow fairly autonomously for minutes or longer.

In this effort, the team will develop a new robot navigation paradigm that incorporates a multilayered feature graph (MFG) based on high-level visual landmarks into the supervisory control system. The MFG-derived results will be combined with segmentation techniques to extract salient landmarks and build a 3D scene model. Operators will then issue high-level commands such as “follow this wall” or “go around that object” that the robot will autonomously execute in real-time. This new navigation paradigm will provide the operator with an increased sense of telepresence and situational awareness, leading to more accurate completion of tasks.

“Effective navigation in an unknown environment with limited feedback is a tough challenge,” said Amitha Perera, Technical Leader at Kitware and Principal Investigator on this project. “We’ve assembled a great team for this effort, and we are all inspired by the potential future applications of autonomous robot navigation.”

For more information on Kitware’s computer vision modeling and reconstruction capabilities, and how they can be leveraged for your projects, please contact kitware(at)kitware(dot)com.

This material is based upon work supported by the United States Army under Contract No. W56HZV-12-C-0408. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Army.

About Kitware
Kitware is an open-source solutions provider for research facilities, government institutions, and corporations worldwide. Founded in 1998, Kitware specializes in research and development in the areas of visualization, medical imaging, computer vision, quality software process, data management, and informatics. Kitware is headquartered in Clifton Park, NY, with offices in Carrboro, NC, Santa Fe, NM, and Lyon, France. More information can be found at http://www.kitware.com.


Contact

Follow us on: Contact's Facebook Contact's Twitter Contact's Google Plus