Asee peer logo

Board # 143 : MAKER: Urban Search and Rescue Robot: Visual Localization and Navigation

Download Paper |


2017 ASEE Annual Conference & Exposition


Columbus, Ohio

Publication Date

June 24, 2017

Start Date

June 24, 2017

End Date

June 28, 2017

Conference Session

Make It!

Tagged Division


Page Count




Permanent URL

Download Count


Request a correction

Paper Authors

author page

Cristal Monet Johnson Carl Wunsche Sr. High School


Sheng-Jen Hsieh Texas A&M University

visit author page

Dr. Sheng-Jen (“Tony”) Hsieh is a Professor in the Dwight Look College of Engineering at Texas A&M University. He holds a joint appointment with the Department of Engineering Technology and the Department of Mechanical Engineering. His research interests include engineering education, cognitive task analysis, automation, robotics and control, intelligent manufacturing system design, and micro/nano manufacturing. He is also the Director of the Rockwell Automation laboratory at Texas A&M University, a state-of-the-art facility for education and research in the areas of automation, control, and automated system integration.

visit author page

Download Paper |


The Tetrix Urban Search and Rescue (S&R) robot is a mobile robot equipped with a vision system and a real-time image processing engine. Both the vision system and processor can be accessed through a wireless connection. This allows users to navigate the robot through unknown terrain using streaming video from the vision system to do image mapping and obstacle avoidance. An instructional module for high school students was created based on this robot. Students planned, reconfigured, and controlled a Tetrix Robot for a given search-and-rescue scenario. During lectures, the instructor demonstrated how to control the robot using various approaches such as a remote panel or a computer. Students then familiarized themselves with a pre-built Tetrix demo robot structure, control system, and sensors, focusing particularly on the vision sub system. A special activity (“Ranger, Help”) was conducted. The class was divided into three-person teams. Each team selected a Control Operator (CO), Data Specialist (DS), and Field Technician (FT). The CO controlled the Ranger Robots' movements through all simulations. The team simulated a search for survivors in a town ravaged by two tornados. The DS and the CO watch a monitor to analyze and document what the Ranger finds. The team understands their first priority is to locate survivors and communicate their location to the FT so they can go in and get them. Any non-surviving locals are documented so that recovery teams will be able to come back in to get them. The FT informs and coordinates the lifesaving efforts of the field teams. Once the FT starts a site team, they move on to the next local reported by the DS. Through teamwork, students learned how vision systems work, how to process images for meaningful knowledge, and how to work as a team to achieve a S&R task using a robot platform. Evaluation results suggested students were very motivated in the activity. Learning of image processing tasks was moderate, but they were proficient in controlling a S&R robot for a given task

Johnson, C. M., & Hsieh, S. (2017, June), Board # 143 : MAKER: Urban Search and Rescue Robot: Visual Localization and Navigation Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--27759

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015