Background
There are a lot of emerging application with swarm robotics. For example, the smart city transportation with autonomous cars (https://www.transportation.gov/smartcity). By sharing information among vehicles, the distributed system is able to provide an optimized route for the vehicles to achieve a better transportation efficiency in city.
Another applicable is emergency rescue. In emergency, it will be a risky job for emergency workers to go into the accident building, while the original structure of the building can be already changed by the accident. By using several rescue robots, with no risk for human workers, robots can fully discover the new structures and search for trapped people or properties. By patching the discovered information to reconstruct the full map of the building, people can make a better preparation and decisions about the situation.
A statistics shows how urgent this technique is needed. According to U.S. Fire Administration, In the year of 2014, U.S. suffered from a total death of 3,275, with over $11 billion property loss. And through year 2006 to 2015, we lost over 1000 firefighters in emergency duty. Smart emergency countermeasure will be a good potential for saving people’s lives.
Testbed Setup
For the software simulations, a high-school student John Song has developed a web-based simulator for multi-micromouse coordinated maze discovery at here, which can be used to develop and text new coordinated maze navigation algorithms. We have also used CORE network emulator to simulate the coordinated micromouse maze discovery with TCP/IP communications. Below is the video for a demo of the simulation. Each micromouse is represented as an android icon with different colors, and besides of it, a python visualization is also provided for seeing the reconstruction of original maze.
We built a micromouse maze and several micromouses based on Lego Mindstorm EV3 in our lab. Multiple micromouses need to coordinate and discover the whole maze in a decentralized way in shortest time. In our project, we use asynchronized broadcast to send information, and each robot can make their own decision without waiting for each other.
Our micromouse hardware is Lego Mindstorm EV3. Sensors used are three Ultrasound sensors installed on front and two sides of the robot, and one gyro sensor installed in the middle of the robot controller, fixed in the middle to avoid unnecessary vibrations.
Here in the video is a short preview of how the micromouse move in the maze to finish the discovery.
Faculty
- Dr. WenZhan Song
Students
- Zhiwei Luo
- Yang Shi
- John Song