Winning Entries of Demo Program Contest 2007

1st Place: Karugamo(Spot-billed Duck) Troop


A tracking system is created to realized luggage transportation using multiple Beego robots and laser range sensors. The Karugamo troop tracks and follows not only human legs but also other objects as no preset model of tracking target is provided to the system. A robust tracking system via laser range sensor data and Beego's full mobility performance are presented in this demonstration.

2nd Place: Coke Delivery Robot


The robot autonomously collects delivered pet bottle by knocking down and rolling it away. When each Coke is delivered, the robot memorizes the environment data of the location provided with the laser range sensor. Then during recollection of the Coke bottles, the robot recalls the environment data of the delivery spots, from which path planning is done and the bottles are tilted in the order of the shortest distance from the robot. The robot can change states from map making, delivery, to recollection autonomously, thus the robot can be operated easily without any keyboard interaction.

3rd Place: Catch Me 2

Morales Saiki Luis Yoichi

In this demo, two yamabico robots wre used simultaneously using socket communication. In the environment two landmarks were previously placed and showed to the robot. First one robot travels, on a squared path correctin its position and angle scanning the landmars with an URG sokuiki sensor. Then mode is changed and first robot is moved by a person and send the trajectory path to the second robot through wireless LAN. Then second robot moves throught the shown trajectory correcting its position with the two landarks. (This demo was third place in the 2007 DemoPro contest).

4th Place: Amidakuji Lottery

KAWATA Hirohiko

The Beego robot is the God of Amidakuji Lottery.

The focus of this demonstration is that it utilizes the weak point of a laser range sensor. The front laser range sensor of the rbot is tilted 45 degrees towards the floor. Aluminum and black tapes are pasted on the floor in the form like amidakuji. When the laser emitted by a range sensor hits the aluminum surface, it is reflected in the opposite direction, on the other hand it is absorbed by the black surface. In any case, the laser is not reflected back to the sensor, so error will be returned for the laser data on these surfaces. Meanwhile the floor is a scattering plane and returns a normal distance value. Thus the robot detects the error returned in the direction of the tape, and moves along the tape.

So the Beego robot moves from the start location of the amidakuji and follows the narrow band of the tape. When the vertical tape is intersected with a horizontal tape, thus returning wide range of error, the robot turns 90 degrees clockwise and once again follows the tape until the next intersection and turns 90 degrees counter clockwise. This is repeated until the tape is cut off, which results in no detected error in the laser range data. Thus the robot will stop at the end of the line.

The God of Amidakuji smiles at the chosen person at the location where it stops.

5th Place: Catch Me If You Can


The robot goes against a person's command and runs away from the person. Two laser range sensors are each mounted on the front and back of the robot, obtaining scans in all directions to detect human presence. In an enclosed area, the robot runs away from people without colliding into the wall, with speed proportional to the person's proximity. The wall data, mainly straight lines, can be differentiated from a person by least square variance method, through which an escape vector can be determined as well. If a person chases the robot slowly it is able to outrun and escape, but it will fail to do so and instead run into the wall when 2 persons chase after it.

6th Place: Robot Actor


With voice recognition, the Beego robot moves and speaks out using speech synthesis according to the detected phrase from a person. A laser range sensor is used to detect the audience and thus will not collide into them.

7th Place: Face Recognition Rocket Launcher

Aji Prasetyo

X and Z axis are created using the Beego robot's frame. 2 servo motors are attached on the bottom of the x and z axis. The Beego robot is transformed into a manipulator with 2 degree of freedom. Face recognition is achieved by using a usb camera and Haar face-detection method in openCV. The location of the face recognized in the image will be registered, and the servo moves the camera so that the face will be positioned in the middle of the image.