Winning Entries of Demo Program Contest 2019

1st Place: Robot for people wearing glasses

ENDO Ryoto

I made a robot that can only be moved by a person wearing glasses. A webcam attached to the end of a long frame recognizes whether the person who stands in front is wearing glasses. The robot follows the person by LiDAR (URG-04LX) only when the person wears glasses. I used Haar-like feature classifier for face recognition and CNN(convolutional neural network) for recognition of the existence of glasses. This CNN has been learned from about 10,000 of each images of people wearing and not wearing glasses collected from the web.

2nd place: Real Robot Shooting - Shooting Game in The Real World

NAGARA Keita

I developed a shooting game to fight by operating robots in the real world. The player fights by operating Yamabico robot with a joystick. Another Yamabico robot is running as an enemy, and it is detected by LiDAR. I did not use a game engine for game development. I programmed the game from scratch on ROS. In programming, I designed to be able to process enemy detection, bullet generation, collision detection and image drawing at high speed in real time. The game screen is drawn by OpenCV and the image is displayed on rviz.

3rd place: Help me

TAKESAKI Daisuke

Chickens mounted on this robot scream in response to other chicken calling. The sound is collected by a microphone built in the PC and FFT is performed, and the spectrum is maximized in a specific frequency band, and if it is continuously observed for a certain period of time, it is determined that a bird is screaming. When you hear chicken's screaming, turn the servomotor and press the belly of the chickens.

4th place: Yamabiko a calligrapher

ANDO Hidem

I attached an arm to Yamabiko (mobile robot) and wrote "レーワ", this means "Reiwa" in katakana, on the wall because the era name is changed from Heisei to Reiwa. The robot is PD controlled by the value of the URG sensor to keep the distance with the wall constant. Only two servos were used for the arm this time. Therefore, the robot only controls the position of the brush tip and does not consider the posture. So, while the distance of the robot itself is constant, the point on the pen ahead of the one, which is calculated as the coordinates of the pen tip on inverse kinematics, as the pen tip moves up and down from the initial position, they will be away from the wall. Since the correction was performed on the program this time, the written character was distorted unnaturally as "レLヶ". In order to solve this problem, there may be improvement measures such as making the correction on the program accurate or increasing one servo, but the brush should be used while standing against the surface, the latter choce is considered better to control the posture so that it is always perpendicular to the wall. Also, by doing so, you may be able to write beautiful characters with more consideration of pressure and so on.

5th place: Vision Based Object Tracking Robot

YANG Chenjie

A robot system of tracking the random target selected from the image with RGB-D camera is presented. The tracker based on only vision is realized by the open-sourced tracking algorithm called Kernelized Correlation Filter (KCF). And it can run at about 100 FPS without GPU. Then the movement of robot is steadily controlled by the speed feedback controller designed through a dynamic inversion method.