Winning Entries of Demo Program Contest 2018

1st Place: Yamabico Sequencer

YAMADA,Tomohumi

I made a robot that perform music corresponding to sensor outputs. At first, the robot performs self-localization and moves reciprocatively. After that, if we put cylinders, the robot recognizes the position of the cylinders and output the corresponding sound. For demonstration, I tried to perform "auld lang syne."

2nd place: Watchdog in the classroom

JOMURA,So

It is a wise robotic dog that partols in front of the desk during the lecture, waking sleeping students. I use two sensors to judge whether a student is sleeping or not. Firstly, the robot checks whether there is a person by using a laser sensor. Then, it detects the face reflected on the camera and checks whether the eyes are open or closed. To detect faces, I use a machine learning library called "dlib". The point is to judge by combining two sensors. It enables to respond to various sleeping styles of students.

3rd place: Happy kindergarden

TAKESAKI,Daisuke

This is Happy kindergarden! Let's enjoy tea time! But..., the teacher had forgotten to prepare sufficient snacks!! NOOOO!!! I'll give a priority to the youngest. This robot estimate age of children sitting side by side, and deliver a snack to the youngest child. It takes pictures of someone's face using PC Built-in Camera, extracts face area using library "dlib", and estimate age using Deep learning library "keras". After it estimate age of children, it comes to the youngest, and pops the baloon that contains a snack.

4th place: Guide dog robot

KATO,Mikiya

It is a robot like a guide dog that pulls the lead gripped by the user and takes him to an empty chair. Detects chairs and people in front of the robot by general object detection using YOLO and estimates the position of empty chairs from the positional relationship between humans and chairs. Then this robot estimates the position of the empty chair in the real world from the position of the empty chair in the image and the angle of view of the web camera and guide the user to that point. I perform object detection and robot control with a limited computer resource of one laptop computer without GPU.

5th place: Scavenger Robot

HATORI,Satoshi

I developed a robot that can pick up a trash and put it into the trash bin. The small cube, which an AR tag is attached to, was used instead of the trash and its position was obtained by web camera. To pick up and throw away the trash, 5-axis manipulator was newly developed and mounted on the YAMABICO robot. A series of operations (e.g. picking and searching) was implemented by ROS.