In order to bridge physical and digital world, it is important that a computer can communicate with the real world. This inspiration led our team to be interested in how a computer can perceive the information of its surrounding. Our team decided to bulid a robot system activated and controlled by speech recognition and text image recognition.

Our team implemented this system using MacBook Pro (late 2011), iRobot Roomba, and an Arduino board connected with three ultrasonic sensors on the front and both sides of the robot's body.

After we achieved the implementation, we made its cover with sheets of acrylic panel and let it run in a more complex maze which has signposts on every corner.

HAL9000 consists of 4 parts: speech recognition, text recognition, distance detection, and robot control. The sound signal is transmitted by ‘Bluetooth microphone’. And this robot recognizes text through webcam of a laptop, detects distance by ultrasonic sensors and all integrated information moves the robot through USB serial port communication. First, when I say ‘Hello, robot!’ to HAL9000, it gets ready to start. Then, when I say ‘Find gate a digit from 1 to 4’, it goes to find the destination. Once it faces a wall, it stops and reads signposts, and then it turns to the direction the signposts instruct. When it arrives at the destination, it finishes.

Our team's project had been selected as a represententative project of Yonsei EE Dept. and had an opportunity to demonstrate our work at E2FESTA (Capstone Project Expo) 2014 in Korea.

Designed and Implemented by Minkyu Choi and Youngwook Do