Autonomous Tour Guide Robot by using Ultrasonic ... - IEEE Xplore

22 downloads 0 Views 2MB Size Report
Flint, MI, USA. {lee7704, lim7722, gtewolde, jkwon}@kettering.edu .... A tour guide robot from Central Michigan University called. CATE (Central Automated Tour ...
410

Autonomous Tour Guide Robot by using Ultrasonic Range Sensors and QR code Recognition in Indoor Environment SeokJu Lee, Jongil Lim, Girma Tewolde, Jaerock Kwon Electrical and Computer Engineering Kettering University Flint, MI, USA {lee7704, lim7722, gtewolde, jkwon}@kettering.edu Abstract- This paper addresses the challenge of mobile robot navigation in indoor environments. There is a critical need for cost-effective, reliable, and fairly accurate solutions to meet the demands of indoor robotic applications. Currently, researchers are exploring various approaches for this problem. The one we are presenting in this paper is based on QR (Quick Response) codes to provide location references for mobile robots. The mobile robot is equipped with a Smartphone that is programmed to detect and read information on QR codes that are strategically placed in the operating environment of the robot. The mobile robot can perform the autonomous run throughout the guide route by using real-time QR code recognition. The lab information on QR code is played to the visitors using Text-to-Speech provided through Android device. Ultrasonic range sensors which can detect objects and measure distances with high accuracy are used to implement the wallfollowing and obstacle-avoidance behaviors. The collected sonar range information by ultrasonic range sensors is processed by a microcontroller that autonomously controls a tour guide robot. An algorithm based on a proportional-integral-derivative (PID) control is applied to the tour guide robot to perform more accurate robot motion control. A Bluetooth technology is used to send stored information on QR codes from the Smartphone to the tour guide robot wirelessly. The experimental setup of the tour guide robot along with the successful implementation of the efficient method for a navigation technique is presented. Index Terms— Mobile robot, Tour guide robot, Robot navigation, QR code, Landmark, Ultrasonic range sensor, PID control, Bluetooth technology

I.

INTRODUCTION

Today, robots are no longer a thing of science fiction films. From cleaning robots to medical robots, they are ubiquitous and are becoming a significant part of people's lives. As robots assume more and more roles in people's daily lives, their influence on society continues to grow. Robotic vacuum cleaners, security, and surveillance applications are some example of successful indoor robot application. Another case of important real world applications of indoor service robots is the use of autonomous mobile robots as tour guides in museums or exhibitions. The work presented in this paper is focused on the development of an indoor autonomous mobile robot that can be used as a tour guide for campus tours, for example during University Open Houses. The goal is to provide the visitors

with automated, efficient, and improved tour experience. During the development of the tour guide robot the project had to deal with a number of core mobile robotics challenges, including navigation, sensor integration, and control. The proposed tour guide robot makes good use of popular technology that combines a Smartphone application with a robot technology. The Smartphone and 4 ultrasonic range sensors, 2 in front and 2 in right side are mounted on the tour guide robot. The tour guide robot is programmed to follow the wall using ultrasonic range sensors while keeping a distance of about 30cm and avoid collisions with any obstacle such as a door by sensing and measuring the distance using the front ultrasonic range sensors. A Smartphone application is programmed to perform QR code and circle shape recognition. QR codes and circle shape landmarks are strategically attached to the wall in a given route. It was observed that recognition rate of the QR code reader falls remarkably with the speed of the tour guide robot. To address this issue circle shapes landmark are utilized by attaching them on the walls just before the QR codes. The circle shapes can be easily recognized even while the robot is moving. When the tour guide robot detects the circle shape landmark, its speed gets reduced to allow effective recognition of the QR code. In order to show the feasibility and effectiveness of our work, this paper presents the design and implementation of the proposed tour guide robot, experimental results of the navigation and some experiences about the actual challenges. The rest of this paper is organized as follows: Section II presents a brief summary of related work in the literature; Section III presents our system implementation details. Section IV discusses the experimental results, followed by the conclusion in section V. II.

RELATED WORKS

A number of related works in building tour guide robots has been identified in the literature. Each of them used various sensors and had a unique method of navigation and localization. For example, Jinny [1] is a tour guide robot that has been tested in actual environments like in an office building in KIST and in the exhibit hall. It used two laser finders and two infrared

978-1-4799-4774-4/14/$31.00 ©2014 IEEE

411

sensors for navigation, and a Monte Carlo based probabilistic map matching scheme for localization. In another related work an interactive tour guide robot called Urbano [2] was designed to be implemented as a tour guide at exhibitions. The Urbano robot is equipped with a four wheeled synchro-drive locomotion system and two sonar sensors and one infrared sensor, which allows detecting obstacles. The platform has also two onboard PCs and one touch screen. A tour guide robot from Central Michigan University called CATE (Central Automated Tour Experience) [3], [4] uses radio frequency identification (RFID) and Sonar in order to navigate. It has a simple design and a localization method that relies on RFID technology. But due to the limited range of the passive RFID reader, if the robot misses to read a reference tag it could easily be lost. Kulyukin, et al. [5] propose a tour guide robot based on RFID technology to assist visually impaired people for reaching their destination. Ultrasonic range sensors are also used to give the robot capability to detect objects. There are efficient methods in the literature for localization and navigation of mobile robots using passive RFID [6]-[9]. Our navigation method using QR codes will be more efficient compared to navigation method using RFID tags because QR codes are substantially cheaper and easier to produce. Furthermore, QR codes are easy to install as navigational landmarks and they can be easily changed. The University of Deusto Avda [10] proposed indoor navigation and product recognition technique to provide blind people shopping support system. For navigation and guidance, the system uses RFID and Smartphone technology to enable accessible shopping for blind people in Supermarket. Smartphones have several features that can be used in robotics [11]. One of them is a rotation sensor. It allows a Smartphone to measure orientation and detect direction status of the Smartphone. When applied to a robot, the sensor performs to determine an orientation of a robot. Suriyon, et al. [12] proposed the design and implementation of a guide robot by using QR code recognition. The robot uses a QR code as a landmark and implements navigation system that can perform the autonomous run throughout the guide route. This guide robot has a laptop PC mounted on it recognizing QR codes and also to control the robot. Instead of using the laptop, our work uses a microcontroller and a Smartphone for the robot control and QR code recognition, respectively, thus resulting in a low-cost and light-weight solution. The use of the QR codes has the following benefits: it does not cost much to install, its installation is simple, and it can be easily modified to change the guide route when necessary. However, one of the main challenges with this technology is that the recognition rate of a QR code reader falls when the distance between the reader and the QR code increases. The problem could be addressed by adjusting the distance between

the QR code and the camera, using an adjustable mounting structure. Belgorod State Technological University [13] developed a machine vision system (MVS) for mobile robot navigation. The MVS identifies artificial landmarks on images from a video camera with pan-tilt mechanism and allows the robot to calculate the deviation from the desired course. MVS includes one camera and onboard computer which runs software that allows searching for artificial landmarks in the environment and provides control of the robot drive in order to eliminate the course deviation calculated on the basis of information from artificial landmarks. QR codes instead of artificial landmarks can be used for mobile robot localization or navigation in indoor environment because QR codes are capable of performing in various ways by storing directional information and by providing x, y coordinates from using the codes. III.

IMPLEMENTATION DETAILS

In the development of the tour guide robot controlled by a microcontroller, hardware and software design techniques are needed. This section presents the hardware design of the major components of the tour guide robot. Fig. 1 shows the proposed tour guide robot platform as the test-bed for the experiments and also a deployment of the hardware components.

Fig. 1. The proposed Tour Guide Robot platform. The robot is composed of microcontroller, motor shield, 4 ultrasonic range sensors, and Smartphone.

412

A. Hardware Design of the Tour Guide Robot The Atmega328 based Arduino UNO R3 Microcontroller is used as the brain to implement this system. A Bluetooth module and multiple sensors are controlled by the microcontroller. The body of the tour guide robot is designed using DF Robot fourwheel-drive (4WD) platform. It is suitable as an inexpensive platform for research related projects. A motor shield allows the microcontroller to drive motors. The motor shield control is based on the DFRobot L298 [14] shown in Fig. 2, (A). In our design, we use it to control the driving speed and direction of the mobile robot. As shown in Fig. 2, (B), the Bluetooth module HC-06 [15] is used for communication between the microcontroller and a Smartphone. As shown in Fig. 2, (C), 4 ultrasonic sensors are used to implement the wall-following and obstacle-avoidance behaviors. Ref. [16] offers details about the ultrasonic range sensor units used in this design. The ultrasonic range sensor can detect objects from 2 cm to 500 cm. A Smartphone, Nexus 4, is adopted to implement a mobile application (app) called Tour Guide Robot Application.

is the execution of the circle shape reader by pressing "OpenCV" button. The third step in Fig. 3, (3) shows implementation of the circle shape reader to detect circle shape landmarks. If the circle shape reader detects more than three circle shape landmarks, a Smartphone in turn sends using Bluetooth command to instruct the tour guide robot to reduce the robot’s wheel speed to prepare for recognition of QR codes without any problem. The fourth step in Fig. 3, (4) shows implementation of the QR code reader to detect QR codes. After the QR code is recognized, the Smartphone sends command to instruct the tour guide robot to run in autonomous robot navigation mode. As a final step, a text-to-speech function is performed through the “Speak out” function as shown in Fig. 3, (5) automatically to explain lab information if the QR code contains information is related to the lab. After pressing the "OpenCV" button, the mobile application can execute automatically from step (3) to step (5) repeatedly.

Fig. 2. From left to right, pictures of (A) DFRobot L298 DC motor driver shield , (B) HC-06 Bluetooth module, and (C) HC-SR04 Ultrasonic sensor.

Fig. 3. Screenshot of the implementation of the Smartphone application for the control of the tour guide robot and speaking the information on the QR code

B. Software Design and Implementation As shown in Fig. 3, a mobile application is developed to demonstrate a simple control scenario for the tour guide robot. The proposed mobile application utilizes libraries available from Zebra Crossing (Zxing) [17] and Open Source Computer Vision (OpenCV) [18] web sites, respectively to implement a QR code and a circle shape recognition. The Android application was then developed to meet the requirements of the tour guide robot. Android offers a Text-To-Speech (TTS) function which is used to speak the information on the QR codes for the user. The first step in Fig. 3, (1) is to establish a Bluetooth connection to the tour guide robot. The connection can be achieved by pressing “Connect to device” button on the mobile application. After the application makes the Bluetooth connection to the tour guide robot, the second step in Fig. 3, (2)

C. Robot Motion 1. Wall-following and obstacle-avoidance using sensors The tour guide robot applied an algorithm based on a proportional-integral-derivative (PID) controller to fulfill a task as wall-following behavior. The collected sonar range information is utilized as feedbacks for the PID algorithm. This is used to control robot’s wheel speeds to maintain the set point of 30cm distance parallel to the wall. The tour guide robot also incorporates obstacle avoidance obstacle behavior using the front of two ultrasonic range sensors. We assume that there are only door obstacles in the experimental environment. In Fig. 4, Dist_sensor implies the horizontal distance between the two ultrasonic sensors. DL and DR are the distances from left sensor to the wall and from right sensor to the wall respectively. Distance (D) is the length of the gap between DL and DR. The angle (θ1) can be obtained by

413

Θ1 = arctan(D/Dist_sesnor).

(1)

In order to perform obstacle avoidance behavior by the tour guide robot, three rotation angles are required and calculated using the following equations: Θ2 = 90° - Θ1

(2)

Θ3 = 180 – Θ5

(3)

Θ4 = Θ5 =(180° - θ2) / 2

(4)

In Fig 5, the door is identical to the shape of isosceles triangle because the length of the two sides is the same. Therefore, the angle (θ4) is equal to the angle (θ5). First rotation angle is computed by (2). The tour guide robot will be rotated by angle (θ2) to align itself parallel to the door and the robot moves straight ahead until the right sensor on the right side does not detect the door anymore. Then, the robot moves straight ahead by the length of the robot’s body to avoid collision with the door. The second rotation angle is calculated by (3). The tour guide robot will be rotated by angle (θ3) given by equation (3) and the robot continues to approach the wall until the front of the left sensor detects distance 30 cm. Then, the robot will be rotated by third rotation angle calculated by (4). After avoiding a door, the robot resumes its wall-following behavior again. 2. Autonomous Robot Navigation using QR code

Fig. 4. Implementation of two ultrasonic range sensors. Dist_sensor and distance (D) form the base and height of the right-angled triangle, respectively.

Fig. 6. Information obtained from QR code recognition helps to aid autonomous robot motion. The information in QR code is stored as follows. (A) Turn Right, (B) Go Straight, (C) Turn Left, (D) LAB Information, and (E) Avoid Obstacle.

Fig. 5. Obstacle-avoidance behavior using two ultrasonic range sensors. The robot avoids a door as an obstacle in the experimental environment.

Fig. 6 shows an implementation of an autonomous navigation method for the tour guide robot using real-time QR code recognition. Each QR code stores some information to aid the robot navigation as shown in Fig. 6. The dimensions of the actual QR code are 3 cm x 3 cm for the experiment, but it could be adjusted based on the desired recognition distance. The QR codes and circle shape landmarks printed on regular white sheets of paper. When the circle shape reader detects more than three circles, the Smartphone realizes that the tour guide robot is approaching the QR code, so it sends information to the tour

414

guide robot to reduce its wheel speed to allow reliable operation of the QR code reader. IV.

RESULTS

A. Map of Working Area in Indoor Environment

The testing environment for the tour guide robot is shown in Fig. 7. It is made up of the main hallway in the Electrical and Computer Engineering lab area at Kettering University. Eight landmarks were placed next to the doors of the laboratory rooms in a given route. To demonstrate the robots obstacle avoidance capabilities, two of the lab rooms were left open for the testing. In the test experiment conducted, the tour guide robot reached from start point to destination point following the guidance received from the QR code landmarks. Fig. 8 shows an actual photographic image of the experimental environment. B. Uncertainties of Ultrasonic Range Sensors There are uncertainties when using ultrasonic range sensors to recognize objects near the tour guide robot. If the angle (θ1) of the door is large, then accurate distance information cannot be obtained as shown in Fig. 9. In our experimental results, the accuracy of the distance information is decreased when the angle (θ1) is larger than 35°. For more robust obstacle detection and avoidance, in the future we plan to use additional ultrasonic range sensors at the front.

Fig. 7. Map of the experimental environment. The arrow and the green line indicates a door as an obstacle and a ideal route from start point to destination for the tour guide robot, respectively. The black dots show the locations of the landmarks.

Fig. 9. Uncertainty of ultrasonic range sensor. The uncertainty is mostly affected by the amount of the angle the obstacle is tilted.

Fig. 8. Photographic image of the main Electrical and Computer Engineering laboratory corridor.

C. Demonstration of Navigation Method Using Landmarks Preliminary experiments with the tour guide robot demonstrated promising results. The tour guide robot has been able to successfully recognize the QR codes that help to aid the robot navigation and provide directional information as well as information about the labs encountered. In addition to the QR codes, the circle shapes were utilized successfully to provide additional layer of control that allowed the robot to slow down

415

as it approaches the QR codes. The robot's wheel speed is determined by the sensing ability of the circle shape reader/recognition program. If the robot moves too quickly, some circle shapes fail to be read. V.

CONCLUSION

We have developed and tested an autonomous tour guide robot that uses QR code recognition as navigational aid. This paper has described the design and implementation of our tour guide robot. The tour guide robot is equipped with 4 ultrasonic range sensors and Smartphone that is developed to detect QR codes and circle shape landmarks. The experimental environment is made up several QR codes and circle shape landmarks. The collected sonar range information by the ultrasonic range sensors controls the movement of the tour guide robot. A Bluetooth technology is used to deliver information provided in the QR code to the tour guide robot. The QR code reader had difficulty recognizing the QR code when the robot is moving fast. To address this challenge, we devised a mechanism that employs circle detection program to tell the robot to slow down as it is getting close to the QR code. The proposed tour guide robot was able to experimentally demonstrate its effective performance to guide visitors to destination places. Furthermore, our implementation is low-cost that is based on easily accessible off-the-shelf electronic modules. In future work, we would like to study ways for more robust turning control of the robot. The rotation sensor in a Smartphone will be a useful source for consideration. Another aspect of the tour guide robot we would like to continue working on is in the user interface. This includes development of multimedia content to allow the tour guide robot play videos to introduce the lab using the Smartphone.

REFERENCES [1]

Gunhee kim, Woojin Chung, Kyung-Rock Kim, Munsang Kim, “The Autonomous Tour-Guide Robot Jinny”, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 4, pp. 3450-3455, October 2004. [2] D. Rodríguez-Losada, F. Matia, R. Galán, M. Hernando, J. M. Montero and J. M. Lucas, "Urbano, an Interactive Mobile Tour-Guide Robot". Advances in Service Robotics. Ed. H. Seok. In-Teh, 2008. [3] J. Beckwith, R. Lefief, S. Sherbrook, M. Williams, and K. Yelamarthi, “CATE: Central Automated Tour Experience,” Proceedings of the 2012 ASEE North-Central Section Conference. [4] Kumar Yelamarthi, Stephen Sherbrook, Jonathan Beckwith, Matt Williams, Robert Lefief, “An RFID Based Autonomous Indoor Tour Guide Robot” Circuits and Systems (MWSCAS), 2012 IEEE 55th International Midwest Symposium. [5] V. Kulyukin, C. Gharpure, J. Nicholson, and S. Pavithran, "RFID in robot-assisted indoor navigation for the visually impaired," in Intelligent Robots and Systems, 2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on, 2004, pp. 1979-1984. [6] Byoung-Suk Choi, jae Mu Yun and Jang Myung Lee, "An Efficient Localization Scheme for an Indoor Mobile Robot", SICE Annual Conference 2005 in Okayama, August 8-10, 2005 Okayama University, Japan. [7] Byoung-Suk Choi and Joon-Woo Lee, “A Hierarchical Algorithm for Indoor Mobile Robot Localization Using RFID Sensor Fusion,” IEEE Transactions on industrial electronics, VOL 58, NO. 6, June 2011 [8] Sunghong Park and Shuji Hashimoto, "Autonomous Mobile Robot Navigation Using RFID in Indoor Environment" IEEE Transactions on industrial electronics, VOL 56, NO. 7, June 2009. [9] T. Tsukiyama, “RFID Based Navigation System for Indoor Mobile Robots”, Preprints of the 18th IFAC World Congress, Milano (Italy) August 28 - September 2, 2011. [10] Diego Lopez-de-Ipina, Tania Lorido, and Unai Lopez, “Indoor Navigation and Product Recognition for Blind People Assisted Shopping”, J. Bravo, R. Hervás, and V. Villarreal (Eds.): IWAAL 2011, LNCS 6693, pp. 33– 40, 2011. [11] Rafael V. Aroca, Antonoio Pericles B. S. de Oliveira and Luiz Marcos G. Goncalves, “Towards Smarter Robots with Smartphones”, 5th workshop in applied robotics and automation, Robocontrol, 2012. [12] Tansuriyavong Suriyon, Higa Keisuke and Boonmee choompol, “Development of Guide Robot by Using QR Code Recognition”, The Second TSME International Confeerence on Mechanical Engineering 1921 October, 2011, Krabi [13] Yudin Dmitriy Aleksandrovich, Postolsky Grigoriy Gennadievich, Kizhuk Alexander Stepanovich and Magergut Valeriy Zalmanovich, “Mobile Robot Navigation Based on Artificial Landmarks with Machine Vision System”, World Applied Sciences Journal 24 (11): 1467-1472, 2013 [14] “DFRobot L298 DC motor driver shield”, http://elesson.tc.edu.tw/md221/pluginfile.php/4241/mod_page/content/ 10/L298_Motor_Shield_Manual.pdf [15] “HC-06 Manual”, http://www.exp-tech.de/service/datasheet/HC-SerialBluetooth-Products.pdf [16] “HC-SR 04 Ultrasonic Sensor”, http://letsmakerobots.com/node/30209 [17] Zxing https://code.google.com/p/zxing/ [18] OpenCV Android SDK http://docs.opencv.org/index.html