2D obstacle avoidance algorithm for mobile robots

4 downloads 0 Views 277KB Size Report
maximum travelable direction which covers span of 180° in front of the robot. ... navigation of a robot in a completely unknown knowledge of environments [1]. .... maximum obstacle free distance and its direction using three example potential ...
2D obstacle avoidance algorithm for mobile robots Tharindu Weerakoon, Kyushu Institute of Technology, [email protected] Kazuo Ishii, Kyushu Institute of Technology This paper discusses a 2-dimentional obstacle avoidance algorithm for mobile robots who search the environment obstacles by laser range finder. This algorithm finds the most obstacle free path that will allow a robot to move the furthest distance. In this algorithm, width of the robot is taken in to account when determining the obstacle free path for the robot. 2D laser range finder scans the horizontal data of the environment and algorithm finds the maximum travelable direction which covers span of 180° in front of the robot. The effectiveness of the algorithm is validated using MATLAB simulation framework using real environmental data captured by laser range finder. Key Words: LRF, obstacle avoidance, beam index, path area, potential direction.

1. Introduction

2.1 Knowledge of the working environment The fig.1 explains the map of the working space. The height of all the buildings and house blocks are higher than the horizontal position of the LRF. The dimensions are shown in cm. LRF is mounted at 30 cm above from the ground level and used to capture the distance information to the objects at the present state. It measures the distance information to objects in each 0.35°. Within the scan window of 240°, LRF captures 681 data points. The Cartesian representation of captured data and its polar representation are shown in fig.2 and fig.3 respectively. Cartesian map is useful to explain the object positioning and how the path of the robot should be. But it is essential to put this information in an array, as laser beam index number and corresponding distance as in fig.3. B7

H3 B5

B: Building H: House

B6

B4

H2

B5 H1 B3 B9

42

2. Extracting knowledge of the environment URG04-LX 2D laser range finder (LRF) provides 2D information in a horizontal plane ahead of the scanner. It can

detect objects within the field of view of the scanner. This environmental information are very useful to robot to avoid the obstacle in its motion path. LRF described above is used to capture the environmental information of the objects from the robot. It scans for 240° by the step of 0.35°. This scanned data is used together with 600mm width robot to find a maximum obstacle free direction. The algorithm considers the maximum travelable distance up to 4000mm and LRF data and path area of the robot is converted in to polar representation in order to use in this algorithm.

436

In the field of mobile robots, the development of fast and efficient motion planning algorithms plays a major role. Some of those algorithms are developed with completely known knowledge of the environment having stationary obstacles for the navigation of robots. On the other hand, another kind of motion planning algorithms deals with the navigation of a robot in a completely unknown knowledge of environments [1]. As a solution to the problem of the mobile robot’s lack of knowledge about the environment, different kinds of sensors such as ultrasonic sensors, odometers, vision cameras, laser range finders have been employed to extract the information about the environment. This information is used in motion planning algorithms and those can be feasibly applied for different application areas. In the motion planning of mobile robots, the famous technique used is the potential field methods. The basic concept used in this technique is to define the working space of the robot with an artificial potential field. Here, potential field is developed to attract robots’ goal position by avoiding the obstacles in the condition having final goal point [1]. The potential field method is particularly popular because of its mathematical scenario and has been extensively conducted several studies in past decade [1], [2], [3], [4], [5], [6]. Even though the above methods have their own merits, the common problem is that most of them are suitable only for the motion planning of mobile robots in a completely known environment. Developing a motion planning algorithm using sensor information for any unknown environment is one of the requirements. For simplicity, robot is treated as a point in most of the research on potential field methods. But in reality, robot cannot be treated as a point and therefore it is not a precise assumption when developing efficient motion planning algorithm. Specially, when developing obstacle free path for the robot, it is essential to consider the width of the robots. With these needs, an obstacle avoidance algorithm is introduced in this paper by considering the width of the robot as a special feature. Laser range finder is employed to extract the knowledge about the environment.

H4

42

B2 B8 95

B1

LRF 481

Fig. 1 Map of the environment

Laser range data

Laser range data

Robot path area

Robot path area

x

d

r

Cut off region

Cut off region

Fig. 2 Cartesian representation of LRF data Fig. 5 Definition of cut off region Distance from LRF (mm)

After sampling the total path area of the robot, it contains 519 data points. The representation of path area distance l for corresponding angle index is shown in fig. 4(b) as polar representation of the path of the robot. These two polar form array representations of LRF data and path area of the robot are directly used in the algorithm in order to find the obstacle availability in the path area for each direction. The usages of these two arrays are explained in section 3. Laser beam index ( i )

Fig. 3 Polar representation of LRF data

L: Maximum distance of the path

l 180°

a



Distance of the robot path area (mm)

90°

d : Width of the ReZOT

Robot path area

Angle indices ( j )

(a) (b) Fig. 4 (a) Path area of the robot, (b) Polar representation of path area of the robot

l

d tan( ) 2

(1)

Since the algorithm should use polar coordinate information of the LRF, path area of the robot is needed to be converted to polar form. Fig. 4 (a) explains the path area of the robot and the width of the robot (d) and the maximum distance of the path (L) which are user defined values. In this research width of the robot is defined as 600mm while the maximum distance is defined as 4000mm. The letter l denotes the corresponding path area distance in polar form. The total path area has to be represented by array information which is used with LRF array data in the algorithm. Therefore it is necessary to use the same sampling angle of 0.35° as the angle of α. The path area distance l can be computed for each value of α that is explained by the Eq. (1).

2.2 Defining cut off region The algorithm checks the obstacle availability within the path area of the robot for each potential direction. Since the scan window of the HOKUYO URG-04LX LRF is 240°, there is lack of information of the objects near by the robot. Fig. 5 explains how cut off region is defined for a given width of a robot and based on the scan window of the LRF. The purpose of defining a cut off region is to avoid running the algorithm with the lack of obstacle information near by the robot. As an advantage of defining cut off region, the run time of the algorithm can be minimized. When the width of the robot becomes larger and larger, the radius of the cut of region also becomes larger. The distance of the cut off range is defined in order to find the obstacle free path direction from 0° to 180° in right to left; front of the robot. The radius of the cut off region r can be calculated using the Eq.(2) given below. sin    

d

2 r

(2)

β is the angle to the initial scanning position from the 0° axis as shown in fig. 5. In this case, since the value of β is 30° and the value of d is 600mm, we can calculate the value of the radius of the cut off region using the Eq. (2). Then the radius r of the cut off region becomes equal to the width of the robot of 600mm. r

600

2 sin  30 

r  600mm

For the selected potential direction by the algorithm, there is a specified distance that the algorithm does not check the

availability of the obstacles. This distance x is depend on the value of the radius of the cut off range. From this distance onwards, algorithm checks the availability of the objects within the path area region. This distance can be computed as in Eq. (3) below. x

d  300 3  520mm 2  tan  300 

(3)

3. Obstacle free algorithm The obstacle free path algorithm uses each beam index as the potential direction and finds the distance from LRF at which the path is obstructed by obstacles. The corresponding distance from the LRF to every beam index is the maximum distance that the robot can travel in each potential direction. Fig. 6 explains the representation of cut off region on the polar representation of the path area of the robot. The centre of the path area is represented by the angle index of 258. Algorithm checks the availability of the obstacle information above the cut off region line and within the path area. In order to find the availability of obstacles within the path area of the robot, defined path area should be used directing to each potential direction. If there is any object information within the path area then that should be the maximum travelable distance for corresponding potential direction. Fig. 7 illustrates how the algorithm works to determine the maximum obstacle free distance and its direction using three example potential paths. Distance of Robot’s path area (mm)

520

Cut off region

244 172

258

Angle Index ( j )

Fig.6 Cut off region and relevant angle indices Direction-1 LRF data

Distance from LRF (mm)

Obstacle free distance

Direction-2

Direction-3

Obstacle free distance

Obstacle free distance

Robot path area

Laser beam index (i)

Fig.7 Illustration of un-obstructive distance examples

3.1 Working process of the algorithm This section gives a detailed explanation about the operation of the algorithm for selected sample directions as shown in fig.7. 85 LRF beam index number is correspond to the most right-side (0°) of the robot while 598 LRF beam index number is corresponds to the most left-side (180°) of the robot. Algorithm runs from 85 beam index to 598 beam index and finds the maximum travelable distance for each beam index. Algorithm starts to run from initial LRF beam index of 85 and range of the beam index based on the cut off region angle indices. The considered range of the scanned data of the obstacles can be represented by the expression given below. {[beam index ( i ) - (86 + step)] , [ beam index ( i ) + (86 + step]}

Beam index i is corresponding to the selected potential direction and step represents the increment of the travelable distance. The number 86 is the half of the path area width of the robot above the cut off region in terms of angle index. Algorithm first selects a direction as beam index i. Then algorithm starts from the beginning with the value step set to 0. When the step value is 0, corresponding angle index j is (172 + step) and its path area distance is 520mm. With this i and step values LRF data range should be computed. Then algorithm checks whether there is any LRF value less than the value of the path area distance corresponding to the j. If there isn’t any LRF data less than that value, step value increases by 1 and calculate new LRF data range and read the corresponding path area distance to j value. Then, algorithm reads whether there is any of LRF data less than the path area value for j. If there is any value of LRF data satisfies the above checking condition; that is the maximum travelable distance that the robot can travel in that direction denoted by beam index i. If not, checking process repeats until the angle index value becomes equal to the 244 which represents the maximum considered distance of the path area. After ending the checking of selected potential direction of i; the potential direction increases by beam index of 1 and new potential direction becomes (i + 1) in beam index. Then algorithm repeats for the new potential direction and it continues until the beam index value becomes 598. For example, in the direction-1 as shown in the fig.7, algorithm will start to find the availability of the obstacle information within the path area of the robot. In the first example algorithm finds that there is obstacle information within the path area of the robot at a distance of 800m. Therefore this is the maximum travelable distance for the direction-1.If the robot tries to move further beyond this distance in this direction robot hits the obstacle. In direction-2, no any obstacle can be found within the robot’s path area until the path area distance is about 2550mm and at this point algorithm has identified that there is an obstacle. Therefore this is the maximum travelable distance for the robot in direction-2. Similarly, for the direction-3, the robot can travel until about 2000mm and robot cannot travel further in this direction.

START

LRF Scanning

Find the obstacle free distances

Distance from LRF (mm)

Define the path area

Maximum obstacle free direction

Obstacle distance information

Obstacle Margin

Find the maximum obstacle free direction LRF beam index

Send output to motion control algorithm STOP

Fig.8 Operational flowchart of the algorithm

Finally those values of travelable distances for every direction from 0° to 180° are stored in an array by the algorithm. Then the algorithm finds the maximum value among the stored travelable distances and the corresponding direction beam index i. This beam index then can be converted in to the angle from the front of the robot and motion algorithm of the robot uses this information to turn the robot. Fig. 8 shows the complete flowchart explanation of the algorithm. First it is necessary to give the path width of the robot and maximum path area distance considered and then algorithm defines the path area accordingly. And also it defines the cut off region for the algorithm based on the scan window size of the LRF. Then LRF scans object information in the environment within span of 240° at the present position of the robot. This scanned data together with path area information of the robot are processed in order to find maximum travelable distance for each direction that covers 180°. Then finally algorithm finds the maximum travelable direction for the robot and its turning angle from the front of the robot. This angle information will send to motion control algorithm and LRF gets new set of data to find a new direction for the current position.

4. Simulation results of the algorithm The algorithm finds the obstacle free distance “Obstacle margin” for each potential direction from 0° to 180° which determines the most right to most left direction of the robot. The corresponding beam index for 0° (defines the right side of the robot) is 598; for 180° (left side of the robot) beam index is 85. Algorithm starts to determine the obstacle free distance for each beam index from index number 85 to 598. For each potential direction the obstacle free distance is stored by the algorithm. The fig. 9 shows the travelable distance for each direction. Finally, the maximum obstacle free distance is determined and then the corresponding direction can be calculated. This is used to control the motion of the robot.

Fig.9 Simulation result of the algorithm in MATLAB

Fig.9 shows the encountered obstacle free distance “obstacle margin” for each beam index. The Beam Index corresponding to the maximum obstacle free distance determines the direction where the robot should turn towards. In this simulation example, the maximum obstacle free direction corresponds to the 169 beam index and the maximum travelable distance is 4289mm. Therefore the robot has to turn to 75.6° in counter clockwise direction from the front of the robot to come ahead the maximum travelable direction.

5. Conclusion In this research, efficient 2D obstacle avoidance algorithm is introduced. Its detail explanation is given and MATLAB simulation has shown expected good result. This algorithm can be applicable for any size of robot by defining the width of the robot as an advantage of this algorithm over the other algorithms which consider the robot as a point. Therefore we can minimize the risk of collide of the robot with the obstacles in the environment. Further, we can add a safety margin to the width of the path area by adding safety factor to the width of the robot. References [1]

[2] [3] [4] [5]

[6]

Koren, Y., and Borenstein, J., 1991, “Potential Field Methods and Their Inherent Limitations for Mobile Robot Navigation,” Proceedings of the IEEE International Conference on Robotics and Automation, Vol. 2, pp. 1398-1404. Hwang, Y. K., and Ahuja, N., 1988, “Path Planning Using a Potential Field Representation,” Proceedings of the IEEE International Conference on Robotics and Automation, Vol. 1, pp. 648-649. Borenstein, J., and Koren, Y., 1989, “Real-Time Obstacle Avoidance for Fast Mobile Robot,” IEEE Transactions on Systems, Man, and Cybernetics, Vol. 19, No, 5, pp. 1179-1187. Rimon, E., and Koditschek, D. E., 1992, “Exact Robot Navigation Using Artificial Potential Functions,” IEEE Transactions on Robotics and Automation, Vol. 8, No. 5, pp. 501-518. Chuang, J. H., and Ahuja, N., 1998, “An Analytically Tractable Potential Field Model of Free Space and Its Application in Obstacle Avoidance,” IEEE Transactions on Systems, Man, and Cybernetics-Part B: Cybernetics, Vol. 28, No. 5, pp. 729-736. Ge, S. S., and Cui. Y. J., 2000, “New Potential Functions for Mobile Robot Path Planning,” IEEE Transactions on Robotics and Automation, Vol. 16, No. 5, pp. 615-620.