Mobile Robot Navigation by Wall Following Using Polar Coordinate Image from Omnidirectional Image Sensor

Tanai JOOCHIM  Kosin CHAMNONGTHAI  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E85-D   No.1   pp.264-274
Publication Date: 2002/01/01
Online ISSN: 
DOI: 
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Image Processing, Image Pattern Recognition
Keyword: 
navigation,  mobile robot,  wall following,  omnidirectional image sensor,  

Full Text: PDF>>
Buy this Article




Summary: 
In order to navigate a mobile robot or an autonomous vehicle in indoor environment, which includes several kinds of obstacles such as walls, furniture, and humans, the distance between the mobile robot and the obstacles have to be determined. These obstacles can be considered as walls with complicated edges. This paper proposes a mobile-robot-navigation method by using the polar coordinate transformation from an omnidirectional image. The omnidirectional image is obtained from a hyperboloidal mirror, which has the prominent feature in sensing the surrounding image at the same time. When the wall image from the camera is transformed by the transformation, the straight lines between the wall and the floor appear in the curve line after transformation. The peak point represents the distance and the direction between the robot and the wall. In addition, the wall types can be classified by the pattern and number of peak points. They are one side wall, corridor and corner. To navigate the mobile robot, in this paper, it starts with comparing a peak point obtained from the real image with the reference point determined by designed distance and direction. If there is a difference between the two points, the system will compute appropriate wheel angle to adjust the distance and direction against the wall by keeping the peak point in the same position as the reference point. The experiments are performed on the prototype mobile robot. The results show that for the determining distance from the robot to the wall between 70-290 cm, the average error is 6.23 percent. For three types of the wall classification, this method can correctly classify 86.67 percent of 15 image samples. In the robot movement alongside the wall, the system approximately consumes the 3 frame/s processing time at 10 cm/s motion speed. The mobile robot can maintain its motion alongside the wall with the average error 12 cm from reference distance.