Hostname: page-component-78c5997874-xbtfd Total loading time: 0 Render date: 2024-11-10T15:21:49.882Z Has data issue: false hasContentIssue false

Position localization for mobile robots using a colour image of equipment at nuclear plants

Published online by Cambridge University Press:  09 March 2009

Kenichi Ebihara
Affiliation:
Computing and Information Systems Center,Japan Atomic Energy Research Institute (JAERI), Tokai, Naka, Ibaraki, 319-11 (Japan) E-mail:[ebihara, otani, kume]@c3007.tokai.jaeri.go.jp
Takayuki Otani
Affiliation:
Computing and Information Systems Center,Japan Atomic Energy Research Institute (JAERI), Tokai, Naka, Ibaraki, 319-11 (Japan) E-mail:[ebihara, otani, kume]@c3007.tokai.jaeri.go.jp
Etsuo Kume
Affiliation:
Computing and Information Systems Center,Japan Atomic Energy Research Institute (JAERI), Tokai, Naka, Ibaraki, 319-11 (Japan) E-mail:[ebihara, otani, kume]@c3007.tokai.jaeri.go.jp

Extract

Position localization is a necessary and important task for an intelligent robot, which moves and works in place of human workers. Various methods for position localization tasks have been proposed and developed by many researchers; for example, can active method using ultrasonic waves or laser and a passive method using images of bar codes or marks in the workspace of the robot. The passive method, using images of the object in the workspace of the robot, is particularly desirable, as this method does not influence the workspace. Thus, an arrangement of the workspace is not necessary and many robots can work simultaneously.

Type
Research Article
Copyright
Copyright © Cambridge University Press 1996

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Kasuga, C., “Identification Method of the Self-Position Using the Digital Mark Pattern” J. Robotics Society of Japan 12, No. 6, 857862 (1996). (In Japanese).Google Scholar
2.Koh, K.C. et al. , “A position estimation system for mobile robots using a monocular image of a 3-D landmark” Robotica 12, 431441 (1996).CrossRefGoogle Scholar
3.Iida, S. et al. , “Position Identification Method for Mobile Robots using Color Image” J. Robotics Society of Japan 8, No. 6, 641651 (1990). (In Japanese).CrossRefGoogle Scholar
4.“SPIDER Users' Manual” (Published by Joint System Development Co., Japan, 1982). (In Japanese).Google Scholar
5.Yokoi, S. et al. , “A Method for Extracting Feature Points and Line Figures from Grey Pictures” Systems·Computers·Controls 58-D, No. 10, 601608 (1975).Google Scholar
6.Otsu, N., “An Automatic Threshold Selection Method Based on Discriminant and Least Squares Criteria” Systems·Computers·Controls J63-D, No. 4, 349356 (1980).Google Scholar
7.Hand Book of Image Analysis (University of Tokyo Press, 1996). (In Japanese).Google Scholar
8.Fu, K.S. et al. , Robotics: Control, Sensing, Vision and Intelligence (McGraw-Hill, New York, 1987).Google Scholar
9.Ebihara, K., “The Conceptual Design of the Sensing System for Patrolling and Inspecting a Nuclear Facility by the Intelligent Robot” JAERI-M 93-255 (Published by Japan Atomic Energy Research Institute, JAERI. 1993). (In Japanese).Google Scholar