Hostname: page-component-78c5997874-ndw9j Total loading time: 0 Render date: 2024-11-10T14:34:01.780Z Has data issue: false hasContentIssue false

Sinusoidal input-based visual control for nonholonomic vehicles

Published online by Cambridge University Press:  13 February 2013

M. Aranda*
Affiliation:
Instituto de Investigación en Ingeniería de Aragón, Universidad de Zaragoza, Spain
G. López-Nicolás
Affiliation:
Instituto de Investigación en Ingeniería de Aragón, Universidad de Zaragoza, Spain
C. Sagüés
Affiliation:
Instituto de Investigación en Ingeniería de Aragón, Universidad de Zaragoza, Spain
*
*Corresponding author. E-mail: marandac@unizar.es

Summary

This paper proposes a new visual control approach based on sinusoidal inputs to be used on a nonholonomic robot. We present several contributions: In our method, developed considering a unicycle kinematic model, sinusoids are used in such a way that the generated vehicle trajectories are feasible, smooth and versatile. Our technique improves previous sinusoidal-based control works in terms of efficiency and flexibility. As further contributions, we present analytical expressions for the evolution of the robot's state, and propose a new state-feedback control law based on these expressions. All the information used in the control scheme is obtained from omnidirectional vision by means of the one-dimensional trifocal tensor. Stability analysis of the proposed approach is presented, and its performance is illustrated through experiments.

Type
Articles
Copyright
Copyright © Cambridge University Press 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.DeSouza, G. N. and Kak, A. C., “Vision for mobile robot navigation: A survey,” IEEE Trans. Pattern Anal. Mach. Intell. 24 (2), 237267 (2002).CrossRefGoogle Scholar
2.Chaumette, F. and Hutchinson, S., “Visual servo control, part I: Basic approaches,” IEEE Robotics and Automation Magazine, 13 (4), 8290 (2006).CrossRefGoogle Scholar
3.Chaumette, F. and Hutchinson, S., “Visual servo control, part II: Advanced approaches,” IEEE Robot. Autom. Mag. 14 (1), 109118 (2007).CrossRefGoogle Scholar
4.Cho, S. and Lee, J., “Localization of a high-speed mobile robot using global features,” Robotica 29 (5), 757765 (2011).CrossRefGoogle Scholar
5.Baker, S. and Nayar, S. K., “A theory of single-viewpoint catadioptric image formation,” Int. J. Comput. Vis. 35 (2), 175196 (1999).CrossRefGoogle Scholar
6.Salichs, M. A. and Moreno, L., “Navigation of mobile robots: Open questions,” Robotica 18 (3), 227234 (2000).CrossRefGoogle Scholar
7.Defoort, M., Palos, J., Kokosy, A., Floquet, T. and Perruquetti, W., “Performance-based reactive navigation for non-holonomic mobile robots,” Robotica 27 (2), 281290 (2009).CrossRefGoogle Scholar
8.Murray, R. and Sastry, S. S., “Nonholonomic motion planning: Steering using sinusoids,” IEEE Trans. Autom. Control 38 (5), 700716 (1993).CrossRefGoogle Scholar
9.De Luca, A., Oriolo, G. and Samson, C., “Feedback control of a nonholonomic car-like robot,” In: Robot Motion, Planning and Control, Lecture Notes in Control and Information Sciences, vol. 229 (Laumond, J.-P., ed.) (Springer, New York, 1998) pp. 171254.CrossRefGoogle Scholar
10.Salaris, P., Fontanelli, D., Pallottino, L. and Bicchi, A., “Shortest paths for a robot with nonholonomic and field-of-view constraints,” IEEE Trans. Robot. 26 (2), 269281 (2010).CrossRefGoogle Scholar
11.Zhang, X., Fang, Y. and Liu, X., “Motion-estimation-based visual servoing of nonholonomic mobile robots,” IEEE Trans. Robot. 27 (6), 11671175 (2011).CrossRefGoogle Scholar
12.Coelho, P. and Nunes, U., “Path-following control of mobile robots in presence of uncertainties,” IEEE Trans. Robot. 21 (2), 252261 (2005).CrossRefGoogle Scholar
13.Cherubini, A., Chaumette, F. and Oriolo, G., “Visual servoing for path reaching with nonholonomic robots,” Robotica 29 (7), 10371048 (2011).CrossRefGoogle Scholar
14.Rosales, A., Scaglia, G., Mut, V. A. and di Sciascio, F., “Trajectory tracking of mobile robots in dynamic environments – A linear algebra approach,” Robotica 27 (7), 981997 (2009).CrossRefGoogle Scholar
15.Tilbury, D., Murray, R. and Sastry, S., “Trajectory generation for the n-trailer problem using Goursat normal form,” IEEE Trans. Autom. Control 40 (5), 802819 (1993).CrossRefGoogle Scholar
16.Teel, A., Murray, R. and Walsh, G., “Nonholonomic Control Systems: From Steering to Stabilization with Sinusoids,” In: Proceedings of the IEEE Conference on Decision and Control (Tucson, Arizona, USA, 1992) pp. 16031609.Google Scholar
17.M'Closkey, R. T. and Murray, R. M., “Exponential stabilization of driftless nonlinear control systems using homogeneous feedback,” IEEE Trans. Autom. Control 42 (5), 614628 (1995).CrossRefGoogle Scholar
18.Sagues, C. and Guerrero, J. J., “Motion and structure for vision-based navigation,” Robotica 17 (4), 355364 (1999).CrossRefGoogle Scholar
19.Chen, J., Dixon, W., Dawson, M. and McIntyre, M., “Homography-based visual servo tracking control of a wheeled mobile robot,” IEEE Trans. Robot. 22 (2), 407416 (2006).Google Scholar
20.Benhimane, S., Malis, E., Rives, P. and Azinheira, J. R., “Vision-based control for car platooning using homography decomposition,” In: Proceedings of the IEEE International Conference on Robotics and Automation (Barcelona, Spain, 2005), pp. 21612166.Google Scholar
21.Courbon, J., Mezouar, Y. and Martinet, P., “Indoor navigation of a non-holonomic mobile robot using a visual memory,” Auton. Robots 25 (3), 253266 (2008).CrossRefGoogle Scholar
22.Argyros, A. A., Bekris, K. E., Orphanoudakis, S. C. and Kavraki, L. E., “Robot homing by exploiting panoramic vision,” Auton. Robots 19 (1), 725 (2005).CrossRefGoogle Scholar
23.Möller, R., Vardy, A., Kreft, S. and Ruwisch, S., “Visual homing in environments with anisotropic landmark distribution,” Auton. Robots 23 (3), 231245 (2007).CrossRefGoogle Scholar
24.Stürzl, W. and Mallot, H. A., “Efficient visual homing based on Fourier-transformed panoramic images,” Robot. Auton. Syst. 54 (4), 300313 (2006).CrossRefGoogle Scholar
25.Möller, R., Krzykawski, M. and Gerstmayr, L., “Three 2D-warping schemes for visual robot navigation,” Auton. Robots 29 (3–4), 253291 (2010).CrossRefGoogle Scholar
26.Liu, M., Pradalier, C., Pomerleau, F. and Siegwart, R., “Scale-Only Visual Homing from an Omnidirectional Camera,” In: Proceedings of the IEEE International Conference on Robotics and Automation (St. Paul, Minnesota, USA, 2012), pp. 39443949.Google Scholar
27.Yu, S.-E. and Kim, D., “Landmark vectors with quantized distance information for homing navigation,” Adapt. Behav. 19 (2), 121141 (2011).CrossRefGoogle Scholar
28.Fu, Y., Hsiang, T.-R. and Chung, S.-L., “Multi-waypoint visual homing in piecewise linear trajectory,” Robotica. Available on CJO (2012), doi:10.1017/S0263574712000434.Google Scholar
29.Usher, K., Ridley, P. R. and Corke, P. I., “Visual Servoing of a Car-Like Vehicle – An Application of Omnidirectional Vision,” In: Proceedings of the IEEE International Conference on Robotics and Automation (Taipei, Taiwan, 2003) pp. 42884293.Google Scholar
30.Hartley, R. I. and Zisserman, A., Multiple View Geometry in Computer Vision, 2nd ed. (Cambridge: Cambridge University Press, 2004).CrossRefGoogle Scholar
31.Basri, R., Rivlin, E. and Shimshoni, I., “Visual homing: Surfing on the epipoles,” Int. J. Comput. Vis. 33 (2), 117137 (1999).CrossRefGoogle Scholar
32.López-Nicolás, G., Sagüés, C., Guerrero, J. J., Kragic, D. and Jensfelt, P., “Switching visual control based on epipoles for mobile robots,” Robot. Auton. Syst. 56 (7), 592603 (2008).CrossRefGoogle Scholar
33.López-Nicolás, G., Guerrero, J. J. and Sagüés, C., “Visual control through the trifocal tensor for non-holonomic robots,” Robot. Auton. Syst. 58 (2), 216226 (2010).CrossRefGoogle Scholar
34.López-Nicolás, G., Becerra, H. M., Aranda, M. and Sagüés, C., “Visual Navigation by Means of Three View Geometry,” Proceedings of the Robot 2011 Workshop, Sevilla, Spain (2011).Google Scholar
35.Shademan, A. and Jägersand, M., “Three-View Uncalibrated Visual Servoing,” In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (Taipei, Taiwan, 2010), pp. 62346239.Google Scholar
36.Faugeras, O., Quan, L. and Sturm, P., “Self-calibration of a 1D projective camera and its application to the self-calibration of a 2D projective camera,” IEEE Trans. Pattern Anal. Mach. Intell. 22 (10), 11791185 (2000).CrossRefGoogle Scholar
37.Becerra, H. M., López-Nicolás, G. and Sagüés, C., “Omnidirectional visual control of mobile robots based on the 1D trifocal tensor,” Robot. Auton. Syst. 58 (6), 796808 (2010).CrossRefGoogle Scholar
38.Aranda, M., López-Nicolás, G. and Sagüés, C., “Omnidirectional Visual Homing Using the 1D Trifocal Tensor,” In: Proceedings of the IEEE International Conference on Robotics and Automation (Anchorage, Alaska, USA, 2010), pp. 24442450.Google Scholar
39.Samson, C. and Ait-Abderrahim, K., “Feedback Control of a Nonholonomic Wheeled Cart in Cartesian Space,” In: Proceedings of the IEEE International Conference on Robotics and Automation (Sacramento, California, USA, 1991) pp. 11361141.Google Scholar
40.Slotine, J.-J. E. and Li, W., Applied Nonlinear Control (Prentice Hall, Upper Saddle River, NJ, 1991), pp. 5768.Google Scholar
41.Lowe, D., “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60 (2), 91110 (2004).CrossRefGoogle Scholar