Skip to main content

Robot Visual Control

  • Living reference work entry
  • Latest version View entry history
  • First Online:
Encyclopedia of Systems and Control

Abstract

This entry presents the basic concepts of vision-based control, that is, the use of visual data to control the motions of a robotics system. It details the modeling steps allowing to achieve a visual task, the list of possible visual features that can be used as input of the control scheme, the design of basic kinematics control schemes, and hints toward more advanced approaches. Applications are also described.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Bibliography

  • Agravante DJ, Claudio G, Spindler F, Chaumette F (2016) Visual servoing in an optimization framework for the whole-body control of humanoid robots. IEEE Robot Autom Lett 2(2):608–615

    Article  Google Scholar 

  • Allibert G, Courtial E, Chaumette F (2010) Predictive control for constrained image-based visual servoing. IEEE Trans Robot 26(5):933–939

    Article  Google Scholar 

  • Bakthavatchalam M, Tahri O, Chaumette F (2018) A direct dense visual servoing approach using photometric moments. IEEE Trans Robot 34(5):1226–1239

    Article  Google Scholar 

  • Bateux Q, Marchand E, Leitner J, Chaumette F, Corke P (2018) Training deep neural networks for visual servoing. In: IEEE International Conference on Robotics and Automation (ICRA’18), pp 3307-3314

    Google Scholar 

  • Caron G, Marchand E, Mouaddib EM (2013) Photometric visual servoing for omnidirectional cameras. Auton Robot 35(2):177–193

    Article  Google Scholar 

  • Chaumette F (2004) Image moments: a general and useful set of features for visual servoing. IEEE Trans Robot 20(4):713–723

    Article  Google Scholar 

  • Chaumette F, Hutchinson S, Corke P (2016) Visual servoing. In: Handbook of robotics, 2nd edn, Chapter 34. Springer, Berlin, pp 841–866

    Chapter  Google Scholar 

  • Chesi G (2009) Visual servoing path planning via homogeneous forms and LMI optimizations. IEEE Trans Robot 25(2):281–291

    Article  Google Scholar 

  • Chesi G, Hashimoto K (eds) (2010) Visual servoing via advanced numerical methods. LNCIS, vol 401. Springer, Berlin

    MATH  Google Scholar 

  • Collewet C, Marchand E (2011) Photometric visual servoing. IEEE Trans Robot 27(4):828–834

    Article  Google Scholar 

  • Corke P (2011) Robotics, vision and control. Springer tracts in advanced robotics, vol 73. Springer, Berlin

    MATH  Google Scholar 

  • Corke P, Good M (1996) Dynamic effects in visual closed-loop systems. IEEE Trans Robot Autom 12(5):671–683

    Article  Google Scholar 

  • Crombez N, Mouaddib EM, Caron G, Chaumette F (2019) Visual servoing with photometric Gaussian mixtures as dense features. IEEE Trans Robot 35(1):49–63

    Article  Google Scholar 

  • Dame A, Marchand E (2011) Mutual information-based visual servoing. IEEE Trans Robot 27(5):958–969

    Article  Google Scholar 

  • Deguchi K (2000) A direct interpretation of dynamic images with camera and object motions for vision guided robot control. Int J Comput Vis 37(1):7–20

    Article  MATH  Google Scholar 

  • De Luca A, Oriolo G, Robuffo Giordano P (2008) Feature depth observation for image-based visual servoing: theory and experiments. Int J Robot Res 38(4):422–450, 27(10):1093–1116

    Article  Google Scholar 

  • Duflot LA, Reisenhofer R, Tmadazte B, Andreff N, Krupa A (2019) Wavelet and shearlet-baed image representations for visual servoing. Int J Robot Res 38(4):422–450

    Article  Google Scholar 

  • Espiau B, Chaumette F, Rives P (1992) A new approach to visual servoing in robotics. IEEE Trans Robot Autom 8(3):313–326

    Article  Google Scholar 

  • Ginhoux R, Gangloff J, de Mathelin M, Soler L, Sanchez MA, Marescaux J (2005) Active filtering of physiological motion in robotized surgery using predictive control. IEEE Trans Robot 21(1):67–79

    Article  Google Scholar 

  • Hadj-Abdelkader H, Mezouar Y, Martinet P, Chaumette F (2008) Catadioptric visual servoing from 3D straght lines. IEEE Trans Robot 24(3):652–665

    Article  Google Scholar 

  • Hamel T, Mahony R (2002) Visual servoing of an under-actuated dynamic rigid-body system: an image-based approach. IEEE Trans Robot Autom 18(2):187–198

    Article  Google Scholar 

  • Han S, Censi A, Straw A, Murray R (2010) A bio-plausible design for visual pose stabilization. In: International Conference on Intelligent Robots and Systems (IROS’10), pp 5679–5686

    Google Scholar 

  • Hashimoto K (ed) (1993) Visual servoing: real-time control of robot manipulators based on visual sensory feedback. World Scientific, Singapore

    Google Scholar 

  • Hashimoto K, Ebine T, Kimura H (1996) Visual servoing with hand-eye manipulator – optimal control approach. IEEE Trans Robot Autom 12(5):766–774

    Article  Google Scholar 

  • Hosoda K, Asada M (1994) Versatile visual servoing without knowledge of true jacobian. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’94), pp 186–193

    Google Scholar 

  • Hutchinson S, Hager G, Corke P (1996) A tutorial on visual servo control. IEEE Trans Robot Autom 12(5):651–670

    Article  Google Scholar 

  • Iwatsuki M, Okiyama N (2005) A new formulation of visual servoing based on cylindrical coordinate system. IEEE Trans Robot Autom 21(2):266–273

    Article  Google Scholar 

  • Jägersand M, Fuentes O, Nelson R (1997) Experimental evaluation of uncalibrated visual servoing for precision manipulation. In: IEEE International Conference on Robotics and Automation (ICRA’97), pp 2874–2880

    Google Scholar 

  • Kazemi M, Gupta K, Mehrandezh M (2013) Randomized kinodynamic planning for robust visual servoing. IEEE Trans Robot 29(5):1197–1211

    Article  Google Scholar 

  • Khalil W, Dombre E (2002) Modeling, identification, and control of robots. CRC Press, Stanmore, UK

    MATH  Google Scholar 

  • Kriegman D, Hager G, Morse S (eds) (1998) The confluence of vision and control. LNCIS, vol 237. Springer, London

    MATH  Google Scholar 

  • Levine S, Finn C, Darrell T, Abbeel P (2016) End-to-end training of deep visuomotor policies. J Mach Learn Res 17(1):1334-1373

    MathSciNet  MATH  Google Scholar 

  • Lopez-Nicolas G, Guerrero JJ, Sagues C (2010) Visual control through the trifocal tensor for nonholonomic robots. Robot Auton Syst 58(2):216–226

    Article  Google Scholar 

  • Malis E, Chaumette F (2000) 2-1/2D visual servoing with respect to unknown objects through a new estimation scheme of camera displacement. Int J Comput Vis 37(1):79–97

    Google Scholar 

  • Marchand E, Spindler F, Chaumette F (2005) ViSP for visual servoing: a generic software platform with a wide class of robot control skills. IEEE Robot Autom Mag 12(4):40–52

    Article  Google Scholar 

  • Marchand E, Uchiyama H, Spindler F (2016) Pose estimation for augmented reality: a hands-on survey. IEEE Trans Vis Comput Graph 22(12):2633–2651

    Article  Google Scholar 

  • Mariottini GL, Oriolo G, Prattichizo D (2007) Image-based visual servoing for nonholonomic mobile robots using epipolar geometry. IEEE Trans Robot 23(1): 87–100

    Article  Google Scholar 

  • Mebarki R, Krupa A, Chaumette F (2010) 2D ultrasound probe complete guidance by visual servoing using image moments. IEEE Trans Robot 26(2):296–306

    Article  Google Scholar 

  • Mebarki R, Lippiello V, Siciliano B (2015) Nonlinear visual control of unmanned aerial vehicles in GPS-denied environments. IEEE Trans Robot 31(4):1004–1017

    Article  Google Scholar 

  • Mezouar Y, Chaumette F (2002) Path planning for robust image-based control. IEEE Trans Robot 22(10):781–804

    Google Scholar 

  • Motyl G, Chaumette F, Gallice J (1992) Coupling a camera and laser stripe in sensor based control. In: Second International Symposium on Measurement and Control in Robotics, pp 685–692

    Google Scholar 

  • Nakamura Y, Hanafusa H, Yoshikawa T (1987) Task-priority based redundancy control of robot manipulators. Int J Robot Res 6(2):3–15

    Article  Google Scholar 

  • Nayar S, Nene S, Murase H (1996) Subspace methods for robot vision. IEEE Trans Robot Autom 12(5):750–758

    Article  Google Scholar 

  • Nelson B, Khosla P (1995) Strategies for increasing the tracking region of an eye-in-hand system by singularity and joint limit avoidance. Int J Robot Res 14(3): 225-269

    Article  Google Scholar 

  • Pagès J, Collewet C, Chaumette F, Salvi J (2006) Optimizing plane-to-plane positioning tasks by image-based visual servoing and structured light. IEEE Trans Robot 22(5):1000–1010

    Article  Google Scholar 

  • Pandya H, Gaud A, Kumar G, Madhava Krishna K (2019) Instance invariant visual servoing framework for part-aware autonomous vehicle inspection using MAVs. J Field Rob 36(5):892–918

    Article  Google Scholar 

  • Silveira G, Malis E (2012) Direct visual servoing: vision-based estimation and control using only nonmetric information. IEEE Trans Robot 28(4):974–980

    Article  Google Scholar 

  • Suh I (1993) Visual servoing of robot manipulators by fuzzy membership function based neural networks. In: Visual servoing. World scientific series on robotics and automated systems, vol 7. World Scientific, Singapore, pp 285–315

    Chapter  Google Scholar 

  • Teulière C, Marchand E (2014) A dense and direct approach to visual servoing using depth maps. IEEE Trans Robot 30(5):1242–1249

    Article  Google Scholar 

  • Thuilot B, Martinet P, Cordesses L, Gallice J (2002) Position based visual servoing: keeping the object in the field of vision. In: IEEE International Conference on Robotics and Automation (ICRA’02), pp 1624–1629

    Google Scholar 

  • Tsai R, Lenz R (1989) A new technique for fully autonomous and efficient 3D robotics hand-eye calibration. IEEE Trans Robot Autom 5(3):345–358

    Article  Google Scholar 

  • Weiss L, Sanderson A, Neuman C (1987) Dynamic sensor-based control of robots with visual feedback. IEEE J Robot Autom 3(5):404–417

    Article  Google Scholar 

  • Wells G, Venaille C, Torras C (1996) Vision-based robot positioning using neural networks. Image Vis Comput 14(10):715–732

    Article  Google Scholar 

  • Wilson W, Hulls C, Bell G (1996) Relative end-effector control using cartesian position-based visual servoing. IEEE Trans Robot Autom 12(5):684–696

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to François Chaumette .

Editor information

Editors and Affiliations

Section Editor information

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer-Verlag London Ltd., part of Springer Nature

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Chaumette, F. (2020). Robot Visual Control. In: Baillieul, J., Samad, T. (eds) Encyclopedia of Systems and Control. Springer, London. https://doi.org/10.1007/978-1-4471-5102-9_170-2

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-5102-9_170-2

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-5102-9

  • Online ISBN: 978-1-4471-5102-9

  • eBook Packages: Springer Reference EngineeringReference Module Computer Science and Engineering

Publish with us

Policies and ethics

Chapter history

  1. Latest

    Robot Visual Control
    Published:
    04 December 2019

    DOI: https://doi.org/10.1007/978-1-4471-5102-9_170-2

  2. Original

    Robot Visual Control
    Published:
    03 April 2014

    DOI: https://doi.org/10.1007/978-1-4471-5102-9_170-1