Skip to main content

Improving RGB-D Scene Reconstruction Using Rolling Shutter Rectification

  • Chapter
New Development in Robot Vision

Part of the book series: Cognitive Systems Monographs ((COSMOS,volume 23))

Abstract

Scene reconstruction, i.e. the process of creating a 3D representation (mesh) of some real world scene, has recently become easier with the advent of cheap RGB-D sensors (e.g. the Microsoft Kinect).

Many such sensors use rolling shutter cameras, which produce geometrically distorted images when they are moving. To mitigate these rolling shutter distortions we propose a method that uses an attached gyroscope to rectify the depth scans.We also present a simple scheme to calibrate the relative pose and time synchronization between the gyro and a rolling shutter RGB-D sensor.

For scene reconstruction we use the Kinect Fusion algorithm to produce meshes. We create meshes from both raw and rectified depth scans, and these are then compared to a ground truth mesh. The types of motion we investigate are: pan, tilt and wobble (shaking) motions.

As our method relies on gyroscope readings, the amount of computations required is negligible compared to the cost of running Kinect Fusion.

This chapter is an extension of a paper at the IEEE Workshop on Robot Vision [10]. Compared to that paper, we have improved the rectification to also correct for lens distortion, and use a coarse-to-fine search to find the time shift more quicky.We have extended our experiments to also investigate the effects of lens distortion, and to use more accurate ground truth. The experiments demonstrate that correction of rolling shutter effects yields a larger improvement of the 3D model than correction for lens distortion.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Baker, S., Bennett, E., Kang, S.B., Szeliski, R.: Removing rolling shutter wobble. In: IEEE Conference on Computer Vision and Pattern Recognition. IEEE Computer Society, San Francisco (2010)

    Google Scholar 

  2. Besl, P., McKay, H.: A method for registration of 3-D shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence 14(2), 239–256 (1992)

    Article  Google Scholar 

  3. Geyer, C., Meingast, M., Sastry, S.: Geometric models of rolling-shutter cameras. In: 6th OmniVis WS (2005)

    Google Scholar 

  4. Golub, G.H., van Loan, C.F.: Matrix Computations. Johns Hopkins University Press, Baltimore (1983)

    MATH  Google Scholar 

  5. Hanning, G., Forslöw, N., Forssén, P.E., Ringaby, E., Törnqvist, D., Callmer, J.: Stabilizing cell phone video using inertial measurement sensors. In: The Second IEEE International Workshop on Mobile Vision. IEEE, Barcelona (2011)

    Google Scholar 

  6. Hartley, R.I., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press (2004)

    Google Scholar 

  7. Hol, J.D., Schön, T.B., Gustafsson, F.: Modeling and calibration of inertial and vision sensors. International Journal of Robotics Research 29(2), 231–244 (2010)

    Article  Google Scholar 

  8. Karpenko, A., Jacobs, D., Baek, J., Levoy, M.: Digital video stabilization and rolling shutter correction using gyroscopes. Tech. Rep. CSTR 2011-03, Stanford University Computer Science (2011)

    Google Scholar 

  9. Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Kohli, P., Shotton, J., Hodges, S., Fitzgibbon, A.: Kinectfusion: Real-time dense surface mapping and tracking. In: IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011, Basel, Switzerland (2011)

    Google Scholar 

  10. Ovrén, H., Forssén, P.E., Törnqvist, D.: Why would i want a gyroscope on my RGB-D sensor? In: Proceedings of IEEE Winter Vision Meetings, Workshop on Robot Vision (WoRV 2013). IEEE, Clearwater (2013)

    Google Scholar 

  11. Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical recipes in C: the art of scientific computing, 2nd edn. Cambridge University Press, New York (1992)

    Google Scholar 

  12. Ringaby, E., Forssén, P.E.: Scan rectification for structured light range sensors with rolling shutters. In: IEEE International Conference on Computer Vision. IEEE Computer Society Press, Barcelona (2011)

    Google Scholar 

  13. Ringaby, E., Forssén, P.E.: Efficient video rectification and stabilisation for cell-phones. International Journal of Computer Vision 96(3), 335–352 (2012)

    Article  Google Scholar 

  14. Roth, H., Vona, M.: Moving volume kinectfusion. In: British Machine Vision Conference (BMVC 2012). BMVA, University of Surrey, UK (2012), http://dx.doi.org/10.5244/C.26.112

  15. Rusu, R.B., Cousins, S.: 3D is here: Point Cloud Library (PCL). In: IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China (2011)

    Google Scholar 

  16. Schönemann, P.: A generalized solution of the orthogonal procrustes problem. Psychometrika 31(1), 1–10 (1966)

    Article  MathSciNet  MATH  Google Scholar 

  17. Shoemake, K.: Animating rotation with quaternion curves. In: Int. Conf. on CGIT, pp. 245–254 (1985)

    Google Scholar 

  18. Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: Proc. of the International Conference on Intelligent Robot Systems, IROS (2012)

    Google Scholar 

  19. Whelan, T., McDonald, J., Kaess, M., Fallon, M., Johannsson, H., Leonard, J.J.: Kintinuous: Spatially extended kinectfusion. In: RSS 2012 Workshop on RGB-D Cameras, Sydney (2012)

    Google Scholar 

  20. Zhang, Z.: A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(11), 1330–1334 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hannes Ovrén .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Ovrén, H., Forssén, PE., Törnqvist, D. (2015). Improving RGB-D Scene Reconstruction Using Rolling Shutter Rectification. In: Sun, Y., Behal, A., Chung, CK. (eds) New Development in Robot Vision. Cognitive Systems Monographs, vol 23. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-43859-6_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-43859-6_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-43858-9

  • Online ISBN: 978-3-662-43859-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics