Skip to main content

An Empirical Evaluation of the Performance of Real-Time Illumination Approaches: Realistic Scenes in Augmented Reality

  • Conference paper
  • First Online:
Augmented Reality, Virtual Reality, and Computer Graphics (AVR 2019)

Abstract

Augmented, Virtual, and Mixed Reality (AR/VR/MR) systems have been developed in general, with many of these applications having accomplished significant results, rendering a virtual object in the appropriate illumination model of the real environment is still under investigation. The entertainment industry has presented an astounding outcome in several media form, albeit the rendering process has mostly been done offline. The physical scene contains the illumination information which can be sampled and then used to render the virtual objects in real-time for realistic scene. In this paper, we evaluate the accuracy of our previous and current developed systems that provide real-time dynamic illumination for coherent interactive augmented reality based on the virtual object’s appearance in association with the real world and related criteria. The system achieves that through three simultaneous aspects. (1) The first is to estimate the incident light angle in the real environment using a live-feed \(360^\circ \) camera instrumented on an AR device. (2) The second is to simulate the reflected light using two routes: (a) global cube map construction and (b) local sampling. (3) The third is to define the shading properties for the virtual object to depict the correct lighting assets and suitable shadowing imitation. Finally, the performance efficiency is examined in both routes of the system to reduce the general cost. Also, The results are evaluated through shadow observation and user study.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Debevec, P.: Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1998, pp. 189–198. ACM, New York (1998)

    Google Scholar 

  2. Karsch, K., Hedau, V., Forsyth, D., Hoiem, D.: Rendering synthetic objects into legacy photographs. ACM Trans. Graph. (TOG) 30(6), 157 (2011)

    Article  Google Scholar 

  3. Agusanto, K., Li, L., Chuangui, Z., Sing, N.W.: Photorealistic rendering for augmented reality using environment illumination. Paper presented at the Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003 Proceedings (2003)

    Google Scholar 

  4. Fournier, A., Gunawan, A.S., Romanzin, C.: Common illumination between real and computer generated scenes. Paper presented at the Graphics Interface (1993)

    Google Scholar 

  5. Franke, T.A.: Delta light propagation volumes for mixed reality. Paper presented at the 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2013)

    Google Scholar 

  6. Gruber, L., Richter-Trummer, T., Schmalstieg, D.: Real-time photometric registration from arbitrary geometry. Paper presented at the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2012)

    Google Scholar 

  7. Jacobs, K., Loscos, C.: Classification of illumination methods for mixed reality. Paper presented at the Computer Graphics Forum (2006)

    Google Scholar 

  8. Kan, P.: High-quality real-time global illumination in augmented reality. Ph.D. thesis. Institute of Software Technology and Interactive Systems (2014)

    Google Scholar 

  9. Knecht, M., Traxler, C., Mattausch, O., Purgathofer, W., Wimmer, M.: Differential instant radiosity for mixed reality. Paper presented at the 9th IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2010)

    Google Scholar 

  10. Kronander, J., Banterle, F., Gardner, A., Miandji, E., Unger, J.: Photorealistic rendering of mixed reality scenes. Paper presented at the Computer Graphics Forum (2015)

    Google Scholar 

  11. Mehta, S. U., Kim, K., Pajak, D., Pulli, K., Kautz, J., Ramamoorthi, R.: Filtering environment illumination for interactive physically-based rendering in mixed reality. Paper presented at the Eurographics Symposium on Rendering (2015)

    Google Scholar 

  12. Nowrouzezahrai, D., Geiger, S., Mitchell, K., Sumner, R., Jarosz, W., Gross, M.: Light factorization for mixed-frequency shadows in augmented reality. Paper presented at the 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2011)

    Google Scholar 

  13. Pessoa, S., Moura, G., Lima, J., Teichrieb, V., Kelner, J.: Photorealistic rendering for augmented reality: a global illumination and BRDF solution. Paper presented at the 2010 IEEE Virtual Reality Conference (VR) (2010)

    Google Scholar 

  14. Richter-Trummer, T., Kalkofen, D., Park, J., Schmalstieg, D.: Instant mixed reality lighting from casual scanning. Paper presented at the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2016)

    Google Scholar 

  15. Rohmer, K., Büschel, W., Dachselt, R., Grosch, T.: Interactive near-field illumination for photorealistic augmented reality with varying materials on mobile devices. IEEE Trans. Vis. Comput. Graph. 21(12), 1349–1362 (2015)

    Article  Google Scholar 

  16. Schwandt, T., Broll, W.: A single camera image based approach for glossy reflections in mixed reality applications. Paper presented at the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2016)

    Google Scholar 

  17. Sloan, P.-P., Kautz, J., Snyder, J.: Precomputed radiance transfer for real-time rendering in dynamic, low-frequency lighting environments. Paper presented at the ACM Transactions on Graphics (TOG) (2002)

    Google Scholar 

  18. Supan, P., Stuppacher, I., Haller, M.: Image based shadowing in real-time augmented reality. IJVR 5(3), 1–7 (2006)

    Google Scholar 

  19. Unger, J., Gustavson, S., Ynnerman, A.: Spatially varying image based lighting by light probe sequences. Vis. Comput. 23(7), 453–465 (2007)

    Article  Google Scholar 

  20. Rhee, T., Petikam, L., Allen, B., Chalmers, A.: MR360: mixed reality rendering for 360 panoramic videos. IEEE Trans. Vis. Comput. Graph. 23(4), 1379–1388 (2017)

    Article  Google Scholar 

  21. Iorns, T., Rhee, T.: Real-time image based lighting for 360-degree panoramic video. In: Huang, F., Sugimoto, A. (eds.) PSIVT 2015. LNCS, vol. 9555, pp. 139–151. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-30285-0_12

    Chapter  Google Scholar 

  22. Michiels, N., Jorissen, L., Put, J., Bekaert, P.: Interactive augmented omnidirectional video with realistic lighting. In: De Paolis, L.T., Mongelli, A. (eds.) AVR 2014. LNCS, vol. 8853, pp. 247–263. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-13969-2_19

    Chapter  Google Scholar 

  23. Alhakamy, A., Tuceryan, M.: AR360: dynamic illumination for augmented reality with real-time interaction. In: 2019 IEEE 2nd International Conference on Information and Computer Technologies ICICT, pp. 170–175 (2019)

    Google Scholar 

  24. Alhakamy, A., Tuceryan, M.: CubeMap360: interactive global illumination for augmented reality in dynamic environment. In: IEEE SoutheastCon (2019, accepted and presented)

    Google Scholar 

  25. Alhakamy, A., Tuceryan, M.: Polarization-based illumination detection for coherent augmented reality scene rendering in dynamic environments. In: Proceedings of ACM Computer Graphics International (2019, accepted)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to A’aeshah Alhakamy or Mihran Tuceryan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Alhakamy, A., Tuceryan, M. (2019). An Empirical Evaluation of the Performance of Real-Time Illumination Approaches: Realistic Scenes in Augmented Reality. In: De Paolis, L., Bourdot, P. (eds) Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2019. Lecture Notes in Computer Science(), vol 11614. Springer, Cham. https://doi.org/10.1007/978-3-030-25999-0_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-25999-0_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-25998-3

  • Online ISBN: 978-3-030-25999-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics