Abstract
By the demand of the times, recent performing stages are attempting to change introducing images as a means of communication with the audience and free expression beyond the limits of time and space. It is a method of visualizing the flow of space beyond the physical limits of the stage by combining the motion of performers with space where images are projected and changing static space to dynamic space by making the audience create an illusion that the object is moving. For the implementation of realistic stage, this study is to propose a method of using the whole surrounding space surrounding performers as a removable screen not a non-removable camera. The location information of viewers received from the camera is used as an interaction element and the result of the interaction is used for the location control of the screen where contents are played as well as changes in the contents being played. The proposed method may take temporal and spatial, economic benefits by reducing the number of image playback equipment required to create 3D space and improve the quality of performances by inducing the audience to be immersed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Wechsler, R.: Artistic considerations in the use of motion tracking with live performers: a practical guide. In: Performance and Technology: Practices of Virtual Embodiment and Interactivity. Palgrove Macmillian (2006)
Davis, J., Shah, M.: Visual Gesture Recognition. Vision, Image and Signal Processing 141(2), 101–106 (1994)
Paradiso, J., Flavia, S.: Optical tracking for music and dance performance. Optical 3-D Measurement Techniques IV, 11–18 (1997)
Salman, A., Khalid, R., Yasir, R., Zahid, B., Usman, K.: On Stage Performer Tracking System. Advances in Engineering & Technology 3(2), 65–76 (2012)
Manigandan, M., Jackin, I.M.: Wireless vision based mobile robot control using hand gesture recognition through perceptual color space. In: Advances in Computer Engineering (ACE), pp. 95–99 (2010)
Kao, S.-T., Yang, Z.-Y., Ho, M.-T.: Design and implementation of a color-based visual tracking control system. In: 2013 CACS International Automatic Control Conference, pp. 371–376 (2013)
Fabian, J., Young, T., Jones, J.C.P., Clayton, G.M.: Integrating the microsoft kinect with simulink: Real-time object tracking example. J. Mechatronics, IEEE/ASME Transactions on Mechatronics 19(1), 249–257 (2012)
Parzych, M., Dabrowski, A., Cetnarowicz, D.: Aspects of Microsoft Kinect sensor application to servomotor control. Bulletin of the Polish Academy of Sciences Technical Sciences 62(3), 595–601 (2014)
Autonics Sensors and Controllers, http://autonics.co.kr
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lim, S., Lyu, J. (2015). Stage Image Control System Using Visual Tracking. In: Park, J., Chao, HC., Arabnia, H., Yen, N. (eds) Advanced Multimedia and Ubiquitous Engineering. Lecture Notes in Electrical Engineering, vol 352. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-47487-7_34
Download citation
DOI: https://doi.org/10.1007/978-3-662-47487-7_34
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-47486-0
Online ISBN: 978-3-662-47487-7
eBook Packages: EngineeringEngineering (R0)