Skip to main content

Retroaction Between Music and Physiology: An Approach from the Point of View of Emotions

  • Chapter
  • First Online:
Guide to Brain-Computer Music Interfacing

Abstract

It is a well-known fact that listening to music produces particular physiological reactions for the auditor, and the study of these relationships remains a wide unexplored field of study. When one starts analyzing physiological signals measured on a person listening to music, one has to firstly define models to know what information could be observed with these signals. Conversely, when one starts trying to generate some music from physiological data, in fact, it is an attempt to create the inverse relationship of the one happening naturally, and in order to do that, one also has to define models enabling the control of all the parameters of a generative music system from the few physiological information available, and in a coherent way. The notion of emotion, aside from looking particularly appropriate in the context, reveals itself to be a central concept allowing the articulation between musical and physiological models. We suggest in this article an experimental real-time system aiming at studying the interactions and retroactions between music and physiology, based on the paradigm of emotions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Bradley MM, Lang PJ (2007) The international affective digitized sounds (; IADS-2): affective ratings of sounds and instruction manual. University of Florida, Gainesville, FL, Technical Report B-3

    Google Scholar 

  • Clay A, Domenger G, Couture N, De La Rivière J, Desainte-Catherine M, et al (2011) Spectacle augmenté: le projet CARE, un processus de recherche. http://hal.inria.fr/hal-00651544/

  • Clay A, Couture N, Decarsin E, Desainte-Catherine M, Vulliard PH, Larralde J (2012) Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation. In: Proceedings of NIME'12. University of Michigan, Ann Arbour. pp. 180–186

    Google Scholar 

  • Duvinage M et al (2012) A P300-based quantitative comparison between the Emotiv Epoc headset and a medical EEG device. In: Proceedings of the 9th IASTED international conference on biomedical engineering, p 2

    Google Scholar 

  • Ekman P (1992) An argument for basic emotions. Cogn Emot 6(3–4):169–200

    Article  Google Scholar 

  • Ganidagli S, Cengiz M, Yanik M, Becerik C, Unal B (2005) The effect of music on preoperative sedation and the bispectral index. Anesth Analg 101(1):103–106

    Google Scholar 

  • Lang PJ, Bradley MM, Cuthbert BN (1999) International affective picture system (IAPS): technical manual and affective ratings. University of Florida, Gainesville, FL

    Google Scholar 

  • Le Groux S, Verschure P (2010) Towards adaptive music generation by reinforcement learning of musical tension. In: Proceedings of the 6th Sound and Music Computing Conference, pp 134–137

    Google Scholar 

  • Livingstone S, Muhlberger R, Brown A, Thompson W (2010) Changing musical emotion: a computational rule system for modifying score and performance. Comput Music J 34(1):41–64

    Article  Google Scholar 

  • Loewy J, Hallan C, Friedman E, Martinez C (2005) Sleep/sedation in children undergoing EEG testing: a comparison of chloral hydrate and music therapy. J Perianesthesia Nurs 20(5):323–332

    Article  Google Scholar 

  • Peretz I, Gagnon L, Bouchard B (1998) Music and emotion: perceptual determinants, immediacy and isolation after brain damage. Cognition 68(2):111–41

    Google Scholar 

  • Renard Y, Lotte F, Gibert G, Congedo M, Maby E, Delannoy V, Bertrand O, Lécuyer A (2010) Openvibe: an open-source software platform to design, test, and use brain-computer interfaces in real and virtual environments. Presence Teleoperators Virtual Environ 19(1):35–53

    Article  Google Scholar 

  • Russel J (1980) A circumplex model of affect. J Personnal Soc Psychol 39(6):1161–1178

    Article  Google Scholar 

  • Sammler D, Grigutsch M, Fritz T, Koelsch S (2007) Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology 44:293–304

    Article  Google Scholar 

  • Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cogn Emot 15(4):487–500

    Article  Google Scholar 

  • Trainor LJ, Schmidt LA (2003) Processing emotions induced by music. Oxford University Press, Oxford

    Google Scholar 

  • Vezard L, Chavent M, Legrand P, Faita-Ainseba F, Clauzel J, et al (2011) Caractérisation d’états psychophysiologiques par classification de signaux EEG. Intégration de ces résultats dans le projet PSI

    Google Scholar 

  • Wallis I, Ingalls T, Campana E (2008) Computer-generating emotional music: the design of an affective music algorithm. In: International conference on digital audio effects, pp 7–12

    Google Scholar 

  • Wright M (2005) Open sound control: an enabling technology for musical networking. Organ Sound 10(03):193–200

    Article  Google Scholar 

  • Zhang XW, Fan Y, Manyande A, Tian YK, Yin P (2005) Effects of music on target? controlled infusion of propofol requirements during combined spinal? epidural anaesthesia. Anaesthesia 60(10):990–994

    Article  Google Scholar 

Download references

Acknowledgments

This research was carried out in the context of the SCRIME (Studio de Création et de Recherche en Informatique et Musique Electroacoustique, scrime.labri.fr) project which is funded by the DGCA of the French Culture Ministry, the Aquitaine Regional Council. SCRIME project is the result of a cooperation convention between the Conservatoire of Bordeaux, ENSEIRB-Matmeca (school of electronic and computer scientist engineers) and the University of Sciences of Bordeaux. It is composed of electroacoustic music composers and scientific researchers. It is managed by the LaBRI (laboratory of research in computer science of the University of Bordeaux, www.labri.fr). Its main missions are research and creation, diffusion and pedagogy thus extending its influence.

We would like to thank Pierre Héricourt, system engineer at LaBRI, for developing the EEG headsets’ drivers, allowing us to interface the headsets with all the software parts, and thus to set up our experiments for real.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pierre-Henri Vulliard .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag London

About this chapter

Cite this chapter

Vulliard, PH., Larralde, J., Desainte-Catherine, M. (2014). Retroaction Between Music and Physiology: An Approach from the Point of View of Emotions. In: Miranda, E., Castet, J. (eds) Guide to Brain-Computer Music Interfacing. Springer, London. https://doi.org/10.1007/978-1-4471-6584-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-6584-2_11

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-6583-5

  • Online ISBN: 978-1-4471-6584-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics