Skip to main content

Automatic Analysis of Affective States: Visual Attention Based Approach

  • Conference paper
  • First Online:
Communication Technologies, Information Security and Sustainable Development (IMTIC 2013)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 414))

Included in the following conference series:

  • 740 Accesses

Abstract

Computing environment is moving from computer centered designs to human-centered designs. Human’s tend to communicate wealth of information through affective states or expressions. Thus automatic analysis of user affective states have become inevitable for computer vision community. In this paper first we focus on understanding human visual system (HVS) when it decodes or recognizes facial expressions. To understand HVS, we have conducted psycho-visual experimental study with an eye-tracker, to find which facial region is perceptually more attractive or salient for a particular expression. Secondly, based on results obtained from psycho-visual experimental study we have proposed a novel framework for automatic analysis of affective states. Framework creates discriminative feature space by processing only salient facial regions to extract Pyramid Histogram of Orientation Gradients (PHOG) features. The proposed framework achieved automatic expression recognition accuracy of 95.3 % on extended Cohn-Kanade (CK+) facial expression database for six universal facial expressions. We have also discussed generalization capabilites of proposed framework on unseen data. In the last paper discusses effectiveness of proposed framework against low resolution image sequences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Pantic, M., Pentland, A., Nijholt, A., Huang, T.S.: Human computing and machine understanding of human behavior: a survey. In: Huang, T.S., Nijholt, A., Pantic, M., Pentland, A. (eds.) AI for Human Computing. LNCS (LNAI), vol. 4451, pp. 47–71. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  2. Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31, 39–58 (2009)

    Article  Google Scholar 

  3. Zhaoping, L.: Theoretical understanding of the early visual processes by data compression and data selection. Netw. Comput. Neural Syst. 17, 301–334 (2006)

    Article  Google Scholar 

  4. Littlewort, G., Bartlett, M.S., Fasel, I., Susskind, J., Movellan, J.: Dynamics of facial expression extracted automatically from video. Image Vis. Comput. 24, 615–625 (2006)

    Article  Google Scholar 

  5. Zhao, G., Pietikäinen, M.: Dynamic texture recognition using local binary patterns with an application to facial expressions. IEEE Trans. Pattern Anal. Mach. Intell. 29, 915–928 (2007)

    Article  Google Scholar 

  6. Kotsia, I., Zafeiriou, S., Pitas, I.: Texture and shape information fusion for facial expression and facial action unit recognition. Pattern Recogn. 41, 833–851 (2008)

    Article  MATH  Google Scholar 

  7. Yang, P., Liu, Q., Metaxas, D.N.: Exploring facial expressions with compositional features. In: IEEE Conference on Computer Vision and Pattern Recognition (2010)

    Google Scholar 

  8. Khan, R.A., Meyer, A., Konik, H., Bouakaz, S.: Framework for reliable, real-time facial expression recognition for low resolution images. Pattern Recogn. Lett. 34(10), 1159–1168 (2013)

    Article  Google Scholar 

  9. Khan, R.A., Meyer, A., Konik, H., Bouakaz, S.: Human vision inspired framework for facial expressions recognition. In: IEEE International Conference on Image Processing (2012)

    Google Scholar 

  10. Rajashekar, U., Cormack, L.K., Bovik, A.: Visual search: Structure from noise. In: Eye Tracking Research & Applications Symposium, pp. 119–123 (2002)

    Google Scholar 

  11. Ekman, P.: Universals and cultural differences in facial expressions of emotion. In: Cole, J. (ed.) Nebraska Symposium on Motivation, pp. 207–283. Lincoln University of Nebraska Press, Lincoln (1971)

    Google Scholar 

  12. Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended cohn-kande dataset (CK+): a complete facial expression dataset for action unit and emotion-specified expression. In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (2010)

    Google Scholar 

  13. Khan, R.A., Meyer, A., Konik, H., Bouakaz, S.: Exploring human visual system: study to aid the development of automatic facial expression recognition framework. In: Computer Vision and Pattern Recognition Workshop (2012)

    Google Scholar 

  14. Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: IEEE Conference on Computer Vision and Pattern Recognition (2005)

    Google Scholar 

  15. Bai, Y., Guo, L., Jin, L., Huang, Q.: A novel feature extraction method using pyramid histogram of orientation gradients for smile recognition. In: International Conference on Image Processing (2009)

    Google Scholar 

  16. Dhall, A., Asthana, A., Goecke, R., Gedeon, T.: Emotion recognition using PHOG and LPQ features. In: IEEE Automatic Face and Gesture Recognition Conference FG2011, Workshop on Facial Expression Recognition and Analysis Challenge FERA (2011)

    Google Scholar 

  17. Ekman, P., Friesen, W.: The Facial Action Coding System: A Technique for the Measurement of Facial Movements. Consulting Psychologist, Palo Alto (1978)

    Google Scholar 

  18. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: IEEE Conference on Computer Vision and Pattern Recognition (2001)

    Google Scholar 

  19. Tian, Y.: Evaluation of face resolution for expression analysis. In: Computer Vision and Pattern Recognition Workshop (2004)

    Google Scholar 

  20. Valstar, M., Patras, I., Pantic, M.: Facial action unit detection using probabilistic actively learned support vector machines on tracked facial point data. In: IEEE Conference on Computer Vision and Pattern Recognition Workshop, pp. 76–84 (2005)

    Google Scholar 

  21. Wallhoff, F.: Facial expressions and emotion database (2006), www.mmk.ei.tum.de/waf/fgnet/feedtum.html

Download references

Acknowledgment

This project is supported by the Région Rhône-Alpes, France.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rizwan Ahmed Khan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Khan, R.A., Meyer, A., Konik, H., Bouakaz, S. (2014). Automatic Analysis of Affective States: Visual Attention Based Approach. In: Shaikh, F., Chowdhry, B., Zeadally, S., Hussain, D., Memon, A., Uqaili, M. (eds) Communication Technologies, Information Security and Sustainable Development. IMTIC 2013. Communications in Computer and Information Science, vol 414. Springer, Cham. https://doi.org/10.1007/978-3-319-10987-9_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-10987-9_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-10986-2

  • Online ISBN: 978-3-319-10987-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics