Abstract
Many people fear that emotion-oriented technologies (EOT) – capable of registering, modelling, influencing and responding to emotions – can easily affect their decisions and lives in ways that effectively undermine their autonomy. In this chapter, we explain why these worries are at least partly founded: EOT are particularly susceptible to abuse of autonomy, and there are ways of respecting the autonomy of persons that EOT are unable to accomplish. We draw some general ethical conclusions concerning the design and further development of EOT, contrasting our approach with the “interactional design approach”. This approach is often thought to avoid infringements of user autonomy. We argue, however, that it unduly restricts possible uses of EOT that are unproblematic from the perspective of autonomy, while at the same time it allows for uses of EOT that tend to compromise the autonomy of persons.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
This characterization is supposed to be compatible with a variety of views on autonomy. More elaborate characterizations and helpful introductions to the notion of autonomy can be found in Christman (1989, 2003), Dworkin (1988, Chap. 1), Friedman (2003, Chap. 1), Feinberg (1989) and Oshana (2006, chap. 1).
- 3.
- 4.
- 5.
Joseph Raz (1997) claims, e.g., that we are active rather than passive if we are responsive to reasons, i.e. if we can rationally make sense of our beliefs and desires (feelings, etc.).
- 6.
- 7.
Some philosophers have distinguished this mode of self-reflection from the other two modes, because at this point the notions of ‘identification’ and ‘authenticity’ come into play. In self-evaluation a person is said to ‘identify’ with certain desires and to ‘make them her own’. The most prominent defender of this idea is Harry Frankfurt (1971). According to his ‘hierarchical model’ a person acts autonomously if she is moved to action by desires she identifies with (she desires to be effective in action). It is important to note that in order to be autonomous, a person must not only satisfy this so-called ‘authenticity condition’ but also have the general capacities or competencies we describe. Cf. Christman (2003).
- 8.
- 9.
Cf. Ekstrom (1993).
- 10.
We want to avoid an ‘external’ or ‘substantive’ account of rationality according to which a person must be able to understand the correct reasons and/or have true beliefs in order to count as autonomous (cf. Benson, 1987). For a discussion of ‘internal’ vs. ‘external’ rationality, see Christman (1991, 13 ff.).
- 11.
- 12.
- 13.
- 14.
- 15.
One might argue that, on our account, the autonomy of Laura and her husband is also compromised, because John does not inform them about his feelings. But we can easily explain why our account of (respect for) autonomy does not yield this result: Most importantly, one reason why John keeps distance might exactly be that he does not want to interact with Laura and his friend in ways that would compromise their autonomy. John’s autonomy is undermined because Paul compromises his ability to resolve the situation on his own, while Laura and her husband do not face a problem and thus do not have to resolve anything (yet).
- 16.
Cf., e.g., the work of Aaron Ben-Ze’ev, Ronald de Sousa, Sabine A. Döring, Peter Goldie, Patricia Greenspan, Bennett Helm, Martha Nussbaum, David Pugmire, Amelie Rorty, Robert Solomon, Holmer Steinfath, Michael Stocker, Christine Tappolet, Gabrielle Taylor, Bernard Williams and Richard Wollheim.
- 17.
This formulation might strike many as way too strong. But note that we always speak from the perspective of autonomy only.
- 18.
Cf. Darwall (2006).
- 19.
This condition is meant to exclude too far-fetched worst case scenarios.
- 20.
As should be clear, these problems relate to the two general duties we have set out in Sect. 3.
- 21.
- 22.
In passing, we want to mention an important qualification: Whether some action undermines a person’s procedural independence cannot always be answered without reference to the person’s actual capacities. For example, a father does not respect his child’s procedural independence if he repeatedly tells her that the best thing to do in life is to become a check-out girl. The child is not in a position to evaluate and critically assess these claims from the background of a stable self-conception, and thus, her father interferes with her procedural independence. By contrast, a father does not disrespect autonomy if he tells his well-educated and self-reflective daughter that the best thing to do in life is to become a check-out girl. She will most probably laugh at him. This suggests a principle that could be labelled in a provocative way the ‘low autonomy, high respect’ principle. The less autonomous a person actually is, the more other persons should respect her autonomy – they are all the time in danger of undermining her procedural independence. This idea fits well intuitions concerning, for example, the treatment of children. In discussing the ethicality of persuasive systems, one needs to make use of something like ‘normality conditions’ and assume that a person fulfils to some degree the conditions of self-reflection and rationality.
- 23.
For further discussions of persuasive systems from the perspective of autonomy, see Baumann and Döring (2006). More on persuasive systems and the role of emotions in section WP8; for the ethicality of persuasive systems see Guerini and Stock (2006); see also the discussion in Goldie and Döring (2005a) (CyberDoc).
- 24.
- 25.
See also the discussion of ‘Semi-Intelligent Information Filters’ (SIIF) in chapter “The Ethical Distinctiveness of Emotion-Oriented Technology” by Sabine Döring et al.
- 26.
This system has been developed by Picard et al., http://affect.media.mit.edu/projects.php?id=2145 (accessed January 28, 2008). The other two systems – ‘iNerve’ and ‘iPanic’ – are only invented for purposes of discussion.
- 27.
Cf. Döring and Goldie (2005b).
- 28.
- 29.
- 30.
A different approach to come up to problems of privacy is to be found in Picard (2004).
- 31.
An overview of the fields of application is given by Schröder et al. (2006).
- 32.
It is important to note that some might want to distinguish between emotions and mere affective states, the former being more complex intentional states. Although we welcome such attempts, this move is not helpful in discussions of EOT, because ‘emotion’ or ‘affect’ is generally used as umbrella terms in this connection.
- 33.
Such worries are also mentioned in Picard and Klein (2002).
- 34.
See also chapter “Principalism: A Method for the Ethics of Emotion- Oriented Machines” by Sheelagh McGuinness.
References
Baumann H, Döring S (2006) Emotion-oriented systems – threats to user autonomy. http://emotion-research.net/ws/wp10/presentation-materials/HolgerBaumann-SabineDoering-wp10ws-EmotionOrientedSystems-ThreatsToUserAutonomy-final.pdf. Accessed 17 May 2010
Beauchamp TL, Childress JF (1994) Principles of biomedical ethics, 4th edn. OUP, Oxford
Benson P (1987) Freedom and value. J Philos 84:465–486
Boehner K et al (2005) Affect: from information to interaction. In: Proceedings of the 4th decennial conference on critical computing: between sense and sensibility, Aarhus, Denmark
Childress JF (1990) The place of autonomy in bioethics. Hastings Cent Rep 20:12–17
Christman J (1989) Introduction. In: Christman J (ed) The inner citadel. OUP, New York, NY, pp 3–23
Christman J (1991) Autonomy and personal history. Can J Philos 21:1–24
Christman J (2003) Autonomy in Moral and Political Philosophy. In: Stanford encyclopedia of philosophy. http://plato.stanford.edu/entries/autonomy-moral. Accessed 17 May 2010
Darwall S (2006) The value of autonomy and autonomy of the will. Ethics 116:263–284
DeCew JW (2006) Privacy. In: Stanford encyclopedia of philosophy. http://plato.stanford.edu/entries/privacy/
Döring S, Goldie P (2005a) Interim report to plenary meeting on ethical frameworks for emotion-oriented systems. HUMAINE deliverable D10b in: http://emotion-research.net/deliverables/D10b.pdf. Accessed 17 May 2010
Döring S, Goldie P (2005b) Categories of emotion: everyday psychology and HUMAINE. http://emotion-research.net/ws/wp3/ExtraMaterial/HUMAINE-Goldie.pdf. Last visited 7 November 2010
Dworkin G (1976) Autonomy and behavior control. Hastings Cent Rep 6(1):23–28
Dworkin G (1988) The theory and practice of autonomy. CUP, Cambridge, MA
Ekstrom L (1993) A coherence theory of autonomy. Philos Phenomenol Res 53:599–616
Frankfurt H (1971) Freedom of the will and the concept of a person. J Philos 86:5–20
Feinberg J (1989) Autonomy. In: Christman J (ed) The inner citadel. OUP, New York, NY, pp 27–53
Friedman M (2003) Autonomy, gender, politics. OUP, Oxford
Guerini M, Stock O (2006) Ethical guidelines for persuasive systems. In: Proceedings of the HUMAINE WP10 workshop, Nov 2006, Vienna, Austria (EU). http://emotion-research.net/ws/wp10/presentation-materials/MarcoGuerini-OlivieroStock-wp10ws-EthicalGuidelinesForPersuasiveSystems-final.pdf/. Accessed 17 May 2010
Haworth L (1986) Autonomy. An essay in philosophical psychology and ethics. Yale UP, Yale
Höök K, Laaksolahti J (2008) Empowerment: a strategy to dealing with human values in affective interactive systems. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.96.5891&rep=rep1&type=pdf. Last visited 7 November 2010
Lindström M et al (2006) Affective diary – designing for bodily expressiveness and self-reflection. http://www.sics.se/~petra/affd.pdf. Last visited 17 May 2010
Meyers D (1989) Self, society, and personal choice. Columbia UP, New York, NY
Oshana M (2006) Personal autonomy in society. Ashgate, London
Picard RW, Klein J (2002) Computers that recognise and respond to user emotion: theoretical and practical implications. Interact Comput 14:141–169
Raz J (1997) When we are ourselves: the active and the passive. Reprint In: – (1999) Engaging reason. On the theory of value and action. OUP, Oxford, pp 5–21
Reynolds C, Picard RW (2004) Affective sensors, privacy, and ethical contracts. http://affect.media.mit.edu/pdfs/04.reynolds-picard-chi.pdf. Last visited 17 May 2010
Rössler B (2001) Der Wert des Privaten. Suhrkamp, Frankfurt
Schröder M, Cowie R, Kollias S (2006) The future of emotion-oriented computing. HUMAINE Plenary presentation, Paris, Jun 2007. http://emotion-research.net/ws/plenary-2007/2007-FutureOfEmotionOrientedComputing.pdf. Last visited 17 May 2010
Velleman D (1989) Practical reflection. Princeton UP, Princeton, NJ
Young R (1980) Autonomy and socialization. Mind 89:565–576
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Baumann, H., Döring, S. (2011). Emotion-Oriented Systems and the Autonomy of Persons. In: Cowie, R., Pelachaud, C., Petta, P. (eds) Emotion-Oriented Systems. Cognitive Technologies. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15184-2_40
Download citation
DOI: https://doi.org/10.1007/978-3-642-15184-2_40
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-15183-5
Online ISBN: 978-3-642-15184-2
eBook Packages: Computer ScienceComputer Science (R0)