Psychology Wiki
Register
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Other fields of psychology: AI · Computer · Consulting · Consumer · Engineering · Environmental · Forensic · Military · Sport · Transpersonal · Index


Affective Computing is also the title of a textbook on the subject by Rosalind Picard.

Affective computing is a branch of the study and development of neuroscience that deals with the design of systems and devices that can recognize, interpret, and process human emotions. It is an interdisciplinary field spanning computer sciences, psychology, and cognitive science.[1] While the origins of the field may be traced as far back as to early philosophical enquiries into emotion,[2] the more modern branch of computer science originated with Rosalind Picard's 1995 paper[3] on affective computing.[4][5] A motivation for the research is the ability to simulate empathy. The machine should interpret the emotional state of humans and adapt its behaviour to them, giving an appropriate response for those emotions.

Areas of affective computing[]

Detecting and recognizing emotional information[]

Detecting emotional information begins with passive sensors which capture data about the user's physical state or behavior without interpreting the input. The data gathered is analogous to the cues humans use to perceive emotions in others. For example, a video camera might capture facial expressions, body posture and gestures, while a microphone might capture speech. Other sensors detect emotional cues by directly measuring physiological data, such as skin temperature and galvanic resistance.[6]

Recognizing emotional information requires the extraction of meaningful patterns from the gathered data. This is done by parsing the data through various processes such as speech recognition, natural language processing, or facial expression detection, all of which are dependent on the human factor vis-a-vis programming.[citation needed]

Emotion in machines[]

Another area within affective computing is the design of computational devices proposed to exhibit either innate emotional capabilities or that are capable of convincingly simulating emotions. A more practical approach, based on current technological capabilities, is the simulation of emotions in conversational agents in order to enrich and facilitate interactivity between human and machine [7]. While human emotions are often associated with surges in hormones and other neuropeptides, emotions in machines might be associated with abstract states associated with progress (or lack of progress) in autonomous learning systems[citation needed]. In this view, affective emotional states correspond to time-derivatives (perturbations) in the learning curve of an arbitrary learning system.[citation needed]

Marvin Minsky, one of the pioneering computer scientists in artificial intelligence, relates emotions to the broader issues of machine intelligence stating in The Emotion Machine that emotion is "not especially different from the processes that we call 'thinking.'"[8]

Technologies of affective computing[]

Emotional speech[]

Emotional speech processing recognizes the user's emotional state by analyzing speech patterns. Vocal parameters and prosody features such as pitch variables and speech rate are analyzed through pattern recognition.[9][10]

Emotional inflection and modulation in synthesized speech, either through phrasing or acoustic features is useful in human-computer interaction. Such capability makes speech natural and expressive. For example a dialog system might modulate its speech to be more puerile if it deems the emotional model of its current user is that of a child.[citation needed]

Facial expression[]

The detection and processing of facial expression is achieved through various methods such as optical flow, hidden Markov model, neural network processing or active appearance model. More than one modalities can be combined or fused (multimodal recognition, e.g. facial expressions and speech prosody [11] or facial expressions and hand gestures [12]) to provide a more robust estimation of the subject's emotional state.

Body gesture[]

Body gesture is the position and the changes of the body. There are many proposed methods[13] to detect the body gesture. Hand gestures have been a common focus of body gesture detection, apparentness [vague]

methods[14] and 3-D modeling methods are traditionally used.

Visual aesthetics[]

Aesthetics, in the world of art and photography, refers to the principles of the nature and appreciation of beauty. Judging beauty and other aesthetic qualities is a highly subjective task. Computer scientists at Penn State treat the challenge of automatically inferring aesthetic quality of pictures using their visual content as a machine learning problem, with a peer-rated on-line photo sharing Website as data source[15]. They extract certain visual features based on the intuition that they can discriminate between aesthetically pleasing and displeasing images. The work is demonstrated in the ACQUINE system[16] on the Web.

Potential applications[]

In e-learning applications, affective computing can be used to adjust the presentation style of a computerized tutor when a learner is bored, interested, frustrated, or pleased.[17] [18] Psychological health services, i.e. counseling, benefit from affective computing applications when determining a client's emotional state.[citation needed] Affective computing sends a message via color or sound to express an emotional state to others.[citation needed]

Robotic systems capable of processing affective information exhibit higher flexibility while one works in uncertain or complex environments. Companion devices, such as digital pets, use affective computing abilities to enhance realism and provide a higher degree of autonomy.[citation needed]

Other potential applications are centered around social monitoring. For example, a car can monitor the emotion of all occupants and engage in additional safety measures, such as alerting other vehicles if it detects the driver to be angry.[citation needed] Affective computing has potential applications in human computer interaction, such as affective mirrors allowing the user to see how he or she performs; emotion monitoring agents sending a warning before one sends an angry email; or even music players selecting tracks based on mood.[citation needed]

Affective computing is also being applied to the development of communicative technologies for use by people with autism.[19]

Application examples[]

See also[]

References[]

  1. Tao, Jianhua; Tieniu Tan (2005). "Affective Computing: A Review". Affective Computing and Intelligent Interaction LNCS 3784: 981–995, Springer. DOI:10.1007/11573548. 
  2. James, William (1884). What is Emotion. Mind 9: 188–205. Cited by Tao and Tan.
  3. "Affective Computing" MIT Technical Report #321 (Abstract), 1995
  4. Kleine-Cosack, Christian (2006). Recognition and Simulation of Emotions. (PDF) URL accessed on May 13, 2008.
  5. Diamond, David (2003). The Love Machine; Building computers that care.. Wired. URL accessed on May 13, 2008.
  6. Garay, Nestor, Idoia Cearreta, Juan Miguel López, Inmaculada Fajardo (April 2006). Assistive Technology and Affective Mediation. Human Technology: an Interdisciplinary Journal on Humans in ICT Environments 2 (1): 55–83.
  7. Heise, David (2004), "Enculturating agents with expressive role behavior", Agent Culture: Human-Agent Interaction in a Mutlicultural World, Lawrence Erlbaum Associates, pp. 127–142 
  8. includeonly>Restak, Richard. "Mind Over Matter", The Washington Post, 2006-12-17. Retrieved on 2008-05-13.
  9. Dellaert, F., Polizin, t., and Waibel, A., Recognizing Emotion in Speech", In Proc. Of ICSLP 1996, Philadelphia, PA, pp.1970-1973, 1996
  10. Lee, C.M.; Narayanan, S.; Pieraccini, R., Recognition of Negative Emotion in the Human Speech Signals, Workshop on Auto. Speech Recognition and Understanding, Dec 2001
  11. G. Caridakis, L. Malatesta, L. Kessous, N. Amir, A. Raouzaiou, K. Karpouzis, Modeling naturalistic affective states via facial and vocal expressions recognition, International Conference on Multimodal Interfaces (ICMI’06), Banff, Alberta, Canada, November 2-4, 2006
  12. T. Balomenos, A. Raouzaiou, S. Ioannou, A. Drosopoulos, K. Karpouzis, S. Kollias, Emotion Analysis in Man-Machine Interaction Systems, Samy Bengio, Herve Bourlard (Eds.), Machine Learning for Multimodal Interaction, Lecture Notes in Computer Science, Vol. 3361, 2004, pp. 318 - 328, Springer-Verlag
  13. J. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999
  14. Vladimir I. Pavlovic, Rajeev Sharma, Thomas S. Huang, Visual Interpretation of Hand Gestures for Human-Computer Interaction; A Review, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997
  15. Ritendra Datta, Dhiraj Joshi, Jia Li and James Z. Wang, Studying Aesthetics in Photographic Images Using a Computational Approach, Lecture Notes in Computer Science, vol. 3953, Proceedings of the European Conference on Computer Vision, Part III, pp. 288-301, Graz, Austria, May 2006.
  16. http://acquine.alipr.com
  17. AutoTutor
  18. S. Asteriadis, P. Tzouveli, K. Karpouzis, S. Kollias, Estimation of behavioral user state based on eye gaze and head pose—application in an e-learning environment, Multimedia Tools and Applications, Springer, Volume 41, Number 3 / February, 2009, pp. 469-493.
  19. Projects in Affective Computing

External links[]

This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement