Explore chapters and articles related to this topic
Emotion Recognition for Human Machine Interaction
Published in Puneet Kumar, Vinod Kumar Jain, Dharminder Kumar, Artificial Intelligence and Global Society, 2021
The evidence of the importance of emotions in human-to-human interaction provides ways for computers to recognize emotional expressions, achieving the goal of human computer interaction. The classification of emotions into different states shows the way to use pattern recognition approaches for recognizing emotions using different modalities as inputs to the emotion recognition models [16]. Several modalities for recognizing human emotions exist to inculcate the emotional ability in machines. These modalities help computers identify the current emotional state of the user and provide service accordingly. Each modality possesses its own advantage and disadvantage when it comes to specific domain application. The modalities for emotion recognition include face, speech, gesture, physiological signals, and brain signals, and are described as follows:
Consciousness and Ethics: Artificially Conscious Moral Agents
Published in Wendell Wallach, Peter Asaro, Machine Ethics and Robot Ethics, 2020
Wendell Wallach, Colin Allen, Stan Franklin
Our focus is not upon whether an artificial agent should be granted moral status, but we agree that the organic view is worthy of serious consideration. However, the possibility of developing synthetic emotions, including pleasure and pain, for artificial (non-biological) agents has been of interest to scientists working in the fields of affective computing and machine consciousness [Picard, 1997; Franklin and Patterson, 2006; Vallverdú and Casacuberta, 2008; Haikonen, 2009]. It is rather early to evaluate whether existing or future implementations of synthetic emotions will or will not lead to the kind of rich emotional intelligence that might be expected of moral agents. Artificial agents incapable of feeling pain or pleasure or lacking empathy may fail in adequately responding to certain kinds (classes) of moral challenges. But the capacity to empathize is not a prerequisite for responding appropriately to all moral challenges.
Flexible and Stretchable Devices for Human-Machine Interfaces
Published in Muhammad Mustafa Hussain, Nazek El-Atab, Handbook of Flexible and Stretchable Electronics, 2019
Irmandy Wicaksono, Canan Dagdeviren
Emotions play a vital role in our daily life as they enable us to express and understand each other’s feelings. They are represented by external physical expressions and internal mental processes that may be imperceptible to us. The ability to recognize human emotions and simulate empathy has become an important aspect in human-machine interaction systems, prompting the field of affective computing or artificial emotional intelligence (emotion AI) (Picard 1997). Recognizing emotions enables machines to adapt and react depending on the user’s behaviors, allowing a more natural and efficacious mutual relationship between human and computers. Multiple methods have been explored in the past years to monitor and classify human emotions. The most widely used approach involves the detection of facial expressions, speech, body gestures, and physiological signals (Castellano et al. 2008). Except for physiological monitoring, which uses wearable sensors, current approaches of emotion recognition mainly use an external camera in order to recognize facial gestures or microphone to process voice signals. As we have covered recent developments of flexible and stretchable devices for body gesture, speech, and facial expression recognition (Sections 3 through 5), in this section, we will mainly discuss the development of these devices for physiological sensing. The fact that individuals cannot easily control their physiological signals makes sensing them extremely useful, as manipulating these signals to hide our emotions is challenging.
Information acquisition, emotion experience and behaviour intention during online shopping: an eye-tracking study
Published in Behaviour & Information Technology, 2021
Emotion can influence thinking, judgment and decision-making. Emotion experience while using e-commerce websites has attracted the attention of many researchers (Guo et al. 2015; Ha and Im 2012; Ha and Lennon 2010; Kim, Kim, and Lennon 2009; Porat and Tractinsky 2012). Kim, Kim, and Lennon (2009) suggested that product presentation has a significant effect on consumers’ emotional responses. Porat and Tractinsky (2012) indicated that a web store's salient design characteristics influence the emotions of its visitors. Ha and Im (2012) indicated that website design quality shows positive direct effects on pleasure, arousal, and perceived information quality and indirect effects on satisfaction. Guo et al. (2015) proposed a multimodal measurement method conjoint using questionnaires, eye-tracking and physiological measures to interpret users’ emotional experience while users are interacting with websites. Ha and Lennon (2010) indicated that pleasure and arousal induced by various online visual merchandising cues are positively related to consumers’ satisfaction.
Feel the image: The role of emotions in the image-seeking process
Published in Human–Computer Interaction, 2019
Lev Poretski, Joel Lanir, Ofer Arazy
Scholars have recognized that emotions play a pivotal role in the process of human–machine interaction (Brave, Nass, & Hutchinson, 2005). Picard (1995) defined the term “affective computing” as “an approach to computing that relates to, originates in, or purposely influences emotions” (p.1). Research in the area of affective computing has aimed at providing computers with the ability to recognize and intelligently respond to human emotions (Hudlicka, 2003; Picard, 2003; Picard & Klein, 2002). Related to the current study, emotions have been studied in the context of Web surfing and information retrieval (Arapakis, Jose, & Gray, 2008; Deng & Poole, 2010; Hudlicka, 2003; Kalbach, 2006). For example, Kalbach (2006), in her study of emotions in information seeking processes, pointed out that people cannot be considered as strictly goal-driven, task-solving agents: they have affective considerations behind their choices and seeking behavior. In particular, when seeking for images as part of a creative communication task, users’ image selection decision often involves deep emotional considerations: the desire to emotionally move people (Chew et al., 2010).
Multi-branch feature learning based speech emotion recognition using SCAR-NET
Published in Connection Science, 2023
Keji Mao, Yuxiang Wang, Ligang Ren, Jinhong Zhang, Jiefan Qiu, Guanglin Dai
Emotions are vehicles for personal feelings and feedback. The understanding of emotions plays a crucial role in human-human interaction and is likewise an effective means to improve human-computer interaction. Therefore, the area of affective computing has gained great attention and made significant developments in the last decade (Zeng et al., 2009). The ultimate goal of affective computing is the automatic understanding and recognition of human emotions, and there are still many difficulties in achieving this. In fact, people often express their emotions through various mediums, such as body movements, facial expressions, speech, and physiological changes. In human-computer interaction, computers could capture these key mediums to recognise the emotional state of the users. Noticing this, researchers applied techniques from psychology, signal processing, and deep learning to affective computing, allowing computers to learn mediums such as body movements (Noroozi et al., 2021; Pławiak et al., 2016), facial expressions (Kulkarni et al., 2018; Shojaeilangari et al., 2016), speech signals (Kamińska & Pelikant, 2012; Kamińska et al., 2017; Noroozi et al., 2017), and physiological signals (Greco et al., 2016; Jenke et al., 2014), ultimately enabling emotion recognition. In earlier studies, facial expressions have received a lot of attention because of their rich emotional expressions. About 95% of the literature is based on facial expressions for affective computing (De Gelder, 2009). However, due to the advantages of low acquisition difficulty and little private information, affective computing based on speech signals is becoming the mainstream of current research.