Explore chapters and articles related to this topic
A Comparative Study of Intrapersonal and Interpersonal Influencing Factors on the Academic Performance of Technical and Nontechnical Students
Published in Mukesh Kumar Awasthi, Ravi Tomar, Maanak Gupta, Mathematical Modeling for Intelligent Systems, 2023
Deepti Sharma, Rishi Asthana, Vaishali Sharma
Besides verbal, a teacher’s nonverbal communication also leaves a remarkable impression on the students’ attentiveness. Nonverbal communication includes kinesics, proxemics, chronemics, and paralinguistic features. Kinesics includes correct postures, appropriate gesticulation, positive body language, right facial expressions, and correct eye contact. Proxemics helps us to use the space while communicating with others. Bambaeeroo and Shokrpour (2017) found that nonverbal cues should complement verbal communication, and balanced communication enhances students learning and improves academic performance. Sutiyatno (2018) also states that verbal and nonverbal communication has a significant positive impact on a students’ learning process. Zeki (2009) indicated that the teacher’s nonverbal communication creates a comfortable environment for learning and motivates students to participate in group discussions.
Patient–Professional Communication
Published in Richard J. Holden, Rupa S. Valdez, The Patient Factor, 2021
Onur Asan, Bradley H. Crotty, Avishek Choudhury
Many studies report the importance of nonverbal communication between patients and clinicians and its link to several outcomes. Nonverbal communication consists of eye gaze, facial expression, gesturing, body postures, and positioning. Nonverbal communication helps to communicate care, concern, fear, respect, happiness, sadness, anger, surprise, fear, and disgust, which directly contribute to forming trust or mistrust. These messages do not stop when there is no verbal communication: even when people are silent, they are still communicating nonverbally.
Conducting and Using the Interview Effectively
Published in Karen L. McGraw, Karan Harbison, User-Centered Requirements: The Scenario-Based Engineering Process, 2020
Karen L. McGraw, Karan Harbison
Some of the difficulties that arise during verbal communication can be clarified if the analyst is aware of, and works to interpret, the supporting nonverbal communication. Nonverbal communication conveys meanings that may enhance, substitute, or even contradict the accompanying verbal communication. We use many tools to communicate nonverbally. Figure 7.5 depicts some of these.
Examining the Use of Nonverbal Communication in Virtual Agents
Published in International Journal of Human–Computer Interaction, 2021
One of the advantages of a data-driven method is that real data can represent the full range of nuance present in real human NVC (which a “distilled” model of behavior fails to capture). This approach is useful when there are no existing studies or models for the specific behaviors that an agent is trying to employ. For example, the smoking cessation coach by Grolleman et al. (2006) required them to first analyze the behaviors of a real coach. Although prior work modeled similar situations with interviewing and coaching, the researchers needed to understand the specific nonverbal strategies that a smoking coach employs. Similarly, Gratch et al. (2013) needed to understand the nonverbal signs associated with distress during clinical interviews. As part of this process, they first collected a large dataset or interview recordings. They then analyzed the data to find correlations between distress and the nonverbal behaviors of gaze, facial expression, and gesture, for later feeding recognition models.
Automatic voice emotion recognition of child-parent conversations in natural settings
Published in Behaviour & Information Technology, 2021
Effie Lai-Chong Law, Samaneh Soleimani, Dawn Watkins, Joanna Barwick
Indeed, there exist a number of studies on understanding how parents shaped the cognitive and emotional development of very young children, mostly infants and pre-schoolers, through verbal behaviours (e.g. (Sabbagh and Callanan 1998; Kochanska and Kim 2013; Luce, Callanan, and Smilovic 2013; Fivush 2014; Sigel, McGillicuddy-DeLisi, and Goodnow 2014; Lagattuta, Elrod, and Kramer 2016)). However, the studies relied on conventional manual methods of data collection and analysis. Such methods can be prohibitively time-consuming and very costly when applied to a vast body of conversational data. The prolonged time gap between collecting data and yielding results may cause researchers to miss the opportunity to maximize the potential impact of the work. This hurdle can be alleviated through automating the analysis process enabled by leveraging the progressively more sophisticated signal processing and machine learning models (a recent review on sentiment analysis tools (Cambria et al. 2013; Serrano-Guerrero et al. 2015; Soleimani and Law 2017, June)). Both verbal and nonverbal data (i.e. word, voice, facial expression, and psychophysiological responses) are relevant to automatic emotion analysis. Each data type involves a nontrivial body of related work. Hence, we focus on voice only in the ensuing discussion, given our main research question.
Deep TMS H7 Coil: Features, Applications & Future
Published in Expert Review of Medical Devices, 2021
Tal Harmelech, Yiftach Roth, Aron Tendler
Autism spectrum disorder (ASD) is a complex developmental condition that involves persistent challenges in social interaction, speech, and nonverbal communication, and restricted/repetitive behaviors. The mPFC is an important node within functional networks involved in mentalizing abilities crucial to social interactions that were found to be impaired in adults with ASD. Two recent studies of Deep TMS in ASD investigated its effect on different common ASD deficits. The first study examined the changes in emotional and cognitive processing following daily HF (5 Hz) Deep TMS H7 Coil sessions over 5 weeks in two high-functioning ASD patients [46]. Assessments included a computerized cognitive battery, tasks for testing emotional recognition, and clinical questionnaires. Both patients improved in a variety of cognitive functions, with a global improvement of 20% (for P1) and 30% (for P2) in the neuropsychological battery. Emotional recognition tasks also revealed that recognizing emotions in others became easier. Patients reported being more aware of emotional and social cues already during the first 2 weeks of treatment, as well as experiencing increased attention and decisiveness throughout the course of treatment. The self-reported questionnaires showed slight improvement in autistic symptoms (AQ) and empathy (IRI). The most noticeable effect, however, was the decrease in OCD-like symptoms at the end of the treatment as measured by the Y-BOCS scale. This decrease was already reported by the patients during the first 2 weeks of treatment and follow-up assessment indicated that it is still significant even 2 months after the end of treatment.