Explore chapters and articles related to this topic
Sonic Interaction With Physical Computing
Published in Michael Filimowicz, Foundations in Sound Design for Embedded Media, 2019
Interfaces that provide the ability to interact using modalities like touch, gesture or voice are often referred to as natural user interfaces (NUI). Using NUIs allows for fun and easy ways for users to interact with technology. Specifically for music, NUIs open the door for performers to control musical parameters in new ways and conform to behaviors we are already used to, therefore reducing the cognitive load of learning something new (Mortensen 2018). Imagine modifying several controls of a synthesizer simply by waving your hand. Using NUIs we can create interfaces that feel less technological and more experiential. For example, you are given the task of creating an interactive meditative experience for a client. What if by merely touching water we can produce sound? Wouldn’t that impress them! And it is entirely possible using the appropriate technological tools. One caveat is that people tend to make interfaces overly complicated and unintuitive, such as by having multiple gestures control something. After a while, it may no longer feel natural. Therefore, it is essential to create a clear user experience by mapping the points of interaction and what they control. Here are some suggestions for creating a meaningful NUI:Take advantage of a user’s existing skill set.Reuse common human skills to expedite the learning process. For example, tapping is an easy way to control the rhythm or BPM of a song.If you are designing a musical instrument, take into account your user. Is it for a novice or skilled musician?Make the behavior easy. Simplifying the behavior by breaking it down into tiny steps can reduce the learning curve if your user is unfamiliar with the interface.Imitating interaction can make the feedback easy to understand for the user. For example, raising your hand automatically raises the volume or pitch of a sound.Let’s learn about some of the existing devices that allow for the creation of NUIs.
Usability of Cross-Device Interaction Interfaces for Augmented Reality in Physical Tasks
Published in International Journal of Human–Computer Interaction, 2022
Xiaotian Zhang, Weiping He, Mark Billinghurst, Daisong Liu, Lingxiao Yang, Shuo Feng, Yizhe Liu
Many current AR HMDs use “natural user interfaces,” driven by interaction techniques, such as gaze, voice, and mid-air hand gestures (Lee & Chu, 2018; Unlu & Xiao, 2021). Natural user interfaces are intuitive and do not require specialized hardware (Unlu & Xiao, 2021). However, there are some limitations. For example, gaze interfaces have a “Midas Touch” problem (Chen & Shi, 2019) and frequent head movements during head-gaze interactions increase motion sickness (Darbar et al., 2021). Voice commands can be unreliable in excessively loud environments (Zhao & Madhavan, 2005) and might not be socially acceptable in many places (Darbar et al., 2021). Mid-air gestures suffer from tremors, fatigue, a lack of physical feedback, and have lower precision (Normand & McGuffin, 2018). Therefore, many of these techniques may not be suitable for long-time use (Lee & Chu, 2018), or applications requiring precise input. In contrast, smartphones and smartwatches have a range of complex and precise sensors and provide comfortable, expansive touch input (Brasier et al., 2021; Mayer & Sörös, 2014; Unlu & Xiao, 2021). Combined together, these different devices offer a future where users can interact with Mobile Multi-Device Environments (Grubert et al., 2015).
A comprehensive tool for developing new human-centred and social inclusion-oriented design strategies and guidelines
Published in Theoretical Issues in Ergonomics Science, 2019
A recent design topic for Interaction Design concerns the development of Natural User Interfaces (NUIs) (e.g. Blake 2011; Wigdor and Wixon 2011), new types of user interfaces where the interaction model is largely based on the use of gesture-based communication (e.g. gesture-recognition, air-gesture, touches on screen, etc.; Harper et al. 2008). But NUIs are still based on the idea of standard end-users and solutions are not conceived to meet the real end-users’ needs. For example, gestures don’t follow real desires of people, visual communication doesn’t follow the needs of people with cognitive and visual disabilities, etc. (Rossi 2014). In order to meet the needs of all possible real end-users, the study developed an early idea of Inclusive Natural User Interface (INUI). Therefore, the HSDT tool has been used to improve a set of new design guidelines previously developed, with the aim to create a more holistic set of inclusive-oriented design guidelines (Figure 10).