Explore chapters and articles related to this topic
Ubiquity
Published in Vivek Kale, Digital Transformation of Enterprise Architecture, 2019
Ensembles of MEMS can be formed into arbitrary 3D shapes, as artifacts resembling different kinds of physical objects. Such an ensemble is called a tangible user interface (TUI) through which a person interacts with digital information in the physical environment. Such an interface can be a part of e-textiles, a new form of fiber materials where sensing and communication are integrated into a woven structure to monitor the signals and variables in the area of interest.
Perspectives on the Nature of Intuitive Interaction
Published in Blackler Alethea, Intuitive Interaction, 2018
Alethea Blackler, Shital Desai, Mitchell McEwan, Vesna Popovic, Sarah Diefenbach
TEIs and NUIs have long been claimed to be intuitive (Hurtienne & Israel, 2007; Jacob et al., 2008). This intuitiveness is attributed to tactile or haptic interactions in terms of static system properties such as directness (Dix, 2011), ease of learning and naturalness (Muller-Tomfelde & Fjeld, 2012), and speed, simplicity, and effectiveness (Jacoby et al., 2009). TEIs represent interfaces that accept physical interactions in the form of gestures, touch, and body movements as inputs to the systems. They are comprised of a mix of physical and virtual elements. Depending on the configuration of the physical and virtual elements, TEIs include interfaces that range across a broad spectrum such as tangible user interfaces (TUIs) (Ishii, 2008), mixed-reality systems (Milgram & Kishino, 1994), ubiquitous systems (Vallgårda, 2014), and gestural and whole-body systems (Aslan, Primessnig, Murer, Moser, & Tscheligi, 2013). TEIs with gestural interaction and body movements were made popular in futuristic movies such as Star Trek and Minority Report. Their use has been successfully implemented in gaming environments such as Microsoft’s Xbox Kinect and Nintendo’s Wii Remote. Other TEIs such as Siftables (Merrill, Kalanithi, & Fitzgerald, 2011) and reacTable (Jordà, Geiger, Alonso, & Kaltenbrunner, 2007) embed physical artifacts with digital information so that the information can be directly manipulated and accessed (Ishii, 2008). Similarly, physical and virtual elements coexist in a mixed-reality system (Milgram & Colquhoun, 1999) such as Osmo (Tangible Play, 2014). Although TUIs and mixed-reality systems are referenced separately in the literature, there is no clear way to differentiate between them. The commonality between them is that they both integrate physical and virtual elements and they both have two main categories of implementation: (1) distinct and separate physical and virtual spaces and (2) overlapping physical and virtual spaces.
Haptic Feedback Helps Me? A VR-SAR Remote Collaborative System with Tangible Interaction
Published in International Journal of Human–Computer Interaction, 2020
Peng Wang, Xiaoliang Bai, Mark Billinghurst, Shusheng Zhang, Dechuan Han, Mengmeng Sun, Zhuo Wang, Hao Lv, Shu Han
Tangible user interface (TUI) uses suitable physical objects to provide haptic interaction in human-computer interfaces (Ishii, 2008; Leithinger, Follmer, Olwal, & Ishii, 2014). Recently, some researchers have demonstrated that passive feedback, like that provided by TUI, could improve the user performance of hand-drawn 3D sketching (Arora et al., 2017; Mohanty et al., 2018). In this case, users can sketch on a real physical surface, rather than using mid-air drawing. Arora et al. (2017) conducted a series of studies to explore the factors affecting freehand sketching in VR and indicated that the passive feedback provided by a physical surface improves precision as well as controllability, and the overall esthetic quality of the hand-drawn strokes. Furthermore, Mohanty et al. (2018) introduced a new haptic-enabled mid-air metaphor to study the spatiality, tangibility, and kinesthetics for curve modeling. Knopp, Lorenz, Pelliccia, and Klimant (2018) developed a robot-powered VR-training application providing haptic feedback making full use of VR advantages (e.g., immersion, interaction, imagination). Piumsomboon et al. (2019b) showed a multi-scale MR collaborative platform that represented the remote partner using a TUI. Their results showed that the passive haptic TUI can improve the mobility and flexibility of controlling a remote collaborator’s viewpoint.
TANGAEON: Tangible Interaction to Support People in a Mindfulness Practice
Published in International Journal of Human–Computer Interaction, 2019
Andrea Vianello, Luca Chittaro, Assunta Matassa
One of the early examples of TEI systems is provided by Tangible User Interfaces (TUIs) (Ishii & Ullmer, 1997), i.e. interfaces that can take different forms and employ different materials, and can be used as input and/or output devices to represent and/or manipulate digital information, or to provide users with feedback on the completion of their physical or digitally computed actions (Ullmer & Ishii, 2000). Recently, their extension for enabling sensing and interacting with the surrounding environment has been made possible by the inclusion of low-cost microcontrollers, e.g. sensors and actuators—giving rise to the Physical Computing paradigm (O’Sullivan & Igoe, 2004).
Auditory and haptic feedback to train basic mathematical skills of children with visual impairments
Published in Behaviour & Information Technology, 2023
Sebastián Marichal, Andrea Rosales, Fernando González Perilli, Ana Cristina Pires, Josep Blat
Haptic interaction is a natural strategy to compensate for VIs and it has been exploited in Tangible User Interfaces (TUIs), based on physical objects and environments augmented with digital information becoming interaction devices (Ishii and Ullmer 1997).