Explore chapters and articles related to this topic
Teamwork in Space Exploration
Published in Lauren Blackwell Landon, Kelley J. Slack, Eduardo Salas, Psychology and Human Performance in Space Programs, 2020
Jensine Paoletti, Molly P. Kilcullen, Eduardo Salas
Teams with poor coordination may also experience the so-called ‘process loss’ associated with working in a team, which was historically considered a major concern for teamwork (Bowers, Baker, & Salas, 1994; Shiflett, 1979). At times, organizations will refer to this process loss by discussing ‘communication breakdowns’ or being ‘out of sync’ (Marks et al., 2001). Indeed, communication was previously 1 of the 7 behavioral outcomes for team coordination (Bowers, Salas, Prince, & Brannick, 1992). No longer considered an outcome, now team communication is a key coordination process. Now, team communication is frequently called information exchange; robust evidence delineates information exchange as vitally important for team performance, cohesion, decision satisfaction, and knowledge integration (Mesmer-Magnus & DeChurch, 2009). Additionally, meta-analytic evidence demonstrates that team communication quality is more positively related to team performance than team communication frequency (Marlow, Lacerenza, Paoletti, Burke, & Salas, 2018).
Parallel Computing Architecture Basics
Published in Vivek Kale, Parallel Computing Architectures and APIs, 2019
The term thread is connected with shared memory and the term process is connected with distributed memory. A coarse classification of the memory organization distinguishes between Shared memory machines: For shared memory machines, a global shared memory stores the data of an application and can be accessed by all processors or cores of the hardware systems. Information exchange between threads is done by shared variables written by one thread and read by another thread. The correct behavior of the entire program has to be achieved by synchronization between threads, so that the access to shared data is coordinated, that is, a thread reads a data element not before the write operation by another thread storing the data element has been finalized. Depending on the programming language or environment, synchronization is done by the runtime system or by the programmer.Distributed memory machines: For distributed memory machines, there exists a private memory for each processor, which can only be accessed by this processor and no synchronization for memory access is needed. Information exchange is done by sending data from one processor to another processor via an interconnection network by explicit communication operations.
Sustainability principles in water management
Published in Nick F. Gray, Water Science and Technology: An Introduction, 2017
In the United States, the Homeland Security Presidential Directives and the Public Health Security and Bioterrorism Preparedness and Response Act (the Bioterrorism Act) of 2002 require that specific actions should be taken in relation to water supplies. These include (i) assessing vulnerabilities of water utilities, (ii) developing strategies for responding to and preparing for emergencies and incidents, (iii) promoting information exchange among stakeholders, and (iv) developing and using technological advances in water security. The Bioterrorism Act requires those providing water supplies serving more than 3300 persons to carry out vulnerability assessments and to develop emergency response plans. The USEPA has been appointed as the agency responsible for identifying, prioritizing and coordinating infrastructure protection activities for the nation's drinking water and water treatment. They are also required to provide (i) information on potential threats to water systems; (ii) strategies for responding to potential incidents; (iii) information on protection protocols for vulnerability assessments; and (iv) research into water security (USEPA, 2003).
A cyber-physical system architecture based on lean principles for managing industry 4.0 setups
Published in International Journal of Computer Integrated Manufacturing, 2022
Amr Nounou, Hadi Jaber, Ridvan Aydin
It allows a precise quantitative analysis of information flows and works in both weighted and directed networks. There are different forms of information exchange: formal and informal exchanges, direct and indirect exchanges, and unilateral disclosures of information and signalling. Information exchanges can be observed in horizontal and vertical relationships and different organizational settings. Information sharing is critical to an organization’s competitiveness and requires a free flow of information among members. For increased performance to occur, new information needs to be disseminated continually to individuals within organizations. Information exchange efficiency (or simply information efficiency) is defined based on the average length of the shortest paths between all nodes in the network of actors. As the average shortest path decreases, information efficiency increases. That is, direct communication channels between two actors maximize information exchange and hence maximize the efficiency indicator. Information Exchange efficiency is calculated as follows. Let D the distance matrix, which contains lengths of shortest paths between all pairs of nodes. Therefore, an entry D (i,j) = dij represents the shortest path length from element i to element j. Lengths between disconnected elements are set to Infinity. The average efficiency E of a network is calculated using the following equation 1 (adopted from (Latora and Marchiori 2001))
Examining freight performance of third-party logistics providers within the automotive industry in India: an environmental sustainability perspective
Published in International Journal of Production Research, 2020
Mohit Goswami, Arijit De, Muhammad Khoirul Khakim Habibi, Yash Daultani
Our study contributes to the extant research literature in several ways. The study enriches the literature on auto components’ freight transportation by examining interrelations of externalities, firms’ intrinsic capabilities, and information sharing in context of achieving environment sustainability. While most of the studies have examined the impact of such inter-relationships in context of achieving operational excellence, including GHG reduction and low-carbon growth as a measure of environment sustainability augments the extant literature. Apart from contributing towards literature on environmental sustainability, our research also contributes towards the information exchange theory, in that techno-commercial considerations are identified that when mediated through information sharing results in furthering the environmental sustainability outcomes.
Knowledge integration via the fusion of the data models used in automotive production systems
Published in Enterprise Information Systems, 2019
Rafal Cupek, Adam Ziebinski, Marek Drewniak, Marcin Fojcik
The traditional design of the IT architecture that is used by industry does not always fit to the new requirements. Different information exchange standards in the production environment make data integration difficult. There are many different requirements that need information from many different sensors and common processing methods. Combining of all of the data is called data fusion (Liggins, Hall, and Llinas 2017) and can be defined as the combination process of the data that is sensed, where the resulting data are more accurate than each one individually. Data fusion is a multilevel, multifaceted process that deals with the automatic detection, association, correlation, estimation and combination of the data and information from single and multiple sources in order to achieve a refined position and identity estimates as well as the complete and timely assessments of any situations and threats and their significance (White 1991). Data fusion is used to combine the information that is made available by different measurement sensors, information sources and decision makers by creating data models. The model defines the relationships between the sources of data and the types of processing that might be carried out to extract the maximum possible information from it. Developing or choosing the appropriate model is crucial for the success of a data fusion system. When preparing the model, the effects of the observing environment, the end user, the software and hardware platform, communication and processing possibilities, all must be considered.