Explore chapters and articles related to this topic
Systems View of the World
Published in Adedeji B. Badiru, Systems Engineering Using the DEJI Systems Model®, 2023
Since this handbook deals with two associated fields—industrial and systems engineering—there is a strong need to define these two professions in order to have a clear perspective about them and to appreciate their interrelationships. Throughout this chapter, these two fields are used together and the discussions that follow are applicable to either. Perhaps the first classic and widely accepted definition of industrial engineering (IE) was offered by the then AIIE in 1948. Others have extended the definition. “Industrial Engineering is uniquely concerned with the analysis, design, installation, control, evaluation, and improvement of sociotechnical systems in a manner that protects the integrity and health of human, social, and natural ecologies. A sociotechnical system can be viewed as any organization in which people, materials, information, equipment, procedures, and energy interact in an integrated fashion throughout the life cycles of its associated products, services, or programs. Through a global system’s perspective of such organizations, industrial engineering draws upon specialized knowledge and skills in the mathematical, physical, and social sciences, together with the principles and methods of engineering analysis and design, to specify the product and evaluate the results obtained from such systems, thereby assuring such objectives as performance, reliability, maintainability, schedule adherence, and cost control.
Looking Behind Patient Safety Culture: Organisational Dynamics, Job Characteristics and the Work Domain
Published in Patrick Waterson, Patient Safety Culture, 2018
Denham L. Phipps, Darren M. Ashcroft
One way to view healthcare is as an open sociotechnical system (e.g. Buckle et al. 2003; Waterson 2009). In simple terms, a sociotechnical system is a collection of technical and social elements that come together in a dynamic manner to create work processes and products. Such a system is ‘open’ when it interacts with the wider environment; indeed, many organisations are shaped by (and possibly influence) broader social, political, economic and technological developments (Geysen and Verbruggen 2003; Rasmussen 1997). Hence, safety is seen as a product of an assumed system and safety culture as the capacity of the system to ‘create’ safety (cf. Carrillo 2011). Figure 5.1 illustrates this metaphor in the form of a feedback control loop. Feedback control loops are used to describe how variables within a dynamic system influence each other (e.g. Kontogiannis 2012; Marais, Saleh and Leveson 2006). In each loop, the arrows indicate which variables are influencing and which are being influenced. A plus or minus sign is used to indicate whether the influenced variable is being enhanced or diminished. A detailed but readable introduction to the use of feedback control loops can be found in Senge (1990).
Reconstructing Medical Practice
Published in Christine Jorm, Reconstructing Medical Practice, 2016
Doctors only ‘see’ a fraction of the care they give. As has been discussed, sometimes this is literally so but there is a broader system, not so easily video-taped, where doctors’ action and inaction, particularly their disengagement from system improvement and management, results in patients suffering poor care and adverse events. Human factors experts talk of ‘socio-technical systems’; that is, the interaction of technology, task, complexity, work processes, environment, human skills and cognitive limitations with the organizational culture and social systems. If doctors learned to see this broader system, they have the potential, as the most powerful and influential healthcare providers, to forge it into better shape, to make change others cannot.
Application of Artificial Intelligence in Detection and Mitigation of Human Factor Errors in Nuclear Power Plants: A Review
Published in Nuclear Technology, 2023
Meenu Sethu, Bhavya Kotla, Darrell Russell, Mahboubeh Madadi, Nesar Ahmed Titu, Jamie Baalis Coble, Ronald L. Boring, Klaus Blache, Vivek Agarwal, Vaibhav Yadav, Anahita Khojandi
Human reliability analysis (HRA) is the study of human contributions to human error and the quantification of human error (e.g., rates, probabilities) for use in overall risk models. Historically, human factors and human reliability have diverged, whereby human factors engineering has tended to focus on understanding factors that influence human performance in sociotechnical systems to improve the design of those systems, while HRA has tended to focus on safety and risk factors, including as-built systems like existing nuclear facilities.8 These two fields converge with the introduction of new analytical tools such as AI. Traditional HRA uses worksheets and subject matter experts to predict the factors that will contribute to human errors. This approach depends on the subjectivity of analysts and can result in considerable inter-analyst variability. The advent of AI techniques in HRA promises the opportunity to minimize such subjectivity and provide a more consistent tool for HRA. Additionally, where the causes of human error are identified, AI can provide technology to assist operators in preventing and mitigating errors.
Clarifying the nature of failure in sociotechnical systems: ambiguity-based failure and expectation-based failure
Published in Theoretical Issues in Ergonomics Science, 2022
Sociotechnical systems are a class of systems that consist of people, and their associated activities, technologies, rules and regulations that act together as a concerted whole. Sociotechnical systems have been addressed by a number of disciplines ranging from engineering to the history of technology (Bijker, Hughes, and Pinch 2012; Carayon 2006; Cook 2000; Hughes 1988; 1998; 2004; Sommerville and Baxter 2010; Trist 1981; Trist, Murray, and Trist 1993; Vermaas et al. 2011; Waterson, Older Gray, and Clegg 2002). There is no one widely-shared formulation of ‘sociotechnical systems’ that exists but rather a variety that has utility within different contexts. Similarly, there is no widely accepted, cross-disciplinary consensus on the nature of systems and systemic concepts. A variety of disciplinary communities use ‘systems thinking’ for epistemic clarification, formulation, and methodology. In addition, a variety of disciplinary communities, e.g. systems engineering, consider systems as ontological constructs. What characterizes these sociotechnical systems is that they have a set of functions that, in some sense, they ‘ought’ to be able to perform, and they can also fail when they cannot perform these functions.
Have we reached the organisational ceiling? a review of applied accident causation models, methods and contributing factors in construction
Published in Theoretical Issues in Ergonomics Science, 2019
Matthew James Woolley, Natassia Goode, Gemma J. M. Read, Paul M. Salmon
Despite the dominance of the systems approach in safety science and broad application of RMF, Accimap, STAMP and FRAM across a diverse set of domains, the extent to which construction has adopted this approach is uncertain. Notwithstanding this uncertainty, it is clear that construction is a complex sociotechnical system (Loosemore and Cheung 2015; Mitropoulos, Abdelhamid, and Howell 2005; Love et al. 2002 etc.). A sociotechnical system has been defined as any work system where people and technology interact to achieve task goals (Klein 2014). Construction involves high energy plant, services and equipment requiring frequent interaction with construction workers. Construction incidents range from falls from heights; to collisions between plant and workers; geo-technical failure; high energy services contact; fire and explosion. The complexity of construction work emerges through the often simultaneous interaction of remote and inclement geographical and environmental conditions (Suraji, Duff, and Peckitt 2001; Bondy et al. 2005; Gibb et al. 2006); transient work forces (Haslam et al. 2005; Esref 2013; Mitropolous et al. 2005); task unpredictability (Mitropolous et al. 2005); and the dynamic and faced paced scheduling and activities (Haslam et al. 2005; Mitropolous et al. 2005; Loughborough 2003; Manu et al. 2010 etc.). These factors, underpinned by the consistency and coordinability of the system (Cowlagi and Saleh 2013), mirror those in many safety-critical domains and highlight the importance of applying systems thinking to all incidents occurring within the system (Hovden, Albrechtsen, and Herrera 2010).