Explore chapters and articles related to this topic
Conclusion
Published in Joanna Woronkowicz, D. Carroll Joynes, Norman M. Bradburn, Building Better Arts Facilities, 2014
Joanna Woronkowicz, D. Carroll Joynes, Norman M. Bradburn
Psychological research also demonstrates that our memories are both fallible and malleable. At the time of our interviews with project representatives, which in some cases occurred decades after the initial idea for the building project, recollections about project decision making may have been subtly altered (and not necessarily with any conscious intention to do so). Memory researchers refer to this as “hindsight” bias. We reconstruct and amend our memories of the past in the light of what we know now.2 Sometimes this serves to make the memories of what happened in the past look better than the actual circumstances and events would support, and sometimes it makes it look worse than it actually was. The direction and degree of distortion depends, in part at least, on whether things turned out better or worse than we, and others, expected. Take, for example, the Durham Convention and Visitors Bureau (DCVB) and their early opposition to the construction of a new performing arts center in downtown Durham. Now that DPAC has consistently contributed revenue to the city’s budget, the DCVB’s narrative has changed: “Now you fast forward 14 years, and they were one of the early supporters of the theater,” said Bill Kalkhof, one of PAC’s planners.
Human Factors in Unmanned Aerial Systems
Published in R. Kurt Barnhart, Douglas M. Marshall, Eric J. Shappee, Introduction to Unmanned Aircraft Systems, 2021
Hindsight bias is defined as the “belief that an event is more predictable after it becomes known than before it became known” (Roese and Vohs, 2012). In other words, once an error occurs, the reviewer can incorrectly surmise that event should have been more easily identified and prevented, but in reality, it is only due to the fact that the outcomes of the actions are known. For example, a pilot who lands an aircraft with the landing gear retracted will often face the question “What were you thinking?” or the statement “I would have never done that!” In a field that carefully examines mishaps for the purpose of shaping new training, designs, or strategies, hindsight bias of personnel can impede progress toward more safe and efficient operations.
Human Factors in Unmanned Aerial Systems
Published in Douglas M. Marshall, R. Kurt Barnhart, Eric Shappee, Michael Most, Introduction to Unmanned Aircraft Systems, 2016
Hindsight bias is defined as the “belief that an event is more predictable after it becomes known than before it became known” (Roese and Vohs, 2012). In other words, once an error occurs, the reviewer can incorrectly surmise that event should have been more easily identified and prevented, but in reality, it is only due to the fact that the outcomes of the actions are known. For example, a pilot who lands an aircraft with the landing gear retracted will often face the question “What were you thinking?” or the statement “I would have never done that!” In a field that carefully examines mishaps for the purpose of shaping new training, designs or strategies, hindsight bias of personnel can impede progress toward more safe and efficient operations.
The role of domain expertise in trusting and following explainable AI decision support systems
Published in Journal of Decision Systems, 2022
Sarah Bayer, Henner Gimpel, Moritz Markgraf
If this is done ‘before’, multiple biases can affect the decisions, such as the default effect. This is the tendency to favour the default option (Anaraky et al., 2020) and thus the proposed suggestion of the DSS. Another potential bias is the hindsight bias. This is the technical term for the human tendency to perceive an event ‘more predictable after it becomes known than it was before it became known’ (Roese and Vohs, 2012, p.411). Accordingly, the human user is likely to approve the suggestion of the AI-enabled DSS even though it might contradict their own opinion. On the contrary, if the AI decision support is given ‘after’ and the system agrees with one’s opinion, one can feel acknowledged, which can feel like a compliment. This can foster a positive atmosphere which, in turn, can foster greater trust (McKnight et al., 1998). In that case, the user is more likely to accept a later suggestion of the AI even though it may conflict with their intuition. When the system disagrees with one’s opinion, however, it creates a notably different decision-making scenario that is affected by other biases, such as the escalation of commitment. Due to this bias, people stick to a choice they made despite understanding the logical implication that doing so might lead to undesirable consequences (Staw, 1996).
The influence of context and perception when designing out risks associated with non-potable urban water reuse
Published in Urban Water Journal, 2018
Pierre Mukheibir, Cynthia Mitchell
We may fall into the overconfidence trap when we are overly confident of our own judgements and overestimate the accuracy of our forecasts and assumptions (Kahneman and Lovallo 1993). This overconfidence trap can be driven when the certainty of our judgement is based on a myopic attention to a limited causal understanding of a past event, in other words, it is based on hindsight bias (Roese and Vohs 2012). In contrast, when faced with a high risk decision, we may fall into the prudencetrap, and be over-cautious and adjust our estimates closer to the ‘worst-case’ scenario, despite the chances of it happening being very low. In such cases, we may hedge our bets against a technology or practice with which we are more familiar, to the discrimination of the less familiar approach. The relatively unknown or untested assumptions about reuse or decentralised schemes have often counted against such projects, whereas business as usual potable centralised schemes are more familiar to planners and decision makers and hence are scored more favourably in risk and multi-criteria assessments (Watson, Mitchell, and Fane 2012).