From cognitive reliability to competence? An evolving approach to human factors and safety

10 pages
95 views
of 10
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Share
Description
When reviewing the research path of an author, we are inevitably influenced by our own background and approach. Tracing back the converging and diverging assumptions of the authors with respect to Erik Hollnagel’s research path, the paper focuses on
Tags
Transcript
  AUTHOR VERSION OF A.RE & L. MACCHI/  Cogn Tech Work (2010) 12:79–85 DOI 10.1007/s10111-010-0148-1 This is an author version of the contribution published on : Cognition, Technology & Work, 2010, 12(2):79-85. DOI:   10.1007/s10111-010-0148-1 The final publication is available at Springer via: http://dx.doi.org/10.1007/s10111-010-0148-1   From cognitive reliability to competence? An evolving approach to human factors and safety Alessandra Re 1 , Luigi Macchi 2   1  Interdepartmental Laboratory of Applied Ergonomics (LIDEA), Department of Psychology, University of Torino, Via Verdi, 10, 10124 Torino, Italy 2  Centre de Recherche sur les Risques et les Crises, Mines-ParisTech, Rue Claude Daunessse, B.P. 207, Sophia Antipolis Cedex, France e-mail: luigi.macchi@crc.ensmp.fr   Abstract When reviewing the research path of an author, we are inevitably influenced by our own  background and approach. Tracing back the converging and diverging assumptions of the authors with respect to Erik Hollnagel’s research path, the paper focuses on the evolution of cognitive psychology as resulting from an srcinal distinction between two models of human cognition: the first one more in line with the behaviourist tradition and the latter with the cybernetic and ecological approach. The former, which becomes dominant in the development of cognitive psychology, marginalizes some aspects that prove crucial in the latter. The concepts of anticipation and of intentional behaviour, together with the notion of variability of normal performance, are traditionally part of the cybernetic and ecological approach to cognitive psychology. These concepts have also been central in the development of the ergonomic analysis of work activities. Throughout the Resilience Engineering  perspective, the two models of human cognition are brought closer, while the concept of competence is sketched as a possible mediator to a ‘‘positive’’ approach to Human Factors and Safety. 1 Introduction There are researchers who develop a vast number of studies; fewer who develop a research  path and even fewer whose research path develops alongside a discipline, and up to a certain extent, it influences this development. Erik Hollnagel is part of the latter. His research on Human Factors and Safety has influenced Cognitive Psychology, as well as Safety studies, where psychological and non psychological disciplines are deeply interrelated. The interdisciplinary character of Human Factors and Safety is discussed in the beginning of this paper to draw the context for the psychological contribution to the discipline. Both engineering and sociology required the understanding of human behaviour to ensure safety  AUTHOR VERSION OF A.RE & L. MACCHI/  Cogn Tech Work (2010) 12:79–85 DOI 10.1007/s10111-010-0148-1 for industrial systems. Back to the 1960s, two different branches of Cognitive Psychology were available to understand and describe human behaviour. The first branch of research goes under the definition of Information Processing, where the computer was the most exploited analogy for human cognition. In this trend, it is possible to find most of the contributions Hollnagel gave from the end of 1970s until the end of the 1990s. Cognitive engineering, cognitive reliability and human errors were the key terms and the main focus of interest. While deeply implied in this research context (Hollnagel et al. 1986, Hollnagel 1993, 1998), Hollnagel constantly questioned this approach and he hinted at its limits (Hollnagel 1978, 2005). In parallel, particularly in the Francophone and Italian tradition, a different branch of research has been developed in cognitive and organizational ergonomics. (de Terssac 1992; de Montmollin 1984, 1990; Leplat 1980, 1993; Oddone et al. 1981). Recognising the central role of the cognition–action linkage, of the intentional behaviour and of the mental modelling of the everyday interactions, this approach looked at humans as the flexible, adaptable component of industrial systems. Humans were seen as the essential component for the safe functioning of the system. In the evolution of his personal approach, Hollnagel among others, with the Resilience Engineering (Hollnagel et al. 2006; Hollnagel et al. 2008a, b), clearly breaks with the Information Processing tradition. The pivotal concept of that tradition is somehow rejected. Humans are no longer seen as the fallible components of industrial systems; they are rather considered as the key component of socio-technical systems. It is due to their ability to locally adjust their behaviour that the safe functioning is ensured. It is probably in the ETTO (Hollnagel 2009) book that the srcinal two psychological branches of Human Factors and Safety studies come in contact after more than 30 years of parallel development. The positive effect of performance variability is highlighted towards the conclusion of this  paper. The concept of competence is also suggested as a potential bridge between the two  branches of research. Taking into consideration the role of competency can allow the development of a discipline open to assert the crucial and positive role of humans on the system safety. Underlying Hollnagel’s contribution to Safety debate, this paper wishes to pay tribute to a  person whose questioning attitude has contributed—contributes and will contribute—to improve the understanding of the nature of risk and safety. 2 Human factors and safety: an interdisciplinary domain Work and research in Human Factors and Safety is a matter of multiple disciplines cooperating towards a common objective. Engineering, occupational sociology together with cognitive psychology are the three main research domains dealing with Safety and Human Factors. To effectively tackle industrial safety, it is necessary that contributions from the three disciplines are brought together to understand and prevent risks. After the Three Mile Island accident in 1979, the Human Factors discipline welcomed a reliable-based safety model from the engineering tradition. In the engineering domain, the  predominant safety model was—and still is—based on technical reliability and functional stability, in absence of deviations and disturbances. From this perspective, humans have traditionally been considered a highly variable and therefore low-reliable component, whose operational activities to the utmost comply with norms and procedures to ensure correct and satisfactory performance. In this framework, humans are considered as mere task executors, which do not generate any additional value or knowledge. For the engineering approach, risks  AUTHOR VERSION OF A.RE & L. MACCHI/  Cogn Tech Work (2010) 12:79–85 DOI 10.1007/s10111-010-0148-1 due to the unreliability of human behaviour have to be controlled and constrained. Unfortunately, for this approach, ‘‘Complete control […] does not exclude that an action can  be incorrectly performed. There is an underlying (or residual) variability of human  performance that cannot be eliminated.’’ (Hollnagel 1998, p. 152). From the occupational sociology, strongly marked by Charles Perrow’s theory on Normal Accidents (1984), the Safety debate inherits the awareness that risks cannot be entirely  predicted and eliminated in complex and tightly coupled socio-technical systems. Residual risk is due to the emergence of undesired and unplanned interactions between system’s components. While shifting the potential for failure from humans to the system, Perrow highlights the direct role of management ‘‘in preventing failures—or causing them’’ (1984,  p. 10). The nature of risk, in Perrow’s words, goes far beyond the limits and boundaries of human cognition: ‘‘Above all, I will argue, sensible living with risky systems means keeping the controversies alive, listening to the public, and recognising the essentially political nature of risk assessment. Ultimately, the issue is not risk, but power; the power to impose risks on the many for the benefit of the few’’. (1984, p. 306). For both the engineering and the sociological approach, uncertainty (human and/or organisational) cannot be eluded from the technical core of a system, and therefore nor can the risks be completely eliminated. Systems where uncertainty is present are always exposed to failures (Thompson 1967). Cognitive psychology was therefore asked to focus on the characteristics and limits of human factors. Cognitive psychologists contributed to the safety debate by either designing consistent technologies with human resources and limits or introducing barriers to reduce discretion and to mitigate risks due to unusual combinations of events (Cacciabue et al. 2000). Barriers aim to protect systems from the fallibility of humans and conversely to  protect humans from the less-than-perfect predictability of the system functioning. To better understand the evolution of Erik Hollnagel’s approach and to relate it to the approach of who is writing, it is worth recalling the beginning of cognitive psychology. In our opinion, in the early sixties, two research branches outlined different models of human cognition that resulted in different approaches to Human Factors and Safety. The first one refers to studies, mainly developed by the Anglophone Human Factors community, that model humans as Information Processing Systems (Hollnagel and Woods 2005). The second  branch, developed mainly in the Francophone and Italian ergonomics community, considered humans as adaptive systems that actively explore the environment and proactively adapt to it. (de Terssac 1992; de Montmollin 1984, 1990; Leplat 1980, 1993; Oddone et al. 1981). 3 Two branches of research for cognitive psychology in human factors and safety As known, in the Human Factors and Safety debate, the Information Processing Systems (IPS) approach became dominant. The computer provided a powerful analogy, being the first technology capable of simulating the complexity of human cognition through series of linear  processes. The development of the IPS paradigm went along with the theoretical definition of structural limitations of human cognition. The paradigm assumed that fallibility is part of human cognition; it is therefore unchangeable. The notion of Cognitive Reliability (Hollnagel 1998) and the underlying Contextual Control Model (Hollnagel 1993) were consistent with this approach. Human cognition was considered as a resource interacting with contextual factors, but coming first, as far as a set of  predetermined limitations and capabilities. Coherently, the context was defined as a set of factors that influence, modify and often degrade the normal cognitive performance.  AUTHOR VERSION OF A.RE & L. MACCHI/  Cogn Tech Work (2010) 12:79–85 DOI 10.1007/s10111-010-0148-1 Despite being involved in the development of this approach, Hollnagel questioned very early its assumptions. One prominent feature of this linear paradigm is that it is a rather passive system, which only processes the information reaching it, another that information only  passes through the system in one direction. Although there is experimental evidence for several components of the system, e.g. the neural detectors, the different types of memory, etc. it is incorrectly regarded as a whole. It is incorrect  precisely because no system could survive in even a slightly complicated environment, if it was only allowed to respond passively to the information which reached it. (Hollnagel 1978, p. 198). Hollnagel’s scepticism about the linearity of human cognition is entirely shared by the cybernetic branch. The most appropriate technological analogy for this branch, parallel to computer for IPS, could be the thermostat, being the first and the simplest TOTE (Test-Operate-Test-Exit) unit capable of adaptive behaviour. To be adaptable, the thermostat requires an Image of the world. In the Boulding’s (1956) classification, the thermostat is the simplest system cyclically querying its colourless and soundless world to know the state of the single variable (i.e. the temperature) it is looking for. The research path of who is writing develops from the cybernetic branch with all its implications, which go far beyond the difference between linear and circular information processing: the Miller’s (1960) analysis of auto-directed intentional behaviour and of the goal-oriented action; Neisser’s (1976) priority of schema with respect to perceptual exploration; Gibson’s (1979) concept of vista as  prospective perception from a personal point of view. On the basis of an anthropomorphic model of humans, to quote the social psychologists Harre´ and Secord (1972), the cybernetic Image of cognition describes humans as active explorers of the world, seeking for some kinds of information while neglecting others, as if they were questioning their environment. Explaining the concept of understanding through an analysis-by-synthesis model, Hollnagel (1978) notes that the hermeneutic circle represents a specific case of this general model, which is quite different from the Information Processing approach: According to this model the process of understanding takes place by a reciprocal interaction between two processes, which are both normally unconscious. One  process produces a guess or an expectation of what the meaning of the message could be. The other process tests this guess against the message in order to establish whether the guess and the message coincide to the extent that one can say the message is understood. All in all the process […] is identical to the structure of a TOTE, as described by Miller, Galanter and Pribram (1960). (1978, p. 202). Further differences between the IPS and the cybernetic branches of cognitive psychology should be here mentioned. The first difference: IPS was focused on information; cybernetic on to the concept of difference: ‘‘Information consists in differences producing a difference’’ (Bateson 1979, p. 135). In this view, the perception is defined as checking a gap from an expected value, rather than receiving information. It is active exploration (Neisser 1976), anticipation and detection of incongruities between the desired and the actual state (Miller 1960). The second one: as highlighted by Hollnagel (Op. Cit), IPS is a passive system processing only the information reaching it at a moment in time; for the cybernetic approach, auto-regulation is a temporal process developing through the interaction of the organism with the external world: ‘‘The discrepancy between the target and the actual response decides the next response in much the same way as a golfer’s error on the putting green decides his next putt’’  AUTHOR VERSION OF A.RE & L. MACCHI/  Cogn Tech Work (2010) 12:79–85 DOI 10.1007/s10111-010-0148-1 (Annett 1972, p. 10). This temporality is directed by the expectations and goals of a subject challenging an external world that offers resistance to his action: ‘‘Perception and cognitive activity usually are not simple transactions at the mental level, but transactions with the external world’’ (Neisser 1976, p. 35). The third one: IPS was concerned with human reliability and human error, and variability was seen as a deviation that grows larger and larger from the core of the system; the cybernetic approach sees variability as a constitutive feature of dynamic systems (Ashby 1960, p. 79). In our opinion, being dominant in the Human Factors and Safety debate, IPS marginalised several pivotal aspects that were independently developed in the same years by many Italian and Francophone ergonomists: anticipation and goal-directed actions. The concept of anticipation (de Terssac 1992) refers to the cognitive activation humans experience while expecting an event to happen. Such activation takes place before any sensory information becomes available. This concept is barely coherent with the Information Processing paradigm, which does not account for the active search for information and for expectations used to control action, while it is quite consistent with the Neisser’s (1976) ‘‘picking up information’’ concept. In the ergonomic analysis of work, the concept of goal-directed behaviour is fundamental (Leplat 1997), as well as that of humans as action-oriented systems. Events in the everyday world are considered as favourable or unfavourable, as positive occasions or obstacles. When an obstacle prevents a goal-directed system from achieving its objective, the system tries to remove the obstacle or tries to move around it or to make the most of it (Minsky 1986). Since 2004, at least, Hollnagel’s unease with the IPS basic concepts became more formalised. In the Barriers and Accident Prevention book (Hollnagel 2004), the notion of human error and human reliability is substituted by the concept of performance variability. The book  presented the Functional Resonance Analysis Method as a non-linear method to perform accident analysis and safety assessment. The focus of the method is variability of normal  performance due to approximate adjustments. 4 Acknowledging positive effects of performance variability The traditional approach of IPS, as previously mentioned, consisted in ensuring safety by designing and enforcing barriers to reduce the number of human errors and to mitigate their consequences, or in other terms to reduce discretion and variability. Hollnagel somehow  breaks with this approach when he recognises that discretion and variability are indeed the sources for incidents and accidents, but in the normality, they are the source of successes and functioning: […] failures represent the flip side of the adaptations necessary to cope with the real world complexity rather than a failure of normal system functions. Success depends on the ability of organisations, groups and individuals to anticipate risks and critical situations, to recognise them in time, and to take appropriate action; failure is due to the temporary or permanent absence of that ability […] (Hollnagel et al. 2008a, b). Diverging from the Engineering approach and the Information Processing paradigm, Hollnagel’s thinking, at this stage, conceptually converges with the ergonomic research based on the analysis of work activities, where variability has always been recognised as a work domain characteristic, rather than as a human limit. The assumption is that standardised/linear
Related Search
Similar documents
View more...
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks