Skip to main content

Integrating Cultural and Regulatory Factors in the Bowtie: Moving from Hand-Waving to Rigor

  • Chapter
Ontology Modeling in Physical Asset Integrity Management

Abstract

Recent analyses of major incidents, such as BP’s Texas City and Macondo disasters and the loss of the space shuttle Columbia, have moved from considering immediate factors and basic organizational failings to including cultural issues. Culture is, however, even more difficult to incorporate into incident investigations and analyses than are organizational factors. This chapter provides a structured approach to analyzing individual, organizational and cultural/regulatory factors based upon the bowtie methodology, using well-defined rules to distinguish three levels of causation.

Level 1 (L1) analysis describes barriers implemented at the individual, team and immediate hardware level; Level 2 (L2) describes the organizational factors that support level one barriers; Level 3 (L3) describes the cultural and regulatory environment that ensures that the organization implements L2, thereby ensuring the integrity of L1 barriers. Incident investigation and analysis can be performed as a structured search for failed barriers at any or all of these three levels. Both organizational failure (L2) and problems with the safety culture and regulatory environment (L3) can be reliably and rigorously identified, rather than relying upon the intuitions of investigators.

This approach allows safety managers to support a range of approaches. First it enables them to identify and rank safety critical controls as those that manage the largest number of threats to integrity; this allows for a risk-based approach to auditing. Secondly the approach allows a rigorous definition of common mode failure with reference to the number of shared barriers at a higher level of analysis. Thirdly the detailed bowtie supports the investigation process by providing hypotheses about which controls have failed, including organizational, cultural, and regulatory factors.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    We are indebted to our colleague Dr. Rob Lee—previous Director of the Australian Bureau of Air Safety Investigation and then Director of Human Factors, Systems Safety and Communications of the Australian Transport Safety Bureau—for originally pointing out that it is not training that causes accidents, but what training was supposed to deliver, it is not the procedure, but what the procedure was meant to ensure happened.

  2. 2.

    Financial specialists were supposed to possess an expertise in risk, but recent events have cast some doubt on this and they are also poorly versed in the types of operational risks concentrated on in this chapter.

  3. 3.

    With current data it is one of the world’s safest airliners, along with the Boeing 787 and the Airbus A350, but this performance can only be seen as an estimate not a fact. Concorde had this reputation until the fatal crash at Paris, but it actually flew so very few hours that assessments were always going to be rough estimates.

  4. 4.

    The term Top Event can confuse users, another term we prefer is Undesired System State (USS), that may capture the situation of heading towards disaster more effectively without necessitating that there be an event as such.

  5. 5.

    In our view it is inexcusable to reject risk analyses that capture the realities of the risk on the grounds that the analysis looks complicated. Managers who demand a simple bulleted PowerPoint slide are often abdicating their responsibility, a point we come to when considering accountabilities and the requirements for frontline individuals. Such abdication could also look negligent, even possibly grossly negligent, when placed before a court, if what they have accepted is markedly less than the complexity of the risks placed before them.

  6. 6.

    This has happened several times recently, when engineers omitted the oil seals on both engines of a Boeing 737, resulting in an emergency landing just before both engines failed due to lack of oil.

  7. 7.

    We are grateful to Andrew Hopkins for reminding us of the similarity to Rasmussen’s Accimap methodology (Rasmussen and Svedung 2000). Accimaps are reactive, being constructed after an incident, but this chapter implies that a related proactive approach to Accimaps could be attempted.

  8. 8.

    The classic old-fashioned auditor’s trick was to inspect the toilets first, and then continue based on what was discovered there. Experienced auditees soon learned to clean the toilets, rather than fix what were real issues.

References

  • Baker J et al (2007) The report of the BP U.S. refineries independent safety review panel. http://www.safetyreviewpanel.com/. 30 Jan 2007

  • Chief Counsel (2011) Chief Counsel for the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling. Report, Washington, D.C.

    Google Scholar 

  • Chemical Safety Board (2007) Investigation report: refinery explosion and fire. U.S. Chemical Safety and Hazard Investigation Board, Report No. 2005-04-I-TX, Mar 2007

    Google Scholar 

  • Cullen D The Lord (1990) The public enquiry into the Piper Alpha disaster. Her Majesty’s Stationary Office, London

    Google Scholar 

  • De Crespigny R (2012) QF 32. Macmillan, Sydney

    Google Scholar 

  • Dekker S (2011) Drift into failure: from hunting broken components to understanding complex systems. Ashgate, Farnham

    Google Scholar 

  • Haddon-Cave, C. (2009) The Nimrod Review. Her Majesty’s Stationary Office, London

    Google Scholar 

  • Hopkins A (2000) Lessons from Longford. CCH Australia, Sydney

    Google Scholar 

  • Hopkins A (2008) Failure to learn: the BP Texas city refinery disaster. CCH Australia, Sydney

    Google Scholar 

  • Hopkins A (2012) Disastrous decisions: the human and organizational causes of the Gulf of Mexico blowout. CCH Australia, Sydney

    Google Scholar 

  • Hudson PTW (1991) Prevention of accidents involving hazardous substances: the role of the human factor in plant operation. Discussion document prepared for the OECD Workshop, Tokyo, 22–26 April

    Google Scholar 

  • Hudson PTW (2010) Safety science: it’s not rocket science, it’s much harder. Inaugural address, Delft University of Technology, 24 Sept 2010

    Google Scholar 

  • Hudson PTW (2014) Accident causation models, management and the law. J Risk Res 17:749–764

    Article  Google Scholar 

  • Hudson PTW, Parker D (2000) Profiling safety culture: the OGP interview study. Report for International Association of Oil and Gas Producers (OGP), London

    Google Scholar 

  • Mogford J et al (2005) Fatal accident investigation report, Isomerization Unit Explosion Final Report, Texas City, Texas, USA, 9 Dec 2005

    Google Scholar 

  • NASA (2003) Columbia Accident Investigation Board report, NASA, Houston, Aug 2003

    Google Scholar 

  • Newton L et al (2008) The Buncefield Incident 11 December 2005. The final report of the Major Incident Investigation Board. Crown Publication, London

    Google Scholar 

  • Rasmussen J, Svedung I (2000) Proactive risk management in a dynamic society. Swedish Rescue Services Agency, Karlstad

    Google Scholar 

  • Reason JT (1990) Human error. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Reason JT (1997) Managing the risks of organizational accidents. Ashgate, Farnham

    Google Scholar 

  • Rogers (1986) Report of the Presidential Commission on the space shuttle challenger accident. NASA, Houston

    Google Scholar 

  • Shappell S, Wiegmann D (1997) A reliability analysis of the taxonomy of unsafe operations. Aviat Space Environ Med 68:620

    Google Scholar 

  • Shappell SA, Wiegmann DA (2001) Applying reason: the human factors analysis and classification system. Hum Factors Aerosp Saf 1:59–86

    Google Scholar 

  • Slovic P (1999) Trust, emotion, sex, politics and science: surveying the risk-assessment battlefield. Risk Anal 19(4):689–701

    Google Scholar 

  • Swain AD, Guttman HE (1983) Handbook of human reliability analysis with emphasis on nuclear power plant applications. NUREG/CR-1278, Washington DC

    Google Scholar 

  • Vaughan D (1996) The challenger launch decision. Risky technology, culture, and deviance at NASA. University of Chicago Press, Chicago

    Google Scholar 

  • Westrum R (1988) Organizational and inter-organizational thought. In: World Bank workshop on safety control and risk management, Washington, DC, 16–18 October

    Google Scholar 

  • Westrum R (1991) Cultures with requisite imagination. In: Wise J, Stager P, Hopkin J (eds) Verification and validation in complex man–machine systems. Springer, New York

    Google Scholar 

  • Westrum R (1996) Human factors experts beginning to focus on organizational factors in safety. ICAO J 51:6–8

    Google Scholar 

  • Wiegmann DA, Shappell SA (2003) A human error approach to aviation accident analysis—the human factors analysis and classification system. Ashgate, Aldershot

    Google Scholar 

  • Woods DD, Dekker S, Cook R, Johannesen L, Sarter N (2010) Behind human error, 2nd edn. Ashgate, Farnham

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Patrick Hudson .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Hudson, P., Hudson, T. (2015). Integrating Cultural and Regulatory Factors in the Bowtie: Moving from Hand-Waving to Rigor. In: Ebrahimipour, V., Yacout, S. (eds) Ontology Modeling in Physical Asset Integrity Management. Springer, Cham. https://doi.org/10.1007/978-3-319-15326-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-15326-1_6

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-15325-4

  • Online ISBN: 978-3-319-15326-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics