Risk Assessment

Blown Away: Risk Management & Public Safety

Blown Away: Risk Management & Public Safety

As an avgeek, I love pictures and videos of aircraft coming into land low over beaches, roads and anything else that happens to be near the end of runways. But should we continue to accept injuries to and the death of people who congregate in these areas during aircraft operations?

Image credit: Richie Diesterheft

The $200K Kangaroo

The $200K Kangaroo

I’m not a big fan of safety tropes. They are often repeated without much thought and eventually this repetition becomes detached from the concept the trope is trying to convey. With many tropes, there are few non-trivial or non-catastrophic events that can reinforce the trope.

The saying on my mind today is “if you think safety is expensive, try having an accident”. The “accident” I often think about is something big, something catastrophic and something that happens to other people. I rarely uttered this trope because I, personally, didn’t feel the power of it.

Now, thanks to a court case in Australia, I feel the power has been returned to this saying. We know have a non-catastrophic event with quantifiable costs associated with the “safety” part and the “accident” part. Plus, I think nearly every airport safety professional out there can empathise with the operator in the case

Seussian Safety Management

Seussian Safety Management

All three on my children have been brought into the world of reading partially through the works of Dr Seuss. I can't count the number of times I have read his books. As my kids have grown older, they have turned into the reader and read these amazing books back to me.

The Bike Lesson is one of my favourites for the very nerdy reason that towards the end of the book The Berenstains provide us with a short & succinct definition of safety. It's three simple stanzas that I think encapsulate modern safety management perfectly.

Unnecessary Segregation or Pragmatic Isolation?

Unnecessary Segregation or Pragmatic Isolation?

I've been out in the "real" world for the past six months or so and in that time, my thinking on risk management has changed a little bit. So here it comes, a confession... I have being using a PIG recently and I have felt its use has probably helped with effective management of overall risk.

BTIII: Assessing Uncertainty

BTIII: Assessing Uncertainty

I can't lie to you. I have been turning myself inside out trying to get a handle on risk evaluation in the aviation safety sphere for close to five years now and I still don't feel any closer to an answer. And I say "an" answer and not "the" answer. Since you are always assessing risk in terms of your objectives, there can and will be multiple approaches to assessing the risk of the same scenario depending on whether you are considering your safety, financial or legal objectives.

BTII: Control-freak*

BTII: Control-freak*

As a follow-on to my first post on the Bow-Tie risk assessment method, I thought I'd concentrate on controls (or barriers or whatever else you would like to call them). This is, after all, where all the action happens. Risk controls are how we spend most of our time - they are the practical aspect of managing risk.

Lessons from Taleb's Black Swan

Having just finished reading Nassim Taleb's The Black Swan, I initially thought about writing a not-so-in-depth assessment of the book's positive and negative points - but I'm not much of a book reviewer and a comprehensive critique is probably beyond my capabilities (at this stage). So, instead I thought I would focus on just a couple of the book's significant concepts and explore how they may apply in the aviation context. Background

The crux of the book, if it can be boiled down to a single paragraph, is that in this modern, complex world we are unable to predict the future when that future involves Black Swan events. Black Swans are those events previously thought extremely rare, if not impossible. The term comes from the standard assertion that all swans are white made prior to the discovery of black swans in Australia.

Taleb's specific definition for a Black Swan has three attributes: it lies outside of regular expectations, it carries an extreme impact and it is subject post-hoc explanation making it appear predictable.

This third attribute is the first significant talking point that I'd like to address.

Retrodiction

When humans look back at a past event, the tendency to create a narrative is strong. It helps us make sense of the world and assists with recall. But in doing so, especially in a complex world, we are likely to introduce a few errors and fall into a few bear-traps.

The big one is over-simplification. The complexity of the typical human's operating environment is growing. Even aviation, which was pretty complex to begin with, has become a close-coupled, global transport system practically unfathomable to the individual. In dealing with this complexity, people tend to identify a limited number of factors and over-attribute their causal influence. Often, this over-emphasis comes at the cost of environmental influences which are outside the control of the event's main players.

Taleb, being an economist, cites examples from the finance world but I couldn't help thinking of accident investigation while reading this. Generally, I felt rather positive to the aviation industry's approach to post-hoc analysis of aircraft accidents - a type of black swan event.

While the development of a narrative is typical, most accident investigation bodies do go beyond the basic "what happened in the cockpit" and look at the latent conditions which contributed to the operational environment. We have the widespread use of the Reason model to thank for this. Some accident investigation bodies, like the ATSB, shy away from the use of the word cause and instead opt for contributory factor or something similar. This is in recognition of the fact that direct causal relationships between identified precursors and the accident cannot always, if ever, be proven in a post-hoc investigation.

Prediction, Shmidiction

Taleb has a real problem with prediction and he puts up quite a few arguments against it. One of my favourites is the "nth billiard ball" - so let me butcher it for you.

The level of accuracy required to make predictions increases significantly with only small increases in system complexity.

For example, let's say you want to calculate the movement of billiard balls. The first couple of collisions aren't too much of a problem but it gets really complicated, very quickly. I won't profess to understand the maths behind these calculations but Michael Berry has apparently shown that:

  • in order to calculate the ninth collision, you need to include the gravitational pull of the man standing at the next table, and
  • in order to calculate the fifty-sixth collision, you need to consider every single particle in the universe in your calculation.

And this is a simple problem! Now consider the dynamic and socio-technical aspects of aviation to really make your head hurt.

Scalability

The third significant concept I wanted to touch was scalability. I'll probably also murder this nuanced concept like those above but here goes.

A scalable is something in which the scale of the outcome is not limited by the nature of the act.

The concept was introduced to Taleb in terms of employment so let's start there. A non-scalable job is one where you are paid by the hour or according to some other unit of work. For example, a barber gets paid per haircut. There is no way for he or she to get paid a total amount that is more than the physical limitation of performing the required service. A scalable job is one where pay is not directly linked to the unit of work performed. In this case consider an author, he or she writes a book but they may receive in return $1 or they may make $1,000,000.

It took me a while but I started to see aviation accident contributory factors in the same light. Some acts, errors, mistakes, etc. will only impact on the single activity being undertaken at the time - a pilot forgetting to put the landing gear down will only contribute to his or her own accident. But others may have a scalable impact and could contribute to many - a poor policy decision relating to training may result in all crew carrying the same deficient knowledge, which in the right circumstances, could contribute to many accidents.

Pulling it Together

Taleb brings together these and numerous other concepts and outlines his approach to financial investment - he calls it the Barbell Strategy. In recognising the problems with predicting outcomes in complex, dynamic socio-technical systems, he takes both a hyper-conservative and hyper-aggressive approach. He invests significantly in low risk investments and then places numerous small bets in extremely speculative opportunities that carry a significant pay-off - he tries to catch as many positive black swan events while minimising his exposure to negative ones.

So what's our Barbell Strategy for aviation safety?

We need to invest in things that we know are closely related to bad stuff happening - say, runway safety, CFIT, etc. - and we need to invest in things that can have a scalable impact on safety - e.g. poor training standards, inappropriate regulations, etc.

How much we should invest in each is an open question but the basic concept sounded pretty good to me. Actually, it almost sounded familiar...

Confirmation Bias? You Betcha!

The more I thought about Taleb's strategy in the aviation safety context, the more I thought it sounded a lot like scoring risk according to proximity and pathways. My still incomplete concept of risk evaluation sought to identify more critical risk conditions according to either their proximity to the ultimate outcome of death and destruction or the number of pathways the risk condition could result in catastrophe.

Proximity is important to those non-scalable conditions that contribute to accident occurrence and ranks them higher the closer they are to that ultimate condition. This avoids those nasty prediction problems Taleb keeps talking about. Pathways considers the scalable conditions that may contribute to accident occurrence but where prediction of direct causal relationships is impossible. Instead, you simply consider the scale of potential contributory pathways as a measure of criticality.

I have a few threads of thought coming together at the moment in this area. I'm excited to find out how they all tie together and whether I can get them out of my head and on to this blog.

BTI: Dressing up for Risk Assessments

BTI: Dressing up for Risk Assessments

I've been doing a lot of pondering on the Bow-Tie method of risk assessment for a project at work. Bow-Tie is a tool used by many, especially in the oil & gas industry, to create a picture of risk surrounding a central event. It's got a few positives and a few negatives but these can be overcome if you understand the limitations of the model being used.