Crowd-sourced Certifications

I've just been mucking around with a new Internet service called Smarterer. That's not a typo, it really is Smarter-er. I guess, in a nutshell, it's an online quiz creator which is meant to help you quantify and showcase your skills. The twist in this implementation is that the quizzes are crowd-sourced. Anyone can write questions for the quiz and thus over time, the group interested in the topic defines the content and the grading of the quiz.

There's a whole pile of things going on under the hood that I haven't gotten into but it does look interesting.

The fruit of my tinkering was that I kicked off an Aviation Safety Management System quiz. It has 20 questions to begin with and is based on ICAO's Safety Management Manual. There's nothing too obscure in the questions but I would love to see the test grow - the only downside is that I can't take the test!

Anyway, check it out at http://smarterer.com/test/aviation-safety-management-systems and let me know what you think.

SMS Considered

While in Bali talking Runway Safety with a wide range of industry personalities, I found myself at the hotel bar talking SMS with Bill Voss from Flight Safety Foundation. The topic was obviously on Bill's mind because upon my return, I found his latest president's piece in FSF's AeroSafety World to be a good overview of his main SMS points. Some of these points have been on my mind too. Since I'm not one to recreate the wheel (providing it works and is fit for purpose), I'll use some of Bill's well-formed words to kick this off.

Guidance Material

Back when the international standards for SMS were signed out at ICAO, we all knew we were going to launch a new industry full of consultants. We also knew that all these consultants couldn’t possibly know much about the subject and would be forced to regurgitate the ICAO guidance material that was being put out.

The title of the piece is SMS Reconsidered but I'm a little bit more critical of how SMS has been implemented in some places and would argue it was never really considered in the first place. The "regurgitation" of guidance material has been a big problem.

ICAO guidance material touting the "four pillars" was, as I saw it anyway, what the title suggested - guidance material. The industry was meant to consider the material and apply it within their operational context, corporate structure and organisational culture. The level of complexity within the operator, the existing systems in place, the attitudes of everyone involved were/are meant to be considered and a tailored SMS developed.

The reasons behind the current state of SMS are many, varied and probably not worth going over. It is more important to get the concept back on track. That's a big task and bigger than this little blog post. Instead, I wanted to discuss Bill's "four audit questions".

Levels Revisited

Bill's piece outlines four seemingly simple questions designed to test the operation of an SMS:

1. What is most likely to be the cause of your next accident or serious incident? 2. How do you know that? 3. What are you doing about it? 4. Is it working?

When posted on the FSF discussion forum on LinkedIn1, a fifth question (taken from the text) was added:

5. Can you show in the budget process where resources have been re-allocated to manage risk?

Interestingly, it was initially assumed that these were questions posed to the safety manager or some other safety professional as part of discussion between like-minded professionals. However, later comments did swing around to my first initial understanding that they could be asked of anyone within the organisation.

In fact, they should be asked of multiple people at different levels of the organisation.

A couple of weeks ago, I discussed the need to find the right solution at the right level and that the same tools may not be appropriate at different levels.

When thinking about SMS as a whole, there are an infinite number of ways of implementation but all must permeate all levels of the organisation with systems, processes and tools suitable to the needs of each level with communication channels between the various levels.

Bill's five questions, being agnostic to any specific SMS approach, can be applied to every level of the organisation. They should be asked of the safety manager, the operations manager, the training manager, the maintenance manager, the line supervisor and, probably most importantly, the CEO.

They aren't the only questions which need to be asked, but they are a good starting and ending point. Having all the "bits" of an SMS is required from a regulatory point of view but system effectiveness is vital to maintaining an ongoing level of assurance in an operator's ability to manage safety.

Pearls

I've audited or reviewed quite a few SMSs - only a few have showed any real consideration of the SMS concept and were tailored to suit the operator's needs. These were often the better performing systems and they bore little resemblance to the "four pillars".

At the Bali conference, I spied the completely different approach taken by Bombardier. It was mentioned a number of times that it is copyright, so I haven't included a picture here but you can find a presentation outlining their approach on the Transport Canada website. I can't comment on the effectiveness of the system but it is definitely food for thought and a ray of hope that the SMS concept is being considered, digested, pondered, manipulated, tailored, and so on.

1. It's a closed group, so I'm not sure who is able to see the discussion.

Logical Fallacies in the Safety Sphere

Sometimes I feel like I really missed out by not receiving a "classical" education. While I can probably live without the latin and greek philosophy, one area I've been keen to pick up is formal logic. The forming of a coherent and valid argument is a key skill which is, in my opinion, overlooked in safety management. Which is disappointing since making such an argument is at the heart of making a safety case.

I'm not going to tackle the subject of logic today. To be honest, I don't know enough about the overall concept. Instead, I'm going to focus on the typical failings present in a logical argument - the logical fallacies.

A logical fallacy is essentially an error in reasoning leading to an invalid argument.

Firstly, it is funny that most definitions I saw on the web described them as "errors". A term which carries a certain definition in aviation safety circles regarding intent. I just want to be clear that fallacies are not restricted to unintentional errors - they can be made deliberately.

More importantly, I should define a valid argument.

A valid argument is one in which the truth of the conclusion flows from the truths of the premises.

Now, there are a lot of specific types of fallacies. So many, in fact, that people have even developed taxonomies of them. Recently, I found a good primer in this area thanks to a team from Virginia.

But I've got a bit of problem with one aspect of this paper. The authors seem to have a higher opinion of safety professionals than I do. These are some of the offending sentences:

We assumed that safety arguments do not contain emotional appeals for their acceptance or willful attempts at deception.

For example, wishful thinking was excluded because it concerns arguments in which a claim is asserted to be true on the basis of a personal desire or vested interest in it being true. Such an argument is unlikely to appear explicitly in a safety argument.

That second one really grates my nerves. Safety tends to cost money and money is the most basic "vested interest".

I have sat through quite a few presentations on aviation safety that have deliberately pulled on the heart-strings to promote their agenda. This is a type of fallacy known as an emotional appeal.

Under the emotional appeal category, there are a few different types. Each is based on a different emotion - fear, envy, hatred, etc. But it is probably the appeal to pity (or the argumentum ad misericordiam) that I've seen the most. Here is a run-through of the most vivid of my encounters - de-identified, of course.

This presentation was on a certain type of approach to operational safety. I'll at least say that it wasn't SMS but let's leave it at that. The majority of the presentation was, what I assume, was a fairly accurate outline of this approach and how it was to be applied in the operational environment of the presenter.

What I had a problem with was the introduction and regular reference back to the, what I considered, grossly inappropriate emotional appeal made at the start. The commentary came on top of series of personal photos, backed up with a lamenting ballad and outlined the heart-wrenching plight of "Jane".

Jane was happily married for a few short years...was the centre of her husband's world...had recently welcomed her first child into the world...until one day here world was torn apart by an aviation tragedy which claimed the life of her husband...

I'm a generally emotional guy and this story got to me. I'm passionate about safety and on some level, I want to minimise the number of "Janes" out there.

But her story and the thousands like it, had absolutely no bearing on the case put forward in the rest of the presentation. In fact, I felt like it detracted from the substance of the information presented. After overcoming my tears and quivering chin, I probably bounced back into a super-critical stance as a reaction to the manipulation which had just occurred.

It is very tempting to employ cheap tricks such as these in an effort to increase the impact of one's safety case. But in the long run, it will only hurt it. Either by casting doubt on the truth of your conclusion or turning people against the argument regardless of its overall validity.

I might be getting a little bit more philosophical in the coming months as Mr Dekker and Mr Taleb continue to blow my mind with just culture, complexity, randomness and the black swan - more to come.

Levels. Levels? Yeah...

Seinfeld fans may remember this short exchange. Kramer might have been on to something and it had nothing to do with interior design. In my research and work, I've been butting up against a few theoretical roadblocks. But I am starting to think that these roadblocks are actually different levels. Internet guru1 Merlin Mann often observes that people need to solve the right problem at the right level. And now, I'm starting to think that is exactly what I need to do.

Identifying the different levels has been my task of late, and it is a task in need of completion.

This is where I'm at so far...

I was initially running with a military-style strategic/operational/tactical taxonomy. Specifically, strategic being the highest level and involving long-term, executive-level decisions through to frontline, troop-level decisions at the tactical level.

But these terms come loaded, so I've been looking elsewhere. Although, I don't think there are any terms left which don't carry some form of baggage.

So I've started down this road:

  • Executive - the highest level; involving the executive oversight or governance of the organisation; typically strategic although may be concerned with lower level issues from time to time.
  • Management - obviously, somewhere between the executive and the shopfront; probably characterised best as the level where enabling work gets done - things like personnel management, information management or hardware management.2
  • Operations - the real do-ers; practical actions taken in the extremely dynamic, real world.

I've been visualising this arrangement as something like this:

Different Levels

So what does this mean?

I believe the point of recognising the existence of the different levels is to accept that within each level, different objectives exist. As such, different tools and techniques may be required.

In thinking about this problem, I realised I posted something related to this before. In that post, I used different risk evaluation techniques at the different levels. While the overall risk management process should be consistent across all levels, the details differ because the objectives, contexts, and decisions differ.

At the highest/executive level, the context was related more to assurance with the decision about whether to accept the determined level of risk or to do more. As the risk picture changed, the executive decided to do more and directed the management level to produce a plan. At this level the risk evaluation methodology was quite different and quite tailored to the wildlife management context and the set of decisions required at that level - what to do about the various bird species.

Different Levels of Risk Assessments

I hinted at a third level of risk management but, to be honest, I haven't really seen that level employed in the real world in this context. OHS practitioners would be familiar with Job Safety Analyses (JSAs) which are a very operations-level activity which I thought would be similar to what I was thinking here.

I guess the moral of this rather rambling post is that I am becoming more and more convinced that an all-encompassing "enterprise risk management system" is not a simple case of having the same small set of tools for all levels. Instead, you need a framework that recognises the different levels (the different contexts, objectives and decisions) and creates linkages between these levels. My immature thoughts at this stage centre around the decisions and their resulting actions being those connections.

For example, the risk management being carried out at the lowest level may itself be a risk control measure for the next level up and so on. This becomes a bit circular but we might as well accept that it's turtles all the way down, people!

There may be more to come on this one, but right now, its bedtime!

1. He would so hate that title ;)

2. Safety management? I'm not too sure. I've been pondering this lately as well and when that thought is half-finished, I'll post it here too.

Safety Hero: Roger Boisjoly

It's a slice of history every safety professional should know - the night before the Challenger disaster, engineers at NASA-contractor Morton Thiokol made a recommendation that the launch not proceed. They believed that at the low temperatures being experienced at and forecast for the launch site, booster rocket o-ring performance would be severely degraded. They that that this could (and did) lead to disaster. Dissecting what happened is important from many perspectives. As the scenario played out there was group-think, political influences, confirmation biases, inappropriate interpretation of a lack of data (the absence of evidence etc.). The list goes on.

The lesson today? Courage.

This morning I wouldn't have been able to name any of the engineers who tried to stop the Challenger launch. I knew of them and have seen, a number of times, their part played out in this reconstruction.

One of them was Roger Boisjoly and yesterday he passed away.

Thanks to twitter and the blogosphere, I've had a chance to read up on him and remind myself of the man who tried to make a difference. Reading about the impact the event had on him was, frankly, depressing. Being able to say "I told you so" isn't a reward, it is not even a solace.

However, I hope that if I am ever in that kind of situation, I show the same courage he did.

Unhappy? No, but...

After a short hiatus and a new job, I've decided to start blogging again. This time on topics related to my new job. My old blog was slanted towards airport safety but with my shift into a more general and strategic role, I thought I'd shift the blogging to a new home and recommence putting my thoughts out there. The name of this blog comes from a chapter in James Reason's 1997 book, managing the risks of organisational accidents. In it, Professor Reason provided a number of reasons why the regulator's lot is an unhappy one.

There are quite a few issues brought up in the chapter with one of the main negative issue being that the regulator is unlikely to receive any accolades for "bringing about a non-event" but is sure to "be judged by those with 20:20 hindsight as making significant contributions to a major disaster".

There is a positive take-home message for those of us silly enough to be a regulator - Professor Reason thinks that we "are potentially one of the most important defences against organisational accidents".

That got me thinking about "that" graph that you see in most safety-related presentations. You know, this one1: "that" graph In an effort to get ahead of the curve, I often turn my mind to trying to identify the next paradigm shift in accident prevention. And being a fairly egotistical, I'm convinced that the next step resides with the regulator at the industry level. Before you judge me too harshly, these's a trend behind my conceit.

At first attention was focussed on the fundamental unit of aviation - the aircraft. Then as the ROI on technological advances slowed, attention shifted to the pilot or the human factor. Then we looked into interactions between pilots with the advent of cockpit resource management. Which morphed into crew resource management when it started looking at cabin crew and then ground crew. At the moment a lot of work is going into company-level interactions - safety management systems and culture.

But it won't stop there, it can't. Traffic will continue to increase and with it, the number of accidents2. The public, travelling or not, will demand safety be better and tickets be cheaper.

So, I think the next frontier will be at that regulatory level. It will involve national and international authorities and its going to involve a lot of reform and a lot of change. It's going to be a hard slog but we've asked those in the industry to rethink safety on a number of occasions. Its not too much to ask ourselves to question the impact of our actions on safety at the frontline and make changes where appropriate.

I'll be posting my thoughts on topics relating to aviation safety regulation here, when I can. Topics I have in the hopper include risk assessment matrices, compliance v safety and safety management systems at the smaller end of the industry. Here we go...

1. I've got quite a few gripes with this graph, I hope to address them in a future post.
2. Actually, some believe we're overdue for an increase in the accident rate.