Sir Charles Haddon-Cave addresses attendees at the Australian Petroleum Production and Exploration Association Conference. Image courtesy APPEA.

Sir Charles Haddon-Cave addresses attendees at the Australian Petroleum Production and Exploration Association Conference. Image courtesy APPEA.

FOURTEEN British service personnel died following a mid-air fire onboard military aircraft RAF Nimrod XV230 on 2 September, 2006, in what British High Court Judge Sir Charles Haddon-Cave said was “an accident waiting to happen.”

Speaking at the Australian Petroleum Production and Exploration Association Conference in April, Sir Charles said the aviation and oil and gas industries had much to learn from each other in regards to safety.

“The problems are common – the principles are the same,” he said.

“There are no new accidents – there are just lessons to be learned from the ones we have had.”

The accident was the biggest single loss of life of British service personnel in one incident since the Falklands War – but an independent review into the incident led by Sir Charles found the cause was not enemy fire.

Instead, the fire was caused by leaking fuel being ignited by an exposed hot cross-feed pipe in the lower fuselage of the aircraft – unreachable by personnel and not covered by the aircraft’s automatic fire suppression system.

It was, in Sir Charles’ words, “an engineering failure.”

A combination of poor initial design of the aircraft and modifications from the 1960s onwards, a history of fuel leaks from the 1970s and 1980s which did not raise concerns and increasing maintenance issues had all contributed to the disaster.

Major organisational change and cuts in funding at the Ministry of Defence, the outsourcing of the Nimrod Safety case to civilian workers and increasing heavy use of the Nimrod aircraft in Kosovo, Afghanistan and Iraq in the late 1990s and early 2000s also contributed to the accident, he said.

“These sorts of major catastrophic accidents with a long gestation are, mercifully, rare; but they are a golden, once-in-a-generation, opportunity to learn deep and important lessons, if organisations are prepared to submit themselves to rigorous, objective examination and a real measure of soul-searching,” Sir Charles said.

“It is easy to blame the guy with the screwdriver or the joystick or the clipboard in his hand, but it is vitally important to examine the fundamental ‘organisational causes’ of accidents,” he added.

He drew parallels to NASA, which lost the space shuttle Columbia in 2003 – saying both had a “‘can do’ attitude and ‘perfect place’ culture”, while experiencing organisational turmoil, the imposition of ‘business’ principles and cuts in resources and manpower.

Companies could not take comfort in a strong safety record or the presence of a complex safety system, he said, adding that dissent in boardroom meetings was a valuable commodity.

While encouraging a just and flexible culture which encouraged reporting and learning from information, Sir Charles said a culture of questioning when developing safety cases was also important.

“It is vital to ask questions such as ‘What if?’, ‘Why?’, ‘Can you explain?’, ‘Can you show me?’, ‘Can you prove it?’,” he said.

Sir Charles said a culture of ‘paper safety’ had developed.

“The safety case regime had developed severe shortcomings which included: bureaucratic length; obscure language; a failure to see the wood for the trees; archaeological documentary exercises; routine outsourcing to Industry; lack of vital operator input; disproportionality; ignoring of age issues; compliance-only exercises; audits of process only; and prior assumptions of safety and
‘shelf-ware’.”

Safety cases should be succinct, home-grown, accessible, proportionate, easy to understand and light on documents, he said.

In addition to that, it was important that groups not become too reliant on outsourcing – but remained an intelligent customer.

The key to any properly run organisation is accountability – having clearly identified and defined duty holders who knew what their responsibilities were and had the resources and support to perform them.

“Remember: ‘Accountability’ is the reciprocal of ‘Responsibility’. There can be no real or meaningful responsibility if it is not accompanied by the knowledge that that person will ultimately be held responsible,” he said.

Good regular data collection and analysis are vital to safety, he added, with highly valued engineers and engineering and technical skills being a must for the companies involved.

“Only in this way can you analyse trends, patterns and hidden dangers. Hazard management should be pro-active not merely reactive,” he said.