Human factors training courses – May 2013

March 25, 2013

The EI is hosting two human factors training courses in May 2013, to be delivered by Bill Gall.

Bill is a member of the EI’s Human and Organisational Factors Committee and a Chartered Psychologist.  He is the author of the EI’s new Human factors briefing notes and Guidance on investigating and analysing human and organisational factors aspects of incidents and accidents.  By his own admission, Bill’s personal goal is ‘to eradicate the terms ergonomics and human factors and convince management that the principles involved are part of good management practice.’  Attendees of the courses should therefore leave equipped with good management processes to share within the workplace.

The courses:

Basic human factors – 7 May 2013
This one day training course provides an essential overview of the role of human and organisational factors in the energy sector and allied industries.  The course will examine the environmental, organisational and job factors, and human and individual characteristics which influence behaviour at work in a way which can affect health and safety outcomes.  Delegates will learn how they can improve procedures, conditions and performance in their own workplace.

Accident and incident investigation – 8-9 May 2013
This two day training course will focus on the analysis of incidents and accidents and will clarify the process of identifying root causes using practical examples.  The course will provide an overview of available analysis methods and the application of these to identify the underlying management and organisational deficiencies responsible.

If you are interested in attending either or both of these courses, please contact Will Sadler e:

Quantitative over quality?

March 5, 2013

In January, EI Netherlands Branch member Arend van Campen posted a thought provoking piece in response to the publication of EI Guidance on quantified human reliability analysis (QHRA) (see “Can humans be quantified?”).

Author of Guidance on quantified human reliability analysis (QHRA), Jamie Henderson, has written a response clarifying the purpose of the guide:

“Arend van Campen’s response raises some important points – issues for which we have a deal of sympathy.  One of the reasons for writing the guidance was that human reliability analyses (HRAs) are often undertaken without a proper understanding of the context in which people work, and of the limitations of the available techniques and data.  There are several potential dangers with this, not least complacency that human factors issues are being adequately managed when they are not.

Thirty years ago, when these techniques (e.g. THERP, HEART) were first being developed, the prevailing approach to engineering risk analysis was, and to some extent still is, primarily deterministic.  When people were considered, if at all, it would typically be as components in a system, that reduces people “…to the same level as if he were a valve or pump that can be tested on reliability and sent back to the manufacturer if it does not work properly”.  Now, for many reasons, some of which are set out in Arend’s response, this approach is lacking.  However, at the time, despite an increasing recognition of the role that people play in ensuring safety, human factors was still finding its feet as a discipline and needed systematic ways of ensuring that human factors issues were considered.  Understandably, these attempts focused on developing tools and techniques that could be integrated with existing approaches to engineering risk management.

In the intervening years, individuals working in this area have made many criticisms of these techniques (for example, the basic concept of human error has been challenged by numerous authors) and sought to develop new approaches to understanding why systems succeed or fail.  One well-known example is research into high reliability organisations (HROs) which, instead of failure, focuses on identifying the characteristics of organisations that appear to manage safety in high-hazard environments particularly well.  Resilience engineering, another relatively recent development, seeks to create flexible, robust processes in the face of real world complexity (e.g. responding to resource issues, revising risk models as situations change).  The issues raised in Arend van Campen’s response (e.g. HSE goals, trust, motivation) are also important factors in the ability of an organisation to manage safety and risk.

However, despite the known issues with their application, and the development of new approaches to understanding why systems succeed or fail, these HRA techniques are still used, often by people without a background in human factors, and without a realistic understanding of the operating context in which tasks are performed.  Until new approaches are developed, operationalised and tested, the existing techniques, which are after all designed to work within the context of an engineering risk analysis, will continue to be used.  Therefore, the aim of the guidance, and the supporting article, was not to endorse these techniques, but to ensure that anyone considering using them, in particular individuals without a background in human factors, understands their limitations.”

We thank both Arend and Jamie for their contributions.

Perhaps something to add is to consider not just how QHRA should be done (if done at all), but why?

Is QHRA being used simply to justify the safety measures we have put in place? Or is it being used to better understand the tasks people are expected to perform, in order to improve the measures in place?