How do I?


Berman Aviation Associates is ready to help you…


Apply the lessons of aviation safety to other high-risk fields

Our team has experience in Aviation… Health Care… Finance…  Development…


Aviation safety is an increasingly mature application of system safety management.  While every aspect of the aviation domain does not directly translate to others, Berman Aviation Associates can apply aviation’s experience to saving lives, property, and money in your high-risk field.


Here is memorandum to the U.S. Congress that Ben Berman co-authored about applying aviation system safety concepts to financial system reliability and regulation:

Using Regulation to Create a Reliable National Market System


Go from imperfect humans and machines in the system, to a highly reliable system

Effective system design begins when we recognize that humans are imperfect and errors are inevitable.  We can control an error to the extent of making it less likely but cannot eliminate it.  Also, humans are particularly limited at performing some tasks with high reliability.  For example, tasked with continually watching over a highly automated machine that almost never fails, our monitoring may subtly but inevitably relax, and our attention may wander to the extent that we are no longer likely to detect the rare failure of the machine when it does happen.  Yet despite inherent limitations in performing it, monitoring is an essential role for humans in the system that is growing in importance as automation takes over many of the other tasks. Humans are excellent (though also imperfect) at recognizing patterns and trends and at breaking outside the box to find creative solutions when an unusual situation demands it, so we cannot be automated out of the system altogether.

An effective set of procedures acknowledges that human and machine errors will occur.  It incorporates error-trapping features, such as monitoring and crosschecks, to catch the inevitable errors.  It also recognizes that the monitoring functions we give humans are of limited reliability, so to protect against high risk adverse and catastrophic events it establishes networks of procedures and error traps that are independent and redundant.
We can take any high risk domain and build you a system that integrates imperfect human and machine elements to achieve a net result of highly reliable system safety and effectiveness.  Please call on us to tell you more.
Here are two NASA studies Ben Berman co-authored that take a systems approach to human reliability:
Maximize system safety by setting realistic expectations for flight crew performance and reliability

We know what people can realistically and reliably deliver in the complex and high-risk systems in which they operate.


Drawing on nineteen accidents (most of which he had investigated at the National Transportation Safety Board), and working with two NASA colleagues who are cognitive psychologists, Ben Berman co-authored The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents.


This book lays out what we can expect highly trained and experienced humans such as airline pilots to do when faced with situations ranging from the normal and routine to emergencies requiring split second decisions and reactions.  And it traces the performance of the pilots during these accidents, in many cases, to the ways in which all humans process information, the limitations we all share in our cognitive processing, and the characteristic biases and errors that can result.


We can help you by applying the analysis of Berman and his colleagues to expertly analyzing accidents, designing procedures, and designing equipment in all domains.



%d bloggers like this: