Putting The Human in Human Factors

Dean Sigler GFC Leave a Comment

Part of the Green Flight Challenge is the provision of adequate human factors design in the cockpit of the projected airplane. One can only design in so much ergonomic and safety-minded concern for the pilot and passengers. The ultimate human factor is indeed human, a topic Dr. Key Dismukes handled quite ably at the fourth annual Electric Aircraft Symposium.

Dr. Key Dismukes - NASA human factors expert

As noted in his NASA resume, “Dr. Dismukes is Chief Scientist for Human Factors in the Human Factors Research & Technology Division at NASA Ames Research Center (CV ). His current research addresses cognitive issues involved in the skilled performance of pilots, their ability to manage challenging situations, and their vulnerability to error. Among the topics investigated by his research group are prospective memory (remembering to perform deferred intentions), management of attention in concurrent task performance, and training crews to analyze their own performance.” His book, The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents (Ashgate Studies in Human Factors for Flight Operations), co-authored with fellow NASA workers Benjamin A. Berman and Loukia D. Loukopolos, looks into 19 recent aircraft accidents and their causes.

Noting that some reports indicate general aviation may be 100 times more prone to accident than airline travel, Dr. Dismukes divides risk into three types; generic risks that are always present, those risks specific to today’s flight, and human error. He suggests six ways to prevent or mitigate these risks.

Risk management, the first tactic that the pilot can apply, comes from planning alternatives to the original plan, and being willing to play the “what if” game. Several things interfere with this clear-headed approach to assessing risk. People tend to have an expectation bias, counting on expectations to be fulfilled. Contingent with this, a plan continuation bias causes people to proceed with the original plan, although impediments to the plan may be piling up. Prospective memory lapses may cause people to later forget a crucial step, and rushing may increase the incidence of every type of error. All these can combine to overwhelm the pilot’s perceptions and judgement. It’s better to have the “what ifs” sorted out, and contingencies evaluated, before getting in the cockpit.

Second, more recurrent training and practice lead to improved pilot skills. Dr. Dismukes suggests flying periodically with instruments (with an instructor or trained pilot) to maintain proficiency for the worst situations that may occur. As technologically advanced aircraft come more into play, the need for such repeated practice becomes more relevant, frequent, and necessary to stay safe.

Third, pilots should practice judgement and decision making. Since 80 percent of GA accidents are attributable to pilot error, learn from accident reports, analyze your own decision-making capabilities, and avoid potential traps in planning or executing a flight.

Fourth, and this requres assistance from manufacturers, make technology that can improve cockpit work loads and provide safer environments more affordable. (One wonders about the capabilities of Droids and iPads, for instance.)

Fifth, introduce human-centered automation into the equation. Things such as stick shakers and stall warning indicators are relatively crude versions of this growing field. Collision and Accident Avoidance Systems (CAAS) are a more involved form, and any of these require a complex analysis of the relative necessity for pilot decisions versus automated responses to differing situations.

Sixth, an extension of human-centered automation, intelligent cockpit agents (Dr. Seeley’s electronic Certified Flight Instructor, for instance) are sometimes seen as individual components with limited functions, usually to advise the pilot of other traffic, conformance with airspace requirements, and adherence to or departure from other operating parameters. The pilot is an essential part of the decision-making process, and design of such systems will probably involve sophisticated analysis of the human-“machine” interaction for best outcomes.

Whatever the level of cockpit and supporting technology, from the simplest hang glider to the most advanced very light jet, the final arbiter of the fate of the flight is usually the pilot in command. Dr. Dismukes provided significant input into what the aids to that command may look like in the near future, but still prompted pilots to be the responsible agent.

Leave a Reply

Your email address will not be published. Required fields are marked *