In the world of aviation, clear communication is critical. Pilots are trained to follow checklists, adhere to procedures, and respond to commands swiftly and accurately. However, a subtle yet profound quirk of human cognition can have significant implications: the brain’s tendency to focus on negatives. Safety Management Systems must include this human factor bias when assessing risks at the workplace. Consider this classic example: “Don’t think of an elephant.” The moment you read that phrase, your mind immediately conjures up the image of an elephant. This happens because the human brain struggles to process negative instructions. It first pictures the very thing it’s being told to avoid. In everyday life, this cognitive glitch is harmless, but in the high-stakes environment of aviation, it can be dangerous. Understanding the Cognitive Mechanism At the heart of this issue is how our brains process language and images. When we hear or read a phrase, the brain first forms a mental picture to understand it. If the instruction is negative, like “don’t,” the brain must first visualize the action or object. Then it can attempt to dismiss it. This means that pilots, when given negative commands, might inadvertently focus on what they should be avoiding. They might not focus on the correct action. The Impact of Negativity Bias This phenomenon is closely related to what psychologists call the “Negativity Bias.” Research shows that our brains are more attuned to negative information than positive information. We tend to remember negative experiences more vividly and focus more on negative instructions or warnings. This bias evolved as a survival mechanism. It helped humans avoid danger. However, it can be a liability in modern settings like aviation. For example, when a pilot is told, “Don’t descend below 5,000 feet,” the brain may latch onto the idea of descending. This increases the risk of error during critical moments. Simon Sinek Real-Life Aviation Incidents The aviation industry has seen numerous examples. Negative instructions or the fixation on avoiding a specific outcome led to serious incidents. Here are a few real-life cases: 1. Eastern Air Lines Flight 401 (1972) Eastern Air Lines Flight 401 was a scheduled flight from New York to Miami. It crashed into the Florida Everglades on December 29, 1972. The crash occurred because the crew became preoccupied with a malfunctioning landing gear indicator light. The captain had instructed the crew to “not let the airplane descend” while troubleshooting the problem. However, the crew was distracted and fixated on the faulty light. They failed to notice that the autopilot had been unintentionally disengaged. This led to a gradual descent that went unnoticed until it was too late. The plane crashed, killing 101 of the 176 people on board. 2. American Airlines Flight 965 (1995) On December 20, 1995, American Airlines Flight 965, en route from Miami to Cali, Colombia, crashed into a mountain. This occurred during approach. The crash killed 159 of the 163 people on board. The crash was partly due to confusion over the aircraft’s navigation system. The first officer’s command played a role in the outcome. He said, “Don’t descend below the MDA [Minimum Descent Altitude] until we are clear of the mountains.” The crew, already disoriented by the unfamiliar terrain and stressed by time constraints, may have fixated on the descent. This fixation led them to misjudge their altitude and proximity to the mountains. 3. Northwest Airlines Flight 255 (1987) Northwest Airlines Flight 255 crashed shortly after takeoff from Detroit Metropolitan Airport on August 16, 1987, killing 156 people. The probable cause of the crash was the crew’s failure to deploy the wing flaps and slats necessary for takeoff. They were likely distracted by a series of preflight instructions. The cockpit voice recorder revealed that just before takeoff, the first officer received a checklist item, “Don’t forget flaps.” In the context of the rushed and stressful situation, this may have contributed to the crew’s oversight. The tragic outcome underscores how a negative command, especially in a high-pressure environment, can lead to catastrophic errors. 4. Tenerife Airport Disaster (1977) The Tenerife Airport disaster on March 27, 1977, remains the deadliest aviation accident in history, with 583 fatalities. The disaster involved two Boeing 747s colliding on the runway at Los Rodeos Airport in the Canary Islands. A contributing factor was the communication between the KLM flight crew and air traffic control (ATC). The KLM captain received an ambiguous ATC instruction. It included a negative command: “Stand by for takeoff; do not take off.” However, the captain misunderstood this as clearance to take off, leading to a fatal misunderstanding. The emphasis on the word “takeoff” rather than the negation may have triggered the incorrect action. The Role of Clear Communication Given this, the importance of clear, positive communication in aviation becomes even more apparent. Instead of issuing a negative command, it is safer and more effective to phrase instructions positively. For example, do not say “Don’t descend below 5,000 feet.” A better command would be “Maintain altitude at or above 5,000 feet.” This way, the pilot’s brain processes the positive action directly. This reduces the risk of focusing on the incorrect action. Training and Procedure Design Understanding this cognitive limitation has implications for pilot training and the design of procedures. Training programs should emphasize the importance of using positive language when communicating. This applies both among the cockpit crew and between air traffic control and pilots. Procedures and checklists should also be reviewed. They should ensure that they are framed in a way that minimizes the potential for negative fixation. Conclusion The human brain is an incredible tool, capable of processing vast amounts of information and making split-second decisions. However, it is not without its flaws. The inability to ignore a negative instruction is a subtle but significant factor that can affect pilot performance. We can foster awareness of this issue. By emphasizing the use of clear, positive communication, the aviation industry can take another step forward in enhancing safety. This helps in reducing human error. In aviation, where the margin for error is razor-thin, understanding and mitigating the impact of human cognitive biases is crucial. After all, in the cockpit, every word counts. Like this:Like Loading… Related