Aviation Safety Newsletter - Volume 2, No. 3

Submitted by on Sat, 01.12.2007 - 00:00

It Can Happen to Anyone

How a series of fuel management errors ended in an engine failure for an instructor and a private pilot on final approach at Rockcliffe

by Roger M. Delisle

Errare Humanum Est”, as was said nearly 2000 years ago: To err is human.  In this edition we look at how this timeless quote does not segregate among pilots. A recent incident at our Flying Club shows us how a chain of insidious events can creep up to catch even a well-intentioned, cautious pilot and an instructor off guard.  Thankfully no one was hurt and no damage was incurred in what will surely be an incident the two pilots will not soon forget.  In this first of two articles inspired from it, we’ll focus on the interesting series of coincidences that led to the incident and the safety defences that prevented a potentially serious accident. This look at the chain of events will use Reason’s Model, a safety tool that can help us understand the human factors in incidents and accidents and help prevent them.

What Happened?

On Friday, September 7th, 2007 at approximately 11:00 EDT, a Piper Cherokee 140 with a Flight Instructor and a Private Pilot on board was on short final to runway 27 at Rockcliffe Airport (CYRO) when the engine failed.  The aircraft glided and landed safely onto the runway and there were no injuries or damage. While it was on final, the aircraft had drifted slightly to the right but the instructor corrected its path and landed less than 100 ft. short of the runway numbers.  On the rollout, the instructor used controlled momentum of the aircraft to coast off into the grass and thus cleared the active runway.  Visual inspection of the left tank after the flight confirmed that it was empty.  Fuel starvation was inferred as causal to the engine failure. Weather on that day was reported as 4 SM in haze with light winds and a high-level overcast.

Incident Details:

The flight was the first of three type certification flights the instructor was to give to the private-rated pilot, who had recently purchased a Piper Cherokee with 5 other pilots.

The instructor guided the student through the pre-flight inspection of the aircraft, during which a fuel sample from the right fuel tank was drained. As is common practice at the Club, for environmental considerations, the pilot returned the sample to the tank while visually inspecting the fuel level at the same time.  Fuel quantity in the right tank was then known to be 18 gallons.

Upon sampling the left tank, the pilot noticed small particulates in the fuel cup so the instructor advised he should discard the sample if he really felt concerned about the cleanliness of the cup, so as not to contaminate the fuel.  The sample was discarded instead of being returned to the tank, whereby it was never opened for visual inspection.

During the entire 1-hour training flight, which included upper aerial exercises, such as stalls, steep turns, etc., the fuel selector valve was on the left position and was never switched to the right tank1.  Twice during the flight, the instructor had intended to instruct the pilot to switch tanks, but was distracted by unusually frequent radio calls due to the reduced visibility in the practice area.  The instructor did not think of switching fuel tanks again until the flight was almost over, when he subconsciously assumed that, since he observed the right fuel tank filled to the ¾ level during pre-flight inspection, this meant a similar amount was in the left tank from the previous flight’s refuelling, and that there was therefore plenty of fuel remaining to end the flight without switching tanks2.

The instructor was flying the aircraft for the approach and landing, and, out of fortunate habit, flew his circuit tight enough to remain within gliding distance of the runway.  This turned out to be the crucial safety defence that avoided an off-airport emergency landing. The instructor first noticed the engine failure on final approach at approximately 1000 feet from the runway threshold when it did not respond to throttle inputs and the propeller wind milled at 1000 RPM.  Because the aircraft landing was imminent, power recovery procedures, including a fuel tank selection change, were not attempted.

After the aircraft was refuelled, the instructor flew uneventfully with the pilot. The first such flight was on the same day as the incident and thus prevented memories of the incident unduly imprinting on the pilot.

Other Noteworthy Facts:

It’s important to note here how events from previous days also lined up to contribute to the incident:

  • Two days prior to the incident flight, another pilot from the aircraft’s group of owners conducted a night flight lasting 1.7 hours. The engine drew all of its fuel from the left tank because the selector was never switched during the flight.
  • Because FBO service at Rockcliffe was not available that night, the aircraft was parked without refuelling upon its return.
  • The night before the incident flight, the group of owners of the Piper held a meeting that the night flight pilot could not attend.  He was not then able to advise the group that the aircraft had not been refuelled the night before.  As well, he did not learn of a new owner group policy adopted during the meeting, whereby all pilots are to placard the dash of the aircraft with a paper note if the aircraft requires refuelling before the flight.


The causes of this incident are clear but the cascading chain of events leading to it make it an excellent example on how to apply a safety analysis tool you may have already heard of: Reason’s Model.  James Reason designed the famous “Swiss-cheese” depiction of the human factors behind aviation accidents and incidents and this model has become a common template for understanding aviation safety and is also used in other areas such as hospital safety.  The model is depicted in Figure 1 below.  It shows how “holes” at each decision-making level can lead to vulnerabilities that can line up and let mishaps slip by all operational layers to cause an accident or incident.  The different layers in Reason’s Model are as follows:

  • The Fallible Decision layer groups all corporate decisions regarding flight operations that might cause incidents and accidents.  This is less relevant in private operations, but might still include general decisions, especially those made by a group of owners who must meet and discuss operations.
  • The Line Management layer includes all operations not involving the flight crew. Typical commercial operations would see all their dispatch and line personnel activities here.  In the case of private aircraft operations, the line management might include the owners’ personal involvement in maintaining the aircraft, including fuelling, interacting with mechanics for repairs and other maintenance, etc.
  • Psychological Precursors to Unsafe Acts, are all personal situations, attitudes or mental states experienced by all persons involved prior and during a flight operation that might lead to unsafe acts.
  • The Unsafe Acts layer classifies the operational deficiencies, by action or omission of the flight crew, that are causal or factors to an accident or incident.
  • The Defences layer contains all operations whose purpose is to prevent incidents or accidents from occurring, in other words, to catch earlier deficiencies.

If any of these layers stops a mishap from progressing “through the holes”, an accident is prevented. Therefore, the objective in a case study is to examine ways of improving each layer to prevent the situation from arising again.

Figure 1: Reason’s Model.    Source: Transport Canada.

For illustrative purposes, we will first identify the causes and factors, and then fit them into Reason’s model as an exercise and to illustrate the chain of events leading to our engine failure.  Note that Reason’s Model uses terminology suitable for commercial operations, so our private flight classification adapts to it as best as possible.

The Causes:  These are the direct actions or omissions of people, or the failure of any component of flight that directly resulted in the incident/accident.  Without these, it is unlikely that the incident would have occurred.  Causal actions or omissions of anyone involved usually entail an incorrect departure from known operating procedures.

The Contributing Factors:  A contributing factor could not, by itself, cause an incident to occur.  But its presence, in conjunction with one or more causes, definitely adds to the likelihood, or participates in the incident in a significant way.  A contributing factor does not imply that it should not otherwise exist or occur, simply that the circumstances have implicated it with the cause(s) of the incident.  In our case study, the contributing factors were numerous and linked together. They are each identified as a “Factor” in the Reason’s Model table below.

Reason’s Model LayerCausal ClassificationDescription
Fallible Decisions Factor The absence of the previous flight’s pilot at the owner’s meeting.  This led to lack of communication with regards to the group’s procedures for refuelling after night flights and the state of the aircraft fuel levels at the time.
Factor The tendency of pilots to combine the visual inspection of the fuel in a tank with returning the drained sample to it.  This leads to a single opening of the tank for both actions.  Because the left sample was discarded, the usual routine was broken and the tank never opened.  Some will claim that the practice of returning fuel samples to the tank altogether is arguably a contributing factor to fuel mismanagement, but this point is open for debate.
Line Management Deficiencies Factor The particulate material in the fuel cup.  This is what distracted the pilot to discard the sample and omit the visual fuel check in the left tank.
Factor The lack of refuelling of the aircraft on the previous flight due to night operations.
Psychological Precursors to Unsafe Acts Factor Unfamiliarity of the pilot with the aircraft.  This was his very first flight in type, so the new, unfamiliar surroundings certainly would contribute to the disruption of the usual flight operation, i.e. the pre-flight inspection.
Factor The pilot’s low total time and inexperience with in-flight fuel tank switching (previous time in Cessnas only). This made it unlikely for him to remember to switch tanks.
Factor The instructor’s possible complacency at the time.  No pilot is immune to this.
Factor Low visibility and higher practice area traffic were distractions that diverted the attention of the crew away from managing the fuel.
Unsafe Acts Cause Incorrect in-flight fuel management causing fuel starvation.
Cause Failure of the pilot to inspect the fuel level of the left tank visually during the pre-flight inspection.
Cause Failure of the instructor to notice that the pilot did not visually inspect the left tank.
Defences - The instructor’s habitual procedure of remaining within gliding distance of the runway while in the circuit.
- The fuel in the other tank.  It was known by the instructor to be there and would have enabled a restart during engine failure emergency procedures.

The Defences:  These are what caught the mishap and prevented it from becoming an accident.  Only one defence, that of remaining within gliding distance from the runway while in the circuit, was actually useful in the flight.  The other was available and might have been crucial in restarting the engine, had it failed anytime before the aircraft reached the circuit.

Closing Remarks

This incident’s causes are obvious but the many factors involved made it a good candidate as a case study in accident analysis and the application of Reason’s Model.  Understanding each layer can help identify causes, and helps the application of safety management methodologies to remedy problems at the correct point in flight operations.

Reason’s Model can always shed some light into the mechanisms that can multiply into a chain of events, from large airline or corporate operations all the way to the private Sunday flyer.  In this article, we combined traditional “Cause & Contributing Factors” analysis with Reason’s Model to categorise and better understand the areas of flight operations where deficiencies were identified.  From there, a more systematic and conscientious review of our own operations can take place. This analysis also makes it easier to see how a chain of human-related failures can be insidious and can happen at anytime to anyone.

In the second Safety Newsletter inspired by this event, we will take a comprehensive look at fuel management in all its aspects, including tips that go beyond the normal training curriculum.


The Safety Committee wishes to thank the parties involved in this investigation for their help and unconditional cooperation towards aviation safety education.

The Committee recognises and thanks one of its members, RFC instructor Jean René de Cotret for his instrumental help in conducting the investigation.

As usual, I welcome all comments and questions.  Staff members at Rockcliffe are also available to answer your questions, or at least direct you to someone who can.

Roger Delisle This email address is being protected from spambots. You need JavaScript enabled to view it..

1 We should note for unfamiliar readers that, contrary to most Cessna models for example, the fuel selector valve in the Piper Cherokee only allows fuel to be used from one tank at a time.  By design, there is no “BOTH” position. The pilot must switch tanks in flight at regular intervals.

2 A factor affecting this decision is the practice of avoiding switching tanks at circuit altitudes.  This minimises the possibility of encountering fuel problems with less time to react to them.

Disclaimer:  The sole purpose of this Safety Newsletter article is to educate pilots and other persons involved in aircraft operations, strictly in the interest of promoting flight safety.  It is not intended as a substitute for any official investigative report which may or may not exist relating to any incident mentioned. In no way should it be interpreted as apportioning civil or criminal responsibility on any party, including, but not limited to, manufacturers, suppliers, operators, pilots, air crew, maintenance personnel or governmental authorities.