The Mechanical Pilot
The pros and cons of automation
The “Mechanical Turk” was an early example of an automaton, a machine which looked like a human figure, but was driven internally, or so its exhibitors proclaimed, by an intricate clockwork mechanism which enabled it to mimic certain human actions and activities. Its main talent was to play a decent game of chess, sometimes beating skilled players, including such notabilities as Napoleon Bonaparte. Exhibited around Europe from the 1770s for half a century or so, the Turk made its various promoters substantial amounts of money. At the same time its notoriety fed a public imagination which, while marvelling at the intellectual capacities of a mere machine, was also nursing disturbing anxieties at the dawn of the Industrial Age. This was a period of turmoil when manufacturers of weaving machines were threatened with violence and vandalism by rebellious weavers, who feared that the manual skills of their profession, and their very livelihoods, were endangered by such new technologies as the punched card loom. Introduced in the early 1800s by Joseph Marie Jacquard, the information about which threads to weave was encoded in holes on a loop of stiff card. It was a method which later inspired the makers of the first digital computers, who entered data on cards reminiscent of those from the first automated looms.
In aviation too, as in every activity where humans interact in a complex and purposeful way with machines, many of its advances have been dictated by the adaptation of existing technologies or the (often previously unforeseen) availability of new ones. Such an evolutionary kind of approach to solving aeronautical problems and challenges was quickly applied by early aviators and engineers. In 1914, barely a decade after the Wright Brothers had inaugurated the aviation age at Kitty Hawk, the first effective autopilot was successfully demonstrated by the American inventor Lawrence Sperry. Adapting a gyrocompass originally designed for marine use by his family’s company, Sperry showed how, together with a feedback mechanism and hydraulically operated controls, the system could greatly reduce the pilot’s workload. Even at this early stage in the history of powered flight, it was recognised that the requirement for the pilot’s constant attention to simply keep the aircraft straight and level for lengthy periods was seriously fatiguing. It also risked distraction from other essential tasks in the cockpit. An accomplished pilot himself, Sperry flew numerous “hands-off” demonstration flights to impress potential military and civil customers. For much of the twentieth century the company’s name was synonymous with automated flight systems. His autopilot was also credited with enabling him, handsome and charismatic airman that he was, to become the first member of the “Mile High” club. Speculation had been fueled by his frequent aerial excursions with a particular lady companion. Sperry maintained a gentlemanly silence on the question, although he never actively denied the rumours.
Such social benefits aside, the important contribution the autopilot, and later automated systems, have made is to free the pilot from individually having to perform all of the myriad tasks required, in a confined envelope of time and space, when flying any but the most basic aircraft. The autopilot at a simple level is an aid to what have been described (in the taxonomy of researchers Jens Rasmussen and James Reason) as “skill-based” activities. In the aviation context, it enhances - or often for long periods it completely replaces - the stick-and-rudder basic flying skills required to maintain the aircraft in an upright, going-forward path on its planned trajectory. Errors at this level can involve lapses of “muscle memory”, permitting or causing the aircraft to depart from specified limits of speed or attitude, and the autopilot is there to correct such errors, or to prohibit them from being made inadvertently by the pilot, and relieve the pilot of what would otherwise be exhausting long-term hands-on control of the flight.
At the second level, “rule-based” actions include planning and in-flight procedures, and routine flight management, which are subject to standard operating procedures and previously considered courses of action. Errors involve departure from agreed rules, and correctly responding requires application of the appropriate rule from a predetermined set which are set down in manuals or regulations. Automation at his level appears in flight management and navigational systems which augment, or replace, human decision-making with machine decision making. The development of such systems was in turn facilitated by the rapid progress since mid-last century of microelectronic and computer technologies, accompanied by advances in human factor studies such as ergonomics and cockpit design. Little by little, it has sometimes seemed, the pilot is being “designed out” of the business of flying, to be replaced by automatic systems, and there is no doubt that accident statistics over this period demonstrate a steady increase in safety levels to the point where commercial aviation has become the world’s safest mode of transport.
At the same time, however, automation has brought its own challenges. Basic flying skills, the essential techniques in which a pilot is first trained to control her or his aircraft, can decline through lack of use, the more so when working for operators who discourage hand-flying and insist on maximum use of “the automatics”. The rationale for this policy is understandable - automatic systems have been shown to enhance safety greatly in routine operations. However it opens a path to potential danger when pilots have to intervene and revert to basic manual and cognitive skills. This can happen when the automatics fail, or when a positive decision is made to hand-fly the aircraft in the face of ambiguous or confusing information or commands emanating from the systems themselves. The overall level of aviation accidents is now at a historical low. However the root cause of many of those events has increasingly, in recent years, evolved (or perhaps reverted) to loss-of-control issues. These are exacerbated in many cases by the diminution of fundamental flying skills, an increase in “automation-dependency”, distraction from priority tasks by alarms and messages from the systems, and the cognitive demands of accurately programming navigational and flight management computers. Often these tasks are tackled under extreme workloads. Regulators, operators and manufacturers are responding by specifying a mandatory element of periodic manual flight refresher training, and encouraging the maximum use of hand-flying where operational demands permit. But this is by no means being observed by the majority of operators.
These issues are largely academic for students and pilots of basic recreational and training aircraft. If they don’t progress beyond relatively simple flying activities, they will rarely be faced with such challenges. However for intending commercial pilots, coming to terms with automatic techniques typically begins during instrument rating training, with their first exposure to autopilots and to programmable navigation systems. This is probably the right moment to start becoming familiar with operation of the actual systems: earlier training should focus on embedding fundamental handling skills, and old-school navigational and flight management techniques. However it is good policy for trainers to familiarise learner pilots with the philosophical and technical challenges posed by automation.
In addition to the skill- and rule-based demands on the pilot are the knowledge-based activities which lie beyond the scope of automatic systems to cope with. These arise from new or unexpected situations, and are, in the words of one author in this field, “why we still need humans in the cockpit”. In these events the application of learned skills or rules may no longer be applicable or effective, and the pilot is forced to draw on their knowledge of the entire aviation environment in order to analyse the situation and devise solutions. Errors emerge from psychological influences such as confirmation-bias and task saturation. Successful outcomes in dealing these events will depend greatly on the pilot’s mindset and on their having effectively internalised both the lessons of previous experience, be it over mere months or many years, and also the key insights of human factors research and training. For at the heart of these complex machines and systems is the human being, with all of her or his flaws and amazing capabilities. The maintenance, care, and the training of this “soft machine” to make the decisions the automatics can’t make, is one of the most interesting parts of the whole business of aviation. Pilots shouldn’t worry about being designed out just yet, but they must stake their claim to ultimate control of their flights in all aspects. They must be prepared, technically, emotionally and psychologically, for those situations that can only be managed by the human being, not by the machine.
Even the Mechanical Turk, having delivered decades of substantial income to his owners, was finally “outed”. After half a century of astonishing a credulous public, it was discovered he was being operated by a small person hiding inside, trained to make the correct chess moves.
--
A version of this article was first published in Flight Training News (UK)
Airline Captain, Test Pilot & Fighter Pilot
6 年Nicely articulated.
Journalist/Cronista: 'Rond de Kaap' (november 2023, uitgeverij Ertsberg)
6 年interesting story of the little man within the Turk.? makes us reflect about the (absurd) term of "human error". generally referring to the operator, not the designer of a machine (who is supposed to be some Extra-terrestrial)?
Part 121 Airline Pilot, part 135/91 BD700, EMB135BJ, F2m, HS125..
6 年Great post as always Darragh