The controversy over how U.S. air traffic controllers are recruited shows no signs of abating. Since the story broke, the Transportation Department’s internal watchdog has launched an investigation into hiring practices at the Federal Aviation Administration (FAA) and several high-profile  “resignations” have followed. Some are now calling for hearings that will force FAA leaders to answer before the American people.

Training is an essential part of any successful business. A recent survey found that 40 percent of workers who receive poor training leave their job within the first year. Depending on the profession however, training can also be a costly affair. The FAA incurs a cost of $93,000 per air traffic control trainee annually. In comparison, most businesses will spend just $1208. Such a wide fiscal discrepancy reflects the uniqueness of a job where mistakes can be fatal and close calls make headlines. However, pre-selecting candidates who are most likely to succeed can help minimize these costs. This approach strikes a balance between fiscal and operational concerns. It also inextricably ties the employer’s fate to that of the trainee. Trainee success means employer success.

ADVERTISEMENT

The FAA has historically used the Air Traffic Selection and Training (AT-SAT) test to screen potential recruits. Since its introduction in 2002, more than 22,000 applicants have taken the test and more than 6800 controllers have been hired as a result. However, 2014 saw the FAA introduce a new Biographical Assessment (BA). The agency argues that the BA “measures qualities known to predict air traffic controller success” and has been “validated based on years of extensive research.” But do these claims stand up to scrutiny?

AT-SAT consists of eight sub-tests. These tests measure among other things an applicant’s ability to scan and interpret instrument readings, detect targets that change over time, and determine the angles of intersecting lines: in other words, abilities that play a vital role in air traffic control. The predictive power of AT-SAT is well documented. A 2013 FAA study found a positive relationship between test performance and training outcome. Higher AT-SAT scoresmeant a greater likelihood of being certified as a controller. Researchers concluded that the available evidence “supports the validity of AT-SAT as a personnel selection procedure for the (ATC) occupation.” Multiple investigations over the last fifteen years have come to the same conclusion. 

The BA on the other hand, is something of a different beast. It asks questions such as “How would you describe your ideal job?”; “What has been the major cause of your failures?”; and perhaps most notably “The number of different high school sports I participated in was:
 A) 4 or more.. B) 3.. C) 2.. D) 1.. E) Didn’t play sports.” How such questions help the FAA pre-select the best and brightest candidates is anyone’s guess, as the agency has resisted calls to release any data regarding the issue. The only publicly available study concluded that certain BA-related questions “did little” to improve the FAA’s ability to preselect applicants and that the evidence for using such questions was “weak.”

These findings are not lost on lawmakers like Rep. Randy Hultgren (R-Ill.) who is now sponsoring legislation that forces the FAA to abandon the BA altogether.

When this story broke last May, Fox Business interviewed Matthew Douglas, a 26-year-old Washingtonian who completed the BA. When reviewing the questions posed by the new test, he asked a simple yet important one of his own. “How does this relate to the job?” The answer lies somewhere within the walls of the FAA. The agency has a long history of making taxpayer-funded research data publicly available. So why not do so now?

Nunes is a visiting researcher at L'Universite Paris Decartes and a guest contributing writer to Aviation Week and Space Technology. He earned his PhD in Engineering Psychology from the University of Illinois at Urbana Champaign.