What to Do When Algorithms Rule
The first American astronauts were recruited from the ranks of test pilots, largely due to convenience. As Tom Wolfe describes in his incredible book The Right Stuff, radar operators might have been better suited to the passive observation required in the largely automated Mercury space capsules. But the test pilots were readily available, had the required security clearances, and could be ordered to report to duty.
Test pilot Al Shepherd, the first American in space, did little during his first, 15-minute flight beyond being observed by cameras and a rectal thermometer (more on the “little” he did do later). Pilots rejected by Project Mercury dubbed Shepherd “spam in a can.”
Other pilots were quick to note that “a monkey’s gonna make the first flight.” Well, not quite a monkey. Before Shepherd, the first to fly in the Mercury space capsule was a chimpanzee named Ham, only 18 months removed from his West African home. Ham performed with aplomb.
But test pilots are not the type to like relinquishing control. The seven Mercury astronauts felt uncomfortable filling a role that could be performed by a chimp (or spam). Thus started the astronauts’ quest to gain more control over the flight and to make their function more akin to that of a pilot. A battle for decision-making authority—man versus automated decision aid—had begun.
From the wish to control a space capsule’s angle of attack on re-entry, to unwillingness to get into a lift without an operator, the reluctance to have our decisions and actions replaced by automated systems extends through a range of human activity and decision-making. It took nearly 50 years for people to accept automated lifts. Today, over three quarters of Americans are afraid to ride in a self-driving vehicle.
Continue reading the post by visiting Behavioral Scientist.
Jason Collins is data science lead with Australia's corporate, markets, and financial services regulator. He specializes in economics, evolution, and behavioral science.