Tech Interviews are Turing Tests
Are software engineers secretly anti-software?
Two recent interactions between humans and bots have put the development of artificial intelligence in striking contrast: the triumph of AlphaGo over Lee Sedol in a match of Go, and the introduction and subsequent shutdown of Tay on Twitter. AlphaGo shows us how far AI has advanced, while Tay shows us how far it has left to go. But perhaps more striking is that AlphaGo was lauded for its inhumanity – its moves not only surprised Sedol, but also impressed experts. Yet Tay was not. Her tweets were profane and offensive – in other words, inhumane.
Racism, sexism, nationalism, and other discriminatory beliefs start by putting other groups down, considering them subhuman, or in extreme cases, not human at all. Bots are somewhere in the middle, receiving much respect for acceptable, “bot-like” accomplishments, and little respect as conversation partners or Twitter users. In some cases, bots are trusted more than humans, because they (seem to) lack capabilities beyond their job. But when algorithms make a mistake, we lose trust in them completely. Bots are modern slaves.
Bots are very good slaves – or perhaps in more positive terms – bots are very good servants. Unsurprisingly, many humans are afraid of automation precisely because bots can do a better job than humans, especially as many tasks blur the line between what a bot can do and what a human can do. Bots are stereotyped as only being good at repetitive, menial tasks, but as the aforementioned developments in artificial intelligence have shown, bots, like humans, are capable of much more. A bot just needs to know what it can do, and it needs to know what to do – and obviously whether or not it has accomplished that goal. That’s what a job is.
Much has been written about “the gig economy”, but no place puts as much emphasis on the “gig” as Amazon Mechanical Turk. How does it work? First, a “Requester” puts a “human intelligence task” (HIT) onto the marketplace, then “Workers” complete these tasks for payment. Requesters never meet the workers – they are only identified by a number. As a result, workers are judged by their work, making it a nearly perfectly indiscriminate market. But not only does it not discriminate between humans – allowing workers discriminated by other marketplaces to find work – it also would not discriminate against bots, if they weren’t explicitly banned. Too bad Mechanical Turk is becoming less popular, perhaps humans are no longer needed to train the bots.
There’s another market that doesn’t discriminate between humans and bots: the stock market. Algorithmic trading makes up a large portion of the volume on the various exchanges, not only playing the role of arbitrageur, but also making markets based on current news and proprietary models. Although it is still contentious whether or not bots are a positive development in trading, one generous explanation is that bots add diversity to the pool of market players, which hedges market movement from the irrational behavior of humans – though bot behavior can also be irrational, in a different way. Whether or not it will be dampened in the future, algorithmic trading is here to stay, diversifying beyond high-frequency trading. It serves as an example of how bots and humans can interact seamlessly.
Silicon Valley has been seeking to replicate the exchange model in other industries, divvying up ownership and time into smaller and smaller chunks that can be exchanged for payment. These startups are converting full-time jobs into short-term gigs, resulting in small tasks that can be easily graded, perfect for a bot. Eventually, Uber will replace human drivers with self-driving cars. But why hasn’t software development itself become a gig? Topcoder hackathons harness the ideas and execution of small groups, while freelance developers provide engineering effort in short-term contracts. Yet neither have the liquidity of a full “gig” economy. Perhaps gigs are a fad.
Instead, the tech industry primarily sticks with full-time employees. To ensure they don’t treat their jobs like gigs, full-time software developers are hazed in tough interviews and handcuffed to equity cliffs. This keeps the labor market illiquid, which is bad for bots, who have the potential to be efficient players in the labor market. It should be no surprise that the bot writers don’t want to be replaced by bots that can accomplish small, easily digestible bits of work. The very idea of building an app without engineers is anathema to the tech industry, but we can already run factories without workers. If software is to eat the world, it may eventually have to eat itself.
The tech hiring process is infamously inefficient. It is a process designed to keep bots out – a Turing test. The process is opaque: interviewers ask questions that, at first, seem to have well-defined measures of accomplishment: does the code work? Could the code be better? No, interviews must be more than “human intelligence tasks”. They must be capable of filtering out bots. But since human capabilities are so mercurial, interviewers often use simple heuristics for determining whether or not a candidate is capable. Oftentimes, humans can’t pass the interview. Some humans just aren’t human enough.
In case a bot does submit an acceptable piece of code to a coding challenge, an in-person interview will keep the impending robot apocalypse at bay for a few more decades. At worse, a replicant applicant can be rejected as a poor “culture fit.” Similarly, remote work and outsourcing are unacceptable, lest bots hide behind a mask of grainy video or slow internet. Perhaps we are afraid of chatbots precisely because they are too human. Too bad we can no longer assume only humans are good at Go.
Presently, software primarily assists humans. Cruise control and optimizing compilers have humans sitting in the driver’s seat, ready to take control in case something goes awry. We are slow to trust bots. A train line can operate without humans, while planes must have pilots. Advanced driver-assistance systems are marketed as safety features, yet driver’s exams are conducted without them. The ultimate test of driving talent, the World Rally Championship, has banned nearly all electronic assistance. And since “real programmers” flip bits with butterflies, technical interviews are conducted on whiteboards, no electronic assistance available. We just can’t let software taint our judgement of humans who write software.
Yet on Wall Street, the line between human and bot has been smeared by the torrent of financial metrics that guide investment decisions. In some cases, trading software is relegated to technical analysis, crunching numbers for humans to dig through and interpret. In other cases, algorithms control the humans, flashing potentially lucrative trades on screen, waiting for a human to execute them. Perhaps we’re harking back to iconic frenzy of the trading floor, where brokers and floor traders juggled the orders of mysterious clients hidden behind landlines. But those days are gone: electronic signals have replaced hand signals; bots have replaced humans. The prophet of profit is searching for the most efficient blend of human and machine; money does not discriminate on humanness.
Hiring is discrimination. Interviewers discriminate between candidates to hire and candidates not to hire, basing their decisions on a very brief time with them. A hiring process with few rigid requirements is, in a sense, very libertarian. But what has happened? The requirements are now implicit rather than explicit. The discrimination is implicit rather than explicit. Though we don’t want the bots to take our place, we should call into doubt our own ability to assess the humanity of others. If we can trust bots to discriminate between good and bad links on the web, we could trust bots to discriminate between good and bad hires. If we can trust our compilers and our development environments to help us be better programmers, we can trust software to help us hire more effectively. At worst, their discriminations would reflect human discriminations. At best, bots will treat bots and humans as equals.
Perhaps the path to diversity in tech requires no humanity at all.