When you’re a first-time passenger in a Google driverless car rolling down busy Highway 101 near San Francisco, you go through three phases, said HR Technology Conference keynote speaker Andrew McAfee: the first is “raw, abject terror,” quickly followed by phase 2 — active fascination. Twenty minutes in, that tends to be followed by the third and final phase: mild boredom.
“These cars are programmed to drive the way we were taught to in driver’s ed classes and then promptly forgot,” said McAfee, principal research scientist at the Massachusetts Institute of Technology and co-author of the recent bestseller The Second Machine Age. “There’s no speeding or abrupt lane changes — it feels like riding an airport monorail.”
McAfee’s driverless car adventure was the indirect result of a book by two professor colleagues, who had written that despite the advances of computers and robotics, they would probably never overtake human beings’ ability to master and rapidly adapt to quickly changing patterns. Thus, they wrote, a computer could never do something like navigate a car through heavy traffic. The book in question, The New Division of Labor, was published in 2004. Six years later, Google announced that its engineers had been riding in computer-guided cars for a number of years.
“As soon as I heard that, I knew I had to experience it,” said McAfee.
The subject of McAfee’s talk, “Making the Right Choices in the Second Machine Age,” was that we’re now living in the greatest era of transformation since the Industrial Revolution. Driverless cars, supercomputers that handily beat long-time Jeopardy! champions and cheap, flexible robots mean that many jobs long-thought to be “automation-proof” because they could only be done by humans will likely be taken over by artificial intelligence. This will, of course, have huge implications for the workplace and for enterprises, he said.
“I don’t think all this will result in enormous factories staffed by only two employees, one of whom is a dog whose job is to bite the human if he tries to touch anything important,” said McAfee. “But I do think we need to re-examine the boundaries between technology and humans and rethink our business models.”
Ideally, human intelligence and artificial intelligence can complement each other, he said. He cited organizations that opened themselves to input from outsiders and used data algorithms to greatly improve service, accuracy and productivity. Such organizations stand in sharp contrast to those that continue to rely on “HiPPOs,” or “the highest-paid person’s opinion,” whether it be the CEO or highly paid outside consultants.
“Some HiPPOs will take data in, but it’s their gut that ultimately makes the decision,” said McAfee.
Geeks, by comparison (McAfee considers “geek” a compliment and describes them as people who are “fascinated and driven by data”) are willing to ignore their gut and follow the data to where it leads them.
This reaps notable dividends, he said: Companies that adopt “data-driven decision-making” achieve a level of productivity that’s 5 percent to 6 percent higher than those that don’t. Digital intelligence is remaking occupations from dairy farming to pathology — indeed, a digital pathologist created by Stanford scientists has proven to be better, on average, at cancer detection than highly trained human pathologists, said McAfee.
For organizations, he said, the upshot of all this must be that they make themselves more open to data-driven approaches and to outsiders who promise to bring in different ways of thinking and doing things, rather than continuing to rely on HiPPOs. He cited the Obama 2012 campaign, which not only brought in “data geeks” to identify new ways of identifying and motivating voters but put them in charge of key operations — a sharp departure from most political campaigns, he said.
“A lot of companies will not be open to this approach, and that will lead to a lot of disruption,” he said. “But if we can find new ways to combine human and digital intelligence, then the sky’s the limit.”