Excerpt from the article:
Hong Kong-based investment firm Deep Knowledge Ventures made headlines in 2014 by appointing a computer algorithm to its corporate board. The firm, which has about 100 million euros under management, wanted a way to enforce a data-driven approach to investing, rather than relying on human intuition and personal interactions with founders. Managing partner Dmitry Kaminskiy says the algorithm served mostly as a veto mechanism—if it spotted red flags, Deep Knowledge wouldn’t invest.
In the five years since Deep Knowledge’s A.I. got its board seat, there hasn’t exactly been a stampede of companies following suit. In fact, Deep Knowledge itself shifted focus and no longer uses the algorithm. “Today, big strategy decisions are based on intuition”—that is to say, by humans—“because we have a data shortage,” says Brian Uzzi, a professor at Northwestern University’s Kellogg School of Management. Firms simply don’t make enough of these major decisions to train an algorithm effectively.
However, as more data is gathered, or models that can account for a lack of data gain commercial traction, it is only a matter of time before A.I. takes on more strategic roles: providing insights on which M&A deals to pursue, which geographies to enter, or whether to match a competitor’s product offering. This is creating a backlash, of sorts. Some management gurus and theorists have rushed to get ahead of the A.I. trend by counterintuitively slowing it down: flatly telling corporate boards not to abandon human intuition and common sense.
Earlier this year, Dirk Lindebaum, from Cardiff University in the U.K., and Mikko Vesa and Frank den Hond, both from the Hanken School of Economics in Helsinki, penned a provocative essay warning corporate directors against becoming too infatuated with A.I. To make their point, the trio drew parallels with the classic E.M. Forster science fiction story “The Machine Stops,” in which humans become so dependent on an all-powerful machine that they lose the ability to think and act independently. “We give away more and more autonomy,” Lindebaum tells Fortune. “Eventually, you come to a situation where you effectively hit the end of choice. You just follow the algorithm blindly.”