NO TECHNOLOGY TREND today comes with more hype than artificial intelligence. Yet AI researchers often joke that most people don’t quite see how it’s changing the world—because as soon as something like Siri or Nest weaves itself into our lives, people forget it’s “artificial.”
That sort of quiet takeover of AI has already begun in an industry we humans tend to hold pretty sacred—news. Last year, Associated Press announced that the majority of its earnings reports will eventually be written with AI-enabled software. While the company argues it will free up reporters to do more analytical work, the question it raises is unsettling: Should we just leave it to the machines to interpret what they see in piles of Big Data and write up the results? Shouldn’t they at least have some human editors?
Like it or not, AI tech is “entering the era of the narrative,” explains Kris Hammond, chief scientist at Narrative Science, one of several “natural language generation” startups emerging as vendors in this nascent field. “These are narratives generated by systems that understand data, that give us information to support the decisions we need to make about tomorrow.”
New apps like Banjo claim to be tapping a new pulse on the world by mining social media, search trends, geo-location data, and other digital signals to produce new forms of breaking news instead of waiting for humans to figure out something is important and start sharing on Facebook or Twitter. Other media ventures are tapping into AI to build stories when events are based around simple facts and human interpretation isn’t required. Two years ago, the Los Angeles Timesbecame the first major outlet to report on an earthquake—almost instantaneously—with a bot. Today, companies like Automated Insights and Narrative Science are powering the production of millions of auto-generated “articles,” such as personalized recaps for fantasy sports fans. A similar metrics-based formula can be used to recap a customer’s stock portfolio performance. Here’s a snippet of auto-prose from one of Narrative Science’s investment reports:
“The energy sector was the main contributor to relative performance, led by stock selection in energy equipment and services companies. In terms of individual contributors, a position in energy equipment and services company Oceaneering International was the largest contributor to returns.”
That said, the reason many believe (and AI companies promise) that a writer’s job is still safe is because these stories are purely factual, basically converting raw data into language. Human writers could eventually focus on more complex writing for analysis, opinion, or humor—the layers of news that attract readers. A robot probably can’t offer a good explanation for why Tom Brady seemed distracted in the third quarter. But this is where AI researchers hold up a finger and say, “Yet!”
Over the last five years, processing power and huge corpuses of teaching data have given computers the ability to detect emotions and moods. Soon, perhaps, they will be able to recognize a sideline scuffle or a player’s shift in attitude. Combine that with sensors gathering crowd reactions, the movement and changes in velocity for players and passes, historical statistics that provide context for the game and a player’s performance—and now AI is starting to encroach on analysis as well.
As that definition of “reliable data” continues to expand, so too will AI applications. Here are a few AI-oriented ventures making inroads with content and media:
· Arria — Natural language generation platform that parses complex data sets, in any vertical, from finance to meteorology and writes expert-level reports formerly assigned to human analysts.
· Banjo — Ingests all types of digital signals from Twitter, Facebook and around the Web, to offer up the latest big stories faster than humans can discover them.
· Knotch — AI-based social discovery app that monitors social behavior to capture audience feeling, emotion, and mood.
· Narrative Science — Processes data, from weather to stock prices, to write news stories and insights.
These examples just scratch the surface of what businesses and researchers think AI can do for media. If they come through, these changes will raise complicated questions. What becomes of the media jobs AI eliminates? Who’s on the hook when machine intelligence gets the facts wrong or commits libel? These questions will likely require answers even more intelligent than the technology.