ai future outlook Articles and Reports
Explore future perspectives on artificial intelligence applications and trends - including products and applications in marketing, finance, and other sectors.
China entered 2016 with a struggling stock market that made many analysts question the strength of the world’s largest economy. Despite this economic omen, China closed 2015 with a pretty stellar year in artificial intelligence and robotics, sparking what may be the beginning of a revolution.
Even folks without a remote interest in artificial intelligence understand that it's starting to surround them. The easy examples can be conjured by just about anyone walking the street: Siri, Amazon's recommendations, Pandora's playlists, Facebook's face-tagging and newsfeed, and Google's search results - these are the easy examples.
"Design fiction," as defined by MIT Media Lab, is "sparking imagination and discussion about the social, cultural, and ethical implications of new technologies through design and storytelling." Storytelling, more specifically science fiction (and even popular nonfiction science) seems to be a natural gateway to this relatively new concept. A series of recent email interviews with science fiction writers inspired us to think more about the avenue of storytelling as an influence in shaping technology and human history in the making.
One of the world's oldest and most prestigious universities is offering a new study focus that sheds light on its progressive approach to academia. With a grant from the non-profit foundation, the Leverhulme Trust, academics at England’s Cambridge University will be able to study artificial intelligence ethics over the next ten years.
Machines like IBM’s Deep Blue and Watson are already capable of beating chess champs and Jeopardy! champs respectively, and prove that strategy and trivia are easily conquered by a machine. But this knowledge doesn’t necessarily transfer over into everyday use.
"Machine learning" is a term that's heard more often in startup and big data circles than "artificial intelligence", and interestingly enough, Google Trends confirms what's already heard through the technological grapevine: