At the start of last year Pietro Michelucci and Janis L. Dickinson published an article in Science urging researchers to look more deeply into human computation, the practice of pairing human intelligence with AI to accomplish tasks insurmountable if each were working alone. Without hyperbolic embellishments, they talked about the possibility of tackling problems such as climate change and geopolitical conflict in the near future using this technology. Such verve pervaded all of research into artificial intelligence and machine learning over 2016 and promises to continue colouring the field over this new year.
Artificial intelligence has become indispensable in everyday life through applications in computer vision and translation, but the next step–that of making universal assistants of artificial intelligence, remains challenging. As fast as the field has been at scoring milestones, the real world still remains too complex for current-day AI algorithms. That’s why researchers have been turning to the virtual realities of video games to approximate real-world challenges in preparation for a time when artificial intelligence will be able to start roaming the streets.
Google’s DeepMind lab topped headlines last year after its AlphaGo program beat one of the highest ranked Go players in the world at the game, which has been notoriously difficult for artificial intelligence to master. While this was a phenomenal achievement, it was merely meant to test the prowess of deep learning, one of the key components of modern AI. The real focus of the company since its inception has been to develop programs that could learn to beat a wide variety of video games with little more input than what a human being playing the games would be able to glean. After showing much promise with a variety of classic Atari games in 2015 the company set its sights on conquering StarCraft this year. “StarCraft is an interesting testing environment for current AI research because it provides a useful bridge to the messiness of the real world,” said Oriol Vinyals in a blog post announcing the project. The real-time gameplay of StarCraft, combined with the need for players to make decisions based on incomplete information and guesswork promise to make this a much harder nut to crack for DeepMind than the relatively simple Go.
Soon after Microsoft acquired the open-world sandbox game Minecraft in 2014, its research arm was granted access to a modified version tailored for easy access to AI. Since then the company has made the code of this project, termed Malmo, publicly available to anyone wishing to use Minecraft for “advanced artificial intelligence research.” Because of how players can change their environments inside Minecraft and of how open-ended it is, the game represents a much better approximation to the real world than other virtual environments artificial intelligence has been tested on before.
OpenAI, a non-profit research company founded in part by Elon Musk, recently opened a platform designed to pool all of these efforts at making versatile artificial intelligence algorithms together by providing researchers with a benchmark to test them in a variety of different environments, including many video games. This platform, called Universe, aims to help tackle one of the biggest challenges in modern AI–that of building algorithms able to apply old knowledge to radically new environments.
With such fertile grounds for improving AI laid down in 2016 it seems like this year will bring even more innovation to this field, a prospect made even more exciting by the promise of tech giant Apple to start sharing its famously secretive research with the public.
(featured image courtesy of Ordnance Survey on Flickr)