Last year, Google's AI subsidiary DeepMind said it was going to work with Starcraft creator Blizzard to turn the strategy game into a proper research environment for AI engineers. Today, they're opening the doors to that environment, with new tools including a machine learning API, a large game replay dataset, an open source DeepMind toolset and more. TechCrunch reports: The new release of the StarCraft II API on the Blizzard side includes a Linux package made to be able to run in the cloud, as well as support for Windows and Mac. It also has support for offline AI vs. AI matches, and those anonymized game replays from actual human players for training up agents, which is starting out at 65,000 complete matches, and will grow to over 500,000 over the course of the next few weeks. StarCraft II is such a useful environment for AI research basically because of how complex and varied the games can be, with multiple open routes to victory for each individual match. Players also have to do many different things simultaneously, including managing and generating resources, as well as commanding military units and deploying defensive structures. Plus, not all information about the game board is available at once, meaning players have to make assumptions and predictions about what the opposition is up to. It's such a big task, in fact, that DeepMind and Blizzard are including "mini-games" in the release, which break down different subtasks into "manageable chunks," including teaching agents to master tasks like building specific units, gathering resources, or moving around the map. The hope is that compartmentalizing these areas of play will allow testing and comparison of techniques from different researchers on each, along with refinement, before their eventual combination in complex agents that attempt to master the whole game.
Read more of this story at Slashdot.