If you’ve ever played any of Blizzard’s RTS games like Warcraft or StarCraft, you know that the AI that Blizzard employs isn’t exactly the most devious or challenging. This is especially true after you’ve played against them for a while in which you’ll start to discover that they follow a script/pattern that you can exploit.
We suppose developing killer AI was never really Blizzard’s intention to begin with, but it looks like the company is curious as to how more high-level AI would fare against players. So much so that during BlizzCon 2016, the company announced that they will be teaming up with Google’s DeepMind to see if the AI is capable of learning and conquering StarCraft 2.
For those unfamiliar with DeepMind, it is the AI behind AlphaGo, an AI machine that learnt how to play the ancient game of “Go” to an expert level in which it actually managed to win some matches against the world champion. That in itself was already pretty fascinating, as was the time when IBM’s Deep Blue computer went against world chess champion Garry Kasparov, but to see DeepMind being adapted to playing a competitive video game will be something else entirely.
We’re not sure when we will see DeepMind and StarCraft 2 in action, but the company did say that they are working on modifications to the game that will allow researchers to build systems that will help AI learn how to play the game.