Deepmind A.I. learns to play video games
Deepmind was recently acquired by Google and it is an artificial intelligence company that works with simulated neural networks.
You can see one of them here, as it learns from its environment how to play video games. A simulated neural network is a simulated brain that learns from its environment, without as much, if any, programming. It learns though experience, like a child does.
Check out this video - not only does the computer master the first game, it adds a dash of creativity after hundreds of hours of training. Also check out the games "River Raid" and "Battle Zone," which involve controlling combat vehicles.
They are going to move their simulator to 3D games like Quake.
DeepMind Technologies is a British artificial intelligence company. It was acquired by Google in 2014.
The company's latest achievement is the creation of a neural network that learns how to play video games in a similar fashion to humans.[2]
Wikipedia
What is the limit to how complex of a video game Deepmind could learn? Could it learn Minecraft? Video games are not that far of a step from interacting with reality, either. In fact, I bet you could train one of these neural networks on video games to prep it for interacting with reality. Think of one operating a drone, for example.
https://www.youtube.com/watch?v=EfGD2qveGdQ