Compsmag is supported by its audience. When you buy through links on our website, we may earn an affiliate commission fee. Learn more

Deep Learning Allows Video Game Characters To Move Like Real People

Researchers from Adobe and the Scottish University of Edinburgh are developing systems that will help video game characters to evolve as humans. As the teams note, the motion capture technology commonly used in video games does not explain all the interactions of a digital nature with the world.

In this spirit, researchers use a network of deep neurons to create more realistic representations of movement. For starters, the teams studied and gathe red various actions such as picking up items, getting on, and sitting down. These were all performed by an interpreter on stage. From the re, the neural network can use what it has learned and adapt it to almost any situation and environment.

Also Read: Nintendo Selling Refurbished NES, SNES Classic Online

In addition to producing a more realistic animation, this network can help reduce the size of files, notes the team. This could be particularly useful when the games evolve towards streaming, as on the future platform Stadia of Google.

Compsmag