Researchers from Adobe and the Scottish University of Edinburgh are developing systems that will help video game characters to evolve as humans. As the teams note, the motion capture technology commonly used in video games does not explain all the interactions of a digital nature with the world.
In this spirit, researchers use a network of deep neurons to create more realistic representations of movement. For starters, the teams studied and gathe red various actions such as picking up items, getting on, and sitting down. These were all performed by an interpreter on stage. From the re, the neural network can use what it has learned and adapt it to almost any situation and environment.
In addition to producing a more realistic animation, this network can help reduce the size of files, notes the team. This could be particularly useful when the games evolve towards streaming, as on the future platform Stadia of Google.