METATRON is the AI/Deep Reinforcement Learning agent that any experience is heavily centered upon, it is a highly intelligent A.I encompassing VR, AR & MR experience which will have the player and visitor immersed completely into all worlds!
If you have ever played a video game, you have interacted with artificial intelligence (AI). Regardless of whether you prefer race-car games like Need for Speed, strategy games like Civilization, or shooting games like Counter Strike, you will always find elements controlled by AI. AIs are often behind the characters you typically don’t pay much attention to, such as enemy creeps, neutral merchants, or even animals.
Although this isn’t dynamic AI found in things like self-driving cars (since games are statically programmed). But how does AI found in gaming relate to the AI that tech giants talk about every day?
Recently Elon Musk has warned the world that the fast development of AI with learning capability by Google and Facebook would put humanity in danger. Such an argument has drawn a lot of public attention to the topic of AI.
The flashy vision AI described by these tech giants seems to be a program that can teach itself and get stronger and stronger upon being fed more data.
This is true to some extent for AI like AlphaGo, which is famous for beating the best human Go players. AlphaGo was trained by observing millions of historical Go matches and is still learning from playing with human players online. However, the term “AI” in video game context is not limited to this self-teaching AI.
An obvious drawback of FSM design is its predictability. All NPCs’ behaviors are pre-programmed, so after playing an FSM-based game a few times, a player may lose interest.
A more advanced method used to enhance the personalized gaming experience is the Monte Carlo Search Tree (MCST) algorithm. MCST embodies the strategy of using random trials to solve a problem. This is the AI strategy used in Deep Blue, the first computer program to defeat a human chess champion in 1997. For each point in the game, Deep Blue would use the MCST to first consider all the possible moves it could make, then consider all the possible human player moves in response, then consider all its possible responding moves, and so on. You can imagine all of the possible moves expanding like the branches grow from a stem–that is why we call it “search tree”.
After repeating this process multiple times, the AI would calculate the payback and then decide the best branch to follow. After taking a real move, the AI would repeat the search tree again based on the outcomes that are still possible. In video games, an AI with MCST design can calculate thousands of possible moves and choose the ones with the best payback (such as more gold).
NATURAL RECORDS STUDIOS AI development in video games focuses on neural network applications on solving complex signal processing or pattern recognition problems within the game. We focus on generating a better development to generate a better and more unique user experience.
In the V3 Drones Over New York (Manhattan level), we use Q-Learning to create a Q-Table with different states and actions to enhance our creatures (ranging from birds to the mutated tarantulas) to have realistic animal behavior. Changing from attack sequences to exploit and explore states to mimic their actual animal counterparts. The creatures can choose different states that better fit their situation at hand. So if a player is on the loose, they will exploit their environment to get their enemy as quick as possible.
As Virtual Reality (VR, which provides an immersive viewing experience by means of a display) and Augmented Reality (AR, which combines a human’s physical view of the world with virtual elements) technologies continue to expand, the boundary between the virtual and real world is beginning to blur.
Pokémon Go, the most famous AR game, demonstrated the compelling power of combining the real world with the video game world for the first time. NATURAL RECORDS STUDIOS VR- and AR-based open-world video games provide players with a “real world” experience, perhaps similar to that imagined by the TV series “Westworld.” In this series, the human players can play whatever they want with AI controlled robots and feeling exactly the same with real world.
With the increasing capability of natural language processing, one-day human players may not be able to tell whether an AI or another human player controls a character in video games as well.
Andrew Wilson, the CEO of Electronic Arts, famously predicted that “Your life will be a video game.” As AI-VR/AR technology matures and prompts us to immerse ourselves in an increasingly virtual world, his vision may actually come true after. In that case, do you think you would prefer playing with an AI or a real person? That will become an increasingly pertinent question.
This page was based on the study and article written by Harbing Lou. You can visit his website at https://projects.iq.harvard.edu/harbing.