Artificial intelligence in video games

During the golden age of arcade video games the idea of AI opponents was largely popularized in the form of graduated difficulty levels, distinct movement patterns, and in-game events dependent on the player's input.

Arthur Samuel's checkers program, developed in the middle 1950s and early 1960s, eventually achieved sufficient skill to challenge a respectable amateur.

It was during the golden age of video arcade games that the idea of AI opponents was largely popularized, due to the success of Space Invaders (1978), which sported an increasing difficulty level, distinct movement patterns, and in-game events dependent on hash functions based on the player's input.

[citation needed] Later sports titles allowed users to "tune" variables in the AI to produce a player-defined managerial or coaching strategy.

Herzog Zwei (1989), for example, had almost broken pathfinding and very basic three-state state machines for unit control, and Dune II (1992) attacked the players' base in a beeline and used numerous cheats.

In recent years, the idea of "hunting" has been introduced; in this 'hunting' state the AI will look for realistic markers, such as sounds made by the character or footprints they may have left behind.

Another side-effect of combat AI occurs when two AI-controlled characters encounter each other; first popularized in the id Software game Doom, so-called 'monster infighting' can break out in certain situations.

[citation needed] In the case of Doom, published gameplay manuals even suggest taking advantage of monster infighting in order to survive certain levels and difficulty settings.

Developers used a pathfinding algorithm trained with a data set of real maps to create road networks that would weave through handcrafted villages within the game world.

[32] An example is the 2013 adventure game Proteus where an algorithm dynamically adapts the music based on the angle the player is viewing the ingame landscape from.

[35] Recent breakthroughs in AI have resulted in the creation of advanced tools that are capable of creating music and sound based on evolving factors with minimal developer input.

MetaComposure is an evolutionary algorithm designed to generate original music compositions during real time gameplay to match the current mood of the environment.

Research indicates that there is a significant positive statistical correlation regarding player rated game engagement and the dynamically generated musical compositions when they accurately match their current emotions.

The Monte Carlo tree search method[38] provides a more engaging game experience by creating additional obstacles for the player to overcome.

For instance, an NPC can provide critical information, offer quests, or simply populate the world to add a sense of realism to the game.

Additionally, their role as quest-givers or merchants makes them integral to the gameplay loop, giving players access to resources, missions, or services that enable further progression.

NPCs in modern video games can now react to player actions with increased sophistication, such as adjusting their tactics in combat or changing their dialogue based on past interactions.

By using deep learning algorithms these systems emulate human-like decisions-making, thus making NPCs feel more like real people rather than static game elements.

This development has increased the depth and immersion of player-NPC interactions, as players can now engage in more complex dialogues that affect the storyline and gameplay outcomes.

Deep learning allows NPCs to process large amounts of data and adapt to player strategies, making interactions with them less predictable and more varied.

Designing NPCs capable of adapting to such variability requires complex AI models that can account for numerous possible interactions, which can be resource-intensive and time-consuming for developers.

[42] Believing that the Atari 8-bit could not compete against a human player, Chris Crawford did not fix a bug in Eastern Front (1941) that benefited the computer-controlled Russian side.

Common variations include giving AIs higher speeds in racing games to catch up to the player or spawning them in advantageous positions in first-person shooters.

The Replicas are capable of utilizing the game environment to their advantage, such as overturning tables and shelves to create cover, opening doors, crashing through windows, or even noticing (and alerting the rest of their comrades to) the player's flashlight.

In addition, the AI is also capable of performing flanking maneuvers, using suppressing fire, throwing grenades to flush the player out of cover, and even playing dead.

The various encountered enemies (if the difficulty level is set to its highest) use combat tactics and behaviors such as healing wounded allies, giving orders, out-flanking the player and using weapons with pinpoint accuracy.

[citation needed] The 2010 real-time strategy game StarCraft II: Wings of Liberty gives the player control of one of three factions in a 1v1, 2v2, or 3v3 battle arena.

NPCs in the game display complex and varied behaviors based on a wide range of factors including their environment, player interactions, and time of day.

For instance, AI systems now utilize sophisticated techniques such as decision trees and state machines to enhance NPC interactions and realism, as discussed in "Artificial Intelligence in Games".

For example, recent research has explored the use of complex neural networks to enable NPCs to learn and adapt their behavior based on player actions, enhancing the overall gaming experience.

Light cycle characters compete to be the last one riding, in GLtron .
A robot goes for the ball and competes in Robocup.
A robot goes for the ball and competes in Robocup.