Project Report FPS GAME Development using unity
Project Report FPS GAME Development using unity
I
BHARATI VIDYAPEETH (DEEMED TO BE) UNIVERSITY,
COLLEGE OF ENGINEERING, PUNE- 43
CERTIFICATE
This is to certify that the project Stage1 report titled FPS GAME DEVELOPMENT USING
UNITY WITH AI AND SIMULATION, has been carried out by the following students in partial
fulfilment of the degree of BACHELOR OF TECHNOLOGY in Computer Engineering of
Bharati Vidyapeeth (Deemed to be) University Pune, during the academic year 2018-2019.
Team:
1. AAGAT DIWAN (1914110454)
2. MOHIT MEHTA (1914110062)
3. PRATIK PANDEY (1914110073)
4. HIMANSHU SINGH (1914110045)
Place: Pune
Date:
Success is not only the hard work and innovation but also the inspiration and motivation.
Completing this task was never one-effort. It is the result of invaluable contribution of number of
individuals.
I feel great pleasure to submitting this project report on FPS GAME DEVELOPMENT
USING UNITY WITH AI AND SIMULATION. I wish to express my deep gratitude towards
my project guide whose untiring efforts only, to bring my best out of me. Her timely invaluable
guidance and encouragement enables me to complete this seminar with smart results. Her skillful
guidance makes me as a professional which will very beneficial for my future.
Also thanks to my father, mother, friends & last but not the least my younger sister who
always co-operated me with their ideas and suggestions to make this seminar successful
The project was carried out to develop a 3D game that could be used for fun and as an efficient way of
teaching. The case of the project was a First-Person Shooting Game which the player had to search the
enemies on a terrain map and kill the enemies by shooting. There were a few enemies on the terrain and
they possessed four kinds of Artificial Intelligence (AI). The enemies would discover the player if the
player entered into the range of patrol or shot them, and the enemies made long-range damage to the
player. The player had to slay all enemies to win the game. If the player was slain, then the game was
over. The Health Points (HPs) were used to judge whether the player or enemies died. The player could
restore HPs according to touching the Heal Box or finishing a mission.
The game contains two main game scenes, one is the Graphic User Interfaces (GUI) scene and another
one is the game scene. The player could play on or off the Background Music, view the information of
the controller and the author and start or end this game on the GUI scene. The Skybox, Rigid body and
Terrain were applied into the game. Also an interesting way of damage calculating was used in the game.
The game engine used for the project was Unity 4. Unity 4 was developed by the Unity Technologies. It
is the synthesizing type of a game engine that the designers could use to develop a 3D video game,
visualized constructions and real-time 3D animations. As well Unity is a cross-platform game engine,
which means it supports the building of Windows OS, Mac, iOS, Android, Web, Xbox 360, PS3, Wii,
Flash Player and Google Native Client.
The project demonstrates the basic features of the First-Personal Shooting Game and the process of the
3D game designing with the Unity 4 game engine. Also all assets and scripts are flexible for future
development.
TITLE PAGE I
CERTIFICATE II
ACKNOWLEDGEMENT III
ABSTRACT IV
LIST OF TABLES V
LIST OF FIGURES VI
CONCLUSION VII
REFRENCES VIII
CHAPTER 1 INTRODUCTION 1-4
1.1 Introduction 1
1.2 Unity 4 1
1.3 History of Video Games 2
LIST OF FIGURES
FIG NO. TITLE PAGE NO.
CHAPTER 1 - INTRODUCTION
1.1 Introduction
The project was carried out of out to develop a 3D game that could be used for fun and as an
efficient way of teaching. The case of the project was a First-Person Shooting Game which the
player had to search the enemies on a terrain map and kill the enemies by shooting. There were a
few enemies on the terrain and they possessed four kinds of Artificial Intelligence (AI). The
enemies would discover the player if the player entered into the range of patrol or shot them, and
the enemies made long-range damage to the player. The player had to slay all enemies to win the
game. If the player was slain, then the game was over. The Health Points (HPs) were used to judge
whether the player or enemies died. The player could restore HPs according to touching the Heal
Box or finishing a mission. The game contains two main game scenes, one is the Graphic User
Interfaces (GUI) scene and another one is the game scene. The player could play on or off the
Background Music, view the information of the controller and the author and start or end this game
on the GUI scene. The Skybox, Rigid body and Terrain were applied into the game. Also an
interesting way of damage calculating was used in the game.
1.2 Unity 4
The game engine used for the project was Unity 4. Unity 4 was developed by the Unity
Technologies. It is the synthesizing type of a game engine that the designers could use to develop a 3D
video game, visualized constructions and real-time 3D animations. As well Unity is a cross-platform
game engine, which means it supports the building of Windows OS, Mac, iOS, Android, Web, Xbox
360, PS3, Wii, Flash Player and Google Native Client.
The project demonstrates the basic features of the First-Personal Shooting Game and the process of the
3D game designing with the Unity 4 game engine. Also all assets and scripts are flexible for future
development.
For instance, Unity provides a built-in nav-mesh system for path finding. Pathfinding is the idea in which
the player is able to determine the ideal traveling route to take during the navigational processes that
takes the least amount of time between two points. The Unity Asset stores would be another case in point.
It contains many reactive system tools like
FSMs and Behavior Trees. For example, Candice AI has components for handling player detection and
combat systems for the game AI. Another example would be Breadcrumbs AI . There are also non
character-based AI like Procedural Level Generator, which creates the level around the player
themselves.
Our goal is to create a game in which AI can match up to the experience of multiplayer. The current
methods that are currently available in the game include Navmesh systems and way point systems. First,
there are some good features of the Navmesh system which allows for pathfinding inside the level
generator which prevents the player from walking straight through walls and objects in the environment.
Second, some good features of the way point system is that it allows for the escaper bot to have an
unpredictable route for reaching the end goal which would imitate that of a real player style of playing
the game; filled with randomness and does not conform to a fixed way of playing the game. The way
point systems generate different routes for the escaper AI bot to take to reach the end goal. When the
game starts the way point system will randomly choose a path for the AI to take out of the many the
developers had encoded into the system. Thus, we believe that using these simple tools we can at least
get close to emulate multiplayer experience in a single player environment. Our system is a synthesis of
the Nav-mesh system and the way point system allowing them to work together for pathfinding while
other tools keep them separated.
In two application scenarios, we demonstrate how the above combination of techniques increases the
accuracy or the imitation for the ability of the AI to mimic the multiplayer experience.
First, we demonstrate the usefulness of our approach by a comprehensive case study on the competitive
AI compared to the competitive player in which the player has to play against one other opponent either
AI or another player in a 1v1 game mode of Maze Escape in which one act as the “Defender” and the
other as the “Escaper”. Second, we try to analyze the result of our 1v1 competitive AI to a competitive
player with a cooperative AI compared to a cooperative player.
• AI in FPS Games:
The incorporation of AI in FPS games has been a critical aspect of creating engaging and
challenging gameplay experiences. AI opponents are expected to exhibit intelligent behavior,
adapt to player actions, and provide a level of challenge comparable to human players.
Researchers have explored various approaches to achieve this, including reactive systems, finite
state machines (FSMs), behavior trees, and machine learning algorithms.
• Reactive Systems:
Reactive systems allow AI opponents to interact with player decisions and take
different actions to create diverse experiences. By leveraging reactive systems, developers
can design AI opponents that respond dynamically to changing circumstances and provide a
more immersive gameplay experience. These systems often utilize FSMs or behavior trees
to govern AI decision-making processes.
• Finite State Machines (FSMs):
FSMs have been widely used in game development to model AI behavior. FSMs
consist of different states representing various scenarios or actions, enabling AI opponents
to transition between states based on predefined conditions. This approach provides
developers with a structured framework to create AI behaviors that align with specific
gameplay situations, such as combat, exploration, or evasion.
FPS (First-Person Shooter) games are highly popular in the gaming industry, offering players
immersive and thrilling experiences. However, one common challenge faced by developers is
creating AI opponents that can match the complexity and enjoyment provided by multiplayer
gameplay. While multiplayer modes allow for dynamic interactions between human players, the
AI opponents in single-player modes often lack the same level of adaptability and unpredictability.
The existing AI systems used in FPS games, such as reactive systems, finite state machines
(FSMs), and behavior trees, have significantly improved the intelligence and decision-making
capabilities of AI opponents. These systems enable AI opponents to interact with player decisions
and take different actions, creating diverse experiences. However, the effectiveness of these AI
systems varies, and in some cases, they can take away from the overall enjoyment of the game
rather than enhancing it. In addition, the current tools and techniques available for AI development
in Unity, a widely used game development platform, often require significant manual effort and
expertise. While Unity provides a built-in nav-mesh system for pathfinding and the Unity Asset
Store offers various AI packages and tools, developers still face challenges in creating AI
opponents that can mimic the multiplayer experience in single-player environments.
Therefore, the problem addressed in this study is the need for improved AI systems and tools in
FPS game development using Unity to enable AI opponents that match the experience and
complexity of multiplayer gameplay. This includes the need for AI opponents that exhibit adaptive
and unpredictable behaviors, enhance player engagement, and provide challenging encounters in
single-player modes. Additionally, there is a need for more efficient and user-friendly tools within
Unity to facilitate the development of intelligent and dynamic AI opponents, reducing the manual
effort and expertise required. Addressing this problem will not only enhance the gameplay
experience for single-player FPS games but also provide developers with better tools and
techniques to create immersive and engaging AI opponents. By bridging the gap between single-
player and multiplayer experiences, players can enjoy challenging encounters and dynamic
gameplay, even in the absence of human opponents.
To address the challenges outlined in the problem statement, we propose the development of an
enhanced AI system for FPS games using Unity. The proposed system aims to provide AI
opponents in single-player modes that can closely match the experience and complexity of
multiplayer gameplay. The system will leverage existing AI techniques and tools, while also
introducing novel approaches to enhance the adaptability, unpredictability, and engagement of AI
opponents.
4.1 Unity
Unity is a cross-platform game engine initially released by Unity Technologies, in 2005. The focus
of Unity lies in the development of both 2D and 3D games and interactive content. Unity now
supports over 20 different target platforms for deploying, while its most popular platforms are the
PC, Android and iOS systems. Unity features a complete toolkit for designing and building games,
including interfaces for graphics, audio, and level-building tools, requiring minimal use of
external programs to work on projects. To create content with Unity, the main requirement is to
download the Unity engine and development environment. Along with the core engine, you may
also download optional modules for deploying to various different platforms, as well as tools for
integrating Unity scripting into Visual Studio. Unity is equally suited for both 2D and 3D games.
All games made in Unity start out as Projects from the Startup Screen.
4.2 PREFABS
Prefab is short for “prefabricated,” which means “made beforehand,” and not “before fabulous.”
Prefab things are made in sections that can be easily shipped and put together to form a finished
product. Some buildings and houses are prefab. Prefab can be used as a noun or adjective. When
you talk about a cool prefab you saw downtown, you mean a prefabricated building, one that was
built from parts that were shipped and quickly assembled. Prefab shelters have been made from
converted shipping containers. Some furniture is sold this way, too, like a prefab bookshelf in flat
box that contain sections you can take home and put together easily. Or so say the directions.
5.1 UNITY
Unity is a cross-platform game engine initially released by Unity Technologies, in 2005. The focus
of Unity lies in the development of both 2D and 3D games and interactive content. Unity now
supports over 20 different target platforms for deploying, while its most popular platforms are the
PC, Android and iOS systems.
5.1.2 PREFABS
Prefab is short for “prefabricated,” which means “made beforehand,” and not “before
fabulous.” Prefab things are made in sections that can be easily shipped and put together to
form a finished product. Some buildings and houses are prefab
5.1.4 SCRIPTING
Scripting is an essential ingredient in all applications you make in Unity. Most applications
need scripts to respond to input from the player and to arrange for events in the gameplay
to happen when they should. Beyond that, scripts can be used to create graphical effects,
control the physical behavior of objects or even implement a custom AI system for
characters in the game. Scripting tells our GameObjects how to behave; it’s the scripts and
components attached to the GameObjects, and how they interact with each other, that
creates your gameplay. Now, scripting in Unity is different from pure programming. If
you’ve done some pure programming, e.g. you created a running app, you should realize
that in Unity you don’t need to create the code that runs the application, because Unity does
it for you. Instead, you focus on the gameplay in your scripts. Unity runs in a big loop. It
reads all of the data that’s in a game scene. For example, it reads through the lights, the
meshes, what the behaviors are, and it processes all of this information for you.
5.1.5 ANIMATION
Unity’s Animation features include retargetable animations, full control of animation weights
at runtime, event calling from within the animation playback, sophisticated state machine
hierarchies and transitions, blend shapes for facial animations, and much more. Animation is
one among the two components of a game, which brings it to life (the other being audio).
Unity’s animation system is called Mechanism, and its power lies in bring humanoid models
to life. In previous versions, another component called “Animation” was used, but it has now
been depreciated in the recent versions of Unity.
5.2 FPS
The first-person shooter genre has been traced back to Wolfenstein 3D (1992), which has been
credited with creating the genre's basic archetype upon which subsequent titles were based. One
such title, and the progenitor of the genre's wider mainstream acceptance and popularity, was
Doom (1993), often considered the most influential game in this genre; for some years, the term
Doom clone was used to designate this genre due to Doom's influence. Corridor shooter was
another common name for the genre in its early years, since processing limitations of the era's
hardware. meant that most of the action in the games had to take place in enclosed areas, such as
in cramped spaces like corridors and tunnels. 1998's Half-Life—along with its 2004 sequel Half-
Life 2—enhanced the narrative and puzzle elements. In 1999, the Half-Life mod Counter-Strike
was released and, together with Doom, is perhaps one of the most influential first-person shooters.
GoldenEye 007, released in 1997, was a landmark first-person shooter for home consoles, while
the Halo series heightened the console's commercial and critical appeal as a platform for first-
person shooter titles. In the 21st century, the first-person shooter is the most commercially viable
video game genre, and in 2016, shooters accounted for over 27% of all video game sales. The
early FPS games were developed in the early 1970s. Some of the first FPS games were Maze War,
Far Cry. They are unlike third – person shooters in which the player can see (usually from behind)
the character they are controlling.
The most challenging task of all was figuring out how to have the AI behave and have objectives similar
to those of a real player. Despite the inclusion of the random way points that were intended to prevent
fixed AI behaviour from happening, there was a problem with the escaper bots not heading toward the
ultimate objective and also picking the same course. Furthermore, we had trouble adding the "wandering"
feature to have the defender bots move around the level because they were unresponsive and remained
stationary the entire time the game was played. The latter two problems were caused by the escapees not
attempting to flee from the defender when it widened its field of vision and just stood there.
Code –
using System.Collections.Generic;
using Unity.FPS.Game;
using UnityEngine;
using UnityEngine.AI;
using UnityEngine.Events;
namespace Unity.FPS.AI
{
[RequireComponent(typeof(Health), typeof(Actor), typeof(NavMeshAgent))]
public class EnemyController : MonoBehaviour
{
[System.Serializable]
public struct RendererIndexData
{
public Renderer Renderer;
public int MaterialIndex;
[Header("Parameters")]
[Tooltip("The Y height at which the enemy will be automatically killed (if it falls off of the level)")]
public float SelfDestructYHeight = -20f;
[Tooltip("The distance at which the enemy considers that it has reached its current path destination point")]
public float PathReachingRadius = 2f;
[Tooltip("Delay after death where the GameObject is destroyed (to allow for animation)")]
public float DeathDuration = 0f;
[Header("Flash on hit")] [Tooltip("The material used for the body of the hoverbot")]
public Material BodyMaterial;
[Header("Debug Display")] [Tooltip("Color of the sphere gizmo representing the path reaching range")]
public Color PathReachingRangeColor = Color.yellow;
RendererIndexData m_EyeRendererData;
MaterialPropertyBlock m_EyeColorMaterialPropertyBlock;
int m_PathDestinationNodeIndex;
EnemyManager m_EnemyManager;
ActorsManager m_ActorsManager;
Health m_Health;
Actor m_Actor;
Collider[] m_SelfColliders;
GameFlowManager m_GameFlowManager;
bool m_WasDamagedThisFrame;
float m_LastTimeWeaponSwapped = Mathf.NegativeInfinity;
int m_CurrentWeaponIndex;
WeaponController m_CurrentWeapon;
WeaponController[] m_Weapons;
NavigationModule m_NavigationModule;
void Start()
{
m_EnemyManager = FindObjectOfType<EnemyManager>();
DebugUtility.HandleErrorIfNullFindObject<EnemyManager, EnemyController>(m_EnemyManager, this);
m_ActorsManager = FindObjectOfType<ActorsManager>();
DebugUtility.HandleErrorIfNullFindObject<ActorsManager, EnemyController>(m_ActorsManager, this);
m_EnemyManager.RegisterEnemy(this);
m_Health = GetComponent<Health>();
DebugUtility.HandleErrorIfNullGetComponent<Health, EnemyController>(m_Health, this, gameObject);
m_Actor = GetComponent<Actor>();
DebugUtility.HandleErrorIfNullGetComponent<Actor, EnemyController>(m_Actor, this, gameObject);
NavMeshAgent = GetComponent<NavMeshAgent>();
m_SelfColliders = GetComponentsInChildren<Collider>();
m_GameFlowManager = FindObjectOfType<GameFlowManager>();
DebugUtility.HandleErrorIfNullFindObject<GameFlowManager, EnemyController>(m_GameFlowManager,
this);
{
m_NavigationModule = navigationModules[0];
NavMeshAgent.speed = m_NavigationModule.MoveSpeed;
NavMeshAgent.angularSpeed = m_NavigationModule.AngularSpeed;
NavMeshAgent.acceleration = m_NavigationModule.Acceleration;
}
if (renderer.sharedMaterials[i] == BodyMaterial)
{
m_BodyRenderers.Add(new RendererIndexData(renderer, i));
}
}
}
void Update()
{
EnsureIsWithinLevelBounds();
DetectionModule.HandleTargetDetection(m_Actor, m_SelfColliders);
m_WasDamagedThisFrame = false;
}
void EnsureIsWithinLevelBounds()
void OnLostTarget()
{
onLostTarget.Invoke();
// Set the eye attack color and property block if the eye renderer is set
if (m_EyeRendererData.Renderer != null)
{
m_EyeColorMaterialPropertyBlock.SetColor("_EmissionColor", DefaultEyeColor);
m_EyeRendererData.Renderer.SetPropertyBlock(m_EyeColorMaterialPropertyBlock,
m_EyeRendererData.MaterialIndex);
}
}
void OnDetectedTarget()
{
onDetectedTarget.Invoke();
// Set the eye default color and property block if the eye renderer is set
if (m_EyeRendererData.Renderer != null)
{
m_EyeColorMaterialPropertyBlock.SetColor("_EmissionColor", AttackEyeColor);
m_EyeRendererData.Renderer.SetPropertyBlock(m_EyeColorMaterialPropertyBlock,
m_EyeRendererData.MaterialIndex);
}
}
bool IsPathValid()
{
return PatrolPath && PatrolPath.PathNodes.Count > 0;
}
m_PathDestinationNodeIndex = closestPathNodeIndex;
}
else
{
m_PathDestinationNodeIndex = 0;
}
onDamaged?.Invoke();
m_LastTimeDamaged = Time.time;
m_WasDamagedThisFrame = true;
}
}
void OnDie()
{
// spawn a particle system when dying
var vfx = Instantiate(DeathVfx, DeathVfxSpawnPoint.position, Quaternion.identity);
Destroy(vfx, 5f);
// loot an object
if (TryDropItem())
{
Instantiate(LootPrefab, transform.position, Quaternion.identity);
}
void OnDrawGizmosSelected()
{
// Path reaching range
Gizmos.color = PathReachingRangeColor;
Gizmos.DrawWireSphere(transform.position, PathReachingRadius);
if (DetectionModule != null)
{
// Detection range
Gizmos.color = DetectionRangeColor;
Gizmos.DrawWireSphere(transform.position, DetectionModule.DetectionRange);
// Attack range
Gizmos.color = AttackRangeColor;
Gizmos.DrawWireSphere(transform.position, DetectionModule.AttackRange);
}
}
OrientWeaponsTowards(enemyPosition);
return didFire;
}
else
return (Random.value <= DropRate);
}
void FindAndInitializeAllWeapons()
{
// Check if we already found and initialized the weapons
if (m_Weapons == null)
{
m_Weapons = GetComponentsInChildren<WeaponController>();
DebugUtility.HandleErrorIfNoComponentFound<WeaponController,
EnemyController>(m_Weapons.Length, this,
gameObject);
DebugUtility.HandleErrorIfNullGetComponent<WeaponController, EnemyController>(m_CurrentWeapon,
this,
gameObject);
return m_CurrentWeapon;
}
{
m_CurrentWeaponIndex = index;
m_CurrentWeapon = m_Weapons[m_CurrentWeaponIndex];
if (SwapToNextWeapon)
{
m_LastTimeWeaponSwapped = Time.time;
}
else
{
m_LastTimeWeaponSwapped = Mathf.NegativeInfinity;
}
}
}
}
SCREENSHOTS
Chapter 8 – TESTING
Testing of the FPS shooting game involved a comprehensive evaluation of its functionality, performance,
and user experience. The testing process aimed to identify and address any bugs, glitches, or design flaws
that could affect the game's stability and gameplay.
1. Functionality Testing:
• Game Mechanics: Testers ensured that all game mechanics, such as player movement, enemy
AI behaviour, shooting accuracy, and health point calculation, worked as intended.
• Object Interactions: Interactions between the player and various in-game objects, such as heal
boxes and mission triggers, were tested to ensure they functioned correctly.
• Enemy AI: The behaviour of enemies in different scenarios, such as patrolling, detecting the
player, and engaging in combat, was tested to verify the AI's effectiveness and responsiveness.
• Game Progression: Testers verified that the game progressed appropriately based on the player's
actions, such as defeating enemies and completing missions.
2. Performance Testing:
• Frame Rate: The game's frame rate was measured to ensure smooth and consistent gameplay
across different hardware configurations.
• Memory Usage: Testers monitored the game's memory usage to identify any excessive memory
leaks or inefficient resource management.
• Loading Times: Loading times for different game scenes and assets were evaluated to ensure
optimal performance and minimise player frustration.
4. Compatibility Testing:
• Platforms: The game was tested on various target platforms (e.g., Windows, Mac, iOS, Android)
to ensure compatibility and functionality across different devices and operating systems.
• Hardware Configurations: Testers validated the game's performance on different hardware
setups to identify any issues related to specific graphics cards, processors, or other hardware
components.
TC003 Shooting 1. Shoot enemies from Bullets hit the Bullets hit the Pass
Accuracy varying distances and target accurately target accurately
angles
TC004 Health Point 1. Player takes damage Player health Player health Pass
Calculation from enemy attacks decreases based points decrease
on the damage as expected
received
TC007 Game 1. Defeat all enemies Game ends with Game ends with Pass
Progression required to win the victory screen victory screen
game
TC008 Game 1. Test various GUI GUI elements GUI elements Pass
Mechanics - elements and options function correctly function
GUI and provide correctly and
desired options provide desired
options
TC009 Frame Rate 1. Monitor and record Game runs Game maintains Pass
game's frame rate smoothly with consistent frame
consistent frame rate
rate
TC010 Loading Times 1. Measure loading Reasonable Loading times Pass
times for different loading times for within
game scenes smooth transition acceptable range
TC011 Controls and 1. Test game Controls are Controls are responsive Pass
Input controls with responsive and and intuitive
different input intuitive
devices
TC012 Graphics and 1. Evaluate Graphics are Graphics are visually Pass
Visuals the quality visually appealing appealing and enhance
and visual and enhance immersion
appeal of immersion
game graphics
TC013 Audio 1. Test Audio elements Audio elements Pass
background synchronized and synchronized and
music, sound contribute to game contribute to game
effects, and atmosphere atmosphere
voiceovers
TC014 User 1. Navigate GUI is user- GUI is user-friendly and Pass
Interface GUI, access friendly and provides necessary
settings, and provides necessary information
view information
information
TC015 Platform 1. Test game Game runs Game runs smoothly Pass
Compatibility functionality smoothly and and functions correctly
on different functions correctly on all tested platforms
platforms on all tested
platforms
TC016 Multiplayer 1. Test Multiplayer Multiplayer features Pass
Stability and multiplayer features work well work well with minimal
Experience functionality, with minimal latency
(if observe latency
applicable) network
stability
CONCLUSION
Here in this report we are mentioning different options like artificial intelligence to give our game upper-
hand in terms of advancement compared to old video games where those options are hard to find or even
if they were used, the old game engine does not give freedom which comes along with those options.
Pawn sense, certain death animation, collider adjustment they were not been found in old games. Sound
plays a great role in shooting games; user can understand certain action with the help of sound used in
our game which was not been found in old video games. 3D Animation effect is the one which generally
get gamer attention, in this paper we are mentioning the different animation made in our game to the
characters which will choose proper actions according to a certain event takes place in game. In today’s
world, the game is showing great advancement in terms of graphics, physics, etc. and game play
perspective is also changing according to the player because of which we created FPS version of game
where it gives gamer a proper experience of a virtual world of video game.
Now there is still so many upgradations can be done in FPS game because there is still some restraint
that is suppressing potential of our game like in the upcoming time there is a concept called FULL DIVE
IMMERSION which will truly defines FPS in their actual means. Physics in game animation is still not
that smooth enough to make player not just to give him proper game experience but also to make them
understand and feel that game concept. The use of AI in real-time task and first-person shooter games
concerns the proposal of concepts that can be used to improve and further develop the use of artificial
intelligence in computer games. there isn’t a very high pace in improving the artificial intelligence in
games is perhaps because the enemy or NPC characters in video game is not challenging enough
compared to online multiplayer games. Either way, game AI is an area that is open for new developments.
REFERENCES
[1] Buro, M. (2003). Real-Time Strategy Gaines: a replacement AI Research Challenge. IJCAI'03
Proceedings of the 18th international joint conference on computing (pp. 1534-1535).
[2]MorganKauffman.Booth,M.(2009).
TheAISystemsofLeft4Dead.computingandInteractiveDigitalEntertainment. Valve.
[3] ValveSoftware.2004.Half-Life2[Computerprogram].ElectronicArts.
IJSDR2009091 International Journal of Scientific Development and Research (IJSDR)
[4] Chen, X., Xu, L., Li, T., & Wang, Y. (2018). “AI techniques for game playing: a survey.”
International Journal of Computer Games Technology, 2018
[5] Gaudl, S., & Hlavac, V. (2018). “AI in first-person shooter games: survey.” Journal of Artificial
Intelligence and Soft Computing Research, 2018
[6] Yannakakis, G., & Hallam, J. (2007). “AI methods for game development.” IEEE Transactions on
Computational Intelligence and AI in Games