SATISH
SATISH
SATISH
ON
AWS CLOUD VIRTUAL
An internship report submitted in partial fulfillment of the requirements for the Award
of Degree of
BACHELOR OF TECHNOLOGY
in
by
Y.SATISH
(20501A0298)
Under the Esteemed guidance of
Ms. V. Sai Geetha Lakshmi
Assistant Professor
KANURU, VIJAYAWADA-520007
2022 – 2023
PRASAD V. POTLURI SIDDHARTHA INSTITUTE OF TECHNOLOGY
AICTE Approved, NBA & NAAC accredited and ISO 9001: 2015 certified Institution
KANURU, VIJAYAWADA-520007
CERTIFICATE
This is to certify that the Internship report titled ”AWS CLOUD VIRTUAL”
submitted by Y.SATISH (Reg. no: 20501A0298) is work done by him and submitted during
2022 – 2023 academic year, in partial fulfillment of the requirements for the award of the
degree of BACHELOR OF TECHNOLOGY in ELECTRICAL AND ELECTRONICS
ENGINEERING.This internship was done at AICTE EDUSKILLS AWS CLOUD
VIRTUAL.
EXTERNAL EXAMINER
ACKNOWLEDGEMENT
It is my responsibiliity to thank Dr.K.PavanKumar,M.Tech,Ph.D for their
immeasurable beneficent help with timely suggestions, insurmountable guidance and
constant encouragement given throughout this internship and philanthropic ideas which
wrapped us in an inconceivable complaisance. We are very much indebted for his
valuable suggestions and inspiration he has proffered throughout the course of my
internship.
We are highly grateful and obliged for the most cooperative attitude of Dr. Ch
Padmanabha Raju, Head of Electrical and Electronics Engineering Department in
preparing our internship.
We are thankful to our beloved Principal Dr. K. SIVAJI BABU for his
encouragement and the facilities provided to us.
We are also thankful to all our staff members for their valuable support for the
completion of the Internship.
Finally, we express our thanks to the members who directly and indirectly
helped us in completing this internship.
Y.SATISH
(Reg.No: 20501A0298)
ABSTRACT
Cloud computing has become an important tool not only in the business world but also in our
day-to-day activities. Most businesses have opted to cloud computing as it is considered safer
and more reliable especially in inventory tracking. Cloud computing is the on-demand
provision of services that includes data and projects can be put away and gotten to easily.
Amazon is at the forefront in providing cloud-computing services globally using a service
called Amazon Web Services (AWS). It allows customers to store data on the platform.
Amazon Web Services is a prevalent form that increases efficiency while assisting several
business practices. Dating back to the 2000s, organizations depended absolutely on servers that
are purchased servers. In contrast, such servers had functions that are limited with steep prices,
including a server that is functioning requiring numerous validations. The more business keeps
experiencing growth, the more optimization practices and servers are needed. Getting such
items showed unproductively and at times excessively costly. The benefits of Amazon Web
Services have been the answer to many problems. Organizations that use AWS have instantly
available servers; also, AWS offers various improved storage options, workloads, and
enhanced security measures.
CONTENTS
1. INTRODUCTION
1.1 AWS
1.2 History of AWS 1
7. CONCLUSION 26
LIST OF FIGURES
1.1 : AWS
AWS means Amazon Web Services that is used by millions, and to get the answer to this
question, we must know that AWS is a cloud provider. It is a safe cloud services platform that
offers almost all that a business requires to develop sophisticated applications with reliabilit y,
scalability, and flexibility. It is a model for billing generally referred to as “pay-as-you-go,”
having no upfront or capital cost. Amazon offers almost 100 services based on-demand, and
the list has been rising daily. Operation is almost immediately, and it’s accessible with reduced
setup. To master AWS is not all about the online building of sites. The service affords
developers access to an interconnected set of attributes offering calculated database storage,
power, content delivery, and an increasing portfolio of connected functionality. Organizations
around the globe use AWS to develop and scale. Cloud computing has come to remain, and the
available solutions from AWS are fast-tracking its development.
Amazon Web Services was launched in 2002. The company intended to sell the infrastructure
that is not in use as a service or offering it to customers, wherein the purpose was met
enthusiastically. Amazon had its first AWS product launched in the year 2006. After four years,
in the year 2012, Amazon had a huge occasion to gather customer input concerning AWS. To
date, the organization continues to hold similar events, like Reinvent, that lets customers share
feedback concerning AWS. In 2015, Amazon publicized that the revenue of AWS has
amounted to $7.8 billion. From then and 2016, measures had been launched by AWS aiding
customers to migrate their services to AWS. Such actions, including the growing and
appreciating features of AWS, made further economic growth. In the year 2016, Amazon’s
revenue augmented to $12.2 billion in 2016. Presently, AWS provides customers with 160
products and services.
1
CHAPTER 2
CLOUD COMPUTING
2.1: Introduction
Cloud computing is the on-demand delivery of compute power, database, storage, applications,
and other IT resources via the internet with pay-as-you-go pricing. These resources run on
server computers that are located in large data centers in different locations around the world.
When you use a cloud service provider like AWS, that service provider owns the computers
that you are using. These resources can be used together like building blocks to build solutions
that help meet business goals and satisfy technology requirements.(as shown in fig:2.1)
For example, if you wanted to provision a new website, you would need to buy the hardware,
rack and stack it, put it in a data center, and then manage it or have someone else manage it.
This approach is expensive and time-consuming.
Cloud computing enables you to stop thinking of your infrastructure as hardware, and instead
think of (and use) it as software.
Cloud computing enables you to think of your infrastructure as software. Software solutions
are flexible. You can select the cloud services that best match your needs, provision and
terminate those resources on-demand, and pay for what you use. You can elastically scale
resources up and down in an automated fashion. With the cloud computing model, you can
treat resources as temporary and disposable. The flexibility that cloud computing offers enables
businesses to implement new solutions quickly and with low upfront costs.(as shown in fig:2.3)
3
2.4 : Features of Cloud Computing(as shown in fig:2.4)
1. Resources Pooling
It means that the Cloud provider pulled the computing resources to provide services to multiple
customers with the help of a multi-tenant model. There are different physical and virtual
resources assigned and reassigned which depends on the demand of the customer.The customer
generally has no control or information over the location of the provided resources but is able
to specify location at a higher level of abstraction
2. On-Demand Self-Service
It is one of the important and valuable features of Cloud Computing as the user can
continuously monitor the server uptime, capabilities, and allotted network storage. With this
feature, the user can also monitor the computing capabilities.
3. Easy Maintenance
The servers are easily maintained and the downtime is very low and even in some cases, there
is no downtime. Cloud Computing comes up with an update every time by gradually making it
better.The updates are more compatible with the devices and perform faster than older ones
along with the bugs which are fixed.
4
4. Large Network Access
The user can access the data of the cloud or upload the data to the cloud from anywhere just
with the help of a device and an internet connection. These capabilities are available all over
the network and accessed with the help of internet.
5. Availability
The capabilities of the Cloud can be modified as per the use and can be extended a lot. It
analyzes the storage usage and allows the user to buy extra Cloud storage if needed for a very
small amount.
6. Automatic System
Cloud computing automatically analyzes the data needed and supports a metering capability at
some level of services. We can monitor, control, and report the usage. It will provide
transparency for the host as well as the customer.
7. Economical
It is the one-time investment as the company (host) has to buy the storage and a small part of
it can be provided to the many companies which save the host from monthly or yearly costs.
Only the amount which is spent is on the basic maintenance and a few more expenses which
are very less.
8. Security
Cloud Security, is one of the best features of cloud computing. It creates a snapshot of the data
stored so that the data may not get lost even if one of the servers gets damaged.The data is
stored within the storage devices, which cannot be hacked and utilized by any other person.
The storage service is quick and reliable.
9. Pay as you go
In cloud computing, the user has to pay only for the service or the space they have utilized.
There is no hidden or extra charge which is to be paid. The service is economical and most of
the time some space is allotted for free.
5
CHAPTER 3
AMAZON WEB SERVICES (AWS)
3.1 : Introduction
Amazon Web Services (AWS) is a secure cloud platform that offers a broad set of global cloud-
based products. Because these products are delivered over the internet, you have on-demand
access to the compute, storage, network, database, and other IT resources that you might need
for your projects—and the tools to manage them. You can immediately provision and launch
AWS resources. The resources are ready for you to use in minutes.
AWS offers flexibility. Your AWS environment can be reconfigured and updated on demand,
scaled up or down automatically to meet usage patterns and optimize spending, or shut down
temporarily or permanently. The billing for AWS services becomes an operational expense
instead of a capital expense.
AWS services are designed to work together to support virtually any type of application or
workload. Think of these services like building blocks, which you can assemble quickly to
build sophisticated, scalable solutions, and then adjust them as your needs change.
Since its existence, AWS has developed into a vital technological cloud computing. Below are
some essential services offered by AWS:
1. Amazon S3
It is a tool used for backing up the internet and less costly for storage options in the category
of object-storage. The central part of this option is that data stored can be retrieved from
virtually anywhere they are needed.
It provides a resizable and secured capacity for computing, depending on your requirements.
The service, therefore, is designed to enable web-scale cloud computing more reachable.
It is a tool for delivering notification messages to a significant number of subscribers via SMS
or email. Alarms can be sent, including service notifications and other messages proposed to
call attention to important information.
6
4. Amazon Lambda
It's for code running depending on a particular event and manages reliant resources. You do
not need either provisioning servers or operating, and how much is paid depending on the
length of time, it takes in executing your code. It's cost-effective, unlike services that their
charges are according to hourly rates.
5. Route 53
It is a DNS service in the cloud that doesn't need you to keep a separate DNS account. The
aim is to provide a cost-effective and reliable method to route users for businesses to internet
apps.
EFS can be used with the Amazon Web Services Cloud resources and services. It's scalable
and straightforward; it's flexible storing of files for on-premise resources. Containing an
intuitive interface allows users to build and file configuring systems without troubling the app
growth and automatic shrinking when files are being added or even removed.
7. Amazon RDS
Easing the process of setting up, operating, and scaling a relational database in the cloud,
Amazon RDS provides cost-efficient and resizable capacity while automating time-consuming
administrative tasks such as database hardware setup, repairing, and backups. The enhanced
service is for memory performance and output/input processes. Amazon RDS gives you the
freedom to use your relational database of choice including the most popular open source and
commercial agents and amazon relational database built for the cloud, Amazon Aurora, which
offers the performance and availability of traditional commercial databases and fraction of the
cost. RDS enables you to scale across a global footprint of data with enterprise high availability
and disasater recovery no matter the size, it automates many previous cumbersome task,
automatic failover, backups at point in time are restored, disaster recover, access management,
encryption, secure networking, monitoring and performance optimization. All these and more
can be enabled with a few clicks or API codes. Even, highly regulated industries can leverage
RDS which means a broad range of compliance certifications.
7
3.3 : Three ways to interact with AWS
There are three ways to create and manage resources on the AWS Cloud:
The AWS Cloud Adoption Framework (AWS CAF) provides guidance and best practices to
help organizations identify gaps in skills and processes. It also helps organizations build a
comprehensive approach to cloud computing—both across the organization and throughout
the IT lifecycle—to accelerate successful cloud adoption.
The AWS CAF organizes guidance into six areas of focus, called perspectives.
Perspectives consist of sets of business or technology capabilities that are the responsibility
of key stakeholders.(as shown in fig:3.4)
8
3.5 : The Future of AWS
As business and artificial intelligence, including IoT, evolve and indeed come into existence
on their own, the necessity for data storage, cloud computing, and security would evolve to
new levels. Additional services can be developed in the cloud, such as financial markets,
healthcare, and other industries that will become more reliant on these technologies.
Gratefully, AWS is out and remains to develop scalable and easy solutions for deploying and
managing web apps in the cloud. It is evident that there's a bright future and that this cloud has
a silver lining. Suppose you're ready to be a part of the future of AWS. In that case, there's a
certification course from Simplilearn that would prepare you to be an industryready, in-
demand AWS solutions architect, with the privilege of firsthand experience with the
management of AWS. You will study how IT architecture rules are redefined by cloud
computing and how to scale and design Amazon Web Services cloud operations with
Amazon's recommended best practices.
9
CHAPTER 4
CLOUD ARCHITECTURE
Cloud architecture is the practice of applying cloud characteristics to a solution that uses cloud
services and features to meet an organization’s technical needs and business use cases. A
solution is similar to a blueprint for a building.Software systems require architects to manage
their size and complexity.
Cloud architects:
Engage with decision makers to identify the business goals and the capabilities that need
improvement. Ensure alignment between technology deliverables of a solution and the business
goals.Work with delivery teams that are implementing the solution to ensure that the
technology features are appropriate.
1. The Security pillar addresses the ability to protect information, systems, and assets while
delivering business value through risk assessments and mitigation strategies.
2. The Operational Excellence pillar addresses the ability to run systems and gain insight into
their operations to deliver business value. It also addresses the ability to continuously improve
supporting processes and procedures.
3. The Reliability pillar addresses the ability of a system to recover from infrastructure or
service disruptions and dynamically acquire computing resources to meet demand.
4. Performance Efficiency pillar when you consider performance, you want to maximize
your performance by using computation resources efficiently. You also want to maintain that
efficiency as the demand changes.
5. Cost optimization pillar is an ongoing requirement of any good architectural design. The
process is iterative, and it should be refined and improved throughout your production.
10
4.3 : Storing Data in Amazon S3
S3 Standard offers high durability, availability, and performant object storage for frequently
accessed data. Because it delivers low latency and high throughput, S3 Standard is appropriate
for a wide variety of use cases, including cloud applications, dynamic websites, content
distribution, mobile and gaming applications, and big data analytics. It provides durability
across at least three Availability Zones.
1. S3 Standard
Infrequent Access (S3 Standard-IA) offers all the benefits of Amazon S3 Standard, but it runs
on a different cost model to store infrequently accessed data, such as older digital images or
older log files. There is a 30-day minimum storage fee applied to any data placed in it, and also
a higher cost to retrieve data from S3 Standard-IA than from S3 Standard storage.
2. S3 One Zone-IA
stores data in a single Availability Zone. It is ideal for customers who want a lower-cost option
and who do not need the availability and resilience of S3 Standard or S3 Standard-IA. It’s a
good choice for storing secondary backup copies of on-premises data or easily re-creatable
data. You can also use it as cost-effective storage for data that is replicated from another AWS
Region.
3. S3 Intelligent
Tiering is designed to optimize costs by automatically moving data to the most cost-effective
access tier, without performance impact or operational overhead. For a small monthly
monitoring and automation fee per object, Amazon S3 monitors access patterns of the objects
in S3 Intelligent-Tiering. It moves objects that have not been accessed for 30 consecutive days
to the infrequent access tier. If an object in the infrequent access tier is accessed, it is
automatically moved back to the frequent access tier. There are no retrieval fees when using
S3 Intelligent-Tiering and no additional tiering fees when objects are moved between tiers.
It is the lowest-cost storage class for Amazon S3. It supports the long-term retention and
digital preservation for data that might be accessed once or twice in a year.
11
4.3 : Adding a Compute Layer in EC2
1.Instance store
2.Amazon EBS
All four options can be used to store a data volume. However, only an instance store or an
SSD-backed EBS volume can be used to store a root volume. In addition, an instance store or
EBS volume must be used by a single instance at a time. In the case of an instance store
volume, only the instance that the volume is added to can use it.
Database types typically fall into one of two broad categories: relational or non-relational.
1. Relational databases are the most familiar type of databases to most people.
Traditional examples include Microsoft SQL Server, Oracle Database, and MySQL.
12
2. Non-relational databases were developed more recently, but have been around for a
few decades. They play an essential role in the modern computing landscape.
Examples include MongoDB, Cassandra, and Redis.
Amazon RDS
Amazon RDS is a fully managed relational database service that creates and operates a
relational database in the cloud. However, before you learn more details about Amazon RDS,
you will first review the advantages of Amazon RDS as a managed database service. As a
relational database offering, Amazon RDS is a good choice for applications that have complex,
well-structured data. Amazon RDS is a good choice if your workloads must frequently combine
and join datasets, and must have syntax rules that are strictly enforced. For example, Amazon
RDS is frequently used to back traditional applications, enterprise resource planning (ERP)
applications, customer relationship management (CRM) applications, and ecommerce
applications.Amazon RDS is available with six database engines to choose from, including
Microsoft SQL Server, Oracle, MySQL, PostgreSQL, Amazon Aurora, and MariaDB. All
Amazon RDS database types run on a serve. The exception is Aurora, which can run as a
serverless option.
Amazon DynamoDB
Security is a shared responsibility between you and AWS. AWS is responsible for security of
the cloud, which means that AWS protects the infrastructure that runs Amazon RDS.
Meanwhile, you are responsible for security in the cloud. One security recommendation for
Amazon RDS is to run your RDS instances in a virtual private cloud (VPC). A VPC enables
you to place your instance in a private subnet, which secures it from public routes on the
internet. The VPC also provides IP firewall protection and enables you to securely control the
applicable network configuration.
13
CHAPTER 5
CREATING AND CONNECTING NETWORKS
Amazon Virtual Private Cloud (Amazon VPC) is a service that enables you to provision a
logically isolated section of the AWS Cloud (called a virtual private cloud, or VPC) where you
can launch your AWS resources.Amazon VPC gives you control over your virtual networking
resources. For example, you can select your own IP address range, create subnets,and configure
route tables and network gateways. You can use both IPv4 and IPv6 in your VPC for secure
access to resources and applications.You can also customize the network configuration for your
VPC. For example, you can create a public subnet for your web servers that can access the
public internet. You can place your backend systems (such as databases or application servers)
in a private subnet with no public internet access. Finally, you can use multiple layers of
security to help control access to Amazon Elastic Compute Cloud (Amazon EC2) instances in
each subnet. These security layers include security groups and network access control lists
(network ACLs).
An internet gateway allows communication between instances in your VPC and the internet.
Route tables control traffic from your subnet or gateway.
Elastic IP addresses are static, public IPv4 addresses that can be associated with an instance
or elastic network interface. They can be remapped to another instance in your account.
NAT gateways enable instances in the private subnet to initiate outbound traffic to the internet
or other AWS services.
A bastion host is a server whose purpose is to provide access to a private network from an
external network, such as the internet.
Security groups
Security groups are stateful firewalls that act at the level of instance or network interface.
Security group rules control inbound and outbound traffic to your AWS resources. You should
tightly configure these rules to restrict traffic and allow access only as needed. Traffic can be
14
restricted by any internet protocol, service port, and source or destination IP address (individual
IP address or CIDR block).
Block all
Allow all
inbound
outbound
Security
group Security
group
When you create a security group, it has no inbound rules. This means that you must add
inbound rules to the security group to allow inbound traffic that originates from another host to
your instance. By default, a security group includes an outbound rule that allows all outbound
traffic. You can remove the rule and add outbound rules that allow specific outboundtraffic only.
If your security group has no outbound rules, no outbound traffic that originates from your
instance is allowed.(as shown in fig:5.3)
A network access control list (network ACL) is an optional layer of security for your VPC. It
acts as a firewall for controlling traffic in and out of one or more subnets. To add another layer
of security to your VPC, you can set up network ACLs with rules that are similar to your
security groups.
15
5.4 : Connecting to your remote network with AWS Site-to-Site VPN.
You can use AWS Site-to-Site Virtual Private Network (AWS Site-to-Site VPN) to securely
connect your on-premises network or branch office site to your VPC. Each AWS Site-to-Site
VPN connection uses internet protocol security (IPSec) communications to create encrypted
VPN tunnels between two locations. A VPN tunnel is an encrypted link where data can pass
from the customer network to or from AWS. The AWS side of the connection is the virtual
private gateway. (Note that instead of a virtual private gateway, you can also create a Site-to-
Site VPN connection as an attachment on a transit gateway. You will learn more about AWS
Transit Gateway later in this module.) The on-premises side of the connection is the customer
gateway.AWS Site-to-Site VPN provides two VPN tunnels across multiple Availability Zones
that you can use simultaneously for high availability. You can stream primary traffic through
the first tunnel and use the second tunnel for redundancy. If one tunnel goes down, traffic will
still get delivered to your VPC.
AWS Direct Connect (or DX) is another solution that goes beyond simple connectivity over
the internet. DX uses open standard 802.1q virtual local area networks (VLANs) so you can
establish a dedicated, private network connection from your premises to AWS. This private
connection can reduce network costs, increase bandwidth throughput, and provide a more
consistent network experience than internet-based connections.
1.Hybrid environments
16
5.6 : Connecting your VPC to supported AWS Services
A VPC endpoint enables you to privately connect your VPC to supported AWS services and to
VPC endpoint services that are powered by AWS PrivateLink. VPC endpoint services that are
powered by AWS PrivateLink include some AWS services, services hosted by other AWS
customers and AWS Partner Network (APN) Partners in their own VPCs (which are referred
to as endpoint services), and supported AWS Marketplace Partner services. VPC endpoints do
not require an internet gateway, NAT device, VPN connection, or DX connection. Instances in
your VPC do not require public IP addresses to communicate with resources in the service.
Traffic between your VPC and the other service does not leave the Amazon network. Endpoints
are virtual devices. They are horizontally scaled, redundant, and highly available VPC
components. Endpoints allow communication between instances in your VPC and services
without imposing availability risks or bandwidth constraints on your network traffic.
1. Interface endpoint
It is an elastic network interface with a private IP address. This IP address serves as an
entry point for traffic that is destined to a supported service.
2. Gateway endpoint
It is a gateway that you specify as a target for a route in your route table. The route is
for traffic that is destined to a supported AWS service. Amazon S3 and Amazon
DynamoDB are supported by gateway endpoints.
17
CHAPTER 6
CAPSTONE PROJECT
The capstone project is one way to practice applying the knowledge that you have developed
in the course to a real-world scenario. By completing and documenting the project, you will
have an asset that you can add to your portfolio of work for future opportunities.
4. Run the website on a t2.small EC2 instance, and provide Secure Shell (SSH) access to
administrators
6. Store database connection information in the AWS Systems Manager Parameter Store
Name- Example-DB-subnet
Description- Example-DB-subnet
Add Subnet:
Create database:
Credentials Settings:
Master username-admin
Masterpassword-password
Confirmpassword-password
Instance Configuration:
Connectivity:
Database options
Backup- uncheck it
Navigate cloud9
19
Create clould environment
Step 1:
Name environment
Step 2:
Step 2:Download the project assets(copy from link from the capstone project)
wget https://aws-tc-largeobjects.s3-us-west-2.amazonaws.com/ILT-TF-200-ACACAD-20-
EN/capstone-project/Example.zip
Then Come back to Cloud9 services and Unzip the php downloaded file by using
Ls
20
mkdir Example
Check public IP of Cloud9 EC2 instance and paste into new tab(Now Webpage is not showing)
Solution: Choose Instances and select your instance.(Cloud9 created Instance-Start with aws-
cloud9)
On the Security tab, view the inbound rules. Add HTTP protocol with 0.0.0.0/0 then again
refresh your webpage it will shows the webpage
Again go to RDS dash board now your database instance is created(avilable) and Copy the
endpoint of DB. Go to Cloud9 service to access the machine. Database file is download
successfully.
Important: Go to Security Group and select Example-DB and add inbound rule for
MYSQL/Aurora and source to cloud9 instance then only we can able to import the database
Step1: Navigate to AWS systems manager and create parameter for following values
/example/endpoint <rds-endpoint>
/example/username admin
select cloud9 instance --> actions-->i mages and templates--> create image
21
Description-AMI for CapstoneProject Then
Network mapping
Mappings-Select both us-east-1a below subnet choose Public subnet1 & us-east-1b below
subnet choose Public subnet2
Security groups
22
Come back to Load balance and refresh this Listener and routing,now we can see created target
group name and select it.
EC2 management console under Auto Scaling choose Auto Scaling Groups in new tab
Name
Launch template
Scroll down and on Launch Templates Contents choose our CapstoneProjectAMI ID then
create it.
Network
Availability Zones and subnets-Select Private subnet1 & Private subnet2 then click next
23
Existing load balancer target groups-Select CapstoneProject-LB
Step 4 :
Desired capacity 2
/example/password password
/example/database exampledb
Import the file: mysql -u admin -p exampledb --host <rds-endpoint> < Countrydatadump.sql
It asks password then give password(copy password from master password) then hit enter
24
mysql -u admin -p --host <endpoint>
final step-1: edit subnet groups of auto scalling to public subnet-1 and public subnet-2.
step-2: then stop instances with name Nminds. wait for 5-10 minutes . keep refresh several
times.
Go to Load balancer and copy that DNS name and paste it into new tab.
25
CONCLUSION
Currently, in the marketplace where there's a rise in on-demand services, AWS has developed
a workable solution for business organizations that are searching for inexpensive, reliable, and
scalable cloud computing services. With separate functions in 22 geographical regions,
Amazon Web Services enables firms to manage different services as well as development, data
processing, game development, warehousing, and lots more. A distinguished benefit of AWS
is that your business can have access to EC2, which in turn offers a virtual cluster of computers
via the internet. Hence the job of hardware resources is copied by these much-helpful server
farms located across the globe. Irrespective of the fact that you're just starting or already an
already established enterprise, AWS is the best solution that can provide extensive maximum
uptime, cost savings, and continuous support, which is a good return on investment,
undeniably.
26