Cloud Computing Tutorial
Cloud Computing Tutorial
Cloud Computing Tutorial
Cloud Computing tutorial provides basic and advanced concepts of Cloud Computing. Our
Cloud Computing tutorial is designed for beginners and professionals.
There are the following operations that we can do using cloud computing:
In that server room, there should be a database server, mail server, networking, firewalls,
routers, modem, switches, QPS (Query Per Second means how much queries or load will be
handled by the server), configurable system, high net speed, and the maintenance engineers.
To establish such IT infrastructure, we need to spend lots of money. To overcome all these
problems and to reduce the IT infrastructure cost, Cloud Computing comes into existence.
1) Agility
The availability of servers is high and more reliable because the chances of infrastructure
failure are minimum.
3) High Scalability
4) Multi-Sharing
With the help of cloud computing, multiple users and applications can work more
efficiently with cost reductions by sharing common infrastructure.
Cloud computing enables the users to access systems using a web browser regardless of their
location or what device they use e.g. PC, mobile phone, etc. As infrastructure is off-
site (typically provided by a third-party) and accessed via the Internet, users can connect
from anywhere.
6) Maintenance
Maintenance of cloud computing applications is easier, since they do not need to be installed
on each user's computer and can be accessed from different places. So, it reduces the cost
also.
7) Low Cost
By using cloud computing, the cost will be reduced because to take the services of cloud
computing, IT company need not to set its own infrastructure and pay-as-per usage of
resources.
Application Programming Interfaces (APIs) are provided to the users so that they can access
services on the cloud by using these APIs and pay the charges as per the usage of services.
Prerequisite
Before learning cloud computing, you must have the basic knowledge of computer
fundamentals.
Audience
Our cloud computing is designed to help beginners and professionals.
Problem
We assure that you will not find any difficulty while learning our cloud computing tutorial. But
if there is any mistake in this tutorial, kindly post the problem or error in the contact form.
Once the data is stored in the cloud, it is easier to get back-up and restore that data using the
cloud.
2) Improved collaboration
Cloud applications improve collaboration by allowing groups of people to quickly and easily
share information in the cloud via shared storage.
3) Excellent accessibility
Cloud allows us to quickly and easily access store information anywhere, anytime in the whole
world, using an internet connection. An internet cloud infrastructure increases organization
productivity and efficiency by ensuring that our data is always accessible.
Cloud computing reduces both hardware and software maintenance costs for organizations.
5) Mobility
Cloud computing allows us to easily access all cloud data via mobile.
Cloud computing offers Application Programming Interfaces (APIs) to the users for access
services on the cloud and pays the charges as per the usage of service.
7) Unlimited storage capacity
Cloud offers us a huge amount of storing capacity for storing our important data such as
documents, images, audio, video, etc. in one place.
8) Data security
Data security is one of the biggest advantages of cloud computing. Cloud offers many
advanced features related to security and ensures that data is securely stored and handled.
1) Internet Connectivity
As you know, in cloud computing, every data (image, audio, video, etc.) is stored on the cloud,
and we access these data through the cloud by using the internet connection. If you do not
have good internet connectivity, you cannot access these data. However, we have no any other
way to access data from the cloud.
2) Vendor lock-in
Vendor lock-in is the biggest disadvantage of cloud computing. Organizations may face
problems when transferring their services from one vendor to another. As different vendors
provide different platforms, that can cause difficulty moving from one cloud to another.
3) Limited Control
As we know, cloud infrastructure is completely owned, managed, and monitored by the service
provider, so the cloud users have less control over the function and execution of services within
a cloud infrastructure.
4) Security
Although cloud service providers implement the best security standards to store important
information. But, before adopting cloud technology, you should be aware that you will be
sending all your organization's sensitive information to a third party, i.e., a cloud computing
service provider. While sending the data on the cloud, there may be a chance that your
organization's information is hacked by Hackers.
Then after, distributed computing came into picture, where all the computers are networked
together and share their resources when needed.
On the basis of above computing, there was emerged of cloud computing concepts that
later implemented.a Try Catch
At around in 1961, John MacCharty suggested in a speech at MIT that computing can be sold
like a utility, just like a water or electricity. It was a brilliant idea, but like all brilliant ideas, it was
ahead if its time, as for the next few decades, despite interest in the model, the technology
simply was not ready for it.
But of course time has passed and the technology caught that idea and after few years we
mentioned that:
In 1999, Salesforce.com started delivering of applications to users using a simple website. The
applications were delivered to enterprises over the Internet, and this way the dream of
computing sold as utility were true.
In 2002, Amazon started Amazon Web Services, providing services like storage, computation
and even human intelligence. However, only starting with the launch of the Elastic Compute
Cloud in 2006 a truly commercial service open to everybody existed.
In 2009, Google Apps also started to provide cloud computing enterprise applications.
Of course, all the big players are present in the cloud computing evolution, some were earlier,
some were later. In 2009, Microsoft launched Windows Azure, and companies like Oracle and
HP have all joined the game. This proves that today, cloud computing has become mainstream.
o Front End
o Back End
The below diagram shows the architecture of cloud computing -
Front End
The front end is used by the client. It contains client-side interfaces and applications that are
required to access the cloud computing platforms. The front end includes web servers
(including Chrome, Firefox, internet explorer, etc.), thin & fat clients, tablets, and mobile
devices.
Back End
The back end is used by the service provider. It manages all the resources that are required to
provide cloud computing services. It includes a huge amount of data storage, security
mechanism, virtual machines, deploying models, servers, traffic control mechanisms, etc.
Note: Both front end and back end are connected to others through a network, generally
using the internet connection.
1. Client Infrastructure
Client Infrastructure is a Front end component. It provides GUI (Graphical User Interface) to
interact with the cloud.
2. Application
The application may be any software or platform that a client wants to access.
3. Service
A Cloud Services manages that which type of service you access according to the client’s
requirement.
i. Software as a Service (SaaS) – It is also known as cloud application services. Mostly, SaaS
applications run directly through the web browser means we do not require to download and
install these applications. Some important example of SaaS is given below –
ii. Platform as a Service (PaaS) – It is also known as cloud platform services. It is quite
similar to SaaS, but the difference is that PaaS provides a platform for software creation, but
using SaaS, we can access software over the internet without the need of any platform.
iii. Infrastructure as a Service (IaaS) – It is also known as cloud infrastructure services. It is
responsible for managing applications data, middleware, and runtime environments.
Example: Amazon Web Services (AWS) EC2, Google Compute Engine (GCE), Cisco Metapod.
4. Runtime Cloud
Runtime Cloud provides the execution and runtime environment to the virtual machines.
5. Storage
Storage is one of the most important components of cloud computing. It provides a huge
amount of storage capacity in the cloud to store and manage data.
6. Infrastructure
7. Management
8. Security
Security is an in-built back end component of cloud computing. It implements a security
mechanism in the back end.
9. Internet
The Internet is medium through which front end and back end can interact and communicate
with each other.
Cloud computing becomes a very popular option for organizations by providing various advantag
including cost-saving, increased productivity, efficiency, performance, data back-ups, disaster recove
and security.
Grid Computing
Grid computing is also called as "distributed computing." It links multiple computing resources (PC
workstations, servers, and storage elements) together and provides a mechanism to access them.
The main advantages of grid computing are that it increases user productivity by providing transpare
access to resources, and work can be completed more quickly.
Cloud Computing follows client-server Grid computing follows a distributed computing architecture.
computing architecture.
Cloud Computing is more flexible than grid Grid Computing is less flexible than cloud computing.
computing.
In cloud computing, cloud servers are owned In Grid computing, grids are owned and managed by th
by infrastructure providers. organization.
Cloud computing uses services like Iaas, Grid computing uses systems like distributed computin
PaaS, and SaaS. distributed information, and distributed pervasive.
But, there may be an alternative for executives like you. So, instead of installing a suite of
software for each computer, you just need to load one application. That application will allow
the employees to log-in into a Web-based service which hosts all the programs for the user
that is required for his/her job. Remote servers owned by another company and that will run
everything from e-mail to word processing to complex data analysis programs. It is called
cloud computing, and it could change the entire computer industry.
In a cloud computing system, there is a significant workload shift. Local computers have no
longer to do all the heavy lifting when it comes to run applications. But cloud computing can
handle that much heavy load easily and automatically. Hardware and software demands on the
user's side decrease. The only thing the user's computer requires to be able to run is the cloud
computing interface software of the system, which can be as simple as a Web browser and the
cloud's network takes care of the rest.
1. Art Applications
Cloud computing offers various art applications for quickly and easily design attractive cards,
booklets, and images. Some most commonly used cloud art applications are given below:
i Moo
Moo is one of the best cloud art applications. It is used for designing and printing business
cards, postcards, and mini cards.
ii. Vistaprint
Vistaprint allows us to easily design various printed marketing products such as business cards,
Postcards, Booklets, and wedding invitations cards.
Adobe creative cloud is made for designers, artists, filmmakers, and other creative
professionals. It is a suite of apps which includes PhotoShop image editing programming,
Illustrator, InDesign, TypeKit, Dreamweaver, XD, and Audition.
2. Business Applications
Business applications are based on cloud service providers. Today, every organization requires
the cloud business application to grow their business. It also ensures that business applications
are 24*7 available to users.
i. MailChimp
MailChimp is an email publishing platform which provides various options to design,
send, and save templates for emails.
iii. Salesforce
Salesforce platform provides tools for sales, service, marketing, e-commerce, and more. It also
provides a cloud development platform.
iv. Chatter
v. Bitrix24
vi. Paypal
Paypal offers the simplest and easiest online payment mode using a secure internet account.
Paypal accepts the payment through debit cards, credit cards, and also from Paypal account
holders.
vii. Slack
Slack stands for Searchable Log of all Conversation and Knowledge. It provides a user-
friendly interface that helps us to create public and private channels for communication.
viii. Quickbooks
Quickbooks works on the terminology "Run Enterprise anytime, anywhere, on any device."
It provides online accounting solutions for the business. It allows more than 20 users to work
simultaneously on the same system.
A list of data storage and backup applications in the cloud are given below -
i. Box.com
ii. Mozy
Mozy provides powerful online backup solutions for our personal and business data. It
schedules automatically back up for each day at a specific time.
iii. Joukuu
Joukuu provides the simplest way to share and track cloud-based backup files. Many users
use joukuu to search files, folders, and collaborate on documents.
4. Education Applications
Cloud computing in the education sector becomes very popular. It offers various online
distance learning platforms and student information portals to the students. The
advantage of using cloud in the field of education is that it offers strong virtual classroom
environments, Ease of accessibility, secure data storage, scalability, greater reach for the
students, and minimal hardware requirements for the applications.
Google Apps for Education is the most widely used platform for free web-based email,
calendar, documents, and collaborative study.
Chromebook for Education is one of the most important Google's projects. It is designed for
the purpose that it enhances education innovation.
It allows educators to quickly implement the latest technology solutions into the classroom and
make it available to their students.
5. Entertainment Applications
Entertainment industries use a multi-cloud strategy to interact with the target audience.
Cloud computing offers various entertainment applications such as online games and video
conferencing.
i. Online games
Today, cloud gaming becomes one of the most important entertainment media. It offers
various online games that run remotely from the cloud. The best cloud gaming services are
Shaow, GeForce Now, Vortex, Project xCloud, and PlayStation Now.
Video conferencing apps provides a simple and instant connected experience. It allows us to
communicate with our business partners, friends, and relatives using a cloud-based video
conferencing. The benefits of using video conferencing are that it reduces cost, increases
efficiency, and removes interoperability.
6. Management Applications
Cloud computing offers various cloud management tools which help admins to manage all
types of cloud activities, such as resource deployment, data integration, and disaster recovery.
These management tools also provide administrative control over the platforms, applications,
and infrastructure.
i. Toggl
Toggl helps users to track allocated time period for a particular project.
ii. Evernote
Evernote allows you to sync and save your recorded notes, typed notes, and other notes in one
convenient place. It is available for both free as well as a paid version.
It uses platforms like Windows, macOS, Android, iOS, Browser, and Unix.
iii. Outright
Outright is used by management users for the purpose of accounts. It helps to track income,
expenses, profits, and losses in real-time environment.
iv. GoToMeeting
GoToMeeting provides Video Conferencing and online meeting apps, which allows you to
start a meeting with your business partners from anytime, anywhere using mobile phones or
tablets. Using GoToMeeting app, you can perform the tasks related to the management such
as join meetings in seconds, view presentations on the shared screen, get alerts for upcoming
meetings, etc.
7. Social Applications
Social cloud applications allow a large number of users to connect with each other using social
networking applications such as Facebook, Twitter, Linkedln, etc.
i. Facebook
Facebook is a social networking website which allows active users to share files, photos,
videos, status, more to their friends, relatives, and business partners using the cloud storage
system. On Facebook, we will always get notifications when our friends like and comment on
the posts.
ii. Twitter
iii. Yammer
Yammer is the best team collaboration tool that allows a team of employees to chat, share
images, documents, and videos.
iv. LinkedIn
Some most common Security Risks of Cloud Computing are given below-
Data Loss
Data loss is the most common cloud security risks of cloud computing. It is also known as data
leakage. Data loss is the process in which data is being deleted, corrupted, and unreadable by a
user, software, or application. In a cloud computing environment, data loss occurs when our
sensitive data is somebody else's hands, one or more data elements can not be utilized by the
data owner, hard disk is not working properly, and software is not updated.
Data Breach
Data Breach is the process in which the confidential data is viewed, accessed, or stolen by the
third party without any authorization, so organization's data is hacked by the hackers.
Vendor lock-in
Vendor lock-in is the of the biggest security risks in cloud computing. Organizations may face
problems when transferring their services from one vendor to another. As different vendors
provide different platforms, that can cause difficulty moving one cloud to another.
Account hijacking
Account hijacking is a serious security risk in cloud computing. It is the process in which
individual user's or organization's cloud account (bank account, e-mail account, and social
media account) is stolen by hackers. The hackers use the stolen account to perform
unauthorized activities.
Types of Cloud
There are the following 4 types of cloud that you can deploy according to the organization's
needs-
o Public Cloud
o Private Cloud
o Hybrid Cloud
o Community Cloud
Public Cloud
Public cloud is open to all to store and access information via the Internet using the pay-per-
usage method.
In public cloud, computing resources are managed and operated by the Cloud Service Provider
(CSP).
Example: Amazon elastic compute cloud (EC2), IBM SmartCloud Enterprise, Microsoft, Google
App Engine, Windows Azure Services Platform.
o Public cloud is owned at a lower cost than the private and hybrid cloud.
o Public cloud is maintained by the cloud service provider, so do not need to worry about the
maintenance.
o Public cloud is easier to integrate. Hence it offers a better flexibility approach to consumers.
o Public cloud is location independent because its services are delivered through the internet.
o Public cloud is highly scalable as per the requirement of computing resources.
o It is accessible by the general public, so there is no limit to the number of users.
Private Cloud
Private cloud is also known as an internal cloud or corporate cloud. It is used by
organizations to build and manage their own data centers internally or by the third party. It can
be deployed using Opensource tools such as Openstack and Eucalyptus.
Based on the location and management, National Institute of Standards and Technology (NIST)
divide private cloud into the following two parts-
o Private cloud provides a high level of security and privacy to the users.
o Private cloud offers better performance with improved speed and space capacity.
o It allows the IT team to quickly allocate and deliver on-demand IT resources.
o The organization has full control over the cloud because it is managed by the organization itself.
So, there is no need for the organization to depends on anybody.
o It is suitable for organizations that require a separate cloud for their personal use and data
security is the first priority.
Hybrid Cloud
Hybrid Cloud is a combination of the public cloud and the private cloud. we can say:
Hybrid cloud is partially secure because the services which are running on the public cloud can
be accessed by anyone, while the services which are running on a private cloud can be
accessed only by the organization's users.
Example: Google Application Suite (Gmail, Google Apps, and Google Drive), Office 365 (MS
Office on the Web and One Drive), Amazon Web Services.
Advantages of Hybrid Cloud
There are the following advantages of Hybrid Cloud -
o Hybrid cloud is suitable for organizations that require more security than the public cloud.
o Hybrid cloud helps you to deliver new products and services more quickly.
o Hybrid cloud provides an excellent way to reduce the risk.
o Hybrid cloud offers flexible resources because of the public cloud and secure resources because
of the private cloud.
Community Cloud
Community cloud allows systems and services to be accessible by a group of several
organizations to share the information between the organization and a specific community. It
is owned, managed, and operated by one or more organizations in the community, a third
party, or a combination of them.
o Community cloud is cost-effective because the whole cloud is being shared by several
organizations or communities.
o Community cloud is suitable for organizations that want to have a collaborative cloud with more
security features than the public cloud.
o It provides better security than the public cloud.
o It provdes collaborative and distributive environment.
o Community cloud allows us to share cloud resources, infrastructure, and other capabilities
among various organizations.
Host Service provider Enterprise (Third party) Enterprise (Third party) Community (Third party
Public Cloud
o Public Cloud provides a shared platform that is accessible to the general public through an
Internet connection.
o Public cloud operated on the pay-as-per-use model and administrated by the third party, i.e.,
Cloud service provider.
o In the Public cloud, the same storage is being used by multiple users at the same time.
o Public cloud is owned, managed, and operated by businesses, universities, government
organizations, or a combination of them.
o Amazon Elastic Compute Cloud (EC2), Microsoft Azure, IBM's Blue Cloud, Sun Cloud, and Google
Cloud are examples of the public cloud.
1) Low Cost
Public cloud has a lower cost than private, or hybrid cloud, as it shares the same resources with
a large number of consumers.
2) Location Independent
Public cloud is location independent because its services are offered through the internet.
3) Save Time
In Public cloud, the cloud service provider is responsible for the manage and maintain data
centers in which data is stored, so the cloud user can save their time to establish connectivity,
deploying new products, release product updates, configure, and assemble servers.
C++ vs Java
4) Quickly and easily set up
Organizations can easily buy public cloud on the internet and deployed and configured it
remotely through the cloud service provider within a few hours.
5) Business Agility
Public cloud provides an ability to elastically re-size computer resources based on the
organization's requirements.
Public cloud offers scalable (easy to add and remove) and reliable (24*7 available) services to
the users at an affordable cost.
2) Performance
In the public cloud, performance depends upon the speed of internet connectivity.
3) Less customizable
Private Cloud
o Private cloud is also known as an internal cloud or corporate cloud.
o Private cloud provides computing services to a private internal network (within the
organization) and selected users instead of the general public.
o Private cloud provides a high level of security and privacy to data through firewalls and
internal hosting. It also ensures that operational and sensitive data are not accessible to third-
party providers.
o HP Data Centers, Microsoft, Elastra-private cloud, and Ubuntu are the example of a private
cloud.
Advantages of Private cloud
There are the following advantages of Private Cloud -
1) More Control
Private clouds have more control over their resources and hardware than public clouds
because it is only accessed by selected users.
Security & privacy are one of the big advantages of cloud computing. Private cloud improved
the security level as compared to the public cloud.
3) Improved performance
Private cloud offers better performance with improved speed and space capacity.
The cost is higher than a public cloud because set up and maintain hardware resources are
costly.
As we know, private cloud is accessible within the organization, so the area of operations is
limited.
3) Limited scalability
Private clouds are scaled only within the capacity of internal hosted resources.
4) Skilled people
Hybrid Cloud
o Hybrid cloud is a combination of public and private clouds.
Hybrid cloud = public cloud + private cloud
o The main aim to combine these cloud (Public and Private) is to create a unified, automated, and
well-managed computing environment.
o In the Hybrid cloud, non-critical activities are performed by the public cloud and critical
activities are performed by the private cloud.
o Mainly, a hybrid cloud is used in finance, healthcare, and Universities.
o The best hybrid cloud provider companies are Amazon, Microsoft, Google,
Cisco, and NetApp.
It provides flexible resources because of the public cloud and secure resources because of the
private cloud.
2) Cost effective
Hybrid cloud costs less than the private cloud. It helps organizations to save costs for both
infrastructure and application support.
3) Cost effective
It offers the features of both the public as well as the private cloud. A hybrid cloud is capable of
adapting to the demands that each company needs for space, memory, and system.
4) Security
Hybrid cloud is secure because critical activities are performed by the private cloud.
5) Risk Management
Hybrid cloud provides an excellent way for companies to manage the risk.
In the Hybrid Cloud, networking becomes complex because of the private and the public cloud.
2) Infrastructure Compatibility
3) Reliability
Community Cloud
Community cloud is a cloud infrastructure that allows systems and services to be accessible by
a group of several organizations to share the information. It is owned, managed, and operated
by one or more organizations in the community, a third party, or a combination of them.
Example: Our government organization within India may share computing infrastructure in the
cloud to manage data.
Cost effective
Community cloud is cost effective because the whole cloud is shared between several
organizations or a community.
The community cloud is flexible and scalable because it is compatible with every user. It allows
the users to modify the documents as per their needs and requirement.
Security
Community cloud is more secure than the public cloud but less secure than the private cloud.
Sharing infrastructure
Community cloud allows us to share cloud resources, infrastructure, and other capabilities
among various organizations.
History of Java
Characteristics of PaaS
There are the following characteristics of PaaS -
Example: AWS Elastic Beanstalk, Windows Azure, Heroku, Force.com, Google App Engine,
Apache Stratos, Magento Commerce Cloud, and OpenShift.
Characteristics of SaaS
There are the following characteristics of SaaS -
It provides a virtual data center to It provides virtual platforms and It provides web software an
store information and create platforms tools to create, test, and deploy apps to complete busines
for app development, testing, and apps. tasks.
deployment.
It provides access to resources such as It provides runtime environments It provides software as a servic
virtual machines, virtual storage, etc. and deployment tools for to the end-users.
applications.
In traditional hosting services, IT infrastructure was rented out for a specific period of time, with
pre-determined hardware configuration. The client paid for the configuration and time,
regardless of the actual use. With the help of the IaaS cloud computing platform layer, clients
can dynamically scale the configuration to meet changing requirements and are billed only for
the services actually used.
IaaS cloud computing platform layer eliminates the need for every organization to maintain the
IT infrastructure.
IaaS is offered in three models: public, private, and hybrid cloud. The private cloud implies that
the infrastructure resides at the customer-premise. In the case of public cloud, it is located at
the cloud computing platform vendor's data center, and the hybrid cloud is a combination of
the two in which the customer selects the best of both public cloud or private cloud.
1. Compute: Computing as a Service includes virtual central processing units and virtual main
memory for the Vms that is provisioned to the end- users.
2. Storage: IaaS provider provides back-end storage for storing files.
3. Network: Network as a Service (NaaS) provides networking components such as routers,
switches, and bridges for the Vms.
4. Load balancers: It provides load balancing capability at the infrastructure layer.
Advantages of IaaS cloud computing layer
There are the following advantages of IaaS computing layer -
1. Shared infrastructure
3. Pay-as-per-use model
IaaS providers provide services based on the pay-as-per-use basis. The users are required to
pay for what they have used.
IaaS providers focus on the organization's core business rather than on IT infrastructure.
5. On-demand scalability
On-demand scalability is one of the biggest advantages of IaaS. Using IaaS, users do not worry
about to upgrade software and troubleshoot the issues related to hardware components.
Security is one of the biggest issues in IaaS. Most of the IaaS providers are not able to provide
100% security.
3. Interoperability issues
It is difficult to migrate VM from one IaaS provider to the other, so the customers might face
problem related to vendor lock-in.
IaaS cloud computing platform may not eliminate the need for an in-house IT department. It
will be needed to monitor or control the IaaS setup. IT salary expenditure might not reduce
significantly, but other IT expenses can be reduced.
Breakdowns at the IaaS cloud computing platform vendor's can bring your business to the halt
stage. Assess the IaaS cloud computing platform vendor's stability and finances. Make sure that
SLAs (i.e., Service Level Agreement) provide backups for data, hardware, network, and
application failures. Image portability and third-party support is a plus point.
The IaaS cloud computing platform vendor can get access to your sensitive data. So, engage
with credible companies or organizations. Study their security policies and precautions.
Top Iaas Providers who are providing IaaS cloud computing platform
Amazon Web Elastic, Elastic Compute Cloud The cloud computing platform pioneer, Amazon offer
Services (EC2) MapReduce, Route 53, auto scaling, cloud monitoring, and load balancin
Virtual Private Cloud, etc. features as part of its portfolio.
Netmagic Netmagic IaaS Cloud Netmagic runs from data centers in Mumbai, Chenna
Solutions and Bangalore, and a virtual data center in the Unite
States. Plans are underway to extend services to Wes
Asia.
Rackspace Cloud servers, cloud files, The cloud computing platform vendor focuses primari
cloud sites, etc. on enterprise-level hosting services.
Reliance Reliance Internet Data Center RIDC supports both traditional hosting and clou
Communications services, with data centers in Mumbai, Bangalor
Hyderabad, and Chennai. The cloud services offered b
RIDC include IaaS and SaaS.
PaaS includes infrastructure (servers, storage, and networking) and platform (middleware,
development tools, database management systems, business intelligence, and more) to
support the web application life cycle.
PaaS providers provide the Programming languages, Application frameworks, Databases, and
Other tools:
2. Application frameworks
PaaS providers provide application frameworks to easily understand the application
development. Some popular application frameworks provided by PaaS providers are Node.js,
Drupal, Joomla, WordPress, Spring, Play, Rack, and Zend.
3. Databases
PaaS providers provide various databases such as ClearDB, PostgreSQL, MongoDB, and Redis
to communicate with the applications.
4. Other tools
PaaS providers provide various other tools that are required to develop, test, and deploy the
applications.
Advantages of PaaS
There are the following advantages of PaaS -
1) Simplified Development
PaaS allows developers to focus on development and innovation without worrying about
infrastructure management.
2) Lower risk
No need for up-front investment in hardware and software. Developers only need a PC and an
internet connection to start building applications.
Some PaaS vendors also provide already defined business functionality so that users can avoid
building everything from very scratch and hence can directly start the projects only.
4) Instant community
PaaS vendors frequently provide online communities where the developer can get the ideas to
share experiences and seek advice from others.
5) Scalability
Applications deployed can scale from one to thousands of users without any changes to the
applications.
One has to write the applications according to the platform provided by the PaaS vendor, so
the migration of an application to another PaaS vendor would be a problem.
2) Data Privacy
Corporate data, whether it can be critical or not, will be private, so if it is not located within the
walls of the company, there can be a risk in terms of privacy of data.
It may happen that some applications are local, and some are in the cloud. So there will be
chances of increased complexity when we want to use data which in the cloud with the local
data.
Providers Services
Google App Engine (GAE) App Identity, URL Fetch, Cloud storage client library, Logservice
Salesforce.com Faster implementation, Rapid scalability, CRM Services, Sales cloud, Mobil
connectivity, Chatter.
Business Services - SaaS Provider provides various business services to start-up the business.
The SaaS business services include ERP (Enterprise Resource Planning), CRM (Customer
Relationship Management), billing, and sales.
Social Networks - As we all know, social networking sites are used by the general public, so
social networking service providers use SaaS for their convenience and handle the general
public's information.
Mail Services - To handle the unpredictable number of users and load on e-mail services,
many e-mail providers offering their services using SaaS.
SaaS pricing is based on a monthly fee or annual fee subscription, so it allows organizations to
access business functionality at a low cost, which is less than licensed applications.
Unlike traditional software, which is sold as a licensed based with an up-front cost (and often
an optional ongoing support fee), SaaS providers are generally pricing the applications using a
subscription fee, most commonly a monthly or annually fee.
2. One to Many
SaaS services are offered as a one-to-many model means a single instance of the application is
shared by multiple users.
The software is hosted remotely, so organizations do not need to invest in additional hardware.
Software as a service removes the need for installation, set-up, and daily maintenance for the
organizations. The initial set-up cost for SaaS is typically less than the enterprise software. SaaS
vendors are pricing their applications based on some usage parameters, such as a number of
users using the application. So SaaS does easy to monitor and automatic updates.
All users will have the same version of the software and typically access it through the web
browser. SaaS reduces IT support costs by outsourcing hardware and software maintenance
and support to the IaaS provider.
6. Multidevice support
SaaS services can be accessed from any device such as desktops, laptops, tablets, phones, and
thin clients.
7. API Integration
SaaS services easily integrate with other software or services through standard APIs.
8. No client-side installation
SaaS services are accessed directly from the service provider using the internet connection, so
do not need to require any software installation.
Actually, data is stored in the cloud, so security may be an issue for some users. However, cloud
computing is not more secure than in-house deployment.
2) Latency issue
Since data and applications are stored in the cloud at a variable distance from the end-user,
there is a possibility that there may be greater latency when interacting with the application
compared to local deployment. Therefore, the SaaS model is not suitable for applications
whose demand response time is in milliseconds.
Switching SaaS vendors involves the difficult and slow task of transferring the very large data
files over the internet and then converting and importing them into another SaaS also.
The below table shows some popular SaaS providers and services that are provided by them -
Provider Services
In other words, Virtualization is a technique, which allows to share a single physical instance of
a resource or an application among multiple customers and organizations. It does by assigning
a logical name to a physical storage and providing a pointer to that physical resource when
demanded.
The machine on which the virtual machine is going to create is known as Host Machine and
that virtual machine is referred as a Guest Machineow to find Nth Highest Salary in SQL
Types of Virtualization:
1. Hardware Virtualization.
2. Operating system Virtualization.
3. Server Virtualization.
4. Storage Virtualization.
1) Hardware Virtualization:
When the virtual machine software or virtual machine manager (VMM) is directly installed on
the hardware system is known as hardware virtualization.
The main job of hypervisor is to control and monitoring the processor, memory and other
hardware resources.
After virtualization of hardware system we can install different operating system on it and run
different applications on those OS.
Usage:
Hardware virtualization is mainly done for the server platforms, because controlling virtual
machines is much easier than controlling a physical server.
Usage:
Operating System Virtualization is mainly used for testing the applications on different
platforms of OS.
3) Server Virtualization:
When the virtual machine software or virtual machine manager (VMM) is directly installed on
the Server system is known as server virtualization.
Usage:
Server virtualization is done because a single physical server can be divided into multiple
servers on the demand basis and for balancing the load.
4) Storage Virtualization:
Storage virtualization is the process of grouping the physical storage from multiple network
storage devices so that it looks like a single storage device.
Usage:
To overcome this problem we use basically virtualization technology, By using virtualization, all
severs and the software application which are required by other cloud providers are
maintained by the third party people, and the cloud providers has to pay the money on
monthly or annual basis.
Conclusion
Mainly Virtualization means, running multiple operating systems on a single machine but
sharing all the hardware resources. And it helps us to provide the pool of IT resources so that
we can share these IT resources in order get benefits in the business.
Data Virtualization
Data virtualization is the process of retrieve data from various resources without knowing its
type and physical location where it is stored. It collects heterogeneous data from different
resources and allows data users across the organization to access this data according to their
work requirements. This heterogeneous data can be accessed using any application such as
web portals, web services, E-commerce, Software as a Service (SaaS), and mobile application.
o It allows users to access the data without worrying about where it resides on the memory.
o It offers better customer satisfaction, retention, and revenue growth.
o It provides various security mechanism that allows users to safely store their personal and
professional information.
o It reduces costs by removing data replication.
o It provides a user-friendly interface to develop customized views.
o It provides various simple and fast deployment resources.
o It increases business user efficiency by providing data in real-time.
o It is used to perform tasks such as data integration, business integration, Service-Oriented
Architecture (SOA) data services, and enterprise search.
1. Analyze performance
Data Virtualization (DV) provides a mechanism to easily search the data which is similar and
internally related to each other.
It is one of the most common uses of Data Virtualization. It is used in agile reporting, real-time
dashboards that require timely aggregation, analyze and present the relevant data from
multiple resources. Both individuals and managers use this to monitor performance, which
helps to make daily operational decision processes such as sales, support, finance, logistics,
legal, and compliance.
4. Data Management
Data virtualization provides a secure centralized layer to search, discover, and govern the
unified data and its relationships.
Red Hat virtualization is the best choice for developers and those who are using micro services
and containers. It is written in Java.
TIBCO helps administrators and users to create a data virtualization platform for accessing the
multiple data sources and data sets. It provides a builtin transformation engine to combine
non-relational and un-structured data sources.
It is a very popular and powerful data integrator tool which is mainly worked with Oracle
products. It allows organizations to quickly develop and manage data services to access a
single view of data.
SAS Federation Server provides various technologies such as scalable, multi-user, and
standards-based data access to access data from multiple data services. It mainly focuses on
securing data.
5. Denodo
Denodo is one of the best data virtualization tools which allows organizations to minimize the
network traffic load and improve response time for large data sets. It is suitable for both small
as well as large organizations.
Hardware Virtualization
Previously, there was "one to one relationship" between physical servers and operating system.
Low capacity of CPU, memory, and networking requirements were available. So, by using this
model, the costs of doing business increased. The physical space, amount of power, and
hardware required meant that costs were adding up.
The hypervisor manages shared the physical resources of the hardware between the guest
operating systems and host operating system. The physical resources become abstracted
versions in standard formats regardless of the hardware platform. The abstracted hardware is
represented as actual hardware. Then the virtualized operating system looks into these
resources as they are physical entities.
When the virtual machine software or virtual machine manager (VMM) or hypervisor software
is directly installed on the hardware system is known as hardware virtualization.
The main job of hypervisor is to control and monitoring the processor, memory and other
hardware resources.
After virtualization of hardware system we can install different operating system on it and run
different applications on those OS.
Usage of Hardware Virtualization
Hardware virtualization is mainly done for the server platforms, because controlling virtual
machines is much easier than controlling a physical server.
Physical resources can be shared among virtual machines. Although the unused resources can
be allocated to a virtual machine and that can be used by other virtual machines if the need
exists.
Now it is possible for multiple operating systems can co-exist on a single hardware platform, so
that the number of servers, rack space, and power consumption drops significantly.
The modern hypervisors provide highly orchestrated operations that maximize the abstraction
of the hardware and help to ensure the maximum uptime. These functions help to migrate a
running virtual machine from one host to another dynamically, as well as maintain a running
copy of virtual machine on another physical host in case the primary host fails.
4) Increased IT Flexibility:
Hardware virtualization helps for quick deployment of server resources in a managed and
consistent ways. That results in IT being able to adapt quickly and provide the business with
resources needed in good time.
Software Virtualization
Managing applications and distribution becomes a typical task for IT departments. Installation
mechanism differs from application to application. Some programs require certain helper
applications or frameworks and these applications may have conflict with existing applications.
Software virtualization is just like a virtualization but able to abstract the software installation
procedure and create virtual software installations.
Virtualized software is an application that will be "installed" into its own self-contained unit.
Example of software virtualization is VMware software, virtual box etc. In the next pages, we are
going to see how to install linux OS and windows OS on VMware application.
Copying a file to a workstation or linking a file in a network then we can easily install virtual
software.
2) Easy to manage:
To manage updates becomes a simpler task. You need to update at one place and deploy the
updated virtual application to the all clients.
3) Software Migration:
Without software virtualization, moving from one software platform to another platform takes
much time for deploying and impact on end user systems. With the help of virtualized software
environment the migration becomes easier.
Server Virtualization
Server Virtualization is the process of dividing a physical server into several virtual servers,
called virtual private servers. Each virtual private server can run independently.
The concept of Server Virtualization widely used in the IT infrastructure to minimizes the costs
by increasing the utilization of existing resources.
The hypervisor is mainly used to perform various tasks such as allocate physical hardware
resources (CPU, RAM, etc.) to several smaller independent virtual machines, called "guest" on
the host machine.
Full Virtualization uses a hypervisor to directly communicate with the CPU and physical server.
It provides the best isolation and security mechanism to the virtual machines.
The biggest disadvantage of using hypervisor in full virtualization is that a hypervisor has its
own processing needs, so it can slow down the application and server performance.
3. Para Virtualization
Para Virtualization is quite similar to the Full Virtualization. The advantage of using this
virtualization is that it is easier to use, Enhanced performance, and does not require
emulation overhead. Xen primarily and UML use the Para Virtualization.
The difference between full and pare virtualization is that, in para virtualization hypervisor does
not need too much processing power to manage the OS.
Hardware Assisted Virtualization was presented by AMD and Intel. It is also known
as Hardware virtualization, AMD virtualization, and Intel virtualization. It is designed to
increase the performance of the processor. The advantage of using Hardware Assisted
Virtualization is that it requires less hypervisor overhead.
6. Kernel-Level Virtualization
1. Independent Restart
In Server Virtualization, each server can be restart independently and does not affect the
working of other virtual servers.
2. Low Cost
Server Virtualization can divide a single server into multiple virtual private servers, so it reduces
the cost of hardware components.
3. Disaster Recovery<
Disaster Recovery is one of the best advantages of Server Virtualization. In Server Virtualization,
data can easily and quickly move from one server to another and these data can be stored and
retrieved from anywhere.
Server virtualization allows us to deploy our resources in a simpler and faster way.
5. Security
It allows uses to store their sensitive data inside the data centers.
1. The biggest disadvantage of server virtualization is that when the server goes offline, all the
websites that are hosted by the server will also go down.
2. There is no way to measure the performance of virtualized environments.
3. It requires a huge amount of RAM consumption.
4. It is difficult to set up and maintain.
5. Some core applications and databases are not supported virtualization.
6. It requires extra hardware resources.
Storage Virtualization
As we know that, there has been a strong link between the physical host and the locally
installed storage devices. However, that paradigm has been changing drastically, almost local
storage is no longer needed. As the technology progressing, more advanced storage devices
are coming to the market that provide more functionality, and obsolete the local storage.
Storage virtualization is a major component for storage servers, in the form of functional RAID
levels and controllers. Operating systems and applications with device can access the disks
directly by themselves for writing. The controllers configure the local storage in RAID groups
and present the storage to the operating system depending upon the configuration. However,
the storage is abstracted and the controller is determining how to write the data or retrieve the
requested data for the operating system.
Storage virtualization is becoming more and more important in various other forms:
File servers: The operating system writes the data to a remote location with no need to
understand how to write to the physical media.
WAN Accelerators: Instead of sending multiple copies of the same data over the WAN
environment, WAN accelerators will cache the data locally and present the re-requested blocks
at LAN speed, while not impacting the WAN performance.
SAN and NAS: Storage is presented over the Ethernet network of the operating system. NAS
presents the storage as file operations (like NFS). SAN technologies present the storage as
block level storage (like Fibre Channel). SAN technologies receive the operating instructions
only when if the storage was a locally attached device.
Storage Tiering: Utilizing the storage pool concept as a stepping stone, storage tiering
analyze the most commonly used data and places it on the highest performing storage pool.
The lowest one used data is placed on the weakest performing storage pool.
This operation is done automatically without any interruption of service to the data consumer.
OS Virtualization
With the help of OS virtualization nothing is pre-installed or permanently loaded on the local
device and no-hard disk is needed. Everything runs from the network using a kind of virtual
disk. This virtual disk is actually a disk image file stored on a remote server, SAN (Storage Area
Network) or NAS (Non-volatile Attached Storage). The client will be connected by the network
to this virtual disk and will boot with the Operating System installed on the virtual disk.
The first component is the OS Virtualization server. This server is the center point in the OS
Virtualization infrastructure. The server manages the streaming of the information on the
virtual disks for the client and also determines which client will be connected to which virtual
disk (using a database, this information is stored). Also the server can host the storage for the
virtual disk locally or the server is connected to the virtual disks via a SAN (Storage Area
Network). In high availability environments there can be more OS Virtualization servers to
create no redundancy and load balancing. The server also ensures that the client will be unique
within the infrastructure.
Secondly, there is a client which will contact the server to get connected to the virtual disk and
asks for components stored on the virtual disk for running the operating system.
The available supporting components are database for storing the configuration and settings
for the server, a streaming service for the virtual disk content, a (optional) TFTP service and a
(also optional) PXE boot service for connecting the client to the OS Virtualization servers.
As it is already mentioned that the virtual disk contains an image of a physical disk from the
system that will reflect to the configuration and the settings of those systems which will be
using the virtual disk. When the virtual disk is created then that disk needs to be assigned to
the client that will be using this disk for starting. The connection between the client and the
disk is made through the administrative tool and saved within the database. When a client has
a assigned disk, the machine can be started with the virtual disk using the following process as
First we start the machine and set up the connection with the OS Virtualization server. Most of
the products offer several possible methods to connect with the server. One of the most
popular and used methods is using a PXE service, but also a boot strap is used a lot (because
of the disadvantages of the PXE service). Although each method initializes the network
interface card (NIC), receiving a (DHCP-based) IP address and a connection to the server.
When the connection is established between the client and the server, the server will look into
its database for checking the client is known or unknown and which virtual disk is assigned to
the client. When more than one virtual disk are connected then a boot menu will be displayed
on the client side. If only one disk is assigned, that disk will be connected to the client which is
mentioned in step number 3.
After the desired virtual disk is selected by the client, that virtual disk is connected through the
OS Virtualization server . At the back-end, the OS Virtualization server makes sure that the
client will be unique (for example computer name and identifier) within the infrastructure.
As soon the disk is connected the server starts streaming the content of the virtual disk. The
software knows which parts are necessary for starting the operating system smoothly, so that
these parts are streamed first. The information streamed in the system should be stored
somewhere (i.e. cached). Most products offer several ways to cache that information. For
examples on the client hard disk or on the disk of the OS Virtualization server.
5) Additional Streaming:
After that the first part is streamed then the operating system will start to run as expected.
Additional virtual disk data will be streamed when required for running or starting a function
called by the user (for example starting an application available within the virtual disk).
Linux OS Virutualization
Vmware workstation software is used to do the virtualization of Operating System. For
installing any Operating System virtually, you need to install VMware software. We are using
VMware workstation 10.
Before installing linux OS, you need to have iso image file of linux OS. Let's see the steps to
install the linux os virtually.
4) In the Guest operating system window, choose iso image file from the disk or any drive. I
have put the iso file of ubuntu in e: drive. So browse your iso image and click on next button.
5) In the easy install information window, provide full name, username, password and confirm
password then click on next button.
You can see the given information.
6) In the processor configuration information, you can select number of processors, number of
processor per core. If you don't want to change the default settings, click on next only.
7) In the memory of the virtual machine window, you can set the memory limit. Click on the
next button.
8) In the specify disk capacity window, you can set the disk size. Click on the next button.
9) In the specify disk file window, you can specify the disk file then click on the next button.
10) In the ready to create virtual machine window, click on the finish button.
11) Now you will see vmware screen then ubuntu screen.
Windows OS Virutualization
To install windows OS virtually, you need to install VMware first. After installing virtualization
software, you will get a window to install the new operating system.
Features of AWS
AWS provides various powerful features for building scalable, cost-effective, enterprise
applications. Some important features of AWS is given below-
2. Microsoft Azure
Microsoft Azure is also known as Windows Azure. It supports various operating systems,
databases, programming languages, frameworks that allow IT professionals to easily build,
deploy, and manage applications through a worldwide network. It also allows users to create
different groups for related utilities.
5. VMware Cloud
VMware cloud is a Software-Defined Data Center (SSDC) unified platform for the Hybrid Cloud.
It allows cloud providers to build agile, flexible, efficient, and robust cloud services.
Features of VMware
o VMware cloud works on the pay-as-per-use model and monthly subscription
o It provides better customer satisfaction by protecting the user's data.
o It can easily create a new VMware Software-Defined Data Center (SDDC) cluster on AWS
cloud by utilizing a RESTful API.
o It provides flexible storage options. We can manage our application storage on a per-
application basis.
o It provides a dedicated high-performance network for managing the application traffic and also
supports multicast networking.
o It eliminates the time and cost complexity.
6. Oracle cloud
Oracle cloud platform is offered by the Oracle Corporation. It combines Platform as a Service,
Infrastructure as a Service, Software as a Service, and Data as a Service with cloud
infrastructure. It is used to perform tasks such as moving applications to the cloud, managing
development environment in the cloud, and optimize connection performance.
7. Red Hat
Red Hat virtualization is an open standard and desktop virtualization platform produced by
Red Hat. It is very popular for the Linux environment to provide various infrastructure solutions
for virtualized servers as well as technical workstations. Most of the small and medium-sized
organizations use Red Hat to run their organizations smoothly. It offers higher density, better
performance, agility, and security to the resources. It also improves the organization's economy
by providing cheaper and easier management capabilities.
8. DigitalOcean
DigitalOcean is the unique cloud provider that offers computing services to the organization. It
was founded in 2011 by Moisey Uretsky and Ben. It is one of the best cloud provider that
allows us to manage and deploy web applications.
Features of DigitalOcean
It uses the KVM hypervisor to allocate physical resources to the virtual servers.
It provides high-quality performance.
It offers a digital community platform that helps to answer queries and holding feedbacks.
It allows developers to use cloud servers to quickly create new virtual machines for their
projects.
It offers one-click apps for droplets. These apps include MySQL, Docker, MongoDB,
Wordpress, PhpMyAdmin, LAMP stack, Ghost, and Machine Learning.
9. Rackspace
Rackspace offers cloud computing services such as hosting web applications, Cloud Backup,
Cloud Block Storage, Databases, and Cloud Servers. The main aim to designing Rackspace is to
easily manage private and public cloud deployments. Its data centers operating in the USA, UK,
Hong Kong, and Australia.
Features of Rackspace
Rackspace provides various tools that help organizations to collaborate and communicate
more efficiently.
We can access files that are stored on the Rackspace cloud drive, anywhere, anytime using
any device.
It offers 6 globally data centers.
It can manage both virtual servers and dedicated physical servers on the same network.
It provides better performance at a lower cost.
4. To Choose an Amazon Machine Image (AMI) page displays a list of basic configurations called
Amazon Machine Images (AMIs) that serve as templates for your instance. Select the 64-bit
version of Microsoft Windows Server 2008 R2. Notice that this configuration is marked as Free
tier eligible.
5. To Choose an Instance Type page, you can select the hardware configuration for your instance.
The t1.micro instance will be selected by default. Click Review and Launch to let the wizard
complete or not with other configuration settings for you, so you can get started quickly.
6. To Review Instance Launch page, you need to go to the settings for your instance.
Under Security Groups, you will see that the wizard will be created and selected a
security group for you. The security group includes basic firewall rules that will enable
you to connect to your instance. For a Windows instance, you connect through Remote
Desktop Protocol (RDP) on port 3389.
If you have an existing security group then you need to use by clicking Edit security
groups, and select your group on the Configure Security Group page. When done,
click Review and Launch to return to the Review Instance Launch page.
7. Click on Launch.
8. In the Select an existing key pair or create a new key pair dialog box, you can select Choose
an existing key pair, to select a key pair you already created.
Alternatively, you can create a new key pair. Select Create a new key pair, enter a name
for the key pair, and then click Download Key Pair.
This is the only chance for you to save the private key file, so be sure to download it.
Save the private key file in a safe place. You'll need to provide the name of your key pair
when you launch an instance and the corresponding private key each time you connect
to the instance.
It will be downloaded in the form of .pem file and save it for future purpose.
Attention
Don't select the Proceed without a key pair option. If you launch your instance without
a key pair, then you can't connect to it.
When you are ready, select the acknowledgement check box, and then click Launch
Instances.
9. A confirmation page will open to know that your instance is launching. Click View Instances to
close the confirmation page and return to the console.
10. On the Instances page, you can view the status of the launch. It takes a short time for an
instance to launch. When you launch an instance, its initial state is pending. After the instance
starts, its state changes to running and it receives a public DNS name.
11. Record the public DNS name for your instance because you'll need it for the next step.
12. (Optional) After your instance is launched, you can view its security group rules. From the
Instances page, select the instance. In the Description tab, find Security groups and click view
rules.
As you can see, if you used the security group the wizard created for you, it contains
one rule that allows RDP traffic from any IP source to port 3389. If you launch a
Windows instance running IIS and SQL, the wizard creates a security group that contains
additional rules to allow traffic to port 80 for HTTP (for IIS) and port 1433 for MS SQL
Note:- Windows instances are limited to two simultaneous remote connections at one time.
If you attempt a third connection, an error will occur. For more information, see Configure
the Number of Simultaneous Remote Connections Allowed for a Connection.
1. In the Amazon EC2 console, select the instance, and then click Connect.
2. In the Connect To Your Instance dialog box, click Get Password (it will take a few minutes after
the instance is launched before the password is available).
3. Click Browse and navigate to the private key file you created when you launched the instance.
Select the file and click Open to copy the entire contents of the file into contents box.
4. Click Decrypt Password. The console displays the default administrator password for the
instance in the Connect To Your Instance dialog box, replacing the link to Get
Password shown previously with the actual password.
5. Click Download Remote Desktop File. Your browser prompts you to either open or save
the .rdp file. Either option is fine. When you have finished, you can click Close to dismiss
the Connect To Your Instance dialog box.
6. If you opened the .rdp file, you will see the Remote Desktop Connection dialog box. If you
saved the .rdp file then navigate to your downloads directory, and double-click the .rdp file to
display the dialog box. You will get a warning that the publisher of the remote connection is
unknown. Click Connect to connect to your instance. You may get a warning that the security
certificate could not be authenticated. Click Yes to continue.
7. Log in to the instance as prompted, using the default Administrator account and the default
administrator password that you recorded or copied previously.
After you connect, we recommend that you do the following:
o Change the Administrator password from the default value. You change the password while
logged on to the instance itself, just as you would on any other Windows Server.
o Create another user account with administrator privileges on the instance. Another account with
administrator privileges is a safeguard if you forget the Administrator password or have a
problem with the Administrator account.
Now how can you share your local drives with the Singapore or others data center?
History of Java
Now start the RDP (Remote Desktop Service) from the Windows machine as shown below. Run
themstsc command from the Run menu.
In the Remote audio area, select Settings to configure the audio settings of your instance.
In the Local Resource tab, in the Local devices and resources area, click More. All the plug
and play devices that can be available through network in the AWS EC2 server instance are
listed, as well as the disk drives.
Click Connect.
Now we can install anything in the Singapore datacenter from our local drives.
AWS provides the largest community with millions of active customers as well as thousands of
partners globally. Most of the organizations use AWS to expand their business by moving their
IT management to the AWS.
Flexibility, security, scalability, and better performance are some important features of AWS.
Microsoft Azure
Microsoft Azure is also called as Windows Azure. It is a worldwide cloud platform which is
used for building, deploying, and managing services. It supports multiple programming
languages such as Java, Nodejs, C, and C#. The advantage of using Microsoft Azure is that it
allows us to a wide variety of services without arranging and purchasing additional hardware
components.
Microsoft Azure provides several computing services, including servers, storage, databases,
software, networking, and analytics over the Internet.
Google Cloud Platform (GCP) provides various cloud computing services, including computing,
data analytics, data storage, and machine learning.
App Testing It uses device farm It uses DevTest labs It uses Cloud Tes
labs.
Cloud Server
Google upgraded its algorithm in July 2018 to include page load speed as a ranking metric.
Consider the consequences if customers leave the page because of load time then the rankings
of the page suffer.
Load-time was one of many instances of the significance of hosting services and its effects are
on the overall profitability of the company.
Now, let's disintegrate the distinction between the two key kinds of services provided to
understand the significance of web hosting servers: These two servers are: Cloud hosting and
dedicated servers.
Each server has certain benefits and drawbacks that may become especially significant to an
organization on a plan, meeting time restrictions or looking to develop. The meanings and
variations you need to know are discussed here.
Cloud Ecosystem
A cloud environment is a dynamic system of interrelated components, all of which come together to
produce cloud services possible. The cloud infrastructure of cloud services is made up of software and
hardware components and also cloud clients, cloud experts, vendors, integrators and partners.
The cloud is a technique that is applied to function as a single entity with limitless multiple-
servers. As data is stored "in the cloud," it implies that it is kept in a virtual environment that
can pull support from numerous geographically placed physical platforms across the world.
Similarly, the hubs are specific servers that are linked via the opportunity to exchange services
in virtual space, mostly in data center facilities. It's a cloud.
To distribute computing resources, cloud servers support pooled files and folders, including
Ceph or a wide Storage Area Network (SAN). Through devolution, hosted and virtual server
data are integrated. In the context of a malfunction, its condition can be easily transferred from
this environment.
To manage the various sizes of cloud storage that are splintered, a hypervisor is often built. It
also controls the assignment of hardware facilities, such as core processors, RAM and storage
space, to every cloud server.
The word 'dedicated' derives from the fact that, depending on hardware, it is separated from
any other physical environment around it. The equipment is deliberately developed to offer
industry-leading efficiency, power, longevity and, very important, durability.
The core level support for several cloud storage is provided by devices known as bare metal
servers. Various bare metal nodes are mainly composed of a public cloud, typically housed in
protected network infrastructure for collocation. Multiple virtual servers are hosted by all of
these physical servers.
In a couple of seconds, a virtual machine can be built. When it is no longer required, it can also
be discarded fast. It is also an easy task to submit information to a virtual server, without the
need for in-depth hardware upgrades. Another of the main benefits of cloud infrastructure is
versatility, and it is a quality that is central to the cloud service concept.
There will be several web servers within such a private cloud that provide services for the same
physical environment. And though each device will be a bare metal server, what consumers
invest for and eventually use is the virtual environment.
All of the server's facilities are offered to the particular client who leases or purchases the
computer equipment. Services are designed to the customer's requirements, such as
storage, RAM, bandwidth load, and processor sort. The most efficient computers in the
marketplace are dedicated hosting servers, which most often include several processors.
A dedicated server can need a server network. The cluster is based on modern technology,
everyone connecting to a virtual network location for several dedicated servers. After all, only
one customer has access to the tools that are in the virtual environment.
Using dedicated servers for back-end operations is one of the most common hybrid cloud
architectures. The hybrid servers' power provides the most stable storage space and
communication climate. On cloud storage, its front-end is hosted. For Software as a Service
(SaaS) applications, which need flexibility and scalability based on customer-handling
parameters, this architecture works perfectly.
Differences between hosting services or virtual private server (VPS) services are often preserved
by cloud storage and physical hosting.
Cloud-based systems and dedicated servers of the modern generation have the specific
capacity to handle almost any service or program. Using related back-end tools, they can be
handled, so that both approaches may execute on similar applications. The differentiation is in
the results.
Matching the perfect approach to a framework will save money for organizations, increase
flexibility and agility, and help to optimize the use of resources.
1. Scalability
Dedicated hosting ranges separately from servers based on clouds. The classifier model is
constrained by the size of stacks or drive-bays of the Distributed Antenna System (DAS)
present on the server. Via an existing logical volume manager (LVM) file, a RAID handler, and a
connected charger, a dedicated server might be able to communicate a disk to an already
open bay. Hot swapping is more complicated for DAS arrays.
Cloud server space, by addition, is readily customizable (and contractible). The cloud server is
not always a part of the connection to provide more storage capacity since the SAN is away
from the host. In the cloud world, extending capacity does not suffer any slowdown.
Excluding operational downtime, dedicated servers often require more money and resources to
update processors. The complete conversion or communicating of another server is necessary
for webservers on a single device that needs additional processing capacity.
2. Performance
For a business that's looking for easy deployment and information retrieval, dedicated servers
are typically the most preferred option. Although they manipulate data locally, they may not
experience a wide range of delays when carrying out certain operations.
This output pace is particularly essential for organizations, including e-commerce, in which
every 1/10th of a second count. To manage information, cloud servers have to go through
SAN, which carries the operation through the architecture's rear end.
The application should also be routed via the hypervisor. This additional processing imposes a
certain delay factor that cannot be decreased.
Devices on dedicated servers are dedicated exclusively to the web or software host. They may
not require to queue queries until all of the computing capacity is used at one (which is highly
doubtful). For businesses with Processor sensitive load balancing operations, this enables
dedicated servers an excellent option. CPU units in a cloud system need supervision to prevent
efficiency from decaying. Without the need for an additional amount of lag, the existing
version of hosts cannot accommodate requests.
Dedicated servers are completely connected to the host site or program, preventing the overall
environment from being throttled. Especially in comparison to the cloud storage world, the
commitment of this degree enables networking to be a simple operation.
Using the physical network in the cloud system poses a serious risk of bandwidth being
throttled. If more than one occupant is concurrently utilizing the same channel, a variety of
adverse impacts can be encountered by both occupants.
Scaling, updates and repairs are a collaborative endeavor between customers and suppliers
that should be strategically planned to keep downtime to a minimum. It will be more
convenient for cloud servers to handle. With much less effect on processes, interoperability is
quicker.
4. Cost Comparison
Normally, cloud servers contain a lower initial expense than dedicated servers. After all, when a
business scales and needs additional capital, cloud servers start losing this benefit.
There are also some characteristics that really can boost the price of cloud and dedicated
servers. For example, executing a cloud server via a specific network interface can be very
costly.
Usually, cloud servers are paid on a regular OpEx (Operational expenditure) model. CapEx
(Capital expenditure) are generally physical server alternatives. They enable you to overwrite
the assets at no extra cost. You also have capital investment expenses that can be paid off for a
period of 3 years.
5. Migration
Streamlined migration can be attained through both dedicated and cloud hosting services.
Migration involves further preparation inside a dedicated setting. The new approach may hold
both previous and present progress in view to execute a smooth migration. There should be a
full-scale decision made.
In most instances, before the new server is entirely prepared to accept over, the old and new
implementations can run simultaneously. Maintaining the existing systems as a backup is also
recommended before the latest approach can be sufficiently checked.
3) What is a cloud?
A cloud is a combination of networks ,hardware, services, storage, and interfaces that helps in
delivering computing as a service. It has three users :
1. End users
2. Business management users
3. cloud service provider
4) What are the different data types used in cloud computing?
There are different data types in cloud computing like emails, contracts, images , blogs etc. As
we know that data is increasing day by day so it is needed to new data types to store these
new data. For an example, if you want to store video then you need a new data type.
o Apache Hadoop
o MapReduce
Platform as a service (PaaS):It provides cloud application platform for the developer.
Software as a service (SaaS)::It provides the cloud applications to users directly without
installing anything on the system. These applications remains on cloud.
11) What are the platforms used for large scale cloud computing?
Apache Hadoop and MapReduce are the platforms use for large scale cloud computing.
Private cloud
Public cloud
Hybrid cloud
Community cloud
It provides permissions to the users so that they can control the access of another user who is
entering into the cloud environment.
Amazon simpleDB
Fine grain multi-tenancy:: In this mode, the resources can be shared by many users but the
functionality remains the same.
It provide an alternative way that you don't need to write the fully fledged program.
It creates applications and link the cloud services with other systems.
27) What are the advantages of cloud services?
Following are the main advantages of cloud services:
Cost saving: It helps in the utilization of investment in the corporate sector. So, it is cost
saving.
Scalable and Robust: It helps in the developing scalable and robust applications. Previously,
the scaling took months, but now, scaling takes less time.
Time saving: It helps in saving time in terms of deployment and maintenance.
1. Professional cloud
2. Personal cloud
3. Performance cloud
32) What are the most essential things that must be followed
before going for cloud computing platform?
Compliance
Loss of data
Data storage
Business continuity
Uptime
Data integrity in cloud computing
33) Which services are provided by Window azure operating
system?
There are three core services provided by Window azure operating system:
Compute
Storage
Management
MongoDB
CouchDB
LucidDB
Google bigtable
Amazon simpleDB
Cloud based SQL