Cloud Computing Security Techniques
Cloud Computing Security Techniques
Cloud Computing Security Techniques
Cloud computing promises to significantly change the way we use computers and ac-
cess and store our personal and business information. With these new computing and commu-
nications paradigms arise new data security challenges. Existing data protection mechanisms
such as encryption have failed in preventing data theft attacks, especially those perpetrated by
an insider to the cloud provider. We propose a different approach for securing data in the cloud
using offensive decoy technology. We monitor data access in the cloud and detect abnormal
data access patterns. When unauthorized access is suspected and then verified using challenge
questions, we launch a disinformation attack by returning large amounts of decoy information
to the attacker. This protects against the misuse of the user’s real data. Experiments conducted
in a local file setting provide evidence that this approach may provide unprecedented levels of
user data security in a Cloud environment. CISCO recently delivered the vision of fog compu-
ting to enable applications on billions of connected devices, already connected in the Inter-
net of Things (IoT), to rundirectly at the network edge.
Customers can develop, manage and run software applications on Cisco IOx framework
of networked devices, including hardened routers, switches and IP video cameras. Cisco brings
the open source Linux and Cisco IOS network operating system together in a single networked
device (initially in routers).
The open application environment encourages more developers to bring their own ap-
plications and connectivity interfaces at the edge of the network. Regardless of Cisco’s prac-
tices, we first answer the questions of what the Fog computing is and what are the differ-
ences between Fog and Cloud. In Fog computing, services can be hosted at end devices such
as set-top-boxes or access points. The infrastructure of this new distributed computing allows
applications to run as close as possible to sensed actionable and massive data, coming out of
people, processes and thing. Such Fog computing concept, actually a Cloud computing close
to the ‘ground’, creates automated response that drives the value. Both Cloud and Fog provide
data, computation, storage and application services to end-users. However, Fog can be distin-
guished from Cloud by its proximity to end-users, the dense geographical distribution and its
support for mobility.
SCOPE OF THE PROJECT
Cloud is growing and by 2020, nearly 75% businesses will be on Cloud. On the con-
trary, challenges do exist- cost, security, management, ops & automation, performance, and the
biggest of these all- “skill gap”!. The future of cloud security will rely on intelligent automa-
tion. Security is everyone’s job and needs continuous monitoring, ensuring that all the best
practices are aligned and keeps the business compliant on cloud. For those on AWS Cloud, it
becomes a little more easier, as security stands first in AWS Cloud. A number of security
threats are associated with cloud data services: not only traditional security threats, such as
network eavesdropping, illegal invasion, and denial of service attacks, but also specific cloud
computing threats, such as side channel attacks, virtualization vulnerabilities, and abuse of
cloud services.
Businesses, especially startups, small and medium businesses (SMBs), are increasingly
opting for outsourcing data and computation to the Cloud. This obviously supports better op-
erational efficiency, but comes with greater risks, perhaps the most serious of which are data
theft attacks. Data theft attacks are amplified if the attacker is a malicious insider. This is con-
sidered as one of the top threats to cloud computing by the Cloud Security Alliance [1]. While
most Cloud computing customers are well-aware of this threat, they are left only with trusting
the service provider when it comes to protecting their data. The lack of transparency into, let
alone control over, the Cloud provider’s authentication, authorization, and audit controls only
exacerbates this threat. The Twitter incident is one example of a data theft at- tack from the
Cloud.
OBJECTIVES:
The major security objectives for cloud computing are the following:
a. Protect Postal Service data from unauthorized access, disclosure, modification, and
monitoring. This includes supporting identity management such that the Postal Service
has the capability to enforce identity and access control policies on authorized users
accessing cloud services. This also includes the ability of the Postal Service to allow
access to its data selectively available to other users.
b. Protect information resources from supply chain threats. This includes verifying and
maintaining the trustworthiness and reliability of the CP, as well as the security assur-
ances associated with the hardware and software used.
c. Prevent unauthorized access to cloud computing infrastructure resources. This includes
implementing security domains that have a logical separation between computing re-
sources (e.g., logical separation of Postal Service workloads running on the same phys-
ical server by virtual machine (VM) monitors [hypervisors] in a multitenant environ-
ment) and using default to no-access configurations.
d. Design Web applications deployed in a cloud for an Internet threat model [such as the
National Institute of Standards and Technology (NIST)] and embed security into the
software development process.
e. Protect Internet browsers from attacks to mitigate end-user security vulnerabilities.
This includes taking measures to protect Internet-connected personal computing de-
vices by applying security software, personal firewalls, and patches on a regular mainte-
nance schedule.
f. Deploy access control and intrusion-detection technologies at the CP and conduct an
independent assessment to verify that they are in place. This includes, but does not rely
on, traditional perimeter security measures in combination with the domain security
model. Traditional perimeter security includes: restricting physical access to network
and devices; protecting individual components from exploitation through security patch
deployment; setting as the default the most secure configurations; disabling all unused
ports and services; using role-based access control; monitoring audit trails; minimizing
the use of privilege; using antivirus software; and encrypting communications.
g. Define trust boundaries between CPs and consumers to clearly establish and promul-
gate boundaries of responsibility for providing security.
h. Support portability such that the Postal Service can take action to change CPs when
needed to satisfy availability, confidentiality, and integrity requirements. This includes
the ability to close an account on a particular date and time and to copy data from one
CP to another.
i. Provide physical separation between Postal Service payment card industry (PCI) and
non-PCI applications. Postal Service PCI applications cannot share processing and
memory storage with non-PCI applications.
KEYWORDS:
1. Cloud Computing
2. Security
3. Trusted Computing;
4. Data integrity,
5. confidentiality
6. Data Protection
Cloud computing began to get both awareness and popularity in the early 2000s. When
the concept of cloud computing originally came to prominence most people did not fully un-
derstand what role it fulfilled or how it helped an organization. In some cases people still do
not fully understand the concept of cloud computing. Cloud computing can refer to business
intelligence (BI), complex event processing (CEP), service-oriented architecture (SOA), Soft-
ware as a Service (SaaS), Web-oriented architecture (WOA), and even Enterprise 2.0. With the
advent and growing acceptance of cloud-based applications like Gmail, Google Calendar,
Flickr, Google Docs, and Delicious, more and more individuals are now open to using a cloud
computing environment than ever before. As this need has continued to grow so has the support
and surrounding infrastructure needed to support it.
To meet those needs companies like Google, Microsoft, and Amazon have started
growing server farms in order to provide companies with the ability to store, process, and re-
trieve data while generating income for themselves. To meet this need Google has brought on-
line more than a million servers in over 30 data centers across its global network. Microsoft is
also investing billions to grow its own cloud infrastructure. Microsoft is currently adding an
estimated 20,000 servers a month. With this amount of process, storage and computing power
coming online, the concept of cloud computing is more of a reality than ever before. The growth
of cloud computing had the net effect of businesses migrating to a new way of managing their
data infrastructure. This growth of cloud computing capabilities has been described as driving
massive centralization at its deep center to take advantage of economies of scale in computing
power, energy consumption, cooling, and administration.
CLOUD ARCHITECTURE:
Infrastructure-as-a-Service (IaaS):
Infrastructure-as-a-Service (IaaS) is offered in the bottom layer, where resources are
aggregated and managed physically (e.g., Emulab) or virtually (e.g., Amazon EC2), and ser-
vices are delivered in forms of storage (e.g., GoogleFS), network (e.g., Openflow), or compu-
tational capability (e.g., Hadoop MapReduce).
Platform-as a-Service (PaaS):
The middle layer delivers Platform-as a-Service (PaaS), in which services are provided
as an environment for programming (e.g., Django) or software execution (e.g., Google App
Engine).
Software- as-a Service (SaaS):
Software- as-a Service (SaaS) locates in the top layer, in which a cloud provider further
confines client flexibility by merely offering software applications as a service. Apart from the
service provisioning, the cloud provider maintains a suite of management tools and facilities
(e.g., service instance life-cycle management, metering and billing, dynamic configuration) in
order to manage a large cloud system.
Cloud deployment models include public, private, community, and hybrid clouds which
is shown in figure 3.2. Public clouds are external or publicly available cloud environments that
are accessible to multiple tenants, whereas pri-vate clouds are typically tailored environments
with dedicated virtualized resources for particular organizations. Similarly, community clouds
are tailored for particular groups of customers .
ORGANIZATION OF CLOUD COMPUTING SECURITY:
Cloud services exhibit five essential characteristics that demonstrate their relation to,
and differences from, traditional computing approaches:
Broad network access - Capabilities are available over the network and accessed
through standard mechanisms that promote use by heterogeneous thin or thick client
platforms (e.g., mobile phones, laptops, and PDAs) as well as other traditional or cloud
based software services.
Resource pooling - The providers computing resources are pooled to serve multiple
consumers using a multi-tenant model, with different physical and virtual resources dy-
namically assigned and reassigned according to consumer demand. There is a degree
of location independence in that the customer generally has no control or knowledge
over the exact location of the provided resources, but may be able to specify location at
a higher level of abstraction (e.g., country, state, or datacenter). Examples of resources
include storage, processing, memory, network bandwidth, and virtual machines. Even
private clouds tend to pool resources between different parts of the same organization.
Rapid elasticity - Capabilities can be rapidly and elastically provisioned in some cases
automatically to quickly scale out; and rapidly released to quickly scale in. To the con-
sumer, the capabilities available for provisioning often appear to be unlimited and can
be purchased in any quantity at any time.
Measured service - Cloud systems automatically control and optimize resource usage
by leveraging a metering capability at some level of abstractionappropriate to the type
of service (e.g., storage, processing, bandwidth, or active user accounts). Resource us-
age can be monitored, controlled, and reported providing transparency for both the pro-
vider and consumer of the service.