Research Finalpaperv3
Research Finalpaperv3
Research Finalpaperv3
Romualdo B. Antonio
Not all information is stored in our memory. Most of the information is translated into digital
form that we all know as data. For common people, data are files, documents, images, pictures
or any other format that contain important information and ideas. This information must be
available whenever we need it and must be provided in timely manner. We saved our files on
our computer and on other media that we feel safe that our files are always there to retrieve.
Our files are not always there for us. There are some catastrophic events and malicious users
that affect the operation of the organizations that can disrupts day to day operations that can
damage our files. Even government agencies are not exempted from natural and man-made
disaster. Government files may be lost if not properly protected. Government agencies should
understand the importance of backing up their important data. No matter how efficient a
computer system appears to be, there is always the possibility of a malfunction wiping out
valuable data. There are both onsite and offsite backup options.
In Philippine Government launched their national cloud service, the GovCloud. The GovCloud
can provide cloud solution to all government agencies including Department of Education. The
creation of the private in-country data center will ensure data security and make online
information available. Philippines government adopted the "cloud first" approach to consider
cloud computing as part of their primary infrastructure.
The government agencies should understand that Cloud as data storage is not the only solution
to protect our information. Although Cloud services have some advantages, but having a good
backup strategy to local premises gives them more advantage in creating a good backup
strategy.
The study may provide us the information with employees’ awareness about cloud computing
as good data storage due to its technological advantage and to be available anytime due to its
distributed storage system. However, without the proper data handling and classification, we
may not be able to maximize the benefits offered by Cloud computing. In this case, the research
can provide practical suggestion relates to assist the IT group of each Government department
who is aiming to formulate a strategic approach and understand the requirements to have
effective data backup and recovery. Lastly, it can do more informed in decision making.
Colegio de San Juan de Letran Calamba
The research can contribute to DepEd in their aim to achieve data backup and recovery
procedure by utilizing Cloud services with proper data handling and classification, and
supplement it with local onsite backup strategy to have a more holistic data backup approach.
Government agency like Department of Education (DepEd) Sto. Rosa branch is saving their wide
ranges of information and data files in the on local network and Cloud storage
General Objective:
The main objective of this study is the analysis and development data backup and recovery
strategy of DepEd Sto. Rosa branch.
Research Objectives
i. To determine the current type of data and existing policies implemented in DepEd office
of Sto. Rosa
ii. To assess the data classification and current data backup practices of DepEd Sto. Rosa.
iii. To recognize data risks and problems encountered by DepEd Sto. Rosa.
iv. To develop cloud-based backup and recovery strategy for DepEd Sto. Rosa
Research Questions
i. What are the current state of data and existing policies implemented in DepEd office of
Sto. Rosa?
ii. What are the data classification and current Data Backup practices of DepEd Sto. Rosa?
iii. What are the data risks and problems encountered by DepEd Sto. Rosa?
Colegio de San Juan de Letran Calamba
Significance of the Study
• To IT Professional:
IT professionals who are specializing in data protection may find the results interesting
to know about the current DepEd data backup strategy. This can give them some
insights on how to formulate their own strategies that are applicable not only to
government agencies but also with private organizations who are planning to avail cloud
services for their data storage.
.
• To future researcher:
The study will serve as their guide in conducting similar studies. It can give them the
necessary information to refer to their own research.
• To Government Employees:
They can benefit on the study to identify their data backup approach to store and
protect their data above their current Cloud storage service.
The Study is focus on the data backup and recovery strategy of DepEd Sto. Rosa while including
the Cloud service by any private provider. The Data protection strategy of DepEd Sto. Rosa is by
performing their Data Backup and Recovery strategy as secondary to their Cloud Storage, and
their security awareness in organizing data into categories or in particular importance.
Data Protection involves different techniques and technology in order to provide the availability
and security of data. The Data protection mentioned in this study limits only with local and
Cloud data backup and recovery, and any DepEd’s data classification practices.
We are not to include other security technologies and software as part of our study, and not to
include DepEd's data disposal practices, implementation of Authentication, Role based access,
Colegio de San Juan de Letran Calamba
Administrative controls, Technology control like Data Loss Prevention (DLP) devices and
encryption gateways, and risk mitigation techniques outside the research objectives.
Definition of Terms
• Platform as a Service (PaaS): This model provides platform as a service. This provides
clients to develop his own application using the tools and programming languages
• Infrastructure as a Service (IaaS): This model provides the shared resource services. It
provides the computing infrastructure like storage, virtual machine, network
connection, bandwidth, IP address. IaaS is complete package for computing.
• Storage-as-a-service Used for Data Protection (STaaS/dp) is cloud service offering cloud
storage as data storage.
• Virtual storage. Virtual storage is concentrating multiple storage modules (such as disks,
disk arrays) in a storage pool by certain means and unified management
• Hot Standby: It refers to hot standby based on two servers in the high availability system
• Data is all information generated or owned by the DepEd (including, but not limited to,
information generated or developed by the DepEd’s employees, contractors and
volunteers while performing their duties and responsibilities to the DepEd, unless the
DepEd has waived its ownership rights to the Data) and information not generated or
Colegio de San Juan de Letran Calamba
owned by the DepEd but which the DepEd has the duty to manage. This information can
exist in any form including, but not limited to, print, electronic and digital.
• Data backup strategy: Data backup strategy refers to the determination of steps and
actions to achieve data recovery and reconstruction objectives. It can be divided into
specific data backup and operating system backup
• Data disposal. As discussed in this paper, the policies, timeframes, and methods for
secure disposal of data.
• Data Owner - is the designated person at the DepEd assigned as the owner and decision
maker on the respective set of Data. The Data Owner sets the appropriate data
classification and determines the impact of Data to organization.
• Data retention - As discussed in this paper, the policies, timeframes, and methods for
storing, archiving, and retrieving data.
• Data retention policy- should reflect the data classification model and data retention
rules that apply to the data that is being retained.
• Data recovery - As discussed in this paper, the long-term storage of data and its
retrieval when it needs to be returned to service
• Private Cloud – cloud available to the general public over the internet
• RTO( Recovery Time Objective) - refers to the maximum tolerable time the user is
willing to wait to recover his data from data loss.
Colegio de San Juan de Letran Calamba
• RPO( Recovery Point Objective) - refers to the maximum tolerable amount of data
where user is willing to loss
• Drive Encryption - referred to as full disk encryption or whole disk encryption. Drive
encryption solutions encrypt the entire hard drive, including the operating system,
applications, drivers, and user data.
Colegio de San Juan de Letran Calamba
II. REVIEW OF RELATED LITERATURE AND STUDIES
International Studies
This chapter will serve as the foundation for the development of the study. It will discuss the
relevant literature relating to the classification Data, Cloud computing services and data backup
strategy to describe both operational backup and recovery of data.
Cloud Services
Praveen S. Challagidad (2017). Cloud computing provides accessing of any kind of services
dynamically over Internet on demand basis. One of the most significant service that is being
provided is storage as a service. Cloud customer can store any amount of data into cloud
storage results to huge amount of data at the datacenter.
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to
a shared pool of configurable computing resources It deals with data storage application,
infrastructure using service oriented technology.
Software as a Service (SaaS): SaaS is a collection of application and software; it allows the
clients to subscribe the software instead of purchasing it. Software application is presented as
service to the customer based on their demand.
Platform as a Service (PaaS): This model provides platform as a service. This provides clients to
develop his own application using the tools and programming languages
Infrastructure as a Service (IaaS): This model provides the shared resource services. It provides
the computing infrastructure like storage, virtual machine, network connection, bandwidth, IP
address. IaaS is complete package for computing.
Microsoft (2017). Cloud providers must have operational practices in place to prevent
unauthorized access to customer data; it’s also important to note that any compliance
requirements a customer organization has must also be supported by the provider. Although
cloud providers can help manage risks, customers need to ensure that data classification
management and enforcement is properly implemented to provide the appropriate level of
data management services.
Although customers are responsible for classifying their data, cloud providers should make
written commitments to customers about how they will secure and maintain the privacy of the
customer data stored within their cloud. These commitments should include information about
• privacy and security practices,
• data use limitations,
• and regulatory compliance
Praveen S. Challagidad (2017). Data storage is one of the most significant services provided by
cloud computing technology. But, recovering for any lost data is one of the challenging issue in
cloud computing. The original data can be recovered by using some of the data recovery
techniques. There are existing recovery techniques are not efficient and reliable hence, to
recover the lost original data a technique is needed to meet efficiency and reliability.
Here are few data backup and recovery techniques in cloud computing
• Cloud mirroring technique. It uses the mirroring algorithm. The method provides the
high availability, integrity of the data, recovery of the data and minimizes the data loss.
This method can be applied to any kind of the cloud. Cost to recover the data is also
less.
• Data backup and recovery technique. This technique provides the data protection from
the service failure and also decreases the cost of solution. By using this technique, the
process of migration becomes simple and also removes the cloud vendor dependency.
They proposed an effective data backup technique to recover the data from the server
Colegio de San Juan de Letran Calamba
in case of data loss. For every business it is essential to back up the data to avoid the
data loss.
• According to Somesh P. Badhel study, they proposed a Seed Block Algorithm
Architecture (SBA) and suggested a remote backup server. The remote Backup server is
a replica of original cloud server which is physically situated at a remote location. This
method is based on the concept of Exclusive-OR (XOR) operation of digital computing.
The SBA uses a random number and a unique client id associated with each client.
1. File-level disaster recovery is backing up data to disaster recovery system by means such
as backup software. It needs to restore data and operating systems in order to complete
the restoration of the system when a disaster occurs
o RPO is in minute’s level or higher, RTO is in hour’s level or higher, and misuse
can’t be restored for file-level disaster recovery.
Colegio de San Juan de Letran Calamba
File-level disaster recovery strategy includes
• Remote mount, Remote mount is directly attaching disk of disaster recovery center to
the production host at the system level using software shipped with the operating
system
• Virtual tape library is using hard disk to imitate the tape library functions when the hard
disk storage cost drops to a certain extent.
• Backup software, it is directly backing up data to disaster recovery center using third-
party software.
• Cloud storage is a system which collects a large number of different types of storage
devices in the network through the application software to work together and to
provide data storage and service access function in common. Cloud storage can be very
good as a disaster recovery strategy for cloud disaster center combined with other
strategy
According to Rongzhi Wang (2017), there are some problems when storing data in the cloud
services. In cloud storage, the user stored their data on the cloud server and storage devices,
and these devices are under the management domain of the Service providers. Then Users
have no control in any event of equipment failure and misconfiguration that may lead user’s
sensitive data loss, damage or leakage.
In the entire storage life cycle of the data, the user cannot monitor the behavior of the cloud
service provider. And these raised into three questions:
1. How to ensure strong confidentiality of data?
2. How to ensure that when the accident caused the loss of data, the user can still
complete data recovery?
3. How to ensure the data when encountering malicious tampering?
Users are concerned about the confidentiality of their data cannot be assured. To achieve the
security required, the need to encrypt the data in the local storage to the cloud. A need for
local encryption before uploading to the cloud storage.
In distributed storage system, data redundancy technology is the most basic method to ensure
system reliability and improve data availability and persistence. Through multiple instances of
the same data storage file (unit for a file or sheet) to different storage node availability to
strengthen data, ensure that even if some nodes are not available, the remaining storage node
also can recover the original data integrity.
According to Praveen S. Challagidad (2017), Data recovery process present some issues during
recovery. These issues are discussed below.
1. Data Storage: All the enterprises store their large amount of data in the cloud. For
providing the security to data the computing is distributed but storage is centralized.
Therefore, single point failure and data loss are critical challenge to store the data in
cloud.
2. Data security: User stores their huge data in the cloud. The stored data may be
confidential and sensitive data. Providing the security to these data is important.
Colegio de San Juan de Letran Calamba
3. Lack of redundancy: If the cloud gets destroyed due to any reason then secondary site
gets activated to provide the data to user when primary storage fails to provide the
data.
4. Dependency: Customer doesn’t have control on their system and data. Backup service is
provided to overcome this drawback.
A Survey of IT Professionals
Druva is the leader in cloud data protection and information management, leveraging the
public cloud to offer a single pane of glass to protect, preserve and discover information –
dramatically increasing the availability and visibility of business critical information, while
reducing the risk, cost and complexity of managing and protecting
Research as sponsored by Druva ( 2015) conducted a survey of 214 IT professionals with the
responsibility for corporate data. The goal of the survey was to understand attitudes,
approaches, and challenges with ensuring the privacy of corporate data. And their key findings
are:
Data privacy is important, but don’t depend on employees as solution to address it
• 99% have sensitive data
• 84% report data privacy is increasing in importance in 2015
• 82% have employees who don’t follow data privacy policies
International requirements are making data privacy even more challenging to manage
• 93% face challenges ensuring data privacy
• 91% have data privacy controls, but those controls are incomplete
• 77% find it challenging to keep up with regional requirements for data privacy
Privacy isn’t viewed as a separate priority, and most resources are on external threats
• Only 20% separate data privacy and data security
• 72% put more effort into coping with threats from external sources than internal
sources
Enterprise Strategy Group (ESG) also surveyed organizations for the common use of cloud
infrastructure services to their organization. Enterprise Strategy Group is an integrated IT
research, analyst, strategy, and validation firm that is world renowned for providing actionable
insight and intelligence to the global IT community
According to ESG, Jason Buffington (2016) survey, Backup and disaster recovery are the two
of the most frequent uses for cloud storage today and it is the secure offsite data repository
as shown Figure 1. Their other respondents, saying their organizations are considering cloud
service to support remote data survivability, save money, and enhance their recovery ability.
According to ESG, over the five-year span from 2012 to 2017, tape use for primary backup
appears to decrease from 56% to 45% while the cloud increased. Essentially, these findings
indicate that although cloud usage for data protection is increasing significantly, tape usage is
not declining at a corresponding rate. Organizations who saved their data in the traditional
tape system are intend to retain their data for six years or more (often much more). However,
most of those organizations also say they would store data in the cloud for three years or
less.
Many organizations should consider the cloud as an addition to any existing disk-plus-tape
backup strategy and not as a substitute to tape backup for long-term retention. Enterprise
Strategy Group (ESG) provided two solution types that are widespread in the industry today:
Amazon Web Services (AWS) is a subsidiary of Amazon.com that provides on-demand cloud
computing platforms to individuals, companies and governments, on a paid subscription basis.
In terms of controlling the overall IT environment, AWS and its customers shared responsibility
for managing the IT environment. They shared responsibility in providing services on a highly
secure and controlled platform. The customers’ responsibility includes configuring their IT
environments in a secure and controlled manner for their purposes. While customers don’t
communicate their use and configurations to AWS, AWS does communicate its security and
control environment relevant to customers.
The customer shoulders responsibility and management of the guest operating system
including system updates and security patches. The responsibility also includes other associated
application software as well as the configuration of the AWS provided security group firewall.
The customer can enhance their security to meet their compliance requirements by using
technology such as host base firewalls, host based intrusion, detection/prevention, encryption
and key management.
But AWS customers are required to continue to maintain adequate governance over the entire
IT control environment. Leading practices include an understanding of required compliance
objectives and requirements (from relevant sources), establishment of a control environment
that meets those objectives and requirements, an understanding of the validation required
based on the organization’s risk tolerance, and verification of the operating effectiveness of
their control environment
Colegio de San Juan de Letran Calamba
Evaluating and Integrating AWS Controls.
AWS provides a wide range of information regarding its IT control environment to customers
through white papers, reports, certifications, and other third-party attestations. This
documentation assists customers in understanding the controls in place relevant to the AWS
services they use and how those controls have been validated.
Control Environment
AWS manages a comprehensive control environment that includes policies, processes and
control activities that leverage various aspects of Amazon’s overall control environment. This
control environment is in place for the secure delivery of AWS’ service offerings. The collective
control environment encompasses the people, processes, and technology necessary to
establish and maintain an environment that supports the operating effectiveness of AWS’
control framework
Colegio de San Juan de Letran Calamba
Data Classification
Data classification is one of the most fundamental ingredients of an effective and efficient
information security strategy. It directly impacts decisions, procedures and practices on what
kind of data is collected, how it is stored, used, protected, shared or disclosed. In very simple
terms, it is an exercise carried out to understand the nature, type, criticality, sensitivity and
other attributes of the data in order to distinguish between good and bad data, important vs.
non-important data,
WHERE DO YOU START? BEFORE YOU CAN DEFEND, YOU MUST DISCOVER
Before you know what protection your data requires you need to know what you’ve got, where
it’s stored, why you have it and who has access to it. Once you’ve got to grips with that, you can
identify what is of true value to the organization – what’s business-critical and what’s sensitive
– and then how to best to treat it. This valuable data might include intellectual property such as
product designs and formulas, strategic plans, personal details, contracts and agreements,
regulated documents and plans for investment. Think about what the impact would be if the
piece of information was leaked or lost. If it was made public,
• Would it harm the business, or your customers, partners or suppliers?
• Would it put an individual’s security or privacy at risk?
• Would you lose advantage if a competitor got hold of it?
• Is it subject to any privacy or data laws, or compliance regulations?
• Would its loss breach a contract or agreement?
• Would it incur a cost?
• Would it damage the brand?
• Would you lose your job?
.
Data classification involves the user attaching an appropriate identifier or label to a message,
document or file, to give the data a value and let other users know how it should be handled or
Colegio de San Juan de Letran Calamba
shared. By classifying data according to its value to the business, organizations can develop
more effective data-centric security approaches that safeguard against accidents and reduce
risk. Using classification tools to implement the approach allows data security controls, rules
and policies to be more consistently enforced. These tools apply clear, consistent electronic
markings to any type of file and message – for instance ‘commercial in confidence’, ‘internal
only’, ‘public’ – and then allow it to be saved or sent only in accordance with the rules that
correspond to that marking.
Most importantly, as well as attaching the label in a visual form, classification tools apply it as
metadata, embedding a tag into document or file properties that stays with it wherever it goes.
This helps shield the business against accidental data loss – for example, a diligent employee
emailing a sensitive document to their home PC to work on at the weekend or someone saving
a confidential file in a public folder with the slip of a key. Attaching the label in two different
forms means the value of the data is clearly displayed to the user, while the metadata can be
used to direct other security and data management solutions downstream.
The most effective approach involves the user in the process. The employee themselves places
the identifier on the information at the point of sending or saving it, deciding which
classification to apply within a context – something a computer just can’t do with any real
accuracy.
For user-driven data classification to work, you need to set a clear policy that enables users to
make fast and intuitive decisions about how each document, message or file should be marked.
Use clear labels and terminology that will be instantly recognizable and meaningful in the
business context, and keep the number of different identifiers to a minimum
Colegio de San Juan de Letran Calamba
Information Classification - Who, Why and How
(SANS Institute, Susan Fowler - February 2003)
Companies need to protect their information today more than ever the increasing need for
companies to protect their customer and financial information is obvious. Signs are prevalent in
the news, publications, and in the turn of recent business and world events. The most
compelling reason to classify information is to satisfy regulatory mandates. Although
information classification is not specified as a required protection measure, it is implied by
special handling requirements for sensitive, medical and financial information.
Approach for classifying information There are many ways to implement an information
classification system. The key is to facilitate employee compliance of company endorsed
information protection measures. To successfully implement information classification, a
company must transition
from recognizing that it should classify its data to recognizing that it can.
Step 1. Identify all information sources that need to be protected. If information sources
haven’t been compiled for other initiatives, the best sources might be developers, operating
system and database administrators, business champions, and departmental and senior
managers
Step 3. Identify information classes. Information class labels should convey the protection goals
being addressed.
Step 4. Map information protection measures to information classes.
Step 5. Classify information
Step 6. Repeat as needed
1. Data Security
While technology has the potential to help enforce data security policies, without a pervasive
culture of data security users fail to use the technology properly. Either due to a lack of training
or the complexity of the security tools, studies continue to show that employees frequently
violate information security protocols
To secure data, senior executives must set the foundation for a culture of information
protection, which includes executive support and involvement, user training and guidance, easy
to use technology, and data classification.
Classification is foundational to securing your information as it allows users to quickly and easily
indicate the value of the data to the organization. The classification is applied as visual markings
(to alert end users), and persistent metadata (to inform security technology systems).
Colegio de San Juan de Letran Calamba
2. The Data Security Imperative
The propagation of data sharing tools, such as email, social media, mobile device access, and
cloud storage media are making it harder for IT and data security departments to keep sensitive
information from moving outside the network perimeter. The reality is that the data security
perimeter is forever changed as data is accessed and stored in multiple locations
It is important to note that the insider threat is not just a malicious user or disgruntled
employee, but could also be trustworthy employees who are just trying to work more
efficiently. When workers are unfamiliar with correct policy procedures and there are no
systems in place to train, inform, and remind them, they engage in risky information handling. If
your users don’t understand the value of the data they are using they are likely to see the
technology as an impediment to their workflow, and actively seek methods to circumvent
security.
When the CEO communicates to her employees the importance of security for their job as well
as for the organization, employees are much more proper to comply. Once the users are on
side in principle, it is important to follow up with tools that are easy to use and provide
immediate feedback with corrective suggestions when there is a violation
Classification can also aid where compliance legislation regulates the protection and retention
of company records. By providing structure to otherwise unstructured information,
Colegio de San Juan de Letran Calamba
classification empowers organizations to control the distribution of their confidential
information in accordance with mandated regulations.
Data classification has been used for decades to help large organizations such as Microsoft,
governments, and military entities manage the integrity of their data. Although risk
assessments are sometimes used by organizations as a starting point for data classification
efforts
Customers can verify the effectiveness of their cloud provider’s practices. Having this
information will help customers understand whether the cloud provider supports the data
protection requirements mandated by their data classification. However, to achieve
compliance, such organizations need to remain aware of their classification obligations and can
manage the classification of data that they store in the cloud.
For internal use only (sensitive). Information that is classified as being of medium sensitivity
includes files and data that would not have a severe impact on an individual and/or
organization if lost or destroyed.
• Email, most of which can be deleted or distributed without causing a crisis (excluding
mailboxes or email from individuals who are identified in the confidential classification).
• Documents and files that do not include confidential data.
Public (unrestricted). Information that is classified as public includes data and files that are not
critical to business needs or operations.
• The data asset owner is the original creator of the data, who can delegate ownership
and assign a custodian. When a file is created, the owner should be able to assign a
classification, which means that they have a responsibility to understand what needs to
be classified as confidential based on their organization’s policies. All of a data asset
owner’s data can be auto-classified as for internal use only (sensitive) unless they are
responsible for owning or creating confidential (restricted) data types. Frequently, the
owner’s role will change after the data is classified. For example, the owner might
create a database of classified information and relinquish their rights to the data
custodian.
Colegio de San Juan de Letran Calamba
After data is classified, finding and implementing ways to protect confidential data becomes an
integral part of any data protection deployment strategy. Protecting confidential data requires
additional attention to how data is stored and transmitted in conventional architectures as well
as in the cloud.
The data asset custodian is assigned by the asset owner (or their delegate) to manage the asset
according to agreements with the asset owner or in accordance with applicable policy
requirements. Ideally, the custodian role can be implemented in an automated system. An
asset custodian ensures that necessary access controls are provided and is responsible for
managing and protecting assets delegated to their care. The responsibilities of the asset
custodian could include:
• Protecting the asset in accordance with the asset owner’s direction or in agreement
with the asset owner
• Ensuring that classification policies are complied with Informing asset owners of any
changes to agreed-upon controls and/or protection procedures prior to those changes
taking effect
• Reporting to the asset owner about changes to or removal of the asset custodian’s
responsibilities
Colegio de San Juan de Letran Calamba
Encryption gateways can provide a means to manage and secure data that has been classified
as confidential by encrypting the data in transit as well as data at rest
• This approach should not be confused with that of a virtual private network (VPN);
encryption gateways are designed to provide a transparent layer to cloud-based
solutions.
• Encryption gateways are placed into the data flow between user devices and application
data centers to provide encryption/decryption services.
Data loss prevention (DLP) technologies can help ensure that solutions such as email services
do not transmit data that has been classified as confidential. Organizations can take advantage
of DLP features in existing products to help prevent data loss
DLP technologies can perform deep content analysis through keyword matches, dictionary
matches, regular expression evaluation, and other content examination to detect content that
Colegio de San Juan de Letran Calamba
violates organizational DLP policies. For example, DLP can help prevent the loss of the following
types of data:
• Social Security and national identification numbers
• Banking information
• Credit card numbers
• IP addresses
Some DLP technologies also provide the ability to override the DLP configuration (for example,
if an organization needs to transmit Social Security number information to a payroll processor).
In addition, it’s possible to configure DLP so that users are notified before they even attempt to
send sensitive information that should not be transmitted
Best Practices on Leverage the Public Cloud for Backup and Disaster Recovery
To leverage public cloud to supplement current backup and disaster recovery solutions,
suggested general backup best practices.
1. Think of local protection as the first line of defense. When it comes to performing
backup and recovery, the best performance—99 percent of the time—will be delivered
by using resources local (on premise) to the systems and data being protected.
2. Identify the systems and the dependencies of those systems that are critical to the
business. If the local protection is the first line of defense, public cloud-based based
protection would be the second line of defense.
3. Don’t think only of traditional backup for disasters. Consider the ability to use
replication technologies to provide continuous data protection locally and in the cloud.
But remember that replication, although a great complement, is never a replacement
for backups.
4. Think about how you want to restore data and back up to meet that goal. Backing up
the system and all the storage will protect everything on that OS instance, which is
perfect for when to restore the entire environment using bare metal recovery scenarios.
5. Backup at the hypervisor level may not always be enough for the best restoration
experience—consider running backup agents within the VM OS instead of just on the
virtualization host.
6. Long-term backup storage in the cloud. Data is stored for many reasons, the most
important of which is long-term archiving for corporate needs and to meet regulatory
requirements.
7. Ensuring security of the public cloud data. Verify the security used in cloud solution, the
physical security of the public cloud locations, encryption of data at rest on the storage,
and logical separation of your organization’s data from other organizations using the
same public cloud backup provider and encryption used to protect the data during
transmission.
Colegio de San Juan de Letran Calamba
8. Running the recovery directly in the cloud Look at options to run your systems in virtual
environments in public cloud virtual machine hosting solutions using the systems and
data backed up in the public cloud. This approach allows your operations to be up and
running again even without your own datacenter.
9. Unified backup and management. Consider leveraging a single solution that supports a
hybrid model and enables a single management approach.
10. Test the processes periodically and any time a significant change occurs in
infrastructure. Perform regular tests to ensure restore processes work and the data
protected is valid
Traditional Data Protection Solutions deploy resource-intensive backup agents in the physical
server, which copy and move data from production storage to a back-end disk or tape
environment. This worked well for physical environments with limited storage capacity but is
not sufficient for a virtualized environment with high utilization rates. As such, a modern data
protection strategy is necessary.
It is vital for a school to have their own backup system and regularly backup their files. In the
event of theft, file, virus infection and any system failure or file corruption their data can be
restored. According to ICT of Covert City, Their ICT services have some recommendation for a
backup software for their schools. The software must be automated with online backup and
data recovery, and simplified management for school backup. The Backup software for Schools
automatically protects data residing on servers and according to retention policies and
schedules set by the user, the Data is compressed and encrypted. When data recovery is
required, users can simply select the data to restore using an intuitive interface. They
recommend Redstor Backup software for Schools, it is currently protecting data for over 60
schools within Coventry. Coventry is a city and metropolitan borough in the West Midlands,
England.
A Guide to Choosing the Right Data Backup Solution for your School
(Microsoft Authorized Education Reseller, Our ICT - March 2015)
Our ICT is the dedicated educational division of Our IT Department Ltd. a market leading IT
services provider with offices located in Central London and the East London/Essex border.
According to Our ICT (2015), School Primary data storage refers to data that is stored on their
Colegio de San Juan de Letran Calamba
computers while Secondary data storage is the process of backing up data and storing critical
data in a secondary location like in cloud backup and storage services. Using cloud service is a
means of securing data while working with school limited budget. Some other schools may still
deploy their data storage on own infrastructure or use a combination of both onsite and offsite
storage.
It is significant for a school to maintain a safe data storage, but having it is enormous task then
placed in school premises. They must implement network infrastructure, fund their IT staff, pay
their security technologies. Using Cloud storage, they can still keep their data safe, meet
compliance and security requirements, and have an effective disaster recovery strategy at a
fraction of the cost of maintaining data backup and storage infrastructure on the premises
Here are the things to consider when choosing a Cloud Service Provider
1. Select a Service Provider with one that has a long-established track record
2. Factors to consider include
o Availability
▪ most service providers use redundancy and the reassurance of a Service
Level Agreement (SLA)
▪ Most quality service providers guarantee an uptime of at least 99.9
percent
o Scalability.
Colegio de San Juan de Letran Calamba
▪ One of the advantages of working with a data backup and storage
provider is the solution is scalable not locked into the amount of storage
space you designated when you first subscribed to the service
▪ storage provider will allow you to scale up or down as necessary and as
school district requirements change
o Security.
▪ use the same security technologies as the military.
▪ any data that is transmitted to their servers is fully encrypted
▪ invest the time to learn all you can about the security technologies they
use and the compliance certifications they have been awarded
o Recovery Time.
▪ an easy and straightforward process for performing single action data
recovery
o Automated Backup.
▪ Should be able to use an automated process to backup and store all
school data.
▪ not require any of your staff to be near their device when the data is
backed up
Here are a few ways on HOW to provide safe data storage using Cloud Storage:
1. Data Security:
• Cloud service provider maintains the infrastructure including keeping it secure
o service provider undergoes periodic audits to ensure their infrastructure
stays compliant
o service provider meets strict compliance standards
• The data is safe, and it can be accessed or restored at any time using a secure
password.
• In addition to the use of passwords, all data is encrypted prior to being sent to the
servers maintained by the cloud data storage provider
• Use the encryption password to unscramble the data and download it from the
server
• Because the data is not being stored on an external device such as a flash drive or
CD, you never have to be concerned about device malfunctions or stolen data
4. User Access.
• School districts never have to worry about compatibility and end user issues
• School is not required to upgrade devices, software, or media on their end since you
access the data storage service using a secure Internet connection and a web
browser
• Any additional expenses of monthly or annual subscription.
• Easy user access is important in the event data must be recovered quickly and
efficiently
Data Classification policy is to provide a structured and consistent classification framework for
defining the university’s data security levels, which will establish the foundation for appropriate
access control policies and procedures. This policy is applicable to all Data, as defined in this
policy. This policy does not apply to information that is the personal property of individuals
covered by the policy
• General information and marketing materials about the university such as press
releases, campus maps, athletic results, information about academic program offerings.
• St. Thomas e-mail addresses.
• University reports filed with federal or state governments and generally available to the
public.
• Copyrighted materials that are publicly available.
• Student information covered as “Directory information” under FERPA if not restricted by
individual student action.
• Published research.
Category II – Yellow
• Data required by law not to be disclosed without consent of the subject of the
information, that is not covered under Category III - Red.
Philippine Government to launch its national cloud service program. An initiative from the
Philippine Government of the iGovPhil program, the (GovCloud) allows government agencies to
take full advantage of the benefits cloud computing. GovCloud will use a hybrid cloud strategy,
ensuring data security while enabling on-demand availability of storage and computing
resources.
GovCloud provides the benefits of Security, Scalability and on-demand availability of storage
and computing resources. The cloud infrastructure serves as a centralized data repository, and
it will allow sharing and integration of resources among the government agencies.
The implementation of the GovCloud is pursuant to the DICT’s Circular No. 2017-002 or the
adoption of the Cloud First Policy by agencies to provide better services to citizens with scalable
and on-demand cloud computing
Department Circular, signed 18 January 2017, prescribes the Philippine Government’s Cloud
First Policy, which aims to promote cloud computing as the preferred ICT deployment strategy
and a means to reduce costs.
5.1 Cloud computing has brought a new and more efficient means of managing government
information technology resources. It is hereby declared the policy of the government to adopt a
“cloud first” approach and for government departments and agencies to consider cloud
computing solutions as a primary part of their info structure planning and procurement.
5.2 All government agencies shall adopt cloud computing as the preferred ICT deployment
strategy for their own administrative use and delivery of government online services, except:
5.2.1 When it can be shown that an alternative ICT deployment strategy meets special
requirements of a government agency; and
5.2.2 When it can be shown that an alternative ICT deployment strategy is more cost
effective from a Total Cost of Ownership (TCO) perspective and demonstrates at least
the same level of security assurance that a cloud computing deployment offers.
Colegio de San Juan de Letran Calamba
GovCloud Service Catalogue
Currently, we have the following offerings:
Operating System
READILY AVAILABLE
• Red Hat Enterprise Linux 6.5 (64-bit)
• CentOS 6.5 (64-Bit)
Colegio de San Juan de Letran Calamba
Data Privacy: Encryption
Why Encryption?
Regulatory compliance, data privacy concerns and brand reputation often become powerful
motivating factors for organizations to take advantage of encryption technologies. In addition,
most of the states in the U. S. have enacted Safe Harbor Laws that protect organizations if they
use strong encryption. If data is encrypted, it’s still protected in the event of a breach.
Obstacles to Encryption
• Misperception is encryption is too expensive
• Encryption solutions are difficult to deploy and manage
Data protection has different approaches to provide a more reliable strategy in securing the
Data. Data can be defined as structured or unstructured, important or not important. Data
classification is the primary strategy to categories the information and data within
organizations, including government agencies. Classifying data according to its value to the
business can help the organizations to develop a more effective data-centric security approach.
Implementing Data classification allows the data to have effective security controls, rules and
polices.
By properly identifying the data according to its impart to the organization, treatment to data
can varies depending on its important, Data backup that can assure of its availability in case of
disaster, can treat the data differently based on its importance on requirements on disk storage
space, data retention, Recovery Point Objective requirement and Recovery Time Objective
requirement. With Data classification can be user-driven classification where data owner
defines the classification of their data, while there is some software that can categories the
data based on configured parameters.
Data protection also covers the data security; Encryption is the basic strategy to protect the
data in use or data in transit. Organizations can implement encryption before saving their data.
Organizations can implement local backup strategy to protect the data and make it available
anytime. Data recovery Is easier in case of data loss or file corruption with local data restoration
procedure. However, onsite backup strategy should be supplemented by Cloud Service.
Cloud services can provide a reliable storage for organizations, due to its technological
advantage to make the data protected from any man-made or natural disaster. Cloud Service
has distributed cloud technology within a region where data can be automatically replicated to
another cloud region. Cloud storage is the primary preference for primary data storage and
onsite backup is the secondary data storage. Many organizations should consider the cloud as a
supplement to any existing disk-plus-tape backup strategy and not as an alternative to tape
backup for long-term retention. Yet, adding cloud storage to any ordinary current backup
strategy will not make it outstanding.
Philippine Government launched its Cloud services. Government declared the policy to adopt a
“cloud first” approach and for government departments and agencies to consider cloud
computing solutions as a primary part of their infrastructure planning and procurement. As the
Colegio de San Juan de Letran Calamba
public sector adopts a cloud first policy, the Philippine GovCloud will continue to support
agencies efforts to adopt cloud solutions according to their requirements.
Research Gap
Government Cloud services (GovCloud) is a service that has not yet been explored. GovCloud
has no established track record that can be compared to Private Cloud service provider.
Government implemented the "Cloud first" policy to encourage government agencies to
consider Government cloud in their infrastructure.
However, there are no published articles or any reviews about the cloud services offered by the
Government that help us to understand the distinct its services, and how to know if they offer
the same level of services to private organizations. We must also understand the service
difference in terms of availability, security, recovery time and assurance of service level to the
government agencies or non-government organizations considering the Cloud service as
primary data storage.
Theoretical Perspective
The theoretical foundation of this study is built on the theory of the process, and this part
describes the control theory. The Control theory offers a good framework through which to
understand the information security process, and a way to reconcile between theory and
practical information protection.
Procedure control offers a useful model for thinking about control in general. Consider Figure 2,
which shows a typical control system. In this system, a sensor determines actual conditions, and
a comparison device compares the standard with what exists. If the difference between reality
and the standard is too great, the comparison device sends a signal to act. The action taken in
turn affects the sensor and standard, and the cycle continues until the comparison device finds
agreement between sensor and standard, and stops signaling for action.
Colegio de San Juan de Letran Calamba
Creating a backup strategy is important for data protection plan. Managers have a concept of
data integrity, and these Managers must be aware of any deviations like data corruption or
data loss. Given the indicator that deviates from data integrity and availability, the
management must act accordingly to bring the data back and making it accessible and reliable.
Deviations can be technical, man-made or natural disaster yet by having efficient backup
strategy with good restoration procedure and by taking the necessary action, a possibility to
restore to data in good state.
Conceptual Framework
The Government agency wants to ensure that all users of the IT structure within the
organization’s domain abide the prescription regarding the security of data stored digitally
within the boundaries the organization stretches its authority.
So how the management and researcher analyze and view the data security strategy to be
developed as intend to recommend the data protection strategy in Cloud. After careful analysis
of existing literature, the proposed framework in this paper focuses on type of data, data
storage, backup and recovery and data classification. The conceptual framework is presented as
depicted in Figure 3 below.
Colegio de San Juan de Letran Calamba
Philosophical Underpinnings/Lenses
Information technology research has usually been examined in terms of positivism paradigm.
(Jakobsen – 2013). “Positivism in general refers to philosophical positions that emphasize
empirical data and scientific methods”. Positivism prefer quantitative approach and methods
as social surveys, structured questionnaires and official. It is a framework using practical
investigation as its basis, and seeks to identify the regularities and interrelationships between
components within the setting under review. When the research setting includes other
variables over which the researcher has no control, namely social factors, this paradigm may
result in unsatisfactory or incomplete explanations of the setting in which social factors play a
part. Positivism is concerned with research that can be replicated to confirm its validity.
• Cloud service is not ‘one-size fits all’ approach in protecting our data against data
corruption and loss.
• Government agencies are committed in their data backup and recovery strategy.
• When it comes to data backup and recovery, IT group implemented risk assessment on
both local network and cloud storage.
• We may assume that data deduplication, encryption, and replication are not available
and implemented in DepEd Sto. Rosa.
Research Design
The researcher used descriptive research in this study. The study aimed to describe the present
condition of DepEd Sto. Rosa branch in relation to their practices in performing their data
backup which will be the foundation of our analysis and recommendations for safeguarding
important information. The data will be gathered from each individual employee of DepEd Sto.
Rosa and will be treated each response as an individual primary data source.
Ethical Considerations
The researcher will ask the full consent from DepEd Office of Sto. Rosa branch including with
the respondents who will participate in the research to be conducted. The researcher will
ensure that the data collected will not contaminate the validity of the results. The Researcher
will protect the privacy of the responders and ensuring the anonymity of individuals. The
researcher will avoid any deception or exaggeration about the objectives of the research.
Research Locale
The Study will be conducted at DepEd Sto. Rosa Branch located at 2nd Floor Leon C. Arcillas
bldg. Barangay Market Area City of Santa Rosa, Laguna. DepEd Sto. Rosa is one of the branches
of Department of Education, the executive department of the Philippine government which is
responsible for ensuring access to, promoting equity in, and improving the quality of basic
education.
Colegio de San Juan de Letran Calamba
DepED Sto. Rosa Branch is composed of three divisions: Office of the Schools Division
Superintendent, Curriculum Implementation Division and School Governance and Operations
Division. Each division has specific goals and functions to address different areas of concern. A
total of 95 computer users from three divisions, and each employee under each division are
regular computer users who are using, transforming, transferring and saving data.
Sampling
The respondent from DepED Sto. Rosa branch involved three divisions: Office of the Schools
Division Superintendent, Curriculum Implementation Division and School Governance and
Operations Division. The total population of this office is 95 rank-and-file employees, composed
of 44 computer users from Office of the Schools Division Superintendent, 21 computer users
from Curriculum Implementation Division, and 27 computer users from School Governance and
Operations Division.
The Researcher will use total enumeration wherein all the respondents are part of the survey.
The respondents of the study are enumerated as:
To answer the questions and to attain the objectives of this study, a structured survey
questionnaire will be used. Through this questionnaire, the researcher could identify the
present data backup and recovery practices in the DepEd Office Sto. Rosa.
The questionnaire will be disseminated to all the respondents of DepEd Sto. Rosa. The
instruments will be distributed manually to all the respondents and the researcher is present
while the respondents are answering the survey questionnaire. In this method, the researcher
can explain any questions to the respondents when needed. The researcher will conduct the
survey while the respondents are still in their natural working environment.
Researcher will distribute his own questionnaire and will improve the reliability of his
instrument by performing pretest and validation test by respondent debriefing. The respondent
debriefing is the approach by running the survey on a small number of respondents’ prior
sending it out to entire population. The questionnaire should be well understood by the
respondents and any comments and suggestions from the respondents are entertain.
Data Analysis
Descriptive analysis is a method and technique will be using to describe the current practices of
DepEd Office Sto. Rosa. Descriptive statistics such as frequency, percentage and means will be
used.
Colegio de San Juan de Letran Calamba
In analyzing and summarizing the data collected, the researcher will use the weighted mean.
Formula as follows:
Weighted Mean:
_
x = _∑wX_
∑w
Where:
_
x = Weighted mean
X = Average of a given value (mean)
w = Frequency number (weighing factors)
∑ = Σ means “add up”
The mean scores computation will be interpreted by using the Likert scale. This tool will
measure and interpret the positive and negative responses from the respondents.
The following tables show the range of scale, rating and interpretation needed to assess the
responses.
Scale Interpretation
5 Always
4 Often
3 Sometimes
2 Rarely
1 Never
Scale Interpretation
3 Agree
2 Undecided
1 Disagree
Colegio de San Juan de Letran Calamba
Appendix A
Survey Questionnaire
Dear Respondent,
We are conducting research on the analysis of Data backup and recovery. Your response in this regard
shall help us to complete this research in efficient way. We ensure you that data collected shall be kept
confidential.
Literature Cited
Praveen S. Challagidad ( April 2017). Microsoft. Efficient and Reliable Data Recovery Technique in Cloud
Computing. Basaveshwar Engineering College, Bagalkot, India
Frank Simorjay (April 2017). Microsoft. Shared Responsibilities for Cloud Computing
Retrieved from https://gallery.technet.microsoft.com/Shared-Responsibilities-81d0ff91
Rongzhi Wang ( 2017 ). Research on data security technology based on cloud storage.
Institute of Computer, Hulunbuir College , Mongolia,China
Thiruvadinathan (2017 ). Happiest Minds Technologies. Data Classification – Taking control of your data
Retrieved from https://www.happiestminds.com/whitepapers/Data-Classification-Taking-control-of-
your-data.pdf
James Boldon (2017). QinetiQ Company. Whitepaper - The First Step to Protecting Unstructured Data.
Retrieved from https://www.boldonjames.com/resources/data-classification/
Boldon James ( August 2017 ). QinetiQ Company. BEST PRACTICE: USER-DRIVEN DATA CLASSIFICATION
Retrieved from https://www.boldonjames.com/tag/user-driven-classification/
Susan Fowler ( February 2003). SANS Institute. Information Classification - Who, Why and How
Retrieved from https://www.sans.org/reading-room/whitepapers/auditing/information-classification-
who-846
CA ARCserve ( 2017). Best Practices on Leverage the Public Cloud for Backup and Disaster Recovery
Retrieved from http://www.arcserve.com/us/solutions/~/media/Files/SolutionBriefs/ca-arcserve-top-
ten-practices-public-cloud.pdf
CA ARCserve ( 2017). Best Practices on Leverage the Public Cloud for Backup and Disaster Recovery
Retrieved from http://www.arcserve.com/us/solutions/~/media/Files/SolutionBriefs/ca-arcserve-top-
ten-practices-public-cloud.pdf
Our ICT ( March 2015). Microsoft Authorized Education Reseller. A Guide to Choosing the Right Data
Backup Solution for your School
Retrieved from http://www.ourict.co.uk/best-school-data-backup
Department of information and communication technology ( DICT ) (2017 ): Government Cloud service
Retrieved from http://www.dict.gov.ph/prescribing-the-philippine-governments-cloud-first-policy/
Department of information and communication technology ( DICT ) (2017 ): GovCloud Service Catalogue
Retrieved from http://i.gov.ph/govcloud/service-catalogue/
Colegio de San Juan de Letran Calamba