Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

    Brajendra Panda

    Welcome to the 17th Annual ACM Symposium on Applied Computing (SAC'2002) as hosted by Universidad Carlos III de Madrid in Leganes, Spain, which is strategically located South of Madrid! Thanks for attending this international forum... more
    Welcome to the 17th Annual ACM Symposium on Applied Computing (SAC'2002) as hosted by Universidad Carlos III de Madrid in Leganes, Spain, which is strategically located South of Madrid! Thanks for attending this international forum for computer scientists, engineers and practitioners that includes many innovative computational ideas and a wide spectrum of applications.SAC is a conference devoted to the study of real-world problem applications using many varieties of computation algorithms. As such, it provides an avenue for discussion and exchange of new ideas, associated computation algorithms, and interesting complex applications. The Symposium is rightly sponsored by the ACM Special Interest Group on Applications (SIGAPP) whose mission is to further the interests of the computing professional engaged in the development of new computing applications, interdisciplinary applications areas, and applied research. Thus, the spectrum of applications and tutorials covers databases and computational finance to evolutionary algorithms, software engineering, and parallel and distributed computing plus others designed to provide a wide range of topics as reflected in the SAC'2002 program. Note that Biomedical Computing is also a special element of the Symposium with an innovative bioinformatics track and associated tutorial and plenary session as directed by Warren Jones.Welcome to the 17th Symposium on Applied Computing (SAC 2002). During the past 16 years, the Symposium provided an opportunity for researchers and practitioners to present their findings and research results in the areas of computer applications and technology. This year, the 3-day technical program offers a wide range of tracks covering major areas of computer applications. Highly qualified referees with strong expertise and special interest in their respective research areas carefully reviewed submitted papers. In addition, the technical program includes a tutorial program offering 3 full-day and 4 half-day and tutorials. The tutorials are described later in this program and are posted on SAC 2002 Website (http ://www.acm.org/conferences/sac/sac2002/TutorialCall.htm).This year, SAC embarked on a radical modification of its established procedure for compiling the list of tracks to which authors would subsequently submit their papers. More to the point, an open call for track proposals was introduced, inviting all parties interested in holding a track to respond to this call by submitting to the Program Chairs a short description of the proposed track, along with a preliminary dissemination plan of the proposed track's call for papers and a short CV of the potential track chairs. In response to this call, 34 track proposals were submitted which were evaluated thoroughly by SAC 2002's Organizing Committee. Some proposals were rejected on thegrounds of either not being appropriate for the areas that SAC covers traditionally or being of rather narrow and specialized nature. Some others were merged to form a single track, on the grounds of having substantial overlap with each other. Eventually, 21 tracks were established, which then went on to produce their own call for papers. In response to these calls, 457 papers were submitted, from which 194 papers were strongly recommended by the referrers for acceptance and inclusion in the Conference Proceedings. This gives SAC 2002 an acceptance rate of 42% across all submissions and an average acceptance rate of 40% over all tracks. It also makes SAC 2002 the most successful conference in the history of SAC so far, but also one of the most popular and competitive conferences in the international field of applied computing.
    Welcome to the 18th Annual ACM Symposium on Applied Computing (SAC'2003) as hosted by Florida Institute of Technology in Melbourne, Florida! Thanks for attending this international forum for computer scientists, engineers and... more
    Welcome to the 18th Annual ACM Symposium on Applied Computing (SAC'2003) as hosted by Florida Institute of Technology in Melbourne, Florida! Thanks for attending this international forum for computer scientists, engineers and practitioners that includes many innovative computational ideas and a wide spectrum of applications.SAC is a conference devoted to the study of real-world problem applications using many varieties of computation algorithms. As such, it provides an avenue for discussion and exchange of new ideas, associated computation algorithms, and interesting complex applications. The symposium is rightly sponsored by the ACM Special Interest Group on Applications (SIGAPP) whose mission is to further the interests of the computing professional engaged in the development of new computing applications, interdisciplinary applications areas, and applied research. Thus, the spectrum of applications and tutorials covers data mining, mobile computing, and computational finance to evolutionary algorithms, software engineering, and parallel and distributed computing plus others designed to provide a wide range of topics as reflected in the SAC'2003 program. Note that 8iomedical Computing continues to be a special element of the symposium with an innovative bioinformatics track and associated tutorials and plenary session as directed by Warren Jones.Again welcome to SAC'2003 and Melbourne, Florida. We hope that you will leave enriched with new friends and new ideas having enjoyed the distinctive ambiance of Florida. Next year, we encourage you and your colleagues to submit papers and attend SAC'2004.Welcome to the 18th Annual ACM Symposium on Applied Computing (SAC 2003). Over the past 17 years, SAC has been an international forum for researchers and practitioners to present their findings and research results in the areas of computer applications and technology. The SAC 2003 Technical Program offers a wide range of tracks covering major ar3eas of computer applications. Highly qualified referees with strong expertise and special inter4est in their respective research areas carefully reviewed the submitted papers. As part of the Technical Program, this yeard the Tutorial Program offers 4 half-day tutorials that were carefully selected form 18 proposals. The Tutorial Program and abstracts are available at http://www.acm.org/conferences/sac/sac2003/Tutorials.htm.SAC's open call for Track Proposals resulted in teh submission of 30 track proposals. These proposals were carefully evaluated by the conference Executive Committee. Some proposals were rejected on the grounds of either not being appropriate for the areas that SAC covers traditionally or being of rather narrow and specialized nature. Some others were merged to form a single track, on the grounds of having substantial overlap with each other. Eventually, 21 tracks were established, which then went on to produce their own call for papers. Int response to these calls, 525 were submitted, from which 200 papers were strongly recommended by the referees for acceptance and inclusion in the Conference Proceedings. This gives SAC an acceptance rate of 38% across all tracks. Furthermore, it makes SAC 2003 the most successful conference in the history of SAC so far, but also one of the most popular and comppetitive conferences in the international field of applied computing.We hope you will enjoy the metting and have the opportunity to exchange your ideas and make new friends. We also hope you will enjoy your stay in Melbourne and take pleasure from the many entertainments and activites that the city (and neighboring cities such as Orlando) has to offer. We look forward to your active participation in SAC 2003, and encourage you and your colleagues to submit your research findings to next years technical program. Thank you for being part of SAC 2003!
    During the last decades, not only the number of cyberattacks have increased significantly, they have also become more sophisticated. Hence designing a cyber-resilient approach is of paramount importance. Traditional security methods are... more
    During the last decades, not only the number of cyberattacks have increased significantly, they have also become more sophisticated. Hence designing a cyber-resilient approach is of paramount importance. Traditional security methods are not adequate to prevent data breaches in case of cyberattacks. Cybercriminals have learned how to use new techniques and robust tools to hack, attack, and breach data. Fortunately, Artificial Intelligence (AI) technologies have been introduced into cyberspace to construct smart models for defending systems from attacks. Since AI technologies can rapidly evolve to address complex situations, they can be used as fundamental tools in the field of cybersecurity. Al-based techniques can provide efficient and powerful cyber defense tools to recognize malware attacks, network intrusions, phishing and spam emails, and data breaches, to name a few, and to alert security incidents when they occur. In this paper, we review the impact of AI in cybersecurity and summarize existing research in terms of benefits of AI in cybersecurity.
    The advancement of information technology in coming years will bring significant changes to the way healthcare data is processed. Technologies such as cloud computing, fog computing, and the Internet of things (IoT) will offer healthcare... more
    The advancement of information technology in coming years will bring significant changes to the way healthcare data is processed. Technologies such as cloud computing, fog computing, and the Internet of things (IoT) will offer healthcare providers and consumers opportunities to obtain effective and efficient services via real-time data exchange. However, as with any computer system, these services are not without risks. There is the possibility that systems might be infiltrated by malicious users and, as a result, data could be corrupted, which is a cause for concern. Once an attacker damages a set of data items, the damage can spread through the database. When valid transactions read corrupted data, they can update other data items based on the value read. Given the sensitive nature of healthcare data and the critical need to provide real-time access for decision-making, it is vital that any damage done by a malicious transaction and spread by valid transactions must be corrected i...
    Cloud computing has brought many advantages to organizations and computer users. It allows different service providers to distribute many applications as services in an economical way. Therefore, many users and companies have begun using... more
    Cloud computing has brought many advantages to organizations and computer users. It allows different service providers to distribute many applications as services in an economical way. Therefore, many users and companies have begun using cloud computing. However, they are concerned about their data when they store it on a third party server, the cloud. The private data of individual users and companies is stored and managed by the service providers on the cloud, which offers services on the other side of the Internet in terms of its users, and consequently results in privacy concerns [1]. In this paper, a technique has been explored to encrypt the data on the cloud and to execute and run SQL queries on the cloud over encrypted data. The strategy is to process the query at the service providers' site without having to decrypt the data. Also, to achieve efficiency, no more than the exact set of requested data is returned to the client. Data decryption is performed at the client site to prevent any leakage at the cloud or during transmission. Two techniques have been provided to effectively store the encrypted data. Also, an experiment evaluation has been provided to compare between the two techniques.
    Information sharing is crucial for various organizations operating in a global environment. Varieties of existing virtual organizations support some forms of information sharing. Since the scope of a virtual organization can span over... more
    Information sharing is crucial for various organizations operating in a global environment. Varieties of existing virtual organizations support some forms of information sharing. Since the scope of a virtual organization can span over multiple administrative domains, information assurance is challenging. While trust plays key roles in eliminating the scalability restriction of traditional security mechanisms and provides more than merely security, existing trust models focus on subject trust management. But studying a subject's trustworthiness alone offers part of the solution to ensure the quality and security of the information the subject produced. Furthermore, most current research on information assurance and security for a virtual organization focuses on information confidentiality and information protection from unauthorized modifications. Very little work has been done in ensuring the quality and security features of external information. Taking these issues into conside...
    While the user-base of cloud computing is growing rapidly, data owners worry about security of the data they store on clouds. Lack of appropriate control over the data might cause security violations. Therefore, all sensitive data stored... more
    While the user-base of cloud computing is growing rapidly, data owners worry about security of the data they store on clouds. Lack of appropriate control over the data might cause security violations. Therefore, all sensitive data stored in cloud databases must be protected at all times. This research paper outlines how data owners can keep their data secure and trustworthy, and how they can verify integrity of data in a cloud computing environment. The proposed model uses data partitioning to reach this goal. We have carried out performance analyses of the model through simulation and the results demonstrate the effectiveness of the model.
    Web of trust is the foundation of the reputation system, recommendation system and semantic Web. Most of existing research on web of trust concentrates on aggregating the trust ratings on subjects and objects in the Web of trust. The... more
    Web of trust is the foundation of the reputation system, recommendation system and semantic Web. Most of existing research on web of trust concentrates on aggregating the trust ratings on subjects and objects in the Web of trust. The problem with this approach is that an adversary subject can accumulate reputation gradually and can be highly trusted by many other subjects. If this subject later deliberately releases a deceptive data, the effect caused by this deceptive data may be disastrous. Not only can this deceptive data greatly affect people who directly trust this individual, but also, its consequence may have an effect on many other subjects in the network. Our model illustrates how the structural analysis of Web can help evaluate the deleterious result of the deceptive data.
    Research Interests:
    For the past few years, research in multilevel secure database systems has received a great deal of attention. Such systems are quite essential in military as well as many commercial applications where data are classified according to... more
    For the past few years, research in multilevel secure database systems has received a great deal of attention. Such systems are quite essential in military as well as many commercial applications where data are classified according to their sensitivity and where each user has a clearance level. Users access the data as per the system's security policy. A system is most secure if it guards against an unauthorized flow of information either directly or indirectly. In this research, the issue of query processing that takes place among the various base relations in a kernelized multilevel secure database system was analyzed. Specifically, the SeaView model, a research prototype developed as a joint effort by SRI International and Gemini Computer, was followed since it is the only model that uses element level (i.e., the finest granularity level) classification of data. Although the SeaView model aims at achieving class A1 system classification, it has two major drawbacks. First, the...
    In this paper, we present an insider attack detection model that is designed to profile traceability links based on document dependencies and calendar-based file usage patterns for detecting insider threats. This model is utilized to... more
    In this paper, we present an insider attack detection model that is designed to profile traceability links based on document dependencies and calendar-based file usage patterns for detecting insider threats. This model is utilized to detect insiders' malicious activities targeted at tampering the contents of files for various purposes. We apply the concept of traceability links in the software engineering field to this research. Our approach mainly employs document dependency traceability links for constructing insider attack detection model.
    The world has experienced a huge advancement in computing technology. People prefer outsourcing their confidential data for storage and processing in cloud computing because of the auspicious services provided by cloud service providers.... more
    The world has experienced a huge advancement in computing technology. People prefer outsourcing their confidential data for storage and processing in cloud computing because of the auspicious services provided by cloud service providers. As promising as this paradigm is, it creates issues, including everything from data security to time latency with data computation and delivery to end-users. In response to these challenges, the fog computing paradigm was proposed as an extension of cloud computing to overcome the time latency and communication overhead and to bring computing and storage resources close to both the ground and the end-users. However, fog computing inherits the same security and privacy challenges encountered by traditional cloud computing. This paper proposed a fine-grained data access control approach by integrating the ciphertext policy attribute-based encryption (CP-ABE) algorithm and blockchain technology to secure end-users’ data security against rogue fog nodes...
    Funy dependency in a database delineates a loose dependency relationship between two sets of attributes. It describes logical relationships among attributes in a database relation and those relationships can't be fully specified by... more
    Funy dependency in a database delineates a loose dependency relationship between two sets of attributes. It describes logical relationships among attributes in a database relation and those relationships can't be fully specified by functional dependencies, which focus on database schema and data organization. This characteristic of the databa se schema can be used to perform damage assessment and also to build fUz:.y recovery modeL In this paper, we formally define the concept offuz:.y dependency and introduce several inference rules. Then we focus on recovery from information attacks. An architecture for fuUJ' value generation during recovery, based on fuw dependency relationships, is also presented. FuUJ' dependency can accelerate the post attack recovery process because it can geflerate acceptable values for damaged data quicker compared to that in traditional recovery schemes. Although the generated fuzzy values may not offer the absolute accuracy, they are acceptabl...
    The volume of data generated worldwide is rapidly growing. Cloud computing, fog computing, and the Internet of things (IoT) technologies have been adapted to compute and process this high data volume. In coming years information... more
    The volume of data generated worldwide is rapidly growing. Cloud computing, fog computing, and the Internet of things (IoT) technologies have been adapted to compute and process this high data volume. In coming years information technology will enable extensive developments in the field of healthcare and offer health care providers and patients broadened opportunities to enhance their healthcare experiences and services owing to heightened availability and enriched services through real-time data exchange. As promising as these technological innovations are, security issues such as data integrity and data consistency remain widely unaddressed. Therefore, it is important to engineer a solution to these issues. Developing a damage assessment and recovery control model for fog computing is critical. This paper proposes two models for using fog computing in healthcare: one for private fog computing distribution and one for public fog computing distribution. For each model, we propose a ...
    The integrity of files stored on cloud is crucial for many applications, specifically applications that manage online social networks. Now-a-days social networking sites have become primary locations for sharing documents. Due to security... more
    The integrity of files stored on cloud is crucial for many applications, specifically applications that manage online social networks. Now-a-days social networking sites have become primary locations for sharing documents. Due to security lapses at these sites, shared documents could be stolen or changed by cyber attackers. We have developed a model to protect shared data and resources from unauthorized accesses. This method, which is called policy-based attribute access control (PBAAC) [20], enables resource owners to define policies to manage their resources from unmanaged accesses. However, policies are saved as plain text in a file and could be compromised by unauthorized users, violating the integrity of those policies. In this paper, we proposed a method to protect policies saved in text files. We developed an algorithm to extract critical information from each policy and create a hash value, by executing a hash cryptography algorithm.
    Recovery of lost or damaged data in a post-intrusion detection scenario is a difficult task since database management systems are not designed to deal with malicious committed transactions. Few existing methods developed for this purpose... more
    Recovery of lost or damaged data in a post-intrusion detection scenario is a difficult task since database management systems are not designed to deal with malicious committed transactions. Few existing methods developed for this purpose heavily rely on logs and require that the log must not be purged. This causes the log grow tremendously and, since scanning the huge log takes enormous amount of time, recovery becomes a complex and prolonged process. In this research, we have used data dependency approach to divide a log into multiple segments, each segment containing only related operations. During damage assessment and recovery, we identify and skip parts of logs that contain unaffected operations. This accelerates the task. Through simulation we have validated performance of our method.
    Research Interests:
    In recent years, millions of people are connected on social networks. Social networks have become an excellent platform for sharing information and communication that reflects real world relationships. This global use of social networks... more
    In recent years, millions of people are connected on social networks. Social networks have become an excellent platform for sharing information and communication that reflects real world relationships. This global use of social networks has made them to be an excellent candidate for using cloud. As a result, numerous access control methods have been proposed to maximize available resources. In addition, to protect these resources, access control policies have been devised. In this paper, we propose a method to evaluate requests for accessing shared resources, and to respond those requests. Before evaluating the request, we provide a well-formed notation for specifying the request, the cloud-based social network, and the associated access control policies. We show that our notation enables users to define the inquires and the policies in a simpler and more efficient way and that it helps users to protect their assets under a tight security.
    Insider threats pose one of the most significant risks to the confidentiality, integrity, and availability of organizational data assets that are critical to the operation of the business. While considerable infrastructure is generally in... more
    Insider threats pose one of the most significant risks to the confidentiality, integrity, and availability of organizational data assets that are critical to the operation of the business. While considerable infrastructure is generally in place to protect critical data from attacks originating from outside sources, much fewer resources are in place that are focused on mitigating the threat of malicious insiders. Because insiders are trusted and have required access to these critical data, the insider threat is a particular pernicious and vexing problem. The risk of attacks originating from insiders is exacerbated by the lack of tools that are available to counteract this risk and this work helps solve the problem. Proactive means of implementing automatic identification features of mission-critical data and deploying honeytokens in known areas where critical data resides, as well as reactive means of identifying potential insider threats to critical data once certain suspicious actions have occurred are utilized. The work performed is primarily focused on database and RFID-based systems, though the techniques are applicable to a wide range of domains. Several new algorithms are proposed, designed, tested, and deployed in order to show the merits of these methods. This work serves as an additional tool to the computer security industry and security engineers in the field that wish to focus limited resources on one of the largest security threats: trusted insiders abusing mission-critical data. The results show that with the proper foresight to plan for insider threats before incidents occur, critical data assets can be identified and protected effectively with the novel methods defined in this work.
    Research Interests:
    The evolution of the utilization of technologies in nearly all aspects of life has produced an enormous amount of data essential in a smart city. Therefore, maximizing the benefits of technologies such as cloud computing, fog computing,... more
    The evolution of the utilization of technologies in nearly all aspects of life has produced an enormous amount of data essential in a smart city. Therefore, maximizing the benefits of technologies such as cloud computing, fog computing, and the Internet of things is important to manage and manipulate data in smart cities. However, certain types of data are sensitive and risky and may be infiltrated by malicious attacks. As a result, such data may be corrupted, thereby causing concern. The damage inflicted by an attacker on a set of data can spread through an entire database. Valid transactions that have read corrupted data can update other data items based on the values read. In this study, we introduce a unique model that uses fog computing in smart cities to manage utility service companies and consumer data. We also propose a novel technique to assess damage to data caused by an attack. Thus, original data can be recovered, and a database can be returned to its consistent state a...
    Fog computing was emerged as mini-clouds deployed close to the ground to reduce communication overhead and time latency between the cloud and end-users’ devices. Because fog computing is an extension of cloud computing, it inherits the... more
    Fog computing was emerged as mini-clouds deployed close to the ground to reduce communication overhead and time latency between the cloud and end-users’ devices. Because fog computing is an extension of cloud computing, it inherits the security and privacy issues cloud computing has faced. If a Fog Node (FN) serving end-devices goes rogue or becomes maliciously compromised, this would hinder individuals’ and organizations’ data security (e.g., Confidentiality, Integrity, and Availability). This paper presents a novel scheme based on the Ciphertext-Policy-Attribute-Based-Encryption (CP-ABE) and hashing cryptographic primitives to minimize the amount of data in danger of breach by rogue fog nodes with maintaining the fog computing services provided to end-users’ devices. This scheme manages to oust rogue Fog Nodes (FNs) and to prevent them from violating end-users’ data security while guarantying the features provided by the fog computing paradigm. We demonstrate our scheme’s applicability and efficiency by carrying out performance analysis and analyzing its security, and communication overhead.
    Recently, database users have begun to use cloud database services to outsource their databases. This is due to the high computation speed and the huge storage capacity the cloud owners provide at low prices. Despite cloud computing being... more
    Recently, database users have begun to use cloud database services to outsource their databases. This is due to the high computation speed and the huge storage capacity the cloud owners provide at low prices. Despite cloud computing being an attractive environment for database users, privacy issues are cause for concern for database owners since the data access will be out of their control. Encryption is the only way of assuaging users’ fears surrounding data privacy. However, executing Structured Query Language (SQL) queries over encrypted data is a challenging task, especially if the data are encrypted by a randomized encryption algorithm. Many researchers have addressed the privacy issues by using deterministic encryption, onion layers encryption, or homomorphic encryption to encrypt data. But, even with these systems, the encrypted data can still be subject to attack. In this research, we propose a model to execute SQL queries over encrypted data, where the data are encrypted by a single randomized encryption algorithm–namely, Advanced Encryption Standard AES-CBC. We move most of the computations to the cloud and leave users with no crypto computation. Our model intends to narrow the range of retrieved encrypted records from the cloud to a small set of records that are candidates for the user’s query. We implement and evaluate our model and find that it is both practical and efficient. Our experiments show that our model succeeds in minimizing the decryption processes to less than 30% of decryption the whole set of encrypted records in a table.
    The cloud computing paradigm has revolutionized the concept of computing and has gained a huge attention since it provides computing resources as a service over the internet. As auspicious as this model is, it brings forth many... more
    The cloud computing paradigm has revolutionized the concept of computing and has gained a huge attention since it provides computing resources as a service over the internet. As auspicious as this model is, it brings forth many challenges: everything from data security to time latency issues with data computation and delivery to end users. To manage these challenges with the cloud, a fog computing paradigm has emerged. In the context of the computing, fog computing involves placing mini clouds close to end users to solve time latency problems. However, as fog computing is an extension of cloud computing, it inherits the same security and privacy challenges encountered by traditional cloud computing. These challenges have accelerated the research community’s efforts to find effective solutions. In this paper, we propose a secure and fine-grained data access control scheme based on the ciphertext-policy attribute-based encryption (CP-ABE) algorithm to prevent fog nodes from violating end users’ confidentiality in situations where a compromised fog node has been ousted. In addition, to provide decreased time latency and low communication overhead between the Cloud Service Provider (CSP) and the fog nodes (FNs), our scheme classifies the fog nodes into fog federations (FF), by location and services provided, and each fog node divides the plaintext to be encrypted into multiple blocks to accelerate the time when retrieving ciphertext from the CSP. We demonstrate our scheme’s efficiency by carrying on a simulation and analyzing its security and performance.
    Phenomenal technological developments have an immense and broad impact on the future of adaptive and evolving government systems. Technologies, such as cloud computing, fog computing, and the Internet of Things (IoT), offer governments... more
    Phenomenal technological developments have an immense and broad impact on the future of adaptive and evolving government systems. Technologies, such as cloud computing, fog computing, and the Internet of Things (IoT), offer governments and their citizens opportunities to enhance their experiences and services; increased availability and higher-quality services via real-time data processing augment the potential for technology to add value to everyday experiences. This paper presents a novel model for an intelligent government system that uses fog computing technology to control and manage the data of the entire system. Guaranteeing trusted and reliable communication between entities is the primary intent of this model. Unique algorithms, focused on sustaining the system’s data integrity in the event of a system attack, are proposed and evaluated.
    The dissertation concentrates on addressing the factors and capabilities that enable insiders to violate systems security. It focuses on modeling the accumulative knowledge that insiders get throughout legal accesses, and it concentrates... more
    The dissertation concentrates on addressing the factors and capabilities that enable insiders to violate systems security. It focuses on modeling the accumulative knowledge that insiders get throughout legal accesses, and it concentrates on analyzing the dependencies and constraints among data items and represents them using graph-based methods. The dissertation proposes new types of Knowledge Graphs (KGs) to represent insiders' knowledgebases. Furthermore, it introduces the Neural Dependency and Inference Graph (NDIG) and Constraints and Dependencies Graph (CDG) to demonstrate the dependencies and constraints among data items. The dissertation discusses in detail how insiders use knowledgebases and dependencies and constraints to get unauthorized knowledge. It suggests new approaches to predict and prevent the aforementioned threat. The proposed models use KGs, NDIG and CDG in analyzing the threat status, and leverage the effect of updates on the lifetimes of data items in insi...
    Cloud computing offers a considerable number of advantages to clients and organizations that use several capabilities to store sensitive data, interact with applications, or use technology infrastructure to perform daily activities. The... more
    Cloud computing offers a considerable number of advantages to clients and organizations that use several capabilities to store sensitive data, interact with applications, or use technology infrastructure to perform daily activities. The development of new models in cloud computing brings with it a series of elements that must be considered by companies, particularly when the sensitive data needs to be protected. In this research, a model that uses a trusted third party (TTP) to enforce database security in the cloud is proposed. First, the TTP performs a partition process over the data by using an index from one of the attributes in the table, and the TTP sends to the cloud the records in encrypted format with an index. Second, the TTP analyzes the client query to retrieve a segment of the data from the cloud-based on the query conditions. The final result is submitted to the client in which a minimum workload is executed. Some simulations were performed to evaluate the efficiency o...
    Cloud computing is an attractive environment for both organizations and individual users, as it provides scalable computing and storage services at an affordable price. However, privacy and confidentiality are two challenges that trouble... more
    Cloud computing is an attractive environment for both organizations and individual users, as it provides scalable computing and storage services at an affordable price. However, privacy and confidentiality are two challenges that trouble most users. Data encryption, using a powerful encryption algorithm such as the Advanced Encryption Standard (AES), is one solution that can allay users' concerns, but other challenges with searching over encrypted data have arisen. Researchers have proposed many different schemes to execute Standard Query Language (SQL) queries over encrypted data by encrypting the data with more than one encryption algorithm. However, other researchers have proposed systems based on the fragmentation of encrypted data. In this paper, we propose bit vector-based model (BVM), a secure database system that works as an intermediary between users and the cloud provider. In BVM, before the encryption and outsourcing processes, the query manager (QM) takes each record...
    Many features and advantages have been brought to organizations and computer users by Cloud computing. Many applications and services have been distributed by Cloud providers in an economical way. Even though companies and clients have... more
    Many features and advantages have been brought to organizations and computer users by Cloud computing. Many applications and services have been distributed by Cloud providers in an economical way. Even though companies and clients have started using Cloud computing, they are still concerned about their data's security because the data are stored and controlled by the Cloud providers [9]. In this paper, a technique has been explored to improve query processing performance while protecting database tables on a Cloud by encrypting those so that they remain secure. In addition, four techniques have been designed to index and partition the data, and these indexed data will be stored together with the encrypted table on the Cloud or server. The indexes and partitions of the data are used to select part of the data from the Cloud or outsource data depending on the required data. The indexed data can be used to increase the performance when the data are requested from the encrypted table. To compare the efficiency of our proposed methods, results will be presented in forms of graphs. In addition, the paper will explain how to improve performance by combining the two methods to partition the data.
    Over the past several decades, the number of peopleusing online social networks for public relations or sharingdocuments have increased significantly. This worldwide use ofsocial networks has made them to be an excellent candidate... more
    Over the past several decades, the number of peopleusing online social networks for public relations or sharingdocuments have increased significantly. This worldwide use ofsocial networks has made them to be an excellent candidate forusing cloud. As a result, numerous access control methods havebeen proposed to maximize available resources. In addition, toprotect these resources, access control policies have been devised. This research provides a model to protect shared data fromunauthorized access. This model enforces finer-grained accesscontrol in a social network, and enables users to define policies togovern belonging resources and to input inquiries in order toaccess a resource in the most effective way. Policies and inquiriesare converted to an easy-to-use format, which enhances the timeand efficiency of evaluation of inputs. In this paper, we propose awell-formed notation for specifying a cloud-based social networkand the associated access control policies. We show that ournotation enables users to define policies in a simpler and moreefficient way and that it helps users to protect their assets undera tight security. The main contribution of this work is thatresource owners are able to protect their resources fromunauthorized access in an easy and efficient way, and requestorsare able to feed their access requests in a simple, clear, andeffortless way as well.
    Much attention is being directed toward the development of secure database systems. Such systems are critical for both military as well as sensitive commercial applications. The majority of research in security and multilevel secure... more
    Much attention is being directed toward the development of secure database systems. Such systems are critical for both military as well as sensitive commercial applications. The majority of research in security and multilevel secure database management systems (MLS/DBMS) are focused on relational systems. However, with the emergence of new and complex applications of the 1990's, research in object oriented security is gaining more prominence [Thur90], [Keef89]. In this paper, we describe a secure algorithm for transaction management in Multilevel Secure Object-Oriented
    ... Note that it would have been more disadvantageous to send a copy of the entire relation of which only a few attriiutes and a few records will be required at the other site. The DVs are used first to our advantage to determine the... more
    ... Note that it would have been more disadvantageous to send a copy of the entire relation of which only a few attriiutes and a few records will be required at the other site. The DVs are used first to our advantage to determine the records that will be participating in the result. ...

    And 108 more