Securing data is a challenging issue in the present time. Data security refers to protective digital privacy measures that are applied to prevent unauthorized access to the important information. Cryptography protects users by providing... more
Securing data is a challenging issue in the present time. Data security refers to protective digital privacy measures that are applied to prevent unauthorized access to the important information. Cryptography protects users by providing functionality for the encryption of data and authentication of other users. Data compression is the art of reducing the number of bits needed to store or transmit data.
Data de-duplication is an emerging technology, which incurs a benefit of storing a single instance or a single copy of duplicated data in a storage disk. As the number of user increase in the network there will be an explosive growth in... more
Data de-duplication is an emerging technology, which incurs a benefit of storing a single instance or a single copy of duplicated data in a storage disk. As the number of user increase in the network there will be an explosive growth in data, thus there is a need of some mechanism which efficiently deal with such situation. To protect the confidentiality of the message in the network while sustenance of data de-duplication, the convergent encryption algorithm is utilized before uploading data send into the network. However, there are several issues that have to be addressed some of the issue is addressed in our proposal. In this paper, to enhance more security, we derived private key or secret key from user password which fulfill the one issue i.e. Data origin authentication. In these creations, the integrity of the information is appreciated by just deriving the convergent key (Secret key + symmetric key (key derived from chunks)). However, we also addressed the issue related to keyword search on encrypted data. Our analysis reveals that our proposal is secure in terms of definition quantified in the proposed secure data de-duplication model
Cloud computing provides many benefits to the users such as accessibility and availability. As the data is available over the cloud, it can be accessed by different users. There may be sensitive data of organization. This is the one issue... more
Cloud computing provides many benefits to the users such as accessibility and availability. As the data is available over the cloud, it can be accessed by different users. There may be sensitive data of organization. This is the one issue to provide access to authenticated users only. But the data can be accessed by the owner of the cloud. So to avoid getting data being accessed by the cloud owner, we will use the intrusion detection system to provide security to the data. The other issue is to save the data backup in other cloud in encrypted form so that load balancing can be done. This will help the user with data availability in case of failure of one cloud.
Cloud computing has good services like virtualization. Virtualization provides the unlimited computational resources. Cloud computing provides robust design with low cost. Different security constraints are satisfied in outsourcing with... more
Cloud computing has good services like virtualization. Virtualization provides the unlimited computational resources. Cloud computing provides robust design with low cost. Different security constraints are satisfied in outsourcing with the implementation of new encryption standards. These above services give the reliable solution in secure transmission. Previous systems cloud environment enables the computational resources are limited whenever access the resources in outsourcing. These resources utilization are pay per use manner here. Previous servers have processing storage, memory levels are less. There is no possibility for encryption complete content. Some content available as a plain text, remaining content available as a cipher text. This two type’s format content starts the transmission. Attackers are entering automatically leakage of data problems generate here. That’s why this type of network comes under insecure. It can deliver the incorrect data in destination. Users are not satisfies with the help of these services. Security is the primary obstacle that prevents the wide adoption of this promising computing model, especially for customers when their confidential data are consumed and produced during the computation. The above limitations are overcome using the linear programming in cloud computing. These types of techniques are providing good secure network and optimization solution. User is ready for transfer the large file to another user. Here large file assume as a large problem. Using linear programming large files divide into sub parts using decomposition. Transformation techniques start the allocation of decomposed parts in different servers. Different servers provide the perfect infrastructure for encryption with sufficient computational resources. Before starts the outsourcing total content is encrypted in client side. After deliver the content verifies the proof. Proof it is matched with server proof then performs the decryption. It’s delivers as a correct data.
This presentation, provided by Mr. Edwards for Capitol College's IAE 684 Complementary Security class, offers unique insights on methods for encryption of data.
Cryptographic applications are becoming increasingly more important in today's world of data exchange, big volumes of data need to be transferred safely from one location to another at high speed. In this paper, the parallel... more
Cryptographic applications are becoming increasingly more important in today's world of data exchange, big volumes of data need to be transferred safely from one location to another at high speed. In this paper, the parallel implementation of blowfish cryptography algorithm is evaluated and compared in terms of running time, speed up and parallel efficiency. The parallel implementation of blowfish is implemented using message passing interface (MPI) library, and the results have been conducted using IMAN1 Supercomputer. The experimental results show that the runtime of blowfish algorithm is decreased as the number of processors is increased. Moreover, when the number of processors is 2, 4, and 8, parallel efficiency achieves up to 99%, 98%, and 66%, respectively.