We are honored to welcome you to the 12th IEEE/ACM International Conference on Utility and Cloud Computing (UCC 2019) on behalf of the technical program committee. As a well-established and selective conference, UCC attracts many high-quality submissions. This year, we received a total of 96 submissions of full papers, from which 84 went through to the review phase after a prescreening phase. We are sincerely thankful to the TPC members and external reviewers, who have worked hard under a very tight schedule to provide a total of 316 review reports that culminated in the selection of 28 full papers, resulting in a 29.2% acceptance rate for the main track of the UCC conference. The commitment, technical skills, and professionalism of the 82 TPC members and 51 sub-reviewers, spanning 24 countries, are crucial to the high-quality program we are used to see in UCC. Once again, we sincerely thank all our TPC members for their hard work and feedback provided to the authors.
We would like to give a special thanks to all authors of submitted papers, from 25 different countries spanning 6 continents, reflecting the great community brought together by the research around cloud and utility computing. We received 39% of submissions from Europe, 21% from Asia, 15% from Northern America, 14% from Australia, 9% from Southern America, and 2% from Africa. The resulting high-quality program includes several topics, ranging from infrastructure and networking to software design aspects, passing through resource management and scheduling all the way to the very important topics including security as well as newly appearing hot topics such as edge and fog computing and Internet of Things. The quality of the program you will experience during the next 4 days of conference is only mirroring the dedication and enthusiasm of these researchers in preparing their contribution for submission to UCC. We deeply acknowledge their efforts!
Proceeding Downloads
An Evaluation of FaaS Platforms as a Foundation for Serverless Big Data Processing
Function-as-a-Service (FaaS), offers a new alternative to operate cloud-based applications. FaaS platforms enable developers to define their application only through a set of service functions, relieving them of infrastructure management tasks, which ...
ATLAS: A Distributed File System for Spatiotemporal Data
A majority of the data generated in several domains is geotagged. These data also have a chronological component associated with them. Pervasive data generation and collection efforts have led to an increase in data volumes. These data hold the ...
Exploring the Cost-benefit of AWS EC2 GPU Instances for Deep Learning Applications
Deep Learning is a subfield of machine learning methods based on artificial neural networks. Thanks to the increased data availability and computational power, such as Graphic Process Units (GPU), training deep networks - a time-consuming process - ...
Aperture: Fast Visualizations Over Spatiotemporal Datasets
One of the most powerful ways to explore data is to visualize it. Visualizations underpin data wrangling, feature space explorations, and understanding the dynamics of phenomena. Here, we explore interactive visualizations of voluminous, spatiotemporal ...
Fog Horizons -- A Theoretical Concept to Enable Dynamic Fog Architectures
With the Internet of Things, more and more devices, and therefore data, need to be handled by current architectures. The flood of provided information leads to bottlenecks within regular client-server architectures. In order to address this challenge ...
FogDocker: Start Container Now, Fetch Image Later
Slow software deployment is an important issue in environments such as fog computing where this operation lies in the critical path of providing online services to the end users. The problem is even worse when the virtualized resources are made of ...
Edge Affinity-based Management of Applications in Fog Computing Environments
Fog computing overcomes the limitations of executing Internet of Things (IoT) applications in remote Cloud datacentres by extending the computation facilities closer to data sources. Since most of the Fog nodes are resource constrained, accommodation of ...
Microservices-based IoT Application Placement within Heterogeneous and Resource Constrained Fog Computing Environments
Fog computing paradigm has created innovation opportunities within Internet of Things (IoT) domain by extending cloud services to the edge of the network. Due to the distributed, heterogeneous and resource constrained nature of the Fog computing nodes, ...
Tensor-Based Resource Utilization Characterization in a Large-Scale Cloud Infrastructure
The introduction of virtualization and cloud computing has enabled a large number of containers/virtual machines to share computing resources. Nevertheless, the number and size of data centres are still on the rise, partly on account of an ever ...
Modelling and Prediction of Resource Utilization of Hadoop Clusters: A Machine Learning Approach
Hadoop is a distributed computing framework that has a large number of configurable parameters. These parameters have impact on system resources and execution time. Optimizing the performance of a Hadoop cluster by tuning such a large number of ...
Privacy-by-Design Distributed Offloading for Vehicular Edge Computing
Vehicular Edge Computing (VEC) is a distributed computing paradigm that utilizes smart vehicles (SVs) as computational cloudlets (edge nodes) by virtue of their inherent attributes such as mobility, low operating costs, flexible deployment, and wireless ...
SecHadoop: A Privacy Preserving Hadoop
With the generation of vast amounts of data, there has been a tremendous need for processing the same in an economical way. MapReduce paradigm provides an economical processing of huge datasets in an effective way. Hadoop is a framework for managing ...
A General Framework for Privacy-preserving Computation on Cloud Environments
While privacy and security concerns dominate public cloud services, Homomorphic Encryption (HE) is seen as an emerging solution that can potentially assure secure processing of sensitive data by third-party cloud vendors. It relies on the fact that ...
Developing GDPR Compliant User Data Policies for Internet of Things
With recent adoption of Internet of Things (IoT) technologies and their use in industry, user data privacy concerns remain a major preoccupation of regulation bodies. The European General Data Protection Regulation (GDPR) enables users to control their ...
A Proactive, Cost-aware, Optimized Data Replication Strategy in Geo-distributed Cloud Datastores
Geo-replicated cloud datastores adopt the replication methodology by placing multiple data replicas at suitable storage zones. This can provide reliable services to customers with high availability, low access latency, low system cost, and decreased ...
Selecting Efficient Cloud Resources for HPC Workloads
Constant advances in CPU, storage, and network virtualization are enabling high-performance computing (HPC) applications to be efficiently executed on cloud computing systems. In this computing model, users pay only for what they use, with no need to ...
EFPO: Energy Efficient and Failure Predictive Edge Offloading
Many researchers focus on offloading issues and challenges to improve energy efficiency and reduce application response time by employing multi-objective offloading frameworks but without considering offloading failures. Edge Computing, due to ...
Energy and Profit-Aware Proof-of-Stake Offloading in Blockchain-based VANETs
In Vehicular Ad-hoc NETworks (VANET) users do not necessarily trust each other and in some cases they may introduce dubios information in the network. Centralized approaches to improve the credibility of information do not easily scale, require trusting ...
LESS: A Matrix Split and Balance Algorithm for Parallel Circuit (Optical) or Hybrid Data Center Switching and More
The research problem of how to use a high-speed circuit switch, typically an optical switch, to most effectively boost the switching capacity of a datacenter network, has been extensively studied. In this work, we focus on a different but related ...
J-OPT: A Joint Host and Network Optimization Algorithm for Energy-Efficient Workflow Scheduling in Cloud Data Centers
Workflows are a popular application model used for representing scientific as well as commercial applications, and cloud data centers are increasingly used in the execution of workflow applications. Existing approaches to energy-efficient workflow ...
The Overhead of Confidentiality and Client-side Encryption in Cloud Storage Systems
Client-side encryption (CSE) is important to ensure that only the intended users have access to information stored in public cloud services. However, CSE complicates file synchronization methods such as deduplication and delta encoding, important to ...
Container-based Sandboxes for Malware Analysis: A Compromise Worth Considering
- Ayrat Khalimov,
- Sofiane Benahmed,
- Rasheed Hussain,
- S.M. Ahsan Kazmi,
- Alma Oracevic,
- Fatima Hussain,
- Farhan Ahmad,
- Chaker Abdelaziz Kerrache
Malware analysis relies on monitoring the behavior of a suspected application within a confined, controlled and secure environment. These environments are commonly referred to as "Sandboxes'' and are often virtualized replicas of a regular system. ...
A Systematic Mapping Study on Engineering Function-as-a-Service Platforms and Tools
Function-as-a-Service (FaaS) is a novel cloud service model allowing to develop fine-grained, provider-managed cloud applications. In this work, we investigate which challenges motivate researchers to introduce or enhance FaaS platforms and tools. We ...
SLO-ML: A Language for Service Level Objective Modelling in Multi-cloud Applications
Cloud modelling languages (CMLs) are designed to assist customers in tackling the diversity of services in the current cloud market. While many CMLs have been proposed in the literature, they lack practical support for automating the selection of ...
IoTNetSim: A Modelling and Simulation Platform for End-to-End IoT Services and Networking
Internet-of-Things (IoT) systems are becoming increasingly complex, heterogeneous and pervasive, integrating a variety of physical devices and virtual services that are spread across architecture layers (cloud, fog, edge) using different connection ...
High Performance Dynamic Graph Model for Consistent Data Integration
In a distributed environment, data from heterogeneous sources are brought together in a unified and consistent manner for analytics and insights. Inconsistencies arising due to the dynamic nature of sources such as addition/deletion of column or merging ...
Facing the Unplanned Migration of Serverless Applications: A Study on Portability Problems, Solutions, and Dead Ends
Serverless computing focuses on developing cloud applications that comprise components fully managed by providers. Function-as-a-Service (FaaS) service model is often associated with the term serverless as it allows developing entire applications by ...
Performance Study of Mixed Reality for Edge Computing
Edge computing is a recent paradigm where computing resources are placed close to the user, at the edge of the network. This is a promising enabler for applications that are too resource-intensive to be run on an end device, but at the same time require ...