Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Human decision making is central in many functions across a broad spectrum of fields such as marketing, investment, smart contract formulations, political campaigns, and organizational strategic management. Behavioral economics seeks to... more
Human decision making is central in many functions across a broad spectrum of fields such as marketing, investment, smart contract formulations, political campaigns, and organizational strategic management. Behavioral economics seeks to study the psychological, cultural, and social factors contributing to decision making along reasoning. It should be highlighted here that behavioral economics do not negate classical economic theory but rather extend it in two distinct directions. First, a finer granularity can be obtained by studying the decision making process not of massive populations but instead of individuals and groups with signal estimation or deep learning techniques based on a wide array of attributes ranging from social media posts to physiological signs. Second, time becomes a critical parameter and changes to the disposition towards alternative decisions can be tracked with input-output or state space models. The primary findings so far are concepts like bounded rationality and perceived risk, while results include optimal strategies for various levels of information awareness and action strategies based on perceived loss aversion principles. From the above it follows that behavioral economics relies on deep learning, signal processing, control theory, social media analysis, affective computing, natural language processing, and gamification to name only a few fields. Therefore, it is directly tied to computer science in many ways. THECOG will be a central meeting point for researchers of various backgrounds in order to generate new interdisciplinary and groundbreaking results.
In this paper we introduce a novel methodology to achieve information diffusion within a social graph that activates a realistic number of users. Our approach combines the predicted patterns of diffusion for each node with propagation... more
In this paper we introduce a novel methodology to achieve information diffusion within a social graph that activates a realistic number of users. Our approach combines the predicted patterns of diffusion for each node with propagation heuristics in order to achieve an effective cover of the graph. The novelty of our methodology is based on the use of history information to predict users' diffusion patterns and on our proposed PBD heuristics for achieving a realistic information spread. Moreover, we use a methodology for calculating the actual diffusion of a message in a social media graph. To validate our approach we present a set of experimental results. Our methodology is useful to marketers who are interested to use social influence and run effective marketing campaigns.
In recent years, machine learning has been used for data processing and analysis, providing insights to businesses and policymakers. Deep learning technology is promising to further revolutionize this processing leading to better and more... more
In recent years, machine learning has been used for data processing and analysis, providing insights to businesses and policymakers. Deep learning technology is promising to further revolutionize this processing leading to better and more accurate results. Current trends in information and communication technology are accelerating widespread use of web services in supporting a service-oriented architecture (SOA) consisting of services, their compositions, interactions, and management. Deep learning approaches can be applied to support the development of SOA-based solutions, leveraging the vast amount of data on web services currently available. On the other hand, SOA has mechanisms that can support the development of distributed, flexible, and reusable infrastructures for the use of deep learning. This paper presents a literature survey and discusses how SOA can be enabled by as well as facilitate the use of deep learning approaches in different types of environments for different levels of users.
With recent advances in mobile technologies and e-commerce infrastructures, there have been increasing demands for the expansion of collaboration services within and across systems. In particular, human collaboration requirements should... more
With recent advances in mobile technologies and e-commerce infrastructures, there have been increasing demands for the expansion of collaboration services within and across systems. In particular, human collaboration requirements should be considered together with those for systems and their components. Agent technologies have been deployed in order to model and implement e-commerce activities as multi-agent systems (MAS). Agents are able to provide assistance on behalf of their users or systems in collaboration services. As such, we advocate the engineering of e-collaboration support by means of MAS in the following three key dimensions: (i) across multiple platforms, (ii) across organization boundaries, and (iii) agent-based intelligent support. To archive this, we present a MAS infrastructure to facilitate systems and human collaboration (or e-collaboration) activities based on the belief-desire-intension (BDI) agent architecture, constraint technology, and contemporary Web Services. Further, the MAS infrastructure also provides users with different options of agent support on different platforms. Motivated by the requirements of mobile professional workforces in large enterprises, the authors present their development and adaptation methodology for e-collaboration services with a case study of constraint-based collaboration protocol from a three-tier implementation architecture aspect. They evaluate our approach from the perspective of three main stakeholders of e-collaboration, which include users, management, and systems developers.
There is an increasing demand for sharing documents for process integration among organizations. Web services technology has recently been widely proposed and gradually adopted as a platform for supporting such an integration. There are... more
There is an increasing demand for sharing documents for process integration among organizations. Web services technology has recently been widely proposed and gradually adopted as a platform for supporting such an integration. There are no holistic solutions thus far that are able to tackle the various protection issues, specifically regarding the security and privacy protection requirements in cross-organizational progress integration. This paper proposes the exchange of documents through a Document / Image Exchange Platform (DIEP), replacing traditional ad-hoc and manual exchange practices. The authors show how the contemporary technologies of Web services under a Service-Oriented Architecture (SOA), together with watermarking, can help protect document exchanges with layered implementation architecture. Furthermore, to facilitate governance and regulation compliance against protection policy violation attempts, the management and the affected parties are notified with alerts for warning and possible handling. The authors discuss the applicability of the proposed platform with a physician towards security and privacy protection requirements based on the Health Insurance Portability and Accountability Act (HIPAA) in the United States, which imposes national regulations to protect individuals’ healthcare information. The proposed approach aims at facilitating the whole governance process from technical to management level with a single unified platform.
Urgent requests and critical messages in healthcare applications must be delivered and handled timely instead of in an ad-hoc manner for most current systems. Therefore, we extend a sophisticated alert management system (AMS) to handle... more
Urgent requests and critical messages in healthcare applications must be delivered and handled timely instead of in an ad-hoc manner for most current systems. Therefore, we extend a sophisticated alert management system (AMS) to handle process and data integration in healthcare chain workflow management under urgency constraints. Alerts are associated with healthcare tasks to capture the parameters for their routing and urgency requirements in order to match them with the specialties of healthcare personnel or the functionalities of Web Services providers. Monitoring is essential to ensure the timeliness and availability of services as well as to ensure the identification of exceptions. We outline our implementation framework with Web Services for the communications among healthcare service providers together with mobile devices for medical professionals. We demonstrate the applicability of our approach with a prototype medical house-call system (MHCS) and evaluate our approach with medical professionals and various stakeholders.
Social media has provided promotion and communication channels for organizations to reach their target audience efficiently and cost-effectively. Despite wide social media analytics applications in different areas, scant studies focus on... more
Social media has provided promotion and communication channels for organizations to reach their target audience efficiently and cost-effectively. Despite wide social media analytics applications in different areas, scant studies focus on NGOs. This research analyzes the use of social media by a well-known NGO, the Hong Kong Generation Next Arts Limited (HKGNA), on its presence and user sentiment on Twitter and Facebook to evaluate the brand image, engagement conditions, and promotion effectiveness. These findings showed a low usage frequency of their Twitter channel. Despite the overall positive sentiment, there are some deviations from their charity mission. Yet, their Facebook channel serves well for publicity, and the interaction is relatively active. Thus, more measures are necessary to improve the digital relationship between HKGNA and its targeted audience. Moreover, this methodology for data analysis serves as an example for other NGOs to improve their communication.
Blockchain is a prime example of disruptive technology in multiple levels. With the advent of blockchains becomes obsolete the need for a mutually trusted third party acting as intermediary between agents which do not necessarily trust... more
Blockchain is a prime example of disruptive technology in multiple levels. With the advent of blockchains becomes obsolete the need for a mutually trusted third party acting as intermediary between agents which do not necessarily trust each other in transactions of any kind, including political or shareholder voting, crowdfunding, financial deals, logistics and supply chain management, and contract formulation. An integral part of the blockchain stack is the proof system, namely the mechanism efficiently verifying the claims of various blockchain stakeholders. Thus, trust is effectively established in a literally trustless environment with purely computational means. This is especially critical in the digital formulation of smart contracts where clauses are to be strictly upheld by intelligent agents. The most prominent proof systems recently proposed in the scientific literature are reviewed. Additionally, the applications of blockchain technology to smart contracts is discussed. The latter allows clause re-negotiation, increasing thus the flexibility factor in transactions. As a concrete example, a simple smart contract written in Solidity, a high level language for the Ethereum Virtual Machine, is presented.
Human decision making is central in many functions across a broad spectrum of fields including marketing, investments and smart contracts, digital health, political campaigns, logistics, and strategic management to name only a few.... more
Human decision making is central in many functions across a broad spectrum of fields including marketing, investments and smart contracts, digital health, political campaigns, logistics, and strategic management to name only a few. Computational behavioral science, the focus of the second consecutive iteration of the international workshop on transforms in behavioral and affective computing (THECOG) which was held in conjunction with CIKM 2022, not only studies the various psychological, cultural, and social factors contributing to decision making besides reasoning, but it also seeks to construct robust, scalable, and efficient computational models imitating or extending decision making processes. This year the keynote speech focused on affective robotics and their expected advantages in substantially improving the quality of human life. Moreover, the accepted papers had a considerable topical variety covering among others smart cities, speech emotion recognition, deepfake discovery...
Startups arguably contribute to the current business landscape by developing innovative products and services. The discovery of business partners and employees with a specific background which can be verified stands out repeatedly as a... more
Startups arguably contribute to the current business landscape by developing innovative products and services. The discovery of business partners and employees with a specific background which can be verified stands out repeatedly as a prime obstacle. LinkedIn is a popular platform where professional milestones, endorsements, recommendations, and skills are posted. A graph search algorithm with a BFS and a DFS strategy for seeking trusted candidates in LinkedIn is proposed. Both strategies rely on a metric for assessing the trustworthiness of an account according to LinkedIn attributes. Also, a stochastic vertex selection mechanism reminiscent of preferential attachment guides search. Both strategies were verified against a large segment of the vivid startup ecosystem of Patras, Hellas. A higher order probabilistic analysis suggests that BFS is more suitable. Findings also imply that emphasis should be given to local networking events, peer interaction, and to tasks allowing verifiable credit for the respective work.
Blockchain is a prime example of disruptive technology in multiple levels. With the advent of blockchains becomes obsolete the need for a mutually trusted third party acting as intermediary between agents which do not necessarily trust... more
Blockchain is a prime example of disruptive technology in multiple levels. With the advent of blockchains becomes obsolete the need for a mutually trusted third party acting as intermediary between agents which do not necessarily trust each other in transactions of any kind, including political or shareholder voting, crowdfunding, financial deals, logistics and supply chain management, and contract formulation. An integral part of the blockchain stack is the proof system, namely the mechanism efficiently verifying the claims of various blockchain stakeholders. Thus, trust is effectively established in a literally trustless environment with purely computational means. This is especially critical in the digital formulation of smart contracts where clauses are to be strictly upheld by intelligent agents. The most prominent proof systems recently proposed in the scientific literature are reviewed. Additionally, the applications of blockchain technology to smart contracts is discussed. The latter allows clause re-negotiation, increasing thus the flexibility factor in transactions. As a concrete example, a simple smart contract written in Solidity, a high level language for the Ethereum Virtual Machine, is presented.
Social graphs abound with information which can be harnessed for numerous behavioral purposes including online political campaigns, digital marketing operations such as brand loyalty assessment and opinion mining, and determining public... more
Social graphs abound with information which can be harnessed for numerous behavioral purposes including online political campaigns, digital marketing operations such as brand loyalty assessment and opinion mining, and determining public sentiment regarding an event. In such scenarios the efficiency of the deployed methods depends critically on three factors, namely the account behavioral model, the social graph topology, and the nature of the information collected. A prime example is Twitter which is especially known for the lively activity and the intense conversations. Here an extensible computational methodology is proposed based on a graph neural network operating on an edge fuzzy graph constructed by a combination of structural, functional, and emotional Twitter attributes. These graphs constitute a strong algorithmic cornerstone for engineering cases where a properly formulated potential or uncertainty functional is linked to each edge. Starting from the ground truth in each i...
At the dawn of the Internet era graph analytics play an important role in high- and low-level network policymaking across a wide array of fields so diverse as transportation network design, supply chain engineering and logistics, social... more
At the dawn of the Internet era graph analytics play an important role in high- and low-level network policymaking across a wide array of fields so diverse as transportation network design, supply chain engineering and logistics, social media analysis, and computer communication networks, to name just a few. This can be attributed not only to the size of the original graph but also to the nature of the problem parameters. For instance, algorithmic solutions depend heavily on the approximation criterion selection. Moreover, iterative or heuristic solutions are often sought as it is a high dimensional problem given the high number of vertices and edges involved as well as their complex interaction. Replacing under constraints a directed graph with an undirected one having the same vertex set is often sought in applications such as data visualization, community structure discovery, and connection-based vertex centrality metrics. Polar decomposition is a key matrix factorization which represents a matrix as a product of a symmetric positive (semi)definite factor and an orthogonal one. The former can be an undirected approximation of the original adjacency matrix. The proposed graph approximation has been tested with three Twitter graphs with encouraging results with respect to density, Fiedler number, and certain vertex centrality metrics based on matrix power series. The dataset was hosted in an online MongoDB instance.
In recent years, machine learning has been used for data processing and analysis, providing insights to businesses and policymakers. Deep learning technology is promising to further revolutionize this processing leading to better and more... more
In recent years, machine learning has been used for data processing and analysis, providing insights to businesses and policymakers. Deep learning technology is promising to further revolutionize this processing leading to better and more accurate results. Current trends in information and communication technology are accelerating widespread use of web services in supporting a service-oriented architecture (SOA) consisting of services, their compositions, interactions, and management. Deep learning approaches can be applied to support the development of SOA-based solutions, leveraging the vast amount of data on web services currently available. On the other hand, SOA has mechanisms that can support the development of distributed, flexible, and reusable infrastructures for the use of deep learning. This paper presents a literature survey and discusses how SOA can be enabled by as well as facilitate the use of deep learning approaches in different types of environments for different l...
Human decision making is central in many functions across a broad spectrum of fields such as marketing, investment, smart contract formulations, political campaigns, and organizational strategic management. Behavioral economics seeks to... more
Human decision making is central in many functions across a broad spectrum of fields such as marketing, investment, smart contract formulations, political campaigns, and organizational strategic management. Behavioral economics seeks to study the psychological, cultural, and social factors contributing to decision making along reasoning. It should be highlighted here that behavioral economics do not negate classical economic theory but rather extend it in two distinct directions. First, a finer granularity can be obtained by studying the decision making process not of massive populations but instead of individuals and groups with signal estimation or deep learning techniques based on a wide array of attributes ranging from social media posts to physiological signs. Second, time becomes a critical parameter and changes to the disposition towards alternative decisions can be tracked with input-output or state space models. The primary findings so far are concepts like bounded rationality and perceived risk, while results include optimal strategies for various levels of information awareness and action strategies based on perceived loss aversion principles. From the above it follows that behavioral economics relies on deep learning, signal processing, control theory, social media analysis, affective computing, natural language processing, and gamification to name only a few fields. Therefore, it is directly tied to computer science in many ways. THECOG will be a central meeting point for researchers of various backgrounds in order to generate new interdisciplinary and groundbreaking results.
BACKGROUND: Ghrelin is a 28-amino acid peptide that predominantly produced by the stomach. Strong evidence indicates the effects of ghrelin in the regulation of metabolic functions and its potential role in the aetiology of obesity.AIM:... more
BACKGROUND: Ghrelin is a 28-amino acid peptide that predominantly produced by the stomach. Strong evidence indicates the effects of ghrelin in the regulation of metabolic functions and its potential role in the aetiology of obesity.AIM: The aim of this study was to investigate the relationship of ghrelin levels with obesity, insulin resistance and glucose in normal and obese subjects.METHODS: Thirteen normal (n = 13) and seven (n = 7) obese weight subjects aged 20-22 participated in the study. Fasting plasma ghrelin, insulin and glucose levels were measured after overnight fasting. HOMA-IR was calculated to evaluate insulin resistance.RESULTS: Ghrelin and insulin levels were found to be statistically significantly lower and higher in obese subjects (P < 0.001), respectively. Glucose levels were clinically higher in obese subjects but not statistically significant. Fasting plasma ghrelin was negatively correlated with BMI (r = -0.77, P < 0.001), fasting insulin levels (r = -0.5...
In this paper we introduce a novel methodology to achieve information diffusion within a social graph that activates a realistic number of users. Our approach combines the predicted patterns of diffusion for each node with propagation... more
In this paper we introduce a novel methodology to achieve information diffusion within a social graph that activates a realistic number of users. Our approach combines the predicted patterns of diffusion for each node with propagation heuristics in order to achieve an effective cover of the graph. The novelty of our methodology is based on the use of history information to predict users' diffusion patterns and on our proposed PBD heuristics for achieving a realistic information spread. Moreover, we use a methodology for calculating the actual diffusion of a message in a social media graph. To validate our approach we present a set of experimental results. Our methodology is useful to marketers who are interested to use social influence and run effective marketing campaigns.
Trust is a fundamental sociotechnological mainstay of the Web today. There is substantial evidence about this since netizens implicitly or explicitly agree to trust virtually every Web service they use ranging from Web-based mail to... more
Trust is a fundamental sociotechnological mainstay of the Web today. There is substantial evidence about this since netizens implicitly or explicitly agree to trust virtually every Web service they use ranging from Web-based mail to e-commerce portals. Moreover the methodological framework for trusting individual netizens, primarily their identity and communications, has considerably progressed. Nevertheless, the core of fact checking for human generated content is still far from being substantially automated as most proposed smart algorithms capture inadequately fundamental human traits. One such case is the evaluation of the profile trustworthiness of LinkedIn members based on publicly available attributes available from the platform itself. A trusted profile may indirectly indicate a more suitable candidate since its contents can be easily verified. In this article a first order graph search mechanism for discovering LinkedIn trusted profiles based on a random walker is extended ...
In several applications like healthcare, time in workflow execution is critical. Several control and data dependencies arise that must be specified, validated as conflict free, and maintained during workflow execution. The author models... more
In several applications like healthcare, time in workflow execution is critical. Several control and data dependencies arise that must be specified, validated as conflict free, and maintained during workflow execution. The author models these kinds of dependencies as constraints that impose temporal restrictions on the relative order of execution of the activities. Hence, a finer granularity of activity execution with respect to time is introduced. The author incorporates a subset of interval algebra in the workflow specification model and the author proposes the T-WfMc specification model. The author examines the consistency issues that arise, and the author proposes different correctness criteria.
This article describes IOT enhancements in a smart house so as to provide a calm computing environment. Experimental studies using a portable wireless EEG-casket indicate that the habitants’ calmness can increase with simple, yet... more
This article describes IOT enhancements in a smart house so as to provide a calm computing environment. Experimental studies using a portable wireless EEG-casket indicate that the habitants’ calmness can increase with simple, yet effective reinforcements in the existing in-house infrastructure. A ground robot assists by providing services and spherical video feedback to the humans while remaining out of visual contact to them. Wireless sensors measuring the sound level, luminosity, humidity, CO, temperature, and pressure assist in the calmness. At the same time, the persons’ heart rate and their interaction with their smartphones is monitored resulting in appropriate actions. A Google-hub enabled device provides actions to the robot, a loudspeaker, an air-cooling device and powers up smart power outlets.
Nowadays several millions of people are throughout the day active, while hundreds of new accounts are created daily on social media. Thousands of short-length posts or tweets are posted on Twitter, a popular micro-blogging platform by a... more
Nowadays several millions of people are throughout the day active, while hundreds of new accounts are created daily on social media. Thousands of short-length posts or tweets are posted on Twitter, a popular micro-blogging platform by a vast variety of authors and thus creating a widely diverse social content. The emerged diversity not only does indicate a remarkable strength, but also reveals a certain kind of difficulty when attempting to find Twitter’s authoritative and influencing authors. This work introduces a two-step algorithmic approach for discovering these authors. A set of metrics and features are, firstly, extracted from the social network e.g. friends and followers and the content of the tweets written by the author are extracted. Then, Twitter’s most authoritative authors are discovered by employing two distinct approaches, one which relies on probabilistic while the other applies fuzzy clustering. In particular, the former, initially, employs the Gaussian Mixture Model to identify the most authoritative authors and then introduces a novel ranking technique which relies on computing the cumulative Gaussian distribution of the extracted metrics and features. On the other hand, the latter combines the Gaussian Mixture Model with fuzzy c-means and subsequently the derived authors are ranked via the Borda count technique. The results indicate that the second scheme was able to find more authoritative authors in the benchmark dataset. Both approaches were designed, implemented, and executed on a local cluster of the Apache Storm framework, a cloud-based platform which supports streaming data and real-time scenarios.
While there are several metrics to measure business process performance, recently there is an additional requirement from businesses to evaluate business processes based on their impact on users. In this work, we evaluate business process... more
While there are several metrics to measure business process performance, recently there is an additional requirement from businesses to evaluate business processes based on their impact on users. In this work, we evaluate business process performance using social media analytics. We view a marketing campaign as a business process and we evaluate its performance based on its impact on the Twitter. We propose a new way to calculate the "follow" relationship in Twitter based on the users' reaction to the marketing campaign process activities and we use time series and sentiment analysis for defining and measuring performance. We re-build the Twitter graph based on users' reactions to the marketing activities in time and we are using community detection algorithms to identify the size of the "follow" community and thus we define metrics to calculate the impact of the marketing/campaign process. We evaluate our approach using a dataset for a given politician. We re-construct the campaign process as a set of activities on specific topics (promotions) in time using LDA. Our results show that social media analytics can be used as a valid metric for assessing business processes performance.
There has been a long discussion on knowledge creation in the health care environment. Recently, the action research approach is attracting considerable attention. Action research supports a learning process where collaboratively the... more
There has been a long discussion on knowledge creation in the health care environment. Recently, the action research approach is attracting considerable attention. Action research supports a learning process where collaboratively the healthcare stakeholders are cooperating to produce knowledge that will influence their practice. Usually physicians are involved in case study research where information is produced but it is not used to offer insights back to the community. In this paper we propose a healthcare learning platform (HLP) that enables members of the health multidisciplinary communities to collaborate, share up-to-date information and harvest useful evidence. In this e-health platform knowledge is created based on patient feedback, the dynamic creation of communities that involve the participation of several stakeholders and the creation of an action learning environment where problem identification, investigation and planning, action and reflection is a cycle that enables ...
Graph signal processing has recently emerged as a field with applications across a broad spectrum of fields including brain connectivity networks, logistics and supply chains, social media, computational aesthetics, and transportation... more
Graph signal processing has recently emerged as a field with applications across a broad spectrum of fields including brain connectivity networks, logistics and supply chains, social media, computational aesthetics, and transportation networks. In this paradigm, signal processing methodologies are applied to the adjacency matrix, seen as a two-dimensional signal. Fundamental operations of this type include graph sampling, the graph Laplace transform, and graph spectrum estimation. In this context, topology similarity metrics allow meaningful and efficient comparisons between pairs of graphs or along evolving graph sequences. In turn, such metrics can be the algorithmic cornerstone of graph clustering schemes. Major advantages of relying on existing signal processing kernels include parallelism, scalability, and numerical stability. This work presents a scheme for training a tensor stack network to estimate the topological correlation coefficient between two graph adjacency matrices compressed with the two-dimensional discrete cosine transform, augmenting thus the indirect decompression with knowledge stored in the network. The results from three benchmark graph sequences are encouraging in terms of mean square error and complexity especially for graph sequences. An additional key point is the independence of the proposed method from the underlying domain semantics. This is primarily achieved by focusing on higher-order structural graph patterns.
Graph signal processing is increasingly becoming important for discovering latent patterns, primarily higher order ones, in massive, linked, and possibly semistructured data. The latter may well include social networks, digital images,... more
Graph signal processing is increasingly becoming important for discovering latent patterns, primarily higher order ones, in massive, linked, and possibly semistructured data. The latter may well include social networks, digital images, music spectrograms, brain connectivity maps, protein-to-protein interaction graphs, and even event dependency graphs between events from standard probability spaces. Computing efficiently the cross-correlation between two graph signals can be a versatile similarity metric between them, paving the way for distance metrics in graph clustering or graph classification tasks in numerous domains. In this conference paper methods from a broad class of cross-correlation methodologies which convert deterministic graphs to one dimensional signals are examined. This analysis is then extended to a class of random graphs with the latter being treated as stochastic signals. As a concrete application, these approaches are applied to benchmark data consisting respectively of Twitter connectivity graphs and instances of synthetic stochastic ones.
Cloud Computing Track Gagan Agrawal, Ohio State University, USA Rajkumar Buyya, University of Melbourne, Australia Jiannong Cao, Hong Kong Polytechnic University, China Chung Shiuan Chen, Academia Sinica, Taiwan Angus F.M. Huang, Academia... more
Cloud Computing Track Gagan Agrawal, Ohio State University, USA Rajkumar Buyya, University of Melbourne, Australia Jiannong Cao, Hong Kong Polytechnic University, China Chung Shiuan Chen, Academia Sinica, Taiwan Angus F.M. Huang, Academia Sinica, Taiwan Hai Jin, Huazhong University of Science and Technology, China Thilo Kielmann, Vrije Universiteit, Netherlands Wenjun Li, Sun Yat-Sen University, China Roie Melamed, IBM Haifa Research Lab, Israel Gero Muhl, Technical University of Berlin, Germany Dimitris Nikolopoulos, University of Crete, Greece Addison Y.S. Su, National Central University, Taiwan Anthony Sulistio, Hochschule Furtwangen, Germany Sandeep Tata, IBM Almaden Research Center, USA Qing Bo Wang, IBM China Research Lab, China Yin Wang, HP Labs, USA
... Spain Pascal Lorenz, University of Haute Alsace, France Carlos Abalde, University of A Coruña, Spain Marco Aiello, University of ... University of São Paulo, Brazil Julian Eckert, TU-Darmstadt, Germany Matthias Ehmann, University of... more
... Spain Pascal Lorenz, University of Haute Alsace, France Carlos Abalde, University of A Coruña, Spain Marco Aiello, University of ... University of São Paulo, Brazil Julian Eckert, TU-Darmstadt, Germany Matthias Ehmann, University of Bayreuth, Germany Christian Emig, University ...
Block chain technology provides a decentralized and secure platform for executing transactions. Smart contracts in Ethereum have been proposed as the mechanism to automate legal contracts securely without the involvement of third parties.... more
Block chain technology provides a decentralized and secure platform for executing transactions. Smart contracts in Ethereum have been proposed as the mechanism to automate legal contracts securely without the involvement of third parties. Yet, there are still several issues to be resolved especially regarding the updating of smart contracts in blockchain as well as the use of blockchain as part of a legal smart contracts system. In this work we propose a methodology and an architecture for building and deploying legal contracts in the blockchain. As the blockchain is immutable, we cannot update the code of the smart legal contracts, but in real life applications updating of contracts is a requirement that cannot be ignored. In this paper we address the problem of contract update by introducing a new versioning system that keeps track of the changes and links the different versions using a linked list. Moreover, we propose a system architecture where the user interface, the application logic and the blockchain are smoothly integrated in a manner that each part of the system contributes for producing a flexible and transparent execution. We show the applicability of our approach by implementing a system for the case of a rental agreement.
Relationship management has been of strategic importance for businesses that are interested to evaluate the state of the relationship with the customer and if possible to migrate customers to better and more binding states. This work... more
Relationship management has been of strategic importance for businesses that are interested to evaluate the state of the relationship with the customer and if possible to migrate customers to better and more binding states. This work addresses the problem of estimating the relationship state of a customer and examining the migration policy of the customer, using social media analytics. We propose an innovative framework, where clustering, linguistic and emotional analytics are used to automatically assign users to relationship states. Our research is of multi-disciplinary nature, where we are using existing results from surveys on users’ behavior when mitigating states to verify the semantics of our metrics, showing that they follow similar behavior. Our results show that clustering users based on communication, emotions and perceived product mix can result in an automated assignment of users to states. Furthermore, trust, commitment and homophily are defined and our results show that users are migrating states influenced by these values. Our work provides data analytics metrics for businesses that will identify and address the problem of relationship management thus improving the overall users’ satisfaction using a data analytics approach.
When a disaster occurs, timely actions in response to urgent requests conveyed by critical messages (known as alerts) constitute a vital key to effectiveness. These actions include notifying potentially affected parties so that they can... more
When a disaster occurs, timely actions in response to urgent requests conveyed by critical messages (known as alerts) constitute a vital key to effectiveness. These actions include notifying potentially affected parties so that they can take precautionary measures, gathering additional information, and requesting remedial actions and resource allocation. However, there are different types of disasters such as epidemic outbreaks, natural
ABSTRACT The unprecedented advancement of the internet is complementing and even replacing the traditional commerce transactions. While face-to-face merchant agreements and paper-based contracts have always been the mechanism for... more
ABSTRACT The unprecedented advancement of the internet is complementing and even replacing the traditional commerce transactions. While face-to-face merchant agreements and paper-based contracts have always been the mechanism for facilitating business ...

And 59 more