Agile software development methodologies have introduced best practices into software development... more Agile software development methodologies have introduced best practices into software development. However we need to adopt and monitor those practices continuously to maximize its benefits. Our research has focused on adaptability, suitability and software maturity model called Agile Maturity Model (AMM) for agile software development environments. This paper introduces a process of adaptability assessment, suitability assessment, and improvement framework for assessing and improving agile best practices. We have also developed a web based automated tool support to assess adaptability, suitability, and improvement for agile practices.
Various factors affect the impact of agile factors on the continuous delivery of software project... more Various factors affect the impact of agile factors on the continuous delivery of software projects. This is a major reason why projects perform differently-some failing and some succeeding-when they implement some agile practices in various environments. This is not helped by the fact that many projects work within limited budget while project plans also change-making them to fall into some sort of pressure to meet deadline when they fall behind in their planned work. This study investigates the impact of pair programming, customer involvement, QA Ability, pair testing and test driven development in the pre-release and post-release quality of software projects using system dynamics within a schedule pressure blighted environment. The model is validated using results from a completed medium-sized software. Statistical results suggest that the impact of PP is insignificant on the pre-release quality of the software while TDD and customer involvement both have significant effects on th...
Abstract. Software engineering activities in the Industry has come a long way with various improv... more Abstract. Software engineering activities in the Industry has come a long way with various improvements brought in various stages of the software development life cycle. The complexity of modern software, the commercial constraints and the expectation for high quality products demand the accurate fault prediction based on OO design metrics in the class level in the early stages of software development. The object oriented class metrics are used as quality predictors in the entire OO software development life cycle even when a highly iterative, incremental model or agile software process is employed. Recent research has shown some of the OO design metrics are useful for predicting fault-proneness of classes. In this paper the empirical validation of a set of metrics proposed by Chidamber and Kemerer is performed to assess their ability in predicting the software quality in terms of fault proneness and degradation. We have also proposed the design complexity of object-oriented softwar...
Effective, accurate and timely application integration is fundamental to the successful operation... more Effective, accurate and timely application integration is fundamental to the successful operation of today’s organizations. The success of every business initiative relies heavily on the integration between existing heterogeneous applications and databases. For this reason, when companies look to improve productivity, reduce overall costs, or streamline business processes, integration should be at the heart of their plans. Integration improves exposure and, by extension, the value and quality of information to facilitate workflow and reduce business risk. It is an important element of the way that the organization’s business process operates. Data integration technology is the key to pulling organization data together and delivering an information infrastructure that will meet strategic business intelligence initiatives. This information infrastructure consists of data warehouses, interfaces, workflows and data access tools. Integration solutions should utilize metadata to move or d...
Organizational Advancements through Enterprise Information Systems
ERP software standardizes an enterprise’s business processes and data. The software converts tran... more ERP software standardizes an enterprise’s business processes and data. The software converts transactional data into useful information and collates the data so that they can be analyzed. Requirements engineering is an important component of ERP projects. In this paper, we propose: (1) An ERP maturity model (EMM) for assessing the ERP maturity within the organization and (2) A Requirements Engineering Method (REM) for ERP system requirements to capture the requirements from the different types of users of an ERP system, verifying and validating them. The EMM consists of three levels and each level has a focus and a key process area. Key indicators of ERP functionality identified by a major ERP vendor have been used to apply the EMM to an enterprise. This identifies the level of the EMM to which an enterprise belongs. Then the REM is used to enable the enterprise to assess its ERP system requirements and refine it using a knowledge database to reach a higher level in the EMM than the...
Service and cloud computing has revolutionized the way we develop software. The emergence of clou... more Service and cloud computing has revolutionized the way we develop software. The emergence of cloud computing has huge impact on the economics of developing services. However, it remains challenging to understand the concept of a service which is quite new for computing sciences. This talk will present a systematic approach to understanding service and cloud computing and will provide a Service Development Life Cycle approaches and market niche techniques and tools.
This report discusses the big data industry and its derived product, the Internet of Things. On t... more This report discusses the big data industry and its derived product, the Internet of Things. On this basis, this paper analyzes the ethical challenges encountered by smart connected toys. It also offers some possible recommendations involved with regulation and parental control, and examine what the stakeholders in smart connected toys industry should do under the framework principles of IoT.
The fragment allocation design is an essential issue that improves the performance of the applica... more The fragment allocation design is an essential issue that improves the performance of the applications processing in the Distributed Database systems (DDBs). The database queries access the applications on the distributed database sites and should be performed effectively. Therefore, the fragments that accessed by queries are needed to be allocated to the DDBs sites so as to reduce the communication cost during the applications execution and handle their operational processing. We present a method for grouping the sites of the DDBs according to their communication cost in order to determine the fragment allocation to a group of sites instead of allocating the fragments to site by site. Optimizing the cost of the fragment allocation functions to reduce the queries processing time and determining the fragments to be allocated in the DDBs sites are also main objectives in our research.
"Fourth International Workshop on Adoption-Centric Software Engineering (ACSE 2004)" W6S Workshop - 26th International Conference on Software Engineering
Cloud computing has emerged to address the needs of businesses and to improve the quantity and qu... more Cloud computing has emerged to address the needs of businesses and to improve the quantity and quality of data that we can collect and analyse from multiple sources and devices. Cloud computing has also revolutionised the software paradigm by changing into a service-oriented paradigm where cloud resources and software are offered as a service. This service archetype has changed the way we have been thinking when producing a cloud service. This chapter provides an outline of the underpinning definition, principles, and concepts which currently lack in the literature. This chapter will also outline the foundations of cloud computing, and then endeavours to draft the emerging trends and evolution of cloud applications. The emerging trends will include new services, federations of cloud paradigm, smart cities, big data, IoT, and mobile cloud.
The presence of bugs in a software release has become inevitable. The loss incurred by a company ... more The presence of bugs in a software release has become inevitable. The loss incurred by a company due to the presence of bugs in a software release is phenomenal. Modern methods of testing and debugging have shifted focus from ‘detecting’ to ‘predicting’ bugs in the code. The existing models of bug prediction have not been optimized for commercial use. Moreover, the scalability of these models has not been discussed in depth yet. Taking into account the varying costs of fixing bugs, depending on which stage of the software development cycle the bug is detected in, this chapter uses two approaches—one model which can be employed when the ‘cost of changing code’ curve is exponential and the other model can be used otherwise. The cases where each model is best suited are discussed. This chapter proposes a model that can be deployed on a cloud platform for software development companies to use. The model in this chapter aims to predict the presence or absence of a bug in the code, using machine learning classification models. Using Microsoft Azure’s machine learning platform, this model can be distributed as a web service worldwide, thus providing bug prediction as a service (BPaaS).
As the use of smartphone proliferates, and human interaction through social media is intensified ... more As the use of smartphone proliferates, and human interaction through social media is intensified around the globe, the amount of data available to process is greater than ever before. As consequence, the design and implementation of systems capable of handling such vast amounts of data in acceptable timescales has moved to the forefront of academic and industry-based research. This research represents a unique contribution to the field of software engineering for Big Data in the form of an investigation of the big data architectures of three well-known real-world companies: Facebook, Twitter and Netflix. The purpose of this investigation is to gather significant non-functional requirements for real-world big data systems, with an aim to addressing these requirements in the design of our own unique reference architecture for big data processing in the cloud: MC-BDP (Multi-Cloud Big Data Processing). MC-BDP represents an evolution of the PaaS-BDP (Platform as a Service for Big Data Processing) architectural pattern, previously developed by the authors. However, its presentation is not within the scope of this study. The scope of this comparative study is limited to the examination of academic papers, technical blogs, presentations, source code and documentation officially published by the companies under investigation. Ten non-functional requirements are identified and discussed in the context of these companies’ architectures: batch data, stream data, late and out-of-order data, processing guarantees, integration and extensibility, distribution and scalability, cloud support and elasticity, fault tolerance, flow control and flexibility and technology agnosticism. They are followed by the conclusion and considerations for future work.
In today's world, we are living in busy metropolitan cities and want our homes to be ambient ... more In today's world, we are living in busy metropolitan cities and want our homes to be ambient intelligent enough towards our cognitive requirements for assisted living in smart space environment and an excellent smart home control system should not rely on the users' instructions (Wanglei, 2015). The ambient intelligence is a sensational new information technology paradigm in which people are empowered for assisted living through multiple IoTs sensors environment that are aware of inhabitant presence and context and highly sensitive, adaptive and responsive to their needs. A noble ambient intelligent environment are characterized by their ubiquity, transparency and intelligence which seamlessly integrated into the background and invisible to surrounded users/inhabitant. Cognitive IoE (Internet of Everything) is a new type of pervasive computing. As the ambient smart home is into research only from a couple of years, many research outcomes are lacking potentials in ambient int...
Service computing and cloud computing have emerged to address the need for more flexible and cost... more Service computing and cloud computing have emerged to address the need for more flexible and cost-efficient computing systems where software is delivered as a service. To make this more resilient and reliable, we need to adopt software engineering (SE) principles and best practices that have existed for the last 40 years or so. Therefore, this chapter proposes a Software Engineering Framework for Service and Cloud Computing (SEF-SCC) to address the need for a systematic approach to design and develop robust, resilient, and reusable services. This chapter presents SEF-SCC methods, techniques, and a systematic engineering process supporting the development of service-oriented software systems and software as a service paradigms. SEF-SCC has been successfully validated for the past 10 years based on a large-scale case study on British Energy Power and Energy Trading (BEPET). Ideas and concepts suggested in this chapter are equally applicable to all distributed computing environments including Fog and Edge Computing paradigms.
Web 2.0 and Cloud Technologies for Implementing Connected Government
Cloud computing technologies are being used highly successfully in large-scale businesses. Theref... more Cloud computing technologies are being used highly successfully in large-scale businesses. Therefore, it is useful for governments to adopt cloud-driven multi-channel, and multiple devices to offer their services such as e-tax, e-vote, e-health, etc. Since these applications require open, flexible, interoperable, collaborative, and integrated architecture, service-oriented architecture approach can be usefully adopted to achieve flexibility and multi-platform and multi-channel integration. However, its adoption needs to be systematic, secure, and privacy-driven. In this context, micro services architecture (MSA), a direct offshoot of SOA, is also a highly attractive mechanism for building and deploying enterprise-scale applications. This chapter proposes a systematic framework for cloud e-government services based on the cloud software engineering approach and suggests a cloud adoption model for e-government, leveraging the benefits of MSA patterns. The proposed model is based on a ...
Agile software development methodologies have introduced best practices into software development... more Agile software development methodologies have introduced best practices into software development. However we need to adopt and monitor those practices continuously to maximize its benefits. Our research has focused on adaptability, suitability and software maturity model called Agile Maturity Model (AMM) for agile software development environments. This paper introduces a process of adaptability assessment, suitability assessment, and improvement framework for assessing and improving agile best practices. We have also developed a web based automated tool support to assess adaptability, suitability, and improvement for agile practices.
Various factors affect the impact of agile factors on the continuous delivery of software project... more Various factors affect the impact of agile factors on the continuous delivery of software projects. This is a major reason why projects perform differently-some failing and some succeeding-when they implement some agile practices in various environments. This is not helped by the fact that many projects work within limited budget while project plans also change-making them to fall into some sort of pressure to meet deadline when they fall behind in their planned work. This study investigates the impact of pair programming, customer involvement, QA Ability, pair testing and test driven development in the pre-release and post-release quality of software projects using system dynamics within a schedule pressure blighted environment. The model is validated using results from a completed medium-sized software. Statistical results suggest that the impact of PP is insignificant on the pre-release quality of the software while TDD and customer involvement both have significant effects on th...
Abstract. Software engineering activities in the Industry has come a long way with various improv... more Abstract. Software engineering activities in the Industry has come a long way with various improvements brought in various stages of the software development life cycle. The complexity of modern software, the commercial constraints and the expectation for high quality products demand the accurate fault prediction based on OO design metrics in the class level in the early stages of software development. The object oriented class metrics are used as quality predictors in the entire OO software development life cycle even when a highly iterative, incremental model or agile software process is employed. Recent research has shown some of the OO design metrics are useful for predicting fault-proneness of classes. In this paper the empirical validation of a set of metrics proposed by Chidamber and Kemerer is performed to assess their ability in predicting the software quality in terms of fault proneness and degradation. We have also proposed the design complexity of object-oriented softwar...
Effective, accurate and timely application integration is fundamental to the successful operation... more Effective, accurate and timely application integration is fundamental to the successful operation of today’s organizations. The success of every business initiative relies heavily on the integration between existing heterogeneous applications and databases. For this reason, when companies look to improve productivity, reduce overall costs, or streamline business processes, integration should be at the heart of their plans. Integration improves exposure and, by extension, the value and quality of information to facilitate workflow and reduce business risk. It is an important element of the way that the organization’s business process operates. Data integration technology is the key to pulling organization data together and delivering an information infrastructure that will meet strategic business intelligence initiatives. This information infrastructure consists of data warehouses, interfaces, workflows and data access tools. Integration solutions should utilize metadata to move or d...
Organizational Advancements through Enterprise Information Systems
ERP software standardizes an enterprise’s business processes and data. The software converts tran... more ERP software standardizes an enterprise’s business processes and data. The software converts transactional data into useful information and collates the data so that they can be analyzed. Requirements engineering is an important component of ERP projects. In this paper, we propose: (1) An ERP maturity model (EMM) for assessing the ERP maturity within the organization and (2) A Requirements Engineering Method (REM) for ERP system requirements to capture the requirements from the different types of users of an ERP system, verifying and validating them. The EMM consists of three levels and each level has a focus and a key process area. Key indicators of ERP functionality identified by a major ERP vendor have been used to apply the EMM to an enterprise. This identifies the level of the EMM to which an enterprise belongs. Then the REM is used to enable the enterprise to assess its ERP system requirements and refine it using a knowledge database to reach a higher level in the EMM than the...
Service and cloud computing has revolutionized the way we develop software. The emergence of clou... more Service and cloud computing has revolutionized the way we develop software. The emergence of cloud computing has huge impact on the economics of developing services. However, it remains challenging to understand the concept of a service which is quite new for computing sciences. This talk will present a systematic approach to understanding service and cloud computing and will provide a Service Development Life Cycle approaches and market niche techniques and tools.
This report discusses the big data industry and its derived product, the Internet of Things. On t... more This report discusses the big data industry and its derived product, the Internet of Things. On this basis, this paper analyzes the ethical challenges encountered by smart connected toys. It also offers some possible recommendations involved with regulation and parental control, and examine what the stakeholders in smart connected toys industry should do under the framework principles of IoT.
The fragment allocation design is an essential issue that improves the performance of the applica... more The fragment allocation design is an essential issue that improves the performance of the applications processing in the Distributed Database systems (DDBs). The database queries access the applications on the distributed database sites and should be performed effectively. Therefore, the fragments that accessed by queries are needed to be allocated to the DDBs sites so as to reduce the communication cost during the applications execution and handle their operational processing. We present a method for grouping the sites of the DDBs according to their communication cost in order to determine the fragment allocation to a group of sites instead of allocating the fragments to site by site. Optimizing the cost of the fragment allocation functions to reduce the queries processing time and determining the fragments to be allocated in the DDBs sites are also main objectives in our research.
"Fourth International Workshop on Adoption-Centric Software Engineering (ACSE 2004)" W6S Workshop - 26th International Conference on Software Engineering
Cloud computing has emerged to address the needs of businesses and to improve the quantity and qu... more Cloud computing has emerged to address the needs of businesses and to improve the quantity and quality of data that we can collect and analyse from multiple sources and devices. Cloud computing has also revolutionised the software paradigm by changing into a service-oriented paradigm where cloud resources and software are offered as a service. This service archetype has changed the way we have been thinking when producing a cloud service. This chapter provides an outline of the underpinning definition, principles, and concepts which currently lack in the literature. This chapter will also outline the foundations of cloud computing, and then endeavours to draft the emerging trends and evolution of cloud applications. The emerging trends will include new services, federations of cloud paradigm, smart cities, big data, IoT, and mobile cloud.
The presence of bugs in a software release has become inevitable. The loss incurred by a company ... more The presence of bugs in a software release has become inevitable. The loss incurred by a company due to the presence of bugs in a software release is phenomenal. Modern methods of testing and debugging have shifted focus from ‘detecting’ to ‘predicting’ bugs in the code. The existing models of bug prediction have not been optimized for commercial use. Moreover, the scalability of these models has not been discussed in depth yet. Taking into account the varying costs of fixing bugs, depending on which stage of the software development cycle the bug is detected in, this chapter uses two approaches—one model which can be employed when the ‘cost of changing code’ curve is exponential and the other model can be used otherwise. The cases where each model is best suited are discussed. This chapter proposes a model that can be deployed on a cloud platform for software development companies to use. The model in this chapter aims to predict the presence or absence of a bug in the code, using machine learning classification models. Using Microsoft Azure’s machine learning platform, this model can be distributed as a web service worldwide, thus providing bug prediction as a service (BPaaS).
As the use of smartphone proliferates, and human interaction through social media is intensified ... more As the use of smartphone proliferates, and human interaction through social media is intensified around the globe, the amount of data available to process is greater than ever before. As consequence, the design and implementation of systems capable of handling such vast amounts of data in acceptable timescales has moved to the forefront of academic and industry-based research. This research represents a unique contribution to the field of software engineering for Big Data in the form of an investigation of the big data architectures of three well-known real-world companies: Facebook, Twitter and Netflix. The purpose of this investigation is to gather significant non-functional requirements for real-world big data systems, with an aim to addressing these requirements in the design of our own unique reference architecture for big data processing in the cloud: MC-BDP (Multi-Cloud Big Data Processing). MC-BDP represents an evolution of the PaaS-BDP (Platform as a Service for Big Data Processing) architectural pattern, previously developed by the authors. However, its presentation is not within the scope of this study. The scope of this comparative study is limited to the examination of academic papers, technical blogs, presentations, source code and documentation officially published by the companies under investigation. Ten non-functional requirements are identified and discussed in the context of these companies’ architectures: batch data, stream data, late and out-of-order data, processing guarantees, integration and extensibility, distribution and scalability, cloud support and elasticity, fault tolerance, flow control and flexibility and technology agnosticism. They are followed by the conclusion and considerations for future work.
In today's world, we are living in busy metropolitan cities and want our homes to be ambient ... more In today's world, we are living in busy metropolitan cities and want our homes to be ambient intelligent enough towards our cognitive requirements for assisted living in smart space environment and an excellent smart home control system should not rely on the users' instructions (Wanglei, 2015). The ambient intelligence is a sensational new information technology paradigm in which people are empowered for assisted living through multiple IoTs sensors environment that are aware of inhabitant presence and context and highly sensitive, adaptive and responsive to their needs. A noble ambient intelligent environment are characterized by their ubiquity, transparency and intelligence which seamlessly integrated into the background and invisible to surrounded users/inhabitant. Cognitive IoE (Internet of Everything) is a new type of pervasive computing. As the ambient smart home is into research only from a couple of years, many research outcomes are lacking potentials in ambient int...
Service computing and cloud computing have emerged to address the need for more flexible and cost... more Service computing and cloud computing have emerged to address the need for more flexible and cost-efficient computing systems where software is delivered as a service. To make this more resilient and reliable, we need to adopt software engineering (SE) principles and best practices that have existed for the last 40 years or so. Therefore, this chapter proposes a Software Engineering Framework for Service and Cloud Computing (SEF-SCC) to address the need for a systematic approach to design and develop robust, resilient, and reusable services. This chapter presents SEF-SCC methods, techniques, and a systematic engineering process supporting the development of service-oriented software systems and software as a service paradigms. SEF-SCC has been successfully validated for the past 10 years based on a large-scale case study on British Energy Power and Energy Trading (BEPET). Ideas and concepts suggested in this chapter are equally applicable to all distributed computing environments including Fog and Edge Computing paradigms.
Web 2.0 and Cloud Technologies for Implementing Connected Government
Cloud computing technologies are being used highly successfully in large-scale businesses. Theref... more Cloud computing technologies are being used highly successfully in large-scale businesses. Therefore, it is useful for governments to adopt cloud-driven multi-channel, and multiple devices to offer their services such as e-tax, e-vote, e-health, etc. Since these applications require open, flexible, interoperable, collaborative, and integrated architecture, service-oriented architecture approach can be usefully adopted to achieve flexibility and multi-platform and multi-channel integration. However, its adoption needs to be systematic, secure, and privacy-driven. In this context, micro services architecture (MSA), a direct offshoot of SOA, is also a highly attractive mechanism for building and deploying enterprise-scale applications. This chapter proposes a systematic framework for cloud e-government services based on the cloud software engineering approach and suggests a cloud adoption model for e-government, leveraging the benefits of MSA patterns. The proposed model is based on a ...
Uploads
Papers by Muthu Ramachandran