Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Joe Essien
    Making incorrect choices when selecting crops can result in substantial financial losses for farmers, primarily because of a limited understanding of the unique needs of each crop. Each farm possesses unique characteristics, influencing... more
    Making incorrect choices when selecting crops can result in substantial financial losses for farmers, primarily because of a limited understanding of the unique needs of each crop. Each farm possesses unique characteristics, influencing the effectiveness of modern agricultural solutions. Challenges persist in optimizing farming methods to maximize yield. This study aims to mitigate these issues by developing a data-driven crop classification and cultivation advisory system, leveraging machine learning algorithms and agricultural data. By analysing variables such as soil nutrient levels, temperature, humidity, pH, and rainfall, the system offers tailored recommendations for crop selection and cultivation practices. This approach optimizes resource utilization, enhances crop productivity, and promotes sustainable agriculture. The study emphasizes the importance of pre-processing data, such as handling missing values and normalizing features, to ensure reliable model training. Various machine learning models, including Random Forests, Bagging Classifier, and AdaBoost Classifier, were employed, demonstrating high accuracy rates in crop classification tasks. The integration of realtime weather data, market prices, and profitability analysis further refines decision-making, while a mobile application facilitates convenient access for farmers. By incorporating user feedback and continuous data collection, the system's performance can be continuously improved, offering precise and economically viable agricultural advice.
    As the practice of Enterprise Architecture (EA) diversifies, the schematic management of its objects, semantics and relationship continues to be complex. Given that EA provides support for the IT environment by simulating alignment... more
    As the practice of Enterprise Architecture (EA) diversifies, the schematic management of its objects, semantics and relationship continues to be complex. Given that EA provides support for the IT environment by simulating alignment between dynamic business architectures, heterogeneous application systems and incongruent technologies, the need to ensure validation of conceptualized EA models has become also critical. As a relatively new discipline, its disparate and widespread methodologies makes it even more challenging to adopt a generic approach in which models can be verified sequel to the unavailability of unified EA modelling languages able to describe a wide range of Information Technology domains. This paper seeks to present an approach for addressing this challenge through the use of ontologies and queries based on constraints specified in the model's motivation taxonomy. The paper is experimental research-based and grounds its hypothesis on initial model created using the ArchiMate modelling language. By transforming its conceptual metamodel into a model instance, a process which can be achieved irrespective of the modeling language used in the design of the EA, it represents extracted triples as resource description framework schema (RDFS) by mapping the model artefacts directly into classes and slots using a more conventional web ontology language. The generated RDF is then queried using Protocol and RDF Query Language (SPARQL) adopting the Behavior Driven Development (BDD) concept. A case study of the Student Internship Program (SIP) is deployed to translate information from business needs to IT solutions encapsulating a view of abstraction of the EA. The paper also proposes an implementation of the approach using an open source platform that allows construction of domain models and knowledge-based applications with ontologies and is a contribution towards effective validation of EA through taxonomy decomposition, systematic generation of testable EA artifacts, creation of structural triples of model elements and alignment of motivational goals to business behavior specifications.
    The advancement of Enterprise Architecture modelling techniques and the need to incorporate these models with other Information Systems collaborations require a methodology that ensures the consistency and compatibility of diverse models... more
    The advancement of Enterprise Architecture modelling techniques and the need to incorporate these models with other Information Systems collaborations require a methodology that ensures the consistency and compatibility of diverse models with shared business objectives. The fulfillment of this requirement is impeded by a notable obstacle, which is the growing intricacy of Enterprise Architecture Framework modelling tools and methodologies. Many organisations that have adopted a variety of methodologies face the difficulty of integrating models because of the existence of inconsistent and diverse modelling artefacts within an unreliable framework, which requires validation. The primary aim of this article is to conduct a comprehensive analysis of the existing methodologies for validating enterprise architecture and propose a new approach that focuses specifically on validating models. This research combines the Systematic Review research approach and meta-analysis research method to provide a systematic and comprehensive review and analysis of multiple studies and combining findings from these studies to synthesize evidence and draw definitive conclusions. The comparative analysis explores validation semantics and heterogeneous model frameworks and delves into different approaches used to validate models in the context of heterogeneous model integration. The study examines the strengths and weaknesses of various validation methods and frameworks for heterogeneous models, considering factors like accuracy, efficiency and scalability. By comparing different validation semantics and heterogeneous model frameworks, the analysis provides valuable insights into selecting appropriate techniques for effectively integrating and validating diverse models in complex systems.
    The problem of traffic congestion is a significant phenomenon that has had a substantial impact on the transportation system within the country. This phenomenon has given rise to numerous intricacies, particularly in instances where... more
    The problem of traffic congestion is a significant phenomenon that has had a substantial impact on the transportation system within the country. This phenomenon has given rise to numerous intricacies, particularly in instances where emergency situations occur at traffic light intersections that are consistently congested with a high volume of vehicles. This implementation of a traffic light controller system is designed with the intention of addressing this problem. The purpose of the system was to facilitate the operation of a 3-way traffic control light and provide priority to emergency vehicles using a Radio Frequency Identification (RFID) sensor and Reduced Instruction Set Computing (RISC) Architecture Based Microcontroller. This research work involved designing a system to mitigate the occurrence of accidents commonly observed at traffic light intersections, where vehicles often need to maneuver in order to make way for emergency vehicles following a designated route. The research effectively achieved the analysis, simulation and implementation of wireless communication devices for traffic light control. The implemented prototype utilizes RFID transmission, operates in conjunction with the sequential mode of traffic lights to alter the traffic light sequence accordingly and reverts the traffic lights back to their normal sequence after the emergency vehicle has passed the traffic lights.
    Ontology-Driven Analytic Models for Pension Management are sophisticated approaches that integrate the principles of ontology and analytics to optimize the management and decision-making processes within pension systems. While... more
    Ontology-Driven Analytic Models for Pension Management are sophisticated approaches that integrate the principles of ontology and analytics to optimize the management and decision-making processes within pension systems. While Ontology-Driven Analytic Models offer significant benefits for pension management, there are also challenges associated with implementing and utilizing the models. Developing a comprehensive and accurate ontology for pension management requires a deep understanding of the domain, including regulatory frameworks, investment strategies, retirement planning, and integration of data from heterogenous sources. Integrating these data into a cohesive ontology can be challenging. This research work leverages on semantic ontology as an approach for structured representation of knowledge about concepts and their relationships, and applies it to analyze and optimize decision support for pension management. The proposed ontology presents a formal and explicit specification of concepts (classes), their attributes, and the relationships between them and provides a shared and standardized understanding of the domain; enabling precise communication and knowledge representation for decision-support. The ontology deploys computational frameworks and analytic models to assess and evaluate data, generate insights, predict future pension fund performance as well as assess risk exposure. The research adopts the Reasoner, SPARQL query and OWL Visualizer executed over Java IDE for modelling the ontology-driven analytics. The approach encapsulated and integrated semantic ontologies with analytical models to enhance the accuracy, contextuality, and comprehensiveness of analyses and decisions within pension systems.
    This research involved an exploratory evaluation of the dynamics of vehicular traffic on a road network across two traffic light-controlled junctions. The study uses the case study of a one-kilometer road system modelled on Anylogic... more
    This research involved an exploratory evaluation of the dynamics of vehicular traffic on a road network across two traffic light-controlled junctions. The study uses the case study of a one-kilometer road system modelled on Anylogic version 8.8.4. Anylogic is a multi-paradigm simulation tool that supports three main simulation methodologies: discrete event simulation, agent-based modeling, and system dynamics modeling. The system is used to evaluate the implication of stochastic time-based vehicle variables on the general efficiency of road use. Road use efficiency as reflected in this model is based on the percentage of entry vehicles to exit the model within a one-hour simulation period. The study deduced that for the model under review, an increase in entry point time delay has a domineering influence on the efficiency of road use far beyond any other consideration. This study therefore presents a novel approach that leverages Discrete Events Simulation to facilitate efficient road management with a focus on optimum road use efficiency. The study also determined that the inclusion of appropriate random parameters to reflect road use activities at critical event points in a simulation can help in the effective representation of authentic traffic models. The Anylogic simulation software leverages the Classic DEVS and Parallel DEVS formalisms to achieve these objectives.
    The advancement of Enterprise Architecture modelling techniques and the need to incorporate these models with other Information Systems collaborations require a methodology that ensures the consistency and compatibility of diverse models... more
    The advancement of Enterprise Architecture modelling techniques and the need to incorporate these models with other Information Systems collaborations require a methodology that ensures the consistency and compatibility of diverse models with shared business objectives. The fulfillment of this requirement is impeded by a notable obstacle, which is the growing intricacy of Enterprise Architecture Framework modelling tools and methodologies. Many organisations that have adopted a variety of methodologies face the difficulty of integrating models because of the existence of inconsistent and diverse modelling artefacts within an unreliable framework, which requires validation. The primary aim of this article is to conduct a comprehensive analysis of the existing methodologies for validating enterprise architecture and propose a new approach that focuses specifically on validating models. This research combines the Systematic Review research approach and meta-analysis research method to provide a systematic and comprehensive review and analysis of multiple studies and combining findings from these studies to synthesize evidence and draw definitive conclusions. The comparative analysis explores validation semantics and heterogeneous model frameworks and delves into different approaches used to validate models in the context of heterogeneous model integration. The study examines the strengths and weaknesses of various validation methods and frameworks for heterogeneous models, considering factors like accuracy, efficiency and scalability. By comparing different validation semantics and heterogeneous model frameworks, the analysis provides valuable insights into selecting appropriate techniques for effectively integrating and validating diverse models in complex systems.
    The problem of traffic congestion is a significant phenomenon that has had a substantial impact on the transportation system within the country. This phenomenon has given rise to numerous intricacies, particularly in instances where... more
    The problem of traffic congestion is a significant phenomenon that has had a substantial impact on the transportation system within the country. This phenomenon has given rise to numerous intricacies, particularly in instances where emergency situations occur at traffic light intersections that are consistently congested with a high volume of vehicles. This implementation of a traffic light controller system is designed with the intention of addressing this problem. The purpose of the system was to facilitate the operation of a 3-way traffic control light and provide priority to emergency vehicles using a Radio Frequency Identification (RFID) sensor and Reduced Instruction Set Computing (RISC) Architecture Based Microcontroller. This research work involved designing a system to mitigate the occurrence of accidents commonly observed at traffic light intersections, where vehicles often need to maneuver in order to make way for emergency vehicles following a designated route. The research effectively achieved the analysis, simulation and implementation of wireless communication devices for traffic light control. The implemented prototype utilizes RFID transmission, operates in conjunction with the sequential mode of traffic lights to alter the traffic light sequence accordingly and reverts the traffic lights back to their normal sequence after the emergency vehicle has passed the traffic lights.
    Enterprise Information System management has become an increasingly vital factor for many firms. Several organizations have encountered problems when attempting to evaluate organizational performance. Measurement of performance metrics is... more
    Enterprise Information System management has become an increasingly vital factor for many firms. Several organizations have encountered problems when attempting to evaluate organizational performance. Measurement of performance metrics is a key challenge for a huge number of firms. In order to preserve relevance and adaptability in competitive markets, it has become essential to respond proactively to complex events through informed decision-making that is supported by technology. Therefore, the objective of this study was to apply neural networks to the modeling, simulation, and forecasting of the effects of the performance indicators of Enterprise Information Systems on the achievement of corporate objectives and value creation. A set of quantifiable and sizeable conditionally independent associations were derived using a simplified joint probability distribution technique. Bayesian Neural Networks were utilized to describe the link between random variables (features) and to concisely and easily specify the joint probability distribution. The research demonstrated that Bayesian networks could effectively explore complex logical linkages by employing probability to represent uncertainty and probabilistic rules; and by applying impact models from Bayesian taxonomies to achieve learning and reasoning processes.
    Sensors for fire alarms require a high level of predictive variables to ensure accurate detection, injury prevention, and loss prevention. Bayesian networks can aid in enhancing early fire detection capabilities and reducing the frequency... more
    Sensors for fire alarms require a high level of predictive variables to ensure accurate detection, injury prevention, and loss prevention. Bayesian networks can aid in enhancing early fire detection capabilities and reducing the frequency of erroneous fire alerts, thereby enhancing the effectiveness of numerous safety monitoring systems. This research explores the development of optimized probabilistic graphic models for the discretization thresholds of alarm system predictor variables. The study presents a statistical model framework that increases the efficacy of fire detection by predicting the discretization thresholds of alarm system predictor variable fluctuations used to detect the onset of fire. The work applies the Bayesian networks and probabilistic visual models to reveal the specific characteristics required to cope with fire detection strategies and patterns. The adopted methodology utilizes a combination of prior knowledge and statistical data to draw conclusions from observations. Utilizing domain knowledge to compute conditional dependencies between network variables enabled predictions to be made through the application of specialized analytical and simulation techniques.
    Dynamic control and performance evaluation of a microcontroller-based smart industrial heat extractor involves the implementation of control strategies and the assessment of its performance under dynamic operating conditions. Performance... more
    Dynamic control and performance evaluation of a microcontroller-based smart industrial heat extractor involves the implementation of control strategies and the assessment of its performance under dynamic operating conditions. Performance evaluation aims to assess the effectiveness and efficiency of the microcontroller-based smart industrial heat extractor under dynamic conditions. Industrial heat extraction systems are often complex, involving multiple components, sensors, actuators, and control algorithms. Understanding and modelling the dynamic behaviour of these systems can be challenging, especially when considering factors like heat transfer rates, thermal delays, and interactions between different system elements. The effectiveness of dynamic control is significantly dependent on precise and dependable measurements from sensors. Sensors deployed in industrial settings may encounter severe environmental conditions, which can result in possible inaccuracies, drift, or even malfu...
    Sensors for fire alarms require a high level of predictive variables to ensure accurate detection, injury prevention, and loss prevention. Bayesian networks can aid in enhancing early fire detection capabilities and reducing the frequency... more
    Sensors for fire alarms require a high level of predictive variables to ensure accurate detection, injury prevention, and loss prevention. Bayesian networks can aid in enhancing early fire detection capabilities and reducing the frequency of erroneous fire alerts, thereby enhancing the effectiveness of numerous safety monitoring systems. This research explores the development of optimized probabilistic graphic models for the discretization thresholds of alarm system predictor variables. The study presents a statistical model framework that increases the efficacy of fire detection by predicting the discretization thresholds of alarm system predictor variable fluctuations used to detect the onset of fire. The work applies the Bayesian networks and probabilistic visual models to reveal the specific characteristics required to cope with fire detection strategies and patterns. The adopted methodology utilizes a combination of prior knowledge and statistical data to draw conclusions from observations. Utilizing domain knowledge to compute conditional dependencies between network variables enabled predictions to be made through the application of specialized analytical and simulation techniques.
    Over time, a range of control mechanisms have been devised to deter unauthorised individuals from gaining entry to restricted facilities. The principal objective of implementing an automated access control system is to guarantee the... more
    Over time, a range of control mechanisms have been devised to deter unauthorised individuals from gaining entry to restricted facilities. The principal objective of implementing an automated access control system is to guarantee the protection of individuals and assets. The proliferation of electronic devices that offer security measures has gained significant traction in recent times. The automated door access control system is a highly sophisticated method of identification that utilises both biometric and nonbiometric techniques. Non-biometric methods of identification rely on the use of passwords and access cards, while biometric methods utilise techniques such as fingerprint, iris, or facial recognition to establish a person's identity. Nevertheless, the majority of these techniques do not incorporate algorithms that are based on roles, which assess access requests in relation to a predetermined set of roles. The study introduces a novel design and simulation of a role-based access control system that has been augmented with a facial recognition system, utilising the ESP32-CAM. The RBAC-EFR system has been developed with the aim of facilitating domain-specific expert systems. It leverages roles and facial recognition technologies to enable deductions or choices pertaining to user access that it actively monitors. The Arduino Integrated Development Environment (IDE) serves as a design tool for interacting with and uploading programmes to the Arduino hardware. On the other hand, Proteus is utilised as a circuit simulation and virtual system modelling application. This technology is utilised for simulating the interaction between microcontroller scripts and digital electronics components.
    Integration of Ultrasonic Range Finder Technology with Internet of Things (IOT) for Smart Automated Door Control Systems is an innovative strategy that integrates two potent technologies to create intelligent and effective door control... more
    Integration of Ultrasonic Range Finder Technology with Internet of Things (IOT) for Smart Automated Door Control Systems is an innovative strategy that integrates two potent technologies to create intelligent and effective door control systems. The integration enables the automation and real-time monitoring of door operations, thereby augmenting security, convenience, and energy efficiency. By integrating ultrasonic range finder technology, which utilizes sound waves to measure distances, with IoT, automated door control systems can be enhanced with advanced capabilities. This paper presents the integration of ultrasonic range finder technology with smart automated door control systems. The work is grounded on the principles of sound wave propagation and the measurement of the time it takes for sound waves to travel and reflect back from objects. The work leveraging on the benefits of enhanced security, efficiency, data-driven insights, and integration possibilities with other IoT devices using Arduino Uno microcontroller, Ultrasonic Range Finder, Ultrasonic sensors, motor driver and integrated circuits. Ultrasonic sensors are deployed as the main medium for sensing users of the system. The methodology adopted include a user interface prototype that can be simulated to allow monitoring and control of the automated door system. The study contributes towards the potential for developing intelligent and responsive door control systems, thereby augmenting the overall smart building ecosystem. The contributions of this work demonstrate the versatility and wide-ranging impact of Ultrasonic Range Finder Technology across various domains, enhancing efficiency, safety, and functionality in numerous applications.
    Due to the rapid advancement of electronic commerce technologies, the use of credit cards has increased significantly. Given that credit cards are the most common form of payment, the incidence of credit card fraud has also risen. With... more
    Due to the rapid advancement of electronic commerce technologies, the use of credit cards has increased significantly. Given that credit cards are the most common form of payment, the incidence of credit card fraud has also risen. With the rise of online money transfers in the cashless economy and the migration of businesses to the internet, fraud detection has become a crucial aspect of transaction security. With the advent of technological advancement and the emergence of new eservice payment options, such as e-commerce and mobile payments, credit card transactions have become more common. To prevent credit card fraud, a robust and reliable fraud detection system is necessary. Several approaches, including predictive approaches and algorithms, have been proposed to detect credit card fraud. These algorithms establish a set of logically sound principles that permit the classification of data as either normal or dubious. However, credit card fraud has persisted despite the adoption of more sophisticated techniques. This study presents an approach for detecting credit card fraud using random forests. The dataset and the user's current dataset are analysed using the random forest technique. Before analysing a subset of given data to detect fraud, the method increases the precision of the outcome data. In addition, a comprehensive comparison and analysis of current and future fraud detection measures is presented. The dataset is applied Random Forest-based classification models, and the model's performance is evaluated using graphical representations of precision and classification accuracy.
    Dynamic control and performance evaluation of a microcontroller-based smart industrial heat extractor involves the implementation of control strategies and the assessment of its performance under dynamic operating conditions. Performance... more
    Dynamic control and performance evaluation of a microcontroller-based smart industrial heat extractor involves the implementation of control strategies and the assessment of its performance under dynamic operating conditions. Performance evaluation aims to assess the
    effectiveness and efficiency of the microcontroller-based smart industrial heat extractor under dynamic conditions. Industrial heat extraction systems are often complex, involving multiple components, sensors, actuators, and control algorithms. Understanding and modelling the
    dynamic behavior of these systems can be challenging, especially when considering factors like heat transfer rates, thermal delays, and interactions between different system elements. The effectiveness of dynamic control is significantly dependent on precise and dependable
    measurements from sensors. Sensors deployed in industrial settings may encounter severe environmental conditions, which can result in possible inaccuracies, drift, or even malfunctions. The objective of this research is to propose a simulated methodology for verifying the efficacy of
    a microcontroller-driven intelligent heat extractor utilized in industrial settings. The execution of experiments or tests within industrial environments can be a costly and time-intensive endeavor,
    and may entail potential hazards. The efficacy of a smart industrial heat extractor in practical industrial settings can be ascertained through the simulation of its evaluation process, thereby mitigating potential risks. The model designed and simulated in this work utilizes an integrated
    temperature sensor to determine the ambient temperature and transmits a signal to the Arduino UNO microcontroller when the temperature sensor detects variant temperatures ranges. The evaluation is performed by comparing the behavior and performance of the simulated system
    with predefined performance metrices.
    The implementation of inventory optimization tools enables businesses to make precise inventory allocation decisions, such as the categorization of distribution levels based on demand and constraints and the improvement of demand... more
    The implementation of inventory optimization tools enables businesses to make precise inventory allocation decisions, such as the categorization of distribution levels based on demand and constraints and the improvement of demand forecasting. Inventory optimization is a vital concern for retail establishments. Supply chain management forecasts have frequently been unreliable and imprecise due to insufficient optimization strategies and inconsistent alignment of information technology with business strategy. This article presents a technique for the formulation of constraints and the justification of demand forecasting optimization. The methodology is implemented by employing a Branch and Bound and Dynamic Programming approach that is compatible with Expert System extensions. Effective administration of business processes entails fostering coordination, presenting a variety of stakeholder perspectives, and facilitating participation and collaboration. These factors are essential for achieving successful results. Using algorithmic models and rapid prototyping approaches, the research identifies the fundamental factors underlying demand fluctuations and forecasts the most beneficial outcomes for demand and supply chain management. This study presents a comprehensive inventory optimization strategy that aims to improve inventory control throughout the entire supply chain. This strategy employs both the Branch and Bound technique and the Dynamic Programming Optimization Methodology as its methodology.
    Over time, a range of control mechanisms have been devised to deter unauthorised individuals from gaining entry to restricted facilities. The principal objective of implementing an automated access control system is to guarantee the... more
    Over time, a range of control mechanisms have been devised to deter unauthorised individuals from gaining entry to restricted facilities. The principal objective of implementing an automated access control system is to guarantee the protection of individuals and assets. The proliferation of electronic devices that offer security measures has gained significant traction in recent times. The automated door access control system is a highly sophisticated method of identification that utilises both biometric and nonbiometric techniques. Non-biometric methods of identification rely on the use of passwords and access cards, while biometric methods utilise techniques such as fingerprint, iris, or facial recognition to establish a person's identity. Nevertheless, the majority of these techniques do not incorporate algorithms that are based on roles, which assess access requests in relation to a predetermined set of roles. The study introduces a novel design and simulation of a role-based access control system that has been augmented with a facial recognition system, utilising the ESP32-CAM. The RBAC-EFR system has been developed with the aim of facilitating domain-specific expert systems. It leverages roles and facial recognition technologies to enable deductions or choices pertaining to user access that it actively monitors. The Arduino Integrated Development Environment (IDE) serves as a design tool for interacting with and uploading programmes to the Arduino hardware. On the other hand, Proteus is utilised as a circuit simulation and virtual system modelling application. This technology is utilised for simulating the interaction between microcontroller scripts and digital electronics components.
    As the endorsement of Enterprise Architecture (EA) modelling continues to grow in diversity and complexity, management of its schema, artefacts, semantics and relationships has become an important business concern. To maintain agility and... more
    As the endorsement of Enterprise Architecture (EA) modelling continues to grow in diversity and complexity, management of its schema, artefacts, semantics and relationships has become an important business concern. To maintain agility and flexibility within competitive markets, organizations have also been compelled to explore ways of adjusting proactively to innovations, changes and complex events also by use of EA concepts to model business processes and strategies. Thus the need to ensure appropriate validation of EA taxonomies has been considered severally as an essential requirement for these processes in order to exert business motivation; relate information systems to technological infrastructure. However, since many taxonomies deployed today use widespread and disparate modelling methodologies, the possibility to adopt a generic validation approach remains a challenge. The proliferation of EA methodologies and perspectives has also led to intricacies in the formalization and...
    Enterprise Architecture (EA) has been defined as the organization of a system embodied in its components, relationships to each other, environment, the principle guiding its design and evolution (IEEE, 2000). Thus an important... more
    Enterprise Architecture (EA) has been defined as the organization of a system embodied in its components, relationships to each other, environment, the principle guiding its design and evolution (IEEE, 2000). Thus an important characteristic of EA is to provide a holistic view of the enterprise visualizing the relevant aspects of the business for specific stakeholders. However, one of the many concerns of this interest has been how to deal with the complex challenges of implementing the models with the ability to validate its integrated components to ensure conformity with individual stakeholder’s motivation. To achieve this, methodologies that describe components in relation to their behavioral attributes, impact on other elements in the domain and their dependencies have been postulated. Albeit, studies show that these taxonomies do not adequately address validation requirements (Lankhorst, 2013). This article analyzes the EA concepts of ArchiMate, focusing on the business and app...
    As modelling of enterprise continues to influence the way many organisations represent their business strategies and technologies, there is a commensurate growth in knowledge base with resolute lessons gained. This is despite criticism... more
    As modelling of enterprise continues to influence the way many organisations represent their business strategies and technologies, there is a commensurate growth in knowledge base with resolute lessons gained. This is despite criticism that Enterprise Architecture development should have started by the elaboration of an agreed architecture representation language in order to avoid contemporary perilous proposals. Many practitioners argue that since the advent of Enterprise Architecture (EA), a lot of prominence has continually been placed distinctively on business process modelling and information technology infrastructure with less emphasis on alignment and formalisation. Approaches such as top-down and bottom-up have been proffered with fastidious ambience of best practises for EA development. Without an extensible and comprehensive alignment strategy, it is difficult for enterprise to evaluate the usefulness of architectures as complex architectures are intricate and difficult to...
    The proliferation of numerous and widespread import agents inhibit the initiative to engage in mass mechanization of agricultural enterprise. With a synergised import and mechanization policy, a competitive relationship can emerge that... more
    The proliferation of numerous and widespread import agents inhibit the initiative to engage in mass mechanization of agricultural enterprise. With a synergised import and mechanization policy, a competitive relationship can emerge that allows an import-dependent country to hedge strategically against price manipulation and accession. For these reasons and other enabling factors, a synergy of food import with agricultural mechanization policies is recommended both as a flexible measure for remodelling changes in food security and as a means for sustainability in developing economies. This article present ways in which this can be achieved, consequently it is structured as follows; Section one places the current food importation in Nigeria in perspective by discussing extent and dominance of national corporations in global food markets; section two presents the root cause of food insecurity, effects of trade and food aid policies as the underpinning legislation for food importation; s...
    Changing the way people work is difficult. The habits and culture of systems development teams are typically ingrained in the way they conceive and approach software solutions. Agile development is a software development process that... more
    Changing the way people work is difficult. The habits and culture of systems development teams are typically ingrained in the way they conceive and approach software solutions. Agile development is a software development process that leverages on adaptive planning, rapid response to change in requirements, early delivery and continuous product improvement. The purpose of agile development is to minimize project failure by collaborating and harnessing bilateral interactions with users. However, agile projects come with a set of challenges and problems different from those faced by projects that follow traditional methodologies. For organizations and projects, where experience can be used to plan a course of action with a good degree of certainty for a positive outcome, a traditional methodology may be more appropriate than an agile methodology. Agile methodologies are effective when the product details cannot be defined or agreed in advance with realistic degree of accuracy. In that ...
    Ontologies are an essential resource for knowledge exemplification. The ontology of a complex system can be intricate due to role relations between several concepts, diverse attributes and incongruent instances. In this paper, the visual... more
    Ontologies are an essential resource for knowledge exemplification. The ontology of a complex system can be intricate due to role relations between several concepts, diverse attributes and incongruent instances. In this paper, the visual analytics clarification, based on different coordinated viewpoints for exploring diverse ontology facets and an innovative deployment of queries to streamline traceability of abstractions is used to moderate the complexity of ontology visual representation. Though many researches have delved into the transformation of Enterprise Architecture models to ontologies, visualization of ontology abstractions have not been exhaustively exemplified especially in the context of system development. One of the major reasons why this is important is the need to effectively align business processes with application framework within organizations, especially those with numerous disparate systems. A method for ensuring the synchronization of the models and ontologi...
    This article draws multiple theoretical concepts and exploits abstractions from knowledge repositories on rural, social networks literatures to investigate rural agricultural practices with intelligent systems and technological trends. A... more
    This article draws multiple theoretical concepts and exploits abstractions from knowledge repositories on rural, social networks literatures to investigate rural agricultural practices with intelligent systems and technological trends. A qualitative research methodology is adopted to gain an understanding to underlying reasons, opinions, and motivations. The methodology also provides insights into the challenges of rural agriculture practice automation and helps to develop ideas and solutions for transforming the rural community. Findings from this work indicate that international food policies and agencies play a significant role in driving innovations for rural agriculturist than governmental institutions and networks. Knowledge management and exchange among producers is limited and does not provide the bridge to formal implementation of intelligent systems. The findings also demonstrate how territorial contingent factors profile automation in the agricultural sector, and how they...
    Changing the way people work is difficult. The habits and culture of systems development teams are typically ingrained in the way they conceive and approach software solutions. Agile development is a software development process that... more
    Changing the way people work is difficult. The habits and culture of systems development teams are typically ingrained in the way they conceive and approach software solutions. Agile development is a software development process that leverages on adaptive planning, rapid response to change in requirements, early delivery and continuous product improvement. The purpose of agile development is to minimize project failure by collaborating and harnessing bilateral interactions with users. However, agile projects come with a set of challenges and problems different from those faced by projects that follow traditional methodologies. For organizations and projects, where experience can be used to plan a course of action with a good degree of certainty for a positive outcome, a traditional methodology may be more appropriate than an agile methodology. Agile methodologies are effective when the product details cannot be defined or agreed in advance with realistic degree of accuracy. In that case, the situation calls for a collaborative environment between the user and the developer. This paper presents an implementation of the agile methodology in a complex and dynamic environment of human resource management. A tripartite case study of an online recruitment involving governmental job pooling center, organizations seeking for applicants and the job applicants is used to demonstrate the effectiveness of the Agile methodology. The importance of this particular case under study is that it goes far beyond an employer or recruiter running a job recruitment portal. This case involves government collaboration in a way that can enable the provision of a supportive environment that create solutions to unemployment, stimulates economic growth and manpower development using accurate and reliable statistics obtained from this type of project. Unemployment can induce precarious social ills such as crime, corruption and underdevelopment prevalent in many societies. The lightweight approach adopted in this work defines its own processes for realizing the core principles for agile methods. The outcome exemplifies an effective client-centric approach to feature-driven development and knowledge sharing thus fostering the principles of high-quality development, testing and collaboration within the enterprises.
    Many problems within the software industry today emanate from the obliviousness that system development for decision support belongs to a category of problems that are classified complex or "wicked" in type. Therefore, procedural... more
    Many problems within the software industry today emanate from the obliviousness that system development for decision support belongs to a category of problems that are classified complex or "wicked" in type. Therefore, procedural paradigms that address this phenomenon constitute areas of much exploration. Distinct with conventional problems, complex problems in information systems are problems that lack clarity in both their aims and solutions; lack definitive constructs and deficient of inherent logic that distinctively indicate when desired goals are achieved. This paper provides a discourse on the complex problems in Insurance Categorization. Ontology and knowledge classification in Insurance systems fall under this class of problems as insurance policies commonly describe problems exigent to unravel due to incomplete, contradictory and obstinately changing requirements. Because people exist in very unpredictable cultures with extensive categories of risk trajectories, it is commonly expected that individuals at some point in life are going to be challenged with situations which will hamper their privileges. Today, insurance businesses evolve either adversely or optimistically contingent upon the kinds of solutions they proffer for complex problems which may range from competition, management and pecuniary situation. This paper decomposes key functionalities of complex insurance processes to model knowledge-based ontology adopting standard Enterprise Architecture archetypes that impress upon the intricacies of insurance information systems management. The relationships between classifications, perspectives and knowledge are examined to promote the organization of homogenous entities intended for grouping procedures for decision support systems. The strengths and constraints of imperative categorization approaches are described in terms of their ability to ruminate, discover, and apply contemporary expositions through modelling so as to harmonize business goals with application infrastructures.
    As modelling of enterprise continues to influence the way many organisations represent their business strategies and technologies, there is a commensurate growth in knowledge base with resolute lessons gained. This is despite criticism... more
    As modelling of enterprise continues to influence the way many organisations represent their business strategies and technologies, there is a commensurate growth in knowledge base with resolute lessons gained. This is despite criticism that Enterprise Architecture development should have started by the elaboration of an agreed architecture representation language in order to avoid contemporary perilous proposals. Many practitioners argue that since the advent of Enterprise Architecture (EA), a lot of prominence has continually been placed distinctively on business process modelling and information technology infrastructure with less emphasis on alignment and formalisation. Approaches such as top-down and bottom-up have been proffered with fastidious ambience of best practises for EA development. Without an extensible and comprehensive alignment strategy, it is difficult for enterprise to evaluate the usefulness of architectures as complex architectures are intricate and difficult to understand by stakeholders. This paper explores the adoption of EA models as an effective strategy for aligning EA models with business constraints and goals. It presents an approach for categorization and modelling of EA artefacts with focus on motivation abstraction to alignment of business processes with organizational goals. The paper proposes and exemplifies methods that can segregate domain knowledge from the operational knowledge and facilitates the analysis of the domain structures through formalised decomposition and systematic integration by use of standard modelling notations of ArchiMate. This approach contrasts significantly with other common perspective such as the maturity matrices, balanced scorecard and reference models and espouses the use of interrogative constructs of EA models to confirm that the EA model meets the intrinsic goals defined by their motivation.

    And 13 more