Web Intelligence (WI) is an official journal of the Web Intelligence Consortium (WIC), an international organization dedicated to promoting collaborative scientific research and industrial development in the era of Web intelligence. WI seeks to collaborate with major societies and international conferences in the field. WI is a peer-reviewed journal, which publishes four issues a year, in both online and print form.
WI aims to achieve a multi-disciplinary balance between research advances in theories and methods usually associated with Collective Intelligence, Data Science, Human-Centric Computing, Knowledge Management, and Network Science. It is committed to publishing research that both deepen the understanding of computational, logical, cognitive, physical, and social foundations of the future Web, and enable the development and application of technologies based on Web intelligence. The journal features high-quality, original research papers (including state-of-the-art reviews), brief papers, and letters in all theoretical and technology areas that make up the field of WI.
Abstract: One of the leading causes of death for people worldwide is liver cancer. Manually identifying the cancer tissue in the current situation is a challenging and time-consuming task. Assessing the tumor load, planning therapies, making predictions, and tracking the clinical response can all be done using the segmentation of liver lesions in Computed Tomography (CT) scans. In this paper we propose a new technique for liver cancer classification with CT image. This method consists of four stages like pre-processing, segmentation, feature extraction and classification. In the initial stage the input image will be pre processed for the quality enhancement. This…preprocessed output will be subjected to the segmentation phase; here improved deep fuzzy clustering technique will be applied for image segmentation. Subsequently, the segmented image will be the input of the feature extraction phase, where the extracted features are named as Improved Gabor Transitional Pattern, Grey-Level Co-occurrence Matrix (GLCM), Statistical features and Convolutional Neural Network (CNN) based feature. Finally the extracted features are subjected to the classification stage, here the two types of classifiers used for classification that is Bi-GRU and Deep Maxout. In this phase we will apply the Crossover mutated COOT optimization (CMCO) for tuning the weights, So that we will improve the quality of the image. This proposed technique, present the best accuracy of disease identification. The CMCO gained the accuracy of 95.58%, which is preferable than AO = 92.16%, COA = 89.38%, TSA = 88.05%, AOA = 92.05% and COOT = 91.95%, respectively.
Show more
Abstract: The prevalence of violence against women and children is concerning, and the initial step is to raise awareness of this issue. Certain forms of detection based techniques are not frequently regarded both socially and culturally permissible. Designing and implementing effective approaches in secondary and supplementary avoidance simultaneously depends on the characterization and assessment. Given the greater incidence of instances and mortalities resulting developing an early detection system is essential. Consequently, violence against women and children is a problem of human health of pandemic proportions. As a result, the focus of this survey is to analyze the existing methods used to…identify violence in photos or films. Here, 50 research papers are reviewed and their techniques employed, dataset, evaluation metrics, and publication year are analyzed. The study reviews the potential future research areas by examining the difficulties in identifying violence against women and children in literary works for researchers to overcome in order to produce better results.
Show more
Keywords: Violence detection (VD), Deep learning (DL), Machine Learning (ML), Deep Neural Networks (DNNs), Support vector machine (SVM)
Abstract: Stock market forecasting remains a difficult problem in the economics industry due to its incredible stochastic nature. The creation of such an expert system aids investors in making investment decisions about a certain company. Due to the complexity of the stock market, using a single data source is insufficient to accurately reflect all of the variables that influence stock fluctuations. However, predicting stock market movement is a challenging undertaking that requires extensive data analysis, particularly from a big data perspective. In order to address these problems and produce a feasible solution, appropriate statistical models and artificially intelligent algorithms are needed.…This paper aims to propose a novel stock market prediction by the following four stages; they are, preprocessing, feature extraction, improved feature level fusion and prediction. The input data is first put through a preparation step in which stock, news, and Twitter data (related to the COVID-19 epidemic) are processed. Under the big data perspective, the input data is taken into account. These pre-processed data are then put through the feature extraction, The improved aspect-based lexicon generation, PMI, and n-gram-based features in this case are derived from the news and Twitter data, while technical indicator-based features are derived from the stock data. The improved feature-level fusion phase is then applied to the extracted features. The ensemble classifiers, which include DBN, CNN, and DRN, were proposed during the prediction phase. Additionally, a SI-MRFO model is suggested to enhance the efficiency of the prediction model by adjusting the best classifier weights. Finally, SI-MRFO model’s effectiveness compared to the existing models with regard to MAE, MAPE, MSE and MSLE. The SI-MRFO accomplished the minimal MAE rate for the 90th learning percentage is approximately 0.015 while other models acquire maximum ratings.
Show more
Abstract: Cancers are genetically diversified, so anticancer treatments have different levels of efficacy on people due to genetic differences. The main objective of this work is to predict the anticancer drug efficiency for colorectal cancer patients to reduce the mortality rates and provides immune energy for the patients. This paper proposes a novel anti-cancer drug efficacy system in colorectal cancer patients. The input data gene is normalized with the Min–Max normalization technique that normalizes the data in distinct scales. Subsequently, proposes an improved entropy-based feature to evaluate the uncertainty distribution of data, in which it induces weight to overcome the issue…of computational complexity. Along with this feature, a correlation-based feature and statistical features are also retrieved. Subsequently, proposes a Recursive Feature Elimination with Hybrid Machine Learning (RFEHML) mechanism for selecting the appropriate feature set by eliminating the recursive features with the aid of hybrid Machine Learning strategies that combine decision tree and logistic regression. Also, the Gini impurity is employed for ranking the feature and selecting the maximum importance score by eliminating the least acquired importance score. Further, proposes a hybrid model for predicting the drug efficiency with the trained feature set. The hybrid model comprises of Long Short-Term Memory (LSTM) and Updated Rectified Linear Unit-Deep Convolutional Neural Network (UReLU-DCNN) model, in which DCNN is modified by updating the activation function at the fully connected layer. Consequently, the learned feature predicts the drug efficacy of anti-cancer in colorectal cancer patients by determining whether the patient is a responder or non-responder of the drug. Finally, the performance of the proposed RFEHML model is compared with other traditional approaches. It is found that the developed method has higher accuracy for each learning percentage, with values of 60LP = 92.48%, 70LP = 94.28%, 80LP = 95.24%, and 90LP = 96.86%, respectively.
Show more
Keywords: Drug efficacy prediction, Recursive Feature Elimination with Hybrid Machine Learning mechanism, Gini impurity, Updated Rectified Linear Unit-Deep Convolutional Neural Network, Colorectal Cancer
Abstract: Now a days, the Internet of Things (IoT) plays a vital role in every industry including agriculture due to its widespread and easy integrations. The agricultural methods are incorporated with IoT technologies for significant growth in agricultural fields. IoT is utilized to support farmers in using their resources effectively and support decision-making systems with better field monitoring techniques. The data collected from IoT-based agricultural systems are highly vulnerable to attack, hence to address this issue it is necessary to employ an authentication scheme. In this paper, Auth Key_Deep Convolutional Neural Network (Auth Key_DCNN) is designed to promote secure data sharing…in IoT-enabled agriculture systems. The different entities, namely sensors, Private Key Generator (PKG), controller, and data user are initially considered and the parameters are randomly initialized. The entities are registered and by using DCNN a secret key is generated in PKG. The encryption of transmitted data is performed in the data protection phase during the protection of data between the controller and the user. Additionally, the performance of the designed model is estimated, where the experimental results revealed that the Auth Key_DCNN model recorded superior performance with a minimal computational cost of 142.56, a memory usage of 49.5 MB, and a computational time of 1.34 sec.
Show more
Keywords: Hashing function, data encryption, key generation, authentication, deep convolutional neural network
Abstract: Supply chain management (SCM) is most significant place of concentration in various corporate circumstances. SCM has both designed and monitored numerous tasks with the following phases such as allocation, creation, product sourcing, and warehousing. Based on this perspective, the privacy of data flow is more important among producers, suppliers, and customers to ensure the responsibility of the market. This work aims to develop a novel Improved Digital Navigator Assessment (DNA)-based Self Improved Pelican Optimization Algorithm (IDNA-based SIPOA model) for secured data transmission in SCM via blockchain. An improved DNA cryptosystem is done for the process of preservation for data. The…original message is encrypted by Improved Advanced Encryption Standard (IAES). The optimal key generation is done by the proposed SIPOA algorithm. The efficiency of the adopted model has been analyzed with conventional methods with regard to security for secured data exchange in SCM. The proposed IDNA-based SIPOA obtained the lowest value for the 40% cypher text is 0.71, while the BWO is 0.79, DOA is 0.77, TWOA is 0.84, BOA is 0.83, POA is 0.86, SDSM is 0.88, DNASF is 0.82 and FSA-SLnO is 0.78, respectively.
Show more
Keywords: Blockchain, supply chain management, DNA cryptosystem, data preservation, IAES encryption
Abstract: In this competitive world, companies should sustain good relationships with their consumers. CRM (customer relationship management) program can improve the company’s customer satisfaction; to satisfy customer need different processes and technique are established to make the CRM more effective. This research is proposed to determine the relationship between customer loyalty and retention. Also, this research examines the impact of Customer Relationship Management (CRM) on Customer Satisfaction. The target population of this study is customers of the tourism industry in India (n = 300 ). Then, regression analysis is carried out in order to discover the link between the…variables. This study result shows that service quality and employee behavior of customer need and satisfaction with the effect of different significant of positive relation of both the variables. To make the customer satisfied and to retain their company the CRM should be strong and reliable with the consumers. CRM plays a vital role in increasing market share, high productivity, improving in-depth customer knowledge, and customer satisfaction to increase consumer loyalty to the company to have a clear view of who is their customer, what are the need of their customer and how can satisfy their needs and wants their customers.
Show more
Abstract: Underground crop leave disease classification is the most significant area in the agriculture sector as they are the significant source of carbohydrates for human food. However, a disease-ridden plant could threaten the availability of food for millions of people. Researchers tried to use computer vision (CV) to develop an image classification algorithm that might warn farmers by clicking the images of plant’s leaves to find if the crop is diseased or not. This work develops anew DHCLDC model for underground crop leave disease classification that considers the plants like cassava, potato and groundnut. Here, preprocessing is done by employing median…filter, followed by segmentation using Improved U-net (U-Net with nested convolutional block). Further, the features extracted comprise of color features, shape features and improved multi text on (MT) features. Finally, Hybrid classifier (HC) model is developed for DHCLDC, which comprised CNN and LSTM models. The outputs from HC(CNN + LSTM) are then given for improved score level fusion (SLF) from which final detected e are attained. Finally, simulations are done with 3 datasets to show the betterment of HC (CNN + LSTM) based DHCLDC model. The specificity of HC (CNN + LSTM) is high, at 95.41, compared to DBN, NN, RF, KNN, CNN, LSTM, DCNN, and SVM.
Show more
Abstract: Conversational agents (CAs) have been widely used for many domains, such as healthcare, education, and business. One main category of CAs is task-oriented CAs, which aim to help users to complete a set of specific tasks. However, task-oriented CAs can fail to answer the user’s question, which can lead to a breakdown in the dialogue (when it is not possible to complete a conversation with a CA). Breakdown detection is an essential task for developing better CAs. Several related studies have focused on breakdown detection using different sets of features, for example, topic transition, word-based similarity and clustering; but, the…existing studies develop features mainly from the system’s outputs or user’s inputs, whereas the features can be extracted from both sides, as well as from the interaction between them. Therefore, in this work, we developed a new supervised fusion machine learning (ML) model that combines the prediction from two machine learning algorithms for breakdown detection CAs services system. We developed features from different groups focusing on both the user input and the system response. Then we select the optimal combined features. The features are based on sentence similarity, sentiment features, and count-based features. The developed fusion model is mainly based on the two best performances of the single classifiers (SVM and RF). We explore several single ML algorithms using different sets of features and the combined features. To verify the effectiveness of the proposed fusion model, we compared the proposed models against baseline methods using four sets of data. We conclude that the proposed fusion model with the combined features outperforms the baselines and all other models in terms of prediction accuracy and f-score measures.
Show more