Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (120)

Search Parameters:
Keywords = Stack4Things

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 1339 KiB  
Article
Stacking Ensemble Deep Learning for Real-Time Intrusion Detection in IoMT Environments
by Easa Alalwany, Bader Alsharif, Yazeed Alotaibi, Abdullah Alfahaid, Imad Mahgoub and Mohammad Ilyas
Sensors 2025, 25(3), 624; https://doi.org/10.3390/s25030624 - 22 Jan 2025
Viewed by 337
Abstract
The Internet of Medical Things (IoMT) is revolutionizing healthcare by enabling advanced patient care through interconnected medical devices and systems. However, its critical role and sensitive data make it a prime target for cyber threats, requiring the implementation of effective security solutions. This [...] Read more.
The Internet of Medical Things (IoMT) is revolutionizing healthcare by enabling advanced patient care through interconnected medical devices and systems. However, its critical role and sensitive data make it a prime target for cyber threats, requiring the implementation of effective security solutions. This paper presents a novel intrusion detection system (IDS) specifically designed for IoMT networks. The proposed IDS leverages machine learning (ML) and deep learning (DL) techniques, employing a stacking ensemble method to enhance detection accuracy by integrating the strengths of multiple classifiers. To ensure real-time performance, the IDS is implemented within a Kappa Architecture framework, enabling continuous processing of IoMT data streams. The system effectively detects and classifies a wide range of cyberattacks, including ARP spoofing, DoS, Smurf, and Port Scan, achieving an outstanding detection accuracy of 0.991 in binary classification and 0.993 in multi-class classification. This research highlights the potential of combining advanced ML and DL methods with ensemble learning to address the unique cybersecurity challenges of IoMT systems, providing a reliable and scalable solution for safeguarding healthcare services. Full article
(This article belongs to the Special Issue Sensors in mHealth Applications)
Show Figures

Figure 1

19 pages, 2816 KiB  
Article
An LDDoS Attack Detection Method Based on Behavioral Characteristics and Stacking Mechanism
by Junwei Ye, Zhixuan Wang, Jichen Yang, Chunan Wang and Chunyu Zhang
IoT 2025, 6(1), 7; https://doi.org/10.3390/iot6010007 - 21 Jan 2025
Viewed by 281
Abstract
Today, the development of the Internet of Things has grown, and the number of related IoT devices has reached the order of tens of billions. Most IoT devices are vulnerable to attacks, especially DdoS (Distributed Denial of Service attack) attacks. DDoS attacks can [...] Read more.
Today, the development of the Internet of Things has grown, and the number of related IoT devices has reached the order of tens of billions. Most IoT devices are vulnerable to attacks, especially DdoS (Distributed Denial of Service attack) attacks. DDoS attacks can easily cause damage to IoT devices, and LDDoS is an attack launched against hardware resources through a small string of very slow traffic. Compared with traditional large-scale DDoS, their attacks require less bandwidth and generate traffic similar to that of normal users, making them difficult to distinguish when identifying them. This article uses the CICIoT2023 dataset combined with behavioral features and stacking mechanisms to extract information from the attack behavior of low-rate attacks as features and uses the stacking mechanism to improve the recognition effect. A method of behavioral characteristics and stacking mechanism is proposed to detect DDoS attacks. This method can accurately detect LDDoS. Experimental results show that the recognition rate of low-rate attacks of this scheme reaches 0.99, and other indicators such as accuracy, recall, and F1 score are all better than other LDDoS detection methods. Thus, the method model proposed in this paper can effectively detect LDDoS attacks. At present, DDoS attacks are relatively mature, and there are many related results, but there is less research on LDDoS detection alone. This paper focuses on the investigation and analysis of LDDoS attacks in DDoS attacks and deduces feasible LDDoS detection methods. Full article
Show Figures

Figure 1

25 pages, 2548 KiB  
Article
Efficient Real-Time Anomaly Detection in IoT Networks Using One-Class Autoencoder and Deep Neural Network
by Aya G. Ayad, Mostafa M. El-Gayar, Noha A. Hikal and Nehal A. Sakr
Electronics 2025, 14(1), 104; https://doi.org/10.3390/electronics14010104 - 30 Dec 2024
Viewed by 587
Abstract
In the face of growing Internet of Things (IoT) security challenges, traditional Intrusion Detection Systems (IDSs) fall short due to IoT devices’ unique characteristics and constraints. This paper presents an effective, lightweight detection model that strengthens IoT security by addressing the high dimensionality [...] Read more.
In the face of growing Internet of Things (IoT) security challenges, traditional Intrusion Detection Systems (IDSs) fall short due to IoT devices’ unique characteristics and constraints. This paper presents an effective, lightweight detection model that strengthens IoT security by addressing the high dimensionality of IoT data. This model merges an asymmetric stacked autoencoder with a Deep Neural Network (DNN), applying one-class learning. It achieves a high detection rate with minimal false positives in a short time. Compared with state-of-the-art approaches based on the BoT-IoT dataset, it shows a higher detection rate of up to 96.27% in 0.27 s. Also, the model achieves an accuracy of 99.99%, precision of 99.21%, and f1 score of 97.69%. These results demonstrate the effectiveness and significance of the proposed model, confirming its potential for reliable deployment in real IoT security problems. Full article
(This article belongs to the Special Issue AI in Cybersecurity, 2nd Edition)
Show Figures

Figure 1

21 pages, 5660 KiB  
Article
EWAIS: An Ensemble Learning and Explainable AI Approach for Water Quality Classification Toward IoT-Enabled Systems
by Nermeen Gamal Rezk, Samah Alshathri, Amged Sayed and Ezz El-Din Hemdan
Processes 2024, 12(12), 2771; https://doi.org/10.3390/pr12122771 - 5 Dec 2024
Viewed by 780
Abstract
In the context of smart cities with advanced Internet of Things (IoT) systems, ensuring the sustainability and safety of freshwater resources is pivotal for public health and urban resilience. This study introduces EWAIS (Ensemble Learning and Explainable AI System), a novel framework designed [...] Read more.
In the context of smart cities with advanced Internet of Things (IoT) systems, ensuring the sustainability and safety of freshwater resources is pivotal for public health and urban resilience. This study introduces EWAIS (Ensemble Learning and Explainable AI System), a novel framework designed for the smart monitoring and assessment of water quality. Leveraging the strengths of Ensemble Learning models and Explainable Artificial Intelligence (XAI), EWAIS not only enhances the prediction accuracy of water quality but also provides transparent insights into the factors influencing these predictions. EWAIS integrates multiple Ensemble Learning models—Extra Trees Classifier (ETC), K-Nearest Neighbors (KNN), AdaBoost Classifier, decision tree (DT), Stacked Ensemble, and Voting Ensemble Learning (VEL)—to classify water as drinkable or non-drinkable. The system incorporates advanced techniques for handling missing data and statistical analysis, ensuring robust performance even in complex urban datasets. To address the opacity of traditional Machine Learning models, EWAIS employs XAI methods such as SHAP and LIME, generating intuitive visual explanations like force plots, summary plots, dependency plots, and decision plots. The system achieves high predictive performance, with the VEL model reaching an accuracy of 0.89 and an F1-Score of 0.85, alongside precision and recall scores of 0.85 and 0.86, respectively. These results demonstrate the proposed framework’s capability to deliver both accurate water quality predictions and actionable insights for decision-makers. By providing a transparent and interpretable monitoring system, EWAIS supports informed water management strategies, contributing to the sustainability and well-being of urban populations. This framework has been validated using controlled datasets, with IoT implementation suggested to enhance water quality monitoring in smart city environments. Full article
Show Figures

Figure 1

22 pages, 4876 KiB  
Article
Innovative Ghost Channel Spatial Attention Network with Adaptive Activation for Efficient Rice Disease Identification
by Yang Zhou, Yang Yang, Dongze Wang, Yuting Zhai, Haoxu Li and Yanlei Xu
Agronomy 2024, 14(12), 2869; https://doi.org/10.3390/agronomy14122869 - 1 Dec 2024
Viewed by 692
Abstract
To address the computational complexity and deployment challenges of traditional convolutional neural networks in rice disease identification, this paper proposes an efficient and lightweight model: Ghost Channel Spatial Attention ShuffleNet with Mish-ReLU Adaptive Activation Function (GCA-MiRaNet). Based on ShuffleNet V2, we effectively reduced [...] Read more.
To address the computational complexity and deployment challenges of traditional convolutional neural networks in rice disease identification, this paper proposes an efficient and lightweight model: Ghost Channel Spatial Attention ShuffleNet with Mish-ReLU Adaptive Activation Function (GCA-MiRaNet). Based on ShuffleNet V2, we effectively reduced the model’s parameter count by streamlining convolutional layers, decreasing stacking depth, and optimizing output channels. Additionally, the model incorporates the Ghost Module as a replacement for traditional 1 × 1 convolutions, further reducing computational overhead. Innovatively, we introduce a Channel Spatial Attention Mechanism (CSAM) that significantly enhances feature extraction and generalization aimed at rice disease detection. Through combining the advantages of Mish and ReLU, we designed the Mish-ReLU Adaptive Activation Function (MAAF), enhancing the model’s generalization capacity and convergence speed. Through transfer learning and ElasticNet regularization, the model’s accuracy has notably improved while effectively avoiding overfitting. Sufficient experimental results indicate that GCA-MiRaNet attains a precision of 94.76% on the rice disease dataset, with a 95.38% reduction in model parameters and a compact size of only 0.4 MB. Compared to traditional models such as ResNet50 and EfficientNet V2, GCA-MiRaNet demonstrates significant advantages in overall performance, especially on embedded devices. This model not only enables efficient and accurate real-time disease monitoring but also provides a viable solution for rice field protection drones and Internet of Things management systems, advancing the process of contemporary agricultural smart management. Full article
Show Figures

Figure 1

31 pages, 4839 KiB  
Article
Earthquake Prediction and Alert System Using IoT Infrastructure and Cloud-Based Environmental Data Analysis
by Cosmina-Mihaela Rosca and Adrian Stancu
Appl. Sci. 2024, 14(22), 10169; https://doi.org/10.3390/app142210169 - 6 Nov 2024
Viewed by 1878
Abstract
Earthquakes are one of the most life-threatening natural phenomena, and their prediction is of constant concern among scientists. The study proposes that abrupt weather parameter value fluctuations may influence the occurrence of shallow seismic events by focusing on developing an innovative concept that [...] Read more.
Earthquakes are one of the most life-threatening natural phenomena, and their prediction is of constant concern among scientists. The study proposes that abrupt weather parameter value fluctuations may influence the occurrence of shallow seismic events by focusing on developing an innovative concept that combines historical meteorological and seismic data collection to predict potential earthquakes. A machine learning (ML) model utilizing the ML.NET framework was designed and implemented. An analysis was undertaken to identify which modeling approach, value prediction, or data classification performs better in forecasting seismic events. The model was trained on a dataset of 8766 records corresponding to the period from 1 January 2001 to 5 October 2024. The achieved accuracy of the model was 95.65% for earthquake prediction based on weather conditions in the Vrancea region, Romania. The authors proposed a unique alerting algorithm and conducted a case study that evaluates multiple predictive models, varying parameters, and methods to identify the most effective model for seismic event prediction in specific meteorological conditions. The findings demonstrate the potential of combining Internet of Things (IoT)-based environmental monitoring with AI to improve earthquake prediction accuracy and preparedness. An IoT-based application was developed using C# with ASP.NET framework to enhance earthquake prediction and public warning capabilities, leveraging Azure cloud infrastructure. The authors also created a hardware prototype for real-time earthquake alerting, integrating the M5Stack platform with ESP32 and MPU-6050 sensors for validation. The testing phase and results describe the proposed methodology and various scenarios. Full article
(This article belongs to the Special Issue Machine Learning Applications in Seismology: 2nd Edition)
Show Figures

Figure 1

14 pages, 2641 KiB  
Article
Vacuum Filtration-Coated Silver Electrodes Coupled with Stacked Conductive Multi-Walled Carbon Nanotubes/Mulberry Paper Sensing Layers for a Highly Sensitive and Wide-Range Flexible Pressure Sensor
by Guanhai Yan, Dongrui Dang, Sheng Chang, Xuefeng Zhang, Jinhua Zhang and Zhengdong Wang
Micromachines 2024, 15(11), 1306; https://doi.org/10.3390/mi15111306 - 28 Oct 2024
Viewed by 978
Abstract
Flexible pressure sensors based on paper have attracted considerable attention owing to their good performance, low cost, and environmental friendliness. However, effectively expanding the detection range of paper-based sensors with high sensitivities is still a challenge. Herein, we present a paper-based resistive pressure [...] Read more.
Flexible pressure sensors based on paper have attracted considerable attention owing to their good performance, low cost, and environmental friendliness. However, effectively expanding the detection range of paper-based sensors with high sensitivities is still a challenge. Herein, we present a paper-based resistive pressure sensor with a sandwich structure consisting of two electrodes and three sensing layers. The silver nanowires were dispersed deposited on a filter paper substrate using the vacuum filtration coating method to prepare the electrode. And the sensing layer was fabricated by coating carbon nanotubes onto a mulberry paper substrate. Waterborne polyurethane was introduced in the process of preparing the sensing layers to enhance the strength of the interface between the carbon nanotubes and the mulberry paper substrate. Therefore, the designed sensor exhibits a good sensing performance by virtue of the rational structure design and proper material selection. Specifically, the rough surfaces of the sensing layers, porous conductive network of silver nanowires on the electrodes, and the multilayer stacked structure of the sensor collaboratively increase the change in the surface contact area under a pressure load, which improves the sensitivity and extends the sensing range simultaneously. Consequently, the designed sensor exhibits a high sensitivity (up to 6.26 kPa−1), wide measurement range (1000 kPa), low detection limit (~1 Pa), and excellent stability (1000 cycles). All these advantages guarantee that the sensor has potential for applications in smart wearable devices and the Internet of Things. Full article
Show Figures

Figure 1

27 pages, 1948 KiB  
Article
A Cross-Layer Approach to Analyzing Energy Consumption and Lifetime of a Wireless Sensor Node
by Fernando Ojeda, Diego Mendez, Arturo Fajardo, Maximilian Gottfried Becker and Frank Ellinger
J. Sens. Actuator Netw. 2024, 13(5), 56; https://doi.org/10.3390/jsan13050056 - 19 Sep 2024
Viewed by 3440
Abstract
Several wireless communication technologies, including Wireless Sensor Networks (WSNs), are essential for Internet of Things (IoT) applications. WSNs employ a layered framework to govern data exchanges between sender and recipient, which facilitates the establishment of rules and standards. However, in this conventional framework, [...] Read more.
Several wireless communication technologies, including Wireless Sensor Networks (WSNs), are essential for Internet of Things (IoT) applications. WSNs employ a layered framework to govern data exchanges between sender and recipient, which facilitates the establishment of rules and standards. However, in this conventional framework, network data sharing is limited to directly stacked layers, allowing manufacturers to develop proprietary protocols while impeding WSN optimization, such as energy consumption minimization, due to non-directly stacked layer effects on network performance. A Cross-Layer (CL) framework addresses implementation, modeling, and design challenges in IoT systems by allowing unrestricted data and parameter sharing between non-stacked layers. This holistic approach captures system dynamics, enabling network design optimization to address IoT network challenges. This paper introduces a novel CL modeling methodology for wireless communication systems, which is applied in two case studies to develop models for estimating energy consumption metrics, including node and network lifetime. Each case study validates the resulting model through experimental tests, demonstrating high accuracy with less than 3% error. Full article
(This article belongs to the Section Communications and Networking)
Show Figures

Figure 1

23 pages, 1151 KiB  
Article
Enhancing Cybersecurity in Healthcare: Evaluating Ensemble Learning Models for Intrusion Detection in the Internet of Medical Things
by Theyab Alsolami, Bader Alsharif and Mohammad Ilyas
Sensors 2024, 24(18), 5937; https://doi.org/10.3390/s24185937 - 13 Sep 2024
Cited by 2 | Viewed by 2357
Abstract
This study investigates the efficacy of machine learning models for intrusion detection in the Internet of Medical Things, aiming to enhance cybersecurity defenses and protect sensitive healthcare data. The analysis focuses on evaluating the performance of ensemble learning algorithms, specifically Stacking, Bagging, and [...] Read more.
This study investigates the efficacy of machine learning models for intrusion detection in the Internet of Medical Things, aiming to enhance cybersecurity defenses and protect sensitive healthcare data. The analysis focuses on evaluating the performance of ensemble learning algorithms, specifically Stacking, Bagging, and Boosting, using Random Forest and Support Vector Machines as base models on the WUSTL-EHMS-2020 dataset. Through a comprehensive examination of performance metrics such as accuracy, precision, recall, and F1-score, Stacking demonstrates exceptional accuracy and reliability in detecting and classifying cyber attack incidents with an accuracy rate of 98.88%. Bagging is ranked second, with an accuracy rate of 97.83%, while Boosting yielded the lowest accuracy rate of 88.68%. Full article
(This article belongs to the Special Issue Recent Trends and Advances in Sensors Cybersecurity)
Show Figures

Figure 1

37 pages, 18482 KiB  
Article
Active Queue Management in L4S with Asynchronous Advantage Actor-Critic: A FreeBSD Networking Stack Perspective
by Deol Satish, Jonathan Kua and Shiva Raj Pokhrel
Future Internet 2024, 16(8), 265; https://doi.org/10.3390/fi16080265 - 25 Jul 2024
Cited by 1 | Viewed by 1283
Abstract
Bufferbloat is one of the leading causes of high data transmission latency and jitter on the Internet, which severely impacts the performance of low-latency interactive applications such as online streaming, cloud-based gaming/applications, Internet of Things (IoT) applications, voice over IP (VoIP), real-time video [...] Read more.
Bufferbloat is one of the leading causes of high data transmission latency and jitter on the Internet, which severely impacts the performance of low-latency interactive applications such as online streaming, cloud-based gaming/applications, Internet of Things (IoT) applications, voice over IP (VoIP), real-time video conferencing, and so forth. There is currently a pressing need for developing Transmission Control Protocol (TCP) congestion control algorithms and bottleneck queue management schemes that can collaboratively control/reduce end-to-end latency, thus ensuring optimal quality of service (QoS) and quality of experience (QoE) for users. This paper introduces a novel solution by experimentally integrate the low latency, low loss, and scalable throughput (L4S) architecture (specified by the IETF in RFC 9330) in FreeBSD framework with the asynchronous advantage actor-critic (A3C) reinforcement learning algorithm. The first phase involves incorporating a modified dual-queue coupled active queue management (AQM) system for L4S into the FreeBSD networking stack, enhancing queue management and mitigating latency and packet loss. The second phase employs A3C to adjust and fine-tune the system performance dynamically. Finally, we evaluate the proposed solution’s effectiveness through comprehensive experiments, comparing it with traditional AQM-based systems. This paper contributes to the advancement of machine learning (ML) for transport protocol research in the field. The experimental implementation and results presented in this paper are made available through our GitHub repositories. Full article
Show Figures

Figure 1

16 pages, 2459 KiB  
Article
Software-Bus-Toolchain (SBT): Introducing a Versatile Method for Quickly Implementing (I)IoT-Scenarios
by Simon D. Duque Anton
Future Internet 2024, 16(7), 237; https://doi.org/10.3390/fi16070237 - 3 Jul 2024
Viewed by 921
Abstract
The Internet of Things (IoT) has become ubiquitous. IoT devices are applied in a multitude of applications, e.g., in smart home scenarios, building automation, smart energy and smart cities, healthcare, and industrial environments. Fast and efficient implementation and roll-out of IoT devices is [...] Read more.
The Internet of Things (IoT) has become ubiquitous. IoT devices are applied in a multitude of applications, e.g., in smart home scenarios, building automation, smart energy and smart cities, healthcare, and industrial environments. Fast and efficient implementation and roll-out of IoT devices is a critical factor for successs and acceptance of IoT devices. At the same time, the variety of hardware platforms that can be used for IoT applications, as well as the number of IoT orchestration platforms is increasing. Finding the right combination of tooling and hardware is not trivial, but essential for building applications that provide value. In this work, a Software-Bus-Toolchain (SBT) is introduced that encapsulates firmware design, data point definition, and communication protocol usage. Furthermore, an IoT control platform is provided to control and evaluate the IoT modules. Thus, using the SBT, solely the business logic has to be designed, while the hardware-design is automated to a high degree. Usage of the Zephyr framework allows the interchange of hardware modules, while interfaces provide easy adaption of data points and communication capabilities. The implementation of interfaces to the IoT-platform as well as to the communication layer provides a universal usage of logic and data elements. The SBT is evaluated in two application scenarios, where its flexible nature is shown. Full article
Show Figures

Figure 1

19 pages, 964 KiB  
Article
Optimizing IoT Intrusion Detection Using Balanced Class Distribution, Feature Selection, and Ensemble Machine Learning Techniques
by Muhammad Bisri Musthafa, Samsul Huda, Yuta Kodera, Md. Arshad Ali, Shunsuke Araki, Jedidah Mwaura and Yasuyuki Nogami
Sensors 2024, 24(13), 4293; https://doi.org/10.3390/s24134293 - 1 Jul 2024
Cited by 3 | Viewed by 1927
Abstract
Internet of Things (IoT) devices are leading to advancements in innovation, efficiency, and sustainability across various industries. However, as the number of connected IoT devices increases, the risk of intrusion becomes a major concern in IoT security. To prevent intrusions, it is crucial [...] Read more.
Internet of Things (IoT) devices are leading to advancements in innovation, efficiency, and sustainability across various industries. However, as the number of connected IoT devices increases, the risk of intrusion becomes a major concern in IoT security. To prevent intrusions, it is crucial to implement intrusion detection systems (IDSs) that can detect and prevent such attacks. IDSs are a critical component of cybersecurity infrastructure. They are designed to detect and respond to malicious activities within a network or system. Traditional IDS methods rely on predefined signatures or rules to identify known threats, but these techniques may struggle to detect novel or sophisticated attacks. The implementation of IDSs with machine learning (ML) and deep learning (DL) techniques has been proposed to improve IDSs’ ability to detect attacks. This will enhance overall cybersecurity posture and resilience. However, ML and DL techniques face several issues that may impact the models’ performance and effectiveness, such as overfitting and the effects of unimportant features on finding meaningful patterns. To ensure better performance and reliability of machine learning models in IDSs when dealing with new and unseen threats, the models need to be optimized. This can be done by addressing overfitting and implementing feature selection. In this paper, we propose a scheme to optimize IoT intrusion detection by using class balancing and feature selection for preprocessing. We evaluated the experiment on the UNSW-NB15 dataset and the NSL-KD dataset by implementing two different ensemble models: one using a support vector machine (SVM) with bagging and another using long short-term memory (LSTM) with stacking. The results of the performance and the confusion matrix show that the LSTM stacking with analysis of variance (ANOVA) feature selection model is a superior model for classifying network attacks. It has remarkable accuracies of 96.92% and 99.77% and overfitting values of 0.33% and 0.04% on the two datasets, respectively. The model’s ROC is also shaped with a sharp bend, with AUC values of 0.9665 and 0.9971 for the UNSW-NB15 dataset and the NSL-KD dataset, respectively. Full article
(This article belongs to the Special Issue Internet of Things, Sensing and Cloud Computing—2nd Edition)
Show Figures

Figure 1

12 pages, 4731 KiB  
Article
High-Capacity Multiple-Input Multiple-Output Communication for Internet-of-Things Applications Using 3D Steering Nolen Beamforming Array
by Hanxiang Zhang, Hao Yan, Powei Liu, Saeed Zolfaghary Pour and Bayaner Arigong
Electronics 2024, 13(13), 2452; https://doi.org/10.3390/electronics13132452 - 22 Jun 2024
Cited by 1 | Viewed by 874
Abstract
In this paper, a novel 2D Nolen beamforming phased array with 3D scanning capability to achieve high channel capacity is presented for multiple-input multiple-output (MIMO) Internet-of-Things (IoT) applications. The proposed 2D beamforming phased array is designed by stacking a fundamental building block consisting [...] Read more.
In this paper, a novel 2D Nolen beamforming phased array with 3D scanning capability to achieve high channel capacity is presented for multiple-input multiple-output (MIMO) Internet-of-Things (IoT) applications. The proposed 2D beamforming phased array is designed by stacking a fundamental building block consisting of a 3 × 3 tunable Nolen matrix, which applies a small number of phase shifters with a small tunning range and reduces the complexity of the beam-steering control mechanism. Each 3 × 3 tunable Nolen matrix can achieve a full 360° range of progressive phase delay by exciting all three input ports, and nine individual radiation beams can be generated and continuously steered on azimuth and elevation planes by stacking up three tunable Nolen matrix in horizontal and three in vertical to maximize signal-to-noise ratio (SNR) in the corresponding spatial directions. To validate the proposed design, the simulations have been conducted on the circuit network and assessed in a fading channel environment. The simulation results agree well with the theoretical analysis, which demonstrates the capability of the proposed 2D Nolen beamforming phased array to realize high channel capacity in MIMO-enabled IoT communications. Full article
(This article belongs to the Special Issue Advances in Wireless Communication for loT)
Show Figures

Figure 1

13 pages, 1866 KiB  
Article
IMTIBOT: An Intelligent Mitigation Technique for IoT Botnets
by Umang Garg, Santosh Kumar and Aniket Mahanti
Future Internet 2024, 16(6), 212; https://doi.org/10.3390/fi16060212 - 17 Jun 2024
Viewed by 1102
Abstract
The tremendous growth of the Internet of Things (IoT) has gained a lot of attention in the global market. The massive deployment of IoT is also inherent in various security vulnerabilities, which become easy targets for hackers. IoT botnets are one type of [...] Read more.
The tremendous growth of the Internet of Things (IoT) has gained a lot of attention in the global market. The massive deployment of IoT is also inherent in various security vulnerabilities, which become easy targets for hackers. IoT botnets are one type of critical malware that degrades the performance of the IoT network and is difficult to detect by end-users. Although there are several traditional IoT botnet mitigation techniques such as access control, data encryption, and secured device configuration, these traditional mitigation techniques are difficult to apply due to normal traffic behavior, similar packet transmission, and the repetitive nature of IoT network traffic. Motivated by botnet obfuscation, this article proposes an intelligent mitigation technique for IoT botnets, named IMTIBoT. Using this technique, we harnessed the stacking of ensemble classifiers to build an intelligent system. This stacking classifier technique was tested using an experimental testbed of IoT nodes and sensors. This system achieved an accuracy of 0.984, with low latency. Full article
(This article belongs to the Special Issue Internet of Things and Cyber-Physical Systems II)
Show Figures

Figure 1

22 pages, 14584 KiB  
Article
An Integrated Smart Pond Water Quality Monitoring and Fish Farming Recommendation Aquabot System
by Md. Moniruzzaman Hemal, Atiqur Rahman, Nurjahan, Farhana Islam, Samsuddin Ahmed, M. Shamim Kaiser and Muhammad Raisuddin Ahmed
Sensors 2024, 24(11), 3682; https://doi.org/10.3390/s24113682 - 6 Jun 2024
Cited by 5 | Viewed by 6302
Abstract
The integration of cutting-edge technologies such as the Internet of Things (IoT), robotics, and machine learning (ML) has the potential to significantly enhance the productivity and profitability of traditional fish farming. Farmers using traditional fish farming methods incur enormous economic costs owing to [...] Read more.
The integration of cutting-edge technologies such as the Internet of Things (IoT), robotics, and machine learning (ML) has the potential to significantly enhance the productivity and profitability of traditional fish farming. Farmers using traditional fish farming methods incur enormous economic costs owing to labor-intensive schedule monitoring and care, illnesses, and sudden fish deaths. Another ongoing issue is automated fish species recommendation based on water quality. On the one hand, the effective monitoring of abrupt changes in water quality may minimize the daily operating costs and boost fish productivity, while an accurate automatic fish recommender may aid the farmer in selecting profitable fish species for farming. In this paper, we present AquaBot, an IoT-based system that can automatically collect, monitor, and evaluate the water quality and recommend appropriate fish to farm depending on the values of various water quality indicators. A mobile robot has been designed to collect parameter values such as the pH, temperature, and turbidity from all around the pond. To facilitate monitoring, we have developed web and mobile interfaces. For the analysis and recommendation of suitable fish based on water quality, we have trained and tested several ML algorithms, such as the proposed custom ensemble model, random forest (RF), support vector machine (SVM), decision tree (DT), K-nearest neighbor (KNN), logistic regression (LR), bagging, boosting, and stacking, on a real-time pond water dataset. The dataset has been preprocessed with feature scaling and dataset balancing. We have evaluated the algorithms based on several performance metrics. In our experiment, our proposed ensemble model has delivered the best result, with 94% accuracy, 94% precision, 94% recall, a 94% F1-score, 93% MCC, and the best AUC score for multi-class classification. Finally, we have deployed the best-performing model in a web interface to provide cultivators with recommendations for suitable fish farming. Our proposed system is projected to not only boost production and save money but also reduce the time and intensity of the producer’s manual labor. Full article
(This article belongs to the Special Issue AI, IoT and Smart Sensors for Precision Agriculture)
Show Figures

Figure 1

Back to TopTop