Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
Anisotropic Filtering Based on the WY Distribution and Multiscale Energy Concentration Accumulation Method for Dim and Small Target Enhancement
Previous Article in Journal
Advanced Unmixing Methodologies for Satellite Thermal Imagery: Matrix Changing and Classification Insights from ASTER and Landsat 8–9
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Volume-Based Occupancy Detection for In-Cabin Applications by Millimeter Wave Radar

1
Mechanical and Mechatronics Engineering Department, University of Waterloo, Waterloo, ON N2L 3G1, Canada
2
The Kilby Labs, Texas Instruments Inc., Dallas, TX 75243, USA
3
Electrical and Computer Engineering Department, University of Waterloo, Waterloo, ON N2L 3G1, Canada
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(16), 3068; https://doi.org/10.3390/rs16163068
Submission received: 27 June 2024 / Revised: 16 August 2024 / Accepted: 20 August 2024 / Published: 21 August 2024

Abstract

:
In-cabin occupancy detection has become increasingly important due to incidents involving children left in vehicles under extreme temperature conditions. Frequency modulated continuous wave (FMCW) radars are widely used for non-contact monitoring and sensing applications, particularly for occupancy detection. However, the confined and metallic environment inside vehicle cabins presents significant challenges due to multipath reflections. This paper introduces a novel approach that detects the occupied space in each seat to determine occupancy, using the variance of detected points as an indicator of volume occupancy. In an experimental study involving 70 different scenarios with single and multiple subjects, we classify occupants in each seat into one of three categories: adult, baby, or empty. The proposed method achieves an overall accuracy of 96.7% using an Adaboost classifier and a miss-detection rate of 1.8% for detecting babies. This approach demonstrates superior robustness to multipath interference compared to traditional energy-based methods, offering a significant advancement in in-cabin occupancy detection technology.

1. Introduction

Occupancy detection is a widely explored application for using sensors inside vehicles. This application is crucial for safety purposes, such as enabling passenger-side airbags. Various sensors for detecting occupancy, including mechanical sensors, vision-based sensors, and radar are employed. Mechanical sensors are commonly used for occupancy detection by measuring weight, force, acceleration, or pressure [1,2,3]. However, they lack the ability to distinguish between humans and inanimate objects.
On the other hand, vision-based sensors like cameras and infrared sensors are promising approaches due to their high accuracy [4,5]. However, these sensors can leak private information since they can capture entire or large portions of the human body. In addition, infrared sensors and depth cameras are sensitive to illumination level and can be obstructed by sunlight [6,7,8]. Therefore, these sensors are not advisable to be used inside the vehicle where the illumination level varies throughout the day.
Radars are non-contact sensors that can preserve privacy and are independent of the illumination level. Moreover, their electromagnetic waves can also penetrate through obstacles and detect individuals in dead spots [9,10]. Therefore, radars are highly appreciated inside private vehicles. These sensors find extensive use inside the vehicle for different applications other than occupancy detection, such as driver status monitoring [11,12,13,14,15] and gesture recognition for human-vehicle interfaces [16,17,18,19,20,21,22,23].
There are three common kinds of radars which are employed for in-cabin applications, including continuous wave (CW) radars, Ultra-wideband (UWB) radars, and FMCW radars. CW radars cannot provide a sufficient range resolution for in-vehicle applications. UWB radars generate a short-time pulse which leads to distinguishing between reflected signals off the nearby targets [15,24,25,26,27,28,29,30,31,32,33]. However, these radar systems operate based on the amplitude of a signal. Therefore, they can be affected by clutter reflection, especially inside a vehicle. FMCW radars sweep from a low frequency to a high frequency to cover a wide frequency range and provide a high-resolution system [34]. Multiple input multiple output (MIMO) FMCW radars represents the state-of-the-art technology for in-cabin applications, as they are capable of range detecting and angle differentiation of targets [15,35,36,37].
There are different signal processing techniques to detect occupancy. The most straightforward approach is to use reflected energy. This technique is commonly used for detecting left-behind children to save pets and children, especially in cold or hot conditions [15,38,39,40,41,42,43,44,45,46,47,48]. The most common approach is based on the extracted features of a micro-doppler [15,48,49,50,51,52,53,54,55,56,57,58,59] due to the human body motions. Then, artificial intelligence (AI) is employed to detect individuals. AI approaches have been extensively used in different applications [60,61,62,63,64,65,66,67].
Recent studies have typically extracted features from a time–frequency map [15,48,50,51,57,59], a range–azimuth map [15,52,55], or a range–doppler map [15,68]. Therefore, these features are energy-based. However, energy-based features have notable limitations. These features are highly dependent on the range of targets from the radar, as dictated by the radar equation [27]. Consequently, they require separate databases for each seat, increasing the complexity of the classification process. Additionally, multipath reflections can significantly impact the performance of energy-based features, particularly when the front seats are occupied by adults, leading to unreliable detection results.
To address these limitations, this paper proposes a novel volume-based occupancy detection approach utilizing the detected point cloud data from the radar. The main contributions of this paper are as follows:
  • Volume-based occupancy detection: We introduce a method that leverages the variance of detected points within a point cloud to determine the volume of occupancy. This approach is less dependent on the range of targets from the radar, thereby reducing the need for separate databases for each seat.
  • Robustness to multipath reflections: Our proposed method demonstrates greater robustness to multipath reflections compared to traditional energy-based methods. This improvement is particularly evident when the front seats are occupied by adults, where energy-based methods typically struggle.
  • Simplified classification labels: We utilize only three classification labels—adult, baby, and empty—to streamline the detection process. This simplification enhances the efficiency and accuracy of the classification.
  • Feature selection optimization: We explore various feature selection methods to enhance the classification performance further, ensuring the most relevant features are utilized for accurate occupancy detection.
The remaining parts of this paper are organized as follows. Section 2 introduces signal and system design to generate a point cloud. In Section 3, the experimental setup and results are discussed. Finally, Section 4 concludes the findings in this paper.

2. Methodology

This section provides an overview of the radar system utilized, delving into its specifications and components. It further explores the fundamental aspects of radar signal processing, focusing on its role in generating a point cloud. The paragraph also addresses the various approaches employed for occupancy detection within the scope of this paper.

2.1. FMCW Radar Fundamentals

Active radar systems transmit electromagnetic waves to the environment and receive reflected off signals from targets and clutters. Then, the received signals are processed to detect targets and eliminate clutters. Clutters usually have zero velocity with high amplitude for in-cabin applications. As a result, the non-zero velocity reflections can be used for detecting human presence.
Both range and velocity resolutions which can be designed in radar sensors are the key parameters in detecting occupancy within the defined zones. An FMCW radar sweeps linearly increasing or decreasing with time from the start frequency to the end frequency to generate a chirp signal. Two parameters can be designed based on chirp configurations including range resolution and velocity resolution. The range resolution of a chirp signal is inversely proportional to this frequency bandwidth. Therefore, higher frequency bandwidth provides better range resolution, finer range discretization, and enhanced details [69].
Another key parameter that can improve the accuracy of radar systems is velocity resolution. This parameter can distinguish between stationary clutters from humans. As a result, a radar system with higher velocity resolution can detect occupancy accurately. The velocity resolution is inversely related to the number of chirps in a frame by considering a constant chirp length.
The position of a target inside a vehicle can be determined based on its range and angle to a radar system. The FMCW radar determines the range of the target based on the peak of the beat signal in the frequency domain. Due to the high frequency bandwidth of off-the-shelf FMCW radars, they can provide accurate range resolution. However, a single-receiver radar system with a high beamwidth of the antenna is unable to detect the angle of the target accurately. On the other hand, a radar system with more than one receiver can steer the beam of the antenna to detect the angle. The delays to these receivers when the target is placed in non-zero angles are different. The larger the angle, the longer the delay in a fixed range. The Capon algorithm is well-known for angle detection in radar systems. The 3D localization of the target can be achieved by knowing the range, azimuth, and elevation angles which can be obtained by a 2D array antenna.
The angle resolution in single-input multiple-output (SIMO) radars is determined by the number of receivers. Each receiver requires its own processing chain on the device like a low noise amplifier, mixer, analog to digital converter, and so on. Consequently, adding a new receiver leads to an increase in the overall cost. Since SIMO pulse radars are active for a portion of the time, extra transmitters can be added to the system to use inactive time and improve the angle resolution. Therefore, multiple transmitters can be employed by dividing the active time of the system between them. This technique is called time division multiplexing.

2.2. Signal Design

In this paper, the AWR6843AOP sensor from Texas Instruments (TI), Dallas, TX, USA, operating at 60 GHz, is used for overhead mount occupancy detection. Some parameters are constant, based on the sensor’s properties, including start frequency, number of transmitters, number of receivers, and field of view. Other parameters are selected based on the problem description. The maximum range is chosen to cover the typical dimensions of a vehicle cabin, ensuring that the radar can detect occupants from the front to the rear seats, thus encompassing the entire cabin space. The maximum velocity is determined based on typical movements within a vehicle cabin. High range and velocity resolutions are critical for distinguishing between closely spaced objects or occupants. The system supports up to a specific range and velocity resolutions, as well as a specific frame periodicity. Table 1 describes the key parameters of the radar system used in this paper. Additionally, the radar is equipped with an internal processor for radio frequency calibration, which enhances the radar system’s functionality, especially in extreme temperature conditions [70].
Figure 1 represents the placement and relative spacing of the transmitter and receiver antennas, and the equivalent antenna array in AWR6843AOP. The angle resolution (θres) of a MIMO radar when targets are in front of the radar can be determined as follows:
θ r e s = 2 N T x .   N R x
where NTx and NRx represent the number of transmitters and receivers, respectively. The azimuth and elevation angle resolutions by using all the transmitters equals 28.6° when the targets are in front of the system.

2.3. Signal Processing Fundamentals for the Point Cloud Detection

This subsection presents the used radar system, the fundamentals of radar signal processing to generate a point cloud, and the employed approaches for occupancy detection in this paper.

2.3.1. Beat Signal in FMCW Radar

Assume that the transmitted signal is as follows:
x T x t = A Tx cos 2 π f c t + π B T t 2
where ATx is the amplitude of the transmitted signal, fc is the start frequency, B is frequency bandwidth, and T is the duration of every chirp. The chirp slope is as follows:
k = B T .
The reflected signal off a target with a delay of td is as follows:
x R x t = A R x c o s 2 π f c t t d + π k t t d 2
where ARx is the amplitude of the received signal. An attenuated version of the transmitted signal is correlated with the received signal. Then, the low-pass filter is applied to make the beat signal. The beat signal is as follows:
y t = α . x T x t . x R x t = α A T x cos 2 π f c t + π k t 2 . A R x cos 2 π f c t t d + π k t t d 2 l o w   p a s s   f i l t e r α A T x A R x cos ( 2 π f c t d + 2 π k ( t t d t d 2 ) )
where α is the attenuation factor. By assuming that td << t, the y(t) can be simplified as follows:
y t α A T x A R x c o s 2 π f c t d + 2 π k t t d .
The time delay in the beat signal depends on the range of the target. If the target is a moving object, the delay is as follows:
t d = 2 R t c
where R(t) is the range of the target. Hence, the beat signal based on a time-varying range of the target is as follows:
y t α A T x A R x c o s 4 π f c R t c + 4 π k t R t c .
By applying fast Fourier transform (FFT) to the beat signal, the range can be estimated from the frequency of the estimated peak [72].

2.3.2. Clutter Removal

The clutters inside a vehicle do not change in a short period of time like a frame time. Therefore, an averaged signal across all the chirps of a frame in a receiver can be used for this static clutter removal. Then, this average is subtracted from all the chirps in the frame of the receiver. The average signal used for static clutter removal is calculated as follows:
X r = 1 N c i = 1 N c x r ,   i
with Nc = number of chirps in a frame; xr,i = ith chirp in rth receiver; and Xr = average of chirp signals in rth receiver in a frame.

2.3.3. Capon Beamforming

Capon Beamforming, also known as minimum variance distortionless response (MVDR) beamforming, is a technique that enhances the signal-to-noise ratio (SNR) by coherently summing the received signals from an array of sensors [73]. Below, we describe the mathematical formulation and implementation details of the Capon Beamforming method.
The fundamental idea behind Capon Beamforming is to construct a beamformer that adapts to the incoming signal environment. Unlike conventional beamformers, which might use fixed weights, Capon Beamforming dynamically adjusts the weights to minimize the output power subject to the constraint that the gain in the direction of the desired signal remains constant. This results in improved interference suppression and noise reduction, leading to a clearer detection of the target signal.
If we suppose that the phase reference is at (x, y, z) = (0, 0, 0), then the phase of the received receiver signal at (m.d, n.d, 0) is as follows:
Θ m , n = 2 π λ m . d . sin θ . cos φ + n . d . cos θ
where θ is elevation angle and φ is azimuth angle, λ is wavelength, and d = λ/2. Therefore, Θm,n can be simplified as follows:
Θ m , n = π m . sin θ . cos φ + n . cos θ .
The steering vector based on Figure 1 is as follows:
v = [ e j Θ 0 , 0 , e j Θ 0 , 1 , e j Θ 0 , 2 , e j Θ 0 , 3 , e j Θ 1,0 , e j Θ 1,1 , e j Θ 1,2 , e j Θ 1,3 , e j Θ 2,0 , e j Θ 2,1 , e j Θ 3,0 , e j Θ 3,1 ] T .
Then, the weight vector based on the steering vector can be as follows:
w = R 1 v v H R 1 v  
where H denotes Hermitian of a matrix and R is as follows:
R = E X . X H
where X is the received signals in the receivers. Finally, the obtained beat signal from all the receivers is as follows:
Y = w H X . Y = w H X .
The SNR of this signal is enhanced since all the received signals are summed coherently [74].

2.3.4. Constant False Alarm Rate (CFAR)

The CFAR is a robust approach to detect targets in a noisy signal. The cell average CFAR (CA-CFAR) is the most applicable kind of CFAR to suppress noise. A threshold based on noise distribution is determined to distinguish between target signals and noise in this approach. This approach has a sliding window with the same dimension of data to estimate noise parameters like mean and variance. In addition, this window selects noise samples around the sample under test to adapt to the noise level in its vicinity.
The detection threshold in this approach is calculated as follows:
T = α P n
where α is called the scaling factor and Pn is noise power. The noise power can be estimated from the samples inside the window which are around the sample under the test. The noise power is as follows:
P n = 1 N m = 1 N x m
where N is the number of samples in the window, and xm is the sample in each cell. The scaling factor in CA-CFAR is as follows:
α = N P f a 1 N 1
where Pfa is the desired probability of false alarm in the radar system.
Figure 2 shows the employed algorithm in this paper for occupancy detection. After removing DC from raw signals, an FFT is applied on the beat signals to obtain range information. Then, the radar cube is reconstructed from the received chirp signals over time. Since occupants inside the vehicle are non-stationary, zero velocity clutter removal is performed as well as 2D Capon in the next step. Then, the Constant false alarm rate (CFAR) method is used to distinguish between targets and noise and generate a point cloud. The produced point cloud has a range, azimuth angle, elevation angle, and SNR for each point. The aforementioned steps have been implemented for point cloud reconstruction. Then, the detected points are assigned to each seat based on the vehicle’s dimensions.

2.4. Occupancy Detection Approaches

Three different occupancy detection approaches are compared in this paper. The first approach is the number of detected points for each seat. Given that the dimensions of the vehicle and its seats are known, the detected points can be associated with specific seats. For instance, consider a typical car with a length of 4.5 m, a width of 1.8 m, and a height of 1.6 m. Suppose this car has five seats arranged in two rows: two seats in the front and three in the rear. The front seats are located 1.2 m from the front of the car, with a width of 0.5 m each and separated by a 0.4 m console. The rear seats are positioned 2.4 m from the front, spanning a width of 1.4 m in total. When the radar detects points within these defined regions, such as a cluster of points 1.2 m from the front and 0.4 m from the centerline, these can be associated with the driver’s seat. Similarly, points detected 2.4 m from the front and within 0.7 m of the centerline can be attributed to the rear middle seat. By correlating detected points with these predefined spatial regions, it is possible to determine the occupancy of each specific seat within the car. The mean SNR of all the detected points for each seat is considered the second approach. The proposed method to detect occupancy uses the location variance of the detected points for each seat as the last approach. All the detected points are divided between different seats based on the defined zones on the x-z plane.
The proposed approach demonstrates more robustness against false detections caused by multipath signals. The SNR-based approach may determine the occupancy correctly if those false points have low SNR. The in-cabin applications of radar, however, often suffer from a high amplitude of falsely detected points due to the low range of targets from the radar. Consequently, in such scenarios, the SNR-based approach misclassifies empty seats as occupied. However, the proposed variance-based approach remains accurate due to the low location variance of those falsely detected points.
Figure 3 represents the detected points for an adult in seat 2 in a single target scenario. The majority of detected points are located inside the designated zone for seat 2. It also shows that the other seats are empty in this scenario. On the other hand, Figure 4 shows some falsely detected points due to the multipath in seat 5. The reflections from seat 2 display longer ranges, with some coinciding with the range of seat 5. This is a result of the radar setup being closer to the first row. The number of detected multipath points is notably lower compared to the first reflections in seat 2. The reflections from seat 2 appear in longer ranges and some of them have the same range as seat 5. This is because the radar setup was closer to the first row. The number of detected multipath points is significantly lower than the first reflections, attributed to the uneven scattering points on the human body reflecting in random ranges. This is due to the irregular distribution of scattering points on the human body and random reflection from the car body, resulting in reflections occurring at random ranges. Figure 4 also displays that the mean SNR of detected points for this empty seat even over time is like an occupied seat. However, the variance-based approach indicates this seat is empty.

2.5. Classification

To address the issue of class imbalance in our dataset, we employ the synthetic minority over-sampling technique (SMOTE) before training and testing our model. Class imbalance occurs when the number of instances of one class significantly outnumbers the instances of other classes, which can lead to biased classification results. SMOTE helps mitigate this bias by generating synthetic data for the minority class, thus balancing the dataset.
SMOTE operates by creating synthetic samples rather than simply duplicating existing ones. The process involves the following steps [75]:
  • Identify minority class instances: SMOTE identifies the minority class instances in the dataset. These are the instances that are under-represented compared to the majority class.
  • Select k-nearest neighbors: For each minority class instance, SMOTE selects k-nearest neighbors from the same class. The value of k is typically set to 5, but it can be adjusted based on the dataset.
  • Generate synthetic samples: SMOTE generates synthetic samples by interpolating between the selected minority instance and its k-nearest neighbors. The synthetic sample is created by randomly choosing a point along the line segment connecting the minority instance and one of its neighbors.
SMOTE offers several advantages in addressing class imbalance. By generating synthetic samples, SMOTE reduces the risk of overfitting that can occur when simply duplicating minority class instances. This helps improve the performance of machine learning models in terms of accuracy, precision, recall, and F1-score by balancing the dataset. Additionally, SMOTE enhances the model’s ability to generalize to unseen data by providing a more balanced and representative training set. Moreover, SMOTE is flexible and can be used with various machine learning algorithms, making it applicable to a wide range of model types [75].
In our study, we apply SMOTE to balance the dataset before training and testing our occupancy detection model. This process ensures that the classifier is not biased towards the majority class and can effectively detect both the minority and majority class instances. By generating synthetic data for the under-represented labels, we improve the robustness and accuracy of our model.
Additionally, a systematic approach to feature extraction and selection has been employed prior to the classification process. Nine different features, including minimum, maximum, mean, skewness, kurtosis, median, entropy, shape factor, and impulse factor are extracted from each measurement. However, some of these features may have the same effect on the machine leaning approach. Subsequently, feature selection methods can be applied to refine the extracted features by eliminating redundant features, thus simplifying the classification approach.

2.5.1. Feature Explanation and Superiority

  • Minimum and maximum: These features capture the range of values in the dataset, providing insights into the extremities of the detected points.
  • Mean: The average value helps in understanding the central tendency of the data.
  • Skewness and kurtosis: These statistical measures describe the distribution shape, revealing asymmetry and the presence of outliers.
  • Median: As a robust measure of central tendency, the median is less affected by outliers.
  • Entropy: This measure indicates the randomness or unpredictability in the dataset, providing a sense of the data complexity.
  • Shape factor and impulse factor: These features are specifically chosen to capture the geometric and structural characteristics of the detected point cloud, which are crucial for accurate volume-based occupancy detection.

2.5.2. Feature Selection Methods

There are three different feature selection methods, including filter method, wrapper method, and embedded method. In this paper, we have used the filter method for two reasons [76]:
  • Suitability for low dimensional data: This approach is computationally faster than wrapper and embedded methods.
  • Algorithm independence: It is almost independent of the learning algorithm, allowing its use with various learning algorithms.
In the context of data classification, three distinct machine learning algorithms, namely Random Forest (RF) [77], Gaussian Naïve Bayes (GNB) [78], and Adaboost (ADB) [79], are employed. These algorithms exhibit efficiency when applied to small databases. However, each algorithm presents its unique drawbacks. RF may incur computational expenses, GNB is less efficient when features exhibit dependence, and ADB is sensitive to both noisy and imbalanced datasets [80,81]. In the context of our research, which emphasizes the heightened significance of detecting babies over adults, the comparison of these classification methods focuses on their performance metrics, specifically miss detection and false detection rates associated with the identification of babies.

3. Experimental Studies

This section is divided into two main parts, including experimental measurements, and results and discussion. Firstly, the instruments used to collect data, and a description of the test vehicle and scenarios are discussed. Secondly, experimental findings using both the conventional approach and our suggested approach are discussed. Because distinguishing between empty and occupied seats is a less complex problem when compared to distinguishing between adults and babies, the driver’s seat is excluded from this classification task.

3.1. Experimental Setup

Figure 5 shows the AWR6843AOP radar sensor mounted above the headrest of the first-row seat while tilted slightly toward the second row. It is capable of detecting across two rows of a vehicle with about a 120-degree azimuth and 120-degree elevation field of view. Seventy different scenarios including single-subject and multi-subject scenarios are implemented. Four subjects were involved in tests from age 12 to 50, and with weights from 35 Kg to 90 Kg, and heights from 130 cm to 180 cm. The Bella Rose baby doll from Ashton Drake, Niles, IL, USA, is also used to mimic a baby [82]. Additionally, this experimental study is performed inside a parked vehicle. Although vehicle vibrations can add a new source of noise, the amplitude of this noise is almost negligible for occupancy detection application [51]. As shown in Figure 6, we investigated the impact of car vibrations on reflected power after clutter cancellation for seat 2. The noise level of an unoccupied seat during motion is approximately 4 dB higher compared to the same seat when the vehicle is stationary. Notably, the reflected power from an occupied seat in a moving vehicle, following clutter cancellation, is approximately 10 dB higher than that from an empty seat.

3.2. Results and Discussion

Figure 7 shows the number of detected points in all seats except the driver seat for both adults and babies over a duration of 28 s (140 frames). In general, adults exhibit a higher number of detected points compared to babies across different seats. Therefore, this indicator can identify occupancy type of each seat individually. However, seat 3 shows fewer detected points for both an adult and a baby. Therefore, data collection needs to be conducted for each seat separately, resulting in the utilization of 17 labels for this indicator.
Figure 8 shows the mean SNR of detected points in all seats except the driver seat for both adults and babies over a duration of 28 s (140 frames). The SNR can distinguish between adults and babies in different seats. However, seat 4 exhibits a higher SNR for both an adult and a baby compared to other seats. The SNR of the baby in seat 4 is almost the same as the SNR levels of the adults in other seats. This indicator is inversely proportional to the range to the power of 4, as specified in the radar equation [26]. As a result, The SNR indicator also needs separate data collection for each seat, resulting in the utilization of 17 labels.
Figure 9 shows the location variance of detected points in different seats except the driver seat for both adults and babies over a duration of 28 s (140 frames). The location variance of detected points can differentiate between adults and babies in all seats. Unlike SNR, the location variance indicator exhibits less dependency on the range. Therefore, all seats represent the same amount of location variance for different occupancy types. Consequently, separate data collection for each seat is unnecessary.
Figure 7, Figure 8 and Figure 9 depict fluctuations over time. Firstly, due to the low activity level of the human body before both inhalation and exhalation, the human body reflections are suppressed by clutter cancellation. Secondly, occupancy type should be decided over time rather than relying on a single frame. In addition, all the indicators can distinguish between an empty seat and an occupied seat accurately. This capability allows the radar to differentiate between a left-behind baby case and an empty seat.
In the final step, a machine learning approach is employed to accomplish the classification task. Various filter-based feature selection techniques with different scoring functions and modes are applied, as detailed in Table 2. The table presents results for different models using these techniques, including RF, GNB, and ADB. The results indicate that different feature combinations impact the performance of the classification models. Specifically, using mutual information with SelectKBest and a feature set size of 2 yields the best performance with AdaBoost as the classifier. AdaBoost achieves the lowest missed detection rate for babies at 1.8% and the highest overall detection accuracy with a precision of 98.8% and an F1-score of 98.4%. This performance underscores AdaBoost’s effectiveness in minimizing both missed and false detections, making it the optimal choice among the evaluated methods.

4. Conclusions

Radars, as non-contact sensors, offer various applications inside the vehicle while preserving privacy. MIMO FMCW represents state-of-the-art technology capable of detecting occupancy even in dead spots. Recent studies proposed different approaches based on analyzing the reflected energy across range, angle, or frequency. These approaches are highly dependent on the range of occupants to the radar, requiring a separate database for each seat for occupancy detection. In addition, energy-based methods are susceptible to interference from multipath reflections. However, the proposed volume-based occupancy detection approach is seat agnostic. Consequently, we have three labels—adult, baby, and empty—in the classification task. After extracting nine statistical features from the dataset, we utilized filter-based feature selection methods. Subsequent removal of redundant features resulted in the ADB model achieving a minimum miss detection rate of 1.8% for babies, while maintaining an overall accuracy of 96.7%.
While the approach demonstrated strong proficiency in classifying adults and babies, distinguishing between children over 12 years old and adults remains a significant challenge. This limitation highlights the need for further research to refine and enhance the classification algorithms, ensuring they can accurately differentiate between adults and teens older than 12 years. Addressing this challenge is crucial for improving the accuracy and reliability of occupancy detection systems, particularly in scenarios where age-specific detection is critical. Future work should explore advanced techniques, such as incorporating additional sensor modalities or leveraging machine learning models trained on diverse datasets, to improve classification performance. These enhancements will be vital for practical applications in safety-critical environments, such as automotive systems, where accurate identification of occupants’ age groups can significantly impact the effectiveness of safety features and personalized services.

Author Contributions

A.G. conceptualized the idea, developed the algorithms, and took the lead in drafting the manuscript. A.G.D. and Z.Y. contributed to building the measurement setup and conducting experiments. A.K. contributed to conceptualizing the idea. G.S. was the technical supervisor, contributed to the writing process, and supervised all aspects of the project. All authors have read and agreed to the published version of the manuscript.

Funding

This work was conducted as part of a collaborative project between the University of Waterloo and Texas Instruments.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

We extend our gratitude to Texas Instruments for their invaluable assistance during the data collection phase of this research. We also thank the editor and reviewers for their helpful and professional comments on the manuscript.

Conflicts of Interest

Authors Anand G. Dabak and Zigang Yang were employed by the company Texas Instruments Inc. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Lichtinger, H.; Curtis, B.M.; Graf, R.; Reich, D.; Morrell, S.; Kremer, M. Sensor Assembly for Seat Occupant Weight Classification System. US Patent 7503417B2, 17 March 2009. [Google Scholar]
  2. Breed, D.S.; DuVall, W.E.; Johnson, W.C. Dynamic Weight Sensing and Classification of Vehicular Occupants. US Patent 7620521B2, 17 November 2009. [Google Scholar]
  3. Yang, J.; Santamouris, M.; Lee, S.E. Review of Occupancy Sensing Systems and Occupancy Modeling Methodologies for the Application in Institutional Buildings. Energy Build. 2016, 121, 344–349. [Google Scholar] [CrossRef]
  4. Hou, Y.-L.; Pang, G.K.H. People Counting and Human Detection in a Challenging Situation. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2011, 41, 24–33. [Google Scholar] [CrossRef]
  5. Tang, N.C.; Lin, Y.-Y.; Weng, M.-F.; Liao, H.-Y.M. Cross-Camera Knowledge Transfer for Multiview People Counting. IEEE Trans. Image Process. 2015, 24, 80–93. [Google Scholar] [CrossRef]
  6. Siddiqui, H.U.R.; Saleem, A.A.; Brown, R.; Bademci, B.; Lee, E.; Rustam, F.; Dudley, S. Non-Invasive Driver Drowsiness Detection System. Sensors 2021, 21, 4833. [Google Scholar] [CrossRef]
  7. Staszek, K.; Wincza, K.; Gruszczynski, S. Driver’s Drowsiness Monitoring System Utilizing Microwave Doppler Sensor. In Proceedings of the 2012 19th International Conference on Microwaves, Radar & Wireless Communications, Warsaw, Poland, 21–23 May 2012; Volume 2, pp. 623–626. [Google Scholar]
  8. Ciattaglia, G.; Spinsante, S.; Gambi, E. Slow-Time MmWave Radar Vibrometry for Drowsiness Detection. In Proceedings of the 2021 IEEE International Workshop on Metrology for Automotive (MetroAutomotive), Virtual Conference, 1–2 July 2021; pp. 141–146. [Google Scholar]
  9. Hyundai Inc. Available online: https://www.tu-auto.com/hyundais-radar-to-protect-forgotten-kids/ (accessed on 28 October 2022).
  10. Toyota Inc. Available online: https://blog.vayyar.com/vayyar-sensor-for-toyota-cabin-awareness (accessed on 6 November 2022).
  11. Yang, Z.; Bocca, M.; Jain, V.; Mohapatra, P. Contactless Breathing Rate Monitoring in Vehicle Using UWB Radar. In Proceedings of the 7th International Workshop on Real-World Embedded Wireless Systems and Networks, Shenzhen, China, 4 November 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 13–18. [Google Scholar]
  12. Vinci, G.; Lenhard, T.; Will, C.; Koelpin, A. Microwave Interferometer Radar-Based Vital Sign Detection for Driver Monitoring Syst. In Proceedings of the 2015 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Heidelberg, Germany, 27–29 April 2015; pp. 1–4. [Google Scholar]
  13. Wang, F.; Zeng, X.; Wu, C.; Wang, B.; Liu, K.J.R. Driver Vital Signs Monitoring Using Millimeter Wave Radio. IEEE Internet Things J. 2022, 9, 11283–11298. [Google Scholar] [CrossRef]
  14. Gharamohammadi, A.; Pirani, M.; Khajepour, A.; Shaker, G. Multibin Breathing Pattern Estimation by Radar Fusion for Enhanced Driver Monitoring. IEEE Trans. Instrum. Meas. 2024, 73, 1–12. [Google Scholar] [CrossRef]
  15. Gharamohammadi, A.; Khajepour, A.; Shaker, G. In-Vehicle Monitoring by Radar: A Review. IEEE Sens. J. 2023, 1, 25650–25672. [Google Scholar] [CrossRef]
  16. Zhang, X.; Wu, Q.; Zhao, D. Dynamic Hand Gesture Recognition Using FMCW Radar Sensor for Driving Assistance. In Proceedings of the 2018 10th International Conference on Wireless Communications and Signal Processing (WCSP), Hangzhou, China, 18–20 October 2018; pp. 1–6. [Google Scholar]
  17. Smith, K.A.; Csech, C.; Murdoch, D.; Shaker, G. Gesture Recognition Using Mm-Wave Sensor for Human-Car Interface. IEEE Sens. Lett. 2018, 2, 1–4. [Google Scholar] [CrossRef]
  18. Wang, X.; Bai, J.; Zhu, X.; Huang, L.; Xiong, M. Research on Gesture Recognition Algorithm Based on Millimeter-Wave Radar in Vehicle Scene; SAE Technical Paper; SAE: Warrendale, PA, USA, 2022. [Google Scholar]
  19. Li, G.; Zhang, S.; Fioranelli, F.; Griffiths, H. Effect of Sparsity-Aware Time–Frequency Analysis on Dynamic Hand Gesture Classification with Radar Micro-Doppler Signatures. IET Radar Sonar Navig. 2018, 12, 815–820. [Google Scholar] [CrossRef]
  20. Molchanov, P.; Gupta, S.; Kim, K.; Pulli, K. Multi-Sensor System for Driver’s Hand-Gesture Recognition. In Proceedings of the 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), Ljubljana, Slovenia, 4–8 May 2015; Volume 1, pp. 1–8. [Google Scholar]
  21. Molchanov, P.; Gupta, S.; Kim, K.; Pulli, K. Short-Range FMCW Monopulse Radar for Hand-Gesture Sensing. In Proceedings of the 2015 IEEE Radar Conference (RadarCon), Arlington, VA, USA, 10–15 May 2015; pp. 1491–1496. [Google Scholar]
  22. Khan, F.; Leem, S.K.; Cho, S.H. Hand-Based Gesture Recognition for Vehicular Applications Using IR-UWB Radar. Sensors 2017, 17, 833. [Google Scholar] [CrossRef] [PubMed]
  23. Khan, F.; Cho, S.H. Hand Based Gesture Recognition inside a Car through IR-UWB Radar. Korean Soc. Electron. Eng. 2017, 154–157. [Google Scholar]
  24. Gharamohammadi, A.; Shaker, G. A Novel Back-Projection Algorithm Improved by Antenna Pattern Automatized by 2-D CFAR. In Proceedings of the 2022 IEEE International Symposium on Antennas and Propagation and USNC-URSI Radio Science Meeting (AP-S/URSI), Denver, CO, USA, 10–15 July 2022; pp. 1158–1159. [Google Scholar]
  25. Gharamohammadi, A.; Shokouhmand, A. A Robust Whitening Algorithm to Identify Buried Objects with Similar Attributes in Correlation-Based Detection. J. Appl. Geophy. 2020, 172, 103917. [Google Scholar] [CrossRef]
  26. Gharamohammadi, A.; Norouzi, Y.; Aghaeinia, H. Optimized UWB Signal to Shallow Buried Object Imaging. Prog. Electromagn. Res. Lett. 2018, 72, 7–10. [Google Scholar] [CrossRef]
  27. Gharamohammadi, A.; Behnia, F.; Amiri, R. Imaging Based on Correlation Function for Buried Objects Identification. IEEE Sens. J. 2018, 18, 7407–7413. [Google Scholar] [CrossRef]
  28. Gharamohammadi, A.; Behnia, F.; Shokouhmand, A.; Shaker, G. Robust Wiener Filter-Based Time Gating Method for Detection of Shallowly Buried Objects. IET Signal Process. 2021, 15, 28–39. [Google Scholar] [CrossRef]
  29. Gharamohammadi, A.; Behnia, F.; Shokouhmand, A. Imaging Based on a Fast Back-Projection Algorithm Considering Antenna Beamwidth. In Proceedings of the 2019 6th Iranian Conference on Radar and Surveillance Systems, ICRSS 2019, Isfahan, Iran, 4–6 December 2019. [Google Scholar]
  30. Gharamohammadi, A.; Behnia, F.; Shokouhmand, A. Machine Learning Based Identification of Buried Objects Using Sparse Whitened NMF. arXiv 2019, arXiv:1910.07180. [Google Scholar]
  31. Möderl, J.; Posch, S.; Pernkopf, F.; Witrisal, K. “ UWBCarGraz” Dataset for Car Occupancy Detection Using Ultra-Wideband Radar. arXiv 2023, arXiv:2311.10478. [Google Scholar]
  32. Choi, J.W.; Yim, D.H.; Cho, S.H. People Counting Based on an IR-UWB Radar Sensor. IEEE Sens. J. 2017, 17, 5717–5727. [Google Scholar] [CrossRef]
  33. Nezirovic, A.; Yarovoy, A.G.; Ligthart, L.P. Signal Processing for Improved Detection of Trapped Victims Using UWB Radar. IEEE Trans. Geosci. Remote Sens. 2010, 48, 2005–2014. [Google Scholar] [CrossRef]
  34. Abu-Sardanah, S.; Gharamohammadi, A.; Ramahi, O.M.; Shaker, G. A Wearable Mm-Wave Radar Platform for Cardiorespiratory Monitoring. IEEE Sens. Lett. 2023, 7, 1–4. [Google Scholar] [CrossRef]
  35. Maaref, N.; Millot, P.; Pichot, C.; Picon, O. A Study of UWB FM-CW Radar for the Detection of Human Beings in Motion Inside a Building. IEEE Trans. Geosci. Remote Sens. 2009, 47, 1297–1300. [Google Scholar] [CrossRef]
  36. Hunt, A.R. Use of a Frequency-Hopping Radar for Imaging and Motion Detection Through Walls. IEEE Trans. Geosci. Remote Sens. 2009, 47, 1402–1408. [Google Scholar] [CrossRef]
  37. Hamidi, S.; Naeini, S.S.; Shaker, G. An Overview of Vital Signs Monitoring Based on RADAR Technologies. In Sensing Technology; Suryadevara, N.K., George, B., Jayasundera, K.P., Roy, J.K., Mukhopadhyay, S.C., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 113–124. [Google Scholar]
  38. Hashim, N.M.Z.; Basri, H.; Jaafar, A.; Aziz, M.A.; Salleh, A.; Ja’afar, A.S. Child in Car Alarm System Using Various Sensors. ARPN J. Eng. Appl. Sci. 2014, 9. [Google Scholar]
  39. Sterner, H.; Aichholzer, W.; Haselberger, M. Development of an Antenna Sensor for Occupant Detection in Passenger Transportation. Procedia Eng. 2012, 47, 178–183. [Google Scholar] [CrossRef]
  40. Mousel, T.; Larsen, P.; Lorenz, H. Unattended Children in Cars: Radiofrequency-Based Detection to Reduce Heat Stroke Fatalities. In Proceedings of the 25th International Technical Conference on the Enhanced Safety of Vehicles (ESV): Innovations in Vehicle Safety: Opportunities and Challenges, Detroit, MI, USA, 5–8 June 2017. [Google Scholar]
  41. Diewald, A.R.; Landwehr, J.; Tatarinov, D.; di Mario Cola, P.; Watgen, C.; Mica, C.; Lu-Dac, M.; Larsen, P.; Gomez, O.; Goniva, T. RF-Based Child Occupation Detection in the Vehicle Interior. In Proceedings of the 2016 17th International Radar Symposium (IRS), Krakow, Poland, 10–12 May 2016; pp. 1–4. [Google Scholar]
  42. Caddemi, A.; Cardillo, E. Automotive Anti-Abandon Systems: A Millimeter-Wave Radar Sensor for the Detection of Child Presence. In Proceedings of the 2019 14th International Conference on Advanced Technologies, Systems and Services in Telecommunications (TELSIKS), Nis, Serbia, 23–25 October 2019; pp. 94–97. [Google Scholar]
  43. Diewald, A.R.; Fox, A.; Tatarinov, D. Thorough Analysis of Multipath Propagation Effects for Radar Applications in the Vehicle Interior. In Proceedings of the 2018 11th German Microwave Conference (GeMiC), Freiburg, Germany, 12–14 March 2018; pp. 63–66. [Google Scholar]
  44. Abedi, H.; Magnier, C.; Mazumdar, V.; Shaker, G. Improving Passenger Safety in Cars Using Novel Radar Signal Processing. Eng. Rep. 2021, 3, e12413. [Google Scholar] [CrossRef]
  45. Diewald, A.R.; Tatarinov, D. Non-Broadside Patch Antenna for Car-Interior Passenger Detection. In Proceedings of the 2017 18th International Radar Symposium (IRS), Prague, Czech Republic, 28–30 June 2017; pp. 1–10. [Google Scholar]
  46. Peng, W.; Li, H.; Tang, J.; Lin, L.; Teng, Z.; Luo, C.; Zhu, L.; Zhu, Z. Safety Protection System Computer Aided Design in Enclosed Vehicle Using Film Pressure Sensor and Microwave Radar. In Proceedings of the 2021 IEEE International Conference on Data Science and Computer Application (ICDSCA), Dalian, China, 28–30 October 2021; pp. 553–555. [Google Scholar]
  47. Liao, J.; Xiang, G.; Cao, L.; Xia, J.; Yue, L. The Left-behind Human Detection and Tracking System Based on Vision with Multi-Model Fusion and Microwave Radar inside the Bus. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2020, 234, 2342–2354. [Google Scholar] [CrossRef]
  48. Abedi, H.; Magnier, C.; Shaker, G. Passenger Monitoring Using AI-Powered Radar. In Proceedings of the 2021 IEEE 19th International Symposium on Antenna Technology and Applied Electromagnetics (ANTEM), Winnipeg, MB, Canada, 8–11 August 2021; pp. 1–2. [Google Scholar]
  49. Lim, S.; Jung, J.; Kim, S.-C.; Lee, S. Deep Neural Network-Based In-Vehicle People Localization Using Ultra-Wideband Radar. IEEE Access 2020, 8, 96606–96612. [Google Scholar] [CrossRef]
  50. Alizadeh, M.; Abedi, H.; Shaker, G. Low-Cost Low-Power in-Vehicle Occupant Detection with Mm-Wave FMCW Radar. In Proceedings of the 2019 IEEE Sensors, Montreal, QC, Canada, 27–30 October 2019; pp. 1–4. [Google Scholar]
  51. Hyun, E.; Jin, Y.-S.; Park, J.-H.; Yang, J.-R. Machine Learning-Based Human Recognition Scheme Using a Doppler Radar Sensor for In-Vehicle Applications. Sensors 2020, 20, 202. [Google Scholar] [CrossRef] [PubMed]
  52. Abedi, H.; Luo, S.; Shaker, G. On the Use of Low-Cost Radars and Machine Learning for In-Vehicle Passenger Monitoring. In Proceedings of the 2020 IEEE 20th Topical Meeting on Silicon Monolithic Integrated Circuits in RF Systems (SiRF), San Antonio, TX, USA, 26–29 January 2020; pp. 63–65. [Google Scholar]
  53. Chen, Y.; Luo, Y.; Qi, A.; Miao, M.; Qi, Y. In-Cabin Monitoring Based on Millimeter Wave FMCW Radar. In Proceedings of the 2021 13th International Symposium on Antennas, Propagation and EM Theory (ISAPE), Zhuhai, China, 1–4 December 2021; Volume 1, pp. 1–3. [Google Scholar]
  54. Ma, Y.; Zeng, Y.; Jain, V. CarOSense: Car Occupancy Sensing with the Ultra-Wideband Keyless Infrastructure. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 1–28. [Google Scholar] [CrossRef]
  55. Abedi, H.; Luo, S.; Mazumdar, V.; Riad, M.M.Y.R.; Shaker, G. AI-Powered In-Vehicle Passenger Monitoring Using Low-Cost Mm-Wave Radar. IEEE Access 2022, 10, 18998–19012. [Google Scholar] [CrossRef]
  56. Song, H.; Shin, H.-C. Single-Channel FMCW-Radar-Based Multi-Passenger Occupancy Detection Inside Vehicle. Entropy 2021, 23, 1472. [Google Scholar] [CrossRef] [PubMed]
  57. Yang, X.; Ding, Y.; Zhang, X.; Zhang, L. Spatial-Temporal-Circulated GLCM and Physiological Features for In-Vehicle People Sensing Based on IR-UWB Radar. IEEE Trans. Instrum. Meas. 2022, 71, 1–13. [Google Scholar] [CrossRef]
  58. Lim, S.; Lee, S.; Jung, J.; Kim, S.-C. Detection and Localization of People Inside Vehicle Using Impulse Radio Ultra-Wideband Radar Sensor. IEEE Sens. J. 2020, 20, 3892–3901. [Google Scholar] [CrossRef]
  59. Song, H.; Yoo, Y.; Shin, H.-C. In-Vehicle Passenger Detection Using FMCW Radar. In Proceedings of the International Conference on Information Networking (ICOIN), Jeju Island, Republic of Korea, 13–16 January 2021; pp. 644–647. [Google Scholar]
  60. Preethi, P.; Mamatha, H.R. Region-Based Convolutional Neural Network for Segmenting Text in Epigraphical Images. Artif. Intell. Appl. 2022, 1, 119–127. [Google Scholar] [CrossRef]
  61. Deng, W.; Cai, X.; Wu, D.; Song, Y.; Chen, H.; Ran, X.; Zhou, X.; Zhao, H. MOQEA/D: Multi-Objective QEA With Decomposition Mechanism and Excellent Global Search and Its Application. IEEE Trans. Intell. Transp. Syst. 2024. [Google Scholar] [CrossRef]
  62. Bhosle, K.; Musande, V. Evaluation of Deep Learning CNN Model for Recognition of Devanagari Digit. Artif. Intell. Appl. 2023, 1, 114–118. [Google Scholar] [CrossRef]
  63. Akande, T.; Alabi, O.A.; Ajagbe, S.A. A Deep Learning-Based CAE Approach for Simulating 3D Vehicle Wheels Under Real-World Conditions. Artif. Intell. Appl. 2024. [Google Scholar] [CrossRef]
  64. Song, Y.; Han, L.; Zhang, B.; Deng, W. A Dual-Time Dual-Population Multi-Objective Evolutionary Algorithm with Application to the Portfolio Optimization Problem. Eng. Appl. Artif. Intell. 2024, 133, 108638. [Google Scholar] [CrossRef]
  65. Li, F.; Chen, J.; Zhou, L.; Kujala, P. Investigation of Ice Wedge Bearing Capacity Based on an Anisotropic Beam Analogy. Ocean Eng. 2024, 302, 117611. [Google Scholar] [CrossRef]
  66. Wu, H.; Prasad, S. Semi-Supervised Deep Learning Using Pseudo Labels for Hyperspectral Image Classification. IEEE Trans. Image Process. 2018, 27, 1259–1270. [Google Scholar] [CrossRef]
  67. Li, M.; Wang, Y.; Yang, C.; Lu, Z.; Chen, J. Automatic Diagnosis of Depression Based on Facial Expression Information and Deep Convolutional Neural Network. IEEE Trans. Comput. Soc. Syst. 2024, 1–12. [Google Scholar] [CrossRef]
  68. Servadei, L.; Sun, H.; Ott, J.; Stephan, M.; Hazra, S.; Stadelmayer, T.; Lopera, D.S.; Wille, R.; Santra, A. Label-Aware Ranked Loss for Robust People Counting Using Automotive In-Cabin Radar. In Proceedings of the ICASSP 2022, 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 22–27 May 2022; pp. 3883–3887. [Google Scholar]
  69. Alizadeh, M.; Shaker, G.; De Almeida, J.C.M.; Morita, P.P.; Safavi-Naeini, S. Remote Monitoring of Human Vital Signs Using Mm-Wave FMCW Radar. IEEE Access 2019, 7, 54958–54968. [Google Scholar] [CrossRef]
  70. Texas Instruments Inc. Self-Calibration in TI’s MmWave Radar Devices. Available online: https://dev.ti.com/tirex/explore/node?node=A__ALN7lP07MD0wMIRRMa1bwA__RADAR-ACADEMY__GwxShWe__LATEST&search=calibration (accessed on 7 January 2024).
  71. Texas Instruments Inc. Vehicle Occupant Detection Reference Design. Available online: https://www.ti.com/lit/ug/tidue95a/tidue95a.pdf (accessed on 11 August 2022).
  72. Bagheri, M.O.; Gharamohammadi, A.; Abu-Sardanah, S.; Ramahi, O.M.; Shaker, G. Radar Near-Field Sensing Using Metasurface for Biomedical Applications. Commun. Eng. 2024, 3, 51. [Google Scholar] [CrossRef]
  73. Núñez-Ortuño, J.M.; González-Coma, J.P.; Nocelo López, R.; Troncoso-Pastoriza, F.; Álvarez-Hernández, M. Beamforming Techniques for Passive Radar: An Overview. Sensors 2023, 23, 3435. [Google Scholar] [CrossRef]
  74. Shokouhmand, A.; Eckstrom, S.; Gholami, B.; Tavassolian, N. Camera-Augmented Non-Contact Vital Sign Monitoring in Real Time. IEEE Sens J 2022, 22, 11965–11978. [Google Scholar] [CrossRef]
  75. Chawla, N.V.; Bowyer, K.W.; Hall, L.O.; Kegelmeyer, W.P. SMOTE: Synthetic Minority over-Sampling Technique. J. Artif. Intell. Res. 2002, 16, 321–357. [Google Scholar] [CrossRef]
  76. Venkatesh, B.; Anuradha, J. A Review of Feature Selection and Its Methods. Cybern. Inf. Technol. 2019, 19, 3. [Google Scholar] [CrossRef]
  77. Liu, Y.; Wang, Y.; Zhang, J. New Machine Learning Algorithm: Random Forest. Information Computing and Applications; Liu, B., Ma, M., Chang, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 246–252. [Google Scholar]
  78. Kamel, H.; Abdulah, D.; Al-Tuwaijari, J.M. Cancer Classification Using Gaussian Naive Bayes Algorithm. In Proceedings of the 2019 International Engineering Conference (IEC), Erbil, Iraq, 23–25 June 2019; pp. 165–170. [Google Scholar]
  79. An, T.-K.; Kim, M.-H. A New Diverse AdaBoost Classifier. In Proceedings of the 2010 International Conference on Artificial Intelligence and Computational Intelligence, Sanya, China, 23–24 October 2010; Volume 1, pp. 359–363. [Google Scholar]
  80. Hatwell, J.; Gaber, M.M.; Atif Azad, R.M. Ada-WHIPS: Explaining AdaBoost Classification with Applications in the Health Sciences. BMC Med. Inform. Decis. Mak. 2020, 20, 250. [Google Scholar] [CrossRef]
  81. Sen, P.C.; Hajra, M.; Ghosh, M. Supervised Classification Algorithms in Machine Learning: A Survey and Review. In Emerging Technology in Modelling and Graphics; Mandal, J.K., Bhattacharya, D., Eds.; Springer Singapore: Singapore, 2020; pp. 99–111. [Google Scholar]
  82. Bella Rose Baby Doll. Available online: https://www.bradfordexchange.ca/products/117759001_lifelike-breathing-baby-doll.html (accessed on 28 September 2023).
Figure 1. MIMO antenna array: (a) antenna positions and (b) equivalent virtual array. λ represents the wavelength which is equal to 5 mm at 60 GHz [71].
Figure 1. MIMO antenna array: (a) antenna positions and (b) equivalent virtual array. λ represents the wavelength which is equal to 5 mm at 60 GHz [71].
Remotesensing 16 03068 g001
Figure 2. The employed algorithm for occupancy detection.
Figure 2. The employed algorithm for occupancy detection.
Remotesensing 16 03068 g002
Figure 3. The detected points for an adult in seat 2 in a single target scenario. (a) Without multipath issue. (b) With multipath issue.
Figure 3. The detected points for an adult in seat 2 in a single target scenario. (a) Without multipath issue. (b) With multipath issue.
Remotesensing 16 03068 g003
Figure 4. The detected points for seat 5, which are influenced by multiple reflections from seat 2, are shown in terms of (a) mean SNR and (b) location variance.
Figure 4. The detected points for seat 5, which are influenced by multiple reflections from seat 2, are shown in terms of (a) mean SNR and (b) location variance.
Remotesensing 16 03068 g004
Figure 5. Radar setup inside the vehicle.
Figure 5. Radar setup inside the vehicle.
Remotesensing 16 03068 g005
Figure 6. The effect of car vibration on reflected power after clutter cancellation in seat 2.
Figure 6. The effect of car vibration on reflected power after clutter cancellation in seat 2.
Remotesensing 16 03068 g006
Figure 7. Number of detected points over 140 frames (28 s).
Figure 7. Number of detected points over 140 frames (28 s).
Remotesensing 16 03068 g007
Figure 8. Mean SNR of the detected points over 140 frames (28 s).
Figure 8. Mean SNR of the detected points over 140 frames (28 s).
Remotesensing 16 03068 g008
Figure 9. Location variance of detected points over 140 frames (28 s).
Figure 9. Location variance of detected points over 140 frames (28 s).
Remotesensing 16 03068 g009
Table 1. Key parameters of the used radar system.
Table 1. Key parameters of the used radar system.
ParameterConfiguration
DeviceAWR6843AOP
Number of transmitters3
Number of receivers4
Field of view120° horizontal, 120° vertical
Maximum range2.7 m
Range resolution5.3 cm
Maximum velocity1.7 m/s
Velocity resolution1.5 cm/s
Frame periodicity200 ms
Table 2. Different filter-based feature selection techniques with different score functions and different modes.
Table 2. Different filter-based feature selection techniques with different score functions and different modes.
TechniqueScore
Function
ModeNumber of
Features
ModelMissed Detection of Baby (%)False Detection of Baby (%)Precision
(%)
Recall
(%)
F1-Score
(%)
Generic univariate feature selectorANOVA F-valueFalse positive rate8RF41.1989697
GNB4.91.197.595.596.5
ADB4.41.397.39696.7
Generic univariate feature selectorChi-squaredFalse positive rate7RF4.40.297.895.696.7
GNB4.92.496.594.895.6
ADB5.30.297.294.795.9
Rank2D algorithm = ‘covariance’4RF1.81.598.59898.2
GNB1.81.598.59898.2
ADB1.80.298.89898.4
SelectKBestMutual informationNumber of features = 22RF2.7298.29797.6
GNB1.84.2979897.5
ADB2.71.898.29797.6
Generic univariate feature selectorANOVA F-valuePercentile1RF6.22.2979495.5
GNB4.95.6949594.5
ADB6.22.2979495.5
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gharamohammadi, A.; Dabak, A.G.; Yang, Z.; Khajepour, A.; Shaker, G. Volume-Based Occupancy Detection for In-Cabin Applications by Millimeter Wave Radar. Remote Sens. 2024, 16, 3068. https://doi.org/10.3390/rs16163068

AMA Style

Gharamohammadi A, Dabak AG, Yang Z, Khajepour A, Shaker G. Volume-Based Occupancy Detection for In-Cabin Applications by Millimeter Wave Radar. Remote Sensing. 2024; 16(16):3068. https://doi.org/10.3390/rs16163068

Chicago/Turabian Style

Gharamohammadi, Ali, Anand G. Dabak, Zigang Yang, Amir Khajepour, and George Shaker. 2024. "Volume-Based Occupancy Detection for In-Cabin Applications by Millimeter Wave Radar" Remote Sensing 16, no. 16: 3068. https://doi.org/10.3390/rs16163068

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop