Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
IPCB: Intelligent Pseudolite Constellation Based on High-Altitude Balloons
Next Article in Special Issue
Evaluating Multimodal Techniques for Predicting Visibility in the Atmosphere Using Satellite Images and Environmental Data
Previous Article in Journal
Optimal Voltage Recovery Learning Control for Microgrids with N-Distributed Generations via Hybrid Iteration Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Periodic Transformer Encoder for Multi-Horizon Travel Time Prediction

by
Hui-Ting Christine Lin
1 and
Vincent S. Tseng
1,2,*
1
Department of Computer Science, National Yang Ming Chiao Tung University, Hsinchu 30010, Taiwan
2
Department of Management Information Systems, National Chung Hsing University, Taichung 402, Taiwan
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(11), 2094; https://doi.org/10.3390/electronics13112094
Submission received: 9 May 2024 / Revised: 24 May 2024 / Accepted: 25 May 2024 / Published: 28 May 2024
(This article belongs to the Special Issue Data-Centric Artificial Intelligence: New Methods for Data Processing)

Abstract

:
In the domain of Intelligent Transportation Systems (ITS), ensuring reliable travel time predictions is crucial for enhancing the efficiency of transportation management systems and supporting long-term planning. Recent advancements in deep learning have demonstrated the ability to effectively leverage large datasets for accurate travel time predictions. These innovations are particularly vital as they address both short-term and long-term travel demands, which are essential for effective traffic management and scheduled routing planning. Despite advances in deep learning applications for traffic analysis, the dynamic nature of traffic patterns frequently challenges the forecasting capabilities of existing models, especially when forecasting both immediate and future traffic conditions across various time horizons. Additionally, the area of long-term travel time forecasting still remains not fully explored in current research due to these complexities. In response to these challenges, this study introduces the Periodic Transformer Encoder (PTE). PTE is a Transformer-based model designed to enhance traffic time predictions by effectively capturing temporal dependencies across various horizons. Utilizing attention mechanisms, PTE learns from long-range periodic traffic data for handling both short-term and long-term fluctuations. Furthermore, PTE employs a streamlined encoder-only architecture that eliminates the need for a traditional decoder, thus significantly simplifying the model’s structure and reducing its computational demands. This architecture enhances both the training efficiency and the performance of direct travel time predictions. With these enhancements, PTE effectively tackles the challenges presented by dynamic traffic patterns, significantly improving prediction performance across multiple time horizons. Comprehensive evaluations on an extensive real-world traffic dataset demonstrate PTE’s superior performance in predicting travel times over multiple horizons compared to existing methods. PTE is notably effective in adapting to high-variability road segments and peak traffic hours. These results prove PTE’s effectiveness and robustness across diverse traffic environments, indicating its significant contribution to advancing traffic prediction capabilities within ITS.

1. Introduction

Intelligent Transportation Systems (ITS) have experienced significant growth driven by increasing urbanization and the increasing demands for efficient transportation management. These systems leverage advanced data analytics and real-time control mechanisms to enhance the reliability, safety, and efficiency of transportation networks. As urban populations grow and transportation becomes more complex, ITS are required to handle not only the immediate fluctuations in traffic but also to anticipate long-term changes and challenges. Consequently, the evolution of these systems demands the development of advanced predictive technologies capable of operating across multiple time horizons. This capability is crucial for effectively managing day-to-day operations while also supporting strategic decisions that ensure sustainability and resilience in transportation planning. By predicting traffic patterns minutes to hours or even days in advance, ITS can improve traffic flow, reduce congestion, and enhance urban mobility in both the short and long term.
Travel time prediction is a fundamental component of ITS that significantly contributes to the efficiency and reliability of transportation systems. Accurate predictions of travel times are essential for route planning, congestion management, and overall traffic enhancement. For example, they enable commuters to make informed decisions, reduce waiting times, and enhance their commuting experience. As transportation networks grow in complexity, the ability to predict travel times accurately over multiple horizons—from a few minutes ahead to several hours—becomes critical for operational efficiency and strategic planning.
Traffic prediction often employs traditional statistical methods such as Historical Averages (HA) and ARIMA (Autoregressive Integrated Moving Average) due to their straightforward computational frameworks and strong theoretical foundations [1,2,3]. These methods, particularly effective for simple, short-term forecasts, rely heavily on stable historical data patterns. HA averages historical data for predictions, which is suitable for short-term forecasting. However, these methods face limitations when dealing with multivariate time series or dynamic traffic conditions that require adaptation to rapid changes. These limitations highlight a need for more advanced forecasting techniques to handle the complexity and variability of long-term traffic predictions.
The advent of data-driven approaches, particularly those leveraging machine learning and deep learning, has transformed traffic prediction methodologies. These methods utilize vast amounts of data to learn complex patterns and dependencies, which are often invisible to traditional statistical methods. In traffic prediction, deep learning techniques such as recurrent neural networks (RNNs) and long short-term memory (LSTM) networks have initially shown promise in handling temporal dependencies and variable traffic conditions. Early studies highlighted the efficacy of LSTM networks in predicting traffic flow dynamics [4]. Further research by [5] expanded the use of LSTM models to encompass travel time prediction, validating their effectiveness in practical settings. Subsequent studies have innovated by integrating Convolutional Neural Networks (CNNs) with LSTMs, enabling the simultaneous processing of spatial and temporal traffic data to enhance prediction precision [6]. Moreover, recent advancements have included weather data integration into deep learning frameworks to enhance the accuracy of travel time predictions [7]. Despite these successes, many current models are primarily designed for short-term predictions and face challenges in long-term forecasting due to their limited ability to effectively handle extended-range dependencies.
In recent years, transformer-based methods have revolutionized deep learning, especially in the analysis of sequence data. By utilizing attention mechanisms, these models can interpret complex relationships across extended time periods, establishing their superiority over traditional models in scenarios that demand an understanding of long-term dependencies. Transformers have shown versatility and efficacy in traffic prediction within various aspects of ITS. They excel in applications ranging from traffic speed analysis [8] to travel time prediction [9], and traffic flow forecasting [10,11,12,13]. These methods show their strengths in handling diverse data types, demonstrating their capability to meet the complex demands of modern traffic systems. Models such as the Informer [14] and PDFormer [11] address significant challenges, including high computational demands and dynamic traffic conditions, respectively. The Informer efficiently processes long data sequences through a ProbSparse self-attention mechanism, improving both processing time and resource usage, which is beneficial for accurate long-term predictions. Meanwhile, the PDFormer introduces innovative approaches to accommodate propagation delays in dynamic scenarios, enhancing the practicality of forecasting models in ITS.
The adaptability of transformer models is further demonstrated by their capability to manage complex spatial–temporal relationships and integrate multimodal data [15]. The Spatial–Temporal Transformer Networks (STTNs) [16] particularly demonstrate the ability of transformers to explore the nonlinear and dynamic dependencies typical in traffic data, making them suitable for complex forecasting tasks. Additionally, transformers adeptly incorporate external factors, such as weather conditions and special events, into their predictions. This integration significantly improves model accuracy across various traffic scenarios, as evidenced by models specifically designed to include such external influences [12,13]. Overall, these transformer models have proven their capability to handle complex, multifaceted data, significantly enhancing both performance and efficiency in long-term time series analysis. Their potential within the ITS domain is significant, demonstrating their ability to improve long-term traffic prediction by effectively managing the intricate and varied data typical of these systems.
In advancing travel time prediction within ITS, this study introduces the Periodic Transformer Encoder (PTE). The PTE leverages the strengths of transformer models to overcome the limitations of existing traffic prediction methods, which are primarily designed for short-term forecasting and remain insufficient in handling the intricacies of long-term predictions. The PTE is specifically designed for multi-horizon prediction, adeptly handling both immediate and extended forecasting challenges by effectively utilizing periodic data, an aspect that existing methods have not fully explored. The contributions of this study are detailed as follows:
  • The proposed PTE framework represents a significant advancement in travel time prediction. By learning from both short-term and long-term traffic patterns and effectively capturing temporal dependencies across various horizons, the PTE enhances prediction performance for all time scales, consistently outperforming existing methods.
  • The PTE introduces a streamlined encoder-only architecture that eliminates the need for a traditional decoder, thereby reducing model complexity and resource requirements. This design simplifies the training and inference processes while reducing computational resources.
  • A series of evaluations conducted on a comprehensive real-world traffic dataset demonstrates the superior performance of the PTE. These results show that the PTE outperforms existing methods in predicting travel times across multiple horizons. Notably, the PTE proves its robustness in handling diverse traffic conditions and varying degrees of complexity, affirming its capability to deliver more accurate and reliable travel time predictions.
The rest of this paper is structured as follows. Section 2 provides an overview of related work. Section 3 describes the proposed framework in detail. Section 4 presents the experimental setup, and Section 5 discusses the results and implications of the experiments. Section 6 provides a summary of the study.

2. Related Work

2.1. Short-Term Traffic Prediction

Early research in traffic prediction primarily utilized traditional machine learning techniques. For instance, support vector regression (SVR) was employed in early works to tackle traffic forecasting challenges [17]. The Autoregressive Integrated Moving Average (ARIMA) model was also used to predict travel times based on historical data, achieving effective one-step-ahead predictions [2]. Additionally, enhancements to conventional models were proposed by the introduction of traffic trend adjustments into the K-nearest neighbors (KNN) model, known as KNN-T [18]. Another approach combined random vector functional link networks with empirical mode decomposition, specifically addressing short-term traffic predictions [19].
In recent years, deep learning has significantly expanded across various fields, enhancing its utilization in traffic forecasting. Long Short-Term Memory (LSTM) networks have demonstrated effectiveness in traffic flow prediction due to their superior capacity for capturing long-term dependencies [4]. Similarly, the deployment of Long Short-Term Memory networks combined with deep neural network layers (LSTM-DNN) has proven effective for analyzing travel time data on highways [20]. Convolutional neural networks (CNNs) have been adapted to handle spatio-temporal traffic data, with local receptive fields designed to improve predictive performance [21]. Furthermore, the integration of Gated Recurrent Units (GRU) and XGBoost has been employed to extract hidden patterns in traffic data, therefore enhancing the precision of predictions [22]. Significant advancements include the development of the bidirectional spatial–temporal adaptive transformer (Bi-STAT), which features an encoder–decoder architecture. This architecture includes spatial-adaptive and temporal-adaptive transformer components for accurate traffic forecasting [23]. Moreover, to capture the inherent periodicity and continuity in traffic data, models have integrated graph convolutional networks (GCNs) for spatial dependency mapping and transformers for temporal analysis [24]. These approaches highlight the diversity of techniques available for short-term forecasting. However, they typically face challenges when extended to longer durations or iterative multi-step forecasts, which can lead to error accumulation. The challenge of modeling long-term dependencies with deep learning models remains significant [12].

2.2. Long-Term Traffic Prediction

Long-term traffic prediction is a critical requirement for various real-world applications, leading to the development of models adept at managing complex datasets. To address both short-term and long-term traffic flow predictions, hybrid forecasting algorithms were introduced [25]. Following this, a DNN model was utilized to improve daily traffic flow predictions in Seattle by leveraging both contextual variables and raw traffic data [26]. Moreover, RNNs are employed to predict long-term traffic flows in urban areas, integrating meteorological and contextual data to refine predictions [27]. Subsequent advancements led to the creation of a hybrid model that merges wavelet decomposition with CNN and LSTM technologies [28]. This model preprocesses traffic data using wavelet technology to effectively extract temporal features. Furthermore, a gradient boosting model, augmented by Fourier filtering to mitigate noise and amalgamate diverse data sources, has been adopted to tackle the challenges of long-term traffic prediction [29]. Additionally, the deep ensemble stacked LSTM model (DE-SLSTM) incorporates weather data to adjust for biases in forecasts that span several hours into the future [7]. These works demonstrate the evolving scope of traffic prediction technologies, illustrating significant advancements in addressing long-term forecasting challenges. Each work has contributed to enhancing predictive accuracy and adapting to the complexities of traffic data in various applications.
Recent advancements have shown the effectiveness of attention-based methods in capturing long-term dependencies. A transformer model with multi-head attention has been introduced for long-term traffic flow forecasting [30]. The multisize patched spatial–temporal transformer network (MSP-STTN) utilizes self-attention for enriched context modeling and cross-attention for global memory learning, aiming at both short- and long-term grid-based crowd flow predictions [12]. Furthermore, the Temporal Fusion Transformer (TFT) merges short-term and long-term temporal patterns and uses various input types to effectively manage prediction horizons ranging from 5 to 150 min [31]. Although these models work to forecast traffic conditions over extended periods, the complexity of traffic dynamics and the need for long-duration predictions present significant challenges. Particularly, error accumulation in multi-step forecasting remains a critical issue. To tackle this, advanced models integrating attention mechanisms within an encoder–decoder architecture have been proposed [32]. This work uses periodic data to enhance forecasting accuracy, particularly for long-term scenarios. By focusing on relevant segments of data through attention mechanisms, these models aim to reduce the common problem of error accumulation observed in multi-step long-term forecasting. However, effectively bridging the gap between short-term and long-term prediction capabilities in traffic management remains an ongoing challenge.

3. Proposed Methodology

3.1. Problem Formulation

This study proposes the Periodic Transformer Encoder (PTE) as a solution for estimating future travel times from past traffic data along specific road segments. The proposed architecture is illustrated in Figure 1. Fundamental to our approach is the construction of a transformation function f ( · ) for each road segment r. This function systematically processes historical traffic data to generate forecasts of future travel times, represented as:
X r , τ f r Δ Y r , τ , Δ
where the historical traffic data X r , τ = { x τ T + 1 r , , x τ r } comprise observations from the last T time steps until the current point τ , and the predicted future data Y r , τ , Δ = { y τ + Δ + 1 r , , y τ + Δ + T r } represent the travel times for the next T steps beginning Δ time steps after τ .
To enhance clarity and ease of understanding, the subsequent sections of this work will not include the time and road segment indicators ( r , τ , and Δ ) in the notations for X and Y.

3.2. Data Preprocessing and Segmentation

The initial data preprocessing stages are visualized in Figure 1, highlighting the flow from raw historical data through various transformations necessary for effective model input preparation, following the established procedures of [32]. Initially, missing values in the traffic data, such as travel time and speed, are filled by linear interpolation, ensuring the temporal integrity of the data is maintained, which is crucial for the reliability of subsequent predictions. To standardize the data and mitigate the influence of outliers, each data point is normalized using the Z-score method, where each value is adjusted based on the mean ( μ ) and standard deviation ( σ ) of the dataset. In addition, the traffic dataset is complemented by a variety of temporal attributes, including day types (weekends, weekdays, national holidays), day of the week, months, times of day, and peak periods. Each data instance is transformed into a 316-dimensional vector, where the first two dimensions capture critical traffic metrics—travel time and speed—and the remaining 314 dimensions are dedicated to binary encodings of these temporal attributes, as detailed in Table 1.
Periodic segment S d is extracted from the preprocessed historical data, where d denotes the day prior to the current time, ranging from d = 0 to d = 7 . The set S is thus composed of these segments, S = { S 0 , S 1 , . . . , S 7 } , which include one short-term segment and seven corresponding week-long short long-term segments. For instance, if the current time is 8:00 A.M. on May 8, and the goal is to predict travel times for the next hour from 8:00 to 8:55 A.M., S 0 covers the one-hour period immediately preceding the current time. Specifically, S 0 = { s 0 0 , s 1 0 , . . . , s 11 0 } includes traffic data from 7:00 to 7:55 A.M., sampled at 5 min intervals. Conversely, the short long-term segments S 1 to S 7 represent the same one-hour period, each containing twelve time points from 8:00 to 8:55 A.M. Each corresponds to the same hour on the days leading up to the prediction day, stretching back from May 7 to May 1. For a visual explanation of how these segments are structured, refer to Figure 2. These segments, { S 0 , S 1 , . . . , S 7 } , are aggregated into a periodic tensor S with dimensions 96 × 316 . The dimension 96 represents the total number of one-hour time points across all segments, with each S d containing 12 time points capturing hourly traffic data. The second dimension, 316, is a 316-dimensional vector representing various traffic-related metrics and temporal attributes for each time point. The aggregated periodic tensor is embedded with temporal positional information, resulting in a component termed the temporal encoded periodic segment. This step delivers important information to the model about the sequence order, employing Positional Encoding techniques, as described in [33]. Positional encoding is applied to the input embeddings at each position using sine and cosine functions of different frequencies:
P E ( p o s , 2 i ) = sin p o s 10000 2 i d model ,
P E ( p o s , 2 i + 1 ) = cos p o s 10000 2 i d model .
Here, p o s represents the position within the sequence, and i denotes the dimension index. Each position in the input sequence receives a unique positional encoding. The dimension of the model, d model , is set at 316, which also corresponds to the dimension of each embedding. The output dimensions remain identical to those of the input. These segments, termed temporal encoded periodic segments, are subsequently fed into the Periodic Transformer Encoder.

3.3. Periodic Transformer Encoder

As illustrated in Figure 3, the proposed Periodic Transformer Encoder (PTE) employs a Transformer Encoder to process the aggregated periodic tensor to capture dependencies across various time horizons. This process enhances interactions between different periodic elements, leading to a unified and robust representation. The temporal encoded periodic segments, S, are input into the Transformer Encoder with dimensions of 96 × 316 , which represents 8 periodic cycles, each comprising 12 segments that correspond to one-hour time points with 316 features each, covering both short-term and long-term dependencies. The multi-head self-attention mechanism of the encoder is particularly effective for capturing the long-range dependencies typical of week-long traffic data. It uses three sets of weights, Query (Q), Key (K), and Value (V), to transform the input tensor:
Q = S W Q , K = S W K , V = S W V
These transformations project the input tensor into spaces that facilitate the computation of attention scores, calculated as follows:
Attention ( Q , K , V ) = softmax Q K T d k V
where d k is set as the input dimension. This mechanism enables the encoder to weigh and prioritize different segments of the input based on their relevance to the output prediction. Following the attention mechanism, the process continues as described in [33], involving the addition and normalization steps and passing through feed-forward networks (FFNs) within each encoder layer.
Finally, the output from the Transformer Encoder, maintaining the dimensions 96 × 316 , S e n c , is further processed through fully connected layers to predict travel times:
Y = ReLU ( S enc W 1 + b 1 ) W 2 + b 2
The dimensions transform from 96 × 316 to 12 × 1 , where each output corresponds to a predicted travel time at 5 min intervals over one hour.
Before generating the final predictions, a reverse Z-score transformation is applied. This step is for converting normalized prediction values back to their original scale. The reverse Z-score transformation is performed using the formula:
X = Z × σ + μ
where Z is the normalized prediction, σ is the standard deviation used during the initial normalization, and μ is the mean of the original traffic data.

4. Experimental Setup

4.1. Datasets

To evaluate the effectiveness of the proposed PTE, comprehensive experiments were conducted using the Taiwan Expressway dataset from the Freeway Bureau of Taiwan, China [34]. This dataset encompasses traffic data for 322 road segments. Following the procedures outlined in [32], we selected the same 15 road segments representing diverse regions of Taiwan—Northern, Central, and Southern. These segments were chosen for their varying travel time statistics, such as mean, standard deviation, and coefficient of variation. This selection aims to capture a broad range of traffic conditions throughout Taiwan, thereby enhancing the representativeness of our dataset. Such diversity is expected to strengthen the validity of our findings across different traffic environments. Details regarding these segments and their corresponding statistical data on travel times are described in Table 2. The dataset covers the period from 1 October 2019 to 31 January 2021. To ensure an equitable experimental design, the data from the first year were used for training, the subsequent month for validation, and the final three months for testing.

4.2. Segmentation of Traffic Conditions

To further illustrate performance under varying traffic densities, experimental results are segmented into two time categories: peak and off-peak. Peak hours, defined as intervals from 7:00 A.M. to 9:00 A.M. and 3:30 P.M. to 7:30 P.M., capture the typical rush hours and are contrasted against less congested off-peak periods. This segmentation helps address the expected greater variability during peak periods due to regular traffic patterns and random events such as accidents, visually supported by Figure 4.

4.3. Road Segment Complexity Analysis

Empirical evaluation results are categorized based on the complexity levels of road segments, assessing PTE on road segments with High, Moderate, and Low Variability, determined by the standard deviation of travel times. Road segments are classified as High Variability (standard deviation > 100), Moderate Variability (standard deviation between 50 and 100), and Low Variability (standard deviation < 50), indicating varying levels of forecasting challenges. The coefficient of variation (CV) is calculated as the ratio of the standard deviation to the mean travel time and is defined as C V = σ μ . CV is used to quantify the variability relative to the mean, enhancing our understanding of each segment’s predictability.

4.4. Competitive Methods

To effectively assess the capabilities of the proposed PTE, it is compared against a range of established and contemporary models in the field. The following methods are included in the comparative analysis:
  • HA [1] employs historical averages for traffic forecasting by applying straightforward time series regression techniques based on past data.
  • LSTM [35] represents a type of recurrent neural network known for its capability to manage sequence-based prediction challenges, including travel time forecasts.
  • DNN [26] is a multi-layer deep neural network structured to address various predictive tasks related to traffic, such as estimating travel durations and analyzing flow dynamics.
  • DE-SLSTM [7] augments conventional LSTM models by incorporating both short-term and extensive historical traffic data to enhance travel time prediction precision.
  • MTSMTT [36] is an approach designed for multivariate time series forecasting, which combines BiLSTM units with attention mechanisms to extract and analyze underlying data intricacies.
  • DHM [22] combines Gated Recurrent Units (GRU) with the XGBoost algorithm to forecast freeway travel times and integrate these predictions using linear regression techniques.
  • TFT [31] utilizes Temporal Fusion Transformers to effectively merge different types of inputs, demonstrating flexibility in forecasting speeds across varying freeway conditions.
  • PASS2S [32] integrates an attention mechanism into a sequence-to-sequence LSTM model, focusing specifically on addressing the complexities of long-term travel time predictions.
To ensure a fair comparison across all competitive methods, each model utilizes both short-term and historical data from one week prior, aligning with the experimental settings described in [32]. Furthermore, consistency in evaluation is maintained by utilizing the test results from [32], thereby aligning our experimental data with the outcomes of their research.

4.5. Parameter Settings and Evaluation Metrics

The configuration of our model’s hyperparameters is outlined in Table 3. Additionally, three standard metrics commonly used in travel time forecasting are employed, defined mathematically as follows:
MAE = 1 N × l i = 1 N j = 1 l | y i , j y ^ i , j | ,
RMSE = 1 N × l i = 1 N j = 1 l ( y i , j y ^ i , j ) 2 ,
SMAPE = 100 % N × l i = 1 N j = 1 l | y i , j y ^ i , j | ( | y i , j | + | y ^ i , j | ) / 2 .
where:
  • y i , j denotes the actual travel time recorded for the i-th sample at time point j.
  • y ^ i , j represents the predicted travel time for the i-th sample at time point j.
  • N is the total count of samples included in the dataset.
  • l indicates the total number of time points within the forecast period.
These metrics provide a comprehensive view of the model’s performance, accounting for different aspects of forecasting performance.
Table 3. Experimental settings.
Table 3. Experimental settings.
SettingValue
Learning rate0.0001
Batch size128
Attention heads4 per encoder block
Number of layers (Encoder, FC)1, 2
Output dimension12
Training epochs15
Loss functionMean Square Error (MSE)

5. Results and Discussions

This section conducts a comparative analysis of the proposed PTE against established competitive methods, focusing on two primary prediction types: short-term (within the next hour) and long-term (from 1 day to a week ahead). For detailed experimental conditions regarding peak and off-peak hours, please refer to Section 4.2. The impact of road segment variability on the prediction performance is also evaluated to provide insights into PTE’s robustness across fluctuating traffic conditions; for more information, see Section 4.3. In the results tables, the top-performing results are highlighted in bold, with the second-best outcomes underscored.

5.1. Short-Term Travel Time Prediction

For short-term forecasts throughout the entire day, Table 4 demonstrates PTE’s exceptional capability in short-term forecasting, achieving the lowest error rates across all evaluation metrics for predictions up to one hour. PTE significantly outperforms its nearest competitors, with improvements of 10.16% in MAE, 9.91% in RMSE, and 9.16% in SMAPE, reflecting its superior forecasting abilities for imminent travel times. Compared to traditional statistical models like HA, which typically underperform, these results indicate the importance of advanced machine learning techniques in enhancing predictive performance. While LSTM and DNN models are also competent at handling sequence predictions, PTE excels particularly in managing dependencies in historical traffic patterns. Among the evaluated methods, DHM ranks highly, benefiting from the combined effects of GRU and XGBoost algorithms, and excelling in short-term forecasts by effectively capturing traffic dynamics. However, PTE’s periodic encoding and direct prediction strategies lead to the most notable performance enhancements.

5.1.1. Performance during Peak and Off-Peak Hours

As shown in Table 4, all evaluated methods face greater challenges during peak hours, as reflected by increased error rates across all evaluation metrics. PTE consistently outperforms all competing models, offering substantial improvements during peak hours, with performance gains of 14.92% in MAE, 11.76% in RMSE, and 15.86% in SMAPE compared to other leading methods. These results not only demonstrate PTE’s robustness, maintaining consistent performance despite traffic variability. In off-peak hours, where traffic patterns are generally more stable, PTE maintains the lowest error rates across all metrics, demonstrating its ability to adapt to different traffic densities.

5.1.2. Impact of Road Segment Complexity

In Table 5, PTE demonstrates its robustness by consistently achieving the lowest MAE, RMSE, and SMAPE values across all types of road segments, including High, Moderate, and Low Variability in short-term prediction scenarios. These results highlight PTE’s ability to handle immediate traffic conditions effectively, regardless of the road segment’s underlying complexity. While error rates increase with segment variability for all models, PTE excels particularly in higher-variability segments, demonstrating superior performance in managing dynamic and unpredictable traffic patterns.

5.2. Long-Term Travel Time Prediction

Table 6 displays PTE’s performance across various time segments for long-term predictions. Across the board, PTE shows superior performance, achieving the lowest MAE, RMSE, and SMAPE values, particularly in the comprehensive all-day category. It notably outperforms specialized models like PASS2S and DE-SLSTM, which are designed for long-term dependencies but fall short of PTE’s advanced data handling capabilities.

5.2.1. Performance during Peak and Off-Peak Hours

During peak hours, PTE effectively handles increased prediction complexity due to irregular congestion events, as evidenced by its outstanding metrics compared to other models, detailed in Table 6. This ability is vital for dynamic routing and congestion management applications. In less challenging off-peak hours, PTE maintains superior performance, essential for reliable planning and operational decisions, demonstrating its consistent predictive strength across different traffic conditions.

5.2.2. Impact of Road Segment Complexity

Table 7 demonstrates PTE’s robust performance across road segments of varying complexity. Consistently recording the lowest error rates in MAE, RMSE, and SMAPE, PTE proves highly effective in complex traffic networks, adeptly managing the inherent uncertainties of diverse traffic patterns. This demonstrates its capacity to deliver reliable long-term predictions across a spectrum of traffic scenarios.

5.2.3. Day-by-Day Performance

Figure 5 details PTE’s ability to deliver accurate long-term forecasts over a seven-day period, showcasing how well it manages daily traffic fluctuations. This aspect is crucial for long-term planning within Intelligent Transportation Systems (ITS), where understanding multi-horizon prediction capabilities is vital.
For the 1-day-ahead prediction, PTE leads with robust performance, effectively utilizing prior data to achieve the lowest MAE and RMSE scores, demonstrating its precise grasp of immediate traffic conditions. From 2-day to 7-day forecasts, PTE maintains lower error rates, highlighting its consistency in adapting to daily changes, including weekday and weekend variations. While models such as DE-SLSTM and PASS2S are designed to handle long-term dependencies, PTE consistently outperforms them and other advanced models such as MTSMTT, DHM, and TFT, illustrating its superior capability in long-term forecasting. PTE’s performance remains notably reliable across the entire week, demonstrating its strong predictive performance and adaptability in varying traffic conditions.

5.2.4. Individual Segment Analysis

The analysis presented in Table 8 details PTE’s MAE performance across selected road segments with varying traffic complexities. This day-by-day examination provides a deeper insight into PTE’s performance trends over a week.
PTE consistently excels on the high-variability segment NFB0370, demonstrating the lowest MAE each day. This demonstrates its adept handling of the unpredictable and complex traffic patterns typical of such roads. Its robustness in these conditions likely stems from the effective use of attention mechanisms that selectively focus on crucial temporal features for the forecast period. Notably, its performance remains stable across the week, highlighting PTE’s ability to manage daily traffic fluctuations effectively.
On the moderately variable road segment NFB0425, PTE also maintains superior performance throughout the week. Although the traffic patterns here are less complex than those of high-variability roads, PTE’s capacity to discern and analyze key temporal dynamics ensures consistent and reliable predictions, which is vital for long-term planning in environments with moderate traffic variability.
For the low-variability segment NFB0247, PTE continues to lead, showing the best performance throughout the week. Traffic on such roads is generally more predictable and stable. Despite slight day-to-day variations in MAE, PTE consistently outperforms other models, proving its efficacy in managing both dynamic and stable traffic conditions.

6. Conclusions

This study introduces the Periodic Transformer Encoder (PTE), designed to enhance travel time predictions across multiple horizons within Intelligent Transportation Systems (ITS). By leveraging the strengths of transformer models and focusing on periodic data, the PTE successfully addresses the limitations of existing methods that primarily target short-term forecasting and often struggle with complex long-term prediction scenarios. The PTE’s encoder-only architecture eliminates the need for a traditional decoder, thereby simplifying the model and significantly reducing computational demands. This design not only streamlines the training and inference processes but also enhances the model’s operational efficiency. Empirical evaluations on a comprehensive real-world traffic dataset have demonstrated that the PTE significantly outperforms existing methods across both short-term and long-term prediction intervals. Its ability to handle complex traffic patterns across diverse road conditions also proves the model’s versatility and robustness. The significant improvements demonstrated by the PTE across multiple prediction horizons indicate its potential to substantially advance the field of traffic time prediction by offering more reliable and precise solutions.
Despite these significant advancements, opportunities for further exploration remain. The model’s integration of multi-horizon data has yet to fully reveal the distinct impacts of different temporal scales on prediction performance, pointing to a potential area for more focused studies. Future work will delve into these topics more thoroughly by conducting analyses to understand how various types of temporal data influence prediction performance across different horizons. Investigating the interpretability of the attention mechanism is a promising direction for this research. These efforts are expected to lead to further improvements in how we manage and interpret these data interactions.

Author Contributions

Conceptualization, H.-T.C.L. and V.S.T.; methodology, H.-T.C.L. and V.S.T.; software, H.-T.C.L.; validation, H.-T.C.L.; formal analysis, H.-T.C.L. and V.S.T.; investigation, H.-T.C.L.; resources, V.S.T.; data curation, H.-T.C.L.; writing—original draft preparation, H.-T.C.L. and V.S.T. writing—review and editing, H.-T.C.L. and V.S.T.; visualization, H.-T.C.L.; supervision, V.S.T.; project administration, V.S.T.; funding acquisition, V.S.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported in part by National Science and Technology Council Taiwan under grant nos. 111-2221-E-A49-124-MY3 and 112-2634-F-A49-005.

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Smith, B.L.; Demetsky, M.J. Traffic Flow Forecasting: Comparison of Modeling Approaches. J. Transp. Eng. 1997, 123, 261–266. [Google Scholar] [CrossRef]
  2. Billings, D.; Yang, J.S. Application of the ARIMA Models to Urban Roadway Travel Time Prediction—A Case Study. In Proceedings of the 2006 IEEE International Conference on Systems, Man and Cybernetics, Taipei, Taiwan, 8–11 October 2006. [Google Scholar]
  3. Shaygan, M.; Meese, C.; Li, W.; Zhao, X.G.; Nejad, M. Traffic prediction using artificial intelligence: Review of recent advances and emerging opportunities. Transp. Res. Part C Emerg. Technol. 2022, 145, 103921. [Google Scholar] [CrossRef]
  4. Tian, Y.; Pan, L. Predicting Short-Term Traffic Flow by Long Short-Term Memory Recurrent Neural Network. In Proceedings of the 2015 IEEE International Conference on Smart City/SocialCom/SustainCom (SmartCity), Chengdu, China, 19–21 October 2015. [Google Scholar]
  5. Duan, Y.; L.V., Y.; Wang, F.Y. Travel time prediction with LSTM neural network. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016. [Google Scholar]
  6. Wang, D.; Zhang, J.; Cao, W.; Li, J.; Zheng, Y. When Will You Arrive? Estimating Travel Time Based on Deep Neural Networks. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018. [Google Scholar]
  7. Chou, C.H.; Huang, Y.; Huang, C.Y.; Tseng, V.S. Long-term traffic time prediction using deep learning with integration of weather effect. In Proceedings of the 23rd Pacific-Asia Conference on Knowledge Discovery and Data Mining, Macau, China, 14–17 April 2019. [Google Scholar]
  8. Wu, L.; Wang, Y.; Liu, J.; Shan, D. Developing a time-series speed prediction model using Transformer networks for freeway interchange areas. Comput. Electr. Eng. 2023, 110, 108860. [Google Scholar] [CrossRef]
  9. Mashurov, V.; Chopurian, V.; Porvatov, V.; Ivanov, A.; Semenova, N. GCT-TTE: Graph Convolutional Transformer for Travel Time Estimation. arXiv 2024, arXiv:2301.07945. [Google Scholar] [CrossRef]
  10. Zong, X.; Chen, Z.; Yu, F.; Wei, S. Local-Global Spatial-Temporal Graph Convolutional Network for Traffic Flow Forecasting. Electronics 2024, 13, 636. [Google Scholar] [CrossRef]
  11. Jiang, J.; Han, C.; Zhao, W.X.; Wang, J. PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for Traffic Flow Prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 20–27 February 2023. [Google Scholar]
  12. Xie, Y.; Niu, J.; Zhang, Y.; Ren, F. Multisize Patched Spatial-Temporal Transformer Network for Short- and Long-Term Crowd Flow Prediction. IEEE Trans. Intell. Transp. Syst. 2022, 23, 21548–21568. [Google Scholar] [CrossRef]
  13. Qi, X.; Mei, G.; Tu, J.; Xi, N.; Piccialli, F. A Deep Learning Approach for Long-Term Traffic Flow Prediction With Multifactor Fusion Using Spatiotemporal Graph Convolutional Network. IEEE Trans. Intell. Transp. Syst. 2023, 24, 8687–8700. [Google Scholar] [CrossRef]
  14. Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Proc. AAAI Conf. Artif. Intell. 2021, 35, 11106–11115. [Google Scholar] [CrossRef]
  15. Zhang, Y.; Liu, S.; Zhang, P.; Li, B. GRU- and Transformer-Based Periodicity Fusion Network for Traffic Forecasting. Electronics 2023, 12, 4988. [Google Scholar] [CrossRef]
  16. Mingxing, X.; Dai, W.; Liu, C.; Gao, X.; Lin, W.; Qi, G.J.; Xiong, H. Spatial-Temporal Transformer Networks for Traffic Flow Forecasting. arXiv 2020, arXiv:2001.02908. [Google Scholar]
  17. Wu, C.H.; Ho, J.M.; Lee, D. Travel-time prediction with support vector regression. IEEE Trans. Intell. Transp. Syst. 2004, 5, 276–281. [Google Scholar] [CrossRef]
  18. Qiao, W.; Haghani, A.; Hamedi, M. Short-Term Travel Time Prediction Considering the Effects of Weather. Transp. Res. Rec. 2012, 2308, 61–72. [Google Scholar] [CrossRef]
  19. Li, L.; Qu, X.; Zhang, J.; Li, H.; Ran, B. Travel time prediction for highway network based on the ensemble empirical mode decomposition and random vector functional link network. Appl. Soft. Comput. 2018, 73, 921–932. [Google Scholar] [CrossRef]
  20. Liu, Y.; Wang, Y.; Yang, X.; Zhang, L. Short-term travel time prediction by deep learning: A comparison of different LSTM-DNN models. In Proceedings of the 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan, 16–19 October 2017. [Google Scholar]
  21. Ran, X.; Shan, Z.; Shi, Y.; Lin, C. Short-Term Travel Time Prediction: A SpatioTemporal Deep Learning Approach. Int. J. Inf. Technol. Decis. Mak. 2019, 18, 1087–1111. [Google Scholar] [CrossRef]
  22. Ting, P.Y.; Wada, T.; Chiu, Y.L.; Sun, M.T.; Sakai, K.; Ku, W.S.; Jeng, A.A.K.; Hwu, J.S. Freeway Travel Time Prediction Using Deep Hybrid Model—Taking Sun Yat-Sen Freeway as an Example. IEEE Trans. Veh. Technol. 2020, 69, 8257–8266. [Google Scholar] [CrossRef]
  23. Chen, C.; Liu, Y.; Chen, L.; Zhang, C. Bidirectional Spatial-Temporal Adaptive Transformer for Urban Traffic Flow Forecasting. IEEE Trans. Neural Netw. Learn. Syst. 2022, 34, 6913–6925. [Google Scholar] [CrossRef]
  24. Cai, L.; Janowicz, K.; Mai, G.; Yan, B.; Zhu, R. Traffic transformer: Capturing the continuity and periodicity of time series for traffic forecasting. Trans. GIS 2020, 24, 736–755. [Google Scholar] [CrossRef]
  25. Hou, Z.; Li, X. Repeatability and Similarity of Freeway Traffic Flow and Long-Term Prediction Under Big Data. IEEE Trans. Intell. Transp. Syst. 2016, 17, 1786–1796. [Google Scholar] [CrossRef]
  26. Qu, L.; Li, W.; Li, W.; Ma, D.; Wang, Y. Daily long-term traffic flow forecasting based on a deep neural network. Expert Syst. Appl. 2019, 121, 304–312. [Google Scholar] [CrossRef]
  27. Belhadi, A.; Djenouri, Y.; Djenouri, D.; Lin, J. A recurrent neural network for urban long-term traffic flow forecasting. Appl. Intell. 2020, 50, 3252–3265. [Google Scholar] [CrossRef]
  28. Li, Y.; Chai, S.; Ma, Z.; Wang, G. A Hybrid Deep Learning Framework for Long-Term Traffic Flow Prediction. IEEE Access 2021, 9, 11264–11271. [Google Scholar] [CrossRef]
  29. Che-Ming Chen, C.C.L.; Chu, C.P. Long-term travel time prediction using gradient boosting. J. Intell. Transp. Syst. 2020, 24, 109–124. [Google Scholar] [CrossRef]
  30. Reza, S.; Ferreira, M.; Machado, J.; Tavares, J. A Multi-head Attention-based Transformer Model for Traffic Flow Forecasting with a Comparative Analysis to Recurrent Neural Networks. Expert Syst. Appl. 2022, 202, 117275. [Google Scholar] [CrossRef]
  31. Zhang, H.; Zou, Y.; Yang, X.; Yang, H. A temporal fusion transformer for short-term freeway traffic speed multistep prediction. Neurocomputing 2022, 500, 329–340. [Google Scholar] [CrossRef]
  32. Huang, Y.; Dai, H.; Tseng, V.S. Periodic Attention-based Stacked Sequence to Sequence framework for long-term travel time prediction. Knowl. Based Syst. 2022, 258, 109976. [Google Scholar] [CrossRef]
  33. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention Is All You Need. In Proceedings of the Advances in Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017. [Google Scholar]
  34. Freeway Bureau Taiwan R.O.C. Taiwan Expressway Dataset. Available online: https://tisvcloud.freeway.gov.tw (accessed on 10 March 2021).
  35. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  36. Du, S.; Li, T.; Yang, Y.; Horng, S.J. Multivariate time series forecasting via attention-based encoder–decoder framework. Neurocomputing 2020, 388, 269–279. [Google Scholar] [CrossRef]
Figure 1. Overview of the proposed framework. FC: fully connected layers, μ : mean, σ : standard deviation of the travel time.
Figure 1. Overview of the proposed framework. FC: fully connected layers, μ : mean, σ : standard deviation of the travel time.
Electronics 13 02094 g001
Figure 2. Visual representation of periodic segments extracted from traffic data, showing both short-term and week-long short long-term segments based on the current time at 8:00 A.M. on May 8, with predictions aimed one hour ahead starting from 9:00 A.M.
Figure 2. Visual representation of periodic segments extracted from traffic data, showing both short-term and week-long short long-term segments based on the current time at 8:00 A.M. on May 8, with predictions aimed one hour ahead starting from 9:00 A.M.
Electronics 13 02094 g002
Figure 3. Architecture of the proposed Periodic Transformer Encoder (PTE).
Figure 3. Architecture of the proposed Periodic Transformer Encoder (PTE).
Electronics 13 02094 g003
Figure 4. The standard deviation of travel times during peak (solid red area) and off-peak (dashed blue line) hours for each road segment, visually depicted using a step line chart.
Figure 4. The standard deviation of travel times during peak (solid red area) and off-peak (dashed blue line) hours for each road segment, visually depicted using a step line chart.
Electronics 13 02094 g004
Figure 5. Comparison with competitive methods in long-term prediction over a 7-day horizon.
Figure 5. Comparison with competitive methods in long-term prediction over a 7-day horizon.
Electronics 13 02094 g005
Table 1. Temporal attributes.
Table 1. Temporal attributes.
AttributeDimensionDescription
Day type3Includes all calendar days categorized into Weekends, Weekdays, and National holidays
Day of the week7Organizes days from Monday to Sunday
Month12Enumerates months from January to December
Peak hours4Identifies four distinct traffic peaks: Morning-peak, Noon-peak, Night-peak, and Off-peak
Time slot288Represents the number of 5 min intervals in a day, accounting for each day’s full 24 h, totaling 288 time slots
Table 2. Statistical summary of travel time across selected road segments.
Table 2. Statistical summary of travel time across selected road segments.
AreaSegment IDMeanStandard DeviationCoefficient of Variation
NFB0370886.61337.960.381
NFB0431519.68128.970.248
NorthNFB0019357.34198.740.556
NFB0033313.6462.900.201
NFB0425239.4071.690.299
NFB0064562.0192.960.165
NFB0063554.1029.720.054
CentralNFB0247428.6834.120.079
NFB0248418.6837.220.089
NFB0061402.9645.750.114
NFB0327498.3961.510.123
NFB0328489.9954.320.111
SouthNFB0117398.2916.640.042
NFB0124394.0515.320.039
NFB0123392.0226.460.067
Table 4. Comparison with competitive methods in short-term prediction.
Table 4. Comparison with competitive methods in short-term prediction.
All Peak Off-Peak
MAERMSESMAPE(%) MAERMSESMAPE(%) MAERMSESMAPE(%)
HA [1]35.03675.5166.408 43.32687.9897.076 32.65470.3326.161
LSTM [35]23.62662.9724.210 36.15281.3285.541 21.00157.2233.931
DNN [26]23.69557.2164.389 32.60568.9755.342 21.82753.5814.189
DE-SLSTM [7]20.99451.8703.934 29.00362.9414.861 19.31548.4983.740
MTSMFF [36]26.04859.6495.092 32.4970.5365.837 23.74254.2074.825
DHM [22]19.59152.8723.712 26.03564.0714.449 17.28447.1323.448
TFT [31]31.96470.4416.118 44.85289.4948.029 28.36262.6955.620
PASS2S [32]20.38150.2503.860 28.22261.0114.778 18.73846.9723.668
PTE17.60045.2643.372 22.14853.8353.743 16.64742.5303.294
Improvement ratio (%)10.1659.9229.160 14.92811.76115.868 3.6879.4564.464
Table 5. Comparison of short-term prediction results on different types of road segments against competitive methods.
Table 5. Comparison of short-term prediction results on different types of road segments against competitive methods.
VariabilityMetricDE-SLSTMMTSMFFDHMTFTPASS2SPTEImprovement Ratio (%)
MAE38.63844.15234.70657.54737.12531.3269.738
HighRMSE87.79497.65688.262124.22382.99073.59711.318
SMAPE (%)5.8496.9395.2548.8685.6974.8347.990
MAE18.30822.69517.80525.55017.84715.54712.524
ModerateRMSE57.00562.92157.42868.05555.79951.3707.937
SMAPE (%)4.7766.0514.6086.9124.6484.06611.753
MAE8.08113.1988.18514.9198.1187.5296.829
LowRMSE18.51025.79520.34327.21419.26817.5835.006
SMAPE (%)1.7772.9121.8303.2971.8061.6904.872
Table 6. Comparison with competitive methods in long-term prediction.
Table 6. Comparison with competitive methods in long-term prediction.
All Peak Off-Peak
MAERMSESMAPE(%) MAERMSESMAPE(%) MAERMSESMAPE(%)
HA [1]34.516746.304 43.33587.0077.081 32.65970.1346.141
LSTM [35]31.52074.0625.743 43.44089.8956.958 29.01169.0805.488
DNN [26]31.27970.2955.886 41.065846.861 29.21966.0875.681
DE-SLSTM [7]31.94872.5495.842 41.6884.7236.841 29.89968.7655.632
MTSMFF [36]31.63969.9645.91 41.2383.9997.026 28.19363.1415.509
DHM [22]33.28171.8966.245 43.50686.5607.459 29.60964.8035.808
TFT [31]32.31670.0846.089 46.43991.4067.965 29.02362.4625.705
PASS2S [32]28.92966.6955.373 37.59678.8796.211 27.10562.9475.197
PTE27.38065.7645.063 35.68978.1375.783 25.63262.0164.912
Improvement ratio (%)5.3531.3965.764 5.0710.9416.886 5.4350.7135.482
Table 7. Comparison of long-term prediction results on different types of road segments against competitive methods.
Table 7. Comparison of long-term prediction results on different types of road segments against competitive methods.
VariabilityMetricDE-SLSTMMTSMFFDHMTFTPASS2SPTEImprovement Ratio (%)
MAE61.20058.25761.83359.12854.74751.9905.035
HighRMSE135.985126.716130.079126.357121.140120.3800.628
SMAPE (%)8.9708.6759.3168.8808.2177.7445.757
MAE23.80524.82825.60725.25222.40421.5693.726
ModerateRMSE63.38364.11364.54563.87761.59759.8992.756
SMAPE (%)6.3956.7066.9856.8585.9965.7544.050
MAE12.99913.99714.60414.68311.76410.7468.650
LowRMSE25.79626.57128.31127.32824.72324.1612.276
SMAPE (%)2.8673.0743.1913.2502.5872.3698.429
Table 8. Performance comparison on MAE for particular road segments.
Table 8. Performance comparison on MAE for particular road segments.
Segment IDMethod1 Day2 Days3 Days4 Days5 Days6 Days7 Days
NFB0370DE-SLSTM [7]112.711124.196116.160113.254113.503126.214111.231
MTSMFF [36]112.253112.012104.217100.242107.015103.390104.751
DHM [22]105.368112.994118.828115.879115.242118.125103.395
TFT [31]110.994110.249108.464111.004110.218110.996114.244
PASS2S [32]100.001102.63299.829101.23599.102101.02099.723
PTE94.86498.44396.95394.78197.97899.90498.822
NFB0425DE-SLSTM [7]30.62231.29232.33831.65531.89333.58934.009
MTSMFF [36]30.99831.59632.22331.63831.70531.80030.499
DHM [22]32.83033.74534.41035.33633.32833.46631.666
TFT [31]29.21631.59230.77630.88431.46331.53431.898
PASS2S [32]28.94629.87930.63230.91029.71330.30130.690
PTE27.95928.78428.21029.05328.47528.61628.474
NFB0247DE-SLSTM [7]15.79115.88516.35918.61616.65018.37216.187
MTSMFF [36]18.15917.78217.77717.80717.32317.59617.639
DHM [22]29.03920.30217.88419.48818.60718.82916.430
TFT [31]19.08318.06320.05219.22920.46018.78618.373
PASS2S [32]13.83814.50515.05715.83115.21315.11915.167
PTE13.42114.34114.64214.74614.75015.04615.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lin, H.-T.C.; Tseng, V.S. Periodic Transformer Encoder for Multi-Horizon Travel Time Prediction. Electronics 2024, 13, 2094. https://doi.org/10.3390/electronics13112094

AMA Style

Lin H-TC, Tseng VS. Periodic Transformer Encoder for Multi-Horizon Travel Time Prediction. Electronics. 2024; 13(11):2094. https://doi.org/10.3390/electronics13112094

Chicago/Turabian Style

Lin, Hui-Ting Christine, and Vincent S. Tseng. 2024. "Periodic Transformer Encoder for Multi-Horizon Travel Time Prediction" Electronics 13, no. 11: 2094. https://doi.org/10.3390/electronics13112094

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop