Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Weekly Wimax Traffic Forecasting Using Trainable Cascade-Forward Backpropagation Network in Wavelet Domain

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

International Journal of Application or Innovation in Engineering& Management (IJAIEM)

Web Site: www.ijaiem.org Email: editor@ijaiem.org


Volume 3, Issue 7, July 2014 ISSN 2319 - 4847

Volume 3, Issue 7, July 2014 Page 197


ABSTRACT
In this Paper, the WiMAX Traffic Forecasting on Week basis is done. The traffic time series is decomposed with Stationary
Wavelet Transform (SWT). Further these coefficients will be trained and predicted with the Trainable Cascade-Forward
Backpropagation Neural Networks. The quality of forecasting obtained is shown in terms of the four parameters.
Keywords: SWT, WiMAX, Neural network, SMAPE, RSQ, RMSE, MAE.
1. INTRODUCTION
Worldwide Interoperability for Microwave Access (WiMAX) technology is a modern solution for wireless networks. One
of the most difficult problems that appears in the WiMAX network is the non uniformity of traffic developed by different
base stations. This comportment is induced by the ad hoc nature of wireless networks and concerns the service providers
who administrate the network. The amount of traffic through a base station (BS) should not be higher than the capacity of
that BS. If the amount of traffic approaches the capacity of the BS, then it saturates. Due to the traffic non-uniformity,
different BS will saturate at different future moments. These moments can be predicted using traffic forecasting
methodologies.
2. TRADITIONAL APPROACHES
The traditional approaches for time series forecasting assume that time series is issued from linear processes, but it may
be totally inappropriate if the underlying mechanism is nonlinear [5]. One of the models is based on Box-Jenkins
methodology which is used for building the time series model in a sequence of steps which were repeated till the optimum
model is not achieved. Second class of models used the structural state space methods that are used to predict the
stationary, trend, seasonal, and cyclical data. These methods capture the observations as a sum of separate components
(such as trend and seasonality). Between all of the above forecasting models, artificial neural networks (ANNs) have been
shown to produce better results [3], [4] and [7]. In [10], the performance and the computational complexity of ANNs are
compared with the ones obtained using ARIMA and fractional ARIMA (FARIMA) predictors, Wavelet based predictors
and ANNs. The results of this study show the significant advantages for the ANN technique. In [6], the advantage of the
ANN over traditional rule-based systems is proved. The authors of [8], [11] and [9] propose a time delayed neural
network (TDNN). The forecasting accuracy by using Wavelet Transform is described in [2]. The paper presents a
forecasting technique for forward energy prices, one day ahead. The results demonstrate that the use of Wavelet
Transform as a pre-processing procedure of forecasting data improves the performance of prediction techniques.
3. FORECASTING PROCEDURE
The procedure of the WiMAX traffic prediction method by using the wavelet transform is to decompose the data
(WiMAX Traffic) which is also referred to as the time-series signal, into a range of frequency scales and then to apply
the forecasting method to the individual Approximate and Detail components of this data. The several steps to be used in
this procedure are presented in Fig. 1:
Weekly WiMAX Traffic Forecasting using
Trainable Cascade-Forward Backpropagation
Network in Wavelet Domain
Mankhush Singh
1
and Simarpreet Kaur
2


1
M.Tech Student, Baba Banda Singh Bahadur Engineering College, Fatehgarh Sahib

2
Department of E.C.E, Baba Banda Singh Bahadur Engineering College, Fatehgarh Sahib
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org
Volume 3, Issue 7, July 2014 ISSN 2319 - 4847

Volume 3, Issue 7, July 2014 Page 198


Figure 1: Steps for Forecasting
1) Decompose the data for input and for test, using the Stationary Wavelet Transform.
2) Arrange the Approximate and Detailed coefficients obtained from the each of the Four levels.
3) Create trainable cascade-forward backpropagation networks for each level of decomposition obtained from data.
4) Keeping the "tansig" (Hyperbolic tangent sigmoid transfer function) for calculating the layer's output from its
network input. Train these networks using "trainscg" (Scaled conjugate gradient backpropagation function) of
Matlab.
5) Predict each decomposition level of the forecasted signal using the decomposed signal and the obtained model.
6) Apply Inverse Stationary Wavelet Transform to obtain the final predicted signal.
4.THE WAVELET TRANSFORM
Wavelets divide the data into several frequency components, then process them at different scales or resolutions. The
multi-resolution analysis (MRA) is a signal processing technique that considers the signal's representation of multiple
time resolutions. At each temporal resolution two categories of coefficients are obtained: Approximation and Detailed
coefficients. Generally the MRA is implemented based upon the algorithm proposed by Stephane Mallat [12], which
computes the Discrete Wavelet Transform (DWT). The disadvantage of this algorithm is the decreasing of the
sequences length of the coefficient with the increase of the iteration index because of the utilization of the decimators.
Another way to implement a MRA is to use Shensas algorithm [13] (which corresponds to the computation of the
Stationary Wavelet Transform (SWT)). In this case the use of decimators is avoided but at each iteration different low-
pass and high-pass filters are used. In this paper we used the SWT with the following purposes:
To extract the overall trend of the temporal series that describes the traffic under analysis with the aid of the
approximation coefficients.
To extract the variability around the overall trend with the aid of some detail coefficients.
The reconstruction is done through the Inverse Stationary Wavelet Transform (ISWT). In [1], the best mother wavelet
used for the prediction accuracy of the traffic variability is the Haar wavelet. So Haar will be used as wavelet for
decomposition in SWT by us.
5. ARTIFICIAL NEURAL NETWORKS AND DATA CONFIGURATION
We used Trainable Cascade-Forward Backpropagation network in our forecasting process. An Artificial Neural Network
is a mathematical nonlinear model which is composed of interconnected simple elements, called artificial neurons.
An ANN has three characteristics:
1. The architecture of interconnected neural units.
2. The learning or training algorithm for determining the weights of the connections. The training function used in our
approach is Trainscg which is a network training function that updates weight and bias values according to the scaled
conjugate gradient method.
3. The activation function that produces the output based on the input values. And the Transfer Function used here is
Tansig which is a neural transfer function that calculates the layer's output from its network input.
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org
Volume 3, Issue 7, July 2014 ISSN 2319 - 4847

Volume 3, Issue 7, July 2014 Page 199


Figure 2: Block Diagram of the Trainable cascade-forward backpropagation neural network modeling

Now the first step is to split the data into training and testing data sets. The next step is the MRA pre-processing of both
the training and testing data sets. The n
th
level of decomposition depends upon the length of the input data. In Matlab it is
mentioned out that to apply the SWT for a discrete signal, the signal must divide to 2
n
if the decomposition of level n is
to be done. The Data to be used is obtained through Opnet software. Having the data of eight weeks, we train the ANN for
each of the four Approximate and four Detail coefficients obtained from the four decomposition level. Samples at the rate
of 96 samples per day were collected. Giving us the 672 samples for a single week and 5376 samples for the eight weeks.
Example for predicting the traffic of week 8, we take data from weeks 1-6 as ANN s' input data and in the training
process we take the data from the week 7 as target data.
6.RESULT PARAMETERS
The Forecasting ability of our model is evaluatedin terms of the following well-known evaluation parameters :
Symmetric Mean Absolute Percent Error (SMAPE):
It calculates the symmetric absolute error in percent between the actual traffic X and the forecasted traffic F across
all the observations t of the test set of size n for each time series s.


the ideal value of SMAPE being 0.

Mean absolute error (MAE):
It represents the average absolute error value. The mean absolute error (MAE) is given by:

where F
t
is the prediction and X
t
the true data value.
R-Square (RSQ):
The coefficient of determination R
2
, in statistics, is the proportion of variability in a data set that is accounted for by a
statistical model. In this definition, the term variability is defined as the sum of squares. A version for its calculation
is:
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org
Volume 3, Issue 7, July 2014 ISSN 2319 - 4847

Volume 3, Issue 7, July 2014 Page 200


where:


the ideal value of RSQ being 1.
in which X
t
, F
t
are the original data values and modeled values (predicted) respectively, while X
t
and F
t
are the means of
the observed data and modeled (predicted) values, respectively. SS
T
is the total sum of squares, SS
R
is the regression sum
of squares.
Root Mean Square Error(RMSE):
It measures the differences between the values predicted by the model and the values actually observed from the time-
series being modeled or estimated.

where F
t
is the prediction and X
t
the true data value
Table 1 Parameter values for Week Forecasting

Week 4 5 6 7 8
SMAPE 0.3376 0.3324 0.3462 0.3498 0.3509
MAE 0.3334 0.3282 0.3351 0.3432 0.3439
RSQ 0.9422 1.0024 0.9849 0.9989 1.0230
RMSE 1.4100 1.4020 1.4081 1.4538 1.4547
0.3376
0.3324
0.3462
0.3498 0.3509
0.32
0.33
0.34
0.35
0.36
Week4 Week5 Week6 Week7 Week8
SMAPE
Values

CHART 1: SMAPE Values for WEEK Prediction of the Traffic
0.3334
0.3282
0.3351
0.3432 0.3439
0.32
0.325
0.33
0.335
0.34
0.345
Week4 Week5 Week6 Week7 Week8
MAE
Values

CHART 2: MAE Values for WEEK Prediction of the Traffic
0.9422
1.0024
0.9849
0.9989
1.023
0.9
0.95
1
1.05
Week4 Week5 Week6 Week7 Week8
RSQ
Values

International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org
Volume 3, Issue 7, July 2014 ISSN 2319 - 4847

Volume 3, Issue 7, July 2014 Page 201

CHART 3: RSQ Values for WEEK Prediction of the Traffic
1.41
1.402
1.4081
1.4538 1.4547
1.36
1.38
1.4
1.42
1.44
1.46
Week 4 Week 5 Week 6 Week 7 Week 8
RMSE
Values

CHART 4: RMSE Values for WEEK Prediction of the Traffic

Figure 3: Trends of the Original Data and the Predicted Data ( Weekly )
7. CONCLUSION
In the paper we have the wavelet decomposition using Stationary Wavelet Transform that gives us the Approximate and
Detailed coefficients for each decomposition level. Further these coefficients were trained with the Trainable Cascade-
Forward Backpropagation neural network. It is observed that the SMAPE, MAE, RSQ and RMSE values have improved
because of the use of Training Function "Trainscg" of Matlab. The neurons of layers used the "tansig" transfer function
for obtaining the output from the neural network which added for the further improvements. This forecasting technique in
the paper can also be used for building prediction models for time series which are present there in our various day to day
businesses and rate exchange processes like Stock Exchanges. Also it would be better to have more data for analysis in
order to have higher performance and to reduce the prediction errors.
REFERENCES
[1.] Cristina Stolojescu, Ion Railean, Sorin Moga, Alexandru Isar, Comparison of Wavelet Families with Application to
WiMAX Traffic Forecasting, 12th International Conference on Optimization of Electrical and Electronic Equipment,
OPTIM 2010
[2.] Nguyen, H. T, Nabney, I. T, Combining the Wavelet Transform and Forecasting Models to Predict Gas Forward
Prices, ICMLA '08: Proceedings of the 2008 Seventh International Conference on Machine Learning and
Applications, IEEE Computer Society, pp. 311-317, 2008.
[3.] S. Armstrong, M. Adyaa, An application of rule based forecasting to a situation lacking domain knowledge,
International Journal of Forecasting, Vol. 16, pp. 477-484, 2000.
[4.] A. Mitra, S. Mitra, Modeling exchange rates using wavelet decomposedgenetic neural networks, Statistical
Methodology, vol. 3, issue 2, pp. 103-124, 2006.
[5.] G. Zhang, B. E. Patuwo, M. Y. Hu, Forecasting with artificial neural networks: The state of the art, International
Journal of Forecasting 14, pp.35-62, 1998.
[6.] N. Clarence, W. Tan, Incorporating Artificial Neural Network into a Rulebased Financial Trading System, The First
New Zealand International Two Stream Conference on Artificial Neural Networks and Expert Systems (ANNES),
University of Otago, Dunedin, New Zealand, November 24-26, 1993.
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org
Volume 3, Issue 7, July 2014 ISSN 2319 - 4847

Volume 3, Issue 7, July 2014 Page 202

[7.] G. Ibarra-Berastegi, A. Elias, R. Arias, A. Barona, Artificial Neural Networks vs Linear Regression in a F luid
Mechanics and Chemical Modeling Problem: Elimination of Hydrogen Sulphide in a Lab-Scale Biojilter, IEEElACS
International Conference on Computer Systems and Applications, pp.584-587, 2007.
[8.] T. Taskaya-Temizel, M.C. Casey, Conjiguration of Neural Networks for the Analysis of Seasonal Time Series,
Proceedings of the 3rd International Conference on Advances in Pattern Recognition (ICAPR 2005), Lecture Notes
in Computer Science 3686, vol. I, pp. 297-304, 2005.
[9.] G. Peter Zhang, Min Qi, Neural network forecasting for seasonal and trend time series, European Journal of
Operational Research 160, pp. 501-514, 2005.
[10.] H. Feng, Y. Shu, Study on Network Traffic Prediction Techniques, Proceedings of the International Conference on
Wireless Communications, Networking and Mobile Computing, Vol. 2, pp. 1041- 1044, 2005.
[11.] Daniel S. Clouse, C. Lee Giles, Bill G. Home, Garrison W. Cottrell, Time-Delay Neural Networks: Representation
and Induction of FiniteState Machines, IEEE Transaction on Neural Networks, vol. 8, no. 5, September 1997.
[12.] S. Mallat, A Wavelet Tour of Signal Processin, Second Edition, 1999.
[13.] M.J. Shensa, Discrete Wavelet Transform. Wedding the a trous and Mallat algorithms, IEEE Transactions and
Signal Processing, 40, pp. 2464-2482, 1992.

You might also like