Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Annals of Biomedical Engineering, Vol. 37, No. 12, December 2009 ( 2009) pp. 2626–2630 DOI: 10.1007/s10439-009-9795-x Log Energy Entropy-Based EEG Classification with Multilayer Neural Networks in Seizure SERAP AYDıN,1 HAMDI MELIH SARAOĞLU,2 and SADıK KARA3 1 Electrical and Electronics Engineering Department, Faculty of Engineering, The University of Ondokuz Mayıs, Kurupelit, 55139 Samsun, Turkey; 2Faculty of Engineering, The University of Dumlupınar, Kutahya, Turkey; and 3Biomedical Engineering Institute, The University of Fatih, Istanbul, Turkey (Received 29 December 2008; accepted 1 September 2009; published online 11 September 2009) intracranially implanted electrodes.11 In general, one or more numeric indicators are computed from EEG measurements by using several signal processing tools.2 In this study, normal and epileptic EEG series are classified where the features are computed by using the Log Energy (LogEn) entropies that are not experienced in seizure. The study of epilepsy is one of the important applications of EEG. Epilepsy is a neurological disorder that is characterized by seizures involves abnormal, rhythmic discharges of cortical neurons. Abnormal neuronal electrical activities occur synchronously, creating a large amplitude signal and leading to uncontrollable oscillations in ictal periods.6 However, it is possible that seizure-like synchronized discharges may be observed in some paroxysmal attacks originated by psychological or physiological conditions, such as migraine, head injury, cardiac arrhythmia, etc.16 Therefore, several nonlinear chaotic measures, such as correlation dimension, Lyapunov exponent, Hurst exponent, Kolmogorov entropy (KolEn),9 approximate entropy (ApEn),20 and spectral entropies (SpecEn)10 have been presented to characterize the seizure. In these studies, epileptic EEG segments are found to be less complex with respect to statistical analysis provided by t-test for both cortical and intracortical records: Those three entropy measures are found to be useful quantifiers for seizure detection. In fact, the basic concepts of those approaches are different to each other. The KolEn is closely related to Lyapunov exponent of a time series besides the ApEn is a regularity statistic quantifies the unpredictability of fluctuations in a time series. Therefore, each measure provides a different level of statistical correlation. The epileptic EEG series are classified by using two different types of neural network (NN) architectures such as Elman Network and Probabilistic NN where the values of ApEn are assigned as characteristic features with the accuracy of Abstract—In this study, normal EEG series recorded from healthy volunteers and epileptic EEG series recorded from patients within and without seizure are classified by using Multilayer Neural Network (MLNN) architectures with respect to several time domain entropy measures such as Shannon Entropy (ShanEn), Log Energy Entropy (LogEn), and Sample Entropy (Sampen). In tests, the MLNN is performed with several numbers of neurons for both one hidden layer and two hidden layers. The results show that segments in seizure have significantly lower entropy values than normal EEG series. This result indicates an important increase of EEG regularity in epilepsy patients. The LogEn approach, which has not been experienced in EEG classification yet, provides the most reliable features into the EEG classification with very low absolute error as 0.01. In particular, the MLNN can be proposed to distinguish the seizure activity from the seizure-free epileptic series where the LogEn values are considered as signal features that characterize the degree of EEG complexity. The highest classification accuracy is obtained for one hidden layer architecture. Keywords—EEG classification, Log Energy Entropy, Neural network, Seizure. INTRODUCTION Recordings of electrical potentials generated by the brain are called as electroencephalography (EEG). Cortical EEG series are formed by several neurophysiological and neuro-psychological factors. They are noninvasive, reliable, and corroborative tools support clinical diagnostic tests. EEG series produced by different brain functions and pathologies related to neural structures up to the scalp are collected by using Address correspondence to Serap Aydın, Electrical and Electronics Engineering Department, Faculty of Engineering, The University of Ondokuz Mayıs, Kurupelit, 55139 Samsun, Turkey. Electronic mail: drserapaydin@hotmail.com, saydin@omu.edu.tr, saraoglu@dpu.edu.tr, skara@fatih.edu.tr. URL: http://www2.omu.edu. tr/akademikper.asp?id=1038, http://www2.omu.edu.tr/docs/english/1038. htm 2626 0090-6964/09/1200-2626/0  2009 Biomedical Engineering Society Log Energy Entropy-Based EEG Classification with MLNN in Seizure 84% and 77%, respectively.20 This study shows that the higher classification accuracy can be obtained by using LogEn even if the available number of measurements is small. The ApEn is capable of to recognize some changing complexity of a complex system, or a chaos in many applications.12 Earlier studies report that ApEn is a better complexity measure than both Fourier Entropy13 and SpecEn3 due to its specification capability about noise threshold in EEG analysis. However, SpecEn is reported to be a better measure than the ApEn in predicting of probability to classify the conscious and unconscious states in anesthesia due to the fact that the ApEn is probably sensitive to artifacts in the conscious state.22 Besides, the SpecEn represents the relative flatness of the power spectral distribution of the time series. In particular, the SpecEn based on the Shannon Entropy (ShanEn) is named as wavelet entropy if it is computed by using the wavelet transforms.17 In this study, intracortical EEG series recorded from patients diagnosed by epilepsy and cortical normal EEG series recorded from healthy volunteers are classified by using ANN. We use sample entropy (SamEn), Shannon entropy (ShanEn), and LogEn to extract the features. Three entropy approaches which are not experienced for EEG classification in seizure are compared to each other in association with the related classification errors originated by the extracted features. All methods are briefly introduced into the following sections. METHODS Data Collection The publicly available data described by Andrzejak and Lehnertz1 were used as the experimental dataset for entropy-based EEG analysis. The complete dataset consists of five sets each containing 100 single-channel EEG signals of 23.6 s. Each segment has been selected and cut out from the continuous multichannel EEG recordings after visual inspection for artifacts originated by muscle activity or eye movements. Among them, three sets denoted by Set-A,D,E are analyzed in this study. Set-A was taken from the surface EEG recordings of five healthy subjects relaxed in an awaken state with eyes open where the standardized electrode placement technique was used. Set-D was measured from five patients in the epileptogenic zone during seizure-free intervals. Set-E consists of epileptic EEG signals collected from five different epileptic patients, recorded during the occurrence of the epileptic seizures from intracranial electrodes. Set-A was recorded extracranially, whereas both Set-D and Set-E were recorded intracranially. 2627 The depth electrodes were implanted symmetrically into the hippocampal formations and strip electrodes were implanted onto the lateral and basal regions (middle and bottom) of the neocortex. The epileptic EEG segments were selected from all the recording sites exhibiting ictal activity in Set-E. These EEG signals were recorded with 128-channel amplifier system, using an average common reference. After a 12-bit analog-to-digital conversion, the data were written continuously onto the disk of a data acquisition computer system at a sampling rate of 173.61 Hz with bandpass filter settings at 0.53–40 Hz (12 dB/octave). Entropy Approaches Entropy-based wavelet packet decomposition presented by Coifman and Wickerhauser4 are used to compute the ShanEn and LogEn. The SamEn, which is presented in Richman and Moorman,15 is also performed to measure the degree of EEG complexity. Conceptually, entropy has to do with how much information is carried by a signal. In other words, entropy can analysis with how much randomness is in the signal. In general, the entropy of a finite length discrete random variable, x ¼ ½ xð0Þ xð1Þ . . . xðN  1Þ with probability distribution function denoted by p(x) is defined by HðxÞ ¼  N 1 X pi ðxÞ log2 ðpi ðxÞÞ ð1Þ i¼0 where i indicates one of the discrete states. This entropy is larger if each discrete state has about the same probability of occurrence. The ShanEn introduced by Shannon19 is a functional of the p(x) in the sense of an expectation in form HShanEn ðxÞ ¼ Eflog2 ðpðxÞÞg ð2Þ This expression means that the ShanEn is the average information content of x. The ShanEn comes from the field of Information Theory such that it measures the degree of uncertainty that exist in the system as follows, N1 X HShanEn ðxÞ ¼  ðpi ðxÞÞ2 ðlog2 ðpi ðxÞÞÞ2 ð3Þ i¼0 and the LogEn entropy of x is given by, HLogEn ðxÞ ¼  N1 X ðlog2 ðpi ðxÞÞÞ2 ð4Þ i¼0 The SamEn is an extension of the ApEn which is the oldest complexity measure to characterize chaotic systems.12 To compute the SamEn for x, m vectors are defined by xm ðiÞ ¼ ½xðiÞ xði þ 1Þ . . . xðm þ i  1Þ 2628 AYDıN et al. where 0 £ i £ N  m. Then a distance between xm(i) and xm(j) is represented as follows: dðxm ðiÞ; xm ði þ 1ÞÞ ¼ max ðjxði þ kÞ  xðj þ kÞjÞ k¼0;...;m1 ð5Þ where i „ j. Then, two probabilities are formulated such that Bm(r) means that two sequences will match for m points and, Am(r) means that two sequences will match for m+1 points AmðrÞ ¼ N1m X N2m X 1 Ai N  1  m i¼0 i¼0 ð6Þ BmðrÞ ¼ N1m X N2m X 1 Bi N  1  m i¼0 i¼0 ð7Þ where Ai and Bi are counts indicate the numbers of xm(i) and xm+1(i), respectively, in case of Bi £ r. Then, the SamEn is presented by   m  A ðrÞ HShanEn ðxÞ ¼ lim  ln m ; ð8Þ N!1 B ðrÞ where there is no guideline to select the optimum values of m and r. In the present analysis, parameter values m = 1 and r = 0.25 times the standard deviation of the original data sequence. The more regular the EEG is, the smaller the SamEn will be. All the entropy measures (ShanEn, LogEn, and SamEn) are calculated with MATLAB routines. Levenberg–Marquardt algorithm (LMA) is selected for training. The LMA uses the second derivatives of the mean squared error (MSE) so that better convergence behavior is observed. The LMA is computed according to the following equation, ~ ~  1Þ  ðHðkÞ ~ þ l~ gðkÞ; WðkÞ ¼ Wðk IÞ1~ ð9Þ ~ is Hessian and ~ gðkÞ is the gradient of the where HðkÞ MSE, l is a scalar value and ~ I is a unit matrix. More details about this training algorithm and their parameters can be found by Richard14 and Temurtas et al.14,21 The global statistical performance evaluation criteria such as MSE do not provide the useful information about distribution of the errors alone.21 Therefore, mean relative absolute error (RAE) is frequently used to test both robustness and appropriateness of the NN architectures.5,18,21 In this study, 75% of these data is used for training and 25% is used for testing. Four steps are followed in classification as follows: (1) Normalization of entropy values and three clinical groups to obtain the better performance of training (normalization is performed between 0.3 and 0.7), (2) training of the ANN, (3) denormalization of three groups is carried out, and (4) classification of three groups. In this study, six different architectures are determined as follows:  ANN-1: Five inputs, five neurons, and one hidden layer,  ANN-2: Five inputs, five neurons, and two hidden layers, Classification Approaches In this study, clinical records are classified by using Artificial Neural Network (ANN) with respect to three entropy approaches. For this purpose, a multilayer (input layer, one hidden layer, and output layer) ANN is determined. An ANN is a nonlinear mapping between input and output vectors through a system of simple interconnected neurons. It is fully connected to every node in the next and previous layer. The output of a neuron is scaled by the connecting weight and fed forward to become an input through a nonlinear activation function to the neurons in the next layer of the network.7,8,14 In a multilayer ANN, the hidden layer neurons and the output layer neurons use nonlinear sigmoid activation functions. In this study, the number of outputs is three and the number of inputs is five (the entropy values are calculated for five single records. ShanEn, LogEn, and SampEn are performed to compute the entropies). Set-A, Set-D, and Set-E are given the binary target values of (0,0,1), (0,1,0), and (1,0,0), respectively.  ANN-3: Ten inputs, ten neurons, and one hidden layer,  ANN-4: Ten inputs, ten neurons, and two hidden layers,  ANN-5: Twenty inputs, twenty neurons, and one hidden layer,  ANN-6: Twenty inputs, twenty neurons, and two hidden layers. Here, the architectures of X-X-3 and X-X-X-3 architectures are used where X is 5, 10, and 20. In other words, entropy values of first five records and first ten records in addition to first twenty records are assigned as the inputs of the above ANNs of interest. RESULTS In each dataset, all entropies are performed for five subjects where 20 convenient segments were recorded from each volunteer. Then, first of all, three entropy quantities over 100 trials collected from five patients for each group are computed. The range of the specified values with respect to ShanEn, LogEn, and SamEn Log Energy Entropy-Based EEG Classification with MLNN in Seizure 2629 2 0 1.8 1.6 1.4 Sample Entropy Shannon Entropy x 108 -50 -100 -150 1.2 1 0.8 0.6 0.4 -200 0.2 -250 0 Set A Set D Set E Set A Set E FIGURE 3. Sample entropies of datasets. FIGURE 1. Shannon entropies of datasets. TABLE 1. The classification errors with respect to RAE (%). 5 4.5 ANN-1 (one hidden layer) ANN-2 (two hidden layer) ANN-3 (two hidden layer) ANN-4 (two hidden layer) ANN-5 (two hidden layer) ANN-6 (two hidden layer) Average error 4 Log Energy x 104 Set D 3.5 3 LogEn SamEn ShanEn 0.53 0.43 0.01 0.60 2.57 0.80 0.82 1.01 0.88 33.70 0.15 41.81 34.59 18.69 17.08 0.97 50.01 38.66 33.48 17.02 26.20 2.5 2 1.5 Set A Set D Set E FIGURE 2. LogEn values of datasets. are shown with bars in Figs. 1–3, respectively. All figures show that (Set-E) produces significantly lower entropy values. However, ShanEn cannot characterize the seizure since the range of ShanEn results is much large. According to Figs. 2 and 3, it can be said that both LogEn and SamEn produce the narrower ranges. However, the bars in figures cannot make us sure about the discrimination capabilities of those entropy measures. Therefore, we attempt MLNN in different architectures for EEG classification to compare the three entropy approaches with respect to their characterization capabilities. In NN applications, the best network architecture which has a minimum error is defined. Six NN architectures having five, ten, and twenty input parameters are trained and tested. MLNN in different architecture performances for classification of LogEn, SamEn, and ShanEn are seen in Table 1. In order to determine the optimum number of hidden neurons in hidden layers, training is used for X-X-X-3 and X-X-3 architectures where X is five, ten, and twenty. The calculated errors for these architectures are given in Table 1. According to this table, ANN-3 architecture has maximum classification accuracy when LogEn values are used as features. Combining the figures and tables, it can be said that the normal and epileptic EEG time series can be successively classified by using a MLNN where both the number of input neuron layer and hidden layers are ten with respect to LogEn values. In particular, the LogEn provides us to distinguish the seizure activity from the seizure-free epileptic activity with the high accuracy even if the number of features is very small like five. AYDıN et al. 2630 CONCLUSION In this study, three datasets consisting of normal and epileptic records in addition to ictal series are classified by using several MLNN architectures. The inputs of the NNs, i.e., the signal features, are computed by addressing the ShanEn, LogEn, and SamEn. The results show that epileptic records (Set-D and Set-E) show lower entropies in comparison to healthy records (Set-A). In particular, seizure activity produces significantly lower entropies. It means that electrophysiological behavior of epileptogenic regions is less complex than behavior of healthy brain. It can be said that lower entropy indicates the severity of epilepsy. Since the LogEn values meet the most reliable features to analyze the nonlinear dynamics of both cortical and intracortical neuronal interactions, we propose the LogEn values as inputs of the MLNN in discriminating the seizure. Entropy is a useful quantity to recognize the certain features of EEG series such that small number of dominating process provides the low entropy. Results show that the LogEn is highly sensitive to the degree of EEG complexity. Since, seizure is a synchronous neuronal activity producing spike discharges, the LogEn can be used for seizure detection. The LogEn can also be proposed to critical analysis of EEG in many diagnostic applications such as discriminating of sleep disorders and monitoring of anesthetic depth. REFERENCES 1 Andrzejak, R. G., K. Lehnertz, et al. Indications of nonlinear deterministic and finite dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. Phys. Rev. E 64:061907, 2001. 2 Bronzino, J. D. The Biomedical Engineering Handbook, 3rd ed. CRC Press, Boca Raton, Sec. III, pp. 26.1–26.5, 2006. 3 Bruhn, J., L. E. Lehmann, et al. Shannon entropy applied to the measurement of the electroencephalographic effects of desflurane. Anesthesiology 95(1):30–35, 2001. 4 Coifman, R. R., and M. V. Wickerhauser. Entropy-based algorithms for best basis selection. IEEE Trans. Inf. Theory 38(2):1241–1243, 1992. 5 Gulbag, A., F. Temurtas, and I. Yusubov. Quantitative discrimination of the binary gas mixtures using a combinational structure of the probabilistic and multilayer neural networks. Sens. Actuat. B Chem. 131:196–204, 2008. 6 Guyton, A. C. Text Book of Medical Physiology. Saunders, Philadelphia, Sec.2, 968 pp, 1986. 7 Hagan, M. T., H. B. Demuth, and M. H. Beale. Neural Network Design. PWS Publishing, Boston, 1996. 8 Jang, J. S. R., C. T. Sun, and E. Mizutani. Neuro Fuzzy and Soft Computing. Prentice Hall, Englewood Cliffs, pp. 335–345, 1997. 9 Kannathal, N., U. R. Acharya, et al. Characterization of EEG—a comparative study. Comp. Meth. Prog. Biomed. 80:17–23, 2005. doi:10.1016/j.cmpb.2005.06.005. 10 Kannathal, N., M. L. Choo, et al. Entropies for detection of epilepsy in EEG. Comp. Meth. Prog. Biomed. 80:187– 194, 2005. doi:10.1016/j.cmpb.2005.06.012. 11 Niedermeyer, E., and F. H. L. Silva (eds). Electroencephalography: Basic Principles, Clinical Applications and Related Fields, 3rd ed. Baltimore: Williams and Wilkins, pp. 661–677, 1993. 12 Pincus, S. M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA 88:2297–2301, 1991. 13 Rezek, I., and S. J. Roberts. Stochastic complexity measures for physiological signal analysis. IEEE Trans. BME 44(9):1186–1191, 1998. 14 Richard, P. B. Fast training algorithms for multi-layer neural nets. IEEE Trans. Neural Netw. 2:346–354, 1991. 15 Richman, J. S., and J. R. Moorman. Physiological timeseries analysis using approximate entropy and sample entropy. Am. J. Physiol. Heart Circ. Physiol. 278(6): H2039–H2049, 2000. 16 Ricker, J. H. Differential Diagnosis in Adult Neuropsychological Assessment. Springer Publishing Company, New York, 109 pp, 2003. 17 Rosso, O. A., S. Blanco, J. Yordanova, et al. Wavelet entropy: a new tool for analysis of short duration brain electrical signals. J. Neurosci. Meth. 105:65–75, 2001. 18 Saraoglu, H. M., and B. Edin. E-Nose system for anesthetic dose level detection using artificial neural network. J. Med. Syst. 6:475–482, 2007. 19 Shannon, C. E. Mathematical theory of communication. Bell Syst. Tech. J. 27:379–423, 623–656, 1948. 20 Srinivasan, V., C. Eswaran, et al. Approximate entropy based epileptic EEG detection using artificial neural networks. IEEE Trans. Inform. Technol. Biomed. 11:288–295, 2007. 21 Temurtas, F., C. Tasaltin, H. Temurtas, et al. Fuzzy logic and neural network applications on the gas sensor data: concentration estimation. Lecture Notes in Computer Science, vol. 2869, pp. 179–186, 2003. 22 Vierti, H., P. Merilainen et al. Spectral entropy, approximate entropy, complexity, fractal spectrum and bispectrum of EEG during anesthesia. In: The 5th International Conference on Memory, Awareness and Consciousness, New York, June 2001.