Modelbased Clustering Classification Data Science PDF
Modelbased Clustering Classification Data Science PDF
Editorial Board
Z. Ghahramani (Department of Engineering, University of Cambridge)
R. Gill (Mathematical Institute, Leiden University)
F. P. Kelly (Department of Pure Mathematics and Mathematical Statistics,
University of Cambridge)
B. D. Ripley (Department of Statistics, University of Oxford)
S. Ross (Department of Industrial and Systems Engineering,
University of Southern California)
M. Stein (Department of Statistics, University of Chicago)
This series of high-quality upper-division textbooks and expository monographs covers all
aspects of stochastic applicable mathematics. The topics range from pure and applied statistics
to probability theory, operations research, optimization and mathematical programming. The
books contain clear presentations of new developments in the field and also of the state of
the art in classical methods. While emphasizing rigorous treatment of theoretical methods, the
books also contain applications and discussions of new techniques made possible by advances
in computational practice.
A complete list of books in the series can be found at www.cambridge.org/statistics.
Recent titles include the following:
Charles Bouveyron
Université Côte d’Azur
Gilles Celeux
Inria Saclay Île-de-France
T. Brendan Murphy
University College Dublin
Adrian E. Raftery
University of Washington
University Printing House, Cambridge CB2 8BS, United Kingdom
314–321, 3rd Floor, Plot 3, Splendor Forum, Jasola District Centre, New Delhi – 110025, India
www.cambridge.org
Information on this title: www.cambridge.org/9781108494205
DOI: 10.1017/9781108644181
© Charles Bouveyron, Gilles Celeux, T. Brendan Murphy and Adrian E. Raftery 2019
A catalogue record for this publication is available from the British Library.
Page
Preface xv
1 Introduction 1
1.1 Cluster Analysis 1
1.2 Classification 4
1.3 Examples 7
1.4 Software 12
1.5 Organization of the Book 13
1.6 Bibliographic Notes 14
vii
viii Contents
Page
Preface xv
1 Introduction 1
1.1 Cluster Analysis 1
1.1.1 From Grouping to Clustering 1
1.1.2 Model-based Clustering 3
1.2 Classification 4
1.2.1 From Taxonomy to Machine Learning 4
1.2.2 Model-based Discriminant Analysis 6
1.3 Examples 7
1.4 Software 12
1.5 Organization of the Book 13
1.6 Bibliographic Notes 14
Acknowledgements
This book is a truly collaborative effort, and the four authors have contributed
equally. Each of us has contributed to each of the chapters.
We would like to thank Chris Fraley for initially developing the mclust software
and later R package, starting in 1991. This software was of extraordinary quality
from the beginning, and without it this whole field would never have developed as
it did. Luca Scrucca took over the package in 2007, and has enhanced it in many
ways, so we also owe a lot to his work. We would also like to thank the developers
and maintainers of Rmixmod software: Florent Langrognet, Rémi Lebret, Christian
Poli, Serge Iovleff, Anwuli Echenim and Benjamin Auder.
The authors would also like to thank the participants in the Working Group on
Model-based Clustering, which has been gathering every year in the third week of
July since 1994, first in Seattle and then since 2007 in different venues around
Europe and North America. This is an extraordinary group of people from many
countries, whose energy, interactions and intellectual generosity have inspired
us every year and driven the field forward. The book owes a great deal to their
insights.
Charles Bouveyron would like to thank in particular Stéphane Girard, Julien
Jacques and Pierre Latouche, for very fruitful and friendly collaborations. Charles
Bouveyron also thanks his coauthors on this topic for all the enjoyable collabora-
tions: Laurent Bergé, Camille Brunet-Saumard, Etienne Côme, Marco Corneli,
Julie Delon, Mathieu Fauvel, Antoine Houdard, Pierre-Alexandre Mattei, Cordelia
Schmid, Amandine Schmutz and Rawya Zreik. He would like also to warmly thank
his family, Nathalie, Alexis, Romain and Nathan, for their love and everyday
support in the writing of this book.
Gilles Celeux would like to thank his old and dear friends Jean Diebolt and
Gérard Govaert for the long and intensive collaborations. He also thanks his
coauthors in the area Jean-Patrick Baudry, Halima Bensmail, Christophe Biernacki,
Guillaume Bouchard, Vincent Brault, Stéphane Chrétien, Florence Forbes, Raphaël
Gottardo, Christine Keribin, Jean-Michel Marin, Marie-Laure Martin-Magniette,
Cathy Maugis-Rabusseau, Abdallah Mkhadri, Nathalie Peyrard, Andrea Rau,
Christian P. Robert, Gilda Soromenho and Vincent Vandewalle for nice and
fruitful collaborations. Finally, he would like to thank Maı̈lys and Maya for their
love.
Brendan Murphy would like to thank John Hartigan for introducing him to
clustering. He would like to thank Claire Gormley, Paul McNicholas, Luca Scrucca
and Michael Fop with whom he has collaborated extensively on model-based
clustering projects over a number of years. He also would like to thank his
students and coauthors for enjoyable collaborations on a wide range of model-
based clustering and classification projects: Marco Alfò, Francesco Bartolucci,
Nema Dean, Silvia D’Angelo, Gerard Downey, Bailey Fosdick, Nial Friel, Marie
Preface xvii
Galligan, Isabella Gollini, Sen Hu, Neil Hurley, Dimitris Karlis, Donal Martin,
Tyler McCormick, Aaron McDaid, Damien McParland, Keefe Murphy, Tin Lok
James Ng, Adrian O’Hagan, Niamh Russell, Michael Salter-Townshend, Lucy
Small, Deirdre Toher, Ted Westling, Arthur White and Jason Wyse. Finally, he
would like to thank his family, Trish, Áine and Emer for their love and support.
Adrian Raftery thanks Fionn Murtagh, with whom he first encountered model-
based clustering and wrote his first paper in the area in 1984, Chris Fraley for a
long and very fruitful collaboration, and Luca Scrucca for another very successful
collaboration. He warmly thanks his Ph.D. students who have worked with him
on model-based clustering, namely Jeff Banfield, Russ Steele, Raphael Gottardo,
Nema Dean, Derek Stanford and William Chad Young for their collaboration
and all that he learned from them. He also thanks his other coauthors in the
area, Jogesh Babu, Jean-Patrick Baudry, Halima Bensmail, Roger Bumgarner,
Simon Byers, Jon Campbell, Abhijit Dasgupta, Mary Emond, Eric Feigelson,
Florence Forbes, Diane Georgian-Smith, Ken Lo, Alejandro Murua, Nathalie
Peyrard, Christian Robert, Larry Ruzzo, Jean-Luc Starck, Ka Yee Yeung, Naisyin
Wang and Ron Wehrens for excellent collaborations.
Raftery would like to thank the Office of Naval Research and the Eunice
Kennedy Shriver National Institute of Child Health and Human Development
(NICHD grants R01 HD054511 and R01 HD070936) for sustained research support
without which this work could not have been carried out. He wrote part of the
book during a fellowship year at the Center for Advanced Study in the Behavioral
Sciences (CASBS) at Stanford University in 2017–2018, which provided an ideal
environment for the sustained thinking needed to complete a project of this kind.
Finally he would like to thank his wife, Hana Ševčı́ková, for her love and support
through this project.
1
Introduction
Cluster analysis and classification are two important tasks which occur daily
in everyday life. As humans, our brain naturally clusters and classifies animals,
objects or even ideas thousands of times a day, without fatigue. The emergence
of science has led to many data sets with clustering structure that cannot be
easily detected by the human brain, and so require automated algorithms. Also,
with the advent of the “Data Age,” clustering and classification tasks are often
repeated large numbers of times, and so need to be automated even if the human
brain could carry them out.
This has led to a range of clustering and classification algorithms over the
past century. Initially these were mostly heuristic, and developed without much
reference to the statistical theory that was emerging in parallel. In the 1960s, it
was realized that cluster analysis could be put on a principled statistical basis by
framing the clustering task as one of inference for a finite mixture model. This
has allowed cluster analysis to benefit from the inferential framework of statistics,
and provide principled and reproducible answers to questions such as: how many
clusters are there? what is the best clustering algorithm? how should we deal with
outliers?
In this book, we describe and review the model-based approach to cluster
analysis which has emerged in the past half-century, and is now an active research
field. We describe the basic ideas, and aim to show the advantages of thinking in
this way, as well as to review recent developments, particularly for newer types of
data such as high-dimensional data, network data, textual data and image data.
Greene (1909) remarked, “naming is classifying.” Plato was among the first to
formalize this with his Theory of Forms, defining a Form as an abstract unchanging
object or idea, of which there may be many instances in practice. For example, in
Plato’s Cratylus dialogue, he has Socrates giving the example of a blacksmith’s
tool, such as a hammer. There are many hammers in the world, but just one
Platonic Form of “hammerness” which is the essence of all of them.
Aristotle, in his History of Animals, classified animals into groups based on
their characteristics. Unlike Plato, he drew heavily on empirical observations. His
student Theophrastus did something similar for plants in his Enquiry Into Plants.
there was a further explosion of interest fueled by new types of data and questions,
often involving much larger data sets than before. These include finding groups
of genes or people using genetic microarray data, finding groups and patterns in
retail barcode data, finding groups of users and websites from Internet use data,
and automatic document clustering for technical documents and websites.
Another major area of application has been image analysis. This includes medical
image segmentation, for example for finding tumors in digital medical images such
as X-rays, CAT scans, MRI scans and PET scans. In these applications, a cluster
is typically a set of pixels in the image. Another application is image compression,
using methods such as color image quantization, where a cluster would correspond
to a set of color levels. For a history of cluster analysis to 1988, see Blashfeld and
Aldenderfer (1988).
which is striking since he did so ten years before the article of Dempster et al.
(1977) that popularized the EM algorithm. This remains the most used estimation
approach in model-based clustering. We outline the early history of model-based
clustering in Section 2.9, after we have introduced the main ideas.
Basing cluster analysis on a probability model has several advantages. In essence,
this brings cluster analysis within the range of standard statistical methodology
and makes it possible to carry out inference in a principled way. It turns out that
many of the previous heuristic methods correspond approximately to particular
clustering models, and so model-based clustering can provide a way of choosing
between clustering methods, and encompasses many of them in its framework. In
our experience, when a clustering method does not correspond to any probability
model, it tends not to work very well. Conversely, understanding what probability
model a clustering method corresponds to can give one an idea of when and why
it will work well or badly.
It also provides a principled way to choose the number of clusters. In fact, the
choice of clustering model and of number of clusters can be reduced to a single
model selection problem. It turns out that there is a trade-off between these
choices. Often, if a simpler clustering model is chosen, more clusters are needed
to represent the data adequately.
Basing cluster analysis on a probability model also leads to a way of assessing
uncertainty about the clustering. In addition, it provides a systematic way of
dealing with outliers by expanding the model to account for them.
1.2 Classification
The problem of classification (also called discriminant analysis) involves classifying
objects into classes when there is already information about the nature of the
classes. This information often comes from a data set of objects that have already
been classified by experts or by other means. Classification aims to determine
which class new objects belong to, and develops automatic algorithms for doing
so. Typically this involves assigning new observations to the class whose objects
they most closely resemble in some sense.
Classification is said to be a “supervised” problem in the sense that it requires
the supervision of experts to provide some examples of the classes. Clustering, in
contrast, aims to divide a set of objects into groups without any examples of the
“true” classes, and so is said to be an “unsupervised” problem.
The first statistical method for classification is due to Ronald Fisher in his
famous work on discriminant analysis (Fisher, 1936). Fisher asked what linear
combination of several features best discriminates between two or more populations.
He applied his methodology, known nowadays as Fisher’s discriminant analysis or
linear discriminant analysis, to a data set on irises that he had obtained from the
botanist Edgar Anderson (Anderson, 1935).
In a following article (Fisher, 1938), he established the links between his
discriminant analysis method and several existing methods, in particular analysis
of variance (ANOVA), Hotelling’s T-squared distribution (Hotelling, 1931) and
the Mahalanobis generalized distance (Mahalanobis, 1930). In his 1936 paper,
Fisher also acknowledged the use of a similar approach, without formalization, in
craniometry for quantifying sex differences in measurements of the mandible.
Discriminant analysis rapidly expanded to other application fields, including
medical diagnosis, fault detection, fraud detection, handwriting recognition, spam
detection and computer vision. Fisher’s linear discriminant analysis provided good
solutions for many applications, but other applications required the development
of specific methods.
Among the key methods for classification, logistic regression (Cox, 1958) ex-
tended the usual linear regression model to the case of a categorical dependent
variable and thus made it possible to do binary classification. Logistic regression
had a great success in medicine, marketing, political science and economics. It
remains a routine method in many companies, for instance for mortgage de-
fault prediction within banks or for click-through rate prediction in marketing
companies.
Another key early classification method was the perceptron (Rosenblatt, 1958).
Originally designed as a machine for image recognition, the perceptron algorithm
is supposed to mimic the behavior of neurons for making a decision. Although
the first attempts were promising, the perceptron appeared not to be able to
recognize many classes without adding several layers. The perceptron is recognized
as one of the first artificial neural networks which recently revolutionized the
classification field, partly because of the massive increase in computing capabilities.
In particular, convolutional neural networks (LeCun et al., 1998) use a variation
of multilayer perceptrons and display impressive results in specific cases.
Before the emergence of convolutional neural networks and deep learning,
support vector machines also pushed forward the performances of classification
at the end of the 1990s. The original support vector machine algorithm or SVM
(Cortes and Vapnik, 1995), was invented in 1963 and it was not to see its first
implementation until 1992, thanks to the “kernel trick” (Boser et al., 1992). SVM
is a family of classifiers, defined by the choice of a kernel, which transform the
original data in a high-dimensional space, through a nonlinear projection, where
they are linearly separable with a hyperplane. One of the reasons for the popularity
of SVMs was their ability to handle data of various types thanks to the notion of
kernel.
As we will see in this book, statistical methods were able to follow the different
revolutions in the performance of supervised classification. In addition, some of
6 Introduction
the older methods remain reference methods because they perform well with low
complexity.
1.3 Examples
We now briefly describe some examples of cluster analysis and discriminant
analysis.
350
● ●
● ●
● ●
● ● ● ●
300
● ●● ●
●●
● ● ●●
● ● ●●
250
● ●
glucose ●● ● ●● ●
200
● ●
●● ●● ●
●
● ●
● ●
150
●
●
● ● ● ●●
● ● ● ●
● ●
●● ● ● ● ● ● ●
●
●● ● ● ●
●● ●● ● ● ●●●● ● ●
●● ●
● ●
100
●● ●
● ●●● ●●●● ●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●●●
● ●●● ●
●
●
●
● ●
●●●
●
●
●●
●●
●
●●
●
●●
●
●●●●● ●●● ●
●●●
●
● ● ●● ●
●●
●●
●
●●
●●●● ●● ●● ●
● ● ●
●
●
●
●
●●
●
●●
●●
●●
●
●
● ●
●●
●●●● ●
●●●●
●
●
●
●
●
●
●●
● ● ● ●●●● ● ●
●●●
●●
● ●● ● ● ● ●● ●
● ●
1500
● ●
● ●
●
● ● ●●●
● ●
● ● ● ●●●
● ● ● ●
● ●
● ● ● ●
1000
●
● ●●
●●
● ●● ●
● ● ● ●
●
●● ●●● ●●
●●
●
insulin ●●
●● ● ● ●
●● ●
● ●● ● ● ●
●
●●● ● ●
500
●●●● ●● ●● ● ●
●
●● ●● ●● ● ●
● ●
●●●●●
●
●
● ●●● ● ●● ● ●
●●
●
●
● ●
●●● ● ● ●● ●
●
● ●
● ●● ●
●
●●
●●●●
●●
●●
● ● ●●●
●
●●●●
● ●
●
●
●●● ● ● ●
●●
●● ●●
●●
● ●●
●●
● ●●●
●●●●●
●●
●●
●● ● ●●●
●●●
● ●
●● ● ● ●●
●● ●● ● ●
● ●
●●
●●
● ●●
● ●● ●
●●
●
●●● ●●
●●●
● ●
0
● ●
● ●
600
● ●
● ●
●● ● ●
● ●
● ●
400
●
●
●
●
●
●
●
●
sspg
●● ● ●● ●●●● ●
● ●●●● ● ●●
●
● ● ●●● ●
● ●● ●● ●
●● ● ● ●● ●● ● ●
● ●●
200
●●●●
●●●
●●
● ●
● ●●●
●
●●
●
●
●●
●●●
●●● ● ●
●●
●●●●●● ●●●●●
●
●
● ●●
●
● ● ● ● ● ● ●●● ●●
● ● ● ●
●● ●●●●
●
●●●
●●●
●
● ● ● ●
●●
● ●
●●●
●
●●●● ● ●
● ● ●
● ● ●●
●●●●
●
●●●● ● ● ●●
●● ● ● ●●
●●●●●●
●●
●●●●
● ●● ● ●
● ● ● ●●● ● ●
●
●●●● ● ●● ● ● ● ● ● ● ●● ●● ●
● ● ●● ● ● ● ●● ● ●●
●
● ●
● ●
●●● ● ●●●
0
100 150 200 250 300 350 0 200 400 600
Figure 1.1 Diabetes data pairs plot: three measurements on 145 patients.
Source: Reaven and Miller (1979).
was apparent visually from the plots of the data. Here it is hard to discern
clustering in Figure 1.2. However, we will see in Chapter 2 that in this higher
dimensional setting, model-based clustering can detect clusters that agree well
with clinical criteria.
● ● ● ●
● ●
0.4
● ●
● ●
● ● ●●
●● ● ●
● ● ● ● ● ● ●● ● ●
0.3
● ● ● ●
● ● ●●
● ● ●
● ●●
● ● ● ● ●
● ●
● ● ●● ● ● ● ●●● ●●●●
●
●● ●●● ● ● ● ●● ●
● ● ●●●●
concavity1 ●
●● ● ●● ●
● ●●
●●●●●●● ●●
0.2
● ●
● ● ● ●●
● ● ●● ●● ● ●
●●● ●●● ●
●●●●●
● ● ●
● ● ● ●● ●
● ● ●●
●●●●● ● ●●
●● ●●●●●●
●●●●●● ●●
●●●
●
●●
●●●●● ● ● ●●● ●
●
●
●●
●●
●
●● ● ●●● ● ● ●
● ● ● ●●
● ●●
●
●●
●●
●
●●
●
●●●
● ●
●●● ●
●● ● ● ●●●●●●● ● ●
● ●● ●●● ● ●●● ●● ● ●● ● ●●● ●●● ●●● ● ●●●
●●●● ●●
●● ●● ●●
●●● ● ●● ●●●
●● ● ●●●● ● ●
● ●● ●● ●●
●● ●●● ●●●● ●●
● ●●● ●
●● ●● ● ● ●● ● ●●●●● ● ●
0.1
● ●● ●
●● ●●
● ●●
● ●●●●●●● ●●
●● ●●●
●●●●●● ●
●● ●
●●●● ●●●
●●●
●●
●● ●● ●● ●●
●●
●
●
●●
● ●●● ●
● ●
● ●●
● ●● ●●
●
●
● ●
●●
●●●●● ●●●●●●
●●
● ● ●●●● ●
●●●●
● ●
●
●●●●
●● ●●●
●●●●
●●● ● ●●●● ●
●●● ●● ●
●●● ●
●●
●
●●● ●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
● ●
●
●●●
● ● ● ●●●
●●●●
● ●
●●● ●
●●
●
●
●●
●
●
●● ●●
●● ●●●● ●
● ●●
●
●● ●● ●● ● ● ●
●●● ●
● ●
● ● ● ●
● ● ●
●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
● ●
● ●
●●
●●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●●●●●
●●● ●● ● ●
●●●
●●
●
●●
●●●
●
●
●●
●
●●
●●
●●●● ●
●
●● ●
●
●●●●
●●
●●
● ●●
●
●●
●●●
●
●●● ●
●● ●
●● ●●●
●● ● ●● ●
●●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●●
●●
●
●●●
●
●● ●● ●
●●
●●●●
●
●●
●●
●● ●●●
●●
●●
●
●
●●●
●● ●●
●
●● ●●
● ●
●●●● ●●●● ●●●
●● ●●
●●● ● ●●
0.0
●
●
●●
●
●●
●
●● ●●●●●
●●●●● ●●
●●● ●● ●●
●
●●● ●●●
● ●● ● ● ●
0.20
● ●
● ● ● ●
● ● ●
●
● ●
● ● ●● ●
● ●
0.15
●
●● ● ●●●● ●
● ●● ●● ● ●
●●● ● ●● ● ●
● ●●
● ● ● ● ●● ● ● ●
● ●● ● ● ● ●
● ● ●● ● ● ● ●
● ● ●●● ● ●● ● ●
●●●●●● ● ●●● ● ●
0.10
●
●●
●●●
●●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
● ● ●●●
●●
●●
●
●●●●
●
●
● ●
●● ●●● ●● ●
●●●
points1 ●
●●
●●
●●
● ●● ● ●●
●●●
●
●●
●
●
●
●●
●
●
●
●
●●● ●
●●
●●●
●●●●
●
●●
●●
●●● ●●
●
●
● ● ●
● ●
●●●
● ●● ● ●●● ● ●● ●●
● ●●
●●●
●●●●●●●●●● ● ● ● ●●●●●
●●● ●
● ●● ●●● ● ●
●
●● ●● ●
● ● ● ● ●●● ●
● ●●●●● ●
● ●●●●
●●●●
● ● ●● ● ●●●●●● ●
●●
● ● ●●● ●
●●
● ●
●● ●
●●
● ●● ● ● ●
●●●
●
● ●● ●● ●● ●●●
●●●
0.05
●
●●●●
●●
●●
●●
● ●● ●●● ●●● ●●●●● ●
●● ●●●
●●● ●●●
●● ●●
●●
●
●
●
● ●
●
●
● ●
● ● ●
●
●●
● ●●
● ●●●●
● ● ● ●● ●
●
●
●●●
●●●●
●●
●
●● ●● ● ●●
●●● ● ●● ●● ●● ●
● ●● ●●● ●● ●● ●
●●
●
●
●
●●●
●
●
●●●
●
●●●
●
●
●●●●●●
●●● ● ●
●●● ●●
●●●●
●●
●
●●●
●
● ●
●●
● ● ● ● ●●
●●
● ●● ● ● ●
●●
●●
●
●
●●
●
●
●●
●●
●
●●
●
●●
●●●●●
● ● ● ●●●●
●●●●●●●
●●
●●●● ●
●●●●
●● ●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●
●
● ●
●
●● ●
●●
●
●●
●●
●●
●
●
●
●●●●●●
●
●
●
●●
●
●
●
●
● ●
●●
●
●
●●●●●●●●
●●● ●●● ●●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
● ●●
●●●● ● ●
●●●●●
●●
●
●●
●●
●
●
●
●●
●
●●
●●
●
● ●
●
●
●●
●
●
●
●● ●●
●
●
●●● ●● ● ●●
●
●● ●
●●●●
●
●
●
●●●
●●●●● ●
●●
●● ● ●
●● ●●
● ●● ●●● ●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
● ●
● ●●●●●●● ●
●
● ●
●
●● ● ●●●●
● ●●●●
●●●
● ● ●
0.00
●
●●
●● ● ●●●●
● ●●● ●
● ● ●
●
●
●
● ● ● ●●
●● ●●●●●
●●●●
● ●
● ●● ●● ● ● ●
5
● ●
4
● ●
● ● ● ●
● ●
3
● ● ● ●
●
● ●● ●
● ● ●
●
● ●●
● ●● ● ●
●● ● ●
●
●
●
●
●
●
● ●●●
●
●●
●● ● ● ●●
● ●
●
●
●
● texture2
● ●● ● ● ●● ●
● ●
●●
●● ● ●●
● ●● ●●●● ●
● ●
2
●●● ●
●●● ●●● ● ● ● ●
●●● ●● ● ●● ●●●
●● ● ●● ●
● ● ● ●●
●●● ● ●
●●●●●●● ● ●●
●●● ●●
● ● ● ●● ●●●● ● ●
●●● ●●●
● ● ● ● ●
●
●● ●
● ●●●●●●●●●●
●●● ● ●●● ●● ● ●● ● ●●● ●●● ● ● ● ● ●●
●●● ●
●●● ●● ● ● ● ● ●
●●● ●●●●●●● ● ● ●
●
●●●
●● ●●
● ●●● ●●●●
● ● ●●
●●●
●
● ●
●● ●●● ● ● ●● ●● ● ●●
● ●
●●●● ● ●
●
● ●●
●●● ●● ●
●●●●●●●●
●
● ●●● ● ● ●●
●
● ●
●
●●●
●●●
●
●
●●●
●●
● ●
●●●● ●
●
●●
● ●
●
● ● ●● ●
● ●
●
●●● ● ● ● ●
●●
●●●● ●●
●
●●●
● ●
●
●
●●●
●
● ●
●
●●●●●●●●●
● ● ● ● ● ●● ●●
●
●
●
●●
● ●
●● ● ● ● ●● ● ●
● ● ● ● ●
● ● ● ●● ● ● ●
● ●
●●●
● ●
●●●●●
●●
● ●● ● ●●
● ● ● ●●
● ● ●●●●
●●●●●●
●●●●●● ●●●●●●●● ● ●
●●●● ● ●●●
●●●
●●●●
●
● ●
●
●●●
● ●
●
●● ●
●●
●●
●
●●● ●
●●●●● ●● ●
● ●● ●● ●●● ●
●●
●
●●●
●
●●●
●
●●
●●● ●
●●● ●● ●●
●● ●●● ●
● ● ●
●●
●●
●
●●●
● ●●●
● ●● ●● ●●●●
●● ●●●●●●● ●● ●● ● ● ●● ●
● ●●●
●
●●●
●
●● ●● ●
●●●● ●●●●● ●●●● ●
●●●
●
● ●●● ● ●●
1
●
●●●
●●
●
● ●
●●
●●
●
●●●● ●● ●
●●
● ●●
●● ● ●●●●
● ●
●
●●●
● ●● ●
●● ●
●
●●● ● ●● ●
●
● ● ●
● ●● ● ● ● ● ● ●● ●●
●● ●
●●
●●
●●
●●
●●
●
● ●●●●●●●
● ●●
●
●●●
●● ●● ● ●● ● ● ● ●● ●● ● ●●
●●●●●
●●
● ● ● ●● ● ●
● ●●● ● ●●● ● ● ●●●● ● ●
● ●
●
●
●
●●
●●
●●
●
●
●●●
●
●
●●
●
●
●
●
● ●
●
●●●
● ● ●
●● ●●●● ●●●
● ●
●●
●●
●
●●●
●●
●●
●
●●
●●
●
●
●
● ●● ●
●●● ● ●●● ●● ●
●
●
●●
●
●●●
●●
●
●
●
●
●
●●
● ●
●●●●
●
●
● ●● ● ●
●
●●●
●
●
●●●
● ●
●●
● ●
●
●●
●
●●●●
●
●
●
●
● ●●● ●
●
●
●●●●
●●● ●
●●
●
●● ● ●●●
●●●●
Thus we can assess how well various clustering methods perform. We will see in
Chapter 2 that model-based clustering is successful at this task.
● ● ● ●● ● ●● ●
● ●
● ● ●● ●●●
●● ● ● ● ●
●
● ● ●●●● ● ● ● ●● ● ●●
●●● ●
●● ● ●
● ● ●
●●● ●
●
● ● ●●● ● ● ●● ●
●
●
● ● ●
●● ● ●● ● ● ● ●
●
●●● ● ● ●●●● ● ●
● ● ●● ● ● ●●
● ●● ●● ● ● ●● ●●● ● ● ●
●● ● ●● ● ● ● ●● ●● ●
●
●● ●●
● ● ● ● ●
● ● ●●● ● ● ●● ●● ● ●● ● ●●
●● ●● ● ●● ● ● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ● ●● ●
● ● ●● ● ●
● ● ●● ●●●● ● ● ● ●● ●
●●●● ● ● ● ●●
● ●● ● ●●● ● ●●
●● ● ● ● ●● ● ●● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ● ●●●
●
● ● ● ●
●
● ● ●● ●● ●
●● ● ●● ●
● ● ● ● ●
● ●●● ● ● ● ● ●
● ●● ●
●
● ●
● ● ● ●● ● ● ● ● ● ●
● ●
●
● ● ● ●●
● ● ●
● ●
● ● ● ● ● ● ● ● ● ●
●● ● ● ● ●
● ●● ●● ● ●● ●● ● ● ●
● ● ● ● ●
● ● ● ● ● ● ● ●●
● ● ● ●
● ● ● ●
● ● ● ● ● ● ●
● ● ●
● ● ●● ● ● ● ●
● ● ●● ● ●
● ● ●●●
● ● ●● ● ● ● ● ●
●● ●
● ●● ● ● ● ● ●
● ● ● ● ●● ● ● ●● ● ●●
● ●
● ●
● ●●
●
●
● ● ●●●●● ●● ●
● ●●● ●
●● ● ● ● ●● ● ● ● ● ● ● ● ●
● ●
● ● ● ● ●●
● ● ● ●●● ● ● ● ● ● ● ● ●● ●
● ●● ●● ●
●●● ●●● ● ●● ● ● ●●● ●●
●● ●
●●● ● ● ●● ● ●● ●●● ● ● ● ●● ● ●● ●●● ●
●●● ● ●●●● ● ●●●●
● ●● ● ●●● ●● ●
●● ● ● ●
●
●●●
●
● ● ●● ● ●● ● ● ●●● ●● ●
●● ● ● ●
● ● ● ●●● ● ●● ●●●
● ●●● ●
●●●●● ● ●●
●●●
●
●●●● ●
● ● ● ●● ● ●● ●● ● ●
●●●●● ●●
●●
●
●●●●
● ●● ● ● ●
● ● ● ●● ● ● ●
●
● ● ●
●● ● ●●● ●● ●
●
● ●● ● ●● ● ●● ● ● ● ● ●● ● ● ● ●
● ●●
●
● ●●
● ● ●
● ● ● ●
● ● ● ● ●● ● ●●● ●● ●● ●● ●● ●● ● ●●● ● ● ● ●● ●
● ● ●● ● ●●●●●
● ● ●
● ●●● ● ●●
●●
●
●
● ● ●● ●
● ●● ●● ● ● ● ●●●
●●● ● ●●
●
●●● ●● ●
● ●●
● ● ●●
● ● ● ●● ●●●● ●● ●● ●●● ●
● ●●
● ●
● ● ●●●
●●● ● ●● ●● ● ●● ●● ●●● ●
● ● ● ● ● ●● ● ●● ●● ● ●● ●● ● ●● ● ● ●● ●●●● ● ●
● ●●
● ●●
● ● ●●● ● ● ● ●● ●●●● ●
●● ● ●● ● ●
●
●
● ● ●● ●●●● ●
●
●● ● ●● ● ●
● ● ●●
● ● ● ●●●
●● ●
● ●● ●● ● ●●●● ●
●
● ● ●● ●●●●
● ● ● ●
●● ● ● ●● ● ● ●●
● ● ● ●●●●
● ●
● ●● ●●● ● ● ●●
●
● ● ● ●● ●● ●
● ● ● ●
● ● ● ●●●● ●
● ● ● ●●● ● ●● ●
● ●● ● ●● ● ● ●
●● ●● ●● ●●●● ●●
●
●●● ●● ● ● ● ●● ● ● ●●●● ● ● ● ● ●● ●●● ●●●● ●●
●●● ●● ● ● ●● ● ● ●●●
● ●
● ●● ● ● ●●● ● ● ● ● ● ●
●
● ● ●● ●● ●● ●● ● ● ● ●●● ● ● ●●● ● ●
● ● ●● ●● ●
●● ● ● ●● ●● ●● ● ● ●●● ●● ● ● ● ● ● ●●● ●● ●
● ● ●● ● ●
● ●
● ●
●
●
● ● ●● ● ●
● ●● ●● ●
● ●●● ●
● ●
● ● ● ● ● ● ● ● ●● ●● ● ●
●
● ●
●●● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ●● ●● ● ● ●● ● ● ●
●● ●● ●● ● ● ●● ●●● ●●● ● ● ●●● ● ● ● ● ●●
●
● ● ●●●● ● ●
● ●●● ●● ● ●●● ● ●●
● ● ● ●● ● ●● ●
● ●
●● ●●● ● ●●●●●
● ● ● ● ●● ● ● ● ● ● ●
●● ●
● ● ●● ● ●●
● ●●
●● ● ●
● ● ● ●●● ●
● ● ● ●● ● ●● ● ● ●●
●
● ● ●●●● ● ●● ● ● ● ● ● ● ●● ●●
● ● ● ● ● ● ●● ●●
● ●● ● ●●
● ● ●● ● ● ● ●●
Figure 1.3 Minefield data. Left: observed data. Right: true classification
into mines and clutter.
or noise, such as other metal objects or rocks. The objects are small and can be
represented by points without losing much information. The analyst’s task is to
determine whether or not minefields are present, and where they are. A typical
data set is shown in Figure 1.3.1 The true classification of the data between mines
and clutter is shown in the right panel of Figure 1.3. These data are available as
the chevron data set in the mclust R package.
This problem is challenging because the clutter form over two-thirds of the data
points and are not separated from the mines spatially, but rather by their density.
1.0
1.0
1.0
0.8
0.8
0.8
48.90
0.6
0.6
0.6
Loading
Loading
Loading
0.4
0.4
0.4
0.2
0.2
0.2
48.87
0.0
0.0
0.0
Dim−15
Lun−03
Lun−15
Mar−03
Mar−15
Mer−03
Mer−15
Jeu−03
Jeu−15
Ven−03
Ven−15
Sam−03
Sam−15
Dim−03
Dim−15
Dim−15
Lun−03
Lun−15
Mar−03
Mar−15
Mer−03
Mer−15
Jeu−03
Jeu−15
Ven−03
Ven−15
Sam−03
Sam−15
Dim−03
Dim−15
Dim−15
Lun−03
Lun−15
Mar−03
Mar−15
Mer−03
Mer−15
Jeu−03
Jeu−15
Ven−03
Ven−15
Sam−03
Sam−15
Dim−03
Dim−15
lat
Station TOUR EIFFEL Station GARE DE LYON Station HOTEL DE VILLE
1.0
1.0
1.0
48.84
0.8
0.8
0.8
0.6
0.6
0.6
Loading
Loading
Loading
0.4
0.4
0.4
48.81
0.2
0.2
0.2
0.0
0.0
0.0
Dim−15
Lun−03
Lun−15
Mar−03
Mar−15
Mer−03
Mer−15
Jeu−03
Jeu−15
Ven−03
Ven−15
Sam−03
Sam−15
Dim−03
Dim−15
Dim−15
Lun−03
Lun−15
Mar−03
Mar−15
Mer−03
Mer−15
Jeu−03
Jeu−15
Ven−03
Ven−15
Sam−03
Sam−15
Dim−03
Dim−15
Dim−15
Lun−03
Lun−15
Mar−03
Mar−15
Mer−03
Mer−15
Jeu−03
Jeu−15
Ven−03
Ven−15
Sam−03
Sam−15
Dim−03
Dim−15
2.25 2.30 2.35 2.40 2.45
lon
Figure 1.4 Map of the Vélib stations in Paris (left panel) and loading
profiles of some Vélib stations (right panel). The red dots correspond to the
stations for which the loading profiles are displayed on the right panel.
Figure 1.5 Some textile samples of the three-class NIR data set.
supporters resigned from the karate club and established a new club, headed by
Mr. Hi. The data set exhibits many of the phenomena observed in social networks,
in particular clustering, or community structure. The data are shown in Figure 1.6
where the friendship network is shown and the locations of Mr. Hi and John A.
within the network are highlighted.
1.4 Software
We will give examples of software code to implement our analyses throughout
the book. Fortunately, model-based clustering is well served by good software,
mostly in the form of R packages. We will primarily use the R packages mclust
(Scrucca et al., 2016) and Rmixmod (Langrognet et al., 2016), each of which carries
out general model-based clustering and classification and has a rich array of
capabilities. The capabilities of these two packages overlap to some extent, but
not completely. An advantage of using R is that it allows one to easily use several
different packages in the same analysis.
We will also give examples using several other R packages that provide additional
model-based clustering and classification capabilities. These include FlexMix
(Leisch, 2004), fpc (Hennig, 2015a), prabclus (Hennig and Hausdorf, 2015), pgmm
(McNicholas et al., 2018), tclust (Iscar et al., 2017), clustMD (McParland and
Gormley, 2017) and HDclassif (Bergé et al., 2016).
1.5 Organization of the Book 13
Mr. Hi
John A.
Figure 1.6 The friendship network of Zachary’s karate club. The two key
members in the dispute within the club, Mr. Hi and John A., are labeled
and colored differently.
Mixture
0.04 Component 1
Component 2
0.03
Probability Density
0.02
0.01
0.00
●●
●
●●●●
●●●
●
●●
●●
●
●●●
●
●●●
●
●●●●
●
●●
●●●
●●●
●
●●
●●●
●●●●
●●●
●●
●●
●●
●●●●●
●
●●●
●●●●● ●●●
●●●
●●
●●
●
●●●●●●
●●
●
●●●
●
●●●●
●
●●
●●●
●
●●
●●
●
●●
●●
●
●●
●
●●
●●
●●
●●
●
●
●●●
●
●●●
●
●●
●●
●
●●●
●
●●
●●●
●
●●
●●
●●●●●●
●●
●●
●
●●● ●● ●
50 60 70 80 90
# Plot densities :
x <- seq ( from = min ( waiting ) , to = max ( waiting ) , length
=1000)
den1 <- dnorm (x , mean = waiting.Mclust $ parameters $ mean
[1] ,
sd = sqrt ( waiting.Mclust $ parameters $ variance $ sigmasq
[1]) )
den2 <- dnorm (x , mean = waiting.Mclust $ parameters $ mean
[2] ,
sd = sqrt ( waiting.Mclust $ parameters $ variance $ sigmasq
[2]) )
tau1 <- waiting.Mclust $ parameters $ pro [1]
tau2 <- waiting.Mclust $ parameters $ pro [2]
dens <- tau1 * den1 + tau2 * den2
plot (x , dens , type = " l " , xlab = " y " , ylab = " Probability
Density " ,
ylim = c ( -max ( dens ) / 10 ,1 .05 * max ( dens ) ) ,
main = " Density for 1 -dim 2 -component normal mixture
model " , lwd =2)
lines (x , tau1 * den1 , col = " red " )
lines (x , tau2 * den2 , col = " blue " )
legend ( x = min ( x ) ,y = max ( dens ) , legend = c ( " Mixture " ,"
Component 1 " ," Component 2 " ) ,
col = c ( " black " ," red " ," blue " ) , lty = c (1 ,1 ,1) , lwd = c (2 ,1 ,1)
)
However, the separation is not total. Also, note that there is some overlap between
the points from the two mixture components. In particular, one of the blue points
is to the left of many of the red points. So even in this fairly clear situation we
would be uncertain about which components the points in the middle belong to,
if they were not conveniently colored. Assessing this kind of uncertainty is one of
the things that model-based clustering allows us to do.
When the data are multivariate, fg is often the multivariate normal or Gaussian
density φg , parameterized by its mean vector, μg and by its covariance matrix Σg ,
and has the form
1
φg (yi |μg , Σg ) = |2πΣg |− 2 exp − 12 (yi − μg )T Σ−1
g (yi − μg ) . (2.2)
Data generated by mixtures of multivariate normal densities are characterized
by groups or clusters centered at the means μg , with increased density for points
nearer the mean. The corresponding surfaces of constant density are ellipsoidal.
Figure 2.2 shows the density contours for a two-dimensional finite bivariate
normal mixture model with two mixture components. The parameters were those
estimated from the two-dimensional Old Faithful data (Example 1), namely
μ1 = (4.29, 79.97), μ2 = (2.04, 54.48), τ1 = 0.644, τ2 = 0.356,
0.170 0.938 0.069 0.437
Σ1 = , and Σ2 = .
0.938 36.017 0.437 33.708
We can see the ellipsoidal nature of the contours. The values simulated from
the two mixture components do not overlap in this bivariate setting. The six
contours shown were chosen so as to contain 5%, 25%, 50%, 75%, 95% and 99%
of the probability, respectively. The R code used to produce the plot is shown in
Listing 2.2.
100
●
Density contours
● ●
● Component 1
● Component 2 ●
● ●
●
90
● ● ● 1
0.01
●
● ●●
●●● ● ● ●
0.021 ●●●● ● ●
●
●● ● ●
● ● ●● ● ● ●● ● ●●
● ●● ● ● ●● ● ● ● ●
0.032
●●●●
●
● ●
●●
● ●● ● ●
● ●● ●● ●
●
●● ●●● ● ●
●● ●0.042 ●
80
● ● ● ●●● ●
● ●● ● ●
● ●● ●●● ● ● ● ●
● ●●● ● ● ●●
●● ●
● ●● ● ● ● ●●
Dimension 2
●
● ●
● ● ●● ● ●●●
● ●● ● ●
● ● ●●● ● ●
● ● ● ● ● ●
● ●
● ● ●
70
● ●
● ● ●●
0.002 ● ●
● ●
● ● ●
● ● ● ● ●
● ● 0.002
●● ●
●●
●● ● ●
60
●● ●
●●● ●
● ● ● ● ● ●
●
●0.032 ●● ●
● ●●
●●● ● ● ● ● ●
●●●● ●●
●
● ● ●●
●
●
● ●● ● ●
● ● ● ●●●
50
● ● ●
● ● ● ●
0.02●
1 ●
● ●● ●
● ●
● 0.011
●
● ● ● ●
●
2 3 4 5
Dimension 1
1 In earlier work (Banfield and Raftery, 1993), λg was taken to be equal to the first eigenvalue of Σg .
2.2 Geometrically Constrained Multivariate Normal Mixture Models 21
Figure 2.3 shows examples of contours of the component densities for the various
models in the two-dimensional case with two mixture components.
Sometimes these constrained models can have far fewer parameters that need
to be independently estimated than the unconstrained model, while fitting the
22 Model-based Clustering: Basic Ideas
Table 2.2 Numbers of parameters needed to specify the covariance matrix for models used in
model-based clustering.
EEI d 2 27
VEI G + (d − 1) 3 29
EVI 1 + G(d − 1) 3 79
VVI Gd 4 81
sample data almost as well. When this is the case, they can yield more precise
estimates of model parameters, better out-of-sample predictions, and more easily
interpretable parameter estimates. All the models have Gd parameters for the
component means μg , and (G − 1) parameters for the mixture proportions τg .
Table 2.2 shows the numbers of parameters needed to specify the covariance
matrix for each model in the two-dimensional two-component case, d = 2, G = 2,
and the 27-dimensional three-component case, d = 27, G = 3. These results are
obtained by noting that for one mixture component, the volume is specified by 1
parameter, the shape by (d − 1) parameters, and the orientation by d(d − 1)/2
parameters.
It is clear that the potential gain in parsimony as measured by number of
parameters is small for the two-dimensional case. But for higher-dimensional cases,
the gain can be large. In the most extreme case in Table 2.2, in the 27-dimensional
case with 3 mixture components, the VVV model requires 1,134 parameters to
represent the covariance matrices, whereas the EII model requires only one.
Also, more parameters are required to specify the shape than the volume, and
far more again to specify the orientation. Thus big gains in parsimony are achieved
by the models that require the orientations to be equal across mixture components,
and the largest gains come from requiring the component densities to be diagonal.
However, as we will see, these most parsimonious models do not always fit the
data adequately.
2.3 Estimation by Maximum Likelihood 23
n
G
= τg φg (yi |μg , Σg ). (2.6)
i=1 g=1
1 if yi belongs to group g
zi,g = (2.7)
0 otherwise.
24 Model-based Clustering: Basic Ideas
τ Z
μ Y Σ
We assume that the zi are independent and identically distributed, each accord-
ing to a multinomial distribution of one draw from G categories with probabilities
τ1 , .
. . , τG . We also assume that the density of an observation yi given zi is given
by g=1 fg (yi | θg )zi,g . Then the resulting complete-data log-likelihood is
G
n
G
C (θg , τg , zi,g | y) = zi,g log [τg fg (yi | θg )] . (2.8)
i=1 g=1
where τ̂g(s) is the value of τg after the sth EM iteration. The M-step involves
maximizing (2.8) in terms of τg and θg with zi,g fixed at the values computed in
the E-step, namely ẑi,g .
The quantity ẑi,g = E[zi,g |yi , θ1 , . . . , θG ] for the model (2.1) is the conditional
(s)
expectation of zi,g given the parameter values at the (s − 1)th iteration and the
observed data y. The value ẑi,g of zi,g at a maximum of (2.6) is the estimated
conditional probability that observation i belongs to group g. The maximum
likelihood classification of observation i is {h | ẑih = maxg ẑi,g }. As a result,
(1 − maxg ẑi,g ) is a measure of the uncertainty in the classification of observation
i (Bensmail et al., 1997).
For multivariate normal mixtures, the E-step is given by (2.9) with fg replaced
2.3 Estimation by Maximum Likelihood 25
Example 1 (ctd.)
We illustrate the EM algorithm for multivariate normal mixture models using the
Old Faithful data. We used the EM algorithm to carry out maximum likelihood
estimation for the model with two mixture components and unconstrained covari-
ance matrices (model “VVV”). We initialized the EM algorithm using a random
partition (other initialization strategies will be discussed in Section 2.4).
A trace of the values of the component means, μ1 and μ2 , at each iteration
of the EM algorithm is shown in Figure 2.5. The algorithm converged within 50
iterations, and the converged estimates are shown by solid circles. Because the
algorithm was initialized with a random partition, the starting estimates of μ1
and μ2 were very similar. R code to produce Figure 2.5 is shown in Listings 2.3
and 2.4;; the other figures in this example can be produced in a similar way.
Figure 2.6 (left) shows the log-likelihood values for each iteration of the EM
algorithm. As the theory predicts, the log-likelihood increases at each iteration.
For the first 30 iterations, the log-likelihood remains relatively constant as the
algorithm struggles to escape from the initial random partition, which gives little
information about the two groups in the data. Then it detects the separation
between the groups, and the log-likelihood increases rapidly between iteration 30
and iteration 40 as the algorithm refines the partition. From iteration 40 onwards
26 Model-based Clustering: Basic Ideas
Mean 1 ●
● ●
●
Mean 2 ●
90
● ●● ●●●
● ● ●
● ● ● ● ●●
● ●
● ●● ● ●
● ●● ●●●
●● ●● ●●● ●● ●
● ● ● ● ● ●●● ● ●
● ● ●●●● ●● ●
● ● ●
● ● ● ●● ● ● ● ● ●● ●
80
● ● ●● ● ●● ●
● ● ● ● ●● ● ● ●●
● ●
● ● ● ●● ● ● ●● ●● ●●
●● ●● ● ● ● ● ● ●
●● ●● ● ● ● ● ●
● ● ●
● ● ●
●●
● ● ● ● ●●
● ● ● ●● ●
●
Waiting
● ● ● ● ●
70
● ● ●●
● ●
●
●
● ●
● ● ●
● ● ● ●
● ● ●
● ● ● ●
60
● ● ● ●● ●
● ●● ● ● ●●
● ●● ●
● ●
●● ●
● ●● ● ● ●
●●●
● ●● ● ● ●
●● ●● ● ●
● ● ● ● ●
●●●● ● ●
50
● ● ● ●
●●● ● ●
● ● ●
● ● ●
●●
● ●
●● ●
●
Eruptions
# EM algorithm step-by-step
# Initialize EM algorithm
itmax <- 50
G <- 2
zmat <- matrix ( rep (0 , n * G ) , ncol = G )
for ( g in (1: G ) ) zmat [ , g ] <- z.init == g
mstep.out <- vector ( " list " , itmax )
estep.out <- vector ( " list " , itmax )
# EM iterations
for ( iter in 1: itmax ) {
# M-step
mstep.tmp <- mstep ( modelName = " VVV " , data = faithful , z = zmat )
mstep.out [[ iter ]] <- mstep.tmp
# E-step
estep.tmp <- estep ( modelName = " VVV " , data = faithful , parameters =
mstep.tmp $ parameters )
estep.out [[ iter ]] <- estep.tmp
zmat <- estep.tmp $ z
}
# Extract classification
EMclass <- rep ( NA , n )
for ( i in 1: n ) {
zmati <- zmat [i ,]
zmat.max <- zmati == max ( zmati )
zmat.argmax <- (1: G ) [ zmat.max ]
EMclass [ i ] <- min ( zmat.argmax )
}
●●●●●●●●● ●
●
●
● ●
●
●
−1150
0.46
●
●
● ●
●
● ●●
●● ●●●
●●●●●●●●●●●●●●●●●●●
●
0.44
●
●
Mixing Proportion 1
Log−Likelihood
−1200
0.42
●
0.40
−1250
●
●
0.38
●
●
●
0.36
● ●
●●
●●●●● ●●
●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●
0 10 20 30 40 50 0 10 20 30 40 50
Iteration Iteration
when the algorithm has started to detect the separation between the two clusters.
It is apparent that the classification is still inadequate. The bottom right panel
shows the result of the 50th iteration, at which stage the algorithm has converged.
The two clusters have been satisfactorily separated.
2.3 Estimation by Maximum Likelihood 29
●●●●● ●
●●●
1.4
● ●● ●●●●●
●● ● ●●
●●●●●●●● ●●
●●●●
●●●●●
●● ●●●●●
●●● ● ●● ●
●●● ● ●● ●
● ●●●●
●●
● ●●
●●●● ●●●●● ● ●
● ●
●●●●●
● ●●●● ●
●●● ●
●● ●
1.2
●● ●
● ● ●
●
●
150
● ●
● ●
● ●
●
1.0
●
● ●
●
● ●
●
0.8
Variance 1
Variance 2
●
100
●
●
● ●
0.6
●
0.4
● ●
50
●
●
● ●
●● ●●●●●●●●
●●
● ●●●●●●●
● ●●●
0.2
●
●●●●●●●●●●
●●●●
●
● ●●●●●●●●● ●
Component 1 Component 1
0.0
● Component 2 ● Component 2
0
0 10 20 30 40 50 0 10 20 30 40 50
Iteration Iteration
●●●
●●●
15
●● ●
●● ●
● ●●●
●●●
●●●●●●●●●●●●● ●
●●●●●●●●●●●●●
●●●
● ●● ●
●●
●
●
●
●
● ●
●
●
●
●
10
●
●
Covariance
●
5
●
●
● ●
● ●●●●●●●●●●
●
Component 1 ●●●● ●●●●●●●●●
● Component 2
0
0 10 20 30 40 50
Iteration
The probability density function of the data evaluated at the parameter es-
timates is a finite mixture distribution, and this is depicted in several ways in
Figure 2.9, using the R code in Listing 2.5.
The uncertainty associated with the allocation of the ith observation’s cluster
membership is measured by
Unceri = 1 − max ẑi,g . (2.12)
g=1,...,G
The uncertainty for data point i, Unceri , will be largest for data points i for which
the ẑi,1 , . . . , ẑi,G are equal to one another, and thus Unceri = 1/G for those data
points. It will be smallest when one of the ẑi,1 , . . . , ẑi,G is close to 1, in which case
30 Model-based Clustering: Basic Ideas
● ●
● Component 1 ● ●
● ● ● ●
● ● ●
90 Component 2 ● ●
90
● ●● ●●● ● ●● ●●●
● ● ● ● ● ●
● ● ● ● ●● ● ● ● ● ●●
● ● ● ●
● ●● ● ● ● ●● ● ●
● ● ● ●●● ● ●● ●●●
●● ●
●
●●● ●● ● ●● ●
●
●●● ●● ●
● ● ● ● ● ●●● ● ● ● ● ● ● ● ●●● ● ●
● ● ●●
●● ●● ●
● ● ● ● ● ●●●● ●● ●
● ● ●
● ● ● ●● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ●● ●
80
80
● ● ● ● ●● ● ● ● ● ● ●● ●
● ● ● ● ●● ● ● ●● ● ● ● ● ●● ● ● ●●
● ●
● ● ● ●● ● ● ●● ●● ●● ● ●
● ● ● ●● ● ● ●● ●● ●●
●● ●● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ●
●● ●● ● ● ● ● ● ●● ●● ● ● ● ● ●
● ● ●
● ● ●●
● ● ● ●
● ● ●
●●
● ● ● ● ●● ● ● ● ● ●●
● ● ● ●● ● ● ● ● ●● ●
● ●
Waiting
waiting
● ● ● ● ● ● ● ● ● ●
70
70
● ● ●● ● ● ●●
● ● ● ●
● ●
● ●
● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
60
60
● ● ● ●● ● ● ● ● ●● ●
● ●● ● ● ●● ● ●● ● ● ●●
● ●● ● ● ●● ●
● ● ● ●
●● ● ●● ●
● ●● ● ● ● ● ●● ● ● ●
●●●
● ●● ● ● ●●●
● ●● ● ●
●● ●● ● ● ●● ●● ● ●
● ● ● ● ● ● ● ● ● ●
●●●● ● ● ●●●● ● ●
50
50
● ● ● ● ● ● ● ●
●●● ● ● ●●● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
●
●● ● ●
●● ●
●● ● ●● ●
● ●
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
Eruptions eruptions
● ●
● ●
● ● ● ●
● ●
● ●
90
90
● ●● ●●● ● ●● ●●●
● ● ● ● ● ●
● ● ● ● ●● ● ● ● ● ●●
● ● ● ●
● ●● ● ● ● ●● ● ●
● ●● ●●● ● ●● ●●●
●● ●●● ●● ●
●
● ●● ●●● ●● ●
●
●
● ● ● ● ● ●●● ● ● ● ● ● ● ● ●●● ● ●
● ● ●●●● ●● ●
● ● ● ● ● ●●●● ●● ●
● ● ●
● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ●
80
80
● ● ● ● ●● ● ● ● ● ● ●● ●
● ● ● ● ●● ● ● ●● ● ● ● ● ●● ● ● ●●
● ●
● ● ● ●● ● ● ●● ●● ●● ● ●
● ● ● ●● ● ● ●● ●● ●●
●● ●● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ●
●● ●● ● ● ● ● ● ●● ●● ● ● ● ● ●
● ● ●
● ● ●●
● ● ● ●
● ● ●●
●
● ● ● ● ●● ● ● ● ● ●●
● ● ● ●● ● ● ● ● ●● ●
● ●
waiting
waiting
● ● ● ● ● ● ● ● ● ●
70
70
● ● ●● ● ● ●●
● ● ● ●
● ●
● ●
● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
60
60
● ● ● ●● ● ● ● ● ●● ●
● ●● ● ● ●● ● ●● ● ● ●●
● ●● ● ● ●● ●
● ● ● ●
●● ● ●● ●
● ●● ● ● ● ● ●● ● ● ●
●
●●●●● ● ● ●●●
● ●● ● ●
●● ●● ● ● ●● ●● ● ●
● ● ● ● ● ● ● ● ● ●
●●●● ● ● ●●●● ● ●
50
50
● ● ● ● ● ● ● ●
●●● ● ● ●●● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
●●
● ● ●●
● ●
●● ● ●● ●
● ●
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions
it will be close to zero. The uncertainty values for the Old Faithful data are shown
in Figure 2.10, produced using the R code in Listing 2.6.
In the left panel of Figure 2.10, the 5% of the observations with the greatest
uncertainty are shown by large black circles; these are the ones that are roughly
equidistant between the two clusters. The next 20% of the observations in terms
of uncertainty are shown by smaller gray circles, and these are closer to the
cluster centers than the most uncertain points. The least uncertain 75% of the
observations are shown by small gray dots, and these are the ones that are closest
to the cluster centers. In this data set the clusters are well separated, and so there
is little uncertainty about the allocation of most of the data points.
2.4 Initializing the EM Algorithm 31
The right panel of Figure 2.10 shows contours of uncertainty level. These are
slightly nonlinear, reflecting the fact that the two clusters have different covariance
matrices. They also fall away quickly from the contour of highest uncertainty,
reflecting the fact that most of the data points are assigned to clusters with little
uncertainty.
For normal mixture models things are even trickier, as there are typically
multiple paths in parameter space along which the likelihood can tend to infinity
at parameter values on the edge of the parameter space (Titterington et al.,
32 Model-based Clustering: Basic Ideas
●
● ● ●
0.005 ● ●● 0.005
90
90
● ● ● ●● ●●● ●
● ●0. ●● ●●
0.02
● ●●● ●● ● ●
● 02 ●●
● ● ●● ●●●●● ●
●
●●●
●
● ●●
●●● ● ●●●●●
●
●
● ●●
● ●●●●●
25
25
●●● ● ● ● ●
●●●
80
80
● ● ● ●● ●● ●●
● ●
●
● ●
0.0
0.0
● ●●●●●●
● ●
●●
●●●●●●●
●●
● ●●●
●
●● ●● ● ● ● ●●
● ● ●● ● ●● ● ●●5 ●●●
Waiting
Waiting
15 ● ●● ● 1 ●
0.0 ● ●● ● ●●
●
●0.0●
70
70
0.01 ● ●● ●
0.01
●
● ●
● ● ●●
0.005 ● ● 0.00 5● ● ● ●
● ● ●● ● ●
60
60
0.015 ●
●●●
● ●●● ●
●
0.015
●● ●● ●●
● ●
●
●
● ●●
0.02
0.02
●
●●
●●
● ●
● ●●
● ●●● ● ● ●
● ●●●
●●
●●●
●● ● ●●
50
50
●● ●●●●●●●
●●
●●
●
●● ●
0.01 ●● ●●
0.01
●
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
Eruptions Eruptions
90
Densit
80
Waiting
y
70
Eru
60
ptio
iti ng
ns
50
Wa
Eruptions
Classification Uncertainty
● ●
0.4
●
0.1
0.3
0.0
●
● ●
0.1
● ●
5
● ●
0.3
● ●
90
90
0.0
● ●● ● ● ● ● ●● ●●●
● ● ●
5
● ● ●
0.2
0.1
● ● ● ● ● ● ● ● ● ● ●●
● ● ● ●
0.2
● ●● ● ●
0.2
● ●● ● ●
● ● ● ● ●● ● ● ● ●●●
5
● ● ●● ●●● ●● ● ●● ●
●
●●● ●● ●
● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●● ● ●
0.1
● ● ● ●●●● ●● ● ● ● ● ● ●●●● ●● ●
● ● ●
● ● ● ●● ● ● ● ● ●● ●
5
● ● ● ● ● ● ●● ● ●● ●
80
80
0.2
● ● ● ● ● ● ● ● ● ● ● ●● ●
● ● ● ● ●● ● ● ●●
5
● ● ● ● ●● ●● ● ●
● ●● ● ● ●● ● ● ●● ●● ● ● ● ●
● ● ● ●● ● ● ●● ●● ●●
●● ●● ● ● ●● ● ● ●● ●● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ●
● ● ●● ● ●● ● ● ● ●
● ● ●
●●
● ● ● ● ● ● ● ● ● ● ●●
● ● ● ●● ● ● ● ● ●● ●
●
Waiting
waiting
0.35
● ● ● ● ● ● ● ● ●
70
70
● ● ●● ● ● ●●
● ● ● ●
● ●
0.35
● ●
● ● ● ●
● ● ● ● ● ●
0.35
● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
0.35
60
60
● ● ● ●● ● ● ● ● ●● ●
● ● ● ● ● ●● ● ●● ● ● ●●
● ●● ● ● ●● ●
0.35
● ● ● ●
●● ● ●● ●
● ●● ● ●
● ●
● ●●●
●
● ●●
●●
● ●
● ●
●
0.35
●● ● ●● ●
● ●
● ●
● ●
●●
●
● ● ●
●●
● ● ● ●
●● ● ●
● ●● ● ● ● ●●●● ● ●
50
50
● ● ● ● ● ● ● ●
0.35
● ●● ● ● ●●● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
●
●● ●
0.35
● ●● ●
● ● ● ●● ●
● ●
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions Eruptions
Figure 2.10 Uncertainty values for the Old Faithful data with two clusters.
Left: uncertainty values for individual observations. The large black circles
indicate the top 5% of the observations in terms of uncertainty, the smaller
gray circles indicate the next 20%, and the light gray dots indicate the
lowest 75%. Right: contours of uncertainty level.
we saw in Section 2.3 that the EM algorithm gave satisfactory performance when
initialized with a random partition. This often does not work well, however.
Here we will describe two methods for selecting initial values that have performed
well in a variety of settings: hierarchical model-based clustering, and the so-called
smallEM method.
1963). These criteria are all special cases of a more general, computationally
efficient algorithm proposed by Lance and Williams (1967).
In hierarchical model-based clustering (Banfield and Raftery, 1993), the criterion
used for deciding which clusters to merge at each stage is no longer based on
dissimilarities. Instead the criterion used is the classification likelihood,
n
LCL (θ, z|y) = fzi (yi |θzi ). (2.13)
i=1
Note that this is different from the observed data (or mixture) likelihood given
by Equation (2.5) in that the set of cluster memberships z is an argument of the
classification likelihood, whereas these are integrated out in the observed data
likelihood. It is also different from the complete-data likelihood given by Equation
(2.4), in that it does not include the mixing probabilities τg corresponding to the
cluster memberships zi .
Methods that attempt to maximize the classification likelihood (2.13) attempt
to estimate the cluster memberships and the model parameters simultaneously
and are not in general asymptotically consistent in the sense of being guaranteed
to give estimates that tend to the true values as sample size increases (Marriott,
1975). However, they are often much more computationally efficient and so can
work well for initialization. Clustering methods using the classification and mixture
likelihoods were compared by Celeux and Govaert (1993).
Another method aimed at maximizing the classification likelihood is the Clas-
sification EM (CEM) algorithm (Celeux and Govaert, 1992).This CEM algo-
rithm incorporates a classification step (C-step) between the E-step and the
M-step of the EM algorithm. The C-step consists of replacing the unobserved
labels hi , i = 1, . . . , n, defining a partition into G clusters of the observations
by {hi | ẑih = maxg ẑi,g }. Thus, the M-step reduces to an estimation of the
(s) (s) (s)
clustering can involve fitting large numbers of different models, so obtaining initial
partitions for all of them from a single run is appealing.
By default, in the mclust R package, the EM algorithm is initialized by running
the hierarchical model-based clustering with the VVV model. This gives a partition
for each number of clusters. These partitions are used as the initial points for the
algorithm.
Example 1 (ctd.)
In the Old Faithful data, a hierarchical model-based clustering of the data is
carried out using the R commands shown in Listing 2.7.
The results are shown in Figure 2.11. This is close, but not identical, to the
converged maximum likelihood solution shown in the bottom left panel of Figure
2.8. It is close enough to provide a good starting value for the EM algorithm. This
is the default initialization in the mclust package.
A potential disadvantage of hierarchical model-based clustering is that, in its
most basic form, it can take a long time to converge when the data set is large.
This is because it involves computing and maintaining a matrix of the classification
likelihood increase for all pairs of clusters that could potentially be merged. Since
initially there is one cluster per observation, this matrix is of size O(n2 ). If n is
large enough, this matrix does not fit in memory, and manipulating it involves
swapping in and out of memory. When that threshold is passed, computing time
can increase hugely.
A simple fix is to base the initialization by hierarchical model-based clustering
on a random subset of the data. Experience suggests that there are few gains to
be made by increasing the size of the subset beyond about 2,000 (Wehrens et al.,
2004). Code to do this for the Old Faithful data with a subset size of 100 is shown
in Listing 2.8.
36 Model-based Clustering: Basic Ideas
●
● Component 1 ●
● ●
● ●
Component 2 ●
90
● ●● ●●●
● ● ●
● ● ● ● ●●
● ●
● ●● ● ●
● ● ● ●●●
●● ●● ●●● ●● ●
● ● ● ● ● ●●● ● ●
● ● ●●
●● ●● ●
● ● ●
● ● ● ●● ● ●● ● ●● ●
80
● ● ● ● ●● ●
● ● ● ● ●● ● ● ●●
● ●
● ● ● ●● ● ● ●● ●● ●●
●● ●● ● ● ● ● ● ●
●● ●● ● ● ● ● ●
● ● ●
● ● ●
●●
● ● ● ● ●●
● ● ● ●● ●
●
Waiting
● ● ● ● ●
70
● ● ●●
● ●
●
●
● ●
● ● ●
● ● ● ●
● ● ●
● ● ● ●
60
● ● ● ●● ●
● ●● ● ● ●●
● ●● ●
● ●
●● ●
● ●● ● ● ●
●●●
● ●● ● ●
●● ●● ● ●
● ● ● ● ●
●●●● ● ●
50
● ● ● ●
●●● ● ●
● ● ●
● ● ●
●●
● ●
●● ●
●
Eruptions
Figure 2.11 Partition of the Old Faithful data into two clusters using
hierarchical model-based clustering with the unconstrained variance (VVV)
model.
3 Record the mixture log-likelihood at the stopping point of the short EM run.
2 Among the stopping points of the short EM runs, choose the one with the
highest mixture likelihood. Using it as a starting point, run the EM algorithm
to convergence.
Example 1 (ctd.)
Results from the smallEM initialization method for the Old Faithful data are
shown in Figures 2.12 and 2.13. They were produced using the R commands in
Listing 2.9.
The smallEM solutions shown in Figure 2.12 are quite similar, even though
38 Model-based Clustering: Basic Ideas
90
90
● ●● ●●●
● ● ●
● ● ●● ●●
● ●
● ●● ●●
● ● ● ●● ●
●● ● ● ●●● ●● ●
● ● ● ● ● ●●● ● ● ●
● ● ●●●
●● ●● ● ● ●
●● ●●●● ● ● ● ●● ●
80
80
80
● ● ● ● ●● ●
● ● ● ● ●● ● ●●●
● ●● ● ● ●●● ● ●●●●●● ●
●● ●● ● ● ●● ● ●
●● ●● ● ● ● ● ●
●● ●
● ● ●●●
● ● ● ●● ● ●
waiting
waiting
waiting
● ● ●● ● ●
●
● ●● ● ● ● ●
70
70
70
●●● ●
● ● ●
● ●
● ●
● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
60
60
60
● ● ● ●●● ● ● ● ●●●
● ●● ● ● ●● ● ●● ● ● ●●
● ●● ● ● ●● ●
● ● ● ●
●● ● ●● ●
● ●● ● ● ● ● ●● ● ● ●
●
●●●●● ● ● ●●●
● ●● ● ●
●● ●● ● ● ●● ●● ● ●
● ● ●● ● ● ● ●● ●
●●●● ● ● ●●●● ● ●
50
50
50
● ●●● ● ●●●
●●
●●● ●●
●●●
● ● ● ● ● ●
● ● ● ● ● ●
●●
● ● ●●
● ●
●● ● ●● ●
● ●
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions eruptions
90
90
80
80
80
●
●
waiting
waiting
waiting
●
● ● ● ● ●
70
70
70
● ● ●
● ● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
60
60
60
● ● ● ●●● ● ● ● ●●● ● ● ● ●●●
● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●●
● ●● ● ● ●● ● ● ●● ●
● ● ● ● ● ●
●● ● ●● ● ●● ●
● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ●
●●●●●● ● ● ●●●●●● ● ● ●●●●●● ● ●
●● ●● ● ● ●● ●● ● ● ●● ●● ● ●
● ● ●● ● ● ● ●● ● ● ● ●● ●
●●●● ● ● ●●●● ● ● ●●●● ● ●
50
50
50
● ●●● ● ●●● ● ●●●
●●●
●● ●●
●●● ●●
●●●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
●●
● ● ●●
● ● ●●
● ●
●● ● ●● ● ●● ●
● ● ●
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions eruptions
Figure 2.12 Initialization with smallEM on the Old Faithful Geyser data:
six runs of two-iteration EM to select the best initialization. Here, the best
initialization is the top-left panel with log-likelihood equal to −1, 174.
“Likelihood” refers to the maximized mixture log-likelihood.
each one is the result of only two EM iterations. However, the differences in
log-likelihood between the solutions are quite substantial.
The final estimate from smallEM shown in Figure 2.13 is the same classification
as that from the random partition initial value shown in Figure 2.5. This is in
spite of the fact that the initial values are dramatically different.
Finally, especially in the clustering context, the CEM algorithm could be a
good alternative to the smallEM strategy since the CEM algorithm is expected to
converge rapidly and can be repeated many times at small cost (Biernacki et al.,
2003).
We also illustrate the possible differences between the partitions derived from
the EM and the CEM algorithms with the Old Faithful geyser data. If the number
of clusters is fixed at G = 2, the CEM algorithm with the VVV model from a
random initial solution gives exactly the same partition, displayed in Figure 2.13,
that we get from EM initialized with the smallEM strategy. This is not surprising
since with G = 2 the two mixture components are well separated. But with G = 3
and the VVV model, the EM and the CEM algorithms yield different partitions;
see Figure 2.14.
2.5 Examples with Known Number of Clusters 39
Likelihood = −1130
90
80
waiting
●
70
● ● ●
● ●
● ● ●
● ● ● ●
60
● ● ● ● ●●
● ●● ●● ● ●
● ●● ●
● ●
●● ●
● ●● ● ● ●
●● ● ●● ● ● ●
● ● ● ● ● ●
● ● ● ● ●
● ●● ● ● ●
50
● ● ● ●
● ●● ● ●
● ● ●
● ● ●
● ●● ●
● ● ●
Figure 2.13 Initialization with smallEM on Old Faithful geyser data: final
result of the EM algorithm from the best two-iteration run. “Likelihood”
refers to the maximized mixture log-likelihood.
90
90
80
80
waiting
waiting
70
70
60
60
50
50
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions
Figure 2.14 Classification plots for the Old Faithful data with the VVV
model and G = 3. Left: EM results. Right: CEM results.
350
300
250
glucose
200
150
●
● ●●● ●●
100
●●●●
● ●
●●
●●
●●
●
●●
●●
●● ● ●
●●●●
● ●●
●●●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●
●●●
● ●●●
● ● ●
●
●●
●
●●
●
●●
●●
●●●
●●
●●●●●
●●
●
●●
●
●●●
●
●●●
●● ●●●● ●●
● ●
● ●
●●●●
●
● ●● ●
●●
● ● ●
●
● ● ● ●
1500
1000
insulin
500
● ●
●●
●
●●
●●● ● ●●●●
●● ● ●
● ●●
●●
●● ●
●●●● ● ●●●● ●●
●● ●●●
●●●
●
●
● ●
●●
●●
●
●●●
●●
●● ● ●●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
● ●
●● ●
●
●
●●●●●●
●●●
●●●
●
●
●● ●● ●●
●●●
●●
● ●
●●● ●
● ●
●●●●● ● ●
0
1.0
0.5
sspg
0.0
●●
● ● ●
●●●
● ● ●●
●● ● ●●
●
−0.5
●●
●
●
●●●
●●
●
●●●
●
●●
●
●
●●
●●●
● ●●●
●● ● ●●●
●●
●● ● ●● ●● ●●
●
●● ●●
●●
●●
●●
●
●
●●
●
●
●●
●
●●
●
● ●
●
●●●●
●
●
● ●●
●
●● ● ● ● ● ●●
●
● ●●
●●● ● ●●
●●●●●●
● ●●
●●●● ●●
● ● ●
100 150 200 250 300 350 −1.0 −0.5 0.0 0.5 1.0 −1.0
1971) of the similarity of two partitions of a data set is the proportion of pairs of
data points that are in the same group in both partitions. The adjusted Rand
index (ARI) (Hubert and Arabie, 1985) is the chance-adjusted value of the Rand
index. When the two partitions are statistically independent, its expected value is
zero, and when the two partitions are identical its value is 1. The higher the ARI,
the better.
The classification error rate for model-based clustering for this data set was
42 Model-based Clustering: Basic Ideas
600
●
400
sspg
●
●
200
● ●
● ●
● ● ●
●
●● ●
● ● ●
● ● ●
●
● ● ●● ●
● ●●
● ● ●
0
insulin
11.7%, and the ARI was 0.719. Single link clustering performed poorly for this
data set, putting all the data points except two into the same cluster. Average
link clustering, complete link clustering and k-means performed similarly, as one
might expect from Figure 2.17, with classification error rates between 27.6% and
29.7%, and ARI values between 0.318 and 0.375. These are all substantially worse
than model-based clustering, as they fail to capture the strongly non-spherical
nature of the clusters.
Model-based clustering also provides a quantification of the uncertainty about
the cluster allocation of each data point, Unceri , defined by Equation (2.12). These
2.5 Examples with Known Number of Clusters 43
600
600
●
400
400
sspg
sspg
●
● ●
●● ● ●● ●
● ● ● ●
●● ●●
● ●● ●●● ● ● ●● ●●● ● ●
●
200
200
●●● ●● ● ●●● ●● ● ●●
●● ●●
●
●●● ●
● ●
●●● ●
●
●●● ●●●
● ●● ●●● ● ●● ●●● ●
● ●● ●●● ● ● ●● ●●● ●●●
●● ● ●● ●● ● ●●
● ●●
●● ● ● ●● ● ●●
●● ●
● ●● ●
●●● ●● ●●● ● ●
● ●
●●● ● ●● ● ●
0
0
0 500 1000 1500 0 500 1000 1500
insulin insulin
(a) Clinical classification (b) Model−Based Clustering
●
600
600
●
●
●
●
400
400
●
sspg
sspg
● ●
●
●
● ●●
● ●
● ●●
●● ● ● ●
● ● ●
●● ● ● ●
● ●● ● ●
● ●●
● ●
200
200
●●● ●● ● ●●
●● ●
●
●●● ●
● ●
●●●
● ●● ●●● ● ●
● ●● ●●● ●●● ● ●
● ● ●●
●
● ●●
●● ● ●●● ● ●●● ●
●●● ● ●●
●
●● ● ●
●
0
insulin insulin
(c) Single link (d) Average link
● ●
● ●
600
600
● ●
● ●
● ●
● ●
● ●
● ●
400
400
● ●
sspg
sspg
● ●
● ●
● ●
● ●● ● ●●
● ● ● ●
● ●● ● ●●
●● ● ●● ●● ● ● ●
● ● ● ● ● ●
●● ● ● ● ●● ● ●
● ●● ●●● ● ● ● ● ●● ●●● ● ●
● ●
200
200
●●● ●● ● ●● ●●● ●● ● ●●
●● ● ●● ●
●
●●● ●
● ● ●
●●● ●
● ●
●●● ●●●
● ●● ●●● ● ● ● ● ●● ●●● ● ●
● ●● ●●● ● ●
● ● ● ●● ●●● ●●● ● ●
●● ● ●● ●● ● ●● ● ●● ●● ●
● ●●
●● ● ● ●● ● ● ●●
● ●●●● ●●● ●
●
●●● ● ●● ●●
● ●●
● ●
●●● ● ●●● ●
●
0
insulin insulin
(e) Complete link (f) k means
Figure 2.17 Clustering results for diabetes data, shown for two of the
three variables, insulin and SSPG. (a) Clinical classification; (b) model-based
clustering; (c) single link; (d) average link; (e) complete link; (f) k-means.
44 Model-based Clustering: Basic Ideas
Table 2.3 Performance of different clustering methods on the diabetes data: classification error
rate (CER: smaller is better) and Adjusted Rand Index (ARI: larger is better).
are summarized in the uncertainty plot in Figure 2.18, which plots the uncertainty
values in increasing order, and shows the classification errors by vertical lines.
Most of the classification errors correspond to higher levels of uncertainty.
The uncertainty values are shown in a different way in Figure 2.19, where the
top 5% of points by uncertainty are shown by solid black circles, and the next
20% by solid gray circles. The points with the largest uncertainty are between
clusters; the points close to cluster centers have low uncertainty.
R commands to analyze the diabetes diagnosis data using model-based clustering
are shown in Listing 2.10. R commands to analyze the same data using the non-
model-based clustering methods are shown in Listing 2.11.
0.5
0.4
0.3
uncertainty
0.2
0.1
0.0
The k-means method also classified too few subjects into the Malignant category,
but more than average link and complete link.
Table 2.4 shows a quantitative comparison between the different clustering
methods in terms of classification error rate (CER) and adjusted Rand index
(ARI). Single link clustering had an ARI that was essentially zero, indicating
that its classification was not better than random chance, while average link and
complete link clustering had slightly better performances. The k-means method
performed better, but it had a classification error rate that was three times greater
than model-based clustering, which performed best for this data set.
The uncertainty plot is shown in Figure 2.22. It can be seen that the majority
46 Model-based Clustering: Basic Ideas
●
600
●
●
●
400
●
sspg
●
● ●
●
● ●
● ●●
● ●● ● ●
● ● ●
●●
●● ● ● ●
● ●● ●●
●●
● ●
200
●
● ●● ● ●
●●
●
●
●
● ●
●●●
● ● ●
●
● ●
● ●
● ●
● ● ●●
●●
●
●● ●
● ●
●
●
●● ● ●
●● ●
●●
● ●● ● ●
● ●
●
● ●
● ●
●●
●
●
● ●
●
● ●
●
● ●● ●
●
● ●
● ● ●
●● ● ● ●
●
● ● ●● ●
● ●
●
● ●
●
0
insulin
Model−Based Clustering Uncertainty
of the errors corresponded to the subjects with the highest uncertainty, so in this
case the assessment of uncertainty captured the likelihood of error fairly well, at
least in a qualitative sense.
Listing 2.10: R code for model-based cluster analysis of the diabetes data
# Initial setup
library ( mclust )
data ( diabetes )
diabetes.data <- diabetes [2:4]
diabetes.class <- unlist ( diabetes [1])
# Uncertainty plot
uncerPlot ( z = diabetes.Mclust $z , truth = diabetes.class )
Listing 2.11: R code for cluster analysis of diabetes data using non-model-based
methods
# Loading libraries and data
library ( MBCbook )
data ( diabetes )
diabetes.data = diabetes [ , -1 ]
diabetes.class = diabetes $ class
# k-means clustering
diabetes.kmeans <- kmeans ( diabetes.data , centers =3 , nstart =20)
coordProj ( diabetes.data , dimens = c (2 ,3) , what = " classification " ,
classification = diabetes.kmeans $ cluster ,
col = c ( " green3 " ," red2 " ," dodgerblue2 " ) , symbols = c (17 ,0 ,16) ,
sub = " ( f ) k means " )
classError ( diabetes.kmeans $ cluster , truth = diabetes.class )
adjustedRandIndex ( diabetes.kmeans $ cluster , diabetes.class )
2.6 Choosing the Number of Clusters and the Clustering Model 49
10
5
Comp.2
0
−5
−5 0 5 10 15
Comp.1
10 ● ●
10
● ●
● ●
● ●
● ●
5
5
● ● ● ● ● ●
● ●● ● ● ●● ●
●● ● ●● ●
Comp.2
Comp.2
●● ● ●● ●
● ● ● ● ● ●● ● ● ● ● ●●
●
●
● ●● ● ● ●● ● ● ●● ●●● ●● ● ●
● ● ● ●
● ● ● ● ● ● ●
●
● ●● ● ● ●● ●
● ● ● ●● ● ● ● ● ● ●● ●
●●● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ●
● ● ●
●● ● ●● ●●
● ● ● ● ●● ● ●● ●● ● ● ●
● ● ● ●
● ● ●
●
0
0
●
● ●● ●● ● ● ● ● ●●
● ●● ●● ● ● ● ●
● ●● ● ●●
● ●●● ● ● ● ●● ● ●
●
●●
● ●
● ● ●●● ● ● ●● ● ●
●
●●
● ●
● ●
● ●
●● ● ● ● ● ● ●● ● ●●
●● ●● ● ● ● ● ● ● ●● ● ●●
●● ●●
● ● ●● ●● ● ●●
● ●● ●●● ● ●● ●●● ●
●● ● ●●● ● ●● ●●● ●
●●
● ●
●●●● ● ● ●
●●● ●● ● ●
● ● ● ●
●●●● ● ● ●
●●● ●● ● ●
● ●
●
● ●●
● ● ● ●
●
● ●●● ●● ●
● ●●● ●● ●
●● ● ● ●
● ● ●● ● ● ● ●● ● ●● ● ●
●● ● ● ● ●● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ●
● ● ●
−5
−5
● ● ● ●
● ● ● ●
● ●
● ●
● ●
● ●
−5 0 5 10 15 −5 0 5 10 15
Comp.1 Comp.1
(a) Clinical classification (b) Model−based clustering
10
10
5
5
Comp.2
Comp.2
●
0
●
●
● ●
●
● ● ●
●
−5
−5
● ●
● ●
●
●
●
● ●
−5 0 5 10 15 −5 0 5 10 15
Comp.1 Comp.1
(c) Single link clustering (d) Average link clustering
10
10
5
●
Comp.2
Comp.2
●
●
● ● ●
● ● ●
●
●
● ●
● ● ●● ● ●
● ●
● ● ●
● ● ●
●
0
● ●
● ● ● ●●
● ●● ● ●
● ●●●● ● ●
● ● ●
● ●●
●● ●●
● ●● ● ● ●●
●
●●● ● ●●●●● ● ● ● ●● ●
● ●●●● ● ● ● ●
● ● ● ●
● ●
● ●●● ●●
● ● ●
● ● ●● ● ●● ● ●
● ● ●● ● ●
● ● ● ● ●● ● ● ● ●
● ●
● ● ●
−5
−5
● ● ● ●
● ● ● ●
● ●
● ●
● ●
● ●
−5 0 5 10 15 −5 0 5 10 15
Comp.1 Comp.1
(e) Complete link clustering (f) k means clustering
Figure 2.21 Clustering results for breast cancer diagnosis data, shown for
the first two principal components of the data. (a) Clinical classification
(Benign = red squares, Malignant = blue circles); (b) model-based
clustering; (c) single link; (d) average link; (e) complete link; (f) k-means.
2.6 Choosing the Number of Clusters and the Clustering Model 51
Table 2.4 Performance of different clustering methods on the breast cancer diagnosis data:
classification error rate (CER: smaller is better) and Adjusted Rand Index (ARI: larger is
better). All methods take the number of clusters (G = 2) as known.
where p(θMk |Mk ) is the prior distribution of θMk , the parameter vector for model
Mk . The quantity p(D|Mk ) is known as the integrated likelihood, or marginal
likelihood, of model Mk .
A natural Bayesian approach to model selection is then to choose the model that
is most likely a posteriori. If the prior model probabilities, p(Mk ), are the same,
this amounts to choosing the model with the highest integrated likelihood. For
comparing two models, M1 and M2 , the Bayes factor is defined as the ratio of the
two integrated likelihoods, B12 = p(D|M1 )/p(D|M2 ), with the comparison favoring
M1 if B12 > 1, and conventionally being viewed as providing very strong evidence
for M1 if B12 > 100 (Jeffreys, 1961). Often, values of 2 log(B12 ) rather than B12
are reported, and on this scale, rounding, very strong evidence corresponds to a
threshold of 10 (Kass and Raftery, 1995).
This approach is appropriate in the present context because it applies when
there are more than two models, and can be used for comparing non-nested
models. In addition to being a Bayesian solution to the problem, it has some
desirable frequentist properties. For example, if one has just two models and they
are nested, then basing model choice on the Bayes factor minimizes the total error
rate, which is the sum of the Type I and Type II error rates (Jeffreys, 1961).
The main difficulty in the use of Bayes factors is the evaluation of the integral
that defines the integrated likelihood. For regular models, the integrated likelihood
can be approximated simply by the Bayesian Information Criterion or BIC:
2 log p(D|Mk ) ≈ 2 log p(D|θ̂Mk , Mk ) − νMk log(n) = BICMk , (2.16)
where νMk is the number of independent parameters to be estimated in model
Mk (Schwarz, 1978; Haughton, 1988). This approximation is particularly good
when a unit information prior is used for the parameters, that is, a multivariate
normal prior that contains the amount of information provided on average by one
observation (Kass and Wasserman, 1995; Raftery, 1995).
52 Model-based Clustering: Basic Ideas
0.4
0.3
uncertainty
0.2
0.1
0.0
but may well have information that would allow them to refine the prior. It thus
seems a reasonable candidate to be used as a default prior.
Note that an informed analyst may well have a more precise prior or external
information, leading to a more concentrated prior distribution. More concentrated
prior distributions often tend to lead to Bayes factors that favor larger models
(and the alternative hypothesis in two-hypothesis testing situations). Thus the
unit information prior can be viewed as conservative, in the sense that informative
priors are likely to favor the alternative hypothesis (or larger models) more. If
external information tends to provide evidence against the null hypothesis, then its
use is legitimate, but the fact that the conclusion is based on external information
as well as the data at hand should be revealed to readers and users.
Finite mixture models do not satisfy the regularity conditions that underlie the
published proofs of (2.16), but several results suggest its appropriateness and good
performance in the model-based clustering context. Leroux (1992) showed that
basing model selection on a comparison of BIC values will not underestimate the
number of groups asymptotically. Keribin (1998) showed that BIC is consistent for
the number of groups, subject to the assumption that the likelihood is bounded.
The likelihood is not in fact bounded in general for Gaussian mixture models
because of the degenerate paths of the likelihood to infinity as the variance matrix
becomes singular. However, a very mild restriction, such as a very small lower
bound on the smallest eigenvalue of the covariance matrices, is enough to ensure
validity of the assumption. The main software used in model-based clustering
incorporates such restrictions, so the assumption is valid in practice for all the
models used.
Roeder and Wasserman (1997) showed that if a mixture of (univariate) normal
distributions is used for one-dimensional nonparametric density estimation, then
using BIC to choose the number of components yields a consistent estimator of
the density. Finally, in a range of applications of model-based clustering, model
choice based on BIC has given good results (Campbell et al., 1997, 1999; Dasgupta
and Raftery, 1998; Fraley and Raftery, 1998; Stanford and Raftery, 2000).
For multivariate normal mixture models, the models to be compared correspond
to different numbers of clusters and different sets of constraints on the covariance
matrices, or clustering models. Each combination of a number of clusters and
a clustering model defines a distinct model to be compared. Typically, models
with G = 1, . . . , Gmax are considered, for some reasonable choice of the maximum
number of clusters, Gmax . In the mclust R package, the default value of Gmax is
9, but in specific applications the appropriate number might be much greater. For
example, in the craniometric data of Example 5, there are known to be at least
28 populations represented, so in examples of this kind it would be wise to take
Gmax to be 30 or greater. By default, the mclust software considers 14 covariance
models, leading to a default number of Gmax × 14 = 126 models considered. The
Rmixmod software considers 28 covariance models, leading to a larger total number
of models considered.
For a finite mixture model, the usual likelihood function is also called the
mixture likelihood, or, particularly in the context of the EM algorithm, the
54 Model-based Clustering: Basic Ideas
n G Mk
zi,g
p(y, z|θMk , Mk ) = {τg fg (yi |θg )} . (2.19)
i=1 g=1
2.6 Choosing the Number of Clusters and the Clustering Model 55
We use a BIC-like approximation to the integral and take twice the logarithm of
it, yielding
2 log p(y, z|Mk ) ≈ 2 log p(y, z|θ̂Mk , Mk ) − νMk log(n). (2.21)
Thus
ICL = 2 log p(y, z ∗ |θ̂Mk , Mk ) − νMk log(n). (2.23)
It turns out that
ICL = BIC − E(Mk ), (2.24)
where
n G Mk
is the expected entropy of the classification from model Mk (Biernacki et al., 2000).
Thus ICL is equal to the BIC penalized by the expected entropy of the classification.
Entropy is high when there is large uncertainty about the classification, and it
is highest when all the ẑi,g are equal (i.e. all equal to 1/GMk ), at which point it
attains the value n log(GMk ). It is lowest when all the ẑi,g are equal to either 0 or
1, at which point it is equal to zero. As a result, ICL tends to favor models that
produce more clearly separated clusters more than BIC does, and so in practice
ICL usually chooses the same or smaller numbers of clusters than BIC.
●
● ●
● ● ●
● ● ● ●
● ● ● ● ●
● ● ● ● ●
● ●
● ● ●
● ● ●
● ●
● ●
●
−2500
●
−2500
● ● ● ●
● ● ●
● ● ●
●
−3000
−3000
● ●
BIC
ICL
−3500
−3500
−4000
1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9
Figure 2.23 Model selection for the Old Faithful data: Left: BIC plot.
Right: ICL plot.
VVE covariance model. The model selected by BIC uses 11 free parameters to
model the covariances, while the model selected by ICL has one fewer, at 10.
Figure 2.25 shows the classifications and the component density ellipses for
the two selected models. Figure 2.26 shows the density contours for the overall
estimated mixture density from the two selected models. It seems that the model
selected by BIC may give a better estimate of the overall density of the data, but
ICL yields a clustering that agrees better with the visual impression.
One possible conclusion from this analysis is that there are likely two clusters
in the data, but that the cluster at the top right in the right panel of Figure 2.25
may well be non-Gaussian and could be better represented as a mixture of two
Gaussian components than as by a single multivariate normal distribution. This
suggests using BIC to select the number of mixture components, but then merging
the two top right components into a single non-Gaussian cluster. We will discuss
how to do this in Chapter 3.3. For the Old Faithful data, it can be seen from
2.6 Choosing the Number of Clusters and the Clustering Model 57
−2320
−2350
●
●
● ●
●
●
●
−2330
●
●
● ● ●
●
● ●
BIC
ICL
●
−2400
−2340
●
●
●
●
−2350
−2450
● EVI EEV ● EVI EEV
● ● ●
VVI VEV VVI VEV
EEE EVV EEE EVV
−2360
2 3 4 2 3 4
Figure 2.24 Zoomed version of Figure 2.23: BIC and ICL plots for Old
Faithful data.
● ●
● ●
● ● ● ●
● ●
● ●
90
90
● ●● ●●● ● ●● ●●●
● ● ● ● ●
● ● ● ● ●● ● ● ● ● ●●
● ● ●
●● ● ● ● ●● ● ●
● ●●● ● ●● ●●●
●
● ●●● ●● ● ●● ●●● ●● ●
●
●
● ● ●●● ● ● ● ● ● ● ● ●●● ● ●
●●●● ●● ●
● ● ● ● ● ●●●● ●● ●
● ● ●
● ●● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ●● ●
80
80
● ● ●● ● ● ● ● ● ●● ●
● ●● ● ● ●● ● ● ● ● ●● ● ● ●●
● ●● ● ● ●● ●● ●● ● ● ● ● ●● ●
● ● ●● ●● ●●
●● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ●
●● ● ● ● ● ● ●● ●● ● ● ● ● ●
●● ● ●●
● ● ● ●
● ● ●
●●
● ● ●● ● ● ● ● ●●
● ● ●● ● ● ● ● ●● ●
● ●
waiting
waiting
● ● ● ●
70
70
●● ● ● ●●
● ● ●
●
● ●
● ●
60
60
50
50
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions
Figure 2.25 Classification plots for the Old Faithful data using selected
models. Left: model selected by BIC with G = 3, model EEE. Right: model
selected by ICL with G = 2, model VVV.
Figure 2.25 that this approach gives the same classification as the two-cluster
Gaussian model selected by ICL.
90 0.005
90
0.02 0.015
0.03 0.025
0.04 0.035
4
0.045 0.0
80
80
0.035
0.03
0.025
0.02
0.015
waiting
waiting
0.01
70
70
0.01
0.005
0.005 0.005
0.01
0.015
60
60
0.02
0.025
3
0.0
5
35
0.03
0.0
3
0.0
50
50
0.02
5
0.02
0.015
0.01
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions
Figure 2.26 Density contour plots for the Old Faithful data using selected
models. Left: Model selected by BIC with G = 3, model EEE. Right: Model
selected by ICL with G = 2, model VVV. −4800
−4800
● ● ● ●
● ● ● ● ●
● ● ● ●
●
● ●
● ● ● ● ● ● ●
● ● ●
● ● ●
● ● ●
● ●
● ●
−5000
−5000
● ● ●
● ● ●
● ●
−5200
●
−5200
●
BIC
ICL
−5400
−5400
● ●
−5600
−5600
1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9
Figure 2.27 Model selection for the diabetes diagnosis data. Left: BIC.
Right: ICL.
600
600
400
400
sspg
sspg
●
●● ●
● ●
●● ●
● ●● ●●● ● ● ● ●●
●
200
200
●●● ●● ● ●● ●●● ●● ●
●● ●● ●
●
●●● ●
● ●
●●● ●●
●●● ●● ●
● ●● ●●● ● ● ●● ●●● ● ●
● ●● ●●● ●●● ● ●● ●●● ●●● ● ●
●● ● ●● ● ●● ● ●● ●● ●
●●
● ●●●● ●●
● ●● ●●
● ●●●● ●
●●●
●● ● ●
●● ●●
● ●
●●● ● ●●● ●
●
0
0
0 500 1000 1500 0 500 1000 1500
insulin insulin
Figure 2.28 Classification plots for selected models for the diabetes
diagnosis data. Left: three-component model with unconstrained covariance
selected by BIC and ICL. Right: ten-component equal volume spherical
model (EII).
very different directions. Thus, the cluster volume, shape and orientation all vary
between clusters, so the VVV model is the most appropriate one, and it is the one
selected by our two criteria.
One interesting aspect of model choice in model-based clustering is that there
is a trade-off between complexity of clustering model and number of clusters. For
example, a single elongated cluster could also be represented approximately by a
larger number of spherical clusters. If we think of a pea plant, the single elongated
cluster could be like a pod, and the approximating spherical clusters could be like
the peas, so the problem could be viewed as one of distinguishing between the
peas and the pod.
The best constant variance spherical model for these data has ten mixture
components, with the EII model. The classification from this is shown in the right
panel of Figure 2.28. The “peas versus pod” dynamic is apparent. For example, the
long thin cluster in the left panel of Figure 2.28 is replaced in the ten-component
spherical clustering by three small components in the right panel. In this case,
BIC and ICL both choose the “pod” model over the “peas” model, selecting the
three-component model.
For some purposes, such as grouping data into compact groups for data com-
pression purposes, a model such as EII with a large number of clusters might be
better suited than a selected more complex covariance model. In such a case, the
spherical model may well be preferred, even if its fit is not as good according to
model selection criteria. It is always important to bear the purpose of the data
analysis in mind when making modeling decisions.
60 Model-based Clustering: Basic Ideas
Figure 2.33 (left) shows the correct classification and the confusion matrix
is shown in Table 2.6. It can be seen that the red group (Grignolino) is for
the most part fairly compact, but also has about nine samples that are quite
spread out and are separated from the core Grignolino samples. The right panel
of Figure 2.33 shows the same information, but with classification errors from
model-based clustering shown by filled black symbols. It is clear that all but one
2.7 Illustrative Analyses 61
5
4
● ● ● ●
● ● ●●● ●● ●● ●●● ● ●● ● ● ●● ● ●●
● ● ● ● ● ●●● ● ● ●●●
● ●
●●
● ●● ● ● ●●●● ●●●● ●● ● ● ●
● ● ●●●● ●●
●●● ●●
● ●●●
● ●
●● ●● ●●●●●● ●●●●
●●
● ●
●●
● ●●●
●● ●●●●● ● ● ● ● ●●
●●●●● ●●●
●● ●●●
●
●●●●● ● ● ● ● ●● ●● ●●●●●●●● ● ●
●
3
● ●
Flavanoids ●●● ● ●●●●● ● ● ●● ●
●●● ●●● ● ● ● ●●●● ●
●
●● ●
● ● ● ● ●● ● ●●●●● ●
●
●
●
●●
●
● ●
● ●
●●
● ●
●● ●●
● ●
● ●●● ● ●●●●● ● ●●
● ●● ● ●
●●●
● ●●●● ● ●● ●●
● ●●
● ●
●● ● ●●●●
●●● ●
●●●
● ●
●●●
●● ●● ●●● ●●
●
●● ● ●●● ●
●●●●● ● ●●● ●● ●
●
● ●●●
●
●●
●
●● ● ●●●●● ● ● ● ●● ● ● ●●●●● ●
●● ●
●●●
●
●●●●● ●●●
●●
●●
2
●
●●●● ● ●
●● ●●● ● ● ● ● ● ● ● ● ● ●
●●
● ●
●●●
● ●●● ● ● ● ●●●● ●●
●● ●
●● ● ●
●●●
● ●● ● ● ● ●● ●●● ● ● ● ● ● ●● ● ●● ● ● ● ●●
● ●● ● ●
●
●●● ● ● ● ●● ● ●
●● ● ●● ●●●●●●●● ● ● ● ●● ●●●●●●
●● ●
●● ● ●●● ● ● ●● ●● ● ●
● ● ●
●●● ● ● ●●●●●● ● ●● ● ● ● ● ●●● ●
●● ●● ●
1
● ●● ●●
●●●●● ●●●●●● ● ●● ● ● ●●●●●●●● ●
●●●● ●●●●●●● ●
●● ● ● ● ●
● ●●●●●
● ●●
●
●
● ● ●●●● ●● ● ● ●●●●
●●●●
●
● ●
● ●●●
●●●
●●● ●●
●●●
●●●● ● ●●
●● ●
●●
●●
●
●
●
● ●●
●
●●● ●
● ●
●●
●
● ● ●
● ● ● ●
12
● ● ● ●
●● ● ●●● ●● ● ● ● ● ●
●● ●● ● ●
● ●● ●●● ●
●
● ● ●● ● ●●● ●● ● ●●
●
● ● ● ●●
● ●● ●● ● ● ●●● ● ● ● ● ● ●● ●
● ●● ●
● ● ● ● ● ● ●
●●●●●● ●●● ● ●● ●● ●
2 4 6 8
●
●●●● ● ● ●● ●●● ● ● ●
● ● ●●●●●
●●● Color_Intensity ●●
● ●●●● ●● ●● ● ● ● ●●● ● ● ● ●●●● ●●
●●●●
● ●● ●●
● ●● ●● ● ● ● ● ● ● ● ● ●
●● ●●● ● ●● ● ●● ●●● ●●● ●●● ● ●●
●●
●● ●● ●● ●●● ●●●● ●●
●●●●● ●● ● ●● ● ● ● ● ●●●●●● ● ●
●● ●
●●●● ● ● ● ● ●●
● ●●
●
●●
●
●●
● ● ●●●●●●
●●
●● ●●
●● ●●●●● ●●
● ●●● ●
● ●● ●●●
●●●●●
●● ●
●
●● ● ●●●● ●●●●
●●●
●
●●
●●
●
● ●● ●●
● ● ● ●●●●● ●● ● ● ●● ●●● ●● ● ●
●●
●● ● ●
●● ●●●●● ●
●●●
●●● ● ● ●●●●
●●●●● ● ●
● ●●●
●●● ●●●●●
●● ●●●● ● ●●●
●●● ●
● ●
● ● ●● ● ● ●● ● ●●●
●●●● ●● ●● ● ● ● ●●● ● ●
● ●● ●● ●
●
●● ● ●●
●● ●●●● ●●
● ● ●●● ●● ● ● ● ●●●● ● ● ●●
●● ● ●●●
● ●●●●●●● ●●
● ●●●●
●● ●
●●●●● ●●●●●
●● ●●●
●● ●● ● ● ●● ● ●● ●
● ●●● ● ● ●
●●●●●●●
●●●
●●●
●●
●●●● ● ●
●● ●
●●● ● ●●
●
●●● ●
●●
●● ● ● ●● ●●
● ● ●
●● ●●
● ●●
● ●● ● ● ●●
●
●●●●● ●
● ●●
●● ● ●
●● ● ● ●
● ●
●
● ●
●● ● ●●●●●● ● ●
●● ●● ●● ● ●
● ●●
● ● ● ●
30
● ● ● ●
● ● ● ● ● ● ●
● ● ● ●
● ● ● ●●
● ●
●
● ●● ● ● ● ●
25
●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●
●●● ● ● ● ●●● ●●● ● ● ●
●● ●● ● ●● ●
● ● ● ● ● ●● ●
●
●●●●● ● ●●● ●● ● ●● ●● ● ● ● ● ● ●
● ● ● ● ● ●●
●
●
●●● ●●
●●●● ●●
● ●● ● ●
●●●● ● ●
●● ●● ● ●
● ● ●
●● ●●
●●●
● ●●
●● ● ● ●●●
●
●
●●
●●● ● ●
●● ● ●● ● ●● ●● ●● ●●
●●
● ●●
● ●●●●
● ● ●●●●●●●●
●
●●
● ● ● ●●
●
●●
●● ●
●●●
● ●● ●●●
●●● ●
● ●● ● ● ● ● AlNalinity_of_ash ●● ● ●● ●●●●●
20
●●
●●●
● ●
●● ●● ● ●●●● ● ●●●●
●●●●
● ● ● ●●
●●●●●● ●
●●●
●●
●●●● ● ●●●●●
● ● ●
● ●●
●●●● ●● ●
●● ●
●●
● ●● ● ●●
●● ●●
● ●●
● ● ●● ●
●●●●● ● ●●●●●●
●
● ●● ●● ●●●● ● ●●●● ●
● ●
● ●● ●● ●
●●●●● ●● ●
●
●● ● ●
●●
●● ●
● ●
●● ●● ●●●
● ● ●●
●● ●● ●●●●●
●● ●
●●●
●●●●
● ● ●● ● ● ● ● ●●
●●●
●●
● ●●●● ●
●●
● ●
●● ●
●●
●● ● ●●● ● ●
●●
●
●● ●●●
●●● ●●●●●●● ●
●
●●● ● ●●●● ●●● ●●●●●●●● ● ●●●●●●
●●●
●● ● ● ●●● ●
●●●● ●● ● ●●● ● ● ●●
● ●●● ● ●● ●●●● ●●●●
●● ● ● ● ●
●
● ●●● ●●●●
●● ●
●●●●●●● ●● ● ●● ●●●
●
●● ●●● ●
15
● ● ●●●● ● ● ●
●
● ● ●●● ● ● ●
● ● ● ● ●● ● ●
● ●
● ● ● ●
●● ●● ● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ●
10
● ● ● ● ● ● ●
● ● ● ● ● ●● ●
3.6
●
● ● ● ●● ● ● ●● ●
● ● ●● ●●● ● ● ● ● ●●
●
● ● ●● ● ● ●●● ● ● ●●● ●●● ● ●●● ●●● ●● ● ● ● ●● ●
● ●● ● ●
●
●●●
● ● ●● ●● ● ● ● ●
●●● ●●● ● ●● ●● ●
● ●
● ●● ●●● ● ● ●● ●
●● ● ● ● ●●●● ●●
● ●●● ●● ● ● ●●●● ●●●
●
3.4
● ● ●● ● ●●●●
●
●● ●●
●
●
●●●● ●●
●● ● ● ●
●●●
●
●
●●●●●●
●●●●
●●●●● ●
● ● ●●●
●●
●●●●●●● ● ● ● ● ●●● ●
●● ●
●●●● ●●● ● ●●
●● ●●●● ●●●●● ●● ●●●●● ●●● ● ●●● ● ●● ● ●
●● ● ●●
●●●● ●
● ● ●
● ● ● ● ● ●● ●● ● ●●●
●● ● ●● ● ●
●●
●●
●●● ●●●●●●
●●●
●●
● ●
●●
●
● ●
●●●●●● ●●●●
●●
●
● ● ●●●● ●
●●
●● ● ● ●● ●
●● ●●●●
●●●●
●
●●●●●
● ● pH ●●●
● ● ●
●●●●●●
● ●●●●
● ●
●●
● ●
●
●
●
● ●
● ●
● ●●●●●●● ●● ●
● ●●● ● ●
●●●
● ●
●● ●● ● ● ● ●●●●●● ● ●● ●●●● ●● ● ●
●●● ●● ●● ●●●
● ● ●●
● ●● ●
●●● ●● ●
3.2
●● ●●
● ●●● ●● ●
● ● ●●● ● ● ●●● ● ● ●●●
●
●●
● ● ●
●
●●●● ● ●● ● ●●●● ●●
● ● ● ●● ●● ● ●●●●
●● ● ● ●●●●
●●● ●● ●●●●
● ●●●●
● ●
●
● ●
● ●● ● ● ●
● ●●
●● ●● ● ● ● ●● ● ●●●● ●●●● ●● ●●●●● ●● ●● ●
●●
● ●● ●
● ●
●● ● ● ● ● ● ●● ●● ●●●●● ●●
● ● ● ● ●
●●● ● ● ● ● ● ● ● ●
●
● ●● ●
●● ●●
● ●●
3.0
●●●● ● ● ● ● ● ●● ●●● ● ●
● ● ● ●
● ● ● ●
160
● ● ● ●
● ● ● ●
● ● ● ● ●
● ● ●● ●
●
● ● ●● ●
●● ● ● ●● ●
●● ●● ● ● ●
120
● ●● ●● ● ● ●● ● ●
●●● ●
●● ●●●●● ●
●● ●●●●
●●● ● ●●● ●
●
●●
●● ●● ●●●● ● ● ●● ●●
● ●
● ● ● Magnesium
●●
●● ● ●● ●● ● ● ●●● ● ●●●● ● ● ● ●● ●●●●● ●●● ● ●
● ● ● ● ●●●●● ●
● ●
●
●●●●● ● ● ● ●●●●●● ● ●● ●● ●●● ●● ●●● ●●● ●
●●●
● ●● ●● ● ● ●● ●●●●●●●●●● ●
●
●● ●● ● ●● ●● ●●
● ● ●
● ● ●● ●
● ●
●●●● ● ● ●●
●● ●
● ●● ●●●● ●●
●● ●
●
● ●●● ●●● ● ●●●●●●●
●●● ● ● ●
● ●
●●●● ● ● ● ●●●●●●
●●● ● ●
●●●●●●
●●
●● ● ●
● ●
● ●● ● ● ●●● ●● ●●●●● ●●●●● ●
●
● ●●●●
●●
●●
● ●●●●
● ●●
●●
●●
●●
●
● ●●●●●●
●●
●● ●●● ● ●●●●●
● ●
●●
●●
●●
● ● ●●●● ●● ●●●●
●
●
●●●●
● ●●●●●●
● ● ●●●● ●
●
●●● ●
●●● ●
●●●●●●●●● ●●●●● ●●●
● ●● ●
● ●● ● ● ●●
● ● ● ●●● ● ●● ● ● ● ●● ● ●●● ●
●●●●●
●●
●●
●
●●
● ●
●
●●●
●
●●●
●●
● ● ●
●
●● ● ● ●
●
●●
●●
●
●●
●
●●●
● ●●
● ●
●● ●● ●
● ● ● ● ●●● ●●
● ●●●●
●
● ● ● ●●
● ●●●●
● ●●● ●
●●●
●●
● ●●● ●
●● ●
●●● ●● ●● ● ●●
80
● ●●●
●●●● ● ●●● ● ● ●● ●● ●●●●● ● ● ● ●●● ●
● ● ● ●
1 2 3 4 5 10 15 20 25 30 80 120 160
Figure 2.29 Pairs plot for five of the 27 physical and chemical
measurements of wine samples
of the ten misclassified samples are among these Grignolino samples whose values
are separated from the main Grignolino group, and so are classified as being of
one of the other types. It would be hard for any clustering method to classify
these samples correctly.
Table 2.7 shows the performance of different clustering methods for the wine
data, when all are given the “correct” number of clusters, G = 3. The model-based
method performs best by a wide margin, while the single link method again
performs very poorly.
62 Model-based Clustering: Basic Ideas
5
4
● ● ● ●
●
● ● ●●● ● ●● ●● ●● ● ● ●●
● ● ● ● ●● ●
●
●●
●●
●
● ●●
●
●● ●●●● ● ● ●● ●
●
● ●●●●
●●
●●
●● ●
●●●● ●● ● ● ● ●
● ●●
●●●● ●● ● ● ●
●●●●
●●● ●● ●● ●●●●●
●
● ●●● ● ● ● ●●●
● ●● ●●●●●●●● ● ●
● ●
3
● ● ●
Flavanoids ●●
●●
●
● ●●● ●●
● ● ● ● ●●
● ●
●● ●●● ● ●
● ● ●●● ●
● ●●
● ●
● ●
●●●●●●●●●●● ● ●
●
●●● ●
●
● ● ●● ●●●●● ●
● ●
●
● ●● ●● ● ●●● ● ●
●● ●●● ● ●●
2
1
12
●● ●● ● ● ●
●
2 4 6 8
● ● ● ●
●●●●
● ● Color_Intensity ●●
● ●●●● ● ● ● ●
●●● ● ● ●●●●● ●
●● ●● ●● ●● ●● ● ●● ●●●
●●
● ● ●● ●● ● ●● ●
● ●
●
●●
●
●
●
●●
●
● ●●
●●●●●● ● ●●
● ●
●
●
● ●●● ●● ●●●●
●● ●● ●
●
●
●●●
●● ●● ●● ● ●●●●●● ●● ●●● ● ● ●● ● ●●●● ●● ● ●
●●
● ● ●● ●●
●
●
● ●
●
●
●●●●
●
● ● ●●●●●● ●● ● ●● ● ●
● ● ●●
● ●●
●● ●●
●●
●●● ● ●
●● ●●
●
● ●●● ●● ●● ● ●●●●●● ● ●
●●●● ●
30
25
● ● ● ●
● ● ● ●
● ● ● ●●●
●●●● ● ●●● ●●● AlNalinity_of_ash ● ●● ●● ●● ● ●
20
●●
●●● ●●●
● ●● ●●● ● ● ●● ●
● ●●● ● ● ●●●● ●
●
●●● ●● ●
●● ● ● ●● ● ● ●● ● ●●●●●●●● ●
●●●● ●●●● ●● ●●●●● ● ● ●
● ●●●●● ● ● ●●●
●● ●●
●●●● ●● ● ● ●● ● ● ●● ●
●●● ●●●● ●●●●
●●
●●●●
●●●●
● ●●● ●
●
● ●●●●
● ●
● ● ●● ●●● ●
●
● ●
15
● ● ● ●● ● ●● ●● ●● ●
● ● ●● ● ●
● ● ● ●
●● ●● ● ● ●
● ● ● ● ● ● ● ●●
10
3.6
● ● ● ● ●●
●● ● ● ●●
● ● ● ● ●●● ● ●
●●● ●●●
●
● ● ●●●●
● ●
●● ●
●●● ●
●●● ● ●
●●●● ●● ●● ●● ●●●
3.4
●
●
●●
●
● ●● ●● ●●
●
● ●● ●● ●● ●
● ● ●● ● ● ●●●●● ●●● ●●● ● ●●
●●●●●
● ●● ● ●● ●
●● ●●
●●●
● ●
●●●●●
● ●
●
●
●●
●● ●●
●● ● ●●
●● ●●●●
● ●● ● ● pH ●● ●●
● ●●●●
● ●
● ●● ●● ● ●● ● ●
● ●● ●●
●●●
● ●
3.2
●●●●●
● ● ●
●●●●●● ● ● ●●●● ●● ●● ●
● ●
● ●
●● ● ● ●
●
● ● ● ●
● ● ● ●
3.0
● ● ● ●
160
● ● ● ●
●● ● ● ●● ●
● ●● ● ● ●
120
●● ● ● ●●● ●
●
● ●●
●●●●
●●● ● ● ●●●● ●● ● ● ● ●● Magnesium
●●● ●●● ● ●
● ●●
●● ●● ●●● ●
●
●●
●●
●●●●●●
●●
●● ●● ●● ●●● ●●●
● ● ● ●●●●
●●
● ●●
●
●●●●● ●
● ● ●● ●● ● ●●
●●
●● ●●●● ●●
●●● ●●●●●●
●●● ●
●●●● ●
● ●
●●●
●●
● ●
●● ● ●● ● ●● ● ●
●●● ● ●●
●●
●
●●●● ●● ● ● ●
● ●●● ●●
●● ● ● ● ●● ●
●
●
● ● ● ● ● ●● ● ● ●● ●●
80
1 2 3 4 5 10 15 20 25 30 80 120 160
Figure 2.30 The same pairs plot for five of the 27 physical and chemical
measurements of wine samples, with true wine types shown by color.
It remains the fact that with BIC, although not with ICL, model-based clustering
has chosen seven clusters, even though there are in fact only three wine varieties.
The confusion matrix for comparing the seven-cluster solution with the partition
into three wine types is shown in Table 2.8. It can be seen that each of the three
wine types has itself been divided into either two or three clusters by the clustering
algorithm. If the clusters were merged correctly, the number of errors would be
very similar to the three-cluster solution (11 versus 10). Not surprisingly, though,
the classification error rate (52%) and the Adjusted Rand Index (0.418) are much
worse than for the three-cluster solution. Note, however, that the ARI is still
2.7 Illustrative Analyses 63
−25000
−25000
●
● ● ● ●
● ●
● ●
● ● ●
● ● ● ●
● ●
● ●
● ●
● ● ● ● ● ●
● ●
● ●
−30000
−30000
−35000
−35000
−40000
−40000
BIC
ICL
−45000
−45000
EII EVE EII EVE
VII VEE VII VEE
−50000
−50000
● EEI VVE ● EEI VVE
● VEI EEV ● VEI EEV
● EVI VEV ● EVI VEV
● VVI EVV ● VVI EVV
−55000
−55000
EEE VVV EEE VVV
1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9
Figure 2.31 BIC (left) and ICL (right) for the wine data.
● ● ●
−24000
●
−24000
● ● ● ●
● ● ● ●
● ● ●
●
●
● ● ● ● ●
● ● ●
● ● ●
●
● ●
−24200
● ●
−24200
● ●
● ●
● ●
● ● ● ●
● ●
−24400
−24400
● ●
● ●
BIC
ICL
−24600
−24600
● ●
● ●
● ●
● ●
−24800
● ●
● VEI EVE ● VEI EVE
● EVI VEE ● EVI VEE
● VVI VVE ● VVI VVE
● ●
2 3 4 5 6 7 8 9 2 3 4 5 6 7 8 9
Figure 2.32 Zoomed version of the BIC (left) and ICL (right) plots for the
wine data.
better than for the three heuristic methods, even though the latter were “told”
the correct number of clusters.
There are 14 wine type/year combinations in the data set. The confusion matrix
for the seven-cluster solution against those 14 type/year categories is shown in
Table 2.9. Within each type, several of the clusters tend to separate out particular
years. For example, cluster 2 corresponds to Barolo wines that are not from 1973,
cluster 5 corresponds to later Grignolino wines, particularly from 1976, while
cluster 7 is mostly made up of Barbera wines from 1978. Thus it seems that the
64 Model-based Clustering: Basic Ideas
Table 2.5 Wine data: top models by BIC and ICL. The number of clusters is denoted by G. The
best model by each criterion is shown in bold font. 23,950 has been added to all BIC and ICL
values for readability.
● ●
12
12
● ●
● ● ● ●
● ●
●● ●●
10
10
● ●
●● ●●
●● ● ●● ●
● ●
● ● ● ●
● ● ● ●
Color_Intensity
Color_Intensity
● ●
8
8
● ●
●●
● ● ●●
● ●
● ●
● ●
● ● ● ●
● ●
6
6
● ●
●
●● ● ●
●● ●
● ● ● ● ● ●
● ●
● ●
● ● ● ●
● ●
● ●
● ● ● ●
● ● ● ●
4
● ●
2
1 2 3 4 5 1 2 3 4 5
Flavanoids Flavanoids
Figure 2.33 Wine data: true classification (left) and classification errors
from model-based clustering (right), in clustering samples by wine type. The
errors are shown as solid black symbols, with the shape corresponding to the
true classification. Two of the 27 measurements, Flavanoids and Color
Intensity, are shown.
Table 2.6 Wine data: confusion matrix for three-cluster model and wine types. For each cluster,
the most represented wine type is marked by a box.
1 58 7 0
2 1 62 0
3 0 2 48
2.7 Illustrative Analyses 65
Table 2.7 Wine data: performance of different clustering methods for classifying wine types.
classification error rate (CER – smaller is better) and Adjusted Rand Index (ARI – larger is
better). All methods are based on G = 3 clusters.
Table 2.8 Wine data: confusion matrix for seven-cluster model chosen by BIC and wine types.
For each cluster, the most represented wine type is marked by a box.
1 31 0 0
2 25 0 0
3 3 17 0
4 0 28 0
5 0 18 0
6 0 8 21
7 0 0 27
−650000
−750000
● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ●
●
●
●
●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ●
● ●
● ●
●
●
BIC
●
−850000
EII EVE
VII VEE
● EEI VVE
● VEI EEV
● EVI VEV
● VVI EVV
−950000
EEE VVV
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29
Number of components
a cluster-specific scale factor. The BIC values for different numbers of clusters
for the VEE model are shown in Figure 2.35. The maximum is attained at G = 15
clusters.
The correspondence between the 15 clusters inferred by model-based clustering
and the 30 populations is shown in Table 2.10. The number of clusters is smaller
than the number of true populations, so there are several clusters that include
the majority of skulls from more than one population. Apart from this, however,
the correspondence between the clusters and the populations is close.
Seven of the 15 clusters correspond closely to a single population: Andaman,
Berg, Buriat, Bushman, Eskimo, Santa Cruz and Tolai. Cluster 8 groups the Ainu
(aboriginal Japanese) and Guam skulls. Cluster 9 groups most of the Arikara and
68 Model-based Clustering: Basic Ideas
−641000
−642000
−643000
BIC
−644000
−645000
VEE
1 2 3 4 5 6 7 8 9 11 13 15 17 19 21
Number of components
Figure 2.35 BIC plot for the VEE model only for the Howells
craniometric data.
Peruvian skulls; these are both Native American populations. Cluster 10 groups
(aboriginal) Australian and Tasmanian skulls; these are geographically proximate.
Cluster 11 groups Easter Island and Mokapu skulls; again these are relatively
geographically proximate. Cluster 12 groups Dogon from Mali, Teita from Kenya
and Zulu from South Africa; these are all sub-Saharan African populations.
Cluster 13 groups (medieval) Norse, (Hungarian) Zalavar and ancient Egyptian
skulls; these are all viewed as Caucasoid populations. Cluster 14 groups most of
the North and South Maori and Moriori skulls; these are all from New Zealand.
Finally, cluster 15 groups the Anyang and Hainan (both from China), Atayal
2.7 Illustrative Analyses 69
Table 2.10 Craniometric data: confusion matrix for the 15 clusters inferred by model-based
clustering and the 30 populations from which the skulls come.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
ANDAMAN 59 0 0 1 0 0 0 1 0 0 0 6 2 0 1
BERG 0 98 0 1 0 1 0 0 1 0 0 0 8 0 0
BURIAT 0 2 105 0 0 0 0 1 0 0 0 0 0 0 1
BUSHMAN 0 0 0 80 0 0 0 0 0 1 0 8 0 0 1
ESKIMO 0 0 0 0 108 0 0 0 0 0 0 0 0 0 0
SANTA CRUZ 0 1 0 0 0 95 0 0 3 0 0 0 3 0 0
TOLAI 0 0 0 0 0 0 100 1 0 3 2 2 0 1 1
AINU 0 1 0 0 0 0 0 71 0 0 2 3 3 0 6
GUAM 0 0 0 0 0 0 0 45 1 0 0 0 0 0 11
ARIKARA 0 5 0 0 0 4 0 3 43 0 1 1 1 4 7
PERU 1 0 0 0 0 3 0 3 102 0 0 0 0 0 1
AUSTRALIA 0 0 0 0 0 0 2 0 0 98 0 0 1 0 0
TASMANIA 0 1 0 0 0 0 10 0 0 70 3 2 1 0 0
EASTER I 0 0 0 0 0 0 0 1 0 0 84 1 0 0 0
MOKAPU 0 0 0 0 0 0 0 4 0 0 92 0 0 3 1
DOGON 0 0 0 0 0 0 0 0 0 0 0 99 0 0 0
TEITA 0 0 0 0 0 0 0 0 0 1 0 80 2 0 0
ZULU 0 0 0 3 0 0 1 0 0 1 0 91 2 0 3
EGYPT 0 0 0 0 0 0 0 0 0 0 0 7 104 0 0
NORSE 0 15 0 0 0 2 0 5 0 0 0 1 85 1 1
ZALAVAR 0 32 0 0 1 0 0 4 1 1 0 1 57 0 1
MORIORI 0 0 0 0 0 0 0 2 0 0 5 0 0 100 1
N MAORI 0 0 0 0 0 0 0 0 0 0 4 0 0 6 0
S MAORI 0 0 0 0 0 0 0 2 0 0 3 0 0 5 0
ANYANG 0 0 0 0 0 0 0 5 1 0 0 0 0 0 36
ATAYAL 0 0 0 0 0 0 1 2 0 0 0 2 0 0 42
HAINAN 0 1 1 0 1 0 0 9 1 0 0 1 0 0 69
N JAPAN 1 0 0 0 0 0 0 12 1 0 0 3 0 0 70
S JAPAN 0 0 0 0 0 0 0 5 1 0 0 0 0 0 85
PHILIPPINES 1 1 0 0 0 0 0 6 1 0 1 4 2 0 34
(Taiwan aboriginal), North and South Japanese, and Philippine populations. These
are all East Asian populations.
If we match each cluster with the population most represented within it, we
calculate a misclassification rate of 44.7%, so that about 56% of the skulls were
correctly classified. However, it seems more reasonable to match each cluster with
the set of geographically proximate populations most represented within it, as
discussed above. If we do this, the misclassification rate is only 12.3%. Overall,
it seems that the clustering has largely kept skulls from the same populations
together, but in some cases has been unable to distinguish between populations
that are relatively close geographically or historically.
Since it is known that there are in fact 30 populations, it is of interest to run
the analysis with the true number of groups, G = 30. This time the preferred
model for the covariance structure is the equal-covariance model, EEE, which is
closely related to the VEE model found best for G = 15 clusters. The confusion
matrix is shown in Table 2.11.
When the algorithm is told to use the correct number of clusters, 30, the
Table 2.11 Craniometric data: Confusion matrix for model-based clustering when the true number of groups, 30, is assumed.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
AINU 69 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 1 1 2 3 2 1 3 0
ANDAMAN 0 63 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 1 0 0 0 0 1 1 0 2
BERG 0 0 73 0 1 0 2 0 0 0 1 0 1 0 0 0 0 0 2 0 28 0 0 0 0 0 0 0 1 0
BURIAT 0 0 0 106 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 2 0 0 0 0 0
BUSHMAN 0 0 0 0 81 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 5 0 0 0 1 0 0
ESKIMO 0 0 0 0 0 108 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
SANTA CRUZ 0 0 0 0 0 0 93 0 0 0 3 1 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 2
TOLAI 0 0 0 0 0 0 0 82 19 0 0 0 0 3 0 1 0 1 0 0 0 0 0 1 0 0 1 2 0 0
ARIKARA 0 0 1 0 0 0 5 0 0 45 4 0 4 0 0 0 2 1 0 0 2 0 0 0 2 3 0 0 0 0
PERU 0 1 0 0 0 0 3 0 0 1 37 27 37 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 2
AUSTRALIA 0 0 0 0 0 0 0 1 0 0 0 0 0 100 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
TASMANIA 0 0 0 0 0 0 0 8 4 0 0 0 0 3 65 2 1 0 0 0 2 1 0 0 0 0 1 0 0 0
EASTER I 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 84 2 0 0 0 0 0 0 0 0 0 0 0 0 0
MOKAPU 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 94 2 0 0 0 0 0 0 0 0 1 0 0 0
MORIORI 0 0 0 0 0 0 1 0 0 2 0 0 0 0 0 1 5 98 0 0 0 0 0 0 1 0 0 0 0 0
N MAORI 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 1 5 0 0 0 0 0 0 0 0 0 0 0 0
S MAORI 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 2 7 0 0 0 0 0 0 0 0 0 0 0 0
EGYPT 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 68 38 2 0 0 1 0 0 0 1 0 0
NORSE 0 0 1 0 0 0 2 0 0 0 0 0 0 0 0 0 0 1 67 4 34 0 1 0 0 0 0 0 0 0
ZALAVAR 1 0 4 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 12 5 71 0 0 1 0 1 0 0 0 0
DOGON 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 87 1 6 0 1 0 1 0 1
TEITA 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 3 66 9 0 0 0 0 0 1
ZULU 0 0 0 0 3 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 5 1 90 0 0 0 0 0 0
ANYANG 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 27 11 2 1 0 0
ATAYAL 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 28 0 16 2 0
GUAM 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 1 51 1 0 0
HAINAN 1 0 0 1 0 0 0 0 0 1 0 0 2 0 0 0 1 0 0 0 0 0 0 0 30 29 2 15 1 0
N JAPAN 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 26 15 2 1 40 0
PHILIPPINES 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 1 1 0 1 0 1 5 7 3 28 0 1
S JAPAN 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 55 25 1 3 6 0
2.8 Who Invented Model-based Clustering? 71
Table 2.12 Performance of different clustering methods on the craniometric data: classification
error rate (CER – smaller is better) and Adjusted Rand Index (ARI – larger is better). Except
for model-based clustering, the other methods are run with the true number of clusters, G = 30.
Figure 2.37 Cover page of Wolfe (1965), the paper in which model-based
clustering for continuous data was invented.
What to do with large samples also emerged as a concern, and Wolfe’s advice
prefigured later findings remarkably accurately.
Fourthly, he recognized that with model-based clustering the choice of the
number of clusters (or “types”) is reduced to one of statistical hypothesis testing.
He proposed a likelihood-ratio test of the hypothesis that there are r types against
the alternative that there are r − 1 types. He wrote that the test statistic has a
χ2 distribution, which we now know not to be the case, but this was a key insight
for subsequent work.
Finally, he evaluated his methods using a simulated example, which became a
preferred approach in this literature.
On the basis of this one paper, we can conclude that John Wolfe was the inventor
of model-based clustering for continuous data. He developed the ideas further in
Wolfe (1967), and in Wolfe (1970) produced the first journal article describing his
results, but the key insights were already present in his 1965 Technical Bulletin.
Lavine and West (1992) and Bensmail et al. (1997) proposed methods for
Bayesian estimation of the multivariate normal mixture model for clustering using
Markov chain Monte Carlo (MCMC). They each did this for a different subset of
the Banfield–Raftery family of models. Cheeseman and Stutz (1995) introduced
the AUTOCLASS software, which carried out model-based clustering in a partly
Bayesian way.
Alternative parsimonious covariance matrix structures have been proposed for
model-based clustering. McNicholas and Murphy (2010a) developed a model-based
clustering approach for longitudinal data based on the Cholesky decomposition of
the covariance and Fop et al. (2018) proposed a sparse covariance structure for
model-based clustering.
Celeux and Soromenho (1996) proposed a normalized entropy criterion (NEC)
to select the number of clusters and the clustering model. The NEC criterion
has been improved by Biernacki et al. (1999) to decide between one and more
than one clusters. NEC is available in the MIXMOD software. There have been
many comparisons of the various available model selection criteria in the context
of model-based clustering, including, more recently, Steele and Raftery (2010).
Alternatives to model selection have also been explored. These include Bayesian
model averaging, in which, instead of choosing a single model, predictive distribu-
tions are averaged over all the models considered, with weights that reflect their
predictive performance over the data (Russell et al., 2015). They also include
regularization methods (Witten and Tibshirani, 2010), which have been compared
with model selection methods by Celeux et al. (2014).
Finding the best way to initialize the EM algorithm remains an open ques-
tion. We have described the hierarchical model-based clustering and smallEM
approaches, but improvements are possible. Improved versions of the initialization
strategy have been described by Scrucca and Raftery (2015) and are implemented
in the mclust R package. O’Hagan et al. (2012) developed a variant of the smallEM
approach that discarded unpromising starting values incrementally. Michael and
Melnykov (2016) and O’Hagan and White (2018) proposed an alternative approach
to initialization based on model averaging, and reported good results.
Most of the heuristic clustering methods that predated model-based clustering
were based on a matrix of similarities between objects, often derived from a
matrix of measured characteristics of the objects, as discussed in Section 1.1.
Model-based clustering, on the other hand, has for the most part modeled the full
data on the measured characteristics directly. Sometimes, however, the full set of
measured characteristics isn’t available, or the similarities are what are measured
directly, or it is desirable to model the similarities rather than the characteristics
for computational reasons. For these situations, Oh and Raftery (2007) developed
a model-based clustering method based on the similarity matrix rather than
the measured characteristics, building on the Bayesian multidimensional scaling
method of Oh and Raftery (2001).
Model-based clustering has been used in a wide range of application areas,
several of which we illustrate in this book. Others include astronomy (Mukherjee
et al., 1998), chemistry (Fraley and Raftery, 2006), political science (Ahlquist and
78 Model-based Clustering: Basic Ideas
Breunig, 2012), education (Howard et al., 2018) and actuarial science (O’Hagan
and Ferrari, 2017). It has been used in a variety of ways in the analysis of gene
microarray data. For example, Yeung et al. (2001) used it to cluster genes according
to their expression levels in a range of experiments, while Young et al. (2017) used
it to remove artifacts caused by incorrect swapping of genes in paired experiments.
McLachlan and Peel (2000) and Fraley and Raftery (1998, 2002) gave overviews
of the area to that point. More recent overviews have been provided by Ahlquist
and Breunig (2012) and McNicholas (2016b,a).
3
Dealing with Difficulties
In this chapter, we will discuss some difficulties that the basic modeling strategy
can have in specific contexts, and describe some relatively simple strategies for
overcoming them, which are readily available and implemented in current software.
More elaborate strategies will be described in later chapters.
One difficulty is that data sets often contain outliers, that is data points that
do not belong to any cluster. We will discuss some ways of dealing with this issue
in Section 3.1.
Another issue is that maximum likelihood estimates of the model parameters
can be degenerate. Bayesian estimation can be useful for regularizing the inference
in such situations, and avoiding these degeneracies. We discuss one Bayesian
approach to this problem in Section 3.2. This is a simple approach designed
to solve the degeneracy problem; when degeneracy is not an issue, it produces
solutions similar to those produced by maximum likelihood.
It often happens that some clusters are not Gaussian, so the most common
probability model in model-based clustering, the multivariate Gaussian, does not
apply. Using it tends to lead to the representation of one cluster by a mixture of
Gaussian components rather than just one. This still gives good estimates of the
underlying overall probability density function, but can lead to overestimation
of the number of clusters. In Section 3.3 we describe one way to overcome this
difficulty, by merging components after an initial fit.
Finally, we briefly mention some other approaches to these issues in Section 3.4.
Other approaches will be described in more detail later in the book.
3.1 Outliers
3.1.1 Outliers in Model-based Clustering
In general, in statistical analysis, an outlier is defined as one of a relatively small
number of points that do not follow the same pattern as the main bulk of the data.
In model-based clustering, outliers are points that do not belong to any of the
clusters. In many statistical analyses, outliers are often data points that lie outside
the main bulk of the data, and this can be true in cluster analysis too. However,
they may also be “inliers” in the sense that they may be between clusters, and
thus in the interior of the data, particularly if clusters are well-separated. Outliers
can increase the estimated number of clusters beyond what is present in the data,
79
80 Dealing with Difficulties
●
●
●
●
●
2
●●
●
● ● ●●
●
●●● ●●
●●●
●●
●
●● ●
●● ●
● ● ●● ● ● ●●
●
●
●● ●
●● ●●●●●●
●
●●
●
●●●●
●
●●●●●
● ●
●
●
● ●●
● ●● ●●●●●●
● ●
●●●●●● ●
●
● ●
●●
● ●●●
●
●●●● ● ●●
●●●
●
● ●● ●●●● ●
●
● ● ●
●
●
0
● ●
● ●
● ● ●
●● ●● ●
●
Y
● ●●
● ● ● ●
● ● ● ●●●● ●
● ●● ● ●
●●●●●●●●●● ● ●
●● ●●● ●●●
● ●● ●
●
●
● ●
−2
●
●
●
−4
−4 −2 0 2 4 6
Figure 3.1 Simulated data set with two well-separated clusters and
outliers.
●
●
●
●
●
2
●●
●
● ● ●●
●
●●● ●●
●●●
●●
●
●● ●
●● ●
● ● ●● ● ● ●●
●
●
●● ●
●● ●●●●●●
●
●●
●
●●●●
●
●●●●●
● ●
●
●
● ●●
● ●● ●●●●●●
● ●
●●●●●● ●
●
● ●
●●
● ●●●
●
●●●● ● ●●
●●●
●
● ●● ●●●● ●
●
● ● ●
●
●
0
● ●
● ●
● ● ●
●● ●● ●
●
Y
● ●●
● ● ● ●
● ● ● ●●●● ●
● ●● ● ●
●●●●●●●●●● ● ●
●● ●●● ●●●
● ●● ●
●
●
● ●
−2
●
●
●
−4
−4 −2 0 2 4 6
identify all the simulated outliers correctly, except for the one close to the left
cluster, which is hard to identify visually anyway.
We will now describe two relatively straightforward ways to deal with outliers.
One is to add an additional component to the mixture model to represent the out-
liers, typically a uniform component. A second approach is to remove observations
identified as outliers at the estimation stage.
τ0
G
p(yi ) = + τg φg (yi |μg , Σg ), (3.1)
V g=1
82 Dealing with Difficulties
−800
●
● ●
●
●
●
● ●
●
● ●
●
−900
●
● ●
● ●
●
●
−1000
●
●
●
●
●
BIC
−1100
●
−1200
●
●
EII EVE
VII VEE
● EEI VVE
−1300
● VEI EEV
● EVI VEV
● VVI EVV
● EEE VVV
1 2 3 4 5 6 7 8 9
Number of components
Figure 3.3 Simulated data with outliers: BIC plot for standard
model-based clustering.
where τ0 is the expected proportion of outliers in the data and V is the volume of
the data region. Estimation and model selection proceed as before.
Operationally, V can be defined as the volume of the smallest axis-aligned
hypercube containing the data. Another definition is the volume of the smallest
hypercube aligned with the principal components of the data set that contains the
data. The mclust software uses the minimum of these two values as its default
value.
It might seem at first sight that the result would be sensitive to the value of
the volume of the data region. However, in practice the sensitivity typically does
not seem to be great. This is probably because with the model (3.1), a point
is classified as an outlier approximately if the uniform density is greater than
the density under the nearest mixture component, and so the cutoff point will
usually be in the tail of one of the mixture densities. Because the Gaussian density
declines fast as one moves away from its mode, the precise location of this cutoff
point is not too sensitive to the height of the uniform density.
The estimation results can be quite sensitive to the initialization of the EM
algorithm, however, particularly the initial assignment of noise points. One auto-
matic way to initialize the outlier assignment is by the nearest neighbor cleaning
method of Byers and Raftery (1998). This essentially designates points as outliers
3.1 Outliers 83
if they are far from other points. It does this by approximating the non-outlier
distribution of the distance of a point to its Kth nearest neighbor by a gamma
distribution with shape parameter K (this is exact if the data are generated by
a homogeneous Poisson distribution). It then represents the distribution of Kth
nearest neighbor distances by a mixture of two components, one corresponding
to non-outliers and one to outliers, estimating the resulting mixture model, and
designating as outliers the points in the cluster with larger values of the Kth
nearest neighbor distance. The method is implemented in the NNclean function
in the prabclus R library (Hennig and Hausdorf, 2015).
If the variables are measured on the same scale, such as spatial coordinates or
scores on similar tests, it is not necessary to scale the data before applying the
nearest neighbor cleaning method. Otherwise it is usually a good idea to scale the
data first. This can be done, for example, by dividing each variable by its standard
deviation, or by a robust measure of scale. Otherwise, the result is not invariant
to rescaling of particular variables. A variable measured on a scale that gives it a
large variance may dominate the calculations, which would be undesirable.
For our simulated data, this analysis proceeds as shown in Listing 3.1, with
the results shown in Figure 3.4. The upper left panel shows the initialization
using nearest neighbor cleaning. This slightly overestimates the number of outliers,
identifying 25 outliers instead of the 20 that were simulated. However, this does
not matter much, as this is only the initial assignment, which is then refined by
the model-based clustering.
Figure 3.4(b) shows the BIC plot. This allows for the possibility of there being
no clusters, i.e. that the data set consists only of Poisson noise, shown in the plot
as “0 components.” This possibility is strongly rejected by BIC for the simulated
data. BIC chooses the equal-variance spherical model, EII, with two clusters and
noise. This is indeed the model from which the data were simulated. Figure 3.4(c)
shows the resulting classification. This correctly identifies the 20 outliers that were
simulated, including the one that was visually ambiguous, with no false positives.
Finally Figure 3.4(d) shows the uncertainty plot. Uncertainty was relatively
high on the periphery of the clusters, but 17 of the 20 simulated outliers were
identified with little uncertainty.
The nearest neighbor cleaning method tends to identify more points as outliers
than are actually present. This happened in the simulated data, where nearest
neighbor cleaning identified 25 outliers compared with 20 that were actually
simulated; the overestimation is often more severe. This usually does not matter
too much because it is being used only for initialization; the mixture model
estimation typically corrects things.
A modification of the nearest neighbor cleaning method that tends to identify
fewer false positives is the nearest neighbor variance estimation method (NNVE).
This adds simulated noise to the data before applying the nearest neighbor
cleaning method; perhaps surprisingly this tends to reduce the number of false
positives identified in the original data. It is implemented in the covRobust R
package, and its application to our simulated data is shown in Listing 3.2, with
the results shown in Figure 3.5.
84 Dealing with Difficulties
# Plotting results
plot ( simdata , col =1+ NNclean.out $ z )
mclust2Dplot ( simdata , parameters = MclustN $ parameters , z = MclustN $z ,
what = " classification " , PCH =20)
plot ( MclustN , what = " uncertainty " )
NNVE identifies fewer outliers than nearest neighbor cleaning, in this case 23
compared with 25. While this does not seem like a big difference, the number of
false positives is reduced from 5 to 3, i.e. by 40%. The final classification from the
mixture model is the same as when it is initialized with nearest neighbor cleaning.
The initial sets of outliers identified by nearest neighbor cleaning and NNVE both
include all the simulated outliers.
−800
● ● ●
● ● ●
● ● ●
● ● ● ● ●
● ● ● ● ● ●
●
2
●● ● ●
●
●●
●●●●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●
●
●●
●●●
●
●
●●
●●
●●
●
●
●●
● ●
●●
●
●●
●●
●
●
●●
●
●●
●●
●
●●●
●●●●
●●
●●
● ●
●●●●●
●●
●
−1200
● EII EVE
0
●●●●
BIC
●●●●
● ●● ● VII VEE
●
Y
●●
● ●●
●●●
●●●
●●●
●
●●
●
●
●
●
●
● ●
●
●●●●●
●
● ● ●
●●●●
● ●● ●
● EEI VVE
● ● ●
−2
● VEI EEV
●
● ● EVI VEV
−1600
●
● VVI EVV
●
−4
EEE VVV
● ●
−4 −2 0 2 4 6 0 1 2 3 4 5 6 7 8 9
X Number of components
Classification Uncertainty
●
●
●
●
●
●●
2
●
●
●●
●●●●● ● ●● ● ●●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
● ●
●
●●●●●●●●●●●●●●●●●●●●●●●●●
●
●●
●●●
●
●
●●
●●
●●
●
●● ●
●
●
●● ● ●●●● ●
●●
●● ●●
●●●● ●
●●
●
●●
●
●●
●●
●
●
●●
●
●●
●● ●
●
●●
●●●●
● ● ●● ●
●
●●●●
●●
●●
●
●
● ●
●●
●●●
●●
●●
●
●
●
●●●●
● ●● ●●● ●
●●
● ●
●
●●●●
●●
●●●
● ● ● ●● ● ●
●●●
● ●
● ●●
●
0
●●●●●●●●●
●
x2
x2
● ●●
●● ●●
●
●●
●
● ●●●●
●● ●● ●●
●●●●
● ●
●●
● ●●●
● ●●●
●●
●● ● ●
●
●
●
●
●
−2
−2
●
●
●
−4
−4
−4 −2 0 2 4 6 −4 −2 0 2 4 6
x1 x1
two-dimensional space. Also, if one thinks of the clutter as outliers, their number
is large, consisting of over two-thirds of the points.
When we apply model-based clustering with no noise component to the data,
the best model according to BIC is the varying covariance model VVV with eight
clusters. The BIC plot is shown in Figure 3.6, with the BIC values for all models
in the left panel, and the BIC values for the VVV model with different numbers of
clusters in the right panel.
The resulting classification is shown in Figure 3.7. While the minefield appears
86 Dealing with Difficulties
Listing 3.2: Outlier modeling for simulated data using nearest neighbor variance
estimation
# Loading libraries
library ( covRobust )
# Plotting results
plot ( simdata , col =1+ nnve.out $ classification )
mclust2Dplot ( simdata , parameters = simdata.MclustN.NNVE $ parameters ,
z = s i m d a t a . M c l u s t N . N N V E $z ,
what = " classification " , PCH =20)
NNVE
●
●
●
●
●
2
●●
●
● ● ●●
● ● ● ●●
●
●●● ●● ●●● ●●
●●●
●●
●
●●
●● ●
● ●●●
●●
●
●● ●
●● ●
● ● ●●● ● ●●
●
● ● ● ●● ● ● ●●
●
●
●● ●●● ●● ●● ●
●● ●●
●●
●
●●●●
●●
●
● ●●● ●●
●●●●
●
● ●●
●
●●●●
●
●●●●●
●
●●●●●●
●
●
● ●● ●
●●●●●●●●●●
● ● ●●
● ●● ●●●●●●
● ●
● ●●●●●
●
● ●
●
●●● ● ●●●●●●●●
●
●●●
●●●
●
●● ● ● ●●
●
●●● ●
●● ●● ● ●
●●
●●
●
● ●
● ●● ●●●● ● ● ●● ●●●●
● ●
● ● ● ● ● ●
●
●
0
● ●
● ●
● ●●
●● ●● ●
x2
●
Y
● ●
● ●●
● ● ●●●●●● ●●
● ●● ● ●
●●● ●
● ●●● ● ●
●●
● ● ●● ●
● ●●●● ●●
●
●
● ●
−2
−2
●
●
●
−4
−4
−4 −2 0 2 4 6 −4 −2 0 2 4 6
X x1
Figure 3.5 Outlier modeling for simulated data with nearest neighbor
variance estimation (NNVE): Left: initialization of assignment of outliers via
NNVE. Right: classification by model-based clustering initialized with
NNVE.
as two clusters, the outliers are represented by six clusters that do not have any
reality. Thus, without a noise component, model-based clustering does not give
an interpretable result for these data.
The identification of outliers by nearest neighbor cleaning is shown in Figure
3.8. This does identify most of the outliers correctly, but it erroneously identifies
some clumps of outliers as part of the minefield. This is because, by chance, their
12th nearest neighbor distances are relatively small. Visually, it is fairly clear from
looking at the data as a whole that they do not belong to the main minefield, but
nearest neighbor cleaning takes a local rather than a global approach to the data
3.1 Outliers 87
●
●
−21600
−21600
● ●
● ● ● ●
● ●
● ● ●
● ●
● ●
● ● ●
● ● ● ●
● ●
●
● ● ● ●
● ● ● ●
● ● ●
−21700
−21700
● ● ● ● ●
● ● ●
●
● ● ●
●
● ●
● ● ● ●
●
● ●
BIC
BIC
● ● ●
●
●
● ●
−21800
−21800
●
●
●
EII EVE
−21900
−21900
VII VEE
● EEI VVE
● VEI EEV
● EVI VEV
● VVI EVV
● EEE VVV VVV
1 2 3 4 5 6 7 8 9 10 12 14 16 18 20 1 2 3 4 5 6 7 8 9 10 12 14 16 18 20
Figure 3.6 BIC values for minefield data for model-based clustering with
no noise component. Left: all 14 models. Right: VVV model only.
and so cannot make this kind of inference. It provides a good initialization for
model-based clustering, which does take a global view of the data.
The BIC plot for model-based clustering with noise for the minefield data is
shown in the left panel of Figure 3.9. The model chosen is the variable-orientation
model, EEV, with two clusters, in which the volumes and shapes of the clusters
are the same. This means that there are two Gaussian clusters in addition to the
noise. The resulting classification is shown in the right panel of Figure 3.9.
The minefield here has a chevron shape, being concentrated around a piecewise
linear curve consisting of two lines that connect. Thus it cannot be well represented
by any one bivariate Gaussian distribution. This is because the bivariate Gaussian
distribution is in general concentrated around a line in two-dimensional space.
(When the covariance matrix is proportional to the identity matrix, the distribution
is distributed in a circular manner around a point in two-dimensional space.) The
model approximates the minefield by two clusters rather than one, each of which
is concentrated about a line. This illustrates the fact that a cluster concentrated
about a nonlinear curve can be approximated by a piecewise linear curve, which
can in turn be represented by several mixture components rather than one. The
model was extended to the situation where a single cluster is concentrated around
a smooth curve, represented by a principal curve, by Stanford and Raftery (2000).
The noise model gives 63 errors out of 1,104 data points, an error rate of 5.7%.
The errors are shown in the left panel of Figure 3.10, and it can be seen that
they are concentrated on the edge of the true minefield, which is where one would
expect any algorithm to have difficulties. The uncertainties are plotted in the
right panel of Figure 3.10, which indicates that the actual errors occurred where
the model said that the uncertainty was greatest.
88 Dealing with Difficulties
●
● ●● ● ●
● ● ●● ●
● ● ●●
● ●●
● ●● ● ● ● ● ●●● ● ●
● ● ● ●●
●
●
● ●
● ● ●● ● ● ●●
●● ●● ●
●● ● ●● ● ● ● ● ● ●
●
● ● ● ● ● ●
● ● ● ● ●● ● ●●
● ●
● ●●● ● ●
●●
● ●● ● ●●●
● ● ● ●● ● ●●
●
●●
●●
● ●●●
●●
● ● ● ●●
● ●
● ●●
● ●● ●● ●●● ●●
●
●●● ●● ●●● ● ●●
●
●●●●●●● ●● ●●●●
● ● ●●
● ● ● ● ● ●●
● ●● ●
● ● ● ●●● ● ●
●
● ● ●●●
●
● ●●●● ●● ● ●
● ●● ●●● ●●●● ●●
●
●●● ●● ●● ●●● ● ●
●● ●
● ● ●● ●● ●● ● ●
● ●
● ●● ●● ●
●●● ● ● ●
●● ●● ●●● ● ●●
● ● ●●●●
●● ● ● ●
● ●● ●
● ●● ●●●●
●
● ● ●●
● ●● ● ●●
It is possible to compare the best mixture models without and with noise using
BIC to provide an approximate Bayesian test of the presence of uniform noise.
For the minefield data, the BIC value for the best model without noise (the
unconstrained covariance VVV model with eight clusters) is −21, 538.79, while for
the best model with noise (the varying orientation model EEV with two clusters
and noise) it is −21, 127.80. Thus the model with noise has a BIC value that
improves on the model without noise by 411 points, a rather decisive difference.
This analysis also illustrates the fact that with the noise model, one can have a
high proportion of outliers, more than 50%, without the analysis breaking down.
NNclean classification
●
●● ●● ●
●
●● ●
●
●
●
●
●●
●● ●
●
●
● ● ● ●
●
●
● ● ●●● ●●● ●
●●●● ●● ●●● ●
● ●●● ●● ●
●● ● ● ●
● ●●● ●● ● ● ●● ●
●● ●●●●● ●● ●●
●
● ●●●●
●●● ● ● ●●● ●
● ●●
● ●● ●● ● ● ●● ●●● ● ● ● ●● ●
●● ● ●● ●● ●●
●●●
● ● ● ● ●●
●
● ●●● ●● ●● ● ●
●●
●●
● ● ● ●● ● ●● ●●● ●
● ●●
● ●● ● ● ●● ● ●● ●●
● ● ●● ●●●● ●
●
●● ● ●● ●
● ●●●
●● ●
● ● ●●●●
● ●
● ●● ● ● ●● ● ●●
● ●●●● ●
● ●● ● ● ● ●
● ●● ●
●● ●●● ●●●● ●●
● ●●●● ● ●●● ●
●●● ●● ● ●
●
● ●
●● ●
● ● ● ●● ● ●● ●● ●●
● ●● ● ●● ●
●● ●●
●
●●●
● ●
● ●
●● ●● ●● ● ● ● ● ●●
●● ●● ● ● ● ● ●● ●
● ●● ●● ● ● ● ●● ●
●●
●● ● ●●
● ● ● ● ●●
●● ● ● ●
● ●
●● ●
● ●● ● ●
● ●● ●● ●● ●●●
●● ●●
● ● ● ●
● ●
● ● ● ●
● ● ● ●
● ● ●
● ●
● ● ●
−21150
● ● ● ● ● ● ●
● ●
● ● ●
● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●● ●
●
● ●
●
● ● ● ● ●
● ●
● ● ●
● ● ●
●
● ● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ● ●
● ●
● ● ●
● ● ● ●
●
● ● ●
● ● ● ● ●
● ● ● ●
● ● ●
● ● ● ●
● ● ●
● ● ●
● ●
● ● ● ● ● ●
● ● ● ● ●
● ● ●
●● ● ●
●
● ●
● ● ● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ● ●
● ● ●
−21200
● ● ● ●
● ● ●
● ● ● ● ● ●
●● ● ● ●
● ● ●
● ● ● ●
● ● ● ● ●
● ●
●
● ● ●
● ● ● ●
● ● ●
●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ●
●
● ●
● ● ●
●
● ● ●
● ● ●
●
● ● ●
●
●
● ●
● ● ●
● ● ●● ● ● ●
● ●
● ● ● ●
● ●
●
● ● ●
● ● ● ● ●● ●
●
● ●
● ● ●
●
●● ●
●
●
●
● ● ● ●
● ●
● ●
● ● ●
●
●
● ● ● ● ● ●
●
● ● ●
● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ●
● ● ● ● ●
● ● ●
●
●
−21250
●
● ● ● ●
●
● ● ● ● ●
● ● ●
● ●
● ● ● ●
● ● ● ●
●
●
●
● ●
● ● ● ●
● ●
BIC
● ●
●
● ●
●
● ● ●
● ● ● ● ● ● ●
●
●
●
●
● ● ●
● ● ●
● ● ●
● ● ● ● ●
● ●
● ● ● ● ●
●
● ● ●
●
● ● ●●
● ● ● ● ●
●
● ● ● ●
● ●
●
●
●
● ● ● ● ●
● ● ● ●
● ● ● ● ● ● ● ● ●
●
●
● ● ● ●
●
● ● ● ● ●
● ●
●
●● ●
● ● ●
●
● ●
● ● ● ● ●
●
●
● ● ● ●
●● ●
● ●
● ● ●
● ●
●
● ● ● ●
● ●
● ● ●
● ●
●● ●
●
● ● ● ● ●
●
● ●
● ● ● ● ● ●
−21300
●●● ●●
● ●
● ●●●
●
● ● ●
● ● ●●
● ● ●
● ● ● ●
●
●
● ●
●
● ●
●
●
● ● ●●
● ●
●● ●● ●
● ● ● ● ● ●
● ● ●
●● ●
●
●
●
● ● ● ●
●
●
● ●
●● ●●● ●●● ● ●●
● ● ●
● ● ● ● ● ●
●
●
●● ●●
●
●
● ●
●
●
●
●
● ●● ● ●● ●●●
● ● ●
●
● ●
●●
● ●
EII EVE
● ●
●●
●
● ●● ●● ●
●
● ● ●
●●●● ● ●
●
● ● ●
● ● ●
●
●
●
●
●
●
●
●
●● ● ● ●
●
● ●
●●
● ●
●● ●●●● ●●
●
● ● ● ●
●
●
●●● ●● ● ●
●
● EEI VVE
●
−21350
● ● ●
●
● ●●
● ●
● ● ● ●
● ●●● ● ●● ●
●
● ● ● ● ● ●
●
●● ●●
●
● ●
● ● ● ●
● ●
● VEI EEV
● ● ● ●
●
● ● ● ● ●
●●
●
●●
● ● ●
●
●
● ● ●
●
● ● ●
● ● ●
● ● ● ●
● ●
●● ●●
● ● ●
● ●
●
● EVI VEV ● ● ● ● ● ●
● ●
●●●●
● ● ●
●
● ●
●
● ●
● ●
● ● ●
● ● ● ● ●
● ● ●
●●
● ● ● ●
●
●
VVI EVV
● ● ●
● ● ●
● ● ● ● ●
● ●● ●
● ● ● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ● ●
● ● ● ● ● ●
● ●
EEE VVV
● ● ● ●● ●
● ● ● ●
● ● ●
●
●
● ●
● ● ● ● ●
● ● ●
●
0 1 2 3 4 5 6 7 8 9
Number of components
Classification Errors
● ● ● ●
● ●
● ● ● ●
● ● ● ●
● ● ●
● ●
● ● ● ● ● ●
● ● ● ● ●
●
● ● ●
● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●● ●
●
● ●
●
● ● ● ● ●
● ●
● ● ●
● ● ●
●
● ● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ● ●
● ●
● ● ●
● ● ● ●
●
● ● ●
● ● ● ● ●
● ● ● ●
● ● ●
● ● ● ●
● ● ●
● ● ●
● ●
● ● ● ● ● ●
● ● ● ● ●
● ● ●
●● ● ●
●
● ●
● ● ● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ● ●
● ● ●
● ● ● ●
● ● ●
● ● ● ● ● ●
●● ● ● ●
● ● ●
● ● ● ●
● ● ● ● ●
● ●
● ● ●
● ● ● ●
● ● ●
●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ●
●
● ●
● ● ●
● ● ●
● ● ● ●
● ●
● ●
● ● ●
● ●● ● ● ●
● ●
● ●
● ● ● ●
● ● ● ● ●
● ● ● ● ●● ●
●
● ●
● ● ● ●
●● ●
●
● ● ● ●
● ●
● ●
● ● ●
●
● ● ● ● ● ●
●
● ● ●
● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ●
● ● ● ● ●
● ● ●
●
● ●
● ● ●
●
● ● ● ● ●
● ● ●
● ●
● ● ● ●
● ● ●
● ●
● ● ● ●
● ●
● ● ● ● ●
● ●
● ● ● ● ● ● ●
●
● ● ●
●
●●
● ● ●
● ● ● ● ●
● ● ● ● ●
● ● ● ●
●●
● ● ● ● ● ●●
●
●
● ● ● ●
●
● ●
● ● ●
● ● ●
● ● ● ●
● ●
● ● ● ● ● ● ● ● ●
●
●
● ● ● ●
●
● ● ● ● ●
●
●
●● ●
● ● ●
●
● ●
● ● ●
●●
● ●
●
● ● ● ●
●● ●
● ●
● ● ●
● ● ●
●
● ● ● ●
● ●
● ● ●
● ●
●● ●
●
● ● ● ● ●
● ●
● ● ● ● ● ● ●
●●● ●●
●
●●
●
●
●●
●
● ● ●
● ● ●●
● ● ●
● ● ● ●
●
●
● ●
●
● ●
●
●
● ●●
● ●
●● ●● ●
● ● ● ● ● ●
● ● ●
● ●
●
● ●
●
● ● ● ●
●
●●● ● ●●
● ●
● ●● ● ●●●
●●●
● ● ●
● ● ● ● ● ●
● ●● ●●●
●
●●●●●
● ●
●
● ●
●
● ● ●
● ● ● ●● ●●
●
●●
● ●
●●
● ● ● ●
● ● ●● ●●●● ●
●
●
● ● ●
● ●
●
● ● ●●●●
● ● ●
● ● ●
●
● ●
●
●
●● ●
● ●
● ● ●●
● ●
●
●●
● ●
●
●● ● ●
● ●
●●
● ●
●● ●●●● ●●
●
● ● ● ●
●
●
● ●
●
●●● ●● ●
●
● ● ● ●
●●
●
● ●●
● ●
●
● ● ● ●
● ● ●●● ● ●● ●
●
● ●
●● ●
● ● ● ●
●
●● ●●
●
● ●
●
● ● ● ●
● ● ● ●
● ●
●
●
●
● ● ● ●
●●
●
●●
●
●
●
●
● ●
●
● ● ●
●
● ● ●
●
●
● ●
● ● ● ●
● ●
●● ●●
● ● ●
● ● ●
●●
●
●
● ● ● ● ● ●
● ●
●●●●
● ● ●
●
● ●
●●
●
● ●
● ●
● ●
●
●
● ● ● ●
● ● ● ●
● ●
●●
● ● ● ●
● ● ● ●
● ● ●
●
● ● ●
●●
● ●● ● ●
● ● ● ● ● ●
● ● ● ●
● ● ●
● ●● ●●
● ● ● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ● ●
●● ●●
● ● ● ● ● ●
● ●
● ● ● ●● ●
● ● ● ●
● ● ●
●●
●
● ●
● ● ● ● ●
● ● ●
●
where the densities fi are not necessarily Gaussian probability density functions.
Under these assumptions, Gallegos and Ritter (2005) and Garcı́a-Escudero et al.
(2008) show that mixture parameters and group memberships can be inferred by
just maximizing the “trimmed” log-likelihood:
G
(Y ; Θ, S) = log (τg φ(yi ; μg , Σg )) .
yi ∈S g=1
● ●
● ●
4
4
● ●
● ●
● ● ● ●
2
2
● ●
●
●
● ●
0
0
●
● ●
●
● ●
−2
−2
● ●
● ●
● ● ●
● ●
−4
−4
● ●
● ●
● ● ● ●
−4 −2 0 2 4 −4 −2 0 2 4
Figure 3.11 tclust for robust clustering. The left panel shows the simulated
data with the actual group label (outliers are the black circles). The right
panel displays the tclust solution, which gives results similar to the actual
group labels. Notice that the true values of G and α were provided to tclust
here.
Listing 3.3 shows a simple use of tclust on simulated data, using the tclust
package for R. Here the data are simulated from a mixture of two Gaussian
distributions and a uniform component. The percentage of data in the uniform
component (assumed to be the outlier group) is α = 0.1. The results are shown in
Figure 3.11.
As always, the actual number of groups and the percentage of spurious data are
not known in practical situations. It is therefore desirable to be able to automati-
cally choose appropriate values for G and α for the data at hand. Unfortunately,
as discussed by Garcı́a-Escudero et al. (2008), it is not valid to use classical model
selection criteria in this context. Instead, Garcı́a-Escudero et al. (2008) proposed
a heuristic which consists of examining the trimmed classification log-likelihood
curves and searching for the value of α for which the curves G and G+1 are similar
for larger values of α. Listing 3.4 presents code which allows the simultaneous
choice of G and α with tclust. The plot produced by the last commands is shown
in Figure 3.12.
The left panel of Figure 3.12 shows the curves of the trimmed classification
log-likelihood for G = 2, 3 and 4 against the value of α. The curve for G = 2
quickly joins the curves associated with larger numbers of groups. Following the
heuristic of Garcı́a-Escudero et al. (2008), one can conclude that an appropriate
value for G is 2, and that α should be between 0.07 and 0.1. The right panel gives
a better view of the relative differences between the curves for G = 2 and 3. It
92 Dealing with Difficulties
# Plotting results
par ( mfrow = c (1 ,2) )
plot (X , col = cls , pch = cls , xlab = ' ' , ylab = ' ' ,
main = " Data with actual labels " )
plot ( out , xlab = ' ' , ylab = ' ' , main = ' tclust clustering ')
# Plotting results
par ( mfrow = c (1 ,2) )
plot ( out )
plot ( seq (0 , 0 .2 , len = 21) , out $ obj [1 ,] - out $ obj [2 ,] , type = 'b ' ,
xlab = ' alpha ' , ylab = ' ' , main = ' Difference between G =2 and G =3 ')
abline ( h =0 , lty =2)
appears that the curves join at around α = 0.09, which is a reasonable estimate
of the actual percentage of outliers (the actual value of α is 0.1).
3 ● ● ●
0
4
2 ● ● ● ● ● ● ● ● ● ●
●
3
4
2 ●
3
4
2
3
4
2
3
4
2
3
4
2 ●
4
3
2
−100
4
3
2
−2000
3
4
2
4
3
2
3
4
2 ●
4
3
2
Objective Function Value
4
3
2
−200
4
3
2
4
3
2 ●
−2500
4
3
3 2
4
−300
4
3
4 2 ●
3
4
3
2
4
−3000
−400
2 ●
−500
−3500
2 ●
0.00 0.05 0.10 0.15 0.20 0.00 0.05 0.10 0.15 0.20
α alpha
Figure 3.12 Choice of G and α with tclust. The left panel shows the curves
of the trimmed classification log-likelihood for G = 2, 3 and 4 against the
value of α. On the right panel, the difference between the curves for G = 2
and 3 is displayed. Following the heuristic of Garcı́a-Escudero et al. (2008),
one can conclude that appropriate values for G and α are respectively 2 and
0.09.
esting in general, and are often spurious or degenerate solutions. They do not have
good properties as estimators. Redner and Walker (1984) showed that there is in
general a sequence of interior maximum likelihood estimates that is consistent, i.e.
that approaches the true values in the limit with high probability as the sample
size increases. But the spurious estimators do not have this good property.
In practice for estimation, this is not a big problem, because if the model is well
specified and the starting values are reasonable, it is unusual for the EM algorithm
to get trapped at a spurious solution. However, for model selection it is a bigger
problem. This is because model selection inevitably involves fitting models that
are bigger than the best fitting model, and so include mixture components that
are not well supported by the data. This is precisely the situation where spurious
solutions are most likely to be found.
Several solutions have been proposed. Hathaway (1985, 1986b) proposed maxi-
mizing the mixture likelihood (2.6) subject to the constraint there exists a constant
c such that all the eigenvalues of Σg Σ−1
h are greater than or equal to c for all pairs
of mixture components g, h. However, the resulting constrained maximization is
difficult.
The geometric constraints described in Section 2.2 provide a partial solution,
in that models that impose equal-volume constraints, such as EEE or EVV, do not
suffer from this problem. This is not a complete solution, however, because one
94 Dealing with Difficulties
Listing 3.5: BIC for diabetes data with spurious singular solutions and with
prior
# Loading libraries and data
library ( MBCbook )
data ( diabetes )
X = diabetes [ , -1 ]
# Including prior :
Mclust (X , prior = priorControl () , modelNames = " VVV " )
spurious singular solutions are reported. The tolerance for convergence of the EM
algorithm, specified by the control parameter tol, was relaxed to 10−3 instead of
the default 10−5 .
The result is shown in Figure 3.13. The log-likelihoods found by the EM
algorithm for five or more components diverged to infinity without regularization,
but finite values were found here because the algorithm stopped. These BIC
values are shown by dashed red lines and open circles in Figure 3.13; by default
they would not be shown. Non-spurious results are found only for four or fewer
components, and the corresponding results are shown by solid red circles. Using
the full red BIC curve in Figure 3.13 would lead to the erroneous conclusion that
there are at least nine components.
When the default prior and posterior mode are used, the result is shown in Figure
3.13 by the solid blue circles and lines. The BIC values are very close to those
for the default solution for four or fewer components, when the default solution
is non-spurious, as we would wish. For five or more components, the spurious
singular solutions are no longer found because of the Bayesian regularization, and
the BIC continues to decline in a smooth way. Thus using the prior confirms our
earlier conclusion that using three mixture components is best.
To illustrate the issues, we now consider the five-component solution. This is the
smallest number of components for which the EM algorithm yields a degenerate
solution, shown in the left panel of Figure 3.14. Note that the data do not support
five components, so this would not be a good scientific solution, and is shown
only for illustration. Cluster 4 has only three points, shown in black in Figure
3.14. The dimension of the data is d = 3, and so Cluster 4 has the same number
of points as there are dimensions, not enough to estimate the covariance matrix.
Thus the estimated covariance matrix for Cluster 4 is singular, with the ratio of
the smallest to the largest eigenvalue equal to zero, and determinant equal to zero
also. There is zero uncertainty about membership of cluster 4. This is unrealistic,
and is another consequence of the degeneracy. In the default mclust solution, this
result would not be reported.
The right panel of Figure 3.14 shows the classification using the maximum
3.3 Non-Gaussian Mixture Components and Merging 97
BIC ●
−4500
Singular BIC ●
●
BIC with prior
−4600
●
−4700
●
●
−4800
BIC
●
●
●
● ●
●
−4900
●
−5000
−5100
2 4 6 8
Number of components
Figure 3.13 Diabetes diagnosis data: BIC values for the unconstrained
covariance VVV model calculated in different ways. The red filled circles show
the BIC values from the usual EM algorithm for 1 to 4 mixture components;
these correspond to non-spurious interior solutions. The red open circles and
dashed lines show the spurious computed BIC values for 5 through 9
components; these were obtained by relaxing the convergence tolerance. The
blue circles show the BIC values when a prior is used.
a posteriori estimates of the model parameters. Cluster 4 now has only two
points (again shown in black). However, the solution is no longer degenerate: the
covariance matrix for Cluster 4 has the ratio of smallest to largest eigenvalue
equal to 1.6 × 10−4 . While small, this is much larger than the reciprocal machine
infinity computed for the unregularized solution (2 × 10−16 ). Note that even with
the Bayesian regularization, the BIC strongly prefers three components to five, so
the five-component solution would not be used in practice.
Diabetes data: Degenerate 5−cluster VVV solution Diabetes data: 5−cluster VVV solution with prior
● Cluster 1 ● Cluster 1
Cluster 2 Cluster 2
Cluster 3 Cluster 3
● Cluster 4 (singular) ● Cluster 4 (2 points)
Cluster 5 Cluster 5
600
600
400
400
sspg
sspg
● ●
● ●
●● ● ●● ●
● ● ● ●
●● ●●
● ●● ●●● ● ● ●● ●● ●●
200
200
● ● ●
●● ●●●● ●● ●●●● ●
●
●●● ●
● ●
●●● ●
●
●●● ●●●
● ●● ●●● ● ●● ●
● ●● ●● ● ● ●● ●●●●
●● ● ●● ●●● ● ●
●●
● ●●●● ● ●●
● ●●● ●● ●
●● ●●
● ●
●●● ●●● ●
0
0
0 500 1000 1500 0 500 1000 1500
insulin insulin
This will give a good estimate of the density, but will overestimate the number of
clusters.
On the other hand, ICL will tend to select a number of mixture components
that corresponds to the number of clusters. However, in so doing it typically
represents a non-Gaussian cluster by a single Gaussian mixture component, which
may not be a good fit to the data. The difficulty is illustrated by the Old Faithful
data, as shown in Figure 3.15.
As we saw in Section 2.6, BIC chooses three mixture components, as illustrated
in the top panels of Figure 3.15. The top left panel suggests that, while the mixture
component in red at the bottom left of the plot is cohesive and well-separated
from the rest of the data, the two components in blue and green at the top right
part of the plot are not very separate from one another, and may in fact make
up a single cluster. The density estimate in the top right panel reinforces this. It
looks as if there are really two clusters, not three, but the one at the top right
has a non-Gaussian distribution and is itself being modeled by a mixture of two
Gaussian distributions.
As seen in the bottom left panel of Figure 3.15, ICL chooses two mixture
components. This is both good and bad. Good because there do indeed seem to
be two clusters, and hence the number of components chosen corresponds to the
number of clusters. But bad because the cluster at the top right of the plot is
represented by a single Gaussian in the ICL solution, which may not be a very
good fit to the data.
3.3 Non-Gaussian Mixture Components and Merging 99
●
●
● ●
●
●
90
90
● ●● ●●●
● ●
● ● ● ● ●● 0.02
●
●● ● ● 0.03
● ●●●
●
● ●●● ●● ● 0.04
● ● ●●● ● ●
●●●● ●● ●
● ● ● 0.045
● ●● ● ●● ● ●● ●
80
80
● ● ●● ●
● ●● ● ● ●●
● ●● ● ● ●● ●● ●●
●● ● ● ● ● ● ● 0.035
●● ● ● ● ● ●
●● ● ●
●●
● ● ●● 0.025
● ● ●● ●
● 0.015
waiting
waiting
0.01
70
70
●●
●
0.005
0.005
0.01
60
60
0.02
3
0.0
35
0.0
50
50
0.02
5
0.015
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions
●
●
● ● 0.005
●
●
90
90
● ●● ●●●
● ● ● 0.015
● ● ● ● ●●
● ●
● ●● ● ● 0.025
● ● ● ●●●
●● ●● ●●● ●● ● 0.035
● ● ● ● ● ●●● ● ● 4
● ● ●●●● ●● ●
● ● ● 0.0
● ● ● ●● ● ● ● ● ●● ●
80
80
● ● ● ● ●● ●
● ●● ● ●● ● ● ●●
● ● ● ● ●● ●
● ● ●● ●● ●●
●● ●● ● ● ● ● ● ●
●● ●● ● ● ● ● ●
● ● ●
● ● ●●
● 0.03
● ● ● ● ●●
● ● ● ●● ● 0.02
●
waiting
waiting
● ● ● ●
70
70
● ● ●● 0.01
● ●
●
● ● 0.005
● ●
0.015
60
60
0.025
5
0.03
3
0.0
50
50
0.02
0.01
1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
eruptions eruptions
Figure 3.15 Old Faithful data: Top left: classification by the BIC-best
model with three components. Top right: corresponding density estimate.
Bottom left: classification by the ICL-best model with two components.
Bottom right: corresponding density estimate.
A widely used solution to this dilemma is to keep the BIC solution as a model for
the density of the data, but to merge mixture components that are close together
before clustering. This has the potential to give the best of both worlds: the right
number of clusters, but a flexible model for each of them that fits the data well.
As summarized by Hennig (2010), methods for merging Gaussian components
proceed as follows:
1 Start with all components of the initially estimated Gaussian mixture as current
clusters.
2 Find the pair of current clusters most promising to merge.
100 Dealing with Difficulties
[G]
For a classification with no uncertainty, i.e. if ẑi,g is equal to 0 or 1 for all i and g,
[G]
Ent(G) = 0. The entropy is maximized when ẑi,g = 1/G for all i and g, i.e. when
the classification is effectively uninformative.
The basic idea is to choose the pair of clusters to merge at each stage that
minimize the increase in entropy over all pairs. The calculations are facilitated
by the fact that there is a simple relationship between the posterior cluster
membership probabilities of a point for any two clusters, and the corresponding
value for the merged cluster: the latter is just the sum of the two former. Merging
continues as long as the resulting increase in entropy is small, and stops when
a merge leads to a large increase in entropy. Using an elbow or change point in
the plot of the entropy against the number of observations has worked well for
deciding when to stop. This approach is implemented in the clustCombi function
in the mclust package.
If clusters g and g from the G-cluster solution are combined, the values of ẑi,h
remain the same for every group h except g and g . The new cluster g ∪ g then
has the following conditional probability:
[G] [G] [G]
ẑi,g∪g = ẑi,g + ẑi,g . (3.7)
Then the resulting entropy is
n [G]
−
[G] [G] [G] [G] [G]
ẑi,g + ẑi,g log ẑi,g + ẑi,g + ẑi,h log(ẑi,h ) . (3.8)
i=1 h=g,g
Thus, the two clusters g and g to be combined are those that maximize the
criterion
n n
−
[G] [G] [G] [G] [G] [G]
ẑi,g log(ẑi,g ) + ẑi,g log(ẑi,g ) + ẑi,g∪g log(ẑi,g∪g ) (3.9)
i=1 i=1
3.3 Non-Gaussian Mixture Components and Merging 101
among all pairs of clusters (g, g ). Then the conditional classification probabilities,
ẑi,g , i = 1, . . . , n; g = 1, . . . , G − 1, can be updated.
[G−1]
●
● ●
●
●
●●
● ●
●●
●● ●● ●●
● ●●●● ●
● ● ● ●●●
●
● ● ●
●● ● ●
6
6
● ●● ●
● ●●●●●●
● ●●●● ●
●● ● ●●●● ● ● ●
● ●●
● ● ●●
●●● ● ● ●
● ● ●●●●●● ●
● ● ● ●●● ● ● ● ●● ●
● ●●
●● ●●● ●● ●●
● ● ● ●● ●
●●
●●● ●●
●●●●
● ●
●●●●● ● ● ● ●●●●
●● ●●
●●●
● ●●●● ●
●●●● ● ●●
● ●●● ● ● ●● ● ● ● ●
●●●
●
● ● ● ●●●● ● ● ● ●● ●●
● ●● ●●● ●●●
● ●● ●●
●●●●● ●●●●● ●●
● ● ●●●●●●
●●
● ● ●●●●●●
● ● ● ●●● ●● ●
● ● ●
●
●● ● ●●
●● ● ●● ●
●
●●
●●●
● ● ●
● ●● ● ●
● ● ●●●● ● ● ● ●●●
● ●●●
● ●●
● ● ●●●●●
●
● ●●●
4
4
●
●● ● ● ●
●
● ●● ● ● ●
● ● ●●
●
● ● ● ●
X2
X2
●
● ●●
● ●
● ● ● ●
2
2
●
● ● ●
●
● ●● ●●
●● ● ●●●● ●
● ● ●●
● ● ●●●●
● ● ●
●●●
●●● ●●
● ● ●● ●● ●●●●
●●
● ● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●
●● ●
● ● ● ● ● ●● ● ●
●●●●●●
●●● ● ● ● ● ●● ● ●●●
●● ●●●
●
●●● ●
● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●●●●
●
●● ● ● ● ●● ● ● ●● ●●● ●● ● ● ●● ● ● ●● ●● ●●
● ●
0
0
●● ● ●
● ● ● ● ●●● ●● ● ●●● ●● ● ● ●●● ●● ● ●
●● ● ●
● ●● ● ● ●● ● ● ●●●
● ● ●●
●●● ●●●●
●● ● ●
●●
● ●● ●
● ●●
●● ●●● ● ● ● ●
●●
●● ● ●● ●● ● ●● ●
●● ●●●● ● ●
●
● ● ●
● ●● ● ●
● ●● ●●●●● ●
● ● ●● ● ●
● ● ●
● ●
●
● ●● ●
−2
−2
● ●
●
−2 0 2 4 6 8 10 −2 0 2 4 6 8 10
X1 X1
Figure 3.16 Simulated data (left) and BIC-best solution with six mixture
components (right).
in entropy is small in going from one to two clusters, from two to three clusters
and from three to four clusters. However, going from four to five clusters involves
a big increase in entropy, again suggesting that merging down to four clusters
improves the entropy substantially, but merging beyond that does not bring big
improvements. Overall, this suggests that four clusters is the best choice.
Figure 3.18 shows the results of merging mixture components, as one goes from
the BIC solution of six clusters down to two clusters. The selected four-cluster
solution combines the components that make up the cross-like clusters, as desired.
The number of clusters is the same as that selected by ICL, but the resulting
densities are different. In the ICL solution, each of the two cross-like clusters is
represented by a single, nearly spherical Gaussian distribution, which is clearly not
fully satisfactory. In the merged four-cluster solution, each of the two cross-like
clusters is itself represented by a mixture of two Gaussian distributions, retaining
the true, cross-like nature of the resulting clusters. Figures 3.17 and 3.18 were
produced by the code in Listing 3.6.
3.3 Non-Gaussian Mixture Components and Merging 103
● ●
●
120
0.3
100
0.2
Entropy
60
●
40
0.1
20
● ●
0.0
● ● ● ● ●
0
Figure 3.17 Simulated data. Left: entropy values for the G-cluster
combined solution, using the entropy merging method of Baudry et al.
(2010), plotted against the cumulative sum of the number of observations
merged at each step. The dashed line shows the best piecewise linear fit,
with a breakpoint at G = 44 clusters. Right: rescaled differences between
successive entropy values, equal to (Ent(G + 1)−Ent(G))/ Number of
merged observations.
We now apply the method of Hennig (2010) where clusters are merged until the
Bhattacharyya distance between all remaining pairs of clusters is greater than a
user-specified threshold. It is implemented using the code in Listing 3.7 and gives
the results shown there. It also selects four clusters, as do the DEMP method, and
the dip test method. The ridgeline methods, however, select five clusters, which
seems less satisfactory.
We now return to the Old Faithful data, where BIC selected three mixture
components but visual inspection suggests two clusters. All the merging methods
we have discussed here point to merging the two mixture components at the top
right of the top left panel of Figure 3.15, as desired. ICL selects three clusters,
each represented by a single Gaussian, as shown in the bottom right panel of
Figure 3.15, while the merging solution has the density shown in the top right
104 Dealing with Difficulties
6
4
4
X2
X2
●●
●
●
2
2
●
●●
●
● ● ●
●●
●● ●●●●
● ● ● ● ● ● ● ● ● ● ●● ●
● ● ● ● ●● ● ●●●
●● ●●●
● ● ● ● ● ●● ● ●
●●●●●●
●●●
● ● ● ● ● ● ●●●● ● ● ● ● ● ● ●●●
● ●● ● ● ●● ●● ●●
● ● ● ●● ● ● ●● ●●● ●● ●
0
0
● ● ●●● ●● ● ● ● ● ●●● ●● ● ●●
● ● ●● ●●●
● ● ●● ● ● ● ●●
● ● ●●
● ●●
● ● ●● ●
●● ●●● ● ● ● ● ●● ●
●● ● ●● ●● ●
●
●
● ●
●●●●● ●
●
●
−2
−2
●
−2 0 2 4 6 8 10 −2 0 2 4 6 8 10
X1 X1
●
●
●
●●
● ● ●●●
● ●
6
6
● ●● ●
●●●● ●
●
● ●●
● ●●●●●● ●
●
●
●●●●
● ● ●● ●
● ●●●●
●●
●● ●●
● ● ●● ●●
●●
●●●●
●●●●●●●●●
●●●
●●●
● ● ●
● ● ●●●
● ●●●
●●●●
●●
● ●●●
4
● ●
● ●
●●
● ● ●
X2
X2
●● ●●
● ●
● ●
2
● ●
●● ●●
● ●
● ● ● ● ● ●
●● ●●
●● ●●●● ●● ●●●●
● ● ● ● ● ●● ● ● ● ● ● ● ●● ●
● ● ● ● ●●● ●●●●●
●● ● ● ● ● ●●● ●●●●●
●●
● ● ● ● ●●
●● ● ●● ● ● ● ● ●●
●● ● ●●
● ●● ● ● ● ● ●●●●
● ● ● ●● ● ● ● ● ●●●●
● ●
●● ●● ●● ●●
0
● ● ●
●●● ●● ●● ●●●
● ●●
●
● ● ● ●
●●● ●● ●● ●●●
● ●●
●
●
●● ●● ●●● ●● ● ●
●● ●● ●●● ●● ● ●
●● ●●
● ●
● ●
● ● ● ●
●●●●● ● ●●●●● ●
● ●
● ●
−2
−2
● ●
−2 0 2 4 6 8 10 −2 0 2 4 6 8 10
X1 X1
Figure 3.18 Simulated data: results from merging for 5, 4, 3 and 2 clusters.
panel. Thus the merging solution seems to provide a better representation of the
apparently non-Gaussian cluster at the top right of the data.
We have found that, even if ICL selects a model that may not be fully satisfactory
because it represents each cluster by a single Gaussian distribution, even when
the cluster is non-Gaussian, it tends to do well in selecting the right number of
clusters (as opposed to the right number of mixture components). Thus a hybrid
approach in the case where BIC selects more mixture components than ICL would
be as follows: use ICL to select the number of clusters, use BIC to select the
number of mixture components and the covariance model, and select the final
clustering model by merging the components as described here, until the number
of clusters selected by ICL is reached.
3.4 Bibliographic Notes 105
# Displaying results
summary ( ex4.1.bhat )
initialize model-based clustering with noise, was introduced by Byers and Raftery
(1998). It was extended to robust covariance estimation by Wang and Raftery
(2002), introducing simulated noise points to reduce bias.
Several alternatives to the uniform noise model for outliers have been proposed.
Hennig (2004) proposed replacing the uniform distribution here by an improper
uniform component, an idea further developed by Hennig and Coretto (2008);
Coretto and Hennig (2010, 2011). An approach based on iteratively trimming
potential outliers was proposed by Garcı́a-Escudero et al. (2008, 2010, 2011); it
is implemented in the tclust R package (Fritz et al., 2012). Evans et al. (2015)
have proposed a method for outlier detection in model-based clustering based on
leave-one-out re-estimation of cluster parameters.
The normal-uniform mixture model for outliers has also been used in other
contexts. For example, Dean and Raftery (2005) used it as the basis for a method for
detecting differentially expressed genes in microarray data. There the differentially
expressed genes were analogous to the outliers and were modeled with the uniform
distribution, while the normal distribution was used to model the non-differentially
expressed genes.
An alternative approach is based on replacing the Gaussian mixture components
by t-distributions (Peel and McLachlan, 2000). This does not aim to identify
and remove outliers and then identify compact clusters, but rather to model
mixture components that may include outlying data points, and so its goal is
rather different from the other methods discussed in this chapter. We discuss it in
more detail along with more recent extensions in Chapter 9.
Degeneracy
The fact that the likelihood for normal mixture models has multiple infinite modes
that are not interesting was first pointed out by Day (1969) in the one-dimensional
case and further elucidated by Titterington et al. (1985). Redner and Walker
(1984) showed that interior local maxima of the likelihood can define estimators
that are consistent, but that the spurious infinite maxima on the edge of the
parameter space do not.
Methods for dealing with this problem can be classified as constraint methods,
Bayesian methods, penalty methods, and others. Hathaway (1985) was the first
to propose a constraint method. He proposed maximizing the likelihood subject
to the constraints that the eigenvalues of Σg Σ−1
h be greater than or equal to some
minimum value c > 0, and developed an algorithm for the univariate case (Hath-
away, 1986b). Ingrassia and Rocci (2007) showed how the Hathaway constraints
can be implemented directly at each iteration of the EM algorithm. Gallegos and
Ritter (2009b,a) proposed different, but conceptually similar, constraints based
on the Löwner matrix ordering, essentially that no group covariance matrix differs
from any other by more than a specified factor.
Fraley and Raftery (2003) proposed computing the condition number for each
estimated covariance matrix and not reporting the results for any covariance
3.4 Bibliographic Notes 107
model and number of components for which any of the condition numbers is zero
(or below relative machine precision); this is the default approach in mclust.
Garcı́a-Escudero et al. (2015) proposed a constraint on the ratio of the maximum
to the minimum of all the eigenvalues of all the mixture component covariance
matrices. This controls differences between groups and singularity of individual
components at the same time. They described an algorithm for implementing
it. Cerioli et al. (2018) proposed a refinement of this method that modifies the
likelihood penalty for model complexity to take account of the higher model
complexity that a larger constraint entails.
In addition to Fraley and Raftery (2007a), several other Bayesian approaches
have used the EM algorithm to estimate the posterior mode for mixture models.
Roberts et al. (1998) used a Dirichlet prior on the mixing proportions and a non-
informative prior on the elements of the means and covariances, while Figueiredo
and Jain (2002) used non-informative priors for all the parameters, and Brand
(1999) proposed an entropic prior on the mixing proportions. These methods work
by starting with more components than necessary, and then pruning those for
which the mixing proportions are considered negligible.
Chi and Lange (2014) proposed a prior on the covariance matrices that dis-
courages their nuclear norm (sum of their eigenvalues) from being too large or
too small. It penalizes large values of a weighted average of the nuclear norm of
Σg and the nuclear norm of Σ−1 g , leading to a method for covariance regularized
model-based clustering. Zhao et al. (2015) proposed a hierarchical BIC using the
posterior mode based on an inverse Wishart prior for the covariance matrices and
non-informative priors for the component means and mixing proportions.
Bayesian estimation for mixture models can also be done via Markov chain
Monte Carlo (MCMC) simulation, typically using conjugate priors on the means
and covariances similar to those of Fraley and Raftery (2007a) (Lavine and West,
1992; Bensmail et al., 1997; Dellaportas, 1998; Bensmail and Meulman, 2003; Zhang
et al., 2004; Bensmail et al., 2005). These can suffer from label-switching problems
(Celeux et al., 2000; Stephens, 2000b) and can be computationally demanding.
In our experience, use of the posterior mode is simpler and less computationally
expensive, provides most of the benefits of the Bayesian approach, and avoids
most of the difficulties.
Ciuperca et al. (2003) and Chen and Tan (2009) proposed maximum penalized
likelihood methods to overcome the unboundedness of the likelihood, with purpose-
built penalty functions. Ruan et al. (2011) and Bhattacharya and McNicholas
(2014) proposed using 1 penality functions. Halbe et al. (2013) proposed an EM
algorithm with Ledoit–Wolf-type shrinkage estimation of the covariance matrices
(Ledoit and Wolf, 2003, 2004, 2012).
Rather than dealing with the problem algorithmically, McLachlan and Peel
(2000) proposed monitoring the local maximizers of the likelihood function and
carefully evaluating them. Seo and Kim (2012) and Kim and Seo (2014) proposed
a systematic algorithmic approach to selection of the best local maximizer to
avoid spurious solutions.
108 Dealing with Difficulties
τ Z
Y μ, Σ
assume any model for the joint distribution p(yi , zi ) (Vapnik, 1998; Bishop, 2006;
Hastie et al., 2009). New methods such as deep learning for neural networks are
taking advantage of rapidly increasing computer power to address increasingly
complex problems involving larger data sets and more complex models.
In this landscape, model-based classification methods provide often slightly
higher misclassification error rates than the best predictive methods due to
differences between the models and the true distribution of the data. But model-
based classification has some advantages: interpretability and simplicity combined
with good if not optimal performance, and the ability to deal with partially
labeled data. Often, it provides almost optimal misclassification error rates with
parsimonious and understandable models. The decision rules of model-based
classifiers are also easier to interpret. Hybrid approaches aiming to achieve the
best of both worlds have also been proposed (Bouchard and Triggs, 2004; Lasserre
et al., 2006).
Moreover, a model-based approach is useful in the semi-supervised classification
context where many data are unlabeled (see Chapter 5). By using the information
provided by cheap and widespread unlabeled data, semi-supervised classification
aims to improve the classifiers’ performance. Most predictive approaches are
unable to take unlabeled data into account without additional assumptions on
the marginal distribution p(y) of the predictors.
In a semi-supervised setting it is natural to adopt the generative point of view,
since it leads to modeling the joint distribution
p(yi , zi ) = p(zi )p(yi |zi ).
This in turn leads to writing the marginal distribution p(yi ) as a mixture model
p(yi ) = p(zi )p(yi |zi ).
zi
Thus, the maximum likelihood estimate of the model parameters can be simply
computed using the EM algorithm. The straightforward formulae of EM in the
semi-supervised context are given in Chapter 5 and details can be found in Miller
and Browning (2003).
0.3
0.3
● ●
● ●
● ●
0.2
0.2
● ● ● ● ● ●
● ●
● ●
● ● ● ●
0.1
0.1
●
● ● ● ●
●
● ●
● ●
AHFantigen
AHFantigen
● ●● ●
0.0
0.0
● ● ●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ●
−0.1
−0.1
● ●
● ● ● ● ●
● ● ●
●
●
● ● ● ● ●
−0.2
−0.2
● ●
●
● ●
−0.3
−0.3
● ●
AHFactivity AHFactivity
0.3
●
0.2
0.2
● ●
● ●
● ●
0.1
0.1
● ● ● ●
●
● ●
AHFantigen
AHFantigen
● ● ●
● ●
0.0
0.0
● ●
● ●
●
−0.1
−0.1
● ●
● ●
●
−0.2
−0.2
● ● ●
●
● ●
● ●
−0.3
−0.3
● ●
●
● AHFactivity AHFactivity
Figure 4.2 The panel on the top left displays the training data set for the
haemophilia data set (no carriers in blue and carriers in red); the panel on
the top right displays the boundary of the 1NN decision rule on the training
data set; the bottom left panel displays the test data set and the bottom
right panel its assignment.
Figure 4.3 displays the classification with the QDA method. Contrary to the 1NN
classifier, there is no sensitive difference between the misclassification error rates
on the training and the test data sets with QDA. On the test data set, it leads to
4 misclassified observations (3 for the noncarrier class on 8 observations and 1 for
the carrier class on 16 observations).
4.2 Parameter Estimation 113
0.3 Learning data set Learned model and decision boundaries of QDA
0.3
● ●
● ●
● ●
0.2
0.2
● ● ● ● ● ●
● ●
● ●
● ● ● ●
0.1
0.1
●
● ● ● ●
●
● ●
● ●
AHFantigen
AHFantigen
● ●● ●
0.0
0.0
● ● ●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ●
−0.1
−0.1
● ●
● ● ● ● ●
● ● ●
●
●
● ● ● ● ●
−0.2
−0.2
● ●
●
● ●
−0.3
−0.3
● ●
AHFactivity AHFactivity
0.3
●
0.2
0.2
● ●
● ●
● ●
0.1
0.1
● ● ● ●
●
● ●
AHFantigen
AHFantigen
● ● ●
● ●
0.0
0.0
● ●
● ●
●
−0.1
●
●
−0.1 ●
●
●
−0.2
−0.2
● ● ●
●
● ●
● ●
−0.3
−0.3
● ●
●
● AHFactivity AHFactivity
Figure 4.3 The panel on the top left displays the training data set for the
haemophilia data set (non carriers in blue and carriers in red); the panel on
the top right displays the boundary of the QDA decision rule on the training
data set; the bottom left panel displays the test data set and the bottom
right panel its assignment.
G
p(yi , zi |τ, θ) = τgzig [fg (yi |θg )]zig . (4.2)
g=1
The way the whole vector parameter (τ, θ) is estimated depends on the way the
training data set is sampled.
Sampling Schemes
In the mixture sampling, the training data set arises from a random sample from
the whole population under study, while in retrospective sampling, the training
data set is the concatenation of G independent sub-samples with fixed sizes ng
and distribution fg (yi |θg ) for k = 1, . . . , G.
114 Model-based Classification
The MAP procedure consists of assigning yn+1 to the class maximizing this
conditional probability, namely
where
(1 − λ)(ng − 1)Σ̂g + λ(n − G)Σ̂
Σ̂g (λ) = ,
(1 − λ)(ng − 1) + λ(n − G)
Σ̂g are the estimates of the covariance matrices for the QDA model for g = 1, . . . , G,
that is n
zig (yi − μ̂g )(yi − μ̂g )T
Σ̂g = i=1 ,
ng
where n
i=1 zig yi
μ̂g = ,
ng
and Σ̂ is the common covariance matrix estimate for the LDA model, namely
G n
i=1 zig (yi − μ̂g )(yi − μ̂g )
T
g=1
Σ̂ = .
n
The complexity parameter λ (0 ≤ λ ≤ 1) controls the contribution of QDA and
LDA, while the regularization parameter γ (0 ≤ γ ≤ 1) controls the amount of
4.3 Parsimonious Classification Models 117
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
800
800
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
600
600
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
400
400
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
sspg
sspg
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●● ●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●● ●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●
●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●
●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●
●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●
●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●
● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●
●
200
200
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●● ●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●● ●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ●●
●●●●●●●●●●●● ●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ●●
●●●●●●●●●●●● ●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●
●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●
●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●● ●●●●●●●●●●●●●●●● ●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ● ●● ●●● ●● ● ●●●●●●●●●●●●●●●●●●●
●●●●●
● ●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ● ●● ●●● ●● ● ●●●●●●●●●●●●●●●●●●●
●●●●●
● ●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●● ●●●●● ●●●●● ●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●● ●●●●● ●●●●● ●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
0
0
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
insulin insulin
800
600
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
400
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
sspg
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
200
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
0
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
insulin
shrinkage of the eigenvalues towards equality, since tr(Σ̂g (λ))/d is the mean of
eigenvalues of Σ̂g (λ).
Thus, with fixed γ = 0, varying λ leads to an intermediate method between
LDA and QDA Now w th fixed λ = 0 ncreas ng γ eads to more b ased est mates
of the e genva ues of the covar ance matr ces For λ = 1 and γ = 1 we get a qu te
s mp e c ass fier wh ch cons sts of ass gn ng any vector to the c ass whose center s
the nearest.
Figure 4.6 illustrates the role of the parameters λ and γ on simulated data.
Finally, note that it is also possible to use the Moore–Penrose pseudo-inverse of
Σ̂ instead of the usual inverse Σ̂−1 . The reader can also refer to Mkhadri et al.
(1997) who provides a comprehensive overview of regularization techniques in
discriminant analysis.
118 Model-based Classification
The solution based on regularization does not have the same drawbacks as
dimension reduction and can be used more widely (see Chapter 8).
The tuning parameters λ and γ are computed by minimizing the cross-validated
error rate. In practice, this method performs well compared with LDA and QDA
for small sample sizes. The drawbacks of RDA are that it provides classifiers
that can be difficult to interpret and the methods are not very sensitive to the
parameters λ and γ; Figure 4.7 provides an illustration of the similar performance
for large ranges of values of the parameters.
The next illustration allows us to compare the EDDA and RDA behaviors on the
haemophilia data. The data set presented in Section 4.1.2 consists of a population
of 75 women (45 were haemophilia A carriers and 30 were not) described by two
variables y1 = 100 log(AHF activity) and y2 = 100 log(AHF like antigen). We
use this data set to illustrate the ability of EDDA to suggest a plausible geometric
model. We applied RDA (see Listing 4.3), LDA, QDA and EDDA to this data set.
The prior probabilities of the two classes were assumed to be equal.
The cross-validated misclassification risks were .14, .16, .17 and .13 for the four
methods. For RDA, the complexity parameter was λ = 0.15 and the shrinkage
parameter was γ = 0.8. Thus, RDA proposed a shrunk version of a quadratic
classifier. But, as shown in Figure 4.7, the two selected tuning parameters vary
greatly with the cross-validated samples and so are difficult to interpret. Conversely,
the selected model with EDDA was [λg DAg D ] and the boundaries of this classifier
are depicted in Figure 4.8.
Thus EDDA suggests that the best model for this data set assumes different
4.4 Multinomial Classification 119
−5200
−BIC
−5800
pk_L_I
pk_Lk_I
pk_L_B
pk_Lk_B
pk_L_Bk
pk_Lk_Bk
pk_L_C
pk_Lk_C
pk_L_D_Ak_D
pk_Lk_D_Ak_D
pk_L_Dk_A_Dk
pk_Lk_Dk_A_Dk
pk_L_Ck
pk_Lk_Ck
0.94
0.90
CV
0.86
0.82
pk_L_I
pk_Lk_I
pk_L_B
pk_Lk_B
pk_L_Bk
pk_Lk_Bk
pk_L_C
pk_Lk_C
pk_L_D_Ak_D
pk_Lk_D_Ak_D
pk_L_Dk_A_Dk
pk_Lk_Dk_A_Dk
pk_L_Ck
pk_Lk_Ck
Figure 4.5 Model selection with BIC and CV on the diabetes data.
volumes and shapes and the same orientations for the two classes. Its boundaries
are depicted in Figure 4.8 and show that this proposal seems quite reasonable.
EDDA performs well and provides a clear representation of the class distributions.
For this example, we do not claim that EDDA performs better than RDA or LDA.
The .632 bootstrap, (Efron and Tibshirani, 1997), with 100 bootstrap replications
gives the same estimated misclassification risk (.08) for RDA, LDA and EDDA.
But, we think that, for this data set, EDDA provides a clearer and more relevant
representation of the class distributions than either RDA or LDA.
λ = 0, γ = 0 λ = 0, γ = 0.5 λ = 0, γ = 1
λ = 1, γ = 0 λ = 1, γ = 0.5 λ = 1, γ = 1
yijh = 1 if yij = h
yijh = 0 otherwise.
●
●
●
●
●
0.0
Figure 4.7 Boxplot of the tuning parameters γ and λ of RDA found using
repeated cross-validated samples.
for g = 1, . . . , G,
jh
fg (yi |αg ) = (αgjh )yi , (4.8)
j,h
122 Model-based Classification
● ●
● ●
● ● ● ●
0.2
0.2
● ●
● ● ● ●
● ● ● ●
● ●
● ●
● ●
● ● ● ●
● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ●● ● ● ●●
0.0
0.0
● ● ● ●
● ●
● ●
● ●
● ● ● ●
AHFantigen
● ●
● ●
AHFantigen
● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ●
● ●
● ● ● ●
● ●
● ● ● ● ● ● ● ●
● ●
● ●
● ●
● ● ● ●
● ● ● ●
−0.2
● ● ● ●
−0.2
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
−0.4
−0.4
● ●
AHFactivity AHFactivity
Figure 4.8 Classification boundaries for the EDDA model [λg DAg DT ]
(left) and for the RDA method with tuning parameters λ = 0.15 and
γ = 0.80 on the haemophilia data set (right).
where αgjh denotes the probability that variable j has level h in class g, and
αg = (αgjh ; j = 1, . . . , d; h = 1, . . . , mj ). The maximum likelihood estimates of the
αg parameters are
(ujh
g )
(αgjh ) =
, j = 1, . . . , d; h = 1, . . . , mj ; g = 1, . . . , G,
ng
n n jh
where ng = i=1 zig and (ujh g ) = i=1 zig yi .
But maximum likelihood estimation with multivariate categorical data remains
problematic because of the curse of dimensionality. This conditional independence
model requires (G − 1) + G j (mj − 1) parameters to be estimated. The model
is more parsimonious than the empirical model which requires j mj parameters,
but it can still be too complex with regard to the sample size n and it can even
involve numerical difficulties (“divide by zero” occurrences or parameters αgjh
estimated to zero).
One way of avoiding such drawbacks is to use the regularized maximum likeli-
hood estimator:
g )+c−1
(ujh
(αgjh ) = (r) ,
ng + mj (c − 1)
where c is a fixed positive number, for updating the estimation of the αgjh . The
estimates can be sensitive to the value of c and considering this regularization
issue in a Bayesian setting could be beneficial (see Section 6.2.7).
With this in mind, the parsimonious models presented in Chapter 6 in the
clustering context could be thought of as desirable. These models use a repa-
rameterization of αg in two parts. For any variable j, let ajg denote the most
frequent response level in class g, the parameter αg can be replaced by (ag , εg )
4.4 Multinomial Classification 123
1 − αgjh if h = ajg
ajg = arg max αgjh and εjh =
h
g
αgjh otherwise.
Vector ag provides the modal levels in class g for the variables and the elements
of vector εg can be regarded as scattering values.
Using this form, it is possible to impose various constraints on the scattering
parameters εjh
g . The models we consider are the following.
• the standard conditional independence model [εjh g ]: the scattering depends upon
classes, variables and levels.
• [εjg ]: the scattering depends upon classes and variables but not upon levels.
• [εg ]: the scattering depends upon classes, but not upon variables.
• [εj ]: the scattering depends upon variables, but not upon classes and levels.
• [ε]: the scattering is constant over variables and classes.
We do not detail here the maximum likelihood estimation of these models in
the classification context. This can be straightforwardly derived from the M-step
formulae given in Chapter 6.
As an example, the maximum likelihood formulae for the model [εjg ], are
(ajg ) = arg max(ujh
g )
h
and
ng − vgj
(εjg ) = ,
ng
∗
∗
for g = 1, . . . , G and j = 1, . . . , d, where vgj = ujh
g , h is the modal level ag
j
j
(Aitchison and Aitken, 1976). Notice that εg is simply the percentage of times the
level of an object in class g is different from its modal level for variable j.
By considering this particular representation of the multivariate multinomial
distribution, it is possible to get parsimonious and simple classifiers. The pa-
rameters of these classifiers are estimated using the formulae presented in the
latent class model context. In particular, it could be beneficial to consider regular-
ized maximum likelihood or Bayesian estimates to avoid the numerical problems
occurring when dealing with many categorical variables.
4.4.2 An Illustration
To illustrate this family of models for supervised classification with categorical
data, a data set describing the morphology of birds (puffins) is used. This data
set was first used by Bretagnolle (2007). Each bird is described by five categorical
variables. One variable for the gender and four variables providing a morphological
description of the birds: the eyebrow stripe (4 levels), the collar description (4
levels), the sub-caudal description (4 levels) and the border description (3 levels).
There are 69 puffins from two species: lherminieri (34) and subalaris (35). The
124 Model-based Classification
purpose is to recover the puffin species with the five categorical variables. Assuming
a mixture sampling scheme, the model minimizing the cross-validated error rate
(leave-one-out) is the standard model [εjhg ] and the estimated error rate is 1.5%.
Listing 4.4 presents the code for learning the classifier on the puffin data set
with Rmixmod.
**************************************************************
* Number of samples = 69
* Problem dimension = 5
**************************************************************
* Number of cluster = 2
* Model Type = Binary_pk_Ekjh
* Criterion = CV(0.9855)
* Parameters = list by cluster
* Cluster 1 :
Proportion = 0.4928
Center = 1.0000 2.0000 3.0000 1.0000 1.0000
Scatter = | 0.4714 0.4714 |
| 0.1786 0.3929 0.2071 0.0071 |
| 0.1500 0.3786 0.5929 0.0643 |
| 0.4500 0.3214 0.0929 0.0357 |
| 0.0762 0.0667 0.0095 |
* Cluster 2 :
Proportion = 0.5072
Center = 2.0000 3.0000 1.0000 1.0000 1.0000
Scatter = | 0.4306 0.4306 |
| 0.0069 0.0069 0.1319 0.1181 |
| 0.0208 0.0069 0.0069 0.0069 |
| 0.0486 0.0069 0.0069 0.0347 |
| 0.0741 0.0370 0.0370 |
* Log-likelihood = -237.3621
**************************************************************
Figure 4.9 displays the distribution of the two classes for the five descriptors.
The discriminative variables are eyebrow stripe, collar description and sub-caudal
description, while gender is (not surprisingly) useless as border description.
0.5
0.8
Borealis Borealis Borealis
Diomedea Diomedea Diomedea
0.4
0.6
0.6
0.3
Alpha
Alpha
Alpha
0.4
0.4
0.2
0.2
0.2
0.1
0.0
0.0
0.0
Male Female None Pronounced None Dashed Continuous
0.5
Borealis Borealis
0.8
Diomedea Diomedea
0.4
0.6
0.3
Alpha
Alpha
0.4
0.2
0.2
0.1
0.0
0.0
Figure 4.9 The distribution of the two puffin species on the five
descriptors: gender, eyebrow stripe, collar description, sub-caudal description
and border description (from left to right and top to bottom).
0.25
0.20
Classification error
0.15
0.10
0.05
0.00
2 4 6 8 10
G
Figure 4.10 Choice of the common number of components per class for
MDA with the diabetes data set.
the flexibility of MDA. An illustration of MDA with the most general Gaussian
mixture with rg = r for g = 1, . . . , G is given in Figures 4.10 and 4.11 for the
diabetes data set. The best cross-validated error rate is achieved when r = 3. The
more and more complex boundaries of the classifications with r = 1, 3 and 10 are
displayed in Figure 4.11.
2
An alternative solution is to consider covariance matrices of the form Σg = σg I,
I being the identity matrix. Acting in such a way, the class conditional densities
fg (yi ) are modeled by a union of Gaussian balls with free volumes, where a Gaussian
ball is a multivariate Gaussian distribution with covariance matrix proportional to
the identity matrix. It is a flexible model since (numerous) Gaussian balls could be
expected to fit the cloud of the ng points of class g, and it remains parsimonious
since each covariance matrix is parameterized with a single parameter.
As an illustration, we consider in Listing 4.5 MDA with Gaussian balls for
learning a supervised classifier on the diabetes data set. For each class, the number
of Gaussian balls was chosen by the BIC criterion (3, 5 and 3 components for
the Type 1 diabetic, Type 2 diabetic and non-diabetic classes, respectively). The
boundaries of the classification rule are displayed in Figure 4.13. It can be seen
that they are somewhat less complex than the boundaries displayed in Figure 4.11
but remain flexible.
800
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●
●●
● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●
●●
●●
● ●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●●●●
●●
●●
●●●
●●
●●
●●
●●
●●
●
●
●
●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●● ●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●
●●
●●
●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●● ●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●● ●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●
●●
●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●
●●
●●
●●
●●
●●● ●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●●
●●●
●●●
●●●
●●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
● ●●
●●
●●
●●
●
●
●
●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●●
●●
●●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●
●●●
●●
●●
●●●
●●
●●
●●
●●
●●
●
●●
●●
●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●●
●●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●● ●
●●● ●●
●●●
●●●
●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●●●
●●●●
●●●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●●●● ●●●● ●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●● ●●
●● ●●
●●●●●●
●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
600
●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●●
●● ●● ● ●● ● ●●●● ●● ● ● ●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●●
●●●
●●●
●●●
●●●
●
●●
●●
●●●●
●●
●●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●●●
●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●
●●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●
●
●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●● ●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●
●
●●●●●●●●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●●●
●●
●
●●
●
●●
●
●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●●●
●●
●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
● ●
●●
●●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●
●●
●●●●
●●
●●
●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●
●●
●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●
●
●
●
●●
●●
●
●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●
●●●
●●
●
●●
●●
● ●
●●
●●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
● ●●
●
●●
●●
●
●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●●
●● ●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●
●●
●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●●
●●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
●●
●
●
●●
●●●●
● ●●
● ●●
●●●
● ●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●● ●●●
400
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●●●● ●● ● ●●●● ●●●●●●●●●●●●●●●●
●●
●●●●●●●●
●●●●●●
●●●
●●●●●●
●●●
●●●
●●●
●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●
●●●
●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●● ●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●●●
●●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●●
●
●
●●
●●●
● ●●●
●●●●●
●● ●●●
●●●●●
●● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
● ●●●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
● ●●
●●
●
●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
sspg
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●●● ● ●● ● ●●● ●● ●● ●● ●● ●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●
●●●
●●●
●●●
● ●
●
●●
●● ●●
● ●
●
●
●
●
●
●●
●●●
● ●●
●●●
●
●
●●
●●●
● ●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●● ●● ● ●● ●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●
●●●●●●
● ●
●●
● ●
●● ●● ●●● ●●●●●●●
●● ●●●● ●●
●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●● ●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●● ●
●●
●●
● ●
●●
● ●
●●●●
●●
●●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●●
●●
●●●● ●●
●● ●●
●●●●
●● ●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
● ●
● ● ● ● ●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
● ●●
●●
●●●●●
●● ●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●
●●●●
●●●●
●●
●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
● ●● ● ● ●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●●
●●
●●
●●
●●●● ●●●●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●
●●
●●
● ●
●
●●
●
● ●●
● ●●
●●●●●
●●●●
●●●●
●● ●●
●●●●
● ●●
●●●●●●
●●●●●●●
●●●●
●●●●●●
●●●●
●●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●● ●●
●● ●
●●●●●●
●●
●●●●
●● ● ●
●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
200
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ●●
● ● ●●●
●
● ●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●
● ●
●●
●●●
●●
●
● ●
●
●●
● ●
●●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
● ●
●●
●●
●●
● ●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●●●●
●
● ●●
● ●●
●●
● ●●
●●●
● ●●
●●●●●
●●●●
●●
●●●●●●●●●
●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●
● ● ●● ●
●● ●● ●● ●●●●●
● ● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●● ●
●●● ●
●●
●●●●●●●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●● ●●
●● ●●●●
●●●●
●●●●
●●●● ●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●● ●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
● ●● ● ●
● ●
● ●●● ●●
●●● ● ● ● ●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●
●●
●●
●●
●●
●● ●●
●●
● ●●
● ●●
●
●●
●
●
●
●●●●●●
●●
● ●●
● ●●● ●●
●●●●●●●●
●●● ●
●●●●●●
●● ●●
●●●●●●●●●●●●●●●●●● ●●●● ●●●●●
●●● ●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●● ●●
●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ●● ● ●●● ● ●● ● ●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●● ●● ● ●●●
● ●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●●●●
●●
●●
●●●●
●●
●●●
●●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●● ● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
● ●●
●●●●
●●●●
●●
● ●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●●
●●
●●
●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●●●
●●●●●
●●
●●
●●
●●
●●
●●●●
●●●●●
●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
● ●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●
● ●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
● ●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●●
●●
●● ●●
●●
●●
●●
●●
●●●
●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●● ●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●● ●●
●●●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●● ●
●●
●●●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●
●●●●
●●●
●●
●●
●●●●
●●●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
●●
● ●
●
●●
●
●
●
● ●
●●
● ●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
● ●
●●
●●
●●
●●
●●
●
●
●
● ●
●●
●●
●●
●●
●●
●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●
●●
●
0
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●● ●●
●●
● ●●
●●●●
●●
●●●●
●●●
●●●●
●●
●●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●● ●●
●●●●●
●●
●●●●
● ● ●● ● ●●●● ●●●●●●●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●●
●●●
●●
●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●
●●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●
●●
●● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●
●●●●●
●●
●●●
●●●●
●● ●
●●
● ●
●●
● ●
●●
● ●●
●●●
● ●
●●
● ●
● ●
●●
● ●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
● ●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●● ●● ● ●●
●●●●
●●
●●●●
●●
●●●●
●●
●● ●●
●●
● ●
●●
● ●
● ●
●●
● ●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●
●●
●●●● ● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●
●●
●●
●●●
●●
●●●
●●
●●
●●●
●●
●●
●●●
●●
●●
●●●
●●
●●
●●●
●●
●●●●
●●●
●●
●●
●●●
●●
●●●
●●
●● ●●
●●●
●●
●●●
●●
●●●●
●●●
●●
●●●
●●
●●
●●●
●●
●●
●●●
●●
●●
●●●
●●
●●
●●●
●●
●●
●
●●
●●
●
●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●● ●
●●
●●●
●●●
●●
●●●
●●
●●●
●●
●●●
●●
●●●
●●●
●●
●● ●
●●
●●●●●
●●
●●●
●● ●
●●
●●●
●●
●● ●
●●
●●●
●●
●●●
●●
●●●
●●
●●●
●●
●●●
●●
●●●
●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●●●●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●
●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●● ●
● ●
●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●● ●●● ●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●●
●●●●
●●●●
●●
●●
●●●
●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
● ●●
●●
●●●●
●●●●
●●
●●
●●●
●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●
●
●●●
●●●
●●●
●●●
●●
●●
●●
●●●
●●●
●●●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●
●● ●●
●●
●●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●●
● ●
●●●●
● ●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●●
● ●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●●
●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
600
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●●
●●●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●
●●
●●
●
●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●●●●●●
●●●●
●●●●
●● ●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
● ●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
● ●●●
● ● ●
●●
● ●
●●
● ●
●●
● ●
●● ●●● ●● ●●●●
● ●●●●● ●●●●●●●●●●●●●●●●●
●●●
●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●● ●●
●●
● ●●
●●●●
●●
●●●●
●●●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●● ●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●●●● ●● ● ●● ●● ● ●● ●●● ●● ● ●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
● ●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●●
● ●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●●●
●●●●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●●
● ●
●●
● ●●
●●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●● ●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●● ●●
●●
●●●●
●● ●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●
●
●●
●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●● ●● ●● ●● ●●● ● ● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●●
●●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●
●●●
400
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●●●●● ● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
● ●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
sspg
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
● ●
●●
● ●
●●
●●
● ●●
●●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●● ●● ●● ●● ●● ● ●● ●●● ●● ● ●● ●●●●●
●● ●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
● ●
●●
●●
●●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●
●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●● ●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ●● ● ●● ●● ●● ●● ●●● ●●●●●●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●● ●
●●
●●●●
●●
●●
●●
●● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●
●●
● ●
●●
● ●●
●●●●●
●●
●●
● ●●
●●● ●
●●
● ●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●●●
●●●● ●●
●● ●●
●● ●●
●● ●●●●●●
●●●●●● ●●●●●●
●●●
●●●●●●
●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●
●●
●●
●●
●●
● ●
●●
● ●●
● ●●
● ●
●●
●●●●●
●●
●
●
●●
●●●
● ●●●●
●●
● ●●
●●
● ●●
● ●●
●●
● ●●●●
●●
● ●●●
●●
● ●●
●●
● ●●
●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●●●●●●●
●●
●●●●
●●
●● ●●
●●●●●●
●●
● ●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
● ● ●
●● ●● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●●●
●●●●●● ●●
●●●●
●●●● ● ●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ● ●● ● ●●●●
● ●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
200
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●
●●
●●
● ●●●
● ● ●●
● ●●●
● ●●
●●●●
●
●
●●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●●
● ●●
●●●●
● ●
●●
●●●● ●●
●●
●●●●
●●●●●●
●●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●●
●
●
● ●
● ●●●
●●
● ●
●
●
●
●●
● ●●
●●● ●
●●
●●●
●●
●●
●●
● ●●
● ●
● ●
●
●●●
●
● ●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●● ●
●●
●●●●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●● ●●
●● ●●●●
●●●●
●●●●
●●●●
● ●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●● ●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
● ● ●
● ●
● ●● ●●● ● ● ● ●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●● ●●
●
●●●
●●●●
●●
●● ●●●
●● ●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●
●●
●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●
●● ●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
● ●
●● ●●●●● ●●
● ● ●● ● ● ●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ● ● ●
●● ● ● ●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●● ●●
●● ●●●
●●
●●●
●●
●●●●●
●●
●●●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●● ●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●
●● ● ●●●
● ●● ●●
● ●●
● ●●
●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●
● ●● ●●●●●●●●●● ●
●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●●
●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●●●
●●● ●●
●●●●
●●●●
●●●●
●● ●● ●● ● ●●●●●
●●
●●
●●
●● ●●●●●●●●●●●●●●●●●●
● ●●●●●●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●
●●
●●●●
●● ●
●●
●●●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●
●●
●●●●
●●●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●●
●●●●
●●●
●●●
●●●
●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●
●●●
● ●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
●●●
● ●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●● ●●
●●●
●●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●
●●
●●
●●
●●
●●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●● ●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●●
●
●●
● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●
●●
●●
●● ● ● ●● ●
●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
0
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●● ●● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●●●
● ●●
●●●
● ●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
● ●
●●
● ●
●●
● ●
●●
●●●
● ●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●
●●
● ●● ●● ●● ●●● ●● ● ●● ●●●●● ●●●●●●●●●
●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●●●
● ●●
●●●
● ●●
●●●
● ●●
●●●
● ●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●
●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
sspg
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●
●●
●●
●●●●●●●●●●●● ●● ●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●
●●●●
●●●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
● ●●
●●●●
●●●
●
●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●
●●
●●
●●
●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●
●●
●●●●
●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●
●●●
●●●
●●●●
●●
●●●
●●●
●●●
●●●
●
●
●●
●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●
●●●
●
200
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
●
●●
●
●
●●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●
●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
0
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●
●●●
●●
●●
●●
●●●●
●●
●●●
●●
●●
●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●●●
●●●
●●
●●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●
●
−21
●
● Class 1
● ● ●
−22 ● ●
●
● Class 2
●
●
● Class●3
−23
−24
BIC value
● ●
● ●
−25
● ●
● ●
●
●
−26
●
● ● ●
● ●
−27
●
●
●
●
−28
2 4 6 8 10
Number of components
Figure 4.12 Choice of the number of components per class for MDA with
the diabetes data set.
800
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●
● ●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●●●
● ●● ●
●● ●●
●●
●● ●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
● ●●●
●● ●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●
●●●●●
●●●
●●
●●
●●●●●●
●●●●●●●
●●●
●●
●●
●●
●●
●●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●
●●
●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●
●●●
●●●
●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ● ●●
●●
●●●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●●●●
● ●●
●●●
●●●
●●●
●●●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●●
●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
● ●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●●
●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●● ●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ● ●●
● ●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●● ●
●●
● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ●
● ●
●●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●●
600
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●
●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
● ●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●
●●●●●●●●●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●
●●●
●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●● ●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●●
● ●●
● ●●
●●●
●●●
● ●●
●●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●
● ●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
● ● ●
●●
●●
● ●
●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●
●●
●
●
●●
●
●●
●●
●● ●
●
●
●●
●
●
●●
●
●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●
●●
●
●●
●●
● ●●
●●●
●●●
●●●
●
●
●●
●●
●●●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●●● ●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●●●●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
400
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●●●●● ● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●● ●
●●
●●●●
●●
●●●●
●●
●●●
●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
sspg
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
● ●
● ●
● ●
●●
●●
● ●
●●●
● ●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
● ● ●
●●●● ●
●●
● ●
●●
● ●
● ●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●● ●
●●
●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●
●●
●●●
●● ●●
●●
● ●●
●●●●
●●●●
● ● ●●● ●● ● ●● ●●●●●●
● ●●
●●●●
●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●● ●● ● ●
●● ●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●●●
●●
● ●
●●●●
●●
●●●●●
●●●●
●●●
●●
●●●●
●●●
●● ●
●●●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
● ●●●● ●● ● ●● ●● ●● ●● ●●●● ●●● ●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●● ●
●●
● ●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●● ●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●●
● ●
●
●●
● ●
●●
● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●●
● ●●
●● ●●
●●●●●●
●● ●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●●●● ● ●●● ●● ● ●● ●●●●●
● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●● ●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
● ●● ●● ●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●● ●●
●● ●●●●●●
●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
200
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ●●
● ● ● ●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●
●●
●●
● ●●
●●
●●●●
●
●
●●
●●●●
●● ●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●● ●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●●●●●
●● ●●
●●●●
●●
●●●●● ●
●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
● ● ●
●●
● ●●●●●●
●●
● ●
●●
●●●
●●●●
●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●● ● ● ●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●● ●●
●● ●●●●
●●●●
●●●●
●●
●● ●●●
● ●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
● ● ●
● ●
● ● ● ● ●● ● ● ● ●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●● ●●
●
●●●
●●●●
●●
●● ●●●
●● ●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●
●●
●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●
●● ●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
● ●
●● ●●●●● ●
●● ● ●● ● ● ●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ● ● ●
●● ● ● ●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●● ●●
●● ●●●
●●
●●●
●●
●●●●
●●●
●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●
●● ● ●●●
● ●● ●●
● ● ●
● ●●
●●●
●●●●
●●●●●●
●●●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●● ●●
●●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●●
●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ●● ●● ●●
● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●●
● ●●●●●●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●● ●●
●●
● ●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ● ● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●
● ●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●
●●●●●
●●●●
●●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
● ●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●
●●
●● ●●
●●
●●
● ●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●● ● ●●● ●● ● ●● ●●● ● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●
●●●●●●●●● ●●●●●●●●●●●●●●●●●● ● ●●●
● ●●
●●●●●●●●●●●●●●●●●●●●●●
●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●●●
● ●●
● ●●
● ●●
●●●
●●●
● ●●
●●●
● ●●
● ●●
●●●
● ●●
●●●
●●
●●
●●
●●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
● ●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●●
0
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●●
●●●
●●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●●●
●●●
●●●
●●●●
●●
●●● ●● ●● ●● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●● ●● ●●
● ●
●●
●●
● ●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●
●●●
●●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
● ●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
● ●
●●
● ●
● ●
● ●
●●
●●
● ●
●●
● ●
● ●
●●
● ●
●●
●●
●●
●●
● ●
●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●●●●●
●●
●●●●
●●●●●●
●●●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ● ● ●●● ●● ● ●● ●●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●● ●● ●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ●●
● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●
●●●●
●●
●● ●
●●
●●●●
●●●●●
●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
● ● ●●● ●● ●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●●●
● ●●
●●●●
●●
● ●
●●
● ●
●●
● ●
●●
● ●●
●●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●
●●
●● ●●
● ●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●●●
●●
● ●●
●●
●●
●●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●●
●●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●●●●● ●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
●●●●
●●●●
●●●●
●●●●
●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●
400
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●●●●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
● ●●●●●●●●●●●●●●●●●● ●● ● ●● ●● ●● ●●●●●
●●
●●●●
●●●●
●●●●
●● ●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
sspg
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●● ●
● ●
●● ●● ●● ●● ●●●● ●●●●●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●
● ●
● ●●
●●
●
●●
●
●●●
● ●
●
●●
●●●
●●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
●●
●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●●
●●
●●●
●● ●●
●●
● ●●
●●●●
●●●●
●●●●●
●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●●●
●●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●● ●● ● ●
●● ●
●●
●●●●
●●
●●●●
●●
●●●●
●●●
●●●●
●●
● ●●
●●
●●●●
●● ●
●●●●●
●●●
●●
●●●●
●●●
●● ●
●●●●
●●●
●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●
●●
●●
●●
● ●
●●
●●
●●
●●
●●
●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
● ●●●● ●● ● ●● ●● ●● ●●● ●●● ●●● ●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●
●● ●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●●●●●
●●● ●
●●●●
●●●
●●●●
●● ●
●●
●●●●
●●●●
●●
●●●●
●●●
●●
●●●●
●●
●●●●
●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●
●●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●
●●●●
●
●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●● ●●
●● ●●
●● ●●
●●●●●●
●●●
●●
●●●●●
●●●●
●●●●
●●
●●●●
●●●●
●●●●
●●●●
●●
●●
●●●●
●●
●●
●●
●●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●● ●
●● ● ●●
● ●
●●
● ●
●
●●
●●● ● ●●●●●●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●● ●●●●●●●●●●●●●●●●●● ●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●
●●●●
●●●●●●
● ●●
●●●●●●
● ●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●●
●●●
●●●
●●●
●●●
●●