Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
Canonical correlation 1
Canonical correlation
In statistics, canonical-correlation analysis (CCA) is a way of making sense of cross-covariance matrices. If we
have two vectors X = (X
1
, ..., X
n
) and Y = (Y
1
, ..., Y
m
) of random variables, and there are correlations among the
variables, then canonical-correlation analysis will find linear combinations of the X
i
and Y
j
which have maximum
correlation with each other. T. R. Knapp notes "virtually all of the commonly encountered parametric tests of
significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for
investigating the relationships between two sets of variables." The method was first introduced by Harold Hotelling
in 1936.
Definition
Given two column vectors and of random variables with finite second
moments, one may define the cross-covariance to be the matrix whose entry
is the covariance . In practice, we would estimate the covariance matrix based on sampled data from
and (i.e. from a pair of data matrices).
Canonical-correlation analysis seeks vectors and such that the random variables and maximize the
correlation . The random variables and are the first pair of
canonical variables. Then one seeks vectors maximizing the same correlation subject to the constraint that they are
to be uncorrelated with the first pair of canonical variables; this gives the second pair of canonical variables. This
procedure may be continued up to times.
Computation
Derivation
Let and . The parameter to maximize is
The first step is to define a change of basis and define
And thus we have
By the Cauchy-Schwarz inequality, we have
There is equality if the vectors and are collinear. In addition, the maximum of correlation is
attained if is the eigenvector with the maximum eigenvalue for the matrix (see
Rayleigh quotient). The subsequent pairs are found by using eigenvalues of decreasing magnitudes. Orthogonality is
guaranteed by the symmetry of the correlation matrices.
Canonical correlation 2
Solution
The solution is therefore:
• is an eigenvector of
• is proportional to
Reciprocally, there is also:
• is an eigenvector of
• is proportional to
Reversing the change of coordinates, we have that
• is an eigenvector of
• is an eigenvector of
• is proportional to
• is proportional to
The canonical variables are defined by:
Implementation
CCA can be computed using singular value decomposition on a correlation matrix. It is available as a function in
• MATLAB as canoncorr
[1]
• R as cancor
[2]
or in FactoMineR
[3]
• SAS as proc cancorr
[4]
• Scikit-Learn,Python as Cross decomposition
[5]
Hypothesis testing
Each row can be tested for significance with the following method. Since the correlations are sorted, saying that row
is zero implies all further correlations are also zero. If we have independent observations in a sample and is
the estimated correlation for . For the th row, the test statistic is:
which is asymptotically distributed as a chi-squared with degrees of freedom for large
. Since all the correlations from to are logically zero (and estimated that way also) the product
for the terms after this point is irrelevant.
Practical uses
A typical use for canonical correlation in the experimental context is to take two sets of variables and see what is
common amongst the two sets. For example in psychological testing, you could take two well established
multidimensional personality tests such as the Minnesota Multiphasic Personality Inventory (MMPI-2) and the NEO.
By seeing how the MMPI-2 factors relate to the NEO factors, you could gain insight into what dimensions were
common between the tests and how much variance was shared. For example you might find that an extraversion or
neuroticism dimension accounted for a substantial amount of shared variance between the two tests.
One can also use canonical-correlation analysis to produce a model equation which relates two sets of variables, for
example a set of performance measures and a set of explanatory variables, or a set of outputs and set of inputs.
Canonical correlation 3
Constraint restrictions can be imposed on such a model to ensure it reflects theoretical requirements or intuitively
obvious conditions. This type of model is known as a maximum correlation model.
Visualization of the results of canonical correlation is usually through bar plots of the coefficients of the two sets of
variables for the pairs of canonical variates showing significant correlation. Some authors suggest that they are best
visualized by plotting them as heliographs, a circular format with ray like bars, with each half representing the two
sets of variables.
Examples
Let with zero expected value, i.e., . If , i.e., and are perfectly correlated,
then, e.g., and , so that the first (and only in this example) pair of canonical variables is and
. If , i.e., and are perfectly anticorrelated, then, e.g., and , so
that the first (and only in this example) pair of canonical variables is and . We notice
that in both cases , which illustrates that the canonical-correlation analysis treats correlated and
anticorrelated variables similarly.
Connection to principal angles
Assuming that and have zero expected values, i.e.,
, their covariance matrices and
can be viewed as Gram matrices in an inner product for the entries of and
, correspondingly. In this interpretation, the random variables, entries of and of are treated as
elements of a vector space with an inner product given by the covariance , see
Covariance#Relationship_to_inner_products.
The definition of the canonical variables and is then equivalent to the definition of principal vectors for the
pair of subspaces spanned by the entries of and with respect to this inner product. The canonical correlations
is equal to the cosine of principal angles.
References
[1] http://www.mathworks.co.uk/help/stats/canoncorr.html
[2] http://stat.ethz.ch/R-manual/R-devel/library/stats/html/cancor.html
[3] http://factominer.free.fr/
[4] http://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm#statug_cancorr_sect005.htm
[5] http://scikit-learn.org/stable/modules/cross_decomposition.html
External links
• Understanding canonical correlation analysis (http://www.qmrg.org.uk/files/2008/12/
3-understanding-canonical-correlation-analysis1.pdf) (Concepts and Techniques in Modern Geography)
• Hardoon, D. R.; Szedmak, S.; Shawe-Taylor, J. (2004). "Canonical Correlation Analysis: An Overview with
Application to Learning Methods". Neural Computation 16 (12): 2639–2664. doi: 10.1162/0899766042321814
(http://dx.doi.org/10.1162/0899766042321814). PMID  15516276 (http://www.ncbi.nlm.nih.gov/
pubmed/15516276).
• A note on the ordinal canonical-correlation analysis of two sets of ranking scores (http://mpra.ub.
uni-muenchen.de/12796/) (Also provides a FORTRAN program)- in J. of Quantitative Economics 7(2), 2009,
pp. 173-199
• Representation-Constrained Canonical Correlation Analysis: A Hybridization of Canonical Correlation and
Principal Component Analyses (http://ssrn.com/abstract=1331886) (Also provides a FORTRAN program)- in J.
of Applied Economic Sciences 4(1), 2009, pp. 115-124
Article Sources and Contributors 4
Article Sources and Contributors
Canonical correlation  Source: http://en.wikipedia.org/w/index.php?oldid=599494674  Contributors: 2andrewknyazev, Ahsglg0054, AndrewHowse, Angryhaggis, Arthur Rubin, Attarparn,
Bestiasonica, Bkonrad, Bob1960evens, Bruce rennes, Bruguiea, Cyan, Dean p foster, Den fjättrade ankan, Duoduoduo, Fangz, Fnielsen, Free Software Knight, Gareth Jones, Geomon, Giganut,
Hu12, JamesBWatson, Jncraton, Kiefer.Wolfowitz, Mark viking, Matteo.pelagatti, MaxSem, Mcld, Melcombe, Memming, Michael Hardy, Mishrasknehu, Olaf, R'n'B, Selvik, Shyamal,
SyedAshrafulla, That Guy, From That Show!, Tomi, VSteiger, Volemak, WikiMSL, Yuzhounh, 50 anonymous edits
License
Creative Commons Attribution-Share Alike 3.0
//creativecommons.org/licenses/by-sa/3.0/

More Related Content

Canonical correlation

  • 1. Canonical correlation 1 Canonical correlation In statistics, canonical-correlation analysis (CCA) is a way of making sense of cross-covariance matrices. If we have two vectors X = (X 1 , ..., X n ) and Y = (Y 1 , ..., Y m ) of random variables, and there are correlations among the variables, then canonical-correlation analysis will find linear combinations of the X i and Y j which have maximum correlation with each other. T. R. Knapp notes "virtually all of the commonly encountered parametric tests of significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for investigating the relationships between two sets of variables." The method was first introduced by Harold Hotelling in 1936. Definition Given two column vectors and of random variables with finite second moments, one may define the cross-covariance to be the matrix whose entry is the covariance . In practice, we would estimate the covariance matrix based on sampled data from and (i.e. from a pair of data matrices). Canonical-correlation analysis seeks vectors and such that the random variables and maximize the correlation . The random variables and are the first pair of canonical variables. Then one seeks vectors maximizing the same correlation subject to the constraint that they are to be uncorrelated with the first pair of canonical variables; this gives the second pair of canonical variables. This procedure may be continued up to times. Computation Derivation Let and . The parameter to maximize is The first step is to define a change of basis and define And thus we have By the Cauchy-Schwarz inequality, we have There is equality if the vectors and are collinear. In addition, the maximum of correlation is attained if is the eigenvector with the maximum eigenvalue for the matrix (see Rayleigh quotient). The subsequent pairs are found by using eigenvalues of decreasing magnitudes. Orthogonality is guaranteed by the symmetry of the correlation matrices.
  • 2. Canonical correlation 2 Solution The solution is therefore: • is an eigenvector of • is proportional to Reciprocally, there is also: • is an eigenvector of • is proportional to Reversing the change of coordinates, we have that • is an eigenvector of • is an eigenvector of • is proportional to • is proportional to The canonical variables are defined by: Implementation CCA can be computed using singular value decomposition on a correlation matrix. It is available as a function in • MATLAB as canoncorr [1] • R as cancor [2] or in FactoMineR [3] • SAS as proc cancorr [4] • Scikit-Learn,Python as Cross decomposition [5] Hypothesis testing Each row can be tested for significance with the following method. Since the correlations are sorted, saying that row is zero implies all further correlations are also zero. If we have independent observations in a sample and is the estimated correlation for . For the th row, the test statistic is: which is asymptotically distributed as a chi-squared with degrees of freedom for large . Since all the correlations from to are logically zero (and estimated that way also) the product for the terms after this point is irrelevant. Practical uses A typical use for canonical correlation in the experimental context is to take two sets of variables and see what is common amongst the two sets. For example in psychological testing, you could take two well established multidimensional personality tests such as the Minnesota Multiphasic Personality Inventory (MMPI-2) and the NEO. By seeing how the MMPI-2 factors relate to the NEO factors, you could gain insight into what dimensions were common between the tests and how much variance was shared. For example you might find that an extraversion or neuroticism dimension accounted for a substantial amount of shared variance between the two tests. One can also use canonical-correlation analysis to produce a model equation which relates two sets of variables, for example a set of performance measures and a set of explanatory variables, or a set of outputs and set of inputs.
  • 3. Canonical correlation 3 Constraint restrictions can be imposed on such a model to ensure it reflects theoretical requirements or intuitively obvious conditions. This type of model is known as a maximum correlation model. Visualization of the results of canonical correlation is usually through bar plots of the coefficients of the two sets of variables for the pairs of canonical variates showing significant correlation. Some authors suggest that they are best visualized by plotting them as heliographs, a circular format with ray like bars, with each half representing the two sets of variables. Examples Let with zero expected value, i.e., . If , i.e., and are perfectly correlated, then, e.g., and , so that the first (and only in this example) pair of canonical variables is and . If , i.e., and are perfectly anticorrelated, then, e.g., and , so that the first (and only in this example) pair of canonical variables is and . We notice that in both cases , which illustrates that the canonical-correlation analysis treats correlated and anticorrelated variables similarly. Connection to principal angles Assuming that and have zero expected values, i.e., , their covariance matrices and can be viewed as Gram matrices in an inner product for the entries of and , correspondingly. In this interpretation, the random variables, entries of and of are treated as elements of a vector space with an inner product given by the covariance , see Covariance#Relationship_to_inner_products. The definition of the canonical variables and is then equivalent to the definition of principal vectors for the pair of subspaces spanned by the entries of and with respect to this inner product. The canonical correlations is equal to the cosine of principal angles. References [1] http://www.mathworks.co.uk/help/stats/canoncorr.html [2] http://stat.ethz.ch/R-manual/R-devel/library/stats/html/cancor.html [3] http://factominer.free.fr/ [4] http://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm#statug_cancorr_sect005.htm [5] http://scikit-learn.org/stable/modules/cross_decomposition.html External links • Understanding canonical correlation analysis (http://www.qmrg.org.uk/files/2008/12/ 3-understanding-canonical-correlation-analysis1.pdf) (Concepts and Techniques in Modern Geography) • Hardoon, D. R.; Szedmak, S.; Shawe-Taylor, J. (2004). "Canonical Correlation Analysis: An Overview with Application to Learning Methods". Neural Computation 16 (12): 2639–2664. doi: 10.1162/0899766042321814 (http://dx.doi.org/10.1162/0899766042321814). PMID  15516276 (http://www.ncbi.nlm.nih.gov/ pubmed/15516276). • A note on the ordinal canonical-correlation analysis of two sets of ranking scores (http://mpra.ub. uni-muenchen.de/12796/) (Also provides a FORTRAN program)- in J. of Quantitative Economics 7(2), 2009, pp. 173-199 • Representation-Constrained Canonical Correlation Analysis: A Hybridization of Canonical Correlation and Principal Component Analyses (http://ssrn.com/abstract=1331886) (Also provides a FORTRAN program)- in J. of Applied Economic Sciences 4(1), 2009, pp. 115-124
  • 4. Article Sources and Contributors 4 Article Sources and Contributors Canonical correlation  Source: http://en.wikipedia.org/w/index.php?oldid=599494674  Contributors: 2andrewknyazev, Ahsglg0054, AndrewHowse, Angryhaggis, Arthur Rubin, Attarparn, Bestiasonica, Bkonrad, Bob1960evens, Bruce rennes, Bruguiea, Cyan, Dean p foster, Den fjättrade ankan, Duoduoduo, Fangz, Fnielsen, Free Software Knight, Gareth Jones, Geomon, Giganut, Hu12, JamesBWatson, Jncraton, Kiefer.Wolfowitz, Mark viking, Matteo.pelagatti, MaxSem, Mcld, Melcombe, Memming, Michael Hardy, Mishrasknehu, Olaf, R'n'B, Selvik, Shyamal, SyedAshrafulla, That Guy, From That Show!, Tomi, VSteiger, Volemak, WikiMSL, Yuzhounh, 50 anonymous edits License Creative Commons Attribution-Share Alike 3.0 //creativecommons.org/licenses/by-sa/3.0/