Okay, here's the problem: - We have a set of n variables, say
**R**_{1},**R**_{2}, ...**R**_{n}(such as the monthly returns for n stocks). - The covariance matrix
**Θ**is known (for these R-variables). - Construct the linear combination:
**W**=**x**_{1}**R**_{1}+**x**_{2}**R**_{2}+ ... +**x**_{n}**R**_{n}. - Choose the weights
**x**_{k}so as to maximize the**Variance**of the variable**W**.
>Huh?
>Then just use the S&P500 Index, right?
>And end up with just 30 instead of 300? You're talkin' the DOW, eh?
Then here's the problem, in math-speak:
>That's the problem? What happened to the R-variables?
>And can you solve that problem?
- The matrix
**Θ**is symmetric. - The eigenvalues λ and eigenvectors
**u**satisfy**Θ****u**= λ**u**. - Symmetric matrices have real eigenvalues and, if they are all distinct, the eigenvectors are orthogonal.
That is: if**u**_{j}and**u**_{k}are two eigenvectors,**u**_{j}^{T}**u**_{k}= 0. - If that is the case, and we write our
**X**as a linear combination of theeigenvectors,__unit__**X**= a_{1}**u**_{1}+ a_{2}**u**_{2}+ ... +a_{n}**u**_{n}, then**X**^{T}**Θ****X**= {a_{1}**u**_{1}^{T}+ a_{2}**u**_{2}^{T}+...}**Θ**{a_{1}**u**_{1}+ a_{2}**u**_{2}+...} = SUM of terms like: a_{j}**u**_{j}^{T}**Θ**a_{k}**u**_{k} - But
**Θ****u**_{k}= λ_{k}**u**_{k}(since**u**_{k}is an eigenvector) and**u**_{j}^{T}**u**_{k}= 0 (since eigenvectors are orthogonal). - Hence:
**X**^{T}**Θ****X**= SUM of terms like: a_{j}λ_{j}**u**_{j}^{T}**u**_{j}for j = 1, 2, ... n. - Since we picked
eigenvectors, then__unit__**u**_{j}^{T}**u**_{j}= 1. - Further, we required that
**X**^{T}**X**= 1, so that**Σ**a_{j}^{2}= 1 for j = 1, 2, ... n.
We're almost there!
Did you get that?
>And just how do you find that eigenvector associated with the largest eigenvalue?
- Pick any vector
**v**. - Multiply by your matrix
**A**, giving**Av**. - If
**v**= a_{1}**u**_{1}+ a_{2}**u**_{2}+ ..., then**Av**= a_{1}λ_{1}**u**_{1}+ a_{2}λ_{2}**u**_{2}+ ... - Repeating umpteen times, we get:
**A**= a^{N}v_{1}λ_{1}^{N}**u**_{1}+ a_{2}λ_{2}^{N}**u**_{2}+ ... - After each preMultiplication by
**A**, normalize the resultant vector so it's of unit length ... then continue multiplying by**A**. - Eventually, the largest eigenvalue dominates and you're left with a
_{1}λ_{1}^{N}**u**_{1}, normalized to unit length- you've identified the appropriate eigenvalue**u**_{1}.
>Huh?
> >But what's that maximum Variance? I thought we ...
>But have you reduced the number of components? I thought you were talking retaining the "principal: components and ...
- The coVariance matrix
**Θ**is real and symmetric and is (usually) obtained from historical data. - If
**Θ**is n x n, then we can expect n eigenvectors and n associated eigenvalues. - The eigenvectors, satisfying
**Θ****u**= λ**u**, have a particular direction, but can be any length. - If the n eigenvalues are λ
_{1}, λ_{2}, ... λ_{n}, then the contribution to the variance provided by the eigenvector**u**_{k}is λ_{k}/**Σ**λ_{j} - Although one can use the raw historical returns data to calculate
**Θ**, it is better to "normalize" the returns by subtracting the Mean Return then dividing by the Standard Deviation.
>Huh?
>But have you reduced the number of components? I thought ...
>That's confusing. Do you It'll download 8 years of historical monthly returns for four stocks and construct a coVariance matrix (called >So, in the example, the allocation 30% GE + 28% MSFT + 18% XOM + 24% GM ... does what?
>And does it?
>And where are all the other eigenvectors and eigenvalues?
The spreadsheet does a bunch of >The next largest? How do you find that?
You stare at the chart, select four Guesses, then click a button to get better and better estimates.
The spreadsheet will do that, somewhere ... like this.
>Is that 13%+19%+25%+43% good ... or what?
Actually, it isn't. It's always the trace of the coVariance matrix = sum of the diagonal elements
(which have the value "1") ... so for "4" stocks it'd be "4".
>And the principal eigenvector?
>Wait a minute! A negative percentage? Are you kidding?
>
The pictures and spreadsheet may change without noitce |