Links to all tutorial articles (same as those on the Exam pages)Eigenvectors, eigenvalues and orthogonality
This is a quick write up on eigenvectors, eigenvalues, orthogonality and the like. These topics have not been very well covered in the handbook, but are important from an examination point of view. Eigenvectors, eigenvalues and orthogonality
Before we go on to matrices, consider what a vector is. A vector is a matrix with a single column. The easiest way to think about a vector is to consider it a data point. For example, if is a vector, consider it a point on a 2 dimensional Cartesian plane. If there are three elements, consider it a point on a 3dimensional Cartesian system, with each of the points representing the x, y and z coordinates. This data point, when joined to the origin, is the vector. It has a length (given by , for a 3 element column vector); and a direction, which you could consider to be determined by its angle to the xaxis (or any other reference line).
Just to keep things simple, I will take an example from a two dimensional plane. These are easier to visualize in the head and draw on a graph. For vectors with higher dimensions, the same analogy applies.
Consider the points (2,1) and (4,2) on a Cartesian plane. These are plotted below. The vectors that these represent are also plotted – the vector is the thinner black line, and the vector for is the thick green line.
One of the things to note about the two vectors above is that the longer vector appears to be a mere extension of the other vector. As if someone had just stretched the first line out by changing its length, but not its direction. In fact in the same way we could also say that the smaller line is merely the contraction of the larger one, ie, the two are some sort of ‘multiples’ of each other (the larger one being the double of the smaller one, and the smaller one being half of the longer one). We take one of the two lines, multiply it by something, and get the other line. That something is a 2 x 2 matrix. In other words, there is a matrix out there that when multiplied by gives us . Let us call that matrix A. Now without calculations (though for a 2x2 matrix these are simple indeed), this A matrix is .
That is really what eigenvalues and eigenvectors are about. In other words, Aw = λw, where w is the eigenvector, A is a square matrix, w is a vector and λ is a constant.
Calculating the angle between vectors: What is a ‘dot product’? The dot product of two matrices is the sum of the product of corresponding elements – for example, if and are two vectors X and Y, their dot product is ac + bd.
Example: Orthogonality Consider the following vectors: . Their dot product is 2*1 + 1*2 = 0. If theta be the angle between these two vectors, then this means cos(θ)=0. Cos θ is zero when θ is 90 degrees. Therefore these are perpendicular. And you can see this in the graph below.
For the exam, note the following common values of cosθ :
Why is all of this important for risk management?
