What is an Eigenvalue and an Eigenvector?

In: Mathematics

Trying to keep it ELI5 but not be patronizing in simplicity. Vectors are a clever way to communicate magnitude and direction. The classic example is a force vector. You can draw an arrow where the direction the arrow is pointing indicates the direction of the force, and the length of the arrow indicates the size or magnitude of the force. Matrices are a good way of organizing the relevant data to construct one.

Eigen values and Eigen vectors are fancy types of matrices that you can use in a branch of mathematics called linear algebra. Essentially, an eigen vector is a type of vector that you can multiply an existing vector by…that doesn’t change the existing vector’s direction. It only changes its magnitude. And the Eigen value is the scalar factor of that change of magnitude.

They’re neat principles of mathematics that have applications all over the place. I use them in my career when it comes to random vibration analysis of aerospace components. When we find natural frequencies of a vibrating object those frequencies are actually eigenvalues of the system of equations that we’re solving.

They have lots of different uses in all sorts of complicated things but the general knowledge is quite simple. It’s a vector that when multiplied by an existing vector, it only changes the magnitude, and the scale of that magnitude change is an eigen value.

Here’s an actual ELI5.

A matrix is a fancy way of describing the change in shape of something.

Now, imagine you take some blob of Play-Doh, and you squish it with your flat hands. Most parts of the Play-Doh go into all kinds of weird directions, left, right, up, down, you name it. However, there will be one specific direction along which the Play-Doh particles along that line are simply pushed out further along that line.

That special direction is the eigenvector, and the amount (i.e. the factor) by which the stretch happens along that direction is the eigenvalue. Both of them together uniquely describe the way you squished the Play-Doh, and that’s why they’re interesting to know.

Not really ELI5, since if you know what vectors are then that’s already above ELI5 level haha

Eigenvalues and eigenvectors describe the behavior of linear transformations—i.e. transformations of space that affect the space in the same way everywhere (e.g. stretch/squish space by the same amount everywhere) and also that keep the zero vector in place. (Perhaps another way of thinking about linear transformations is that they transform straight lines into straight lines, parallelograms into parallelograms, triangles into triangles, etc.)

For some random linear transformation, most vectors will be transformed by a combination of rotation and stretching (so they’ll be rotated by some amount and elongated/shortened by some factor). Note that the rotation amount and scaling amount is not necessarily constant for a given linear transformation—it depends on the specific vector being transformed. Interestingly enough, there are often some vectors that the transformation will transform only by scaling, with no rotation at all; these are the eigenvectors of the transformation. For each eigenvector, the amount by which the transformation will stretch/shrink the eigenvector is called the eigenvalue of that eigenvector.

For instance, if I take the transformation of the plane that “flattens” all vectors onto the x-axis (the orthogonal projection onto the x-axis), then the eigenvectors of this transformation are (1,0) and (0,1). (1,0) lies along the x-axis, so it isn’t changed at all, so its eigenvalue is 1. (0,1) lies along the y-axis, so it gets collapsed down into the zero vector, so its associated eigenvalue is 0.

(If you’re paying very close attention, you might notice that (2,0) and (0,2) are also eigenvectors, as are (3,0) and (0,3), etc. In fact, any multiple of an eigenvector is also an eigenvector. So when we list out the eigenvectors of a transformation, we usually just list out one eigenvector from each “family” of eigenvectors, with the understanding that any multiple of the given eigenvectors will also be an eigenvector.)

Eigenvectors and eigenvalues are useful because it’s really simple to describe what the transformation does to them—it just scales them by some factor! So if we apply the transformation many times to an eigenvector, it’ll just get scaled by the same amount over and over again. In contrast, it can be difficult to predict what applying a transformation to an arbitrary vector many times will do to that vector. Moreover, if a transformation happens to have enough eigenvectors, then we can actually completely characterize the transformation by taking note of how it transforms all of its eigenvectors—i.e. by noting all of its eigenvectors and eigenvalues. This can make repeatedly applying the transformation very easy, since we can just reduce a more complicated transformation into simple scaling of certain vectors. (This is the principle of diagonalization.)

In the language of vectors and matrices; A geometric vector is basically something with a length and direction. You can transform vectors by multiplying matrices by them, for example a rotation of 180 degrees in 2D can be represented by the matrix

|-1|0|

|:-|:-|

|0|-1|

If you multiply any vector by this matrix then it will rotate it around 180 degrees.

Now most matrices aren’t that simple and they will do a combination of operations, changing the length and direction of vectors in complex ways.For a given matrix an eigenvector is one which doesn’t have its direction changed, i.e only its length is scaled when you multiply it by that matrix. The eigenvalue is how much that length is scaled.

For example if we have some matrix M and a vector (3,5). Then say we multiply M*V and get a new vector (6,10) which = 2*(3,5). This means V is an eigenvector of M with an eigenvalue of 2, applying M to V has just scaled it by a factor 2 without changing its direction.