What is an Eigenvalue and an Eigenvector?

641 views

What is an Eigenvalue and an Eigenvector?

In: Mathematics

4 Answers

Anonymous 0 Comments

Not really ELI5, since if you know what vectors are then that’s already above ELI5 level haha

Eigenvalues and eigenvectors describe the behavior of linear transformations—i.e. transformations of space that affect the space in the same way everywhere (e.g. stretch/squish space by the same amount everywhere) and also that keep the zero vector in place. (Perhaps another way of thinking about linear transformations is that they transform straight lines into straight lines, parallelograms into parallelograms, triangles into triangles, etc.)

For some random linear transformation, most vectors will be transformed by a combination of rotation and stretching (so they’ll be rotated by some amount and elongated/shortened by some factor). Note that the rotation amount and scaling amount is not necessarily constant for a given linear transformation—it depends on the specific vector being transformed. Interestingly enough, there are often some vectors that the transformation will transform only by scaling, with no rotation at all; these are the eigenvectors of the transformation. For each eigenvector, the amount by which the transformation will stretch/shrink the eigenvector is called the eigenvalue of that eigenvector.

For instance, if I take the transformation of the plane that “flattens” all vectors onto the x-axis (the orthogonal projection onto the x-axis), then the eigenvectors of this transformation are (1,0) and (0,1). (1,0) lies along the x-axis, so it isn’t changed at all, so its eigenvalue is 1. (0,1) lies along the y-axis, so it gets collapsed down into the zero vector, so its associated eigenvalue is 0.

(If you’re paying very close attention, you might notice that (2,0) and (0,2) are also eigenvectors, as are (3,0) and (0,3), etc. In fact, any multiple of an eigenvector is also an eigenvector. So when we list out the eigenvectors of a transformation, we usually just list out one eigenvector from each “family” of eigenvectors, with the understanding that any multiple of the given eigenvectors will also be an eigenvector.)

Eigenvectors and eigenvalues are useful because it’s really simple to describe what the transformation does to them—it just scales them by some factor! So if we apply the transformation many times to an eigenvector, it’ll just get scaled by the same amount over and over again. In contrast, it can be difficult to predict what applying a transformation to an arbitrary vector many times will do to that vector. Moreover, if a transformation happens to have enough eigenvectors, then we can actually completely characterize the transformation by taking note of how it transforms all of its eigenvectors—i.e. by noting all of its eigenvectors and eigenvalues. This can make repeatedly applying the transformation very easy, since we can just reduce a more complicated transformation into simple scaling of certain vectors. (This is the principle of diagonalization.)

You are viewing 1 out of 4 answers, click here to view all answers.