Eli5: what are eigen values and eigen vectors and how is it useful.

289 views

Eli5: what are eigen values and eigen vectors and how is it useful.

In: 55

7 Answers

Anonymous 0 Comments

Eigenvalues and eigenvectors are mathematical concepts that are used to study linear transformations. A linear transformation is a function that maps one vector to another vector in a way that preserves certain properties.

An eigenvector is a special vector that, when transformed by a linear transformation, only changes in magnitude (i.e., it gets stretched or shrunk) but not in direction. An eigenvalue is the scalar value that represents how much the eigenvector gets stretched or shrunk by the linear transformation.

Eigenvalues and eigenvectors are useful because they help us understand how a linear transformation affects vectors in space. They can be used to find important features of a matrix, such as its determinant and trace, which are important in many areas of mathematics and science.

For example, in physics, eigenvalues and eigenvectors are used to study the behavior of quantum mechanical systems. In engineering, they are used to analyze the stability of structures under different loads. In computer science, they are used in machine learning algorithms to reduce the dimensionality of data and extract important features.

I

Anonymous 0 Comments

Okay, imagine you have a magic mirror. When you look at yourself in that mirror, you see something special. An eigenvalue is like a special number, and an eigenvector is like a special direction in that mirror.

The magic part is that when you change into different clothes, the special number (eigenvalue) stays the same, and the special direction (eigenvector) still points in the same way.

Now, why is this useful? It’s like having a favorite toy, and you want to understand how it works. Eigenvalues and eigenvectors help scientists and smart grown-ups understand and solve big problems, like making cool things in computers and science. They’re like the magic tools to understand and solve special problems.

Anonymous 0 Comments

Say you have a shape. It can be a circle, a square, a triangle, whatever.

Draw an arrow on that shape.

Now change the shape in some way. You can rotate it, flip it over, stretch bits, squash bits, whatever. All the things you do to it collectively make up the “transformation” you’ve applied to the shape.

If that arrow (*vector*) is still pointing the same direction* after the transformation, then that arrow is an *eigenvector* of the transformation. Transformations can have multiple eigenvectors. For example, imagine *s t r e t c h i n g* your shape out to the left and right. Well, that doesn’t change what direction a left- or right- arrow is pointing, so those are both eigenvectors.

Now, the arrow might have gotten stretched or squished in the process, but as long as it’s still pointing the same direction*, it’s an eigenvector. The *eigenvalue* is the factor by which the eigenvector got stretched or squished – if it got stretched out and now it’s twice as long as it was before, then that eigenvector has a corresponding eigenvalue of 2.

Eigenvectors are an example of an *invariant* – something that doesn’t change amidst a bunch of other things that do change. These often hint at some sort of fundamental property of the underlying system you can exploit to deduce other information from. It’s a bit like, say, the law of conservation of mass – it’s useful because it lets you *derive other facts*, like being able to balance a chemical equation by ruling out the possibility that atoms can just appear or disappear at random.

Eigenvectors are basically the conservation laws of – not chemistry, but the math field of *linear algebra*. If you have a really complicated math or physics thing that can be turned into a linear algebra problem, then it’s useful just in general to keep in mind the law of conservation of eigenvectors(tm).

(* *Technically*, it’s an eigenvector if it *lies on the same line* as before, not necessarily facing the same direction. It can flip to the exact opposite direction and that still counts as an eigenvector; the corresponding eigenvalue will just be negative in that case. Put another way, a vector is an eigenvector if the transformation just manages to scale it by some constant factor (the eigenvalue).)

Anonymous 0 Comments

A very intuitive explanation of one application of eigenvalues and eigenvectors:

Eigenvectors “point” in the same direction as most of your data, and eigenvalues tell you how strongly your data points that direction.

For example let’s say I have a point cloud in the shape of a book like the green image here: https://www.semanticscholar.org/paper/Tutorial%3A-Point-Cloud-Library%3A-Three-Dimensional-6-Aldoma-M%C3%A1rton/53f85aa476673f50c7a4c255c46cde461d3ba949/figure/0

The eigenvectors of this data (generally) will point in the same direction as the faces of the book as shown on the picture. The largest eigenvalue will correspond to the up direction because the book is longest in that direction. The second eigenvalue will correspond to the left direction, and smallest eigenvalue to the right/coming out the face.

This is helpful because it works even if I don’t know the original orientation of the book.

See more at the page on Principle Component Analysis: https://en.wikipedia.org/wiki/Principal_component_analysis?wprov=sfla1

Anonymous 0 Comments

There are good answers here already that explain how useful these concepts are and how they relate to linear algebra, physics, engineering etc. I will try to add some pure geometric intuition that hopefully fundamentally addresses why Eigenvectors are so important. **The key concept is orthogonality.**

Imagine the standard two-dimensional mathematical graph with an x and a y axis. We always draw the y axis perpendicular (at 90 degrees) to the x axis. Think about why we do this. Why not draw the axes at 45 degrees to each other, or some other angle? We could still cover the entire 2D plane with two lines at a non-zero angle and form a coordinate system, so what is special about 90 degrees? The answer is because, if the axes are perpendicular, the system has the useful property of being able to vary the x value without affecting the y value (and visa-versa). They are independent dimensions. Another word for this concept is orthogonality. **Geometrically, if two lines (or vectors) are orthogonal, then they are at 90 degrees to each other in space.**

If we think about the x and y axes as vectors, they are **x** = [1, 0] and **y** = [0, 1] respectively. Let’s put these together into a matrix: **A** = [[1,0] , [0,1]].Now look at the standard equation used to solve for the Eigen vectors and values: **A** **x** = λ **x.** This equation is asking “For the matrix **A**, what vector(s) **x** can we matrix multiply by such that it is equivalent to just multiplying with a single number λ?”

Every vector is an Eigenvector of the identity matrix, but if we have a more complicated matrix like **B** = [[a, b, …], [c, d, …]], then it’s more interesting. There may be no solutions, or there may be a finite number of solutions. If there are a finite number of solutions, then we can find those vectors, and use them to simplify calculations involving **B.**

**Eigen vectors are useful because they turn bad-guy matrix multiplication into good-guy scalar multiplication.**

If a matrix has multiple Eigenvectors, they will all be orthogonal to each other. If we have some arbitrary matrix **C** made up of N of vectors with several dimensions, many of which are not orthogonal to each other, then it would be useful if we could find a smaller number of vectors M < N that can be put together to cover the entire space more efficiently. In other words, a set of orthogonal vectors that can form a new simplified coordinate system that looks like the nice and comforting x-y graph.

If any system (mathematical, physical, economic, social, whatever) can be described with a mathematical model, and Eigenvectors exist in that model, then they can be used as fundamental building blocks that make understanding the behaviour of the system much simpler. Eigen values then tell you about the relationship between the Eigen vectors, e.g. which one is the biggest (is there dominant behaviour?), whether they have polarity (positivity or negativity), and whether they oscillate at different rates (complex eigen values).

**TL:DR: If you want simplify a nasty problem into something you can explain to a 5 year old who can do multiplication, you need Eigenvectors.**

Anonymous 0 Comments

If you have some linear operator, in some finite dimensional space like in good old 3D euclidean geometry an operator is some transformation. The 3D space is a vector space where each point is described by a vector pointing to it from the origin.

The operator transfroms these vectors, rotates, mirrors, stretches, squishes, projects, etc or any combination of these. The eigen values are special numbers. For every transformation there are sets of vectors that only get streached or squished into a number multiple of themselves. This number multiple is the eigenvalue.

Av=sv

Where A is an operator that transforms a v vector in a way so that the effect of A in v is to take v into s×v. Such sets of vectors are eigenvectors.

There is always a subspace which transform like this so for an s eigenvalue the eigenvectros are a line. So one vectors and any number multiple is an eigenvector. There are so called degenerate cases where you find that multiple eigenvalues are equal. The 3D identity that does nothing has 3 eigenvalues all 1. In these cases the subspace of eigenvectors is greater than 1. Like A takes a full plain of vectors to the eigenvalue multiple of themselves.

An operator can be represented as a matrix given some coordinate system (or rather basis) in this case we can subtract the eigenvalue from the main diagonal of the matrix. What this does is simple the main diagonal tells you how that operator stretches the vectors if you take an eigenvalue out of it it will squished a direction to 0. This means that your lets say 3D space transforms onto a 2D plain. The determinant of the operator tells you how much it morphs a volume. Lets look at a cube described by 3 perpendicular vectors. The cube has unit volume. A determinant of 2 means the operator transformations a unit volume into a 2 unit volume. For any shape. If we 0 out a direct the determinant will be 0 as the operator flattens any volume. So for eigenvalue s we get an equation called the characteristics polynomial since for an N×N matrix detA=0 will be an N order polynomial.

So for the 3D case we will have 3 solutions for s give the fundamental theory of algebra any N order polynomial has N solutions and sometimes these solutions end up as the same, thats the degenerate case. So we have 3 eigenvalues and now we just have to actually subtract it from the diagonal and solve the (A-id×s)v=0 equation. So the A operator with s subtracted takes some vector to 0 and this is a system of N equations of which 2 are independent since we will have 1 freedom as any number multiple of v is also a solution. The dependent equation means that it holds no additional information. And now we have the eigenvectors.

So yeah the solution give a matrix is algorithmic. But the problem comes up all the time. If you want to apply functions to operators you have to make projector decomposition which requires you to solve the eigenvalue problem. In physics you can solve a system of coupled springs using a matrix-vector formalism this requires you to solve an eigenvalue problem. Schrodinger’s equation is the eigenvalue problem of the Hamilton operator where Hv=Ev where v is the wavefunction and energy will be the eigenvalues of H. These problems also show up frequently in data analysis.

You can think of eigenvectors as direction that are natural for that operator. So the way to look at its applicability is that if you have an operator its eigenvalue problem will be of interest to you more often than not.

Anonymous 0 Comments

Imagine a top spinning. The eigen vector is the direction the axis is pointing and the eigen value is the rotational speed.