In math, tensors are, like matrices, a way to write down lots of large, similar equations that differ only in coefficients. Tensors are a generalization of matrices and vectors: a vector is a 1D array (row or column) of numbers, a matrix is a 2d array (several rows), a tensor can be 3D or more.
You can use a vector to write down one equartion, like a sum of several variables being zero. You can use a matrix to write down a system of equations (several equations that have to be satisfied at the same time), as a single equation where a vector is multiplied by a matrix to get a different vector. Likewise a tensor can be used to write down a system of matrix equations as a single tensor equation.
The core of linear algebra – part of math that deals with vectors, matrices and tensors – is how to write down extremely large equarions in fewer symbols, and to operate on those equations. Linear algebra also sometimes provides a “geometric interpretation” to vectors and matrices and a “statistical interpretation” about whether variables are related or not, but that’s just an application of matrices.
Tl;dr: think of tensors as bigger matrices.
Latest Answers