In Math what are Tensors, as in TensorFlow?

171 views

In Math what are Tensors, as in TensorFlow?

In: 0

3 Answers

Anonymous 0 Comments

In linear algebra, tensors are generalizations of scalars and vectors.

Scalars are just numbers, like 1, 2.5, Pi and so on.

Vectors can be thought as arrows with a given direction. They can be given in relation to a certain reference system called a base. For example, we can say a compass has two axes, one pointing north and one pointing east and I will define here that each axis is 1 mile long from the origin. Say I wanted to give you directions to go somewhere: I tell you to go 1 mile north and 1 mile west. In relation to our compass axes, we could write that down as (1, 1), i.e. go 1 times the north axis and 1 times the east axis.

We might now ask ourselves what happens when we change our reference system. For example, if I made each axis twice as long, the representation for going 1 mile north and 1 mile east would have to become (1/2, 1/2), as the 1 mile we want to go is now only half as long as each of our reference axes. If I rotate the reference system 90 degrees clockwise you could imagine we have to rotate the representation 90 degrees counterclockwise to point the same direction. The representations (1,1), (1/2, 1/2),… are vectors and whichever way we transform the reference system, we have to transform the vector the exact opposite way for it to still point in the same direction with the same length. Thus, vectors are said to be contravariant, i.e. they change according to the inverse of the transformation to the reference system, or basis.

I don’t want to go into too much detail here, but there are also objects in linear algebra that transform in different ways. For example, covectors transform covariantly, meaning when you change the reference system a certain way, they change along with it, in the same way. Scalars are independent of the reference system and thus don’t change at all when it is transformed, thus they are said to be invariant.

Tensors are now a generalization that allows us to better understand how objects transform. A tensor has a defined type that tells us how contravariant or covariant it is. A scalar is a type (0,0)-tensor, it is neither contravariant nor covariant, because it is invariant. A vector is contravariant and thus is a (1,0)-tensor, a covector is covariant and is a (0,1)-tensor. But tensors can be more complex. They can be co- or contravariant in more than one way, so a (2,1)-tensor would, in a way, transform twice contravariantly and once covariantly.

Systems of tensors form interesting structures that can be studied in mathematics. Also, since you specifically asked about TensorFlow, the tensors in TensorFlow have basically nothing to do with this, they are just multidimensional arrays.

You are viewing 1 out of 3 answers, click here to view all answers.