In math, what is a tensor? How is it related to tensorflow from machine learning?

226 views

In math, what is a tensor? How is it related to tensorflow from machine learning?

In: 90

7 Answers

Anonymous 0 Comments

A single number is called a scalar.

A set of numbers that belong together is a vector. You arrange them as a 1-Dimensional list.

If you add a second dimension by grouping multiple vectors you get a matrix wich for example allows mapping vectors to vectors.

A tensor is has 3 or more dimensions, so it’s basically a generalized matrix. They are usefull in a bunch of calculations for 3D effects like tensions (hence the name). It allows you to have a single mathematical construct to represent forces that can be direct, shearing or twisting.

Tensorflow is a platform that uses tensor arithmethics to make all the calculations you need to train your AI more easy to represent.

In informatics terms: a Tensor is multidimensional array basically.

Anonymous 0 Comments

I am unable to articulate it briefly, but this video was extremely helpful:

In short: tensors show relationships between a basis, the vectors that define a space. Depending on the level of the tensor you can describe forces, pressures, shear, or even scalars like temperature

Their power comes from the fact that no matter what coordinate system (basis) you choose, everyone will agree on the results (after applying a translation to get between bases)

Anonymous 0 Comments

A tensor is a collection of numbers. If you have a single number the tensor has rank 0. If you have a list or array or vector of numbers the tensor has rank 1. If you have a matrix (2D) array you have a tensor of rank 2. So a tensor can be thought of as a multidimensional array of numbers. The way tensors relate to ML is that lots of ML algorithms involve multiplying tensors. If I have a batch of RGB images as inputs, each is a tensor of rank 3 – height x width x depth 3 for each color channel. But the batch is a tensor of rank 4 because there are k images of rank 3. The tensor now has an index that specifies which image in the batch is being considered.

Anonymous 0 Comments

If you want the musical version:


https://en.wikipedia.org/wiki/User:Nobi/Finite_Simple_Group_of_Order_Two

Performed by The Klein Four.

The path of love is never smooth
But mine’s continuous for you
You’re the upper bound in the chains of my heart
You’re my Axiom of Choice, you know it’s true

But lately our relation’s not so well-defined
And I just can’t function without you
I’ll prove my proposition and I’m sure you’ll find
We’re a finite simple group of order two

I’m losing my identity
I’m getting tensor every day
And without loss of generality
I will assume that you feel the same way

Since everytime I see you, you just quotient out
The faithful image that I map into
But when we’re one-to-one you’ll see what I’m about
‘Cause we’re a finite simple group of order two

Our equivalence was stable,
A principal love bundle sitting deep inside
But then you drove a wedge between our two-forms
Now everything is so complexified

When we first met, we simply connected
My heart was open but too dense
Our system was already directed
To have a finite limit, in some sense

I’m living in the kernel of a rank-one map
From my domain, its image looks so blue,
‘Cause all I see are zeroes, it’s a cruel trap
But we’re a finite simple group of order two

I’m not the smoothest operator in my class,
But we’re a mirror pair, me and you,
So let’s apply forgetful functors to the past
And be a finite simple group, be a finite simple group,
Let’s be a finite simple group of order two
(“Why not three?”)

I’ve proved my proposition now, as you can see,
So let’s both be associative and free
And by corollary, this shows you and I to be
Purely inseparable. Q.E.D.

Anonymous 0 Comments

Math at this level isn’t really for 5 year olds but in math, a tensor is an element of a special bilinear product ⊗ of vector spaces V and W such that for any biliear map *h* from the direct product of V and W to a different vector space Z, the special bilinear product commutes with a unique linear map *h’* to Z to create *h*.

So for vectors v in V and w in W, h bilinear: h(v,w)=h'(v⊗w) always has a linear solution h’ (a universal property). Here, v⊗w is called a *tensor*, and the space of all tensors here is denoted V⊗W. It requires the axiom of choice to assume the tensor product exists over infinite dimensional vector spaces. Tensor products over finite dimensional vector spaces give you n-dimensional arrays.

In machine learning, you’re messing with values in n-dimensional arrays to try to find an n-linear map X’ from the input space I to the outputs X(I) that approximates an unknown nonlinear map X. The process of updating a tensor with small changes over time to find X’ can be called a *flow*.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

In math, tensors are, like matrices, a way to write down lots of large, similar equations that differ only in coefficients. Tensors are a generalization of matrices and vectors: a vector is a 1D array (row or column) of numbers, a matrix is a 2d array (several rows), a tensor can be 3D or more.

You can use a vector to write down one equartion, like a sum of several variables being zero. You can use a matrix to write down a system of equations (several equations that have to be satisfied at the same time), as a single equation where a vector is multiplied by a matrix to get a different vector. Likewise a tensor can be used to write down a system of matrix equations as a single tensor equation.

The core of linear algebra – part of math that deals with vectors, matrices and tensors – is how to write down extremely large equarions in fewer symbols, and to operate on those equations. Linear algebra also sometimes provides a “geometric interpretation” to vectors and matrices and a “statistical interpretation” about whether variables are related or not, but that’s just an application of matrices.

Tl;dr: think of tensors as bigger matrices.