What is a “weight vector” / “weighted vector”, but especially as goes artificial neural networks?

211 views

I’m reading Richard Powers’ *Galatea 2.2*, >!which tells the story of an engineer and a writer developing a computer-based neural network capable of taking as input a piece of English literature (e.g. a sonnet) and outputting analysis equivalent of a 22-year-old’s level!<. **In the book,** ***weight/ed vectors*** **get multiple mentions and the concept has me stumped.**

In: 0

3 Answers

Anonymous 0 Comments

Machine learning these days is often understood as a function that takes an input and produces an output. For example, f(x,y)=5x+2y is a function that takes in an input and produces an output. So if took in the inputs x=1,y=3, it’d produce 7. Machine learning functions often happen in stages, with lots of values at each stage. Finding a value at each stage involves executing some rule. For example, it might be something like “each value of this stage is equal to (weight 1)*first input + (weight 2)*second input + (weight 3)*third input”, etc. This happens a few times until there is some final output which could be interpreted as words or image pixels or anything else. Machine “learning” happens by picking random values for all of those “weight” values, then tuning them by comparing how that function works to known “correct” answers until the function reliably ends up creating matching answers.

So, in short, the function for how machine learning things will work is a given, but the result of training is a collection of numbers that make the function produce “good” answers instead of random junk. Those numbers are the “weights.” For the example function above, you could think of the “5” and “2” as the weights. Weights are often described and thought of as groups of numbers called vectors.

You are viewing 1 out of 3 answers, click here to view all answers.