What is a “weight vector” / “weighted vector”, but especially as goes artificial neural networks?

213 views

I’m reading Richard Powers’ *Galatea 2.2*, >!which tells the story of an engineer and a writer developing a computer-based neural network capable of taking as input a piece of English literature (e.g. a sonnet) and outputting analysis equivalent of a 22-year-old’s level!<. **In the book,** ***weight/ed vectors*** **get multiple mentions and the concept has me stumped.**

In: 0

3 Answers

Anonymous 0 Comments

In math, a “weight” is usually just a factor of some sort. For example, if you and I decide to mow lawns, we might decide that your “weight” might be 0.6, mine 0.4, and that means you get 60% money, and I get 40% of money we earn as a mowing team.

Weights in artificial neural networks are just values that represent the “knowledge” of the neural network.

[Here’s a comment](https://www.reddit.com/r/askscience/comments/10aqkjj/what_exactly_is_the_process_when_someone_trains/j47s2cb/?context=3) I wrote recently about NN learning.

Mathematically, a neural network takes a bunch of input data, the data goes through some mathematical algorithms and functions, then the results of those algorithms / functions gets multiplied by weights , and that gives you your result.

And these weights represent part of the “knowledge” of a neural network. It’s simply a bunch of numbers that get determined during the training phase of setting up a neural network when answering the question “*what kind of math should I do on this set of input data so I get that set of output data?*”

You are viewing 1 out of 3 answers, click here to view all answers.