Eli5 What is machine learning?

473 views

Like. What is it?

In: 8

14 Answers

Anonymous 0 Comments

Basically, you have specialised systems with specialised software, and then you throw _mountains of data_ at them. They can basically use that to “learn” data patterns and thus be able to recreate and predict them based on new data.

For example, if you let the machine study literally millions of conversations, it’ll get pretty good at predicting what works get used in response to any kind of prompt. And that’s how you get a chat program that seems very realistic.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

Simplest way I could phrase it :

“I say 5, you say 3”, “I say 100, you say 98” and repeat it millions of times with different numbers. Your assignment is to figure out a relation/formula (how much do add, subtract or divide) between number I tell you and the number You should answer (in this case: my number minus 2 is your number).

Now instead of numbers we use dog images and answers are dog breeds.

Anonymous 0 Comments

In a sentence, it’s the study of taking data and automatically, with computers, learning attributes, facts, distributions about it, usually for the purposes of classification, prediction, compression, etc. Usually accomplishing those goals well can lead to a variety of applications, including facial recognition, content recommendation, automated content generation, chat bots, etc.

Classical machine learning is closely related to statistics in that there’s usually some underlying human-curated model which encodes some understanding/belief about the data. These days the focus/hype is (deservedly so) around deep learning, where a one line sales pitch may be “throw enough data and hardware at the problem and even with minimal modeling assumptions the technology will still learn properly”. This is great as it allows for: a common vocabulary and end-to-end platform to perform a variety of real world tasks; a tool that rounds the edges around where human-curated modeling makes sharp simplifying assumptions; a system based on relatively simple compute units that is massively scalable (think GPUs instead of CPUs).

An example of the above: ChatGPT is a design that is relatively simple in terms of architectural complexity, but it’s creators trained it with tons of texts, conversations, etc. and now it can behave and converse very closely to a real human can.

Anonymous 0 Comments

Basically, you have specialised systems with specialised software, and then you throw _mountains of data_ at them. They can basically use that to “learn” data patterns and thus be able to recreate and predict them based on new data.

For example, if you let the machine study literally millions of conversations, it’ll get pretty good at predicting what works get used in response to any kind of prompt. And that’s how you get a chat program that seems very realistic.

Anonymous 0 Comments

Its a method to create programms which can learn and identify rules / consistencies / common features.

You create a programm and show it many pictures of a mug and after a while it learns having edges in a certain way is indicating the thing on the pitcure is probably a mug.

Ofcourse its a little more nuanced, but in general with pitcures its a complicated relationship graph between each pixel.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

Simplest way I could phrase it :

“I say 5, you say 3”, “I say 100, you say 98” and repeat it millions of times with different numbers. Your assignment is to figure out a relation/formula (how much do add, subtract or divide) between number I tell you and the number You should answer (in this case: my number minus 2 is your number).

Now instead of numbers we use dog images and answers are dog breeds.

Anonymous 0 Comments

In a sentence, it’s the study of taking data and automatically, with computers, learning attributes, facts, distributions about it, usually for the purposes of classification, prediction, compression, etc. Usually accomplishing those goals well can lead to a variety of applications, including facial recognition, content recommendation, automated content generation, chat bots, etc.

Classical machine learning is closely related to statistics in that there’s usually some underlying human-curated model which encodes some understanding/belief about the data. These days the focus/hype is (deservedly so) around deep learning, where a one line sales pitch may be “throw enough data and hardware at the problem and even with minimal modeling assumptions the technology will still learn properly”. This is great as it allows for: a common vocabulary and end-to-end platform to perform a variety of real world tasks; a tool that rounds the edges around where human-curated modeling makes sharp simplifying assumptions; a system based on relatively simple compute units that is massively scalable (think GPUs instead of CPUs).

An example of the above: ChatGPT is a design that is relatively simple in terms of architectural complexity, but it’s creators trained it with tons of texts, conversations, etc. and now it can behave and converse very closely to a real human can.

Anonymous 0 Comments

Its a method to create programms which can learn and identify rules / consistencies / common features.

You create a programm and show it many pictures of a mug and after a while it learns having edges in a certain way is indicating the thing on the pitcure is probably a mug.

Ofcourse its a little more nuanced, but in general with pitcures its a complicated relationship graph between each pixel.