Eli5: What is the real difference between analog and digital?

1.62K views

I tried looking it up, and my brain just wasn’t making sense of it, but I haven’t had coffee yet so.

In: 0

25 Answers

Anonymous 0 Comments

No real difference in practice. Digital is however noise-free and arbitrarily precise, while analog is limited by quality of components and electromagnetic interference.

Anonymous 0 Comments

An analog circuit uses the physical properties of electricity to do complicated stuff. For example, a capacitor takes time to charge up, and you can use this to create a timer. It’s like a Rube Goldberg machine but with electricity.

With a digital circuit, it’s all about maths, logic and calculations. Things don’t happen because of the cool properties of electricity, rather they happen because of the cool properties of transistors and how they can do maths. That way, your circuit is controlled by a program which needs to be written.

Basically digital uses logic, maths, and programs to get stuff done. Analog uses the physical properties of electricity, magnetism or even chemistry (like film photography).

Anonymous 0 Comments

The term your looking for is discrete quantization. A digital value has discrete states like a flight of stairs. You are on the 9th step of a flight of 50 stairs. If you want to increase your value there is a defined exact defined minimum amount to go up to an increment to step 10. Analog or continuous values are like a wheelchair ramp. You are a fifth of the way up the ramp, you could go any value up, any amount a hundredth of an inch or 10 feet.

Anonymous 0 Comments

The difference between analog and digital is how far removed you are in terms of level of abstraction from physical reality when modeling some problem. In particular, it typically refers to some kind of quantization or discretization being done.

Digital signals are for example artificially discretized – they’re “chunked” in a way such that a corresponding digital system of some specification we also lay out for them can process them.

The difference between some natural discretization of an analog signal (e.g. the raw number of photons in a timeframe) and the artificial discretization of some other, digital signal, is the intention, the control. We impose artificial rules on top of physical reality, and manage to stay within those rules. We operate on representations, instead of representing a given problem in physical reality (with actual “analogues” – hence the name).

Anonymous 0 Comments

Draw a circle with a pen on paper. This is like analogue. It is almost certainly not perfectly round but the line has a real curve.

Now Draw a circle with the circle tool in MSpaint. It will probably look more rounded but if you zoom in enough it is not round at all. It is made up of square pixels. The line is not curved it is a zig zag of the edges of squares.This is digital.

Analogue signals can be any of an infinite range of values.
Digital signals can only be a certain value out of a set range.
The size and accuracy of the digital range is called it’s resolution. (1 bit resolution is either 0v or full volts)

The higher the resolution the more accurate it can mimic an analogue signal but it is still a series of steps in voltages the same way a circle on a screen is a series of pixels.

Analogue signals are generally from something physical. Radio waves or electric generation etc. where a wire passes through a changing magnet field. But most semiconductors can’t handle analogue signals they are either on or of. As modern computers are made up of semiconductors the same way a screen is made up of pixels. We have to be able to deal with both and convert between the two.