What’s the difference between analog and digital?

627 views

I’m pretty sure that that analog signals is just a continuous stream of input versus digital which provides signals at discrete time steps. Why have we shifted from analog to digital for so many things? Wouldn’t a steady stream of information be of better use?

In: Technology

7 Answers

Anonymous 0 Comments

> I’m pretty sure that that analog signals is just a continuous stream of input versus digital which provides signals at discrete time steps.

Exactly right. Analog is like drawing something, and digital is like drawing that same thing via a game of connect-the-dots.

> Why have we shifted from analog to digital for so many things?

The trouble with analog is that every time you make a copy, that copy is slightly different from the original. Over multiple generations of copying, or with poorly made copying equipment, these differences can be easily seen/heard.

Digital is written entirely only using on/off bits to represent 1s and 0s which then represent larger numbers.

The great thing about bits is that you can either read the bit or you can’t, so there aren’t generational losses in copying.

Digital media can also be scrambled and can include “checksum” bits. These two tools allow electronics to correct for bits that were incorrectly read.

> Wouldn’t a steady stream of information be of better use?

The discrete steps in modern digital signals are so small that they’re imperceptible. For example, the human ear can hear sound frequencies from about 20Hz to 20kHz, and Nyquist theorem says that a digital audio format should sample at double that rate to accurately reproduce signals, so we’ve been using 44kHz and 48kHz audio for years… and on high-end products and in studios, they’re even doing 96 kHz and 192 kHz now.

You are viewing 1 out of 7 answers, click here to view all answers.