Computers are based on electricity and chemistry. The smallest meaningful unit inside a computer uses chemistry to direct electricity. This is a logic gate that uses high or low voltage to represent two states (binary), which when chained together can express rich logical concepts. This may also be called “digital” in that the states are discrete rather than continuous (high low instead of 0%-100%), the advantage here being a much cleaner signal reproduction (if your signal is “high”, there’s a pretty low chance it was intended to be “low”). Similarly, “memory”, or stateful information is stored digitally by configuring capacitors into a high or low state.
Brains also use electricity and chemistry. The primary mechanism is a neuron, a type of cell that can receive and send electrical signals. The neuron, modulated by a variety of information – from its input signal, neighboring cells, temperature, and chemicals – will function like a gate in deciding to propagate an electrical signal (allowing ions to pass through its cell membrane). Memory/stateful information is thought to be, mechanistically, reconfiguration of synapses, or the connection between different neurons. In a way the “whether to fire” part of a neuron can be considered digital, but as a whole, the strength of a signal can be propagated, leading to this being partially considered analog as well.
In truth the emergent behavior of computers is much easier to study. First because we designed them from scratch, so we know exactly how they work to begin with. But second, also because the digital nature makes analysis exponentially cleaner and simpler. So although I’ve described simple mechanisms for how the brain works – ie ways in which it can feasibly represent logic and ways in which it can encode state, it’s not clear how much this is the full picture yet.
Bonus info – you’ve probably heard of artificial intelligence in the news lately. The core technology behind the latest advances can be traced to an artificial neuron, which is based off the same mechanisms as normal computer parts, but which originally mimicked what we thought was how brains worked at the time (decades ago). It turns out the thing we mimicked is not how brains work, but that feature still ended up being arguably the top reason modern AIs are so powerful today.
Computers are based on electricity and chemistry. The smallest meaningful unit inside a computer uses chemistry to direct electricity. This is a logic gate that uses high or low voltage to represent two states (binary), which when chained together can express rich logical concepts. This may also be called “digital” in that the states are discrete rather than continuous (high low instead of 0%-100%), the advantage here being a much cleaner signal reproduction (if your signal is “high”, there’s a pretty low chance it was intended to be “low”). Similarly, “memory”, or stateful information is stored digitally by configuring capacitors into a high or low state.
Brains also use electricity and chemistry. The primary mechanism is a neuron, a type of cell that can receive and send electrical signals. The neuron, modulated by a variety of information – from its input signal, neighboring cells, temperature, and chemicals – will function like a gate in deciding to propagate an electrical signal (allowing ions to pass through its cell membrane). Memory/stateful information is thought to be, mechanistically, reconfiguration of synapses, or the connection between different neurons. In a way the “whether to fire” part of a neuron can be considered digital, but as a whole, the strength of a signal can be propagated, leading to this being partially considered analog as well.
In truth the emergent behavior of computers is much easier to study. First because we designed them from scratch, so we know exactly how they work to begin with. But second, also because the digital nature makes analysis exponentially cleaner and simpler. So although I’ve described simple mechanisms for how the brain works – ie ways in which it can feasibly represent logic and ways in which it can encode state, it’s not clear how much this is the full picture yet.
Bonus info – you’ve probably heard of artificial intelligence in the news lately. The core technology behind the latest advances can be traced to an artificial neuron, which is based off the same mechanisms as normal computer parts, but which originally mimicked what we thought was how brains worked at the time (decades ago). It turns out the thing we mimicked is not how brains work, but that feature still ended up being arguably the top reason modern AIs are so powerful today.
Latest Answers