First, by guesstimating how close different components can safely be, and then they avoid putting them any closer than that.
Second, they might lower the voltage to allow them to be put closer together
Third, lots and lots of testing. If it works, it works. If it doesn’t work, then maybe they were a bit too close together, and separating them a bit more will fix it.
Specialized design, simulation, modeling, testing, and careful material design. They make sure that the traces are the right shape and length so as to not act like antennas, using ground layers and traces to shield, and more design requirements. Things like the encapsulation materials are specially designed to restrict the propagation of fields.
There are several strategies used.
The first is modeling. The materials and characteristics are actually pretty well defined. You can apply models to calculate this. A little iteration can produce a chip module that has a lot of internal designed protections for all users.
A lot of this has actually already been done and is actually pretty well known. You will find the chip itself has these ‘distances’ already factored in based on its packaging and shielding. They also have comprehensive datasheets about it too for the devices most susceptible to this interference.
There is a lot of ‘reusing’ the old data here too. Reusing old layouts and geometries on new products.
The second methods are more of a strategy for operation.
Digital systems run on analog signals. The old TTL (transistor to Transistor Logic Level) is defined on a nominal 5V scale. It is literally a signal of up to 5V being sent. The standard said anything from 0V to 0.7V would be a logic low. Anything from 3.5V to 5V is logic high. The middle 0.7-3.5V is undefined.
This standard allowed a fair bit of ‘noise’ to be injected without impacting the data being sent. There are different logic standards out there.
In that same line of though, for data transmission between devices a ‘checksum’ can be included. This checksum is merely a quick mathematical operation on the data packet. it is calculated at the sender and receiver device. The receiver compares its calculation to what the sender did to see if data corruption occurred.
In the short, the electronics maker uses simulation and modeling, coupled with manufacturers specification and past use cases to do layouts. The manufacturers also employ different data transmission standards to allow and accommodate some level of interference that is bound to occur as well.
Every conductor thrhough which passes a current has an electromagnetic field. This is law. And, by extension, integrated circuits have electromagnetic fields and they can interfere with the circuit components.
The engineers are careful to protect the components through different methods that I find impossible to ELY5.
The EM fields can be calculated and where engineers consider there is a need for EMI protection they implement one of those measures for better protection. That, for example, can be done with capacitors that interrupt the circuit (there is no current passing through a capacitor because it’s basically two metal bars separated by an isulating gas. One metal plate is charged by the current and through energy transfer the second plate is charged, letting go of current through the rest of the circuit.)
Big power consumers are kept further appart to avoid EMI. For example you don’t want your microcontroller too close to the supply. Or if you have two chips in your integrated circuit, you want to give them space.
There is even more consideration than electromagnetic compatibility (EMC) and electromagnetic interferences (EMI).
Heat is also considered. Dissipated heat can also be calculated and of course the easiest way to protect against it is by keeping big power consumers further appart. For this there also exist special layers of the integrated circuit that are good heat conductors so they disippate energy. Microprocessors for example have a paste and a heat disipation material under them, and if the chip is not soldered perfectly it can actually deteriorate because the heat is not spread accordingly.
Even more, the connecting lines between components can also cause problems like parasitic capacitances and current reflection. Parasitic capacitance can happen if two lines are too close or even a right angle of the same line has the risk of making a parasitic capacitance. As i said, a capacitor is two metal bars separated by an insulator. Line reflections happen mostly when there is a power surge in the circuit, like when turning it on. When the supply switches on, there will be a power spike in the circuit before it stabilises. Power surges or even just normal curent flow can reflect on the line, meaning that when current goes from component A to component B, a very small portion comes back. The reasons behind this phenomenon are unknown to me (although I did learn about them in college), but what I do remember is that the longer the line, the higher the reflection can be. So it’s a challenge to get rid of EMI, heat, and reflection together.
There can also be cost reduction issues and they try to reduce the amount of material.
There are virtual instrumentations that allow integrated circuit design which automatically takes care of many problems so you can solve them at design level.
Of course, integrated circuits are EMC tested and if they are within accepted parameters they are considered good for sale.
All electronics have a mandatory EMC compliance and the proof of it is the small stylized “CE” printed on them.
This is all what I can remember from my college classes. I welcome anyone to correct me if I described something wrong.
Simulation and testing. There are tools to simulate the fields and their effects. Then after you got your results you consider your options which are: Shielding, distance, limiting currents, alternative components, and variations on design. After this you compare it to your tolerance for errors and capacity to correct said errors (in the signals).
Computer modeling. Chip designers use specialized software that can model electromagnetic field strength and interaction. Today, pretty much all complex products are designed on a computer before they go into production. In addition to the general design software, specialized software exists for a wide variety of simulation and modeling scenarios.
For example, have a look at the [electronics related product listing from Ansys](https://www.ansys.com/products#t=ProductsTab&sort=relevancy&layout=card&f:@productcollections=[Electronics]). You’ll see software for everything from electric motor design (which relies entirely on EM field modeling) to high-frequency signaling simulation. Ansys even makes a [specific piece of software for designing PCB (circuit boards) and chip packages (when you put multiple chips on a single die)](https://www.ansys.com/products/electronics/ansys-siwave). Ansys isn’t the only vendor for this type of software, of course.
Hi, I design specialized high frequency chips for a living where this is an especially major challenge. I’ll try to stay true to ELI5 explanations.
In my line of work, we use full 3D simulation software with high power computers to test chips virtually before we send designs out to be built. There are lots of different ways and software packages that do this, each with their own advantages and disadvantages that I could talk about for hours but that wouldn’t be a conversation for ELI5. We at my company spend tens of thousands of dollars every year on these software packages and devote extremely high performance computers exclusively to this task.
In most applications at lower frequencies, they do something simpler and faster, where coupling factors are extracted between critical nodes and placed into a circuit diagram using a circuit component called a capacitor. There is then a whole array of capacitors interconnecting between the critical nodes which can then be used to loosely simulate performance to a lesser degree of accuracy that works just fine below a few GHz of operating frequency.
Engineers use models and simulations to predict if there will be a conflict between small circuitry placed close together. Over time, they’ve refined these models to build computer software that not only can check for conflicts, but can also help place structures.
This is fairly complicated, and is part of why computers have a “clock speed” and can adjust the voltage of different components by tiny fractions of a volt. As you can imagine, just like cars on a road, there’s a certain “speed” that things are designed to operate at – too fast and you can start to have electrons jumping (tunneling) between lanes and causing all kinds of traffic accidents.
Latest Answers