At the time nuclear physics was in its infancy. Scientists understood that large unstable atoms could trigger the nuclear decay of other unstable atoms in an ongoing process called **fission**, producing smaller byproducts and the release of massive amounts of energy. Similarly it was known that light atoms could be made to fuse together under massive temperatures and pressures to form larger byproducts along with obscene amounts of energy. This process was called **fusion**.
Fission could be made to occur by simply piling enough large radioactive atoms close together. In fact that is what the first nuclear reactors were called, “nuclear piles”. It could also occur naturally in certain geological formations on Earth! Fusion though required such temperatures and pressures that it only happened in stars, not on Earth.
The problem was that by building and detonating a nuclear bomb they would be creating conditions on Earth that had never occurred before. The explosion would make a blast wave that was hotter and more compressed than Earth’s atmosphere had ever encountered. Our atmosphere also contains significant amounts of relatively light atoms such as nitrogen.
What some people pondered was if that hot, super-compressed shock wave could possibly reach the conditions required to fuse nitrogen or similar components together, releasing more energy to just make the shock wave hotter and more compressed. If it could do that it might create a self-sustaining fusion reaction that would spread across the entire atmosphere!
Those scientists familiar with the required energies could dismiss the idea without even doing any math, but there were not very many such scientists and considering the stakes they actually did do the math. It turns out there was no real danger, the required conditions and energy released by such a fusion reaction would not happen and couldn’t be self-sustaining.
Latest Answers