A lot of the radiation in a nuclear bomb blast is released immediately by the “flash” of the nuclear explosion. Running a test underground or underwater blocks direct exposure to the flash, which gets absorbed by the water or rock around the bomb.
Another radiation concern is the radioactive byproducts of the nuclear blast. For a properly conducted underground test, the blast is contained within a layer of rock, so all the radioactive stuff is locked below ground where it isn’t harming people. Assuming you don’t dig down to the blast, or ground water doesn’t seep through it, the radioactive material is trapped very well. For an underwater blast, the radioactive material isn’t nearly as well contained. For one thing, the material can contaminate the water and sea floor, and that radiation can be absorbed by sea life. Also, underwater tests tend to create a explosion on the surface of the water, blowing radioactive water and dust particles into the air. In one of the more infamous underwater tests, Ivy Mike, an underwater test of a hydrogen bomb vaporized a coral reef, and blew the radioactive calcium from the reef into the sky, where it rained down like snow over hundreds of square miles. An unlucky Japanese fishing boat was caught within the fallout zone, and the fishermen were coated in strange dust. This was highly radioactive and made the fishermen very sick after a few days.
To my knowledge, underwater tests were never performed to be “safer” than above ground tests. They were done to see how naval ships could stand up to nuclear blasts, and because it is a lot easier to find an unoccupied stretch of water than to find an unoccupied stretch of land. When the US switched to “safe” nuclear tests, they stopped doing underwater, above ground, or high altitude tests and just did underground tests. The US stopped testing nuclear bombs entirely in the early 1990s.
Latest Answers