During WW2, the U.S. dropped not one, but two nuclear bombs on cities within Japan. These bombs, despite being relatively weak when compared to modern nuclear weapons, did considerable damage to the cities of Nagasaki and Hiroshima. It inspired one reader to ask, “Is Nagasaki and Hiroshima still radioactive?”
The short answer to this question is “no”, Nagasaki and Hiroshima are not radioactive. The long answer requires a bit of an explanation which we’ve provided below.
A nuclear bomb’s energy & destructive power come from the splitting of atoms (usually uranium or plutonium). These split halves are what’s known as fission products and they’re what’s left over after a nuclear chain-reaction. These waste products are highly radioactive and their lethality can vary from “will kill you within a matter of hours” to “will slightly raise your risk of getting cancer over a few decades”. Luckily, the more radioactively energetic something is, the less time it sticks around. This means the extremely lethal variety will only be radioactive for a few hours to a couple weeks.
The less energetic variety can stick around for years. Some elements, like Plutonium, have a half-life of 25,000 years or more. While exposure to this type of radiation won’t immediately kill you, chronic long-term exposure can raise your risk of cancer and cause other health related problems.
The radiation risks faced by civilians living in the Japanese cities were two-fold. The biggest threat came from what is known as prompt radiation. Prompt radiation is the high-energy neutrons and gamma rays which occur immediately after the detonation of a nuclear device. This radiation is extremely short-lived and only occurs during the initial bright flash of light that accompanies a nuclear explosion.
To receive a deadly dose of this type of radiation, a person would have to be extremely close — within 2 miles of the bomb’s detonation point. However, at this distance, flying debris, collapsing buildings or the explosion itself is a bigger threat to life. It is estimated that 20% of the deaths of Nagasaki & Hiroshima were the result of prompt radiation.
When most people think about nuclear explosions and radiation, they’re most likely thinking about the secondary residual radiation. There are two ways residual radiation can occur. The first type is caused by the high-energy neutrons which were emitted during the initial blast. Neutrons are known for their ability to make other elements radioactive. During the prompt radiation phase, those high-energy neutrons can slam into the nucleus of other atoms and make them radioactive for a short while. Fortunately, this type of radioactivity is extremely brief and only poses a risk near the epicenter of the blast.
The second way residual radiation happens is through fallout.
When a nuclear bomb explodes, a fireball is created which contains most of the radioactive fission products and unreacted nuclear fuel. This fireball is what forms the head of the iconic mushroom cloud. Where this fireball forms is the determining factor for fallout risk.
A bomb which detonates near or on the ground has a greater chance of producing radioactive fallout than one which is detonated high in the air.
If a bomb was detonated in the air, like the two which were detonated in Japan, the hot, radioactive ball of fire travels up high into the stratosphere. It does this quickly, usually within minutes. The cloud then cools down and begins to look like a regular (albeit irregular shaped) cloud. But don’t let this fool you, it is still hot and radioactive. Prevailing winds will blow this cloud over a huge area. The residual heat and lightness of the particles will keep it in the atmosphere for a few weeks, after which, the particles begin to “fall out” and come back down to earth. By this time, the radioactive particles have been dispersed and diluted over a thousands of square miles with the most dangerous radioactive elements already rendered inert by decay.
Health risks posed by this type of fallout are negligible and are generally indistinguishable from the standard low-level background radiation everyone receives simply by living.
A nuclear bomb detonating on or near the ground creates a vastly different scenario. The fireball created by the explosion will ‘consume’ a large amount of debris and soil into its mushroom cloud. The dirt will mix with the radioactive elements making it radioactive in the process. Instead of being dispersed in the air, the particles stick to the dirt and remain quite large — you could see them with a microscope or even with the naked eye. Because these radioactive elements are heavy, they can ‘fall out’ of the cloud within hours.
These kinds of explosions create the kind of fallout plumes we associate with nuclear testing; large areas of land which are made radioactive, becoming short-term hazards for people in the area as well as creating long-term contamination issues.
What About Japan?
The bombs dropped on Japan were detonated high up in the air so the radioactive fireball did not touch the ground. This dramatically reduced the radioactive fallout. However, the U.S. didn’t do this out of consideration, rather, it just happened to be the ideal height to maximize the destruction of the structures within the city. Today, Hiroshima and Nagasaki’s radiation levels match the world average background radiation of 0.87 mSv/a.
Bonus Fact: A Japanese man by the name of Tsutomu Yamaguchi was in Hiroshima on a business trip the day the first atomic bomb was dropped. He was injured, but managed to survive the blast. He returned to his home in Nagasaki only to survive another nuclear explosion 3 days later. You can read more about him here.