Copyright © 2015 Bert N. Langford (Images may be subject to copyright. Please send feedback)
Welcome to Our Generation USA!
This Page Covers
All Uses of Nuclear Energy
Whether for Peaceful Uses (energy sources) or
Military Uses (WMD as Weapons of Mass Destruction)
Nuclear Technology
- YouTube Video: Nuclear Reactors vs. Nuclear Weapons
- YouTube Video: Nuclear Reactor - Understanding how it works | Physics elearning
- YouTube Video: The risk of nuclear war in Ukraine | Russia sends nukes to Belarus | This World
- Top: Powerplant as a peaceful example of nuclear technology:
- Bottom: Future examples of Space-based nuclear weapons.
Nuclear technology is technology that involves the nuclear reactions of atomic nuclei. Among the notable nuclear technologies are nuclear reactors, nuclear medicine and nuclear weapons. It is also used, among other things, in smoke detectors and gun sights.
History and scientific background:
Discovery:
Main article: Nuclear physics
The vast majority of common, natural phenomena on Earth only involve gravity and electromagnetism, and not nuclear reactions. This is because atomic nuclei are generally kept apart because they contain positive electrical charges and therefore repel each other.
In 1896, Henri Becquerel was investigating phosphorescence in uranium salts when he discovered a new phenomenon which came to be called radioactivity. He, Pierre Curie and Marie Curie began investigating the phenomenon. In the process, they isolated the element radium, which is highly radioactive.
They discovered that radioactive materials produce intense, penetrating rays of three distinct sorts, which they labeled alpha, beta, and gamma after the first three Greek letters. Some of these kinds of radiation could pass through ordinary matter, and all of them could be harmful in large amounts. All of the early researchers received various radiation burns, much like sunburn, and thought little of it.
The new phenomenon of radioactivity was seized upon by the manufacturers of quack medicine (as had the discoveries of electricity and magnetism, earlier), and a number of patent medicines and treatments involving radioactivity were put forward.
Gradually it was realized that the radiation produced by radioactive decay was ionizing radiation, and that even quantities too small to burn could pose a severe long-term hazard.
Many of the scientists working on radioactivity died of cancer as a result of their exposure. Radioactive patent medicines mostly disappeared, but other applications of radioactive materials persisted, such as the use of radium salts to produce glowing dials on meters.
As the atom came to be better understood, the nature of radioactivity became clearer. Some larger atomic nuclei are unstable, and so decay (release matter or energy) after a random interval.
The three forms of radiation that Becquerel and the Curies discovered are also more fully understood:
All three types of radiation occur naturally in certain elements.
It has also become clear that the ultimate source of most terrestrial energy is nuclear, either through radiation from the Sun caused by stellar thermonuclear reactions or by radioactive decay of uranium within the Earth, the principal source of geothermal energy.
Nuclear fission:
Main article: Nuclear fission
In natural nuclear radiation, the byproducts are very small compared to the nuclei from which they originate. Nuclear fission is the process of splitting a nucleus into roughly equal parts, and releasing energy and neutrons in the process. If these neutrons are captured by another unstable nucleus, they can fission as well, leading to a chain reaction.
The average number of neutrons released per nucleus that go on to fission another nucleus is referred to as k. Values of k larger than 1 mean that the fission reaction is releasing more neutrons than it absorbs, and therefore is referred to as a self-sustaining chain reaction.
A mass of fissile material large enough (and in a suitable configuration) to induce a self-sustaining chain reaction is called a critical mass.
When a neutron is captured by a suitable nucleus, fission may occur immediately, or the nucleus may persist in an unstable state for a short time. If there are enough immediate decays to carry on the chain reaction, the mass is said to be prompt critical, and the energy release will grow rapidly and uncontrollably, usually leading to an explosion.
When discovered on the eve of World War II, this insight led multiple countries to begin programs investigating the possibility of constructing an atomic bomb — a weapon which utilized fission reactions to generate far more energy than could be created with chemical explosives.
The Manhattan Project, run by the United States with the help of the United Kingdom and Canada, developed multiple fission weapons which were used against Japan in 1945 at Hiroshima and Nagasaki. During the project, the first fission reactors were developed as well, though they were primarily for weapons manufacture and did not generate electricity.
In 1951, the first nuclear fission power plant was the first to produce electricity at the Experimental Breeder Reactor No. 1 (EBR-1), in Arco, Idaho, ushering in the "Atomic Age" of more intensive human energy use.
However, if the mass is critical only when the delayed neutrons are included, then the reaction can be controlled, for example by the introduction or removal of neutron absorbers.
This is what allows nuclear reactors to be built. Fast neutrons are not easily captured by nuclei; they must be slowed (slow neutrons), generally by collision with the nuclei of a neutron moderator, before they can be easily captured. Today, this type of fission is commonly used to generate electricity.
Nuclear fusion:
Main article: Nuclear fusion
See also: Timeline of nuclear fusion
If nuclei are forced to collide, they can undergo nuclear fusion. This process may release or absorb energy. When the resulting nucleus is lighter than that of iron, energy is normally released; when the nucleus is heavier than that of iron, energy is generally absorbed. This process of fusion occurs in stars, which derive their energy from hydrogen and helium.
They form, through stellar nucleosynthesis, the light elements (lithium to calcium) as well as some of the heavy elements (beyond iron and nickel, via the S-process). The remaining abundance of heavy elements, from nickel to uranium and beyond, is due to supernova nucleosynthesis, the R-process.
Of course, these natural processes of astrophysics are not examples of nuclear "technology". Because of the very strong repulsion of nuclei, fusion is difficult to achieve in a controlled fashion. Hydrogen bombs obtain their enormous destructive power from fusion, but their energy cannot be controlled.
Controlled fusion is achieved in particle accelerators; this is how many synthetic elements are produced. A fusor can also produce controlled fusion and is a useful neutron source. However, both of these devices operate at a net energy loss. Controlled, viable fusion power has proven elusive, despite the occasional hoax.
Technical and theoretical difficulties have hindered the development of working civilian fusion technology, though research continues to this day around the world.
Nuclear fusion was initially pursued only in theoretical stages during World War II, when scientists on the Manhattan Project (led by Edward Teller) investigated it as a method to build a bomb. The project abandoned fusion after concluding that it would require a fission reaction to detonate.
It took until 1952 for the first full hydrogen bomb to be detonated, so-called because it used reactions between deuterium and tritium. Fusion reactions are much more energetic per unit mass of fuel than fission reactions, but starting the fusion chain reaction is much more difficult.
Nuclear weapons:
Main article: Nuclear weapon
A nuclear weapon is an explosive device that derives its destructive force from nuclear reactions, either fission or a combination of fission and fusion. Both reactions release vast quantities of energy from relatively small amounts of matter. Even small nuclear devices can devastate a city by blast, fire and radiation.
Nuclear weapons are considered weapons of mass destruction, and their use and control has been a major aspect of international policy since their debut.
The design of a nuclear weapon is more complicated than it might seem. Such a weapon must hold one or more subcritical fissile masses stable for deployment, then induce criticality (create a critical mass) for detonation. It also is quite difficult to ensure that such a chain reaction consumes a significant fraction of the fuel before the device flies apart.
The procurement of a nuclear fuel is also more difficult than it might seem, since sufficiently unstable substances for this process do not currently occur naturally on Earth in suitable amounts.
One isotope of uranium, namely uranium-235, is naturally occurring and sufficiently unstable, but it is always found mixed with the more stable isotope uranium-238. The latter accounts for more than 99% of the weight of natural uranium. Therefore, some method of isotope separation based on the weight of three neutrons must be performed to enrich (isolate) uranium-235.
Alternatively, the element plutonium possesses an isotope that is sufficiently unstable for this process to be usable. Terrestrial plutonium does not currently occur naturally in sufficient quantities for such use, so it must be manufactured in a nuclear reactor.
Ultimately, the Manhattan Project manufactured nuclear weapons based on each of these elements. They detonated the first nuclear weapon in a test code-named "Trinity", near Alamogordo, New Mexico, on July 16, 1945. The test was conducted to ensure that the implosion method of detonation would work, which it did.
A uranium bomb, Little Boy, was dropped on the Japanese city Hiroshima on August 6, 1945, followed three days later by the plutonium-based Fat Man on Nagasaki. In the wake of unprecedented devastation and casualties from a single weapon, the Japanese government soon surrendered, ending World War II.
Since these bombings, no nuclear weapons have been deployed offensively. Nevertheless, they prompted an arms race to develop increasingly destructive bombs to provide a nuclear deterrent.
Just over four years later, on August 29, 1949, the Soviet Union detonated its first fission weapon. The United Kingdom followed on October 2, 1952; France, on February 13, 1960; and China component to a nuclear weapon.
Approximately half of the deaths from Hiroshima and Nagasaki died two to five years afterward from radiation exposure. A radiological weapon is a type of nuclear weapon designed to distribute hazardous nuclear material in enemy areas. Such a weapon would not have the explosive capability of a fission or fusion bomb, but would kill many people and contaminate a large area.
A radiological weapon has never been deployed. While considered useless by a conventional military, such a weapon raises concerns over nuclear terrorism.
There have been over 2,000 nuclear tests conducted since 1945. In 1963, all nuclear and many non-nuclear states signed the Limited Test Ban Treaty, pledging to refrain from testing nuclear weapons in the atmosphere, underwater, or in outer space. The treaty permitted underground nuclear testing.
France continued atmospheric testing until 1974, while China continued up until 1980. The last underground test by the United States was in 1992, the Soviet Union in 1990, the United Kingdom in 1991, and both France and China continued testing until 1996.
After signing the Comprehensive Test Ban Treaty in 1996 (which had as of 2011 not entered into force), all of these states have pledged to discontinue all nuclear testing. Non-signatories India and Pakistan last tested nuclear weapons in 1998.
Nuclear weapons are the most destructive weapons known - the archetypal weapons of mass destruction. Throughout the Cold War, the opposing powers had huge nuclear arsenals, sufficient to kill hundreds of millions of people. Generations of people grew up under the shadow of nuclear devastation, portrayed in films such as Dr. Strangelove and The Atomic Cafe.
However, the tremendous energy release in the detonation of a nuclear weapon also suggested the possibility of a new energy source.
Civilian uses:
Nuclear power:
Further information: Nuclear power and Nuclear reactor technology
Nuclear power is a type of nuclear technology involving the controlled use of nuclear fission to release energy for work including propulsion, heat, and the generation of electricity.
Nuclear energy is produced by a controlled nuclear chain reaction which creates heat—and which is used to boil water, produce steam, and drive a steam turbine. The turbine is used to generate electricity and/or to do mechanical work.
Currently nuclear power provides approximately 15.7% of the world's electricity (in 2004) and is used to propel aircraft carriers, icebreakers and submarines (so far economics and fears in some ports have prevented the use of nuclear power in transport ships).
All nuclear power plants use fission. No man-made fusion reaction has resulted in a viable source of electricity.
Medical applications:
Further information: Nuclear medicine
The medical applications of nuclear technology are divided into diagnostics and radiation treatment.
Imaging - The largest use of ionizing radiation in medicine is in medical radiography to make images of the inside of the human body using x-rays. This is the largest artificial source of radiation exposure for humans. Medical and dental x-ray imagers use of cobalt-60 or other x-ray sources.
A number of radiopharmaceuticals are used, sometimes attached to organic molecules, to act as radioactive tracers or contrast agents in the human body. Positron emitting nucleotides are used for high resolution, short time span imaging in applications known as Positron emission tomography.
Radiation is also used to treat diseases in radiation therapy.
Industrial applications:
Since some ionizing radiation can penetrate matter, they are used for a variety of measuring methods. X-rays and gamma rays are used in industrial radiography to make images of the inside of solid products, as a means of nondestructive testing and inspection. The piece to be radiographed is placed between the source and a photographic film in a cassette. After a certain exposure time, the film is developed and it shows any internal defects of the material.
Gauges - Gauges use the exponential absorption law of gamma rays
Electrostatic control - To avoid the build-up of static electricity in production of paper, plastics, synthetic textiles, etc., a ribbon-shaped source of the alpha emitter 241Am can be placed close to the material at the end of the production line. The source ionizes the air to remove electric charges on the material.
Radioactive tracers - Since radioactive isotopes behave, chemically, mostly like the inactive element, the behavior of a certain chemical substance can be followed by tracing the radioactivity.
Oil and Gas Exploration- Nuclear well logging is used to help predict the commercial viability of new or existing wells. The technology involves the use of a neutron or gamma-ray source and a radiation detector which are lowered into boreholes to determine the properties of the surrounding rock such as porosity and lithography.
Road Construction - Nuclear moisture/density gauges are used to determine the density of soils, asphalt, and concrete. Typically a cesium-137 source is used.
Commercial applications:
Food processing and agriculture:
In biology and agriculture, radiation is used to induce mutations to produce new or improved species, such as in atomic gardening. Another use in insect control is the sterile insect technique, where male insects are sterilized by radiation and released, so they have no offspring, to reduce the population.
In industrial and food applications, radiation is used for sterilization of tools and equipment.
An advantage is that the object may be sealed in plastic before sterilization. An emerging use in food production is the sterilization of food using food irradiation.
Food irradiation is the process of exposing food to ionizing radiation in order to destroy microorganisms, bacteria, viruses, or insects that might be present in the food. The radiation sources used include radioisotope gamma ray sources, X-ray generators and electron accelerators.
Further applications include sprout inhibition, delay of ripening, increase of juice yield, and improvement of re-hydration. Irradiation is a more general term of deliberate exposure of materials to radiation to achieve a technical goal (in this context 'ionizing radiation' is implied). As such it is also used on non-food items, such as medical hardware, plastics, tubes for gas-pipelines, hoses for floor-heating, shrink-foils for food packaging, automobile parts, wires and cables (isolation), tires, and even gemstones.
Compared to the amount of food irradiated, the volume of those every-day applications is huge but not noticed by the consumer.
The genuine effect of processing food by ionizing radiation relates to damages to the DNA, the basic genetic information for life. Microorganisms can no longer proliferate and continue their malignant or pathogenic activities.
Spoilage causing micro-organisms cannot continue their activities. Insects do not survive or become incapable of procreation. Plants cannot continue the natural ripening or aging process. All these effects are beneficial to the consumer and the food industry, likewise.
The amount of energy imparted for effective food irradiation is low compared to cooking the same; even at a typical dose of 10 kGy most food, which is (with regard to warming) physically equivalent to water, would warm by only about 2.5 °C (4.5 °F).
The specialty of processing food by ionizing radiation is the fact, that the energy density per atomic transition is very high, it can cleave molecules and induce ionization (hence the name) which cannot be achieved by mere heating. This is the reason for new beneficial effects, however at the same time, for new concerns.
The treatment of solid food by ionizing radiation can provide an effect similar to heat pasteurization of liquids, such as milk. However, the use of the term, cold pasteurization, to describe irradiated foods is controversial, because pasteurization and irradiation are fundamentally different processes, although the intended end results can in some cases be similar.
Detractors of food irradiation have concerns about the health hazards of induced radioactivity.
A report for the industry advocacy group American Council on Science and Health entitled "Irradiated Foods" states: "The types of radiation sources approved for the treatment of foods have specific energy levels well below that which would cause any element in food to become radioactive. Food undergoing irradiation does not become any more radioactive than luggage passing through an airport X-ray scanner or teeth that have been X-rayed."
Food irradiation is currently permitted by over 40 countries and volumes are estimated to exceed 500,000 metric tons (490,000 long tons; 550,000 short tons) annually worldwide.
Food irradiation is essentially a non-nuclear technology; it relies on the use of ionizing radiation which may be generated by accelerators for electrons and conversion into bremsstrahlung, but which may use also gamma-rays from nuclear decay.
There is a worldwide industry for processing by ionizing radiation, the majority by number and by processing power using accelerators. Food irradiation is only a niche application compared to medical supplies, plastic materials, raw materials, gemstones, cables and wires, etc.
Accidents:
Main articles: Nuclear and radiation accidents and Nuclear safety
Nuclear accidents, because of the powerful forces involved, are often very dangerous.
Historically, the first incidents involved fatal radiation exposure. Marie Curie died from aplastic anemia which resulted from her high levels of exposure.
Two scientists, an American and Canadian respectively, Harry Daghlian and Louis Slotin, died after mishandling the same plutonium mass. Unlike conventional weapons, the intense light, heat, and explosive force is not the only deadly component to a nuclear weapon.
Approximately half of the deaths from Hiroshima and Nagasaki died two to five years afterward from radiation exposure.
Civilian nuclear and radiological accidents primarily involve nuclear power plants. Most common are nuclear leaks that expose workers to hazardous material. A nuclear meltdown refers to the more serious hazard of releasing nuclear material into the surrounding environment.
The most significant meltdowns occurred at Three Mile Island in Pennsylvania and Chernobyl in the Soviet Ukraine. Plus, the earthquake and tsunami on March 11, 2011 caused serious damage to three nuclear reactors and a spent fuel storage pond at the Fukushima Daiichi nuclear power plant in Japan.
Military reactors that experienced similar accidents were Windscale in the United Kingdom and SL-1 in the United States.
Military accidents usually involve the loss or unexpected detonation of nuclear weapons.
The Castle Bravo test in 1954 produced a larger yield than expected, which contaminated nearby islands, a Japanese fishing boat (with one fatality), and raised concerns about contaminated fish in Japan. In the 1950s through 1970s, several nuclear bombs were lost from submarines and aircraft, some of which have never been recovered.
The last twenty years have seen a marked decline in such accidents.
Examples of environmental benefits:
Proponents of nuclear energy note that annually, nuclear-generated electricity reduces 470 million metric tons of carbon dioxide emissions that would otherwise come from fossil fuels.
Additionally, the amount of comparatively low waste that nuclear energy does create is safely disposed of by the large scale nuclear energy production facilities or it is repurposed/recycled for other energy uses. Proponents of nuclear energy also bring to attention the opportunity cost of utilizing other forms of electricity.
For example, the Environmental Protection Agency estimates that coal kills 30,000 people a year, as a result of its environmental impact, while 60 people died in the Chernobyl disaster.
A real world example of impact provided by proponents of nuclear energy is the 650,000 ton increase in carbon emissions in the two months following the closure of the Vermont Yankee nuclear plant.
See also
History and scientific background:
Discovery:
Main article: Nuclear physics
The vast majority of common, natural phenomena on Earth only involve gravity and electromagnetism, and not nuclear reactions. This is because atomic nuclei are generally kept apart because they contain positive electrical charges and therefore repel each other.
In 1896, Henri Becquerel was investigating phosphorescence in uranium salts when he discovered a new phenomenon which came to be called radioactivity. He, Pierre Curie and Marie Curie began investigating the phenomenon. In the process, they isolated the element radium, which is highly radioactive.
They discovered that radioactive materials produce intense, penetrating rays of three distinct sorts, which they labeled alpha, beta, and gamma after the first three Greek letters. Some of these kinds of radiation could pass through ordinary matter, and all of them could be harmful in large amounts. All of the early researchers received various radiation burns, much like sunburn, and thought little of it.
The new phenomenon of radioactivity was seized upon by the manufacturers of quack medicine (as had the discoveries of electricity and magnetism, earlier), and a number of patent medicines and treatments involving radioactivity were put forward.
Gradually it was realized that the radiation produced by radioactive decay was ionizing radiation, and that even quantities too small to burn could pose a severe long-term hazard.
Many of the scientists working on radioactivity died of cancer as a result of their exposure. Radioactive patent medicines mostly disappeared, but other applications of radioactive materials persisted, such as the use of radium salts to produce glowing dials on meters.
As the atom came to be better understood, the nature of radioactivity became clearer. Some larger atomic nuclei are unstable, and so decay (release matter or energy) after a random interval.
The three forms of radiation that Becquerel and the Curies discovered are also more fully understood:
- Alpha decay is when a nucleus releases an alpha particle, which is two protons and two neutrons, equivalent to a helium nucleus.
- Beta decay is the release of a beta particle, a high-energy electron.
- Gamma decay releases gamma rays, which unlike alpha and beta radiation are not matter but electromagnetic radiation of very high frequency, and therefore energy. This type of radiation is the most dangerous and most difficult to block.
All three types of radiation occur naturally in certain elements.
It has also become clear that the ultimate source of most terrestrial energy is nuclear, either through radiation from the Sun caused by stellar thermonuclear reactions or by radioactive decay of uranium within the Earth, the principal source of geothermal energy.
Nuclear fission:
Main article: Nuclear fission
In natural nuclear radiation, the byproducts are very small compared to the nuclei from which they originate. Nuclear fission is the process of splitting a nucleus into roughly equal parts, and releasing energy and neutrons in the process. If these neutrons are captured by another unstable nucleus, they can fission as well, leading to a chain reaction.
The average number of neutrons released per nucleus that go on to fission another nucleus is referred to as k. Values of k larger than 1 mean that the fission reaction is releasing more neutrons than it absorbs, and therefore is referred to as a self-sustaining chain reaction.
A mass of fissile material large enough (and in a suitable configuration) to induce a self-sustaining chain reaction is called a critical mass.
When a neutron is captured by a suitable nucleus, fission may occur immediately, or the nucleus may persist in an unstable state for a short time. If there are enough immediate decays to carry on the chain reaction, the mass is said to be prompt critical, and the energy release will grow rapidly and uncontrollably, usually leading to an explosion.
When discovered on the eve of World War II, this insight led multiple countries to begin programs investigating the possibility of constructing an atomic bomb — a weapon which utilized fission reactions to generate far more energy than could be created with chemical explosives.
The Manhattan Project, run by the United States with the help of the United Kingdom and Canada, developed multiple fission weapons which were used against Japan in 1945 at Hiroshima and Nagasaki. During the project, the first fission reactors were developed as well, though they were primarily for weapons manufacture and did not generate electricity.
In 1951, the first nuclear fission power plant was the first to produce electricity at the Experimental Breeder Reactor No. 1 (EBR-1), in Arco, Idaho, ushering in the "Atomic Age" of more intensive human energy use.
However, if the mass is critical only when the delayed neutrons are included, then the reaction can be controlled, for example by the introduction or removal of neutron absorbers.
This is what allows nuclear reactors to be built. Fast neutrons are not easily captured by nuclei; they must be slowed (slow neutrons), generally by collision with the nuclei of a neutron moderator, before they can be easily captured. Today, this type of fission is commonly used to generate electricity.
Nuclear fusion:
Main article: Nuclear fusion
See also: Timeline of nuclear fusion
If nuclei are forced to collide, they can undergo nuclear fusion. This process may release or absorb energy. When the resulting nucleus is lighter than that of iron, energy is normally released; when the nucleus is heavier than that of iron, energy is generally absorbed. This process of fusion occurs in stars, which derive their energy from hydrogen and helium.
They form, through stellar nucleosynthesis, the light elements (lithium to calcium) as well as some of the heavy elements (beyond iron and nickel, via the S-process). The remaining abundance of heavy elements, from nickel to uranium and beyond, is due to supernova nucleosynthesis, the R-process.
Of course, these natural processes of astrophysics are not examples of nuclear "technology". Because of the very strong repulsion of nuclei, fusion is difficult to achieve in a controlled fashion. Hydrogen bombs obtain their enormous destructive power from fusion, but their energy cannot be controlled.
Controlled fusion is achieved in particle accelerators; this is how many synthetic elements are produced. A fusor can also produce controlled fusion and is a useful neutron source. However, both of these devices operate at a net energy loss. Controlled, viable fusion power has proven elusive, despite the occasional hoax.
Technical and theoretical difficulties have hindered the development of working civilian fusion technology, though research continues to this day around the world.
Nuclear fusion was initially pursued only in theoretical stages during World War II, when scientists on the Manhattan Project (led by Edward Teller) investigated it as a method to build a bomb. The project abandoned fusion after concluding that it would require a fission reaction to detonate.
It took until 1952 for the first full hydrogen bomb to be detonated, so-called because it used reactions between deuterium and tritium. Fusion reactions are much more energetic per unit mass of fuel than fission reactions, but starting the fusion chain reaction is much more difficult.
Nuclear weapons:
Main article: Nuclear weapon
A nuclear weapon is an explosive device that derives its destructive force from nuclear reactions, either fission or a combination of fission and fusion. Both reactions release vast quantities of energy from relatively small amounts of matter. Even small nuclear devices can devastate a city by blast, fire and radiation.
Nuclear weapons are considered weapons of mass destruction, and their use and control has been a major aspect of international policy since their debut.
The design of a nuclear weapon is more complicated than it might seem. Such a weapon must hold one or more subcritical fissile masses stable for deployment, then induce criticality (create a critical mass) for detonation. It also is quite difficult to ensure that such a chain reaction consumes a significant fraction of the fuel before the device flies apart.
The procurement of a nuclear fuel is also more difficult than it might seem, since sufficiently unstable substances for this process do not currently occur naturally on Earth in suitable amounts.
One isotope of uranium, namely uranium-235, is naturally occurring and sufficiently unstable, but it is always found mixed with the more stable isotope uranium-238. The latter accounts for more than 99% of the weight of natural uranium. Therefore, some method of isotope separation based on the weight of three neutrons must be performed to enrich (isolate) uranium-235.
Alternatively, the element plutonium possesses an isotope that is sufficiently unstable for this process to be usable. Terrestrial plutonium does not currently occur naturally in sufficient quantities for such use, so it must be manufactured in a nuclear reactor.
Ultimately, the Manhattan Project manufactured nuclear weapons based on each of these elements. They detonated the first nuclear weapon in a test code-named "Trinity", near Alamogordo, New Mexico, on July 16, 1945. The test was conducted to ensure that the implosion method of detonation would work, which it did.
A uranium bomb, Little Boy, was dropped on the Japanese city Hiroshima on August 6, 1945, followed three days later by the plutonium-based Fat Man on Nagasaki. In the wake of unprecedented devastation and casualties from a single weapon, the Japanese government soon surrendered, ending World War II.
Since these bombings, no nuclear weapons have been deployed offensively. Nevertheless, they prompted an arms race to develop increasingly destructive bombs to provide a nuclear deterrent.
Just over four years later, on August 29, 1949, the Soviet Union detonated its first fission weapon. The United Kingdom followed on October 2, 1952; France, on February 13, 1960; and China component to a nuclear weapon.
Approximately half of the deaths from Hiroshima and Nagasaki died two to five years afterward from radiation exposure. A radiological weapon is a type of nuclear weapon designed to distribute hazardous nuclear material in enemy areas. Such a weapon would not have the explosive capability of a fission or fusion bomb, but would kill many people and contaminate a large area.
A radiological weapon has never been deployed. While considered useless by a conventional military, such a weapon raises concerns over nuclear terrorism.
There have been over 2,000 nuclear tests conducted since 1945. In 1963, all nuclear and many non-nuclear states signed the Limited Test Ban Treaty, pledging to refrain from testing nuclear weapons in the atmosphere, underwater, or in outer space. The treaty permitted underground nuclear testing.
France continued atmospheric testing until 1974, while China continued up until 1980. The last underground test by the United States was in 1992, the Soviet Union in 1990, the United Kingdom in 1991, and both France and China continued testing until 1996.
After signing the Comprehensive Test Ban Treaty in 1996 (which had as of 2011 not entered into force), all of these states have pledged to discontinue all nuclear testing. Non-signatories India and Pakistan last tested nuclear weapons in 1998.
Nuclear weapons are the most destructive weapons known - the archetypal weapons of mass destruction. Throughout the Cold War, the opposing powers had huge nuclear arsenals, sufficient to kill hundreds of millions of people. Generations of people grew up under the shadow of nuclear devastation, portrayed in films such as Dr. Strangelove and The Atomic Cafe.
However, the tremendous energy release in the detonation of a nuclear weapon also suggested the possibility of a new energy source.
Civilian uses:
Nuclear power:
Further information: Nuclear power and Nuclear reactor technology
Nuclear power is a type of nuclear technology involving the controlled use of nuclear fission to release energy for work including propulsion, heat, and the generation of electricity.
Nuclear energy is produced by a controlled nuclear chain reaction which creates heat—and which is used to boil water, produce steam, and drive a steam turbine. The turbine is used to generate electricity and/or to do mechanical work.
Currently nuclear power provides approximately 15.7% of the world's electricity (in 2004) and is used to propel aircraft carriers, icebreakers and submarines (so far economics and fears in some ports have prevented the use of nuclear power in transport ships).
All nuclear power plants use fission. No man-made fusion reaction has resulted in a viable source of electricity.
Medical applications:
Further information: Nuclear medicine
The medical applications of nuclear technology are divided into diagnostics and radiation treatment.
Imaging - The largest use of ionizing radiation in medicine is in medical radiography to make images of the inside of the human body using x-rays. This is the largest artificial source of radiation exposure for humans. Medical and dental x-ray imagers use of cobalt-60 or other x-ray sources.
A number of radiopharmaceuticals are used, sometimes attached to organic molecules, to act as radioactive tracers or contrast agents in the human body. Positron emitting nucleotides are used for high resolution, short time span imaging in applications known as Positron emission tomography.
Radiation is also used to treat diseases in radiation therapy.
Industrial applications:
Since some ionizing radiation can penetrate matter, they are used for a variety of measuring methods. X-rays and gamma rays are used in industrial radiography to make images of the inside of solid products, as a means of nondestructive testing and inspection. The piece to be radiographed is placed between the source and a photographic film in a cassette. After a certain exposure time, the film is developed and it shows any internal defects of the material.
Gauges - Gauges use the exponential absorption law of gamma rays
- Level indicators: Source and detector are placed at opposite sides of a container, indicating the presence or absence of material in the horizontal radiation path. Beta or gamma sources are used, depending on the thickness and the density of the material to be measured. The method is used for containers of liquids or of grainy substances
- Thickness gauges: if the material is of constant density, the signal measured by the radiation detector depends on the thickness of the material. This is useful for continuous production, like of paper, rubber, etc.
Electrostatic control - To avoid the build-up of static electricity in production of paper, plastics, synthetic textiles, etc., a ribbon-shaped source of the alpha emitter 241Am can be placed close to the material at the end of the production line. The source ionizes the air to remove electric charges on the material.
Radioactive tracers - Since radioactive isotopes behave, chemically, mostly like the inactive element, the behavior of a certain chemical substance can be followed by tracing the radioactivity.
- Examples:
- Adding a gamma tracer to a gas or liquid in a closed system makes it possible to find a hole in a tube.
- Adding a tracer to the surface of the component of a motor makes it possible to measure wear by measuring the activity of the lubricating oil.
Oil and Gas Exploration- Nuclear well logging is used to help predict the commercial viability of new or existing wells. The technology involves the use of a neutron or gamma-ray source and a radiation detector which are lowered into boreholes to determine the properties of the surrounding rock such as porosity and lithography.
Road Construction - Nuclear moisture/density gauges are used to determine the density of soils, asphalt, and concrete. Typically a cesium-137 source is used.
Commercial applications:
- radioluminescence
- tritium illumination: Tritium is used with phosphor in rifle sights to increase nighttime firing accuracy. Some runway markers and building exit signs use the same technology, to remain illuminated during blackouts.
- Betavoltaics.
- Smoke detector: An ionization smoke detector includes a tiny mass of radioactive americium-241, which is a source of alpha radiation. Two ionisation chambers are placed next to each other. Both contain a small source of 241Am that gives rise to a small constant current. One is closed and serves for comparison, the other is open to ambient air; it has a gridded electrode. When smoke enters the open chamber, the current is disrupted as the smoke particles attach to the charged ions and restore them to a neutral electrical state. This reduces the current in the open chamber. When the current drops below a certain threshold, the alarm is triggered.
Food processing and agriculture:
In biology and agriculture, radiation is used to induce mutations to produce new or improved species, such as in atomic gardening. Another use in insect control is the sterile insect technique, where male insects are sterilized by radiation and released, so they have no offspring, to reduce the population.
In industrial and food applications, radiation is used for sterilization of tools and equipment.
An advantage is that the object may be sealed in plastic before sterilization. An emerging use in food production is the sterilization of food using food irradiation.
Food irradiation is the process of exposing food to ionizing radiation in order to destroy microorganisms, bacteria, viruses, or insects that might be present in the food. The radiation sources used include radioisotope gamma ray sources, X-ray generators and electron accelerators.
Further applications include sprout inhibition, delay of ripening, increase of juice yield, and improvement of re-hydration. Irradiation is a more general term of deliberate exposure of materials to radiation to achieve a technical goal (in this context 'ionizing radiation' is implied). As such it is also used on non-food items, such as medical hardware, plastics, tubes for gas-pipelines, hoses for floor-heating, shrink-foils for food packaging, automobile parts, wires and cables (isolation), tires, and even gemstones.
Compared to the amount of food irradiated, the volume of those every-day applications is huge but not noticed by the consumer.
The genuine effect of processing food by ionizing radiation relates to damages to the DNA, the basic genetic information for life. Microorganisms can no longer proliferate and continue their malignant or pathogenic activities.
Spoilage causing micro-organisms cannot continue their activities. Insects do not survive or become incapable of procreation. Plants cannot continue the natural ripening or aging process. All these effects are beneficial to the consumer and the food industry, likewise.
The amount of energy imparted for effective food irradiation is low compared to cooking the same; even at a typical dose of 10 kGy most food, which is (with regard to warming) physically equivalent to water, would warm by only about 2.5 °C (4.5 °F).
The specialty of processing food by ionizing radiation is the fact, that the energy density per atomic transition is very high, it can cleave molecules and induce ionization (hence the name) which cannot be achieved by mere heating. This is the reason for new beneficial effects, however at the same time, for new concerns.
The treatment of solid food by ionizing radiation can provide an effect similar to heat pasteurization of liquids, such as milk. However, the use of the term, cold pasteurization, to describe irradiated foods is controversial, because pasteurization and irradiation are fundamentally different processes, although the intended end results can in some cases be similar.
Detractors of food irradiation have concerns about the health hazards of induced radioactivity.
A report for the industry advocacy group American Council on Science and Health entitled "Irradiated Foods" states: "The types of radiation sources approved for the treatment of foods have specific energy levels well below that which would cause any element in food to become radioactive. Food undergoing irradiation does not become any more radioactive than luggage passing through an airport X-ray scanner or teeth that have been X-rayed."
Food irradiation is currently permitted by over 40 countries and volumes are estimated to exceed 500,000 metric tons (490,000 long tons; 550,000 short tons) annually worldwide.
Food irradiation is essentially a non-nuclear technology; it relies on the use of ionizing radiation which may be generated by accelerators for electrons and conversion into bremsstrahlung, but which may use also gamma-rays from nuclear decay.
There is a worldwide industry for processing by ionizing radiation, the majority by number and by processing power using accelerators. Food irradiation is only a niche application compared to medical supplies, plastic materials, raw materials, gemstones, cables and wires, etc.
Accidents:
Main articles: Nuclear and radiation accidents and Nuclear safety
Nuclear accidents, because of the powerful forces involved, are often very dangerous.
Historically, the first incidents involved fatal radiation exposure. Marie Curie died from aplastic anemia which resulted from her high levels of exposure.
Two scientists, an American and Canadian respectively, Harry Daghlian and Louis Slotin, died after mishandling the same plutonium mass. Unlike conventional weapons, the intense light, heat, and explosive force is not the only deadly component to a nuclear weapon.
Approximately half of the deaths from Hiroshima and Nagasaki died two to five years afterward from radiation exposure.
Civilian nuclear and radiological accidents primarily involve nuclear power plants. Most common are nuclear leaks that expose workers to hazardous material. A nuclear meltdown refers to the more serious hazard of releasing nuclear material into the surrounding environment.
The most significant meltdowns occurred at Three Mile Island in Pennsylvania and Chernobyl in the Soviet Ukraine. Plus, the earthquake and tsunami on March 11, 2011 caused serious damage to three nuclear reactors and a spent fuel storage pond at the Fukushima Daiichi nuclear power plant in Japan.
Military reactors that experienced similar accidents were Windscale in the United Kingdom and SL-1 in the United States.
Military accidents usually involve the loss or unexpected detonation of nuclear weapons.
The Castle Bravo test in 1954 produced a larger yield than expected, which contaminated nearby islands, a Japanese fishing boat (with one fatality), and raised concerns about contaminated fish in Japan. In the 1950s through 1970s, several nuclear bombs were lost from submarines and aircraft, some of which have never been recovered.
The last twenty years have seen a marked decline in such accidents.
Examples of environmental benefits:
Proponents of nuclear energy note that annually, nuclear-generated electricity reduces 470 million metric tons of carbon dioxide emissions that would otherwise come from fossil fuels.
Additionally, the amount of comparatively low waste that nuclear energy does create is safely disposed of by the large scale nuclear energy production facilities or it is repurposed/recycled for other energy uses. Proponents of nuclear energy also bring to attention the opportunity cost of utilizing other forms of electricity.
For example, the Environmental Protection Agency estimates that coal kills 30,000 people a year, as a result of its environmental impact, while 60 people died in the Chernobyl disaster.
A real world example of impact provided by proponents of nuclear energy is the 650,000 ton increase in carbon emissions in the two months following the closure of the Vermont Yankee nuclear plant.
See also
- Nuclear power debate
- Outline of nuclear technology
- Radiology
- Nuclear Energy Institute – Beneficial Uses of Radiation
- Nuclear Technology
- National Isotope Development Center – U.S. Government source of isotopes for basic and applied nuclear science and nuclear technology – production, research, development, distribution, and information
What you need to know about the U.S. fusion energy breakthrough* including Nuclear Fusion and Nuclear Power
* -- Published in Washington Post 12/13/2022
* -- Published in Washington Post 12/13/2022
- YouTube Video: Nuclear Fusion: Inside the breakthrough that could change our world | 60 Minutes
- YouTube Video: How Developments In Nuclear Fusion Change Everything | Neil deGrasse Tyson Explains...
- YouTube Video: How This Fusion Reactor Will Make Electricity by 2024
The Washington Post 12/13/2022: On Tuesday, the Energy Department announced a long-awaited milestone in the development of nuclear fusion energy: net energy gain. The news could galvanize the fusion community, which has long hyped the technology as a possible clean energy tool to combat climate change.
10 steps you can take to lower your carbon footprint:
But how big of a deal is the “net energy gain” anyway — and what does it mean for the fusion power plants of the future? Here’s what you need to know.
What is fusion energy?
Existing nuclear power plants work through fission — splitting apart heavy atoms to create energy. In fission, a neutron collides with a heavy uranium atom, splitting it into lighter atoms and releasing a lot of heat and energy at the same time.
Fusion, on the other hand, works in the opposite way — it involves smushing two atoms (often two hydrogen atoms) together to create a new element (often helium), in the same way that stars creates energy. In that process, the two hydrogen atoms lose a small amount of mass, which is converted to energy according to Einstein’s famous equation, E=mc². Because the speed of light is very, very fast — 300,000,000 meters per second — even a tiny amount of mass lost can result in a ton of energy.
What is ‘net energy gain,’ and how did the researchers achieve it?
Up to this point, researchers have been able to fuse two hydrogen atoms together successfully, but it has always taken more energy to do the reaction than they get back. Net energy gain — where they get more energy back than they put in to create the reaction — has been the elusive holy grail of fusion research.
On Tuesday, researchers at the National Ignition Facility at the Lawrence Livermore National Laboratory in California announced that they attained net energy gain by shooting lasers at hydrogen atoms.
The lasers delivered 2.05 megajoules of energy and created 3.15 megajoules of fusion energy, a gain of about 1.5 times. The 192 laser beams compressed the hydrogen atoms down to about 100 times the density of lead and heated them to approximately 100 million degrees Celsius. The high density and temperature caused the atoms to merge into helium.
Other methods being researched involve using magnets to confine superhot plasma.
“It’s like the Kitty Hawk moment for the Wright brothers,” said Melanie Windridge, a plasma physicist and the CEO of Fusion Energy Insights. “It’s like the plane taking off.”
Does this mean fusion energy is ready for prime time?
No. Scientists refer to the current breakthrough as “scientific net energy gain” — meaning that more energy has come out of the reaction than was inputted by the laser. That’s a huge milestone that has never before been achieved.
But it’s only a net energy gain at the micro level. The lasers used at the Livermore lab are only about 1 percent efficient, according to Troy Carter, a plasma physicist at the University of California at Los Angeles. That means that it takes about 100 times more energy to run the lasers than they are ultimately able to deliver to the hydrogen atoms.
So researchers will still have to reach “engineering net energy gain,” or the point at which the entire process takes less energy than is outputted by the reaction. They will also have to figure out how to turn the outputted energy — currently in the form of kinetic energy from the helium nucleus and the neutron — into a form that is usable for electricity. They could do that by converting it to heat, then heating steam to turn a turbine and run a generator. That process also has efficiency limitations.
All that means that the energy gain will probably need to be pushed much, much higher for fusion to actually be commercially viable.
At the moment, researchers can also only do the fusion reaction about once a day. In between, they have to allow the lasers to cool and replace the fusion fuel target. A commercially viable plant would need to be able to do it several times per second, said Dennis Whyte, director of the Plasma Science and Fusion Center at MIT. “Once you’ve got scientific viability,” he said, “you’ve got to figure out engineering viability.”
What are the benefits of fusion?
Fusion’s possibilities are huge. The technology is much, much safer than nuclear fission, since fusion can’t create runaway reactions. It also doesn’t produce radioactive byproducts that need to be stored, or harmful carbon emissions; it simply produces inert helium and a neutron. And we’re not likely to run out of fuel: The fuel for fusion is just heavy hydrogen atoms, which can be found in seawater.
When could fusion actually power our homes?
That’s the trillion-dollar question. For decades, scientists have joked that fusion is always 30 or 40 years away. Over the years, researchers have variously predicted that fusion plants will be operational in the 1990s, the 2000s, the 2010s and the 2020s.
Current fusion experts argue that it’s not a matter of time, but a matter of will — if governments and private donors finance fusion aggressively, they say, a prototype fusion power plant could be available in the 2030s.
“The timeline is not really a question of time,” Carter said. “It’s a question of innovating and putting the effort in.”
___________________________________________________________________________
Fusion power (Wikipedia):
Fusion power is a proposed form of power generation that would generate electricity by using heat from nuclear fusion reactions. In a fusion process, two lighter atomic nuclei combine to form a heavier nucleus, while releasing energy. Devices designed to harness this energy are known as fusion reactors.
Research into fusion reactors began in the 1940s, but as of 2022, only one design, an inertial confinement laser-driven fusion machine at the US National Ignition Facility, has conclusively produced a positive fusion energy gain factor, i.e. more power output than input.
Fusion processes require fuel and a confined environment with sufficient temperature, pressure, and confinement time to create a plasma in which fusion can occur. The combination of these figures that results in a power-producing system is known as the Lawson criterion. In stars, the most common fuel is hydrogen, and gravity provides extremely long confinement times that reach the conditions needed for fusion energy production.
Proposed fusion reactors generally use heavy hydrogen isotopes such as deuterium and tritium (and especially a mixture of the two), which react more easily than protium (the most common hydrogen isotope), to allow them to reach the Lawson criterion requirements with less extreme conditions. Most designs aim to heat their fuel to around 100 million degrees, which presents a major challenge in producing a successful design.
As a source of power, nuclear fusion is expected to have many advantages over fission. These include reduced radioactivity in operation and little high-level nuclear waste, ample fuel supplies, and increased safety.
However, the necessary combination of temperature, pressure, and duration has proven to be difficult to produce in a practical and economical manner. A second issue that affects common reactions is managing neutrons that are released during the reaction, which over time degrade many common materials used within the reaction chamber.
Fusion researchers have investigated various confinement concepts. The early emphasis was on three main systems: z-pinch, stellarator, and magnetic mirror. The current leading designs are the tokamak and inertial confinement (ICF) by laser. Both designs are under research at very large scales, most notably the ITER tokamak in France, and the National Ignition Facility (NIF) laser in the United States.
Researchers are also studying other designs that may offer cheaper approaches. Among these alternatives, there is increasing interest in magnetized target fusion and inertial electrostatic confinement, and new variations of the stellarator.
Click on any of the blue hyperlinks for more about Fusion Power:
Nuclear fusion:
Nuclear fusion is a reaction in which two or more atomic nuclei are combined to form one or more different atomic nuclei and subatomic particles (neutrons or protons).
The difference in mass between the reactants and products is manifested as either the release or absorption of energy. This difference in mass arises due to the difference in nuclear binding energy between the atomic nuclei before and after the reaction. Nuclear fusion is the process that powers active or main-sequence stars and other high-magnitude stars, where large amounts of energy are released.
A nuclear fusion process that produces atomic nuclei lighter than iron-56 or nickel-62 will generally release energy. These elements have a relatively small mass and a relatively large binding energy per nucleon.
Fusion of nuclei lighter than these releases energy (an exothermic process), while the fusion of heavier nuclei results in energy retained by the product nucleons, and the resulting reaction is endothermic. The opposite is true for the reverse process, called nuclear fission. Nuclear fusion uses lighter elements, such as hydrogen and helium, which are in general more fusible; while the heavier elements, such as uranium, thorium and plutonium, are more fissionable.
The extreme astrophysical event of a supernova can produce enough energy to fuse nuclei into elements heavier than iron.
Click on any of the following blue hyperlinks for more about Nuclear Fusion:
10 steps you can take to lower your carbon footprint:
But how big of a deal is the “net energy gain” anyway — and what does it mean for the fusion power plants of the future? Here’s what you need to know.
What is fusion energy?
Existing nuclear power plants work through fission — splitting apart heavy atoms to create energy. In fission, a neutron collides with a heavy uranium atom, splitting it into lighter atoms and releasing a lot of heat and energy at the same time.
Fusion, on the other hand, works in the opposite way — it involves smushing two atoms (often two hydrogen atoms) together to create a new element (often helium), in the same way that stars creates energy. In that process, the two hydrogen atoms lose a small amount of mass, which is converted to energy according to Einstein’s famous equation, E=mc². Because the speed of light is very, very fast — 300,000,000 meters per second — even a tiny amount of mass lost can result in a ton of energy.
What is ‘net energy gain,’ and how did the researchers achieve it?
Up to this point, researchers have been able to fuse two hydrogen atoms together successfully, but it has always taken more energy to do the reaction than they get back. Net energy gain — where they get more energy back than they put in to create the reaction — has been the elusive holy grail of fusion research.
On Tuesday, researchers at the National Ignition Facility at the Lawrence Livermore National Laboratory in California announced that they attained net energy gain by shooting lasers at hydrogen atoms.
The lasers delivered 2.05 megajoules of energy and created 3.15 megajoules of fusion energy, a gain of about 1.5 times. The 192 laser beams compressed the hydrogen atoms down to about 100 times the density of lead and heated them to approximately 100 million degrees Celsius. The high density and temperature caused the atoms to merge into helium.
Other methods being researched involve using magnets to confine superhot plasma.
“It’s like the Kitty Hawk moment for the Wright brothers,” said Melanie Windridge, a plasma physicist and the CEO of Fusion Energy Insights. “It’s like the plane taking off.”
Does this mean fusion energy is ready for prime time?
No. Scientists refer to the current breakthrough as “scientific net energy gain” — meaning that more energy has come out of the reaction than was inputted by the laser. That’s a huge milestone that has never before been achieved.
But it’s only a net energy gain at the micro level. The lasers used at the Livermore lab are only about 1 percent efficient, according to Troy Carter, a plasma physicist at the University of California at Los Angeles. That means that it takes about 100 times more energy to run the lasers than they are ultimately able to deliver to the hydrogen atoms.
So researchers will still have to reach “engineering net energy gain,” or the point at which the entire process takes less energy than is outputted by the reaction. They will also have to figure out how to turn the outputted energy — currently in the form of kinetic energy from the helium nucleus and the neutron — into a form that is usable for electricity. They could do that by converting it to heat, then heating steam to turn a turbine and run a generator. That process also has efficiency limitations.
All that means that the energy gain will probably need to be pushed much, much higher for fusion to actually be commercially viable.
At the moment, researchers can also only do the fusion reaction about once a day. In between, they have to allow the lasers to cool and replace the fusion fuel target. A commercially viable plant would need to be able to do it several times per second, said Dennis Whyte, director of the Plasma Science and Fusion Center at MIT. “Once you’ve got scientific viability,” he said, “you’ve got to figure out engineering viability.”
What are the benefits of fusion?
Fusion’s possibilities are huge. The technology is much, much safer than nuclear fission, since fusion can’t create runaway reactions. It also doesn’t produce radioactive byproducts that need to be stored, or harmful carbon emissions; it simply produces inert helium and a neutron. And we’re not likely to run out of fuel: The fuel for fusion is just heavy hydrogen atoms, which can be found in seawater.
When could fusion actually power our homes?
That’s the trillion-dollar question. For decades, scientists have joked that fusion is always 30 or 40 years away. Over the years, researchers have variously predicted that fusion plants will be operational in the 1990s, the 2000s, the 2010s and the 2020s.
Current fusion experts argue that it’s not a matter of time, but a matter of will — if governments and private donors finance fusion aggressively, they say, a prototype fusion power plant could be available in the 2030s.
“The timeline is not really a question of time,” Carter said. “It’s a question of innovating and putting the effort in.”
___________________________________________________________________________
Fusion power (Wikipedia):
Fusion power is a proposed form of power generation that would generate electricity by using heat from nuclear fusion reactions. In a fusion process, two lighter atomic nuclei combine to form a heavier nucleus, while releasing energy. Devices designed to harness this energy are known as fusion reactors.
Research into fusion reactors began in the 1940s, but as of 2022, only one design, an inertial confinement laser-driven fusion machine at the US National Ignition Facility, has conclusively produced a positive fusion energy gain factor, i.e. more power output than input.
Fusion processes require fuel and a confined environment with sufficient temperature, pressure, and confinement time to create a plasma in which fusion can occur. The combination of these figures that results in a power-producing system is known as the Lawson criterion. In stars, the most common fuel is hydrogen, and gravity provides extremely long confinement times that reach the conditions needed for fusion energy production.
Proposed fusion reactors generally use heavy hydrogen isotopes such as deuterium and tritium (and especially a mixture of the two), which react more easily than protium (the most common hydrogen isotope), to allow them to reach the Lawson criterion requirements with less extreme conditions. Most designs aim to heat their fuel to around 100 million degrees, which presents a major challenge in producing a successful design.
As a source of power, nuclear fusion is expected to have many advantages over fission. These include reduced radioactivity in operation and little high-level nuclear waste, ample fuel supplies, and increased safety.
However, the necessary combination of temperature, pressure, and duration has proven to be difficult to produce in a practical and economical manner. A second issue that affects common reactions is managing neutrons that are released during the reaction, which over time degrade many common materials used within the reaction chamber.
Fusion researchers have investigated various confinement concepts. The early emphasis was on three main systems: z-pinch, stellarator, and magnetic mirror. The current leading designs are the tokamak and inertial confinement (ICF) by laser. Both designs are under research at very large scales, most notably the ITER tokamak in France, and the National Ignition Facility (NIF) laser in the United States.
Researchers are also studying other designs that may offer cheaper approaches. Among these alternatives, there is increasing interest in magnetized target fusion and inertial electrostatic confinement, and new variations of the stellarator.
Click on any of the blue hyperlinks for more about Fusion Power:
- Background
- Methods
- Common tools
- Fuels
- Material selection
- Safety and the environment
- Economics
- Regulation
- Geopolitics
- Advantages
- History
- Records
- See also:
- COLEX process, for production of Li-6
- Fusion ignition
- High beta fusion reactor
- Inertial electrostatic confinement
- Levitated dipole
- List of fusion experiments
- Magnetic mirror
- Fusion Device Information System
- Fusion Energy Base
- Fusion Industry Association
- Princeton Satellite Systems News
- U.S. Fusion Energy Science Program
- Ball, Philip. "The chase for fusion energy". Nature. Retrieved 2021-11-22.
Nuclear fusion:
Nuclear fusion is a reaction in which two or more atomic nuclei are combined to form one or more different atomic nuclei and subatomic particles (neutrons or protons).
The difference in mass between the reactants and products is manifested as either the release or absorption of energy. This difference in mass arises due to the difference in nuclear binding energy between the atomic nuclei before and after the reaction. Nuclear fusion is the process that powers active or main-sequence stars and other high-magnitude stars, where large amounts of energy are released.
A nuclear fusion process that produces atomic nuclei lighter than iron-56 or nickel-62 will generally release energy. These elements have a relatively small mass and a relatively large binding energy per nucleon.
Fusion of nuclei lighter than these releases energy (an exothermic process), while the fusion of heavier nuclei results in energy retained by the product nucleons, and the resulting reaction is endothermic. The opposite is true for the reverse process, called nuclear fission. Nuclear fusion uses lighter elements, such as hydrogen and helium, which are in general more fusible; while the heavier elements, such as uranium, thorium and plutonium, are more fissionable.
The extreme astrophysical event of a supernova can produce enough energy to fuse nuclei into elements heavier than iron.
Click on any of the following blue hyperlinks for more about Nuclear Fusion:
- History
- Process
- Nuclear fusion in stars
- Requirements
- Artificial fusion
- Important reactions
- Mathematical description of cross section
- See also:
- China Fusion Engineering Test Reactor
- Cold fusion
- Focus fusion
- Fusenet
- Fusion rocket
- Impulse generator
- Joint European Torus
- List of fusion experiments
- List of Fusor examples
- List of plasma (physics) articles
- Neutron source
- Nuclear energy
- Nuclear fusion–fission hybrid
- Nuclear physics
- Nuclear reactor
- Nucleosynthesis
- Periodic table
- Pulsed power
- Pure fusion weapon
- Teller–Ulam design
- Thermonuclear fusion
- Timeline of nuclear fusion
- Triple-alpha process
- NuclearFiles.org – A repository of documents related to nuclear power.
- Annotated bibliography for nuclear fusion from the Alsos Digital Library for Nuclear Issues
J. Robert Oppenheimer and the Manhattan Project
Courtesy of the New York Times: OPINION--GUEST ESSAY
The Tragedy of J. Robert Oppenheimer
July 17, 2023
By Kai Bird
Mr. Bird is the director of the Leon Levy Center for Biography and co-author with the late Martin J. Sherwin of “American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer.”
One day in the spring of 1954, J. Robert Oppenheimer ran into Albert Einstein outside their offices at the Institute for Advanced Study in Princeton, N.J. Oppenheimer had been the director of the institute since 1947 and Einstein a faculty member since he fled Germany in 1933. The two men might argue about quantum physics — Einstein grumbled that he just didn’t think that God played dice with the universe — but they were good friends.
Oppenheimer took the occasion to explain to Einstein that he was going to be absent from the institute for some weeks. He was being forced to defend himself in Washington, D.C., during a secret hearing against charges that he was a security risk, and perhaps even disloyal.
Einstein argued that Oppenheimer “had no obligation to subject himself to the witch hunt, that he had served his country well, and that if this was the reward she [America] offered he should turn his back on her.” Oppenheimer demurred, saying he could not turn his back on America. “He loved America,” said Verna Hobson, his secretary who was a witness to the conversation, “and this love was as deep as his love of science.”
“Einstein doesn’t understand,” Oppenheimer told Ms. Hobson. But as Einstein walked back into his office he told his own assistant, nodding in the direction of Oppenheimer, “There goes a narr,” or fool.
Einstein was right. Oppenheimer was foolishly subjecting himself to a kangaroo court in which he was soon stripped of his security clearance and publicly humiliated. The charges were flimsy, but by a vote of 2 to 1 the security panel of the Atomic Energy Commission deemed Oppenheimer a loyal citizen who was nevertheless a security risk: “We find that Dr. Oppenheimer’s continuing conduct and association have reflected a serious disregard for the requirements of the security system.”
The scientist would no longer be trusted with the nation’s secrets. Celebrated in 1945 as the “father of the atomic bomb,” nine years later he would become the chief celebrity victim of the McCarthyite maelstrom.
Oppenheimer may have been naïve, but he was right to fight the charges — and right to use his influence as one of the country’s pre-eminent scientists to speak out against a nuclear arms race. In the months and years leading up to the security hearing, Oppenheimer had criticized the decision to build a “super” hydrogen bomb. Astonishingly, he had gone so far as to say that the Hiroshima bomb was used “against an essentially defeated enemy.”
The atomic bomb, he warned, “is a weapon for aggressors, and the elements of surprise and terror are as intrinsic to it as are the fissionable nuclei.” These forthright dissents against the prevailing view of Washington’s national security establishment earned him powerful political enemies. That was precisely why he was being charged with disloyalty.
It is my hope that Christopher Nolan’s stunning new film on Oppenheimer’s complicated legacy will initiate a national conversation not only about our existential relationship to weapons of mass destruction, but also the need in our society for scientists as public intellectuals. Mr. Nolan’s three-hour film is a riveting thriller and mystery story that delves deeply into what this country did to its most famous scientist.
Sadly, Oppenheimer’s life story is relevant to our current political predicaments.
Oppenheimer was destroyed by a political movement characterized by rank know-nothing, anti-intellectual, xenophobic demagogues. The witch-hunters of that season are the direct ancestors of our current political actors of a certain paranoid style.
I’m thinking of Roy Cohn, Senator Joseph McCarthy’s chief counsel, who tried to subpoena Oppenheimer in 1954, only to be warned that this could interfere with the impending security hearing against Oppenheimer.
Yes, that Roy Cohn, who taught former President Donald Trump his brash, wholly deranged style of politics. Just recall the former president’s fact-challenged comments on the pandemic or climate change. This is a worldview proudly scornful of science.
After America’s most celebrated scientist was falsely accused and publicly humiliated, the Oppenheimer case sent a warning to all scientists not to stand up in the political arena as public intellectuals. This was the real tragedy of Oppenheimer. What happened to him also damaged our ability as a society to debate honestly about scientific theory — the very foundation of our modern world.
Quantum physics has utterly transformed our understanding of the universe. And this science has also given us a revolution in computing power and incredible biomedical innovations to prolong human life. Yet, too many of our citizens still distrust scientists and fail to understand the scientific quest, the trial and error inherent in testing any theory against facts by experimenting.
Just look at what happened to our public health civil servants during the recent pandemic.
Editors’ Picks:
We stand on the cusp of another technological revolution in which artificial intelligence will transform how we live and work, and yet we are not yet having the kind of informed civil discourse with its innovators that could help us to make wise policy decisions on its regulation.
Our politicians need to listen more to technology innovators like Sam Altman and quantum physicists like Kip Thorne and Michio Kaku.
Oppenheimer was trying desperately to have that kind of conversation about nuclear weapons. He was trying to warn our generals that these are not battlefield weapons, but weapons of pure terror. But our politicians chose to silence him; the result was that we spent the Cold War engaged in a costly and dangerous arms race.
Today, Vladimir Putin’s not-so-veiled threats to deploy tactical nuclear weapons in the war in Ukraine are a stark reminder that we can never be complacent about living with nuclear weapons. Oppenheimer did not regret what he did at Los Alamos; he understood that you cannot stop curious human beings from discovering the physical world around them.
One cannot halt the scientific quest, nor can one un-invent the atomic bomb.
But Oppenheimer always believed that human beings could learn to regulate these technologies and integrate them into a sustainable and humane civilization. We can only hope he was right.
[End of Opinion Piece]
___________________________________________________________________________
Robert Oppenheimer -- Wikipedia:
Julius Robert Oppenheimer(April 22, 1904 – February 18, 1967) was an American theoretical physicist and director of the Los Alamos Laboratory during World War II. He is often credited as the "father of the atomic bomb" for his role in organizing the Manhattan Project, the research and development undertaking that created the first nuclear weapons (see further below).
Born to German Jewish immigrants in New York City, Oppenheimer earned a bachelor's degree in chemistry from Harvard University in 1925 and a PhD in physics from the University of Göttingen in Germany in 1927.
After research at other institutions, he joined the physics department at the University of California, Berkeley, where he became a full professor in 1936. He made significant contributions to theoretical physics, including:
With his students, he also made contributions to the theory of neutron stars and black holes, quantum field theory, and the interactions of cosmic rays.
In 1942, Oppenheimer was recruited to work on the Manhattan Project, and in 1943 was appointed director of the project's Los Alamos Laboratory in New Mexico, tasked with developing the first nuclear weapons; his leadership and scientific expertise were instrumental in the success of the project.
On July 16, 1945, he was present at the Trinity test of the first atomic bomb. In August 1945, the weapons were used in the bombings of Hiroshima and Nagasaki in Japan, which remain the only use of nuclear weapons in an armed conflict.
In 1947, Oppenheimer became the director of the Institute for Advanced Study in Princeton, New Jersey, and chaired the influential General Advisory Committee of the newly created United States Atomic Energy Commission. He lobbied for international control of nuclear power, to avert nuclear proliferation and a nuclear arms race with the Soviet Union.
He opposed the development of the hydrogen bomb during a 1949–1950 governmental debate on the question. During the Second Red Scare, Oppenheimer's stances, together with past associations he had with people and organizations affiliated with the Communist Party USA, led to the revocation of his security clearance following a 1954 security hearing.
In 1963, President John F. Kennedy awarded him (and Lyndon B. Johnson presented him with) the Enrico Fermi Award as a gesture of political rehabilitation. In 2022, the U.S. government vacated its 1954 decision, saying that the process had been flawed.
Click on any of the following blue hyperlinks for more about J. Robert Oppenheimer:
[Your WebHost: the article on the Manhattan Project below is lengthy and highly technical: to instead be taken the Topic Outline for a selective reading, Click Here.]
The Manhattan Project:
The Manhattan Project was a research and development undertaking during World War II that produced the first nuclear weapons. It was led by the United States with the support of the United Kingdom and Canada.
From 1942 to 1946, the project was under the direction of Major General Leslie Groves of the U.S. Army Corps of Engineers. The nuclear physicist Robert Oppenheimer was the director of the Los Alamos Laboratory that designed the bombs.
The Army component was designated the Manhattan District, as its first headquarters were in Manhattan; the name gradually superseded the official codename, Development of Substitute Materials, for the entire project.
The project absorbed its earlier British counterpart, Tube Alloys. The Manhattan Project began modestly in 1939, but employed nearly 130,000 people at its peak and cost nearly US$2 billion (equivalent to about $24 billion in 2021).
Over 90 percent of the cost was for building factories and to produce fissile material, with less than 10 percent for development and production of the weapons. Research and production took place at more than 30 sites across the United States, the United Kingdom, and Canada.
The project led to the development of two types of atomic bombs, both developed concurrently, during the war: a relatively simple gun-type fission weapon and a more complex implosion-type nuclear weapon.
The Thin Man gun-type design proved impractical to use with plutonium, so a simpler gun-type design called Little Boy was developed that used uranium-235, an isotope that makes up only 0.7 percent of natural uranium. Because it is chemically identical to the most common isotope, uranium-238, and has almost the same mass, separating the two proved difficult.
Three methods were employed for uranium enrichment:
Most of this work was carried out at the Clinton Engineer Works at Oak Ridge, Tennessee.
In parallel with the work on uranium was an effort to produce plutonium, which researchers at the University of California, Berkeley, discovered in 1940.
After the feasibility of the world's first artificial nuclear reactor, the Chicago Pile-1, was demonstrated in 1942 at the Metallurgical Laboratory in the University of Chicago, the project designed the X-10 Graphite Reactor at Oak Ridge and the production reactors at the Hanford Site in Washington state, in which uranium was irradiated and transmuted into plutonium. The plutonium was then chemically separated from the uranium, using the bismuth phosphate process.
The Fat Man plutonium implosion-type weapon was developed in a concerted design and development effort by the Los Alamos Laboratory.
The project was also charged with gathering intelligence on the German nuclear weapon project. Through Operation Alsos, Manhattan Project personnel served in Europe, sometimes behind enemy lines, where they gathered nuclear materials and documents, and rounded up German scientists. Despite the Manhattan Project's tight security, Soviet atomic spies successfully penetrated the program.
The first nuclear device ever detonated was an implosion-type bomb during the Trinity test, conducted at New Mexico's Alamogordo Bombing and Gunnery Range on 16 July 1945. Little Boy and Fat Man bombs were used a month later in the atomic bombings of Hiroshima and Nagasaki, respectively, with Manhattan Project personnel serving as bomb assembly technicians and weaponeers on the attack aircraft.
In the immediate postwar years, the Manhattan Project conducted weapons testing at Bikini Atoll as part of Operation Crossroads, developed new weapons, promoted the development of the network of national laboratories, supported medical research into radiology and laid the foundations for the nuclear navy.
It maintained control over American atomic weapons research and production until the formation of the United States Atomic Energy Commission in January 1947.
Origins:
For a chronological guide, see Timeline of the Manhattan Project.The discovery of nuclear fission by German chemists Otto Hahn and Fritz Strassmann in 1938, and its theoretical explanation by Lise Meitner and Otto Frisch, made the development of an atomic bomb a theoretical possibility.
There were fears that a German atomic bomb project would develop one first, especially among scientists who were refugees from Nazi Germany and other fascist countries.
In August 1939, Hungarian-born physicists Leo Szilard and Eugene Wigner drafted the Einstein–Szilard letter, which warned of the potential development of "extremely powerful bombs of a new type". It urged the United States to take steps to acquire stockpiles of uranium ore and accelerate the research of Enrico Fermi and others into nuclear chain reactions.
They had it signed by Albert Einstein and delivered to President Franklin D. Roosevelt. Roosevelt called on Lyman Briggs of the National Bureau of Standards to head the Advisory Committee on Uranium to investigate the issues raised by the letter. Briggs held a meeting on 21 October 1939, which was attended by Szilárd, Wigner and Edward Teller.
The committee reported back to Roosevelt in November that uranium "would provide a possible source of bombs with a destructiveness vastly greater than anything now known."
In February 1940, the U.S. Navy awarded Columbia University $6,000 in funding, most of which Enrico Fermi and Szilard spent on purchasing graphite. A team of Columbia professors including Fermi, Szilard, Eugene T. Booth and John Dunning created the first nuclear fission reaction in the Americas, verifying the work of Hahn and Strassmann.
The same team subsequently built a series of prototype nuclear reactors (or "piles" as Fermi called them) in Pupin Hall at Columbia, but were not yet able to achieve a chain reaction.
The Advisory Committee on Uranium became the National Defense Research Committee (NDRC) on Uranium when that organization was formed on 27 June 1940.
Briggs proposed spending $167,000 on research into uranium, particularly the uranium-235 isotope, and plutonium, which was discovered in 1940 at the University of California.
On 28 June 1941, Roosevelt signed Executive Order 8807, which created the Office of Scientific Research and Development (OSRD), with Vannevar Bush as its director. The office was empowered to engage in large engineering projects in addition to research The NDRC Committee on Uranium became the S-1 Section of the OSRD; the word "uranium" was dropped for security reasons.
In Britain, Frisch and Rudolf Peierls at the University of Birmingham had made a breakthrough investigating the critical mass of uranium-235 in June 1939. Their calculations indicated that it was within an order of magnitude of 10 kilograms (22 lb), which was small enough to be carried by a bomber of the day. Their March 1940 Frisch–Peierls memorandum initiated the British atomic bomb project and its MAUD Committee, which unanimously recommended pursuing the development of an atomic bomb.
In July 1940, Britain had offered to give the United States access to its scientific research, and the Tizard Mission's John Cockcroft briefed American scientists on British developments. He discovered that the American project was smaller than the British, and not as far advanced.
As part of the scientific exchange, the MAUD Committee's findings were conveyed to the United States. One of its members, the Australian physicist Mark Oliphant, flew to the United States in late August 1941 and discovered that data provided by the MAUD Committee had not reached key American physicists.
Oliphant then set out to find out why the committee's findings were apparently being ignored. He met with the Uranium Committee and visited Berkeley, California, where he spoke persuasively to Ernest O. Lawrence. Lawrence was sufficiently impressed to commence his own research into uranium. He in turn spoke to James B. Conant, Arthur H. Compton and George B. Pegram.
Oliphant's mission was therefore a success; key American physicists were now aware of the potential power of an atomic bomb.
On 9 October 1941, President Roosevelt approved the atomic program after he convened a meeting with Vannevar Bush and Vice President Henry A. Wallace. To control the program, he created a Top Policy Group consisting of himself—although he never attended a meeting—Wallace, Bush, Conant, Secretary of War Henry L. Stimson, and the Chief of Staff of the Army, General George C. Marshall. Roosevelt chose the Army to run the project rather than the Navy, because the Army had more experience with management of large-scale construction projects.
Roosevelt also agreed to coordinate the effort with that of the British, and on 11 October he sent a message to Prime Minister Winston Churchill, suggesting that they correspond on atomic matters.
Feasibility:
Proposals:
The S-1 Committee held its meeting on 18 December 1941 "pervaded by an atmosphere of enthusiasm and urgency" in the wake of the attack on Pearl Harbor and the subsequent United States declaration of war upon Japan and then on Germany. Work was proceeding on three different techniques for isotope separation to separate uranium-235 from the more abundant uranium-238.
Lawrence and his team at the University of California investigated electromagnetic separation, while Eger Murphree and Jesse Wakefield Beams's team looked into gaseous diffusion at Columbia University, and Philip Abelson directed research into thermal diffusion at the Carnegie Institution of Washington and later the Naval Research Laboratory. Murphree was also the head of an unsuccessful separation project using gas centrifuges.
Meanwhile, there were two lines of research into nuclear reactor technology, with Harold Urey continuing research into heavy water at Columbia, while Arthur Compton brought the scientists working under his supervision from Columbia, California and Princeton University to join his team at the University of Chicago, where he organized the Metallurgical Laboratory in early 1942 to study plutonium and reactors using graphite as a neutron moderator.
Briggs, Compton, Lawrence, Murphree, and Urey met on 23 May 1942, to finalize the S-1 Committee recommendations, which called for all five technologies to be pursued. This was approved by Bush, Conant, and Brigadier General Wilhelm D. Styer, the chief of staff of Major General Brehon B. Somervell's Services of Supply, who had been designated the Army's representative on nuclear matters.
Bush and Conant then took the recommendation to the Top Policy Group with a budget proposal for $54 million for construction by the United States Army Corps of Engineers, $31 million for research and development by OSRD and $5 million for contingencies in fiscal year 1943.
The Top Policy Group sent it on 17 June 1942, to the President, who approved it by writing "OK FDR" on the document.
Bomb design concepts:
Compton asked theoretical physicist J. Robert Oppenheimer of the University of California to take over research into fast neutron calculations—the key to calculations of critical mass and weapon detonation—from Gregory Breit, who had quit on 18 May 1942 because of concerns over lax operational security.
John H. Manley, a physicist at the Metallurgical Laboratory, was assigned to assist Oppenheimer by contacting and coordinating experimental physics groups scattered across the country. Oppenheimer and Robert Serber of the University of Illinois examined the problems of neutron diffusion—how neutrons moved in a nuclear chain reaction—and hydrodynamics—how the explosion produced by a chain reaction might behave.
To review this work and the general theory of fission reactions, Oppenheimer and Fermi convened meetings at the University of Chicago in June and at the University of California in July 1942 with theoretical physicists:
They tentatively confirmed that a fission bomb was theoretically possible.
There were still many unknown factors. The properties of pure uranium-235 were relatively unknown, as were those of plutonium, an element that had only been discovered in February 1941 by Glenn Seaborg and his team. The scientists at the (July 1942) Berkeley conference envisioned creating plutonium in nuclear reactors where uranium-238 atoms absorbed neutrons that had been emitted from fissioning uranium-235 atoms.
At this point no reactor had been built, and only tiny quantities of plutonium were available from cyclotrons at institutions such as Washington University in St. Louis.
Even by December 1943, only two milligrams had been produced. There were many ways of arranging the fissile material into a critical mass. The simplest was shooting a "cylindrical plug" into a sphere of "active material" with a "tamper"—dense material that would focus neutrons inward and keep the reacting mass together to increase its efficiency.
They also explored designs involving spheroids, a primitive form of "implosion" suggested by Richard C. Tolman, and the possibility of autocatalytic methods, which would increase the efficiency of the bomb as it exploded.
Considering the idea of the fission bomb theoretically settled—at least until more experimental data was available—the 1942 Berkeley conference then turned in a different direction. Edward Teller pushed for discussion of a more powerful bomb: the "super", now usually referred to as a "hydrogen bomb", which would use the explosive force of a detonating fission bomb to ignite a nuclear fusion reaction in deuterium and tritium.
Teller proposed scheme after scheme, but Bethe refused each one. The fusion idea was put aside to concentrate on producing fission bombs. Teller also raised the speculative possibility that an atomic bomb might "ignite" the atmosphere because of a hypothetical fusion reaction of nitrogen nuclei.
Bethe calculated that it could not happen, and a report co-authored by Teller showed that "no self-propagating chain of nuclear reactions is likely to be started." In Serber's account, Oppenheimer mentioned the possibility of this scenario to Arthur Compton, who "didn't have enough sense to shut up about it. It somehow got into a document that went to Washington" and was "never laid to rest".
Organization:
Manhattan District:
The Chief of Engineers, Major General Eugene Reybold, selected Colonel James C. Marshall to head the Army's part of the project in June 1942. Marshall created a liaison office in Washington, D.C., but established his temporary headquarters on the 18th floor of 270 Broadway in New York, where he could draw on administrative support from the Corps of Engineers' North Atlantic Division.
It was close to the Manhattan office of Stone & Webster, the principal project contractor, and to Columbia University. He had permission to draw on his former command, the Syracuse District, for staff, and he started with Lieutenant Colonel Kenneth Nichols, who became his deputy.
Because most of his task involved construction, Marshall worked in cooperation with the head of the Corps of Engineers Construction Division, Major General Thomas M. Robbins, and his deputy, Colonel Leslie Groves. Reybold, Somervell, and Styer decided to call the project "Development of Substitute Materials", but Groves felt that this would draw attention.
Since engineer districts normally carried the name of the city where they were located, Marshall and Groves agreed to name the Army's component of the project the Manhattan District. This became official on 13 August when Reybold issued the order creating the new district. Informally, it was known as the Manhattan Engineer District, or MED.
Unlike other districts, it had no geographic boundaries, and Marshall had the authority of a division engineer. Development of Substitute Materials remained as the official codename of the project as a whole, but was supplanted over time by "Manhattan".
Marshall later conceded that, "I had never heard of atomic fission but I did know that you could not build much of a plant, much less four of them for $90 million." A single TNT plant that Nichols had recently built in Pennsylvania had cost $128 million.
Nor were they impressed with estimates to the nearest order of magnitude, which Groves compared with telling a caterer to prepare for between ten and a thousand guests.
A survey team from Stone & Webster had already scouted a site for the production plants. The War Production Board recommended sites around Knoxville, Tennessee, an isolated area where the Tennessee Valley Authority could supply ample electric power and the rivers could provide cooling water for the reactors.
After examining several sites, the survey team selected one near Elza, Tennessee. Conant advised that it be acquired at once and Styer agreed but Marshall temporized, awaiting the results of Conant's reactor experiments before taking action. Of the prospective processes, only Lawrence's electromagnetic separation appeared sufficiently advanced for construction to commence.
Marshall and Nichols began assembling the resources they would need. The first step was to obtain a high priority rating for the project. The top ratings were AA-1 through AA-4 in descending order, although there was also a special AAA rating reserved for emergencies.
Ratings AA-1 and AA-2 were for essential weapons and equipment, so Colonel Lucius D. Clay, the deputy chief of staff at Services and Supply for requirements and resources, felt that the highest rating he could assign was AA-3, although he was willing to provide a AAA rating on request for critical materials if the need arose.
Nichols and Marshall were disappointed; AA-3 was the same priority as Nichols' TNT plant in Pennsylvania.
Military Policy Committee:
Vannevar Bush became dissatisfied with Colonel Marshall's failure to get the project moving forward expeditiously, specifically the failure to acquire the Tennessee site, the low priority allocated to the project by the Army and the location of his headquarters in New York City.
Bush felt that more aggressive leadership was required, and spoke to Harvey Bundy and Generals Marshall, Somervell, and Styer about his concerns. He wanted the project placed under a senior policy committee, with a prestigious officer, preferably Styer, as overall director.
Somervell and Styer selected Groves for the post, informing him on 17 September of this decision, and that General Marshall ordered that he be promoted to brigadier general, as it was felt that the title "general" would hold more sway with the academic scientists working on the Manhattan Project. Groves' orders placed him directly under Somervell rather than Reybold, with Colonel Marshall now answerable to Groves.
Groves established his headquarters in Washington, D.C., on the fifth floor of the New War Department Building, where Colonel Marshall had his liaison office. He assumed command of the Manhattan Project on 23 September 1942.
Later that day, he attended a meeting called by Stimson, which established a Military Policy Committee, responsible to the Top Policy Group, consisting of Bush (with Conant as an alternate), Styer and Rear Admiral William R. Purnell. Tolman and Conant were later appointed as Groves' scientific advisers.
On 19 September, Groves went to Donald Nelson, the chairman of the War Production Board, and asked for broad authority to issue a AAA rating whenever it was required. Nelson initially balked but quickly caved in when Groves threatened to go to the President.
Groves promised not to use the AAA rating unless it was necessary. It soon transpired that for the routine requirements of the project the AAA rating was too high but the AA-3 rating was too low. After a long campaign, Groves finally received AA-1 authority on 1 July 1944.
According to Groves, "In Washington you became aware of the importance of top priority. Most everything proposed in the Roosevelt administration would have top priority. That would last for about a week or two and then something else would get top priority".
One of Groves' early problems was to find a director for Project Y, the group that would design and build the bomb. The obvious choice was one of the three laboratory heads, Urey, Lawrence, or Compton, but they could not be spared. Compton recommended Oppenheimer, who was already intimately familiar with the bomb design concepts. However, Oppenheimer had little administrative experience, and, unlike Urey, Lawrence, and Compton, had not won a Nobel Prize, which many scientists felt that the head of such an important laboratory should have.
There were also concerns about Oppenheimer's security status, as many of his associates were communists, including his wife, Kitty (Katherine Oppenheimer); his girlfriend, Jean Tatlock; and his brother, Frank Oppenheimer. A long conversation on a train in October 1942 convinced Groves and Nichols that Oppenheimer thoroughly understood the issues involved in setting up a laboratory in a remote area and should be appointed as its director.
Groves personally waived the security requirements and issued Oppenheimer a clearance on 20 July 1943.
Collaboration with the United Kingdom:
Main article: British contribution to the Manhattan Project
The British and Americans exchanged nuclear information but did not initially combine their efforts. Britain rebuffed attempts by Bush and Conant in 1941 to strengthen cooperation with its own project, codenamed Tube Alloys, because it was reluctant to share its technological lead and help the United States develop its own atomic bomb.
But the British, who had made significant contributions early in the war, did not have the resources to carry through such a research program while fighting for their survival, and Tube Alloys soon fell behind its American counterpart.
The roles of the two countries were reversed, and in January 1943 Conant notified the British that they would no longer receive atomic information except in certain areas.The British investigated the possibility of an independent nuclear program, but determined that it could not be ready in time to affect the outcome of the war in Europe.
By March 1943 Conant decided that British help would benefit some areas of the project. James Chadwick and one or two other British scientists were important enough that the bomb design team at Los Alamos needed them, despite the risk of revealing weapon design secrets.
In August 1943 Churchill and Roosevelt negotiated the Quebec Agreement, which resulted in a resumption of cooperation. The subsequent Hyde Park Agreement in September 1944 extended this cooperation to the postwar period.
The Quebec Agreement established the Combined Policy Committee to coordinate the efforts of the United States, United Kingdom and Canada. Stimson, Bush and Conant served as the American members of the Combined Policy Committee, Field Marshal Sir John Dill and Colonel J. J. Llewellin were the British members, and C. D. Howe was the Canadian member.
Llewellin returned to the United Kingdom at the end of 1943 and was replaced on the committee by Sir Ronald Ian Campbell, who in turn was replaced by the British Ambassador to the United States, Lord Halifax, in early 1945. Sir John Dill died in Washington, D.C., in November 1944 and was replaced both as Chief of the British Joint Staff Mission and as a member of the Combined Policy Committee by Field Marshal Sir Henry Maitland Wilson.
When cooperation resumed after the Quebec agreement, the Americans' progress and expenditures amazed the British. Chadwick pressed for British involvement in the Manhattan Project to the fullest extent and abandoned any hopes of an independent British project during the war.
With Churchill's backing, he attempted to ensure that every request from Groves for assistance was honored. The British Mission that arrived in the United States in December 1943 included Niels Bohr, Otto Frisch, Klaus Fuchs, Rudolf Peierls, and Ernest Titterton.
More scientists arrived in early 1944. While those assigned to gaseous diffusion left by the fall of 1944, the thirty-five working under Oliphant with Lawrence at Berkeley were assigned to existing laboratory groups and most stayed until the end of the war. The nineteen sent to Los Alamos also joined existing groups, primarily related to implosion and bomb assembly, but not the plutonium-related ones.
The Quebec Agreement specified that nuclear weapons would not be used against another country without the mutual consent of the US and UK. In June 1945, Wilson agreed that the use of nuclear weapons against Japan would be recorded as a decision of the Combined Policy Committee.
The Combined Policy Committee created the Combined Development Trust in June 1944, with Groves as its chairman, to procure uranium and thorium ores on international markets.
The Belgian Congo and Canada held much of the world's uranium outside Eastern Europe, and the Belgian government in exile was in London. Britain agreed to give the United States most of the Belgian ore, as it could not use most of the supply without restricted American research.
In 1944, the Trust purchased 3,440,000 pounds (1,560,000 kg) of uranium oxide ore from companies operating mines in the Belgian Congo. In order to avoid briefing US Secretary of the Treasury Henry Morgenthau Jr. on the project, a special account not subject to the usual auditing and controls was used to hold Trust monies.
Between 1944 and the time he resigned from the Trust in 1947, Groves deposited a total of $37.5 million into the Trust's account.
Groves appreciated the early British atomic research and the British scientists' contributions to the Manhattan Project, but stated that the United States would have succeeded without them. The British wartime participation was crucial to the success of the United Kingdom's independent nuclear weapons program after the war when the McMahon Act of 1946 temporarily ended American nuclear cooperation.
Project sites:
The Tragedy of J. Robert Oppenheimer
July 17, 2023
By Kai Bird
Mr. Bird is the director of the Leon Levy Center for Biography and co-author with the late Martin J. Sherwin of “American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer.”
One day in the spring of 1954, J. Robert Oppenheimer ran into Albert Einstein outside their offices at the Institute for Advanced Study in Princeton, N.J. Oppenheimer had been the director of the institute since 1947 and Einstein a faculty member since he fled Germany in 1933. The two men might argue about quantum physics — Einstein grumbled that he just didn’t think that God played dice with the universe — but they were good friends.
Oppenheimer took the occasion to explain to Einstein that he was going to be absent from the institute for some weeks. He was being forced to defend himself in Washington, D.C., during a secret hearing against charges that he was a security risk, and perhaps even disloyal.
Einstein argued that Oppenheimer “had no obligation to subject himself to the witch hunt, that he had served his country well, and that if this was the reward she [America] offered he should turn his back on her.” Oppenheimer demurred, saying he could not turn his back on America. “He loved America,” said Verna Hobson, his secretary who was a witness to the conversation, “and this love was as deep as his love of science.”
“Einstein doesn’t understand,” Oppenheimer told Ms. Hobson. But as Einstein walked back into his office he told his own assistant, nodding in the direction of Oppenheimer, “There goes a narr,” or fool.
Einstein was right. Oppenheimer was foolishly subjecting himself to a kangaroo court in which he was soon stripped of his security clearance and publicly humiliated. The charges were flimsy, but by a vote of 2 to 1 the security panel of the Atomic Energy Commission deemed Oppenheimer a loyal citizen who was nevertheless a security risk: “We find that Dr. Oppenheimer’s continuing conduct and association have reflected a serious disregard for the requirements of the security system.”
The scientist would no longer be trusted with the nation’s secrets. Celebrated in 1945 as the “father of the atomic bomb,” nine years later he would become the chief celebrity victim of the McCarthyite maelstrom.
Oppenheimer may have been naïve, but he was right to fight the charges — and right to use his influence as one of the country’s pre-eminent scientists to speak out against a nuclear arms race. In the months and years leading up to the security hearing, Oppenheimer had criticized the decision to build a “super” hydrogen bomb. Astonishingly, he had gone so far as to say that the Hiroshima bomb was used “against an essentially defeated enemy.”
The atomic bomb, he warned, “is a weapon for aggressors, and the elements of surprise and terror are as intrinsic to it as are the fissionable nuclei.” These forthright dissents against the prevailing view of Washington’s national security establishment earned him powerful political enemies. That was precisely why he was being charged with disloyalty.
It is my hope that Christopher Nolan’s stunning new film on Oppenheimer’s complicated legacy will initiate a national conversation not only about our existential relationship to weapons of mass destruction, but also the need in our society for scientists as public intellectuals. Mr. Nolan’s three-hour film is a riveting thriller and mystery story that delves deeply into what this country did to its most famous scientist.
Sadly, Oppenheimer’s life story is relevant to our current political predicaments.
Oppenheimer was destroyed by a political movement characterized by rank know-nothing, anti-intellectual, xenophobic demagogues. The witch-hunters of that season are the direct ancestors of our current political actors of a certain paranoid style.
I’m thinking of Roy Cohn, Senator Joseph McCarthy’s chief counsel, who tried to subpoena Oppenheimer in 1954, only to be warned that this could interfere with the impending security hearing against Oppenheimer.
Yes, that Roy Cohn, who taught former President Donald Trump his brash, wholly deranged style of politics. Just recall the former president’s fact-challenged comments on the pandemic or climate change. This is a worldview proudly scornful of science.
After America’s most celebrated scientist was falsely accused and publicly humiliated, the Oppenheimer case sent a warning to all scientists not to stand up in the political arena as public intellectuals. This was the real tragedy of Oppenheimer. What happened to him also damaged our ability as a society to debate honestly about scientific theory — the very foundation of our modern world.
Quantum physics has utterly transformed our understanding of the universe. And this science has also given us a revolution in computing power and incredible biomedical innovations to prolong human life. Yet, too many of our citizens still distrust scientists and fail to understand the scientific quest, the trial and error inherent in testing any theory against facts by experimenting.
Just look at what happened to our public health civil servants during the recent pandemic.
Editors’ Picks:
- It’s Marathon Training Season.
- Here’s How to Build a Foundation.
- A Perfect Weekend in Telluride
- Furby, Is That You?
We stand on the cusp of another technological revolution in which artificial intelligence will transform how we live and work, and yet we are not yet having the kind of informed civil discourse with its innovators that could help us to make wise policy decisions on its regulation.
Our politicians need to listen more to technology innovators like Sam Altman and quantum physicists like Kip Thorne and Michio Kaku.
Oppenheimer was trying desperately to have that kind of conversation about nuclear weapons. He was trying to warn our generals that these are not battlefield weapons, but weapons of pure terror. But our politicians chose to silence him; the result was that we spent the Cold War engaged in a costly and dangerous arms race.
Today, Vladimir Putin’s not-so-veiled threats to deploy tactical nuclear weapons in the war in Ukraine are a stark reminder that we can never be complacent about living with nuclear weapons. Oppenheimer did not regret what he did at Los Alamos; he understood that you cannot stop curious human beings from discovering the physical world around them.
One cannot halt the scientific quest, nor can one un-invent the atomic bomb.
But Oppenheimer always believed that human beings could learn to regulate these technologies and integrate them into a sustainable and humane civilization. We can only hope he was right.
[End of Opinion Piece]
___________________________________________________________________________
Robert Oppenheimer -- Wikipedia:
Julius Robert Oppenheimer(April 22, 1904 – February 18, 1967) was an American theoretical physicist and director of the Los Alamos Laboratory during World War II. He is often credited as the "father of the atomic bomb" for his role in organizing the Manhattan Project, the research and development undertaking that created the first nuclear weapons (see further below).
Born to German Jewish immigrants in New York City, Oppenheimer earned a bachelor's degree in chemistry from Harvard University in 1925 and a PhD in physics from the University of Göttingen in Germany in 1927.
After research at other institutions, he joined the physics department at the University of California, Berkeley, where he became a full professor in 1936. He made significant contributions to theoretical physics, including:
- achievements in quantum mechanics and nuclear physics such as the Born–Oppenheimer approximation for molecular wave functions,
- work on the theory of electrons and positrons,
- the Oppenheimer–Phillips process in nuclear fusion,
- and the first prediction of quantum tunneling.
With his students, he also made contributions to the theory of neutron stars and black holes, quantum field theory, and the interactions of cosmic rays.
In 1942, Oppenheimer was recruited to work on the Manhattan Project, and in 1943 was appointed director of the project's Los Alamos Laboratory in New Mexico, tasked with developing the first nuclear weapons; his leadership and scientific expertise were instrumental in the success of the project.
On July 16, 1945, he was present at the Trinity test of the first atomic bomb. In August 1945, the weapons were used in the bombings of Hiroshima and Nagasaki in Japan, which remain the only use of nuclear weapons in an armed conflict.
In 1947, Oppenheimer became the director of the Institute for Advanced Study in Princeton, New Jersey, and chaired the influential General Advisory Committee of the newly created United States Atomic Energy Commission. He lobbied for international control of nuclear power, to avert nuclear proliferation and a nuclear arms race with the Soviet Union.
He opposed the development of the hydrogen bomb during a 1949–1950 governmental debate on the question. During the Second Red Scare, Oppenheimer's stances, together with past associations he had with people and organizations affiliated with the Communist Party USA, led to the revocation of his security clearance following a 1954 security hearing.
In 1963, President John F. Kennedy awarded him (and Lyndon B. Johnson presented him with) the Enrico Fermi Award as a gesture of political rehabilitation. In 2022, the U.S. government vacated its 1954 decision, saying that the process had been flawed.
Click on any of the following blue hyperlinks for more about J. Robert Oppenheimer:
- Early life
- Early career
- Private and political life
- Manhattan Project
- Postwar activities
- Final years and death
- Legacy
- See also:
- Biography and online exhibit created for the centennial of his birth
- 1965 Audio Interview with J. Robert Oppenheimer by Stephane Groueff Voices of the Manhattan Project
- Was Oppenheimer a member of the Communist Party? documents on the question
- On Atomic Energy, Problems to Civilization audio file of UC Berkeley talk, November 1946
- Oppenheimer talking about the experience of the first bomb test (video file, "Now I am become death, destroyer of worlds.")
- "Freedom and Necessity in the Sciences" audio and documents from a lecture at Dartmouth College, April 1959
[Your WebHost: the article on the Manhattan Project below is lengthy and highly technical: to instead be taken the Topic Outline for a selective reading, Click Here.]
The Manhattan Project:
The Manhattan Project was a research and development undertaking during World War II that produced the first nuclear weapons. It was led by the United States with the support of the United Kingdom and Canada.
From 1942 to 1946, the project was under the direction of Major General Leslie Groves of the U.S. Army Corps of Engineers. The nuclear physicist Robert Oppenheimer was the director of the Los Alamos Laboratory that designed the bombs.
The Army component was designated the Manhattan District, as its first headquarters were in Manhattan; the name gradually superseded the official codename, Development of Substitute Materials, for the entire project.
The project absorbed its earlier British counterpart, Tube Alloys. The Manhattan Project began modestly in 1939, but employed nearly 130,000 people at its peak and cost nearly US$2 billion (equivalent to about $24 billion in 2021).
Over 90 percent of the cost was for building factories and to produce fissile material, with less than 10 percent for development and production of the weapons. Research and production took place at more than 30 sites across the United States, the United Kingdom, and Canada.
The project led to the development of two types of atomic bombs, both developed concurrently, during the war: a relatively simple gun-type fission weapon and a more complex implosion-type nuclear weapon.
The Thin Man gun-type design proved impractical to use with plutonium, so a simpler gun-type design called Little Boy was developed that used uranium-235, an isotope that makes up only 0.7 percent of natural uranium. Because it is chemically identical to the most common isotope, uranium-238, and has almost the same mass, separating the two proved difficult.
Three methods were employed for uranium enrichment:
- electromagnetic,
- gaseous
- and thermal.
Most of this work was carried out at the Clinton Engineer Works at Oak Ridge, Tennessee.
In parallel with the work on uranium was an effort to produce plutonium, which researchers at the University of California, Berkeley, discovered in 1940.
After the feasibility of the world's first artificial nuclear reactor, the Chicago Pile-1, was demonstrated in 1942 at the Metallurgical Laboratory in the University of Chicago, the project designed the X-10 Graphite Reactor at Oak Ridge and the production reactors at the Hanford Site in Washington state, in which uranium was irradiated and transmuted into plutonium. The plutonium was then chemically separated from the uranium, using the bismuth phosphate process.
The Fat Man plutonium implosion-type weapon was developed in a concerted design and development effort by the Los Alamos Laboratory.
The project was also charged with gathering intelligence on the German nuclear weapon project. Through Operation Alsos, Manhattan Project personnel served in Europe, sometimes behind enemy lines, where they gathered nuclear materials and documents, and rounded up German scientists. Despite the Manhattan Project's tight security, Soviet atomic spies successfully penetrated the program.
The first nuclear device ever detonated was an implosion-type bomb during the Trinity test, conducted at New Mexico's Alamogordo Bombing and Gunnery Range on 16 July 1945. Little Boy and Fat Man bombs were used a month later in the atomic bombings of Hiroshima and Nagasaki, respectively, with Manhattan Project personnel serving as bomb assembly technicians and weaponeers on the attack aircraft.
In the immediate postwar years, the Manhattan Project conducted weapons testing at Bikini Atoll as part of Operation Crossroads, developed new weapons, promoted the development of the network of national laboratories, supported medical research into radiology and laid the foundations for the nuclear navy.
It maintained control over American atomic weapons research and production until the formation of the United States Atomic Energy Commission in January 1947.
Origins:
For a chronological guide, see Timeline of the Manhattan Project.The discovery of nuclear fission by German chemists Otto Hahn and Fritz Strassmann in 1938, and its theoretical explanation by Lise Meitner and Otto Frisch, made the development of an atomic bomb a theoretical possibility.
There were fears that a German atomic bomb project would develop one first, especially among scientists who were refugees from Nazi Germany and other fascist countries.
In August 1939, Hungarian-born physicists Leo Szilard and Eugene Wigner drafted the Einstein–Szilard letter, which warned of the potential development of "extremely powerful bombs of a new type". It urged the United States to take steps to acquire stockpiles of uranium ore and accelerate the research of Enrico Fermi and others into nuclear chain reactions.
They had it signed by Albert Einstein and delivered to President Franklin D. Roosevelt. Roosevelt called on Lyman Briggs of the National Bureau of Standards to head the Advisory Committee on Uranium to investigate the issues raised by the letter. Briggs held a meeting on 21 October 1939, which was attended by Szilárd, Wigner and Edward Teller.
The committee reported back to Roosevelt in November that uranium "would provide a possible source of bombs with a destructiveness vastly greater than anything now known."
In February 1940, the U.S. Navy awarded Columbia University $6,000 in funding, most of which Enrico Fermi and Szilard spent on purchasing graphite. A team of Columbia professors including Fermi, Szilard, Eugene T. Booth and John Dunning created the first nuclear fission reaction in the Americas, verifying the work of Hahn and Strassmann.
The same team subsequently built a series of prototype nuclear reactors (or "piles" as Fermi called them) in Pupin Hall at Columbia, but were not yet able to achieve a chain reaction.
The Advisory Committee on Uranium became the National Defense Research Committee (NDRC) on Uranium when that organization was formed on 27 June 1940.
Briggs proposed spending $167,000 on research into uranium, particularly the uranium-235 isotope, and plutonium, which was discovered in 1940 at the University of California.
On 28 June 1941, Roosevelt signed Executive Order 8807, which created the Office of Scientific Research and Development (OSRD), with Vannevar Bush as its director. The office was empowered to engage in large engineering projects in addition to research The NDRC Committee on Uranium became the S-1 Section of the OSRD; the word "uranium" was dropped for security reasons.
In Britain, Frisch and Rudolf Peierls at the University of Birmingham had made a breakthrough investigating the critical mass of uranium-235 in June 1939. Their calculations indicated that it was within an order of magnitude of 10 kilograms (22 lb), which was small enough to be carried by a bomber of the day. Their March 1940 Frisch–Peierls memorandum initiated the British atomic bomb project and its MAUD Committee, which unanimously recommended pursuing the development of an atomic bomb.
In July 1940, Britain had offered to give the United States access to its scientific research, and the Tizard Mission's John Cockcroft briefed American scientists on British developments. He discovered that the American project was smaller than the British, and not as far advanced.
As part of the scientific exchange, the MAUD Committee's findings were conveyed to the United States. One of its members, the Australian physicist Mark Oliphant, flew to the United States in late August 1941 and discovered that data provided by the MAUD Committee had not reached key American physicists.
Oliphant then set out to find out why the committee's findings were apparently being ignored. He met with the Uranium Committee and visited Berkeley, California, where he spoke persuasively to Ernest O. Lawrence. Lawrence was sufficiently impressed to commence his own research into uranium. He in turn spoke to James B. Conant, Arthur H. Compton and George B. Pegram.
Oliphant's mission was therefore a success; key American physicists were now aware of the potential power of an atomic bomb.
On 9 October 1941, President Roosevelt approved the atomic program after he convened a meeting with Vannevar Bush and Vice President Henry A. Wallace. To control the program, he created a Top Policy Group consisting of himself—although he never attended a meeting—Wallace, Bush, Conant, Secretary of War Henry L. Stimson, and the Chief of Staff of the Army, General George C. Marshall. Roosevelt chose the Army to run the project rather than the Navy, because the Army had more experience with management of large-scale construction projects.
Roosevelt also agreed to coordinate the effort with that of the British, and on 11 October he sent a message to Prime Minister Winston Churchill, suggesting that they correspond on atomic matters.
Feasibility:
Proposals:
The S-1 Committee held its meeting on 18 December 1941 "pervaded by an atmosphere of enthusiasm and urgency" in the wake of the attack on Pearl Harbor and the subsequent United States declaration of war upon Japan and then on Germany. Work was proceeding on three different techniques for isotope separation to separate uranium-235 from the more abundant uranium-238.
Lawrence and his team at the University of California investigated electromagnetic separation, while Eger Murphree and Jesse Wakefield Beams's team looked into gaseous diffusion at Columbia University, and Philip Abelson directed research into thermal diffusion at the Carnegie Institution of Washington and later the Naval Research Laboratory. Murphree was also the head of an unsuccessful separation project using gas centrifuges.
Meanwhile, there were two lines of research into nuclear reactor technology, with Harold Urey continuing research into heavy water at Columbia, while Arthur Compton brought the scientists working under his supervision from Columbia, California and Princeton University to join his team at the University of Chicago, where he organized the Metallurgical Laboratory in early 1942 to study plutonium and reactors using graphite as a neutron moderator.
Briggs, Compton, Lawrence, Murphree, and Urey met on 23 May 1942, to finalize the S-1 Committee recommendations, which called for all five technologies to be pursued. This was approved by Bush, Conant, and Brigadier General Wilhelm D. Styer, the chief of staff of Major General Brehon B. Somervell's Services of Supply, who had been designated the Army's representative on nuclear matters.
Bush and Conant then took the recommendation to the Top Policy Group with a budget proposal for $54 million for construction by the United States Army Corps of Engineers, $31 million for research and development by OSRD and $5 million for contingencies in fiscal year 1943.
The Top Policy Group sent it on 17 June 1942, to the President, who approved it by writing "OK FDR" on the document.
Bomb design concepts:
Compton asked theoretical physicist J. Robert Oppenheimer of the University of California to take over research into fast neutron calculations—the key to calculations of critical mass and weapon detonation—from Gregory Breit, who had quit on 18 May 1942 because of concerns over lax operational security.
John H. Manley, a physicist at the Metallurgical Laboratory, was assigned to assist Oppenheimer by contacting and coordinating experimental physics groups scattered across the country. Oppenheimer and Robert Serber of the University of Illinois examined the problems of neutron diffusion—how neutrons moved in a nuclear chain reaction—and hydrodynamics—how the explosion produced by a chain reaction might behave.
To review this work and the general theory of fission reactions, Oppenheimer and Fermi convened meetings at the University of Chicago in June and at the University of California in July 1942 with theoretical physicists:
- Hans Bethe,
- John Van Vleck,
- Edward Teller,
- Emil Konopinski,
- Robert Serber,
- Stan Frankel,
- and Eldred C. (Carlyle) Nelson,
- (The latter three former students of Oppenheimer,)
- and experimental physicists:
They tentatively confirmed that a fission bomb was theoretically possible.
There were still many unknown factors. The properties of pure uranium-235 were relatively unknown, as were those of plutonium, an element that had only been discovered in February 1941 by Glenn Seaborg and his team. The scientists at the (July 1942) Berkeley conference envisioned creating plutonium in nuclear reactors where uranium-238 atoms absorbed neutrons that had been emitted from fissioning uranium-235 atoms.
At this point no reactor had been built, and only tiny quantities of plutonium were available from cyclotrons at institutions such as Washington University in St. Louis.
Even by December 1943, only two milligrams had been produced. There were many ways of arranging the fissile material into a critical mass. The simplest was shooting a "cylindrical plug" into a sphere of "active material" with a "tamper"—dense material that would focus neutrons inward and keep the reacting mass together to increase its efficiency.
They also explored designs involving spheroids, a primitive form of "implosion" suggested by Richard C. Tolman, and the possibility of autocatalytic methods, which would increase the efficiency of the bomb as it exploded.
Considering the idea of the fission bomb theoretically settled—at least until more experimental data was available—the 1942 Berkeley conference then turned in a different direction. Edward Teller pushed for discussion of a more powerful bomb: the "super", now usually referred to as a "hydrogen bomb", which would use the explosive force of a detonating fission bomb to ignite a nuclear fusion reaction in deuterium and tritium.
Teller proposed scheme after scheme, but Bethe refused each one. The fusion idea was put aside to concentrate on producing fission bombs. Teller also raised the speculative possibility that an atomic bomb might "ignite" the atmosphere because of a hypothetical fusion reaction of nitrogen nuclei.
Bethe calculated that it could not happen, and a report co-authored by Teller showed that "no self-propagating chain of nuclear reactions is likely to be started." In Serber's account, Oppenheimer mentioned the possibility of this scenario to Arthur Compton, who "didn't have enough sense to shut up about it. It somehow got into a document that went to Washington" and was "never laid to rest".
Organization:
Manhattan District:
The Chief of Engineers, Major General Eugene Reybold, selected Colonel James C. Marshall to head the Army's part of the project in June 1942. Marshall created a liaison office in Washington, D.C., but established his temporary headquarters on the 18th floor of 270 Broadway in New York, where he could draw on administrative support from the Corps of Engineers' North Atlantic Division.
It was close to the Manhattan office of Stone & Webster, the principal project contractor, and to Columbia University. He had permission to draw on his former command, the Syracuse District, for staff, and he started with Lieutenant Colonel Kenneth Nichols, who became his deputy.
Because most of his task involved construction, Marshall worked in cooperation with the head of the Corps of Engineers Construction Division, Major General Thomas M. Robbins, and his deputy, Colonel Leslie Groves. Reybold, Somervell, and Styer decided to call the project "Development of Substitute Materials", but Groves felt that this would draw attention.
Since engineer districts normally carried the name of the city where they were located, Marshall and Groves agreed to name the Army's component of the project the Manhattan District. This became official on 13 August when Reybold issued the order creating the new district. Informally, it was known as the Manhattan Engineer District, or MED.
Unlike other districts, it had no geographic boundaries, and Marshall had the authority of a division engineer. Development of Substitute Materials remained as the official codename of the project as a whole, but was supplanted over time by "Manhattan".
Marshall later conceded that, "I had never heard of atomic fission but I did know that you could not build much of a plant, much less four of them for $90 million." A single TNT plant that Nichols had recently built in Pennsylvania had cost $128 million.
Nor were they impressed with estimates to the nearest order of magnitude, which Groves compared with telling a caterer to prepare for between ten and a thousand guests.
A survey team from Stone & Webster had already scouted a site for the production plants. The War Production Board recommended sites around Knoxville, Tennessee, an isolated area where the Tennessee Valley Authority could supply ample electric power and the rivers could provide cooling water for the reactors.
After examining several sites, the survey team selected one near Elza, Tennessee. Conant advised that it be acquired at once and Styer agreed but Marshall temporized, awaiting the results of Conant's reactor experiments before taking action. Of the prospective processes, only Lawrence's electromagnetic separation appeared sufficiently advanced for construction to commence.
Marshall and Nichols began assembling the resources they would need. The first step was to obtain a high priority rating for the project. The top ratings were AA-1 through AA-4 in descending order, although there was also a special AAA rating reserved for emergencies.
Ratings AA-1 and AA-2 were for essential weapons and equipment, so Colonel Lucius D. Clay, the deputy chief of staff at Services and Supply for requirements and resources, felt that the highest rating he could assign was AA-3, although he was willing to provide a AAA rating on request for critical materials if the need arose.
Nichols and Marshall were disappointed; AA-3 was the same priority as Nichols' TNT plant in Pennsylvania.
Military Policy Committee:
Vannevar Bush became dissatisfied with Colonel Marshall's failure to get the project moving forward expeditiously, specifically the failure to acquire the Tennessee site, the low priority allocated to the project by the Army and the location of his headquarters in New York City.
Bush felt that more aggressive leadership was required, and spoke to Harvey Bundy and Generals Marshall, Somervell, and Styer about his concerns. He wanted the project placed under a senior policy committee, with a prestigious officer, preferably Styer, as overall director.
Somervell and Styer selected Groves for the post, informing him on 17 September of this decision, and that General Marshall ordered that he be promoted to brigadier general, as it was felt that the title "general" would hold more sway with the academic scientists working on the Manhattan Project. Groves' orders placed him directly under Somervell rather than Reybold, with Colonel Marshall now answerable to Groves.
Groves established his headquarters in Washington, D.C., on the fifth floor of the New War Department Building, where Colonel Marshall had his liaison office. He assumed command of the Manhattan Project on 23 September 1942.
Later that day, he attended a meeting called by Stimson, which established a Military Policy Committee, responsible to the Top Policy Group, consisting of Bush (with Conant as an alternate), Styer and Rear Admiral William R. Purnell. Tolman and Conant were later appointed as Groves' scientific advisers.
On 19 September, Groves went to Donald Nelson, the chairman of the War Production Board, and asked for broad authority to issue a AAA rating whenever it was required. Nelson initially balked but quickly caved in when Groves threatened to go to the President.
Groves promised not to use the AAA rating unless it was necessary. It soon transpired that for the routine requirements of the project the AAA rating was too high but the AA-3 rating was too low. After a long campaign, Groves finally received AA-1 authority on 1 July 1944.
According to Groves, "In Washington you became aware of the importance of top priority. Most everything proposed in the Roosevelt administration would have top priority. That would last for about a week or two and then something else would get top priority".
One of Groves' early problems was to find a director for Project Y, the group that would design and build the bomb. The obvious choice was one of the three laboratory heads, Urey, Lawrence, or Compton, but they could not be spared. Compton recommended Oppenheimer, who was already intimately familiar with the bomb design concepts. However, Oppenheimer had little administrative experience, and, unlike Urey, Lawrence, and Compton, had not won a Nobel Prize, which many scientists felt that the head of such an important laboratory should have.
There were also concerns about Oppenheimer's security status, as many of his associates were communists, including his wife, Kitty (Katherine Oppenheimer); his girlfriend, Jean Tatlock; and his brother, Frank Oppenheimer. A long conversation on a train in October 1942 convinced Groves and Nichols that Oppenheimer thoroughly understood the issues involved in setting up a laboratory in a remote area and should be appointed as its director.
Groves personally waived the security requirements and issued Oppenheimer a clearance on 20 July 1943.
Collaboration with the United Kingdom:
Main article: British contribution to the Manhattan Project
The British and Americans exchanged nuclear information but did not initially combine their efforts. Britain rebuffed attempts by Bush and Conant in 1941 to strengthen cooperation with its own project, codenamed Tube Alloys, because it was reluctant to share its technological lead and help the United States develop its own atomic bomb.
But the British, who had made significant contributions early in the war, did not have the resources to carry through such a research program while fighting for their survival, and Tube Alloys soon fell behind its American counterpart.
The roles of the two countries were reversed, and in January 1943 Conant notified the British that they would no longer receive atomic information except in certain areas.The British investigated the possibility of an independent nuclear program, but determined that it could not be ready in time to affect the outcome of the war in Europe.
By March 1943 Conant decided that British help would benefit some areas of the project. James Chadwick and one or two other British scientists were important enough that the bomb design team at Los Alamos needed them, despite the risk of revealing weapon design secrets.
In August 1943 Churchill and Roosevelt negotiated the Quebec Agreement, which resulted in a resumption of cooperation. The subsequent Hyde Park Agreement in September 1944 extended this cooperation to the postwar period.
The Quebec Agreement established the Combined Policy Committee to coordinate the efforts of the United States, United Kingdom and Canada. Stimson, Bush and Conant served as the American members of the Combined Policy Committee, Field Marshal Sir John Dill and Colonel J. J. Llewellin were the British members, and C. D. Howe was the Canadian member.
Llewellin returned to the United Kingdom at the end of 1943 and was replaced on the committee by Sir Ronald Ian Campbell, who in turn was replaced by the British Ambassador to the United States, Lord Halifax, in early 1945. Sir John Dill died in Washington, D.C., in November 1944 and was replaced both as Chief of the British Joint Staff Mission and as a member of the Combined Policy Committee by Field Marshal Sir Henry Maitland Wilson.
When cooperation resumed after the Quebec agreement, the Americans' progress and expenditures amazed the British. Chadwick pressed for British involvement in the Manhattan Project to the fullest extent and abandoned any hopes of an independent British project during the war.
With Churchill's backing, he attempted to ensure that every request from Groves for assistance was honored. The British Mission that arrived in the United States in December 1943 included Niels Bohr, Otto Frisch, Klaus Fuchs, Rudolf Peierls, and Ernest Titterton.
More scientists arrived in early 1944. While those assigned to gaseous diffusion left by the fall of 1944, the thirty-five working under Oliphant with Lawrence at Berkeley were assigned to existing laboratory groups and most stayed until the end of the war. The nineteen sent to Los Alamos also joined existing groups, primarily related to implosion and bomb assembly, but not the plutonium-related ones.
The Quebec Agreement specified that nuclear weapons would not be used against another country without the mutual consent of the US and UK. In June 1945, Wilson agreed that the use of nuclear weapons against Japan would be recorded as a decision of the Combined Policy Committee.
The Combined Policy Committee created the Combined Development Trust in June 1944, with Groves as its chairman, to procure uranium and thorium ores on international markets.
The Belgian Congo and Canada held much of the world's uranium outside Eastern Europe, and the Belgian government in exile was in London. Britain agreed to give the United States most of the Belgian ore, as it could not use most of the supply without restricted American research.
In 1944, the Trust purchased 3,440,000 pounds (1,560,000 kg) of uranium oxide ore from companies operating mines in the Belgian Congo. In order to avoid briefing US Secretary of the Treasury Henry Morgenthau Jr. on the project, a special account not subject to the usual auditing and controls was used to hold Trust monies.
Between 1944 and the time he resigned from the Trust in 1947, Groves deposited a total of $37.5 million into the Trust's account.
Groves appreciated the early British atomic research and the British scientists' contributions to the Manhattan Project, but stated that the United States would have succeeded without them. The British wartime participation was crucial to the success of the United Kingdom's independent nuclear weapons program after the war when the McMahon Act of 1946 temporarily ended American nuclear cooperation.
Project sites:
Oak Ridge:
Main article: Clinton Engineer Works
The day after he took over the project, Groves took a train to Tennessee with Colonel Marshall to inspect the proposed site there, and Groves was impressed. On 29 September 1942, United States Under Secretary of War Robert P. Patterson authorized the Corps of Engineers to acquire 56,000 acres (23,000 ha) of land by eminent domain at a cost of $3.5 million. An additional 3,000 acres (1,200 ha) was subsequently acquired. About 1,000 families were affected by the condemnation order, which came into effect on 7 October.
Protests, legal appeals, and a 1943 Congressional inquiry were to no avail. By mid-November U.S. Marshals were tacking notices to vacate on farmhouse doors, and construction contractors were moving in.
Some families were given two weeks' notice to vacate farms that had been their homes for generations; others had settled there after being evicted to make way for the Great Smoky Mountains National Park in the 1920s or the Norris Dam in the 1930s.
The ultimate cost of land acquisition in the area, which was not completed until March 1945, was only about $2.6 million, which worked out to around $47 an acre. When presented with Public Proclamation Number Two, which declared Oak Ridge a total exclusion area that no one could enter without military permission, the Governor of Tennessee, Prentice Cooper, angrily tore it up.
Initially known as the Kingston Demolition Range, the site was officially renamed the Clinton Engineer Works (CEW) in early 1943. While Stone & Webster concentrated on the production facilities, the architectural and engineering firm Skidmore, Owings & Merrill designed and built a residential community for 13,000. The community was located on the slopes of Black Oak Ridge, from which the new town of Oak Ridge got its name.
The Army presence at Oak Ridge increased in August 1943 when Nichols replaced Marshall as head of the Manhattan Engineer District. One of his first tasks was to move the district headquarters to Oak Ridge although the name of the district did not change.
In September 1943 the administration of community facilities was outsourced to Turner Construction Company through a subsidiary, the Roane-Anderson Company (for Roane and Anderson Counties, in which Oak Ridge was located).
Chemical engineers, including William J. (Jenkins) Wilcox Jr. and Warren Fuchs, were part of "frantic efforts" to make 10% to 12% enriched uranium 235, known as the code name "tuballoy tetroxide", with tight security and fast approvals for supplies and materials.
The population of Oak Ridge soon expanded well beyond the initial plans, and peaked at 75,000 in May 1945, by which time 82,000 people were employed at the Clinton Engineer Works, and 10,000 by Roane-Anderson.
Fine-arts photographer, Josephine Herrick, and her colleague, Mary Steers, helped document the work at Oak Ridge.
Los Alamos:
Main article: Project Y
The idea of locating Project Y at Oak Ridge was considered, but in the end it was decided that it should be in a remote location. On Oppenheimer's recommendation, the search for a suitable site was narrowed to the vicinity of Albuquerque, New Mexico, where Oppenheimer owned a ranch.
In October 1942, Major John H. Dudley of the Manhattan District was sent to survey the area. He recommended a site near Jemez Springs, New Mexico. On 16 November, Oppenheimer, Groves, Dudley and others toured the site. Oppenheimer feared that the high cliffs surrounding the site would make his people feel claustrophobic, while the engineers were concerned with the possibility of flooding.
The party then moved on to the vicinity of the Los Alamos Ranch School. Oppenheimer was impressed and expressed a strong preference for the site, citing its natural beauty and views of the Sangre de Cristo Mountains, which, it was hoped, would inspire those who would work on the project.
The engineers were concerned about the poor access road, and whether the water supply would be adequate, but otherwise felt that it was ideal.
Patterson approved the acquisition of the site on 25 November 1942, authorizing $440,000 for the purchase of the site of 54,000 acres (22,000 ha), all but 8,900 acres (3,600 ha) of which were already owned by the Federal Government.
Secretary of Agriculture Claude R. Wickard granted use of some 45,100 acres (18,300 ha) of United States Forest Service land to the War Department "for so long as the military necessity continues". The need for land, for a new road, and later for a right of way for a 25-mile (40 km) power line, eventually brought wartime land purchases to 45,737 acres (18,509.1 ha), but only $414,971 was spent.
Construction was contracted to the M. M. Sundt Company of Tucson, Arizona, with Willard C. Kruger and Associates of Santa Fe, New Mexico, as architect and engineer. Work commenced in December 1942. Groves initially allocated $300,000 for construction, three times Oppenheimer's estimate, with a planned completion date of 15 March 1943. It soon became clear that the scope of Project Y was greater than expected, and by the time Sundt finished on 30 November 1943, over $7 million had been spent.
Because it was secret, Los Alamos was referred to as "Site Y" or "the Hill". Birth certificates of babies born in Los Alamos during the war listed their place of birth as PO Box 1663 in Santa Fe.
Initially Los Alamos was to have been a military laboratory with Oppenheimer and other researchers commissioned into the Army. Oppenheimer went so far as to order himself a lieutenant colonel's uniform, but two key physicists, Robert Bacher and Isidor Rabi, balked at the idea. Conant, Groves and Oppenheimer then devised a compromise whereby the laboratory was operated by the University of California under contract to the War Department.
Chicago:
Main article: Metallurgical Laboratory
An Army-OSRD council on 25 June 1942 decided to build a pilot plant for plutonium production in Red Gate Woods southwest of Chicago. In July, Nichols arranged for a lease of 1,025 acres (415 ha) from the Cook County Forest Preserve District, and Captain James F. Grafton (1908-1969) was appointed Chicago area engineer. It soon became apparent that the scale of operations was too great for the area, and it was decided to build the plant at Oak Ridge, and keep a research and testing facility in Chicago.
Delays in establishing the plant in Red Gate Woods led Compton to authorize the Metallurgical Laboratory to construct the first nuclear reactor beneath the bleachers of Stagg Field at the University of Chicago. The reactor required an enormous amount of graphite blocks and uranium pellets.
At the time, there was a limited source of pure uranium. Frank Spedding of Iowa State University was able to produce only two short tons of pure uranium. Additional three short tons of uranium metal was supplied by Westinghouse Lamp Plant which was produced in a rush with makeshift process.
A large square balloon was constructed by Goodyear Tire to encase the reactor. On 2 December 1942, a team led by Enrico Fermi initiated the first artificial self-sustaining nuclear chain reaction in an experimental reactor known as Chicago Pile-1.
The point at which a reaction becomes self-sustaining became known as "going critical". Compton reported the success to Conant in Washington, D.C., by a coded phone call, saying, "The Italian navigator [Fermi] has just landed in the new world."
In January 1943, Grafton's successor, Major Arthur V. Peterson, ordered Chicago Pile-1 dismantled and reassembled at Red Gate Woods, as he regarded the operation of a reactor as too hazardous for a densely populated area.
At the Argonne site, Chicago Pile-3, the first heavy water reactor, went critical on 15 May 1944. After the war, the operations that remained at Red Gate moved to the new site of the Argonne National Laboratory about 6 miles (9.7 km) away.
Hanford:
Main article: Hanford Engineer Works
By December 1942 there were concerns that even Oak Ridge was too close to a major population center (Knoxville) in the unlikely event of a major nuclear accident. Groves recruited DuPont in November 1942 to be the prime contractor for the construction of the plutonium production complex.
DuPont was offered a standard cost plus fixed-fee contract, but the President of the company, Walter S. Carpenter, Jr., wanted no profit of any kind, and asked for the proposed contract to be amended to explicitly exclude the company from acquiring any patent rights.
This was accepted, but for legal reasons a nominal fee of one dollar was agreed upon. After the war, DuPont asked to be released from the contract early, and had to return 33 cents.
DuPont recommended that the site be located far from the existing uranium production facility at Oak Ridge. In December 1942, Groves dispatched Colonel Franklin Matthias and DuPont engineers to scout potential sites. Matthias reported that Hanford Site near Richland, Washington, was "ideal in virtually all respects". It was isolated and near the Columbia River, which could supply sufficient water to cool the reactors that would produce the plutonium.
Groves visited the site in January and established the Hanford Engineer Works (HEW), codenamed "Site W".
Under Secretary Patterson gave his approval on 9 February, allocating $5 million for the acquisition of 430,000 acres (170,000 ha) of land in the area. The federal government relocated some 1,500 residents of White Bluffs and Hanford, and nearby settlements, as well as the Wanapum and other tribes using the area. A dispute arose with farmers over compensation for crops, which had already been planted before the land was acquired.
Where schedules allowed, the Army allowed the crops to be harvested, but this was not always possible. The land acquisition process dragged on and was not completed before the end of the Manhattan Project in December 1946.
The dispute did not delay work. Although progress on the reactor design at Metallurgical Laboratory and DuPont was not sufficiently advanced to accurately predict the scope of the project, a start was made in April 1943 on facilities for an estimated 25,000 workers, half of whom were expected to live on-site.
By July 1944, some 1,200 buildings had been erected and nearly 51,000 people were living in the construction camp. As area engineer, Matthias exercised overall control of the site.
At its peak, the construction camp was the third most populous town in Washington state. Hanford operated a fleet of over 900 buses, more than the city of Chicago. Like Los Alamos and Oak Ridge, Richland was a gated community with restricted access, but it looked more like a typical wartime American boomtown: the military profile was lower, and physical security elements like high fences, towers, and guard dogs were less evident.
Canadian sites:
Main article: Montreal Laboratory
British Columbia:
Cominco had produced electrolytic hydrogen at Trail, British Columbia, since 1930. Urey suggested in 1941 that it could produce heavy water. To the existing $10 million plant consisting of 3,215 cells consuming 75 MW of hydroelectric power, secondary electrolysis cells were added to increase the deuterium concentration in the water from 2.3% to 99.8%.
For this process, Hugh Taylor of Princeton developed a platinum-on-carbon catalyst for the first three stages while Urey developed a nickel-chromia one for the fourth stage tower. The final cost was $2.8 million. The Canadian Government did not officially learn of the project until August 1942.
Trail's heavy water production started in January 1944 and continued until 1956. Heavy water from Trail was used for Chicago Pile 3, the first reactor using heavy water and natural uranium, which went critical on 15 May 1944.
Ontario:
The Chalk River, Ontario, site was established to rehouse the Allied effort at the Montreal Laboratory away from an urban area. A new community was built at Deep River, Ontario, to provide residences and facilities for the team members. The site was chosen for its proximity to the industrial manufacturing area of Ontario and Quebec, and proximity to a rail head adjacent to a large military base, Camp Petawawa.
Located on the Ottawa River, it had access to abundant water. The first director of the new laboratory was Hans von Halban. He was replaced by John Cockcroft in May 1944, who in turn was succeeded by Bennett Lewis in September 1946.
A pilot reactor known as ZEEP (zero-energy experimental pile) became the first Canadian reactor, and the first to be completed outside the United States, when it went critical in September 1945, ZEEP remained in use by researchers until 1970.
A larger 10 MW NRX reactor, which was designed during the war, was completed and went critical in July 1947.
Northwest Territories:
The Eldorado Mine at Port Radium was a source of uranium ore.
Heavy water sites:
Main article: P-9 Project
Although DuPont's preferred designs for the nuclear reactors were helium cooled and used graphite as a moderator, DuPont still expressed an interest in using heavy water as a backup, in case the graphite reactor design proved infeasible for some reason. For this purpose, it was estimated that 3 short tons (2.7 t) of heavy water would be required per month. The P-9 Project was the government's code name for the heavy water production program.
As the plant at Trail, which was then under construction, could produce 0.5 short tons (0.45 t) per month, additional capacity was required. Groves therefore authorized DuPont to establish heavy water facilities at:
Although known as Ordnance Works and paid for under Ordnance Department contracts, they were built and operated by the Army Corps of Engineers. The American plants used a process different from Trail's; heavy water was extracted by distillation, taking advantage of the slightly higher boiling point of heavy water.
Uranium:
Ore:
The key raw material for the project was uranium, which was used as fuel for the reactors, as feed that was transformed into plutonium, and, in its enriched form, in the atomic bomb itself. There were four known major deposits of uranium in 1940: in Colorado, in northern Canada, in Joachimsthal in Czechoslovakia, and in the Belgian Congo.
All but Joachimstal were in Allied hands. A November 1942 survey determined that sufficient quantities of uranium were available to satisfy the project's requirements.
Nichols arranged with the State Department for export controls to be placed on uranium oxide and negotiated for the purchase of 1,200 short tons (1,100 t) of uranium ore from the Belgian Congo that was being stored in a warehouse on Staten Island and the remaining stocks of mined ore stored in the Congo. He negotiated with Eldorado Gold Mines for the purchase of ore from its refinery in Port Hope, Ontario, and its shipment in 100-ton lots. The Canadian government subsequently bought up the company's stock until it acquired a controlling interest.
While these purchases assured a sufficient supply to meet wartime needs, the American and British leaders concluded that it was in their countries' interest to gain control of as much of the world's uranium deposits as possible. The richest source of ore was the Shinkolobwe mine in the Belgian Congo, but it was flooded and closed.
Nichols unsuccessfully attempted to negotiate its reopening and the sale of the entire future output to the United States with Edgar Sengier, the director of the company that owned the mine, the Union Minière du Haut-Katanga. The matter was then taken up by the Combined Policy Committee. As 30 percent of Union Minière's stock was controlled by British interests, the British took the lead in negotiations.
Sir John Anderson and Ambassador John Winant hammered out a deal with Sengier and the Belgian government in May 1944 for the mine to be reopened and 1,720 short tons (1,560 t) of ore to be purchased at $1.45 a pound. To avoid dependence on the British and Canadians for ore, Groves also arranged for the purchase of US Vanadium Corporation's stockpile in Uravan, Colorado. Uranium mining in Colorado yielded about 800 short tons (730 t) of ore.
Mallinckrodt Incorporated in St. Louis, Missouri, took the raw ore and dissolved it in nitric acid to produce uranyl nitrate. Ether was then added in a liquid–liquid extraction process to separate the impurities from the uranyl nitrate. This was then heated to form uranium trioxide, which was reduced to highly pure uranium dioxide.
By July 1942, Mallinckrodt was producing a ton of highly pure oxide a day, but turning this into uranium metal initially proved more difficult for contractors Westinghouse and Metal Hydrides. Production was too slow and quality was unacceptably low. A special branch of the Metallurgical Laboratory was established at Iowa State College in Ames, Iowa, under Frank Spedding to investigate alternatives. This became known as the Ames Project, and its Ames process became available in 1943.
Isotope separation:
Natural uranium consists of 99.3% uranium-238 and 0.7% uranium-235, but only the latter is fissile. The chemically identical uranium-235 has to be physically separated from the more plentiful isotope. Various methods were considered for uranium enrichment, most of which was carried out at Oak Ridge.
The most obvious technology, the centrifuge, failed, but electromagnetic separation, gaseous diffusion, and thermal diffusion technologies were all successful and contributed to the project. In February 1943, Groves came up with the idea of using the output of some plants as the input for others.
Centrifuges:
The centrifuge process was regarded as the only promising separation method in April 1942. Jesse Beams had developed such a process at the University of Virginia during the 1930s, but had encountered technical difficulties. The process required high rotational speeds, but at certain speeds harmonic vibrations developed that threatened to tear the machinery apart. It was therefore necessary to accelerate quickly through these speeds.
In 1941 he began working with uranium hexafluoride, the only known gaseous compound of uranium, and was able to separate uranium-235. At Columbia, Urey had Karl P. Cohen investigate the process, and he produced a body of mathematical theory making it possible to design a centrifugal separation unit, which Westinghouse undertook to construct.
Scaling this up to a production plant presented a formidable technical challenge. Urey and Cohen estimated that producing a kilogram (2.2 lb) of uranium-235 per day would require up to 50,000 centrifuges with 1-meter (3 ft 3 in) rotors, or 10,000 centrifuges with 4-meter (13 ft) rotors, assuming that 4-meter rotors could be built.
The prospect of keeping so many rotors operating continuously at high speed appeared daunting, and when Beams ran his experimental apparatus, he obtained only 60% of the predicted yield, indicating that more centrifuges would be required. Beams, Urey and Cohen then began work on a series of improvements which promised to increase the efficiency of the process.
However, frequent failures of motors, shafts and bearings at high speeds delayed work on the pilot plant. In November 1942 the centrifuge process was abandoned by the Military Policy Committee following a recommendation by Conant, Nichols and August C. Klein of Stone & Webster.
Although the centrifuge method was abandoned by the Manhattan Project, research into it advanced significantly after the war with the introduction of the Zippe-type centrifuge, which was developed in the Soviet Union by Soviet and captured German engineers.
It eventually became the preferred method of Uranium isotope separation, being far more economical than the other separation methods used during World War II.
Electromagnetic separation:
Main article: Y-12 Project
Electromagnetic isotope separation was developed by Lawrence at the University of California Radiation Laboratory. This method employed devices known as calutrons, a hybrid of the standard laboratory mass spectrometer and the cyclotron magnet. The name was derived from the words California, university and cyclotron.
In the electromagnetic process, a magnetic field deflected charged particles according to mass. The process was neither scientifically elegant nor industrially efficient.
Compared with a gaseous diffusion plant or a nuclear reactor, an electromagnetic separation plant would consume more scarce materials, require more manpower to operate, and cost more to build.
Nonetheless, the process was approved because it was based on proven technology and therefore represented less risk. Moreover, it could be built in stages, and rapidly reach industrial capacity.
Marshall and Nichols discovered that the electromagnetic isotope separation process would require 5,000 short tons (4,500 tonnes) of copper, which was in desperately short supply.
However, silver could be substituted, in an 11:10 ratio. On 3 August 1942, Nichols met with Under Secretary of the Treasury Daniel W. Bell and asked for the transfer of 6,000 tons of silver bullion from the West Point Bullion Depository. "Young man," Bell told him, "you may think of silver in tons but the Treasury will always think of silver in troy ounces!"
Ultimately 14,700 short tons (13,300 tonnes; 430,000,000 troy ounces) were used.
The 1,000-troy-ounce (31 kg) silver bars were cast into cylindrical billets and taken to Phelps Dodge in Bayway, New Jersey, where they were extruded into strips 0.625 inches (15.9 mm) thick, 3 inches (76 mm) wide and 40 feet (12 m) long. These were wound onto magnetic coils by Allis-Chalmers in Milwaukee, Wisconsin.
After the war, all the machinery was dismantled and cleaned and the floorboards beneath the machinery were ripped up and burned to recover minute amounts of silver. In the end, only 1/3,600,000th was lost. The last silver was returned in May 1970.
Responsibility for the design and construction of the electromagnetic separation plant, which came to be called Y-12, was assigned to Stone & Webster by the S-1 Committee in June 1942. The design called for five first-stage processing units, known as Alpha racetracks, and two units for final processing, known as Beta racetracks. In September 1943 Groves authorized construction of four more racetracks, known as Alpha II. Construction began in February 1943.
When the plant was started up for testing on schedule in October, the 14-ton vacuum tanks crept out of alignment because of the power of the magnets, and had to be fastened more securely. A more serious problem arose when the magnetic coils started shorting out.
In December Groves ordered a magnet to be broken open, and handfuls of rust were found inside. Groves then ordered the racetracks to be torn down and the magnets sent back to the factory to be cleaned. A pickling plant was established on-site to clean the pipes and fittings.
The second Alpha I was not operational until the end of January 1944, the first Beta and first and third Alpha I's came online in March, and the fourth Alpha I was operational in April.
The four Alpha II racetracks were completed between July and October 1944.
Tennessee Eastman was contracted to manage Y-12 on the usual cost plus fixed-fee basis, with a fee of $22,500 per month plus $7,500 per racetrack for the first seven racetracks and $4,000 per additional racetrack.
The calutrons were initially operated by scientists from Berkeley to remove bugs and achieve a reasonable operating rate. They were then turned over to trained Tennessee Eastman operators who had only a high school education. Nichols compared unit production data, and pointed out to Lawrence that the young "hillbilly" girl operators, known as Calutron Girls, were outperforming his PhDs.
They agreed to a production race and Lawrence lost, a morale boost for the Tennessee Eastman workers and supervisors. The girls were "trained like soldiers not to reason why", while "the scientists could not refrain from time-consuming investigation of the cause of even minor fluctuations of the dials."
Y-12 initially enriched the uranium-235 content to between 13% and 15%, and shipped the first few hundred grams of this to Los Alamos in March 1944. Only 1 part in 5,825 of the uranium feed emerged as final product. Much of the rest was splattered over equipment in the process.
Strenuous recovery efforts helped raise production to 10% of the uranium-235 feed by January 1945. In February the Alpha racetracks began receiving slightly enriched (1.4%) feed from the new S-50 thermal diffusion plant. The next month it received enhanced (5%) feed from the K-25 gaseous diffusion plant.
By August, K-25 was producing uranium sufficiently enriched to feed directly into the Beta tracks.
Gaseous diffusion:
Main article: K-25
The most promising but also the most challenging method of isotope separation was gaseous diffusion. Graham's law states that the rate of effusion of a gas is inversely proportional to the square root of its molecular mass, so in a box containing a semi-permeable membrane and a mixture of two gases, the lighter molecules will pass out of the container more rapidly than the heavier molecules.
The gas leaving the container is somewhat enriched in the lighter molecules, while the residual gas is somewhat depleted. The idea was that such boxes could be formed into a cascade of pumps and membranes, with each successive stage containing a slightly more enriched mixture.
Research into the process was carried out at Columbia University by a group that included Harold Urey, Karl P. Cohen, and John R. Dunning.
In November 1942 the Military Policy Committee approved the construction of a 600-stage gaseous diffusion plant. On 14 December, M. W. Kellogg accepted an offer to construct the plant, which was codenamed K-25. A cost plus fixed-fee contract was negotiated, eventually totaling $2.5 million.
A separate corporate entity called Kellex was created for the project, headed by Percival C. Keith, one of Kellogg's vice presidents. The process faced formidable technical difficulties. The highly corrosive gas uranium hexafluoride would have to be used, as no substitute could be found, and the motors and pumps would have to be vacuum tight and enclosed in inert gas.
The biggest problem was the design of the barrier, which would have to be strong, porous and resistant to corrosion by uranium hexafluoride. The best choice for this seemed to be nickel. Edward Adler and Edward Norris created a mesh barrier from electroplated nickel. A six-stage pilot plant was built at Columbia to test the process, but the Norris-Adler prototype proved to be too brittle.
A rival barrier was developed from powdered nickel by Kellex, the Bell Telephone Laboratories and the Bakelite Corporation. In January 1944, Groves ordered the Kellex barrier into production.
Kellex's design for K-25 called for a four-story 0.5-mile (0.80 km) long U-shaped structure containing 54 contiguous buildings. These were divided into nine sections. Within these were cells of six stages. The cells could be operated independently, or consecutively within a section. Similarly, the sections could be operated separately or as part of a single cascade. A survey party began construction by marking out the 500-acre (2.0 km2) site in May 1943.
Work on the main building began in October 1943, and the six-stage pilot plant was ready for operation on 17 April 1944. In 1945 Groves canceled the upper stages of the plant, directing Kellex to instead design and build a 540-stage side feed unit, which became known as K-27. Kellex transferred the last unit to the operating contractor, Union Carbide and Carbon, on 11 September 1945.
The total cost, including the K-27 plant completed after the war, came to $480 million.
The production plant commenced operation in February 1945, and as cascade after cascade came online, the quality of the product increased. By April 1945, K-25 had attained a 1.1% enrichment and the output of the S-50 thermal diffusion plant began being used as feed.
Some product produced the next month reached nearly 7% enrichment. In August, the last of the 2,892 stages commenced operation. K-25 and K-27 achieved their full potential in the early postwar period, when they eclipsed the other production plants and became the prototypes for a new generation of plants.
Thermal diffusion:
Main article: S-50 Project
The thermal diffusion process was based on Sydney Chapman and David Enskog's theory, which explained that when a mixed gas passes through a temperature gradient, the heavier one tends to concentrate at the cold end and the lighter one at the warm end.
Since hot gases tend to rise and cool ones tend to fall, this can be used as a means of isotope separation. This process was first demonstrated by Klaus Clusius and Gerhard Dickel in Germany in 1938. It was developed by US Navy scientists, but was not one of the enrichment technologies initially selected for use in the Manhattan Project. This was primarily due to doubts about its technical feasibility, but the inter-service rivalry between the Army and Navy also played a part.
The Naval Research Laboratory continued the research under Philip Abelson's direction, but there was little contact with the Manhattan Project until April 1944, when Captain William S. Parsons, the naval officer in charge of ordnance development at Los Alamos, brought Oppenheimer news of encouraging progress in the Navy's experiments on thermal diffusion.
Oppenheimer wrote to Groves suggesting that the output of a thermal diffusion plant could be fed into Y-12. Groves set up a committee consisting of Warren K. Lewis, Eger Murphree and Richard Tolman to investigate the idea, and they estimated that a thermal diffusion plant costing $3.5 million could enrich 50 kilograms (110 lb) of uranium per week to nearly 0.9% uranium-235. Groves approved its construction on 24 June 1944.
Groves contracted with the H. K. Ferguson Company of Cleveland, Ohio, to build the thermal diffusion plant, which was designated S-50. Groves's advisers, Karl Cohen and W. I. Thompson from Standard Oil, estimated that it would take six months to build.
Groves gave Ferguson just four. Plans called for the installation of 2,142 48-foot-tall (15 m) diffusion columns arranged in 21 racks. Inside each column were three concentric tubes. Steam, obtained from the nearby K-25 powerhouse at a pressure of 100 pounds per square inch (690 kPa) and temperature of 545 °F (285 °C), flowed downward through the innermost 1.25-inch (32 mm) nickel pipe, while water at 155 °F (68 °C) flowed upward through the outermost iron pipe. The uranium hexafluoride flowed in the middle copper pipe, and isotope separation of the uranium occurred between the nickel and copper pipes.
Work commenced on 9 July 1944, and S-50 began partial operation in September. Ferguson operated the plant through a subsidiary known as Fercleve. The plant produced just 10.5 pounds (4.8 kg) of 0.852% uranium-235 in October. Leaks limited production and forced shutdowns over the next few months, but in June 1945 it produced 12,730 pounds (5,770 kg).
By March 1945, all 21 production racks were operating. Initially the output of S-50 was fed into Y-12, but starting in March 1945 all three enrichment processes were run in series. S-50 became the first stage, enriching from 0.71% to 0.89%. This material was fed into the gaseous diffusion process in the K-25 plant, which produced a product enriched to about 23%. This was, in turn, fed into Y-12, which boosted it to about 89%, sufficient for nuclear weapons. About 50 kilograms (110 lb) of uranium enriched to 89% uranium-235 was delivered to Los Alamos by July 1945.
The entire 50 kg, along with some 50%-enriched, averaging out to about 85% enriched, were used in the first Little Boy.
Plutonium:
The second line of development pursued by the Manhattan Project used the fissile element plutonium. Although small amounts of plutonium exist in nature, the best way to obtain large quantities of the element is in a nuclear reactor, in which natural uranium is bombarded by neutrons.
The uranium-238 is transmuted into uranium-239, which rapidly decays, first into neptunium-239 and then into plutonium-239. Only a small amount of the uranium-238 will be transformed, so the plutonium must be chemically separated from the remaining uranium, from any initial impurities, and from fission products.
X-10 Graphite Reactor:
Main article: X-10 Graphite Reactor
In March 1943, DuPont began construction of a plutonium plant on a 112-acre (0.5 km2) site at Oak Ridge. Intended as a pilot plant for the larger production facilities at Hanford, it included the air-cooled X-10 Graphite Reactor, a chemical separation plant, and support facilities.
Because of the subsequent decision to construct water-cooled reactors at Hanford, only the chemical separation plant operated as a true pilot. The X-10 Graphite Reactor consisted of a huge block of graphite, 24 feet (7.3 m) long on each side, weighing around 1,500 short tons (1,400 t), surrounded by 7 feet (2.1 m) of high-density concrete as a radiation shield.
The greatest difficulty was encountered with the uranium slugs produced by Mallinckrodt and Metal Hydrides. These somehow had to be coated in aluminum to avoid corrosion and the escape of fission products into the cooling system. The Grasselli Chemical Company attempted to develop a hot dipping process without success. Meanwhile, Alcoa tried canning.
A new process for flux-less welding was developed, and 97% of the cans passed a standard vacuum test, but high temperature tests indicated a failure rate of more than 50%.
Nonetheless, production began in June 1943. The Metallurgical Laboratory eventually developed an improved welding technique with the help of General Electric, which was incorporated into the production process in October 1943.
Watched by Fermi and Compton, the X-10 Graphite Reactor went critical on 4 November 1943 with about 30 short tons (27 t) of uranium. A week later the load was increased to 36 short tons (33 t), raising its power generation to 500 kW, and by the end of the month the first 500 mg of plutonium was created.
Modifications over time raised the power to 4,000 kW in July 1944. X-10 operated as a production plant until January 1945, when it was turned over to research activities.
Hanford reactors:
Main article: Hanford Engineer Works
Although an air-cooled design was chosen for the reactor at Oak Ridge to facilitate rapid construction, it was recognized that this would be impractical for the much larger production reactors. Initial designs by the Metallurgical Laboratory and DuPont used helium for cooling, before they determined that a water-cooled reactor would be simpler, cheaper and quicker to build.
The design did not become available until 4 October 1943; in the meantime, Matthias concentrated on improving the Hanford Site by erecting accommodations, improving the roads, building a railway switch line, and upgrading the electricity, water and telephone lines.
As at Oak Ridge, the most difficulty was encountered while canning the uranium slugs, which commenced at Hanford in March 1944. They were pickled to remove dirt and impurities, dipped in molten bronze, tin, and aluminum-silicon alloy, canned using hydraulic presses, and then capped using arc welding under an argon atmosphere. Finally, they were subjected to a series of tests to detect holes or faulty welds.
Disappointingly, most canned slugs initially failed the tests, resulting in an output of only a handful of canned slugs per day. But steady progress was made and by June 1944 production increased to the point where it appeared that enough canned slugs would be available to start Reactor B on schedule in August 1944.
Work began on Reactor B, the first of six planned 250 MW reactors, on 10 October 1943. The reactor complexes were given letter designations A through F, with B, D and F sites chosen to be developed first, as this maximised the distance between the reactors. They would be the only ones constructed during the Manhattan Project.
Some 390 short tons (350 t) of steel, 17,400 cubic yards (13,300 m3) of concrete, 50,000 concrete blocks and 71,000 concrete bricks were used to construct the 120-foot (37 m) high building.
Construction of the reactor itself commenced in February 1944. Watched by Compton, Matthias, DuPont's Crawford Greenewalt, Leona Woods and Fermi, who inserted the first slug, the reactor was powered up beginning on 13 September 1944. Over the next few days, 838 tubes were loaded and the reactor went critical. Shortly after midnight on 27 September, the operators began to withdraw the control rods to initiate production.
At first all appeared well but around 03:00 the power level started to drop and by 06:30 the reactor had shut down completely. The cooling water was investigated to see if there was a leak or contamination.
The next day the reactor started up again, only to shut down once more.
Fermi contacted Chien-Shiung Wu, who identified the cause of the problem as neutron poisoning from xenon-135, which has a half-life of 9.2 hours. Fermi, Woods, Donald J. Hughes and John Archibald Wheeler then calculated the nuclear cross section of xenon-135, which turned out to be 30,000 times that of uranium. DuPont engineer George Graves had deviated from the Metallurgical Laboratory's original design in which the reactor had 1,500 tubes arranged in a circle, and had added an additional 504 tubes to fill in the corners.
The scientists had originally considered this overengineering a waste of time and money, but Fermi realized that by loading all 2,004 tubes, the reactor could reach the required power level and efficiently produce plutonium. Reactor D was started on 17 December 1944 and Reactor F on 25 February 1945.
Separation process:
Meanwhile, the chemists considered the problem of how plutonium could be separated from uranium when its chemical properties were not known. Working with the minute quantities of plutonium available at the Metallurgical Laboratory in 1942, a team under Charles M. Cooper developed a lanthanum fluoride process for separating uranium and plutonium, which was chosen for the pilot separation plant.
A second separation process, the bismuth phosphate process, was subsequently developed by Seaborg and Stanly G. Thomson. This process worked by toggling plutonium between its +4 and +6 oxidation states in solutions of bismuth phosphate. In the former state, the plutonium was precipitated; in the latter, it stayed in solution and the other products were precipitated.
Greenewalt favored the bismuth phosphate process due to the corrosive nature of lanthanum fluoride, and it was selected for the Hanford separation plants. Once X-10 began producing plutonium, the pilot separation plant was put to the test. The first batch was processed at 40% efficiency but over the next few months this was raised to 90%.
At Hanford, top priority was initially given to the installations in the 300 area. This contained buildings for testing materials, preparing uranium, and assembling and calibrating instrumentation. One of the buildings housed the canning equipment for the uranium slugs, while another contained a small test reactor.
Notwithstanding the high priority allocated to it, work on the 300 area fell behind schedule due to the unique and complex nature of the 300 area facilities, and wartime shortages of labor and materials.
Early plans called for the construction of two separation plants in each of the areas known as 200-West and 200-East. This was subsequently reduced to two, the T and U plants, in 200-West and one, the B plant, at 200-East. Each separation plant consisted of four buildings:
The canyons were each 800 feet (240 m) long and 65 feet (20 m) wide. Each consisted of forty 17.7-by-13-by-20-foot (5.4 by 4.0 by 6.1 m) cells.
Work began on 221-T and 221-U in January 1944, with the former completed in September and the latter in December. The 221-B building followed in March 1945. Because of the high levels of radioactivity involved, all work in the separation plants had to be conducted by remote control using closed-circuit television, something unheard of in 1943.
Maintenance was carried out with the aid of an overhead crane and specially designed tools. The 224 buildings were smaller because they had less material to process, and it was less radioactive. The 224-T and 224-U buildings were completed on 8 October 1944, and 224-B followed on 10 February 1945.
The purification methods that were eventually used in 231-W were still unknown when construction commenced on 8 April 1944, but the plant was complete and the methods were selected by the end of the year. On 5 February 1945, Matthias hand-delivered the first shipment of 80 g of 95%-pure plutonium nitrate to a Los Alamos courier in Los Angeles.
Weapon design:
Main article: Project Y
In 1943, development efforts were directed to a gun-type fission weapon with plutonium called Thin Man. Initial research on the properties of plutonium was done using cyclotron-generated plutonium-239, which was extremely pure, but could only be created in very small amounts.
The idea behind the Thin Man design was to fire one subcritical mass of plutonium at another and the collision would create a nuclear explosion. Los Alamos received the first sample of plutonium from the Clinton X-10 reactor in April 1944 and within days Emilio Segrè discovered a problem: the reactor-bred plutonium had a higher concentration of plutonium-240, resulting in up to five times the spontaneous fission rate of cyclotron plutonium. Seaborg had correctly predicted in March 1943 that some of the plutonium-239 would absorb a neutron and become plutonium-240.
This made reactor plutonium unsuitable for use in a gun-type weapon. The plutonium-240 would start the chain reaction too quickly, causing a predetonation that would release enough energy to disperse the critical mass with a minimal amount of plutonium reacted (a fizzle). A faster gun was suggested but found to be impractical. The possibility of separating the isotopes was considered and rejected, as plutonium-240 is even harder to separate from plutonium-239 than uranium-235 from uranium-238.
Work on an alternative method of bomb design, known as implosion, had begun earlier under the direction of the physicist Seth Neddermeyer. Implosion used explosives to crush a subcritical sphere of fissile material into a smaller and denser form. When the fissile atoms are packed closer together, the rate of neutron capture increases, and the mass becomes a critical mass.
The metal needs to travel only a very short distance, so the critical mass is assembled in much less time than it would take with the gun method. Neddermeyer's 1943 and early 1944 investigations into implosion showed promise, but also made it clear that the problem would be much more difficult from a theoretical and engineering perspective than the gun design.
In September 1943, John von Neumann, who had experience with shaped charges used in armor-piercing shells, argued that not only would implosion reduce the danger of pre-detonation and fizzle, but would make more efficient use of the fissionable material. He proposed using a spherical configuration instead of the cylindrical one that Neddermeyer was working on.
By July 1944, Oppenheimer had concluded plutonium could not be used in a gun design, and opted for implosion. The accelerated effort on an implosion design, codenamed Fat Man, began in August 1944 when Oppenheimer implemented a sweeping reorganization of the Los Alamos laboratory to focus on implosion.
Two new groups were created at Los Alamos to develop the implosion weapon, X (for explosives) Division headed by explosives expert George Kistiakowsky and G (for gadget) Division under Robert Bacher. The new design that von Neumann and T (for theoretical) Division, most notably Rudolf Peierls, had devised used explosive lenses to focus the explosion onto a spherical shape using a combination of both slow and fast high explosives.
The design of lenses that detonated with the proper shape and velocity turned out to be slow, difficult and frustrating. Various explosives were tested before settling on composition B as the fast explosive and baratol as the slow explosive. The final design resembled a soccer ball, with 20 hexagonal and 12 pentagonal lenses, each weighing about 80 pounds (36 kg).
Getting the detonation just right required fast, reliable and safe electrical detonators, of which there were two for each lens for reliability. It was therefore decided to use exploding-bridgewire detonators, a new invention developed at Los Alamos by a group led by Luis Alvarez. A contract for their manufacture was given to Raytheon.
To study the behavior of converging shock waves, Robert Serber devised the RaLa Experiment, which used the short-lived radioisotope lanthanum-140, a potent source of gamma radiation. The gamma ray source was placed in the center of a metal sphere surrounded by the explosive lenses, which in turn were inside in an ionization chamber. This allowed the taking of an X-ray movie of the implosion. The lenses were designed primarily using this series of tests.
In his history of the Los Alamos project, David Hawkins wrote: "RaLa became the most important single experiment affecting the final bomb design".
Within the explosives was the 4.5-inch (110 mm) thick aluminum pusher, which provided a smooth transition from the relatively low density explosive to the next layer, the 3-inch (76 mm) thick tamper of natural uranium. Its main job was to hold the critical mass together as long as possible, but it would also reflect neutrons back into the core. Some part of it might fission as well.
To prevent predetonation by an external neutron, the tamper was coated in a thin layer of boron. A polonium-beryllium modulated neutron initiator, known as an "urchin" because its shape resembled a sea urchin, was developed to start the chain reaction at precisely the right moment.
This work with the chemistry and metallurgy of radioactive polonium was directed by Charles Allen Thomas of the Monsanto Company and became known as the Dayton Project. Testing required up to 500 curies per month of polonium, which Monsanto was able to deliver.
The whole assembly was encased in a duralumin bomb casing to protect it from bullets and flak.
The ultimate task of the metallurgists was to determine how to cast plutonium into a sphere. The difficulties became apparent when attempts to measure the density of plutonium gave inconsistent results. At first contamination was believed to be the cause, but it was soon determined that there were multiple allotropes of plutonium. The brittle α phase that exists at room temperature changes to the plastic β phase at higher temperatures.
Attention then shifted to the even more malleable δ phase that normally exists in the 300 °C to 450 °C range. It was found that this was stable at room temperature when alloyed with aluminum, but aluminum emits neutrons when bombarded with alpha particles, which would exacerbate the pre-ignition problem. The metallurgists then hit upon a plutonium-gallium alloy, which stabilized the δ phase and could be hot pressed into the desired spherical shape.
As plutonium was found to corrode readily, the sphere was coated with nickel.
The work proved dangerous. By the end of the war, half the experienced chemists and metallurgists had to be removed from work with plutonium when unacceptably high levels of the element appeared in their urine. A minor fire at Los Alamos in January 1945 led to a fear that a fire in the plutonium laboratory might contaminate the whole town, and Groves authorized the construction of a new facility for plutonium chemistry and metallurgy, which became known as the DP-site.
The hemispheres for the first plutonium pit (or core) were produced and delivered on 2 July 1945. Three more hemispheres followed on 23 July and were delivered three days later.
In contrast to the plutonium Fat Man, the uranium gun-type Little Boy weapon was straightforward if not trivial to design. Overall responsibility for it was assigned to Parsons's Ordnance (O) Division, with the design, development, and technical work at Los Alamos consolidated under Lieutenant Commander Francis Birch's group.
The gun-type design now had to work with enriched uranium only, and this allowed the design to be greatly simplified. A high-velocity gun was no longer required, and a simpler weapon was substituted.
Trinity:
Main article: Trinity (nuclear test)
Because of the complexity of an implosion-style weapon, it was decided that, despite the waste of fissile material, an initial test would be required. Groves approved the test, subject to the active material being recovered. Consideration was therefore given to a controlled fizzle, but Oppenheimer opted instead for a full-scale nuclear test, codenamed "Trinity".
In March 1944, planning for the test was assigned to Kenneth Bainbridge, a professor of physics at Harvard, working under Kistiakowsky. Bainbridge selected the bombing range near Alamogordo Army Airfield as the site for the test.
Bainbridge worked with Captain Samuel P. Davalos on the construction of the Trinity Base Camp and its facilities, which included barracks, warehouses, workshops, an explosive magazine and a commissary.
Groves did not relish the prospect of explaining to a Senate committee the loss of a billion dollars worth of plutonium, so a cylindrical containment vessel codenamed "Jumbo" was constructed to recover the active material in the event of a failure. Measuring 25 feet (7.6 m) long and 12 feet (3.7 m) wide, it was fabricated at great expense from 214 short tons (194 t) of iron and steel by Babcock & Wilcox in Barberton, Ohio.
Brought in a special railroad car to a siding in Pope, New Mexico, it was transported the last 25 miles (40 km) to the test site on a trailer pulled by two tractors. By the time it arrived, however, confidence in the implosion method was high enough, and the availability of plutonium was sufficient, that Oppenheimer decided not to use it. Instead, it was placed atop a steel tower 800 yards (730 m) from the weapon as a rough measure of how powerful the explosion would be.
In the end, Jumbo survived, although its tower did not, adding credence to the belief that Jumbo would have successfully contained a fizzled explosion.
A pre-test explosion was conducted on 7 May 1945 to calibrate the instruments. A wooden test platform was erected 800 yards (730 m) from Ground Zero and piled with 100 short tons (91 t) of TNT spiked with nuclear fission products in the form of an irradiated uranium slug from Hanford, which was dissolved and poured into tubing inside the explosive.
This explosion was observed by Oppenheimer and Groves's new deputy commander, Brigadier General Thomas Farrell. The pre-test produced data that proved vital for the Trinity test.
For the actual test, the weapon, nicknamed "the gadget", was hoisted to the top of a 100-foot (30 m) steel tower, as detonation at that height would give a better indication of how the weapon would behave when dropped from a bomber. Detonation in the air maximized the energy applied directly to the target, and generated less nuclear fallout.
The gadget was assembled under the supervision of Norris Bradbury at the nearby McDonald Ranch House on 13 July, and precariously winched up the tower the following day.
Observers included Bush, Chadwick, Conant, Farrell, Fermi, Groves, Lawrence, Oppenheimer and Tolman. At 05:30 on 16 July 1945 the gadget exploded with an energy equivalent of around 20 kilotons of TNT, leaving a crater of Trinitite (radioactive glass) in the desert 250 feet (76 m) wide.
The shock wave was felt over 100 miles (160 km) away, and the mushroom cloud reached 7.5 miles (12.1 km) in height. It was heard as far away as El Paso, Texas, so Groves issued a cover story about an ammunition magazine explosion at Alamogordo Field.
Oppenheimer later recalled that, while witnessing the explosion, he thought of a verse from the Hindu holy book, the Bhagavad Gita (XI,12): "If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one ...
Years later he would explain that another verse had also entered his head at that time:
"We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, 'Now I am become Death, the destroyer of worlds.' I suppose we all thought that, one way or another.
Personnel:
At its peak in June 1944, the Manhattan Project employed about 129,000 workers, of whom 84,500 were construction workers, 40,500 were plant operators and 1,800 were military personnel. As construction activity fell off, the workforce declined to 100,000 a year later, but the number of military personnel increased to 5,600.
Procuring the required numbers of workers, especially highly skilled workers, in competition with other vital wartime programs proved very difficult. Due to high turnover, over 500,000 people worked on the project. In 1943, Groves obtained a special temporary priority for labor from the War Manpower Commission.
In March 1944, both the War Production Board and the War Manpower Commission gave the project their highest priority. The Kansas commission director stated that from April to July 1944 every qualified applicant in the state who visited a United States Employment Service office was urged to work at the Hanford Site. No other job was offered until the applicant definitively rejected the offer.
Tolman and Conant, in their role as the project's scientific advisers, drew up a list of candidate scientists and had them rated by scientists already working on the project. Groves then sent a personal letter to the head of their university or company asking for them to be released for essential war work.
One source of skilled personnel was the Army itself, particularly the Army Specialized Training Program. In 1943, the MED created the Special Engineer Detachment (SED), with an authorized strength of 675. Technicians and skilled workers drafted into the Army were assigned to the SED.
Another source was the Women's Army Corps (WAC). Initially intended for clerical tasks handling classified material, the WACs were soon tapped for technical and scientific tasks as well. On 1 February 1945, all military personnel assigned to the MED, including all SED detachments, were assigned to the 9812th Technical Service Unit, except at Los Alamos, where military personnel other than SED, including the WACs and Military Police, were assigned to the 4817th Service Command Unit.
An Associate Professor of Radiology at the University of Rochester School of Medicine, Stafford L. Warren, was commissioned as a colonel in the United States Army Medical Corps, and appointed as chief of the MED's Medical Section and Groves' medical advisor. Warren's initial task was to staff hospitals at Oak Ridge, Richland and Los Alamos.
The Medical Section was responsible for medical research, but also for the MED's health and safety programs. This presented an enormous challenge, because workers were handling a variety of toxic chemicals, using hazardous liquids and gases under high pressures, working with high voltages, and performing experiments involving explosives, not to mention the largely unknown dangers presented by radioactivity and handling fissile materials.
Yet in December 1945, the National Safety Council presented the Manhattan Project with the Award of Honor for Distinguished Service to Safety in recognition of its safety record. Between January 1943 and June 1945, there were 62 fatalities and 3,879 disabling injuries, which was about 62 percent below the rate of private industry.
Secrecy:
Byron Price, head of the government's Office of Censorship, called the Manhattan Project the best-kept secret of the war. A 1945 Life article estimated that before the Hiroshima and Nagasaki bombings "probably no more than a few dozen men in the entire country knew the full meaning of the Manhattan Project, and perhaps only a thousand others even were aware that work on atoms was involved."
The magazine wrote that the more than 100,000 others employed with the project "worked like moles in the dark". Warned that disclosing the project's secrets was punishable by 10 years in prison or a fine of US$10,000 (equivalent to $163,000 in 2022), they saw enormous quantities of raw materials enter factories with nothing coming out and monitored "dials and switches while behind thick concrete walls mysterious reactions took place" without knowing the purpose of their jobs.
In December 1945 the United States Army published a secret report analysing and assessing the security apparatus surrounding the Manhattan Project. The report states that the Manhattan Project was "more drastically guarded than any other highly secret war development."
The security infrastructure surrounding the Manhattan Project was so vast and thorough that in the early days of the project in 1943, security investigators vetted 400,000 potential employees and 600 companies that would be involved in all aspects of the project for potential security risks. Although at times the most important employer in the nation for government bureaucrats assigning manpower, they only knew of the "Pasco secret project"; one said that until Hiroshima "we had no idea what was being made".
Oak Ridge security personnel considered any private party with more than seven people as suspicious, and residents—who believed that US government agents were secretly among them—avoided repeatedly inviting the same guests. Although original residents of the area could be buried in existing cemeteries, every coffin was reportedly opened for inspection.
Everyone, including top military officials, and their automobiles were searched when entering and exiting project facilities. One Oak Ridge worker stated that "if you got inquisitive, you were called on the carpet within two hours by government secret agents. Usually those summoned to explain were then escorted bag and baggage to the gate and ordered to keep going".
Despite being told that their work would help end the war and perhaps all future wars, not seeing or understanding the results of their often tedious duties—or even typical side effects of factory work such as smoke from smokestacks—and the war in Europe ending without the use of their work, caused serious morale problems among workers and caused many rumors to spread.
One manager stated after the war: "Well it wasn't that the job was tough ... it was confusing. You see, no one knew what was being made in Oak Ridge, not even me, and a lot of the people thought they were wasting their time here. It was up to me to explain to the dissatisfied workers that they were doing a very important job. When they asked me what, I'd have to tell them it was a secret. But I almost went crazy myself trying to figure out what was going on."
Another worker told of how, working in a laundry, she every day held "a special instrument" to uniforms and listened for "a clicking noise". She learned only after the war that she had been performing the important task of checking for radiation with a geiger counter.
To improve morale among such workers Oak Ridge created an extensive system of intramural sports leagues, including 10 baseball teams, 81 softball teams, and 26 football teams.
Censorship:
Voluntary censorship of atomic information began before the Manhattan Project. After the start of the European war in 1939 American scientists began avoiding publishing military-related research, and in 1940 scientific journals began asking the National Academy of Sciences to clear articles. William L. Laurence of The New York Times, who wrote an article on atomic fission in The Saturday Evening Post of 7 September 1940, later learned that government officials asked librarians nationwide in 1943 to withdraw the issue.
The Soviets noticed the silence, however. In April 1942 nuclear physicist Georgy Flyorov wrote to Josef Stalin on the absence of articles on nuclear fission in American journals; this resulted in the Soviet Union establishing its own atomic bomb project.
The Manhattan Project operated under tight security lest its discovery induce Axis powers, especially Germany, to accelerate their own nuclear projects or undertake covert operations against the project.
The Office of Censorship, by contrast, relied on the press to comply with a voluntary code of conduct it published, and the project at first avoided notifying the office. By early 1943 newspapers began publishing reports of large construction in Tennessee and Washington based on public records, and the office began discussing with the project how to maintain secrecy.
In June the Office of Censorship asked newspapers and broadcasters to avoid discussing "atom smashing, atomic energy, atomic fission, atomic splitting, or any of their equivalents.
The use for military purposes of radium or radioactive materials, heavy water, high voltage discharge equipment, cyclotrons." The office also asked to avoid discussion of "polonium, uranium, ytterbium, hafnium, protactinium, radium, rhenium, thorium, deuterium"; only uranium was sensitive, but was listed with other elements to hide its importance.
Soviet spies:
Main article: Atomic spies
The prospect of sabotage was always present, and sometimes suspected when there were equipment failures. While there were some problems believed to be the result of careless or disgruntled employees, there were no confirmed instances of Axis-instigated sabotage.
However, on 10 March 1945, a Japanese fire balloon struck a power line, and the resulting power surge caused the three reactors at Hanford to be temporarily shut down.
With so many people involved, security was a difficult task. A special Counter Intelligence Corps detachment was formed to handle the project's security issues. By 1943, it was clear that the Soviet Union was attempting to penetrate the project. Lieutenant Colonel Boris T. Pash, the head of the Counter Intelligence Branch of the Western Defense Command, investigated suspected Soviet espionage at the Radiation Laboratory in Berkeley.
Oppenheimer informed Pash that he had been approached by a fellow professor at Berkeley, Haakon Chevalier, about passing information to the Soviet Union.
The most successful Soviet spy was Klaus Fuchs, a member of the British Mission who played an important part at Los Alamos. The 1950 revelation of his espionage activities damaged the United States' nuclear cooperation with Britain and Canada.
Subsequently, other instances of espionage were uncovered, leading to the arrest of Harry Gold, David Greenglass, and Julius and Ethel Rosenberg. Other spies like George Koval and Theodore Hall remained unknown for decades. The value of the espionage is difficult to quantify, as the principal constraint on the Soviet atomic bomb project was a shortage of uranium ore. The consensus is that espionage saved the Soviets one or two years of effort.
Foreign intelligence:
Main articles: Alsos Mission and Operation Epsilon
In addition to developing the atomic bomb, the Manhattan Project was charged with gathering intelligence on the German nuclear energy project. It was believed that the Japanese nuclear weapons program was not far advanced because Japan had little access to uranium ore, but it was initially feared that Germany was very close to developing its own weapons.
At the instigation of the Manhattan Project, a bombing and sabotage campaign was carried out against heavy water plants in German-occupied Norway. A small mission was created, jointly staffed by the Office of Naval Intelligence, OSRD, the Manhattan Project, and Army Intelligence (G-2), to investigate enemy scientific developments.
It was not restricted to those involving nuclear weapons. The Chief of Army Intelligence, Major General George V. Strong, appointed Boris Pash to command the unit, which was codenamed "Alsos", a Greek word meaning "grove".
The Alsos Mission to Italy questioned staff of the physics laboratory at the University of Rome following the capture of the city in June 1944. Meanwhile, Pash formed a combined British and American Alsos mission in London under the command of Captain Horace K. Calvert to participate in Operation Overlord.
Groves considered the risk that the Germans might attempt to disrupt the Normandy landings with radioactive poisons was sufficient to warn General Dwight D. Eisenhower and send an officer to brief his chief of staff, Lieutenant General Walter Bedell Smith.
Under the codename Operation Peppermint, special equipment was prepared and Chemical Warfare Service teams were trained in its use.
Following in the wake of the advancing Allied armies, Pash and Calvert interviewed Frédéric Joliot-Curie about the activities of German scientists. They spoke to officials at Union Minière du Haut Katanga about uranium shipments to Germany. They tracked down 68 tons of ore in Belgium and 30 tons in France. The interrogation of German prisoners indicated that uranium and thorium were being processed in Oranienburg, 20 miles north of Berlin, so Groves arranged for it to be bombed on 15 March 1945.
An Alsos team went to Stassfurt in the Soviet Occupation Zone and retrieved 11 tons of ore from WIFO. In April 1945, Pash, in command of a composite force known as T-Force, conducted Operation Harborage, a sweep behind enemy lines of the cities of Hechingen, Bisingen, and Haigerloch that were the heart of the German nuclear effort.
T-Force captured the nuclear laboratories, documents, equipment and supplies, including heavy water and 1.5 tons of metallic uranium.
Alsos teams rounded up German scientists including:
Thes German scientists were taken to England where they were interned at Farm Hall, a bugged house in Godmanchester.
After the bombs were detonated in Japan, the Germans were forced to confront the fact that the Allies had done what they could not.
Atomic bombings of Hiroshima and Nagasaki:
Main article: Atomic bombings of Hiroshima and Nagasaki
Preparations:
The only Allied aircraft capable of carrying the 17-foot (5.2 m) long Thin Man or the 59-inch (150 cm) wide Fat Man was the British Avro Lancaster, but using a British aircraft would have caused difficulties with maintenance. Groves hoped that the American Boeing B-29 Superfortress could be modified to carry Thin Man by joining its two bomb bays together.
This became unnecessary after Thin man was abandoned, as a Little Boy was short enough to fit into a B-29 bomb bay, but modifications were still required. The Chief of United States Army Air Forces (USAAF), General Henry H. Arnold assured Groves that no effort would be spared to modify B-29s to do the job, and he designated Major General Oliver P. Echols as the USAAF liaison to the Manhattan Project.
In turn, Echols named Colonel Roscoe C. Wilson as his alternate, and Wilson became Manhattan Project's main USAAF contact.
Commencing in November 1943, the Army Air Forces Materiel Command at Wright Field, Ohio, began Silverplate, the codename for the modification of the B-29 to carry atomic bombs. Test drops were carried out at Muroc Army Air Field and the Naval Ordnance Test Station in California with Thin Man and Fat Man pumpkin bombs to test their ballistic, fuzing and stability characteristics.
The 509th Composite Group was activated on 17 December 1944 at Wendover Army Air Field, Utah, under the command of Colonel Paul W. Tibbets. This base, close to the border with Nevada, was codenamed "Kingman" or "W-47". Training was conducted at Wendover and at Batista Army Airfield, Cuba, where the 393rd Bombardment Squadron practiced long-distance flights over water and dropped pumpkin bombs.
Roosevelt instructed Groves that if the atomic bombs were ready before the war with Germany ended, he should be ready to drop them on Germany, but Japan was regarded as the most likely target.
A special unit known as Project Alberta was formed at Los Alamos under Parsons's command to assist in preparing and delivering the bombs. Commander Frederick L. Ashworth from Alberta met with Fleet Admiral Chester W. Nimitz on Guam in February 1945 to inform him of the Manhattan Project.
While he was there, Ashworth selected North Field on Tinian as a base for the 509th Composite Group, and he reserved space for the group and its buildings. The group deployed there in July 1945. Farrell arrived at Tinian on 30 July as the Manhattan Project representative. Purnell went to Tinian as the representative of the Military Policy Committee.
Most of the components for Little Boy left San Francisco on the cruiser USS Indianapolis on 16 July and arrived on Tinian on 26 July. Four days later the ship was sunk by a Japanese submarine. The remaining components, which included six highly enriched uranium rings, were delivered by three Douglas C-54 Skymasters of the 509th Group's 320th Troop Carrier Squadron.
Two Fat Man assemblies travelled to Tinian in specially modified 509th Composite Group B-29s. The first plutonium core went in a special C-54.
In late April, a joint targeting committee of the Manhattan District and USAAF was established to determine which cities in Japan should be targets, and recommended Kokura, Hiroshima, Niigata, and Kyoto.
At this point, Secretary of War Henry L. Stimson intervened, announcing that he would be making the targeting decision, and that he would not authorize the bombing of Kyoto on the grounds of its historical and religious significance. Groves therefore asked Arnold to remove Kyoto not just from the list of nuclear targets, but from targets for conventional bombing as well. Nagasaki was substituted.
Bombings:
In May 1945, the Interim Committee was created to advise on wartime and postwar use of nuclear energy. The committee was chaired by Stimson, with:
The Interim Committee in turn established a scientific panel consisting of Arthur Compton, Fermi, Lawrence and Oppenheimer to advise it on scientific issues. In its presentation to the Interim Committee, the scientific panel offered its opinion not just on the likely physical effects of an atomic bomb, but on its probable military and political impact.
At the Potsdam Conference in Germany, Truman was informed that the Trinity test had been successful. He told Stalin, the leader of the Soviet Union, that the US had a new superweapon, without giving any details. This was the first official communication to the Soviet Union about the bomb, but Stalin already knew about it from spies. With the authorization to use the bomb against Japan already given, no alternatives were considered after the Japanese rejection of the Potsdam Declaration.
Picture below: Little Boy explodes over Hiroshima, Japan, 6 August 1945 (left);
Fat Man explodes over Nagasaki, Japan, 9 August 1945 (right).
Main article: Clinton Engineer Works
The day after he took over the project, Groves took a train to Tennessee with Colonel Marshall to inspect the proposed site there, and Groves was impressed. On 29 September 1942, United States Under Secretary of War Robert P. Patterson authorized the Corps of Engineers to acquire 56,000 acres (23,000 ha) of land by eminent domain at a cost of $3.5 million. An additional 3,000 acres (1,200 ha) was subsequently acquired. About 1,000 families were affected by the condemnation order, which came into effect on 7 October.
Protests, legal appeals, and a 1943 Congressional inquiry were to no avail. By mid-November U.S. Marshals were tacking notices to vacate on farmhouse doors, and construction contractors were moving in.
Some families were given two weeks' notice to vacate farms that had been their homes for generations; others had settled there after being evicted to make way for the Great Smoky Mountains National Park in the 1920s or the Norris Dam in the 1930s.
The ultimate cost of land acquisition in the area, which was not completed until March 1945, was only about $2.6 million, which worked out to around $47 an acre. When presented with Public Proclamation Number Two, which declared Oak Ridge a total exclusion area that no one could enter without military permission, the Governor of Tennessee, Prentice Cooper, angrily tore it up.
Initially known as the Kingston Demolition Range, the site was officially renamed the Clinton Engineer Works (CEW) in early 1943. While Stone & Webster concentrated on the production facilities, the architectural and engineering firm Skidmore, Owings & Merrill designed and built a residential community for 13,000. The community was located on the slopes of Black Oak Ridge, from which the new town of Oak Ridge got its name.
The Army presence at Oak Ridge increased in August 1943 when Nichols replaced Marshall as head of the Manhattan Engineer District. One of his first tasks was to move the district headquarters to Oak Ridge although the name of the district did not change.
In September 1943 the administration of community facilities was outsourced to Turner Construction Company through a subsidiary, the Roane-Anderson Company (for Roane and Anderson Counties, in which Oak Ridge was located).
Chemical engineers, including William J. (Jenkins) Wilcox Jr. and Warren Fuchs, were part of "frantic efforts" to make 10% to 12% enriched uranium 235, known as the code name "tuballoy tetroxide", with tight security and fast approvals for supplies and materials.
The population of Oak Ridge soon expanded well beyond the initial plans, and peaked at 75,000 in May 1945, by which time 82,000 people were employed at the Clinton Engineer Works, and 10,000 by Roane-Anderson.
Fine-arts photographer, Josephine Herrick, and her colleague, Mary Steers, helped document the work at Oak Ridge.
Los Alamos:
Main article: Project Y
The idea of locating Project Y at Oak Ridge was considered, but in the end it was decided that it should be in a remote location. On Oppenheimer's recommendation, the search for a suitable site was narrowed to the vicinity of Albuquerque, New Mexico, where Oppenheimer owned a ranch.
In October 1942, Major John H. Dudley of the Manhattan District was sent to survey the area. He recommended a site near Jemez Springs, New Mexico. On 16 November, Oppenheimer, Groves, Dudley and others toured the site. Oppenheimer feared that the high cliffs surrounding the site would make his people feel claustrophobic, while the engineers were concerned with the possibility of flooding.
The party then moved on to the vicinity of the Los Alamos Ranch School. Oppenheimer was impressed and expressed a strong preference for the site, citing its natural beauty and views of the Sangre de Cristo Mountains, which, it was hoped, would inspire those who would work on the project.
The engineers were concerned about the poor access road, and whether the water supply would be adequate, but otherwise felt that it was ideal.
Patterson approved the acquisition of the site on 25 November 1942, authorizing $440,000 for the purchase of the site of 54,000 acres (22,000 ha), all but 8,900 acres (3,600 ha) of which were already owned by the Federal Government.
Secretary of Agriculture Claude R. Wickard granted use of some 45,100 acres (18,300 ha) of United States Forest Service land to the War Department "for so long as the military necessity continues". The need for land, for a new road, and later for a right of way for a 25-mile (40 km) power line, eventually brought wartime land purchases to 45,737 acres (18,509.1 ha), but only $414,971 was spent.
Construction was contracted to the M. M. Sundt Company of Tucson, Arizona, with Willard C. Kruger and Associates of Santa Fe, New Mexico, as architect and engineer. Work commenced in December 1942. Groves initially allocated $300,000 for construction, three times Oppenheimer's estimate, with a planned completion date of 15 March 1943. It soon became clear that the scope of Project Y was greater than expected, and by the time Sundt finished on 30 November 1943, over $7 million had been spent.
Because it was secret, Los Alamos was referred to as "Site Y" or "the Hill". Birth certificates of babies born in Los Alamos during the war listed their place of birth as PO Box 1663 in Santa Fe.
Initially Los Alamos was to have been a military laboratory with Oppenheimer and other researchers commissioned into the Army. Oppenheimer went so far as to order himself a lieutenant colonel's uniform, but two key physicists, Robert Bacher and Isidor Rabi, balked at the idea. Conant, Groves and Oppenheimer then devised a compromise whereby the laboratory was operated by the University of California under contract to the War Department.
Chicago:
Main article: Metallurgical Laboratory
An Army-OSRD council on 25 June 1942 decided to build a pilot plant for plutonium production in Red Gate Woods southwest of Chicago. In July, Nichols arranged for a lease of 1,025 acres (415 ha) from the Cook County Forest Preserve District, and Captain James F. Grafton (1908-1969) was appointed Chicago area engineer. It soon became apparent that the scale of operations was too great for the area, and it was decided to build the plant at Oak Ridge, and keep a research and testing facility in Chicago.
Delays in establishing the plant in Red Gate Woods led Compton to authorize the Metallurgical Laboratory to construct the first nuclear reactor beneath the bleachers of Stagg Field at the University of Chicago. The reactor required an enormous amount of graphite blocks and uranium pellets.
At the time, there was a limited source of pure uranium. Frank Spedding of Iowa State University was able to produce only two short tons of pure uranium. Additional three short tons of uranium metal was supplied by Westinghouse Lamp Plant which was produced in a rush with makeshift process.
A large square balloon was constructed by Goodyear Tire to encase the reactor. On 2 December 1942, a team led by Enrico Fermi initiated the first artificial self-sustaining nuclear chain reaction in an experimental reactor known as Chicago Pile-1.
The point at which a reaction becomes self-sustaining became known as "going critical". Compton reported the success to Conant in Washington, D.C., by a coded phone call, saying, "The Italian navigator [Fermi] has just landed in the new world."
In January 1943, Grafton's successor, Major Arthur V. Peterson, ordered Chicago Pile-1 dismantled and reassembled at Red Gate Woods, as he regarded the operation of a reactor as too hazardous for a densely populated area.
At the Argonne site, Chicago Pile-3, the first heavy water reactor, went critical on 15 May 1944. After the war, the operations that remained at Red Gate moved to the new site of the Argonne National Laboratory about 6 miles (9.7 km) away.
Hanford:
Main article: Hanford Engineer Works
By December 1942 there were concerns that even Oak Ridge was too close to a major population center (Knoxville) in the unlikely event of a major nuclear accident. Groves recruited DuPont in November 1942 to be the prime contractor for the construction of the plutonium production complex.
DuPont was offered a standard cost plus fixed-fee contract, but the President of the company, Walter S. Carpenter, Jr., wanted no profit of any kind, and asked for the proposed contract to be amended to explicitly exclude the company from acquiring any patent rights.
This was accepted, but for legal reasons a nominal fee of one dollar was agreed upon. After the war, DuPont asked to be released from the contract early, and had to return 33 cents.
DuPont recommended that the site be located far from the existing uranium production facility at Oak Ridge. In December 1942, Groves dispatched Colonel Franklin Matthias and DuPont engineers to scout potential sites. Matthias reported that Hanford Site near Richland, Washington, was "ideal in virtually all respects". It was isolated and near the Columbia River, which could supply sufficient water to cool the reactors that would produce the plutonium.
Groves visited the site in January and established the Hanford Engineer Works (HEW), codenamed "Site W".
Under Secretary Patterson gave his approval on 9 February, allocating $5 million for the acquisition of 430,000 acres (170,000 ha) of land in the area. The federal government relocated some 1,500 residents of White Bluffs and Hanford, and nearby settlements, as well as the Wanapum and other tribes using the area. A dispute arose with farmers over compensation for crops, which had already been planted before the land was acquired.
Where schedules allowed, the Army allowed the crops to be harvested, but this was not always possible. The land acquisition process dragged on and was not completed before the end of the Manhattan Project in December 1946.
The dispute did not delay work. Although progress on the reactor design at Metallurgical Laboratory and DuPont was not sufficiently advanced to accurately predict the scope of the project, a start was made in April 1943 on facilities for an estimated 25,000 workers, half of whom were expected to live on-site.
By July 1944, some 1,200 buildings had been erected and nearly 51,000 people were living in the construction camp. As area engineer, Matthias exercised overall control of the site.
At its peak, the construction camp was the third most populous town in Washington state. Hanford operated a fleet of over 900 buses, more than the city of Chicago. Like Los Alamos and Oak Ridge, Richland was a gated community with restricted access, but it looked more like a typical wartime American boomtown: the military profile was lower, and physical security elements like high fences, towers, and guard dogs were less evident.
Canadian sites:
Main article: Montreal Laboratory
British Columbia:
Cominco had produced electrolytic hydrogen at Trail, British Columbia, since 1930. Urey suggested in 1941 that it could produce heavy water. To the existing $10 million plant consisting of 3,215 cells consuming 75 MW of hydroelectric power, secondary electrolysis cells were added to increase the deuterium concentration in the water from 2.3% to 99.8%.
For this process, Hugh Taylor of Princeton developed a platinum-on-carbon catalyst for the first three stages while Urey developed a nickel-chromia one for the fourth stage tower. The final cost was $2.8 million. The Canadian Government did not officially learn of the project until August 1942.
Trail's heavy water production started in January 1944 and continued until 1956. Heavy water from Trail was used for Chicago Pile 3, the first reactor using heavy water and natural uranium, which went critical on 15 May 1944.
Ontario:
The Chalk River, Ontario, site was established to rehouse the Allied effort at the Montreal Laboratory away from an urban area. A new community was built at Deep River, Ontario, to provide residences and facilities for the team members. The site was chosen for its proximity to the industrial manufacturing area of Ontario and Quebec, and proximity to a rail head adjacent to a large military base, Camp Petawawa.
Located on the Ottawa River, it had access to abundant water. The first director of the new laboratory was Hans von Halban. He was replaced by John Cockcroft in May 1944, who in turn was succeeded by Bennett Lewis in September 1946.
A pilot reactor known as ZEEP (zero-energy experimental pile) became the first Canadian reactor, and the first to be completed outside the United States, when it went critical in September 1945, ZEEP remained in use by researchers until 1970.
A larger 10 MW NRX reactor, which was designed during the war, was completed and went critical in July 1947.
Northwest Territories:
The Eldorado Mine at Port Radium was a source of uranium ore.
Heavy water sites:
Main article: P-9 Project
Although DuPont's preferred designs for the nuclear reactors were helium cooled and used graphite as a moderator, DuPont still expressed an interest in using heavy water as a backup, in case the graphite reactor design proved infeasible for some reason. For this purpose, it was estimated that 3 short tons (2.7 t) of heavy water would be required per month. The P-9 Project was the government's code name for the heavy water production program.
As the plant at Trail, which was then under construction, could produce 0.5 short tons (0.45 t) per month, additional capacity was required. Groves therefore authorized DuPont to establish heavy water facilities at:
- the Morgantown Ordnance Works, near Morgantown, West Virginia;
- the Wabash River Ordnance Works, near Dana and Newport, Indiana;
- the Alabama Ordnance Works, near Childersburg and Sylacauga, Alabama.
Although known as Ordnance Works and paid for under Ordnance Department contracts, they were built and operated by the Army Corps of Engineers. The American plants used a process different from Trail's; heavy water was extracted by distillation, taking advantage of the slightly higher boiling point of heavy water.
Uranium:
Ore:
The key raw material for the project was uranium, which was used as fuel for the reactors, as feed that was transformed into plutonium, and, in its enriched form, in the atomic bomb itself. There were four known major deposits of uranium in 1940: in Colorado, in northern Canada, in Joachimsthal in Czechoslovakia, and in the Belgian Congo.
All but Joachimstal were in Allied hands. A November 1942 survey determined that sufficient quantities of uranium were available to satisfy the project's requirements.
Nichols arranged with the State Department for export controls to be placed on uranium oxide and negotiated for the purchase of 1,200 short tons (1,100 t) of uranium ore from the Belgian Congo that was being stored in a warehouse on Staten Island and the remaining stocks of mined ore stored in the Congo. He negotiated with Eldorado Gold Mines for the purchase of ore from its refinery in Port Hope, Ontario, and its shipment in 100-ton lots. The Canadian government subsequently bought up the company's stock until it acquired a controlling interest.
While these purchases assured a sufficient supply to meet wartime needs, the American and British leaders concluded that it was in their countries' interest to gain control of as much of the world's uranium deposits as possible. The richest source of ore was the Shinkolobwe mine in the Belgian Congo, but it was flooded and closed.
Nichols unsuccessfully attempted to negotiate its reopening and the sale of the entire future output to the United States with Edgar Sengier, the director of the company that owned the mine, the Union Minière du Haut-Katanga. The matter was then taken up by the Combined Policy Committee. As 30 percent of Union Minière's stock was controlled by British interests, the British took the lead in negotiations.
Sir John Anderson and Ambassador John Winant hammered out a deal with Sengier and the Belgian government in May 1944 for the mine to be reopened and 1,720 short tons (1,560 t) of ore to be purchased at $1.45 a pound. To avoid dependence on the British and Canadians for ore, Groves also arranged for the purchase of US Vanadium Corporation's stockpile in Uravan, Colorado. Uranium mining in Colorado yielded about 800 short tons (730 t) of ore.
Mallinckrodt Incorporated in St. Louis, Missouri, took the raw ore and dissolved it in nitric acid to produce uranyl nitrate. Ether was then added in a liquid–liquid extraction process to separate the impurities from the uranyl nitrate. This was then heated to form uranium trioxide, which was reduced to highly pure uranium dioxide.
By July 1942, Mallinckrodt was producing a ton of highly pure oxide a day, but turning this into uranium metal initially proved more difficult for contractors Westinghouse and Metal Hydrides. Production was too slow and quality was unacceptably low. A special branch of the Metallurgical Laboratory was established at Iowa State College in Ames, Iowa, under Frank Spedding to investigate alternatives. This became known as the Ames Project, and its Ames process became available in 1943.
Isotope separation:
Natural uranium consists of 99.3% uranium-238 and 0.7% uranium-235, but only the latter is fissile. The chemically identical uranium-235 has to be physically separated from the more plentiful isotope. Various methods were considered for uranium enrichment, most of which was carried out at Oak Ridge.
The most obvious technology, the centrifuge, failed, but electromagnetic separation, gaseous diffusion, and thermal diffusion technologies were all successful and contributed to the project. In February 1943, Groves came up with the idea of using the output of some plants as the input for others.
Centrifuges:
The centrifuge process was regarded as the only promising separation method in April 1942. Jesse Beams had developed such a process at the University of Virginia during the 1930s, but had encountered technical difficulties. The process required high rotational speeds, but at certain speeds harmonic vibrations developed that threatened to tear the machinery apart. It was therefore necessary to accelerate quickly through these speeds.
In 1941 he began working with uranium hexafluoride, the only known gaseous compound of uranium, and was able to separate uranium-235. At Columbia, Urey had Karl P. Cohen investigate the process, and he produced a body of mathematical theory making it possible to design a centrifugal separation unit, which Westinghouse undertook to construct.
Scaling this up to a production plant presented a formidable technical challenge. Urey and Cohen estimated that producing a kilogram (2.2 lb) of uranium-235 per day would require up to 50,000 centrifuges with 1-meter (3 ft 3 in) rotors, or 10,000 centrifuges with 4-meter (13 ft) rotors, assuming that 4-meter rotors could be built.
The prospect of keeping so many rotors operating continuously at high speed appeared daunting, and when Beams ran his experimental apparatus, he obtained only 60% of the predicted yield, indicating that more centrifuges would be required. Beams, Urey and Cohen then began work on a series of improvements which promised to increase the efficiency of the process.
However, frequent failures of motors, shafts and bearings at high speeds delayed work on the pilot plant. In November 1942 the centrifuge process was abandoned by the Military Policy Committee following a recommendation by Conant, Nichols and August C. Klein of Stone & Webster.
Although the centrifuge method was abandoned by the Manhattan Project, research into it advanced significantly after the war with the introduction of the Zippe-type centrifuge, which was developed in the Soviet Union by Soviet and captured German engineers.
It eventually became the preferred method of Uranium isotope separation, being far more economical than the other separation methods used during World War II.
Electromagnetic separation:
Main article: Y-12 Project
Electromagnetic isotope separation was developed by Lawrence at the University of California Radiation Laboratory. This method employed devices known as calutrons, a hybrid of the standard laboratory mass spectrometer and the cyclotron magnet. The name was derived from the words California, university and cyclotron.
In the electromagnetic process, a magnetic field deflected charged particles according to mass. The process was neither scientifically elegant nor industrially efficient.
Compared with a gaseous diffusion plant or a nuclear reactor, an electromagnetic separation plant would consume more scarce materials, require more manpower to operate, and cost more to build.
Nonetheless, the process was approved because it was based on proven technology and therefore represented less risk. Moreover, it could be built in stages, and rapidly reach industrial capacity.
Marshall and Nichols discovered that the electromagnetic isotope separation process would require 5,000 short tons (4,500 tonnes) of copper, which was in desperately short supply.
However, silver could be substituted, in an 11:10 ratio. On 3 August 1942, Nichols met with Under Secretary of the Treasury Daniel W. Bell and asked for the transfer of 6,000 tons of silver bullion from the West Point Bullion Depository. "Young man," Bell told him, "you may think of silver in tons but the Treasury will always think of silver in troy ounces!"
Ultimately 14,700 short tons (13,300 tonnes; 430,000,000 troy ounces) were used.
The 1,000-troy-ounce (31 kg) silver bars were cast into cylindrical billets and taken to Phelps Dodge in Bayway, New Jersey, where they were extruded into strips 0.625 inches (15.9 mm) thick, 3 inches (76 mm) wide and 40 feet (12 m) long. These were wound onto magnetic coils by Allis-Chalmers in Milwaukee, Wisconsin.
After the war, all the machinery was dismantled and cleaned and the floorboards beneath the machinery were ripped up and burned to recover minute amounts of silver. In the end, only 1/3,600,000th was lost. The last silver was returned in May 1970.
Responsibility for the design and construction of the electromagnetic separation plant, which came to be called Y-12, was assigned to Stone & Webster by the S-1 Committee in June 1942. The design called for five first-stage processing units, known as Alpha racetracks, and two units for final processing, known as Beta racetracks. In September 1943 Groves authorized construction of four more racetracks, known as Alpha II. Construction began in February 1943.
When the plant was started up for testing on schedule in October, the 14-ton vacuum tanks crept out of alignment because of the power of the magnets, and had to be fastened more securely. A more serious problem arose when the magnetic coils started shorting out.
In December Groves ordered a magnet to be broken open, and handfuls of rust were found inside. Groves then ordered the racetracks to be torn down and the magnets sent back to the factory to be cleaned. A pickling plant was established on-site to clean the pipes and fittings.
The second Alpha I was not operational until the end of January 1944, the first Beta and first and third Alpha I's came online in March, and the fourth Alpha I was operational in April.
The four Alpha II racetracks were completed between July and October 1944.
Tennessee Eastman was contracted to manage Y-12 on the usual cost plus fixed-fee basis, with a fee of $22,500 per month plus $7,500 per racetrack for the first seven racetracks and $4,000 per additional racetrack.
The calutrons were initially operated by scientists from Berkeley to remove bugs and achieve a reasonable operating rate. They were then turned over to trained Tennessee Eastman operators who had only a high school education. Nichols compared unit production data, and pointed out to Lawrence that the young "hillbilly" girl operators, known as Calutron Girls, were outperforming his PhDs.
They agreed to a production race and Lawrence lost, a morale boost for the Tennessee Eastman workers and supervisors. The girls were "trained like soldiers not to reason why", while "the scientists could not refrain from time-consuming investigation of the cause of even minor fluctuations of the dials."
Y-12 initially enriched the uranium-235 content to between 13% and 15%, and shipped the first few hundred grams of this to Los Alamos in March 1944. Only 1 part in 5,825 of the uranium feed emerged as final product. Much of the rest was splattered over equipment in the process.
Strenuous recovery efforts helped raise production to 10% of the uranium-235 feed by January 1945. In February the Alpha racetracks began receiving slightly enriched (1.4%) feed from the new S-50 thermal diffusion plant. The next month it received enhanced (5%) feed from the K-25 gaseous diffusion plant.
By August, K-25 was producing uranium sufficiently enriched to feed directly into the Beta tracks.
Gaseous diffusion:
Main article: K-25
The most promising but also the most challenging method of isotope separation was gaseous diffusion. Graham's law states that the rate of effusion of a gas is inversely proportional to the square root of its molecular mass, so in a box containing a semi-permeable membrane and a mixture of two gases, the lighter molecules will pass out of the container more rapidly than the heavier molecules.
The gas leaving the container is somewhat enriched in the lighter molecules, while the residual gas is somewhat depleted. The idea was that such boxes could be formed into a cascade of pumps and membranes, with each successive stage containing a slightly more enriched mixture.
Research into the process was carried out at Columbia University by a group that included Harold Urey, Karl P. Cohen, and John R. Dunning.
In November 1942 the Military Policy Committee approved the construction of a 600-stage gaseous diffusion plant. On 14 December, M. W. Kellogg accepted an offer to construct the plant, which was codenamed K-25. A cost plus fixed-fee contract was negotiated, eventually totaling $2.5 million.
A separate corporate entity called Kellex was created for the project, headed by Percival C. Keith, one of Kellogg's vice presidents. The process faced formidable technical difficulties. The highly corrosive gas uranium hexafluoride would have to be used, as no substitute could be found, and the motors and pumps would have to be vacuum tight and enclosed in inert gas.
The biggest problem was the design of the barrier, which would have to be strong, porous and resistant to corrosion by uranium hexafluoride. The best choice for this seemed to be nickel. Edward Adler and Edward Norris created a mesh barrier from electroplated nickel. A six-stage pilot plant was built at Columbia to test the process, but the Norris-Adler prototype proved to be too brittle.
A rival barrier was developed from powdered nickel by Kellex, the Bell Telephone Laboratories and the Bakelite Corporation. In January 1944, Groves ordered the Kellex barrier into production.
Kellex's design for K-25 called for a four-story 0.5-mile (0.80 km) long U-shaped structure containing 54 contiguous buildings. These were divided into nine sections. Within these were cells of six stages. The cells could be operated independently, or consecutively within a section. Similarly, the sections could be operated separately or as part of a single cascade. A survey party began construction by marking out the 500-acre (2.0 km2) site in May 1943.
Work on the main building began in October 1943, and the six-stage pilot plant was ready for operation on 17 April 1944. In 1945 Groves canceled the upper stages of the plant, directing Kellex to instead design and build a 540-stage side feed unit, which became known as K-27. Kellex transferred the last unit to the operating contractor, Union Carbide and Carbon, on 11 September 1945.
The total cost, including the K-27 plant completed after the war, came to $480 million.
The production plant commenced operation in February 1945, and as cascade after cascade came online, the quality of the product increased. By April 1945, K-25 had attained a 1.1% enrichment and the output of the S-50 thermal diffusion plant began being used as feed.
Some product produced the next month reached nearly 7% enrichment. In August, the last of the 2,892 stages commenced operation. K-25 and K-27 achieved their full potential in the early postwar period, when they eclipsed the other production plants and became the prototypes for a new generation of plants.
Thermal diffusion:
Main article: S-50 Project
The thermal diffusion process was based on Sydney Chapman and David Enskog's theory, which explained that when a mixed gas passes through a temperature gradient, the heavier one tends to concentrate at the cold end and the lighter one at the warm end.
Since hot gases tend to rise and cool ones tend to fall, this can be used as a means of isotope separation. This process was first demonstrated by Klaus Clusius and Gerhard Dickel in Germany in 1938. It was developed by US Navy scientists, but was not one of the enrichment technologies initially selected for use in the Manhattan Project. This was primarily due to doubts about its technical feasibility, but the inter-service rivalry between the Army and Navy also played a part.
The Naval Research Laboratory continued the research under Philip Abelson's direction, but there was little contact with the Manhattan Project until April 1944, when Captain William S. Parsons, the naval officer in charge of ordnance development at Los Alamos, brought Oppenheimer news of encouraging progress in the Navy's experiments on thermal diffusion.
Oppenheimer wrote to Groves suggesting that the output of a thermal diffusion plant could be fed into Y-12. Groves set up a committee consisting of Warren K. Lewis, Eger Murphree and Richard Tolman to investigate the idea, and they estimated that a thermal diffusion plant costing $3.5 million could enrich 50 kilograms (110 lb) of uranium per week to nearly 0.9% uranium-235. Groves approved its construction on 24 June 1944.
Groves contracted with the H. K. Ferguson Company of Cleveland, Ohio, to build the thermal diffusion plant, which was designated S-50. Groves's advisers, Karl Cohen and W. I. Thompson from Standard Oil, estimated that it would take six months to build.
Groves gave Ferguson just four. Plans called for the installation of 2,142 48-foot-tall (15 m) diffusion columns arranged in 21 racks. Inside each column were three concentric tubes. Steam, obtained from the nearby K-25 powerhouse at a pressure of 100 pounds per square inch (690 kPa) and temperature of 545 °F (285 °C), flowed downward through the innermost 1.25-inch (32 mm) nickel pipe, while water at 155 °F (68 °C) flowed upward through the outermost iron pipe. The uranium hexafluoride flowed in the middle copper pipe, and isotope separation of the uranium occurred between the nickel and copper pipes.
Work commenced on 9 July 1944, and S-50 began partial operation in September. Ferguson operated the plant through a subsidiary known as Fercleve. The plant produced just 10.5 pounds (4.8 kg) of 0.852% uranium-235 in October. Leaks limited production and forced shutdowns over the next few months, but in June 1945 it produced 12,730 pounds (5,770 kg).
By March 1945, all 21 production racks were operating. Initially the output of S-50 was fed into Y-12, but starting in March 1945 all three enrichment processes were run in series. S-50 became the first stage, enriching from 0.71% to 0.89%. This material was fed into the gaseous diffusion process in the K-25 plant, which produced a product enriched to about 23%. This was, in turn, fed into Y-12, which boosted it to about 89%, sufficient for nuclear weapons. About 50 kilograms (110 lb) of uranium enriched to 89% uranium-235 was delivered to Los Alamos by July 1945.
The entire 50 kg, along with some 50%-enriched, averaging out to about 85% enriched, were used in the first Little Boy.
Plutonium:
The second line of development pursued by the Manhattan Project used the fissile element plutonium. Although small amounts of plutonium exist in nature, the best way to obtain large quantities of the element is in a nuclear reactor, in which natural uranium is bombarded by neutrons.
The uranium-238 is transmuted into uranium-239, which rapidly decays, first into neptunium-239 and then into plutonium-239. Only a small amount of the uranium-238 will be transformed, so the plutonium must be chemically separated from the remaining uranium, from any initial impurities, and from fission products.
X-10 Graphite Reactor:
Main article: X-10 Graphite Reactor
In March 1943, DuPont began construction of a plutonium plant on a 112-acre (0.5 km2) site at Oak Ridge. Intended as a pilot plant for the larger production facilities at Hanford, it included the air-cooled X-10 Graphite Reactor, a chemical separation plant, and support facilities.
Because of the subsequent decision to construct water-cooled reactors at Hanford, only the chemical separation plant operated as a true pilot. The X-10 Graphite Reactor consisted of a huge block of graphite, 24 feet (7.3 m) long on each side, weighing around 1,500 short tons (1,400 t), surrounded by 7 feet (2.1 m) of high-density concrete as a radiation shield.
The greatest difficulty was encountered with the uranium slugs produced by Mallinckrodt and Metal Hydrides. These somehow had to be coated in aluminum to avoid corrosion and the escape of fission products into the cooling system. The Grasselli Chemical Company attempted to develop a hot dipping process without success. Meanwhile, Alcoa tried canning.
A new process for flux-less welding was developed, and 97% of the cans passed a standard vacuum test, but high temperature tests indicated a failure rate of more than 50%.
Nonetheless, production began in June 1943. The Metallurgical Laboratory eventually developed an improved welding technique with the help of General Electric, which was incorporated into the production process in October 1943.
Watched by Fermi and Compton, the X-10 Graphite Reactor went critical on 4 November 1943 with about 30 short tons (27 t) of uranium. A week later the load was increased to 36 short tons (33 t), raising its power generation to 500 kW, and by the end of the month the first 500 mg of plutonium was created.
Modifications over time raised the power to 4,000 kW in July 1944. X-10 operated as a production plant until January 1945, when it was turned over to research activities.
Hanford reactors:
Main article: Hanford Engineer Works
Although an air-cooled design was chosen for the reactor at Oak Ridge to facilitate rapid construction, it was recognized that this would be impractical for the much larger production reactors. Initial designs by the Metallurgical Laboratory and DuPont used helium for cooling, before they determined that a water-cooled reactor would be simpler, cheaper and quicker to build.
The design did not become available until 4 October 1943; in the meantime, Matthias concentrated on improving the Hanford Site by erecting accommodations, improving the roads, building a railway switch line, and upgrading the electricity, water and telephone lines.
As at Oak Ridge, the most difficulty was encountered while canning the uranium slugs, which commenced at Hanford in March 1944. They were pickled to remove dirt and impurities, dipped in molten bronze, tin, and aluminum-silicon alloy, canned using hydraulic presses, and then capped using arc welding under an argon atmosphere. Finally, they were subjected to a series of tests to detect holes or faulty welds.
Disappointingly, most canned slugs initially failed the tests, resulting in an output of only a handful of canned slugs per day. But steady progress was made and by June 1944 production increased to the point where it appeared that enough canned slugs would be available to start Reactor B on schedule in August 1944.
Work began on Reactor B, the first of six planned 250 MW reactors, on 10 October 1943. The reactor complexes were given letter designations A through F, with B, D and F sites chosen to be developed first, as this maximised the distance between the reactors. They would be the only ones constructed during the Manhattan Project.
Some 390 short tons (350 t) of steel, 17,400 cubic yards (13,300 m3) of concrete, 50,000 concrete blocks and 71,000 concrete bricks were used to construct the 120-foot (37 m) high building.
Construction of the reactor itself commenced in February 1944. Watched by Compton, Matthias, DuPont's Crawford Greenewalt, Leona Woods and Fermi, who inserted the first slug, the reactor was powered up beginning on 13 September 1944. Over the next few days, 838 tubes were loaded and the reactor went critical. Shortly after midnight on 27 September, the operators began to withdraw the control rods to initiate production.
At first all appeared well but around 03:00 the power level started to drop and by 06:30 the reactor had shut down completely. The cooling water was investigated to see if there was a leak or contamination.
The next day the reactor started up again, only to shut down once more.
Fermi contacted Chien-Shiung Wu, who identified the cause of the problem as neutron poisoning from xenon-135, which has a half-life of 9.2 hours. Fermi, Woods, Donald J. Hughes and John Archibald Wheeler then calculated the nuclear cross section of xenon-135, which turned out to be 30,000 times that of uranium. DuPont engineer George Graves had deviated from the Metallurgical Laboratory's original design in which the reactor had 1,500 tubes arranged in a circle, and had added an additional 504 tubes to fill in the corners.
The scientists had originally considered this overengineering a waste of time and money, but Fermi realized that by loading all 2,004 tubes, the reactor could reach the required power level and efficiently produce plutonium. Reactor D was started on 17 December 1944 and Reactor F on 25 February 1945.
Separation process:
Meanwhile, the chemists considered the problem of how plutonium could be separated from uranium when its chemical properties were not known. Working with the minute quantities of plutonium available at the Metallurgical Laboratory in 1942, a team under Charles M. Cooper developed a lanthanum fluoride process for separating uranium and plutonium, which was chosen for the pilot separation plant.
A second separation process, the bismuth phosphate process, was subsequently developed by Seaborg and Stanly G. Thomson. This process worked by toggling plutonium between its +4 and +6 oxidation states in solutions of bismuth phosphate. In the former state, the plutonium was precipitated; in the latter, it stayed in solution and the other products were precipitated.
Greenewalt favored the bismuth phosphate process due to the corrosive nature of lanthanum fluoride, and it was selected for the Hanford separation plants. Once X-10 began producing plutonium, the pilot separation plant was put to the test. The first batch was processed at 40% efficiency but over the next few months this was raised to 90%.
At Hanford, top priority was initially given to the installations in the 300 area. This contained buildings for testing materials, preparing uranium, and assembling and calibrating instrumentation. One of the buildings housed the canning equipment for the uranium slugs, while another contained a small test reactor.
Notwithstanding the high priority allocated to it, work on the 300 area fell behind schedule due to the unique and complex nature of the 300 area facilities, and wartime shortages of labor and materials.
Early plans called for the construction of two separation plants in each of the areas known as 200-West and 200-East. This was subsequently reduced to two, the T and U plants, in 200-West and one, the B plant, at 200-East. Each separation plant consisted of four buildings:
- a process cell building or "canyon" (known as 221),
- a concentration building (224),
- a purification building (231)
- and a magazine store (213).
The canyons were each 800 feet (240 m) long and 65 feet (20 m) wide. Each consisted of forty 17.7-by-13-by-20-foot (5.4 by 4.0 by 6.1 m) cells.
Work began on 221-T and 221-U in January 1944, with the former completed in September and the latter in December. The 221-B building followed in March 1945. Because of the high levels of radioactivity involved, all work in the separation plants had to be conducted by remote control using closed-circuit television, something unheard of in 1943.
Maintenance was carried out with the aid of an overhead crane and specially designed tools. The 224 buildings were smaller because they had less material to process, and it was less radioactive. The 224-T and 224-U buildings were completed on 8 October 1944, and 224-B followed on 10 February 1945.
The purification methods that were eventually used in 231-W were still unknown when construction commenced on 8 April 1944, but the plant was complete and the methods were selected by the end of the year. On 5 February 1945, Matthias hand-delivered the first shipment of 80 g of 95%-pure plutonium nitrate to a Los Alamos courier in Los Angeles.
Weapon design:
Main article: Project Y
In 1943, development efforts were directed to a gun-type fission weapon with plutonium called Thin Man. Initial research on the properties of plutonium was done using cyclotron-generated plutonium-239, which was extremely pure, but could only be created in very small amounts.
The idea behind the Thin Man design was to fire one subcritical mass of plutonium at another and the collision would create a nuclear explosion. Los Alamos received the first sample of plutonium from the Clinton X-10 reactor in April 1944 and within days Emilio Segrè discovered a problem: the reactor-bred plutonium had a higher concentration of plutonium-240, resulting in up to five times the spontaneous fission rate of cyclotron plutonium. Seaborg had correctly predicted in March 1943 that some of the plutonium-239 would absorb a neutron and become plutonium-240.
This made reactor plutonium unsuitable for use in a gun-type weapon. The plutonium-240 would start the chain reaction too quickly, causing a predetonation that would release enough energy to disperse the critical mass with a minimal amount of plutonium reacted (a fizzle). A faster gun was suggested but found to be impractical. The possibility of separating the isotopes was considered and rejected, as plutonium-240 is even harder to separate from plutonium-239 than uranium-235 from uranium-238.
Work on an alternative method of bomb design, known as implosion, had begun earlier under the direction of the physicist Seth Neddermeyer. Implosion used explosives to crush a subcritical sphere of fissile material into a smaller and denser form. When the fissile atoms are packed closer together, the rate of neutron capture increases, and the mass becomes a critical mass.
The metal needs to travel only a very short distance, so the critical mass is assembled in much less time than it would take with the gun method. Neddermeyer's 1943 and early 1944 investigations into implosion showed promise, but also made it clear that the problem would be much more difficult from a theoretical and engineering perspective than the gun design.
In September 1943, John von Neumann, who had experience with shaped charges used in armor-piercing shells, argued that not only would implosion reduce the danger of pre-detonation and fizzle, but would make more efficient use of the fissionable material. He proposed using a spherical configuration instead of the cylindrical one that Neddermeyer was working on.
By July 1944, Oppenheimer had concluded plutonium could not be used in a gun design, and opted for implosion. The accelerated effort on an implosion design, codenamed Fat Man, began in August 1944 when Oppenheimer implemented a sweeping reorganization of the Los Alamos laboratory to focus on implosion.
Two new groups were created at Los Alamos to develop the implosion weapon, X (for explosives) Division headed by explosives expert George Kistiakowsky and G (for gadget) Division under Robert Bacher. The new design that von Neumann and T (for theoretical) Division, most notably Rudolf Peierls, had devised used explosive lenses to focus the explosion onto a spherical shape using a combination of both slow and fast high explosives.
The design of lenses that detonated with the proper shape and velocity turned out to be slow, difficult and frustrating. Various explosives were tested before settling on composition B as the fast explosive and baratol as the slow explosive. The final design resembled a soccer ball, with 20 hexagonal and 12 pentagonal lenses, each weighing about 80 pounds (36 kg).
Getting the detonation just right required fast, reliable and safe electrical detonators, of which there were two for each lens for reliability. It was therefore decided to use exploding-bridgewire detonators, a new invention developed at Los Alamos by a group led by Luis Alvarez. A contract for their manufacture was given to Raytheon.
To study the behavior of converging shock waves, Robert Serber devised the RaLa Experiment, which used the short-lived radioisotope lanthanum-140, a potent source of gamma radiation. The gamma ray source was placed in the center of a metal sphere surrounded by the explosive lenses, which in turn were inside in an ionization chamber. This allowed the taking of an X-ray movie of the implosion. The lenses were designed primarily using this series of tests.
In his history of the Los Alamos project, David Hawkins wrote: "RaLa became the most important single experiment affecting the final bomb design".
Within the explosives was the 4.5-inch (110 mm) thick aluminum pusher, which provided a smooth transition from the relatively low density explosive to the next layer, the 3-inch (76 mm) thick tamper of natural uranium. Its main job was to hold the critical mass together as long as possible, but it would also reflect neutrons back into the core. Some part of it might fission as well.
To prevent predetonation by an external neutron, the tamper was coated in a thin layer of boron. A polonium-beryllium modulated neutron initiator, known as an "urchin" because its shape resembled a sea urchin, was developed to start the chain reaction at precisely the right moment.
This work with the chemistry and metallurgy of radioactive polonium was directed by Charles Allen Thomas of the Monsanto Company and became known as the Dayton Project. Testing required up to 500 curies per month of polonium, which Monsanto was able to deliver.
The whole assembly was encased in a duralumin bomb casing to protect it from bullets and flak.
The ultimate task of the metallurgists was to determine how to cast plutonium into a sphere. The difficulties became apparent when attempts to measure the density of plutonium gave inconsistent results. At first contamination was believed to be the cause, but it was soon determined that there were multiple allotropes of plutonium. The brittle α phase that exists at room temperature changes to the plastic β phase at higher temperatures.
Attention then shifted to the even more malleable δ phase that normally exists in the 300 °C to 450 °C range. It was found that this was stable at room temperature when alloyed with aluminum, but aluminum emits neutrons when bombarded with alpha particles, which would exacerbate the pre-ignition problem. The metallurgists then hit upon a plutonium-gallium alloy, which stabilized the δ phase and could be hot pressed into the desired spherical shape.
As plutonium was found to corrode readily, the sphere was coated with nickel.
The work proved dangerous. By the end of the war, half the experienced chemists and metallurgists had to be removed from work with plutonium when unacceptably high levels of the element appeared in their urine. A minor fire at Los Alamos in January 1945 led to a fear that a fire in the plutonium laboratory might contaminate the whole town, and Groves authorized the construction of a new facility for plutonium chemistry and metallurgy, which became known as the DP-site.
The hemispheres for the first plutonium pit (or core) were produced and delivered on 2 July 1945. Three more hemispheres followed on 23 July and were delivered three days later.
In contrast to the plutonium Fat Man, the uranium gun-type Little Boy weapon was straightforward if not trivial to design. Overall responsibility for it was assigned to Parsons's Ordnance (O) Division, with the design, development, and technical work at Los Alamos consolidated under Lieutenant Commander Francis Birch's group.
The gun-type design now had to work with enriched uranium only, and this allowed the design to be greatly simplified. A high-velocity gun was no longer required, and a simpler weapon was substituted.
Trinity:
Main article: Trinity (nuclear test)
Because of the complexity of an implosion-style weapon, it was decided that, despite the waste of fissile material, an initial test would be required. Groves approved the test, subject to the active material being recovered. Consideration was therefore given to a controlled fizzle, but Oppenheimer opted instead for a full-scale nuclear test, codenamed "Trinity".
In March 1944, planning for the test was assigned to Kenneth Bainbridge, a professor of physics at Harvard, working under Kistiakowsky. Bainbridge selected the bombing range near Alamogordo Army Airfield as the site for the test.
Bainbridge worked with Captain Samuel P. Davalos on the construction of the Trinity Base Camp and its facilities, which included barracks, warehouses, workshops, an explosive magazine and a commissary.
Groves did not relish the prospect of explaining to a Senate committee the loss of a billion dollars worth of plutonium, so a cylindrical containment vessel codenamed "Jumbo" was constructed to recover the active material in the event of a failure. Measuring 25 feet (7.6 m) long and 12 feet (3.7 m) wide, it was fabricated at great expense from 214 short tons (194 t) of iron and steel by Babcock & Wilcox in Barberton, Ohio.
Brought in a special railroad car to a siding in Pope, New Mexico, it was transported the last 25 miles (40 km) to the test site on a trailer pulled by two tractors. By the time it arrived, however, confidence in the implosion method was high enough, and the availability of plutonium was sufficient, that Oppenheimer decided not to use it. Instead, it was placed atop a steel tower 800 yards (730 m) from the weapon as a rough measure of how powerful the explosion would be.
In the end, Jumbo survived, although its tower did not, adding credence to the belief that Jumbo would have successfully contained a fizzled explosion.
A pre-test explosion was conducted on 7 May 1945 to calibrate the instruments. A wooden test platform was erected 800 yards (730 m) from Ground Zero and piled with 100 short tons (91 t) of TNT spiked with nuclear fission products in the form of an irradiated uranium slug from Hanford, which was dissolved and poured into tubing inside the explosive.
This explosion was observed by Oppenheimer and Groves's new deputy commander, Brigadier General Thomas Farrell. The pre-test produced data that proved vital for the Trinity test.
For the actual test, the weapon, nicknamed "the gadget", was hoisted to the top of a 100-foot (30 m) steel tower, as detonation at that height would give a better indication of how the weapon would behave when dropped from a bomber. Detonation in the air maximized the energy applied directly to the target, and generated less nuclear fallout.
The gadget was assembled under the supervision of Norris Bradbury at the nearby McDonald Ranch House on 13 July, and precariously winched up the tower the following day.
Observers included Bush, Chadwick, Conant, Farrell, Fermi, Groves, Lawrence, Oppenheimer and Tolman. At 05:30 on 16 July 1945 the gadget exploded with an energy equivalent of around 20 kilotons of TNT, leaving a crater of Trinitite (radioactive glass) in the desert 250 feet (76 m) wide.
The shock wave was felt over 100 miles (160 km) away, and the mushroom cloud reached 7.5 miles (12.1 km) in height. It was heard as far away as El Paso, Texas, so Groves issued a cover story about an ammunition magazine explosion at Alamogordo Field.
Oppenheimer later recalled that, while witnessing the explosion, he thought of a verse from the Hindu holy book, the Bhagavad Gita (XI,12): "If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one ...
Years later he would explain that another verse had also entered his head at that time:
"We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, 'Now I am become Death, the destroyer of worlds.' I suppose we all thought that, one way or another.
Personnel:
At its peak in June 1944, the Manhattan Project employed about 129,000 workers, of whom 84,500 were construction workers, 40,500 were plant operators and 1,800 were military personnel. As construction activity fell off, the workforce declined to 100,000 a year later, but the number of military personnel increased to 5,600.
Procuring the required numbers of workers, especially highly skilled workers, in competition with other vital wartime programs proved very difficult. Due to high turnover, over 500,000 people worked on the project. In 1943, Groves obtained a special temporary priority for labor from the War Manpower Commission.
In March 1944, both the War Production Board and the War Manpower Commission gave the project their highest priority. The Kansas commission director stated that from April to July 1944 every qualified applicant in the state who visited a United States Employment Service office was urged to work at the Hanford Site. No other job was offered until the applicant definitively rejected the offer.
Tolman and Conant, in their role as the project's scientific advisers, drew up a list of candidate scientists and had them rated by scientists already working on the project. Groves then sent a personal letter to the head of their university or company asking for them to be released for essential war work.
One source of skilled personnel was the Army itself, particularly the Army Specialized Training Program. In 1943, the MED created the Special Engineer Detachment (SED), with an authorized strength of 675. Technicians and skilled workers drafted into the Army were assigned to the SED.
Another source was the Women's Army Corps (WAC). Initially intended for clerical tasks handling classified material, the WACs were soon tapped for technical and scientific tasks as well. On 1 February 1945, all military personnel assigned to the MED, including all SED detachments, were assigned to the 9812th Technical Service Unit, except at Los Alamos, where military personnel other than SED, including the WACs and Military Police, were assigned to the 4817th Service Command Unit.
An Associate Professor of Radiology at the University of Rochester School of Medicine, Stafford L. Warren, was commissioned as a colonel in the United States Army Medical Corps, and appointed as chief of the MED's Medical Section and Groves' medical advisor. Warren's initial task was to staff hospitals at Oak Ridge, Richland and Los Alamos.
The Medical Section was responsible for medical research, but also for the MED's health and safety programs. This presented an enormous challenge, because workers were handling a variety of toxic chemicals, using hazardous liquids and gases under high pressures, working with high voltages, and performing experiments involving explosives, not to mention the largely unknown dangers presented by radioactivity and handling fissile materials.
Yet in December 1945, the National Safety Council presented the Manhattan Project with the Award of Honor for Distinguished Service to Safety in recognition of its safety record. Between January 1943 and June 1945, there were 62 fatalities and 3,879 disabling injuries, which was about 62 percent below the rate of private industry.
Secrecy:
Byron Price, head of the government's Office of Censorship, called the Manhattan Project the best-kept secret of the war. A 1945 Life article estimated that before the Hiroshima and Nagasaki bombings "probably no more than a few dozen men in the entire country knew the full meaning of the Manhattan Project, and perhaps only a thousand others even were aware that work on atoms was involved."
The magazine wrote that the more than 100,000 others employed with the project "worked like moles in the dark". Warned that disclosing the project's secrets was punishable by 10 years in prison or a fine of US$10,000 (equivalent to $163,000 in 2022), they saw enormous quantities of raw materials enter factories with nothing coming out and monitored "dials and switches while behind thick concrete walls mysterious reactions took place" without knowing the purpose of their jobs.
In December 1945 the United States Army published a secret report analysing and assessing the security apparatus surrounding the Manhattan Project. The report states that the Manhattan Project was "more drastically guarded than any other highly secret war development."
The security infrastructure surrounding the Manhattan Project was so vast and thorough that in the early days of the project in 1943, security investigators vetted 400,000 potential employees and 600 companies that would be involved in all aspects of the project for potential security risks. Although at times the most important employer in the nation for government bureaucrats assigning manpower, they only knew of the "Pasco secret project"; one said that until Hiroshima "we had no idea what was being made".
Oak Ridge security personnel considered any private party with more than seven people as suspicious, and residents—who believed that US government agents were secretly among them—avoided repeatedly inviting the same guests. Although original residents of the area could be buried in existing cemeteries, every coffin was reportedly opened for inspection.
Everyone, including top military officials, and their automobiles were searched when entering and exiting project facilities. One Oak Ridge worker stated that "if you got inquisitive, you were called on the carpet within two hours by government secret agents. Usually those summoned to explain were then escorted bag and baggage to the gate and ordered to keep going".
Despite being told that their work would help end the war and perhaps all future wars, not seeing or understanding the results of their often tedious duties—or even typical side effects of factory work such as smoke from smokestacks—and the war in Europe ending without the use of their work, caused serious morale problems among workers and caused many rumors to spread.
One manager stated after the war: "Well it wasn't that the job was tough ... it was confusing. You see, no one knew what was being made in Oak Ridge, not even me, and a lot of the people thought they were wasting their time here. It was up to me to explain to the dissatisfied workers that they were doing a very important job. When they asked me what, I'd have to tell them it was a secret. But I almost went crazy myself trying to figure out what was going on."
Another worker told of how, working in a laundry, she every day held "a special instrument" to uniforms and listened for "a clicking noise". She learned only after the war that she had been performing the important task of checking for radiation with a geiger counter.
To improve morale among such workers Oak Ridge created an extensive system of intramural sports leagues, including 10 baseball teams, 81 softball teams, and 26 football teams.
Censorship:
Voluntary censorship of atomic information began before the Manhattan Project. After the start of the European war in 1939 American scientists began avoiding publishing military-related research, and in 1940 scientific journals began asking the National Academy of Sciences to clear articles. William L. Laurence of The New York Times, who wrote an article on atomic fission in The Saturday Evening Post of 7 September 1940, later learned that government officials asked librarians nationwide in 1943 to withdraw the issue.
The Soviets noticed the silence, however. In April 1942 nuclear physicist Georgy Flyorov wrote to Josef Stalin on the absence of articles on nuclear fission in American journals; this resulted in the Soviet Union establishing its own atomic bomb project.
The Manhattan Project operated under tight security lest its discovery induce Axis powers, especially Germany, to accelerate their own nuclear projects or undertake covert operations against the project.
The Office of Censorship, by contrast, relied on the press to comply with a voluntary code of conduct it published, and the project at first avoided notifying the office. By early 1943 newspapers began publishing reports of large construction in Tennessee and Washington based on public records, and the office began discussing with the project how to maintain secrecy.
In June the Office of Censorship asked newspapers and broadcasters to avoid discussing "atom smashing, atomic energy, atomic fission, atomic splitting, or any of their equivalents.
The use for military purposes of radium or radioactive materials, heavy water, high voltage discharge equipment, cyclotrons." The office also asked to avoid discussion of "polonium, uranium, ytterbium, hafnium, protactinium, radium, rhenium, thorium, deuterium"; only uranium was sensitive, but was listed with other elements to hide its importance.
Soviet spies:
Main article: Atomic spies
The prospect of sabotage was always present, and sometimes suspected when there were equipment failures. While there were some problems believed to be the result of careless or disgruntled employees, there were no confirmed instances of Axis-instigated sabotage.
However, on 10 March 1945, a Japanese fire balloon struck a power line, and the resulting power surge caused the three reactors at Hanford to be temporarily shut down.
With so many people involved, security was a difficult task. A special Counter Intelligence Corps detachment was formed to handle the project's security issues. By 1943, it was clear that the Soviet Union was attempting to penetrate the project. Lieutenant Colonel Boris T. Pash, the head of the Counter Intelligence Branch of the Western Defense Command, investigated suspected Soviet espionage at the Radiation Laboratory in Berkeley.
Oppenheimer informed Pash that he had been approached by a fellow professor at Berkeley, Haakon Chevalier, about passing information to the Soviet Union.
The most successful Soviet spy was Klaus Fuchs, a member of the British Mission who played an important part at Los Alamos. The 1950 revelation of his espionage activities damaged the United States' nuclear cooperation with Britain and Canada.
Subsequently, other instances of espionage were uncovered, leading to the arrest of Harry Gold, David Greenglass, and Julius and Ethel Rosenberg. Other spies like George Koval and Theodore Hall remained unknown for decades. The value of the espionage is difficult to quantify, as the principal constraint on the Soviet atomic bomb project was a shortage of uranium ore. The consensus is that espionage saved the Soviets one or two years of effort.
Foreign intelligence:
Main articles: Alsos Mission and Operation Epsilon
In addition to developing the atomic bomb, the Manhattan Project was charged with gathering intelligence on the German nuclear energy project. It was believed that the Japanese nuclear weapons program was not far advanced because Japan had little access to uranium ore, but it was initially feared that Germany was very close to developing its own weapons.
At the instigation of the Manhattan Project, a bombing and sabotage campaign was carried out against heavy water plants in German-occupied Norway. A small mission was created, jointly staffed by the Office of Naval Intelligence, OSRD, the Manhattan Project, and Army Intelligence (G-2), to investigate enemy scientific developments.
It was not restricted to those involving nuclear weapons. The Chief of Army Intelligence, Major General George V. Strong, appointed Boris Pash to command the unit, which was codenamed "Alsos", a Greek word meaning "grove".
The Alsos Mission to Italy questioned staff of the physics laboratory at the University of Rome following the capture of the city in June 1944. Meanwhile, Pash formed a combined British and American Alsos mission in London under the command of Captain Horace K. Calvert to participate in Operation Overlord.
Groves considered the risk that the Germans might attempt to disrupt the Normandy landings with radioactive poisons was sufficient to warn General Dwight D. Eisenhower and send an officer to brief his chief of staff, Lieutenant General Walter Bedell Smith.
Under the codename Operation Peppermint, special equipment was prepared and Chemical Warfare Service teams were trained in its use.
Following in the wake of the advancing Allied armies, Pash and Calvert interviewed Frédéric Joliot-Curie about the activities of German scientists. They spoke to officials at Union Minière du Haut Katanga about uranium shipments to Germany. They tracked down 68 tons of ore in Belgium and 30 tons in France. The interrogation of German prisoners indicated that uranium and thorium were being processed in Oranienburg, 20 miles north of Berlin, so Groves arranged for it to be bombed on 15 March 1945.
An Alsos team went to Stassfurt in the Soviet Occupation Zone and retrieved 11 tons of ore from WIFO. In April 1945, Pash, in command of a composite force known as T-Force, conducted Operation Harborage, a sweep behind enemy lines of the cities of Hechingen, Bisingen, and Haigerloch that were the heart of the German nuclear effort.
T-Force captured the nuclear laboratories, documents, equipment and supplies, including heavy water and 1.5 tons of metallic uranium.
Alsos teams rounded up German scientists including:
Thes German scientists were taken to England where they were interned at Farm Hall, a bugged house in Godmanchester.
After the bombs were detonated in Japan, the Germans were forced to confront the fact that the Allies had done what they could not.
Atomic bombings of Hiroshima and Nagasaki:
Main article: Atomic bombings of Hiroshima and Nagasaki
Preparations:
The only Allied aircraft capable of carrying the 17-foot (5.2 m) long Thin Man or the 59-inch (150 cm) wide Fat Man was the British Avro Lancaster, but using a British aircraft would have caused difficulties with maintenance. Groves hoped that the American Boeing B-29 Superfortress could be modified to carry Thin Man by joining its two bomb bays together.
This became unnecessary after Thin man was abandoned, as a Little Boy was short enough to fit into a B-29 bomb bay, but modifications were still required. The Chief of United States Army Air Forces (USAAF), General Henry H. Arnold assured Groves that no effort would be spared to modify B-29s to do the job, and he designated Major General Oliver P. Echols as the USAAF liaison to the Manhattan Project.
In turn, Echols named Colonel Roscoe C. Wilson as his alternate, and Wilson became Manhattan Project's main USAAF contact.
Commencing in November 1943, the Army Air Forces Materiel Command at Wright Field, Ohio, began Silverplate, the codename for the modification of the B-29 to carry atomic bombs. Test drops were carried out at Muroc Army Air Field and the Naval Ordnance Test Station in California with Thin Man and Fat Man pumpkin bombs to test their ballistic, fuzing and stability characteristics.
The 509th Composite Group was activated on 17 December 1944 at Wendover Army Air Field, Utah, under the command of Colonel Paul W. Tibbets. This base, close to the border with Nevada, was codenamed "Kingman" or "W-47". Training was conducted at Wendover and at Batista Army Airfield, Cuba, where the 393rd Bombardment Squadron practiced long-distance flights over water and dropped pumpkin bombs.
Roosevelt instructed Groves that if the atomic bombs were ready before the war with Germany ended, he should be ready to drop them on Germany, but Japan was regarded as the most likely target.
A special unit known as Project Alberta was formed at Los Alamos under Parsons's command to assist in preparing and delivering the bombs. Commander Frederick L. Ashworth from Alberta met with Fleet Admiral Chester W. Nimitz on Guam in February 1945 to inform him of the Manhattan Project.
While he was there, Ashworth selected North Field on Tinian as a base for the 509th Composite Group, and he reserved space for the group and its buildings. The group deployed there in July 1945. Farrell arrived at Tinian on 30 July as the Manhattan Project representative. Purnell went to Tinian as the representative of the Military Policy Committee.
Most of the components for Little Boy left San Francisco on the cruiser USS Indianapolis on 16 July and arrived on Tinian on 26 July. Four days later the ship was sunk by a Japanese submarine. The remaining components, which included six highly enriched uranium rings, were delivered by three Douglas C-54 Skymasters of the 509th Group's 320th Troop Carrier Squadron.
Two Fat Man assemblies travelled to Tinian in specially modified 509th Composite Group B-29s. The first plutonium core went in a special C-54.
In late April, a joint targeting committee of the Manhattan District and USAAF was established to determine which cities in Japan should be targets, and recommended Kokura, Hiroshima, Niigata, and Kyoto.
At this point, Secretary of War Henry L. Stimson intervened, announcing that he would be making the targeting decision, and that he would not authorize the bombing of Kyoto on the grounds of its historical and religious significance. Groves therefore asked Arnold to remove Kyoto not just from the list of nuclear targets, but from targets for conventional bombing as well. Nagasaki was substituted.
Bombings:
In May 1945, the Interim Committee was created to advise on wartime and postwar use of nuclear energy. The committee was chaired by Stimson, with:
- James F. Byrnes, a former US Senator soon to be Secretary of State, as President Harry S. Truman's personal representative;
- Ralph A. Bard, the Under Secretary of the Navy;
- William L. Clayton, the Assistant Secretary of State;
- Vannevar Bush;
- Karl T. Compton;
- James B. Conant;
- and George L. Harrison, an assistant to Stimson and president of New York Life Insurance Company.
The Interim Committee in turn established a scientific panel consisting of Arthur Compton, Fermi, Lawrence and Oppenheimer to advise it on scientific issues. In its presentation to the Interim Committee, the scientific panel offered its opinion not just on the likely physical effects of an atomic bomb, but on its probable military and political impact.
At the Potsdam Conference in Germany, Truman was informed that the Trinity test had been successful. He told Stalin, the leader of the Soviet Union, that the US had a new superweapon, without giving any details. This was the first official communication to the Soviet Union about the bomb, but Stalin already knew about it from spies. With the authorization to use the bomb against Japan already given, no alternatives were considered after the Japanese rejection of the Potsdam Declaration.
Picture below: Little Boy explodes over Hiroshima, Japan, 6 August 1945 (left);
Fat Man explodes over Nagasaki, Japan, 9 August 1945 (right).
On 6 August 1945, a Boeing B-29 Superfortress (Enola Gay) of the 393d Bombardment Squadron, piloted by Tibbets, lifted off from North Field with a Little Boy in its bomb bay. Hiroshima, the headquarters of the 2nd General Army and Fifth Division and a port of embarkation, was the primary target of the mission, with Kokura and Nagasaki as alternatives.
With Farrell's permission, Parsons, the weaponeer in charge of the mission, completed the bomb assembly in the air to minimize the risks of a nuclear explosion in the event of a crash during takeoff. The bomb detonated at an altitude of 1,750 feet (530 m) with a blast that was later estimated to be the equivalent of 13 kilotons of TNT.
An area of approximately 4.7 square miles (12 km2) was destroyed. Japanese officials determined that 69% of Hiroshima's buildings were destroyed and another 6–7% damaged.
About 70,000 to 80,000 people, of whom 20,000 were Japanese combatants and 20,000 were Korean slave laborers, or some 30% of the population of Hiroshima, were killed immediately, and another 70,000 injured.
On the morning of 9 August 1945, a second B-29 (Bockscar), piloted by the 393d Bombardment Squadron's commander, Major Charles W. Sweeney, lifted off with a Fat Man on board. This time, Ashworth served as weaponeer and Kokura was the primary target.
Sweeney took off with the weapon already armed but with the electrical safety plugs still engaged. When they reached Kokura, they found cloud cover had obscured the city, prohibiting the visual attack required by orders. After three runs over the city, and with fuel running low, they headed for the secondary target, Nagasaki.
Ashworth decided that a radar approach would be used if the target was obscured, but a last-minute break in the clouds over Nagasaki allowed a visual approach as ordered. The Fat Man was dropped over the city's industrial valley midway between the Mitsubishi Steel and Arms Works in the south and the Mitsubishi-Urakami Ordnance Works in the north.
The resulting explosion had a blast yield equivalent to 21 kilotons of TNT, roughly the same as the Trinity blast, but was confined to the Urakami Valley, and a major portion of the city was protected by the intervening hills, resulting in the destruction of about 44% of the city.
The bombing also crippled the city's industrial production extensively and killed 23,200–28,200 Japanese industrial workers and 150 Japanese soldiers. Overall, an estimated 35,000–40,000 people were killed and 60,000 injured.
Groves expected to have another atomic bomb ready for use on 19 August, with three more in September and a further three in October. Two more Fat Man assemblies were readied, and scheduled to leave Kirtland Field for Tinian on 11 and 14 August.
At Los Alamos, technicians worked 24 hours straight to cast another plutonium core. Although cast, it still needed to be pressed and coated, which would take until 16 August. It could therefore have been ready for use on 19 August.
On 10 August, Truman secretly requested that additional atomic bombs not be dropped on Japan without his express authority. Groves suspended the third core's shipment on his own authority on 13 August.
On 11 August, Groves phoned Warren with orders to organize a survey team to report on the damage and radioactivity at Hiroshima and Nagasaki.
A party equipped with portable Geiger counters arrived in Hiroshima on 8 September headed by Farrell and Warren, with Japanese Rear Admiral Masao Tsuzuki, who acted as a translator.
They remained in Hiroshima until 14 September and then surveyed Nagasaki from 19 September to 8 October. This and other scientific missions to Japan provided valuable scientific and historical data.
The necessity of the bombings of Hiroshima and Nagasaki became a subject of controversy among historians. Some questioned whether an "atomic diplomacy" would not have attained the same goals and disputed whether the bombings or the Soviet declaration of war on Japan was decisive. The Franck Report was the most notable effort pushing for a demonstration but was turned down by the Interim Committee's scientific panel.
The Szilárd petition, drafted in July 1945 and signed by dozens of scientists working on the Manhattan Project, was a late attempt at warning President Harry S. Truman about his responsibility in using such weapons.
After the war:
Seeing the work they had not understood produce the Hiroshima and Nagasaki bombs amazed the workers of the Manhattan Project as much as the rest of the world; newspapers in Oak Ridge announcing the Hiroshima bomb sold for $1 ($12 today).[252][250]
Although the bombs' existence was public, secrecy continued, and many workers remained ignorant of their jobs; one stated in 1946, "I don't know what the hell I'm doing besides looking into a ——— and turning a ——— alongside a ———. I don't know anything about it, and there's nothing to say".
Many residents continued to avoid discussion of "the stuff" in ordinary conversation despite it being the reason for their town's existence.
In anticipation of the bombings, Groves had Henry DeWolf Smyth prepare a history for public consumption. Atomic Energy for Military Purposes, better known as the "Smyth Report", was released to the public on 12 August 1945.
Groves and Nichols presented Army–Navy "E" Awards to key contractors, whose involvement had hitherto been secret. Over 20 awards of the Presidential Medal for Merit were made to key contractors and scientists, including Bush and Oppenheimer.
Military personnel received the Legion of Merit, including the commander of the Women's Army Corps detachment, Captain Arlene G. Scheidenhelm.
At Hanford, plutonium production fell off as Reactors B, D and F wore out, poisoned by fission products and swelling of the graphite moderator known as the Wigner effect. The swelling damaged the charging tubes where the uranium was irradiated to produce plutonium, rendering them unusable.
In order to maintain the supply of polonium for the urchin initiators, production was curtailed and the oldest unit, B pile, was closed down so at least one reactor would be available in the future.
Research continued, with DuPont and the Metallurgical Laboratory developing a redox solvent extraction process as an alternative plutonium extraction technique to the bismuth phosphate process, which left unspent uranium in a state from which it could not easily be recovered.
Bomb engineering was carried out by the Z Division, named for its director, Dr. Jerrold R. Zacharias from Los Alamos. Z Division was initially located at Wendover Field but moved to Oxnard Field, New Mexico, in September 1945 to be closer to Los Alamos. This marked the beginning of Sandia Base. Nearby Kirtland Field was used as a B-29 base for aircraft compatibility and drop tests.
By October, all the staff and facilities at Wendover had been transferred to Sandia. As reservist officers were demobilized, they were replaced by about fifty hand-picked regular officers.
Nichols recommended that S-50 and the Alpha tracks at Y-12 be closed down. This was done in September. Although performing better than ever, the Alpha tracks could not compete with K-25 and the new K-27, which had commenced operation in January 1946.
In December, the Y-12 plant was closed, thereby cutting the Tennessee Eastman payroll from 8,600 to 1,500 and saving $2 million a month.
Legacy:
See also: Nuclear weapons in popular culture
The political and cultural impacts of the development of nuclear weapons were profound and far-reaching. William Laurence of The New York Times, the first to use the phrase "Atomic Age", became the official correspondent for the Manhattan Project in spring 1945. In 1943 and 1944 he unsuccessfully attempted to persuade the Office of Censorship to permit writing about the explosive potential of uranium, and government officials felt that he had earned the right to report on the biggest secret of the war.
Laurence witnessed both the Trinity test and the bombing of Nagasaki and wrote the official press releases prepared for them. He went on to write a series of articles extolling the virtues of the new weapon. His reporting before and after the bombings helped to spur public awareness of the potential of nuclear technology and motivated its development in the United States and the Soviet Union.
The wartime Manhattan Project left a legacy in the form of the network of national laboratories:
Two more were established by Groves soon after the war:
Groves allocated $72 million to them for research activities in fiscal year 1946–1947. They would be in the vanguard of the kind of large-scale research that Alvin Weinberg, the director of the Oak Ridge National Laboratory, would call Big Science.
The Naval Research Laboratory had long been interested in the prospect of using nuclear power for warship propulsion, and sought to create its own nuclear project. In May 1946, Nimitz, now Chief of Naval Operations, decided that the Navy should instead work with the Manhattan Project.
A group of naval officers were assigned to Oak Ridge, the most senior of whom was Captain Hyman G. Rickover, who became assistant director there. They immersed themselves in the study of nuclear energy, laying the foundations for a nuclear-powered navy.
A similar group of Air Force personnel arrived at Oak Ridge in September 1946 with the aim of developing nuclear aircraft. Their Nuclear Energy for the Propulsion of Aircraft (NEPA) project ran into formidable technical difficulties, and was ultimately cancelled.
The ability of the new reactors to create radioactive isotopes in previously unheard-of quantities sparked a revolution in nuclear medicine in the immediate postwar years. Starting in mid-1946, Oak Ridge began distributing radioisotopes to hospitals and universities. Most of the orders were for iodine-131 and phosphorus-32, which were used in the diagnosis and treatment of cancer.
In addition to medicine, isotopes were also used in biological, industrial and agricultural research.
On handing over control to the Atomic Energy Commission, Groves bid farewell to the people who had worked on the Manhattan Project: "Five years ago, the idea of Atomic Power was only a dream. You have made that dream a reality. You have seized upon the most nebulous of ideas and translated them into actualities. You have built cities where none were known before. You have constructed industrial plants of a magnitude and to a precision heretofore deemed impossible. You built the weapon which ended the War and thereby saved countless American lives. With regard to peacetime applications, you have raised the curtain on vistas of a new world.
In 2014, the United States Congress passed a law providing for a national park dedicated to the history of the Manhattan Project. The Manhattan Project National Historical Park was established on 10 November 2015.
See also:
With Farrell's permission, Parsons, the weaponeer in charge of the mission, completed the bomb assembly in the air to minimize the risks of a nuclear explosion in the event of a crash during takeoff. The bomb detonated at an altitude of 1,750 feet (530 m) with a blast that was later estimated to be the equivalent of 13 kilotons of TNT.
An area of approximately 4.7 square miles (12 km2) was destroyed. Japanese officials determined that 69% of Hiroshima's buildings were destroyed and another 6–7% damaged.
About 70,000 to 80,000 people, of whom 20,000 were Japanese combatants and 20,000 were Korean slave laborers, or some 30% of the population of Hiroshima, were killed immediately, and another 70,000 injured.
On the morning of 9 August 1945, a second B-29 (Bockscar), piloted by the 393d Bombardment Squadron's commander, Major Charles W. Sweeney, lifted off with a Fat Man on board. This time, Ashworth served as weaponeer and Kokura was the primary target.
Sweeney took off with the weapon already armed but with the electrical safety plugs still engaged. When they reached Kokura, they found cloud cover had obscured the city, prohibiting the visual attack required by orders. After three runs over the city, and with fuel running low, they headed for the secondary target, Nagasaki.
Ashworth decided that a radar approach would be used if the target was obscured, but a last-minute break in the clouds over Nagasaki allowed a visual approach as ordered. The Fat Man was dropped over the city's industrial valley midway between the Mitsubishi Steel and Arms Works in the south and the Mitsubishi-Urakami Ordnance Works in the north.
The resulting explosion had a blast yield equivalent to 21 kilotons of TNT, roughly the same as the Trinity blast, but was confined to the Urakami Valley, and a major portion of the city was protected by the intervening hills, resulting in the destruction of about 44% of the city.
The bombing also crippled the city's industrial production extensively and killed 23,200–28,200 Japanese industrial workers and 150 Japanese soldiers. Overall, an estimated 35,000–40,000 people were killed and 60,000 injured.
Groves expected to have another atomic bomb ready for use on 19 August, with three more in September and a further three in October. Two more Fat Man assemblies were readied, and scheduled to leave Kirtland Field for Tinian on 11 and 14 August.
At Los Alamos, technicians worked 24 hours straight to cast another plutonium core. Although cast, it still needed to be pressed and coated, which would take until 16 August. It could therefore have been ready for use on 19 August.
On 10 August, Truman secretly requested that additional atomic bombs not be dropped on Japan without his express authority. Groves suspended the third core's shipment on his own authority on 13 August.
On 11 August, Groves phoned Warren with orders to organize a survey team to report on the damage and radioactivity at Hiroshima and Nagasaki.
A party equipped with portable Geiger counters arrived in Hiroshima on 8 September headed by Farrell and Warren, with Japanese Rear Admiral Masao Tsuzuki, who acted as a translator.
They remained in Hiroshima until 14 September and then surveyed Nagasaki from 19 September to 8 October. This and other scientific missions to Japan provided valuable scientific and historical data.
The necessity of the bombings of Hiroshima and Nagasaki became a subject of controversy among historians. Some questioned whether an "atomic diplomacy" would not have attained the same goals and disputed whether the bombings or the Soviet declaration of war on Japan was decisive. The Franck Report was the most notable effort pushing for a demonstration but was turned down by the Interim Committee's scientific panel.
The Szilárd petition, drafted in July 1945 and signed by dozens of scientists working on the Manhattan Project, was a late attempt at warning President Harry S. Truman about his responsibility in using such weapons.
After the war:
Seeing the work they had not understood produce the Hiroshima and Nagasaki bombs amazed the workers of the Manhattan Project as much as the rest of the world; newspapers in Oak Ridge announcing the Hiroshima bomb sold for $1 ($12 today).[252][250]
Although the bombs' existence was public, secrecy continued, and many workers remained ignorant of their jobs; one stated in 1946, "I don't know what the hell I'm doing besides looking into a ——— and turning a ——— alongside a ———. I don't know anything about it, and there's nothing to say".
Many residents continued to avoid discussion of "the stuff" in ordinary conversation despite it being the reason for their town's existence.
In anticipation of the bombings, Groves had Henry DeWolf Smyth prepare a history for public consumption. Atomic Energy for Military Purposes, better known as the "Smyth Report", was released to the public on 12 August 1945.
Groves and Nichols presented Army–Navy "E" Awards to key contractors, whose involvement had hitherto been secret. Over 20 awards of the Presidential Medal for Merit were made to key contractors and scientists, including Bush and Oppenheimer.
Military personnel received the Legion of Merit, including the commander of the Women's Army Corps detachment, Captain Arlene G. Scheidenhelm.
At Hanford, plutonium production fell off as Reactors B, D and F wore out, poisoned by fission products and swelling of the graphite moderator known as the Wigner effect. The swelling damaged the charging tubes where the uranium was irradiated to produce plutonium, rendering them unusable.
In order to maintain the supply of polonium for the urchin initiators, production was curtailed and the oldest unit, B pile, was closed down so at least one reactor would be available in the future.
Research continued, with DuPont and the Metallurgical Laboratory developing a redox solvent extraction process as an alternative plutonium extraction technique to the bismuth phosphate process, which left unspent uranium in a state from which it could not easily be recovered.
Bomb engineering was carried out by the Z Division, named for its director, Dr. Jerrold R. Zacharias from Los Alamos. Z Division was initially located at Wendover Field but moved to Oxnard Field, New Mexico, in September 1945 to be closer to Los Alamos. This marked the beginning of Sandia Base. Nearby Kirtland Field was used as a B-29 base for aircraft compatibility and drop tests.
By October, all the staff and facilities at Wendover had been transferred to Sandia. As reservist officers were demobilized, they were replaced by about fifty hand-picked regular officers.
Nichols recommended that S-50 and the Alpha tracks at Y-12 be closed down. This was done in September. Although performing better than ever, the Alpha tracks could not compete with K-25 and the new K-27, which had commenced operation in January 1946.
In December, the Y-12 plant was closed, thereby cutting the Tennessee Eastman payroll from 8,600 to 1,500 and saving $2 million a month.
Legacy:
See also: Nuclear weapons in popular culture
The political and cultural impacts of the development of nuclear weapons were profound and far-reaching. William Laurence of The New York Times, the first to use the phrase "Atomic Age", became the official correspondent for the Manhattan Project in spring 1945. In 1943 and 1944 he unsuccessfully attempted to persuade the Office of Censorship to permit writing about the explosive potential of uranium, and government officials felt that he had earned the right to report on the biggest secret of the war.
Laurence witnessed both the Trinity test and the bombing of Nagasaki and wrote the official press releases prepared for them. He went on to write a series of articles extolling the virtues of the new weapon. His reporting before and after the bombings helped to spur public awareness of the potential of nuclear technology and motivated its development in the United States and the Soviet Union.
The wartime Manhattan Project left a legacy in the form of the network of national laboratories:
- the Lawrence Berkeley National Laboratory,
- Los Alamos National Laboratory,
- Oak Ridge National Laboratory,
- Argonne National Laboratory,
- and Ames Laboratory.
Two more were established by Groves soon after the war:
- the Brookhaven National Laboratory at Upton, New York,
- and the Sandia National Laboratories at Albuquerque, New Mexico.
Groves allocated $72 million to them for research activities in fiscal year 1946–1947. They would be in the vanguard of the kind of large-scale research that Alvin Weinberg, the director of the Oak Ridge National Laboratory, would call Big Science.
The Naval Research Laboratory had long been interested in the prospect of using nuclear power for warship propulsion, and sought to create its own nuclear project. In May 1946, Nimitz, now Chief of Naval Operations, decided that the Navy should instead work with the Manhattan Project.
A group of naval officers were assigned to Oak Ridge, the most senior of whom was Captain Hyman G. Rickover, who became assistant director there. They immersed themselves in the study of nuclear energy, laying the foundations for a nuclear-powered navy.
A similar group of Air Force personnel arrived at Oak Ridge in September 1946 with the aim of developing nuclear aircraft. Their Nuclear Energy for the Propulsion of Aircraft (NEPA) project ran into formidable technical difficulties, and was ultimately cancelled.
The ability of the new reactors to create radioactive isotopes in previously unheard-of quantities sparked a revolution in nuclear medicine in the immediate postwar years. Starting in mid-1946, Oak Ridge began distributing radioisotopes to hospitals and universities. Most of the orders were for iodine-131 and phosphorus-32, which were used in the diagnosis and treatment of cancer.
In addition to medicine, isotopes were also used in biological, industrial and agricultural research.
On handing over control to the Atomic Energy Commission, Groves bid farewell to the people who had worked on the Manhattan Project: "Five years ago, the idea of Atomic Power was only a dream. You have made that dream a reality. You have seized upon the most nebulous of ideas and translated them into actualities. You have built cities where none were known before. You have constructed industrial plants of a magnitude and to a precision heretofore deemed impossible. You built the weapon which ended the War and thereby saved countless American lives. With regard to peacetime applications, you have raised the curtain on vistas of a new world.
In 2014, the United States Congress passed a law providing for a national park dedicated to the history of the Manhattan Project. The Manhattan Project National Historical Park was established on 10 November 2015.
See also:
- "The Atomic Bomb and the End of World War II, A Collection of Primary Sources". George Washington University. Retrieved 27 July 2011.
- "Atomic Heritage Foundation". Atomic Heritage Foundation. Retrieved 27 July 2011.
- "Voices of the Manhattan Project". Atomic Heritage Foundation. Retrieved 10 February 2015. Features hundreds of audio/visual interviews with Manhattan Project veterans.
- "History Center: Los Alamos National Laboratory". Los Alamos National Laboratory. Retrieved 27 July 2011.
- "ORNL: The first 50 Years: History of ORNL". ORNL Review. 25 (3). Archived from the original on 2 June 2016. Retrieved 13 October 2015.
- Manhattan Project and Allied Scientists Collections at the University of Chicago Special Collections Research Center
- Chadwick, Mark B. (3 December 2021). "Nuclear Science for the Manhattan Project and Comparison to Today's ENDF Data". Nuclear Technology. 207 (sup1): S24–S61
Nuclear Power as an Energy Source
- YouTube Video: How does a nuclear power plant work?
- YouTube Video: Inside a nuclear reactor core - Bang Goes The Theory - BBC
- YouTube Video: Tour of Nuclear Power plant
- NUCLEAR 101: How Does a Nuclear Reactor Work?* (Department of Energy)
* -- Continuation of article for above Illustration: "How does a Nuclear Power Plant Work?" (by DOE)
Nuclear reactors are the heart of a nuclear power plant.
They contain and control nuclear chain reactions that produce heat through a physical process called fission. That heat is used to make steam that spins a turbine to create electricity.
With more than 440 commercial reactors worldwide, including 92 in the United States, nuclear power continues to be one of the largest sources of reliable carbon-free electricity available.
Nuclear Fission Creates Heat:
The main job of a reactor is to house and control nuclear fission—a process where atoms split and release energy.
They contain and control nuclear chain reactions that produce heat through a physical process called fission. That heat is used to make steam that spins a turbine to create electricity.
With more than 440 commercial reactors worldwide, including 92 in the United States, nuclear power continues to be one of the largest sources of reliable carbon-free electricity available.
Reactors use uranium for nuclear fuel. The uranium is processed into small ceramic pellets and stacked together into sealed metal tubes called fuel rods. Typically, more than 200 of these rods are bundled together to form a fuel assembly. A reactor core is typically made up of a couple hundred assemblies, depending on power level.
Inside the reactor vessel, the fuel rods are immersed in water which acts as both a coolant and moderator. The moderator helps slow down the neutrons produced by fission to sustain the chain reaction.
Control rods can then be inserted into the reactor core to reduce the reaction rate or withdrawn to increase it.
The heat created by fission turns the water into steam, which spins a turbine to produce carbon-free electricity.
Types of Light-water Reactors in the United States:
All commercial nuclear reactors in the United States are light-water reactors. This means they use normal water as both a coolant and neutron moderator.
There are two types of light-water reactors operating in America:
1) PRESSURIZED WATER REACTORS: see above illustration
More than 65% of the commercial reactors in the United States are pressurized-water reactors or PWRs. These reactors pump water into the reactor core under high pressure to prevent the water from boiling.
The water in the core is heated by nuclear fission and then pumped into tubes inside a heat exchanger. Those tubes heat a separate water source to create steam. The steam then turns an electric generator to produce electricity.
The core water cycles back to the reactor to be reheated and the process is repeated.
2) BOILING WATER REACTORS:
Nuclear reactors are the heart of a nuclear power plant.
They contain and control nuclear chain reactions that produce heat through a physical process called fission. That heat is used to make steam that spins a turbine to create electricity.
With more than 440 commercial reactors worldwide, including 92 in the United States, nuclear power continues to be one of the largest sources of reliable carbon-free electricity available.
Nuclear Fission Creates Heat:
The main job of a reactor is to house and control nuclear fission—a process where atoms split and release energy.
They contain and control nuclear chain reactions that produce heat through a physical process called fission. That heat is used to make steam that spins a turbine to create electricity.
With more than 440 commercial reactors worldwide, including 92 in the United States, nuclear power continues to be one of the largest sources of reliable carbon-free electricity available.
Reactors use uranium for nuclear fuel. The uranium is processed into small ceramic pellets and stacked together into sealed metal tubes called fuel rods. Typically, more than 200 of these rods are bundled together to form a fuel assembly. A reactor core is typically made up of a couple hundred assemblies, depending on power level.
Inside the reactor vessel, the fuel rods are immersed in water which acts as both a coolant and moderator. The moderator helps slow down the neutrons produced by fission to sustain the chain reaction.
Control rods can then be inserted into the reactor core to reduce the reaction rate or withdrawn to increase it.
The heat created by fission turns the water into steam, which spins a turbine to produce carbon-free electricity.
Types of Light-water Reactors in the United States:
All commercial nuclear reactors in the United States are light-water reactors. This means they use normal water as both a coolant and neutron moderator.
There are two types of light-water reactors operating in America:
1) PRESSURIZED WATER REACTORS: see above illustration
More than 65% of the commercial reactors in the United States are pressurized-water reactors or PWRs. These reactors pump water into the reactor core under high pressure to prevent the water from boiling.
The water in the core is heated by nuclear fission and then pumped into tubes inside a heat exchanger. Those tubes heat a separate water source to create steam. The steam then turns an electric generator to produce electricity.
The core water cycles back to the reactor to be reheated and the process is repeated.
2) BOILING WATER REACTORS:
Roughly a third of the reactors operating in the United States are boiling water reactors (BWRs).
BWRs heat water and produce steam directly inside the reactor vessel. Water is pumped up through the reactor core and heated by fission. Pipes then feed the steam directly to a turbine to produce electricity.
The unused steam is then condensed back to water and reused in the heating process.
[End of DOE Article]
___________________________________________________________________________
Nuclear Power as an Energy Source (Wikipedia)
Nuclear power is the use of nuclear reactions to produce electricity. Nuclear power can be obtained from nuclear fission, nuclear decay and nuclear fusion reactions. Presently, the vast majority of electricity from nuclear power is produced by nuclear fission of uranium and plutonium in nuclear power plants.
Nuclear decay processes are used in niche applications such as radioisotope thermoelectric generators in some space probes such as Voyager 2. Generating electricity from fusion power remains the focus of international research.
Most nuclear power plants use thermal reactors with enriched uranium in a once-through fuel cycle. Fuel is removed when the percentage of neutron absorbing atoms becomes so large that a chain reaction can no longer be sustained, typically three years. It is then cooled for several years in on-site spent fuel pools before being transferred to long term storage.
The spent fuel, though low in volume, is high-level radioactive waste. While its radioactivity decreases exponentially it must be isolated from the biosphere for hundreds of thousands of years, though newer technologies (like fast reactors) have the potential to reduce this significantly.
Because the spent fuel is still mostly fissionable material, some countries (e.g. France and Russia) reprocess their spent fuel by extracting fissile and fertile elements for fabrication in new fuel, although this process is more expensive than producing new fuel from mined uranium.
All reactors breed some plutonium-239, which is found in the spent fuel, and because Pu-239 is the preferred material for nuclear weapons, reprocessing is seen as a weapon proliferation risk.
The first nuclear power plant was built in the 1950s. The global installed nuclear capacity grew to 100 GW in the late 1970s, and then expanded rapidly during the 1980s, reaching 300 GW by 1990.
The 1979 Three Mile Island accident in the United States and the 1986 Chernobyl disaster in the Soviet Union resulted in increased regulation and public opposition to nuclear plants.
These factors, along with high cost of construction, resulted in the global installed capacity only increasing to 390 GW by 2022. These plants supplied 2,586 terawatt hours (TWh) of electricity in 2019, equivalent to about 10% of global electricity generation, and were the second-largest low-carbon power source after hydroelectricity.
As of September 2022, there are 437 civilian fission reactors in the world, with overall capacity of 393 GW, 57 under construction and 102 planned, with a combined capacity of 62 GW and 96 GW, respectively.
The United States has the largest fleet of nuclear reactors, generating over 800 TWh of zero-emissions electricity per year with an average capacity factor of 92%. Average global capacity factor is 89%. Most new reactors under construction are generation III reactors in Asia.
Nuclear power generation causes one of the lowest levels of fatalities per unit of energy generated compared to other energy sources. Coal, petroleum, natural gas and hydroelectricity each have caused more fatalities per unit of energy due to air pollution and accidents.
Nuclear power plants emit no greenhouse gases. One of the dangers of nuclear power is the potential for accidents like the Fukushima nuclear disaster in Japan in 2011.
There is a debate about nuclear power. Proponents contend that nuclear power is a safe, sustainable energy source that reduces carbon emissions. The anti-nuclear movement contends that nuclear power poses many threats to people and the environment and is too expensive and slow to deploy when compared to alternative sustainable energy sources.
History:
Main article: History of nuclear power
Origins:
The discovery of nuclear fission occurred in 1938 following over four decades of work on the science of radioactivity and the elaboration of new nuclear physics that described the components of atoms. Soon after the discovery of the fission process, it was realized that a fissioning nucleus can induce further nucleus fissions, thus inducing a self-sustaining chain reaction.
Once this was experimentally confirmed in 1939, scientists in many countries petitioned their governments for support of nuclear fission research, just on the cusp of World War II, for the development of a nuclear weapon.
In the United States, these research efforts led to the creation of the first man-made nuclear reactor, the Chicago Pile-1, which achieved criticality on December 2, 1942. The reactor's development was part of the Manhattan Project, the Allied effort to create atomic bombs during World War II.
It led to the building of larger single-purpose production reactors for the production of weapons-grade plutonium for use in the first nuclear weapons. The United States tested the first nuclear weapon in July 1945, the Trinity test, with the atomic bombings of Hiroshima and Nagasaki taking place one month later.
Despite the military nature of the first nuclear devices, the 1940s and 1950s were characterized by strong optimism for the potential of nuclear power to provide cheap and endless energy. Electricity was generated for the first time by a nuclear reactor on December 20, 1951, at the EBR-I experimental station near Arco, Idaho, which initially produced about 100 kW.
In 1953, American President Dwight Eisenhower gave his "Atoms for Peace" speech at the United Nations, emphasizing the need to develop "peaceful" uses of nuclear power quickly. This was followed by the Atomic Energy Act of 1954 which allowed rapid declassification of U.S. reactor technology and encouraged development by the private sector.
First power generation:
The first organization to develop practical nuclear power was the U.S. Navy, with the S1W reactor for the purpose of propelling submarines and aircraft carriers. The first nuclear-powered submarine, USS Nautilus, was put to sea in January 1954. The S1W reactor was a pressurized water reactor.
This design was chosen because it was simpler, more compact, and easier to operate compared to alternative designs, thus more suitable to be used in submarines. This decision would result in the PWR being the reactor of choice also for power generation, thus having a lasting impact on the civilian electricity market in the years to come.
On June 27, 1954, the Obninsk Nuclear Power Plant in the USSR became the world's first nuclear power plant to generate electricity for a power grid, producing around 5 megawatts of electric power. The world's first commercial nuclear power station, Calder Hall at Windscale, England was connected to the national power grid on 27 August 1956. In common with a number of other generation I reactors, the plant had the dual purpose of producing electricity and plutonium-239, the latter for the nascent nuclear weapons program in Britain.
Expansion and first opposition:
The total global installed nuclear capacity initially rose relatively quickly, rising from less than 1 gigawatt (GW) in 1960 to 100 GW in the late 1970s. During the 1970s and 1980s rising economic costs (related to extended construction times largely due to regulatory changes and pressure-group litigation) and falling fossil fuel prices made nuclear power plants then under construction less attractive.
In the 1980s in the U.S. and 1990s in Europe, the flat electric grid growth and electricity liberalization also made the addition of large new baseload energy generators economically unattractive.
The 1973 oil crisis had a significant effect on countries, such as France and Japan, which had relied more heavily on oil for electric generation to invest in nuclear power. France would construct 25 nuclear power plants over the next 15 years, and as of 2019, 71% of French electricity was generated by nuclear power, the highest percentage by any nation in the world.
Some local opposition to nuclear power emerged in the United States in the early 1960s. In the late 1960s some members of the scientific community began to express pointed concerns.
These anti-nuclear concerns related to:
In the early 1970s, there were large protests about a proposed nuclear power plant in Wyhl, Germany. The project was cancelled in 1975. The anti-nuclear success at Wyhl inspired opposition to nuclear power in other parts of Europe and North America.
By the mid-1970s anti-nuclear activism gained a wider appeal and influence, and nuclear power began to become an issue of major public protest. In some countries, the nuclear power conflict "reached an intensity unprecedented in the history of technology controversies".
The increased public hostility to nuclear power led to a longer license procurement process, regulations and increased requirements for safety equipment, which made new construction much more expensive. In the United States, over 120 LWR reactor proposals were ultimately cancelled and the construction of new reactors ground to a halt.
The 1979 accident at Three Mile Island with no fatalities, played a major part in the reduction in the number of new plant constructions in many countries.
Chernobyl and renaissance:
During the 1980s one new nuclear reactor started up every 17 days on average. By the end of the decade, global installed nuclear capacity reached 300 GW. Since the late 1980s, new capacity additions slowed down significantly, with the installed nuclear capacity reaching 366 GW in 2005.
The 1986 Chernobyl disaster in the USSR, involving an RBMK reactor, altered the development of nuclear power and led to a greater focus on meeting international safety and regulatory standards. It is considered the worst nuclear disaster in history both in total casualties, with 56 direct deaths, and financially, with the cleanup and the cost estimated at 18 billion Rbls (US$68 billion in 2019, adjusted for inflation).
The international organization to promote safety awareness and the professional development of operators in nuclear facilities, the World Association of Nuclear Operators (WANO), was created as a direct outcome of the 1986 Chernobyl accident. The Chernobyl disaster played a major part in the reduction in the number of new plant constructions in the following years.
Influenced by these events, Italy voted against nuclear power in a 1987 referendum, becoming the first country to completely phase out nuclear power in 1990.
In the early 2000s, nuclear energy was expecting a nuclear renaissance, an increase in the construction of new reactors, due to concerns about carbon dioxide emissions. During this period, newer generation III reactors, such as the EPR began construction.
Fukushima:
Prospects of a nuclear renaissance were delayed by another nuclear accident. The 2011 Fukushima Daiichi nuclear accident was caused by a large tsunami triggered by the Tōhoku earthquake, one of the largest earthquakes ever recorded. The Fukushima Daiichi Nuclear Power Plant suffered three core meltdowns due to failure of the emergency cooling system for lack of electricity supply. This resulted in the most serious nuclear accident since the Chernobyl disaster.
The accident prompted a re-examination of nuclear safety and nuclear energy policy in many countries. Germany approved plans to close all its reactors by 2022, and many other countries reviewed their nuclear power programs.
Following the disaster, Japan shut down all of its nuclear power reactors, some of them permanently, and in 2015 began a gradual process to restart the remaining 40 reactors, following safety checks and based on revised criteria for operations and public approval.
In 2022, the Japanese government, under the leadership of Prime Minister Fumio Kishida, has declared that 10 more nuclear power plants be reopened since the 2011 disaster. Kishida is also pushing for research and construction of new safer nuclear plants to safeguard Japanese consumers from the fluctuating fossil fuel market and reduce Japan's greenhouse gas emissions. Prime Minister Kishida intends to have Japan become a significant exporter of nuclear energy and technology to developing countries around the world.
Current prospects:
By 2015, the IAEA's outlook for nuclear energy had become more promising, recognizing the importance of low-carbon generation for mitigating climate change. As of 2015, the global trend was for new nuclear power stations coming online to be balanced by the number of old plants being retired.
In 2016, the U.S. Energy Information Administration projected for its "base case" that world nuclear power generation would increase from 2,344 terawatt hours (TWh) in 2012 to 4,500 TWh in 2040. Most of the predicted increase was expected to be in Asia. As of 2018, there are over 150 nuclear reactors planned including 50 under construction.
In January 2019, China had 45 reactors in operation, 13 under construction, and plans to build 43 more, which would make it the world's largest generator of nuclear electricity.
As of 2021, 17 reactors were reported to be under construction. China built significantly fewer reactors than originally planned, its share of electricity from nuclear power was 5% in 2019 and observers have cautioned that, along with the risks, the changing economics of energy generation may cause new nuclear energy plants to "no longer make sense in a world that is leaning toward cheaper, more reliable renewable energy".
In October 2021, the Japanese cabinet approved the new Plan for Electricity Generation to 2030 prepared by the Agency for Natural Resources and Energy (ANRE) and an advisory committee, following public consultation. The nuclear target for 2030 requires the restart of another ten reactors.
Prime Minister Fumio Kishida in July 2022 announced that the country should consider building advanced reactors and extending operating licences beyond 60 years.
As of 2022, with world oil and gas prices on the rise, while Germany is restarting its coal plants to deal with loss of Russian gas that it needs to supplement its Energiwende, many other countries have announced ambitious plans to reinvigorate ageing nuclear generating capacity with new investments.
French President Emmanuel Macron announced his intention to build six new reactors in coming decades, placing nuclear at the heart of France's drive for carbon neutrality by 2050.
Meanwhile in the United States, the Department of Energy, in collaboration with commercial entities, TerraPower and X-energy, is planning on building two different advanced nuclear reactors by 2027, with further plans for nuclear implementation in its long term green energy and energy security goals.
Power plants:
Main articles:
Nuclear power plants are thermal power stations that generate electricity by harnessing the thermal energy released from nuclear fission. A fission nuclear power plant is generally composed of: a
When a neutron hits the nucleus of a uranium-235 or plutonium atom, it can split the nucleus into two smaller nuclei, which is a nuclear fission reaction. The reaction releases energy and neutrons. The released neutrons can hit other uranium or plutonium nuclei, causing new fission reactions, which release more energy and more neutrons.
This is called a chain reaction. In most commercial reactors, the reaction rate is contained by control rods that absorb excess neutrons. The controllability of nuclear reactors depends on the fact that a small fraction of neutrons resulting from fission are delayed. The time delay between the fission and the release of the neutrons slows down changes in reaction rates and gives time for moving the control rods to adjust the reaction rate.
Fuel cycle:
Main articles: Nuclear fuel cycle and Integrated Nuclear Fuel Cycle Information System
The life cycle of nuclear fuel starts with uranium mining. The uranium ore is then converted into a compact ore concentrate form, known as yellowcake (U3O8), to facilitate transport. Fission reactors generally need uranium-235, a fissile isotope of uranium.
The concentration of uranium-235 in natural uranium is very low (about 0.7%). Some reactors can use this natural uranium as fuel, depending on their neutron economy. These reactors generally have graphite or heavy water moderators.
For light water reactors, the most common type of reactor, this concentration is too low, and it must be increased by a process called uranium enrichment. In civilian light water reactors, uranium is typically enriched to 3.5–5% uranium-235.
The uranium is then generally converted into uranium oxide (UO2), a ceramic, that is then compressively sintered into fuel pellets, a stack of which forms fuel rods of the proper composition and geometry for the particular reactor.
After some time in the reactor, the fuel will have reduced fissile material and increased fission products, until its use becomes impractical. At this point, the spent fuel will be moved to a spent fuel pool which provides cooling for the thermal heat and shielding for ionizing radiation.
After several months or years, the spent fuel is radioactively and thermally cool enough to be moved to dry storage casks or reprocessed.
Uranium resources:
Main articles:
Uranium is a fairly common element in the Earth's crust: it is approximately as common as tin or germanium, and is about 40 times more common than silver. Uranium is present in trace concentrations in most rocks, dirt, and ocean water, but is generally economically extracted only where it is present in high concentrations.
Uranium mining can be underground, open-pit, or in-situ leach mining. An increasing number of the highest output mines are remote underground operations, such as McArthur River uranium mine, in Canada, which by itself accounts for 13% of global production. As of 2011 the world's known resources of uranium, economically recoverable at the arbitrary price ceiling of US$130/kg, were enough to last for between 70 and 100 years.
In 2007, the OECD estimated 670 years of economically recoverable uranium in total conventional resources and phosphate ores assuming the then-current use rate.
Light water reactors make relatively inefficient use of nuclear fuel, mostly using only the very rare uranium-235 isotope. Nuclear reprocessing can make this waste reusable, and newer reactors also achieve a more efficient use of the available resources than older ones.
With a pure fast reactor fuel cycle with a burn up of all the uranium and actinides (which presently make up the most hazardous substances in nuclear waste), there is an estimated 160,000 years worth of uranium in total conventional resources and phosphate ore at the price of 60–100 US$/kg.
However, reprocessing is expensive, possibly dangerous and can be used to manufacture nuclear weapons. One analysis found that for uranium prices could increase by two orders of magnitudes between 2035 and 2100 and that there could be a shortage near the end of the century.
A 2017 study by researchers from MIT and WHOI found that "at the current consumption rate, global conventional reserves of terrestrial uranium (approximately 7.6 million tonnes) could be depleted in a little over a century". Limited uranium-235 supply may inhibit substantial expansion with the current nuclear technology.
While various ways to reduce dependence on such resources are being explored, new nuclear technologies are considered to not be available in time for climate change mitigation purposes or competition with alternatives of renewables in addition to being more expensive and require costly research and development.
A study found it to be uncertain whether identified resources will be developed quickly enough to provide uninterrupted fuel supply to expanded nuclear facilities and various forms of mining may be challenged by ecological barriers, costs, and land requirements.
Researchers also report considerable import dependence of nuclear energy. Unconventional uranium resources also exist. Uranium is naturally present in seawater at a concentration of about 3 micrograms per liter, with 4.4 billion tons of uranium considered present in seawater at any time.
In 2014 it was suggested that it would be economically competitive to produce nuclear fuel from seawater if the process was implemented at large scale. Like fossil fuels, over geological timescales, uranium extracted on an industrial scale from seawater would be replenished by both river erosion of rocks and the natural process of uranium dissolved from the surface area of the ocean floor, both of which maintain the solubility equilibria of seawater concentration at a stable level.
Some commentators have argued that this strengthens the case for nuclear power to be considered a renewable energy.
Waste:
Main article: Nuclear waste
The normal operation of nuclear power plants and facilities produce radioactive waste, or nuclear waste. This type of waste is also produced during plant decommissioning. There are two broad categories of nuclear waste: low-level waste and high-level waste.
The first has low radioactivity and includes contaminated items such as clothing, which poses limited threat. High-level waste is mainly the spent fuel from nuclear reactors, which is very radioactive and must be cooled and then safely disposed of or reprocessed.
High-level waste:
Main articles: High-level waste and Spent nuclear fuel
The most important waste stream from nuclear power reactors is spent nuclear fuel, which is considered high-level waste. For LWRs, spent fuel is typically composed of 95% uranium, 4% fission products, and about 1% transuranic actinides (mostly plutonium, neptunium and americium).
The fission products are responsible for the bulk of the short-term radioactivity, whereas the plutonium and other transuranics are responsible for the bulk of the long-term radioactivity.
High-level waste (HLW) must be stored isolated from the biosphere with sufficient shielding so as to limit radiation exposure. After being removed from the reactors, used fuel bundles are stored for six to ten years in spent fuel pools, which provide cooling and shielding against radiation.
After that, the fuel is cool enough that it can be safely transferred to dry cask storage. The radioactivity decreases exponentially with time, such that it will have decreased by 99.5% after 100 years. The more intensely radioactive short-lived fission products (SLFPs) decay into stable elements in approximately 300 years, and after about 100,000 years, the spent fuel becomes less radioactive than natural uranium ore.
Commonly suggested methods to isolate LLFP waste from the biosphere include separation and transmutation, synroc treatments, or deep geological storage.
Thermal-neutron reactors, which presently constitute the majority of the world fleet, cannot burn up the reactor grade plutonium that is generated during the reactor operation. This limits the life of nuclear fuel to a few years.
In some countries, such as the United States, spent fuel is classified in its entirety as a nuclear waste. In other countries, such as France, it is largely reprocessed to produce a partially recycled fuel, known as mixed oxide fuel or MOX. For spent fuel that does not undergo reprocessing, the most concerning isotopes are the medium-lived transuranic elements, which are led by reactor-grade plutonium (half-life 24,000 years).
Some proposed reactor designs, such as the integral fast reactor and molten salt reactors, can use as fuel the plutonium and other actinides in spent fuel from light water reactors, thanks to their fast fission spectrum. This offers a potentially more attractive alternative to deep geological disposal.
The thorium fuel cycle results in similar fission products, though creates a much smaller proportion of transuranic elements from neutron capture events within a reactor. Spent thorium fuel, although more difficult to handle than spent uranium fuel, may present somewhat lower proliferation risks.
Low-level waste:
Main article: Low-level waste
The nuclear industry also produces a large volume of low-level waste, with low radioactivity, in the form of contaminated items like clothing, hand tools, water purifier resins, and (upon decommissioning) the materials of which the reactor itself is built. Low-level waste can be stored on-site until radiation levels are low enough to be disposed of as ordinary waste, or it can be sent to a low-level waste disposal site.
Waste relative to other types:
See also: Radioactive waste § Naturally occurring radioactive material
In countries with nuclear power, radioactive wastes account for less than 1% of total industrial toxic wastes, much of which remains hazardous for long periods. Overall, nuclear power produces far less waste material by volume than fossil-fuel based power plants.
Coal-burning plants, in particular, produce large amounts of toxic and mildly radioactive ash resulting from the concentration of naturally occurring radioactive materials in coal. A 2008 report from Oak Ridge National Laboratory concluded that coal power actually results in more radioactivity being released into the environment than nuclear power operation, and that the population effective dose equivalent from radiation from coal plants is 100 times that from the operation of nuclear plants.
Although coal ash is much less radioactive than spent nuclear fuel by weight, coal ash is produced in much higher quantities per unit of energy generated. It is also released directly into the environment as fly ash, whereas nuclear plants use shielding to protect the environment from radioactive materials.
Nuclear waste volume is small compared to the energy produced. For example, at Yankee Rowe Nuclear Power Station, which generated 44 billion kilowatt hours of electricity when in service, its complete spent fuel inventory is contained within sixteen casks. It is estimated that to produce a lifetime supply of energy for a person at a western standard of living (approximately 3 GWh) would require on the order of the volume of a soda can of low enriched uranium, resulting in a similar volume of spent fuel generated.
Waste disposal:
See also: List of radioactive waste treatment technologies
Following interim storage in a spent fuel pool, the bundles of used fuel rod assemblies of a typical nuclear power station are often stored on site in dry cask storage vessels. Presently, waste is mainly stored at individual reactor sites and there are over 430 locations around the world where radioactive material continues to accumulate.
Disposal of nuclear waste is often considered the most politically divisive aspect in the lifecycle of a nuclear power facility. With the lack of movement of nuclear waste in the 2 billion year old natural nuclear fission reactors in Oklo, Gabon being cited as "a source of essential information today."
Experts suggest that centralized underground repositories which are well-managed, guarded, and monitored, would be a vast improvement. There is an "international consensus on the advisability of storing nuclear waste in deep geological repositories". With the advent of new technologies, other methods including horizontal drillhole disposal into geologically inactive areas have been proposed.
There are no commercial scale purpose built underground high-level waste repositories in operation. However, in Finland the Onkalo spent nuclear fuel repository of the Olkiluoto Nuclear Power Plant is under construction as of 2015.
Reprocessing:
Main articles:
Most thermal-neutron reactors run on a once-through nuclear fuel cycle, mainly due to the low price of fresh uranium. However, many reactors are also fueled with recycled fissionable materials that remain in spent nuclear fuel. The most common fissionable material that is recycled is the reactor-grade plutonium (RGPu) that is extracted from spent fuel, it is mixed with uranium oxide and fabricated into mixed-oxide or MOX fuel.
Because thermal LWRs remain the most common reactor worldwide, this type of recycling is the most common. It is considered to increase the sustainability of the nuclear fuel cycle, reduce the attractiveness of spent fuel to theft, and lower the volume of high level nuclear waste.
Spent MOX fuel cannot generally be recycled for use in thermal-neutron reactors. This issue does not affect fast-neutron reactors, which are therefore preferred in order to achieve the full energy potential of the original uranium.
The main constituent of spent fuel from LWRs is slightly enriched uranium. This can be recycled into reprocessed uranium (RepU), which can be used in a fast reactor, used directly as fuel in CANDU reactors, or re-enriched for another cycle through an LWR. Re-enriching of reprocessed uranium is common in France and Russia.
Reprocessed uranium is also safer in terms of nuclear proliferation potential. Reprocessing has the potential to recover up to 95% of the uranium and plutonium fuel in spent nuclear fuel, as well as reduce long-term radioactivity within the remaining waste.
However, reprocessing has been politically controversial because of the potential for nuclear proliferation and varied perceptions of increasing the vulnerability to nuclear terrorism.
Reprocessing also leads to higher fuel cost compared to the once-through fuel cycle.While reprocessing reduces the volume of high-level waste, it does not reduce the fission products that are the primary causes of residual heat generation and radioactivity for the first few centuries outside the reactor. Thus, reprocessed waste still requires an almost identical treatment for the initial first few hundred years.
Reprocessing of civilian fuel from power reactors is currently done in France, the United Kingdom, Russia, Japan, and India. In the United States, spent nuclear fuel is currently not reprocessed.
The La Hague reprocessing facility in France has operated commercially since 1976 and is responsible for half the world's reprocessing as of 2010. It produces MOX fuel from spent fuel derived from several countries. More than 32,000 tonnes of spent fuel had been reprocessed as of 2015, with the majority from France, 17% from Germany, and 9% from Japan.
Breeding:
Main articles: Breeder reactor and Nuclear power proposed as renewable energy
Breeding is the process of converting non-fissile material into fissile material that can be used as nuclear fuel. The non-fissile material that can be used for this process is called fertile material, and constitute the vast majority of current nuclear waste. This breeding process occurs naturally in breeder reactors.
As opposed to light water thermal-neutron reactors, which use uranium-235 (0.7% of all natural uranium), fast-neutron breeder reactors use uranium-238 (99.3% of all natural uranium) or thorium. A number of fuel cycles and breeder reactor combinations are considered to be sustainable or renewable sources of energy.
In 2006 it was estimated that with seawater extraction, there was likely five billion years' worth of uranium resources for use in breeder reactors.
Breeder technology has been used in several reactors, but as of 2006, the high cost of reprocessing fuel safely requires uranium prices of more than US$200/kg before becoming justified economically.
Breeder reactors are however being developed for their potential to burn up all of the actinides (the most active and dangerous components) in the present inventory of nuclear waste, while also producing power and creating additional quantities of fuel for more reactors via the breeding process.
As of 2017, there are two breeders producing commercial power, BN-600 reactor and the BN-800 reactor, both in Russia. The Phénix breeder reactor in France was powered down in 2009 after 36 years of operation. Both China and India are building breeder reactors. The Indian 500 MWe Prototype Fast Breeder Reactor is in the commissioning phase, with plans to build more.
Another alternative to fast-neutron breeders are thermal-neutron breeder reactors that use uranium-233 bred from thorium as fission fuel in the thorium fuel cycle. Thorium is about 3.5 times more common than uranium in the Earth's crust, and has different geographic characteristics.
India's three-stage nuclear power programme features the use of a thorium fuel cycle in the third stage, as it has abundant thorium reserves but little uranium.
Decommissioning:
Main article: Nuclear decommissioning
Nuclear decommissioning is the process of dismantling a nuclear facility to the point that it no longer requires measures for radiation protection, returning the facility and its parts to a safe enough level to be entrusted for other uses.
Due to the presence of radioactive materials, nuclear decommissioning presents technical and economic challenges. The costs of decommissioning are generally spread over the lifetime of a facility and saved in a decommissioning fund.
Production:
Further information: Nuclear power by country and List of nuclear reactors
Civilian nuclear power supplied 2,586 terawatt hours (TWh) of electricity in 2019, equivalent to about 10% of global electricity generation, and was the second largest low-carbon power source after hydroelectricity.
Since electricity accounts for about 25% of world energy consumption, nuclear power's contribution to global energy was about 2.5% in 2011. This is a little more than the combined global electricity production from wind, solar, biomass and geothermal power, which together provided 2% of global final energy consumption in 2014.
Nuclear power's share of global electricity production has fallen from 16.5% in 1997, in large part because the economics of nuclear power have become more difficult.
As of March 2022, there are 439 civilian fission reactors in the world, with a combined electrical capacity of 392 gigawatt (GW). There are also 56 nuclear power reactors under construction and 96 reactors planned, with a combined capacity of 62 GW and 96 GW, respectively.
The United States has the largest fleet of nuclear reactors, generating over 800 TWh per year with an average capacity factor of 92%. Most reactors under construction are generation III reactors in Asia.
Regional differences in the use of nuclear power are large:
The United States produces the most nuclear energy in the world, with nuclear power providing 20% of the electricity it consumes, while France produces the highest percentage of its electrical energy from nuclear reactors—71% in 2019.
In the European Union, nuclear power provides 26% of the electricity as of 2018.
Nuclear power is the single largest low-carbon electricity source in the United States, and accounts for two-thirds of the European Union's low-carbon electricity. Nuclear energy policy differs among European Union countries, and some, such as Austria, Estonia, Ireland and Italy, have no active nuclear power stations.
In addition, there were approximately 140 naval vessels using nuclear propulsion in operation, powered by about 180 reactors. These include military and some civilian ships, such as nuclear-powered icebreakers.
International research is continuing into additional uses of process heat such as hydrogen production (in support of a hydrogen economy), for desalinating sea water, and for use in district heating systems.
Economics:
Main articles:
The economics of new nuclear power plants is a controversial subject and multi-billion-dollar investments depend on the choice of energy sources. Nuclear power plants typically have high capital costs for building the plant. For this reason, comparison with other power generation methods is strongly dependent on assumptions about construction timescales and capital financing for nuclear plants.
Fuel costs account for about 30 percent of the operating costs, while prices are subject to the market.
The high cost of construction is one of the biggest challenges for nuclear power plants. A new 1,100 MW plant is estimated to cost between $6 billion to $9 billion. Nuclear power cost trends show large disparity by nation, design, build rate and the establishment of familiarity in expertise.
The only two nations for which data is available that saw cost decreases in the 2000s were India and South Korea.
Analysis of the economics of nuclear power must also take into account who bears the risks of future uncertainties. As of 2010, all operating nuclear power plants have been developed by state-owned or regulated electric utility monopolies.
Many countries have since liberalized the electricity market where these risks, and the risk of cheaper competitors emerging before capital costs are recovered, are borne by plant suppliers and operators rather than consumers, which leads to a significantly different evaluation of the economics of new nuclear power plants.
The levelized cost of electricity (LCOE) from a new nuclear power plant is estimated to be 69 USD/MWh, according to an analysis by the International Energy Agency and the OECD Nuclear Energy Agency. This represents the median cost estimate for an nth-of-a-kind nuclear power plant to be completed in 2025, at a discount rate of 7%.
Nuclear power was found to be the least-cost option among dispatchable technologies. Variable renewables can generate cheaper electricity: the median cost of onshore wind power was estimated to be 50 USD/MWh, and utility-scale solar power 56 USD/MWh.
At the assumed CO2 emission cost of 30 USD/ton, power from coal (88 USD/MWh) and gas (71 USD/MWh) is more expensive than low-carbon technologies. Electricity from long-term operation of nuclear power plants by lifetime extension was found the be the least-cost option, at 32 USD/MWh.
Measures to mitigate global warming, such as a carbon tax or carbon emissions trading, may favor the economics of nuclear power. Extreme weather events, including events made more severe by climate change, are decreasing all energy source reliability including nuclear energy by a small degree, depending on location siting.
New small modular reactors, such as those developed by NuScale Power, are aimed at reducing the investment costs for new construction by making the reactors smaller and modular, so that they can be built in a factory.
Certain designs had considerable early positive economics, such as the CANDU, which realized much higher capacity factor and reliability when compared to generation II light water reactors up to the 1990s.
Nuclear power plants, though capable of some grid-load following, are typically run as much as possible to keep the cost of the generated electrical energy as low as possible, supplying mostly base-load electricity. Due to the on-line refueling reactor design, PHWRs (of which the CANDU design is a part) continue to hold many world record positions for longest continual electricity generation, often over 800 days. The specific record as of 2019 is held by a PHWR at Kaiga Atomic Power Station, generating electricity continuously for 962 days.
Costs not considered in LCOE calculations include funds for research and development, and disasters (the Fukushima disaster is estimated to cost taxpayers ≈$187 billion).
Governments were found to in some cases force "consumers to pay upfront for potential cost overruns" or subsidize uneconomic nuclear energy or be required to do so. Nuclear operators are liable to pay for the waste management in the EU.
In the U.S. the Congress reportedly decided 40 years ago that the nation, and not private companies, would be responsible for storing radioactive waste with taxpayers paying for the costs.
The World Nuclear Waste Report 2019 found that "even in countries in which the polluter-pays-principle is a legal requirement, it is applied incompletely" and notes the case of the German Asse II deep geological disposal facility, where the retrieval of large amounts of waste has to be paid for by taxpayers. Similarly, other forms of energy, including fossil fuels and renewables, have a portion of their costs covered by governments.
Use in space
Main article: Nuclear power in space
The most common use of nuclear power in space is the use of radioisotope thermoelectric generators, which use radioactive decay to generate power. These power generators are relatively small scale (few kW), and they are mostly used to power space missions and experiments for long periods where solar power is not available in sufficient quantity, such as in the Voyager 2 space probe.
A few space vehicles have been launched using nuclear reactors: 34 reactors belong to the Soviet RORSAT series and one was the American SNAP-10A.
Both fission and fusion appear promising for space propulsion applications, generating higher mission velocities with less reaction mass.
Safety:
See also: Nuclear safety and security and Nuclear reactor safety system
Nuclear power plants have three unique characteristics that affect their safety, as compared to other power plants. Firstly, intensely radioactive materials are present in a nuclear reactor.
Their release to the environment could be hazardous.
Second, the fission products, which make up most of the intensely radioactive substances in the reactor, continue to generate a significant amount of decay heat even after the fission chain reaction has stopped. If the heat cannot be removed from the reactor, the fuel rods may overheat and release radioactive materials.
Third, a criticality accident (a rapid increase of the reactor power) is possible in certain reactor designs if the chain reaction cannot be controlled. These three characteristics have to be taken into account when designing nuclear reactors.
All modern reactors are designed so that an uncontrolled increase of the reactor power is prevented by natural feedback mechanisms, a concept known as negative void coefficient of reactivity. If the temperature or the amount of steam in the reactor increases, the fission rate inherently decreases.
The chain reaction can also be manually stopped by inserting control rods into the reactor core. Emergency core cooling systems (ECCS) can remove the decay heat from the reactor if normal cooling systems fail. If the ECCS fails, multiple physical barriers limit the release of radioactive materials to the environment even in the case of an accident.
The last physical barrier is the large containment building.
With a death rate of 0.07 per TWh, nuclear power is the safest energy source per unit of energy generated in terms of mortality when the historical track-record is considered.
Energy produced by coal, petroleum, natural gas and hydropower has caused more deaths per unit of energy generated due to air pollution and energy accidents. This is found when comparing the immediate deaths from other energy sources to both the immediate and the latent, or predicted, indirect cancer deaths from nuclear energy accidents.
When the direct and indirect fatalities (including fatalities resulting from the mining and air pollution) from nuclear power and fossil fuels are compared, the use of nuclear power has been calculated to have prevented about 1.84 million deaths from air pollution between 1971 and 2009, by reducing the proportion of energy that would otherwise have been generated by fossil fuels.
Following the 2011 Fukushima nuclear disaster, it has been estimated that if Japan had never adopted nuclear power, accidents and pollution from coal or gas plants would have caused more lost years of life.
Serious impacts of nuclear accidents are often not directly attributable to radiation exposure, but rather social and psychological effects. Evacuation and long-term displacement of affected populations created problems for many people, especially the elderly and hospital patients.
Forced evacuation from a nuclear accident may lead to social isolation, anxiety, depression, psychosomatic medical problems, reckless behavior, and suicide. A comprehensive 2005 study on the aftermath of the Chernobyl disaster concluded that the mental health impact is the largest public health problem caused by the accident.
Frank N. von Hippel, an American scientist, commented that a disproportionate fear of ionizing radiation (radiophobia) could have long-term psychological effects on the population of contaminated areas following the Fukushima disaster.
Accidents:
See also:
Some serious nuclear and radiation accidents have occurred. The severity of nuclear accidents is generally classified using the International Nuclear Event Scale (INES) introduced by the International Atomic Energy Agency (IAEA). The scale ranks anomalous events or accidents on a scale from 0 (a deviation from normal operation that poses no safety risk) to 7 (a major accident with widespread effects).
There have been three accidents of level 5 or higher in the civilian nuclear power industry, two of which, the Chernobyl accident and the Fukushima accident, are ranked at level 7.
The first major nuclear accidents were the Kyshtym disaster in the Soviet Union and the Windscale fire in the United Kingdom, both in 1957.
The first major accident at a nuclear reactor in the USA occurred in 1961 at the SL-1, a U.S. Army experimental nuclear power reactor at the Idaho National Laboratory. An uncontrolled chain reaction resulted in a steam explosion which killed the three crew members and caused a meltdown.
Another serious accident happened in 1968, when one of the two liquid-metal-cooled reactors on board the Soviet submarine K-27 underwent a fuel element failure, with the emission of gaseous fission products into the surrounding air, resulting in 9 crew fatalities and 83 injuries.
The Fukushima Daiichi nuclear accident was caused by the 2011 Tohoku earthquake and tsunami. The accident has not caused any radiation-related deaths but resulted in radioactive contamination of surrounding areas. The difficult cleanup operation is expected to cost tens of billions of dollars over 40 or more years.
The Three Mile Island accident in 1979 was a smaller scale accident, rated at INES level 5. There were no direct or indirect deaths caused by the accident.
The impact of nuclear accidents is controversial. According to Benjamin K. Sovacool, fission energy accidents ranked first among energy sources in terms of their total economic cost, accounting for 41 percent of all property damage attributed to energy accidents.
Another analysis found that coal, oil, liquid petroleum gas and hydroelectric accidents (primarily due to the Banqiao Dam disaster) have resulted in greater economic impacts than nuclear power accidents. The study compares latent cancer deaths attributable to nuclear with immediate deaths from other energy sources per unit of energy generated, and does not include fossil fuel related cancer and other indirect deaths created by the use of fossil fuel consumption in its "severe accident" (an accident with more than five fatalities) classification.
The Chernobyl accident in 1986 caused approximately 50 deaths from direct and indirect effects, and some temporary serious injuries from acute radiation syndrome. The future predicted mortality from increases in cancer rates is estimated at 4,000 in the decades to come. However, the costs have been large and are increasing.
Nuclear power works under an insurance framework that limits or structures accident liabilities in accordance with national and international conventions. It is often argued that this potential shortfall in liability represents an external cost not included in the cost of nuclear electricity. This cost is small, amounting to about 0.1% of the levelized cost of electricity, according to a study by the Congressional Budget Office in the United States.
These beyond-regular insurance costs for worst-case scenarios are not unique to nuclear power. Hydroelectric power plants are similarly not fully insured against a catastrophic event such as dam failures. For example, the failure of the Banqiao Dam caused the death of an estimated 30,000 to 200,000 people, and 11 million people lost their homes. As private insurers base dam insurance premiums on limited scenarios, major disaster insurance in this sector is likewise provided by the state.
Attacks and sabotage:
Main articles:
Terrorists could target nuclear power plants in an attempt to release radioactive contamination into the community. The United States 9/11 Commission has said that nuclear power plants were potential targets originally considered for the September 11, 2001 attacks.
An attack on a reactor's spent fuel pool could also be serious, as these pools are less protected than the reactor core. The release of radioactivity could lead to thousands of near-term deaths and greater numbers of long-term fatalities.
In the United States, the NRC carries out "Force on Force" (FOF) exercises at all nuclear power plant sites at least once every three years. In the United States, plants are surrounded by a double row of tall fences which are electronically monitored. The plant grounds are patrolled by a sizeable force of armed guards.
Insider sabotage is also a threat because insiders can observe and work around security measures. Successful insider crimes depended on the perpetrators' observation and knowledge of security vulnerabilities.
A fire caused 5–10 million dollars worth of damage to New York's Indian Point Energy Center in 1971. The arsonist was a plant maintenance worker.
Proliferation:
Further information: Nuclear proliferation
See also: Plutonium Management and Disposition Agreement
Nuclear proliferation is the spread of nuclear weapons, fissionable material, and weapons-related nuclear technology to states that do not already possess nuclear weapons. Many technologies and materials associated with the creation of a nuclear power program have a dual-use capability, in that they can also be used to make nuclear weapons. For this reason, nuclear power presents proliferation risks.
Nuclear power program can become a route leading to a nuclear weapon. An example of this is the concern over Iran's nuclear program. The re-purposing of civilian nuclear industries for military purposes would be a breach of the Non-Proliferation Treaty, to which 190 countries adhere.
As of April 2012, there are thirty one countries that have civil nuclear power plants, of which nine have nuclear weapons. The vast majority of these nuclear weapons states have produced weapons before commercial nuclear power stations.
A fundamental goal for global security is to minimize the nuclear proliferation risks associated with the expansion of nuclear power. The Global Nuclear Energy Partnership was an international effort to create a distribution network in which developing countries in need of energy would receive nuclear fuel at a discounted rate, in exchange for that nation agreeing to forgo their own indigenous development of a uranium enrichment program.
The France-based Eurodif/European Gaseous Diffusion Uranium Enrichment Consortium is a program that successfully implemented this concept, with Spain and other countries without enrichment facilities buying a share of the fuel produced at the French-controlled enrichment facility, but without a transfer of technology. Iran was an early participant from 1974 and remains a shareholder of Eurodif via Sofidif.
A 2009 United Nations report said that: "the revival of interest in nuclear power could result in the worldwide dissemination of uranium enrichment and spent fuel reprocessing technologies, which present obvious risks of proliferation as these technologies can produce fissile materials that are directly usable in nuclear weapons.
On the other hand, power reactors can also reduce nuclear weapons arsenals when military-grade nuclear materials are reprocessed to be used as fuel in nuclear power plants.
The Megatons to Megawatts Program is considered the single most successful non-proliferation program to date. Up to 2005, the program had processed $8 billion of high enriched, weapons-grade uranium into low enriched uranium suitable as nuclear fuel for commercial fission reactors by diluting it with natural uranium. This corresponds to the elimination of 10,000 nuclear weapons.
For approximately two decades, this material generated nearly 10 percent of all the electricity consumed in the United States, or about half of all U.S. nuclear electricity, with a total of around 7,000 TWh of electricity produced. In total it is estimated to have cost $17 billion, a "bargain for US ratepayers", with Russia profiting $12 billion from the deal.
Much needed profit for the Russian nuclear oversight industry, which after the collapse of the Soviet economy, had difficulties paying for the maintenance and security of the Russian Federations highly enriched uranium and warheads.
The Megatons to Megawatts Program was hailed as a major success by anti-nuclear weapon advocates as it has largely been the driving force behind the sharp reduction in the number of nuclear weapons worldwide since the cold war ended. However, without an increase in nuclear reactors and greater demand for fissile fuel, the cost of dismantling and down blending has dissuaded Russia from continuing their disarmament. As of 2013 Russia appears to not be interested in extending the program.
Environmental impact:
Main article: Environmental impact of nuclear power
Being a low-carbon energy source with relatively little land-use requirements, nuclear energy can have a positive environmental impact. It also requires a constant supply of significant amounts of water and affects the environment through mining and milling.
Its largest potential negative impacts on the environment may arise:
However, these remain mostly only risks as historically there have only been few disasters at nuclear power plants with known relatively substantial environmental impacts.
Carbon emissions:
See also:
Nuclear power is one of the leading low carbon power generation methods of producing electricity, and in terms of total life-cycle greenhouse gas emissions per unit of energy generated, has emission values comparable to or lower than renewable energy.
A 2014 analysis of the carbon footprint literature by the Intergovernmental Panel on Climate Change (IPCC) reported that the embodied total life-cycle emission intensity of nuclear power has a median value of 12 g CO2eq/kWh, which is the lowest among all commercial baseload energy sources.
This is contrasted with coal and natural gas at 820 and 490 g CO2 eq/kWh. As of 2021, nuclear reactors worldwide have helped avoid the emission of 72 billion tonnes of carbon dioxide since 1970, compared to coal-fired electricity generation, according to a report.
Radiation:
The average dose from natural background radiation is 2.4 millisievert per year (mSv/a) globally. It varies between 1 mSv/a and 13 mSv/a, depending mostly on the geology of the location.
According to the United Nations (UNSCEAR), regular nuclear power plant operations, including the nuclear fuel cycle, increases this amount by 0.0002 mSv/a of public exposure as a global average.
The average dose from operating nuclear power plants to the local populations around them is less than 0.0001 mSv/a. For comparison, the average dose to those living within 50 miles of a coal power plant is over three times this dose, at 0.0003 mSv/a.
Chernobyl resulted in the most affected surrounding populations and male recovery personnel receiving an average initial 50 to 100 mSv over a few hours to weeks, while the remaining global legacy of the worst nuclear power plant accident in average exposure is 0.002 mSv/a and is continuously dropping at the decaying rate, from the initial high of 0.04 mSv per person averaged over the entire populace of the Northern Hemisphere in the year of the accident in 1986.
Debate:
Main articles:
The nuclear power debate concerns the controversy which has surrounded the deployment and use of nuclear fission reactors to generate electricity from nuclear fuel for civilian purposes.
Proponents of nuclear energy regard it as a sustainable energy source that reduces carbon emissions and increases energy security by decreasing dependence on other energy sources that are also often dependent on imports. For example, proponents note that annually, nuclear-generated electricity reduces 470 million metric tons of carbon dioxide emissions that would otherwise come from fossil fuels.
Additionally, the amount of comparatively low waste that nuclear energy does create is safely disposed of by the large scale nuclear energy production facilities or it is repurposed/recycled for other energy uses. M. King Hubbert, who popularized the concept of peak oil, saw oil as a resource that would run out and considered nuclear energy its replacement.
Proponents also claim that the present quantity of nuclear waste is small and can be reduced through the latest technology of newer reactors and that the operational safety record of fission-electricity in terms of deaths is so far "unparalleled".
Kharecha and Hansen estimated that "global nuclear power has prevented an average of 1.84 million air pollution-related deaths and 64 gigatonnes of CO2-equivalent (GtCO2-eq) greenhouse gas (GHG) emissions that would have resulted from fossil fuel burning" and, if continued, it could prevent up to 7 million deaths and 240 GtCO2-eq emissions by 2050.
Proponents also bring to attention the opportunity cost of utilizing other forms of electricity.
For example, the Environmental Protection Agency estimates that coal kills 30,000 people a year, as a result of its environmental impact, while 60 people died in the Chernobyl disaster. A real world example of impact provided by proponents is the 650,000 ton increase in carbon emissions in the two months following the closure of the Vermont Yankee nuclear plant.
Opponents believe that nuclear power poses many threats to people's health and environment such as the risk of nuclear weapons proliferation, long-term safe waste management and terrorism in the future. They also contend that nuclear power plants are complex systems where many things can and have gone wrong.
Costs of the Chernobyl disaster amount to ≈$68 billion as of 2019 and are increasing, the Fukushima disaster is estimated to cost taxpayers ~$187 billion, and radioactive waste management is estimated to cost the EU nuclear operators ~$250 billion by 2050.
However, in countries that already use nuclear energy, when not considering reprocessing, intermediate nuclear waste disposal costs could be relatively fixed to certain but unknown degrees "as the main part of these costs stems from the operation of the intermediate storage facility".
Critics find that one of the largest drawbacks to building new nuclear fission power plants are the large construction and operating costs when compared to alternatives of sustainable energy sources. Further costs include costs for ongoing research and development, expensive reprocessing in cases where such is practiced and decommissioning.
Proponents note that focussing on the Levelized Cost of Energy (LCOE), however, ignores the value premium associated with 24/7 dispatchable electricity and the cost of storage and backup systems necessary to integrate variable energy sources into a reliable electrical grid.
"Nuclear thus remains the dispatchable low-carbon technology with the lowest expected costs in 2025. Only large hydro reservoirs can provide a similar contribution at comparable costs but remain highly dependent on the natural endowments of individual countries."
Overall, many opponents find that nuclear energy cannot meaningfully contribute to climate change mitigation. In general, they find it to be:
Nevertheless, there is ongoing research and debate over costs of new nuclear, especially in regions where i.a. seasonal energy storage is difficult to provide and which aim to phase out fossil fuels in favor of low carbon power faster than the global average.
Some find that financial transition costs for a 100% renewables-based European energy system that has completely phased out nuclear energy could be more costly by 2050 based on current technologies (i.e. not considering potential advances in e.g. green hydrogen, transmission and flexibility capacities, ways to reduce energy needs, geothermal energy and fusion energy) when the grid only extends across Europe. Arguments of economics and safety are used by both sides of the debate.
Comparison with renewable energy:
See also: Renewable energy debate
Slowing global warming requires a transition to a low-carbon economy, mainly by burning far less fossil fuel. Limiting global warming to 1.5 °C is technically possible if no new fossil fuel power plants are built from 2019.
This has generated considerable interest and dispute in determining the best path forward to rapidly replace fossil-based fuels in the global energy mix, with intense academic debate.
Sometimes the IEA says that countries without nuclear should develop it as well as their renewable power.
Several studies suggest that it might be theoretically possible to cover a majority of world energy generation with new renewable sources. The Intergovernmental Panel on Climate Change (IPCC) has said that if governments were supportive, renewable energy supply could account for close to 80% of the world's energy use by 2050.
While in developed nations the economically feasible geography for new hydropower is lacking, with every geographically suitable area largely already exploited, some proponents of wind and solar energy claim these resources alone could eliminate the need for nuclear power.
Nuclear power is comparable to, and in some cases lower, than many renewable energy sources in terms of lives lost in the past per unit of electricity delivered.
Depending on recycling of renewable energy technologies, nuclear reactors may produce a much smaller volume of waste, although much more toxic, expensive to manage and longer-lived. A nuclear plant also needs to be disassembled and removed and much of the disassembled nuclear plant needs to be stored as low-level nuclear waste for a few decades.
The disposal and management of the wide variety of radioactive waste, of which there are over one quarter of a million tons as of 2018, can cause future damage and costs across the world for over or during hundreds of thousands of years – possibly over a million years, due to issues such as leakage, malign retrieval, vulnerability to attacks (including of reprocessing and power plants), groundwater contamination, radiation and leakage to above ground, brine leakage or bacterial corrosion.
The European Commission Joint Research Centre found that as of 2021 the necessary technologies for geological disposal of nuclear waste are now available and can be deployed. Corrosion experts noted in 2020 that putting the problem of storage off any longer "isn't good for anyone".
Separated plutonium and enriched uranium could be used for nuclear weapons, which – even with the current centralized control (e.g. state-level) and level of prevalence – are considered to be a difficult and substantial global risk for substantial future impacts on human health, lives, civilization and the environment.
Speed of transition and investment needed:
Analysis in 2015 by professor Barry W. Brook and colleagues found that nuclear energy could displace or remove fossil fuels from the electric grid completely within 10 years. This finding was based on the historically modest and proven rate at which nuclear energy was added in France and Sweden during their building programs in the 1980s.
In a similar analysis, Brook had earlier determined that 50% of all global energy, including transportation synthetic fuels etc., could be generated within approximately 30 years if the global nuclear fission build rate was identical to historical proven installation rates calculated in GW per year per unit of global GDP (GW/year/$). This is in contrast to the conceptual studies for 100% renewable energy systems, which would require an order of magnitude more costly global investment per year, which has no historical precedent.
These renewable scenarios would also need far greater land devoted to onshore wind and onshore solar projects. Brook notes that the "principal limitations on nuclear fission are not technical, economic or fuel-related, but are instead linked to complex issues of societal acceptance, fiscal and political inertia, and inadequate critical evaluation of the real-world constraints facing [the other] low-carbon alternatives."
Scientific data indicates that – assuming 2021 emissions levels – humanity only has a carbon budget equivalent to 11 years of emissions left for limiting warming to 1.5 °C while the construction of new nuclear reactors took a median of 7.2–10.9 years in 2018–2020, substantially longer than, alongside other measures, scaling up the deployment of wind and solar – especially for novel reactor types – as well as being more risky, often delayed and more dependent on state-support.
Researchers have cautioned that novel nuclear technologies – which have been in development since decades, are less tested, have higher proliferation risks, have more new safety problems, are often far from commercialization and are more expensive – are not available in time.
Critics of nuclear energy often only oppose nuclear fission energy but not nuclear fusion; however, fusion energy is unlikely to be commercially widespread before 2050.
Land use:
The median land area used by US nuclear power stations per 1 GW installed capacity is 1.3 square miles. To generate the same amount of electricity annually (taking into account capacity factors) from solar PV would require about 60 square miles, and from a wind farm about 310 square miles.
Not included in this is land required for the associated transmission lines, water supply, rail lines, mining and processing of nuclear fuel, and for waste disposal.
Research:
Advanced fission reactor designs:
Main article: Generation IV reactor
Current fission reactors in operation around the world are second or third generation systems, with most of the first-generation systems having been already retired. Research into advanced generation IV reactor types was officially started by the Generation IV International Forum (GIF) based on eight technology goals, including to improve economics, safety, proliferation resistance, natural resource utilization and the ability to consume existing nuclear waste in the production of electricity.
Most of these reactors differ significantly from current operating light water reactors, and are expected to be available for commercial construction after 2030.
Hybrid fusion-fission:
Main article: Nuclear fusion–fission hybrid
Hybrid nuclear power is a proposed means of generating power by the use of a combination of nuclear fusion and fission processes. The concept dates to the 1950s and was briefly advocated by Hans Bethe during the 1970s, but largely remained unexplored until a revival of interest in 2009, due to delays in the realization of pure fusion.
When a sustained nuclear fusion power plant is built, it has the potential to be capable of extracting all the fission energy that remains in spent fission fuel, reducing the volume of nuclear waste by orders of magnitude, and more importantly, eliminating all actinides present in the spent fuel, substances which cause security concerns.
Fusion:
Main articles: Nuclear fusion and Fusion power
Nuclear fusion reactions have the potential to be safer and generate less radioactive waste than fission. These reactions appear potentially viable, though technically quite difficult and have yet to be created on a scale that could be used in a functional power plant.
Fusion power has been under theoretical and experimental investigation since the 1950s. Nuclear fusion research is underway but fusion energy is not likely to be commercially widespread before 2050.
Several experimental nuclear fusion reactors and facilities exist. The largest and most ambitious international nuclear fusion project currently in progress is ITER, a large tokamak under construction in France.
ITER is planned to pave the way for commercial fusion power by demonstrating self-sustained nuclear fusion reactions with positive energy gain. Construction of the ITER facility began in 2007, but the project has run into many delays and budget overruns.
The facility is now not expected to begin operations until the year 2027–11 years after initially anticipated. A follow-on commercial nuclear fusion power station, DEMO, has been proposed. There are also suggestions for a power plant based upon a different fusion approach, that of an inertial fusion power plant.
Fusion-powered electricity generation was initially believed to be readily achievable, as fission-electric power had been. However, the extreme requirements for continuous reactions and plasma containment led to projections being extended by several decades. In 2020, more than 80 years after the first attempts, commercialization of fusion power production was thought to be unlikely before 2050.
See also:
BWRs heat water and produce steam directly inside the reactor vessel. Water is pumped up through the reactor core and heated by fission. Pipes then feed the steam directly to a turbine to produce electricity.
The unused steam is then condensed back to water and reused in the heating process.
[End of DOE Article]
___________________________________________________________________________
Nuclear Power as an Energy Source (Wikipedia)
Nuclear power is the use of nuclear reactions to produce electricity. Nuclear power can be obtained from nuclear fission, nuclear decay and nuclear fusion reactions. Presently, the vast majority of electricity from nuclear power is produced by nuclear fission of uranium and plutonium in nuclear power plants.
Nuclear decay processes are used in niche applications such as radioisotope thermoelectric generators in some space probes such as Voyager 2. Generating electricity from fusion power remains the focus of international research.
Most nuclear power plants use thermal reactors with enriched uranium in a once-through fuel cycle. Fuel is removed when the percentage of neutron absorbing atoms becomes so large that a chain reaction can no longer be sustained, typically three years. It is then cooled for several years in on-site spent fuel pools before being transferred to long term storage.
The spent fuel, though low in volume, is high-level radioactive waste. While its radioactivity decreases exponentially it must be isolated from the biosphere for hundreds of thousands of years, though newer technologies (like fast reactors) have the potential to reduce this significantly.
Because the spent fuel is still mostly fissionable material, some countries (e.g. France and Russia) reprocess their spent fuel by extracting fissile and fertile elements for fabrication in new fuel, although this process is more expensive than producing new fuel from mined uranium.
All reactors breed some plutonium-239, which is found in the spent fuel, and because Pu-239 is the preferred material for nuclear weapons, reprocessing is seen as a weapon proliferation risk.
The first nuclear power plant was built in the 1950s. The global installed nuclear capacity grew to 100 GW in the late 1970s, and then expanded rapidly during the 1980s, reaching 300 GW by 1990.
The 1979 Three Mile Island accident in the United States and the 1986 Chernobyl disaster in the Soviet Union resulted in increased regulation and public opposition to nuclear plants.
These factors, along with high cost of construction, resulted in the global installed capacity only increasing to 390 GW by 2022. These plants supplied 2,586 terawatt hours (TWh) of electricity in 2019, equivalent to about 10% of global electricity generation, and were the second-largest low-carbon power source after hydroelectricity.
As of September 2022, there are 437 civilian fission reactors in the world, with overall capacity of 393 GW, 57 under construction and 102 planned, with a combined capacity of 62 GW and 96 GW, respectively.
The United States has the largest fleet of nuclear reactors, generating over 800 TWh of zero-emissions electricity per year with an average capacity factor of 92%. Average global capacity factor is 89%. Most new reactors under construction are generation III reactors in Asia.
Nuclear power generation causes one of the lowest levels of fatalities per unit of energy generated compared to other energy sources. Coal, petroleum, natural gas and hydroelectricity each have caused more fatalities per unit of energy due to air pollution and accidents.
Nuclear power plants emit no greenhouse gases. One of the dangers of nuclear power is the potential for accidents like the Fukushima nuclear disaster in Japan in 2011.
There is a debate about nuclear power. Proponents contend that nuclear power is a safe, sustainable energy source that reduces carbon emissions. The anti-nuclear movement contends that nuclear power poses many threats to people and the environment and is too expensive and slow to deploy when compared to alternative sustainable energy sources.
History:
Main article: History of nuclear power
Origins:
The discovery of nuclear fission occurred in 1938 following over four decades of work on the science of radioactivity and the elaboration of new nuclear physics that described the components of atoms. Soon after the discovery of the fission process, it was realized that a fissioning nucleus can induce further nucleus fissions, thus inducing a self-sustaining chain reaction.
Once this was experimentally confirmed in 1939, scientists in many countries petitioned their governments for support of nuclear fission research, just on the cusp of World War II, for the development of a nuclear weapon.
In the United States, these research efforts led to the creation of the first man-made nuclear reactor, the Chicago Pile-1, which achieved criticality on December 2, 1942. The reactor's development was part of the Manhattan Project, the Allied effort to create atomic bombs during World War II.
It led to the building of larger single-purpose production reactors for the production of weapons-grade plutonium for use in the first nuclear weapons. The United States tested the first nuclear weapon in July 1945, the Trinity test, with the atomic bombings of Hiroshima and Nagasaki taking place one month later.
Despite the military nature of the first nuclear devices, the 1940s and 1950s were characterized by strong optimism for the potential of nuclear power to provide cheap and endless energy. Electricity was generated for the first time by a nuclear reactor on December 20, 1951, at the EBR-I experimental station near Arco, Idaho, which initially produced about 100 kW.
In 1953, American President Dwight Eisenhower gave his "Atoms for Peace" speech at the United Nations, emphasizing the need to develop "peaceful" uses of nuclear power quickly. This was followed by the Atomic Energy Act of 1954 which allowed rapid declassification of U.S. reactor technology and encouraged development by the private sector.
First power generation:
The first organization to develop practical nuclear power was the U.S. Navy, with the S1W reactor for the purpose of propelling submarines and aircraft carriers. The first nuclear-powered submarine, USS Nautilus, was put to sea in January 1954. The S1W reactor was a pressurized water reactor.
This design was chosen because it was simpler, more compact, and easier to operate compared to alternative designs, thus more suitable to be used in submarines. This decision would result in the PWR being the reactor of choice also for power generation, thus having a lasting impact on the civilian electricity market in the years to come.
On June 27, 1954, the Obninsk Nuclear Power Plant in the USSR became the world's first nuclear power plant to generate electricity for a power grid, producing around 5 megawatts of electric power. The world's first commercial nuclear power station, Calder Hall at Windscale, England was connected to the national power grid on 27 August 1956. In common with a number of other generation I reactors, the plant had the dual purpose of producing electricity and plutonium-239, the latter for the nascent nuclear weapons program in Britain.
Expansion and first opposition:
The total global installed nuclear capacity initially rose relatively quickly, rising from less than 1 gigawatt (GW) in 1960 to 100 GW in the late 1970s. During the 1970s and 1980s rising economic costs (related to extended construction times largely due to regulatory changes and pressure-group litigation) and falling fossil fuel prices made nuclear power plants then under construction less attractive.
In the 1980s in the U.S. and 1990s in Europe, the flat electric grid growth and electricity liberalization also made the addition of large new baseload energy generators economically unattractive.
The 1973 oil crisis had a significant effect on countries, such as France and Japan, which had relied more heavily on oil for electric generation to invest in nuclear power. France would construct 25 nuclear power plants over the next 15 years, and as of 2019, 71% of French electricity was generated by nuclear power, the highest percentage by any nation in the world.
Some local opposition to nuclear power emerged in the United States in the early 1960s. In the late 1960s some members of the scientific community began to express pointed concerns.
These anti-nuclear concerns related to:
In the early 1970s, there were large protests about a proposed nuclear power plant in Wyhl, Germany. The project was cancelled in 1975. The anti-nuclear success at Wyhl inspired opposition to nuclear power in other parts of Europe and North America.
By the mid-1970s anti-nuclear activism gained a wider appeal and influence, and nuclear power began to become an issue of major public protest. In some countries, the nuclear power conflict "reached an intensity unprecedented in the history of technology controversies".
The increased public hostility to nuclear power led to a longer license procurement process, regulations and increased requirements for safety equipment, which made new construction much more expensive. In the United States, over 120 LWR reactor proposals were ultimately cancelled and the construction of new reactors ground to a halt.
The 1979 accident at Three Mile Island with no fatalities, played a major part in the reduction in the number of new plant constructions in many countries.
Chernobyl and renaissance:
During the 1980s one new nuclear reactor started up every 17 days on average. By the end of the decade, global installed nuclear capacity reached 300 GW. Since the late 1980s, new capacity additions slowed down significantly, with the installed nuclear capacity reaching 366 GW in 2005.
The 1986 Chernobyl disaster in the USSR, involving an RBMK reactor, altered the development of nuclear power and led to a greater focus on meeting international safety and regulatory standards. It is considered the worst nuclear disaster in history both in total casualties, with 56 direct deaths, and financially, with the cleanup and the cost estimated at 18 billion Rbls (US$68 billion in 2019, adjusted for inflation).
The international organization to promote safety awareness and the professional development of operators in nuclear facilities, the World Association of Nuclear Operators (WANO), was created as a direct outcome of the 1986 Chernobyl accident. The Chernobyl disaster played a major part in the reduction in the number of new plant constructions in the following years.
Influenced by these events, Italy voted against nuclear power in a 1987 referendum, becoming the first country to completely phase out nuclear power in 1990.
In the early 2000s, nuclear energy was expecting a nuclear renaissance, an increase in the construction of new reactors, due to concerns about carbon dioxide emissions. During this period, newer generation III reactors, such as the EPR began construction.
Fukushima:
Prospects of a nuclear renaissance were delayed by another nuclear accident. The 2011 Fukushima Daiichi nuclear accident was caused by a large tsunami triggered by the Tōhoku earthquake, one of the largest earthquakes ever recorded. The Fukushima Daiichi Nuclear Power Plant suffered three core meltdowns due to failure of the emergency cooling system for lack of electricity supply. This resulted in the most serious nuclear accident since the Chernobyl disaster.
The accident prompted a re-examination of nuclear safety and nuclear energy policy in many countries. Germany approved plans to close all its reactors by 2022, and many other countries reviewed their nuclear power programs.
Following the disaster, Japan shut down all of its nuclear power reactors, some of them permanently, and in 2015 began a gradual process to restart the remaining 40 reactors, following safety checks and based on revised criteria for operations and public approval.
In 2022, the Japanese government, under the leadership of Prime Minister Fumio Kishida, has declared that 10 more nuclear power plants be reopened since the 2011 disaster. Kishida is also pushing for research and construction of new safer nuclear plants to safeguard Japanese consumers from the fluctuating fossil fuel market and reduce Japan's greenhouse gas emissions. Prime Minister Kishida intends to have Japan become a significant exporter of nuclear energy and technology to developing countries around the world.
Current prospects:
By 2015, the IAEA's outlook for nuclear energy had become more promising, recognizing the importance of low-carbon generation for mitigating climate change. As of 2015, the global trend was for new nuclear power stations coming online to be balanced by the number of old plants being retired.
In 2016, the U.S. Energy Information Administration projected for its "base case" that world nuclear power generation would increase from 2,344 terawatt hours (TWh) in 2012 to 4,500 TWh in 2040. Most of the predicted increase was expected to be in Asia. As of 2018, there are over 150 nuclear reactors planned including 50 under construction.
In January 2019, China had 45 reactors in operation, 13 under construction, and plans to build 43 more, which would make it the world's largest generator of nuclear electricity.
As of 2021, 17 reactors were reported to be under construction. China built significantly fewer reactors than originally planned, its share of electricity from nuclear power was 5% in 2019 and observers have cautioned that, along with the risks, the changing economics of energy generation may cause new nuclear energy plants to "no longer make sense in a world that is leaning toward cheaper, more reliable renewable energy".
In October 2021, the Japanese cabinet approved the new Plan for Electricity Generation to 2030 prepared by the Agency for Natural Resources and Energy (ANRE) and an advisory committee, following public consultation. The nuclear target for 2030 requires the restart of another ten reactors.
Prime Minister Fumio Kishida in July 2022 announced that the country should consider building advanced reactors and extending operating licences beyond 60 years.
As of 2022, with world oil and gas prices on the rise, while Germany is restarting its coal plants to deal with loss of Russian gas that it needs to supplement its Energiwende, many other countries have announced ambitious plans to reinvigorate ageing nuclear generating capacity with new investments.
French President Emmanuel Macron announced his intention to build six new reactors in coming decades, placing nuclear at the heart of France's drive for carbon neutrality by 2050.
Meanwhile in the United States, the Department of Energy, in collaboration with commercial entities, TerraPower and X-energy, is planning on building two different advanced nuclear reactors by 2027, with further plans for nuclear implementation in its long term green energy and energy security goals.
Power plants:
Main articles:
Nuclear power plants are thermal power stations that generate electricity by harnessing the thermal energy released from nuclear fission. A fission nuclear power plant is generally composed of: a
- nuclear reactor, in which the nuclear reactions generating heat take place;
- a cooling system, which removes the heat from inside the reactor;
- a steam turbine, which transforms the heat into mechanical energy;
- an electric generator, which transforms the mechanical energy into electrical energy.
When a neutron hits the nucleus of a uranium-235 or plutonium atom, it can split the nucleus into two smaller nuclei, which is a nuclear fission reaction. The reaction releases energy and neutrons. The released neutrons can hit other uranium or plutonium nuclei, causing new fission reactions, which release more energy and more neutrons.
This is called a chain reaction. In most commercial reactors, the reaction rate is contained by control rods that absorb excess neutrons. The controllability of nuclear reactors depends on the fact that a small fraction of neutrons resulting from fission are delayed. The time delay between the fission and the release of the neutrons slows down changes in reaction rates and gives time for moving the control rods to adjust the reaction rate.
Fuel cycle:
Main articles: Nuclear fuel cycle and Integrated Nuclear Fuel Cycle Information System
The life cycle of nuclear fuel starts with uranium mining. The uranium ore is then converted into a compact ore concentrate form, known as yellowcake (U3O8), to facilitate transport. Fission reactors generally need uranium-235, a fissile isotope of uranium.
The concentration of uranium-235 in natural uranium is very low (about 0.7%). Some reactors can use this natural uranium as fuel, depending on their neutron economy. These reactors generally have graphite or heavy water moderators.
For light water reactors, the most common type of reactor, this concentration is too low, and it must be increased by a process called uranium enrichment. In civilian light water reactors, uranium is typically enriched to 3.5–5% uranium-235.
The uranium is then generally converted into uranium oxide (UO2), a ceramic, that is then compressively sintered into fuel pellets, a stack of which forms fuel rods of the proper composition and geometry for the particular reactor.
After some time in the reactor, the fuel will have reduced fissile material and increased fission products, until its use becomes impractical. At this point, the spent fuel will be moved to a spent fuel pool which provides cooling for the thermal heat and shielding for ionizing radiation.
After several months or years, the spent fuel is radioactively and thermally cool enough to be moved to dry storage casks or reprocessed.
Uranium resources:
Main articles:
Uranium is a fairly common element in the Earth's crust: it is approximately as common as tin or germanium, and is about 40 times more common than silver. Uranium is present in trace concentrations in most rocks, dirt, and ocean water, but is generally economically extracted only where it is present in high concentrations.
Uranium mining can be underground, open-pit, or in-situ leach mining. An increasing number of the highest output mines are remote underground operations, such as McArthur River uranium mine, in Canada, which by itself accounts for 13% of global production. As of 2011 the world's known resources of uranium, economically recoverable at the arbitrary price ceiling of US$130/kg, were enough to last for between 70 and 100 years.
In 2007, the OECD estimated 670 years of economically recoverable uranium in total conventional resources and phosphate ores assuming the then-current use rate.
Light water reactors make relatively inefficient use of nuclear fuel, mostly using only the very rare uranium-235 isotope. Nuclear reprocessing can make this waste reusable, and newer reactors also achieve a more efficient use of the available resources than older ones.
With a pure fast reactor fuel cycle with a burn up of all the uranium and actinides (which presently make up the most hazardous substances in nuclear waste), there is an estimated 160,000 years worth of uranium in total conventional resources and phosphate ore at the price of 60–100 US$/kg.
However, reprocessing is expensive, possibly dangerous and can be used to manufacture nuclear weapons. One analysis found that for uranium prices could increase by two orders of magnitudes between 2035 and 2100 and that there could be a shortage near the end of the century.
A 2017 study by researchers from MIT and WHOI found that "at the current consumption rate, global conventional reserves of terrestrial uranium (approximately 7.6 million tonnes) could be depleted in a little over a century". Limited uranium-235 supply may inhibit substantial expansion with the current nuclear technology.
While various ways to reduce dependence on such resources are being explored, new nuclear technologies are considered to not be available in time for climate change mitigation purposes or competition with alternatives of renewables in addition to being more expensive and require costly research and development.
A study found it to be uncertain whether identified resources will be developed quickly enough to provide uninterrupted fuel supply to expanded nuclear facilities and various forms of mining may be challenged by ecological barriers, costs, and land requirements.
Researchers also report considerable import dependence of nuclear energy. Unconventional uranium resources also exist. Uranium is naturally present in seawater at a concentration of about 3 micrograms per liter, with 4.4 billion tons of uranium considered present in seawater at any time.
In 2014 it was suggested that it would be economically competitive to produce nuclear fuel from seawater if the process was implemented at large scale. Like fossil fuels, over geological timescales, uranium extracted on an industrial scale from seawater would be replenished by both river erosion of rocks and the natural process of uranium dissolved from the surface area of the ocean floor, both of which maintain the solubility equilibria of seawater concentration at a stable level.
Some commentators have argued that this strengthens the case for nuclear power to be considered a renewable energy.
Waste:
Main article: Nuclear waste
The normal operation of nuclear power plants and facilities produce radioactive waste, or nuclear waste. This type of waste is also produced during plant decommissioning. There are two broad categories of nuclear waste: low-level waste and high-level waste.
The first has low radioactivity and includes contaminated items such as clothing, which poses limited threat. High-level waste is mainly the spent fuel from nuclear reactors, which is very radioactive and must be cooled and then safely disposed of or reprocessed.
High-level waste:
Main articles: High-level waste and Spent nuclear fuel
The most important waste stream from nuclear power reactors is spent nuclear fuel, which is considered high-level waste. For LWRs, spent fuel is typically composed of 95% uranium, 4% fission products, and about 1% transuranic actinides (mostly plutonium, neptunium and americium).
The fission products are responsible for the bulk of the short-term radioactivity, whereas the plutonium and other transuranics are responsible for the bulk of the long-term radioactivity.
High-level waste (HLW) must be stored isolated from the biosphere with sufficient shielding so as to limit radiation exposure. After being removed from the reactors, used fuel bundles are stored for six to ten years in spent fuel pools, which provide cooling and shielding against radiation.
After that, the fuel is cool enough that it can be safely transferred to dry cask storage. The radioactivity decreases exponentially with time, such that it will have decreased by 99.5% after 100 years. The more intensely radioactive short-lived fission products (SLFPs) decay into stable elements in approximately 300 years, and after about 100,000 years, the spent fuel becomes less radioactive than natural uranium ore.
Commonly suggested methods to isolate LLFP waste from the biosphere include separation and transmutation, synroc treatments, or deep geological storage.
Thermal-neutron reactors, which presently constitute the majority of the world fleet, cannot burn up the reactor grade plutonium that is generated during the reactor operation. This limits the life of nuclear fuel to a few years.
In some countries, such as the United States, spent fuel is classified in its entirety as a nuclear waste. In other countries, such as France, it is largely reprocessed to produce a partially recycled fuel, known as mixed oxide fuel or MOX. For spent fuel that does not undergo reprocessing, the most concerning isotopes are the medium-lived transuranic elements, which are led by reactor-grade plutonium (half-life 24,000 years).
Some proposed reactor designs, such as the integral fast reactor and molten salt reactors, can use as fuel the plutonium and other actinides in spent fuel from light water reactors, thanks to their fast fission spectrum. This offers a potentially more attractive alternative to deep geological disposal.
The thorium fuel cycle results in similar fission products, though creates a much smaller proportion of transuranic elements from neutron capture events within a reactor. Spent thorium fuel, although more difficult to handle than spent uranium fuel, may present somewhat lower proliferation risks.
Low-level waste:
Main article: Low-level waste
The nuclear industry also produces a large volume of low-level waste, with low radioactivity, in the form of contaminated items like clothing, hand tools, water purifier resins, and (upon decommissioning) the materials of which the reactor itself is built. Low-level waste can be stored on-site until radiation levels are low enough to be disposed of as ordinary waste, or it can be sent to a low-level waste disposal site.
Waste relative to other types:
See also: Radioactive waste § Naturally occurring radioactive material
In countries with nuclear power, radioactive wastes account for less than 1% of total industrial toxic wastes, much of which remains hazardous for long periods. Overall, nuclear power produces far less waste material by volume than fossil-fuel based power plants.
Coal-burning plants, in particular, produce large amounts of toxic and mildly radioactive ash resulting from the concentration of naturally occurring radioactive materials in coal. A 2008 report from Oak Ridge National Laboratory concluded that coal power actually results in more radioactivity being released into the environment than nuclear power operation, and that the population effective dose equivalent from radiation from coal plants is 100 times that from the operation of nuclear plants.
Although coal ash is much less radioactive than spent nuclear fuel by weight, coal ash is produced in much higher quantities per unit of energy generated. It is also released directly into the environment as fly ash, whereas nuclear plants use shielding to protect the environment from radioactive materials.
Nuclear waste volume is small compared to the energy produced. For example, at Yankee Rowe Nuclear Power Station, which generated 44 billion kilowatt hours of electricity when in service, its complete spent fuel inventory is contained within sixteen casks. It is estimated that to produce a lifetime supply of energy for a person at a western standard of living (approximately 3 GWh) would require on the order of the volume of a soda can of low enriched uranium, resulting in a similar volume of spent fuel generated.
Waste disposal:
See also: List of radioactive waste treatment technologies
Following interim storage in a spent fuel pool, the bundles of used fuel rod assemblies of a typical nuclear power station are often stored on site in dry cask storage vessels. Presently, waste is mainly stored at individual reactor sites and there are over 430 locations around the world where radioactive material continues to accumulate.
Disposal of nuclear waste is often considered the most politically divisive aspect in the lifecycle of a nuclear power facility. With the lack of movement of nuclear waste in the 2 billion year old natural nuclear fission reactors in Oklo, Gabon being cited as "a source of essential information today."
Experts suggest that centralized underground repositories which are well-managed, guarded, and monitored, would be a vast improvement. There is an "international consensus on the advisability of storing nuclear waste in deep geological repositories". With the advent of new technologies, other methods including horizontal drillhole disposal into geologically inactive areas have been proposed.
There are no commercial scale purpose built underground high-level waste repositories in operation. However, in Finland the Onkalo spent nuclear fuel repository of the Olkiluoto Nuclear Power Plant is under construction as of 2015.
Reprocessing:
Main articles:
Most thermal-neutron reactors run on a once-through nuclear fuel cycle, mainly due to the low price of fresh uranium. However, many reactors are also fueled with recycled fissionable materials that remain in spent nuclear fuel. The most common fissionable material that is recycled is the reactor-grade plutonium (RGPu) that is extracted from spent fuel, it is mixed with uranium oxide and fabricated into mixed-oxide or MOX fuel.
Because thermal LWRs remain the most common reactor worldwide, this type of recycling is the most common. It is considered to increase the sustainability of the nuclear fuel cycle, reduce the attractiveness of spent fuel to theft, and lower the volume of high level nuclear waste.
Spent MOX fuel cannot generally be recycled for use in thermal-neutron reactors. This issue does not affect fast-neutron reactors, which are therefore preferred in order to achieve the full energy potential of the original uranium.
The main constituent of spent fuel from LWRs is slightly enriched uranium. This can be recycled into reprocessed uranium (RepU), which can be used in a fast reactor, used directly as fuel in CANDU reactors, or re-enriched for another cycle through an LWR. Re-enriching of reprocessed uranium is common in France and Russia.
Reprocessed uranium is also safer in terms of nuclear proliferation potential. Reprocessing has the potential to recover up to 95% of the uranium and plutonium fuel in spent nuclear fuel, as well as reduce long-term radioactivity within the remaining waste.
However, reprocessing has been politically controversial because of the potential for nuclear proliferation and varied perceptions of increasing the vulnerability to nuclear terrorism.
Reprocessing also leads to higher fuel cost compared to the once-through fuel cycle.While reprocessing reduces the volume of high-level waste, it does not reduce the fission products that are the primary causes of residual heat generation and radioactivity for the first few centuries outside the reactor. Thus, reprocessed waste still requires an almost identical treatment for the initial first few hundred years.
Reprocessing of civilian fuel from power reactors is currently done in France, the United Kingdom, Russia, Japan, and India. In the United States, spent nuclear fuel is currently not reprocessed.
The La Hague reprocessing facility in France has operated commercially since 1976 and is responsible for half the world's reprocessing as of 2010. It produces MOX fuel from spent fuel derived from several countries. More than 32,000 tonnes of spent fuel had been reprocessed as of 2015, with the majority from France, 17% from Germany, and 9% from Japan.
Breeding:
Main articles: Breeder reactor and Nuclear power proposed as renewable energy
Breeding is the process of converting non-fissile material into fissile material that can be used as nuclear fuel. The non-fissile material that can be used for this process is called fertile material, and constitute the vast majority of current nuclear waste. This breeding process occurs naturally in breeder reactors.
As opposed to light water thermal-neutron reactors, which use uranium-235 (0.7% of all natural uranium), fast-neutron breeder reactors use uranium-238 (99.3% of all natural uranium) or thorium. A number of fuel cycles and breeder reactor combinations are considered to be sustainable or renewable sources of energy.
In 2006 it was estimated that with seawater extraction, there was likely five billion years' worth of uranium resources for use in breeder reactors.
Breeder technology has been used in several reactors, but as of 2006, the high cost of reprocessing fuel safely requires uranium prices of more than US$200/kg before becoming justified economically.
Breeder reactors are however being developed for their potential to burn up all of the actinides (the most active and dangerous components) in the present inventory of nuclear waste, while also producing power and creating additional quantities of fuel for more reactors via the breeding process.
As of 2017, there are two breeders producing commercial power, BN-600 reactor and the BN-800 reactor, both in Russia. The Phénix breeder reactor in France was powered down in 2009 after 36 years of operation. Both China and India are building breeder reactors. The Indian 500 MWe Prototype Fast Breeder Reactor is in the commissioning phase, with plans to build more.
Another alternative to fast-neutron breeders are thermal-neutron breeder reactors that use uranium-233 bred from thorium as fission fuel in the thorium fuel cycle. Thorium is about 3.5 times more common than uranium in the Earth's crust, and has different geographic characteristics.
India's three-stage nuclear power programme features the use of a thorium fuel cycle in the third stage, as it has abundant thorium reserves but little uranium.
Decommissioning:
Main article: Nuclear decommissioning
Nuclear decommissioning is the process of dismantling a nuclear facility to the point that it no longer requires measures for radiation protection, returning the facility and its parts to a safe enough level to be entrusted for other uses.
Due to the presence of radioactive materials, nuclear decommissioning presents technical and economic challenges. The costs of decommissioning are generally spread over the lifetime of a facility and saved in a decommissioning fund.
Production:
Further information: Nuclear power by country and List of nuclear reactors
Civilian nuclear power supplied 2,586 terawatt hours (TWh) of electricity in 2019, equivalent to about 10% of global electricity generation, and was the second largest low-carbon power source after hydroelectricity.
Since electricity accounts for about 25% of world energy consumption, nuclear power's contribution to global energy was about 2.5% in 2011. This is a little more than the combined global electricity production from wind, solar, biomass and geothermal power, which together provided 2% of global final energy consumption in 2014.
Nuclear power's share of global electricity production has fallen from 16.5% in 1997, in large part because the economics of nuclear power have become more difficult.
As of March 2022, there are 439 civilian fission reactors in the world, with a combined electrical capacity of 392 gigawatt (GW). There are also 56 nuclear power reactors under construction and 96 reactors planned, with a combined capacity of 62 GW and 96 GW, respectively.
The United States has the largest fleet of nuclear reactors, generating over 800 TWh per year with an average capacity factor of 92%. Most reactors under construction are generation III reactors in Asia.
Regional differences in the use of nuclear power are large:
The United States produces the most nuclear energy in the world, with nuclear power providing 20% of the electricity it consumes, while France produces the highest percentage of its electrical energy from nuclear reactors—71% in 2019.
In the European Union, nuclear power provides 26% of the electricity as of 2018.
Nuclear power is the single largest low-carbon electricity source in the United States, and accounts for two-thirds of the European Union's low-carbon electricity. Nuclear energy policy differs among European Union countries, and some, such as Austria, Estonia, Ireland and Italy, have no active nuclear power stations.
In addition, there were approximately 140 naval vessels using nuclear propulsion in operation, powered by about 180 reactors. These include military and some civilian ships, such as nuclear-powered icebreakers.
International research is continuing into additional uses of process heat such as hydrogen production (in support of a hydrogen economy), for desalinating sea water, and for use in district heating systems.
Economics:
Main articles:
- Economics of nuclear power plants,
- List of companies in the nuclear sector,
- Cost of electricity by source
The economics of new nuclear power plants is a controversial subject and multi-billion-dollar investments depend on the choice of energy sources. Nuclear power plants typically have high capital costs for building the plant. For this reason, comparison with other power generation methods is strongly dependent on assumptions about construction timescales and capital financing for nuclear plants.
Fuel costs account for about 30 percent of the operating costs, while prices are subject to the market.
The high cost of construction is one of the biggest challenges for nuclear power plants. A new 1,100 MW plant is estimated to cost between $6 billion to $9 billion. Nuclear power cost trends show large disparity by nation, design, build rate and the establishment of familiarity in expertise.
The only two nations for which data is available that saw cost decreases in the 2000s were India and South Korea.
Analysis of the economics of nuclear power must also take into account who bears the risks of future uncertainties. As of 2010, all operating nuclear power plants have been developed by state-owned or regulated electric utility monopolies.
Many countries have since liberalized the electricity market where these risks, and the risk of cheaper competitors emerging before capital costs are recovered, are borne by plant suppliers and operators rather than consumers, which leads to a significantly different evaluation of the economics of new nuclear power plants.
The levelized cost of electricity (LCOE) from a new nuclear power plant is estimated to be 69 USD/MWh, according to an analysis by the International Energy Agency and the OECD Nuclear Energy Agency. This represents the median cost estimate for an nth-of-a-kind nuclear power plant to be completed in 2025, at a discount rate of 7%.
Nuclear power was found to be the least-cost option among dispatchable technologies. Variable renewables can generate cheaper electricity: the median cost of onshore wind power was estimated to be 50 USD/MWh, and utility-scale solar power 56 USD/MWh.
At the assumed CO2 emission cost of 30 USD/ton, power from coal (88 USD/MWh) and gas (71 USD/MWh) is more expensive than low-carbon technologies. Electricity from long-term operation of nuclear power plants by lifetime extension was found the be the least-cost option, at 32 USD/MWh.
Measures to mitigate global warming, such as a carbon tax or carbon emissions trading, may favor the economics of nuclear power. Extreme weather events, including events made more severe by climate change, are decreasing all energy source reliability including nuclear energy by a small degree, depending on location siting.
New small modular reactors, such as those developed by NuScale Power, are aimed at reducing the investment costs for new construction by making the reactors smaller and modular, so that they can be built in a factory.
Certain designs had considerable early positive economics, such as the CANDU, which realized much higher capacity factor and reliability when compared to generation II light water reactors up to the 1990s.
Nuclear power plants, though capable of some grid-load following, are typically run as much as possible to keep the cost of the generated electrical energy as low as possible, supplying mostly base-load electricity. Due to the on-line refueling reactor design, PHWRs (of which the CANDU design is a part) continue to hold many world record positions for longest continual electricity generation, often over 800 days. The specific record as of 2019 is held by a PHWR at Kaiga Atomic Power Station, generating electricity continuously for 962 days.
Costs not considered in LCOE calculations include funds for research and development, and disasters (the Fukushima disaster is estimated to cost taxpayers ≈$187 billion).
Governments were found to in some cases force "consumers to pay upfront for potential cost overruns" or subsidize uneconomic nuclear energy or be required to do so. Nuclear operators are liable to pay for the waste management in the EU.
In the U.S. the Congress reportedly decided 40 years ago that the nation, and not private companies, would be responsible for storing radioactive waste with taxpayers paying for the costs.
The World Nuclear Waste Report 2019 found that "even in countries in which the polluter-pays-principle is a legal requirement, it is applied incompletely" and notes the case of the German Asse II deep geological disposal facility, where the retrieval of large amounts of waste has to be paid for by taxpayers. Similarly, other forms of energy, including fossil fuels and renewables, have a portion of their costs covered by governments.
Use in space
Main article: Nuclear power in space
The most common use of nuclear power in space is the use of radioisotope thermoelectric generators, which use radioactive decay to generate power. These power generators are relatively small scale (few kW), and they are mostly used to power space missions and experiments for long periods where solar power is not available in sufficient quantity, such as in the Voyager 2 space probe.
A few space vehicles have been launched using nuclear reactors: 34 reactors belong to the Soviet RORSAT series and one was the American SNAP-10A.
Both fission and fusion appear promising for space propulsion applications, generating higher mission velocities with less reaction mass.
Safety:
See also: Nuclear safety and security and Nuclear reactor safety system
Nuclear power plants have three unique characteristics that affect their safety, as compared to other power plants. Firstly, intensely radioactive materials are present in a nuclear reactor.
Their release to the environment could be hazardous.
Second, the fission products, which make up most of the intensely radioactive substances in the reactor, continue to generate a significant amount of decay heat even after the fission chain reaction has stopped. If the heat cannot be removed from the reactor, the fuel rods may overheat and release radioactive materials.
Third, a criticality accident (a rapid increase of the reactor power) is possible in certain reactor designs if the chain reaction cannot be controlled. These three characteristics have to be taken into account when designing nuclear reactors.
All modern reactors are designed so that an uncontrolled increase of the reactor power is prevented by natural feedback mechanisms, a concept known as negative void coefficient of reactivity. If the temperature or the amount of steam in the reactor increases, the fission rate inherently decreases.
The chain reaction can also be manually stopped by inserting control rods into the reactor core. Emergency core cooling systems (ECCS) can remove the decay heat from the reactor if normal cooling systems fail. If the ECCS fails, multiple physical barriers limit the release of radioactive materials to the environment even in the case of an accident.
The last physical barrier is the large containment building.
With a death rate of 0.07 per TWh, nuclear power is the safest energy source per unit of energy generated in terms of mortality when the historical track-record is considered.
Energy produced by coal, petroleum, natural gas and hydropower has caused more deaths per unit of energy generated due to air pollution and energy accidents. This is found when comparing the immediate deaths from other energy sources to both the immediate and the latent, or predicted, indirect cancer deaths from nuclear energy accidents.
When the direct and indirect fatalities (including fatalities resulting from the mining and air pollution) from nuclear power and fossil fuels are compared, the use of nuclear power has been calculated to have prevented about 1.84 million deaths from air pollution between 1971 and 2009, by reducing the proportion of energy that would otherwise have been generated by fossil fuels.
Following the 2011 Fukushima nuclear disaster, it has been estimated that if Japan had never adopted nuclear power, accidents and pollution from coal or gas plants would have caused more lost years of life.
Serious impacts of nuclear accidents are often not directly attributable to radiation exposure, but rather social and psychological effects. Evacuation and long-term displacement of affected populations created problems for many people, especially the elderly and hospital patients.
Forced evacuation from a nuclear accident may lead to social isolation, anxiety, depression, psychosomatic medical problems, reckless behavior, and suicide. A comprehensive 2005 study on the aftermath of the Chernobyl disaster concluded that the mental health impact is the largest public health problem caused by the accident.
Frank N. von Hippel, an American scientist, commented that a disproportionate fear of ionizing radiation (radiophobia) could have long-term psychological effects on the population of contaminated areas following the Fukushima disaster.
Accidents:
See also:
- Energy accidents,
- Nuclear and radiation accidents and incidents,
- Lists of nuclear disasters and radioactive incidents
Some serious nuclear and radiation accidents have occurred. The severity of nuclear accidents is generally classified using the International Nuclear Event Scale (INES) introduced by the International Atomic Energy Agency (IAEA). The scale ranks anomalous events or accidents on a scale from 0 (a deviation from normal operation that poses no safety risk) to 7 (a major accident with widespread effects).
There have been three accidents of level 5 or higher in the civilian nuclear power industry, two of which, the Chernobyl accident and the Fukushima accident, are ranked at level 7.
The first major nuclear accidents were the Kyshtym disaster in the Soviet Union and the Windscale fire in the United Kingdom, both in 1957.
The first major accident at a nuclear reactor in the USA occurred in 1961 at the SL-1, a U.S. Army experimental nuclear power reactor at the Idaho National Laboratory. An uncontrolled chain reaction resulted in a steam explosion which killed the three crew members and caused a meltdown.
Another serious accident happened in 1968, when one of the two liquid-metal-cooled reactors on board the Soviet submarine K-27 underwent a fuel element failure, with the emission of gaseous fission products into the surrounding air, resulting in 9 crew fatalities and 83 injuries.
The Fukushima Daiichi nuclear accident was caused by the 2011 Tohoku earthquake and tsunami. The accident has not caused any radiation-related deaths but resulted in radioactive contamination of surrounding areas. The difficult cleanup operation is expected to cost tens of billions of dollars over 40 or more years.
The Three Mile Island accident in 1979 was a smaller scale accident, rated at INES level 5. There were no direct or indirect deaths caused by the accident.
The impact of nuclear accidents is controversial. According to Benjamin K. Sovacool, fission energy accidents ranked first among energy sources in terms of their total economic cost, accounting for 41 percent of all property damage attributed to energy accidents.
Another analysis found that coal, oil, liquid petroleum gas and hydroelectric accidents (primarily due to the Banqiao Dam disaster) have resulted in greater economic impacts than nuclear power accidents. The study compares latent cancer deaths attributable to nuclear with immediate deaths from other energy sources per unit of energy generated, and does not include fossil fuel related cancer and other indirect deaths created by the use of fossil fuel consumption in its "severe accident" (an accident with more than five fatalities) classification.
The Chernobyl accident in 1986 caused approximately 50 deaths from direct and indirect effects, and some temporary serious injuries from acute radiation syndrome. The future predicted mortality from increases in cancer rates is estimated at 4,000 in the decades to come. However, the costs have been large and are increasing.
Nuclear power works under an insurance framework that limits or structures accident liabilities in accordance with national and international conventions. It is often argued that this potential shortfall in liability represents an external cost not included in the cost of nuclear electricity. This cost is small, amounting to about 0.1% of the levelized cost of electricity, according to a study by the Congressional Budget Office in the United States.
These beyond-regular insurance costs for worst-case scenarios are not unique to nuclear power. Hydroelectric power plants are similarly not fully insured against a catastrophic event such as dam failures. For example, the failure of the Banqiao Dam caused the death of an estimated 30,000 to 200,000 people, and 11 million people lost their homes. As private insurers base dam insurance premiums on limited scenarios, major disaster insurance in this sector is likewise provided by the state.
Attacks and sabotage:
Main articles:
- Vulnerability of nuclear plants to attack,
- Nuclear terrorism,
- and Nuclear safety in the United States
Terrorists could target nuclear power plants in an attempt to release radioactive contamination into the community. The United States 9/11 Commission has said that nuclear power plants were potential targets originally considered for the September 11, 2001 attacks.
An attack on a reactor's spent fuel pool could also be serious, as these pools are less protected than the reactor core. The release of radioactivity could lead to thousands of near-term deaths and greater numbers of long-term fatalities.
In the United States, the NRC carries out "Force on Force" (FOF) exercises at all nuclear power plant sites at least once every three years. In the United States, plants are surrounded by a double row of tall fences which are electronically monitored. The plant grounds are patrolled by a sizeable force of armed guards.
Insider sabotage is also a threat because insiders can observe and work around security measures. Successful insider crimes depended on the perpetrators' observation and knowledge of security vulnerabilities.
A fire caused 5–10 million dollars worth of damage to New York's Indian Point Energy Center in 1971. The arsonist was a plant maintenance worker.
Proliferation:
Further information: Nuclear proliferation
See also: Plutonium Management and Disposition Agreement
Nuclear proliferation is the spread of nuclear weapons, fissionable material, and weapons-related nuclear technology to states that do not already possess nuclear weapons. Many technologies and materials associated with the creation of a nuclear power program have a dual-use capability, in that they can also be used to make nuclear weapons. For this reason, nuclear power presents proliferation risks.
Nuclear power program can become a route leading to a nuclear weapon. An example of this is the concern over Iran's nuclear program. The re-purposing of civilian nuclear industries for military purposes would be a breach of the Non-Proliferation Treaty, to which 190 countries adhere.
As of April 2012, there are thirty one countries that have civil nuclear power plants, of which nine have nuclear weapons. The vast majority of these nuclear weapons states have produced weapons before commercial nuclear power stations.
A fundamental goal for global security is to minimize the nuclear proliferation risks associated with the expansion of nuclear power. The Global Nuclear Energy Partnership was an international effort to create a distribution network in which developing countries in need of energy would receive nuclear fuel at a discounted rate, in exchange for that nation agreeing to forgo their own indigenous development of a uranium enrichment program.
The France-based Eurodif/European Gaseous Diffusion Uranium Enrichment Consortium is a program that successfully implemented this concept, with Spain and other countries without enrichment facilities buying a share of the fuel produced at the French-controlled enrichment facility, but without a transfer of technology. Iran was an early participant from 1974 and remains a shareholder of Eurodif via Sofidif.
A 2009 United Nations report said that: "the revival of interest in nuclear power could result in the worldwide dissemination of uranium enrichment and spent fuel reprocessing technologies, which present obvious risks of proliferation as these technologies can produce fissile materials that are directly usable in nuclear weapons.
On the other hand, power reactors can also reduce nuclear weapons arsenals when military-grade nuclear materials are reprocessed to be used as fuel in nuclear power plants.
The Megatons to Megawatts Program is considered the single most successful non-proliferation program to date. Up to 2005, the program had processed $8 billion of high enriched, weapons-grade uranium into low enriched uranium suitable as nuclear fuel for commercial fission reactors by diluting it with natural uranium. This corresponds to the elimination of 10,000 nuclear weapons.
For approximately two decades, this material generated nearly 10 percent of all the electricity consumed in the United States, or about half of all U.S. nuclear electricity, with a total of around 7,000 TWh of electricity produced. In total it is estimated to have cost $17 billion, a "bargain for US ratepayers", with Russia profiting $12 billion from the deal.
Much needed profit for the Russian nuclear oversight industry, which after the collapse of the Soviet economy, had difficulties paying for the maintenance and security of the Russian Federations highly enriched uranium and warheads.
The Megatons to Megawatts Program was hailed as a major success by anti-nuclear weapon advocates as it has largely been the driving force behind the sharp reduction in the number of nuclear weapons worldwide since the cold war ended. However, without an increase in nuclear reactors and greater demand for fissile fuel, the cost of dismantling and down blending has dissuaded Russia from continuing their disarmament. As of 2013 Russia appears to not be interested in extending the program.
Environmental impact:
Main article: Environmental impact of nuclear power
Being a low-carbon energy source with relatively little land-use requirements, nuclear energy can have a positive environmental impact. It also requires a constant supply of significant amounts of water and affects the environment through mining and milling.
Its largest potential negative impacts on the environment may arise:
- from its transgenerational risks for nuclear weapons proliferation that may increase risks of their use in the future,
- risks for problems associated with the management of the radioactive waste such as groundwater contamination,
- risks for accidents
- and for risks for various forms of attacks on waste storage sites or reprocessing- and power-plants.
However, these remain mostly only risks as historically there have only been few disasters at nuclear power plants with known relatively substantial environmental impacts.
Carbon emissions:
See also:
Nuclear power is one of the leading low carbon power generation methods of producing electricity, and in terms of total life-cycle greenhouse gas emissions per unit of energy generated, has emission values comparable to or lower than renewable energy.
A 2014 analysis of the carbon footprint literature by the Intergovernmental Panel on Climate Change (IPCC) reported that the embodied total life-cycle emission intensity of nuclear power has a median value of 12 g CO2eq/kWh, which is the lowest among all commercial baseload energy sources.
This is contrasted with coal and natural gas at 820 and 490 g CO2 eq/kWh. As of 2021, nuclear reactors worldwide have helped avoid the emission of 72 billion tonnes of carbon dioxide since 1970, compared to coal-fired electricity generation, according to a report.
Radiation:
The average dose from natural background radiation is 2.4 millisievert per year (mSv/a) globally. It varies between 1 mSv/a and 13 mSv/a, depending mostly on the geology of the location.
According to the United Nations (UNSCEAR), regular nuclear power plant operations, including the nuclear fuel cycle, increases this amount by 0.0002 mSv/a of public exposure as a global average.
The average dose from operating nuclear power plants to the local populations around them is less than 0.0001 mSv/a. For comparison, the average dose to those living within 50 miles of a coal power plant is over three times this dose, at 0.0003 mSv/a.
Chernobyl resulted in the most affected surrounding populations and male recovery personnel receiving an average initial 50 to 100 mSv over a few hours to weeks, while the remaining global legacy of the worst nuclear power plant accident in average exposure is 0.002 mSv/a and is continuously dropping at the decaying rate, from the initial high of 0.04 mSv per person averaged over the entire populace of the Northern Hemisphere in the year of the accident in 1986.
Debate:
Main articles:
The nuclear power debate concerns the controversy which has surrounded the deployment and use of nuclear fission reactors to generate electricity from nuclear fuel for civilian purposes.
Proponents of nuclear energy regard it as a sustainable energy source that reduces carbon emissions and increases energy security by decreasing dependence on other energy sources that are also often dependent on imports. For example, proponents note that annually, nuclear-generated electricity reduces 470 million metric tons of carbon dioxide emissions that would otherwise come from fossil fuels.
Additionally, the amount of comparatively low waste that nuclear energy does create is safely disposed of by the large scale nuclear energy production facilities or it is repurposed/recycled for other energy uses. M. King Hubbert, who popularized the concept of peak oil, saw oil as a resource that would run out and considered nuclear energy its replacement.
Proponents also claim that the present quantity of nuclear waste is small and can be reduced through the latest technology of newer reactors and that the operational safety record of fission-electricity in terms of deaths is so far "unparalleled".
Kharecha and Hansen estimated that "global nuclear power has prevented an average of 1.84 million air pollution-related deaths and 64 gigatonnes of CO2-equivalent (GtCO2-eq) greenhouse gas (GHG) emissions that would have resulted from fossil fuel burning" and, if continued, it could prevent up to 7 million deaths and 240 GtCO2-eq emissions by 2050.
Proponents also bring to attention the opportunity cost of utilizing other forms of electricity.
For example, the Environmental Protection Agency estimates that coal kills 30,000 people a year, as a result of its environmental impact, while 60 people died in the Chernobyl disaster. A real world example of impact provided by proponents is the 650,000 ton increase in carbon emissions in the two months following the closure of the Vermont Yankee nuclear plant.
Opponents believe that nuclear power poses many threats to people's health and environment such as the risk of nuclear weapons proliferation, long-term safe waste management and terrorism in the future. They also contend that nuclear power plants are complex systems where many things can and have gone wrong.
Costs of the Chernobyl disaster amount to ≈$68 billion as of 2019 and are increasing, the Fukushima disaster is estimated to cost taxpayers ~$187 billion, and radioactive waste management is estimated to cost the EU nuclear operators ~$250 billion by 2050.
However, in countries that already use nuclear energy, when not considering reprocessing, intermediate nuclear waste disposal costs could be relatively fixed to certain but unknown degrees "as the main part of these costs stems from the operation of the intermediate storage facility".
Critics find that one of the largest drawbacks to building new nuclear fission power plants are the large construction and operating costs when compared to alternatives of sustainable energy sources. Further costs include costs for ongoing research and development, expensive reprocessing in cases where such is practiced and decommissioning.
Proponents note that focussing on the Levelized Cost of Energy (LCOE), however, ignores the value premium associated with 24/7 dispatchable electricity and the cost of storage and backup systems necessary to integrate variable energy sources into a reliable electrical grid.
"Nuclear thus remains the dispatchable low-carbon technology with the lowest expected costs in 2025. Only large hydro reservoirs can provide a similar contribution at comparable costs but remain highly dependent on the natural endowments of individual countries."
Overall, many opponents find that nuclear energy cannot meaningfully contribute to climate change mitigation. In general, they find it to be:
- too dangerous,
- too expensive,
- to take too long for deployment,
- to be an obstacle to achieving a transition towards sustainability and carbon-neutrality,
- effectively being a distracting competition for resources:
- human,
- financial,
- time,
- infrastructure and expertise
- for the deployment and development of alternative,
- sustainable energy system technologies
- such as for wind,
- ocean and solar
- including e.g. floating solar
- as well as ways to manage their intermittency other than nuclear baseload generation such as:
- dispatchable generation,
- renewables-diversification,
- super grids,
- flexible energy demand
- and supply regulating smart grids and energy storage technologies.
Nevertheless, there is ongoing research and debate over costs of new nuclear, especially in regions where i.a. seasonal energy storage is difficult to provide and which aim to phase out fossil fuels in favor of low carbon power faster than the global average.
Some find that financial transition costs for a 100% renewables-based European energy system that has completely phased out nuclear energy could be more costly by 2050 based on current technologies (i.e. not considering potential advances in e.g. green hydrogen, transmission and flexibility capacities, ways to reduce energy needs, geothermal energy and fusion energy) when the grid only extends across Europe. Arguments of economics and safety are used by both sides of the debate.
Comparison with renewable energy:
See also: Renewable energy debate
Slowing global warming requires a transition to a low-carbon economy, mainly by burning far less fossil fuel. Limiting global warming to 1.5 °C is technically possible if no new fossil fuel power plants are built from 2019.
This has generated considerable interest and dispute in determining the best path forward to rapidly replace fossil-based fuels in the global energy mix, with intense academic debate.
Sometimes the IEA says that countries without nuclear should develop it as well as their renewable power.
Several studies suggest that it might be theoretically possible to cover a majority of world energy generation with new renewable sources. The Intergovernmental Panel on Climate Change (IPCC) has said that if governments were supportive, renewable energy supply could account for close to 80% of the world's energy use by 2050.
While in developed nations the economically feasible geography for new hydropower is lacking, with every geographically suitable area largely already exploited, some proponents of wind and solar energy claim these resources alone could eliminate the need for nuclear power.
Nuclear power is comparable to, and in some cases lower, than many renewable energy sources in terms of lives lost in the past per unit of electricity delivered.
Depending on recycling of renewable energy technologies, nuclear reactors may produce a much smaller volume of waste, although much more toxic, expensive to manage and longer-lived. A nuclear plant also needs to be disassembled and removed and much of the disassembled nuclear plant needs to be stored as low-level nuclear waste for a few decades.
The disposal and management of the wide variety of radioactive waste, of which there are over one quarter of a million tons as of 2018, can cause future damage and costs across the world for over or during hundreds of thousands of years – possibly over a million years, due to issues such as leakage, malign retrieval, vulnerability to attacks (including of reprocessing and power plants), groundwater contamination, radiation and leakage to above ground, brine leakage or bacterial corrosion.
The European Commission Joint Research Centre found that as of 2021 the necessary technologies for geological disposal of nuclear waste are now available and can be deployed. Corrosion experts noted in 2020 that putting the problem of storage off any longer "isn't good for anyone".
Separated plutonium and enriched uranium could be used for nuclear weapons, which – even with the current centralized control (e.g. state-level) and level of prevalence – are considered to be a difficult and substantial global risk for substantial future impacts on human health, lives, civilization and the environment.
Speed of transition and investment needed:
Analysis in 2015 by professor Barry W. Brook and colleagues found that nuclear energy could displace or remove fossil fuels from the electric grid completely within 10 years. This finding was based on the historically modest and proven rate at which nuclear energy was added in France and Sweden during their building programs in the 1980s.
In a similar analysis, Brook had earlier determined that 50% of all global energy, including transportation synthetic fuels etc., could be generated within approximately 30 years if the global nuclear fission build rate was identical to historical proven installation rates calculated in GW per year per unit of global GDP (GW/year/$). This is in contrast to the conceptual studies for 100% renewable energy systems, which would require an order of magnitude more costly global investment per year, which has no historical precedent.
These renewable scenarios would also need far greater land devoted to onshore wind and onshore solar projects. Brook notes that the "principal limitations on nuclear fission are not technical, economic or fuel-related, but are instead linked to complex issues of societal acceptance, fiscal and political inertia, and inadequate critical evaluation of the real-world constraints facing [the other] low-carbon alternatives."
Scientific data indicates that – assuming 2021 emissions levels – humanity only has a carbon budget equivalent to 11 years of emissions left for limiting warming to 1.5 °C while the construction of new nuclear reactors took a median of 7.2–10.9 years in 2018–2020, substantially longer than, alongside other measures, scaling up the deployment of wind and solar – especially for novel reactor types – as well as being more risky, often delayed and more dependent on state-support.
Researchers have cautioned that novel nuclear technologies – which have been in development since decades, are less tested, have higher proliferation risks, have more new safety problems, are often far from commercialization and are more expensive – are not available in time.
Critics of nuclear energy often only oppose nuclear fission energy but not nuclear fusion; however, fusion energy is unlikely to be commercially widespread before 2050.
Land use:
The median land area used by US nuclear power stations per 1 GW installed capacity is 1.3 square miles. To generate the same amount of electricity annually (taking into account capacity factors) from solar PV would require about 60 square miles, and from a wind farm about 310 square miles.
Not included in this is land required for the associated transmission lines, water supply, rail lines, mining and processing of nuclear fuel, and for waste disposal.
Research:
Advanced fission reactor designs:
Main article: Generation IV reactor
Current fission reactors in operation around the world are second or third generation systems, with most of the first-generation systems having been already retired. Research into advanced generation IV reactor types was officially started by the Generation IV International Forum (GIF) based on eight technology goals, including to improve economics, safety, proliferation resistance, natural resource utilization and the ability to consume existing nuclear waste in the production of electricity.
Most of these reactors differ significantly from current operating light water reactors, and are expected to be available for commercial construction after 2030.
Hybrid fusion-fission:
Main article: Nuclear fusion–fission hybrid
Hybrid nuclear power is a proposed means of generating power by the use of a combination of nuclear fusion and fission processes. The concept dates to the 1950s and was briefly advocated by Hans Bethe during the 1970s, but largely remained unexplored until a revival of interest in 2009, due to delays in the realization of pure fusion.
When a sustained nuclear fusion power plant is built, it has the potential to be capable of extracting all the fission energy that remains in spent fission fuel, reducing the volume of nuclear waste by orders of magnitude, and more importantly, eliminating all actinides present in the spent fuel, substances which cause security concerns.
Fusion:
Main articles: Nuclear fusion and Fusion power
Nuclear fusion reactions have the potential to be safer and generate less radioactive waste than fission. These reactions appear potentially viable, though technically quite difficult and have yet to be created on a scale that could be used in a functional power plant.
Fusion power has been under theoretical and experimental investigation since the 1950s. Nuclear fusion research is underway but fusion energy is not likely to be commercially widespread before 2050.
Several experimental nuclear fusion reactors and facilities exist. The largest and most ambitious international nuclear fusion project currently in progress is ITER, a large tokamak under construction in France.
ITER is planned to pave the way for commercial fusion power by demonstrating self-sustained nuclear fusion reactions with positive energy gain. Construction of the ITER facility began in 2007, but the project has run into many delays and budget overruns.
The facility is now not expected to begin operations until the year 2027–11 years after initially anticipated. A follow-on commercial nuclear fusion power station, DEMO, has been proposed. There are also suggestions for a power plant based upon a different fusion approach, that of an inertial fusion power plant.
Fusion-powered electricity generation was initially believed to be readily achievable, as fission-electric power had been. However, the extreme requirements for continuous reactions and plasma containment led to projections being extended by several decades. In 2020, more than 80 years after the first attempts, commercialization of fusion power production was thought to be unlikely before 2050.
See also:
- Atomic battery
- Nuclear power by country
- Nuclear weapons debate
- Pro-nuclear movement
- Thorium-based nuclear power
- Uranium mining debate
- World energy consumption
- U.S. Energy Information Administration
- Nuclear Fuel Cycle Cost Calculator
Nuclear Weapons including a List of nuclear weapons
TOP: Comparing the size of U.S. nuclear weapons over time.
BOTTOM: Remnants of the Japanese City of Hiroshima after the Nuclear Explosion
- YouTube Video: The threat of nuclear weapons, explained | JUST THE FAQS
- YouTube Video: The Atomic Bombings of Hiroshima and Nagasaki - Animated
- YouTube Video: What a Nuclear Bomb Explosion Feels Like
TOP: Comparing the size of U.S. nuclear weapons over time.
BOTTOM: Remnants of the Japanese City of Hiroshima after the Nuclear Explosion
Nuclear weapon (Click here for a List of nuclear weapons)
A nuclear weapon is an explosive device that derives its destructive force from nuclear reactions, either fission (fission bomb) or a combination of fission and fusion reactions (thermonuclear bomb), producing a nuclear explosion. Both bomb types release large quantities of energy from relatively small amounts of matter.
The first test of a fission ("atomic") bomb released an amount of energy approximately equal to 20,000 tons of TNT (84 TJ). The first thermonuclear ("hydrogen") bomb test released energy approximately equal to 10 million tons of TNT (42 PJ).
Nuclear bombs have had yields between 10 tons TNT (the W54) and 50 megatons for the Tsar Bomba (see TNT equivalent). A thermonuclear weapon weighing as little as 600 pounds (270 kg) can release energy equal to more than 1.2 megatonnes of TNT (5.0 PJ).
A nuclear device no larger than a conventional bomb can devastate an entire city by blast, fire, and radiation. Since they are weapons of mass destruction, the proliferation of nuclear weapons is a focus of international relations policy. Nuclear weapons have been deployed twice in war, by the United States against the Japanese cities of Hiroshima and Nagasaki in 1945 during World War II.
Testing and deployment:
Nuclear weapons have only twice been used in warfare, both times by the United States against Japan at the end of World War II. On August 6, 1945, the United States Army Air Forces (USAAF) detonated a uranium gun-type fission bomb nicknamed "Little Boy" over the Japanese city of Hiroshima; three days later, on August 9, the USAAF detonated a plutonium implosion-type fission bomb nicknamed "Fat Man" over the Japanese city of Nagasaki.
These bombings caused injuries that resulted in the deaths of approximately 200,000 civilians and military personnel. The ethics of these bombings and their role in Japan's surrender are subjects of debate.
Since the atomic bombings of Hiroshima and Nagasaki, nuclear weapons have been detonated over 2,000 times for testing and demonstration. Only a few nations possess such weapons or are suspected of seeking them.
The only countries known to have detonated nuclear weapons—and acknowledge possessing them—are (chronologically by date of first test) the following:
Israel is believed to possess nuclear weapons, though, in a policy of deliberate ambiguity, it does not acknowledge having them.
Germany, Italy, Turkey, Belgium and the Netherlands are nuclear weapons sharing states. South Africa is the only country to have independently developed and then renounced and dismantled its nuclear weapons.
The Treaty on the Non-Proliferation of Nuclear Weapons aims to reduce the spread of nuclear weapons, but its effectiveness has been questioned. Modernisation of weapons continues to this day.
Types of Nuclear Weapons
Main article: Nuclear weapon design
There are two basic types of nuclear weapons: those that derive the majority of their energy from nuclear fission reactions alone, and those that use fission reactions to begin nuclear fusion reactions that produce a large amount of the total energy output.
Fission weapons:
All existing nuclear weapons derive some of their explosive energy from nuclear fission reactions. Weapons whose explosive output is exclusively from fission reactions are commonly referred to as atomic bombs or atom bombs (abbreviated as A-bombs). This has long been noted as something of a misnomer, as their energy comes from the nucleus of the atom, just as it does with fusion weapons.
In fission weapons, a mass of fissile material (enriched uranium or plutonium) is forced into supercriticality—allowing an exponential growth of nuclear chain reactions—either by shooting one piece of sub-critical material into another (the "gun" method) or by compression of a sub-critical sphere or cylinder of fissile material using chemically fueled explosive lenses.
The latter approach, the "implosion" method, is more sophisticated and more efficient (smaller, less massive, and requiring less of the expensive fissile fuel) than the former.
A major challenge in all nuclear weapon designs is to ensure that a significant fraction of the fuel is consumed before the weapon destroys itself. The amount of energy released by fission bombs can range from the equivalent of just under a ton to upwards of 500,000 tons (500 kilotons) of TNT (4.2 to 2.1×106 GJ).
All fission reactions generate fission products, the remains of the split atomic nuclei. Many fission products are either highly radioactive (but short-lived) or moderately radioactive (but long-lived), and as such, they are a serious form of radioactive contamination. Fission products are the principal radioactive component of nuclear fallout.
Another source of radioactivity is the burst of free neutrons produced by the weapon. When they collide with other nuclei in the surrounding material, the neutrons transmute those nuclei into other isotopes, altering their stability and making them radioactive.
The most commonly used fissile materials for nuclear weapons applications have been uranium-235 and plutonium-239. Less commonly used has been uranium-233. Neptunium-237 and some isotopes of americium may be usable for nuclear explosives as well, but it is not clear that this has ever been implemented, and their plausible use in nuclear weapons is a matter of dispute.
Fusion weapons:
Main article: Thermonuclear weapon
The other basic type of nuclear weapon produces a large proportion of its energy in nuclear fusion reactions. Such fusion weapons are generally referred to as thermonuclear weapons or more colloquially as hydrogen bombs (abbreviated as H-bombs), as they rely on fusion reactions between isotopes of hydrogen (deuterium and tritium).
All such weapons derive a significant portion of their energy from fission reactions used to "trigger" fusion reactions, and fusion reactions can themselves trigger additional fission reactions.
Only six countries—the United States, Russia, the United Kingdom, China, France, and India—have conducted thermonuclear weapon tests. Whether India has detonated a "true" multi-staged thermonuclear weapon is controversial. North Korea claims to have tested a fusion weapon as of January 2016, though this claim is disputed.
Thermonuclear weapons are considered much more difficult to successfully design and execute than primitive fission weapons. Almost all of the nuclear weapons deployed today use the thermonuclear design because it is more efficient.
Thermonuclear bombs work by using the energy of a fission bomb to compress and heat fusion fuel. In the Teller-Ulam design, which accounts for all multi-megaton yield hydrogen bombs, this is accomplished by placing a fission bomb and fusion fuel (tritium, deuterium, or lithium deuteride) in proximity within a special, radiation-reflecting container. When the fission bomb is detonated, gamma rays and X-rays emitted first compress the fusion fuel, then heat it to thermonuclear temperatures.
The ensuing fusion reaction creates enormous numbers of high-speed neutrons, which can then induce fission in materials not normally prone to it, such as depleted uranium. Each of these components is known as a "stage", with the fission bomb as the "primary" and the fusion capsule as the "secondary". In large, megaton-range hydrogen bombs, about half of the yield comes from the final fissioning of depleted uranium.
Virtually all thermonuclear weapons deployed today use the "two-stage" design described above, but it is possible to add additional fusion stages—each stage igniting a larger amount of fusion fuel in the next stage. This technique can be used to construct thermonuclear weapons of arbitrarily large yield.
This is in contrast to fission bombs, which are limited in their explosive power due to criticality danger (premature nuclear chain reaction caused by too-large amounts of pre-assembled fissile fuel). The largest nuclear weapon ever detonated, the Tsar Bomba of the USSR, which released an energy equivalent of over 50 megatons of TNT (210 PJ), was a three-stage weapon. Most thermonuclear weapons are considerably smaller than this, due to practical constraints from missile warhead space and weight requirements.
Fusion reactions do not create fission products, and thus contribute far less to the creation of nuclear fallout than fission reactions, but because all thermonuclear weapons contain at least one fission stage, and many high-yield thermonuclear devices have a final fission stage, thermonuclear weapons can generate at least as much nuclear fallout as fission-only weapons.
Furthermore, high yield thermonuclear explosions (most dangerously ground bursts) have the force to lift radioactive debris upwards past the tropopause into the stratosphere, where the calm non-turbulent winds permit the debris to travel great distances from the burst, eventually settling and unpredictably contaminating areas far removed from the target of the explosion.
Other types of Nuclear Weapons:
Main articles below:
There are other types of nuclear weapons as well. For example, a boosted fission weapon is a fission bomb that increases its explosive yield through a small number of fusion reactions, but it is not a fusion bomb. In the boosted bomb, the neutrons produced by the fusion reactions serve primarily to increase the efficiency of the fission bomb.
There are two types of boosted fission bomb: internally boosted, in which a deuterium-tritium mixture is injected into the bomb core, and externally boosted, in which concentric shells of lithium-deuteride and depleted uranium are layered on the outside of the fission bomb core.
The external method of boosting enabled the USSR to field the first partially-thermonuclear weapons, but it is now obsolete because it demands a spherical bomb geometry, which was adequate during the 1950s arms race when bomber aircraft were the only available delivery vehicles.
The detonation of any nuclear weapon is accompanied by a blast of neutron radiation. Surrounding a nuclear weapon with suitable materials (such as cobalt or gold) creates a weapon known as a salted bomb. This device can produce exceptionally large quantities of long-lived radioactive contamination. It has been conjectured that such a device could serve as a "doomsday weapon" because such a large quantity of radioactivities with half-lives of decades, lifted into the stratosphere where winds would distribute it around the globe, would make all life on the planet extinct.
In connection with the Strategic Defense Initiative, research into the nuclear pumped laser was conducted under the DOD program Project Excalibur but this did not result in a working weapon. The concept involves the tapping of the energy of an exploding nuclear bomb to power a single-shot laser that is directed at a distant target.
During the Starfish Prime high-altitude nuclear test in 1962, an unexpected effect was produced which is called a nuclear electromagnetic pulse. This is an intense flash of electromagnetic energy produced by a rain of high-energy electrons which in turn are produced by a nuclear bomb's gamma rays.
This flash of energy can permanently destroy or disrupt electronic equipment if insufficiently shielded. It has been proposed to use this effect to disable an enemy's military and civilian infrastructure as an adjunct to other nuclear or conventional military operations. By itself it could as well be useful to terrorists for crippling a nation's economic electronics-based infrastructure.
Because the effect is most effectively produced by high altitude nuclear detonations (by military weapons delivered by air, though ground bursts also produce EMP effects over a localized area), it can produce damage to electronics over a wide, even continental, geographical area.
Research has been done into the possibility of pure fusion bombs: nuclear weapons that consist of fusion reactions without requiring a fission bomb to initiate them. Such a device might provide a simpler path to thermonuclear weapons than one that required the development of fission weapons first, and pure fusion weapons would create significantly less nuclear fallout than other thermonuclear weapons because they would not disperse fission products.
In 1998, the United States Department of Energy divulged that the United States had, "...made a substantial investment" in the past to develop pure fusion weapons, but that, "The U.S. does not have and is not developing a pure fusion weapon", and that, "No credible design for a pure fusion weapon resulted from the DOE investment".
Nuclear isomers provide a possible pathway to fissionless fusion bombs. These are naturally occurring isotopes (178m2Hf being a prominent example) which exist in an elevated energy state. Mechanisms to release this energy as bursts of gamma radiation (as in the hafnium controversy) have been proposed as possible triggers for conventional thermonuclear reactions.
Antimatter, which consists of particles resembling ordinary matter particles in most of their properties but having opposite electric charge, has been considered as a trigger mechanism for nuclear weapons. A major obstacle is the difficulty of producing antimatter in large enough quantities, and there is no evidence that it is feasible beyond the military domain.
However, the U.S. Air Force funded studies of the physics of antimatter in the Cold War, and began considering its possible use in weapons, not just as a trigger, but as the explosive itself. A fourth generation nuclear weapon design is related to, and relies upon, the same principle as antimatter-catalyzed nuclear pulse propulsion.
Most variation in nuclear weapon design is for the purpose of achieving different yields for different situations, and in manipulating design elements to attempt to minimize weapon size, radiation hardness or requirements for special materials, especially fissile fuel or tritium.
Tactical nuclear weapons:
Some nuclear weapons are designed for special purposes; most of these are for non-strategic (decisively war-winning) purposes and are referred to as tactical nuclear weapons.
The neutron bomb purportedly conceived by Sam Cohen is a thermonuclear weapon that yields a relatively small explosion but a relatively large amount of neutron radiation. Such a weapon could, according to tacticians, be used to cause massive biological casualties while leaving inanimate infrastructure mostly intact and creating minimal fallout.
Because high energy neutrons are capable of penetrating dense matter, such as tank armor, neutron warheads were procured in the 1980s (though not deployed in Europe, as intended, over the objections of NATO allies) for use as tactical payloads for US Army artillery shells (200 mm W79 and 155 mm W82) and short range missile forces. Soviet authorities announced similar intentions for neutron warhead deployment in Europe; indeed claimed to have originally invented the neutron bomb, but their deployment on USSR tactical nuclear forces is unverifiable.
A type of nuclear explosive most suitable for use by ground special forces was the Special Atomic Demolition Munition, or SADM, sometimes popularly known as a suitcase nuke.
This is a nuclear bomb that is man-portable, or at least truck-portable, and though of a relatively small yield (one or two kilotons) is sufficient to destroy important tactical targets such as bridges, dams, tunnels, important military or commercial installations, etc. either behind enemy lines or pre-emptively on friendly territory soon to be overtaken by invading enemy forces.
These weapons require plutonium fuel and are particularly "dirty". They also demand especially stringent security precautions in their storage and deployment.
Small "tactical" nuclear weapons were deployed for use as antiaircraft weapons. Examples include the USAF AIR-2 Genie, the AIM-26 Falcon and US Army Nike Hercules. Missile interceptors such as the Sprint and the Spartan also used small nuclear warheads (optimized to produce neutron or X-ray flux) but were for use against enemy strategic warheads.
Other small, or tactical, nuclear weapons were deployed by naval forces for use primarily as antisubmarine weapons. These included nuclear depth bombs or nuclear armed torpedoes.
Nuclear mines for use on land or at sea are also possibilities.
Weapons delivery:
See also:
The system used to deliver a nuclear weapon to its target is an important factor affecting both nuclear weapon design and nuclear strategy. The design, development, and maintenance of delivery systems are among the most expensive parts of a nuclear weapons program; they account, for example, for 57% of the financial resources spent by the United States on nuclear weapons projects since 1940.
The simplest method for delivering a nuclear weapon is a gravity bomb dropped from aircraft; this was the method used by the United States against Japan. This method places few restrictions on the size of the weapon. It does, however, limit attack range, response time to an impending attack, and the number of weapons that a country can field at the same time.
With miniaturization, nuclear bombs can be delivered by both strategic bombers and tactical fighter-bombers. This method is the primary means of nuclear weapons delivery; the majority of U.S. nuclear warheads, for example, are free-fall gravity bombs, namely the B61.
Pictured below: Montage of an inert test of a United States Trident SLBM (submarine launched ballistic missile), from submerged to the terminal, or re-entry phase, of the multiple independently targetable reentry vehicles.
A nuclear weapon is an explosive device that derives its destructive force from nuclear reactions, either fission (fission bomb) or a combination of fission and fusion reactions (thermonuclear bomb), producing a nuclear explosion. Both bomb types release large quantities of energy from relatively small amounts of matter.
The first test of a fission ("atomic") bomb released an amount of energy approximately equal to 20,000 tons of TNT (84 TJ). The first thermonuclear ("hydrogen") bomb test released energy approximately equal to 10 million tons of TNT (42 PJ).
Nuclear bombs have had yields between 10 tons TNT (the W54) and 50 megatons for the Tsar Bomba (see TNT equivalent). A thermonuclear weapon weighing as little as 600 pounds (270 kg) can release energy equal to more than 1.2 megatonnes of TNT (5.0 PJ).
A nuclear device no larger than a conventional bomb can devastate an entire city by blast, fire, and radiation. Since they are weapons of mass destruction, the proliferation of nuclear weapons is a focus of international relations policy. Nuclear weapons have been deployed twice in war, by the United States against the Japanese cities of Hiroshima and Nagasaki in 1945 during World War II.
Testing and deployment:
Nuclear weapons have only twice been used in warfare, both times by the United States against Japan at the end of World War II. On August 6, 1945, the United States Army Air Forces (USAAF) detonated a uranium gun-type fission bomb nicknamed "Little Boy" over the Japanese city of Hiroshima; three days later, on August 9, the USAAF detonated a plutonium implosion-type fission bomb nicknamed "Fat Man" over the Japanese city of Nagasaki.
These bombings caused injuries that resulted in the deaths of approximately 200,000 civilians and military personnel. The ethics of these bombings and their role in Japan's surrender are subjects of debate.
Since the atomic bombings of Hiroshima and Nagasaki, nuclear weapons have been detonated over 2,000 times for testing and demonstration. Only a few nations possess such weapons or are suspected of seeking them.
The only countries known to have detonated nuclear weapons—and acknowledge possessing them—are (chronologically by date of first test) the following:
- United States,
- the Soviet Union (succeeded as a nuclear power by Russia),
- the United Kingdom,
- France,
- China,
- India,
- Pakistan,
- and North Korea.
Israel is believed to possess nuclear weapons, though, in a policy of deliberate ambiguity, it does not acknowledge having them.
Germany, Italy, Turkey, Belgium and the Netherlands are nuclear weapons sharing states. South Africa is the only country to have independently developed and then renounced and dismantled its nuclear weapons.
The Treaty on the Non-Proliferation of Nuclear Weapons aims to reduce the spread of nuclear weapons, but its effectiveness has been questioned. Modernisation of weapons continues to this day.
Types of Nuclear Weapons
Main article: Nuclear weapon design
There are two basic types of nuclear weapons: those that derive the majority of their energy from nuclear fission reactions alone, and those that use fission reactions to begin nuclear fusion reactions that produce a large amount of the total energy output.
Fission weapons:
All existing nuclear weapons derive some of their explosive energy from nuclear fission reactions. Weapons whose explosive output is exclusively from fission reactions are commonly referred to as atomic bombs or atom bombs (abbreviated as A-bombs). This has long been noted as something of a misnomer, as their energy comes from the nucleus of the atom, just as it does with fusion weapons.
In fission weapons, a mass of fissile material (enriched uranium or plutonium) is forced into supercriticality—allowing an exponential growth of nuclear chain reactions—either by shooting one piece of sub-critical material into another (the "gun" method) or by compression of a sub-critical sphere or cylinder of fissile material using chemically fueled explosive lenses.
The latter approach, the "implosion" method, is more sophisticated and more efficient (smaller, less massive, and requiring less of the expensive fissile fuel) than the former.
A major challenge in all nuclear weapon designs is to ensure that a significant fraction of the fuel is consumed before the weapon destroys itself. The amount of energy released by fission bombs can range from the equivalent of just under a ton to upwards of 500,000 tons (500 kilotons) of TNT (4.2 to 2.1×106 GJ).
All fission reactions generate fission products, the remains of the split atomic nuclei. Many fission products are either highly radioactive (but short-lived) or moderately radioactive (but long-lived), and as such, they are a serious form of radioactive contamination. Fission products are the principal radioactive component of nuclear fallout.
Another source of radioactivity is the burst of free neutrons produced by the weapon. When they collide with other nuclei in the surrounding material, the neutrons transmute those nuclei into other isotopes, altering their stability and making them radioactive.
The most commonly used fissile materials for nuclear weapons applications have been uranium-235 and plutonium-239. Less commonly used has been uranium-233. Neptunium-237 and some isotopes of americium may be usable for nuclear explosives as well, but it is not clear that this has ever been implemented, and their plausible use in nuclear weapons is a matter of dispute.
Fusion weapons:
Main article: Thermonuclear weapon
The other basic type of nuclear weapon produces a large proportion of its energy in nuclear fusion reactions. Such fusion weapons are generally referred to as thermonuclear weapons or more colloquially as hydrogen bombs (abbreviated as H-bombs), as they rely on fusion reactions between isotopes of hydrogen (deuterium and tritium).
All such weapons derive a significant portion of their energy from fission reactions used to "trigger" fusion reactions, and fusion reactions can themselves trigger additional fission reactions.
Only six countries—the United States, Russia, the United Kingdom, China, France, and India—have conducted thermonuclear weapon tests. Whether India has detonated a "true" multi-staged thermonuclear weapon is controversial. North Korea claims to have tested a fusion weapon as of January 2016, though this claim is disputed.
Thermonuclear weapons are considered much more difficult to successfully design and execute than primitive fission weapons. Almost all of the nuclear weapons deployed today use the thermonuclear design because it is more efficient.
Thermonuclear bombs work by using the energy of a fission bomb to compress and heat fusion fuel. In the Teller-Ulam design, which accounts for all multi-megaton yield hydrogen bombs, this is accomplished by placing a fission bomb and fusion fuel (tritium, deuterium, or lithium deuteride) in proximity within a special, radiation-reflecting container. When the fission bomb is detonated, gamma rays and X-rays emitted first compress the fusion fuel, then heat it to thermonuclear temperatures.
The ensuing fusion reaction creates enormous numbers of high-speed neutrons, which can then induce fission in materials not normally prone to it, such as depleted uranium. Each of these components is known as a "stage", with the fission bomb as the "primary" and the fusion capsule as the "secondary". In large, megaton-range hydrogen bombs, about half of the yield comes from the final fissioning of depleted uranium.
Virtually all thermonuclear weapons deployed today use the "two-stage" design described above, but it is possible to add additional fusion stages—each stage igniting a larger amount of fusion fuel in the next stage. This technique can be used to construct thermonuclear weapons of arbitrarily large yield.
This is in contrast to fission bombs, which are limited in their explosive power due to criticality danger (premature nuclear chain reaction caused by too-large amounts of pre-assembled fissile fuel). The largest nuclear weapon ever detonated, the Tsar Bomba of the USSR, which released an energy equivalent of over 50 megatons of TNT (210 PJ), was a three-stage weapon. Most thermonuclear weapons are considerably smaller than this, due to practical constraints from missile warhead space and weight requirements.
Fusion reactions do not create fission products, and thus contribute far less to the creation of nuclear fallout than fission reactions, but because all thermonuclear weapons contain at least one fission stage, and many high-yield thermonuclear devices have a final fission stage, thermonuclear weapons can generate at least as much nuclear fallout as fission-only weapons.
Furthermore, high yield thermonuclear explosions (most dangerously ground bursts) have the force to lift radioactive debris upwards past the tropopause into the stratosphere, where the calm non-turbulent winds permit the debris to travel great distances from the burst, eventually settling and unpredictably contaminating areas far removed from the target of the explosion.
Other types of Nuclear Weapons:
Main articles below:
- Boosted fission weapon,
- Neutron bomb,
- Radiological warfare,
- Induced gamma emission,
- Antimatter weapon
There are other types of nuclear weapons as well. For example, a boosted fission weapon is a fission bomb that increases its explosive yield through a small number of fusion reactions, but it is not a fusion bomb. In the boosted bomb, the neutrons produced by the fusion reactions serve primarily to increase the efficiency of the fission bomb.
There are two types of boosted fission bomb: internally boosted, in which a deuterium-tritium mixture is injected into the bomb core, and externally boosted, in which concentric shells of lithium-deuteride and depleted uranium are layered on the outside of the fission bomb core.
The external method of boosting enabled the USSR to field the first partially-thermonuclear weapons, but it is now obsolete because it demands a spherical bomb geometry, which was adequate during the 1950s arms race when bomber aircraft were the only available delivery vehicles.
The detonation of any nuclear weapon is accompanied by a blast of neutron radiation. Surrounding a nuclear weapon with suitable materials (such as cobalt or gold) creates a weapon known as a salted bomb. This device can produce exceptionally large quantities of long-lived radioactive contamination. It has been conjectured that such a device could serve as a "doomsday weapon" because such a large quantity of radioactivities with half-lives of decades, lifted into the stratosphere where winds would distribute it around the globe, would make all life on the planet extinct.
In connection with the Strategic Defense Initiative, research into the nuclear pumped laser was conducted under the DOD program Project Excalibur but this did not result in a working weapon. The concept involves the tapping of the energy of an exploding nuclear bomb to power a single-shot laser that is directed at a distant target.
During the Starfish Prime high-altitude nuclear test in 1962, an unexpected effect was produced which is called a nuclear electromagnetic pulse. This is an intense flash of electromagnetic energy produced by a rain of high-energy electrons which in turn are produced by a nuclear bomb's gamma rays.
This flash of energy can permanently destroy or disrupt electronic equipment if insufficiently shielded. It has been proposed to use this effect to disable an enemy's military and civilian infrastructure as an adjunct to other nuclear or conventional military operations. By itself it could as well be useful to terrorists for crippling a nation's economic electronics-based infrastructure.
Because the effect is most effectively produced by high altitude nuclear detonations (by military weapons delivered by air, though ground bursts also produce EMP effects over a localized area), it can produce damage to electronics over a wide, even continental, geographical area.
Research has been done into the possibility of pure fusion bombs: nuclear weapons that consist of fusion reactions without requiring a fission bomb to initiate them. Such a device might provide a simpler path to thermonuclear weapons than one that required the development of fission weapons first, and pure fusion weapons would create significantly less nuclear fallout than other thermonuclear weapons because they would not disperse fission products.
In 1998, the United States Department of Energy divulged that the United States had, "...made a substantial investment" in the past to develop pure fusion weapons, but that, "The U.S. does not have and is not developing a pure fusion weapon", and that, "No credible design for a pure fusion weapon resulted from the DOE investment".
Nuclear isomers provide a possible pathway to fissionless fusion bombs. These are naturally occurring isotopes (178m2Hf being a prominent example) which exist in an elevated energy state. Mechanisms to release this energy as bursts of gamma radiation (as in the hafnium controversy) have been proposed as possible triggers for conventional thermonuclear reactions.
Antimatter, which consists of particles resembling ordinary matter particles in most of their properties but having opposite electric charge, has been considered as a trigger mechanism for nuclear weapons. A major obstacle is the difficulty of producing antimatter in large enough quantities, and there is no evidence that it is feasible beyond the military domain.
However, the U.S. Air Force funded studies of the physics of antimatter in the Cold War, and began considering its possible use in weapons, not just as a trigger, but as the explosive itself. A fourth generation nuclear weapon design is related to, and relies upon, the same principle as antimatter-catalyzed nuclear pulse propulsion.
Most variation in nuclear weapon design is for the purpose of achieving different yields for different situations, and in manipulating design elements to attempt to minimize weapon size, radiation hardness or requirements for special materials, especially fissile fuel or tritium.
Tactical nuclear weapons:
Some nuclear weapons are designed for special purposes; most of these are for non-strategic (decisively war-winning) purposes and are referred to as tactical nuclear weapons.
The neutron bomb purportedly conceived by Sam Cohen is a thermonuclear weapon that yields a relatively small explosion but a relatively large amount of neutron radiation. Such a weapon could, according to tacticians, be used to cause massive biological casualties while leaving inanimate infrastructure mostly intact and creating minimal fallout.
Because high energy neutrons are capable of penetrating dense matter, such as tank armor, neutron warheads were procured in the 1980s (though not deployed in Europe, as intended, over the objections of NATO allies) for use as tactical payloads for US Army artillery shells (200 mm W79 and 155 mm W82) and short range missile forces. Soviet authorities announced similar intentions for neutron warhead deployment in Europe; indeed claimed to have originally invented the neutron bomb, but their deployment on USSR tactical nuclear forces is unverifiable.
A type of nuclear explosive most suitable for use by ground special forces was the Special Atomic Demolition Munition, or SADM, sometimes popularly known as a suitcase nuke.
This is a nuclear bomb that is man-portable, or at least truck-portable, and though of a relatively small yield (one or two kilotons) is sufficient to destroy important tactical targets such as bridges, dams, tunnels, important military or commercial installations, etc. either behind enemy lines or pre-emptively on friendly territory soon to be overtaken by invading enemy forces.
These weapons require plutonium fuel and are particularly "dirty". They also demand especially stringent security precautions in their storage and deployment.
Small "tactical" nuclear weapons were deployed for use as antiaircraft weapons. Examples include the USAF AIR-2 Genie, the AIM-26 Falcon and US Army Nike Hercules. Missile interceptors such as the Sprint and the Spartan also used small nuclear warheads (optimized to produce neutron or X-ray flux) but were for use against enemy strategic warheads.
Other small, or tactical, nuclear weapons were deployed by naval forces for use primarily as antisubmarine weapons. These included nuclear depth bombs or nuclear armed torpedoes.
Nuclear mines for use on land or at sea are also possibilities.
Weapons delivery:
See also:
- Nuclear weapons delivery,
- Nuclear triad,
- Strategic bomber,
- Intercontinental ballistic missile,
- and Submarine-launched ballistic missile
The system used to deliver a nuclear weapon to its target is an important factor affecting both nuclear weapon design and nuclear strategy. The design, development, and maintenance of delivery systems are among the most expensive parts of a nuclear weapons program; they account, for example, for 57% of the financial resources spent by the United States on nuclear weapons projects since 1940.
The simplest method for delivering a nuclear weapon is a gravity bomb dropped from aircraft; this was the method used by the United States against Japan. This method places few restrictions on the size of the weapon. It does, however, limit attack range, response time to an impending attack, and the number of weapons that a country can field at the same time.
With miniaturization, nuclear bombs can be delivered by both strategic bombers and tactical fighter-bombers. This method is the primary means of nuclear weapons delivery; the majority of U.S. nuclear warheads, for example, are free-fall gravity bombs, namely the B61.
Pictured below: Montage of an inert test of a United States Trident SLBM (submarine launched ballistic missile), from submerged to the terminal, or re-entry phase, of the multiple independently targetable reentry vehicles.
Preferable from a strategic point of view is a nuclear weapon mounted on a missile, which can use a ballistic trajectory to deliver the warhead over the horizon. Although even short-range missiles allow for a faster and less vulnerable attack, the development of long-range intercontinental ballistic missiles (ICBMs) and submarine-launched ballistic missiles (SLBMs) has given some nations the ability to plausibly deliver missiles anywhere on the globe with a high likelihood of success.
More advanced systems, such as multiple independently targetable reentry vehicles (MIRVs), can launch multiple warheads at different targets from one missile, reducing the chance of a successful missile defense. Today, missiles are most common among systems designed for delivery of nuclear weapons. Making a warhead small enough to fit onto a missile, though, can be difficult.
Tactical weapons have involved the most variety of delivery types, including not only gravity bombs and missiles but also:
An atomic mortar has been tested by the United States. Small, two-man portable tactical weapons (somewhat misleadingly referred to as suitcase bombs), such as the Special Atomic Demolition Munition, have been developed, although the difficulty of combining sufficient yield with portability limits their military utility.
Nuclear strategy:
Main articles:
Nuclear warfare strategy is a set of policies that deal with preventing or fighting a nuclear war. The policy of trying to prevent an attack by a nuclear weapon from another country by threatening nuclear retaliation is known as the strategy of nuclear deterrence.
The goal in deterrence is to always maintain a second strike capability (the ability of a country to respond to a nuclear attack with one of its own) and potentially to strive for first strike status (the ability to destroy an enemy's nuclear forces before they could retaliate).
During the Cold War, policy and military theorists considered the sorts of policies that might prevent a nuclear attack, and they developed game theory models that could lead to stable deterrence conditions.
Different forms of nuclear weapons delivery (see above) allow for different types of nuclear strategies. The goals of any strategy are generally to make it difficult for an enemy to launch a pre-emptive strike against the weapon system and difficult to defend against the delivery of the weapon during a potential conflict.
This can mean keeping weapon locations hidden, such as deploying them on submarines or land mobile transporter erector launchers whose locations are difficult to track, or it can mean protecting weapons by burying them in hardened missile silo bunkers. Other components of nuclear strategies included using missile defenses to destroy the missiles before they land, or implementing civil defense measures using early-warning systems to evacuate citizens to safe areas before an attack.
Weapons designed to threaten large populations or to deter attacks are known as strategic weapons. Nuclear weapons for use on a battlefield in military situations are called tactical weapons.
Critics of nuclear war strategy often suggest that a nuclear war between two nations would result in mutual annihilation. From this point of view, the significance of nuclear weapons is to deter war because any nuclear war would escalate out of mutual distrust and fear, resulting in mutually assured destruction. This threat of national, if not global, destruction has been a strong motivation for anti-nuclear weapons activism.
Critics from the peace movement and within the military establishment have questioned the usefulness of such weapons in the current military climate.
According to an advisory opinion issued by the International Court of Justice in 1996, the use of (or threat of use of) such weapons would generally be contrary to the rules of international law applicable in armed conflict, but the court did not reach an opinion as to whether or not the threat or use would be lawful in specific extreme circumstances such as if the survival of the state were at stake.
Another deterrence position is that nuclear proliferation can be desirable. In this case, it is argued that, unlike conventional weapons, nuclear weapons deter all-out war between states, and they succeeded in doing this during the Cold War between the U.S. and the Soviet Union.
In the late 1950s and early 1960s, Gen. Pierre Marie Gallois of France, an adviser to Charles de Gaulle, argued in books like The Balance of Terror: Strategy for the Nuclear Age (1961) that mere possession of a nuclear arsenal was enough to ensure deterrence, and thus concluded that the spread of nuclear weapons could increase international stability.
Some prominent neo-realist scholars, such as Kenneth Waltz and John Mearsheimer, have argued, along the lines of Gallois, that some forms of nuclear proliferation would decrease the likelihood of total war, especially in troubled regions of the world where there exists a single nuclear-weapon state.
Aside from the public opinion that opposes proliferation in any form, there are two schools of thought on the matter: those, like Mearsheimer, who favored selective proliferation, and Waltz, who was somewhat more non-interventionist. Interest in proliferation and the stability-instability paradox that it generates continues to this day, with ongoing debate about indigenous Japanese and South Korean nuclear deterrent against North Korea.
The threat of potentially suicidal terrorists possessing nuclear weapons (a form of nuclear terrorism) complicates the decision process. The prospect of mutually assured destruction might not deter an enemy who expects to die in the confrontation. Further, if the initial act is from a stateless terrorist instead of a sovereign nation, there might not be a nation or specific target to retaliate against.
It has been argued, especially after the September 11, 2001, attacks, that this complication calls for a new nuclear strategy, one that is distinct from that which gave relative stability during the Cold War.
Since 1996, the United States has had a policy of allowing the targeting of its nuclear weapons at terrorists armed with weapons of mass destruction.
Robert Gallucci argues that although traditional deterrence is not an effective approach toward terrorist groups bent on causing a nuclear catastrophe, Gallucci believes that "the United States should instead consider a policy of expanded deterrence, which focuses not solely on the would-be nuclear terrorists but on those states that may deliberately transfer or inadvertently leak nuclear weapons and materials to them. By threatening retaliation against those states, the United States may be able to deter that which it cannot physically prevent.".
Graham Allison makes a similar case, arguing that the key to expanded deterrence is coming up with ways of tracing nuclear material to the country that forged the fissile material. "After a nuclear bomb detonates, nuclear forensics cops would collect debris samples and send them to a laboratory for radiological analysis.
By identifying unique attributes of the fissile material, including its impurities and contaminants, one could trace the path back to its origin." The process is analogous to identifying a criminal by fingerprints. "The goal would be twofold: first, to deter leaders of nuclear states from selling weapons to terrorists by holding them accountable for any use of their weapons; second, to give leaders every incentive to tightly secure their nuclear weapons and materials."
According to the Pentagon's June 2019 "Doctrine for Joint Nuclear Operations" of the Joint Chiefs of Staffs website Publication, "Integration of nuclear weapons employment with conventional and special operations forces is essential to the success of any mission or operation."
Governance, control, and law:
Main articles:
Because they are weapons of mass destruction, the proliferation and possible use of nuclear weapons are important issues in international relations and diplomacy. In most countries, the use of nuclear force can only be authorized by the head of government or head of state. Despite controls and regulations governing nuclear weapons, there is an inherent danger of "accidents, mistakes, false alarms, blackmail, theft, and sabotage".
In the late 1940s, lack of mutual trust prevented the United States and the Soviet Union from making progress on arms control agreements. The Russell–Einstein Manifesto was issued in London on July 9, 1955, by Bertrand Russell in the midst of the Cold War. It highlighted the dangers posed by nuclear weapons and called for world leaders to seek peaceful resolutions to international conflict.
The signatories included eleven pre-eminent intellectuals and scientists, including Albert Einstein, who signed it just days before his death on April 18, 1955.
A few days after the release, philanthropist Cyrus S. Eaton offered to sponsor a conference—called for in the manifesto—in Pugwash, Nova Scotia, Eaton's birthplace. This conference was to be the first of the Pugwash Conferences on Science and World Affairs, held in July 1957.
By the 1960s, steps were taken to limit both the proliferation of nuclear weapons to other countries and the environmental effects of nuclear testing. The Partial Nuclear Test Ban Treaty (1963) restricted all nuclear testing to underground nuclear testing, to prevent contamination from nuclear fallout, whereas the Treaty on the Non-Proliferation of Nuclear Weapons (1968) attempted to place restrictions on the types of activities signatories could participate in, with the goal of allowing the transference of non-military nuclear technology to member countries without fear of proliferation.
In 1957, the International Atomic Energy Agency (IAEA) was established under the mandate of the United Nations to encourage development of peaceful applications of nuclear technology, provide international safeguards against its misuse, and facilitate the application of safety measures in its use.
In 1996, many nations signed the Comprehensive Nuclear-Test-Ban Treaty, which prohibits all testing of nuclear weapons. A testing ban imposes a significant hindrance to nuclear arms development by any complying country. The Treaty requires the ratification by 44 specific states before it can go into force; as of 2012, the ratification of eight of these states is still required.
Additional treaties and agreements have governed nuclear weapons stockpiles between the countries with the two largest stockpiles, the United States and the Soviet Union, and later between the United States and Russia. These include treaties such as:
Even when they did not enter into force, these agreements helped limit and later reduce the numbers and types of nuclear weapons between the United States and the Soviet Union/Russia.
Nuclear weapons have also been opposed by agreements between countries. Many nations have been declared Nuclear-Weapon-Free Zones, areas where nuclear weapons production and deployment are prohibited, through the use of treaties. The Treaty of Tlatelolco (1967) prohibited any production or deployment of nuclear weapons in Latin America and the Caribbean, and the Treaty of Pelindaba (1964) prohibits nuclear weapons in many African countries.
As recently as 2006 a Central Asian Nuclear Weapon Free Zone was established among the former Soviet republics of Central Asia prohibiting nuclear weapons.
In 1996, the International Court of Justice, the highest court of the United Nations, issued an Advisory Opinion concerned with the "Legality of the Threat or Use of Nuclear Weapons".
The court ruled that the use or threat of use of nuclear weapons would violate various articles of international law, including:
Given the unique, destructive characteristics of nuclear weapons, the International Committee of the Red Cross calls on States to ensure that these weapons are never used, irrespective of whether they consider them lawful or not.
Additionally, there have been other, specific actions meant to discourage countries from developing nuclear arms. In the wake of the tests by India and Pakistan in 1998, economic sanctions were (temporarily) levied against both countries, though neither were signatories with the Nuclear Non-Proliferation Treaty.
One of the stated casus belli for the initiation of the 2003 Iraq War was an accusation by the United States that Iraq was actively pursuing nuclear arms (though this was soon discovered not to be the case as the program had been discontinued). In 1981, Israel had bombed a nuclear reactor being constructed in Osirak, Iraq, in what it called an attempt to halt Iraq's previous nuclear arms ambitions; in 2007, Israel bombed another reactor being constructed in Syria.
In 2013, Mark Diesendorf said that governments of France, India, North Korea, Pakistan, UK, and South Africa have used nuclear power or research reactors to assist nuclear weapons development or to contribute to their supplies of nuclear explosives from military reactors.
In 2017, 122 countries mainly in the Global South voted in favor of adopting the Treaty on the Prohibition of Nuclear Weapons, which eventually entered into force in 2021.
The Doomsday Clock measures the likelihood of a human-made global catastrophe and is published annually by the Bulletin of the Atomic Scientists. The two years with the highest likelihood had previously been 1953, when the Clock was set to two minutes until midnight after the U.S. and the Soviet Union began testing hydrogen bombs, and 2018, following the failure of world leaders to address tensions relating to nuclear weapons and climate change issues.
In 2023, following the escalation of nuclear threats during the Russian invasion of Ukraine, the doomsday clock was set to 90 seconds, the highest likelihood of global catastrophe since the existence of the Doomsday Clock.
Disarmament:
Main article: Nuclear disarmament
For statistics on possession and deployment, see List of states with nuclear weapons.
Nuclear disarmament refers to both the act of reducing or eliminating nuclear weapons and to the end state of a nuclear-free world, in which nuclear weapons are eliminated.
Beginning with the 1963 Partial Test Ban Treaty and continuing through the 1996 Comprehensive Nuclear-Test-Ban Treaty, there have been many treaties to limit or reduce nuclear weapons testing and stockpiles.
The 1968 Nuclear Non-Proliferation Treaty has as one of its explicit conditions that all signatories must "pursue negotiations in good faith" towards the long-term goal of "complete disarmament". The nuclear-weapon states have largely treated that aspect of the agreement as "decorative" and without force.
Only one country—South Africa—has ever fully renounced nuclear weapons they had independently developed. The former Soviet republics of Belarus, Kazakhstan, and Ukraine returned Soviet nuclear arms stationed in their countries to Russia after the collapse of the USSR.
Proponents of nuclear disarmament say that it would lessen the probability of nuclear war, especially accidentally. Critics of nuclear disarmament say that it would undermine the present nuclear peace and deterrence and would lead to increased global instability.
Various American elder statesmen, who were in office during the Cold War period, have been advocating the elimination of nuclear weapons. These officials include Henry Kissinger, George Shultz, Sam Nunn, and William Perry. In January 2010, Lawrence M. Krauss stated that "no issue carries more importance to the long-term health and security of humanity than the effort to reduce, and perhaps one day, rid the world of nuclear weapons".
In January 1986, Soviet leader Mikhail Gorbachev publicly proposed a three-stage program for abolishing the world's nuclear weapons by the end of the 20th century. In the years after the end of the Cold War, there have been numerous campaigns to urge the abolition of nuclear weapons, such as that organized by the Global Zero movement, and the goal of a "world without nuclear weapons" was advocated by United States President Barack Obama in an April 2009 speech in Prague.
A CNN poll from April 2010 indicated that the American public was nearly evenly split on the issue.
Some analysts have argued that nuclear weapons have made the world relatively safer, with peace through deterrence and through the stability–instability paradox, including in south Asia. Kenneth Waltz has argued that nuclear weapons have helped keep an uneasy peace, and further nuclear weapon proliferation might even help avoid the large scale conventional wars that were so common before their invention at the end of World War II.
But former Secretary Henry Kissinger says there is a new danger, which cannot be addressed by deterrence: "The classical notion of deterrence was that there was some consequences before which aggressors and evildoers would recoil. In a world of suicide bombers, that calculation doesn't operate in any comparable way". George Shultz has said, "If you think of the people who are doing suicide attacks, and people like that get a nuclear weapon, they are almost by definition not deterrable".
As of early 2019, more than 90% of world's 13,865 nuclear weapons were owned by Russia and the United States.
United Nations:
Main article: United Nations Office for Disarmament Affairs
The UN Office for Disarmament Affairs (UNODA) is a department of the United Nations Secretariat established in January 1998 as part of the United Nations Secretary-General Kofi Annan's plan to reform the UN as presented in his report to the General Assembly in July 1997.
Its goal is to promote nuclear disarmament and non-proliferation and the strengthening of the disarmament regimes in respect to other weapons of mass destruction, chemical and biological weapons. It also promotes disarmament efforts in the area of conventional weapons, especially land mines and small arms, which are often the weapons of choice in contemporary conflicts.
Controversy:
See also: Nuclear weapons debate and History of the anti-nuclear movement
Ethics:
Main article: Nuclear ethics
Even before the first nuclear weapons had been developed, scientists involved with the Manhattan Project were divided over the use of the weapon. The role of the two atomic bombings of the country in Japan's surrender and the U.S.'s ethical justification for them has been the subject of scholarly and popular debate for decades. The question of whether nations should have nuclear weapons, or test them, has been continually and nearly universally controversial.
Notable nuclear weapons accidents
Main articles:
August 21, 1945: While conducting experiments on a plutonium-gallium core at Los Alamos National Laboratory, physicist Harry Daghlian received a lethal dose of radiation when an error caused it to enter prompt criticality. He died 25 days later, on September 15, 1945, from radiation poisoning.
May 21, 1946: While conducting further experiments on the same core at Los Alamos National Laboratory, physicist Louis Slotin accidentally caused the core to become briefly supercritical. He received a lethal dose of gamma and neutron radiation, and died nine days later on May 30, 1946.
After the death of Daghlian and Slotin, the mass became known as the "demon core". It was ultimately used to construct a bomb for use on the Nevada Test Range.
February 13, 1950: a Convair B-36B crashed in northern British Columbia after jettisoning a Mark IV atomic bomb. This was the first such nuclear weapon loss in history. The accident was designated a "Broken Arrow"—an accident involving a nuclear weapon but which does not present a risk of war. Experts believe that up to 50 nuclear weapons were lost during the Cold War.
May 22, 1957: a 42,000-pound (19,000 kg) Mark-17 hydrogen bomb accidentally fell from a bomber near Albuquerque, New Mexico. The detonation of the device's conventional explosives destroyed it on impact and formed a crater 25 feet (7.6 m) in diameter on land owned by the University of New Mexico. According to a researcher at the Natural Resources Defense Council, it was one of the most powerful bombs made to date.
June 7, 1960: the 1960 Fort Dix IM-99 accident destroyed a Boeing CIM-10 Bomarc nuclear missile and shelter and contaminated the BOMARC Missile Accident Site in New Jersey.
January 24, 1961: the 1961 Goldsboro B-52 crash occurred near Goldsboro, North Carolina. A Boeing B-52 Stratofortress carrying two Mark 39 nuclear bombs broke up in mid-air, dropping its nuclear payload in the process.
1965 Philippine Sea A-4 crash, where a Skyhawk attack aircraft with a nuclear weapon fell into the sea. The pilot, the aircraft, and the B43 nuclear bomb were never recovered. It was not until 1989 that the Pentagon revealed the loss of the one-megaton bomb.
January 17, 1966: the 1966 Palomares B-52 crash occurred when a B-52G bomber of the USAF collided with a KC-135 tanker during mid-air refuelling off the coast of Spain. The KC-135 was completely destroyed when its fuel load ignited, killing all four crew members. The B-52G broke apart, killing three of the seven crew members aboard. Of the four Mk28 type hydrogen bombs the B-52G carried, three were found on land near Almería, Spain. The non-nuclear explosives in two of the weapons detonated upon impact with the ground, resulting in the contamination of a 2-square-kilometer (490-acre) (0.78 square mile) area by radioactive plutonium. The fourth, which fell into the Mediterranean Sea, was recovered intact.
January 21, 1968: the 1968 Thule Air Base B-52 crash involved a United States Air Force (USAF) B-52 bomber. The aircraft was carrying four hydrogen bombs when a cabin fire forced the crew to abandon the aircraft. Six crew members ejected safely, but one who did not have an ejection seat was killed while trying to bail out. The bomber crashed onto sea ice in Greenland, causing the nuclear payload to rupture and disperse, which resulted in widespread radioactive contamination. One of the bombs remains lost.
September 18–19, 1980: the Damascus Accident, occurred in Damascus, Arkansas, where a Titan missile equipped with a nuclear warhead exploded. The accident was caused by a maintenance man who dropped a socket from a socket wrench down an 80-foot (24 m) shaft, puncturing a fuel tank on the rocket. Leaking fuel resulted in a hypergolic fuel explosion, jettisoning the W-53 warhead beyond the launch site.
Nuclear testing and fallout:
Main articles:
Over 500 atmospheric nuclear weapons tests were conducted at various sites around the world from 1945 to 1980. Radioactive fallout from nuclear weapons testing was first drawn to public attention in 1954 when the Castle Bravo hydrogen bomb test at the Pacific Proving Grounds contaminated the crew and catch of the Japanese fishing boat Lucky Dragon. One of the fishermen died in Japan seven months later, and the fear of contaminated tuna led to a temporary boycotting of the popular staple in Japan.
The incident caused widespread concern around the world, especially regarding the effects of nuclear fallout and atmospheric nuclear testing, and "provided a decisive impetus for the emergence of the anti-nuclear weapons movement in many countries".
As public awareness and concern mounted over the possible health hazards associated with exposure to the nuclear fallout, various studies were done to assess the extent of the hazard.
A Centers for Disease Control and Prevention/ National Cancer Institute study claims that fallout from atmospheric nuclear tests would lead to perhaps 11,000 excess deaths among people alive during atmospheric testing in the United States from all forms of cancer, including leukemia, from 1951 to well into the 21st century.
As of March 2009, the U.S. is the only nation that compensates nuclear test victims. Since the Radiation Exposure Compensation Act of 1990, more than $1.38 billion in compensation has been approved. The money is going to people who took part in the tests, notably at the Nevada Test Site, and to others exposed to the radiation.
In addition, leakage of byproducts of nuclear weapon production into groundwater has been an ongoing issue, particularly at the Hanford site.
Effects of nuclear explosions
Main article: Effects of nuclear explosions
Effects of nuclear explosions on human health:
Main article: Effects of nuclear explosions on human health
Some scientists estimate that a nuclear war with 100 Hiroshima-size nuclear explosions on cities could cost the lives of tens of millions of people from long-term climatic effects alone.
The climatology hypothesis is that if each city firestorms, a great deal of soot could be thrown up into the atmosphere which could blanket the earth, cutting out sunlight for years on end, causing the disruption of food chains, in what is termed a nuclear winter.
People near the Hiroshima explosion and who managed to survive the explosion subsequently suffered a variety of medical effects:
Fallout exposure—depending on if further afield individuals shelter in place or evacuate perpendicular to the direction of the wind, and therefore avoid contact with the fallout plume, and stay there for the days and weeks after the nuclear explosion, their exposure to fallout, and therefore their total dose, will vary.
With those who do shelter in place, and or evacuate, experiencing a total dose that would be negligible in comparison to someone who just went about their life as normal.
Staying indoors until after the most hazardous fallout isotope, I-131 decays away to 0.1% of its initial quantity after ten half-lifes—which is represented by 80 days in I-131s case, would make the difference between likely contracting Thyroid cancer or escaping completely from this substance depending on the actions of the individual.
Effects of nuclear war:
See also:
Nuclear war could yield unprecedented human death tolls and habitat destruction. Detonating large numbers of nuclear weapons would have an immediate, short term and long-term effects on the climate, potentially causing cold weather known as a "nuclear winter".
In 1982, Brian Martin estimated that a US–Soviet nuclear exchange might kill 400–450 million directly, mostly in the United States, Europe and Russia, and maybe several hundred million more through follow-up consequences in those same areas.
Many scholars have posited that a global thermonuclear war with Cold War-era stockpiles, or even with the current smaller stockpiles, may lead to the extinction of the human race.
The International Physicians for the Prevention of Nuclear War believe that nuclear war could indirectly contribute to human extinction via secondary effects, including environmental consequences, societal breakdown, and economic collapse.
It has been estimated that a relatively small-scale nuclear exchange between India and Pakistan involving 100 Hiroshima yield (15 kilotons) weapons, could cause a nuclear winter and kill more than a billion people.
According to a peer-reviewed study published in the journal Nature Food in August 2022, a full-scale nuclear war between the U.S. and Russia would directly kill 360 million people and more than 5 billion people would die from starvation. More than 2 billion people could die from a smaller-scale nuclear war between India and Pakistan.
Public opposition:
See also: Nuclear disarmament and International Day against Nuclear Tests
Peace movements emerged in Japan and in 1954 they converged to form a unified "Japan Council against Atomic and Hydrogen Bombs." Japanese opposition to nuclear weapons tests in the Pacific Ocean was widespread, and "an estimated 35 million signatures were collected on petitions calling for bans on nuclear weapons".
In the United Kingdom, the first Aldermaston March organised by the Campaign for Nuclear Disarmament (CND) took place at Easter 1958, when, according to the CND, several thousand people marched for four days from Trafalgar Square, London, to the Atomic Weapons Research Establishment close to Aldermaston in Berkshire, England, to demonstrate their opposition to nuclear weapons.
The Aldermaston marches continued into the late 1960s when tens of thousands of people took part in the four-day marches.
In 1959, a letter in the Bulletin of the Atomic Scientists was the start of a successful campaign to stop the Atomic Energy Commission dumping radioactive waste in the sea 19 kilometres from Boston.
In 1962, Linus Pauling won the Nobel Peace Prize for his work to stop the atmospheric testing of nuclear weapons, and the "Ban the Bomb" movement spread.
In 1963, many countries ratified the Partial Test Ban Treaty prohibiting atmospheric nuclear testing. Radioactive fallout became less of an issue and the anti-nuclear weapons movement went into decline for some years. A resurgence of interest occurred amid European and American fears of nuclear war in the 1980s.
Costs and technology spin-offs:
See also:
According to an audit by the Brookings Institution, between 1940 and 1996, the U.S. spent $10.9 trillion in present-day terms on nuclear weapons programs:
Non-weapons uses:
Main article: Peaceful nuclear explosion
Peaceful nuclear explosions are nuclear explosions conducted for non-military purposes, such as activities related to economic development including the creation of canals. During the 1960s and 1970s, both the United States and the Soviet Union conducted a number of PNEs. Six of the explosions by the Soviet Union are considered to have been of an applied nature, not just tests.
The United States and the Soviet Union later halted their programs. Definitions and limits are covered in the Peaceful Nuclear Explosions Treaty of 1976. The stalled Comprehensive Nuclear-Test-Ban Treaty of 1996 would prohibit all nuclear explosions, regardless of whether they are for peaceful purposes or not.
Click on any of the following blue hyperlinks for more about Nuclear Weapons:
More advanced systems, such as multiple independently targetable reentry vehicles (MIRVs), can launch multiple warheads at different targets from one missile, reducing the chance of a successful missile defense. Today, missiles are most common among systems designed for delivery of nuclear weapons. Making a warhead small enough to fit onto a missile, though, can be difficult.
Tactical weapons have involved the most variety of delivery types, including not only gravity bombs and missiles but also:
- artillery shells,
- land mines,
- nuclear depth charges
- and torpedoes for anti-submarine warfare.
An atomic mortar has been tested by the United States. Small, two-man portable tactical weapons (somewhat misleadingly referred to as suitcase bombs), such as the Special Atomic Demolition Munition, have been developed, although the difficulty of combining sufficient yield with portability limits their military utility.
Nuclear strategy:
Main articles:
- Nuclear strategy and Deterrence theory
- Pre-emptive nuclear strike,
- Nuclear peace,
- Essentials of Post–Cold War Deterrence,
- Single Integrated Operational Plan,
- Nuclear warfare,
- and On Thermonuclear War
Nuclear warfare strategy is a set of policies that deal with preventing or fighting a nuclear war. The policy of trying to prevent an attack by a nuclear weapon from another country by threatening nuclear retaliation is known as the strategy of nuclear deterrence.
The goal in deterrence is to always maintain a second strike capability (the ability of a country to respond to a nuclear attack with one of its own) and potentially to strive for first strike status (the ability to destroy an enemy's nuclear forces before they could retaliate).
During the Cold War, policy and military theorists considered the sorts of policies that might prevent a nuclear attack, and they developed game theory models that could lead to stable deterrence conditions.
Different forms of nuclear weapons delivery (see above) allow for different types of nuclear strategies. The goals of any strategy are generally to make it difficult for an enemy to launch a pre-emptive strike against the weapon system and difficult to defend against the delivery of the weapon during a potential conflict.
This can mean keeping weapon locations hidden, such as deploying them on submarines or land mobile transporter erector launchers whose locations are difficult to track, or it can mean protecting weapons by burying them in hardened missile silo bunkers. Other components of nuclear strategies included using missile defenses to destroy the missiles before they land, or implementing civil defense measures using early-warning systems to evacuate citizens to safe areas before an attack.
Weapons designed to threaten large populations or to deter attacks are known as strategic weapons. Nuclear weapons for use on a battlefield in military situations are called tactical weapons.
Critics of nuclear war strategy often suggest that a nuclear war between two nations would result in mutual annihilation. From this point of view, the significance of nuclear weapons is to deter war because any nuclear war would escalate out of mutual distrust and fear, resulting in mutually assured destruction. This threat of national, if not global, destruction has been a strong motivation for anti-nuclear weapons activism.
Critics from the peace movement and within the military establishment have questioned the usefulness of such weapons in the current military climate.
According to an advisory opinion issued by the International Court of Justice in 1996, the use of (or threat of use of) such weapons would generally be contrary to the rules of international law applicable in armed conflict, but the court did not reach an opinion as to whether or not the threat or use would be lawful in specific extreme circumstances such as if the survival of the state were at stake.
Another deterrence position is that nuclear proliferation can be desirable. In this case, it is argued that, unlike conventional weapons, nuclear weapons deter all-out war between states, and they succeeded in doing this during the Cold War between the U.S. and the Soviet Union.
In the late 1950s and early 1960s, Gen. Pierre Marie Gallois of France, an adviser to Charles de Gaulle, argued in books like The Balance of Terror: Strategy for the Nuclear Age (1961) that mere possession of a nuclear arsenal was enough to ensure deterrence, and thus concluded that the spread of nuclear weapons could increase international stability.
Some prominent neo-realist scholars, such as Kenneth Waltz and John Mearsheimer, have argued, along the lines of Gallois, that some forms of nuclear proliferation would decrease the likelihood of total war, especially in troubled regions of the world where there exists a single nuclear-weapon state.
Aside from the public opinion that opposes proliferation in any form, there are two schools of thought on the matter: those, like Mearsheimer, who favored selective proliferation, and Waltz, who was somewhat more non-interventionist. Interest in proliferation and the stability-instability paradox that it generates continues to this day, with ongoing debate about indigenous Japanese and South Korean nuclear deterrent against North Korea.
The threat of potentially suicidal terrorists possessing nuclear weapons (a form of nuclear terrorism) complicates the decision process. The prospect of mutually assured destruction might not deter an enemy who expects to die in the confrontation. Further, if the initial act is from a stateless terrorist instead of a sovereign nation, there might not be a nation or specific target to retaliate against.
It has been argued, especially after the September 11, 2001, attacks, that this complication calls for a new nuclear strategy, one that is distinct from that which gave relative stability during the Cold War.
Since 1996, the United States has had a policy of allowing the targeting of its nuclear weapons at terrorists armed with weapons of mass destruction.
Robert Gallucci argues that although traditional deterrence is not an effective approach toward terrorist groups bent on causing a nuclear catastrophe, Gallucci believes that "the United States should instead consider a policy of expanded deterrence, which focuses not solely on the would-be nuclear terrorists but on those states that may deliberately transfer or inadvertently leak nuclear weapons and materials to them. By threatening retaliation against those states, the United States may be able to deter that which it cannot physically prevent.".
Graham Allison makes a similar case, arguing that the key to expanded deterrence is coming up with ways of tracing nuclear material to the country that forged the fissile material. "After a nuclear bomb detonates, nuclear forensics cops would collect debris samples and send them to a laboratory for radiological analysis.
By identifying unique attributes of the fissile material, including its impurities and contaminants, one could trace the path back to its origin." The process is analogous to identifying a criminal by fingerprints. "The goal would be twofold: first, to deter leaders of nuclear states from selling weapons to terrorists by holding them accountable for any use of their weapons; second, to give leaders every incentive to tightly secure their nuclear weapons and materials."
According to the Pentagon's June 2019 "Doctrine for Joint Nuclear Operations" of the Joint Chiefs of Staffs website Publication, "Integration of nuclear weapons employment with conventional and special operations forces is essential to the success of any mission or operation."
Governance, control, and law:
Main articles:
- Treaty on the Non-Proliferation of Nuclear Weapons,
- Strategic Arms Limitation Talks,
- Intermediate-Range Nuclear Forces Treaty,
- START I,
- START II,
- Strategic Offensive Reductions Treaty,
- Comprehensive Nuclear-Test-Ban Treaty,
- Lahore Declaration,
- New START,
- Treaty on the Prohibition of Nuclear Weapons
- Anti-nuclear movement
Because they are weapons of mass destruction, the proliferation and possible use of nuclear weapons are important issues in international relations and diplomacy. In most countries, the use of nuclear force can only be authorized by the head of government or head of state. Despite controls and regulations governing nuclear weapons, there is an inherent danger of "accidents, mistakes, false alarms, blackmail, theft, and sabotage".
In the late 1940s, lack of mutual trust prevented the United States and the Soviet Union from making progress on arms control agreements. The Russell–Einstein Manifesto was issued in London on July 9, 1955, by Bertrand Russell in the midst of the Cold War. It highlighted the dangers posed by nuclear weapons and called for world leaders to seek peaceful resolutions to international conflict.
The signatories included eleven pre-eminent intellectuals and scientists, including Albert Einstein, who signed it just days before his death on April 18, 1955.
A few days after the release, philanthropist Cyrus S. Eaton offered to sponsor a conference—called for in the manifesto—in Pugwash, Nova Scotia, Eaton's birthplace. This conference was to be the first of the Pugwash Conferences on Science and World Affairs, held in July 1957.
By the 1960s, steps were taken to limit both the proliferation of nuclear weapons to other countries and the environmental effects of nuclear testing. The Partial Nuclear Test Ban Treaty (1963) restricted all nuclear testing to underground nuclear testing, to prevent contamination from nuclear fallout, whereas the Treaty on the Non-Proliferation of Nuclear Weapons (1968) attempted to place restrictions on the types of activities signatories could participate in, with the goal of allowing the transference of non-military nuclear technology to member countries without fear of proliferation.
In 1957, the International Atomic Energy Agency (IAEA) was established under the mandate of the United Nations to encourage development of peaceful applications of nuclear technology, provide international safeguards against its misuse, and facilitate the application of safety measures in its use.
In 1996, many nations signed the Comprehensive Nuclear-Test-Ban Treaty, which prohibits all testing of nuclear weapons. A testing ban imposes a significant hindrance to nuclear arms development by any complying country. The Treaty requires the ratification by 44 specific states before it can go into force; as of 2012, the ratification of eight of these states is still required.
Additional treaties and agreements have governed nuclear weapons stockpiles between the countries with the two largest stockpiles, the United States and the Soviet Union, and later between the United States and Russia. These include treaties such as:
- SALT II (never ratified),
- START I (expired),
- INF,
- START II (never in effect),
- SORT,
- and New START,
- as well as non-binding agreements such as SALT I and the Presidential Nuclear Initiatives of 1991.
Even when they did not enter into force, these agreements helped limit and later reduce the numbers and types of nuclear weapons between the United States and the Soviet Union/Russia.
Nuclear weapons have also been opposed by agreements between countries. Many nations have been declared Nuclear-Weapon-Free Zones, areas where nuclear weapons production and deployment are prohibited, through the use of treaties. The Treaty of Tlatelolco (1967) prohibited any production or deployment of nuclear weapons in Latin America and the Caribbean, and the Treaty of Pelindaba (1964) prohibits nuclear weapons in many African countries.
As recently as 2006 a Central Asian Nuclear Weapon Free Zone was established among the former Soviet republics of Central Asia prohibiting nuclear weapons.
In 1996, the International Court of Justice, the highest court of the United Nations, issued an Advisory Opinion concerned with the "Legality of the Threat or Use of Nuclear Weapons".
The court ruled that the use or threat of use of nuclear weapons would violate various articles of international law, including:
- the Geneva Conventions,
- the Hague Conventions,
- the UN Charter,
- and the Universal Declaration of Human Rights.
Given the unique, destructive characteristics of nuclear weapons, the International Committee of the Red Cross calls on States to ensure that these weapons are never used, irrespective of whether they consider them lawful or not.
Additionally, there have been other, specific actions meant to discourage countries from developing nuclear arms. In the wake of the tests by India and Pakistan in 1998, economic sanctions were (temporarily) levied against both countries, though neither were signatories with the Nuclear Non-Proliferation Treaty.
One of the stated casus belli for the initiation of the 2003 Iraq War was an accusation by the United States that Iraq was actively pursuing nuclear arms (though this was soon discovered not to be the case as the program had been discontinued). In 1981, Israel had bombed a nuclear reactor being constructed in Osirak, Iraq, in what it called an attempt to halt Iraq's previous nuclear arms ambitions; in 2007, Israel bombed another reactor being constructed in Syria.
In 2013, Mark Diesendorf said that governments of France, India, North Korea, Pakistan, UK, and South Africa have used nuclear power or research reactors to assist nuclear weapons development or to contribute to their supplies of nuclear explosives from military reactors.
In 2017, 122 countries mainly in the Global South voted in favor of adopting the Treaty on the Prohibition of Nuclear Weapons, which eventually entered into force in 2021.
The Doomsday Clock measures the likelihood of a human-made global catastrophe and is published annually by the Bulletin of the Atomic Scientists. The two years with the highest likelihood had previously been 1953, when the Clock was set to two minutes until midnight after the U.S. and the Soviet Union began testing hydrogen bombs, and 2018, following the failure of world leaders to address tensions relating to nuclear weapons and climate change issues.
In 2023, following the escalation of nuclear threats during the Russian invasion of Ukraine, the doomsday clock was set to 90 seconds, the highest likelihood of global catastrophe since the existence of the Doomsday Clock.
Disarmament:
Main article: Nuclear disarmament
For statistics on possession and deployment, see List of states with nuclear weapons.
Nuclear disarmament refers to both the act of reducing or eliminating nuclear weapons and to the end state of a nuclear-free world, in which nuclear weapons are eliminated.
Beginning with the 1963 Partial Test Ban Treaty and continuing through the 1996 Comprehensive Nuclear-Test-Ban Treaty, there have been many treaties to limit or reduce nuclear weapons testing and stockpiles.
The 1968 Nuclear Non-Proliferation Treaty has as one of its explicit conditions that all signatories must "pursue negotiations in good faith" towards the long-term goal of "complete disarmament". The nuclear-weapon states have largely treated that aspect of the agreement as "decorative" and without force.
Only one country—South Africa—has ever fully renounced nuclear weapons they had independently developed. The former Soviet republics of Belarus, Kazakhstan, and Ukraine returned Soviet nuclear arms stationed in their countries to Russia after the collapse of the USSR.
Proponents of nuclear disarmament say that it would lessen the probability of nuclear war, especially accidentally. Critics of nuclear disarmament say that it would undermine the present nuclear peace and deterrence and would lead to increased global instability.
Various American elder statesmen, who were in office during the Cold War period, have been advocating the elimination of nuclear weapons. These officials include Henry Kissinger, George Shultz, Sam Nunn, and William Perry. In January 2010, Lawrence M. Krauss stated that "no issue carries more importance to the long-term health and security of humanity than the effort to reduce, and perhaps one day, rid the world of nuclear weapons".
In January 1986, Soviet leader Mikhail Gorbachev publicly proposed a three-stage program for abolishing the world's nuclear weapons by the end of the 20th century. In the years after the end of the Cold War, there have been numerous campaigns to urge the abolition of nuclear weapons, such as that organized by the Global Zero movement, and the goal of a "world without nuclear weapons" was advocated by United States President Barack Obama in an April 2009 speech in Prague.
A CNN poll from April 2010 indicated that the American public was nearly evenly split on the issue.
Some analysts have argued that nuclear weapons have made the world relatively safer, with peace through deterrence and through the stability–instability paradox, including in south Asia. Kenneth Waltz has argued that nuclear weapons have helped keep an uneasy peace, and further nuclear weapon proliferation might even help avoid the large scale conventional wars that were so common before their invention at the end of World War II.
But former Secretary Henry Kissinger says there is a new danger, which cannot be addressed by deterrence: "The classical notion of deterrence was that there was some consequences before which aggressors and evildoers would recoil. In a world of suicide bombers, that calculation doesn't operate in any comparable way". George Shultz has said, "If you think of the people who are doing suicide attacks, and people like that get a nuclear weapon, they are almost by definition not deterrable".
As of early 2019, more than 90% of world's 13,865 nuclear weapons were owned by Russia and the United States.
United Nations:
Main article: United Nations Office for Disarmament Affairs
The UN Office for Disarmament Affairs (UNODA) is a department of the United Nations Secretariat established in January 1998 as part of the United Nations Secretary-General Kofi Annan's plan to reform the UN as presented in his report to the General Assembly in July 1997.
Its goal is to promote nuclear disarmament and non-proliferation and the strengthening of the disarmament regimes in respect to other weapons of mass destruction, chemical and biological weapons. It also promotes disarmament efforts in the area of conventional weapons, especially land mines and small arms, which are often the weapons of choice in contemporary conflicts.
Controversy:
See also: Nuclear weapons debate and History of the anti-nuclear movement
Ethics:
Main article: Nuclear ethics
Even before the first nuclear weapons had been developed, scientists involved with the Manhattan Project were divided over the use of the weapon. The role of the two atomic bombings of the country in Japan's surrender and the U.S.'s ethical justification for them has been the subject of scholarly and popular debate for decades. The question of whether nations should have nuclear weapons, or test them, has been continually and nearly universally controversial.
Notable nuclear weapons accidents
Main articles:
- Nuclear and radiation accidents and incidents
- List of military nuclear accidents
- List of nuclear close calls
August 21, 1945: While conducting experiments on a plutonium-gallium core at Los Alamos National Laboratory, physicist Harry Daghlian received a lethal dose of radiation when an error caused it to enter prompt criticality. He died 25 days later, on September 15, 1945, from radiation poisoning.
May 21, 1946: While conducting further experiments on the same core at Los Alamos National Laboratory, physicist Louis Slotin accidentally caused the core to become briefly supercritical. He received a lethal dose of gamma and neutron radiation, and died nine days later on May 30, 1946.
After the death of Daghlian and Slotin, the mass became known as the "demon core". It was ultimately used to construct a bomb for use on the Nevada Test Range.
February 13, 1950: a Convair B-36B crashed in northern British Columbia after jettisoning a Mark IV atomic bomb. This was the first such nuclear weapon loss in history. The accident was designated a "Broken Arrow"—an accident involving a nuclear weapon but which does not present a risk of war. Experts believe that up to 50 nuclear weapons were lost during the Cold War.
May 22, 1957: a 42,000-pound (19,000 kg) Mark-17 hydrogen bomb accidentally fell from a bomber near Albuquerque, New Mexico. The detonation of the device's conventional explosives destroyed it on impact and formed a crater 25 feet (7.6 m) in diameter on land owned by the University of New Mexico. According to a researcher at the Natural Resources Defense Council, it was one of the most powerful bombs made to date.
June 7, 1960: the 1960 Fort Dix IM-99 accident destroyed a Boeing CIM-10 Bomarc nuclear missile and shelter and contaminated the BOMARC Missile Accident Site in New Jersey.
January 24, 1961: the 1961 Goldsboro B-52 crash occurred near Goldsboro, North Carolina. A Boeing B-52 Stratofortress carrying two Mark 39 nuclear bombs broke up in mid-air, dropping its nuclear payload in the process.
1965 Philippine Sea A-4 crash, where a Skyhawk attack aircraft with a nuclear weapon fell into the sea. The pilot, the aircraft, and the B43 nuclear bomb were never recovered. It was not until 1989 that the Pentagon revealed the loss of the one-megaton bomb.
January 17, 1966: the 1966 Palomares B-52 crash occurred when a B-52G bomber of the USAF collided with a KC-135 tanker during mid-air refuelling off the coast of Spain. The KC-135 was completely destroyed when its fuel load ignited, killing all four crew members. The B-52G broke apart, killing three of the seven crew members aboard. Of the four Mk28 type hydrogen bombs the B-52G carried, three were found on land near Almería, Spain. The non-nuclear explosives in two of the weapons detonated upon impact with the ground, resulting in the contamination of a 2-square-kilometer (490-acre) (0.78 square mile) area by radioactive plutonium. The fourth, which fell into the Mediterranean Sea, was recovered intact.
January 21, 1968: the 1968 Thule Air Base B-52 crash involved a United States Air Force (USAF) B-52 bomber. The aircraft was carrying four hydrogen bombs when a cabin fire forced the crew to abandon the aircraft. Six crew members ejected safely, but one who did not have an ejection seat was killed while trying to bail out. The bomber crashed onto sea ice in Greenland, causing the nuclear payload to rupture and disperse, which resulted in widespread radioactive contamination. One of the bombs remains lost.
September 18–19, 1980: the Damascus Accident, occurred in Damascus, Arkansas, where a Titan missile equipped with a nuclear warhead exploded. The accident was caused by a maintenance man who dropped a socket from a socket wrench down an 80-foot (24 m) shaft, puncturing a fuel tank on the rocket. Leaking fuel resulted in a hypergolic fuel explosion, jettisoning the W-53 warhead beyond the launch site.
Nuclear testing and fallout:
Main articles:
Over 500 atmospheric nuclear weapons tests were conducted at various sites around the world from 1945 to 1980. Radioactive fallout from nuclear weapons testing was first drawn to public attention in 1954 when the Castle Bravo hydrogen bomb test at the Pacific Proving Grounds contaminated the crew and catch of the Japanese fishing boat Lucky Dragon. One of the fishermen died in Japan seven months later, and the fear of contaminated tuna led to a temporary boycotting of the popular staple in Japan.
The incident caused widespread concern around the world, especially regarding the effects of nuclear fallout and atmospheric nuclear testing, and "provided a decisive impetus for the emergence of the anti-nuclear weapons movement in many countries".
As public awareness and concern mounted over the possible health hazards associated with exposure to the nuclear fallout, various studies were done to assess the extent of the hazard.
A Centers for Disease Control and Prevention/ National Cancer Institute study claims that fallout from atmospheric nuclear tests would lead to perhaps 11,000 excess deaths among people alive during atmospheric testing in the United States from all forms of cancer, including leukemia, from 1951 to well into the 21st century.
As of March 2009, the U.S. is the only nation that compensates nuclear test victims. Since the Radiation Exposure Compensation Act of 1990, more than $1.38 billion in compensation has been approved. The money is going to people who took part in the tests, notably at the Nevada Test Site, and to others exposed to the radiation.
In addition, leakage of byproducts of nuclear weapon production into groundwater has been an ongoing issue, particularly at the Hanford site.
Effects of nuclear explosions
Main article: Effects of nuclear explosions
Effects of nuclear explosions on human health:
Main article: Effects of nuclear explosions on human health
Some scientists estimate that a nuclear war with 100 Hiroshima-size nuclear explosions on cities could cost the lives of tens of millions of people from long-term climatic effects alone.
The climatology hypothesis is that if each city firestorms, a great deal of soot could be thrown up into the atmosphere which could blanket the earth, cutting out sunlight for years on end, causing the disruption of food chains, in what is termed a nuclear winter.
People near the Hiroshima explosion and who managed to survive the explosion subsequently suffered a variety of medical effects:
- Initial stage—the first 1–9 weeks, in which are the greatest number of deaths, with 90% due to thermal injury or blast effects and 10% due to super-lethal radiation exposure.
- Intermediate stage—from 10 to 12 weeks. The deaths in this period are from ionizing radiation in the median lethal range – LD50
- Late period—lasting from 13 to 20 weeks. This period has some improvement in survivors' condition.
- Delayed period—from 20+ weeks. Characterized by numerous complications, mostly related to healing of thermal and mechanical injuries, and if the individual was exposed to a few hundred to a thousand millisieverts of radiation, it is coupled with infertility, sub-fertility and blood disorders. Furthermore, ionizing radiation above a dose of around 50–100 millisievert exposure has been shown to statistically begin increasing one's chance of dying of cancer sometime in their lifetime over the normal unexposed rate of ~25%, in the long term, a heightened rate of cancer, proportional to the dose received, would begin to be observed after ~5+ years, with lesser problems such as eye cataracts and other more minor effects in other organs and tissue also being observed over the long term.
Fallout exposure—depending on if further afield individuals shelter in place or evacuate perpendicular to the direction of the wind, and therefore avoid contact with the fallout plume, and stay there for the days and weeks after the nuclear explosion, their exposure to fallout, and therefore their total dose, will vary.
With those who do shelter in place, and or evacuate, experiencing a total dose that would be negligible in comparison to someone who just went about their life as normal.
Staying indoors until after the most hazardous fallout isotope, I-131 decays away to 0.1% of its initial quantity after ten half-lifes—which is represented by 80 days in I-131s case, would make the difference between likely contracting Thyroid cancer or escaping completely from this substance depending on the actions of the individual.
Effects of nuclear war:
See also:
Nuclear war could yield unprecedented human death tolls and habitat destruction. Detonating large numbers of nuclear weapons would have an immediate, short term and long-term effects on the climate, potentially causing cold weather known as a "nuclear winter".
In 1982, Brian Martin estimated that a US–Soviet nuclear exchange might kill 400–450 million directly, mostly in the United States, Europe and Russia, and maybe several hundred million more through follow-up consequences in those same areas.
Many scholars have posited that a global thermonuclear war with Cold War-era stockpiles, or even with the current smaller stockpiles, may lead to the extinction of the human race.
The International Physicians for the Prevention of Nuclear War believe that nuclear war could indirectly contribute to human extinction via secondary effects, including environmental consequences, societal breakdown, and economic collapse.
It has been estimated that a relatively small-scale nuclear exchange between India and Pakistan involving 100 Hiroshima yield (15 kilotons) weapons, could cause a nuclear winter and kill more than a billion people.
According to a peer-reviewed study published in the journal Nature Food in August 2022, a full-scale nuclear war between the U.S. and Russia would directly kill 360 million people and more than 5 billion people would die from starvation. More than 2 billion people could die from a smaller-scale nuclear war between India and Pakistan.
Public opposition:
See also: Nuclear disarmament and International Day against Nuclear Tests
Peace movements emerged in Japan and in 1954 they converged to form a unified "Japan Council against Atomic and Hydrogen Bombs." Japanese opposition to nuclear weapons tests in the Pacific Ocean was widespread, and "an estimated 35 million signatures were collected on petitions calling for bans on nuclear weapons".
In the United Kingdom, the first Aldermaston March organised by the Campaign for Nuclear Disarmament (CND) took place at Easter 1958, when, according to the CND, several thousand people marched for four days from Trafalgar Square, London, to the Atomic Weapons Research Establishment close to Aldermaston in Berkshire, England, to demonstrate their opposition to nuclear weapons.
The Aldermaston marches continued into the late 1960s when tens of thousands of people took part in the four-day marches.
In 1959, a letter in the Bulletin of the Atomic Scientists was the start of a successful campaign to stop the Atomic Energy Commission dumping radioactive waste in the sea 19 kilometres from Boston.
In 1962, Linus Pauling won the Nobel Peace Prize for his work to stop the atmospheric testing of nuclear weapons, and the "Ban the Bomb" movement spread.
In 1963, many countries ratified the Partial Test Ban Treaty prohibiting atmospheric nuclear testing. Radioactive fallout became less of an issue and the anti-nuclear weapons movement went into decline for some years. A resurgence of interest occurred amid European and American fears of nuclear war in the 1980s.
Costs and technology spin-offs:
See also:
- Global Positioning System,
- Nuclear weapons delivery,
- History of computing hardware,
- ENIAC,
- Swords to ploughshares
According to an audit by the Brookings Institution, between 1940 and 1996, the U.S. spent $10.9 trillion in present-day terms on nuclear weapons programs:
- 57% of which was spent on building nuclear weapons delivery systems.
- 6.3% of the total, $681 billion in present-day terms, was spent on environmental remediation and nuclear waste management, for example cleaning up the Hanford site, and 7% of the total,
- $763 billion was spent on making nuclear weapons themselves.
Non-weapons uses:
Main article: Peaceful nuclear explosion
Peaceful nuclear explosions are nuclear explosions conducted for non-military purposes, such as activities related to economic development including the creation of canals. During the 1960s and 1970s, both the United States and the Soviet Union conducted a number of PNEs. Six of the explosions by the Soviet Union are considered to have been of an applied nature, not just tests.
The United States and the Soviet Union later halted their programs. Definitions and limits are covered in the Peaceful Nuclear Explosions Treaty of 1976. The stalled Comprehensive Nuclear-Test-Ban Treaty of 1996 would prohibit all nuclear explosions, regardless of whether they are for peaceful purposes or not.
Click on any of the following blue hyperlinks for more about Nuclear Weapons:
- History of development
- See also:
- Media related to Nuclear weapons at Wikimedia Commons
- Nuclear Weapon Archive from Carey Sublette: reliable source, has links to other sources and an informative FAQ.
- The Federation of American Scientists provide information on weapons of mass destruction, including nuclear weapons and their effects
- The National Museum of Nuclear Science & History (United States) – located in New Mexico; a Smithsonian Affiliate Museum
- Nuclear Emergency and Radiation Resources
- The Manhattan Project: Making the Atomic Bomb at AtomicArchive.com
- Los Alamos National Laboratory: History (U.S. nuclear history)
- Race for the Superbomb, PBS website on the history of the H-bomb
- Recordings of recollections of the victims of Hiroshima and Nagasaki
- The Woodrow Wilson Center's Nuclear Proliferation International History Project or NPIHP is a global network of individuals and institutions engaged in the study of international nuclear history through archival documents, oral history interviews and other empirical sources.
- NUKEMAP3D – a 3D nuclear weapons effects simulator powered by Google Maps.
- Cobalt bomb
- Cosmic bomb (phrase)
- Cuban Missile Crisis
- Dirty bomb
- Nth Country Experiment
- Nuclear blackout
- Nuclear bunker buster
- Nuclear weapons of the United Kingdom
- Nuclear weapons in popular culture
- OPANAL (Agency for the Prohibition of Nuclear Weapons in Latin America and the Caribbean)
- Three Non-Nuclear Principles of Japan
Nuclear Weapons of the United States
- YouTube Video: Atomic Bomb: The Rise Of The Nuclear Superpowers | M.A.D. World | Timeline
- YouTube Video: America's Nuclear Bomb: Most Dangerous Weapon In Action
- YouTube Video: The True Scale of Nuclear Weapons
The United States was the first country to manufacture nuclear weapons and is the only country to have used them in combat, with the bombings of Hiroshima and Nagasaki in World War II.
Before and during the Cold War, it conducted 1,054 nuclear tests, and tested many long-range nuclear weapons delivery systems.
Between 1940 and 1996, the U.S. federal government spent at least US$10.9 trillion in present-day terms on nuclear weapons, including platforms development (aircraft, rockets and facilities), command and control, maintenance, waste management and administrative costs.
It is estimated that the United States produced more than 70,000 nuclear warheads since 1945, more than all other nuclear weapon states combined. Until November 1962, the vast majority of U.S. nuclear tests were above ground. After the acceptance of the Partial Nuclear Test Ban Treaty, all testing was relegated underground, in order to prevent the dispersion of nuclear fallout.
By 1998, at least US$759 million had been paid to the Marshall Islanders in compensation for their exposure to U.S. nuclear testing. By March 2021 over US$2.5 billion in compensation had been paid to U.S. citizens exposed to nuclear hazards as a result of the U.S. nuclear weapons program.
In 2019, the U.S. and Russia possessed a comparable number of nuclear warheads; together, these two nations possess more than 90% of the world's nuclear weapons stockpile.
As of 2020, the United States had a stockpile of 3,750 active and inactive nuclear warheads plus approximately 2,000 warheads retired and awaiting dismantlement. Of the stockpiled warheads, the U.S. stated in its March 2019 New START declaration that 1,365 were deployed on 656 ICBMs, SLBMs, and strategic bombers.
Development history
Manhattan Project
Main article: Manhattan Project
The United States first began developing nuclear weapons during World War II under the order of President Franklin Roosevelt in 1939, motivated by the fear that they were engaged in a race with Nazi Germany to develop such a weapon.
After a slow start under the direction of the National Bureau of Standards, at the urging of British scientists and American administrators, the program was put under the Office of Scientific Research and Development, and in 1942 it was officially transferred under the auspices of the United States Army and became known as the Manhattan Project, an American, British and Canadian joint venture.
Under the direction of General Leslie Groves, over thirty different sites were constructed for the research, production, and testing of components related to bomb-making. These included the Los Alamos National Laboratory at Los Alamos, New Mexico, under the direction of physicist Robert Oppenheimer, the Hanford plutonium production facility in Washington, and the Y-12 National Security Complex in Tennessee.
By investing heavily in breeding plutonium in early nuclear reactors and in the electromagnetic and gaseous diffusion enrichment processes for the production of uranium-235, the United States was able to develop three usable weapons by mid-1945. The Trinity test was a plutonium implosion-design weapon tested on 16 July 1945, with around a 20 kiloton yield.
Faced with a planned invasion of the Japanese home islands scheduled to begin on 1 November 1945 and with Japan not surrendering, President Harry S. Truman ordered the atomic raids on Japan. On 6 August 1945, the U.S. detonated a uranium-gun design bomb, Little Boy, over the Japanese city of Hiroshima with an energy of about 15 kilotons of TNT, killing approximately 70,000 people, among them 20,000 Japanese combatants and 20,000 Korean slave laborers, and destroying nearly 50,000 buildings (including the 2nd General Army and Fifth Division headquarters).
Three days later, on 9 August, the U.S. attacked Nagasaki using a plutonium implosion-design bomb, Fat Man, with the explosion equivalent to about 20 kilotons of TNT, destroying 60% of the city and killing approximately 35,000 people, among them 23,200–28,200 Japanese munitions workers, 2,000 Korean slave laborers, and 150 Japanese combatants.
On 1 January 1947, the Atomic Energy Act of 1946 (known as the McMahon Act) took effect, and the Manhattan Project was officially turned over to the United States Atomic Energy Commission (AEC).
On 15 August 1947, the Manhattan Project was abolished.
During the Cold War:
The American atomic stockpile was small and grew slowly in the immediate aftermath of World War II, and the size of that stockpile was a closely guarded secret. However, there were forces that pushed the United States towards greatly increasing the size of the stockpile.
Some of these were international in origin and focused on the increasing tensions of the Cold War, including the loss of China, the Soviet Union becoming an atomic power, and the onset of the Korean War. And some of the forces were domestic – both the Truman administration and the Eisenhower administration wanted to rein in military spending and avoid budget deficits and inflation. It was the perception that nuclear weapons gave more "bang for the buck" and thus were the most cost-efficient way to respond to the security threat the Soviet Union represented.
As a result, beginning in 1950 the AEC embarked on a massive expansion of its production facilities, an effort that would eventually be one of the largest U.S. government construction projects ever to take place outside of wartime. And this production would soon include the far more powerful hydrogen bomb, which the United States had decided to move forward with after an intense debate during 1949–50. as well as much smaller tactical atomic weapons for battlefield use.
By 1990, the United States had produced more than 70,000 nuclear warheads, in over 65 different varieties, ranging in yield from around .01 kilotons (such as the man-portable Davy Crockett shell) to the 25 megaton B41 bomb. Between 1940 and 1996, the U.S. spent at least $10.9 trillion in present-day terms on nuclear weapons development. Over half was spent on building delivery mechanisms for the weapon. $681 billion in present-day terms was spent on nuclear waste management and environmental remediation.
Richland, Washington was the first city established to support plutonium production at the nearby Hanford nuclear site, to power the American nuclear weapons arsenals. It produced plutonium for use in cold war atomic bombs.
Throughout the Cold War, the U.S. and USSR threatened with all-out nuclear attack in case of war, regardless of whether it was a conventional or a nuclear clash. U.S. nuclear doctrine called for mutually assured destruction (MAD), which entailed a massive nuclear attack against strategic targets and major populations centers of the Soviet Union and its allies.
The term "mutual assured destruction" was coined in 1962 by American strategist Donald Brennan. MAD was implemented by deploying nuclear weapons simultaneously on three different types of weapons platforms.
Post–Cold War:
After the 1989 end of the Cold War and the 1991 dissolution of the Soviet Union, the U.S. nuclear program was heavily curtailed; halting its program of nuclear testing, ceasing its production of new nuclear weapons, and reducing its stockpile by half by the mid-1990s under President Bill Clinton.
Many former nuclear facilities were closed, and their sites became targets of extensive environmental remediation. Efforts were redirected from weapons production to stockpile stewardship;,attempting to predict the behavior of aging weapons without using full-scale nuclear testing.
Increased funding was directed to anti-nuclear proliferation programs, such as helping the states of the former Soviet Union to eliminate their former nuclear sites and to assist Russia in their efforts to inventory and secure their inherited nuclear stockpile.
By February 2006, over $1.2 billion had been paid under the Radiation Exposure Compensation Act of 1990 to U.S. citizens exposed to nuclear hazards as a result of the U.S. nuclear weapons program, and by 1998 at least $759 million had been paid to the Marshall Islanders in compensation for their exposure to U.S. nuclear testing. Over $15 million was paid to the Japanese government following the exposure of its citizens and food supply to nuclear fallout from the 1954 "Bravo" test.
In 1998, the country spent an estimated $35.1 billion on its nuclear weapons and weapons-related programs.
In the 2013 book Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters (Oxford), Kate Brown explores the health of affected citizens in the United States, and the "slow-motion disasters" that still threaten the environments where the plants are located.
According to Brown, the plants at Hanford, over a period of four decades, released millions of curies of radioactive isotopes into the surrounding environment. Brown says that most of this radioactive contamination over the years at Hanford were part of normal operations, but unforeseen accidents did occur and plant management kept this secret, as the pollution continued unabated. Even today, as pollution threats to health and the environment persist, the government keeps knowledge about the associated risks from the public.
During the presidency of George W. Bush, and especially after the 11 September terrorist attacks of 2001, rumors circulated in major news sources that the U.S. was considering designing new nuclear weapons ("bunker-busting nukes") and resuming nuclear testing for reasons of stockpile stewardship.
Republicans argued that small nuclear weapons appear more likely to be used than large nuclear weapons, and thus small nuclear weapons pose a more credible threat that has more of a deterrent effect against hostile behavior.
Democrats counterargued that allowing the weapons could trigger an arms race. In 2003, the Senate Armed Services Committee voted to repeal the 1993 Spratt-Furse ban on the development of small nuclear weapons. This change was part of the 2004 fiscal year defense authorization.
The Bush administration wanted the repeal so that they could develop weapons to address the threat from North Korea. "Low-yield weapons" (those with one-third the force of the bomb that was dropped on Hiroshima in 1945) were permitted to be developed.
The Bush administration was unsuccessful in its goal to develop a guided low-yield nuclear weapon, however, in 2010 President Barack Obama began funding and development for what would become the B61-12, a smart guided low-yield nuclear bomb developed off of the B61 “dumb bomb”.
Statements by the U.S. government in 2004 indicated that they planned to decrease the arsenal to around 5,500 total warheads by 2012. Much of that reduction was already accomplished by January 2008.
According to the Pentagon's June 2019 Doctrine for Joint Nuclear Operations, "Integration of nuclear weapons employment with conventional and special operations forces is essential to the success of any mission or operation."
Nuclear weapons testing:
Main articles:
Between 16 July 1945 and 23 September 1992, the United States maintained a program of vigorous nuclear testing, with the exception of a moratorium between November 1958 and September 1961.
By official count, a total of 1,054 nuclear tests and two nuclear attacks were conducted, with over 100 of them taking place at sites in the Pacific Ocean, over 900 of them at the Nevada Test Site, and ten on miscellaneous sites in the United States (Alaska, Colorado, Mississippi, and New Mexico).
Until November 1962, the vast majority of the U.S. tests were atmospheric (that is, above-ground); after the acceptance of the Partial Test Ban Treaty all testing was relegated underground, in order to prevent the dispersion of nuclear fallout.
The U.S. program of atmospheric nuclear testing exposed a number of the population to the hazards of fallout. Estimating exact numbers, and the exact consequences, of people exposed has been medically very difficult, with the exception of the high exposures of Marshall Islanders and Japanese fishers in the case of the Castle Bravo incident in 1954.
A number of groups of U.S. citizens—especially farmers and inhabitants of cities downwind of the Nevada Test Site and U.S. military workers at various tests—have sued for compensation and recognition of their exposure, many successfully. The passage of the Radiation Exposure Compensation Act of 1990 allowed for a systematic filing of compensation claims in relation to testing as well as those employed at nuclear weapons facilities.
By June 2009 over $1.4 billion total has been given in compensation, with over $660 million going to "downwinders".
A few notable U.S. nuclear tests include:
A summary table of each of the American operational series may be found at United States' nuclear test series.
Delivery systems:
Main article: Nuclear weapons delivery
The original Little Boy and Fat Man weapons, developed by the United States during the Manhattan Project, were relatively large (Fat Man had a diameter of 5 feet (1.5 m)) and heavy (around 5 tons each) and required specially modified bomber planes to be adapted for their bombing missions against Japan. Each modified bomber could only carry one such weapon and only within a limited range.
After these initial weapons were developed, a considerable amount of money and research was conducted towards the goal of standardizing nuclear warheads so that they did not require highly specialized experts to assemble them before use, as in the case with the idiosyncratic wartime devices, and miniaturization of the warheads for use in more variable delivery systems.
Through the aid of brainpower acquired through Operation Paperclip at the tail end of the European theater of World War II, the United States was able to embark on an ambitious program in rocketry.
One of the first products of this was the development of rockets capable of holding nuclear warheads. The MGR-1 Honest John was the first such weapon, developed in 1953 as a surface-to-surface missile with a 15-mile (24 km) maximum range. Because of their limited range, their potential use was heavily constrained (they could not, for example, threaten Moscow with an immediate strike).
Development of long-range bombers, such as the B-29 Superfortress during World War II, was continued during the Cold War period. In 1946, the Convair B-36 Peacemaker became the first purpose-built nuclear bomber; it served with the USAF until 1959.
The Boeing B-52 Stratofortress was able by the mid-1950s to carry a wide arsenal of nuclear bombs, each with different capabilities and potential use situations. Starting in 1946, the U.S. based its initial deterrence force on the Strategic Air Command, which, by the late 1950s, maintained a number of nuclear-armed bombers in the sky at all times, prepared to receive orders to attack the USSR whenever needed. This system was, however, tremendously expensive, both in terms of natural and human resources, and raised the possibility of an accidental nuclear war.
During the 1950s and 1960s, elaborate computerized early warning systems such as Defense Support Program were developed to detect incoming Soviet attacks and to coordinate response strategies.
During this same period, intercontinental ballistic missile (ICBM) systems were developed that could deliver a nuclear payload across vast distances, allowing the U.S. to house nuclear forces capable of hitting the Soviet Union in the American Midwest. Shorter-range weapons, including small tactical weapons, were fielded in Europe as well, including nuclear artillery and man-portable Special Atomic Demolition Munition.
The development of submarine-launched ballistic missile systems allowed for hidden nuclear submarines to covertly launch missiles at distant targets as well, making it virtually impossible for the Soviet Union to successfully launch a first strike attack against the United States without receiving a deadly response.
Improvements in warhead miniaturization in the 1970s and 1980s allowed for the development of MIRVs—missiles which could carry multiple warheads, each of which could be separately targeted.
The question of whether these missiles should be based on constantly rotating train tracks (to avoid being easily targeted by opposing Soviet missiles) or based in heavily fortified silos (to possibly withstand a Soviet attack) was a major political controversy in the 1980s (eventually the silo deployment method was chosen). MIRVed systems enabled the U.S. to render Soviet missile defenses economically unfeasible, as each offensive missile would require between three and ten defensive missiles to counter.
Additional developments in weapons delivery included cruise missile systems, which allowed a plane to fire a long-distance, low-flying nuclear-armed missile towards a target from a relatively comfortable distance.
The current delivery systems of the U.S. make virtually any part of the Earth's surface within the reach of its nuclear arsenal. Though its land-based missile systems have a maximum range of 10,000 kilometres (6,200 mi) (less than worldwide), its submarine-based forces extend its reach from a coastline 12,000 kilometres (7,500 mi) inland. Additionally, in-flight refueling of long-range bombers and the use of aircraft carriers extends the possible range virtually indefinitely.
Command and control:
Command and control procedures in case of nuclear war were given by the Single Integrated Operational Plan (SIOP) until 2003, when this was superseded by Operations Plan 8044.
Since World War II, the President of the United States has had sole authority to launch U.S. nuclear weapons, whether as a first strike or nuclear retaliation. This arrangement was seen as necessary during the Cold War to present a credible nuclear deterrent; if an attack was detected, the United States would have only minutes to launch a counterstrike before its nuclear capability was severely damaged, or national leaders killed.
If the President has been killed, command authority follows the presidential line of succession. Changes to this policy have been proposed, but currently the only way to countermand such an order before the strike was launched would be for the Vice President and the majority of the Cabinet to relieve the President under Section 4 of the Twenty-fifth Amendment to the United States Constitution.
Regardless of whether the United States is actually under attack by a nuclear-capable adversary, the President alone has the authority to order nuclear strikes. The President and the Secretary of Defense form the National Command Authority, but the Secretary of Defense has no authority to refuse or disobey such an order.
The President's decision must be transmitted to the National Military Command Center, which will then issue the coded orders to nuclear-capable forces.
The President can give a nuclear launch order using their nuclear briefcase (nicknamed the nuclear football), or can use command centers such as the White House Situation Room.
The command would be carried out by a Nuclear and Missile Operations Officer (a member of a missile combat crew, also called a "missileer") at a missile launch control center. A two-man rule applies to the launch of missiles, meaning that two officers must turn keys simultaneously (far enough apart that this cannot be done by one person).
When President Reagan was shot in 1981, there was confusion about where the "nuclear football" was, and who was in charge.
Starting with President Eisenhower, authority to launch a full-scale nuclear attack has been delegated to theater commanders and other specific commanders if they believe it is warranted by circumstances, and are out of communication with the president or the president had been incapacitated.
For example, during the Cuban Missile Crisis, on 24 October 1962, General Thomas Power, commander of the Strategic Air Command (SAC), took the country to DEFCON 2, the very precipice of full-scale nuclear war, launching the SAC bombers of the US with nuclear weapons ready to strike.
Moreover, some of these commanders sub-delegated to lower commanders the authority to launch nuclear weapons under similar circumstance. In fact, the nuclear weapons were not placed under locks (i.e., permissive action links) until decades later, and so pilots or individual submarine commanders had the power to launch nuclear weapons entirely on their own, without higher authority.
Accidents:
Main articles: Nuclear and radiation accidents and List of military nuclear accidents
The United States nuclear program since its inception has experienced accidents of varying forms, ranging from single-casualty research experiments (such as that of Louis Slotin during the Manhattan Project), to the nuclear fallout dispersion of the Castle Bravo shot in 1954, to accidents such as crashes of aircraft carrying nuclear weapons, the dropping of nuclear weapons from aircraft, losses of nuclear submarines, and explosions of nuclear-armed missiles (broken arrows).
How close any of these accidents came to being major nuclear disasters is a matter of technical and scholarly debate and interpretation.
Weapons accidentally dropped by the United States include:
In some of these cases (such as the 1966 Palomares case), the explosive system of the fission weapon discharged, but did not trigger a nuclear chain reaction (safety features prevent this from easily happening), but did disperse hazardous nuclear materials across wide areas, necessitating expensive cleanup endeavors.
Several US nuclear weapons, partial weapons, or weapons components are thought to be lost and unrecovered, primarily in aircraft accidents. The 1980 Damascus Titan missile explosion in Damascus, Arkansas, threw a warhead from its silo but did not release any radiation.
The nuclear testing program resulted in a number of cases of fallout dispersion onto populated areas.
The most significant of these was the Castle Bravo test, which spread radioactive ash over an area of over 100 square miles (260 km2), including a number of populated islands. The populations of the islands were evacuated but not before suffering radiation burns. They would later suffer long-term effects, such as birth defects and increased cancer risk. There are ongoing concerns around deterioration of the nuclear waste site on Runit Island and a potential radioactive spill.
There were also instances during the nuclear testing program in which soldiers were exposed to overly high levels of radiation, which grew into a major scandal in the 1970s and 1980s, as many soldiers later suffered from what were claimed to be diseases caused by their exposures.
Many of the former nuclear facilities produced significant environmental damages during their years of activity, and since the 1990s have been Superfund sites of cleanup and environmental remediation. Hanford is currently the most contaminated nuclear site in the United States and is the focus of the nation's largest environmental cleanup.
Radioactive materials are known to be leaking from Hanford into the environment. The Radiation Exposure Compensation Act of 1990 allows for U.S. citizens exposed to radiation or other health risks through the U.S. nuclear program to file for compensation and damages.
Deliberate attacks on weapons facilities:
Main article: Vulnerability of nuclear plants to attack
In 1972, three hijackers took control of a domestic passenger flight along the east coast of the U.S. and threatened to crash the plane into a U.S. nuclear weapons plant in Oak Ridge, Tennessee. The plane got as close as 8,000 feet above the site before the hijackers' demands were met.
Various acts of civil disobedience since 1980 by the peace group Plowshares have shown how nuclear weapons facilities can be penetrated, and the group's actions represent extraordinary breaches of security at nuclear weapons plants in the United States.
The National Nuclear Security Administration has acknowledged the seriousness of the 2012 Plowshares action. Non-proliferation policy experts have questioned "the use of private contractors to provide security at facilities that manufacture and store the government's most dangerous military material".
Nuclear weapons materials on the black market are a global concern, and there is concern about the possible detonation of a small, crude nuclear weapon by a militant group in a major city, with significant loss of life and property.
Stuxnet is a computer worm discovered in June 2010 that is believed to have been created by the United States and Israel to attack Iran's nuclear fuel enrichment facilities.
Development agencies
The initial U.S. nuclear program was run by the National Bureau of Standards starting in 1939 under the edict of President Franklin Delano Roosevelt. Its primary purpose was to delegate research and dispense funds. In 1940 the National Defense Research Committee (NDRC) was established, coordinating work under the Committee on Uranium among its other wartime efforts.
In June 1941, the Office of Scientific Research and Development (OSRD) was established, with the NDRC as one of its subordinate agencies, which enlarged and renamed the Uranium Committee as the Section on Uranium.
In 1941, NDRC research was placed under direct control of Vannevar Bush as the OSRD S-1 Section, which attempted to increase the pace of weapons research. In June 1942, the U.S. Army Corps of Engineers took over the project to develop atomic weapons, while the OSRD retained responsibility for scientific research.
This was the beginning of the Manhattan Project, run as the Manhattan Engineering District (MED), an agency under military control that was in charge of developing the first atomic weapons. After World War II, the MED maintained control over the U.S. arsenal and production facilities and coordinated the Operation Crossroads tests.
In 1946 after a long and protracted debate, the Atomic Energy Act of 1946 was passed, creating the Atomic Energy Commission (AEC) as a civilian agency that would be in charge of the production of nuclear weapons and research facilities, funded through Congress, with oversight provided by the Joint Committee on Atomic Energy.
The AEC was given vast powers of control over secrecy, research, and money, and could seize lands with suspected uranium deposits. Along with its duties towards the production and regulation of nuclear weapons, it was also in charge of stimulating development and regulating civilian nuclear power. The full transference of activities was finalized in January 1947.
In 1975, following the "energy crisis" of the early 1970s and public and congressional discontent with the AEC (in part because of the impossibility to be both a producer and a regulator), it was disassembled into component parts as the Energy Research and Development Administration (ERDA), which assumed most of the AEC's former production, coordination, and research roles, and the Nuclear Regulatory Commission, which assumed its civilian regulation activities.
ERDA was short-lived, however, and in 1977 the U.S. nuclear weapons activities were reorganized under the Department of Energy, which maintains such responsibilities through the semi-autonomous National Nuclear Security Administration. Some functions were taken over or shared by the Department of Homeland Security in 2002.
The already-built weapons themselves are in the control of the Strategic Command, which is part of the Department of Defense.
In general, these agencies served to coordinate research and build sites. They generally operated their sites through contractors, however, both private and public (for example:
Funding was received both through these agencies directly, but also from additional outside agencies, such as the Department of Defense. Each branch of the military also maintained its own nuclear-related research agencies (generally related to delivery systems).
Weapons production complex:
In addition to deploying weapons on its own soil, during the Cold War, the United States also stationed nuclear weapons in 27 foreign countries and territories, including:
The table below is not comprehensive, as numerous facilities throughout the United States have contributed to its nuclear weapons program. It includes the major sites related to the U.S. weapons program (past and present), their basic site functions, and their current status of activity.
Not listed are the many bases and facilities at which nuclear weapons have been deployed.
Before and during the Cold War, it conducted 1,054 nuclear tests, and tested many long-range nuclear weapons delivery systems.
Between 1940 and 1996, the U.S. federal government spent at least US$10.9 trillion in present-day terms on nuclear weapons, including platforms development (aircraft, rockets and facilities), command and control, maintenance, waste management and administrative costs.
It is estimated that the United States produced more than 70,000 nuclear warheads since 1945, more than all other nuclear weapon states combined. Until November 1962, the vast majority of U.S. nuclear tests were above ground. After the acceptance of the Partial Nuclear Test Ban Treaty, all testing was relegated underground, in order to prevent the dispersion of nuclear fallout.
By 1998, at least US$759 million had been paid to the Marshall Islanders in compensation for their exposure to U.S. nuclear testing. By March 2021 over US$2.5 billion in compensation had been paid to U.S. citizens exposed to nuclear hazards as a result of the U.S. nuclear weapons program.
In 2019, the U.S. and Russia possessed a comparable number of nuclear warheads; together, these two nations possess more than 90% of the world's nuclear weapons stockpile.
As of 2020, the United States had a stockpile of 3,750 active and inactive nuclear warheads plus approximately 2,000 warheads retired and awaiting dismantlement. Of the stockpiled warheads, the U.S. stated in its March 2019 New START declaration that 1,365 were deployed on 656 ICBMs, SLBMs, and strategic bombers.
Development history
Manhattan Project
Main article: Manhattan Project
The United States first began developing nuclear weapons during World War II under the order of President Franklin Roosevelt in 1939, motivated by the fear that they were engaged in a race with Nazi Germany to develop such a weapon.
After a slow start under the direction of the National Bureau of Standards, at the urging of British scientists and American administrators, the program was put under the Office of Scientific Research and Development, and in 1942 it was officially transferred under the auspices of the United States Army and became known as the Manhattan Project, an American, British and Canadian joint venture.
Under the direction of General Leslie Groves, over thirty different sites were constructed for the research, production, and testing of components related to bomb-making. These included the Los Alamos National Laboratory at Los Alamos, New Mexico, under the direction of physicist Robert Oppenheimer, the Hanford plutonium production facility in Washington, and the Y-12 National Security Complex in Tennessee.
By investing heavily in breeding plutonium in early nuclear reactors and in the electromagnetic and gaseous diffusion enrichment processes for the production of uranium-235, the United States was able to develop three usable weapons by mid-1945. The Trinity test was a plutonium implosion-design weapon tested on 16 July 1945, with around a 20 kiloton yield.
Faced with a planned invasion of the Japanese home islands scheduled to begin on 1 November 1945 and with Japan not surrendering, President Harry S. Truman ordered the atomic raids on Japan. On 6 August 1945, the U.S. detonated a uranium-gun design bomb, Little Boy, over the Japanese city of Hiroshima with an energy of about 15 kilotons of TNT, killing approximately 70,000 people, among them 20,000 Japanese combatants and 20,000 Korean slave laborers, and destroying nearly 50,000 buildings (including the 2nd General Army and Fifth Division headquarters).
Three days later, on 9 August, the U.S. attacked Nagasaki using a plutonium implosion-design bomb, Fat Man, with the explosion equivalent to about 20 kilotons of TNT, destroying 60% of the city and killing approximately 35,000 people, among them 23,200–28,200 Japanese munitions workers, 2,000 Korean slave laborers, and 150 Japanese combatants.
On 1 January 1947, the Atomic Energy Act of 1946 (known as the McMahon Act) took effect, and the Manhattan Project was officially turned over to the United States Atomic Energy Commission (AEC).
On 15 August 1947, the Manhattan Project was abolished.
During the Cold War:
The American atomic stockpile was small and grew slowly in the immediate aftermath of World War II, and the size of that stockpile was a closely guarded secret. However, there were forces that pushed the United States towards greatly increasing the size of the stockpile.
Some of these were international in origin and focused on the increasing tensions of the Cold War, including the loss of China, the Soviet Union becoming an atomic power, and the onset of the Korean War. And some of the forces were domestic – both the Truman administration and the Eisenhower administration wanted to rein in military spending and avoid budget deficits and inflation. It was the perception that nuclear weapons gave more "bang for the buck" and thus were the most cost-efficient way to respond to the security threat the Soviet Union represented.
As a result, beginning in 1950 the AEC embarked on a massive expansion of its production facilities, an effort that would eventually be one of the largest U.S. government construction projects ever to take place outside of wartime. And this production would soon include the far more powerful hydrogen bomb, which the United States had decided to move forward with after an intense debate during 1949–50. as well as much smaller tactical atomic weapons for battlefield use.
By 1990, the United States had produced more than 70,000 nuclear warheads, in over 65 different varieties, ranging in yield from around .01 kilotons (such as the man-portable Davy Crockett shell) to the 25 megaton B41 bomb. Between 1940 and 1996, the U.S. spent at least $10.9 trillion in present-day terms on nuclear weapons development. Over half was spent on building delivery mechanisms for the weapon. $681 billion in present-day terms was spent on nuclear waste management and environmental remediation.
Richland, Washington was the first city established to support plutonium production at the nearby Hanford nuclear site, to power the American nuclear weapons arsenals. It produced plutonium for use in cold war atomic bombs.
Throughout the Cold War, the U.S. and USSR threatened with all-out nuclear attack in case of war, regardless of whether it was a conventional or a nuclear clash. U.S. nuclear doctrine called for mutually assured destruction (MAD), which entailed a massive nuclear attack against strategic targets and major populations centers of the Soviet Union and its allies.
The term "mutual assured destruction" was coined in 1962 by American strategist Donald Brennan. MAD was implemented by deploying nuclear weapons simultaneously on three different types of weapons platforms.
Post–Cold War:
After the 1989 end of the Cold War and the 1991 dissolution of the Soviet Union, the U.S. nuclear program was heavily curtailed; halting its program of nuclear testing, ceasing its production of new nuclear weapons, and reducing its stockpile by half by the mid-1990s under President Bill Clinton.
Many former nuclear facilities were closed, and their sites became targets of extensive environmental remediation. Efforts were redirected from weapons production to stockpile stewardship;,attempting to predict the behavior of aging weapons without using full-scale nuclear testing.
Increased funding was directed to anti-nuclear proliferation programs, such as helping the states of the former Soviet Union to eliminate their former nuclear sites and to assist Russia in their efforts to inventory and secure their inherited nuclear stockpile.
By February 2006, over $1.2 billion had been paid under the Radiation Exposure Compensation Act of 1990 to U.S. citizens exposed to nuclear hazards as a result of the U.S. nuclear weapons program, and by 1998 at least $759 million had been paid to the Marshall Islanders in compensation for their exposure to U.S. nuclear testing. Over $15 million was paid to the Japanese government following the exposure of its citizens and food supply to nuclear fallout from the 1954 "Bravo" test.
In 1998, the country spent an estimated $35.1 billion on its nuclear weapons and weapons-related programs.
In the 2013 book Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters (Oxford), Kate Brown explores the health of affected citizens in the United States, and the "slow-motion disasters" that still threaten the environments where the plants are located.
According to Brown, the plants at Hanford, over a period of four decades, released millions of curies of radioactive isotopes into the surrounding environment. Brown says that most of this radioactive contamination over the years at Hanford were part of normal operations, but unforeseen accidents did occur and plant management kept this secret, as the pollution continued unabated. Even today, as pollution threats to health and the environment persist, the government keeps knowledge about the associated risks from the public.
During the presidency of George W. Bush, and especially after the 11 September terrorist attacks of 2001, rumors circulated in major news sources that the U.S. was considering designing new nuclear weapons ("bunker-busting nukes") and resuming nuclear testing for reasons of stockpile stewardship.
Republicans argued that small nuclear weapons appear more likely to be used than large nuclear weapons, and thus small nuclear weapons pose a more credible threat that has more of a deterrent effect against hostile behavior.
Democrats counterargued that allowing the weapons could trigger an arms race. In 2003, the Senate Armed Services Committee voted to repeal the 1993 Spratt-Furse ban on the development of small nuclear weapons. This change was part of the 2004 fiscal year defense authorization.
The Bush administration wanted the repeal so that they could develop weapons to address the threat from North Korea. "Low-yield weapons" (those with one-third the force of the bomb that was dropped on Hiroshima in 1945) were permitted to be developed.
The Bush administration was unsuccessful in its goal to develop a guided low-yield nuclear weapon, however, in 2010 President Barack Obama began funding and development for what would become the B61-12, a smart guided low-yield nuclear bomb developed off of the B61 “dumb bomb”.
Statements by the U.S. government in 2004 indicated that they planned to decrease the arsenal to around 5,500 total warheads by 2012. Much of that reduction was already accomplished by January 2008.
According to the Pentagon's June 2019 Doctrine for Joint Nuclear Operations, "Integration of nuclear weapons employment with conventional and special operations forces is essential to the success of any mission or operation."
Nuclear weapons testing:
Main articles:
Between 16 July 1945 and 23 September 1992, the United States maintained a program of vigorous nuclear testing, with the exception of a moratorium between November 1958 and September 1961.
By official count, a total of 1,054 nuclear tests and two nuclear attacks were conducted, with over 100 of them taking place at sites in the Pacific Ocean, over 900 of them at the Nevada Test Site, and ten on miscellaneous sites in the United States (Alaska, Colorado, Mississippi, and New Mexico).
Until November 1962, the vast majority of the U.S. tests were atmospheric (that is, above-ground); after the acceptance of the Partial Test Ban Treaty all testing was relegated underground, in order to prevent the dispersion of nuclear fallout.
The U.S. program of atmospheric nuclear testing exposed a number of the population to the hazards of fallout. Estimating exact numbers, and the exact consequences, of people exposed has been medically very difficult, with the exception of the high exposures of Marshall Islanders and Japanese fishers in the case of the Castle Bravo incident in 1954.
A number of groups of U.S. citizens—especially farmers and inhabitants of cities downwind of the Nevada Test Site and U.S. military workers at various tests—have sued for compensation and recognition of their exposure, many successfully. The passage of the Radiation Exposure Compensation Act of 1990 allowed for a systematic filing of compensation claims in relation to testing as well as those employed at nuclear weapons facilities.
By June 2009 over $1.4 billion total has been given in compensation, with over $660 million going to "downwinders".
A few notable U.S. nuclear tests include:
- Trinity test on 16 July 1945, was the world's first test of a nuclear weapon (yield of around 20 kt).
- Operation Crossroads series in July 1946, was the first postwar test series and one of the largest military operations in U.S. history.
- Operation Greenhouse shots of May 1951 included the first boosted fission weapon test ("Item") and a scientific test that proved the feasibility of thermonuclear weapons ("George").
- Ivy Mike shot of 1 November 1952, was the first full test of a Teller-Ulam design "staged" hydrogen bomb, with a yield of 10 megatons. It was not a deployable weapon, however—with its full cryogenic equipment it weighed some 82 tons.
- Castle Bravo shot of 1 March 1954, was the first test of a deployable (solid fuel) thermonuclear weapon, and also (accidentally) the largest weapon ever tested by the United States (15 megatons). It was also the single largest U.S. radiological accident in connection with nuclear testing. The unanticipated yield, and a change in the weather, resulted in nuclear fallout spreading eastward onto the inhabited Rongelap and Rongerik atolls, which were soon evacuated. Many of the Marshall Islanders have since suffered from birth defects and have received some compensation from the federal government. A Japanese fishing boat, Daigo Fukuryū Maru, also came into contact with the fallout, which caused many of the crew to grow ill; one eventually died.
- Shot Argus I of Operation Argus, on 27 August 1958, was the first detonation of a nuclear weapon in outer space when a 1.7-kiloton warhead was detonated at an altitude of 200 kilometres (120 mi) during a series of high altitude nuclear explosions.
- Shot Frigate Bird of Operation Dominic I on 6 May 1962, was the only U.S. test of an operational submarine-launched ballistic missile (SLBM) with a live nuclear warhead (yield of 600 kilotons), at Christmas Island. In general, missile systems were tested without live warheads and warheads were tested separately for safety concerns. In the early 1960s, however, there mounted technical questions about how the systems would behave under combat conditions (when they were "mated", in military parlance), and this test was meant to dispel these concerns. However, the warhead had to be somewhat modified before its use, and the missile was a SLBM (and not an ICBM), so by itself it did not satisfy all concerns.
- Shot Sedan of Operation Storax on 6 July 1962 (yield of 104 kilotons), was an attempt to show the feasibility of using nuclear weapons for "civilian" and "peaceful" purposes as part of Operation Plowshare. In this instance, a 1,280-foot (390 m) diameter 320-foot (98 m) deep crater was created at the Nevada Test Site.
A summary table of each of the American operational series may be found at United States' nuclear test series.
Delivery systems:
Main article: Nuclear weapons delivery
The original Little Boy and Fat Man weapons, developed by the United States during the Manhattan Project, were relatively large (Fat Man had a diameter of 5 feet (1.5 m)) and heavy (around 5 tons each) and required specially modified bomber planes to be adapted for their bombing missions against Japan. Each modified bomber could only carry one such weapon and only within a limited range.
After these initial weapons were developed, a considerable amount of money and research was conducted towards the goal of standardizing nuclear warheads so that they did not require highly specialized experts to assemble them before use, as in the case with the idiosyncratic wartime devices, and miniaturization of the warheads for use in more variable delivery systems.
Through the aid of brainpower acquired through Operation Paperclip at the tail end of the European theater of World War II, the United States was able to embark on an ambitious program in rocketry.
One of the first products of this was the development of rockets capable of holding nuclear warheads. The MGR-1 Honest John was the first such weapon, developed in 1953 as a surface-to-surface missile with a 15-mile (24 km) maximum range. Because of their limited range, their potential use was heavily constrained (they could not, for example, threaten Moscow with an immediate strike).
Development of long-range bombers, such as the B-29 Superfortress during World War II, was continued during the Cold War period. In 1946, the Convair B-36 Peacemaker became the first purpose-built nuclear bomber; it served with the USAF until 1959.
The Boeing B-52 Stratofortress was able by the mid-1950s to carry a wide arsenal of nuclear bombs, each with different capabilities and potential use situations. Starting in 1946, the U.S. based its initial deterrence force on the Strategic Air Command, which, by the late 1950s, maintained a number of nuclear-armed bombers in the sky at all times, prepared to receive orders to attack the USSR whenever needed. This system was, however, tremendously expensive, both in terms of natural and human resources, and raised the possibility of an accidental nuclear war.
During the 1950s and 1960s, elaborate computerized early warning systems such as Defense Support Program were developed to detect incoming Soviet attacks and to coordinate response strategies.
During this same period, intercontinental ballistic missile (ICBM) systems were developed that could deliver a nuclear payload across vast distances, allowing the U.S. to house nuclear forces capable of hitting the Soviet Union in the American Midwest. Shorter-range weapons, including small tactical weapons, were fielded in Europe as well, including nuclear artillery and man-portable Special Atomic Demolition Munition.
The development of submarine-launched ballistic missile systems allowed for hidden nuclear submarines to covertly launch missiles at distant targets as well, making it virtually impossible for the Soviet Union to successfully launch a first strike attack against the United States without receiving a deadly response.
Improvements in warhead miniaturization in the 1970s and 1980s allowed for the development of MIRVs—missiles which could carry multiple warheads, each of which could be separately targeted.
The question of whether these missiles should be based on constantly rotating train tracks (to avoid being easily targeted by opposing Soviet missiles) or based in heavily fortified silos (to possibly withstand a Soviet attack) was a major political controversy in the 1980s (eventually the silo deployment method was chosen). MIRVed systems enabled the U.S. to render Soviet missile defenses economically unfeasible, as each offensive missile would require between three and ten defensive missiles to counter.
Additional developments in weapons delivery included cruise missile systems, which allowed a plane to fire a long-distance, low-flying nuclear-armed missile towards a target from a relatively comfortable distance.
The current delivery systems of the U.S. make virtually any part of the Earth's surface within the reach of its nuclear arsenal. Though its land-based missile systems have a maximum range of 10,000 kilometres (6,200 mi) (less than worldwide), its submarine-based forces extend its reach from a coastline 12,000 kilometres (7,500 mi) inland. Additionally, in-flight refueling of long-range bombers and the use of aircraft carriers extends the possible range virtually indefinitely.
Command and control:
Command and control procedures in case of nuclear war were given by the Single Integrated Operational Plan (SIOP) until 2003, when this was superseded by Operations Plan 8044.
Since World War II, the President of the United States has had sole authority to launch U.S. nuclear weapons, whether as a first strike or nuclear retaliation. This arrangement was seen as necessary during the Cold War to present a credible nuclear deterrent; if an attack was detected, the United States would have only minutes to launch a counterstrike before its nuclear capability was severely damaged, or national leaders killed.
If the President has been killed, command authority follows the presidential line of succession. Changes to this policy have been proposed, but currently the only way to countermand such an order before the strike was launched would be for the Vice President and the majority of the Cabinet to relieve the President under Section 4 of the Twenty-fifth Amendment to the United States Constitution.
Regardless of whether the United States is actually under attack by a nuclear-capable adversary, the President alone has the authority to order nuclear strikes. The President and the Secretary of Defense form the National Command Authority, but the Secretary of Defense has no authority to refuse or disobey such an order.
The President's decision must be transmitted to the National Military Command Center, which will then issue the coded orders to nuclear-capable forces.
The President can give a nuclear launch order using their nuclear briefcase (nicknamed the nuclear football), or can use command centers such as the White House Situation Room.
The command would be carried out by a Nuclear and Missile Operations Officer (a member of a missile combat crew, also called a "missileer") at a missile launch control center. A two-man rule applies to the launch of missiles, meaning that two officers must turn keys simultaneously (far enough apart that this cannot be done by one person).
When President Reagan was shot in 1981, there was confusion about where the "nuclear football" was, and who was in charge.
Starting with President Eisenhower, authority to launch a full-scale nuclear attack has been delegated to theater commanders and other specific commanders if they believe it is warranted by circumstances, and are out of communication with the president or the president had been incapacitated.
For example, during the Cuban Missile Crisis, on 24 October 1962, General Thomas Power, commander of the Strategic Air Command (SAC), took the country to DEFCON 2, the very precipice of full-scale nuclear war, launching the SAC bombers of the US with nuclear weapons ready to strike.
Moreover, some of these commanders sub-delegated to lower commanders the authority to launch nuclear weapons under similar circumstance. In fact, the nuclear weapons were not placed under locks (i.e., permissive action links) until decades later, and so pilots or individual submarine commanders had the power to launch nuclear weapons entirely on their own, without higher authority.
Accidents:
Main articles: Nuclear and radiation accidents and List of military nuclear accidents
The United States nuclear program since its inception has experienced accidents of varying forms, ranging from single-casualty research experiments (such as that of Louis Slotin during the Manhattan Project), to the nuclear fallout dispersion of the Castle Bravo shot in 1954, to accidents such as crashes of aircraft carrying nuclear weapons, the dropping of nuclear weapons from aircraft, losses of nuclear submarines, and explosions of nuclear-armed missiles (broken arrows).
How close any of these accidents came to being major nuclear disasters is a matter of technical and scholarly debate and interpretation.
Weapons accidentally dropped by the United States include:
- incidents off the coast of British Columbia (1950) (see 1950 British Columbia B-36 crash),
- near Atlantic City, New Jersey (1957);
- Savannah, Georgia (1958) (see Tybee Bomb);
- Goldsboro, North Carolina (1961) (see 1961 Goldsboro B-52 crash);
- off the coast of Okinawa (1965);
- in the sea near Palomares, Spain (1966, see 1966 Palomares B-52 crash);
- and near Thule Air Base, Greenland (1968) (see 1968 Thule Air Base B-52 crash).
In some of these cases (such as the 1966 Palomares case), the explosive system of the fission weapon discharged, but did not trigger a nuclear chain reaction (safety features prevent this from easily happening), but did disperse hazardous nuclear materials across wide areas, necessitating expensive cleanup endeavors.
Several US nuclear weapons, partial weapons, or weapons components are thought to be lost and unrecovered, primarily in aircraft accidents. The 1980 Damascus Titan missile explosion in Damascus, Arkansas, threw a warhead from its silo but did not release any radiation.
The nuclear testing program resulted in a number of cases of fallout dispersion onto populated areas.
The most significant of these was the Castle Bravo test, which spread radioactive ash over an area of over 100 square miles (260 km2), including a number of populated islands. The populations of the islands were evacuated but not before suffering radiation burns. They would later suffer long-term effects, such as birth defects and increased cancer risk. There are ongoing concerns around deterioration of the nuclear waste site on Runit Island and a potential radioactive spill.
There were also instances during the nuclear testing program in which soldiers were exposed to overly high levels of radiation, which grew into a major scandal in the 1970s and 1980s, as many soldiers later suffered from what were claimed to be diseases caused by their exposures.
Many of the former nuclear facilities produced significant environmental damages during their years of activity, and since the 1990s have been Superfund sites of cleanup and environmental remediation. Hanford is currently the most contaminated nuclear site in the United States and is the focus of the nation's largest environmental cleanup.
Radioactive materials are known to be leaking from Hanford into the environment. The Radiation Exposure Compensation Act of 1990 allows for U.S. citizens exposed to radiation or other health risks through the U.S. nuclear program to file for compensation and damages.
Deliberate attacks on weapons facilities:
Main article: Vulnerability of nuclear plants to attack
In 1972, three hijackers took control of a domestic passenger flight along the east coast of the U.S. and threatened to crash the plane into a U.S. nuclear weapons plant in Oak Ridge, Tennessee. The plane got as close as 8,000 feet above the site before the hijackers' demands were met.
Various acts of civil disobedience since 1980 by the peace group Plowshares have shown how nuclear weapons facilities can be penetrated, and the group's actions represent extraordinary breaches of security at nuclear weapons plants in the United States.
The National Nuclear Security Administration has acknowledged the seriousness of the 2012 Plowshares action. Non-proliferation policy experts have questioned "the use of private contractors to provide security at facilities that manufacture and store the government's most dangerous military material".
Nuclear weapons materials on the black market are a global concern, and there is concern about the possible detonation of a small, crude nuclear weapon by a militant group in a major city, with significant loss of life and property.
Stuxnet is a computer worm discovered in June 2010 that is believed to have been created by the United States and Israel to attack Iran's nuclear fuel enrichment facilities.
Development agencies
The initial U.S. nuclear program was run by the National Bureau of Standards starting in 1939 under the edict of President Franklin Delano Roosevelt. Its primary purpose was to delegate research and dispense funds. In 1940 the National Defense Research Committee (NDRC) was established, coordinating work under the Committee on Uranium among its other wartime efforts.
In June 1941, the Office of Scientific Research and Development (OSRD) was established, with the NDRC as one of its subordinate agencies, which enlarged and renamed the Uranium Committee as the Section on Uranium.
In 1941, NDRC research was placed under direct control of Vannevar Bush as the OSRD S-1 Section, which attempted to increase the pace of weapons research. In June 1942, the U.S. Army Corps of Engineers took over the project to develop atomic weapons, while the OSRD retained responsibility for scientific research.
This was the beginning of the Manhattan Project, run as the Manhattan Engineering District (MED), an agency under military control that was in charge of developing the first atomic weapons. After World War II, the MED maintained control over the U.S. arsenal and production facilities and coordinated the Operation Crossroads tests.
In 1946 after a long and protracted debate, the Atomic Energy Act of 1946 was passed, creating the Atomic Energy Commission (AEC) as a civilian agency that would be in charge of the production of nuclear weapons and research facilities, funded through Congress, with oversight provided by the Joint Committee on Atomic Energy.
The AEC was given vast powers of control over secrecy, research, and money, and could seize lands with suspected uranium deposits. Along with its duties towards the production and regulation of nuclear weapons, it was also in charge of stimulating development and regulating civilian nuclear power. The full transference of activities was finalized in January 1947.
In 1975, following the "energy crisis" of the early 1970s and public and congressional discontent with the AEC (in part because of the impossibility to be both a producer and a regulator), it was disassembled into component parts as the Energy Research and Development Administration (ERDA), which assumed most of the AEC's former production, coordination, and research roles, and the Nuclear Regulatory Commission, which assumed its civilian regulation activities.
ERDA was short-lived, however, and in 1977 the U.S. nuclear weapons activities were reorganized under the Department of Energy, which maintains such responsibilities through the semi-autonomous National Nuclear Security Administration. Some functions were taken over or shared by the Department of Homeland Security in 2002.
The already-built weapons themselves are in the control of the Strategic Command, which is part of the Department of Defense.
In general, these agencies served to coordinate research and build sites. They generally operated their sites through contractors, however, both private and public (for example:
- Union Carbide, a private company, ran Oak Ridge National Laboratory for many decades;
- the University of California, a public educational institution, has run the Los Alamos and Lawrence Livermore laboratories since their inception, and will jointly manage Los Alamos with the private company Bechtel as of its next contract).
Funding was received both through these agencies directly, but also from additional outside agencies, such as the Department of Defense. Each branch of the military also maintained its own nuclear-related research agencies (generally related to delivery systems).
Weapons production complex:
In addition to deploying weapons on its own soil, during the Cold War, the United States also stationed nuclear weapons in 27 foreign countries and territories, including:
- Okinawa (which was US-controlled until 1971,)
- Japan (during the occupation immediately following World War II),
- Greenland,
- Germany,
- Taiwan,
- and French Morocco then (independent) Morocco.
The table below is not comprehensive, as numerous facilities throughout the United States have contributed to its nuclear weapons program. It includes the major sites related to the U.S. weapons program (past and present), their basic site functions, and their current status of activity.
Not listed are the many bases and facilities at which nuclear weapons have been deployed.
Proliferation:
Main article: Nuclear proliferation
Early on in the development of its nuclear weapons, the United States relied in part on information-sharing with both the United Kingdom and Canada, as codified in the Quebec Agreement of 1943.
These three parties agreed not to share nuclear weapons information with other countries without the consent of the others, an early attempt at nonproliferation.
After the development of the first nuclear weapons during World War II, though, there was much debate within the political circles and public sphere of the United States about whether or not the country should attempt to maintain a monopoly on nuclear technology, or whether it should undertake a program of information sharing with other nations (especially its former ally and likely competitor, the Soviet Union), or submit control of its weapons to some sort of international organization (such as the United Nations) who would use them to attempt to maintain world peace.
Though fear of a nuclear arms race spurred many politicians and scientists to advocate some degree of international control or sharing of nuclear weapons and information, many politicians and members of the military believed that it was better in the short term to maintain high standards of nuclear secrecy and to forestall a Soviet bomb as long as possible (and they did not believe the USSR would actually submit to international controls in good faith).
Since this path was chosen, the United States was, in its early days, essentially an advocate for the prevention of nuclear proliferation, though primarily for the reason of self-preservation.
A few years after the USSR detonated its first weapon in 1949, though, the U.S. under President Dwight D. Eisenhower sought to encourage a program of sharing nuclear information related to civilian nuclear power and nuclear physics in general.
The Atoms for Peace program, begun in 1953, was also in part political: the U.S. was better poised to commit various scarce resources, such as enriched uranium, towards this peaceful effort, and to request a similar contribution from the Soviet Union, who had far fewer resources along these lines; thus the program had a strategic justification as well, as was later revealed by internal memos.
This overall goal of promoting civilian use of nuclear energy in other countries, while also preventing weapons dissemination, has been labeled by many critics as contradictory and having led to lax standards for a number of decades which allowed a number of other nations, such as China and India, to profit from dual-use technology (purchased from nations other than the U.S.).
The Cooperative Threat Reduction program of the Defense Threat Reduction Agency was established after the breakup of the Soviet Union in 1991 to aid former Soviet bloc countries in the inventory and destruction of their sites for developing nuclear, chemical, and biological weapons, and their methods of delivering them (ICBM silos, long-range bombers, etc.).
Over $4.4 billion has been spent on this endeavor to prevent purposeful or accidental proliferation of weapons from the former Soviet arsenal.
After India and Pakistan tested nuclear weapons in 1998, President Bill Clinton imposed economic sanctions on the countries. In 1999, however, the sanctions against India were lifted; those against Pakistan were kept in place as a result of the military government that had taken over.
Shortly after the September 11 attacks in 2001, President George W. Bush lifted the sanctions against Pakistan as well, in order to get the Pakistani government's help as a conduit for US and NATO forces for operations in Afghanistan.
The U.S. government has been vocal against the proliferation of such weapons in the countries of Iran and North Korea. The 2003 invasion of Iraq by the U.S. was done, in part, on indications that Weapons of mass destruction were being stockpiled (later, stockpiles of previously undeclared nerve agent and mustard gas shells were located in Iraq), and the Bush administration said that its policies on proliferation were responsible for the Libyan government's agreement to abandon its nuclear ambitions.
However, a year after the war began, the Senate's report on pre-war intelligence on Iraq, no stockpiles of weapons of mass destruction or active programs of mass destruction were found in Iraq.
Nuclear disarmament in international law:
The United States is one of the five nuclear weapons states with a declared nuclear arsenal under the Treaty on the Non-Proliferation of Nuclear Weapons (NPT), of which it was an original drafter and signatory on 1 July 1968 (ratified 5 March 1970). All signatories of the NPT agreed to refrain from aiding in nuclear weapons proliferation to other states.
Further under Article VI of the NPT, all signatories, including the US, agreed to negotiate in good faith to stop the nuclear arms race and to negotiate for complete elimination of nuclear weapons. "Each of the Parties to the Treaty undertakes to pursue negotiations in good faith on effective measures relating to cessation of the nuclear arms race at an early date and to nuclear disarmament, and on a treaty on general and complete disarmament."
The International Court of Justice (ICJ), the preeminent judicial tribunal of international law, in its advisory opinion on the Legality of the Threat or Use of Nuclear Weapons, issued 8 July 1996, unanimously interprets the text of Article VI as implying that:
There exists an obligation to pursue in good faith and bring to a conclusion negotiations leading to nuclear disarmament in all its aspects under strict and effective international control.
The International Atomic Energy Agency (IAEA) in 2005 proposed a comprehensive ban on fissile material that would greatly limit the production of weapons of mass destruction. One hundred forty seven countries voted for this proposal but the United States voted against.
The US government has also resisted the Treaty on the Prohibition of Nuclear Weapons, a binding agreement for negotiations for the total elimination of nuclear weapons, supported by more than 120 nations.
International relations and nuclear weapons:
In 1958, the United States Air Force had considered a plan to drop nuclear bombs on China during a confrontation over Taiwan but it was overruled, previously secret documents showed after they were declassified due to the Freedom of Information Act in April 2008.
The plan included an initial plan to drop 10–15 kiloton bombs on airfields in Amoy (now called Xiamen) in the event of a Chinese blockade against Taiwan's Offshore Islands.
Occupational illness:
The Energy Employees Occupational Illness Compensation Program (EEOICP) began on 31 July 2001. The program provides compensation and health benefits to Department of Energy nuclear weapons workers (employees, former employees, contractors and subcontractors) as well as compensation to certain survivors if the worker is already deceased.
By 14 August 2010, the program had already identified 45,799 civilians who lost their health (including 18,942 who developed cancer) due to exposure to radiation and toxic substances while producing nuclear weapons for the United States.
Current status:
Further information: Nuclear triad § United States
The United States is one of the five recognized nuclear powers by the signatories of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). As of 2017, the US has an estimated 4,018 nuclear weapons in either deployment or storage.
This figure compares to a peak of 31,225 total warheads in 1967 and 22,217 in 1989, and does not include "several thousand" warheads that have been retired and scheduled for dismantlement.
The Pantex Plant near Amarillo, Texas, is the only location in the United States where weapons from the aging nuclear arsenal can be refurbished or dismantled.
In 2009 and 2010, the Obama administration declared policies that would invalidate the Bush-era policy for use of nuclear weapons and its motions to develop new ones. First, in a prominent 2009 speech, U.S. President Barack Obama outlined a goal of "a world without nuclear weapons".
To that goal, U.S. President Barack Obama and Russian President Dmitry Medvedev signed a new START treaty on 8 April 2010, to reduce the number of active nuclear weapons from 2,200 to 1,550.
That same week Obama also revised U.S. policy on the use of nuclear weapons in a Nuclear Posture Review required of all presidents, declaring for the first time that the U.S. would not use nuclear weapons against non-nuclear, NPT-compliant states. The policy also renounces development of any new nuclear weapons.
However, within the same Nuclear Posture Review of April of 2010, there was a stated need to develop new “low yield” nuclear weapons. This resulted in the development of the B61 Mod 12. Despite President Obama's goal of a nuclear-free world and reversal of former President Bush’s nuclear policies, his presidency cut fewer warheads from the stockpile any previous post-Cold War presidency.
Following a renewal of tension after the Russo-Ukrainian War started in 2014, the Obama administration announced plans to continue to renovate the US nuclear weapons facilities and platforms with a budgeted spend of about a trillion dollars over 30 years. Under these news plans, the US government would fund research and development of new nuclear cruise missiles. The Trump and Biden administrations continued with these plans.
As of 2021, American nuclear forces on land consist of 400 Minuteman III ICBMs spread among 450 operational launchers, staffed by Air Force Global Strike Command. Those in the seas consist of 14 nuclear-capable Ohio-class Trident submarines, nine in the Pacific and five in the Atlantic. Nuclear capabilities in the air are provided by 60 nuclear-capable heavy bombers, 20 B-2 bombers and 40 B-52s.
The Air Force has modernized its Minuteman III missiles to last through 2030, and a Ground Based Strategic Deterrent (GBSD) is set to begin replacing them in 2029. The Navy has undertaken efforts to extend the operational lives of its missiles in warheads past 2020; it is also producing new Columbia-class submarines to replace the Ohio fleet beginning 2031.
The Air Force is also retiring the nuclear cruise missiles of its B-52s, leaving only half nuclear-capable. It intends to procure a new long-range bomber, the B-21, and a new long-range standoff (LRSO) cruise missile in the 2020s.
Nuclear disarmament movement
See also: Nuclear disarmament and Anti-nuclear movement in the United States
In the early 1980s, the revival of the nuclear arms race triggered large protests about nuclear weapons. On 12 June 1982, one million people demonstrated in New York City's Central Park against nuclear weapons and for an end to the cold war arms race. It was the largest anti-nuclear protest and the largest political demonstration in American history.
International Day of Nuclear Disarmament protests were held on 20 June 1983 at 50 sites across the United States. There were many Nevada Desert Experience protests and peace camps at the Nevada Test Site during the 1980s and 1990s.
There have also been protests by anti-nuclear groups at the
On 1 May 2005, 40,000 anti-nuclear/anti-war protesters marched past the United Nations in New York, 60 years after the atomic bombings of Hiroshima and Nagasaki. This was the largest anti-nuclear rally in the U.S. for several decades.
In May 2010, some 25,000 people, including members of peace organizations and 1945 atomic bomb survivors, marched from downtown New York to the United Nations headquarters, calling for the elimination of nuclear weapons.
Some scientists and engineers have opposed nuclear weapons, including:
In recent years, many elder statesmen have also advocated nuclear disarmament. Sam Nunn, William Perry, Henry Kissinger, and George Shultz—have called upon governments to embrace the vision of a world free of nuclear weapons, and in various op-ed columns have proposed an ambitious program of urgent steps to that end. The four have created the Nuclear Security Project to advance this agenda.
Organizations such as Global Zero, an international non-partisan group of 300 world leaders dedicated to achieving nuclear disarmament, have also been established.
United States strategic nuclear weapons arsenal:
New START Treaty Aggregate Numbers of Strategic Offensive Arms, 14 June 2023:
Main article: Nuclear proliferation
Early on in the development of its nuclear weapons, the United States relied in part on information-sharing with both the United Kingdom and Canada, as codified in the Quebec Agreement of 1943.
These three parties agreed not to share nuclear weapons information with other countries without the consent of the others, an early attempt at nonproliferation.
After the development of the first nuclear weapons during World War II, though, there was much debate within the political circles and public sphere of the United States about whether or not the country should attempt to maintain a monopoly on nuclear technology, or whether it should undertake a program of information sharing with other nations (especially its former ally and likely competitor, the Soviet Union), or submit control of its weapons to some sort of international organization (such as the United Nations) who would use them to attempt to maintain world peace.
Though fear of a nuclear arms race spurred many politicians and scientists to advocate some degree of international control or sharing of nuclear weapons and information, many politicians and members of the military believed that it was better in the short term to maintain high standards of nuclear secrecy and to forestall a Soviet bomb as long as possible (and they did not believe the USSR would actually submit to international controls in good faith).
Since this path was chosen, the United States was, in its early days, essentially an advocate for the prevention of nuclear proliferation, though primarily for the reason of self-preservation.
A few years after the USSR detonated its first weapon in 1949, though, the U.S. under President Dwight D. Eisenhower sought to encourage a program of sharing nuclear information related to civilian nuclear power and nuclear physics in general.
The Atoms for Peace program, begun in 1953, was also in part political: the U.S. was better poised to commit various scarce resources, such as enriched uranium, towards this peaceful effort, and to request a similar contribution from the Soviet Union, who had far fewer resources along these lines; thus the program had a strategic justification as well, as was later revealed by internal memos.
This overall goal of promoting civilian use of nuclear energy in other countries, while also preventing weapons dissemination, has been labeled by many critics as contradictory and having led to lax standards for a number of decades which allowed a number of other nations, such as China and India, to profit from dual-use technology (purchased from nations other than the U.S.).
The Cooperative Threat Reduction program of the Defense Threat Reduction Agency was established after the breakup of the Soviet Union in 1991 to aid former Soviet bloc countries in the inventory and destruction of their sites for developing nuclear, chemical, and biological weapons, and their methods of delivering them (ICBM silos, long-range bombers, etc.).
Over $4.4 billion has been spent on this endeavor to prevent purposeful or accidental proliferation of weapons from the former Soviet arsenal.
After India and Pakistan tested nuclear weapons in 1998, President Bill Clinton imposed economic sanctions on the countries. In 1999, however, the sanctions against India were lifted; those against Pakistan were kept in place as a result of the military government that had taken over.
Shortly after the September 11 attacks in 2001, President George W. Bush lifted the sanctions against Pakistan as well, in order to get the Pakistani government's help as a conduit for US and NATO forces for operations in Afghanistan.
The U.S. government has been vocal against the proliferation of such weapons in the countries of Iran and North Korea. The 2003 invasion of Iraq by the U.S. was done, in part, on indications that Weapons of mass destruction were being stockpiled (later, stockpiles of previously undeclared nerve agent and mustard gas shells were located in Iraq), and the Bush administration said that its policies on proliferation were responsible for the Libyan government's agreement to abandon its nuclear ambitions.
However, a year after the war began, the Senate's report on pre-war intelligence on Iraq, no stockpiles of weapons of mass destruction or active programs of mass destruction were found in Iraq.
Nuclear disarmament in international law:
The United States is one of the five nuclear weapons states with a declared nuclear arsenal under the Treaty on the Non-Proliferation of Nuclear Weapons (NPT), of which it was an original drafter and signatory on 1 July 1968 (ratified 5 March 1970). All signatories of the NPT agreed to refrain from aiding in nuclear weapons proliferation to other states.
Further under Article VI of the NPT, all signatories, including the US, agreed to negotiate in good faith to stop the nuclear arms race and to negotiate for complete elimination of nuclear weapons. "Each of the Parties to the Treaty undertakes to pursue negotiations in good faith on effective measures relating to cessation of the nuclear arms race at an early date and to nuclear disarmament, and on a treaty on general and complete disarmament."
The International Court of Justice (ICJ), the preeminent judicial tribunal of international law, in its advisory opinion on the Legality of the Threat or Use of Nuclear Weapons, issued 8 July 1996, unanimously interprets the text of Article VI as implying that:
There exists an obligation to pursue in good faith and bring to a conclusion negotiations leading to nuclear disarmament in all its aspects under strict and effective international control.
The International Atomic Energy Agency (IAEA) in 2005 proposed a comprehensive ban on fissile material that would greatly limit the production of weapons of mass destruction. One hundred forty seven countries voted for this proposal but the United States voted against.
The US government has also resisted the Treaty on the Prohibition of Nuclear Weapons, a binding agreement for negotiations for the total elimination of nuclear weapons, supported by more than 120 nations.
International relations and nuclear weapons:
In 1958, the United States Air Force had considered a plan to drop nuclear bombs on China during a confrontation over Taiwan but it was overruled, previously secret documents showed after they were declassified due to the Freedom of Information Act in April 2008.
The plan included an initial plan to drop 10–15 kiloton bombs on airfields in Amoy (now called Xiamen) in the event of a Chinese blockade against Taiwan's Offshore Islands.
Occupational illness:
The Energy Employees Occupational Illness Compensation Program (EEOICP) began on 31 July 2001. The program provides compensation and health benefits to Department of Energy nuclear weapons workers (employees, former employees, contractors and subcontractors) as well as compensation to certain survivors if the worker is already deceased.
By 14 August 2010, the program had already identified 45,799 civilians who lost their health (including 18,942 who developed cancer) due to exposure to radiation and toxic substances while producing nuclear weapons for the United States.
Current status:
Further information: Nuclear triad § United States
The United States is one of the five recognized nuclear powers by the signatories of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). As of 2017, the US has an estimated 4,018 nuclear weapons in either deployment or storage.
This figure compares to a peak of 31,225 total warheads in 1967 and 22,217 in 1989, and does not include "several thousand" warheads that have been retired and scheduled for dismantlement.
The Pantex Plant near Amarillo, Texas, is the only location in the United States where weapons from the aging nuclear arsenal can be refurbished or dismantled.
In 2009 and 2010, the Obama administration declared policies that would invalidate the Bush-era policy for use of nuclear weapons and its motions to develop new ones. First, in a prominent 2009 speech, U.S. President Barack Obama outlined a goal of "a world without nuclear weapons".
To that goal, U.S. President Barack Obama and Russian President Dmitry Medvedev signed a new START treaty on 8 April 2010, to reduce the number of active nuclear weapons from 2,200 to 1,550.
That same week Obama also revised U.S. policy on the use of nuclear weapons in a Nuclear Posture Review required of all presidents, declaring for the first time that the U.S. would not use nuclear weapons against non-nuclear, NPT-compliant states. The policy also renounces development of any new nuclear weapons.
However, within the same Nuclear Posture Review of April of 2010, there was a stated need to develop new “low yield” nuclear weapons. This resulted in the development of the B61 Mod 12. Despite President Obama's goal of a nuclear-free world and reversal of former President Bush’s nuclear policies, his presidency cut fewer warheads from the stockpile any previous post-Cold War presidency.
Following a renewal of tension after the Russo-Ukrainian War started in 2014, the Obama administration announced plans to continue to renovate the US nuclear weapons facilities and platforms with a budgeted spend of about a trillion dollars over 30 years. Under these news plans, the US government would fund research and development of new nuclear cruise missiles. The Trump and Biden administrations continued with these plans.
As of 2021, American nuclear forces on land consist of 400 Minuteman III ICBMs spread among 450 operational launchers, staffed by Air Force Global Strike Command. Those in the seas consist of 14 nuclear-capable Ohio-class Trident submarines, nine in the Pacific and five in the Atlantic. Nuclear capabilities in the air are provided by 60 nuclear-capable heavy bombers, 20 B-2 bombers and 40 B-52s.
The Air Force has modernized its Minuteman III missiles to last through 2030, and a Ground Based Strategic Deterrent (GBSD) is set to begin replacing them in 2029. The Navy has undertaken efforts to extend the operational lives of its missiles in warheads past 2020; it is also producing new Columbia-class submarines to replace the Ohio fleet beginning 2031.
The Air Force is also retiring the nuclear cruise missiles of its B-52s, leaving only half nuclear-capable. It intends to procure a new long-range bomber, the B-21, and a new long-range standoff (LRSO) cruise missile in the 2020s.
Nuclear disarmament movement
See also: Nuclear disarmament and Anti-nuclear movement in the United States
In the early 1980s, the revival of the nuclear arms race triggered large protests about nuclear weapons. On 12 June 1982, one million people demonstrated in New York City's Central Park against nuclear weapons and for an end to the cold war arms race. It was the largest anti-nuclear protest and the largest political demonstration in American history.
International Day of Nuclear Disarmament protests were held on 20 June 1983 at 50 sites across the United States. There were many Nevada Desert Experience protests and peace camps at the Nevada Test Site during the 1980s and 1990s.
There have also been protests by anti-nuclear groups at the
- Y-12 Nuclear Weapons Plant,
- the Idaho National Laboratory,
- the Hanford Site,
- the Nevada Test Site,
- Lawrence Livermore National Laboratory,
- and transportation of nuclear waste from the Los Alamos National Laboratory.
On 1 May 2005, 40,000 anti-nuclear/anti-war protesters marched past the United Nations in New York, 60 years after the atomic bombings of Hiroshima and Nagasaki. This was the largest anti-nuclear rally in the U.S. for several decades.
In May 2010, some 25,000 people, including members of peace organizations and 1945 atomic bomb survivors, marched from downtown New York to the United Nations headquarters, calling for the elimination of nuclear weapons.
Some scientists and engineers have opposed nuclear weapons, including:
- Paul M. Doty,
- Hermann Joseph Muller,
- Linus Pauling,
- Eugene Rabinowitch,
- M.V. Ramana
- and Frank N. von Hippel.
In recent years, many elder statesmen have also advocated nuclear disarmament. Sam Nunn, William Perry, Henry Kissinger, and George Shultz—have called upon governments to embrace the vision of a world free of nuclear weapons, and in various op-ed columns have proposed an ambitious program of urgent steps to that end. The four have created the Nuclear Security Project to advance this agenda.
Organizations such as Global Zero, an international non-partisan group of 300 world leaders dedicated to achieving nuclear disarmament, have also been established.
United States strategic nuclear weapons arsenal:
New START Treaty Aggregate Numbers of Strategic Offensive Arms, 14 June 2023:
Notes:
See also:
- Each heavy bomber is counted as one warhead (The New START Treaty)
- The nuclear weapon delivery capability has been removed from B-1 heavy bombers.
See also:
- Anti-nuclear movement in the United States
- Global Security Institute
- History of nuclear weapons
- International Day against Nuclear Tests
- List of nuclear weapons tests
- National Security Strategy (United States)
- Nuclear terrorism
- Nuclear-free zone
- U.S. nuclear weapons in Japan
- United States Strategic Command
Russia's Weapons of Mass Destruction
- YouTube Video: Russia-Ukraine war LIVE: Russian weapons of mass destruction in Ukraine war | Russian troops LIVE
- YouTube Video: Putin is close to 'creating weapons of mass destruction
- YouTube Video: Ukraine war: Putin confirms first nuclear weapons moved to Belarus - BBC News
* -- What will Biden do if Putin goes nuclear? Experts say a nuclear response is unlikely but not impossible (NBC News 6/13/2022):
"It’s a troubling question with no palatable answer: What would President Joe Biden do if Russia used nuclear weapons in the Ukraine war?
A half dozen current and former government officials briefed on the issue, and several outside experts, told NBC News there was no playbook and little agreement about how the U.S. would respond to a norm-shattering act of destruction that could obliterate a Ukrainian city, kill tens of thousands and send a cloud of nuclear fallout drifting over NATO countries in Western Europe.
This isn’t new to the Biden administration. In fact, when the Obama administration conducted a war game simulating Russian use of nuclear weapons in the Baltics, there were fundamental disagreements about how to react.
U.S. intelligence officials say they have seen no signs that Russian President Vladimir Putin is poised to employ so-called battlefield nukes, but several versions of Russian military doctrine published since 2000 have envisioned the first use of nuclear weapons in response to a conventional threat in a regional war in response to a conventional threat against the Russian homeland.
And military experts say Russia’s smallest warheads have many times the explosive power of the bombs dropped on Hiroshima and Nagasaki.
As the U.S. continues to send ever more sophisticated weapons designed to help Ukraine destroy invading Russian forces, American officials tell NBC News the Biden administration has for months been thinking the unthinkable about what Putin could do — and war-gaming scenarios envisioning Russia using an atomic bomb on Ukraine.
“We don’t see … practical evidence at this point of Russian planning for the deployment or even potential use of tactical nuclear weapons,” CIA Director William Burns said last month. But, he added, “given the kind of saber-rattling that ... we’ve heard from the Russian leadership, we can’t take lightly those possibilities.”
NBC News talked to a half dozen current and former officials briefed on the issue, and several outside experts. Current officials declined to speak on the record, citing the sensitivity of the planning.
It’s fair to say that the American response “would depend wildly on how the Russians used” a nuclear weapon, as one U.S. official regularly briefed on U.S. government deliberations put it.
A demonstration shot over the Black Sea? A strike on Ukrainian troops in a remote area? Or far more provocative scenarios, such as a devastating blow to a major Ukrainian city or a nuclear attack on a NATO country?
The menu of American options is stark, officials and outside experts say: Stay the course, up the sanctions and keep arming the Ukrainians, while building an international coalition against Russia that completely isolates the country; launch a conventional military attack on Russian forces in Ukraine or Russia; or respond with a nuclear attack.
Unless a NATO country was hit, the U.S. would not have any obligation to respond.
Some military and intelligence officials told NBC News they believe it’s unlikely the U.S. would retaliate militarily after a single Russian use of a so-called tactical nuclear weapon in Ukraine.
Others said Biden would have to unleash some conventional force, perhaps attacking Russian troops in Ukraine or the Russian military unit that launched the nuclear weapon, an option that could have serious repercussions, since Russian military leaders might be killed.
If Russia used a nuclear weapon of any type, “I expect (the president) to say we’re in a new situation, and the U.S. will directly enter the war against Russia to stop this government that has not only broken so many international laws and violated human rights but also now violated the nuclear taboo,” said Evelyn Farkas, a former top Pentagon official for Russia and now executive director of the McCain Institute. “Putin will be signing the order on changing the regime.”
But two U.S. officials briefed on the issue did not agree, with one saying, “Unless they use them on NATO we’re probably not going to respond militarily.”
Under this thinking, Biden would not want to risk an escalation into a full-scale nuclear war that leads to the destruction of American cities. But he might not have to, because if Putin were to go nuclear, experts believe most other countries in the world, including many that are sitting on the fence in the current conflict, would quickly turn against and isolate Russia.
“The whole world would stop,” said Joseph Cirincione, a nuclear expert and distinguished fellow at the Quincy Institute for Responsible Statecraft.
A remote possibility that can’t be taken lightlyAmerican and Western national security officials tell NBC News there has been no sign that Russia has moved tactical nuclear weapons out of storage facilities. Intelligence officials have said they assess that Putin would consider nukes only if he believed his regime was in mortal danger.
But two U.S. officials, citing American intelligence assessments, say some in Putin’s inner circle have encouraged him to test a nuclear weapon as a show of strength during moments when his conventional forces have struggled in Ukraine. The officials said there is continuing concern that Putin could choose this option if he believes Russia has been backed into a corner.
Putin placed Moscow’s nuclear forces on high alert shortly after his invasion of Ukraine began on Feb. 24, and he warned that “no one should have any doubts that a direct attack on our country will lead to the destruction and horrible consequences for any potential aggressor.”
But U.S. officials told NBC News they did not see any changes to their footprint or movements at the time. In April, Russian foreign minister Sergey Lavrov warned the West not to underestimate the elevated risks of nuclear conflict over Ukraine. Putin supporters on Russian state TV in recent weeks have talked openly about a nuclear war with the U.S. and Europe.
All this comes against the backdrop of a Russian nuclear doctrine that has evolved in what Western officials consider disturbing ways. In a 1993 document, Russia said it would use nuclear weapons only when the existence of the country was threatened. But in versions published since 2000, Russia reserves the right to use nuclear weapons in response to any weapons of mass destruction used to attack Russia and its allies, according to scholars who have examined it.
The doctrine also allows for the use of nuclear weapons “in response to large-scale aggression using conventional weapons in situations critical to the national security of the Russian Federation.” Experts have described that strategy as “escalate to de-escalate,” and they say it means that Russia is willing to make limited use of nuclear weapons to win what would otherwise be a conventional war.
On paper, U.S. nuclear doctrine is similar, but in practical terms, experts do not believe an American president would ever use nuclear weapons in a regional conventional war, and the U.S. has not, through Korea, Vietnam, Afghanistan and Iraq. Officials say the main purpose of the U.S. nuclear arsenal is to deter or respond to a nuclear attack by an adversary.
Still, the U.S. has not ruled out using nuclear weapons in response to biological or chemical attacks, and in some limited circumstances, conventional attacks. It still maintains around 100 nuclear weapons in NATO countries, put there originally to stop Russian tanks from seizing Western Europe.
Nuclear warning shot?Officials are struggling to understand exactly what could prompt Putin to use a nuclear weapon. To cement gains made on the battlefield? To reverse losses? To stave off a rout?
“It’s not clear where that red line is. If Ukrainian forces were to enter Russian territory, would that be sufficient? I don’t know," said Chris Chivvis, who served from April 2018 to April 2021 as the U.S. national intelligence officer for Europe.
Citing the Russian invasion of Georgia in 2008, its seizure of Ukraine’s Crimean Peninsula in 2014 and the Kremlin’s interference in the 2016 U.S. election, he added, “The reality is we have been surprised by Russia three or four times in the last 15 years.”
Although technology exists to make battlefield nuclear weapons smaller than those dropped on Hiroshima and Nagasaki, Russia does not have warheads that small, according to Jeffrey Lewis, a nuclear expert at the Middlebury Institute. All of its so-called tactical nuclear weapons have enormous explosive power.
There is no technological distinction between “tactical” nuclear weapons and “strategic” ones — the difference is in the targets and the goals. Tactical nukes are used to gain advantage on the battlefield, while strategic weapons are aimed at military infrastructure and even whole cities.
If Russia decided to use one, its options could include an attack on an airbase or other military target, an attack on a Ukrainian city or a test of a nuclear weapon at a remote site — a warning shot designed to signal Moscow’s willingness to use the ultimate weapon, former officials said. It could deliver the weapon as a bomb, or via a missile.
Although none of these scenarios are likely, the nuclear test could be the most attractive for Moscow, some experts said.
Testing a nuclear weapon would be an extraordinarily provocative step, something only North Korea has done in this century.
A test above ground would risk radioactive material drifting into populated areas in Russian territory or NATO countries, depending on where it was carried out and the weather conditions. The former Soviet Union’s last nuclear test was carried out underground in 1989.
If Russia faced impending defeat in Ukraine, a single "demonstration attack," either on Ukrainian territory or possibly on the Black Sea, could seek to “convey their resolve, to try to force terror on the other party and get the Ukrainians to fold,” said Rose Gottemoeller, a former deputy secretary general of NATO who is now at Stanford University’s Freeman Spogli Institute for International Studies.
“They would be trying to strike terror into the hearts of the Ukrainians, get them to back down, get them even to concede defeat,” Gottemoeller said.
“Whatever he (Putin) did, he would do it in the belief that it would ensure his survival and perhaps compel surrender or retreat for the Ukrainians.”
Instead of a nuclear exchange with the U.S., “Russia has many options that it could employ either in Ukraine or elsewhere that would be much smaller steps up the nuclear escalatory ladder, but that nevertheless would represent a sea change in world history,“ said Chivvis, now a senior fellow at the Carnegie Endowment think tank. “I worry that people are not being open-minded to the reality that there are scenarios in which Russia could use nuclear weapons. They’re not the most likely scenarios, but to be responsible, we have to figure that into our thinking about this conflict,” he said.
Strategic ambiguityThe Biden administration has intentionally avoided spelling out how it would respond if Russia launched a nuclear attack in Ukraine, leaving open the possibility of retaliating with nuclear weapons, conventional forces, a cyber operation or other means.
“We have to be crystal clear in our policy of warning him of a swift and decisive response, without necessarily being unambiguous about what that would be,” said Alexander Vershbow, who served as deputy secretary general of NATO from 2012 to 2016 and as ambassador to Russia from 2001 to 2005.
Biden would have to at least consider a major conventional military response in support of Ukraine, former U.S. officials said.
A Russian attack on Ukraine with a tactical nuclear weapon would pose an excruciating dilemma for Washington and its NATO allies. Biden and Western political leaders would have to weigh a response that would avoid triggering a full-blown nuclear conflict with Russia, while still imposing a heavy cost on Moscow.
Jeffrey Edmonds, an expert on the Russian military at the CNA think tank, says there are four possible response scenarios, only two of which are plausible: Capitulate and sue for peace; stay the course with sanctions and pressure; mount a conventional attack to punish Russia; respond with a nuclear attack on Russia.
The real choice, he believes, is either staying the course or a conventional attack.
Biden could decide that “what we’re doing is working, we’re just going to keep going, we’re going to take the moral high ground.”
Presumably, Russia would become more isolated diplomatically and international sanctions pressure would ratchet up. But Edmonds noted that calls for a military response to a Russian use of nuclear weapons would be “deafening” in Washington.
In his book “The Bomb,” about nuclear war planning, author Fred Kaplan writes about a National Security Council war game during the Obama administration that simulated a Russian tactical nuclear attack on a NATO country during a Russian invasion of one of the Baltic States.
Lower level officials decided not to respond with a nuclear weapon, instead continuing to fight with conventional forces. But when the same scenario was presented to Cabinet level officials, they decided that the U.S. had to respond with a nuclear attack, and they targeted Russian ally Belarus.
“I think that’s nuts,” Cirincione said. “There is a belief that you can have a limited nuclear exchange. You don’t want to get in that box, because once you are in that nuclear war-fighting mindset, you can’t control it.”
The Biden administration’s track record so far suggests it would move cautiously, in consultation with its European allies, and seek to avoid plunging the world into a nuclear conflagration, former officials said.
The administration has faced criticism that it has moved too slowly to send advanced weapons to Ukraine, but the White House’s supporters say the administration has focused on avoiding actions that could escalate the crisis into a direct clash between Russia and the U.S.
Realistically, the U.S. would look for ways to respond short of launching a nuclear weapon, possibly through cyber operations or other support for Ukraine, said Gottemoeller.
The United States would need to avoid any kind of nuclear escalation in the interest of the U.S. and its allies, but also for “global survival,” she said."
[End of Article]
___________________________________________________________________________
Russia and its weapons of mass destruction (Wikipedia)
The Russian Federation is known to possess or have possessed three types of weapons of mass destruction:
It is one of the five nuclear-weapon states recognized under the Treaty on the Non-Proliferation of Nuclear Weapons.
Russia possesses a total of 5,889 nuclear warheads as of 2023, the largest stockpile of nuclear warheads in the world; the second-largest stockpile is the United States' 5,428 warheads. Russia's deployed missiles (those actually ready to be launched) number about 1,674, second to the United States' 1,770.
The remaining weapons are either in reserve stockpiles, or have been retired and are slated for dismantling. Russia's predecessor state, the Soviet Union, reached a peak stockpile of about 45,000 nuclear warheads in 1986. The number of weapons Russia may possess is currently controlled by the bilateral New START treaty with the United States.
The Soviet Union ratified the Geneva Protocol—prohibiting the use of biological and chemical weapons—on April 5, 1928, with reservations that were later dropped on January 18, 2001.
Russia is also party to the 1972 Biological Weapons Convention and the 1993 Chemical Weapons Convention. The Soviet biological weapons program violated the Biological Weapons Convention and was the world's largest, longest, and most sophisticated program of its kind. At its peak, the program employed up to 65,000 people.
Despite being a signatory to the Chemical Weapons Convention, Russia has continued to hold and occasionally use chemical weapons. In 1997, Russia declared an arsenal of 39,967 tons of chemical weapons, which it worked in part to decrease.
Its stock of weapons was declared destroyed in 2017. The poisoning of Sergei and Yulia Skripal in 2018 and the poisoning of Alexei Navalny in 2020, both carried out by Russia, revealed that the country maintained an illicit chemical weapons program.
Nuclear weapons:
History:
Soviet era
Main article: Soviet atomic bomb project
Post-Soviet era:
At the dissolution of the Soviet Union in 1991, Soviet nuclear weapons were deployed in four of the new republics: Russia, Ukraine, Belarus and Kazakhstan.
In May 1992, these four states signed the Lisbon Protocol, agreeing to join the Treaty on the Non-Proliferation of Nuclear Weapons, with Russia the successor state to the Soviet Union as a nuclear state, and the other three states joining as non-nuclear states.
Ukraine agreed to give up its weapons to Russia, in exchange for guarantees of Ukrainian territory from Russia, the United Kingdom, and the United States, known as the Budapest Memorandum on Security Assurances. China and France also made statements in support of the memorandum.
Arms reduction:
Main article: Megatons to Megawatts Program
The threat of nuclear warfare was a persistent and terrifying threat during the Cold War. At its height, the Soviet Union and United States each mustered tens of thousands of warheads, under the doctrine of mutual assured destruction.
By the 1980s, both the United States and Soviet Union sought to reduce the number of weapons the other was fielding. This led to the opening of arms reduction talks in 1982.
This culminated in the signing of the START I treaty in 1991: the first nuclear arms reduction treaty between the two global powers. This first treaty limited the number of deployed warheads in each nation to 6,000, nearly halving the prior 10,000 to 12,000 being fielded in 1991.
The considerable success of START I, combined with the dissolution of the Soviet Union in 1991, led to the START II treaty. Russia never ratified the treaty, and it did not go into effect.
START III was attempted but could not get past negotiations.
Instead, the Strategic Offensive Reductions Treaty was passed in 2002, capping warheads at 2,200. The current limitations stem from the New START treaty, ratified in 2010. It limits each side to 1,550 weapons. Nuclear bombers only count as one weapon each, even though they may carry up to 20, so the actual limit on the countries is slightly higher. The treaty is in force through to 2026.
After U.S. President George W. Bush withdrew from the 1972 Anti-Ballistic Missile Treaty, Russia responded by building-up their nuclear capabilities, in such a way as to counterbalance U.S. capabilities.
Russia decided not to sign the UN treaty on the Prohibition of Nuclear Weapons, which was adopted on 7 July 2017 by 122 States. Most analysts agree that Russia's nuclear strategy under Putin eventually brought it into violation of the 1987 Intermediate-Range Nuclear Forces Treaty (although this is not confirmed).
According to Russian officials, the American decision to deploy the missile defense system in Europe was a violation of the treaty. U.S. President Donald Trump announced on 20 October 2018 that the U.S. would no longer consider itself bound by the treaty's provisions, raising nuclear tensions between the two powers.
Nuclear arsenal of Russia:
The exact number of nuclear warheads is a state secret and is therefore a matter of guesswork.
As of 2022, the Federation of American Scientists estimates that Russia possesses 5,977 nuclear weapons, while the United States has 5,428; Russia and the U.S. each have about 1,600 active deployed strategic nuclear warheads.
Russia's stockpile is growing in size, while the United States' is shrinking. Russia has six nuclear missile fields in Kozelsk, Tatishchevo, Uzhur, Dombarovskiy, Kartalay, and Aleysk; nuclear missile submarines patrolling from three naval bases at Nerpich'ya, Yagel'Naya, and Rybachiy; and nuclear bombers at Ukrainka and Engels air bases.
The RS-28 Sarmat (Russian: РС-28 Сармат; NATO reporting name: SATAN 2), is a Russian liquid-fueled, MIRV-equipped, super-heavy thermonuclear armed intercontinental ballistic missile in development by the Makeyev Rocket Design Bureau since 2009, intended to replace the previous R-36 missile.
Its large payload would allow for up to 10 heavy warheads or 15 lighter ones, or a combination of warheads and massive amounts of countermeasures designed to defeat anti-missile systems. It was heralded by the Russian military as a response to the U.S. Prompt Global Strike.
In 2015, information emerged that Russia may be developing a new nuclear torpedo, the Status-6 Ocean Multipurpose System, codenamed "Kanyon" by Pentagon officials. This weapon is designed to create a tsunami wave up to 500m tall that will radioactively contaminate a wide area on an enemy coasts with cobalt-60, and to be immune to anti-missile defense systems such as laser weapons and railguns that might disable an ICBM.
Two potential carrier submarines, the Project 09852 Belgorod, and the Project 09851 Khabarovsk, are new boats laid down in 2012 and 2014 respectively.
Status 6 appears to be a deterrent weapon of last resort. It appears to be a torpedo-shaped robotic mini-submarine, that can travel at speeds of 185 km/h (100 kn). More recent information suggests a top speed of 100 km/h (54 kn), with a range of 10,000 km (6,200 mi) and a depth maximum of 1,000 m (3,300 ft). This underwater drone is cloaked by stealth technology to elude acoustic tracking devices.
During an annual state-of-the-nation address given on March 1, 2018, President Vladimir Putin publicly claimed that Russia was now in possession of several new classes of nuclear weapons, including some with capabilities previously speculated to exist. Putin discussed several new or upgraded weapons, including a hypersonic glide vehicle known as the Avangard, capable of performing sharp maneuvers while traveling at 20 times the speed of sound making it "absolutely invulnerable for any missile defense system."
Putin discussed the existence of a nuclear powered underwater torpedo and a nuclear powered cruise missile (9M730 Burevestnik), both with effectively unlimited range. He discussed that Russia had tested a new class of traditional ICBM called the Sarmat, which expanded upon the range and carrying capability of the Soviet-era Satan ICBM. Animations of these weapons were shown in front of the live and televised audience. Putin suggested that an online poll be conducted to give them official public names.
Nuclear weapons in Russian military doctrine:
Main article: Military doctrine of Russia
According to a Russian military doctrine stated in 2010, nuclear weapons could be used by Russia "in response to the use of nuclear and other types of weapons of mass destruction against it or its allies, and also in case of aggression against Russia with the use of conventional weapons when the very existence of the state is threatened".
Most military analysts believe that, in this case, Russia would pursue an 'escalate to de-escalate' strategy, initiating limited nuclear exchange to bring adversaries to the negotiating table. Russia will also threaten nuclear conflict to discourage initial escalation of any major conventional conflict.
Nuclear proliferation;
Main articles:
After the Korean War, the Soviet Union transferred nuclear technology and weapons to the People's Republic of China as an adversary of the United States and NATO. According to Ion Mihai Pacepa, "Khrushchev's nuclear-proliferation process started with Communist China in April 1955, when the new ruler in the Kremlin consented to supply Beijing a sample atomic bomb and to help with its mass production. Subsequently, the Soviet Union built all the essentials of China's new military nuclear industry."
Russia is one of the five "Nuclear Weapons States" (NWS) under the Treaty on the Non-Proliferation of Nuclear Weapons (NPT), which Russia ratified (as the Soviet Union) in 1968.
Following the dissolution of the Soviet Union in 1991, a number of Soviet-era nuclear warheads remained on the territories of Belarus, Ukraine, and Kazakhstan.
Under the terms of the Lisbon Protocol to the NPT, and following the 1995 Trilateral Agreement between Russia, Belarus, and the US, these were transferred to Russia, leaving Russia as the sole inheritor of the Soviet nuclear arsenal. It is estimated that the Soviet Union had approximately 45,000 nuclear weapons stockpiled at the time of its collapse.
The collapse of the Soviet Union allowed for a warming of relations with NATO. Fears of a nuclear holocaust lessened. In September 1997, the former secretary of the Russian Security Council Alexander Lebed claimed 100 "suitcase sized" nuclear weapons were unaccounted for.
He said he was attempting to inventory the weapons when he was fired by President Boris Yeltsin in October 1996. Indeed, several US politicians have expressed worries and promised legislation addressing the threat.
There were allegations that Russia contributed to the North Korean nuclear program, selling it the equipment for the safe storage and transportation of nuclear materials.
Nevertheless, Russia has condemned North Korean nuclear tests since then. The Russian Federation has also wider commercial interests in selling the nuclear technology to India and Iran, reaching understanding memorandums in training their technicians in their respected nuclear programs.
Russia is allegedly making efforts to build its influential hold in Africa for earning several billions of pounds by selling nuclear technology to developing African countries.
Nuclear sabotage allegations from Russia:
The highest-ranking GRU defector Stanislav Lunev described alleged Soviet plans for using tactical nuclear weapons for sabotage against the United States in the event of war.
He described Soviet-made suitcase nukes identified as RA-115s (or RA-115-01s for submersible weapons) which weigh 50–60 pounds (23–27 kg). These portable bombs can last for many years if wired to an electric source. "In case there is a loss of power, there is a battery backup. If the battery runs low, the weapon has a transmitter that sends a coded message – either by satellite or directly to a GRU post at a Russian embassy or consulate."
Lunev was personally looking for hiding places for weapons caches in the Shenandoah Valley area. He said that "it is surprisingly easy to smuggle nuclear weapons into the US" either across the Mexican border or using a small transport missile that can slip though undetected when launched from a Russian airplane.
Searches of the areas identified by Lunev – who admits he never planted any weapons in the US – have been conducted, "but law-enforcement officials have never found such weapons caches, with or without portable nuclear weapons" in the US.
In a 2004 interview, colonel general of RVSN Viktor Yesin said that Soviet small-scale nuclear bombs have only been operated by the Army. All such devices have been stored in a weapons depot within Russia, and only left it for checks at the plant which produced them.
2020 Russian nuclear deterrence state policy:
On June 2, 2020, President Putin signed an Executive Order formally titled "Fundamentals of Russia’s Nuclear Deterrence State Policy", in an unprecedented public release of an official document on Russia's nuclear policy.
The six-page document identified the range of threats that Russia seeks to deter with its nuclear forces, clarified Russia’s general approach to nuclear deterrence, and articulated conditions under which Russia might use of nuclear weapons. The policy endorses the use of nuclear weapons in response to a non-nuclear strike due to the improved capabilities of U.S. conventional weapons.
2022 Russian invasion of Ukraine:
See also: Nuclear threats during the 2022 Russian invasion of Ukraine
During the 2022 Russian invasion of Ukraine, Russian President Vladimir Putin placed Strategic Rocket Forces's nuclear deterrence units on high alert, a move heavily condemned internationally.
Putin warned that "whoever tries to hinder us in Ukraine would see consequences, you have never seen in your history". According to the US Director of National Intelligence, Avril Haines, Putin could potentially turn to nuclear weapons if he perceived an "existential threat" to the Russian state or regime; there has been speculation that he could regard defeat in Ukraine as an existential threat to his regime.
According to a peer-reviewed study published in the journal Nature Food in August 2022, a full-scale nuclear war between the United States and Russia, which together hold more than 90% of the world's nuclear weapons, would kill 360 million people directly and more than 5 billion indirectly by starvation during a nuclear winter.
In September 2022, Putin announced the 2022 Russian mobilization and threatened nuclear retaliation against the west if Russia's territorial integrity was threatened.
On February 21, 2023, Putin suspended Russia's participation in the New START nuclear arms reduction treaty with the United States, saying that Russia would not allow the US and NATO to inspect its nuclear facilities.
On 25 March 2023, Putin announced that Russia would be stationing tactical nuclear operations in Belarus. On 14 June 2023, Belarusian President Aleksander Lukashenko stated that Belarus had started to take delivery of nuclear weapons in a TV interview with Russian state channel Russia-1.
Click on any of the following hyperlinks for mre about Russia and its "Weapons of Mass Destruction":
"It’s a troubling question with no palatable answer: What would President Joe Biden do if Russia used nuclear weapons in the Ukraine war?
A half dozen current and former government officials briefed on the issue, and several outside experts, told NBC News there was no playbook and little agreement about how the U.S. would respond to a norm-shattering act of destruction that could obliterate a Ukrainian city, kill tens of thousands and send a cloud of nuclear fallout drifting over NATO countries in Western Europe.
This isn’t new to the Biden administration. In fact, when the Obama administration conducted a war game simulating Russian use of nuclear weapons in the Baltics, there were fundamental disagreements about how to react.
U.S. intelligence officials say they have seen no signs that Russian President Vladimir Putin is poised to employ so-called battlefield nukes, but several versions of Russian military doctrine published since 2000 have envisioned the first use of nuclear weapons in response to a conventional threat in a regional war in response to a conventional threat against the Russian homeland.
And military experts say Russia’s smallest warheads have many times the explosive power of the bombs dropped on Hiroshima and Nagasaki.
As the U.S. continues to send ever more sophisticated weapons designed to help Ukraine destroy invading Russian forces, American officials tell NBC News the Biden administration has for months been thinking the unthinkable about what Putin could do — and war-gaming scenarios envisioning Russia using an atomic bomb on Ukraine.
“We don’t see … practical evidence at this point of Russian planning for the deployment or even potential use of tactical nuclear weapons,” CIA Director William Burns said last month. But, he added, “given the kind of saber-rattling that ... we’ve heard from the Russian leadership, we can’t take lightly those possibilities.”
NBC News talked to a half dozen current and former officials briefed on the issue, and several outside experts. Current officials declined to speak on the record, citing the sensitivity of the planning.
It’s fair to say that the American response “would depend wildly on how the Russians used” a nuclear weapon, as one U.S. official regularly briefed on U.S. government deliberations put it.
A demonstration shot over the Black Sea? A strike on Ukrainian troops in a remote area? Or far more provocative scenarios, such as a devastating blow to a major Ukrainian city or a nuclear attack on a NATO country?
The menu of American options is stark, officials and outside experts say: Stay the course, up the sanctions and keep arming the Ukrainians, while building an international coalition against Russia that completely isolates the country; launch a conventional military attack on Russian forces in Ukraine or Russia; or respond with a nuclear attack.
Unless a NATO country was hit, the U.S. would not have any obligation to respond.
Some military and intelligence officials told NBC News they believe it’s unlikely the U.S. would retaliate militarily after a single Russian use of a so-called tactical nuclear weapon in Ukraine.
Others said Biden would have to unleash some conventional force, perhaps attacking Russian troops in Ukraine or the Russian military unit that launched the nuclear weapon, an option that could have serious repercussions, since Russian military leaders might be killed.
If Russia used a nuclear weapon of any type, “I expect (the president) to say we’re in a new situation, and the U.S. will directly enter the war against Russia to stop this government that has not only broken so many international laws and violated human rights but also now violated the nuclear taboo,” said Evelyn Farkas, a former top Pentagon official for Russia and now executive director of the McCain Institute. “Putin will be signing the order on changing the regime.”
But two U.S. officials briefed on the issue did not agree, with one saying, “Unless they use them on NATO we’re probably not going to respond militarily.”
Under this thinking, Biden would not want to risk an escalation into a full-scale nuclear war that leads to the destruction of American cities. But he might not have to, because if Putin were to go nuclear, experts believe most other countries in the world, including many that are sitting on the fence in the current conflict, would quickly turn against and isolate Russia.
“The whole world would stop,” said Joseph Cirincione, a nuclear expert and distinguished fellow at the Quincy Institute for Responsible Statecraft.
A remote possibility that can’t be taken lightlyAmerican and Western national security officials tell NBC News there has been no sign that Russia has moved tactical nuclear weapons out of storage facilities. Intelligence officials have said they assess that Putin would consider nukes only if he believed his regime was in mortal danger.
But two U.S. officials, citing American intelligence assessments, say some in Putin’s inner circle have encouraged him to test a nuclear weapon as a show of strength during moments when his conventional forces have struggled in Ukraine. The officials said there is continuing concern that Putin could choose this option if he believes Russia has been backed into a corner.
Putin placed Moscow’s nuclear forces on high alert shortly after his invasion of Ukraine began on Feb. 24, and he warned that “no one should have any doubts that a direct attack on our country will lead to the destruction and horrible consequences for any potential aggressor.”
But U.S. officials told NBC News they did not see any changes to their footprint or movements at the time. In April, Russian foreign minister Sergey Lavrov warned the West not to underestimate the elevated risks of nuclear conflict over Ukraine. Putin supporters on Russian state TV in recent weeks have talked openly about a nuclear war with the U.S. and Europe.
All this comes against the backdrop of a Russian nuclear doctrine that has evolved in what Western officials consider disturbing ways. In a 1993 document, Russia said it would use nuclear weapons only when the existence of the country was threatened. But in versions published since 2000, Russia reserves the right to use nuclear weapons in response to any weapons of mass destruction used to attack Russia and its allies, according to scholars who have examined it.
The doctrine also allows for the use of nuclear weapons “in response to large-scale aggression using conventional weapons in situations critical to the national security of the Russian Federation.” Experts have described that strategy as “escalate to de-escalate,” and they say it means that Russia is willing to make limited use of nuclear weapons to win what would otherwise be a conventional war.
On paper, U.S. nuclear doctrine is similar, but in practical terms, experts do not believe an American president would ever use nuclear weapons in a regional conventional war, and the U.S. has not, through Korea, Vietnam, Afghanistan and Iraq. Officials say the main purpose of the U.S. nuclear arsenal is to deter or respond to a nuclear attack by an adversary.
Still, the U.S. has not ruled out using nuclear weapons in response to biological or chemical attacks, and in some limited circumstances, conventional attacks. It still maintains around 100 nuclear weapons in NATO countries, put there originally to stop Russian tanks from seizing Western Europe.
Nuclear warning shot?Officials are struggling to understand exactly what could prompt Putin to use a nuclear weapon. To cement gains made on the battlefield? To reverse losses? To stave off a rout?
“It’s not clear where that red line is. If Ukrainian forces were to enter Russian territory, would that be sufficient? I don’t know," said Chris Chivvis, who served from April 2018 to April 2021 as the U.S. national intelligence officer for Europe.
Citing the Russian invasion of Georgia in 2008, its seizure of Ukraine’s Crimean Peninsula in 2014 and the Kremlin’s interference in the 2016 U.S. election, he added, “The reality is we have been surprised by Russia three or four times in the last 15 years.”
Although technology exists to make battlefield nuclear weapons smaller than those dropped on Hiroshima and Nagasaki, Russia does not have warheads that small, according to Jeffrey Lewis, a nuclear expert at the Middlebury Institute. All of its so-called tactical nuclear weapons have enormous explosive power.
There is no technological distinction between “tactical” nuclear weapons and “strategic” ones — the difference is in the targets and the goals. Tactical nukes are used to gain advantage on the battlefield, while strategic weapons are aimed at military infrastructure and even whole cities.
If Russia decided to use one, its options could include an attack on an airbase or other military target, an attack on a Ukrainian city or a test of a nuclear weapon at a remote site — a warning shot designed to signal Moscow’s willingness to use the ultimate weapon, former officials said. It could deliver the weapon as a bomb, or via a missile.
Although none of these scenarios are likely, the nuclear test could be the most attractive for Moscow, some experts said.
Testing a nuclear weapon would be an extraordinarily provocative step, something only North Korea has done in this century.
A test above ground would risk radioactive material drifting into populated areas in Russian territory or NATO countries, depending on where it was carried out and the weather conditions. The former Soviet Union’s last nuclear test was carried out underground in 1989.
If Russia faced impending defeat in Ukraine, a single "demonstration attack," either on Ukrainian territory or possibly on the Black Sea, could seek to “convey their resolve, to try to force terror on the other party and get the Ukrainians to fold,” said Rose Gottemoeller, a former deputy secretary general of NATO who is now at Stanford University’s Freeman Spogli Institute for International Studies.
“They would be trying to strike terror into the hearts of the Ukrainians, get them to back down, get them even to concede defeat,” Gottemoeller said.
“Whatever he (Putin) did, he would do it in the belief that it would ensure his survival and perhaps compel surrender or retreat for the Ukrainians.”
Instead of a nuclear exchange with the U.S., “Russia has many options that it could employ either in Ukraine or elsewhere that would be much smaller steps up the nuclear escalatory ladder, but that nevertheless would represent a sea change in world history,“ said Chivvis, now a senior fellow at the Carnegie Endowment think tank. “I worry that people are not being open-minded to the reality that there are scenarios in which Russia could use nuclear weapons. They’re not the most likely scenarios, but to be responsible, we have to figure that into our thinking about this conflict,” he said.
Strategic ambiguityThe Biden administration has intentionally avoided spelling out how it would respond if Russia launched a nuclear attack in Ukraine, leaving open the possibility of retaliating with nuclear weapons, conventional forces, a cyber operation or other means.
“We have to be crystal clear in our policy of warning him of a swift and decisive response, without necessarily being unambiguous about what that would be,” said Alexander Vershbow, who served as deputy secretary general of NATO from 2012 to 2016 and as ambassador to Russia from 2001 to 2005.
Biden would have to at least consider a major conventional military response in support of Ukraine, former U.S. officials said.
A Russian attack on Ukraine with a tactical nuclear weapon would pose an excruciating dilemma for Washington and its NATO allies. Biden and Western political leaders would have to weigh a response that would avoid triggering a full-blown nuclear conflict with Russia, while still imposing a heavy cost on Moscow.
Jeffrey Edmonds, an expert on the Russian military at the CNA think tank, says there are four possible response scenarios, only two of which are plausible: Capitulate and sue for peace; stay the course with sanctions and pressure; mount a conventional attack to punish Russia; respond with a nuclear attack on Russia.
The real choice, he believes, is either staying the course or a conventional attack.
Biden could decide that “what we’re doing is working, we’re just going to keep going, we’re going to take the moral high ground.”
Presumably, Russia would become more isolated diplomatically and international sanctions pressure would ratchet up. But Edmonds noted that calls for a military response to a Russian use of nuclear weapons would be “deafening” in Washington.
In his book “The Bomb,” about nuclear war planning, author Fred Kaplan writes about a National Security Council war game during the Obama administration that simulated a Russian tactical nuclear attack on a NATO country during a Russian invasion of one of the Baltic States.
Lower level officials decided not to respond with a nuclear weapon, instead continuing to fight with conventional forces. But when the same scenario was presented to Cabinet level officials, they decided that the U.S. had to respond with a nuclear attack, and they targeted Russian ally Belarus.
“I think that’s nuts,” Cirincione said. “There is a belief that you can have a limited nuclear exchange. You don’t want to get in that box, because once you are in that nuclear war-fighting mindset, you can’t control it.”
The Biden administration’s track record so far suggests it would move cautiously, in consultation with its European allies, and seek to avoid plunging the world into a nuclear conflagration, former officials said.
The administration has faced criticism that it has moved too slowly to send advanced weapons to Ukraine, but the White House’s supporters say the administration has focused on avoiding actions that could escalate the crisis into a direct clash between Russia and the U.S.
Realistically, the U.S. would look for ways to respond short of launching a nuclear weapon, possibly through cyber operations or other support for Ukraine, said Gottemoeller.
The United States would need to avoid any kind of nuclear escalation in the interest of the U.S. and its allies, but also for “global survival,” she said."
[End of Article]
___________________________________________________________________________
Russia and its weapons of mass destruction (Wikipedia)
The Russian Federation is known to possess or have possessed three types of weapons of mass destruction:
It is one of the five nuclear-weapon states recognized under the Treaty on the Non-Proliferation of Nuclear Weapons.
Russia possesses a total of 5,889 nuclear warheads as of 2023, the largest stockpile of nuclear warheads in the world; the second-largest stockpile is the United States' 5,428 warheads. Russia's deployed missiles (those actually ready to be launched) number about 1,674, second to the United States' 1,770.
The remaining weapons are either in reserve stockpiles, or have been retired and are slated for dismantling. Russia's predecessor state, the Soviet Union, reached a peak stockpile of about 45,000 nuclear warheads in 1986. The number of weapons Russia may possess is currently controlled by the bilateral New START treaty with the United States.
The Soviet Union ratified the Geneva Protocol—prohibiting the use of biological and chemical weapons—on April 5, 1928, with reservations that were later dropped on January 18, 2001.
Russia is also party to the 1972 Biological Weapons Convention and the 1993 Chemical Weapons Convention. The Soviet biological weapons program violated the Biological Weapons Convention and was the world's largest, longest, and most sophisticated program of its kind. At its peak, the program employed up to 65,000 people.
Despite being a signatory to the Chemical Weapons Convention, Russia has continued to hold and occasionally use chemical weapons. In 1997, Russia declared an arsenal of 39,967 tons of chemical weapons, which it worked in part to decrease.
Its stock of weapons was declared destroyed in 2017. The poisoning of Sergei and Yulia Skripal in 2018 and the poisoning of Alexei Navalny in 2020, both carried out by Russia, revealed that the country maintained an illicit chemical weapons program.
Nuclear weapons:
History:
Soviet era
Main article: Soviet atomic bomb project
Post-Soviet era:
At the dissolution of the Soviet Union in 1991, Soviet nuclear weapons were deployed in four of the new republics: Russia, Ukraine, Belarus and Kazakhstan.
In May 1992, these four states signed the Lisbon Protocol, agreeing to join the Treaty on the Non-Proliferation of Nuclear Weapons, with Russia the successor state to the Soviet Union as a nuclear state, and the other three states joining as non-nuclear states.
Ukraine agreed to give up its weapons to Russia, in exchange for guarantees of Ukrainian territory from Russia, the United Kingdom, and the United States, known as the Budapest Memorandum on Security Assurances. China and France also made statements in support of the memorandum.
Arms reduction:
Main article: Megatons to Megawatts Program
The threat of nuclear warfare was a persistent and terrifying threat during the Cold War. At its height, the Soviet Union and United States each mustered tens of thousands of warheads, under the doctrine of mutual assured destruction.
By the 1980s, both the United States and Soviet Union sought to reduce the number of weapons the other was fielding. This led to the opening of arms reduction talks in 1982.
This culminated in the signing of the START I treaty in 1991: the first nuclear arms reduction treaty between the two global powers. This first treaty limited the number of deployed warheads in each nation to 6,000, nearly halving the prior 10,000 to 12,000 being fielded in 1991.
The considerable success of START I, combined with the dissolution of the Soviet Union in 1991, led to the START II treaty. Russia never ratified the treaty, and it did not go into effect.
START III was attempted but could not get past negotiations.
Instead, the Strategic Offensive Reductions Treaty was passed in 2002, capping warheads at 2,200. The current limitations stem from the New START treaty, ratified in 2010. It limits each side to 1,550 weapons. Nuclear bombers only count as one weapon each, even though they may carry up to 20, so the actual limit on the countries is slightly higher. The treaty is in force through to 2026.
After U.S. President George W. Bush withdrew from the 1972 Anti-Ballistic Missile Treaty, Russia responded by building-up their nuclear capabilities, in such a way as to counterbalance U.S. capabilities.
Russia decided not to sign the UN treaty on the Prohibition of Nuclear Weapons, which was adopted on 7 July 2017 by 122 States. Most analysts agree that Russia's nuclear strategy under Putin eventually brought it into violation of the 1987 Intermediate-Range Nuclear Forces Treaty (although this is not confirmed).
According to Russian officials, the American decision to deploy the missile defense system in Europe was a violation of the treaty. U.S. President Donald Trump announced on 20 October 2018 that the U.S. would no longer consider itself bound by the treaty's provisions, raising nuclear tensions between the two powers.
Nuclear arsenal of Russia:
The exact number of nuclear warheads is a state secret and is therefore a matter of guesswork.
As of 2022, the Federation of American Scientists estimates that Russia possesses 5,977 nuclear weapons, while the United States has 5,428; Russia and the U.S. each have about 1,600 active deployed strategic nuclear warheads.
Russia's stockpile is growing in size, while the United States' is shrinking. Russia has six nuclear missile fields in Kozelsk, Tatishchevo, Uzhur, Dombarovskiy, Kartalay, and Aleysk; nuclear missile submarines patrolling from three naval bases at Nerpich'ya, Yagel'Naya, and Rybachiy; and nuclear bombers at Ukrainka and Engels air bases.
The RS-28 Sarmat (Russian: РС-28 Сармат; NATO reporting name: SATAN 2), is a Russian liquid-fueled, MIRV-equipped, super-heavy thermonuclear armed intercontinental ballistic missile in development by the Makeyev Rocket Design Bureau since 2009, intended to replace the previous R-36 missile.
Its large payload would allow for up to 10 heavy warheads or 15 lighter ones, or a combination of warheads and massive amounts of countermeasures designed to defeat anti-missile systems. It was heralded by the Russian military as a response to the U.S. Prompt Global Strike.
In 2015, information emerged that Russia may be developing a new nuclear torpedo, the Status-6 Ocean Multipurpose System, codenamed "Kanyon" by Pentagon officials. This weapon is designed to create a tsunami wave up to 500m tall that will radioactively contaminate a wide area on an enemy coasts with cobalt-60, and to be immune to anti-missile defense systems such as laser weapons and railguns that might disable an ICBM.
Two potential carrier submarines, the Project 09852 Belgorod, and the Project 09851 Khabarovsk, are new boats laid down in 2012 and 2014 respectively.
Status 6 appears to be a deterrent weapon of last resort. It appears to be a torpedo-shaped robotic mini-submarine, that can travel at speeds of 185 km/h (100 kn). More recent information suggests a top speed of 100 km/h (54 kn), with a range of 10,000 km (6,200 mi) and a depth maximum of 1,000 m (3,300 ft). This underwater drone is cloaked by stealth technology to elude acoustic tracking devices.
During an annual state-of-the-nation address given on March 1, 2018, President Vladimir Putin publicly claimed that Russia was now in possession of several new classes of nuclear weapons, including some with capabilities previously speculated to exist. Putin discussed several new or upgraded weapons, including a hypersonic glide vehicle known as the Avangard, capable of performing sharp maneuvers while traveling at 20 times the speed of sound making it "absolutely invulnerable for any missile defense system."
Putin discussed the existence of a nuclear powered underwater torpedo and a nuclear powered cruise missile (9M730 Burevestnik), both with effectively unlimited range. He discussed that Russia had tested a new class of traditional ICBM called the Sarmat, which expanded upon the range and carrying capability of the Soviet-era Satan ICBM. Animations of these weapons were shown in front of the live and televised audience. Putin suggested that an online poll be conducted to give them official public names.
Nuclear weapons in Russian military doctrine:
Main article: Military doctrine of Russia
According to a Russian military doctrine stated in 2010, nuclear weapons could be used by Russia "in response to the use of nuclear and other types of weapons of mass destruction against it or its allies, and also in case of aggression against Russia with the use of conventional weapons when the very existence of the state is threatened".
Most military analysts believe that, in this case, Russia would pursue an 'escalate to de-escalate' strategy, initiating limited nuclear exchange to bring adversaries to the negotiating table. Russia will also threaten nuclear conflict to discourage initial escalation of any major conventional conflict.
Nuclear proliferation;
Main articles:
- Russia-China relations,
- Russia-India relations,
- Russia-North Korea relations,
- and Russia-Iran relations
After the Korean War, the Soviet Union transferred nuclear technology and weapons to the People's Republic of China as an adversary of the United States and NATO. According to Ion Mihai Pacepa, "Khrushchev's nuclear-proliferation process started with Communist China in April 1955, when the new ruler in the Kremlin consented to supply Beijing a sample atomic bomb and to help with its mass production. Subsequently, the Soviet Union built all the essentials of China's new military nuclear industry."
Russia is one of the five "Nuclear Weapons States" (NWS) under the Treaty on the Non-Proliferation of Nuclear Weapons (NPT), which Russia ratified (as the Soviet Union) in 1968.
Following the dissolution of the Soviet Union in 1991, a number of Soviet-era nuclear warheads remained on the territories of Belarus, Ukraine, and Kazakhstan.
Under the terms of the Lisbon Protocol to the NPT, and following the 1995 Trilateral Agreement between Russia, Belarus, and the US, these were transferred to Russia, leaving Russia as the sole inheritor of the Soviet nuclear arsenal. It is estimated that the Soviet Union had approximately 45,000 nuclear weapons stockpiled at the time of its collapse.
The collapse of the Soviet Union allowed for a warming of relations with NATO. Fears of a nuclear holocaust lessened. In September 1997, the former secretary of the Russian Security Council Alexander Lebed claimed 100 "suitcase sized" nuclear weapons were unaccounted for.
He said he was attempting to inventory the weapons when he was fired by President Boris Yeltsin in October 1996. Indeed, several US politicians have expressed worries and promised legislation addressing the threat.
There were allegations that Russia contributed to the North Korean nuclear program, selling it the equipment for the safe storage and transportation of nuclear materials.
Nevertheless, Russia has condemned North Korean nuclear tests since then. The Russian Federation has also wider commercial interests in selling the nuclear technology to India and Iran, reaching understanding memorandums in training their technicians in their respected nuclear programs.
Russia is allegedly making efforts to build its influential hold in Africa for earning several billions of pounds by selling nuclear technology to developing African countries.
Nuclear sabotage allegations from Russia:
The highest-ranking GRU defector Stanislav Lunev described alleged Soviet plans for using tactical nuclear weapons for sabotage against the United States in the event of war.
He described Soviet-made suitcase nukes identified as RA-115s (or RA-115-01s for submersible weapons) which weigh 50–60 pounds (23–27 kg). These portable bombs can last for many years if wired to an electric source. "In case there is a loss of power, there is a battery backup. If the battery runs low, the weapon has a transmitter that sends a coded message – either by satellite or directly to a GRU post at a Russian embassy or consulate."
Lunev was personally looking for hiding places for weapons caches in the Shenandoah Valley area. He said that "it is surprisingly easy to smuggle nuclear weapons into the US" either across the Mexican border or using a small transport missile that can slip though undetected when launched from a Russian airplane.
Searches of the areas identified by Lunev – who admits he never planted any weapons in the US – have been conducted, "but law-enforcement officials have never found such weapons caches, with or without portable nuclear weapons" in the US.
In a 2004 interview, colonel general of RVSN Viktor Yesin said that Soviet small-scale nuclear bombs have only been operated by the Army. All such devices have been stored in a weapons depot within Russia, and only left it for checks at the plant which produced them.
2020 Russian nuclear deterrence state policy:
On June 2, 2020, President Putin signed an Executive Order formally titled "Fundamentals of Russia’s Nuclear Deterrence State Policy", in an unprecedented public release of an official document on Russia's nuclear policy.
The six-page document identified the range of threats that Russia seeks to deter with its nuclear forces, clarified Russia’s general approach to nuclear deterrence, and articulated conditions under which Russia might use of nuclear weapons. The policy endorses the use of nuclear weapons in response to a non-nuclear strike due to the improved capabilities of U.S. conventional weapons.
2022 Russian invasion of Ukraine:
See also: Nuclear threats during the 2022 Russian invasion of Ukraine
During the 2022 Russian invasion of Ukraine, Russian President Vladimir Putin placed Strategic Rocket Forces's nuclear deterrence units on high alert, a move heavily condemned internationally.
Putin warned that "whoever tries to hinder us in Ukraine would see consequences, you have never seen in your history". According to the US Director of National Intelligence, Avril Haines, Putin could potentially turn to nuclear weapons if he perceived an "existential threat" to the Russian state or regime; there has been speculation that he could regard defeat in Ukraine as an existential threat to his regime.
According to a peer-reviewed study published in the journal Nature Food in August 2022, a full-scale nuclear war between the United States and Russia, which together hold more than 90% of the world's nuclear weapons, would kill 360 million people directly and more than 5 billion indirectly by starvation during a nuclear winter.
In September 2022, Putin announced the 2022 Russian mobilization and threatened nuclear retaliation against the west if Russia's territorial integrity was threatened.
On February 21, 2023, Putin suspended Russia's participation in the New START nuclear arms reduction treaty with the United States, saying that Russia would not allow the US and NATO to inspect its nuclear facilities.
On 25 March 2023, Putin announced that Russia would be stationing tactical nuclear operations in Belarus. On 14 June 2023, Belarusian President Aleksander Lukashenko stated that Belarus had started to take delivery of nuclear weapons in a TV interview with Russian state channel Russia-1.
Click on any of the following hyperlinks for mre about Russia and its "Weapons of Mass Destruction":
- Biological weapons
- Chemical weapons
- See also:
- Defense industry of Russia
- Father of All Bombs
- List of Russian weaponry makers
- Military doctrine of Russia
- New physical principles weapons
- Nuclear weapons and the United States
- Nuclear weapons of the United Kingdom
- Nunn–Lugar Cooperative Threat Reduction
- Soviet biological weapons program
- Video archive of the Soviet Union's Nuclear Testing at sonicbomb.com
- Abolishing Weapons of Mass Destruction: Addressing Cold War and Other Wartime Legacies in the Twenty-First Century by Mikhail S. Gorbachev
- Russia's Nuclear Policy in the 21st Century Environment - analysis by Dmitri Trenin, IFRI Proliferation Papers n°13, 2005
- Nuclear Threat Initiative on Russia by National Journal
- Nuclear stockpile estimate fas.org
- Nuclear Notebook: Russian nuclear forces, 2006, Bulletin of the Atomic Scientists, March/April 2006.
- Current information on nuclear stockpiles in Russia. nuclearfiles.org
- Chemical Weapons in Russia: History, Ecology, Politics by Lev Fedorov, Moscow, Center of Ecological Policy of Russia, 27 July 1994
United States and its Weapons of Mass DestructionPictured below: 3 Nuclear Weapon Delivery Systems in the United States Nuclear Arsenal
United States and weapons of mass destruction
The United States is known to have possessed three types of weapons of mass destruction:
The U.S. is the only country to have used nuclear weapons on another country, when it detonated two atomic bombs over two Japanese cities of Hiroshima and Nagasaki during World War II. It had secretly developed the earliest form of the atomic weapon during the 1940s under the title "Manhattan Project".
The United States pioneered the development of both the nuclear fission and hydrogen bombs (the latter involving nuclear fusion). It was the world's first and only nuclear power for four years, from 1945 until 1949, when the Soviet Union produced its own nuclear weapon. The United States has the second-largest number of nuclear weapons in the world, after the Russian Federation.
Nuclear weapons:
U.S. nuclear warhead stockpiles, 1945–2002.
Main article: Nuclear weapons of the United States
Nuclear weapons have been used twice in combat: two nuclear weapons were used by the United States against Japan during World War II in the atomic bombings of Hiroshima and Nagasaki. Altogether, the two bombings killed 105,000 people and injured thousands more while devastating hundreds or thousands of military bases, factories, and cottage industries.
The U.S. conducted an extensive nuclear testing program. 1054 tests were conducted between 1945 and 1992. The exact number of nuclear devices detonated is unclear because some tests involved multiple devices while a few failed to explode or were designed not to create a nuclear explosion.
The last nuclear test by the United States was on September 23, 1992; the U.S. has signed but not ratified the Comprehensive Nuclear-Test-Ban Treaty.
Currently, the United States nuclear arsenal is deployed in three areas:
The United States is one of the five "Nuclear Weapons States" under the Treaty on the Non-Proliferation of Nuclear Weapons, which the U.S. ratified in 1968. On October 13, 1999, the U.S. Senate rejected ratification of the Comprehensive Test Ban Treaty, having previously ratified the Partial Test Ban Treaty in 1963.
The U.S. has not, however, tested a nuclear weapon since 1992, though it has tested many non-nuclear components and has developed powerful supercomputers to duplicate the knowledge gained from testing without conducting the actual tests themselves.
In the early 1990s, the U.S. stopped developing new nuclear weapons and now devotes most of its nuclear efforts into stockpile stewardship, maintaining and dismantling its now-aging arsenal. The administration of George W. Bush decided in 2003 to engage in research towards a new generation of small nuclear weapons, especially "earth penetrators". The budget passed by the United States Congress in 2004 eliminated funding for some of this research including the "bunker-busting or earth-penetrating" weapons.
The exact number of nuclear weapons possessed by the United States is difficult to determine. Different treaties and organizations have different criteria for reporting nuclear weapons, especially those held in reserve, and those being dismantled or rebuilt:
In 2002, the United States and Russia agreed in the SORT treaty to reduce their deployed stockpiles to not more than 2,200 warheads each. In 2003, the U.S. rejected Russian proposals to further reduce both nation's nuclear stockpiles to 1,500 each.
In 2007, for the first time in 15 years, the United States built new warheads. These replaced some older warheads as part of the Minuteman III upgrade program. 2007 also saw the first Minuteman III missiles removed from service as part of the drawdown. Overall, stockpiles and deployment systems continue to decline in number under the terms of the New START treaty.
In 2014, Bulletin of the Atomic Scientists released a report, stating that there are a total of 2,530 warheads kept in reserve, and 2,120 actively deployed. Of the warheads actively deployed, the number of strategic warheads rests at 1,920 (subtracting 200 tactical B61s as part of Nato nuclear weapon sharing arrangements). The amount of warheads being actively disabled rests at about 2,700 warheads, which brings the total United States inventory to about 7,400 warheads.
The U.S. government decided not to sign the UN treaty on the Prohibition of Nuclear Weapons, a binding agreement for negotiations for the total elimination of nuclear weapons, supported by more than 120 nations.
As of early 2019, more than 90% of world's 13,865 nuclear weapons were owned by the United States and Russia. Russia has the most nuclear warheads sitting at 5,977, while the United States has 5,428 warheads.
Land-based ICBMs:
The U.S. Air Force currently operates 400 Minuteman III ICBMs, located primarily in the northern Rocky Mountain states and the Dakotas. Peacekeeper missiles were phased out of the Air Force inventory in 2005.
All USAF Minuteman II missiles were destroyed in accordance with the START treaty and their launch silos imploded and buried then sold to the public under the START II. The U.S. goal under the SORT treaty was to reduce from 1,600 warheads deployed on over 500 missiles in 2003 to 500 warheads on 450 missiles in 2012.
The first Minuteman III were removed under this plan in 2007 while, at the same time, the warheads deployed on Minuteman IIIs began to be upgraded from smaller W62s to larger W87s from decommissioned Peacekeeper missiles
Air-based delivery systems:
B-2 Spirit stealth strategic bomber (see upper right image above). The U.S. Air Force also operates a strategic nuclear bomber fleet. The bomber force consists of 36 nuclear-armed B-52 Stratofortresses, and 13 B-2 Spirits. All 64 B-1s were retrofitted to operate in a solely conventional mode by 2007 and thus don't count as nuclear platforms.
In addition to this, the U.S. military can also deploy smaller tactical nuclear weapons either through cruise missiles or with conventional fighter-bombers. The U.S. maintains about 400 nuclear gravity bombs capable of use by the:
Some 350 of these bombs are deployed at seven airbases in six European NATO countries; of these, 180 tactical B61 nuclear bombs fall under a nuclear sharing arrangement.
Submarine-based ballistic missiles:
USS Kentucky (SSBN-737), an Ohio-class ballistic missile submarine.The U.S. Navy currently has 18 Ohio-class submarines deployed, of which 14 are ballistic missile submarines. Each submarine is equipped with a maximum complement of 24 Trident II missiles. Approximately 12 U.S. attack submarines were equipped to launch nuclear Tomahawk missiles, but these weapons were removed from service by 2013.
The number of Deployed and Non-Deployed SLBMs on the Ohio-Class SSBNs as of 2018 is 280, of which 203 SLBMs are deployed.
Click on any of the following blue hyperlinks for more about the following United States weapons of mass destruction:
The United States is known to have possessed three types of weapons of mass destruction:
The U.S. is the only country to have used nuclear weapons on another country, when it detonated two atomic bombs over two Japanese cities of Hiroshima and Nagasaki during World War II. It had secretly developed the earliest form of the atomic weapon during the 1940s under the title "Manhattan Project".
The United States pioneered the development of both the nuclear fission and hydrogen bombs (the latter involving nuclear fusion). It was the world's first and only nuclear power for four years, from 1945 until 1949, when the Soviet Union produced its own nuclear weapon. The United States has the second-largest number of nuclear weapons in the world, after the Russian Federation.
Nuclear weapons:
U.S. nuclear warhead stockpiles, 1945–2002.
Main article: Nuclear weapons of the United States
Nuclear weapons have been used twice in combat: two nuclear weapons were used by the United States against Japan during World War II in the atomic bombings of Hiroshima and Nagasaki. Altogether, the two bombings killed 105,000 people and injured thousands more while devastating hundreds or thousands of military bases, factories, and cottage industries.
The U.S. conducted an extensive nuclear testing program. 1054 tests were conducted between 1945 and 1992. The exact number of nuclear devices detonated is unclear because some tests involved multiple devices while a few failed to explode or were designed not to create a nuclear explosion.
The last nuclear test by the United States was on September 23, 1992; the U.S. has signed but not ratified the Comprehensive Nuclear-Test-Ban Treaty.
Currently, the United States nuclear arsenal is deployed in three areas:
- Land-based intercontinental ballistic missiles, or ICBMs;
- Sea-based, nuclear submarine-launched ballistic missiles, or SLBMs; and
- Air-based nuclear weapons of the U.S. Air Force's heavy bomber group
The United States is one of the five "Nuclear Weapons States" under the Treaty on the Non-Proliferation of Nuclear Weapons, which the U.S. ratified in 1968. On October 13, 1999, the U.S. Senate rejected ratification of the Comprehensive Test Ban Treaty, having previously ratified the Partial Test Ban Treaty in 1963.
The U.S. has not, however, tested a nuclear weapon since 1992, though it has tested many non-nuclear components and has developed powerful supercomputers to duplicate the knowledge gained from testing without conducting the actual tests themselves.
In the early 1990s, the U.S. stopped developing new nuclear weapons and now devotes most of its nuclear efforts into stockpile stewardship, maintaining and dismantling its now-aging arsenal. The administration of George W. Bush decided in 2003 to engage in research towards a new generation of small nuclear weapons, especially "earth penetrators". The budget passed by the United States Congress in 2004 eliminated funding for some of this research including the "bunker-busting or earth-penetrating" weapons.
The exact number of nuclear weapons possessed by the United States is difficult to determine. Different treaties and organizations have different criteria for reporting nuclear weapons, especially those held in reserve, and those being dismantled or rebuilt:
- In its Strategic Arms Reduction Treaty (START) declaration for 2003, the U.S. listed 5968 deployed warheads as defined by START rules.
- The exact number as of September 30, 2009, was 5,113 warheads, according to a U.S. fact sheet released May 3, 2010.
In 2002, the United States and Russia agreed in the SORT treaty to reduce their deployed stockpiles to not more than 2,200 warheads each. In 2003, the U.S. rejected Russian proposals to further reduce both nation's nuclear stockpiles to 1,500 each.
In 2007, for the first time in 15 years, the United States built new warheads. These replaced some older warheads as part of the Minuteman III upgrade program. 2007 also saw the first Minuteman III missiles removed from service as part of the drawdown. Overall, stockpiles and deployment systems continue to decline in number under the terms of the New START treaty.
In 2014, Bulletin of the Atomic Scientists released a report, stating that there are a total of 2,530 warheads kept in reserve, and 2,120 actively deployed. Of the warheads actively deployed, the number of strategic warheads rests at 1,920 (subtracting 200 tactical B61s as part of Nato nuclear weapon sharing arrangements). The amount of warheads being actively disabled rests at about 2,700 warheads, which brings the total United States inventory to about 7,400 warheads.
The U.S. government decided not to sign the UN treaty on the Prohibition of Nuclear Weapons, a binding agreement for negotiations for the total elimination of nuclear weapons, supported by more than 120 nations.
As of early 2019, more than 90% of world's 13,865 nuclear weapons were owned by the United States and Russia. Russia has the most nuclear warheads sitting at 5,977, while the United States has 5,428 warheads.
Land-based ICBMs:
The U.S. Air Force currently operates 400 Minuteman III ICBMs, located primarily in the northern Rocky Mountain states and the Dakotas. Peacekeeper missiles were phased out of the Air Force inventory in 2005.
All USAF Minuteman II missiles were destroyed in accordance with the START treaty and their launch silos imploded and buried then sold to the public under the START II. The U.S. goal under the SORT treaty was to reduce from 1,600 warheads deployed on over 500 missiles in 2003 to 500 warheads on 450 missiles in 2012.
The first Minuteman III were removed under this plan in 2007 while, at the same time, the warheads deployed on Minuteman IIIs began to be upgraded from smaller W62s to larger W87s from decommissioned Peacekeeper missiles
Air-based delivery systems:
B-2 Spirit stealth strategic bomber (see upper right image above). The U.S. Air Force also operates a strategic nuclear bomber fleet. The bomber force consists of 36 nuclear-armed B-52 Stratofortresses, and 13 B-2 Spirits. All 64 B-1s were retrofitted to operate in a solely conventional mode by 2007 and thus don't count as nuclear platforms.
In addition to this, the U.S. military can also deploy smaller tactical nuclear weapons either through cruise missiles or with conventional fighter-bombers. The U.S. maintains about 400 nuclear gravity bombs capable of use by the:
- F/A-18 Hornet
- F-15E,
- F-16,
- F-22,
- and F-35.
Some 350 of these bombs are deployed at seven airbases in six European NATO countries; of these, 180 tactical B61 nuclear bombs fall under a nuclear sharing arrangement.
Submarine-based ballistic missiles:
USS Kentucky (SSBN-737), an Ohio-class ballistic missile submarine.The U.S. Navy currently has 18 Ohio-class submarines deployed, of which 14 are ballistic missile submarines. Each submarine is equipped with a maximum complement of 24 Trident II missiles. Approximately 12 U.S. attack submarines were equipped to launch nuclear Tomahawk missiles, but these weapons were removed from service by 2013.
The number of Deployed and Non-Deployed SLBMs on the Ohio-Class SSBNs as of 2018 is 280, of which 203 SLBMs are deployed.
Click on any of the following blue hyperlinks for more about the following United States weapons of mass destruction:
- Biological weapons
- Chemical weapons
- See also:
- Defense Threat Reduction Agency – The U.S. Department of Defense's official Combat Support Agency for countering weapons of mass destruction.
- Dugway sheep incident
- Enduring Stockpile – the name of the United States's remaining arsenal of nuclear weapons following the end of the Cold War.
- List of U.S. biological weapons topics
- Nuclear weapons and the United States
- Operation Paperclip – the codename under which the U.S. intelligence and military services extricated scientists from Germany, during and after the final stages of World War II.
- Nuclear weapons of the United Kingdom
- United States Army Chemical Corps
- United States missile defense New START Treaty Aggregate Numbers of Strategic Offensive Arms
- Video archive of the US's Nuclear Testing
- United States Nuclear Forces Guide
- Abolishing Weapons of Mass Destruction: Addressing Cold War and Other Wartime Legacies in the Twenty-First Century By Mikhail S. Gorbachev
- Nuclear Threat Initiative on United States
- U.S. Army Chemical Weapons Agency website
- Nuclear Files.org Current information on nuclear stockpiles in the United States
- Trends in U.S. Nuclear Policy - analysis by William C. Potter, IFRI Proliferation Papers n°11, 2005
- The Woodrow Wilson Center's Nuclear Proliferation International History Project or NPIHP is a global network of individuals and institutions engaged in the study of international nuclear history through archival documents, oral history interviews and other empirical sources.
Nuclear Weapons of Mass DestructionPictured below: Color-coded World Map highlighting those nations with Nucleear Weapons (in red), nations with only chemical and biological weapons (orange); and nations that previously had weapons of mass destruction but don't now (yellow).
[This web page primarily covers nuclear weapons, However, within the following and in all discussions regarding weapons of mass destruction, it includes not only nuclear weapons, but also biological and chemical weapons.
Even so, of the three, nuclear weapons are by far the worst. The following Wilson Center article elaborates on this]:
Evaluating the Threat: Chemical, Biological, and Nuclear
Technology raises the stakes of the terrorist threat. Former Congressman Lee Hamilton outlines the danger from chemical, biological and nuclear weapons.
Why are we so worried about terrorism? After all, terrorism has been part of human experience for centuries. Sometimes – when assessing endless airport lines, billions of dollars in homeland security expenditures, or constant media analysis – you cannot help but wonder what all the fuss is about.
A short answer is technology. Yes, people have been killing innocent civilians to spread terror as long as people have wielded weapons. But never before have such horrific weapons been so widely available to people who can travel and communicate so freely – a degree of lethality previously reserved for nation-states. That is why every American should be familiar with the threat of chemical, biological and nuclear terrorism.
Of the three, chemical weapons are the easiest to obtain and use, though they are also the least deadly. An individual with the right know-how could purchase components for a chemical like the nerve gas sarin on the open market, or a terrorist group could buy or steal a weapon from one of the dozen or so nations that has stockpiles. Released into an enclosed space like a subway or arena, a properly weaponized chemical agent could kill hundreds, even thousands. A terrorist could achieve more deadly results by blowing up part of a chemical plant, releasing a cloud of toxic agents.
Biological weapons have the potential to cause greater harm. We have seen the capacity of a terrorist to spread death and disease– while evading capture – with anthrax. Future attacks could be far more deadly if they involve more infectious biological agents like smallpox and plague. Most experts fear that a biological attack of some sort is likely, in part because information on creating bio-weapons, and the equipment to produce them, are widely available.
Biological attacks are extremely hard to defend against. Even with a recent government subsidy, pharmaceutical companies lack incentive to develop vaccines for which there is little profit. Public health experts worry about our ability to detect an attack before it has spread beyond containment, and fear that our hospitals and local health agencies would be overrun by people seeking vaccines or antidotes.
Meanwhile, terrorism analysts are left guessing at a seemingly infinite possibility of methods – from contamination of the food supply to delivery via a crop-duster – while revolutionary advances in biotechnology make it hard to anticipate what the threat will be in a decade.
The most catastrophic threat is an attack with a nuclear weapon. Though nuclear terrorism is also the most unlikely of the three, the sheer magnitude of the danger causes deep concern. It is extremely difficult to build, transport and use a nuclear device, but the number of countries with a nuclear weapon has been growing, and last year's revelation of a global black market of nuclear secrets shows how the most deadly of technologies does not belong solely to nation-states.
It should be stressed that nuclear, biological and chemical materials are all extremely difficult to acquire, build, and turn into workable weapons. Movies dwell on scenarios where whole cities are wiped out, but the more likely scenarios involve less horrific destruction: a "dirty bomb" emitting radiation rather than a nuclear explosion; an anthrax-style biological attack rather than a smallpox epidemic; the release of a modest amount of dangerous chemicals rather than a mass-casualty scenario.
Still, an attack with one of these weapons that kills only a handful of people – less than a fire or major car accident – could cause tens of billions of dollars in damage, cleanup and economic fallout, not to mention widespread panic.
Nor should we take comfort in the unlikelihood of worst-case scenarios. If an attack does cause mass destruction, society might change in ways that are hard to imagine – with a serious erosion, if not an end to civil liberties and open society as we know it.
There is no doubting the terrorists' intent and no reason to underestimate their capability – learning to master a chemical agent is no more unlikely than learning to fly, penetrating our borders and airline security, simultaneously gaining control of passenger airliners, and crashing them into buildings. As twenty or fifty years go by, the advances and diffusion of knowledge in the nuclear, chemical, and biological sciences will only increase, further raising the risks.
Preventing catastrophic terrorism should be the number one security objective of the 21st century – for all governments. We are better prepared than we were on 9/11, but much work remains. Steady advances in technology should galvanize our efforts to increase outreach, understanding, and cooperation with the Islamic world, and among all peoples – so science serves as a vehicle for peace and enlightenment, not war and terror.
[End of Article]
___________________________________________________________________________
A weapon of mass destruction (WMD) (Wikipedia)
is a chemical, biological, radiological, nuclear, or any other weapon that can kill or significantly harm many people or cause great damage to artificial structures (e.g., buildings), natural structures (e.g., mountains), or the biosphere.
The scope and usage of the term has evolved and been disputed, often signifying more politically than technically. Originally coined in reference to aerial bombing with chemical explosives during World War II, it has later come to refer to large-scale weaponry of warfare-related technologies, such as chemical, biological, radiological, or nuclear warfare.
Early uses of this term:
The first use of the term "weapon of mass destruction" on record is by Cosmo Gordon Lang, Archbishop of Canterbury, in 1937 in reference to the aerial bombing of Guernica, Spain:
"Who can think at this present time without a sickening of the heart of the appalling slaughter, the suffering, the manifold misery brought by war to Spain and to China? Who can think without horror of what another widespread war would mean, waged as it would be with all the new weapons of mass destruction?"
At the time, nuclear weapons had not been developed. Japan conducted research on biological weapons (see Unit 731), and chemical weapons had seen wide battlefield use in World War I. Their use was outlawed by the Geneva Protocol of 1925. Italy used mustard agent against civilians and soldiers in Ethiopia in 1935–36.
Following the atomic bombings of Hiroshima and Nagasaki that ended World War II and during the Cold War, the term came to refer more to non-conventional weapons. The application of the term to specifically nuclear and radiological weapons is traced by William Safire to the Russian phrase "Оружие массового поражения" – oruzhiye massovogo porazheniya (weapon of mass destruction).
William Safire credits James Goodby (of the Brookings Institution) with tracing what he considers the earliest known English-language use soon after the nuclear bombing of Hiroshima and Nagasaki (although it is not quite verbatim): a communique from a 15 November 1945, meeting of Harry Truman, Clement Attlee and Mackenzie King (probably drafted by Vannevar Bush, as Bush claimed in 1970) referred to "weapons adaptable to mass destruction."
Safire says Bernard Baruch used that exact phrase in 1946 (in a speech at the United Nations probably written by Herbert Bayard Swope). The phrase found its way into the very first resolution the United Nations General assembly adopted in January 1946 in London, which used the wording "the elimination from national armaments of atomic weapons and of all other weapons adaptable to mass destruction."
The resolution also created the Atomic Energy Commission (predecessor of the International Atomic Energy Agency (IAEA)).
An exact use of this term was given in a lecture titled "Atomic Energy as a Contemporary Problem" by J. Robert Oppenheimer. He delivered the lecture to the Foreign Service and the State Department, on 17 September 1947:
"It is a very far-reaching control which would eliminate the rivalry between nations in this field, which would prevent the surreptitious arming of one nation against another, which would provide some cushion of time before atomic attack, and presumably therefore before any attack with weapons of mass destruction, and which would go a long way toward removing atomic energy at least as a source of conflict between the powers."
The term was also used in the introduction to the hugely influential U.S. government document known as NSC 68 written in 1950.
During a speech at Rice University on 12 September 1962, President John F. Kennedy spoke of not filling space "with weapons of mass destruction, but with instruments of knowledge and understanding."
The following month, during a televised presentation about the Cuban Missile Crisis on 22 October 1962, Kennedy made reference to "offensive weapons of sudden mass destruction."
An early use of the exact phrase in an international treaty is in the Outer Space Treaty of 1967, but the treaty provides no definition of the phrase, and the treaty also categorically prohibits the stationing of "weapons" and the testing of "any type of weapon" in outer space, in addition to its specific prohibition against placing in orbit, or installing on celestial bodies, "any objects carrying nuclear weapons or any other kinds of weapons of mass destruction."
Evolution of its use:
During the Cold War, the term "weapons of mass destruction" was primarily a reference to nuclear weapons. At the time, in the West the euphemism "strategic weapons" was used to refer to the American nuclear arsenal. However, there is no precise definition of the "strategic" category, neither considering range nor yield of the nuclear weapon.
Subsequent to Operation Opera, the destruction of a pre-operational nuclear reactor inside Iraq by the Israeli Air Force in 1981, the Israeli prime minister, Menachem Begin, countered criticism by saying that "on no account shall we permit an enemy to develop weapons of mass destruction against the people of Israel." This policy of pre-emptive action against real or perceived weapons of mass destruction became known as the Begin Doctrine.
The term "weapons of mass destruction" continued to see periodic use, usually in the context of nuclear arms control; Ronald Reagan used it during the 1986 Reykjavík Summit, when referring to the 1967 Outer Space Treaty. Reagan's successor, George H. W. Bush, used the term in a 1989 speech to the United Nations, primarily in reference to chemical arms.
The end of the Cold War reduced U.S. reliance on nuclear weapons as a deterrent, causing it to shift its focus to disarmament. With the 1990 invasion of Kuwait and 1991 Gulf War, Iraq's nuclear, biological, and chemical weapons programs became a particular concern of the first Bush Administration. Following the war, Bill Clinton and other western politicians and media continued to use the term, usually in reference to ongoing attempts to dismantle Iraq's weapons programs.
After the 11 September 2001 attacks and the 2001 anthrax attacks in the United States, an increased fear of nonconventional weapons and asymmetric warfare took hold in many countries.
The fear reached a crescendo with the 2002 Iraq disarmament crisis and the alleged existence of weapons of mass destruction in Iraq that became the primary justification for the 2003 invasion of Iraq; however, American forces found none in Iraq. They found old stockpiles of chemical munitions including sarin and mustard agents, but all were considered to be unusable because of corrosion or degradation.
Iraq, however, declared a chemical weapons stockpile in 2009 which U.N. personnel had secured after the 1991 Gulf War. The stockpile contained mainly chemical precursors, but some munitions remained usable.
Because of its prolific use and (worldwide) public profile during this period, the American Dialect Society voted "weapons of mass destruction" (and its abbreviation, "WMD") the word of the year in 2002, and in 2003 Lake Superior State University added WMD to its list of terms banished for "Mis-use, Over-use and General Uselessness" (and "as a card that trumps all forms of aggression").
In its criminal complaint against the main suspect of the Boston Marathon bombing of 15 April 2013, the FBI refers to a pressure-cooker improvised bomb as a "weapon of mass destruction."
There have been calls to classify at least some classes of cyber weapons as WMD, in particular those aimed to bring about large-scale (physical) destruction, such as by targeting critical infrastructure. However, some scholars have objected to classifying cyber weapons as WMD on the grounds that they "cannot [currently] directly injure or kill human beings as efficiently as guns or bombs" or clearly "meet the legal and historical definitions" of WMD.
Definitions of the term:
United States: Strategic definition:
The most widely used definition of "weapons of mass destruction" is that of nuclear, biological, or chemical weapons (NBC) although there is no treaty or customary international law that contains an authoritative definition. Instead, international law has been used with respect to the specific categories of weapons within WMD, and not to WMD as a whole.
While nuclear, chemical and biological weapons are regarded as the three major types of WMDs, some analysts have argued that radiological materials as well as missile technology and delivery systems such as aircraft and ballistic missiles could be labeled as WMDs as well.
However, there is an argument that nuclear and biological weapons do not belong in the same category as chemical and "dirty bomb" radiological weapons, which have limited destructive potential (and close to none, as far as property is concerned), whereas nuclear and biological weapons have the unique ability to kill large numbers of people with very small amounts of material, and thus could be said to belong in a class by themselves.
The NBC definition has also been used in official U.S. documents, by the U.S. President, the U.S. Central Intelligence Agency, the U.S. Department of Defense, and the U.S. Government Accountability Office.
Other documents expand the definition of WMD to also include radiological or conventional weapons. The U.S. military refers to WMD as:
"Chemical, biological, radiological, or nuclear weapons capable of a high order of destruction or causing mass casualties and exclude the means of transporting or propelling the weapon where such means is a separable and divisible part from the weapon. Also called WMD."
This may also refer to nuclear ICBMs (intercontinental ballistic missiles).
The significance of the words separable and divisible part of the weapon is that missiles such as the Pershing II and the SCUD are considered weapons of mass destruction, while aircraft capable of carrying bombloads are not.
In 2004, the United Kingdom's Butler Review recognized the "considerable and long-standing academic debate about the proper interpretation of the phrase 'weapons of mass destruction'".
The committee set out to avoid the general term but when using it, employed the definition of United Nations Security Council Resolution 687, which defined the systems which Iraq was required to abandon: "Nuclear weapons or nuclear-weapons-usable material or any sub-systems or components or any research, development, support or manufacturing facilities relating to [nuclear weapons].
Chemical weapons expert Gert G. Harigel considers only nuclear weapons true weapons of mass destruction, because "only nuclear weapons are completely indiscriminate by their explosive power, heat radiation and radioactivity, and only they should therefore be called a weapon of mass destruction". He prefers to call chemical and biological weapons "weapons of terror" when aimed against civilians and "weapons of intimidation" for soldiers.
Testimony of one such soldier expresses the same viewpoint. For a period of several months in the winter of 2002–2003, U.S. Deputy Secretary of Defense Paul Wolfowitz frequently used the term "weapons of mass terror", apparently also recognizing the distinction between the psychological and the physical effects of many things currently falling into the WMD category.
Gustavo Bell Lemus, the Vice President of Colombia, at 9 July 2001 United Nations Conference on the Illicit Trade in Small Arms and Light Weapons in All Its Aspects, quoted the Millennium Report of the UN Secretary-General to the General Assembly, in which Kofi Annan said that small arms could be described as WMD because the fatalities they cause "dwarf that of all other weapons systems – and in most years greatly exceed the toll of the atomic bombs that devastated Hiroshima and Nagasaki".
An additional condition often implicitly applied to WMD is that the use of the weapons must be strategic. In other words, they would be designed to "have consequences far outweighing the size and effectiveness of the weapons themselves". The strategic nature of WMD also defines their function in the military doctrine of total war as targeting the means a country would use to support and supply its war effort, specifically its population, industry, and natural resources.
Within U.S. civil defense organizations, the category is now Chemical, Biological, Radiological, Nuclear, and Explosive (CBRNE), which defines WMD as:
Military definition:
For the general purposes of national defense, the U.S. Code defines a weapon of mass destruction as:
For the purposes of the prevention of weapons proliferation, the U.S. Code defines weapons of mass destruction as "chemical, biological, and nuclear weapons, and chemical, biological, and nuclear materials used in the manufacture of such weapons".
Criminal (civilian) definition:
For the purposes of U.S. criminal law concerning terrorism, weapons of mass destruction are defined as:
The Federal Bureau of Investigation's definition is similar to that presented above from the terrorism statute:
Indictments and convictions for possession and use of WMD such as truck bombs, pipe bombs, shoe bombs, and cactus needles coated with a biological toxin have been obtained under 18 USC 2332a.
As defined by 18 USC §2332 (a), a Weapon of Mass Destruction is:
Under the same statute, conspiring, attempting, threatening, or using a Weapon of Mass Destruction may be imprisoned for any term of years or for life, and if resulting in death, be punishable by death or by imprisonment for any terms of years or for life. They can also be asked to pay a maximum fine of $250,000.
The Washington Post reported on 30 March 2006: "Jurors asked the judge in the death penalty trial of Zacarias Moussaoui today to define the term 'weapons of mass destruction' and were told it includes airplanes used as missiles". Moussaoui was indicted and tried for conspiracy to both destroy aircraft and use weapons of mass destruction, among others.
The surviving Boston Marathon bombing perpetrator, Dzhokhar Tsarnaev, was charged in June 2013 with the federal offense of "use of a weapon of mass destruction" after he and his brother Tamerlan Tsarnaev allegedly placed crude shrapnel bombs, made from pressure cookers packed with ball bearings and nails, near the finish line of the Boston Marathon.
He was convicted in April 2015. The bombing resulted in three deaths and at least 264 injuries.
International law:
See also:
The development and use of WMD is governed by several international conventions and treaties (as follows):
Even so, of the three, nuclear weapons are by far the worst. The following Wilson Center article elaborates on this]:
Evaluating the Threat: Chemical, Biological, and Nuclear
Technology raises the stakes of the terrorist threat. Former Congressman Lee Hamilton outlines the danger from chemical, biological and nuclear weapons.
Why are we so worried about terrorism? After all, terrorism has been part of human experience for centuries. Sometimes – when assessing endless airport lines, billions of dollars in homeland security expenditures, or constant media analysis – you cannot help but wonder what all the fuss is about.
A short answer is technology. Yes, people have been killing innocent civilians to spread terror as long as people have wielded weapons. But never before have such horrific weapons been so widely available to people who can travel and communicate so freely – a degree of lethality previously reserved for nation-states. That is why every American should be familiar with the threat of chemical, biological and nuclear terrorism.
Of the three, chemical weapons are the easiest to obtain and use, though they are also the least deadly. An individual with the right know-how could purchase components for a chemical like the nerve gas sarin on the open market, or a terrorist group could buy or steal a weapon from one of the dozen or so nations that has stockpiles. Released into an enclosed space like a subway or arena, a properly weaponized chemical agent could kill hundreds, even thousands. A terrorist could achieve more deadly results by blowing up part of a chemical plant, releasing a cloud of toxic agents.
Biological weapons have the potential to cause greater harm. We have seen the capacity of a terrorist to spread death and disease– while evading capture – with anthrax. Future attacks could be far more deadly if they involve more infectious biological agents like smallpox and plague. Most experts fear that a biological attack of some sort is likely, in part because information on creating bio-weapons, and the equipment to produce them, are widely available.
Biological attacks are extremely hard to defend against. Even with a recent government subsidy, pharmaceutical companies lack incentive to develop vaccines for which there is little profit. Public health experts worry about our ability to detect an attack before it has spread beyond containment, and fear that our hospitals and local health agencies would be overrun by people seeking vaccines or antidotes.
Meanwhile, terrorism analysts are left guessing at a seemingly infinite possibility of methods – from contamination of the food supply to delivery via a crop-duster – while revolutionary advances in biotechnology make it hard to anticipate what the threat will be in a decade.
The most catastrophic threat is an attack with a nuclear weapon. Though nuclear terrorism is also the most unlikely of the three, the sheer magnitude of the danger causes deep concern. It is extremely difficult to build, transport and use a nuclear device, but the number of countries with a nuclear weapon has been growing, and last year's revelation of a global black market of nuclear secrets shows how the most deadly of technologies does not belong solely to nation-states.
It should be stressed that nuclear, biological and chemical materials are all extremely difficult to acquire, build, and turn into workable weapons. Movies dwell on scenarios where whole cities are wiped out, but the more likely scenarios involve less horrific destruction: a "dirty bomb" emitting radiation rather than a nuclear explosion; an anthrax-style biological attack rather than a smallpox epidemic; the release of a modest amount of dangerous chemicals rather than a mass-casualty scenario.
Still, an attack with one of these weapons that kills only a handful of people – less than a fire or major car accident – could cause tens of billions of dollars in damage, cleanup and economic fallout, not to mention widespread panic.
Nor should we take comfort in the unlikelihood of worst-case scenarios. If an attack does cause mass destruction, society might change in ways that are hard to imagine – with a serious erosion, if not an end to civil liberties and open society as we know it.
There is no doubting the terrorists' intent and no reason to underestimate their capability – learning to master a chemical agent is no more unlikely than learning to fly, penetrating our borders and airline security, simultaneously gaining control of passenger airliners, and crashing them into buildings. As twenty or fifty years go by, the advances and diffusion of knowledge in the nuclear, chemical, and biological sciences will only increase, further raising the risks.
Preventing catastrophic terrorism should be the number one security objective of the 21st century – for all governments. We are better prepared than we were on 9/11, but much work remains. Steady advances in technology should galvanize our efforts to increase outreach, understanding, and cooperation with the Islamic world, and among all peoples – so science serves as a vehicle for peace and enlightenment, not war and terror.
[End of Article]
___________________________________________________________________________
A weapon of mass destruction (WMD) (Wikipedia)
is a chemical, biological, radiological, nuclear, or any other weapon that can kill or significantly harm many people or cause great damage to artificial structures (e.g., buildings), natural structures (e.g., mountains), or the biosphere.
The scope and usage of the term has evolved and been disputed, often signifying more politically than technically. Originally coined in reference to aerial bombing with chemical explosives during World War II, it has later come to refer to large-scale weaponry of warfare-related technologies, such as chemical, biological, radiological, or nuclear warfare.
Early uses of this term:
The first use of the term "weapon of mass destruction" on record is by Cosmo Gordon Lang, Archbishop of Canterbury, in 1937 in reference to the aerial bombing of Guernica, Spain:
"Who can think at this present time without a sickening of the heart of the appalling slaughter, the suffering, the manifold misery brought by war to Spain and to China? Who can think without horror of what another widespread war would mean, waged as it would be with all the new weapons of mass destruction?"
At the time, nuclear weapons had not been developed. Japan conducted research on biological weapons (see Unit 731), and chemical weapons had seen wide battlefield use in World War I. Their use was outlawed by the Geneva Protocol of 1925. Italy used mustard agent against civilians and soldiers in Ethiopia in 1935–36.
Following the atomic bombings of Hiroshima and Nagasaki that ended World War II and during the Cold War, the term came to refer more to non-conventional weapons. The application of the term to specifically nuclear and radiological weapons is traced by William Safire to the Russian phrase "Оружие массового поражения" – oruzhiye massovogo porazheniya (weapon of mass destruction).
William Safire credits James Goodby (of the Brookings Institution) with tracing what he considers the earliest known English-language use soon after the nuclear bombing of Hiroshima and Nagasaki (although it is not quite verbatim): a communique from a 15 November 1945, meeting of Harry Truman, Clement Attlee and Mackenzie King (probably drafted by Vannevar Bush, as Bush claimed in 1970) referred to "weapons adaptable to mass destruction."
Safire says Bernard Baruch used that exact phrase in 1946 (in a speech at the United Nations probably written by Herbert Bayard Swope). The phrase found its way into the very first resolution the United Nations General assembly adopted in January 1946 in London, which used the wording "the elimination from national armaments of atomic weapons and of all other weapons adaptable to mass destruction."
The resolution also created the Atomic Energy Commission (predecessor of the International Atomic Energy Agency (IAEA)).
An exact use of this term was given in a lecture titled "Atomic Energy as a Contemporary Problem" by J. Robert Oppenheimer. He delivered the lecture to the Foreign Service and the State Department, on 17 September 1947:
"It is a very far-reaching control which would eliminate the rivalry between nations in this field, which would prevent the surreptitious arming of one nation against another, which would provide some cushion of time before atomic attack, and presumably therefore before any attack with weapons of mass destruction, and which would go a long way toward removing atomic energy at least as a source of conflict between the powers."
The term was also used in the introduction to the hugely influential U.S. government document known as NSC 68 written in 1950.
During a speech at Rice University on 12 September 1962, President John F. Kennedy spoke of not filling space "with weapons of mass destruction, but with instruments of knowledge and understanding."
The following month, during a televised presentation about the Cuban Missile Crisis on 22 October 1962, Kennedy made reference to "offensive weapons of sudden mass destruction."
An early use of the exact phrase in an international treaty is in the Outer Space Treaty of 1967, but the treaty provides no definition of the phrase, and the treaty also categorically prohibits the stationing of "weapons" and the testing of "any type of weapon" in outer space, in addition to its specific prohibition against placing in orbit, or installing on celestial bodies, "any objects carrying nuclear weapons or any other kinds of weapons of mass destruction."
Evolution of its use:
During the Cold War, the term "weapons of mass destruction" was primarily a reference to nuclear weapons. At the time, in the West the euphemism "strategic weapons" was used to refer to the American nuclear arsenal. However, there is no precise definition of the "strategic" category, neither considering range nor yield of the nuclear weapon.
Subsequent to Operation Opera, the destruction of a pre-operational nuclear reactor inside Iraq by the Israeli Air Force in 1981, the Israeli prime minister, Menachem Begin, countered criticism by saying that "on no account shall we permit an enemy to develop weapons of mass destruction against the people of Israel." This policy of pre-emptive action against real or perceived weapons of mass destruction became known as the Begin Doctrine.
The term "weapons of mass destruction" continued to see periodic use, usually in the context of nuclear arms control; Ronald Reagan used it during the 1986 Reykjavík Summit, when referring to the 1967 Outer Space Treaty. Reagan's successor, George H. W. Bush, used the term in a 1989 speech to the United Nations, primarily in reference to chemical arms.
The end of the Cold War reduced U.S. reliance on nuclear weapons as a deterrent, causing it to shift its focus to disarmament. With the 1990 invasion of Kuwait and 1991 Gulf War, Iraq's nuclear, biological, and chemical weapons programs became a particular concern of the first Bush Administration. Following the war, Bill Clinton and other western politicians and media continued to use the term, usually in reference to ongoing attempts to dismantle Iraq's weapons programs.
After the 11 September 2001 attacks and the 2001 anthrax attacks in the United States, an increased fear of nonconventional weapons and asymmetric warfare took hold in many countries.
The fear reached a crescendo with the 2002 Iraq disarmament crisis and the alleged existence of weapons of mass destruction in Iraq that became the primary justification for the 2003 invasion of Iraq; however, American forces found none in Iraq. They found old stockpiles of chemical munitions including sarin and mustard agents, but all were considered to be unusable because of corrosion or degradation.
Iraq, however, declared a chemical weapons stockpile in 2009 which U.N. personnel had secured after the 1991 Gulf War. The stockpile contained mainly chemical precursors, but some munitions remained usable.
Because of its prolific use and (worldwide) public profile during this period, the American Dialect Society voted "weapons of mass destruction" (and its abbreviation, "WMD") the word of the year in 2002, and in 2003 Lake Superior State University added WMD to its list of terms banished for "Mis-use, Over-use and General Uselessness" (and "as a card that trumps all forms of aggression").
In its criminal complaint against the main suspect of the Boston Marathon bombing of 15 April 2013, the FBI refers to a pressure-cooker improvised bomb as a "weapon of mass destruction."
There have been calls to classify at least some classes of cyber weapons as WMD, in particular those aimed to bring about large-scale (physical) destruction, such as by targeting critical infrastructure. However, some scholars have objected to classifying cyber weapons as WMD on the grounds that they "cannot [currently] directly injure or kill human beings as efficiently as guns or bombs" or clearly "meet the legal and historical definitions" of WMD.
Definitions of the term:
United States: Strategic definition:
The most widely used definition of "weapons of mass destruction" is that of nuclear, biological, or chemical weapons (NBC) although there is no treaty or customary international law that contains an authoritative definition. Instead, international law has been used with respect to the specific categories of weapons within WMD, and not to WMD as a whole.
While nuclear, chemical and biological weapons are regarded as the three major types of WMDs, some analysts have argued that radiological materials as well as missile technology and delivery systems such as aircraft and ballistic missiles could be labeled as WMDs as well.
However, there is an argument that nuclear and biological weapons do not belong in the same category as chemical and "dirty bomb" radiological weapons, which have limited destructive potential (and close to none, as far as property is concerned), whereas nuclear and biological weapons have the unique ability to kill large numbers of people with very small amounts of material, and thus could be said to belong in a class by themselves.
The NBC definition has also been used in official U.S. documents, by the U.S. President, the U.S. Central Intelligence Agency, the U.S. Department of Defense, and the U.S. Government Accountability Office.
Other documents expand the definition of WMD to also include radiological or conventional weapons. The U.S. military refers to WMD as:
"Chemical, biological, radiological, or nuclear weapons capable of a high order of destruction or causing mass casualties and exclude the means of transporting or propelling the weapon where such means is a separable and divisible part from the weapon. Also called WMD."
This may also refer to nuclear ICBMs (intercontinental ballistic missiles).
The significance of the words separable and divisible part of the weapon is that missiles such as the Pershing II and the SCUD are considered weapons of mass destruction, while aircraft capable of carrying bombloads are not.
In 2004, the United Kingdom's Butler Review recognized the "considerable and long-standing academic debate about the proper interpretation of the phrase 'weapons of mass destruction'".
The committee set out to avoid the general term but when using it, employed the definition of United Nations Security Council Resolution 687, which defined the systems which Iraq was required to abandon: "Nuclear weapons or nuclear-weapons-usable material or any sub-systems or components or any research, development, support or manufacturing facilities relating to [nuclear weapons].
- Chemical and biological weapons and all stocks of agents and all related subsystems and components and all research, development, support and manufacturing facilities.
- Ballistic missiles with a range greater than 150 kilometres and related major parts, and repair and production facilities."
Chemical weapons expert Gert G. Harigel considers only nuclear weapons true weapons of mass destruction, because "only nuclear weapons are completely indiscriminate by their explosive power, heat radiation and radioactivity, and only they should therefore be called a weapon of mass destruction". He prefers to call chemical and biological weapons "weapons of terror" when aimed against civilians and "weapons of intimidation" for soldiers.
Testimony of one such soldier expresses the same viewpoint. For a period of several months in the winter of 2002–2003, U.S. Deputy Secretary of Defense Paul Wolfowitz frequently used the term "weapons of mass terror", apparently also recognizing the distinction between the psychological and the physical effects of many things currently falling into the WMD category.
Gustavo Bell Lemus, the Vice President of Colombia, at 9 July 2001 United Nations Conference on the Illicit Trade in Small Arms and Light Weapons in All Its Aspects, quoted the Millennium Report of the UN Secretary-General to the General Assembly, in which Kofi Annan said that small arms could be described as WMD because the fatalities they cause "dwarf that of all other weapons systems – and in most years greatly exceed the toll of the atomic bombs that devastated Hiroshima and Nagasaki".
An additional condition often implicitly applied to WMD is that the use of the weapons must be strategic. In other words, they would be designed to "have consequences far outweighing the size and effectiveness of the weapons themselves". The strategic nature of WMD also defines their function in the military doctrine of total war as targeting the means a country would use to support and supply its war effort, specifically its population, industry, and natural resources.
Within U.S. civil defense organizations, the category is now Chemical, Biological, Radiological, Nuclear, and Explosive (CBRNE), which defines WMD as:
- Any explosive, incendiary, poison gas, bomb, grenade, or rocket having a propellant charge of more than four ounces [113 g], missile having an explosive or incendiary charge of more than one-quarter ounce [7 g], or mine or device similar to the above.
- Poison gas.
- Any weapon involving a disease organism.
- Any weapon that is designed to release radiation at a level dangerous to human life.
Military definition:
For the general purposes of national defense, the U.S. Code defines a weapon of mass destruction as:
- any weapon or device that is intended, or has the capability, to cause death or serious bodily injury to a significant number of people through the release, dissemination, or impact of:
- toxic or poisonous chemicals or their precursors
- a disease organism
- radiation or radioactivity
For the purposes of the prevention of weapons proliferation, the U.S. Code defines weapons of mass destruction as "chemical, biological, and nuclear weapons, and chemical, biological, and nuclear materials used in the manufacture of such weapons".
Criminal (civilian) definition:
For the purposes of U.S. criminal law concerning terrorism, weapons of mass destruction are defined as:
- any "destructive device" defined as any explosive, incendiary, or poison gas – bomb, grenade, rocket having a propellant charge of more than four ounces, missile having an explosive or incendiary charge of more than one-quarter ounce, mine, or device similar to any of the devices described in the preceding clauses
- any weapon that is designed or intended to cause death or serious bodily injury through the release, dissemination, or impact of toxic or poisonous chemicals, or their precursors
- any weapon involving a biological agent, toxin, or vector
- any weapon that is designed to release radiation or radioactivity at a level dangerous to human life
The Federal Bureau of Investigation's definition is similar to that presented above from the terrorism statute:
- any "destructive device" as defined in Title 18 USC Section 921: any explosive, incendiary, or poison gas – bomb, grenade, rocket having a propellant charge of more than four ounces, missile having an explosive or incendiary charge of more than one-quarter ounce, mine, or device similar to any of the devices described in the preceding clauses
- any weapon designed or intended to cause death or serious bodily injury through the release, dissemination, or impact of toxic or poisonous chemicals or their precursors
- any weapon involving a disease organism
- any weapon designed to release radiation or radioactivity at a level dangerous to human life
- any device or weapon designed or intended to cause death or serious bodily injury by causing a malfunction of or destruction of an aircraft or other vehicle that carries humans or of an aircraft or other vehicle whose malfunction or destruction may cause said aircraft or other vehicle to cause death or serious bodily injury to humans who may be within range of the vector in its course of travel or the travel of its debris.
Indictments and convictions for possession and use of WMD such as truck bombs, pipe bombs, shoe bombs, and cactus needles coated with a biological toxin have been obtained under 18 USC 2332a.
As defined by 18 USC §2332 (a), a Weapon of Mass Destruction is:
- (A) any destructive device as defined in section 921 of the title;
- (B) any weapon that is designed or intended to cause death or serious bodily injury through the release, dissemination, or impact of toxic or poisonous chemicals, or their precursors;
- (C) any weapon involving a biological agent, toxin, or vector (as those terms are defined in section 178 of this title); or
- (D) any weapon that is designed to release radiation or radioactivity at a level dangerous to human life;
Under the same statute, conspiring, attempting, threatening, or using a Weapon of Mass Destruction may be imprisoned for any term of years or for life, and if resulting in death, be punishable by death or by imprisonment for any terms of years or for life. They can also be asked to pay a maximum fine of $250,000.
The Washington Post reported on 30 March 2006: "Jurors asked the judge in the death penalty trial of Zacarias Moussaoui today to define the term 'weapons of mass destruction' and were told it includes airplanes used as missiles". Moussaoui was indicted and tried for conspiracy to both destroy aircraft and use weapons of mass destruction, among others.
The surviving Boston Marathon bombing perpetrator, Dzhokhar Tsarnaev, was charged in June 2013 with the federal offense of "use of a weapon of mass destruction" after he and his brother Tamerlan Tsarnaev allegedly placed crude shrapnel bombs, made from pressure cookers packed with ball bearings and nails, near the finish line of the Boston Marathon.
He was convicted in April 2015. The bombing resulted in three deaths and at least 264 injuries.
International law:
See also:
The development and use of WMD is governed by several international conventions and treaties (as follows):
Use, possession, and access:
Nuclear weapons:
Main articles:
The only country to have used a nuclear weapon in war is the United States, which dropped two atomic bombs on the Japanese cities of Hiroshima and Nagasaki during World War II.
There are eight countries that have declared they possess nuclear weapons and are known to have tested a nuclear weapon, only five of which are members of the NPT. The eight are:
Israel is considered by most analysts to have nuclear weapons numbering in the low hundreds as well, but maintains an official policy of nuclear ambiguity, neither denying nor confirming its nuclear status.
South Africa developed a small nuclear arsenal in the 1980s but disassembled them in the early 1990s, making it the only country to have fully given up an independently developed nuclear weapons arsenal.
Belarus, Kazakhstan, and Ukraine inherited stockpiles of nuclear arms following the break-up of the Soviet Union, but relinquished them to the Russian Federation.
Countries where nuclear weapons are deployed through nuclear sharing agreements include Belgium, Germany, Italy, the Netherlands, and Turkey.
Biological weapons:
Main articles:
The history of biological warfare goes back at least to the Mongol siege of Caffa in 1346 and possibly much farther back to antiquity. However, only by the turn of the 20th century did advances in microbiology allow for the large-scale weaponization of pathogens.
At least nine states have operated offensive biological weapons programs during the 20th century, including:
The Japanese biological weapons program, which was run by the secret Imperial Japanese Army Unit 731 during the Sino-Japanese War (1937–1945), became infamous for conducting often fatal human experiments on prisoners and producing biological weapons for combat use.
The Soviet Union covertly operated the world's largest, longest, and most sophisticated biological weapons program, in violation of its obligations under international law.
International restrictions on biological warfare began with the 1925 Geneva Protocol, which prohibits the use but not the possession or development of biological and chemical weapons. Upon ratification of the Geneva Protocol, several countries made reservations regarding its applicability and use in retaliation. Due to these reservations, it was in practice a "no-first-use" agreement only.
The 1972 Biological Weapons Convention (BWC) supplements the Geneva Protocol by prohibiting the development, production, acquisition, transfer, stockpiling, and use of biological weapons. Having entered into force on 26 March 1975, the BWC was the first multilateral disarmament treaty to ban the production of an entire category of weapons of mass destruction. As of March 2021, 183 states have become party to the treaty.
Chemical weapons:
Main article: Chemical warfare
Chemical weapons have been used around the world by various civilizations since ancient times. In the industrial era, they were used extensively by both sides during World War I, and by the Axis powers during World War II (both in battle and in extermination camp gas chambers), though Allied powers also stockpiled them. Countries in Western Europe renounced the use of such weapons.
As of 2018, a handful of countries have known inventories, and many are in the process of being safely destroyed under the Chemical Weapons Convention. Nonetheless, proliferation and use in war zones remains an active concern, most recently the use of chemical weapons in the Syrian Civil War.
Use, possession, and access:
Nuclear weapons:
Main articles:
The only country to have used a nuclear weapon in war is the United States, which dropped two atomic bombs on the Japanese cities of Hiroshima and Nagasaki during World War II.
There are eight countries that have declared they possess nuclear weapons and are known to have tested a nuclear weapon, only five of which are members of the NPT. The eight are:
- China,
- France,
- India,
- North Korea,
- Pakistan,
- Russia,
- the United Kingdom,
- and the United States.
Israel is considered by most analysts to have nuclear weapons numbering in the low hundreds as well, but maintains an official policy of nuclear ambiguity, neither denying nor confirming its nuclear status.
South Africa developed a small nuclear arsenal in the 1980s but disassembled them in the early 1990s, making it the only country to have fully given up an independently developed nuclear weapons arsenal.
Belarus, Kazakhstan, and Ukraine inherited stockpiles of nuclear arms following the break-up of the Soviet Union, but relinquished them to the Russian Federation.
Countries where nuclear weapons are deployed through nuclear sharing agreements include Belgium, Germany, Italy, the Netherlands, and Turkey.
Biological weapons:
Main articles:
The history of biological warfare goes back at least to the Mongol siege of Caffa in 1346 and possibly much farther back to antiquity. However, only by the turn of the 20th century did advances in microbiology allow for the large-scale weaponization of pathogens.
At least nine states have operated offensive biological weapons programs during the 20th century, including:
- Canada (1946–1956),
- France (1921–1972),
- Iraq (1985–1990s),
- Japan (1930s–1945),
- Rhodesia,
- South Africa (1981–1993),
- the Soviet Union (1920s–1992),
- the United Kingdom (1934–1956),
- and the United States (1943–1969).
The Japanese biological weapons program, which was run by the secret Imperial Japanese Army Unit 731 during the Sino-Japanese War (1937–1945), became infamous for conducting often fatal human experiments on prisoners and producing biological weapons for combat use.
The Soviet Union covertly operated the world's largest, longest, and most sophisticated biological weapons program, in violation of its obligations under international law.
International restrictions on biological warfare began with the 1925 Geneva Protocol, which prohibits the use but not the possession or development of biological and chemical weapons. Upon ratification of the Geneva Protocol, several countries made reservations regarding its applicability and use in retaliation. Due to these reservations, it was in practice a "no-first-use" agreement only.
The 1972 Biological Weapons Convention (BWC) supplements the Geneva Protocol by prohibiting the development, production, acquisition, transfer, stockpiling, and use of biological weapons. Having entered into force on 26 March 1975, the BWC was the first multilateral disarmament treaty to ban the production of an entire category of weapons of mass destruction. As of March 2021, 183 states have become party to the treaty.
Chemical weapons:
Main article: Chemical warfare
Chemical weapons have been used around the world by various civilizations since ancient times. In the industrial era, they were used extensively by both sides during World War I, and by the Axis powers during World War II (both in battle and in extermination camp gas chambers), though Allied powers also stockpiled them. Countries in Western Europe renounced the use of such weapons.
As of 2018, a handful of countries have known inventories, and many are in the process of being safely destroyed under the Chemical Weapons Convention. Nonetheless, proliferation and use in war zones remains an active concern, most recently the use of chemical weapons in the Syrian Civil War.
Ethics and international legal status:
Some commentators classify some or all the uses of nuclear, chemical, or biological weapons during wartime as a war crime (or crime against humanity if widespread) because they kill civilians (who are protected by the laws of war) indiscriminately or are specifically prohibited by international treaties (which have become more comprehensive over time).
Proponents of use say that specific uses of such weapons have been necessary for defense or to avoid more deaths in a protracted war. The tactic of terror bombing from aircraft, and generally targeting cities with area bombardment or saturation carpet bombing has also been criticized, defended, and prohibited by treaty in the same way; the destructive effect of conventional saturation bombing is similar to that of a nuclear weapon.
United States politics:
Due to the potentially indiscriminate effects of WMD, the fear of a WMD attack has shaped political policies and campaigns, fostered social movements, and has been the central theme of many films. Support for different levels of WMD development and control varies nationally and internationally. Yet understanding of the nature of the threats is not high, in part because of imprecise usage of the term by politicians and the media.
Fear of WMD, or of threats diminished by the possession of WMD, has long been used to catalyze public support for various WMD policies. They include mobilization of pro- and anti-WMD campaigners alike, and generation of popular political support.
The term WMD may be used as a powerful buzzword or to generate a culture of fear. It is also used ambiguously, particularly by not distinguishing among the different types of WMD.
A television commercial called Daisy, promoting Democrat Lyndon Johnson's 1964 presidential candidacy, invoked the fear of a nuclear war and was an element in Johnson's subsequent election.
Later, United States' President George W. Bush used the threat of potential WMD in Iraq as justification for the 2003 invasion of Iraq. Broad reference to Iraqi WMD in general was seen as an element of President Bush's arguments.
The claim that Iraq possessed Weapons of Mass Destruction (WMD) was a major factor that led to the invasion of Iraq in 2003 by Coalition forces. Over 500 munitions containing mustard agent and sarin were discovered throughout Iraq since 2003; they were made in the 1980s and are no longer usable as originally intended due to corrosion.
The American Heritage Dictionary defines a weapon of mass destruction as: "a weapon that can cause widespread destruction or kill large numbers of people, especially a nuclear, chemical, or biological weapon."
In other words, it does not have to be nuclear, biological or chemical (NBC). For example, Dzhokhar Tsarnaev, one of the perpetrators of the Boston Marathon bombing, was charged under United States law 18 U.S.C. 2332A for using a weapon of mass destruction and that was a pressure cooker bomb. In other words, it was a weapon that caused large-scale death and destruction, without being an NBC weapon.
Media coverage:
In March 2004, the Center for International and Security Studies at Maryland (CISSM) released a report examining the media's coverage of WMD issues during three separate periods: nuclear weapons tests by India and Pakistan in May 1998; the U.S. announcement of evidence of a North Korean nuclear weapons program in October 2002; and revelations about Iran's nuclear program in May 2003.
The CISSM report argues that poor coverage resulted less from political bias among the media than from tired journalistic conventions. The report's major findings were that:
In a separate study published in 2005, a group of researchers assessed the effects reports and retractions in the media had on people's memory regarding the search for WMD in Iraq during the 2003 Iraq War.
The study focused on populations in two coalition countries (Australia and the United States) and one opposed to the war (Germany). Results showed that U.S. citizens generally did not correct initial misconceptions regarding WMD, even following disconfirmation; Australian and German citizens were more responsive to retractions.
Dependence on the initial source of information led to a substantial minority of Americans exhibiting false memory that WMD were indeed discovered, while they were not. This led to three conclusions:
A poll conducted between June and September 2003 asked people whether they thought evidence of WMD had been discovered in Iraq since the war ended. They were also asked which media sources they relied upon. Those who obtained their news primarily from Fox News were three times as likely to believe that evidence of WMD had been discovered in Iraq than those who relied on PBS and NPR for their news, and one third more likely than those who primarily watched CBS.
Some commentators classify some or all the uses of nuclear, chemical, or biological weapons during wartime as a war crime (or crime against humanity if widespread) because they kill civilians (who are protected by the laws of war) indiscriminately or are specifically prohibited by international treaties (which have become more comprehensive over time).
Proponents of use say that specific uses of such weapons have been necessary for defense or to avoid more deaths in a protracted war. The tactic of terror bombing from aircraft, and generally targeting cities with area bombardment or saturation carpet bombing has also been criticized, defended, and prohibited by treaty in the same way; the destructive effect of conventional saturation bombing is similar to that of a nuclear weapon.
United States politics:
Due to the potentially indiscriminate effects of WMD, the fear of a WMD attack has shaped political policies and campaigns, fostered social movements, and has been the central theme of many films. Support for different levels of WMD development and control varies nationally and internationally. Yet understanding of the nature of the threats is not high, in part because of imprecise usage of the term by politicians and the media.
Fear of WMD, or of threats diminished by the possession of WMD, has long been used to catalyze public support for various WMD policies. They include mobilization of pro- and anti-WMD campaigners alike, and generation of popular political support.
The term WMD may be used as a powerful buzzword or to generate a culture of fear. It is also used ambiguously, particularly by not distinguishing among the different types of WMD.
A television commercial called Daisy, promoting Democrat Lyndon Johnson's 1964 presidential candidacy, invoked the fear of a nuclear war and was an element in Johnson's subsequent election.
Later, United States' President George W. Bush used the threat of potential WMD in Iraq as justification for the 2003 invasion of Iraq. Broad reference to Iraqi WMD in general was seen as an element of President Bush's arguments.
The claim that Iraq possessed Weapons of Mass Destruction (WMD) was a major factor that led to the invasion of Iraq in 2003 by Coalition forces. Over 500 munitions containing mustard agent and sarin were discovered throughout Iraq since 2003; they were made in the 1980s and are no longer usable as originally intended due to corrosion.
The American Heritage Dictionary defines a weapon of mass destruction as: "a weapon that can cause widespread destruction or kill large numbers of people, especially a nuclear, chemical, or biological weapon."
In other words, it does not have to be nuclear, biological or chemical (NBC). For example, Dzhokhar Tsarnaev, one of the perpetrators of the Boston Marathon bombing, was charged under United States law 18 U.S.C. 2332A for using a weapon of mass destruction and that was a pressure cooker bomb. In other words, it was a weapon that caused large-scale death and destruction, without being an NBC weapon.
Media coverage:
In March 2004, the Center for International and Security Studies at Maryland (CISSM) released a report examining the media's coverage of WMD issues during three separate periods: nuclear weapons tests by India and Pakistan in May 1998; the U.S. announcement of evidence of a North Korean nuclear weapons program in October 2002; and revelations about Iran's nuclear program in May 2003.
The CISSM report argues that poor coverage resulted less from political bias among the media than from tired journalistic conventions. The report's major findings were that:
- Most media outlets represented WMD as a monolithic menace, failing to adequately distinguish between weapons programs and actual weapons or to address the real differences among chemical, biological, nuclear, and radiological weapons.
- Most journalists accepted the Bush administration's formulation of the "War on Terror" as a campaign against WMD, in contrast to coverage during the Clinton era, when many journalists made careful distinctions between acts of terrorism and the acquisition and use of WMD.
- Many stories stenographically reported the incumbent administration's perspective on WMD, giving too little critical examination of the way officials framed the events, issues, threats, and policy options.
- Too few stories proffered alternative perspectives to official line, a problem exacerbated by the journalistic prioritizing of breaking-news stories and the "inverted pyramid" style of storytelling.
In a separate study published in 2005, a group of researchers assessed the effects reports and retractions in the media had on people's memory regarding the search for WMD in Iraq during the 2003 Iraq War.
The study focused on populations in two coalition countries (Australia and the United States) and one opposed to the war (Germany). Results showed that U.S. citizens generally did not correct initial misconceptions regarding WMD, even following disconfirmation; Australian and German citizens were more responsive to retractions.
Dependence on the initial source of information led to a substantial minority of Americans exhibiting false memory that WMD were indeed discovered, while they were not. This led to three conclusions:
- The repetition of tentative news stories, even if they are subsequently disconfirmed, can assist in the creation of false memories in a substantial proportion of people.
- Once information is published, its subsequent correction does not alter people's beliefs unless they are suspicious about the motives underlying the events the news stories are about.
- When people ignore corrections, they do so irrespective of how certain they are that the corrections occurred.
A poll conducted between June and September 2003 asked people whether they thought evidence of WMD had been discovered in Iraq since the war ended. They were also asked which media sources they relied upon. Those who obtained their news primarily from Fox News were three times as likely to believe that evidence of WMD had been discovered in Iraq than those who relied on PBS and NPR for their news, and one third more likely than those who primarily watched CBS.
The above is based on a series of polls taken from June–September 2003.
In 2006, Fox News reported the claims of two Republican lawmakers that WMDs had been found in Iraq, based upon unclassified portions of a report by the National Ground Intelligence Center. Quoting from the report, Senator Rick Santorum said "Since 2003, coalition forces have recovered approximately 500 weapons munitions which contain degraded mustard or sarin nerve agent".
According to David Kay, who appeared before the U.S. House Armed Services Committee to discuss these badly corroded munitions, they were leftovers, many years old, improperly stored or destroyed by the Iraqis.
Charles Duelfer agreed, stating on NPR's Talk of the Nation: "When I was running the ISG – the Iraq Survey Group – we had a couple of them that had been turned in to these IEDs, the improvised explosive devices. But they are local hazards. They are not a major, you know, weapon of mass destruction."
Later, wikileaks would show that WMDs of these kinds continued to be found as the Iraqi occupation continued.
Many news agencies, including Fox News, reported the conclusions of the CIA that, based upon the investigation of the Iraq Survey Group, WMDs are yet to be found in Iraq.
Public perceptions:
Awareness and opinions of WMD have varied during the course of their history. Their threat is a source of unease, security, and pride to different people. The anti-WMD movement is embodied most in nuclear disarmament, and led to the formation of the British Campaign for Nuclear Disarmament in 1957.
In order to increase awareness of all kinds of WMD, in 2004 the nuclear physicist and Nobel Peace Prize winner Joseph Rotblat inspired the creation of The WMD Awareness Programme to provide trustworthy and up to date information on WMD worldwide.
In 1998 University of New Mexico's Institute for Public Policy released their third report on U.S. perceptions – including the general public, politicians and scientists – of nuclear weapons since the breakup of the Soviet Union. Risks of nuclear conflict, proliferation, and terrorism were seen as substantial.
While maintenance of the U.S. nuclear arsenal was considered above average in importance, there was widespread support for a reduction in the stockpile, and very little support for developing and testing new nuclear weapons.
Also in 1998, but after the UNM survey was conducted, nuclear weapons became an issue in India's election of March, in relation to political tensions with neighboring Pakistan. Prior to the election the Bharatiya Janata Party (BJP) announced it would "declare India a nuclear weapon state" after coming to power.
BJP won the elections, and on 14 May, three days after India tested nuclear weapons for the second time, a public opinion poll reported that a majority of Indians favored the country's nuclear build-up.
On 15 April 2004, the Program on International Policy Attitudes (PIPA) reported that U.S. citizens showed high levels of concern regarding WMD, and that preventing the spread of nuclear weapons should be "a very important U.S. foreign policy goal", accomplished through multilateral arms control rather than the use of military threats.
A majority also believed the United States should be more forthcoming with its biological research and its Nuclear Non-Proliferation Treaty commitment of nuclear arms reduction.
A Russian opinion poll conducted on 5 August 2005 indicated half the population believes new nuclear powers have the right to possess nuclear weapons. 39% believes the Russian stockpile should be reduced, though not fully eliminated.
In popular culture:
Main article: Weapons of mass destruction in popular culture
Weapons of mass destruction and their related impacts have been a mainstay of popular culture since the beginning of the Cold War, as both political commentary and humorous outlet. The actual phrase "weapons of mass destruction" has been used similarly and as a way to characterise any powerful force or product since the Iraqi weapons crisis in the lead up to the Coalition invasion of Iraq in 2003. Science-fiction may introduce novel weapons of mass destruction with much greater yields or impact than anything in reality
Click on any of the following blue hyperlinks for more aboout Weapons of Mass Destruction:
In 2006, Fox News reported the claims of two Republican lawmakers that WMDs had been found in Iraq, based upon unclassified portions of a report by the National Ground Intelligence Center. Quoting from the report, Senator Rick Santorum said "Since 2003, coalition forces have recovered approximately 500 weapons munitions which contain degraded mustard or sarin nerve agent".
According to David Kay, who appeared before the U.S. House Armed Services Committee to discuss these badly corroded munitions, they were leftovers, many years old, improperly stored or destroyed by the Iraqis.
Charles Duelfer agreed, stating on NPR's Talk of the Nation: "When I was running the ISG – the Iraq Survey Group – we had a couple of them that had been turned in to these IEDs, the improvised explosive devices. But they are local hazards. They are not a major, you know, weapon of mass destruction."
Later, wikileaks would show that WMDs of these kinds continued to be found as the Iraqi occupation continued.
Many news agencies, including Fox News, reported the conclusions of the CIA that, based upon the investigation of the Iraq Survey Group, WMDs are yet to be found in Iraq.
Public perceptions:
Awareness and opinions of WMD have varied during the course of their history. Their threat is a source of unease, security, and pride to different people. The anti-WMD movement is embodied most in nuclear disarmament, and led to the formation of the British Campaign for Nuclear Disarmament in 1957.
In order to increase awareness of all kinds of WMD, in 2004 the nuclear physicist and Nobel Peace Prize winner Joseph Rotblat inspired the creation of The WMD Awareness Programme to provide trustworthy and up to date information on WMD worldwide.
In 1998 University of New Mexico's Institute for Public Policy released their third report on U.S. perceptions – including the general public, politicians and scientists – of nuclear weapons since the breakup of the Soviet Union. Risks of nuclear conflict, proliferation, and terrorism were seen as substantial.
While maintenance of the U.S. nuclear arsenal was considered above average in importance, there was widespread support for a reduction in the stockpile, and very little support for developing and testing new nuclear weapons.
Also in 1998, but after the UNM survey was conducted, nuclear weapons became an issue in India's election of March, in relation to political tensions with neighboring Pakistan. Prior to the election the Bharatiya Janata Party (BJP) announced it would "declare India a nuclear weapon state" after coming to power.
BJP won the elections, and on 14 May, three days after India tested nuclear weapons for the second time, a public opinion poll reported that a majority of Indians favored the country's nuclear build-up.
On 15 April 2004, the Program on International Policy Attitudes (PIPA) reported that U.S. citizens showed high levels of concern regarding WMD, and that preventing the spread of nuclear weapons should be "a very important U.S. foreign policy goal", accomplished through multilateral arms control rather than the use of military threats.
A majority also believed the United States should be more forthcoming with its biological research and its Nuclear Non-Proliferation Treaty commitment of nuclear arms reduction.
A Russian opinion poll conducted on 5 August 2005 indicated half the population believes new nuclear powers have the right to possess nuclear weapons. 39% believes the Russian stockpile should be reduced, though not fully eliminated.
In popular culture:
Main article: Weapons of mass destruction in popular culture
Weapons of mass destruction and their related impacts have been a mainstay of popular culture since the beginning of the Cold War, as both political commentary and humorous outlet. The actual phrase "weapons of mass destruction" has been used similarly and as a way to characterise any powerful force or product since the Iraqi weapons crisis in the lead up to the Coalition invasion of Iraq in 2003. Science-fiction may introduce novel weapons of mass destruction with much greater yields or impact than anything in reality
Click on any of the following blue hyperlinks for more aboout Weapons of Mass Destruction:
- Common hazard symbols
- See also:
- The Bomb (film) – 2015 American documentary film
- Commission on the Prevention of WMD proliferation and terrorism
- List of CBRN warfare forces
- Core (game theory)
- Ethnic bioweapon
- Fallout shelter
- Game theory
- Global Partnership Against the Spread of Weapons and Materials of Mass Destruction
- Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction
- Kinetic bombardment
- Mutual assured destruction
- NBC suit
- New physical principles weapons
- Nuclear terrorism
- Operations Plus WMD
- Orbital bombardment
- Russia and weapons of mass destruction
- Strategic bombing
- United States and weapons of mass destruction
- Weapons of Mass Destruction Commission
- "Home". Center for the Study of Weapons of Mass Destruction.
- Journal dedicated to CBRNE issues
- United Nations: Disarmament at the Wayback Machine (archived 24 June 2005)
- US Department of State at the Wayback Machine (archived 13 March 2007)
- Nuclear Threat Initiative (NTI), non-profit organisation working to prevent catastrophic attacks and accidents with weapons of mass destruction
- Federation of American Scientists (FAS)
- Carnegie Endowment for International Peace
- Avoiding Armageddon, PBS
- FAS assessment of countries that own weapons of mass destruction
- National Counterproliferation Center – Office of the Director of National Intelligence
- HLSWatch.com: Homeland Security Watch policy and current events resource
- Office of the Special Assistant for Chemical Biological Defense and Chemical Demilitarization Programs, Official Department of Defense web site that provides information about the DoD Chemical Biological Defense Program
- Terrorism and the Threat From Weapons of Mass Destruction in the Middle East at the Wayback Machine (archived 29 April 2001)
- "Iranian Chemical Attacks Victims" Archived 22 May 2011 at the Wayback Machine (Payvand News Agency)
- Iran: 'Forgotten Victims' Of Saddam Hussein Era Await Justice
- Comparison of Chinese, Japanese and Vietnamese translations
- Nuclear Age Peace Foundation
- Radius Engineering International Inc. Radius Engineering International Inc (ed.). "Nuclear Weapons Effects" (PDF). Archived from the original (PDF) on 14 December 2010. Retrieved 20 December 2010. These tables describe the effects of various nuclear blast sizes. All figures are for 15 mph (13 kn; 24 km/h) winds. Thermal burns represent injuries to an unprotected person. The legend describes the data.
Nuclear Power Plant
- YouTube Video: MOST Advanced Nuclear Power Facilities on Earth
- YouTube Video: Why I changed my mind about nuclear power | Michael Shellenberger | TEDxBerlin
- YouTube Video: Inside the New Micro Nuclear Reactor that Could Power the Future
A nuclear power plant (NPP) is a thermal power station in which the heat source is a nuclear reactor. As is typical of thermal power stations, heat is used to generate steam that drives a steam turbine connected to a generator that produces electricity. As of August 2023, the International Atomic Energy Agency reported there were 412 nuclear power reactors in operation in 31 countries around the world, and 57 nuclear power reactors under construction.
Nuclear plants are very often used for base load since their operations, maintenance, and fuel costs are at the lower end of the spectrum of costs. However, building a nuclear power plant often spans five to ten years, which can accrue to significant financial costs, depending on how the initial investments are financed.
Nuclear power plants have a carbon footprint comparable to that of renewable energy such as solar farms and wind farms, and much lower than fossil fuels such as natural gas and coal. Despite some spectacular catastrophes, nuclear power plants are among the safest mode of electricity generation, comparable to solar and wind power plants.
History:
Main article: History of nuclear power
The first time that heat from a nuclear reactor was used to generate electricity was on December 21, 1951, at the Experimental Breeder Reactor I, feeding four light bulbs.
On June 27, 1954, the world's first nuclear power station to generate electricity for a power grid, the Obninsk Nuclear Power Plant, commenced operations in Obninsk, in the Soviet Union.
The world's first full scale power station, Calder Hall in the United Kingdom, opened on October 17, 1956. The world's first full scale power station solely devoted to electricity production—Calder Hall was also meant to produce plutonium—the Shippingport Atomic Power Station in Pennsylvania, United States—was connected to the grid on December 18, 1957.
Basic components:
Fuel handling:
Picture below: Boiling water reactor (BWR)
Nuclear plants are very often used for base load since their operations, maintenance, and fuel costs are at the lower end of the spectrum of costs. However, building a nuclear power plant often spans five to ten years, which can accrue to significant financial costs, depending on how the initial investments are financed.
Nuclear power plants have a carbon footprint comparable to that of renewable energy such as solar farms and wind farms, and much lower than fossil fuels such as natural gas and coal. Despite some spectacular catastrophes, nuclear power plants are among the safest mode of electricity generation, comparable to solar and wind power plants.
History:
Main article: History of nuclear power
The first time that heat from a nuclear reactor was used to generate electricity was on December 21, 1951, at the Experimental Breeder Reactor I, feeding four light bulbs.
On June 27, 1954, the world's first nuclear power station to generate electricity for a power grid, the Obninsk Nuclear Power Plant, commenced operations in Obninsk, in the Soviet Union.
The world's first full scale power station, Calder Hall in the United Kingdom, opened on October 17, 1956. The world's first full scale power station solely devoted to electricity production—Calder Hall was also meant to produce plutonium—the Shippingport Atomic Power Station in Pennsylvania, United States—was connected to the grid on December 18, 1957.
Basic components:
Fuel handling:
- Radwaste system
- Refueling floor
- Spent fuel pool
- Online refueling machine(s) in some designs such as RBMK and CANDU
- Control rod drives
- Instrumentation such as ion chambers
- Control rods
- Coolant
- Neutron howitzer
- Neutron moderator
- Neutron poison
- Nuclear fuel
- Nuclear reactor core
- Reactor pressure vessel (In most reactors)
- Startup neutron source
- Containment building
- Emergency core cooling system
- Emergency power system
- Essential service water system
- Reactor protection system
- Standby liquid control system
- Boiler feedwater pump
- Steam generators (in PWR reactors, which also have pressurizers)
Picture below: Boiling water reactor (BWR)
Systems:
The conversion to electrical energy takes place indirectly, as in conventional thermal power stations. The fission in a nuclear reactor heats the reactor coolant. The coolant may be water or gas, or even liquid metal, depending on the type of reactor. The reactor coolant then goes to a steam generator and heats water to produce steam.
The pressurized steam is then usually fed to a multi-stage steam turbine. After the steam turbine has expanded and partially condensed the steam, the remaining vapor is condensed in a condenser. The condenser is a heat exchanger which is connected to a secondary side such as a river or a cooling tower. The water is then pumped back into the steam generator and the cycle begins again. The water-steam cycle corresponds to the Rankine cycle.
The nuclear reactor is the heart of the station. In its central part, the reactor's core produces heat due to nuclear fission. With this heat, a coolant is heated as it is pumped through the reactor and thereby removes the energy from the reactor. The heat from nuclear fission is used to raise steam, which runs through turbines, which in turn power the electrical generators.
Nuclear reactors usually rely on uranium to fuel the chain reaction. Uranium is a very heavy metal that is abundant on Earth and is found in sea water as well as most rocks. Naturally occurring uranium is found in two different isotopes: uranium-238 (U-238), accounting for 99.3% and uranium-235 (U-235) accounting for about 0.7%. U-238 has 146 neutrons and U-235 has 143 neutrons.
Different isotopes have different behaviors. For instance, U-235 is fissile which means that it is easily split and gives off a lot of energy making it ideal for nuclear energy. On the other hand, U-238 does not have that property despite it being the same element. Different isotopes also have different half-lives. U-238 has a longer half-life than U-235, so it takes longer to decay over time. This also means that U-238 is less radioactive than U-235.
Since nuclear fission creates radioactivity, the reactor core is surrounded by a protective shield. This containment absorbs radiation and prevents radioactive material from being released into the environment. In addition, many reactors are equipped with a dome of concrete to protect the reactor against both internal casualties and external impacts.
The purpose of the steam turbine is to convert the heat contained in steam into mechanical energy. The engine house with the steam turbine is usually structurally separated from the main reactor building. It is aligned so as to prevent debris from the destruction of a turbine in operation from flying towards the reactor.
Pictured below: Pressurized water reactor (PWR)
The conversion to electrical energy takes place indirectly, as in conventional thermal power stations. The fission in a nuclear reactor heats the reactor coolant. The coolant may be water or gas, or even liquid metal, depending on the type of reactor. The reactor coolant then goes to a steam generator and heats water to produce steam.
The pressurized steam is then usually fed to a multi-stage steam turbine. After the steam turbine has expanded and partially condensed the steam, the remaining vapor is condensed in a condenser. The condenser is a heat exchanger which is connected to a secondary side such as a river or a cooling tower. The water is then pumped back into the steam generator and the cycle begins again. The water-steam cycle corresponds to the Rankine cycle.
The nuclear reactor is the heart of the station. In its central part, the reactor's core produces heat due to nuclear fission. With this heat, a coolant is heated as it is pumped through the reactor and thereby removes the energy from the reactor. The heat from nuclear fission is used to raise steam, which runs through turbines, which in turn power the electrical generators.
Nuclear reactors usually rely on uranium to fuel the chain reaction. Uranium is a very heavy metal that is abundant on Earth and is found in sea water as well as most rocks. Naturally occurring uranium is found in two different isotopes: uranium-238 (U-238), accounting for 99.3% and uranium-235 (U-235) accounting for about 0.7%. U-238 has 146 neutrons and U-235 has 143 neutrons.
Different isotopes have different behaviors. For instance, U-235 is fissile which means that it is easily split and gives off a lot of energy making it ideal for nuclear energy. On the other hand, U-238 does not have that property despite it being the same element. Different isotopes also have different half-lives. U-238 has a longer half-life than U-235, so it takes longer to decay over time. This also means that U-238 is less radioactive than U-235.
Since nuclear fission creates radioactivity, the reactor core is surrounded by a protective shield. This containment absorbs radiation and prevents radioactive material from being released into the environment. In addition, many reactors are equipped with a dome of concrete to protect the reactor against both internal casualties and external impacts.
The purpose of the steam turbine is to convert the heat contained in steam into mechanical energy. The engine house with the steam turbine is usually structurally separated from the main reactor building. It is aligned so as to prevent debris from the destruction of a turbine in operation from flying towards the reactor.
Pictured below: Pressurized water reactor (PWR)
In the case of a pressurized water reactor, the steam turbine is separated from the nuclear system. To detect a leak in the steam generator and thus the passage of radioactive water at an early stage, an activity meter is mounted to track the outlet steam of the steam generator.
In contrast, boiling water reactors pass radioactive water through the steam turbine, so the turbine is kept as part of the radiologically controlled area of the nuclear power station.
The electric generator converts mechanical power supplied by the turbine into electrical power. Low-pole AC synchronous generators of high rated power are used. A cooling system removes heat from the reactor core and transports it to another area of the station, where the thermal energy can be harnessed to produce electricity or to do other useful work.
Typically the hot coolant is used as a heat source for a boiler, and the pressurized steam from that drives one or more steam turbine driven electrical generators.
In the event of an emergency, safety valves can be used to prevent pipes from bursting or the reactor from exploding. The valves are designed so that they can derive all of the supplied flow rates with little increase in pressure. In the case of the BWR, the steam is directed into the suppression chamber and condenses there. The chambers on a heat exchanger are connected to the intermediate cooling circuit.
The main condenser is a large cross-flow shell and tube heat exchanger that takes wet vapor, a mixture of liquid water and steam at saturation conditions, from the turbine-generator exhaust and condenses it back into sub-cooled liquid water so it can be pumped back to the reactor by the condensate and feedwater pumps.
In the main condenser, the wet vapor turbine exhaust come into contact with thousands of tubes that have much colder water flowing through them on the other side. The cooling water typically come from a natural body of water such as a river or lake.
Palo Verde Nuclear Generating Station, located in the desert about 97 kilometres (60 mi) west of Phoenix, Arizona, is the only nuclear facility that does not use a natural body of water for cooling, instead it uses treated sewage from the greater Phoenix metropolitan area.
The water coming from the cooling body of water is either pumped back to the water source at a warmer temperature or returns to a cooling tower where it either cools for more uses or evaporates into water vapor that rises out the top of the tower.
The water level in the steam generator and the nuclear reactor is controlled using the feedwater system. The feedwater pump has the task of taking the water from the condensate system, increasing the pressure and forcing it into either the steam generators—in the case of a pressurized water reactor — or directly into the reactor, for boiling water reactors.
Continuous power supply to the plant is critical to ensure safe operation. Most nuclear stations require at least two distinct sources of offsite power for redundancy. These are usually provided by multiple transformers that are sufficiently separated and can receive power from multiple transmission lines.
In addition, in some nuclear stations, the turbine generator can power the station's loads while the station is online, without requiring external power. This is achieved via station service transformers which tap power from the generator output before they reach the step-up transformer.
Economics:
The economics of nuclear power plants is a controversial subject, and multibillion-dollar investments ride on the choice of an energy source. Nuclear power stations typically have high capital costs, but low direct fuel costs, with the costs of fuel extraction, processing, use and spent fuel storage internalized costs.
Therefore, comparison with other power generation methods is strongly dependent on assumptions about construction timescales and capital financing for nuclear stations. Cost estimates take into account station decommissioning and nuclear waste storage or recycling costs in the United States due to the Price Anderson Act.
With the prospect that all spent nuclear fuel could potentially be recycled by using future reactors, generation IV reactors are being designed to completely close the nuclear fuel cycle.
However, up to now, there has not been any actual bulk recycling of waste from a NPP, and on-site temporary storage is still being used at almost all plant sites due to construction problems for deep geological repositories. Only Finland has stable repository plans, therefore from a worldwide perspective, long-term waste storage costs are uncertain.
Construction, or capital cost aside, measures to mitigate global warming such as a carbon tax or carbon emissions trading, increasingly favor the economics of nuclear power.
Further efficiencies are hoped to be achieved through more advanced reactor designs, Generation III reactors promise to be at least 17% more fuel efficient, and have lower capital costs, while Generation IV reactors promise further gains in fuel efficiency and significant reductions in nuclear waste.
In Eastern Europe, a number of long-established projects are struggling to find financing, notably Belene in Bulgaria and the additional reactors at Cernavodă in Romania, and some potential backers have pulled out.
Where cheap gas is available and its future supply relatively secure, this also poses a major problem for nuclear projects.
Analysis of the economics of nuclear power must take into account who bears the risks of future uncertainties. To date all operating nuclear power stations were developed by state-owned or regulated utilities where many of the risks associated with construction costs, operating performance, fuel price, and other factors were borne by consumers rather than suppliers.
Many countries have now liberalized the electricity market where these risks and the risk of cheaper competitors emerging before capital costs are recovered, are borne by station suppliers and operators rather than consumers, which leads to a significantly different evaluation of the economics of new nuclear power stations.
Following the 2011 Fukushima nuclear accident in Japan, costs are likely to go up for currently operating and new nuclear power stations, due to increased requirements for on-site spent fuel management and elevated design basis threats.
However many designs, such as the currently under construction AP1000, use passive nuclear safety cooling systems, unlike those of Fukushima I which required active cooling systems, which largely eliminates the need to spend more on redundant back up safety equipment.
According to the World Nuclear Association, as of March 2020:
Safety and accidents:
Modern nuclear reactor designs have had numerous safety improvements since the first-generation nuclear reactors. A nuclear power plant cannot explode like a nuclear weapon because the fuel for uranium reactors is not enriched enough, and nuclear weapons require precision explosives to force fuel into a small enough volume to go supercritical.
Most reactors require continuous temperature control to prevent a core meltdown, which has occurred on a few occasions through accident or natural disaster, releasing radiation and making the surrounding area uninhabitable. Plants must be defended against theft of nuclear material and attack by enemy military planes or missiles.
The most serious accidents to date have been:
Professor of sociology Charles Perrow states that multiple and unexpected failures are built into society's complex and tightly coupled nuclear reactor systems. Such accidents are unavoidable and cannot be designed around.
An interdisciplinary team from MIT has estimated that given the expected growth of nuclear power from 2005 to 2055, at least four serious nuclear accidents would be expected in that period. The MIT study does not, however, take into account improvements in safety since 1970.
Controversy:
The nuclear power debate about the deployment and use of nuclear fission reactors to generate electricity from nuclear fuel for civilian purposes peaked during the 1970s and 1980s, when it "reached an intensity unprecedented in the history of technology controversies," in some countries.
Proponents argue that nuclear power is a sustainable energy source which reduces carbon emissions and can increase energy security if its use supplants a dependence on imported fuels.
Proponents advance the notion that nuclear power produces virtually no air pollution, in contrast to the chief viable alternative of fossil fuel. Proponents also believe that nuclear power is the only viable course to achieve energy independence for most Western countries.
They emphasize that the risks of storing waste are small and can be further reduced by using the latest technology in newer reactors, and the operational safety record in the Western world is excellent when compared to the other major kinds of power plants.
Opponents say that nuclear power poses many threats to people and the environment, and that costs do not justify benefits. Threats include health risks and environmental damage from uranium mining, processing and transport, the risk of nuclear weapons proliferation or sabotage, and the problem of radioactive nuclear waste.
(Another environmental issue is discharge of hot water into the sea. The hot water modifies the environmental conditions for marine flora and fauna. They also contend that reactors themselves are enormously complex machines where many things can and do go wrong, and there have been many serious nuclear accidents.)
Critics do not believe that these risks can be reduced through new technology, despite rapid advancements in containment procedures and storage methods.
Opponents argue that when all the energy-intensive stages of the nuclear fuel chain are considered, from uranium mining to nuclear decommissioning, nuclear power is not a low-carbon electricity source despite the possibility of refinement and long-term storage being powered by a nuclear facility.
Those countries that do not contain uranium mines cannot achieve energy independence through existing nuclear power technologies. Actual construction costs often exceed estimates, and spent fuel management costs are difficult to define.
On 1 August 2020, the UAE launched the Arab region's first-ever nuclear energy plant. Unit 1 of the Barakah plant in the Al Dhafrah region of Abu Dhabi commenced generating heat on the first day of its launch, while the remaining 3 Units are being built.
However, Nuclear Consulting Group head, Paul Dorfman, warned the Gulf nation's investment into the plant as a risk "further destabilizing the volatile Gulf region, damaging the environment and raising the possibility of nuclear proliferation."
Reprocessing:
Nuclear reprocessing technology was developed to chemically separate and recover fissionable plutonium from irradiated nuclear fuel. Reprocessing serves multiple purposes, whose relative importance has changed over time.
Originally reprocessing was used solely to extract plutonium for producing nuclear weapons. With the commercialization of nuclear power, the reprocessed plutonium was recycled back into MOX nuclear fuel for thermal reactors. The reprocessed uranium, which constitutes the bulk of the spent fuel material, can in principle also be re-used as fuel, but that is only economic when uranium prices are high or disposal is expensive.
Finally, the breeder reactor can employ not only the recycled plutonium and uranium in spent fuel, but all the actinides, closing the nuclear fuel cycle and potentially multiplying the energy extracted from natural uranium by more than 60 times.
Nuclear reprocessing reduces the volume of high-level waste, but by itself does not reduce radioactivity or heat generation and therefore does not eliminate the need for a geological waste repository.
Reprocessing has been politically controversial because of:
In the United States, the Obama administration stepped back from President Bush's plans for commercial-scale reprocessing and reverted to a program focused on reprocessing-related scientific research.
Accident indemnification:
Nuclear power works under an insurance framework that limits or structures accident liabilities in accordance with the Paris Convention on Third Party Liability in the Field of Nuclear Energy, the Brussels supplementary convention, and the Vienna Convention on Civil Liability for Nuclear Damage.
However states with a majority of the world's nuclear power stations, including the U.S., Russia, China and Japan, are not party to international nuclear liability conventions.
United States:
In the United States, insurance for nuclear or radiological incidents is covered (for facilities licensed through 2025) by the Price-Anderson Nuclear Industries Indemnity Act.
United Kingdom:
Under the energy policy of the United Kingdom through its 1965 Nuclear Installations Act, liability is governed for nuclear damage for which a UK nuclear licensee is responsible. The Act requires compensation to be paid for damage up to a limit of £150 million by the liable operator for ten years after the incident.
Between ten and thirty years afterwards, the Government meets this obligation. The Government is also liable for additional limited cross-border liability (about £300 million) under international conventions (Paris Convention on Third Party Liability in the Field of Nuclear Energy and Brussels Convention supplementary to the Paris Convention).
Decommissioning:
Nuclear decommissioning is the dismantling of a nuclear power station and decontamination of the site to a state no longer requiring protection from radiation for the general public. The main difference from the dismantling of other power stations is the presence of radioactive material that requires special precautions to remove and safely relocate to a waste repository.
Decommissioning involves many administrative and technical actions. It includes all clean-up of radioactivity and progressive demolition of the station. Once a facility is decommissioned, there should no longer be any danger of a radioactive accident or to any persons visiting it. After a facility has been completely decommissioned it is released from regulatory control, and the licensee of the station no longer has responsibility for its nuclear safety.
Timing and deferral of decommissioning:
Generally speaking, nuclear stations were originally designed for a life of about 30 years. Newer stations are designed for a 40 to 60-year operating life.
The Centurion Reactor is a future class of nuclear reactor that is being designed to last 100 years.
One of the major limiting wear factors is the deterioration of the reactor's pressure vessel under the action of neutron bombardment, however in 2018 Rosatom announced it had developed a thermal annealing technique for reactor pressure vessels which ameliorates radiation damage and extends service life by between 15 and 30 years.
Flexibility:
Nuclear stations are used primarily for base load because of economic considerations. The fuel cost of operations for a nuclear station is smaller than the fuel cost for operation of coal or gas plants. Since most of the cost of nuclear power plant is capital cost, there is almost no cost saving by running it at less than full capacity.
In France, nuclear power plants are routinely used in load following mode on a large scale, although "it is generally accepted that this is not an ideal economic situation for nuclear stations."
Unit A at the decommissioned German Biblis Nuclear Power Plant was designed to modulate its output 15% per minute between 40% and 100% of its nominal power.
Russia has led in the practical development of floating nuclear power stations, which can be transported to the desired location and occasionally relocated or moved for easier decommissioning.
In 2022, the United States Department of Energy funded a three-year research study of offshore floating nuclear power generation. In October 2022, NuScale Power and Canadian company Prodigy announced a joint project to bring a North American small modular reactor based floating plant to market.
See also:
In contrast, boiling water reactors pass radioactive water through the steam turbine, so the turbine is kept as part of the radiologically controlled area of the nuclear power station.
The electric generator converts mechanical power supplied by the turbine into electrical power. Low-pole AC synchronous generators of high rated power are used. A cooling system removes heat from the reactor core and transports it to another area of the station, where the thermal energy can be harnessed to produce electricity or to do other useful work.
Typically the hot coolant is used as a heat source for a boiler, and the pressurized steam from that drives one or more steam turbine driven electrical generators.
In the event of an emergency, safety valves can be used to prevent pipes from bursting or the reactor from exploding. The valves are designed so that they can derive all of the supplied flow rates with little increase in pressure. In the case of the BWR, the steam is directed into the suppression chamber and condenses there. The chambers on a heat exchanger are connected to the intermediate cooling circuit.
The main condenser is a large cross-flow shell and tube heat exchanger that takes wet vapor, a mixture of liquid water and steam at saturation conditions, from the turbine-generator exhaust and condenses it back into sub-cooled liquid water so it can be pumped back to the reactor by the condensate and feedwater pumps.
In the main condenser, the wet vapor turbine exhaust come into contact with thousands of tubes that have much colder water flowing through them on the other side. The cooling water typically come from a natural body of water such as a river or lake.
Palo Verde Nuclear Generating Station, located in the desert about 97 kilometres (60 mi) west of Phoenix, Arizona, is the only nuclear facility that does not use a natural body of water for cooling, instead it uses treated sewage from the greater Phoenix metropolitan area.
The water coming from the cooling body of water is either pumped back to the water source at a warmer temperature or returns to a cooling tower where it either cools for more uses or evaporates into water vapor that rises out the top of the tower.
The water level in the steam generator and the nuclear reactor is controlled using the feedwater system. The feedwater pump has the task of taking the water from the condensate system, increasing the pressure and forcing it into either the steam generators—in the case of a pressurized water reactor — or directly into the reactor, for boiling water reactors.
Continuous power supply to the plant is critical to ensure safe operation. Most nuclear stations require at least two distinct sources of offsite power for redundancy. These are usually provided by multiple transformers that are sufficiently separated and can receive power from multiple transmission lines.
In addition, in some nuclear stations, the turbine generator can power the station's loads while the station is online, without requiring external power. This is achieved via station service transformers which tap power from the generator output before they reach the step-up transformer.
Economics:
The economics of nuclear power plants is a controversial subject, and multibillion-dollar investments ride on the choice of an energy source. Nuclear power stations typically have high capital costs, but low direct fuel costs, with the costs of fuel extraction, processing, use and spent fuel storage internalized costs.
Therefore, comparison with other power generation methods is strongly dependent on assumptions about construction timescales and capital financing for nuclear stations. Cost estimates take into account station decommissioning and nuclear waste storage or recycling costs in the United States due to the Price Anderson Act.
With the prospect that all spent nuclear fuel could potentially be recycled by using future reactors, generation IV reactors are being designed to completely close the nuclear fuel cycle.
However, up to now, there has not been any actual bulk recycling of waste from a NPP, and on-site temporary storage is still being used at almost all plant sites due to construction problems for deep geological repositories. Only Finland has stable repository plans, therefore from a worldwide perspective, long-term waste storage costs are uncertain.
Construction, or capital cost aside, measures to mitigate global warming such as a carbon tax or carbon emissions trading, increasingly favor the economics of nuclear power.
Further efficiencies are hoped to be achieved through more advanced reactor designs, Generation III reactors promise to be at least 17% more fuel efficient, and have lower capital costs, while Generation IV reactors promise further gains in fuel efficiency and significant reductions in nuclear waste.
In Eastern Europe, a number of long-established projects are struggling to find financing, notably Belene in Bulgaria and the additional reactors at Cernavodă in Romania, and some potential backers have pulled out.
Where cheap gas is available and its future supply relatively secure, this also poses a major problem for nuclear projects.
Analysis of the economics of nuclear power must take into account who bears the risks of future uncertainties. To date all operating nuclear power stations were developed by state-owned or regulated utilities where many of the risks associated with construction costs, operating performance, fuel price, and other factors were borne by consumers rather than suppliers.
Many countries have now liberalized the electricity market where these risks and the risk of cheaper competitors emerging before capital costs are recovered, are borne by station suppliers and operators rather than consumers, which leads to a significantly different evaluation of the economics of new nuclear power stations.
Following the 2011 Fukushima nuclear accident in Japan, costs are likely to go up for currently operating and new nuclear power stations, due to increased requirements for on-site spent fuel management and elevated design basis threats.
However many designs, such as the currently under construction AP1000, use passive nuclear safety cooling systems, unlike those of Fukushima I which required active cooling systems, which largely eliminates the need to spend more on redundant back up safety equipment.
According to the World Nuclear Association, as of March 2020:
- Nuclear power is cost competitive with other forms of electricity generation, except where there is direct access to low-cost fossil fuels.
- Fuel costs for nuclear plants are a minor proportion of total generating costs, though capital costs are greater than those for coal-fired plants and much greater than those for gas-fired plants.
- System costs for nuclear power (as well as coal and gas-fired generation) are very much lower than for intermittent renewables.
- Providing incentives for long-term, high-capital investment in deregulated markets driven by short-term price signals presents a challenge in securing a diversified and reliable electricity supply system.
- In assessing the economics of nuclear power, decommissioning and waste disposal costs are fully taken into account.
- Nuclear power plant construction is typical of large infrastructure projects around the world, whose costs and delivery challenges tend to be under-estimated.
Safety and accidents:
Modern nuclear reactor designs have had numerous safety improvements since the first-generation nuclear reactors. A nuclear power plant cannot explode like a nuclear weapon because the fuel for uranium reactors is not enriched enough, and nuclear weapons require precision explosives to force fuel into a small enough volume to go supercritical.
Most reactors require continuous temperature control to prevent a core meltdown, which has occurred on a few occasions through accident or natural disaster, releasing radiation and making the surrounding area uninhabitable. Plants must be defended against theft of nuclear material and attack by enemy military planes or missiles.
The most serious accidents to date have been:
- the 1979 Three Mile Island accident,
- the 1986 Chernobyl disaster,
- and the 2011 Fukushima Daiichi nuclear disaster,
Professor of sociology Charles Perrow states that multiple and unexpected failures are built into society's complex and tightly coupled nuclear reactor systems. Such accidents are unavoidable and cannot be designed around.
An interdisciplinary team from MIT has estimated that given the expected growth of nuclear power from 2005 to 2055, at least four serious nuclear accidents would be expected in that period. The MIT study does not, however, take into account improvements in safety since 1970.
Controversy:
The nuclear power debate about the deployment and use of nuclear fission reactors to generate electricity from nuclear fuel for civilian purposes peaked during the 1970s and 1980s, when it "reached an intensity unprecedented in the history of technology controversies," in some countries.
Proponents argue that nuclear power is a sustainable energy source which reduces carbon emissions and can increase energy security if its use supplants a dependence on imported fuels.
Proponents advance the notion that nuclear power produces virtually no air pollution, in contrast to the chief viable alternative of fossil fuel. Proponents also believe that nuclear power is the only viable course to achieve energy independence for most Western countries.
They emphasize that the risks of storing waste are small and can be further reduced by using the latest technology in newer reactors, and the operational safety record in the Western world is excellent when compared to the other major kinds of power plants.
Opponents say that nuclear power poses many threats to people and the environment, and that costs do not justify benefits. Threats include health risks and environmental damage from uranium mining, processing and transport, the risk of nuclear weapons proliferation or sabotage, and the problem of radioactive nuclear waste.
(Another environmental issue is discharge of hot water into the sea. The hot water modifies the environmental conditions for marine flora and fauna. They also contend that reactors themselves are enormously complex machines where many things can and do go wrong, and there have been many serious nuclear accidents.)
Critics do not believe that these risks can be reduced through new technology, despite rapid advancements in containment procedures and storage methods.
Opponents argue that when all the energy-intensive stages of the nuclear fuel chain are considered, from uranium mining to nuclear decommissioning, nuclear power is not a low-carbon electricity source despite the possibility of refinement and long-term storage being powered by a nuclear facility.
Those countries that do not contain uranium mines cannot achieve energy independence through existing nuclear power technologies. Actual construction costs often exceed estimates, and spent fuel management costs are difficult to define.
On 1 August 2020, the UAE launched the Arab region's first-ever nuclear energy plant. Unit 1 of the Barakah plant in the Al Dhafrah region of Abu Dhabi commenced generating heat on the first day of its launch, while the remaining 3 Units are being built.
However, Nuclear Consulting Group head, Paul Dorfman, warned the Gulf nation's investment into the plant as a risk "further destabilizing the volatile Gulf region, damaging the environment and raising the possibility of nuclear proliferation."
Reprocessing:
Nuclear reprocessing technology was developed to chemically separate and recover fissionable plutonium from irradiated nuclear fuel. Reprocessing serves multiple purposes, whose relative importance has changed over time.
Originally reprocessing was used solely to extract plutonium for producing nuclear weapons. With the commercialization of nuclear power, the reprocessed plutonium was recycled back into MOX nuclear fuel for thermal reactors. The reprocessed uranium, which constitutes the bulk of the spent fuel material, can in principle also be re-used as fuel, but that is only economic when uranium prices are high or disposal is expensive.
Finally, the breeder reactor can employ not only the recycled plutonium and uranium in spent fuel, but all the actinides, closing the nuclear fuel cycle and potentially multiplying the energy extracted from natural uranium by more than 60 times.
Nuclear reprocessing reduces the volume of high-level waste, but by itself does not reduce radioactivity or heat generation and therefore does not eliminate the need for a geological waste repository.
Reprocessing has been politically controversial because of:
- the potential to contribute to nuclear proliferation,
- the potential vulnerability to nuclear terrorism,
- the political challenges of repository siting (a problem that applies equally to direct disposal of spent fuel),
- and because of its high cost compared to the once-through fuel cycle.
In the United States, the Obama administration stepped back from President Bush's plans for commercial-scale reprocessing and reverted to a program focused on reprocessing-related scientific research.
Accident indemnification:
Nuclear power works under an insurance framework that limits or structures accident liabilities in accordance with the Paris Convention on Third Party Liability in the Field of Nuclear Energy, the Brussels supplementary convention, and the Vienna Convention on Civil Liability for Nuclear Damage.
However states with a majority of the world's nuclear power stations, including the U.S., Russia, China and Japan, are not party to international nuclear liability conventions.
United States:
In the United States, insurance for nuclear or radiological incidents is covered (for facilities licensed through 2025) by the Price-Anderson Nuclear Industries Indemnity Act.
United Kingdom:
Under the energy policy of the United Kingdom through its 1965 Nuclear Installations Act, liability is governed for nuclear damage for which a UK nuclear licensee is responsible. The Act requires compensation to be paid for damage up to a limit of £150 million by the liable operator for ten years after the incident.
Between ten and thirty years afterwards, the Government meets this obligation. The Government is also liable for additional limited cross-border liability (about £300 million) under international conventions (Paris Convention on Third Party Liability in the Field of Nuclear Energy and Brussels Convention supplementary to the Paris Convention).
Decommissioning:
Nuclear decommissioning is the dismantling of a nuclear power station and decontamination of the site to a state no longer requiring protection from radiation for the general public. The main difference from the dismantling of other power stations is the presence of radioactive material that requires special precautions to remove and safely relocate to a waste repository.
Decommissioning involves many administrative and technical actions. It includes all clean-up of radioactivity and progressive demolition of the station. Once a facility is decommissioned, there should no longer be any danger of a radioactive accident or to any persons visiting it. After a facility has been completely decommissioned it is released from regulatory control, and the licensee of the station no longer has responsibility for its nuclear safety.
Timing and deferral of decommissioning:
Generally speaking, nuclear stations were originally designed for a life of about 30 years. Newer stations are designed for a 40 to 60-year operating life.
The Centurion Reactor is a future class of nuclear reactor that is being designed to last 100 years.
One of the major limiting wear factors is the deterioration of the reactor's pressure vessel under the action of neutron bombardment, however in 2018 Rosatom announced it had developed a thermal annealing technique for reactor pressure vessels which ameliorates radiation damage and extends service life by between 15 and 30 years.
Flexibility:
Nuclear stations are used primarily for base load because of economic considerations. The fuel cost of operations for a nuclear station is smaller than the fuel cost for operation of coal or gas plants. Since most of the cost of nuclear power plant is capital cost, there is almost no cost saving by running it at less than full capacity.
In France, nuclear power plants are routinely used in load following mode on a large scale, although "it is generally accepted that this is not an ideal economic situation for nuclear stations."
Unit A at the decommissioned German Biblis Nuclear Power Plant was designed to modulate its output 15% per minute between 40% and 100% of its nominal power.
Russia has led in the practical development of floating nuclear power stations, which can be transported to the desired location and occasionally relocated or moved for easier decommissioning.
In 2022, the United States Department of Energy funded a three-year research study of offshore floating nuclear power generation. In October 2022, NuScale Power and Canadian company Prodigy announced a joint project to bring a North American small modular reactor based floating plant to market.
See also:
- List of commercial nuclear reactors
- List of nuclear power stations
- Media related to Nuclear power plant at Wikimedia Commons
- Non Destructive Testing for Nuclear Power Plants
Nuclear Power in the United States, including a List of the Largest Nuclear Plants on U.S. Soil
- YouTube Video: The Greatest Threat: Preventing Nuclear Terrorism
- YouTube Video: Economics of Nuclear Reactor
- YouTube Video: Coal vs. Nuclear Power Plants
Click Here for a List of the Largest Nuclear Plants on U.S. Soil
Nuclear power in the United States
In the United States, nuclear power is provided by 92 commercial reactors with a net capacity of 94.7 gigawatts (GW), with 61 pressurized water reactors and 31 boiling water reactors. In 2019, they produced a total of 809.41 terawatt-hours of electricity, which accounted for 20% of the nation's total electric energy generation. In 2018, nuclear comprised nearly 50 percent of US emission-free energy generation.
As of September 2017, there are two new reactors under construction with a gross electrical capacity of 2,500 MW, while 39 reactors have been permanently shut down. The United States is the world's largest producer of commercial nuclear power, and in 2013 generated 33% of the world's nuclear electricity. With the past and future scheduled plant closings, China and Russia could surpass the United States in nuclear energy production.
As of October 2014, the Nuclear Regulatory Commission (NRC) has granted license renewals providing 20-year extensions to a total of 74 reactors. In early 2014, the NRC prepared to receive the first applications of license renewal beyond 60 years of reactor life as early as 2017, a process which by law requires public involvement.
Licenses for 22 reactors are due to expire before the end of 2029 if no renewals are granted. Pilgrim Nuclear Power Station in Massachusetts was the most recent nuclear power plant to be decommissioned, on June 1, 2019.
Another five aging reactors were permanently closed in 2013 and 2014 before their licenses expired because of high maintenance and repair costs at a time when natural gas prices had fallen:
In April 2021, New York State permanently closed Indian Point in Buchanan, 30 miles from New York City.
Most reactors began construction by 1974; following the Three Mile Island accident in 1979 and changing economics, many planned projects were canceled. More than 100 orders for nuclear power reactors, many already under construction, were canceled in the 1970s and 1980s, bankrupting some companies.
In 2006, the Brookings Institution, a public policy organization, stated that new nuclear units had not been built in the United States because of soft demand for electricity, the potential cost overruns on nuclear reactors due to regulatory issues and resulting construction delays.
There was a revival of interest in nuclear power in the 2000s, with talk of a "nuclear renaissance", supported particularly by the Nuclear Power 2010 Program. A number of applications were made, but facing economic challenges, and later in the wake of the 2011 Fukushima Daiichi nuclear disaster, most of these projects have been canceled.
Up until 2013, there had also been no ground-breaking on new nuclear reactors at existing power plants since 1977. Then in 2012, the NRC approved construction of four new reactors at existing nuclear plants. Construction of the Virgil C. Summer Nuclear Generating Station Units 2 and 3 began on March 9, 2013, but was abandoned on July 31, 2017, after the reactor supplier Westinghouse filed for bankruptcy protection in March 2017.
On March 12, 2013, construction began on the Vogtle Electric Generating Plant Units 3 and 4. The target in-service date for Unit 3 was originally November 2021. In March 2023, the Vogtle reached "initial criticality" and started service on July 31, 2023.
On October 19, 2016, TVA's Unit 2 reactor at the Watts Bar Nuclear Generating Station became the first US reactor to enter commercial operation since 1996.
Pictured below: Nuclear reactors within the Contiguous United States as of October 2021; marker colors indicate respective administrative Regions of the Nuclear Regulatory Commission.
Nuclear power in the United States
In the United States, nuclear power is provided by 92 commercial reactors with a net capacity of 94.7 gigawatts (GW), with 61 pressurized water reactors and 31 boiling water reactors. In 2019, they produced a total of 809.41 terawatt-hours of electricity, which accounted for 20% of the nation's total electric energy generation. In 2018, nuclear comprised nearly 50 percent of US emission-free energy generation.
As of September 2017, there are two new reactors under construction with a gross electrical capacity of 2,500 MW, while 39 reactors have been permanently shut down. The United States is the world's largest producer of commercial nuclear power, and in 2013 generated 33% of the world's nuclear electricity. With the past and future scheduled plant closings, China and Russia could surpass the United States in nuclear energy production.
As of October 2014, the Nuclear Regulatory Commission (NRC) has granted license renewals providing 20-year extensions to a total of 74 reactors. In early 2014, the NRC prepared to receive the first applications of license renewal beyond 60 years of reactor life as early as 2017, a process which by law requires public involvement.
Licenses for 22 reactors are due to expire before the end of 2029 if no renewals are granted. Pilgrim Nuclear Power Station in Massachusetts was the most recent nuclear power plant to be decommissioned, on June 1, 2019.
Another five aging reactors were permanently closed in 2013 and 2014 before their licenses expired because of high maintenance and repair costs at a time when natural gas prices had fallen:
- San Onofre 2 and 3 in California,
- Crystal River 3 in Florida,
- Vermont Yankee in Vermont,
- and Kewaunee in Wisconsin.
In April 2021, New York State permanently closed Indian Point in Buchanan, 30 miles from New York City.
Most reactors began construction by 1974; following the Three Mile Island accident in 1979 and changing economics, many planned projects were canceled. More than 100 orders for nuclear power reactors, many already under construction, were canceled in the 1970s and 1980s, bankrupting some companies.
In 2006, the Brookings Institution, a public policy organization, stated that new nuclear units had not been built in the United States because of soft demand for electricity, the potential cost overruns on nuclear reactors due to regulatory issues and resulting construction delays.
There was a revival of interest in nuclear power in the 2000s, with talk of a "nuclear renaissance", supported particularly by the Nuclear Power 2010 Program. A number of applications were made, but facing economic challenges, and later in the wake of the 2011 Fukushima Daiichi nuclear disaster, most of these projects have been canceled.
Up until 2013, there had also been no ground-breaking on new nuclear reactors at existing power plants since 1977. Then in 2012, the NRC approved construction of four new reactors at existing nuclear plants. Construction of the Virgil C. Summer Nuclear Generating Station Units 2 and 3 began on March 9, 2013, but was abandoned on July 31, 2017, after the reactor supplier Westinghouse filed for bankruptcy protection in March 2017.
On March 12, 2013, construction began on the Vogtle Electric Generating Plant Units 3 and 4. The target in-service date for Unit 3 was originally November 2021. In March 2023, the Vogtle reached "initial criticality" and started service on July 31, 2023.
On October 19, 2016, TVA's Unit 2 reactor at the Watts Bar Nuclear Generating Station became the first US reactor to enter commercial operation since 1996.
Pictured below: Nuclear reactors within the Contiguous United States as of October 2021; marker colors indicate respective administrative Regions of the Nuclear Regulatory Commission.
Click on any of the following blue hyperlinks for more about Nuclear Power Plants Located within the United States:
- History
Nuclear power plants - Safety and accidents
Uranium supply - Fuel cycle
Water use in nuclear power production - Plant decommissioning
- Organizations
- Debate
- Economics
- Statistics
- See also:
- High-level radioactive waste management
- Nuclear energy policy of the United States
- Energy policy of the United States
- List of articles associated with nuclear issues in California
- List of nuclear reactors
- List of largest power stations in the United States
- List of the largest nuclear power stations in the United States
- Nuclear energy policy
- Nuclear reactor accidents in the United States
- List of nuclear whistleblowers
- United States-India Peaceful Atomic Energy Cooperation Act
- United States-Japan Joint Nuclear Energy Action Plan
- GA Mansoori, N Enayati, LB Agyarko (2016), Energy: Sources, Utilization, Legislation, Sustainability, Illinois as Model State, World Sci. Pub. Co., ISBN 978-981-4704-00-7
- World Nuclear Association
- US Nuclear Power Plants – General U.S. Nuclear Info
- The Nuclear Energy Institute: The policy organization of the nuclear energy and technologies industry
- Nuclear power plant operators in the United States (SourceWatch).
- How many people live near a nuclear power plant in the United States? Data Visualization
Three Mile Island Power Plant and Accident
- YouTube VIdeo: Movie Trailer from the Film China Syndrome (1979)
- YouTube Video: Three Mile Island - What Really Happened
- YouTube Video: Three Mile Island: On the Closure and Decommissioning of a Nuclear Power Plant
Three Mile Island Nuclear Generating Station (commonly abbreviated as TMI) is a closed nuclear power plant on Three Mile Island in Londonderry Township, Dauphin County, Pennsylvania on Lake Frederic, a reservoir in the Susquehanna River just south of Harrisburg. It has two separate units, TMI-1 (owned by Constellation Energy) and TMI-2 (owned by EnergySolutions).
The plant was the site of the most significant accident in United States commercial nuclear energy when, on March 28, 1979, TMI-2 suffered a partial meltdown. According to the Nuclear Regulatory Commission (NRC) report, the accident resulted in no deaths or injuries to plant workers or in nearby communities.
Follow-up epidemiology studies did not find causality between the accident and any increase in cancers. One work-related death has occurred on-site during decommissioning.
The reactor core of TMI-2 has since been removed from the site, but the site has not been fully decommissioned. In July 1998, Amergen Energy (now Exelon Generation) agreed to purchase TMI-1 from General Public Utilities for $100 million.
The plant was originally built by General Public Utilities Corporation, later renamed GPU Incorporated. The plant was operated by Metropolitan Edison Company (Met-Ed), a subsidiary of the GPU Energy division. In 2001, GPU Inc. merged with FirstEnergy Corporation. On December 18, 2020 - FirstEnergy transferred Unit 2's license over to EnergySolutions' subsidiary TMI-2 Solutions after receiving approval from the NRC.
Exelon was operating Unit 1 at a financial loss since 2015. In 2017 the company said it would consider ceasing operations at Unit 1 because of high costs unless there was action from the Pennsylvania government. Unit 1 officially shut down at noon on September 20, 2019.
Unit 1 decommissioning is expected to be completed in 2079 and will cost $1.2 billion.Unit 2, which has been dormant since the accident in 1979, is expected to close in 2052.
Emergency zones and nearby population:
The NRC defines two emergency planning zones around nuclear power plants: a plume exposure pathway zone with a radius of 10 miles (16 km), concerned primarily with exposure to, and inhalation of, airborne radioactive contamination, and an ingestion pathway zone of about 50 miles (80 km), concerned primarily with ingestion of food and liquid contaminated by radioactivity.
The 2010 U.S. population within 10 miles (16 km) of Three Mile Island was 211,261, an increase of 10.9 percent in a decade, according to an analysis of U.S. Census data. The 2010 U.S. population within 50 miles (80 km) was 2,803,322, an increase of 10.3 percent since 2000. Cities within 50 miles include Harrisburg (12 miles to city center), York (13 miles to city center), and Lancaster (24 miles to city center).[26]
Electricity production:
During its last full year of operation in 2018, Three Mile Island generated 7,355 GWh of electricity. In that same year, electricity from nuclear power produced approximately 39% of the total electricity generated in Pennsylvania (83.5 TWh nuclear of 215 TWh total), with Three Mile Island Generating Station contributing approximately 4% to the statewide total generation. In 2021 the state of Pennsylvania generated approximately 241 TWh total electricity.
The plant was the site of the most significant accident in United States commercial nuclear energy when, on March 28, 1979, TMI-2 suffered a partial meltdown. According to the Nuclear Regulatory Commission (NRC) report, the accident resulted in no deaths or injuries to plant workers or in nearby communities.
Follow-up epidemiology studies did not find causality between the accident and any increase in cancers. One work-related death has occurred on-site during decommissioning.
The reactor core of TMI-2 has since been removed from the site, but the site has not been fully decommissioned. In July 1998, Amergen Energy (now Exelon Generation) agreed to purchase TMI-1 from General Public Utilities for $100 million.
The plant was originally built by General Public Utilities Corporation, later renamed GPU Incorporated. The plant was operated by Metropolitan Edison Company (Met-Ed), a subsidiary of the GPU Energy division. In 2001, GPU Inc. merged with FirstEnergy Corporation. On December 18, 2020 - FirstEnergy transferred Unit 2's license over to EnergySolutions' subsidiary TMI-2 Solutions after receiving approval from the NRC.
Exelon was operating Unit 1 at a financial loss since 2015. In 2017 the company said it would consider ceasing operations at Unit 1 because of high costs unless there was action from the Pennsylvania government. Unit 1 officially shut down at noon on September 20, 2019.
Unit 1 decommissioning is expected to be completed in 2079 and will cost $1.2 billion.Unit 2, which has been dormant since the accident in 1979, is expected to close in 2052.
Emergency zones and nearby population:
The NRC defines two emergency planning zones around nuclear power plants: a plume exposure pathway zone with a radius of 10 miles (16 km), concerned primarily with exposure to, and inhalation of, airborne radioactive contamination, and an ingestion pathway zone of about 50 miles (80 km), concerned primarily with ingestion of food and liquid contaminated by radioactivity.
The 2010 U.S. population within 10 miles (16 km) of Three Mile Island was 211,261, an increase of 10.9 percent in a decade, according to an analysis of U.S. Census data. The 2010 U.S. population within 50 miles (80 km) was 2,803,322, an increase of 10.3 percent since 2000. Cities within 50 miles include Harrisburg (12 miles to city center), York (13 miles to city center), and Lancaster (24 miles to city center).[26]
Electricity production:
During its last full year of operation in 2018, Three Mile Island generated 7,355 GWh of electricity. In that same year, electricity from nuclear power produced approximately 39% of the total electricity generated in Pennsylvania (83.5 TWh nuclear of 215 TWh total), with Three Mile Island Generating Station contributing approximately 4% to the statewide total generation. In 2021 the state of Pennsylvania generated approximately 241 TWh total electricity.
Three Mile Island Unit 1:
The Three Mile Island Unit 1 is a pressurized water reactor designed by Babcock & Wilcox with a net generating capacity of 819 MWe. The initial construction cost for TMI-1 was US$400 million, equal to $2.37 billion in 2018 dollars. Unit 1 first came online on April 19, 1974, and began commercial operations on September 2, 1974.
TMI-1 is licensed to operate for 40 years from its first run, and in 2009, was extended 20 years, which means it could have operated until April 19, 2034.
TMI-1 had a closed-cycle cooling system for its main condenser using two natural draft cooling towers. Makeup water was drawn from the river to replace the water lost via evaporation in the towers. Once-through cooling with river water is used for the service water system which cools auxiliary components and removed decay heat when the reactor was shut down.
On February 17, 1979, TMI-1 went offline for refueling. It was brought back online on October 9, 1985, after public opposition, several federal court injunctions, and some technical and regulatory complications - more than six years after it initially went offline.
Unit 1 was scheduled to be shut down by September 2019 after Exelon announced they did not receive any commitments for subsidies from the state, rendering Exelon unable to continue operating the reactor. TMI-1 was shut down on September 20, 2019.
Incidents:
In February 1993, a man drove his car past a checkpoint at the TMI nuclear plant, then broke through an entry gate. He eventually crashed the car through a secure door and entered the Unit 1 turbine building. The intruder, who had a history of mental illness, hid in the turbine building and was apprehended after four hours.
During and following the September 11 attacks, there was a moral panic that United Airlines Flight 93 was headed towards Three Mile Island. On that day, the NRC placed all of the nation's nuclear power plants into the highest level of security. United Flight 93 crashed into a field about 135 miles west (217 km) of Three Mile Island in Stonycreek Township, just outside Shanksville, Pennsylvania, with its actual target believed to have been Washington, D.C.
On November 21, 2009, a release of radioactivity occurred inside the containment building of TMI-1 while workers were cutting pipes. Exelon Corporation stated to the public that "A monitor at the temporary opening cut into the containment building wall to allow the new steam generators to be moved inside showed a slight increase in a reading and then returned to normal.
Approximately 20 employees were treated for mild radiation exposure." As of November 22, 2009, it was believed that no radiation escaped the containment building and the public was not in any danger. The inside airborne contamination was caused by a change in air pressure inside the containment building that dislodged small irradiated particles in the reactor piping system.
Some of the small particles became airborne inside the building and were detected by an array of monitors in place to detect such material. The air pressure change occurred when inside building ventilation fans were started to support outage activities.
The site modified the ventilation system to prevent future air pressure changes. Work continued on the project the following day. On January 24, 2010, TMI-1 was brought back online.
Material handling accident:
On September 10, 2021, a contractor from Alabama was fatally injured while unloading equipment from a truck. Fire and emergency medical personnel from Londonderry Township were dispatched and declared the contractor dead on arrival. The Nuclear Regulatory Commission said the injury was work-related, and the contractor was outside the radiological controlled area.
Three Mile Island Unit 2
The Three Mile Island Unit 2 was also a pressurized water reactor constructed by B&W, similar to Unit 1. The only difference was that TMI-2 was slightly larger with a net generating capacity of 906 MWe, compared to TMI-1, which delivers 819 MWe.
Unit 2 received its operating license on February 8, 1978, and began commercial operation on December 30, 1978. TMI Unit 2 has been permanently shut off after the Three Mile Island accident in 1979.
Accident
Main article: Three Mile Island accident
On March 28, 1979, a cooling system malfunction caused a partial meltdown of the reactor core. This loss-of-coolant accident resulted in the release of an estimated 43,000 curies (1.59 PBq) of radioactive krypton-85 gas (half life 10 yrs), and less than 20 curies (740 GBq) of the especially hazardous iodine-131 (half life 8 days), into the surrounding environment.
Nearly 2 million people were exposed to radiation from the accident. A review by the World Nuclear Association concluded that no deaths, injuries or adverse health effects resulted from the accident, and a report by Columbia University epidemiologist Maureen Hatch confirmed this finding.
Because of the health concerns, the Pennsylvania Department of Health kept a registry of more than 30,000 people that lived within 5 miles (8.0 km) of TMI at the time of the accident. The registry was kept for nearly 20 years until 1997, when no evidence was found of unusual health effects.
Further epidemiology studies have not shown any increase in cancer as a result of the accident. However, almost $25 million was paid in insurance settlements to people who then agreed not to discuss their injuries in ongoing litigation.
Unit 2 has not been operational since the accident occurred.
The New York Times reported on August 14, 1993, 14 years after the accident, that the cleanup had been finished. According to the United States NRC, 2.3 million gallons of waste water had been removed.
The incident was widely publicized internationally, and had far-reaching effects on public opinion, particularly in the United States. The China Syndrome, (see film's trailer under the above YouTube Videos), a movie about a nuclear disaster, which was released 12 days before the incident and received a glowing reception from the movie-going public, became a blockbuster hit.
Unit 2 Generator:
On January 22, 2010, officials at the NRC announced the electrical generator from the damaged Unit 2 reactor at TMI will be used at Shearon Harris Nuclear Plant in New Hill, North Carolina. The generator was transported in two parts, weighing a combined 670 tons. It was refurbished and installed during a refueling outage at Shearon Harris NPP in November 2010.
TMI's Unit 2 reactor has been shut down since the partial meltdown in 1979.
Post-accident:
Exelon Corporation was created in October 2000 by the merger of PECO Energy Company and Unicom, of Philadelphia and Chicago respectively. Unicom owned Commonwealth Edison. The PECO share in AmerGen was acquired by Exelon during late 2000.
Exelon acquired British Energy's share in AmerGen in 2003, and transferred Unit 1 under the direct ownership and operation of its Exelon Nuclear business unit. According to Exelon Corporation, "many people are surprised when they learn that Three Mile Island is still making electricity, enough to power 800,000 households" from its undamaged and fully functional reactor unit 1.
Exelon viewed the plant's economics of $44/MWh as challenging due to the low price of natural gas at $25/MWh. As of 2016, the average price of electricity in the area was $39/MWh.
Closure:
On June 20, 2017, Exelon Generation, the owners of Three Mile Island's Unit 1, sent to the Nuclear Regulatory Commission a formal notice of its intention to shut down the plant on September 30, 2019, unless the Pennsylvania legislature rescued the nuclear industry, which was struggling to compete as newfound natural gas resources drove down electricity prices.
Exelon Generation's Senior Vice President Bryan Hanson noted that once Three Mile Island was closed, it could never be reopened for use again. Hanson explicitly stated the reason for the shutdown is because of the unprofitability of Unit 1. Unit 1 has lost the company over $300 million over the last half-decade despite it being one of Exelon's best-performing power plants.
Shut down of Unit 1 can go in two possible directions, the first being the immediate dismantlement after the radioactive fuel has been moved away from the plant. The dismantlement can proceed after the spent fuel is removed from the pool, put into storage casks, and the casks are transferred to the ISFSI pad for storage until the DOE takes them away to a DOE repository.
Dismantling the plant this way will take anywhere from 8 to 10 years.
The second option Exelon could take is the long-term storage, which involves mothballing the plant and letting the radiation decay for up to 60 years on its own to a harmless level before completely dismantling the buildings.
The advantage to the long-term storage is the lack of radiation when the dismantlement would begin but the disadvantage would be the possible lack of qualified workers at the time of dismantlement. Exelon would also have to pay for limited maintenance and security of the plant over the potential 60 years.
The entirety of the spent fuel will be moved to the Londonderry Township facility, which is another process that could take decades to complete.
About 70 state legislators signed the industry-inspired Nuclear Caucus but made no financial commitments.
In April 2019, Exelon stated it would cost $1.2 billion over nearly 60 years to completely decommission Unit 1. Unit 1 closed on September 20, 2019.
In 2022, Unit 1 was transferred to Constellation Energy following separation from Exelon. Unit 2 was also transferred to TriArtisan ES Partners, LLC - following their acquisition of EnergySolutions.
Decommissioning;
Following the TMI-2 accident in 1979, approximately 99% of the fuel and damaged core debris was removed from the reactor vessel and associated systems and shipped to the Idaho National Laboratory near Idaho Falls, Idaho. Since 1993, when the initial cleanup of the plant was completed, TMI-2 has been in a condition known as Post Defueling Monitored Storage (PDMS) and is under constant monitoring to ensure the plant's safety and stability.
The cost of decommissioning a closed nuclear reactor and related structures at Three Mile Island is estimated at $918 million.
Seismic risk:
The Nuclear Regulatory Commission's estimate of the risk each year of an earthquake intense enough to cause core damage to the reactor at Three Mile Island was 1 in 25,000, according to an NRC study published in August 2010.
See also:
Chernobyl Nuclear Power Plant and Disaster
TOP: Chernobyl Nuclear Power Plant in 2013
BOTTOM: Steam plumes continued to be generated days after the initial explosion
- YouTube Video: The Chernobyl Disaster: How It Happened
- YouTube Video: Inside the heart of the Chernobyl nuclear reactor | 60 Minutes Australia
- YouTube Video: Chernobyl Nuclear Explosion Disaster Explained (Hour by Hour)
TOP: Chernobyl Nuclear Power Plant in 2013
BOTTOM: Steam plumes continued to be generated days after the initial explosion
The Chernobyl Nuclear Power Plant is a nuclear power plant undergoing decommissioning. ChNPP is located near the abandoned city of Pripyat in northern Ukraine, 16.5 kilometers (10 mi) northwest of the city of Chernobyl, 16 kilometers (10 mi) from the Belarus–Ukraine border, and about 100 kilometers (62 mi) north of Kyiv. The plant was cooled by an engineered pond, fed by the Pripyat River about 5 kilometers (3 mi) northwest from its juncture with the Dnieper.
Originally named for Vladimir Lenin, the plant was commissioned in phases with the four reactors entering commercial operation between 1978 and 1984.
In 1986, in what became known as the Chernobyl disaster, reactor No. 4 suffered a catastrophic meltdown and explosion; as a result of this, the power plant is now within a large restricted area known as the Chernobyl Exclusion Zone. Both the zone and the power plant are administered by the State Agency of Ukraine on Exclusion Zone Management.
The three other reactors remained operational post-accident maintaining a capacity factor between 60 and 70%. In total, units 1 and 3 had supplied 98 terawatt-hours of electricity each, with unit 2 slightly behind at 75 TWh. In 1991, unit 2 was placed into a permanent shutdown state by the plant's operator due to complications resulting from a turbine fire. This was followed by unit 1 in 1996 and unit 3 in 2000.
Their closures were largely attributed to foreign pressures. In 2013, the plant's operator announced that units 1-3 were fully defueled, and in 2015 entered the decommissioning phase, during which equipment contaminated during the operational period of the power station will be removed.
This process is expected to take until 2065 according to the plant's operator. Although the reactors have all ceased generation, Chernobyl maintains a large workforce as the ongoing decommissioning process requires constant management.
From 24 February to 31 March 2022, Russian troops occupied the plant as part of their invasion of Ukraine.
Click on any of the following blue hyperlinks for more about the Chernobyl Nuclear Power Plant and Disaster:
Originally named for Vladimir Lenin, the plant was commissioned in phases with the four reactors entering commercial operation between 1978 and 1984.
In 1986, in what became known as the Chernobyl disaster, reactor No. 4 suffered a catastrophic meltdown and explosion; as a result of this, the power plant is now within a large restricted area known as the Chernobyl Exclusion Zone. Both the zone and the power plant are administered by the State Agency of Ukraine on Exclusion Zone Management.
The three other reactors remained operational post-accident maintaining a capacity factor between 60 and 70%. In total, units 1 and 3 had supplied 98 terawatt-hours of electricity each, with unit 2 slightly behind at 75 TWh. In 1991, unit 2 was placed into a permanent shutdown state by the plant's operator due to complications resulting from a turbine fire. This was followed by unit 1 in 1996 and unit 3 in 2000.
Their closures were largely attributed to foreign pressures. In 2013, the plant's operator announced that units 1-3 were fully defueled, and in 2015 entered the decommissioning phase, during which equipment contaminated during the operational period of the power station will be removed.
This process is expected to take until 2065 according to the plant's operator. Although the reactors have all ceased generation, Chernobyl maintains a large workforce as the ongoing decommissioning process requires constant management.
From 24 February to 31 March 2022, Russian troops occupied the plant as part of their invasion of Ukraine.
Click on any of the following blue hyperlinks for more about the Chernobyl Nuclear Power Plant and Disaster:
World Nuclear Association
- YouTube Video: World Nuclear Association: Building on 20 years of success
- YouTube Video: World Nuclear Performance Report 2023
- YouTube Video: World Nuclear Association Strategic eForum: Committing to Net Zero
Home / Information Library / Nuclear Fuel Cycle / Nuclear Power Reactors / Nuclear Power Reactors(Updated May 2023)
This page is about the main conventional types of nuclear reactor. For more advanced types, see pages on:
How does a nuclear reactor work?
A nuclear reactor produces and controls the release of energy from splitting the atoms of certain elements. In a nuclear power reactor, the energy released is used as heat to make steam to generate electricity. (In a research reactor the main purpose is to utilise the actual neutrons produced in the core. In most naval reactors, steam drives a turbine directly for propulsion.)
The principles for using nuclear power to produce electricity are the same for most types of reactor. The energy released from continuous fission of the atoms of the fuel is harnessed as heat in either a gas or water, and is used to produce steam. The steam is used to drive the turbines which produce electricity (as in most fossil fuel plants).
The world's first nuclear reactors 'operated' naturally in a uranium deposit about two billion years ago. These were in rich uranium orebodies and moderated by percolating rainwater.
The 17 known at Oklo in west Africa, each less than 100 kW thermal, together consumed about six tonnes of uranium. It is assumed that these were not unique worldwide.
Today, reactors derived from designs originally developed for propelling submarines and large naval ships generate about 85% of the world's nuclear electricity. The main design is the pressurised water reactor (PWR) which has water at over 300°C under pressure in its primary cooling/heat transfer circuit, and generates steam in a secondary circuit.
The less numerous boiling water reactor (BWR) makes steam in the primary circuit above the reactor core, at similar temperatures and pressure. Both types use water as both coolant and moderator, to slow neutrons. Since water normally boils at 100°C, they have robust steel pressure vessels or tubes to enable the higher operating temperature. (Another type uses heavy water, with deuterium atoms, as moderator. Hence the term ‘light water’ is used to differentiate.)
Components of a nuclear reactor:
There are several components common to most types of reactor:
Fuel
Uranium is the basic fuel. Usually pellets of uranium oxide (UO2) are arranged in tubes to form fuel rods. The rods are arranged into fuel assemblies in the reactor core.* In a 1000 MWe class PWR there might be 51,000 fuel rods with over 18 million pellets.
* - In a new reactor with new fuel a neutron source is needed to get the reaction going.
Usually this is beryllium mixed with polonium, radium or other alpha-emitter. Alpha particles from the decay cause a release of neutrons from the beryllium as it turns to carbon-12.
Restarting a reactor with some used fuel may not require this, as there may be enough neutrons to achieve criticality when control rods are removed.
Moderator
Material in the core which slows down the neutrons released from fission so that they cause more fission. It is usually water, but may be heavy water or graphite.
Control rods or blades
These are made with neutron-absorbing material such as cadmium, hafnium or boron, and are inserted or withdrawn from the core to control the rate of reaction, or to halt it.* In some PWR reactors, special control rods are used to enable the core to sustain a low level of power efficiently. (Secondary control systems involve other neutron absorbers, usually boron in the coolant – its concentration can be adjusted over time as the fuel burns up.) PWR control rods are inserted from the top, BWR cruciform blades from the bottom of the core.
* In fission, most of the neutrons are released promptly, but some are delayed. These are crucial in enabling a chain reacting system (or reactor) to be controllable and to be able to be held precisely critical.
Coolant
A fluid circulating through the core so as to transfer the heat from it. In light water reactors the water moderator functions also as primary coolant. Except in BWRs, there is secondary coolant circuit where the water becomes steam. (See also later section on primary coolant characteristics.) A PWR has two to four primary coolant loops with pumps, driven either by steam or electricity – China’s Hualong One design has three, each driven by a 6.6 MW electric motor, with each pump set weighing 110 tonnes.
Pressure vessel or pressure tubes
Usually a robust steel vessel containing the reactor core and moderator/coolant, but it may be a series of tubes holding the fuel and conveying the coolant through the surrounding moderator.
Steam generator
Part of the cooling system of pressurised water reactors (PWR & PHWR) where the high-pressure primary coolant bringing heat from the reactor is used to make steam for the turbine, in a secondary circuit. Essentially a heat exchanger like a motor car radiator.* Reactors have up to six 'loops', each with a steam generator. Since 1980 over 110 PWR reactors have had their steam generators replaced after 20-30 years service, over half of these in the USA.
* These are large heat exchangers for transferring heat from one fluid to another – here from high-pressure primary circuit in PWR to secondary circuit where water turns to steam. Each structure weighs up to 800 tonnes and contains from 300 to 16,000 tubes about 2 cm diameter for the primary coolant, which is radioactive due to nitrogen-16 (N-16, formed by neutron bombardment of oxygen, with half-life of 7 seconds).
The secondary water must flow through the support structures for the tubes. The whole thing needs to be designed so that the tubes don't vibrate and fret, operated so that deposits do not build up to impede the flow, and maintained chemically to avoid corrosion. Tubes which fail and leak are plugged, and surplus capacity is designed to allow for this. Leaks can be detected by monitoring N-16 levels in the steam as it leaves the steam generator.
Containment
The structure around the reactor and associated steam generators which is designed to protect it from outside intrusion and to protect those outside from the effects of radiation in case of any serious malfunction inside. It is typically a metre-thick concrete and steel structure.
Newer Russian and some other reactors install core melt localisation devices or 'core catchers' under the pressure vessel to catch any melted core material in the event of a major accident.
There are several different types of reactor as indicated in the following table.
Fuelling a nuclear reactor:
Most reactors need to be shut down for refuelling, so that the reactor vessel can be opened up. In this case refuelling is at intervals of 12, 18 or 24 months, when a quarter to a third of the fuel assemblies are replaced with fresh ones. The CANDU and RBMK types have pressure tubes (rather than a pressure vessel enclosing the reactor core) and can be refuelled under load by disconnecting individual pressure tubes.
The AGR is also designed for refuelling on-load.
If graphite or heavy water is used as moderator, it is possible to run a power reactor on natural instead of enriched uranium. Natural uranium has the same elemental composition as when it was mined (0.7% U-235, over 99.2% U-238), enriched uranium has had the proportion of the fissile isotope (U-235) increased by a process called enrichment, commonly to 3.5-5.0%. In this case the moderator can be ordinary water, and such reactors are collectively called light water reactors.
Because the light water absorbs neutrons as well as slowing them, it is less efficient as a moderator than heavy water or graphite. Some new small reactor designs require high-assay low-enriched uranium fuel, enriched to near 20% U-235.
During operation, some of the U-238 is changed to plutonium, and Pu-239 ends up providing about one-third of the energy from the fuel.
In most reactors the fuel is ceramic uranium oxide (UO2 with a melting point of 2800°C) and most is enriched. The fuel pellets (usually about 1 cm diameter and 1.5 cm long) are typically arranged in a long zirconium alloy (zircaloy) tube to form a fuel rod, the zirconium being hard, corrosion-resistant and transparent to neutrons.*
Numerous rods form a fuel assembly, which is an open lattice and can be lifted into and out of the reactor core. In the most common reactors these are about 4 metres long. A BWR fuel assembly may be about 320 kg, a PWR one 655 kg, in which case they hold 183 kg uranium and 460 kgU respectively. In both, about 100 kg of zircaloy is involved.
* Zirconium is an important mineral for nuclear power, where it finds its main use. It is therefore subject to controls on trading. It is normally contaminated with hafnium, a neutron absorber, so very pure 'nuclear grade' Zr is used to make the zircaloy, which is about 98% Zr plus about 1.5% tin, also iron, chromium and sometimes nickel to enhance its strength.
A significant industry initiative is to develop accident-tolerant fuels which are more resistant to melting under conditions such as those in the Fukushima accident, and with the cladding being more resistant to oxidation with hydrogen formation at very high temperatures under such conditions.
Burnable poisons are often used in fuel or coolant to even out the performance of the reactor over time from fresh fuel being loaded to refuelling. These are neutron absorbers which decay under neutron exposure, compensating for the progressive build up of neutron absorbers in the fuel as it is burned, and hence allowing higher fuel burn-up (in terms of GW days per tonne of U)*.
The best known is gadolinium, which is a vital ingredient of fuel in naval reactors where installing fresh fuel is very inconvenient, so reactors are designed to run more than a decade between refuellings (full power equivalent – in practice they are not run continuously).
Gadolinium is incorporated in the ceramic fuel pellets. An alternative is zirconium diboride integral fuel burnable absorber (IFBA) as a thin coating on normal pellets.
* Average burn-up of fuel used in US reactors has increase to nearly 50 GWd/t, from half that in the 1980s.
Gadolinium, mostly at up to 3g oxide per kilogram of fuel, requires slightly higher fuel enrichment to compensate for it, and also after burn-up of about 17 GWd/t it retains about 4% of its absorbtive effect and does not decrease further. The ZrB2 IFBA burns away more steadily and completely, and has no impact on fuel pellet properties.
It is now used in most US reactors and a few in Asia. China has the technology for AP1000 reactors.
Main types of nuclear reactor:
Pressurised water reactor (PWR)This is the most common type, with about 300 operable reactors for power generation and several hundred more employed for naval propulsion. The design of PWRs originated as a submarine power plant. PWRs use ordinary water as both coolant and moderator.
The design is distinguished by having a primary cooling circuit which flows through the core of the reactor under very high pressure, and a secondary circuit in which steam is generated to drive the turbine. In Russia these are known as VVER types – water-moderated and -cooled.
- Nuclear reactors work by using the heat energy released from splitting atoms of certain elements to generate electricity.
- Most nuclear electricity is generated using just two kinds of reactor which were developed in the 1950s and improved since.
- The first generation of these reactors have all been retired, and most of those operating are second-generation.
- New designs are coming forward, both large and small.
- About 10% of the world's electricity is produced from nuclear energy.
This page is about the main conventional types of nuclear reactor. For more advanced types, see pages on:
- Advanced Nuclear Power Reactors,
- Small Nuclear Power Reactors,
- Fast Neutron Reactors
- and Generation IV Nuclear Reactors.
How does a nuclear reactor work?
A nuclear reactor produces and controls the release of energy from splitting the atoms of certain elements. In a nuclear power reactor, the energy released is used as heat to make steam to generate electricity. (In a research reactor the main purpose is to utilise the actual neutrons produced in the core. In most naval reactors, steam drives a turbine directly for propulsion.)
The principles for using nuclear power to produce electricity are the same for most types of reactor. The energy released from continuous fission of the atoms of the fuel is harnessed as heat in either a gas or water, and is used to produce steam. The steam is used to drive the turbines which produce electricity (as in most fossil fuel plants).
The world's first nuclear reactors 'operated' naturally in a uranium deposit about two billion years ago. These were in rich uranium orebodies and moderated by percolating rainwater.
The 17 known at Oklo in west Africa, each less than 100 kW thermal, together consumed about six tonnes of uranium. It is assumed that these were not unique worldwide.
Today, reactors derived from designs originally developed for propelling submarines and large naval ships generate about 85% of the world's nuclear electricity. The main design is the pressurised water reactor (PWR) which has water at over 300°C under pressure in its primary cooling/heat transfer circuit, and generates steam in a secondary circuit.
The less numerous boiling water reactor (BWR) makes steam in the primary circuit above the reactor core, at similar temperatures and pressure. Both types use water as both coolant and moderator, to slow neutrons. Since water normally boils at 100°C, they have robust steel pressure vessels or tubes to enable the higher operating temperature. (Another type uses heavy water, with deuterium atoms, as moderator. Hence the term ‘light water’ is used to differentiate.)
Components of a nuclear reactor:
There are several components common to most types of reactor:
Fuel
Uranium is the basic fuel. Usually pellets of uranium oxide (UO2) are arranged in tubes to form fuel rods. The rods are arranged into fuel assemblies in the reactor core.* In a 1000 MWe class PWR there might be 51,000 fuel rods with over 18 million pellets.
* - In a new reactor with new fuel a neutron source is needed to get the reaction going.
Usually this is beryllium mixed with polonium, radium or other alpha-emitter. Alpha particles from the decay cause a release of neutrons from the beryllium as it turns to carbon-12.
Restarting a reactor with some used fuel may not require this, as there may be enough neutrons to achieve criticality when control rods are removed.
Moderator
Material in the core which slows down the neutrons released from fission so that they cause more fission. It is usually water, but may be heavy water or graphite.
Control rods or blades
These are made with neutron-absorbing material such as cadmium, hafnium or boron, and are inserted or withdrawn from the core to control the rate of reaction, or to halt it.* In some PWR reactors, special control rods are used to enable the core to sustain a low level of power efficiently. (Secondary control systems involve other neutron absorbers, usually boron in the coolant – its concentration can be adjusted over time as the fuel burns up.) PWR control rods are inserted from the top, BWR cruciform blades from the bottom of the core.
* In fission, most of the neutrons are released promptly, but some are delayed. These are crucial in enabling a chain reacting system (or reactor) to be controllable and to be able to be held precisely critical.
Coolant
A fluid circulating through the core so as to transfer the heat from it. In light water reactors the water moderator functions also as primary coolant. Except in BWRs, there is secondary coolant circuit where the water becomes steam. (See also later section on primary coolant characteristics.) A PWR has two to four primary coolant loops with pumps, driven either by steam or electricity – China’s Hualong One design has three, each driven by a 6.6 MW electric motor, with each pump set weighing 110 tonnes.
Pressure vessel or pressure tubes
Usually a robust steel vessel containing the reactor core and moderator/coolant, but it may be a series of tubes holding the fuel and conveying the coolant through the surrounding moderator.
Steam generator
Part of the cooling system of pressurised water reactors (PWR & PHWR) where the high-pressure primary coolant bringing heat from the reactor is used to make steam for the turbine, in a secondary circuit. Essentially a heat exchanger like a motor car radiator.* Reactors have up to six 'loops', each with a steam generator. Since 1980 over 110 PWR reactors have had their steam generators replaced after 20-30 years service, over half of these in the USA.
* These are large heat exchangers for transferring heat from one fluid to another – here from high-pressure primary circuit in PWR to secondary circuit where water turns to steam. Each structure weighs up to 800 tonnes and contains from 300 to 16,000 tubes about 2 cm diameter for the primary coolant, which is radioactive due to nitrogen-16 (N-16, formed by neutron bombardment of oxygen, with half-life of 7 seconds).
The secondary water must flow through the support structures for the tubes. The whole thing needs to be designed so that the tubes don't vibrate and fret, operated so that deposits do not build up to impede the flow, and maintained chemically to avoid corrosion. Tubes which fail and leak are plugged, and surplus capacity is designed to allow for this. Leaks can be detected by monitoring N-16 levels in the steam as it leaves the steam generator.
Containment
The structure around the reactor and associated steam generators which is designed to protect it from outside intrusion and to protect those outside from the effects of radiation in case of any serious malfunction inside. It is typically a metre-thick concrete and steel structure.
Newer Russian and some other reactors install core melt localisation devices or 'core catchers' under the pressure vessel to catch any melted core material in the event of a major accident.
There are several different types of reactor as indicated in the following table.
Fuelling a nuclear reactor:
Most reactors need to be shut down for refuelling, so that the reactor vessel can be opened up. In this case refuelling is at intervals of 12, 18 or 24 months, when a quarter to a third of the fuel assemblies are replaced with fresh ones. The CANDU and RBMK types have pressure tubes (rather than a pressure vessel enclosing the reactor core) and can be refuelled under load by disconnecting individual pressure tubes.
The AGR is also designed for refuelling on-load.
If graphite or heavy water is used as moderator, it is possible to run a power reactor on natural instead of enriched uranium. Natural uranium has the same elemental composition as when it was mined (0.7% U-235, over 99.2% U-238), enriched uranium has had the proportion of the fissile isotope (U-235) increased by a process called enrichment, commonly to 3.5-5.0%. In this case the moderator can be ordinary water, and such reactors are collectively called light water reactors.
Because the light water absorbs neutrons as well as slowing them, it is less efficient as a moderator than heavy water or graphite. Some new small reactor designs require high-assay low-enriched uranium fuel, enriched to near 20% U-235.
During operation, some of the U-238 is changed to plutonium, and Pu-239 ends up providing about one-third of the energy from the fuel.
In most reactors the fuel is ceramic uranium oxide (UO2 with a melting point of 2800°C) and most is enriched. The fuel pellets (usually about 1 cm diameter and 1.5 cm long) are typically arranged in a long zirconium alloy (zircaloy) tube to form a fuel rod, the zirconium being hard, corrosion-resistant and transparent to neutrons.*
Numerous rods form a fuel assembly, which is an open lattice and can be lifted into and out of the reactor core. In the most common reactors these are about 4 metres long. A BWR fuel assembly may be about 320 kg, a PWR one 655 kg, in which case they hold 183 kg uranium and 460 kgU respectively. In both, about 100 kg of zircaloy is involved.
* Zirconium is an important mineral for nuclear power, where it finds its main use. It is therefore subject to controls on trading. It is normally contaminated with hafnium, a neutron absorber, so very pure 'nuclear grade' Zr is used to make the zircaloy, which is about 98% Zr plus about 1.5% tin, also iron, chromium and sometimes nickel to enhance its strength.
A significant industry initiative is to develop accident-tolerant fuels which are more resistant to melting under conditions such as those in the Fukushima accident, and with the cladding being more resistant to oxidation with hydrogen formation at very high temperatures under such conditions.
Burnable poisons are often used in fuel or coolant to even out the performance of the reactor over time from fresh fuel being loaded to refuelling. These are neutron absorbers which decay under neutron exposure, compensating for the progressive build up of neutron absorbers in the fuel as it is burned, and hence allowing higher fuel burn-up (in terms of GW days per tonne of U)*.
The best known is gadolinium, which is a vital ingredient of fuel in naval reactors where installing fresh fuel is very inconvenient, so reactors are designed to run more than a decade between refuellings (full power equivalent – in practice they are not run continuously).
Gadolinium is incorporated in the ceramic fuel pellets. An alternative is zirconium diboride integral fuel burnable absorber (IFBA) as a thin coating on normal pellets.
* Average burn-up of fuel used in US reactors has increase to nearly 50 GWd/t, from half that in the 1980s.
Gadolinium, mostly at up to 3g oxide per kilogram of fuel, requires slightly higher fuel enrichment to compensate for it, and also after burn-up of about 17 GWd/t it retains about 4% of its absorbtive effect and does not decrease further. The ZrB2 IFBA burns away more steadily and completely, and has no impact on fuel pellet properties.
It is now used in most US reactors and a few in Asia. China has the technology for AP1000 reactors.
Main types of nuclear reactor:
Pressurised water reactor (PWR)This is the most common type, with about 300 operable reactors for power generation and several hundred more employed for naval propulsion. The design of PWRs originated as a submarine power plant. PWRs use ordinary water as both coolant and moderator.
The design is distinguished by having a primary cooling circuit which flows through the core of the reactor under very high pressure, and a secondary circuit in which steam is generated to drive the turbine. In Russia these are known as VVER types – water-moderated and -cooled.
A PWR has fuel assemblies of 200-300 rods each, arranged vertically in the core, and a large reactor would have about 150-250 fuel assemblies with 80-100 tonnes of uranium.
Water in the reactor core reaches about 325°C, hence it must be kept under about 150 times atmospheric pressure to prevent it boiling. Pressure is maintained by steam in a pressuriser (see diagram). In the primary cooling circuit the water is also the moderator, and if any of it turned to steam the fission reaction would slow down. This negative feedback effect is one of the safety features of the type. The secondary shutdown system involves adding boron to the primary circuit.
The secondary circuit is under less pressure and the water here boils in the heat exchangers which are thus steam generators. The steam drives the turbine to produce electricity, and is then condensed and returned to the heat exchangers in contact with the primary circuit.
Boiling water reactor (BWR):
This type of reactor has many similarities to the PWR, except that there is only a single circuit in which the water is at lower pressure (about 75 times atmospheric pressure) so that it boils in the core at about 285°C. The reactor is designed to operate with 12-15% of the water in the top part of the core as steam, and hence with less moderating effect and thus efficiency there. BWR units can operate in load-following mode more readily than PWRs.
The steam passes through drier plates (steam separators) above the core and then directly to the turbines, which are thus part of the reactor circuit. Since the water around the core of a reactor is always contaminated with traces of radionuclides, it means that the turbine must be shielded and radiological protection provided during maintenance. The cost of this tends to balance the savings due to the simpler design. Most of the radioactivity in the water is very short-lived*, so the turbine hall can be entered soon after the reactor is shut down.
* mostly N-16, with a 7 second half-life
A BWR fuel assembly comprises 90-100 fuel rods, and there are up to 750 assemblies in a reactor core, holding up to 140 tonnes of uranium. The secondary control system involves restricting water flow through the core so that more steam in the top part reduces moderation.
Water in the reactor core reaches about 325°C, hence it must be kept under about 150 times atmospheric pressure to prevent it boiling. Pressure is maintained by steam in a pressuriser (see diagram). In the primary cooling circuit the water is also the moderator, and if any of it turned to steam the fission reaction would slow down. This negative feedback effect is one of the safety features of the type. The secondary shutdown system involves adding boron to the primary circuit.
The secondary circuit is under less pressure and the water here boils in the heat exchangers which are thus steam generators. The steam drives the turbine to produce electricity, and is then condensed and returned to the heat exchangers in contact with the primary circuit.
Boiling water reactor (BWR):
This type of reactor has many similarities to the PWR, except that there is only a single circuit in which the water is at lower pressure (about 75 times atmospheric pressure) so that it boils in the core at about 285°C. The reactor is designed to operate with 12-15% of the water in the top part of the core as steam, and hence with less moderating effect and thus efficiency there. BWR units can operate in load-following mode more readily than PWRs.
The steam passes through drier plates (steam separators) above the core and then directly to the turbines, which are thus part of the reactor circuit. Since the water around the core of a reactor is always contaminated with traces of radionuclides, it means that the turbine must be shielded and radiological protection provided during maintenance. The cost of this tends to balance the savings due to the simpler design. Most of the radioactivity in the water is very short-lived*, so the turbine hall can be entered soon after the reactor is shut down.
* mostly N-16, with a 7 second half-life
A BWR fuel assembly comprises 90-100 fuel rods, and there are up to 750 assemblies in a reactor core, holding up to 140 tonnes of uranium. The secondary control system involves restricting water flow through the core so that more steam in the top part reduces moderation.
Above Illustration: Pressurised heavy water reactor (PHWR)
The PHWR reactor has been developed since the 1950s in Canada as the CANDU, and from 1980s also in India. PHWRs generally use natural uranium (0.7% U-235) oxide as fuel, hence needs a more efficient moderator, in this case heavy water (D2O).** The PHWR produces more energy per kilogram of mined uranium than other designs, but also produces a much larger amount of used fuel per unit output.
** with the CANDU system, the moderator is enriched (i.e. water) rather than the fuel – a cost trade-off.
The moderator is in a large tank called a calandria, penetrated by several hundred horizontal pressure tubes which form channels for the fuel, cooled by a flow of heavy water under high pressure (about 100 times atmospheric pressure) in the primary cooling circuit, typically reaching 290°C.
As in the PWR, the primary coolant generates steam in a secondary circuit to drive the turbines. The pressure tube design means that the reactor can be refuelled progressively without shutting down, by isolating individual pressure tubes from the cooling circuit. It is also less costly to build than designs with a large pressure vessel, but the tubes have not proved as durable.
The PHWR reactor has been developed since the 1950s in Canada as the CANDU, and from 1980s also in India. PHWRs generally use natural uranium (0.7% U-235) oxide as fuel, hence needs a more efficient moderator, in this case heavy water (D2O).** The PHWR produces more energy per kilogram of mined uranium than other designs, but also produces a much larger amount of used fuel per unit output.
** with the CANDU system, the moderator is enriched (i.e. water) rather than the fuel – a cost trade-off.
The moderator is in a large tank called a calandria, penetrated by several hundred horizontal pressure tubes which form channels for the fuel, cooled by a flow of heavy water under high pressure (about 100 times atmospheric pressure) in the primary cooling circuit, typically reaching 290°C.
As in the PWR, the primary coolant generates steam in a secondary circuit to drive the turbines. The pressure tube design means that the reactor can be refuelled progressively without shutting down, by isolating individual pressure tubes from the cooling circuit. It is also less costly to build than designs with a large pressure vessel, but the tubes have not proved as durable.
A CANDU fuel assembly consists of a bundle of 37 half metre long fuel rods (ceramic fuel pellets in zircaloy tubes) plus a support structure, with 12 bundles lying end to end in a fuel channel.
Control rods penetrate the calandria vertically, and a secondary shutdown system involves adding gadolinium to the moderator. The heavy water moderator circulating through the body of the calandria vessel also yields some heat (though this circuit is not shown on the diagram above).
Newer PHWR designs such as the Advanced Candu Reactor (ACR) have light water cooling and slightly-enriched fuel.
CANDU reactors can accept a variety of fuels. They may be run on recycled uranium from reprocessing LWR used fuel, or a blend of this and depleted uranium left over from enrichment plants. About 4000 MWe of PWR might then fuel 1000 MWe of CANDU capacity, with addition of depleted uranium. Thorium may also be used in fuel.
Advanced gas-cooled reactor (AGR):
These are the second generation of British gas-cooled reactors, using graphite moderator and carbon dioxide as primary coolant. The fuel is uranium oxide pellets, enriched to 2.5 - 3.5%, in stainless steel tubes. The carbon dioxide circulates through the core, reaching 650°C and then past steam generator tubes outside it, but still inside the concrete and steel pressure vessel (hence 'integral' design). Control rods penetrate the moderator and a secondary shutdown system involves injecting nitrogen to the coolant. The high temperature gives it a high thermal efficiency – about 41%. Refuelling can be on-load.
Control rods penetrate the calandria vertically, and a secondary shutdown system involves adding gadolinium to the moderator. The heavy water moderator circulating through the body of the calandria vessel also yields some heat (though this circuit is not shown on the diagram above).
Newer PHWR designs such as the Advanced Candu Reactor (ACR) have light water cooling and slightly-enriched fuel.
CANDU reactors can accept a variety of fuels. They may be run on recycled uranium from reprocessing LWR used fuel, or a blend of this and depleted uranium left over from enrichment plants. About 4000 MWe of PWR might then fuel 1000 MWe of CANDU capacity, with addition of depleted uranium. Thorium may also be used in fuel.
Advanced gas-cooled reactor (AGR):
These are the second generation of British gas-cooled reactors, using graphite moderator and carbon dioxide as primary coolant. The fuel is uranium oxide pellets, enriched to 2.5 - 3.5%, in stainless steel tubes. The carbon dioxide circulates through the core, reaching 650°C and then past steam generator tubes outside it, but still inside the concrete and steel pressure vessel (hence 'integral' design). Control rods penetrate the moderator and a secondary shutdown system involves injecting nitrogen to the coolant. The high temperature gives it a high thermal efficiency – about 41%. Refuelling can be on-load.
The AGR was developed from the Magnox reactor. Magnox reactors were also graphite moderated and CO2 cooled, used natural uranium fuel in metal form, and water as secondary coolant. The UK's last Magnox reactor closed at the end of 2015.
Light water graphite-moderated reactor (LWGR):
The main LWGR design is the RBMK, a Soviet design, developed from plutonium production reactors. It employs long (7 metre) vertical pressure tubes running through graphite moderator, and is cooled by water, which is allowed to boil in the core at 290°C and at about 6.9 MPa, much as in a BWR.
Fuel is low-enriched uranium oxide made up into fuel assemblies 3.5 metres long. With moderation largely due to the fixed graphite, excess boiling simply reduces the cooling and neutron absorbtion without inhibiting the fission reaction, and a positive feedback problem can arise, which is why they have never been built outside the Soviet Union. See appendix on RBMK Reactors for further information.
Fast neutron reactor (FNR):
Some reactors do not have a moderator and utilise fast neutrons, generating power from plutonium while making more of it from the U-238 isotope in or around the fuel. While they get more than 60 times as much energy from the original uranium compared with normal reactors, they are expensive to build.
Further development of them is likely in the next decade, and the main designs expected to be built in two decades are FNRs. If they are configured to produce more fissile material (plutonium) than they consume they are called fast breeder reactors (FBR). See also pages on Fast Neutron Reactors and Small Nuclear Power Reactors papers.
Light water graphite-moderated reactor (LWGR):
The main LWGR design is the RBMK, a Soviet design, developed from plutonium production reactors. It employs long (7 metre) vertical pressure tubes running through graphite moderator, and is cooled by water, which is allowed to boil in the core at 290°C and at about 6.9 MPa, much as in a BWR.
Fuel is low-enriched uranium oxide made up into fuel assemblies 3.5 metres long. With moderation largely due to the fixed graphite, excess boiling simply reduces the cooling and neutron absorbtion without inhibiting the fission reaction, and a positive feedback problem can arise, which is why they have never been built outside the Soviet Union. See appendix on RBMK Reactors for further information.
Fast neutron reactor (FNR):
Some reactors do not have a moderator and utilise fast neutrons, generating power from plutonium while making more of it from the U-238 isotope in or around the fuel. While they get more than 60 times as much energy from the original uranium compared with normal reactors, they are expensive to build.
Further development of them is likely in the next decade, and the main designs expected to be built in two decades are FNRs. If they are configured to produce more fissile material (plutonium) than they consume they are called fast breeder reactors (FBR). See also pages on Fast Neutron Reactors and Small Nuclear Power Reactors papers.
For reactors under construction, see information page on Plans for New Reactors Worldwide.
Advanced reactors:
Several generations of reactors are commonly distinguished. Generation I reactors were developed in the 1950-60s and the last one (Wylfa 1 in the UK) shut down at the end of 2015.
They mostly used natural uranium fuel and used graphite as moderator. Generation II reactors are typified by the present US fleet and most in operation elsewhere. They typically use enriched uranium fuel and are mostly cooled and moderated by water. Generation III are the advanced reactors evolved from these, the first few of which are in operation in Japan, China, Russia and the UAE.
Others are under construction and ready to be ordered. They are developments of the second generation with enhanced safety. There is no clear distinction between Generation II and Generation III.
Generation IV designs are still on the drawing board. They will tend to have closed fuel cycles and burn the long-lived actinides now forming part of spent fuel, so that fission products are the only high-level waste. Of seven designs under development with international collaboration, four or five will be fast neutron reactors. Four will use fluoride or liquid metal coolants, hence operate at low pressure. Two will be gas-cooled. Most will run at much higher temperatures than today’s water-cooled reactors. See Generation IV Reactors paper.
More than a dozen (Generation III) advanced reactor designs are in various stages of development. Some are evolutionary from the PWR, BWR and CANDU designs above, some are more radical departures.
The former include the Advanced Boiling Water Reactor, a few of which are now operating with others under construction. Advanced PWRs operate in China, Russia and UAE, with more under construction. The best-known radical new design has the fuel as large 'pebbles' and uses helium as coolant, at very high temperature, possibly to drive a turbine directly.
Considering the closed fuel cycle, Generation I-III reactors recycle plutonium (and possibly uranium), while Generation IV are expected to have full actinide recycle.
Many advanced reactor designs are for small units – under 300 MWe – and in the category of small modular reactors (SMRs), since several of them together may comprise a large power plant, maybe built progressively. Apart from the normal oxide fuels, other fuel types are metal, TRISO*, carbide, nitride, or liquid salt.
* TRISO (tristructural-isotropic) particles less than a millimetre in diameter. Each has a kernel (c. 0.5 mm) of uranium oxycarbide (or uranium dioxide), with the uranium enriched up to 20% U-235. This kernel is surrounded by layers of carbon and silicon carbide, giving a containment for fission products which is stable to over 1600°C.
Floating nuclear power plants:
Apart from over 200 nuclear reactors powering various kinds of ships, Rosatom in Russia has set up a subsidiary to supply floating nuclear power plants ranging in size from 70 to 600 MWe. These will be mounted in pairs on a large barge, which will be permanently moored where it is needed to supply power and possibly some desalination to a shore settlement or industrial complex.
The first has two 40 MWe reactors based on those in icebreakers and operates at a remote site in Siberia. Electricity cost is expected to be much lower than from present alternatives. For more information see page on Nuclear Power in Russia.
The Russian KLT-40S is a reactor well proven in icebreakers. Here a 150 MWt unit produces 35 MWe (gross) as well as up to 35 MW of heat for desalination or district heating. These are designed to run 3-4 years between refuelling and it is envisaged that they will be operated in pairs to allow for outages, with on-board refuelling capability and used fuel storage. At the end of a 12-year operating cycle the whole plant is taken to a central facility for two-year overhaul and removal of used fuel, before being returned to service.
Second generation Russian FNPPs will have two 175 MWt, 50 MWe RITM-200M reactor units, each about 1500 tonnes lighter but more powerful than KLT-40S, and thus on a much smaller barge – about 12,000 rather than 21,000 tonnes displacement. Refuelling will be every 10-12 years.
Very similar RITM-200 reactors power the latest Russian icebreakers. In December 2022 Rosatom announced that it had developed nuclear fuel for the RITM-200s.
Power rating of a nuclear reactor:
Nuclear plant reactor power outputs are quoted in three ways:
* Net electrical MWe and gross MWe vary slightly from summer to winter, so normally the lower summer figure, or an average figure, is used. If the summer figure is quoted, plants may show a capacity factor greater than 100% in cooler times.
Watts Bar PWR in Tennessee is reported to run at about 1125 MWe in summer and about 1165 MWe net in winter, due to different condenser cooling water temperatures. Some design options, such as powering the main large feedwater pumps with electric motors (as in EPR or Hualong One) rather than steam turbines (taking steam before it gets to the main turbine-generator), explains some gross to net differences between different reactor types. The EPR has a relatively large drop from gross to net MWe for this reason, and as noted above, the Hualong One needs 20 MWe to run its primary pumps.
Advanced reactors:
Several generations of reactors are commonly distinguished. Generation I reactors were developed in the 1950-60s and the last one (Wylfa 1 in the UK) shut down at the end of 2015.
They mostly used natural uranium fuel and used graphite as moderator. Generation II reactors are typified by the present US fleet and most in operation elsewhere. They typically use enriched uranium fuel and are mostly cooled and moderated by water. Generation III are the advanced reactors evolved from these, the first few of which are in operation in Japan, China, Russia and the UAE.
Others are under construction and ready to be ordered. They are developments of the second generation with enhanced safety. There is no clear distinction between Generation II and Generation III.
Generation IV designs are still on the drawing board. They will tend to have closed fuel cycles and burn the long-lived actinides now forming part of spent fuel, so that fission products are the only high-level waste. Of seven designs under development with international collaboration, four or five will be fast neutron reactors. Four will use fluoride or liquid metal coolants, hence operate at low pressure. Two will be gas-cooled. Most will run at much higher temperatures than today’s water-cooled reactors. See Generation IV Reactors paper.
More than a dozen (Generation III) advanced reactor designs are in various stages of development. Some are evolutionary from the PWR, BWR and CANDU designs above, some are more radical departures.
The former include the Advanced Boiling Water Reactor, a few of which are now operating with others under construction. Advanced PWRs operate in China, Russia and UAE, with more under construction. The best-known radical new design has the fuel as large 'pebbles' and uses helium as coolant, at very high temperature, possibly to drive a turbine directly.
Considering the closed fuel cycle, Generation I-III reactors recycle plutonium (and possibly uranium), while Generation IV are expected to have full actinide recycle.
Many advanced reactor designs are for small units – under 300 MWe – and in the category of small modular reactors (SMRs), since several of them together may comprise a large power plant, maybe built progressively. Apart from the normal oxide fuels, other fuel types are metal, TRISO*, carbide, nitride, or liquid salt.
* TRISO (tristructural-isotropic) particles less than a millimetre in diameter. Each has a kernel (c. 0.5 mm) of uranium oxycarbide (or uranium dioxide), with the uranium enriched up to 20% U-235. This kernel is surrounded by layers of carbon and silicon carbide, giving a containment for fission products which is stable to over 1600°C.
Floating nuclear power plants:
Apart from over 200 nuclear reactors powering various kinds of ships, Rosatom in Russia has set up a subsidiary to supply floating nuclear power plants ranging in size from 70 to 600 MWe. These will be mounted in pairs on a large barge, which will be permanently moored where it is needed to supply power and possibly some desalination to a shore settlement or industrial complex.
The first has two 40 MWe reactors based on those in icebreakers and operates at a remote site in Siberia. Electricity cost is expected to be much lower than from present alternatives. For more information see page on Nuclear Power in Russia.
The Russian KLT-40S is a reactor well proven in icebreakers. Here a 150 MWt unit produces 35 MWe (gross) as well as up to 35 MW of heat for desalination or district heating. These are designed to run 3-4 years between refuelling and it is envisaged that they will be operated in pairs to allow for outages, with on-board refuelling capability and used fuel storage. At the end of a 12-year operating cycle the whole plant is taken to a central facility for two-year overhaul and removal of used fuel, before being returned to service.
Second generation Russian FNPPs will have two 175 MWt, 50 MWe RITM-200M reactor units, each about 1500 tonnes lighter but more powerful than KLT-40S, and thus on a much smaller barge – about 12,000 rather than 21,000 tonnes displacement. Refuelling will be every 10-12 years.
Very similar RITM-200 reactors power the latest Russian icebreakers. In December 2022 Rosatom announced that it had developed nuclear fuel for the RITM-200s.
Power rating of a nuclear reactor:
Nuclear plant reactor power outputs are quoted in three ways:
- Thermal MWt, which depends on the design of the actual nuclear reactor itself, and relates to the quantity and quality of the steam it produces.
- Gross electrical MWe, which indicates the power produced by the attached steam turbine and generator, and also takes into account the ambient temperature for the condenser circuit (cooler means more electric power, warmer means less). Rated gross power assumes certain conditions with both.
- Net electrical MWe, which is the power available to be sent out from the plant to the grid, after deducting the electrical power needed to run the reactor (cooling and feedwater pumps, etc.) and the rest of the plant.*
* Net electrical MWe and gross MWe vary slightly from summer to winter, so normally the lower summer figure, or an average figure, is used. If the summer figure is quoted, plants may show a capacity factor greater than 100% in cooler times.
Watts Bar PWR in Tennessee is reported to run at about 1125 MWe in summer and about 1165 MWe net in winter, due to different condenser cooling water temperatures. Some design options, such as powering the main large feedwater pumps with electric motors (as in EPR or Hualong One) rather than steam turbines (taking steam before it gets to the main turbine-generator), explains some gross to net differences between different reactor types. The EPR has a relatively large drop from gross to net MWe for this reason, and as noted above, the Hualong One needs 20 MWe to run its primary pumps.
The relationship between these is expressed in two ways:
In World Nuclear Association information pages and figures and World Nuclear News items, generally net MWe is used for operating plants, and gross MWe for those under construction or planned/proposed.
Lifetime of nuclear reactors:
Most of today's nuclear plants which were originally designed for 30 or 40-year operating lives. However, with major investments in systems, structures and components operating lifetimes can be extended, and in several countries there are active programmes to extend operation.
In the USA nearly all of the almost 100 reactors have been granted operating licence extensions from 40 to 60 years. This justifies significant capital expenditure in upgrading systems and components, including building in extra performance margins. Some will operate for 80 years or more.
Some components simply wear out, corrode or degrade to a low level of efficiency. These need to be replaced. Steam generators are the most prominent and expensive of these, and many have been replaced after about 30 years where the reactor otherwise has the prospect of running for 60 or more years. This is essentially an economic decision. Lesser components are more straightforward to replace as they age. In Candu reactors, pressure tube replacement has been undertaken on some plants after about 30 years of operation.
A second issue is that of obsolescence. For instance, older reactors have analogue instrument and control systems. Some have been replaced with digital systems. Thirdly, the properties of materials may degrade with age, particularly with heat and neutron irradiation. In respect to all these aspects, investment is needed to maintain reliability and safety. Also, periodic safety reviews are undertaken on older plants in line with international safety conventions and principles to ensure that safety margins are maintained.
Another important issue is knowledge management over the full lifecycle from design, through construction and operation to decommissioning for reactors and other facilities. This may span a century and involve several countries, and involve a succession of companies.
The plant lifespan will cover several generations of engineers. Data needs to be transferable across several generations of software and IT hardware, as well as being shared with other operators of similar plants.* Significant modifications may be made to the design over the life of the plant, so original documentation is not sufficient, and loss of design base knowledge can have huge implications (e.g. Pickering A and Bruce A in Ontario).
Knowledge management is often a shared responsibility and is essential for effective decision-making and the achievement of plant safety and economics.
* ISO15926 covers portability and interoperability for lifecycle open data standard. Also EPRI in 2013 published Advanced Nuclear Technology: New Nuclear Power Plant Information Handover Guide.
See also section on Ageing, in Safety of Plants paper.
Primary coolants:
The advent of some of the designs mentioned above provides opportunity to review the various primary heat transfer fluids used in nuclear reactors. There is a wide variety – gas, water, light metal, heavy metal and salt:
Water or heavy water must be maintained at very high pressure (1000-2200 psi, 7-15 MPa, 150 atmospheres) to enable it to function well above 100°C, up to 345°C, as in present reactors.
This has a major influence on reactor engineering. However, supercritical water around 25 MPa can give 45% thermal efficiency – as at some fossil-fuel power plants today with outlet temperatures of 600°C, and at ultra supercritical levels (30+ MPa) 50% may be attained.
Water cooling of steam condensers is fairly standard in power plants because it works very well, is relatively inexpensive, and there is a huge experience base. Water (at 75 atm pressure) has good heat capacity – about 4000 kJ/m3 – so is a lot more effective than gas for removing heat, though its thermal conductivity is less than liquid alternatives.
A possible variation on this is having a high proportion of heavy water in the coolant early in the fuel cycle so that more Pu-239 is bred from U-238, thereby extending the cycle and improving uranium utilization. This is known as spectral shift control.
Helium must be used at similar pressure (1000-2000 psi, 7-14 MPa) to maintain sufficient density for efficient operation. However, even at 75 atm pressure its heat capacity is only about 20 kJ/m3. Again, there are engineering implications from the high pressure required, but it can be used in the Brayton cycle to drive a turbine directly.
Carbon dioxide was used in early British reactors, and their current AGRs, which operate at much higher temperatures than light water reactors. It is denser than helium and thus likely to give better thermal conversion efficiency. It also leaks less readily than helium. But at very high temperatures – such as in HTRs – it breaks down, hence the focus on helium. There is now interest in supercritical CO2 for the Brayton cycle.
Sodium, as normally used in fast neutron reactors at around 550ºC, melts at 98°C and boils at 883°C at atmospheric pressure, so despite the need to keep it dry the engineering required to contain it is relatively modest. It has high thermal conductivity and high heat capacity – about 1000 kJ/m3 at 2 atm pressure.
However, normally water/steam is used in the secondary circuit to drive a turbine (Rankine cycle) at lower thermal efficiency than the Brayton cycle. In some designs sodium is in a secondary circuit to steam generators. Sodium does not corrode the metals used in the fuel cladding or primary circuit, nor the fuel itself if there is cladding damage, but it is very reactive generally.
In particular it reacts exothermically with water or steam to liberate hydrogen. It burns in air, but much less vigorously. Sodium has a low neutron capture cross-section, but it is enough for some Na-23 to become Na-24, which is a beta-emitter and very gamma-active with 15-hour half-life, so some shielding is required.
In a large reactor, with about 5000 t sodium per GWe, Na-24 activity reaches an equilibrium level of nearly 1 TBq/kg – a large radioactive inventory. If a reactor needs to be shut down frequently, NaK eutectic which is liquid at room temperature (about 13°C) may be used as coolant, but the potassium is pyrophoric, which increases the hazard. Sodium is about six times more transparent to neutrons than lead.
Lead or lead-bismuth eutectic in fast neutron reactors are capable of higher temperature operation at atmospheric pressure. They are transparent to neutrons, aiding efficiency due to greater spacing between fuel pins which then allows coolant flow by convection for decay heat removal, and since they do not react with water the heat exchanger interface is safer.
They do not burn when exposed to air. However, they are corrosive of fuel cladding and steels, which originally limited temperatures to 550°C (boiling point of lead is 1750°C). With today's materials 650°C can be reached, and in future 800°C is envisaged with the second stage of Generation IV development, using oxide dispersion-strengthened steels.
Lead and Pb-Bi have much higher thermal conductivity than water, but lower than sodium. Rosatom is building a demonstration 300 MWe BREST lead-cooled fast neutron reactor in Russia.
Westinghouse is developing a lead-cooled fast reactor concept and LeadCold in Canada is developing one also, using novel aluminium-steel alloys that are highly corrosion-resistant to 450°C. The compound Ti3SiC2 (titanium silicon carbide) is suggested for primary circuits, resisting corrosion.
While lead has limited activation from neutrons, a problem with Pb-Bi is that it yields toxic polonium (Po-210) activation product, an alpha-emitter with a half-life of 138 days. Pb-Bi melts at a relatively low 125°C (hence eutectic) and boils at 1670°C, Pb melts at 327°C and boils at 1737°C but is very much more abundant and cheaper to produce than bismuth, hence is envisaged for large-scale use in the future, though freezing must be prevented.
The development of nuclear power based on Pb-Bi cooled fast neutron reactors is likely to be limited to a total of 50-100 GWe, basically for small reactors in remote places. In 1998 Russia declassified a lot of research information derived from its experience with submarine reactors, and US interest in using Pb generally or Pb-Bi for small reactors has increased subsequently. The Gen4 Module (Hyperion) reactor will use lead-bismuth eutectic which is 45% Pb, 55% Bi. A secondary circuit generating steam is likely.
For details of lead-bismuth eutectic coolants, see the 2013 IAEA report in References.
SALT: Fluoride salts boil at around 1400°C at atmospheric pressure, so allow several options for use of the heat, including using helium in a secondary Brayton cycle circuit with thermal efficiencies of 48% at 750°C to 59% at 1000°C, for manufacture of hydrogen.
Fluoride salts have a very high boiling temperature, very low vapour pressure even at red heat, very high volumetric heat capacity (4670 kJ/m3 for FLiBe, higher than water at 75 atm pressure), good heat transfer properties, low neutron absorbtion, good neutron moderation capability, are not damaged by radiation, are chemically very stable so absorb all fission products well and do not react violently with air or water, are compatible with graphite, and some are also inert to some common structural metals.
Some gamma-active F-20 is formed by neutron capture, but has very short half-life (11 seconds).
Lithium-beryllium fluoride Li2BeF4 (FLiBe) salt is a eutectic version of LiF (2LiF + BeF2) which solidifies at 459°C and boils at 1430°C. It is favoured in MSR and AHTR/FHR primary cooling and when uncontaminated has a low corrosion effect. LiF without the toxic beryllium solidifies at about 500°C and boils at about 1200°C. FLiNaK (LiF-NaF-KF) is also eutectic and solidifies at 454°C and boils at 1570°C. It has a higher neutron cross-section than FLiBe or LiF but can be used intermediate cooling loops.
For details of molten salt coolants, both as coolant only and as fuel-carriers, see the 2013 IAEA report on Challenges Related to the Use of Liquid Metal and Molten Salt Coolants in Advanced Reactors – Report of the Collaborative Project COOL of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO).
Chloride salts have advantages in fast-spectrum molten salt reactors, having higher solubility for actinides than fluorides. While NaCl has good nuclear, chemical and physical properties its high melting point means it needs to be blended with MgCl2 or CaCl2, the former being preferred in eutectic, and allowing the addition of actinide trichlorides. The major isotope of chlorine, Cl-35 gives rise to Cl-36 as an activation product – a long-lived energetic beta source, so Cl-37 is much preferable in a reactor. In thermal reactors, chlorides are only candidates for secondary cooling loops.
All low-pressure liquid coolants allow all their heat to be delivered at high temperatures, since the temperature drop in heat exchangers is less than with gas coolants. Also, with a good margin between operating and boiling temperatures, passive cooling for decay heat is readily achieved. Since heat exchangers do leak to some small extent, having incompatible primary and secondary coolants can be a problem. The less pressure difference across the heat exchanger, the less is the problem.
The removal of passive decay heat is a vital feature of primary cooling systems, beyond heat transfer to do work. When the fission process stops, fission product decay continues and a substantial amount of heat is added to the core.
At the moment of shutdown, this is about 6.5% of the full power level, but after an hour it drops to about 1.5% as the short-lived fission products decay. After a day, the decay heat falls to 0.4%, and after a week it will be only 0.2%. This heat could melt the core of a light water reactor unless it is reliably dissipated, as shown in the March 2011 accident at Fukushima Daiichi, where about 1.5% of the heat was being generated when the tsunami disabled the cooling.
In passive systems, some kind of convection flow is relied upon. Decay heat removal is more of a problem in gas-cooled reactors due to low thermal inertia, and this has limited the size of individual units.
- Thermal efficiency %, the ratio of gross MWe to MWt. This relates to the difference in temperature between the steam from the reactor and the cooling water. It is often 33-37% in light water reactors, reaching 38% in the latest PWRs.
- Net efficiency %, the ratio of net MWe achieved to MWt. This is a little lower, and allows for plant usage.
In World Nuclear Association information pages and figures and World Nuclear News items, generally net MWe is used for operating plants, and gross MWe for those under construction or planned/proposed.
Lifetime of nuclear reactors:
Most of today's nuclear plants which were originally designed for 30 or 40-year operating lives. However, with major investments in systems, structures and components operating lifetimes can be extended, and in several countries there are active programmes to extend operation.
In the USA nearly all of the almost 100 reactors have been granted operating licence extensions from 40 to 60 years. This justifies significant capital expenditure in upgrading systems and components, including building in extra performance margins. Some will operate for 80 years or more.
Some components simply wear out, corrode or degrade to a low level of efficiency. These need to be replaced. Steam generators are the most prominent and expensive of these, and many have been replaced after about 30 years where the reactor otherwise has the prospect of running for 60 or more years. This is essentially an economic decision. Lesser components are more straightforward to replace as they age. In Candu reactors, pressure tube replacement has been undertaken on some plants after about 30 years of operation.
A second issue is that of obsolescence. For instance, older reactors have analogue instrument and control systems. Some have been replaced with digital systems. Thirdly, the properties of materials may degrade with age, particularly with heat and neutron irradiation. In respect to all these aspects, investment is needed to maintain reliability and safety. Also, periodic safety reviews are undertaken on older plants in line with international safety conventions and principles to ensure that safety margins are maintained.
Another important issue is knowledge management over the full lifecycle from design, through construction and operation to decommissioning for reactors and other facilities. This may span a century and involve several countries, and involve a succession of companies.
The plant lifespan will cover several generations of engineers. Data needs to be transferable across several generations of software and IT hardware, as well as being shared with other operators of similar plants.* Significant modifications may be made to the design over the life of the plant, so original documentation is not sufficient, and loss of design base knowledge can have huge implications (e.g. Pickering A and Bruce A in Ontario).
Knowledge management is often a shared responsibility and is essential for effective decision-making and the achievement of plant safety and economics.
* ISO15926 covers portability and interoperability for lifecycle open data standard. Also EPRI in 2013 published Advanced Nuclear Technology: New Nuclear Power Plant Information Handover Guide.
See also section on Ageing, in Safety of Plants paper.
Primary coolants:
The advent of some of the designs mentioned above provides opportunity to review the various primary heat transfer fluids used in nuclear reactors. There is a wide variety – gas, water, light metal, heavy metal and salt:
Water or heavy water must be maintained at very high pressure (1000-2200 psi, 7-15 MPa, 150 atmospheres) to enable it to function well above 100°C, up to 345°C, as in present reactors.
This has a major influence on reactor engineering. However, supercritical water around 25 MPa can give 45% thermal efficiency – as at some fossil-fuel power plants today with outlet temperatures of 600°C, and at ultra supercritical levels (30+ MPa) 50% may be attained.
Water cooling of steam condensers is fairly standard in power plants because it works very well, is relatively inexpensive, and there is a huge experience base. Water (at 75 atm pressure) has good heat capacity – about 4000 kJ/m3 – so is a lot more effective than gas for removing heat, though its thermal conductivity is less than liquid alternatives.
A possible variation on this is having a high proportion of heavy water in the coolant early in the fuel cycle so that more Pu-239 is bred from U-238, thereby extending the cycle and improving uranium utilization. This is known as spectral shift control.
Helium must be used at similar pressure (1000-2000 psi, 7-14 MPa) to maintain sufficient density for efficient operation. However, even at 75 atm pressure its heat capacity is only about 20 kJ/m3. Again, there are engineering implications from the high pressure required, but it can be used in the Brayton cycle to drive a turbine directly.
Carbon dioxide was used in early British reactors, and their current AGRs, which operate at much higher temperatures than light water reactors. It is denser than helium and thus likely to give better thermal conversion efficiency. It also leaks less readily than helium. But at very high temperatures – such as in HTRs – it breaks down, hence the focus on helium. There is now interest in supercritical CO2 for the Brayton cycle.
Sodium, as normally used in fast neutron reactors at around 550ºC, melts at 98°C and boils at 883°C at atmospheric pressure, so despite the need to keep it dry the engineering required to contain it is relatively modest. It has high thermal conductivity and high heat capacity – about 1000 kJ/m3 at 2 atm pressure.
However, normally water/steam is used in the secondary circuit to drive a turbine (Rankine cycle) at lower thermal efficiency than the Brayton cycle. In some designs sodium is in a secondary circuit to steam generators. Sodium does not corrode the metals used in the fuel cladding or primary circuit, nor the fuel itself if there is cladding damage, but it is very reactive generally.
In particular it reacts exothermically with water or steam to liberate hydrogen. It burns in air, but much less vigorously. Sodium has a low neutron capture cross-section, but it is enough for some Na-23 to become Na-24, which is a beta-emitter and very gamma-active with 15-hour half-life, so some shielding is required.
In a large reactor, with about 5000 t sodium per GWe, Na-24 activity reaches an equilibrium level of nearly 1 TBq/kg – a large radioactive inventory. If a reactor needs to be shut down frequently, NaK eutectic which is liquid at room temperature (about 13°C) may be used as coolant, but the potassium is pyrophoric, which increases the hazard. Sodium is about six times more transparent to neutrons than lead.
Lead or lead-bismuth eutectic in fast neutron reactors are capable of higher temperature operation at atmospheric pressure. They are transparent to neutrons, aiding efficiency due to greater spacing between fuel pins which then allows coolant flow by convection for decay heat removal, and since they do not react with water the heat exchanger interface is safer.
They do not burn when exposed to air. However, they are corrosive of fuel cladding and steels, which originally limited temperatures to 550°C (boiling point of lead is 1750°C). With today's materials 650°C can be reached, and in future 800°C is envisaged with the second stage of Generation IV development, using oxide dispersion-strengthened steels.
Lead and Pb-Bi have much higher thermal conductivity than water, but lower than sodium. Rosatom is building a demonstration 300 MWe BREST lead-cooled fast neutron reactor in Russia.
Westinghouse is developing a lead-cooled fast reactor concept and LeadCold in Canada is developing one also, using novel aluminium-steel alloys that are highly corrosion-resistant to 450°C. The compound Ti3SiC2 (titanium silicon carbide) is suggested for primary circuits, resisting corrosion.
While lead has limited activation from neutrons, a problem with Pb-Bi is that it yields toxic polonium (Po-210) activation product, an alpha-emitter with a half-life of 138 days. Pb-Bi melts at a relatively low 125°C (hence eutectic) and boils at 1670°C, Pb melts at 327°C and boils at 1737°C but is very much more abundant and cheaper to produce than bismuth, hence is envisaged for large-scale use in the future, though freezing must be prevented.
The development of nuclear power based on Pb-Bi cooled fast neutron reactors is likely to be limited to a total of 50-100 GWe, basically for small reactors in remote places. In 1998 Russia declassified a lot of research information derived from its experience with submarine reactors, and US interest in using Pb generally or Pb-Bi for small reactors has increased subsequently. The Gen4 Module (Hyperion) reactor will use lead-bismuth eutectic which is 45% Pb, 55% Bi. A secondary circuit generating steam is likely.
For details of lead-bismuth eutectic coolants, see the 2013 IAEA report in References.
SALT: Fluoride salts boil at around 1400°C at atmospheric pressure, so allow several options for use of the heat, including using helium in a secondary Brayton cycle circuit with thermal efficiencies of 48% at 750°C to 59% at 1000°C, for manufacture of hydrogen.
Fluoride salts have a very high boiling temperature, very low vapour pressure even at red heat, very high volumetric heat capacity (4670 kJ/m3 for FLiBe, higher than water at 75 atm pressure), good heat transfer properties, low neutron absorbtion, good neutron moderation capability, are not damaged by radiation, are chemically very stable so absorb all fission products well and do not react violently with air or water, are compatible with graphite, and some are also inert to some common structural metals.
Some gamma-active F-20 is formed by neutron capture, but has very short half-life (11 seconds).
Lithium-beryllium fluoride Li2BeF4 (FLiBe) salt is a eutectic version of LiF (2LiF + BeF2) which solidifies at 459°C and boils at 1430°C. It is favoured in MSR and AHTR/FHR primary cooling and when uncontaminated has a low corrosion effect. LiF without the toxic beryllium solidifies at about 500°C and boils at about 1200°C. FLiNaK (LiF-NaF-KF) is also eutectic and solidifies at 454°C and boils at 1570°C. It has a higher neutron cross-section than FLiBe or LiF but can be used intermediate cooling loops.
For details of molten salt coolants, both as coolant only and as fuel-carriers, see the 2013 IAEA report on Challenges Related to the Use of Liquid Metal and Molten Salt Coolants in Advanced Reactors – Report of the Collaborative Project COOL of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO).
Chloride salts have advantages in fast-spectrum molten salt reactors, having higher solubility for actinides than fluorides. While NaCl has good nuclear, chemical and physical properties its high melting point means it needs to be blended with MgCl2 or CaCl2, the former being preferred in eutectic, and allowing the addition of actinide trichlorides. The major isotope of chlorine, Cl-35 gives rise to Cl-36 as an activation product – a long-lived energetic beta source, so Cl-37 is much preferable in a reactor. In thermal reactors, chlorides are only candidates for secondary cooling loops.
All low-pressure liquid coolants allow all their heat to be delivered at high temperatures, since the temperature drop in heat exchangers is less than with gas coolants. Also, with a good margin between operating and boiling temperatures, passive cooling for decay heat is readily achieved. Since heat exchangers do leak to some small extent, having incompatible primary and secondary coolants can be a problem. The less pressure difference across the heat exchanger, the less is the problem.
The removal of passive decay heat is a vital feature of primary cooling systems, beyond heat transfer to do work. When the fission process stops, fission product decay continues and a substantial amount of heat is added to the core.
At the moment of shutdown, this is about 6.5% of the full power level, but after an hour it drops to about 1.5% as the short-lived fission products decay. After a day, the decay heat falls to 0.4%, and after a week it will be only 0.2%. This heat could melt the core of a light water reactor unless it is reliably dissipated, as shown in the March 2011 accident at Fukushima Daiichi, where about 1.5% of the heat was being generated when the tsunami disabled the cooling.
In passive systems, some kind of convection flow is relied upon. Decay heat removal is more of a problem in gas-cooled reactors due to low thermal inertia, and this has limited the size of individual units.
Heat transfer for different primary coolants – low pressure liquid coolants allow more heat to be delivered, at higher temperatures (Source: Forsberg1)
See also information page on Cooling Power Plants.
There is some radioactivity in the cooling water flowing through the core of a water-cooled reactor, due mainly to the activation product nitrogen-16, formed by neutron capture from oxygen. N-16 has a half-life of only 7 seconds but produces high-energy gamma radiation during decay. It is the reason that access to a BWR turbine hall is restricted during actual operation.
Load-following capability:
Nuclear power plants are best run continuously at high capacity to meet base-load demand in a grid system. If their power output is ramped up and down on a daily and weekly basis, efficiency is compromised, and in this respect they are similar to most coal-fired plants. (It is also uneconomic to run them at less than full capacity, since they are expensive to build but cheap to run.)
However, in some situations it is necessary to vary the output according to daily and weekly load cycles on a regular basis, for instance in France, where there is a very high reliance on nuclear power. Areva has developed its Advanced Load-Following Control System for PWRs that automatically adjusts the plant's electrical output according to the needs of the grid operator.
It involves a software upgrade of the reactor control system which varies the plant's output between 50% and 100% of its installed capacity without intervention of the operator. Since 2008, Areva NP has installed the technology at four German nuclear power units, Philippsburg 2 (now shutdown), Isar 2, Brokdorf, and Grohnde, as well as Goesgen in Switzerland.
BWRs can be made to follow loads reasonably easily without burning the core unevenly, by changing the coolant flow rate. Load following is not as readily achieved in a PWR, but especially in France since 1981, so-called 'grey' control rods are used. The ability of a PWR to run at less than full power for much of the time depends on whether it is in the early part of its 18 to 24-month refuelling cycle or late in it, and whether it is designed with special control rods which diminish power levels throughout the core without shutting it down.
Thus, though the ability on any individual PWR reactor to run on a sustained basis at low power decreases markedly as it progresses through the refuelling cycle, there is considerable scope for running a fleet of reactors in load-following mode.
European Utility Requirements (EUR) since 2001 specify that new reactor designs must be capable of load-following between 50 and 100% of capacity with a rate of change of electric output of 3-5% per minute. The economic consequences are mainly due to diminished load factor of a capital-intensive plant.
Further information in the Nuclear Power in France information page and the 2011 Nuclear Energy Agency report, Technical and Economic Aspects of Load Following with Nuclear Power Plants.
As fast neutron reactors become established in future years, their ability to load-follow will be a benefit.
Nuclear reactors for process heat:
Producing steam to drive a turbine and generator is relatively easy, and a light water reactor running at 350°C does this readily. As the above section and Figure show, other types of reactor are required for higher temperatures. A 2010 US Department of Energy document quotes 500°C for a liquid metal cooled reactor (FNR), 860°C for a molten salt reactor (MSR), and 950°C for a high temperature gas-cooled reactor (HTR).
Lower-temperature reactors can be used with supplemental gas heating to reach higher temperatures, though employing an LWR would not be practical or economic. The DOE said that high reactor outlet temperatures in the range 750 to 950°C were required to satisfy all end user requirements evaluated to date for the Next Generation Nuclear Plant.
For more information see page on Nuclear Process Heat for Industry.
Primitive reactors:
The world's oldest known nuclear reactors operated at what is now Oklo in Gabon, West Africa. About 2 billion years ago, at least 16 natural nuclear reactors achieved criticality in a high-grade deposit of uranium ore (a 17th was in the Bangombe deposit 30 km away). Each operated intermittently at about 20 kW thermal, the reaction ceasing whenever the water turned to steam so that it ceased to function as moderator.
At that time the concentration of U-235 in all natural uranium was about 3.6% instead of 0.7% as at present. (U-235 decays much faster than U-238, whose half-life is about the same as the age of the Earth. When the Earth was formed U-235 was about 30% of uranium.)
These natural chain reactions started spontaneously and continued overall for one or two million years before finally dying away. It appears that each reactor operated in pulses of about 30 minutes. It is estimated that about 130 TWh of heat was produced. (The reactors were discovered when assays of mined uranium showed only 0.717% U-235 instead of 0.720% as everywhere else on the planet.
Further investigation identified particular reactor zones with U-235 levels down to 0.44%. There were also significant concentrations of decay nuclides from fission products of both uranium and plutonium.)
During this long reaction period about 5.4 tonnes of fission products as well as up to two tonnes of plutonium together with other transuranic elements were generated in the orebody.
The initial radioactive products have long since decayed into stable elements but close study of the amount and location of these has shown that there was little movement of radioactive wastes during and after the nuclear reactions. Plutonium and the other transuranics remained immobile.
___________________________________________________________________________
World Nuclear Association (Author of above "Nuclear Power Reactors") Wikipedia
World Nuclear Association is the international organization that promotes nuclear power and supports the companies that comprise the global nuclear industry. Its members come from all parts of the nuclear fuel cycle, including:
Together, World Nuclear Association members are responsible for 70% of the world's nuclear power as well as the vast majority of world uranium, conversion and enrichment production.
The Association says it aims to fulfill a dual role for its members: Facilitating their interaction on technical, commercial and policy matters and promoting wider public understanding of nuclear technology. It has a secretariat of around 30 staff.
The Association was founded in 2001 on the basis of the Uranium Institute, itself founded in 1975.
Membership:
World Nuclear Association continues to expand its membership, particularly in non-OECD countries where nuclear power is produced or where this option is under active consideration. Members are located in 44 countries representing 80% of the world's population.
The annual subscription fee for an institutional member is based on its size and scale of activity. Upon receiving an inquiry or application, the Association's London-based secretariat determines the fee according to standard criteria and informs the candidate organisation accordingly. The fee structure provides, in many cases, for significant discounts for organisations located in countries outside the OECD.
A low-fee non-commercial membership is available for organisations with a solely academic, research, policy or regulatory function.
A list of current members is published on the World Nuclear Association website.
Charter of Ethics:
World Nuclear Association has established a Charter of Ethics to serve as a common credo amongst its member organizations.This affirmation of values and principles is intended to summarize the responsibilities of the nuclear industry and the surrounding legal and institutional framework that has been constructed through international cooperation to fulfill U.S. President Dwight D. Eisenhower's vision of 'Atoms for Peace'.
Leadership:
World Nuclear Association members appoint a Director General and elect a 20-member board of management. The current Director General is Sama Bilbao y León. The Chairman of the board is Philippe Knoche CEO at Orano.
Vice chairman is Kirill Komarov First Deputy Director General at Rosatom. A board of management fulfils statutory duties pertaining to the organization's governance and sets World Nuclear Association policies and strategic objectives, subject to approval by the full membership.
Activities and services:
Industry interaction:
An essential role of World Nuclear Association is to facilitate commercially valuable interaction among its members.
Ongoing World Nuclear Association Working Groups, consisting of members and supported by the secretariat, share information and develop analysis on a range of technical, trade and environmental matters. These subjects include:
See also information page on Cooling Power Plants.
There is some radioactivity in the cooling water flowing through the core of a water-cooled reactor, due mainly to the activation product nitrogen-16, formed by neutron capture from oxygen. N-16 has a half-life of only 7 seconds but produces high-energy gamma radiation during decay. It is the reason that access to a BWR turbine hall is restricted during actual operation.
Load-following capability:
Nuclear power plants are best run continuously at high capacity to meet base-load demand in a grid system. If their power output is ramped up and down on a daily and weekly basis, efficiency is compromised, and in this respect they are similar to most coal-fired plants. (It is also uneconomic to run them at less than full capacity, since they are expensive to build but cheap to run.)
However, in some situations it is necessary to vary the output according to daily and weekly load cycles on a regular basis, for instance in France, where there is a very high reliance on nuclear power. Areva has developed its Advanced Load-Following Control System for PWRs that automatically adjusts the plant's electrical output according to the needs of the grid operator.
It involves a software upgrade of the reactor control system which varies the plant's output between 50% and 100% of its installed capacity without intervention of the operator. Since 2008, Areva NP has installed the technology at four German nuclear power units, Philippsburg 2 (now shutdown), Isar 2, Brokdorf, and Grohnde, as well as Goesgen in Switzerland.
BWRs can be made to follow loads reasonably easily without burning the core unevenly, by changing the coolant flow rate. Load following is not as readily achieved in a PWR, but especially in France since 1981, so-called 'grey' control rods are used. The ability of a PWR to run at less than full power for much of the time depends on whether it is in the early part of its 18 to 24-month refuelling cycle or late in it, and whether it is designed with special control rods which diminish power levels throughout the core without shutting it down.
Thus, though the ability on any individual PWR reactor to run on a sustained basis at low power decreases markedly as it progresses through the refuelling cycle, there is considerable scope for running a fleet of reactors in load-following mode.
European Utility Requirements (EUR) since 2001 specify that new reactor designs must be capable of load-following between 50 and 100% of capacity with a rate of change of electric output of 3-5% per minute. The economic consequences are mainly due to diminished load factor of a capital-intensive plant.
Further information in the Nuclear Power in France information page and the 2011 Nuclear Energy Agency report, Technical and Economic Aspects of Load Following with Nuclear Power Plants.
As fast neutron reactors become established in future years, their ability to load-follow will be a benefit.
Nuclear reactors for process heat:
Producing steam to drive a turbine and generator is relatively easy, and a light water reactor running at 350°C does this readily. As the above section and Figure show, other types of reactor are required for higher temperatures. A 2010 US Department of Energy document quotes 500°C for a liquid metal cooled reactor (FNR), 860°C for a molten salt reactor (MSR), and 950°C for a high temperature gas-cooled reactor (HTR).
Lower-temperature reactors can be used with supplemental gas heating to reach higher temperatures, though employing an LWR would not be practical or economic. The DOE said that high reactor outlet temperatures in the range 750 to 950°C were required to satisfy all end user requirements evaluated to date for the Next Generation Nuclear Plant.
For more information see page on Nuclear Process Heat for Industry.
Primitive reactors:
The world's oldest known nuclear reactors operated at what is now Oklo in Gabon, West Africa. About 2 billion years ago, at least 16 natural nuclear reactors achieved criticality in a high-grade deposit of uranium ore (a 17th was in the Bangombe deposit 30 km away). Each operated intermittently at about 20 kW thermal, the reaction ceasing whenever the water turned to steam so that it ceased to function as moderator.
At that time the concentration of U-235 in all natural uranium was about 3.6% instead of 0.7% as at present. (U-235 decays much faster than U-238, whose half-life is about the same as the age of the Earth. When the Earth was formed U-235 was about 30% of uranium.)
These natural chain reactions started spontaneously and continued overall for one or two million years before finally dying away. It appears that each reactor operated in pulses of about 30 minutes. It is estimated that about 130 TWh of heat was produced. (The reactors were discovered when assays of mined uranium showed only 0.717% U-235 instead of 0.720% as everywhere else on the planet.
Further investigation identified particular reactor zones with U-235 levels down to 0.44%. There were also significant concentrations of decay nuclides from fission products of both uranium and plutonium.)
During this long reaction period about 5.4 tonnes of fission products as well as up to two tonnes of plutonium together with other transuranic elements were generated in the orebody.
The initial radioactive products have long since decayed into stable elements but close study of the amount and location of these has shown that there was little movement of radioactive wastes during and after the nuclear reactions. Plutonium and the other transuranics remained immobile.
___________________________________________________________________________
World Nuclear Association (Author of above "Nuclear Power Reactors") Wikipedia
World Nuclear Association is the international organization that promotes nuclear power and supports the companies that comprise the global nuclear industry. Its members come from all parts of the nuclear fuel cycle, including:
- uranium mining,
- uranium conversion,
- uranium enrichment,
- nuclear fuel fabrication,
- plant manufacture,
- transport,
- and the disposition of used nuclear fuel as well as electricity generation itself.
Together, World Nuclear Association members are responsible for 70% of the world's nuclear power as well as the vast majority of world uranium, conversion and enrichment production.
The Association says it aims to fulfill a dual role for its members: Facilitating their interaction on technical, commercial and policy matters and promoting wider public understanding of nuclear technology. It has a secretariat of around 30 staff.
The Association was founded in 2001 on the basis of the Uranium Institute, itself founded in 1975.
Membership:
World Nuclear Association continues to expand its membership, particularly in non-OECD countries where nuclear power is produced or where this option is under active consideration. Members are located in 44 countries representing 80% of the world's population.
The annual subscription fee for an institutional member is based on its size and scale of activity. Upon receiving an inquiry or application, the Association's London-based secretariat determines the fee according to standard criteria and informs the candidate organisation accordingly. The fee structure provides, in many cases, for significant discounts for organisations located in countries outside the OECD.
A low-fee non-commercial membership is available for organisations with a solely academic, research, policy or regulatory function.
A list of current members is published on the World Nuclear Association website.
Charter of Ethics:
World Nuclear Association has established a Charter of Ethics to serve as a common credo amongst its member organizations.This affirmation of values and principles is intended to summarize the responsibilities of the nuclear industry and the surrounding legal and institutional framework that has been constructed through international cooperation to fulfill U.S. President Dwight D. Eisenhower's vision of 'Atoms for Peace'.
Leadership:
World Nuclear Association members appoint a Director General and elect a 20-member board of management. The current Director General is Sama Bilbao y León. The Chairman of the board is Philippe Knoche CEO at Orano.
Vice chairman is Kirill Komarov First Deputy Director General at Rosatom. A board of management fulfils statutory duties pertaining to the organization's governance and sets World Nuclear Association policies and strategic objectives, subject to approval by the full membership.
Activities and services:
Industry interaction:
An essential role of World Nuclear Association is to facilitate commercially valuable interaction among its members.
Ongoing World Nuclear Association Working Groups, consisting of members and supported by the secretariat, share information and develop analysis on a range of technical, trade and environmental matters. These subjects include:
- Cooperation in reactor design, evaluation and licensing
- Radiological protection
- Industry economics
- Nuclear law
- Supply chain
- Transport of radioactive mater