Sunday, April 04, 2010

The problem of conveying nuclear effects facts to the public against anti-civil defense propaganda; reviewing the 1977 edition of Glasstone and Dolan


Above: the British Government's 1957 civil defence poster on The Hydrogen Bomb (U.K. National Archives, reference INF 13/281) grossly exaggerates the effects of nuclear weapons, due to errors in Dr Glasstone's June 1957 edition of The Effects of Nuclear Weapons on thermal radiation transmission, blast and cratering. Thermal transmission was wrongly assumed to be about 50% for all distances beyond 10 miles. The crater size was quoted as 1 mile diameter for the 10 megaton Mike test on the water wave innundated, saturated porous coral reef around Elugelab Island of Eniwetok Atoll in 1952; the correct crater diameter for a 10 megaton surface burst on dry soil is just 0.11 miles as finally discovered from gravitational potential energy considerations in 1991.





Above: David I. Feinstein of IIT Research Institute, Chicago, Illinois, developed a computer model (based on effects measured at nuclear tests), showing the large differences in protection between different types of building. Hiroshima and Nagasaki only burned down because they were overwhelmingly composed of wood-frame houses containing easily overturned charcoal cooking braziers which were aflame at the times of the attack (breakfast time Hiroshima; lunch preparation time Nagasaki).

Brick, concrete, and steel frame buildings are far more fire resistant (the Twin Towers fires were due to the injection of aviation fuel, which nuclear weapons don't provide). Feinstein's report AD676183 is based on a 10 megaton nuclear surface burst, which has a longer blast wind drag duration than the smaller Hiroshima and Nagasaki explosions, so the speed attained by blast carried debris is greater and casualty rates are higher for blast for any fixed peak overpressure. There are huge differences in the median (50%) lethal peak overpressure for different situations: outdoors, 50% of people standing without any thermal radiation shadowing will be killed by burns and wind drag impacts for 3.0 psi, but inside a 7-story load-bearing brick warehouse 9.2 psi is needed. The types of buildings predominating in all modern cities provide immensely more protection than was generally available in Hiroshima and Nagasaki. In the predictions above, people are assumed to be standing with no "duck and cover" countermeasures. Injuries are here due primarily to flying glass, flying debris, bodily displacement by wind drag, and flash burns.

Because the blast wave takes time to arrive after the flash over large areas, unlike the popular impression based solely upon the always-lying television propaganda films of nuclear detonations whereby the blast effect is without exception falsely superimposed on the first flash of the explosion, there is enough warning time over most of the damaged area for people to effectively duck and cover even if there is no attack warning given (due to government secrecy or incompetence), since lying prone allows the body length to attenuate some of the direct initial gamma radiation mid-line dose by self-shielding of tissue (see U.K. National Archives report HO 225/14, linked here), cuts down exposure to the thermal radiation by shadowing, and eliminates most dangers from wind drag and the exposed body area to flying glass and other debris as will be illustrated later in this post with data from Hiroshima.

Note that Feinstein's model for standing personnel is accurate, but the results predicted for prone personnel are exaggerations because they ignore the shielding from thermal radiation by shadowing and do not properly account for the sliding resistance to translation. In addition, covering under a strong table or under a strong staircase - the "Morrison shelter" effect in WWII Britain, also demonstrated by 1950s nuclear tests on brick houses - protects reasonably well against the debris collapse of a house, since the weight of falling debris when a house collapses is completely unaffected by the strength of the blast wave.


Above: as we shall see in this blog post, hiding under the stairs or under tables like the "Morrison" led to survival in houses destroyed in the Blitz. This page is from the 1964 edition of The Effects of Nuclear Weapons and shows two supposedly "brick" houses (actually the brick finish was just a veneer: "The exterior walls were of brick veneer and cinder block and the foundational walls of cinder block", according to page 182 in the 1977 edition of The Effects of Nuclear Weapons) subjected to peak overpressures of 5 and 1.7 psi from the 29 kt Apple-2 Nevada tower test of 5 May 1955. In the film of the 5 psi blast hitting the wrecked house, the wind pressure peels the roof off while the peak overpressure cracks the front wall, enters the house through the windows, and then (as the blast passes and the external pressure drops below ambient pressure in the "negative phase") the blast overpressure inside the house causes the cracked walls to visibly explode outwards. The brick debris goes outwards, not inwards. People ducking under the stairs or a strong table to avoid flying glass could have avoided injury from blast, as well obviously as all of the thermal radiation and the larger part of the nuclear radiation dose. By ripping the roof off, the debris load on the floors below was reduced, preventing total collapse (some photos taken from other angles make it look as if this house was squashed flat, which is untrue). Of the house at 1.7 psi, Glasstone and Dolan (1977) state: "its condition was such that it could be made available for habitation by shoring and some fairly inexpensive repairs." They also show that precast concrete houses survived 5 psi in that test with just damage to windows and doors.

At higher blast yields, the wind pressure duration increases so debris loading problem is actually further reduced (making duck and cover countermeasures still more important), because the wind pressure carries more and more of the rubble horizontally beyond the building, instead of allowing it to fall vertically straight down on to the ground floor. The collapsing load is therefore reduced, unlike the situation with controlled demolition, the Twin Towers (which collapsed due to the heating of the metal frame, weaking it as a result of the intensely burning aviation fuel from the planes, which has little relevance to nuclear weapons), or with conventional TNT bombing (short duration wind pressures) in WWII, which all maximised the debris load per unit area on the ground floor.



Above: Dr Shields Warren and Dr Ashley Webster Oughterson compiled detailed data on the survival of groups of people at various distances in Hiroshima according to the degree of protection they had in their book Medical effects of the atomic bomb in Japan, by the Joint Commission for the Investigation of the Effects of the Atomic Bomb in Japan (McGraw Hill, New York, 1956, p. 103). The high casualty rates from thermal radiation in Japan are not generally applicable to other situations. The U.S. Congressional Office of Technology Assessment study The Effects of Nuclear War in 1979 pointed out that on a cold winter night typically only 1 % of the population would be exposed to thermal radiation, compared to typically 25 % for the summer and daytime. In addition, the weather (atmospheric visibility) affects thermal transmission from bomb to target, just as the wind direction affects fallout delivery to a target in a surface burst. Nobody therefore can assert that a nuclear weapon explosion will automatically produce the effects exhibited on Hiroshima. Even if the atmospheric conditions were similar, other factors would be different and the results would not be the same.



Above: at Hiroshima any opaque object like a hat prevented burns, so if personnel had ducked and covered when they saw the bomb fall, they would have avoided the thermal burns and flying glass injuries which caused the lethal synergism of combined infected wounds and radiation-depressed white blood cell counts, where the radiation exposure would not have caused a lethal effect if unaccompanied by burns and other trauma (see the diagram below). Experiments on glass window breakage similarly show that even just by ducking 10, 20 and 24 degrees angle below the horizontal from behind from a glass window, reduces the number of skin-penetrating, blast-wind accelerated, high velocity glass fragments to a unit area of skin to about 40 %, 15 % and only 10 %, respectively, of the values horizontally behind the window (ref.: page 21 of Dr E. Royce Fletcher's report Glass Fragment Hazard from Windows Broken by Airblast, ADA105824, DNA 5593T, 1980; clothing also provides a measure of protection). This demonstrates that even feeble "duck and cover" reduces not just the thermal flash exposure from nuclear weapons, but also the blast fragment laceration hazard.

Some 3 metres behind a glass window 1.37 x 1.83 metres in size, bare skin can be exposed to 140 incised wounds, the maximum number possible for a median glass fragment velocity of 40 m/s; this is reduced to 8 wounds if clothing is worn. For higher median velocities, the necessary shock wave is stronger, making more fragments with smaller individual mass, so the momentum per fragment in fact falls despite the increase in velocity. (See Figure 16 on page 31 of E. Royce Fletcher, Donald R. Richmond, and John T. Yelverton, “Glass Fragment Hazard from Windows Broken by Airblast,” Lovelace Biomedical an Environmental Research Foundation, report ADA105824, DNA-5593T, 1980. Hence clothing can provide protection factors as high as 140/8 = 18 for reducing the number of glass fragment wounds, permitted the use of clothed parts of the body to shield unclothed parts.)



Above: the data from 11,055 Nagasaki case histories allowed Wayne L. Davis, William L. Baker and Donald L. Summers, in their report Analysis of Japanese Nuclear Casualty Data (Dirkwood Corporation, Albuquerque, DC-FR-1045, 1966, Figure 29, page 43), to analyze the relationship between the percentage of body area burned (up to 40% for unclothed flash burns from the direct line-of-sight thermal flash, since the back and sides will not be exposed if facing the explosion, and up to 100% for flame burns due to fires ignited by overturned charcoal lunch cooking braziers in blasted wooden houses) and mortality for 2nd degree (blistering) and 3rd degree (charring) skin burns. 1st degree burns were basically similar to sunburn and did not result in lethal infections. These curves above apply to combined synergism of thermal and nuclear radiation exposures. Most of the casualties in both cities were due to blast and thermal radiation, with infected wounds made worse by the synergism of initial radiation exposure, which lowers the white blood cell count; see the PDF linked here of James W. Brooks et al., "The Influence of External Body Radiation on Mortality from Thermal Burns", Annals of Surgery, vol. 136 (1952), pp. 533–45. (See also: G. H. Blair et al., "Experimental Study of Effects of Radiation on Wound Healing", in D. Slome, Editor, Wound Healing, Pergamon, N.Y., 1961.) Notice that for both types of burns, if only 20% of the surface area of the body was burned (either blistering or charring), the mortality rate was less than 10%. In medicine, the “rule of nines” allows the percentage of the body surface area to be quickly estimated:

Head and neck equal ................... 9%
Anterior trunk equals ....................18%
Posterior trunk equals ..................18%
Upper extremities (each 9%) ........18%
Lower extremities (each 18%) ......36%
Perineum ..................................... 1%

This demonstrates that a reduction of the area of skin exposed to the fireball thermal radiation can be vitally important in reducing the risk of mortality in nuclear war. Duck and cover is not a fraud. The protection afforded by clothing was established by Nevada nuclear tests and is reported in Capabilities of Atomic Weapons TM 23-200, November 1957 (dark clothing may flame and smoke at the higher exposure levels, but if the person is lying on the ground they can roll over to extinguish flames as the thermal pulse subsides):



Above: American data for thermal energy needed for burns under clothing, from page 6.2b of the 1960 (change 2 pages revision) Capabilities of Atomic Weapons, TM 23-200, Confidential.


Above: the 3-storey Bank of Japan was a modern type concrete building which survived 0.25 mile from ground zero in Hiroshima where the peak overpressure was 18 psi. The U.S. Strategic Bombing Survey found that half of the 100 people in the building survived; there were injuries to those standing near windows from horizontally flying glass window fragments, debris, nuclear and thermal radiation. No ignitions occurred in the building from blast or thermal radiation effects, and it was not ignited by burning wooden buildings 25 feet away. At 1.5 hours after the explosion, however, a firebrand from burning trees in the surrounding firestorm of wooden buildings containing overturned charcoal cooking braziers started a fire on the second storey: "The survivors extinguished the blaze with water buckets, preventing further damage. A little later, a fire started on the third floor. It was beyond control when discovered and the third floor burned out. But the fire did not spread to the lower floors." (This quotation is from Panel 26 of the DCPA Attack Environment Manual, Chapter 3, CPG 2-1A3, June 1973, PDF linked here; see also the September 1989 revision, FEMA-127, online PDF linked here, which gives later research data on the fuel loading of cities and the effect of blast in blowing out thermal ignition apart from open pans of ignited liquid fuel or the Encore instant flashover due to the ignition of rooms stuffed full of flammable kindling with windows having a direct view of the fireball.)

“Once a mass fire has formed, the usual prognosis for people trapped within the fire area is not very favorable. [However] ... records show that more than 85 per cent of the 280,000 people in the firestorm area of Hamburg survived ... (Earp, Kathleen F., Deaths from Fire in Large Scale Air Attack with Special Reference to the Hamburg Fire Storm, U.K. Home Office Scientific Advisory Branch report CD/SA-28, 1953; U.K. National Archives document reference: HO 225/28).” - Dr Abraham Broido, “Surviving Fire Effects of Nuclear Detonations”, Bulletin of the Atomic Scientists, March 1963, pp. 20-3.


This fire-fighting also saved the Geibi Bank Company building in Hiroshima which again was not ignited by thermal radiation, despite being close enough to receive a peak overpressure of 8 psi. Firebrands from the firestorm around it caused some furnishing and curtains on the 1st and 3rd floors to ignite: "The fires were extinguished with water buckets by the building occupants. Negligible fire damage resulted. ... If one assumes that Americans can do what the unsuspecting residents of Hiroshima did, self-help measures ... would appear to be effective."



A SANE POLICY (Harvard Crimson, Monday, October 30, 1961):

It has been brought to our attention that certain elements among the passengers and crew favor the installation of "life" boats on this ship. These elements have advanced the excuse that such action would save lives in the event of a maritime disaster such as the ship striking an iceberg. Although we share their concern, we remain unalterably opposed to any consideration of their course of action for the following reasons:

1. This program would lull you into a false sense of security.

2. It would cause undue alarm and destroy your desire to continue your voyage in this ship.

3. It demonstrates a lack of faith in our Captain.

4. The apparent security which "life" boats offer will make our Navigators reckless.

5. These proposals will distract our attention from more important things i.e. building unsinkable ships. They may even lead our builders to false economies and the building of ships that are actually unsafe.

6. In the event of being struck by an iceberg (we will never strike first) the "life" boats would certainly sink along with the ship.

7. If they do not sink, you will only be saved for a worse fate, inevitable death on the open sea.

8. If you should be washed ashore on a desert island, you will be unaccustomed to the hostile environment and will surely die of exposure.

9. If you should be rescued by a passing vessel, you would spend a life of remorse mourning over your lost loved ones.

10. The panic engendered by a collision with an iceberg would destroy all vestiges of civilized human behavior. We shudder at the vision of one man shooting another for the possession of a "life" boat.

11. Such a catastrophe is too horrible to contemplate. Anyone who does contemplate it obviously advocates it.

- Committee for a Sane Navigational Policy: Stephan A. Khiney '62, Robert Fresco '63, Richard W. Bulliet '62, Donald M. Scott '62.


This 1961 Harvard Crimson parody of the vacuous political objections to President Kennedy's civil defence program to a rejection of lifeboats for ships was quoted and expanded upon by strategist and civil defence advocate Herman Kahn in his 1962 book Thinking About the Unthinkable, where Kahn points out that the anti-lifeboat fanatics could add the deceptive complaint that adding lifeboats to a ship increases the weight on the ship, thereby increasing the rate of sinking in a disaster, making the problem worse! The point is, "objections" to civil defence are vacuous and are supported not by facts or by science, but by political bias, groupthink and wishful thinking. (This 1961 lifeboats analogy to civil defence is adapted to ambulances in the August 1962 magazine Fission Fragments, issue number 3, pages 14-5, located in the U.K. National Archives as document HO 229/3, edited by W. F. Greenhalgh of the Home Office civil defence Scientific Adviser’s Branch, London. In the 1980s, after the compulsory introduction of car seat belts, they used those as the analogy instead in official civil defence publications such as the November 1981 U.K. Government publication Civil Defence - Why We Need It which states: "Why bother with civil defence? Why bother with wearing a seat belt in a car? Because a seat belt is reckoned to lessen the chance of serious injury in a crash." The first publication of the ship analogy to civil defence is at page 3 of the 1938 British Home Office public civil defence manual, The Protection Of Your Home Against Air Raids: “On board ship, both crew and passengers are instructed where to go and what to do, not when danger threatens, but beforehand. The captain considers it a matter of ordinary routine and everyday precaution that everything is in readiness for a shipwreck which he hopes will never happen.”)

In fact, the analogy of civil defence to lifeboats goes a lot deeper: for many years lifeboats were in fact "debunked" and ridiculed as silly, expensive, useless, etc. That came to a dramatic end in 1912 with the testimony of Commander Charles Lightoller, the Second Officer aboard the Titanic, who was ordered to fill a grossly inadequate number of lifeboats, choosing who would survive and who would die. He recommended to the inquiry that lifeboat capacity be based on numbers of passengers and crew instead of ship tonnage, that lifeboat drills should be conducted regularly on ships so passengers know where their lifeboats are and crew know how to operate them, and that early warnings of ice and collision should be given by radio communications in all passenger ships. Summarizing the points made in Walter Lord's minute-by-minute account of the disaster based on interviews with 63 survivors, A Night to Remember (Longmans, Green and Corgi, London, 1956), Dr Tom Stonier explained this obvious analogy between the inadequate disaster preparations of the Titanic and the panic due to the inadequacy of civil defence for nuclear attack in Hiroshima, on page 55 his 1964 book Nuclear Disaster (Penguin, London):

"The immediate survivors of a disaster are ... frequently so frightened or so stunned that they cannot utilize the resources available to them with the greatest effectiveness, nor can they muster the courage to conduct rescue operations. Nowhere is the incapacitating effect of fear more clearly illustrated than by the events that followed the sinking of the Titanic in 1912. Of sixteen hundred men, women, and children in the ice water, only thirteen people were picked up by the half-empty lifeboats nearby. Only one of the eighteen boats made the attempt to return and rescue them. The others failed to lend assistance out of fear of being swamped. In boat after boat, the suggestion to go back and help was countered by the sentiment, 'Why should we lose our lives in a useless attempt to save others from the ship?'

"The damaging effect of fear is therefore not so much that it elicits the flight reaction, which is a healthy, normal, and life-saving response, but that it leads to a paralysis of judgement and action that tends to prevent the maximum use of available resources and thereby prevents preserving the maximum number of lives."


Robert Jungk's book, Children of the Ashes (Heinemann, London, 1961) cites a report in Hiroshima by American psychologist Woodbury Sparks called Panic Among A-Bomb Casualties at Hiroshima which showed that due to their surprise at the effects of the Hiroshima nuclear explosion, only 26 percent (153 out of a random sample of 589) of bomb survivors in Hiroshima gave any assistance at all to anybody else after the explosion. Only 5% of people trapped alive by blast debris in Hiroshima were freed by others, while 50% freed themselves before the firestorm took hold. Because British brick houses produce heavier debris than Japanese wooden houses, only 25% of people trapped alive under the stairs or a strong table (Morrison shelter) in collapsed houses after air raids in Britain could free themselves, although the fire risk was lower because bricks do not burn as U.K. Home Office proved. Organised rescue efforts (see the earlier post, linked here) could therefore increase the survival chance even in demolished wooden buildings substantially.

For a detailed statistical analysis of the paltry attempt at rescue in Japan, see Table 6 on p. 101 of Wayne L. Davis, William L. Baker and Donald L. Summers, Analysis of Japanese Nuclear Casualty Data (Dirkwood Corporation, Albuquerque, DC-FR-1045, 1966, linked here), based on 35,099 personnel (24,044 in Hiroshima and 11,055 in Nagasaki). The figure of 5% rescued was a maximum.




Above: the new Blitz-experience-based Shelter at Home handbook, published in June 1941, marked a shift of civil defence policy away from cold, damp, flooded outdoor shelters toward the more popular home Morrison protected bed shelter. The British Government under Prime Minister Chamberlain had failed to properly fund civil defence research against high explosives in good time before World War II, resulting in idealistic solutions which were not properly tested for practical effectiveness before being deployed in panic after the September 1938 Munich crisis (when Prime Minister Chamberlain was intimidated in his second meeting with Hitler). The panic civil defence countermeasures were outdoor trenches in public parks and the "Anderson" shelter, a corrugated steel arch buried in the ground and covered with earth.

Most of the Nazi bombing of Britain occurred during the Blitz (between 7 September 1940 and 10 May 1941), when the U.K. Government's Shelter Census of central London in November 1940 found that 60% of the public were sleeping in their own homes during air raids, instead of getting up and dressed to go to a shelter upon the attack warning siren. Only 4% used the Underground system shelters, 9% used other public air raid shelters, and 27% used domestic Anderson shelters (Morrison indoor shelters were not even introduced until March 1941). The 60% who did not go out to any kind of shelter during air raids:

"stayed in their homes, sleeping downstairs, under stairs, under tables, in cupboards."


(For this census, see the "Home Shelters" tab at the internet site linked here, but beware that it falsely states that Morrison shelters were available, which was completely incorrect in November 1940.)


"... distribution of 'Andersons' had begun before their testing had been completed. At the opening of 1939 'load tests' had shown that 'Andersons' were strong enough to bear the weight of any debris falling on them from the type of house for which they were intended. But it was not until some months later [Sectional Steel Shelters, Cmd. 6055, July 1939] that a series of 'explosion tests' proved conclusively [that they] could withstand without damage a 500 lb. [227 kg] high explosive bomb falling at least fifty feet away [equivalent to a 12 kt Hiroshima nuclear bomb some 50(12,000/0.227)1/3 = 1,880 feet away: thus, Anderson shelters would have survived undamaged at ground zero after the air burst that high over Hiroshima] ... It was established at the same time that they would protect their occupants against blast from a bomb of this size bursting in the open at a distance of thirty feet or more. But this soundness of the 'Andersons' from a structural soundpoint, it soon became clear, was counterbalanced by an important practical defect, namely liability to flooding."

- Terence H. O'Brien, Civil Defence, H.M. Stationery Office, London, 1955, p. 196.


There were very good reasons for the failure of 60% of the public to utilize outdoor shelters: Britain has a cool, wet climate with a high water table, so any below ground structure rapidly became damp and cold during the winter, and flooded by rain. Before World War II it was believed that Nazi bombing would be in the daytime for reasons of accuracy, like World War I bombing. In fact, the Blitz was nighttime bombing, when people were trying to sleep, because anti-aircraft guns and fighter aircraft found it much harder to shoot down bombers in the dark at nighttime, despite searchlights and early radar sets. London was bombed 57 consecutive nights. Most people simply did not have time, upon hearing the air raid warning siren, to get dressed and go out to a cold, damp or flooded public or back yard Anderson shelter, which in winter were often dark to allow some people to try to sleep and uncomfortable compared to a home bed (London's underground rail communal shelters being an exception to the rule). Attempts to evacuate millions of women and children proved a failure, since most evacuees returned home after a few months of the outbreak of war, when the predicted air raids had still not occurred.

See Richard M. Titmuss, Problems of Social Policy, H.M. Stationery Office, London, 1950, online HTML version linked here, and Terence H. O'Brien, Civil Defence, H.M. Stationery Office, London, 1955, online PDF linked here. O'Brien at pp. 325-7 points out that the Government plan was to evacuate 4,000,000 women and kids before the outbreak of war in early September 1939, but unknown to the Government fewer than half of those decided to leave, and furthermore:

"By Christmas more than one-half of the 1,500,000 mothers and children concerned had returned home; in the London and Liverpool areas about two-thirds of the evacuated children had returned. (The first count taken in January 1940 disclosed that about 900,000 had returned.) ... this evacuation scheme had, as Mr Titmuss says, 'largely failed to achieve its object of removing for the duration of the war most of the mothers and children in the target areas'."


Consequently, there was a shift based on these experiences away from the large failures of the "outdoor" shelter and evacuation policy, towards providing better protection within the home itself. Sir John Fleetwood Baker and his assistant Edward Leader-Williams at the Ministry of Home Security developed an indoor shelter which could absorb the energy of the falling debris from the collapse of a normal house. There is a really brilliant scientific proof film (in the linked Cambridge University engineering faculty internet site, right click on the video link to choose to save the 7 MB mpg video file) showing precisely the mechanism by which the Morrison shelter deflects slightly in order to absorb the kinetic energy of a falling house by plastic deformation; Baker simply puts his own pocket watch inside a tiny model Morrison shelter within a model house, and then slams down a 10 pound load to represent the debris of the collapsing house, and his watch remains safe!

These Morrison table shelters were named after the Minister of Home Security (Herbert Morrison) and were introduced in March 1941. More than 500,000 were issued by November 1941, and they simply consisted of a strong dinner table containing a mattress for sleeping. They were 6' 6" long x 4' wide x 2' 9" high with a top consisting of 1/8" solid steel plate, with welded wire mesh sides and a metal lath floor. One wire side lifted up, allowing people to crawl inside the structure, where there was sleeping space for several people. These were placed in a ground floor (or basement) "refuge room", a technique revived for blast, thermal flash and fallout radiation shielding by the U.K. Government in its 1980 civil defence manual against nuclear attack, Protect and Survive. Edward Leader-Williams, assistant to Morrison shelter designer Sir John Baker during the experiments, worked in the U.K. Home Office Scientific Advisory Branch until 1965, and in 1955 initiated the basic Protect and Survive "inner refuge" research against nuclear war.

"In one examination of 44 severely damaged houses it was found that three people had been killed, 13 seriously injured, and 16 slightly injured out of a total of 136 people who had occupied Morrison shelters; thus 120 out of 136 escaped from severely bomb-damaged houses without serious injury. Furthermore it was discovered that the fatalities had occurred in a house which had suffered a direct hit, and some of the severely injured were in shelters sited incorrectly within the houses." - Wikipedia


The 22 May 1940 booklet Your Home as an Air Raid Shelter had already marked a change in policy as the discomfort and flooding of outdoor Anderson shelters became clear. As a result of the experience gained during the Blitz bombing, it was revised and greatly improved in June 1941 to create the new handbook (featuring the indoor Morrison shelter), Shelter at Home, which states:

“people have often been rescued from demolished houses because they had taken shelter under an ordinary table ... strong enough to bear the weight of the falling bedroom floor.”


The discovery of this table "duck and cover" effectiveness in air raids led to a revolutionary shelter design; the indoor Morrison table shelter of 1941. (For publication dates of these booklets, see T. H. O’Brien, Civil Defence, H.M. Stationery Office, 1955, pages 371 and 529.) It is the forerunner to the “inner core refuge” adopted for protection against thermal flash, blasted flying debris and fallout radiation in a nuclear war in the 1980 booklet Protect and Survive.


Above: the facts about the life-saving ability of the Morrison table shelter during aerial bombing in World War II Britain: it protects against the collapse of buildings regardless of whether that collapse is caused by TNT, a hurricane, an earthquake, or a nuclear bomb. A U.K. Government press release from November 1941, Morrison Shelters in Recent Air Raids, states:

“A report of Ministry of Home Security experts on 39 cases of bombing incidents in different parts of Britain covering all those for which full particulars are available in which Morrison shelters were involved shows how well they have stood up to severe tests of heavy bombing.

“All the incidents were serious. Many of the incidents involved direct hits on the houses concerned, a risk against which it was never claimed these shelters would afford protection. In all of them the houses in which shelters were placed were within the radius of damage by bombs; in 24 there was complete demolition of the house on the shelter.

“A hundred and nineteen people were sheltering in these ‘Morrisons’ and only four were killed. So that 115 out of 119 people were saved. Of these only 7 were seriously injured and 14 slightly injured while 94 escaped uninjured. The majority were able to leave their shelters unaided.”


The top set of instructions for building the Morrison shelter and using it as a table between air-raids are taken from the instruction manual for building the Morrison shelter, How to put up your Morrison “Table” Shelter, issued by the Ministry of Home Security, H.M. Stationery Office, March 1941 (National Archives document reference HO 186/580), which states:

“The walls of most houses give good shelter from blast and splinters from a bomb falling nearby. The bomb, however, may also bring down part of the house, and additional protection from the fall of walls, floors and ceilings is therefore very essential. This is what the indoor shelter has been designed to give. Where to put it up, which floor? Ground floor if you have no basement. Basement, if you have one. ... Protect windows of the shelter room with fabric netting or cellulose film stuck to the glass (as recommended in Your Home as an Air Raid Shelter). The sides of your table shelter will not keep out small glass splinters.”


According to the article, "Air Raid Precautions" in Nature, vol. 146, p. 125, 27 July 1940, "more than 700,000 copies" of Your Home as an Air Raid Shelter, were sold by the end of July 1940.

Your Home as an Air Raid Shelter



Above: The U.K. Government film, Your Home as an Air-Raid Shelter was issued in 1940 to accompany a manual of the same title, giving improved information based on bombing experience.

“The public outcry about conditions in the largest public shelters, often without sanitation or even lighting, and the appalling inadequacy of the over-loaded and ill-equipped rest centres for the bombed-out led to immediate improvements, but cost Sir John Anderson his job. ... His successor as Home Secretary, Herbert Morrison ...

“The growing reluctance of many people to go out of doors led the new Home Secretary to look again at the need for an indoor shelter… The result was the Morrison shelter, which resembled a large steel table … During the day it could be used as a table and at night it could, with a slight squeeze, accommodate two adults and two small children, lying down. The first were delivered in March 1941 and by the end of the war about 1,100,000 were in use, including a few two-tier models for larger families. Morrisons were supplied free to people earning up to £350 a year and were on sale at about £7 to people earning more. … the Morrison proved the most successful shelter of the war, particularly during the ‘hit and run’ and flying-bomb raids when a family had only a few seconds to get under cover. It was also a good deal easier to erect than an Anderson, and while most people remember their nights in the Anderson with horror, memories of the Morrison shelter are usually good-humoured.

“... A government leaflet, Shelter at Home, pointed out that ‘people have often been rescued from demolished houses because they had taken shelter under an ordinary table... strong enough to bear the weight of the falling bedroom floor’. I frequently worked beneath the solid oak tables in the school library during ‘imminent danger periods’ and, particularly before the arrival of the Morrison, families became accomplished at squeezing beneath the dining table during interrupted meals. ... Although the casualties were mercifully far fewer than expected, the damage to property was far greater. From September 1940 to May 1941 in London alone 1,150,000 houses were damaged ...”


- Norman Longmate, How we Lived Then - A history of everyday life during the Second World War, Pimlico, 1971.

Contents

1. Introduction
2. Height of burst blast curves: American and British analyses
3. Ground shock, cratering, water waves
4. Thermal phenomenology
5. Initial nuclear radiation
6. Fallout
7. Radio and radar temporary attenuation by ionization
8. Electronic and electrical equipment damage by EMP
9. Biological effects

1. Introduction

Each edition of the U.S. Department of Defense's book, The Effects of Nuclear Weapons has coincided with significant public concern over a nuclear weapons threat. The first version dated 1950 was a response to the development of nuclear weapons by Russia, which tested its first nuclear weapon in 1949 (the preparation for the 1950 edition actually began before the Russian test, when it was clear that Russia would have the bomb within a few years). The 1957 edition, based mainly on the Nevada tests up to and including Operation Teapot in 1955 and the Pacific tests up to and including Operation Redwing in 1956, was a response to the bigger threat from the megaton range hydrogen bombs and their fallout, and the increasing test data. The 1962/4 edition responded was a completely rewritten, expanded version written in response to the new data from the many 1957 Nevada tests of Operation Plumbbob, and the 1958 Pacific tests of Operation Hardtack, and to support President Kennedy's civil defense initiative has the best final chapter on the principles of civil defense, silencing most of the "protection is futile"-appeasers whose aim is always to exaggerate the effects to such a lying extent that civil defense disappears as an option and the only option to avoid the risk of "total annihilation" is the surrender to secret terrorist regimes (as occurred in the 1930s when Germany secretly rearmed and was able to intimidate a largely disarmed world).

The 1977 edition was published at a time when the Soviet strategic nuclear threat was finally outpacing the American stockpile, and is technically the most sophisticated. It is really a military textbook. It is not a compendium of all of the best nuclear test data, let alone of the scientific literature, although it does have many excellent chapter bibliographies listing very important research reports and books on each topic. Instead, it is a state-of-the-art summary of the results that have come out of generally secret research, as we will show later on. The result is a reliance on authority to a certain extent, although in some cases - the best example being the chapter on "Radio and Radar Effects" - most of the basic calculations are clearly set out in detail. Because detailed comparisons between theory and nuclear test data are generally excluded from the book due to secrecy in 1977, it suffers from an overly "theoretical" feel. The secrecy of nuclear test effects data, and even the full U.S. Strategic Bombing Survey reports on Hiroshima and Nagasaki, misleads the public (particularly many academic scientists) into believing falsely that such data simply does not exist, and was never measured accurately. One problem with The Effects of Nuclear Weapons in every edition has always been that it tends to encourage this belief by excluding a full comparison of theory versus all the (secret) nuclear test data.

People believe that the book contains everything known, rather thay just being a summary of the conclusions to far more detailed secret research reports, and therefore they assume that human knowledge on nuclear effects is more "theoretical" that it really is. For instance, when we go right back to 1945, only a brief summary, excluding the mechanism of the firestorm, was published in unclassified form in 1947 of the U.S. Strategic Bombing Survey report on Hiroshima and Nagasaki. The full report has never been published to this very day!

It was declassified in 1972 and is held in the U.K. National Archives in Kew, London (document references: AIR 48/160, AIR 48/161, AIR 48/162, AIR 48/163, AIR 48/164, and AIR 48/165). It is six volumes long (three volumes on Hiroshima and three on Nagasaki), is full of tables and graphs of data which were excluded from the 1947 published resume, and has the word "Secret" printed at the top of every page.

The originally ‘secret’ May 1947 U.S. Strategic Bombing Survey report on Hiroshima, pp. 4-6:

‘Six persons who had been in reinforced-concrete buildings within 3,200 feet [975 m] of air zero stated that black cotton black-out curtains were ignited by flash heat... A large proportion of over 1,000 persons questioned was, however, in agreement that a great majority of the original fires were started by debris falling on kitchen charcoal fires... There had been practically no rain in the city for about 3 weeks. The velocity of the wind ... was not more than 5 miles [8 km] per hour....

‘The fire wind, which blew always toward the burning area, reached a maximum velocity of 30 to 40 miles [48-64 km] per hour 2 to 3 hours after the explosion ... Hundreds of fires were reported to have started in the centre of the city within 10 minutes after the explosion... almost no effort was made to fight this conflagration within the outer perimeter which finally encompassed 4.4 square miles [11 square km]. Most of the fire had burned itself out or had been extinguished on the fringe by early evening ... There were no automatic sprinkler systems in building...’


The originally ‘secret’ May 1947 U.S. Strategic Bombing Survey report on Nagasaki states (vol. 1, p. 10):

‘... the raid alarm was not given ... until 7 minutes after the atomic bomb had exploded ... less than 400 persons were in the tunnel shelters which had capacities totalling approximately 70,000.’


The exclusion of such vital facts on Hiroshima and Nagasaki from public understanding of the effects of nuclear weapons is tragic for civil defense. The official unclassified presentations on the subject are totally misleading "groupthink" deceptions which served a purpose in helping to bolster the nuclear threat in order to deter Soviet aggression with a limited American nuclear stockpile during the Cold War, but it is unhelpful today where the main threats are from proliferation. Exaggerating nuclear threats makes nuclear intimidation more attractive to rogue states, just as the exaggeration of aerial attack effects of poison gas, incendiaries and high explosives in the 1920s to 1930s made such weapons attractive to the terrorist states in Europe.

Unlike the previous editions, which were cheaply printed in paperback with an empty pocket for the "Nuclear Bomb Effects Computer" inside the back cover (which was sold separately), the 1977 edition of The Effects of Nuclear Weapons was published hardback with the slide-rule included. It is considerably shorter than the 1964 edition, since the civil defense final chapter was removed. It presents example problems and solutions in textbook format, and indeed a major purpose was for the instruction of military personnel in nuclear weapons effects, not civil defense. Philip J. Dolan drafted the 1977 edition at Stanford Research Institute, California, shortly after editing the 1972 first two-part version of the U.S. Department of Defense's Secret-Restricted Data 1,650 pages long Capabilities of Nuclear Weapons, Effects Manual EM-1. Many of the new diagrams in the 1977 edition of The Effects of Nuclear Weapons were declassified from the 1974 NATO edition of EM-1.

The military focus is shown by the relatively brief treatment of fallout radiation, ignoring the prediction of specific activity of fallout (the visibility of its mass deposit, as related to its hazardous radioactivity content), the uptake of fallout nuclides in food from contaminated soil, solubility of fallout from different kinds of detonation, decontamination measures and their effectiveness, and so on. The major focus is on blast, shock, cratering, thermal effects, initial nuclear radiation, and radio and radar effects. The chapter on "Radio and Radar Effects" is brilliant technically and scientifically, but is written up more like a detailed scientific review paper than a brief book chapter. All of the material there is excellent, but requires a lot of very close technical study to understand. Some of the material, such as detailed equations for calculations of the disappearance of electrons in the ionosphere due to combination with neutral particles or their ions, is more important for detailed EMP field predictions, which were not provided due to classification.

The ionization from a nuclear explosion attenuates or refracts radio and radar signals for relatively short periods of time for the extremely high frequencies that are used now. Satellite based communications systems are designed to use extremely high frequencies to penetrate the Earth's natural ionosphere, and such frequencies will also penetrate most nuclear explosion ionization regions at high altitudes, with only a temporary disruption at most. Radio systems that don't use satellites or the bouncing of signals off the ionosphere are immune to ionization regions unless the explosion is so close that blast and initial nuclear radiation would be of overriding importance. The permanent damage due to EMP is of more concern. The EMP can of course be degraded by the ionization. For instance, in a surface burst, the EMP radiated by the net upward Compton current above ground zero is partially absorbed by the air ionization caused by the gamma rays moving outwards in more horizontal directions. This reduces the observed EMP at a long distance from a surface burst to less than the field strength that could be predicted from the net vertical Compton current if air ionization is ignored.

The 1977 edition of The Effects of Nuclear Weapons is not a book whose content can be assimilated quickly. I think that a lot of the information could be communicated in a faster, more appealing way by changing the format to a larger, A4 size, and laying out the pictures and diagrams in a graphics designer way which helps to achieve rapid communication, preferably with equations briefly summarized inside graphical diagrams (so that they are available for checking and computation, but can be ignored if not needed). Pictures of blast destruction to buildings could be arranged in order of blast pressures and duration, showing at a glance the visible effects of different pressures and durations. In the case of Hiroshima and Nagasaki, the carefully documented survival rates in the various kinds of buildings could also be included, as well as declassified information on how the fires began and when buildings caught fire. Photographs of nuclear explosions should be laid out to demonstrate more clearly the visible time-sequence of detonation effects for the various kinds of burst and yields (surface bursts, underwater, air bursts, and high altitude bursts). The 1964 civil defense material should be similarly revised and included.

This is because nuclear weapons effects data needs to be used to actively debunk popular pro-disarmament lies of the sort circulated before World War II which allowed Hitler to start the war, by acquiring weapons of mass destruction like gas. Gas was never used in a military way since civil defense countermeasures negated that threat and like some nuclear weapons effects, it is wind dependent and not a reliable military weapon, but gas was used against millions of defenseless people murdered in gas chambers. Those people need not have been murdered if the pro-disarmament protestors of the 1920s and 1930s had not exaggerated to the point of outright lies against civil defence, but had been honest and admitted that simple civil defence will save lives in mass bombing and all other forms of attack. Exaggerations of Hitler's air bombing threat allowed public sympathy to support Prime Minister Chamberlain in ignoring the plight of the Jews and appeasing Hitler.

The public must never be misled, and such lies must be opposed, in the matter of the facts of the threat of nuclear weapons and the effectiveness data for the efficiency of the countermeasures the public themselves should and must take in the event of terrorist attack. Allowing lying anti-civil defense propaganda in the 1920s and 1930s led to appeasement which came close to permitting a Thousand Year Reich of global fascism.


The CND internet page on "The Effects of Nuclear Weapons" is a concise compendium of some of the major anti-civil defense lies which are promoted actively by the extreme pro-disarmament elements of the media, which shows the areas of concern that nuclear weapons effects information needs to debunk effectively:

"Over a wide area the resulting heat flash literally vaporises all human tissue. At Hiroshima, within a radius of half a mile, the only remains of most of the people caught in the open were their shadows burnt into stone."


- Complete lie against "duck and cover" facts that is shamefully unopposed by the nuclear effects and civil defense authorities. Nothing in Hiroshima was vaporized. Even objects placed within nuclear fireballs weren't instantly vaporized except right beside the bomb: the bomb itself is vaporized but the materials around it are heated to immense temperatures only for a very brief period of time, too short for the heat to diffuse into the depth of the material. A very thin surface layer is heated and charred, not the full thickness of the material. Any cover or intervening object provided protection.

"People inside buildings or otherwise shielded will be indirectly killed by the blast and heat effects as buildings collapse and all inflammable materials burst into flames. The immediate death rate will be over 90%."


- Complete lie, disproved at the Encore Nevada nuclear test on thermal ignition in 1953. Only inflammable trash and junk like old newspapers in the direct line-of-sight to the fireball can be ignited. Building collapse injuries were averted by duck and cover under strong tables (Morrison's) in the Blitz bombing. The immediate death rate in Hiroshima for people in ground floors of concrete buildings was over 120 times less than for people in the open: a reduction in median lethal range to from 1.3 miles to just 0.12 mile, which reduces areas and casualties by a factor of (1.3/0.12)2 (Glasstone and Dolan 1977, p. 546, Table 12.17).

"The many individual fires combine to produce a fire storm as all the oxygen is consumed. As the heat rises, air is drawn in from the periphery at or near ground level. This both results in lethal, hurricane force winds and perpetuates the fire as the fresh oxygen is burnt. (Such fire storms have also been produced by intense, large scale conventional bombing in cities such as Hamburg and Tokyo)."


- Complete lie: firestorms require incendiary bombing of multistorey medieval style wooden houses like the medieval wooden part of Hamburg in World War II. In Hiroshima, the "firestorm" mechanism was slower than incendiaries because the originally Secret U.S. Strategic Bombing Survey reports on Hiroshima show that it depended upon the ignition of black coloured air raid blackout curtains and the overturning of charcoal cooking stoves in blasted wooden houses. By the time the firestorm was going, people had left the area. The Hiroshima firestorm was trivial in intensity compared to Hamburg, because the wooden buildings in Hiroshima were mainly only two storey, not 3-5 storey as in Hamburg. None of this has any relevance to brick and concrete buldings such as in the bombing with incendiaries in Britain. Mass fires in Britain occurred in warehouses and dock areas filled with imflammables; modern homes proved very resistant to fire.

Dr Fred Charles Iklé (b. 1924) makes the point in his 1958 book The Social Impact of Bomb Destruction (University of Oklahoma Press) that five months after the July 1943 firestorm in Hamburg, the city’s production had recovered to 80% of that before the firestorm.

Dr Iklé also makes the point on page 147 that the lessons learned from Hiroshima were quickly applied to help survivors of the nuclear attack on Nagasaki; on 10 August (Day 2 after Nagasaki), emergency rations are brought in to feed 67,000 survivors: ‘this represents a remarkable feat of organisation that illustrates the great possibilities of mass feeding.’

Examine the post-attack recovery rate in Hiroshima before any significant outside help arrived:

7 August (Day 2): Survivors open bridges and roads to pedestrian traffic, clearing away debris.

8 August (Day 3): Tracks cleared and trains to Hiroshima resumed.

9 August (Day 4): Street trolley bus (electric tram) lines return to service.


Next, consider what civil defence did during the post-attack recovery process to help aid survivors in Nagasaki, subjected to a nuclear explosion just 3 days after Hiroshima:

9 August (Day 1): Emergency rations are brought in to feed 25,000 survivors (though less than the required amount, due to bureaucratic confusion). The survivors lived in the air-raid shelters, which had survived.

10 August (Day 2): Emergency rations are brought in to feed 67,000 survivors.

7 October (Day 60): The first green shoots of recovery appeared on an irradiated and firestorm burned chestnut tree, photographed by U.S. Air Force observers, and published in the U.S. Congress book, The Effects of Nuclear War, 1979, p. 114.


Robert Jungk, Children of the Ashes (Heinemann, London, 1961): 'one morning in April 1946, the Vice-Mayor [of Hiroshima] gazed for a long time. For what met his eyes was a sight he had scarcely hoped ever to see again ... The blackness of the branches was dappled with the brilliant white of cherry buds opening into blossom.'

Robert Jungk carefully investigated the history of the recovery in Hiroshima by interviewing the people involved and collecting first hand reports, and gives further interesting details in his book Children of the Ashes (Heinemann, London, 1961):

1. On 31 August 1945: "the first locally produced and locally printed post-war edition of the Chugoku Shimbun was on sale in the streets of Hiroshima ... 'Our darkroom was an air-raid shelter dug into the hillside', one of the editors remembers, 'but our type had to be cast in the open air, under the sunny sky'." (The Chugoku newspaper building 870 m east of GZ Hiroshima, survived the blast but was full of paper and was later gutted by the firestorm; see the photo of it, linked here.)

2. On 7 September 1945, the Chugoku Shimbun reported that Hiroshima then had a population estimated to be 130,000.

3. On 10 September 1945, electricity was reconnected to some parts of Hiroshima: "huts made of planks quickly knocked together ... already had electric light."

4. On 5 November 1945, the Chugoku Shimbun reported that - despite inertia and delays due to "the rigidity of bureaucratic procedure" which was hindering the recovery rate - a lot of progress was being made:

"Housing. The building of houses is to be systematically begun on 15 November. ...

"Tramways. At present, ten trams are in commission on the main route, eight on the Miyajima route and five muncipal buses. These twenty-three vehicles must cater for an average of 42,000 persons daily."


It is a fact that 70% of the destroyed buildings of Hiroshima had been reconstructed by mid-1949. (Ref.: Research Department, Hiroshima Municipal Office, as cited in Hiroshima, Hiroshima Publishing, 1949. Other recovery data are given in U.S. Strategic Bombing Survey, The Effects of Atomic Bombs on Hiroshima and Nagasaki, Washington, D.C., 1946, p. 8.)

Dr. Harold Jacobsen (a Manhattan Project health physicist who knew nothing about the fallout particle size distribution in the air burst high over Hiroshima), made the following false claim which was published in the Washington Post on August 8, 1945 (two days after Hiroshima and one day before Nagasaki):

"Hiroshima is contaminated with radiation. It will be barren of life and nothing will grow for 75 years. Hiroshima will be barren of human and animal life for 75 years. Any scientists who go there to survey the damage will be committing suicide."


Both General Groves and New York Times science journalist William Laurence attacked this falsehood that caused needless panic among the survivors of Hiroshima and later Nagasaki. Most of the casualties in both cities were due to blast and thermal radiation, with infected wounds made worse by the synergism of initial radiation exposure, which lowers the white blood cell count; see the PDF linked here of James W. Brooks et al., "The Influence of External Body Radiation on Mortality from Thermal Burns", Annals of Surgery, vol. 136 (1952), pp. 533–45. (See also: G. H. Blair et al., "Experimental Study of Effects of Radiation on Wound Healing", in D. Slome, Editor, Wound Healing, Pergamon, N.Y., 1961.) There was no local fallout because the fireball did not touch the ground. The neutron induced activity in Hiroshima was (as intended) too low even at ground zero to cause radiation sickness, owing to the height of the detonation. The "black rain" in Hiroshima originated from the firestorm which began 30 minutes after the explosion, by which time the radioactive mushroom cloud had been blown many miles downwind by the wind.

The actual radioactive fallout around Hiroshima was measured and modelled theoretically, and was not lethal. It was due not to the firestorm "black rain" or dry fallout but to the "cloud seeding" rainout effect from hydroscopic salt crystals in sea level coastal air being entrained by the afterwinds into the mushroom cloud. This mechanism also caused a similar effect found after the 1950s Pacific air bursts King (1480 feet burst altitude, 500 kt) and Cherokee (4350 feet burst altitude, 3.8 Mt), as described in the footnote to Table 1 on page 5 of report USNRDL-TR-899; Figure 4.2 on p. 136 of weapon test report WT-1317 shows that on the bridge of the ship YAG 40 (located 72 nautical miles directly downwind of the 3.8 Mt Cherokee air burst) fallout began to arrive at 6 hours after burst, with a peak exposure rate of 0.25 mR/hour (this figure includes the 0.02 mR/hour background radiation level measured prior to 6 hours) occurring at 8.8 hours after burst. This is trivial radiation, since - given fallout's fast decay rate - it does not significantly increase the annual natural background radiation exposure.

"Even people in underground shelters who survive the initial heat flash will die as all the oxygen is sucked out of the atmosphere."


- Complete lie, even in the worst city firestorm on record, that in Hamburg (not Hiroshima!). If there was no oxygen, there would be no fire! Fires only burn where there is oxygen. Underground shelters proved effective in Hamburg, contrary to unending lies from CND for over fifty years. In any case, underground structures are not needed for civil defense: it is possible to protect against all the effects in a normal city building to such an extent that the casualty rate is reduced by a factor of 120, as in the example for Hiroshima already given.

"Even those with possibly survivable injuries will die since almost all rescue and medical services will have been destroyed and personnel killed."


- Complete lie, disproved by survival data from the Hiroshima and Nagasaki nuclear attacks, where there was no medical assistance available or given to survivors.

"Many of the medical services needed such as specialist burns units are in strictly limited supply. The sheer scale of the casualties would overwhelm any state’s medical resources even in peace time."


- Complete lie, based on serious peacetime burns casualties such as from people covered in burning gasoline, where burns occur to larger body areas than is possible to people exposed on one side to thermal flash, and which are of longer duration and thus to deeper tissue than the very brief-duration surface heating skin burns at Hiroshima and Nagasaki. This is why so many burned survivors recovered from burns without any help in Hiroshima and Nagasaki, contrasted to conventional burns. All of the "burned corpses" photos from Hiroshima and Nagasaki turned out to be standing people who had not ducked and covered, who had been knocked unconscious or had sustained broken legs and been burned often hours after the detonation by fires in wooden houses with overturned cooking stoves. This does not disprove duck and cover civil defense: ALL such people would have SURVIVED if they had ducked and covered in Hiroshima as PROVED by the detailed data for thousands of people as a function of body area burned and type of burns, for the flash burns and initial nuclear radiation synergism at Nagasaki given in the earlier post, linked here.

"Accurate estimates of long term fatalities at Hiroshima are not possible given the large scale destruction of records, population movements and a general censorship on nuclear effects by the US occupation regime."


- Complete lie, ignoring all the research on the long term effects of radiation at Hiroshima and Nagasaki by the Radiation Effects Research Foundation. The long term effects have been well documented:

The percentage of deaths due to delayed effects has always been dwarfed by the natural cancer and natural genetic defect rates, see for instance Radiation Research, volume 146, 1996, pp. 1-27. In a controlled sample of 36,500 survivors, 89 people got leukemia over a 40 year period, above the natural leukemia number of 176 in the unexposed control group, due to the thermal unstability of DNA which is naturally broken due to random molecular impacts from the Brownian motion of water molecules at body temperature, 37 °C. There were 4,687 other, "solid", tumour cancer deaths, which was 339 above the unexposed matched control group. Hence in the 36,500 Hiroshima survivors over 40 years there were 4,863 cancers of all kinds, which is 428 more than the unexposed control group. Hence, 12.2% naturally died from cancer over 40 years who weren’t exposed to radiation, while for the irradiated bomb survivors the figure was 13.3%. No increase whatsoever in genetic malformations could be detected: any effect was so low it was lost in the statistical noise of natural genetic defects - the effect of body temperature on DNA again - for the sample size. Nature is a way, way, way bigger problem than radiation from nuclear bombs.

CND forgot to add the lie that the "nuclear winter" from the soot rainfall in Hiroshima froze the planet and exterminated all life, making the rebuilding of "eternally contaminated" Hiroshima impossible, and making the all the tens of thousands of "survivors envy the dead" as Khrushchev's propaganda claim stated (after he brutally suppressed the peaceful uprising in Hungary using tanks in 1956), thereby disproving the value of civil defense for survival. Maybe they will add this media lie later on? That policy will encourage aggressors to develop weapons of mass destruction in order to intimidate us into appeasement towards them, just like Hitler did in the 1930s, while they prepare for war! Is this what CND wants? Why can't they ever tell the truth about the effects of nuclear weapons? Why do people listen to their Goebbels-style "big lie" propaganda confidence tricks? Why can't the civil defense authorities publish the truth in a clear concise way and debunk these people?

2. Height of burst blast curves: American and British analyses

The 1977 edition of The Effects of Nuclear Weapons contains revised height-of-burst blast curves, with "knees" differing from those in the 1962/4 edition. The 1957 edition contained no such curves at all, merely data for a surface burst, a "free air burst" (i.e., where the blast wave reaches the observer without first striking the ground), and for "typical air burst" at the altitude of the Nagasaki detonation, with burst altitude and ground distance scaled by the cube-root of the explosive yield in all cases. The 1950 edition, The Effects of Atomic Weapons contained more information on the effect of the height of burst on blast pressures, based partly on the chemical high explosive air bursts and partly upon the data from early tests like Trinity, Crossroads-Able, and Operation Sandstone.

Before going into the full details, it is worth jumping forwards in time to the present day. The current compendium is John A. Northrop's 736 pages long 1996 Handbook of Nuclear Weapon Effects: Calculational Tools Abstracted from DSWA's Effects Manual One (EM-1), a compendium of declassified key equations and data extracted on Brode's 22-volume revision of Dolan's EM-1, Capabilities of Nuclear Weapons. Northrop's book is unclassified but of "limited" distribution and is banned from export outside the U.S.A. Regarding the height-of-burst effect on blast pressures, it follows a review by Brode, in which Brode developed an awesome-looking page-long mathematical formula to summarize the nuclear test data on height-of-burst effects in the massive secret American compilation, report DASA 1200.

If you don't have access to Northrop's book, you can find all of the lengthy Brode blast formulae (including expressions for overpressure and dynamic pressure impulses, not just peak overpressure) neatly compiled in the 300 page long March 1985 handbook by the American Society of Civil Engineers, Design of Structures to Resist Nuclear Weapons Effects.

However, there is problem with this. Brode's height-of-burst blast equations are not analytical approximations so much as empirical approximations, and the American Nevada nuclear test data they summarize is not universally valid, as explained in detail by Lord William G. Penney and others at the Atomic Weapons Research Establishment, Aldermaston, in his 1970 analysis of blast at Hiroshima and Nagasaki, which reproduces a summary of all British nuclear test data. British research showed that there is a major problem in the American approach to blast height of burst curves. Penney points out that most of the extreme "knees" in the height of burst curves for blast in American are due to the heating of air near the dark-colored Nevada soil by the thermal radiation pulse in air bursts. Near ground zero, this is trivial since the blast arrives before most of the thermal radiation has been emitted, but at greater distances the blast (near ground level) gradually picks up thermal energy from the pre-shock heated air (due to the "smoking" of the ground surface, forming a hot pre-shock thermal layer), which directly adds energy into the blast wave shock front, increasing the overpressure to values beyond that which would occur in the absence of this thermal effect!

This thermal enhancement on air blast is quite apart from the (1) dust storm "precursor" which can occur and (2) the "Mach effect", which is the simple merging of reflected and incident shock waves, due to the fact that the reflected shock wave is travelling through air heated by the incident blast wave and therefore travels faster, catching up and merging with the incident shock wave to form the so-called "Mach stem". The thermal enhancement is also separate from the enhancement of the air blast by regular reflection and by the fact that air bursts have more blast and thermal energy available owing to the fact that they do not expend fireball energy in cratering, severe ground shock, and melting on the order of 100 tons of soil per kiloton to form fused fallout particles.

Penney's height-of-burst curves from British nuclear tests can be directly compared to those in Glasstone and Dolan, and show a yield effect. Most of the American Nevada test data in the "knees" region is for yields in the range of 20-30 kilotons. British test data is mostly for yields in the range of 1-20 kt. The American data shows greater "knees", giving larger ranges of blast for optimum heights of burst. This could be partly due to the higher average yield in American tests, because the blast radii scale as W1/3 whereas the thermal ranges (over relatively small distances in clear atmospheric visibility, so that thermal attenuation by air is trivial) scale more far more strongly with yield, as W1/2 not W1/3. The thermal contribution to the "knees" means that the blast wave from optimized air bursts will depend on the colour of the soil and also will not scale simply by the cube root law in clear atmospheric conditions, but will scale up more rapidly and for very high yield optimum air bursts over dark colored soil, the peak overpressure distances will scale with yield more like W1/2 than like W1/3.

However, unlike the Nevada test site, city buildings shadow the thermal radiation (at least prior to blast arrival at any given building), eliminating most of this effect over large ranges. Penney also points out another factor which is ignored by Glasstone in all editions after the first (1950) edition, namely the use of blast energy in irreversibly causing destruction of buildings in any given radial line outwards from ground zero. In Hiroshima and Nagasaki, Penney's research found that each wooden building blown up by the blast took out on average around 1% of the blast peak overpressure (compared to the unobstructed Nevada desert), so after 100 buildings in any given radial line from ground zero had been destroyed, the peak overpressure was down to just 0.99100 = 0.37 of that in unobstructed terrain. Although you might naively expect some non-radial diffraction of blast energy downwards from higher altitudes to offset this cumulative energy loss due to damage done, the blast pressures are greatest near the ground surface due to the thermal interaction anyway, so the general non-radial flow of energy due to vertical diffraction is upwards, not downwards. You can't get around this problem.

All factors considered, the blast height-of-burst curves are oversimplified by Glasstone, Brode and Northrop. Most nuclear test data is applicable to unobstructed deserts with dark color Nevada soil and clear atmospheric visibility, and these conditions are not applicable generally to the use of nuclear weapons. It is easy, however, to use the nuclear test data to validate full theoretical solutions of the height-of-burst curves. All you need to do is to integrate over the amount of thermal energy (emitted by the time the blast reaches any given radius) absorbed by the ground and add a fraction of that energy to the effective blast yield of the weapon. The fraction will simply be dependent on the albedo of the ground for absorbing the thermal pulse (this is well known, since the fireball thermal pulse spectrum is very similar to that of sunlight), and the sine of the angle which the radial line from the fireball makes with the ground. A considerable proportion of the thermal flash energy absorbed by the ground can convectively heat the air above the ground by "smoking" (a phenomenon visibly clear in many films of nuclear test effects) prior to the blast arrival at that point. The exact fraction of energy transferred from the thermal heating of the ground to the blast wave can be determined by comparing the calculations to the observed blast for given weapon yields in the Nevada, and the result can then be used to predict height-of-burst curves for other bomb yields allowing accurately for the pre-shock thermal layer boosting of the Mach stem.

Friday, April 02, 2010

Fractionation of fission products: the large reduction in long term hazards from close-in fallout



Dr Carl F. Miller, “A Theory of Decontamination of Fallout from Nuclear Detonations. Part II. Methods for Estimating the Composition of Contaminated Systems”, U. S. Naval Radiological Defense Laboratory, report USNRDL-466, 29 September 1961.

Dr Terry Triffet and Philip D. LaRiviere, “Operation Redwing. Project 2.63. Characterization of Fallout”, Nuclear Weapon test report WT-1317, 15 March 1961.

"As the temperature decreases, positive ions regain their electrons and become atoms, atoms recombine to form molecules, molecules condense to form liquid droplets, and, finally, when the temperature is low enough, the droplets solidify."

- Dr Carl F. Miller, Fallout and Radiological Countermeasures, volume 1, Stanford Research Institute, January 1963, p. 14.


"In 1954 ... we were about 20 miles away when a 10-megaton shot was detonated ... The ship [YAG 39] sailed on a pathway that led to an area directly underneath the expanding cloud, so as to be exposed to a maximum amount of fallout ... Fallout arrived about 20 minutes after detonation, at which time I collected the first few drops of 'hot' washdown water ... With most of the local fallout that we're talking about, a lot of the larger particles are fused or melted to form little glassy marbles. ... The radioactive atoms that could be absorbed into, or by, body organs were the few that are plated out on the surface of the fallout particles during the later stages of condensation in the fireball. That's why the elements iodine, strontium, ruthenium and a few other isotopes of that nature have been found in organs of animals and humans."

- Dr Carl F. Miller, fallout countermeasures research award acceptance speech, U.S. National Council on Radiological Protection (NCRP) symposium on 27-29 April 1981 in Virginia, published in The Control of Exposure of the Public to Ionising Radiation in the Event of Accident or Attack, pp. 99-100.


In the cooling of nuclear fireball, dust and dirt is continuously entering and leaving the regions of vaporized bomb debris and fission products. The gaseous fission products xenon-137 and krypton-90 cannot condense on to solid particles of fallout until their radioactive decay transforms them from rare gases into solid elements.

Hence, large fallout particles can't collect much gaseous xenon-137 or krypton-90 fission products before they leave the fireball due to gravitational settling. This is a simple example of fallout "fractionation". Xenon-137 is the precursor of cesium-137 (the well known long-lived gamma radiation emitter in fallout, which gets into food chains due to its chemical similarity to potassium) while krypton-90 is the precursor to rubidium which then decays into the well-known long-lived beta emitter strontium-90, which gets taken up and deposited into bone a little like calcium (although most food chains do discriminate against strontium relative to calcium to an impressive and helpful extent). The loss of most of the xenon-137 and krypton-90 from close-in fallout therefore reduces the long-term cesium-137 and strontium-90 radiation doses.

"As a result of radioactive decay, the gases krypton and xenon form rubidium and cesium, respectively, which subsequently condense onto solid particles. Consequently, the first particles to fall out, near ground zero, will be depleted in not only krypton and xenon, but also in their various decay (or daughter) products. ... For explosions of large energy yield at or near the surface of the sea, where the condensed particles consist of sea-water salts and water, fractionation is observed to a lesser degree than for a land surface burst. The reason is that the cloud must cool to 100 °C (212 °F) or less before the evaporated water condenses [compared a condensation temperature of 1,400 °C for Nevada silicate soil, causing a relatively] long cooling time ..."

- Samuel Glasstone and Philip J. Dolan, The Effects of Nuclear Weapons, U.S. Department of Defense, 3rd ed., 1977, p. 389.


There are also many short-lived nuclides in fallout subject to some fractionation. Iodine-131 and other isotopes of iodine are depleted from local fallout particles due to the chemical volatility of iodine and its precursors, which keeps it gaseous in the fireball for a long period. Most of these gases end up condensing so late in the fireball that most of the large fallout particles have already fallen out, and only very small particles remain, which on average take a long time to descend into the lower atmosphere from whence they can then be scavenged to the surface by rainfall.

Fractionation therefore works to reduce the relative long-term agricultural dangers from the close-in fallout, and to enhance these dangers in the distant fallout. Put simply, the difference in contamination levels in plants and animals in the local fallout pattern and in the global fallout pattern is much smaller than you would predict from the gamma dose rates, if you ignore fractionation. But what about the hard numbers? Exactly how much fractionation is there for any given nuclide at any given distance from any given type of nuclear explosion?

Two experts, Dr Carl F. Miller and Dr Edward C. Freiling, both originally located at the U.S. Naval Radiological Defense Laboratory in the 1950s, investigated nuclear test fallout experimentally and theoretically in an effort to get the numbers right.

Dr Freiling published a series of unclassified articles empirically analysing the nuclear test data on fractionation from nuclear tests, mainly the four well documented Operation Redwing tests in 1956 (two water surface bursts, Navajo and Flathead, and one coral island surface burst, Zuni, and one surface burst over water so shallow compared to the fireball radius that it was effectively a coral surface burst, Tewa). Freiling's first article was "Radionuclide Fractionation in Bomb Debris", in Science, v. 133 (1961), pp. 1991-8, which is based on his laboratory report Fractionation I. High Yield Surface Burst Correlations, USNRDL-TR-385 (29 October 1959). This simply plots on logarithmic graphs the ratios of the amounts of fractionation of pairs of nuclides in the fallout and cloud samples, finding straight line correlations to within a factor of 2 for all the data points. Fractionation is measured as the reduction factor of nuclide abundance in fractionated fallout, compared to the abundance in unfractionated fission products. In practice this ratio was obtained by taking a very refractory nuclide (in the 1950s this was assumed to be Mo-99, but from the early 1960s Zr-95 was preferred) as a measure of the total amount of unfractionated fission products in the sample, and then finding the relative abundances of other nuclides. The depletion factor due to fractionation, R, is then the factor by which measured abundance of a nuclide in fallout is reduced from the M-shaped fission product production curve determined for fission in a sealed sample of neutron irradiated fissionable material in a laboratory, where no fractionation occurs.

Dr Miller published an alternative approach to fractionation in January 1963 in Fallout and Radiological Countermeasures, after moving from the U.S. Naval Radiological Defense Laboratory to Stanford Research Institute. Miller formulated a theory of how molten particles of soil absorb fission products in the fireball at high temperature before falling out and solidifying. Particles entering the fireball after it has cooled blow the melting point of the soil are then not melted, and merely contaminated on their outer surfaces. Miller's theory follows experimental studies in the 1950s of fallout particles, in which individual fallout particles were sealed in solid transparent resin and then shaved into thin cross-sections which were exposed to photographic film to produce "radioautographs", x-ray like photos in which the source of the illumination is the beta radiation from the radioactivity in the particle.

Examples of such radioautographs are included in Miller's reports and the 1977 edition of Glasstone and Dolan's Effects of Nuclear Weapons. They show that small spheroidal and spherical fallout particles, which were melted from irregular sand grains by the heat of the fireball, tend to be uniformly contaminated throughout their entire internal volume. Because they were molten, the fission products condensing upon them could diffuse into the liquid interior before the fallout particle left the fireball, cooled, and solidified into glassy silica.

On the other hand, irregular shaped particles, which had not been melted by the fireball, were only contaminated on their outer surfaces, thus leaving an irregular loop shape on the radioautographs. Miller explains that these particles are generally formed at a late stage in the fireball history, when the fireball has cooled so much that it can no longer melt the debris entering it.

The problem with this theoretical model in 1963 was that most of the chemical constants for the diffusion of different fission products into molten silicate sand or other fallout materials were then unknown. Most of the research available on absorption was for the absorption of dyes into materials, and gases into activated charcoal absorbers for gas masks. There was little data available on the absorption of the various fission product vapours into molten silicate sand at the high temperatures relevant to a nuclear fireball. Miller's January 1963 report reviewed Freiling's logarithmic correlations and stated that a theoretical justification was needed. Freiling responded in the 15 March 1963 issue of Science, v. 139, pp. 1058-9, with an article called Theoretical Basis for Logarithmic Correlations of Fractionated Radionuclide Compositions.

This is easily explained with reference to the two different distributions of fission products within fallout: soil particles that enter the fireball when it is hotter than the soil melting point end up uniformly distributed throughout their volumes, while those that enter later on end up with surface contamination only.

Uniform contamination corresponds therefore to unfractionated or refractory (non-volatile) fission products, that can condense even at very high temperature. Thus, most of the unfractionated fission product radioactivity condenses uniformly throughout the internal volumes of large particles, making this radioactivity mostly insoluble (trapped in glass), if the soil is silicate in nature (calcium carbonate soils like coral or limestone produce more soluble early fallout than silicate soil).

If the radius of a spherical fallout particle is r, then the total uniformly distributed unfractionated activity throughout its volume is proportional to r3. By contrast, the highly fractionated fission products condense at late times after the fallout particle has cooled and solidified, so they just land on the solid outer surface and are unable to diffuse throughout the volume. The total amount of this fractionated activity which condenses upon a fallout particle is therefore proportional to the particle's surface area, which varies with particle size in proportion to r2.

Hence, the ratio of fractionated to unfractionated fission products in a fallout particle of radius r is directly proportional to the ratio of surface area to volume or r2/r3 = 1/r. Hence, you would expect the fractionation depletion factor R to vary inversely with the radius of the particle: bigger particles show greater depletion in volatile nuclides, because they fall out of the fireball sooner and get less volatile gaseous fission products condensed upon them, relative to the refractory metallic fission products with high boiling points that are always unfractionated in fallout.

So for severely fractionated fission products, the depletion factor R varies with fallout particle radius, r, according to the rule R ~ 1/r, whereas for unfractionated fission products R = 1.

Freilings logarithmic correlations can then be seen as a simple unification and interpolation between these two extremes: R ~ r-n, where n = 0 for unfractionated (refractory) nuclides and n = 1 for the most highly fractionated nuclides. For intermediate degrees of fractionation, n is between 0 and 1. This explains Freilings logarithmic correlations: he called it the "radial distribution model" of fractionation.

There are a couple of very deep insights to be gained from applying this theory to the interpretation of nuclear test data on fractionation. First, Freilings original correlations were log-log plots of depletion factors, R (relative to unfractionated Mo-99 or to unfractionated Zr-95), comparing the fractionation of two different fission product beta decay chains, for example the R factor for nuclides of mass number 91 versus that for nuclides of mass number 89. Freiling's correlations were straight lines through the data points on the log-log axes, but he originally allowed two variables: the intercept of the line and the gradient of the line. His "radial distribution model" later led to the abandoning of the first variable, since the intercept of the line on either R axis cannot be a variable but must always be equal R = 1 for both of the decay chains.

Hence Freiling's original correlation of the Cs-137 depletion factor R137 compared to the Sr-89 depletion factor R89 was:

log R137 = a + (b log R89)

containing two adjustable parameters, a and b, which his later theoretical radial distribution model of fractionation reduced to a single variable, b, by showing that the log-log intercept value a = 0:

log R137 = b log R89,

which yields the simple relationship:

R137 = R89b,

which is exactly what Freiling's radial distribution model of fractionation gives.

In the original papers by Freiling and others, R89 is written r89,95. We're using upper case R for the fractionation depletion factor because lower case r is being used for the radius of the fallout particle, and we are dropping the "95" from the subscript which tells you that Nb-95 was used as the "unfractionated" standard nuclide in working out the depletion factor for Sr-89. This was important when the standard nuclide was switched from Mo-99 in the 1950s to Nb-95 in the 1960s. Precise definition:

"The fractionation ratio [R89,95] is the ratio of the number of fissions required to produce the amount of the mass-89 [beta decay] chain found in a sample to the number of fissions which would be required to produce the amount of the mass-95 chain found in the same sample."

- E. C. Freiling and S. C. Rainey, Fractionation II. On Defining the Surface Density of Contamination, USNRDL-TR-631, 13 March 1963, p. 9.


The absence of a variable intercept implies that where R = 1 for one decay chain (say mass number 89) in a sample, R must also equal 1 for all of the other decay chains in fallout particles leaving the fallout at the same time, i.e. in a sample which has left the fireball at a particular time after detonation. If for Sr-89, R = 1 in a sample, then the "radial distribution model" of fractionation predicts that all other nuclides in the sample are also unfractionated, having R = 1.

This theoretical conclusion simplified Freilings logarithmic correlations by reducing the correlation of the degree of fractionation to a single variable, the slope of the line on the log-log plot of reduction factors, R. In Figure 4 of Glenn R. Crocker, Francis K. Kawahara and Edward C. Freiling's paper "Radiochemical-Data Correlations of Debris from Silicate Bursts" (in Alfred W. Klement, Jr., Radioactive Fallout from Nuclear Weapons Tests, U.S. Atomic Energy Commission, Symposium Series 5, Proceedings of the Second Conference, Germantown, Maryland, November 3-6, 1964, CONF-765, page 78), a Freiling log-log plot of R factors for fission product mass number 91 is plotted against the R factor for fission product mass number 89 for the 1962 1.65 kt Nevada surface burst Small Boy. Freiling plots data from three different laboratories which were employed to determine the fractionation debris from 43 fallout collection stations within 8.7 miles of ground zero.

Each of the three laboratories produced data with an an almost identical slope, but with greatly different intercepts values, due to systematic measurement errors in the laboratory analyses. Freiling could easily identify the most accurate laboratory from the data which gave similar intercept values of about R = 1 for both nuclides, thus using the theoretical elimination of one variable as a means to identify and discard inaccurate laboratory data!

Table 1 of that report by Crocker, Kawahara and Freiling summarizes the fractionation slopes for the 1962 Nevada surface bursts Small Boy (1.65 kt) and Johnie Boy (0.5 kt) and the 1956 coral surface burst Zuni (3.53 Mt). Before we repeat these data, it is important to quote what they say on pp. 73-5 about the samples from the July 1962 Nevada near-surface burst nuclear tests:

"Small Boy was a low-yield shot fired from atop a 10-ft high wooden tower above alluvial soil ... NRDL [U.S. Naval Radiological Defense Laboratory] collected many fallout samples of debris at 43 stations within 8.7 miles of ground zero ... The discussion in this report is mainly restricted to samples from within 8.7 miles of ground zero and from the cloud samples ... A total of about 187 samples is dealt with here, all of which were analyzed for Sr-89, Sr-90, Y-91, and Zr-95. In addition, about one-third of them were analyzed for Mo-99, Ru-103, Ru-106, Cs-136, Cs-137, Ba-140, Ce-141, Ce-144, Np-239, and Pu-239. Some of this last group of samples were also analyzed for I-131 and Te-132. The numbers quoted do not include a fairly large number of radiochemical analyses on samples used for solubility studies. ...

"... Johnie Boy was a low-yield burst 23 in. below the surface of basaltic material ... Forty-four fallout samples from the area out to about 1.25 miles from ground zero and two cloud samples were studied radiochemically. All of these were analyzed for Sr-89, Sr-90, Y-91, and Zr-95, and about one-third of them were analyzed for the long list of nuclides previously given for the Small Boy samples. ...

"... The Small Boy [deposited fallout] field was the cigar-shaped downwind area typically associated with such shots. The Johnnie Boy field was very, perhaps atypically, narrow with a very [radioactive] hot line down the centre, which was visible on the ground as a darkened streak. For Johnie Boy a weighted average for this [Sr-89 depletion R ratio due to fractionation relative to unfractionated fission products, determined from the abundance of the unfractionated nuclide Nb-95] for the hot-line stations is around 0.03, indicating very severe fractionation. For Small Boy the values for most stations are in the range 0.1 to 0.2, indicating more moderate fractionation."


[Insert fractionation correlation summary data table here]

There is also a problem with the way that some of the nuclear test data on fractionation has been analyzed to check this theory. It is clear that the fallout particles do not fall out of the hot fireball in a perfectly size-ordered way, largest first and smallest last. There is a toroidal circulation with an updraft in the centre carrying the dusty mushroom stem up into the cloud head, and there are downdrafts around the periphery of the edge of the mushroom cloud, where the ascending air column has collided with cool high altitude air, been cooled, and started to flow back downwards.

This is confirmed experimentally by the small spread of fractionation values in the large distribution of particle sizes collected in sequentially exposed fallout collection trays at fixed ground stations under mushroom clouds. Fallout particles of different sizes that arrive at the same place, at the same time after detonation, must have originated from different locations in the mushroom cloud, due to their different settling rates. In kiloton Nevada tests where the vertical extent of the cloud was far greater than its relatively small horizontal radius, it generally follows that the larger particles in such samples originated from higher altitudes than the smaller particles.

This implies that the smaller particles in such samples left the fireball sooner than the larger particles, which is the exact opposite of the overall fractionation trend with particle size within the downwind fallout pattern, hence offsetting the usual fractionation variation with size, and explaining discrepancies in the degree of fractionation as a function of particle size in individual samples of fallout obtained at a fixed time and location.

To give a specific example of this error in experimental data analysis, Glenn R. Crocker, Francis K. Kawahara and Edward C. Freiling in Figure 5 of a paper called "Radiochemical-Data Correlations of Debris from Silicate Bursts" (in Alfred W. Klement, Jr., Radioactive Fallout from Nuclear Weapons Tests, U.S. Atomic Energy Commission, Symposium Series 5, Proceedings of the Second Conference, Germantown, Maryland, November 3-6, 1964, CONF-765, page 80) plot a range of data on the depletion factor R for Sr-89 as a function of fallout particle radius for the 1.65 kt Nevada nuclear surface burst Small Boy. But because most of the data used was collected for similar locations and times after detonation, the data points for the fractionation R factor is mostly in the range of 0.1-0.2 and shows only very weak evidence of a dependence upon particle size. Because of the relatively small cloud radius in this low yield test, most of the variation in the sizes of fallout particles deposited at close in locations was due to variation in altitude from which the particles originated, so this effect wiped out much of the anticipated radial distribution model fractionation with particle size! The same misleading graph is reproduced, with a poorly fitting fractionation prediction curve, as Figure 10 on page 26 of Freiling's symposium proceedings book Radionuclides in the Environment (1970).

In order to properly determine the fractionation depletion factor of a given nuclide as a function of particle size, it is therefore necessary to take account of the altitude from which individual particles originated. If particles originate from only one altitude, then to see a significant radial distribution in fractionation in fallout samples it is necessary to utilise samples from widely varying downwind distances, or the fractionation effect will be largely suppressed by differences in the altitude of origination of the fallout particles deposited at one location.

Effect of fractionation on the gamma ray spectrum of fallout

Glenn R. Crocker's 287 pages long report Radiation Properties of Fractionated Fallout; Predictions of Activities, Exposure Rates and Gamma Spectra for Selected Situations, U.S. Naval Radiological Defense Laboratory, USNRDL-TR-68-134, 27 June 1968 (mentioned previously on the post linked here) does not appear to be listed in any online database, although it is cited in the experimental report linked here, so we have created a PDF file which tabulates some of Crocker's most important gamma spectra data, linked here. This shows that for the fission of U-238 in a H-bomb by thermonuclear neutrons, the mean gamma ray energy for unfractionated fission products is 0.81 MeV at one hour and 0.48 MeV at 1 week after detonation, while for fission products in which 90% of the Sr-89 is depleted (i.e. where only 10% of the Sr-89 expected - from the abundance of unfractionated Nb-95 - is present), the mean gamma ray energy is just 0.71 MeV at 1 hour and 0.44 MeV at one week after detonation.

Hence, the depletion of volatile fission products due to fractionation does cause a shift in the spectrum to lower gamma ray energy. As explained in more detail in a previous post, this shift is due to the fact that the most highly volatile fission products are shell structures for both electrons and nuclear properties via the exclusion principle which result in higher than average gamma ray energy emissions. The loss of these high gamma ray energies from fallout due to fractionation results in a downward shift in the mean gamma ray energy. This is quite apart from the additional effect of very low energy gamma ray contributions from non-fission neutron captures in U-238 which produce large quantities of Np-239, U-237, U-240, etc., in the fallout, causing an additional massive reduction in the mean gamma ray energy and making shielding against fallout (at least for low to moderate protection factors, where the low energy gamma rays are easily filtered out, leaving only the smaller proportion of higher energy gamma rays to continue).


(To be continued.)

WHAT IS NUKEGATE? The Introduction to "Nuclear Weapons Effects Theory" (1990 unpublished book), as updated 2025

R. G. Shreffler and W. S. Bennett, Tactical nuclear warfare , Los Alamos report LA-4467-MS, originally classified SECRET, p8 (linked HE...