Peace through practical, proved civil defence for credible war deterrence
  • Please see also post linked here, and our summary of the key points in Herman Kahn's much-abused call for credible deterrence, On Thermonuclear War, linked here.

  • Hiroshima's air raid shelters were unoccupied because Japanese Army officers were having breakfast when B29s were detected far away, says Yoshie Oka, the operator of the Hiroshima air raid sirens on 6 August 1945...

  • In 1,881 burns cases in Hiroshima, only 17 (or 0.9 percent) were due to ignited clothing and 15 (or 0.7%) were due to the firestorm flames...

  • Dr Harold L. Brode’s new book, Nuclear Weapons in ...

  • 800 war migrants drowned on 22 April by EU policy:...

  • Photographed fireball shielding by cloud cover in ...

  • Nuclear weapons effects "firestorm" and "nuclear w...

  • Proved 97.5% survival in completely demolished houses ...

    "There has never been a war yet which, if the facts had been put calmly before the ordinary folk, could not have been prevented." - British Foreign Secretary Ernest Bevin, House of Commons Debate on Foreign Affairs, Hansard, 23 November 1945, column 786 (unfortunately secret Cabinet committees called "democracy" for propaganda purposes have not been quite so successful in preventing war). Protection is needed against collateral civilian damage and contamination in conventional, chemical and nuclear attack, with credible low yield clean nuclear deterrence against conventional warfare which, in reality (not science fiction) costs far more lives. Anti scientific media, who promulgate and exploit terrorism for profit, censor (1) vital, effective civil defense knowledge and (2) effective, safe, low yield air burst clean weapons like the Mk54 and W79 which deter conventional warfare and escalation, allowing arms negotiations from a position of strength. This helped end the Cold War in the 1980s. Opposing civil defense and nuclear weapons that really deter conventional war, is complacent and dangerous.

    War and coercion dangers have not stemmed from those who openly attack mainstream mistakes, but from those who camouflage themselves as freedom fighters to ban such free criticism itself, by making the key facts seem taboo, without even a proper debate, let alone financing research into unfashionable alternatives. Research and education in non-mainstream alternatives is needed before an unprejudiced debate, to establish all the basic facts for a real debate. “Wisdom itself cannot flourish, nor even truth be determined, without the give and take of debate and criticism.” – Robert Oppenheimer (quotation from the H-bomb TV debate hosted by Eleanor Roosevelt, 12 February 1950).

    “Apologies for freedom? I can’t handle this! ... Deal from strength or get crushed every time ... Freedom demands liberty everywhere. I’m thinking, you see, it’s not so easy. But we have to stand up tall and answer freedom’s call!” – Freedom Kids

  • Friday, March 23, 2007

    Radiation Effects Research Foundation covers up the very low cancer rates of Hiroshima and Nagasaki nuclear survivors using cynical obfuscation tactic

    In a controlled sample of 36,500 survivors, 89 people got leukemia over a 40 year period, above the number in the unexposed control group. (Data: Radiation Research, volume 146, 1996, pages 1-27.) Over 40 years, in 36,500 survivors monitored, there were 176 leukemia deaths which is 89 more than the control (unexposed) group got naturally. There were 4,687 other cancer deaths, but that was merely 339 above the number in the control (unexposed) group, so this is statistically a much smaller rise than the leukemia result. Natural leukemia rates, which are very low in any case, were increased by 51% in the irradiated survivors, but other cancers were merely increased by just 7%. Adding all the cancers together, the total was 4,863 cancers (virtually all natural cancer, nothing whatsoever to do with radiation), which is just 428 more than the unexposed control group. Hence, the total increase over the natural cancer rate due to bomb exposure was only 9%, spread over a period of 40 years. There was no increase whatsoever in genetic malformations.

    'This continues the series of general reports on mortality in the cohort of atomic bomb survivors followed up by the Radiation Effects Research Foundation. This cohort includes 86,572 people with individual dose estimates ... There have been 9,335 deaths from solid cancer and 31,881 deaths from noncancer diseases during the 47-year follow-up. ... We estimate that about 440 (5%) of the solid cancer deaths and 250 (0.8%) of the noncancer deaths were associated with the radiation exposure [emphasis added]. ... a new finding is that relative risks decline with increasing attained age, as well as being highest for those exposed as children as noted previously. A useful representative value is that for those exposed at age 30 the solid cancer risk is elevated by 47% per sievert at age 70. ... There is no direct evidence of radiation effects for doses less than about 0.5 Sv [emphasis added; notice that this report considers 86,572 people with individual dose estimates, and 40% have doses below 5 mSv or 0.005 Sv, so the politically expedient so-called 'lack of evidence' is actually a fact backed up by one hell of a lot of evidence that there are no radiation effects at low doses, a fact the biased scare-story-selling media and corrupt politically-expedient politicians will never report!].' - D. L. Preston, Y. Shimizu, D. A. Pierce, A. Suyama, and K. Mabuchi, 'Studies of mortality of atomic bomb survivors. Report 13: Solid cancer and noncancer disease mortality: 1950-1997', Radiation Research, volume 160, issue 2, pp. 381-407 (2003).
    Above: what is being politically covered up in the latest reports by the Radiation Effects Research Foundation. D. A. Pierce and D. L. Preston (Radiation Effects Research Foundation, Hijiyama Park, Hiroshima) wrote in 'Radiation-related cancer risks at low doses among atomic bomb survivors', Radiation Research, volume 154, issue 2. pp. 178-86 (August 2000): 'To clarify the information in the Radiation Effects Research Foundation data regarding cancer risks of low radiation doses, we focus on survivors with doses less than 0.5 Sv. ... Analysis is of solid cancer incidence from 1958-1994, involving 7,000 cancer cases among 50,000 survivors in that dose and distance range. The results provide useful risk estimates for doses as low as 0.05-0.1 Sv, which are not overestimated by linear risk estimates computed from the wider dose ranges 0-2 Sv or 0-4 Sv. There is ... an upper confidence limit on any possible threshold is computed as 0.06 Sv [emphasis added]. It is indicated that modification of the neutron dose estimates currently under consideration would not markedly change the conclusions.' In the illustration above, at 3.4 rads (gamma dose equivalent) reduced the natural leukemia rate by 30% in the Hiroshima and Nagasaki data available in 1982. There seems to be a "threshold" of 8 rads before there is any increase in risk. (H. Kato and W. J. Schull, 'Studies of the mortality of A-bomb survivors. 7. Mortality, 1950-1978: Part I. Cancer mortality', Radiation Research, May 1982, v90, Issue 2, pp. 395-432.) The accuracy in dosimetry (substantiated by measurements of neutron induced activity and gamma ray thermoluminescence in the two cities) at that time meant that the doses were generally believed accurate to about +/- 50% (the accuracy of later estimates has increased). These data are based on a radiation quality factor of about 20 for neutrons, to reconcile data from the two cities (the Hiroshima gun-type bomb bomb leaked the most neutrons, which were mainly absorbed in the high explosive in the Nagasaki device which worked by spherically symmetrical implosion), i.e., 1 rad from neutrons was considered to be equivalent to 20 rads of gamma rays. The reason for the reduction in natural leukemia rate by 3.4 rads may be either the stimulation of the protein P53 repair mechanism which repairs DNA strands broken by radiation, and/or a long-term boosting to the immune system caused somehow by surviving the nuclear explosions with low doses. It is unlikely to be a completely statistical random error, because the sample size of people exposed to low doses of radiation is very large - 23,073 people exposed to an average of 3.4 rads, with an unexposed control group size of 31,581. However, the exact doses received were still fairly uncertain in 1982, and the survivors of Hiroshima and Nagasaki were still dying:


    This means that the early data from the 1950s upon which all the health physics philosophy (linear dose-effects relation with no threshold dose before effects start to appear, and no effect of dose rate - see previous post) is useless not only because of the dosimetry but because it was premature to judge long terms effects by that early data. For example, the major source of 1950s data from Hiroshima and Nagasaki is summarised in a table on page 966 of the 1957 U.S. Congressional Hearings before the Special Subcommittee on Radiation of the Join Committee on Atomic Energy, The Nature of Radioactive Fallout and Its Effects on Man. This table was headed "Incidence of leukemia among the combined exposed populations of Hiroshima and Nagasaki by distance from the hypocenter (January 1948-September 1955)", and it is divided into distances of 0-1 km (0.96% of survivors had leukemia), 1-1.5 km (0.30% of survivors had leukemia), 1.5-2 km (0.043% of survivors had leukemia) and beyond 2 km (0.017% had leukemia). This early data was simply not detailed enough, and not collected over a long enough period of time to assess the effects of radiation properly, and there was no proper dosimetry to determine the doses people received, their shielding by houses (and the mutual shielding of clusters of houses), etc. The valuable data has taken decades to get.

    The joint Japanese-American (Department of Energy)-funded Radiation Effects Research Foundation aren't putting the sort of detailed dose-effects data we need on the internet due to political bias in favour of fashionable prejudice in Japan, despite such bias being cynically anti-scientific, ignorance-promoting, politically expedient dogmatism: its online 16 pages booklet called 'Basic Guide to Radiation and Health Sciences' gives no quantitative results on radiation effects whatsoever, while it falsely promotes lies about radioactive rainout on page 5:



    Above: by the time that the mass fires developed in the wooden homes of Hiroshima (breakfast time) and Nagasaki (lunch time) from blast wind-overturned cooking braziers, paper screens, and bamboo furnishings, the mushroom cloud has been blown away by the wind. The moisture and soot from the firestorm in Hiroshima which condensed to a 'black rain' when it had risen and cooled above the city, fell on the city an hour or two after the explosion and did not intersect the radioactive mushroom cloud, which had attained much higher altitude than the firestorm soot and moisture in any case. The neutron induced activity doses from the ground were trivial compared to the outdoor initial nuclear radiation doses, as illustrated in a previous post using the latest DS02 dosimetry. The RERF propaganda seeks to discredit civil defence by false propaganda, a continuation of the fellow travelled Cold War Soviet communist propaganda against Western defenses.

    ‘Science is the organized skepticism in the reliability of expert opinion.’

    - R. P. Feynman (quoted by Smolin, The Trouble with Physics, 2006, p. 307).

    ‘Science is the belief in the ignorance of [the speculative consensus of] experts.’

    - R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

    The linear non-threshold (LNT) anti-civil defence dogma results from ignoring the vitally important effects of the dose rate on cancer induction, which have been known and published in papers by Mole and a book by Loutit for about 50 years; the current dogma which is falsely based on merely the total dose, thus ignoring the time-dependent ability of protein P53 and other to cancer-prevention mechanisms to repair broken DNA segments. This is particularly the case for double strand breaks, where the whole double helix gets broken; the repair of single strand breaks is less time-dependent because there is no risk of the broken single strand being joined to the wrong end of a broken DNA segment. Repair is only successful in preventing cancer if the broken ends are rapaired correctly before too many unrepaired breaks have accumulated in a short time; if too many double strand breaks occur quickly, segments can be incorrectly 'repaired' with double strand breaks being miss-matched to the wrong segments ends, possibly inducing cancer if the resulting somatic cell can then undergo division successfully without apoptosis.
    http://www.rerf.jp/top/qae.htm.If you look at the data they provide at http://www.rerf.or.jp/eigo/faqs/faqse.htm, it only extends to 1990 and deliberately doesn't include any dosimetry at all (although the doses depend on shielding, they could have dealt with this by providing average shielding figures at each range, or simply ignoring distance and plotting dose versus effects). But I found the 1988 report update based on the 1986 dosimetry, which is close to the latest data: Y. Shimizu, et al., Life Span Study Report II, Part 2, Cancer Mortality in the Years 1950-1985 Based on the Recently Revised Doses (DS86), Radiation Effects Research Foundation, RERF-TR-5-88:

    You can see that small doses up to 5 rads have no effect either way on the leukemia risk, while 6-9 rads in this data seems to cause a reduction in normal leukemia risk from 0.17% to 0.12%. Doses which exceed this are harmful, possibly because the P53 repair mechanism was saturated and could not repair radiation induced damage to DNA due to the rate it occurred at higher doses. A dose of 20-40 rads more than doubles the natural leukemia risk. Hence anyone getting leukemia after a larger dose is more than 50% likely to have got the cancer as a result of the radiation exposure than naturally. (You cannot say this about other forms of cancer because 23% of Americans die from some form of cancer now anyway, so even the sort of risks at massive radiation doses can't compete with the natural risk of cancer for most types of cancer.) Notice that the DS02 dosimetry dose effects estimates are within 10% of the earlier DS86 estimates. DS02 (Dosimetry System 2002) was adopted in 2003 and gives a radiation dose at 1 m above he ground in open terrain at 1 km from ground zero of 4.5 and 8.7 Gy in Hiroshima and Nagasaki, respectively, with 0.08 and 0.14 Gy at 2 km, respectively. According to the recent Life Span Study report for the period of 1950-2000, among 86,611 survivors for whom individual doses were estimated, there were 47,685 deaths (55% of total number of survivors alive in 1950), including 10,127 from solid cancer and 296 from leukemia. Out of the 10,127 solid cancer deaths, only 5% were due to radiation, as shown by comparison to a non-exposed (but otherwise matched) control group.

    In 1969, Professor Ernest Sternglass, a physicist, correlated a dramatic increase in infant mortality during the 1950s to the increasing fallout radiation from nuclear testing. His papers and books on low-level radiation effects were unscientific in the sense that they illustrate how not to do science. He had no control group, unlike the Hiroshima and Nagasaki data. So he had no idea what was causing the childhood mortality rise. It could have been diet, proximity to smoking adults at home, effects of natal X-rays (see previous post), childhood X-ray checks or screening for TB, etc.


    Above: Professor Sternglass' analysis, which wasn't even based upon a real increase childhood mortality, which was falling before, during and after the nuclear tests. Sternglass instead claimed that in the absence of nuclear testing, childhood mortality should (in his opinion), have somehow continued to decrease according to the average fall rate of 1935-50 (when better medical care was reducing childhood mortality). Then he claimed that the flattening of the curve which occurred instead was evidence for a relative increase in childhood mortality due to radiation from fallout caused by nuclear testing.

    He was therefore first assuming that fallout from bomb testing was responsible, and then - without stating this assumption - he was using this assumption to claim that the data of the correlation between infant mortality and fallout rate was evidence that fallout was causing the increase! His first presentation was at the 9th Annual Hanford Biological Symposium, May 1969. On 24 July 1969, Dr Alice Stewart wrote an article for the New Scientist, "The pitfalls of Extrapolation" which found another contradiction in Professor Sternglass' theory:

    "Sternglass has postulated a fetal mortality trend which would eventually produce rates well below the level which - according to his own theory - would result from background radiation."

    The danger here is that bad science, lacking mechanism, can be asserted and become credible in the public despite being completely false, just because a scientist misuses authority to gain attention. In this case, when Sternglass' paper was rejected from a scientific journal, he had it published in the September 1969 issue of Esquire magazine, titled ‘The death of all children’. That magazine advertised the story as a selling point, and sent out copies of the magazine to prominent people in politics. If he had scientific evidence that was being covered up, that would have been reason to do that, assuming that the media would be interested in making a political storm out of the facts (which strongly support a result which is the opposite of that which Sternglass makes). So you end up with the idea that these false claims about low level radiation stem from politics: if the public wants to fear low level, low dose rate radiation, someone will fiddle the statistics accordingly. Anyone giving the facts is conveniently ignored or ridiculed as being ‘out of touch’ or part of a conspiracy and cover-up.

    Sternglass' straight line extrapolation is completely pseudoscience, because if carried into the past it predicts a time with 100% infant mortality (evidently wrong, because people are alive now), and extrapolated into the future it predicts 0% childhood mortality (clearly false, because disease cannot be eradicated, despite the innovations like sulfonamides and antibiotics in the 1935-50 era). This type of error due to a lack of causality and proper mechanism based predictions is not limited to the controversy over the effects of radiation. It is also typical of how controversy is created by people like Dr Edward Witten, string theorist, and is completely false science: Witten claims that string theory has the wonderful property of "predicting gravity". Actually, it predicts nothing checkable about it, and so doesn't, it is rather the case that 11-dimensional supergravity is assumed to be true, because it is assumed that gravity is due to spin-2 gauge bosons (gravitons) which nobody has ever observed. What Witten should have said is that it is an ad hoc model which includes spin-2 gravitons that nobody has ever seen, which is a far cry from claiming that string theory predicts gravity. These people are in some way well meaning I'm sure, but they are being deliberately misleading over scientific facts to boost some research program or political viewpoint just for the sake of politics or controversy, not science. As stated in an earlier post, quite a bit of iodine-131 was released across America by Nevada testing in 1951-62, but even the effects of that were far smaller than what Sternglass was claiming.

    Darrell Huff wrote a book called How to Lie with Statistics which has the example that researchers found that the number of children in a family in Holland correlated to the number of storks nests on the roof of the home! Perhaps that proves that storks really deliver children to families? Well, actually the bigger the family, the bigger the home they needed on the average. The bigger families with more children tended to have bigger, older houses, with big old roofs which had more storks nests because of their size and age. Professor Sternglass has recently had a change from claiming that low-level radiation is lethal: he has published a book about what happened before the big bang, an analogy to an egg dividing many times to produce all the particles.

    (I don't find that too scientific either because it just ignores the pair-production mechanism for the creation of fundamental particles in strong fields, it is essentially ad hoc theorizing which doesn't explain or predict the key issues in the cosmological application of general relativity - such as the epicycles like dark matter and dark energy - it doesn't explain the Standard Model of Particle Physics, and as a result, perhaps, it has not gained so much attention as his claims on low-level radiation. However, Sternglass is right in some details, such as the cause of the double slit experiment interference with single photons being the size of the photon compared to the slit spacing, about Bohr's mainstream Copenhagen Interpretation orthodoxy being not even wrong unpredictive belief, and about Dirac's sea in quantum field theory being censored today as a physical mechanism because of heresies over aether, which he discussed with Einstein and others like Feynman, who advised him to check and prove his ideas more carefully.)

    Update: the report by Donald A. Pierce and Dale L. Preston of RERF, 'Radiation-Related Cancer Risks at Low Doses among Atomic Bomb Survivors' in Radiation Research v. 154 (2000), pp. 178–186 states: 'Analysis is of solid cancer [not leukemia] incidence from 1958–1994, involving 7,000 cancer cases among 50,000 survivors in that dose and distance range. ... There is a statistically significant risk in the range 0–0.1 Sv, and an upper confidence limit on any possible threshold is computed as 0.06 Sv. It is indicated that modification of the neutron dose estimates currently under consideration would not markedly change the conclusions.'

    D. L. Preston et al., 'Effect of Recent Atomic Bomb Survivor Dosimetry Changes on Cancer Mortality Risk Estimates,' Radiation Research, v162 (2004), pp. 377-389 state: 'The Radiation Effects Research Foundation has recently implemented a new dosimetry system, DS02, to replace the previous system, DS86. This paper assesses the effect of the change on risk estimates for radiation-related solid cancer and leukemia mortality. The changes in dose estimates were smaller than many had anticipated, with the primary systematic change being an increase of about 10% in γ-ray estimates for both cities. In particular, an anticipated large increase of the neutron component in Hiroshima for low-dose survivors did not materialize. However, DS02 improves on DS86 in many details, including the specifics of the radiation released by the bombs and the effects of shielding by structures and terrain. ... For both solid cancer and leukemia, estimated age–time patterns and sex difference are virtually unchanged by the dosimetry revision. The estimates of solid-cancer radiation risk per sievert and the curvilinear dose response for leukemia are both decreased by about 8% by the dosimetry revision, due to the increase in the γ-ray dose estimates...' However, the difficulty of finding any recent reported summary of the key data on the internet suggests that maybe they are not publishing the detailed data on dose versus effects, but just some average based on force-fitting the high-dose effects data to the linear, no-threshold model. Otherwise it would just be embarrassing to the orthodoxy, and draw the ignorant scorn of the anti-nuclear lobby? Of course the public at large only wants to hear lies about radiation because they've been brainwashed by propaganda based on prejudices, not science, and the media provide what readers want to hear, political arguments.

    The information from the current online version of http://www.rerf.or.jp/eigo/faqs/faqse.htm#faq2, quoting data for 1950-90 from Radiation Research (146:1-27, 1996), without any doses to correspond to the effects despite the massive 2002 dosimetry project, clearly seems to prove that the Radiation Effects Research Foundation is covering up the dose-effects data by only making available on the internet data stripped of the dosimetry, so that it doesn't upset the 1950s linear, no-threshold religious style orthodoxy or rather, dogma. Of course, if they didn't cover-up, the implications would be uproar. So the one really valuable source of information is censored.

    There is no other really reliable data because of the lack of good control groups (with similar exposures to other risks, similar lifestyles, etc.) and statistically significant population sizes exposed. For example, the 64 Marshallese on Rongelap after the Bravo test who were exposed to about 175 rads of gamma radiation from fallout over 44 hours in 1954 are too small a sample to get accurate long-term data from. In 1972, one person died from leukemia due to gamma radiation in the group of 64, and several thyroid nodules (most thyroid effects of radiation are not lethal) also occurred as a result of beta radiation to the thyroid gland from drinking water collected by an open cistern which became contaminated by the fallout containing iodine-131. Although this gives a leukemia risk of 1/64 after 175 rads received over 44 hours, this figure is statistically very weak because of the small sample size.

    Hiroshima and Nagasaki data are being deliberately abused for propaganda purposes by ignoring the low dose data, and falsely taking high dose data and using that as if effects are directly proportional to dose with no threshold and no dose rate effect. Sometimes in the past claims have been made that the cancer rates were worse than previously thought. In 1957, Japanese type isolated wooden houses were exposed to nuclear tests in Nevada during Operation Plumbbob to determine how much radiation shielding they provided. It's obvious that a cluster of houses will provided more shielding than an isolated house in a desert, because the slant direct and scattered radiation will get additional shielding by the surrounding buildings they have to penetrate. It turned out that the wooden houses gave a typical protection factor of about 2-3 against the initial neutrons and gamma rays. The shielding by adjacent buildings was ignored. Later it was shown that the shielding by surrounding houses in a city doubles the overall protection factor for wooden houses, from 2-3 to 4-6. As a result, the estimated doses were halved. This meant that the same number of cancers was caused by only half as much radiation, so the number of cancers per unit of radiation was doubled.

    So these revisions were caused by dosimetry, not new effects showing up! The dosimetry is very accurate now. The effects of radiation are "well known" in the scientific sense, although they're not "well known" in the political sense.

    Kenneth L. Mossman of Arizona State University wrote a review of the problem in the March 1998 issue of Medical Physics (v25, Issue No. 3, pp. 279-284), on 'The linear no-threshold debate: Where do we go from here?', arguing:

    'For the past several years, the LNT (linear no-threshold) theory has come under attack within the scientific community. Analysis of a number of epidemiological studies of the Japanese survivors of the atomic bombings and workers exposed to low level radiation suggest that the LNT philosophy is overly conservative, and low-level radiation may be less dangerous than commonly believed. Proponents of current standards argue that risk conservatism is justified because low level risks remain uncertain and it is prudent public health policy; LNT opponents maintain that regulatory compliance costs are excessive, and there is now substantial scientific information arguing against the LNT model. Regulators use the LNT theory in the standards setting process to predict numbers of cancers due to exposure to low level radiation because direct observations of radiation-induced cancers in populations exposed to low level radiation are difficult. The LNT model is simplistic and provides a conservative estimate of risk. Abandoning the LNT philosophy and relaxing regulations would have enormous economic implications. However, alternative models to predict risk at low dose are as difficult to justify as the LNT model. Perhaps exposure limits should be based on model-independent approaches. There is no requirement that exposure limits be based on any predictive model. It is prudent to base exposure limits on what is known directly about health effects of radiation exposure of human populations.'

    A more recent review, in 2005, of the mechanism behind the Hiroshima and Nagasaki data at low doses was done by L. E. Feinendegen in his paper, 'Evidence for beneficial low level radiation effects and radiation hormesis' in the British Journal of Radiology, v78 (2005), pp. 3-7:

    'Low doses in the mGy range [1 mGy = 0.1 rad, since 1 Gray = 1 Joule/kg = 100 rads] cause a dual effect on cellular DNA. One is a relatively low probability of DNA damage per energy deposition event and increases in proportion to the dose. At background exposures this damage to DNA is orders of magnitude lower than that from endogenous sources, such as reactive oxygen species. The other effect at comparable doses is adaptive protection against DNA damage from many, mainly endogenous, sources, depending on cell type, species and metabolism. Adaptive protection causes DNA damage prevention and repair and immune stimulation. It develops with a delay of hours, may last for days to months, decreases steadily at doses above about 100 mGy to 200 mGy and is not observed any more after acute exposures of more than about 500 mGy. Radiation-induced apoptosis and terminal cell differentiation also occur at higher doses and add to protection by reducing genomic instability and the number of mutated cells in tissues. At low doses reduction of damage from endogenous sources by adaptive protection maybe equal to or outweigh radiogenic damage induction. Thus, the linear-no-threshold (LNT) hypothesis for cancer risk is scientifically unfounded and appears to be invalid in favour of a threshold or hormesis. This is consistent with data both from animal studies and human epidemiological observations on low-dose induced cancer. The LNT hypothesis should be abandoned and be replaced by a hypothesis that is scientifically justified and causes less unreasonable fear and unnecessary expenditure.'

    Online there is a 1982 book by Harvey Wasserman, Norman Solomon, Robert Alvarez and Eleanor Walters called 'Killing our Own: Chronicling the Disaster of America's Experience with Atomic Radiation, 1945-1982'. It contains a summary of all the radiation horror stories (some like Sternglass, et al., are pseudoscience, and some are valid). It doesn't contain any of the basic data with large control groups that shows how many excess cancers actually occur as a function of dose for particular dose rates. It relies instead on the opinions of committees and scientific authorities, repeating Sternglass' claims in chapter 11 and complaining that 'The industry as a whole has devoted thousands of dollars to undercutting his reputation.' That's the problem: you can't deal with errors by making ad hominem attacks on the reputations of the people making the errors, but by clearly emphasising where the errors are. Better still, publish the facts briefly, clearly, honestly, and fairly as simple graphs in the first place, and then the public will know what they are and will be able to make informed judgements.

    In May 1985, a U.S. National Research Council report on mortality in nuclear weapons test participants raised several questions. Some 5,113 nuclear test participants had died between 1952-81, when 6,125 deaths would be expected for a similar sized group of non-exposed Americans. The number of leukemia deaths was 56, identical to that in a similar sized non-exposed group. However, as the graph at the top of this post shows, the risk depends on the dose, so the few people with the highest doses would have far greater risks. In 1983, a C.D.C. report on the effects of fallout from the Plumbbob-Smoky test in 1957 showed that 8 participants in that test has died from leukemia up to 1979, compared to only 3 expected from a similar sized sample of non-exposed Americans. However, even for the Plumbbob-Smoky test, the overall death rate from all causes in the exposed test participants (320 deaths from 1957-79) was less than that in a matched sample of non-exposed Americans (365 deaths). The average dose to American nuclear test participants was only about 0.5 rad, although far higher doses were received by those working with fallout soon after nuclear tests. Altogether, out of 205,000 U.S. Department of Defense participants in nuclear tests, 34,000 were expected to die from naturally occurring cancer, and 11 from cancer due to radiation exposure. (According to the March 1990 U.S. Defense Nuclear Agency study guide DNA1.941108.010, report HRE-856, Medical Effects of Nuclear Weapons.

    Update (13 August 2007):

    There is an essay by Dr Donald W. Miller, Afraid of Radiation? Low Doses are Good for You, available in PDF format here. Two problems with that title are:

    (1) as pointed out in previous posts, only long-ranged, low-LET radiation like gamma rays and x-rays (which are electrically neutral, and thus only weakly ionizing) exhibit a threshold in all reliable data. Alpha and beta radiations are short-ranged, high-LET radiation, so where they can gain entry to the body (by being inhaled or ingested in soluble form, for example, which is not too easy for insoluble radioactivity trapped in fallout particles composed of fused glass spheres from melted sand grains), they can irradiate a few nearby cells very intensely because of their short range. With alpha and beta radiation, there is no threshold dose and all exposure is potentially harmful; the effects do obey a linear dose-response relationship at low doses of alpha and beta exposure. Only for gamma and x-rays at low dose rates are there thresholds and benefits possible from boosting the immune system and DNA repair mechanisms like P53.

    (2) the dose rate seriously affects the rate of cancer induction, which is an effect currently ignored completely by Health Physicists. This is because all laws and personal 'dosimeter' radiation monitoring systems for radiation safety record merely the integrated total doses, without regard to the dose rate at which the dose was received. (Some effects prediction schemes do make arbitrary 'factor of two' corrections, doubling the danger expected from doses received above some threshold for high dose rates, but these corrections grossly neglect the observed facts; see previous post for details of how this was discovered in animal experiments, and why it is still censored out!).

    Summary: gamma or x-ray radiation received at a low dose rate in small total doses can reduce the normal cancer rate. If this small total dose radiation is received at a high dose rate, however, protein P53 may not be fast enough able to repair the damage successfully during the exposure, and if there are multiple breaks in DNA strands produced in a short period of time, the broken bits will risk being 'repaired' incorrectly (the wrong way around or whatever), initiating cancer at some time in the future when that DNA is unzipped and thus copied in order to create new cells.

    This isn't rocket science. As an analogy, solar radiation from the sun contains ultraviolet radiation, which will make a geiger counter (provided it has been provided with a transparent glass window, not a shield to keep ultraviolet light out!) click rapidly, since it borders the soft x-ray spectrum and is weakly ionizing. If you receive ultraviolet radiation at a low dose rate in small total doses, the positive effects may outweigh the risks: vitamin D produced which is helpful rather than dangerous. If, however, you are exposed to very intense ultraviolet, the DNA in your skin gets broken up at a rate faster than protein P53 can stick the pieces together again, so some bits are put back together in the wrong order and skin cancer may eventually result when those cells try to divide to form new skin cells. The visible 'burning' of skin by ultraviolet is also due to dose rate effects causing cellular death and serious cellular disruption. It doesn't matter so much what the total dose is. What matters even more than the dose, for long term effects, is the dose rate (speed) at which the radiation dose is received.

    The key facts about radiation seem to be: it's all harmful at sufficiently high dose rates and at high doses. Gamma and x-rays are 'safe' (i.e., have advantages which outweigh risks) at low dose rates (obviously dose rates were high at Hiroshima and Nagasaki, where 95% of the doses were received from initial radiation lasting 10 seconds) and at low total doses. On the other hand, there is always a risk from cellular exposure to alpha and beta radiation because they are short-ranged so their energy is all absorbed in just a small number cells. Because they are quickly stopped by solid matter, they deposit all their energy in sensitive areas of bone tissue if you inhale or ingest sources of alpha and beta radiation that can be doposited in bones (a very small fraction of ingested soluble radium, strontium, uranium, and plutonium can end up in the bones). Gamma rays and x-rays are not dangerous at low dose rates and small total doses because they are not stopped so easily by matter as alpha and beta particles because they carry no electrical charge. This means that gamma and x-rays deposit their energy over a larger volume of tissue so that at low dose rates DNA repair mechanisms can repair damage as soon as it occurs.

    Anyway, to get back to the paper by Donald W. Miller, Jr., MD, he does usefully explain an evolved conspiracy to confuse the facts:

    'A process known as radiation hormesis mediates its beneficial effect on health. Investigators have found that small doses of radiation have a stimulating and protective effect on cellular function. It stimulates immune system defenses, prevents oxidative DNA damage, and suppresses cancer.'

    He cites the monumental report on effects of low dose rate, low-LET gamma radiation on 10,000 people in Taiwan by W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, Is Chronic Radiation an Effective Prophylaxis Against Cancer?, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here:

    'An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

    'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...

    'The data on reduced cancer mortality and congenital malformations are compatible with the phenomenon of radiation hormesis, an adaptive response of biological organisms to low levels of radiation stress or damage; a modest overcompensation to a disruption, resulting in improved fitness. Recent assessments of more than a century of data have led to the formulation of a well founded scientific model of this phenomenon.

    'The experience of these 10,000 persons suggests that long term exposure to [gamma]radiation, at a dose rate of the order of 50 mSv (5 rem) per year, greatly reduces cancer mortality, which is a major cause of death in North America.'



    The statistics in the paper by Chen and others has been alleged to apply to a younger age group than the general population, affecting the significance of the data, although in other ways the data are more valid than Hiroshima and Nagasaki data extrapolations to low doses. For instance, the radiation cancer scare mongering of survivors of high doses in Hiroshima and Nagasaki would have been prejudiced in the sense of preventing a blind to avoid “anti-placebo” effect, e.g. increased fear, psychological stress and worry about the long term effects of radiation, and associated behaviour. The 1958 book about the Hiroshima and Nagasaki survivors, “Formula for Death”, makes the point that highly irradiated survivors often smoked more, in the belief that they were doomed to die from radiation induced cancer anyway. Therefore, the fear culture of the irradiated survivors would statistically be expected to result in a deviancy from normal behaviour, in some cases increasing the cancer risks above those due purely to radiation exposure.

    For up-to-date data and literature discussions on the effects of DNA repair enzymes on preventing cancers from low-dose rate radiation, please see

    http://en.wikipedia.org/wiki/Radiation_hormesis

    The irrational, fashionable, groupthink semi-religious (believing in speculation) society we live in!

    ‘Science is the organized skepticism in the reliability of expert opinion.’ - R. P. Feynman (quoted by Smolin, TTWP, 2006, p. 307).

    ‘Science is the belief in the ignorance of [the speculative consensus of] experts.’ - R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

    If we lived in a rational society, the facts above would be reported in the media, and would be the focus for discussion about radiation hazards. Instead, the media and their worshippers (the politicians) as well as their funders (the general public who fund the media by paying for it), choose to ignore or ridicule the facts because the facts are 'unfashionable' and lying bullshit (see Sterngrass graph above) is 'fashionable' and some sort of consensus of mainstream narcissistic elitists with a political mandate to kill people by lying about the effects of low-level radiation and refusing to discuss the facts. There is no uncertainty about these facts, as radiation effects have been better checked and more extensively studied than any other alleged hazard to life!

    Below a little summary of politically-inexpedient facts from a book edited by Nobel Laureate Professor Eugene P. Wigner, Survival and the Bomb: Methods of Civil Defense, Indiana University Press, Bloomington, London, 1969.

    The dust jacket blurb states: 'The purpose of civil defence, Mr. Wigner believes, is the same as that of the anti-ballistic missile: to provide not a retaliation to an attack, but a defense against it; for no peace is possible as long as defense consists solely in the threat of revenge and as long as an aggressor - the one who strikes first - has a considerable advantage. Civil and anti-ballistic missile defense not only provide some protection against an attack, they render it less likely by decreasing the advantage gained by striking first.'

    The chapter on 'Psychological Problems of A-Bomb Defense' is by Professor of psychology, Irving L. Janis, who states on p. 61:

    'It has been suggested that the device of using increasing doses of graphic sound films (preferably in technicolor) showing actual disasters should be investigated as a possible way of hardening people and preventing demoralization.'

    He adds on pp. 62-3:

    'For the large number of patients who will be worried about epilation, ugly scar tissue, and other disfigurements, a special series of pamphlets and posters might be prepared in advance, containing reassuring information about treatment and the chances of recovery.'

    On pp. 64-5 he deals with the 'General Effects on Morale of A-Bomb Attack':

    'In general, a single atomic bomb disaster is not likely to produce any different kind of effects on morale than those produced by other types of heavy air attacks. This is the conclusion reached by USSBS [U.S. Strategic Bombing Survey, 1945] investigators in Japan. Only about one-fourth of the survivors of Hiroshima and Nagasaki asserted that they had felt that victory was impossible because of the atomic bombing. The amount of defeatism was not greater than that in other Japanese cities. In fact, when the people of Hiroshima and Nagasaki were compared with those in all other cities in Japan, the morale of the former was found to resemble that of people in the lightly bombed and unbombed cities rather than in the heavily bombed cities. This has been explained as being due to the fact that morale was initially higher than average in the two cities because, prior to the A-Bomb disasters, the populace had not been exposed to a series of heavy air attacks. Apparently a single A-Bomb attack produced no greater drop in morale among the Japanese civilians than would be expected from a single saturation raid of incendiaries or of high explosive bombs.'

    On p. 68, Professor Janis addresses the question 'Will There Be Widespread Panic?':

    'Prior to World War II, government circles in Britain believed that if their cities were subjected to heavy air raids, a high percentage of the bombed civilian population would break down mentally and become chronically neurotic. This belief, based on predictions made by various specialists, proved to be a myth.'

    The chapter on 'Decontamination' is by Dr Frederick P. Cowan (then the Head of the Health Physics Division, Brookhaven National Laboratory) and Charles B. Meinhold, who summarise data from a vital selection of decontamination research reports. The first report summarised (on page 227) is J. C. Maloney, et al., Cold Weather Decontamination Study, McCoy, I, II, and IV, U.S. Army Chemical Corps., Nuclear Defense Laboratory, Edgewood Arsenal, reports NDL-TR-24, -32, and -58 (1962, 1962 and 1964), which demonstrated that:

    1. 'In most cases, the time during which access to important facilities must be denied can be reduced by a factor of 10 (e.g., from two months to less than a week) using practical methods of decontamination.'

    2. 'Radiation levels inside selected structures can be reduced by a factor of 5.'

    3. 'Radiation levels outdoors in selected areas can be reduced by a factor of 20.'

    4. 'These results can be achieved without excessive exposure to individuals carrying out the decontamination.'

    On page 228, Cowan and Meinhold point out:

    'Although long sheltering periods may in some cases be reduced by the effect of rainfall or by transfer of people to less-contaminated areas, it is clear that decontamination is a very important technique for hastening the process of recovery.

    'Although the gamma radiation from fallout is the major concern, the effects of beta radiation should not be overlooked. Fallout material left on the skin for an extended period of time [this critical time is just a few minutes for fallout contamination an hour after the explosion, but much longer periods of exposure are required for burns if the fallout is more than an hour old, and after 3 days the specific activity of fallout from a land surface burst is simply too low to cause beta burns] can cause serious burns, and if inhaled or ingested in sufficient quantities, it can result in internal damage. Grossly contaminated clothing may contribute to such skin exposures or indirectly to the ingestion of radioactive material. Thus it may be necessary to resort to decontamination of body surfaces, clothing, food and water.'

    On pp. 229-230, the basic facts about land surface burst fallout stated are:

    1. 'The mass of the radioactive material itself is a tiny fraction of the mass of the inert fallout material with which it is associated. This, in discussing the mechanics of removal, fallout may be considered as a type of dirt.'

    2. 'In general, the amount of radioactive material removed is proportional to the total amount of fallout material removed.'

    3. 'Although the solubility of fallout particles depends on the composition of the ground where the detonation took place, it is fair to say that detonations over land will produce essentially insoluble particles while detonations over water will produce much less but fairly soluble fallout material. This soluble material will have a much greater tendency to adsorb to surfaces.'

    4. 'Under most circumstances one is dealing with small particle sizes.

    'The methods applicable to radiological decontamination are those available to dirt removal in general. Some common examples are sweeping, brushing, vacuuming, flushing with water, scrubbing, surface removal, and filtration. In addition, the radioactive material can be shielded by plowing, spading, covering with clean dirt or by construction of protective dikes. Such methods may utilize power equipment or depend upon manual labor. Their effectiveness will vary widely, depending upon the method of application, the type of surface, the conditions of deposition, etc. ...


    'Flushing with water can be very effective, particularly if the water is under pressure, the surface is smooth and proper drainage [to deep drains, where the radiation is shielded by intervening soil] is available. Under certain conditions, the use of water flushing during the deposition period can be of great value. The water will tend to fill the surface irregularities and prevent entrapment of particles. Soluble materials will be kept in solution, thereby reducing the chance of surface adsorption.'

    On p. 232, a useful summary table of decontamination is given:




    There is other extensive data on fallout decontamination in many previous posts on this blog, e.g., as the posts here, here and here (this last link includes a slightly different table of decontamination efficiencies, which is interesting to compare to the table of data above), as well as several other earlier ones. In summing up the situation for urban area decontamination, Cowan and Meinhold state on p. 232:

    'A number of factors make large-scale decontamination useful in urban areas. Much of the area between buildings is paved and, thus, readily cleaned using motorized flushers and sweepers, which are usually available. If, in addition, the roofs are decontaminated by high-pressure hosing, it may be possible to make entire buildings habitable fairly soon, even if the fallout has been very heavy.'

    On page 237 they summarise the evidence concerning methods for the 'Decontamination of People, Clothing, Food, Water and Equipment':

    'Since fallout is basically dirt contaminated with radioactive substances, it can be largely removed from the skin by washing with soap and water. ... Not all the radioactivity will be removed by washing, but that remaining will not be large enough to be harmful. ... To be a problem in relation to food, fallout must get into the food actually eaten by people. ... Vegetables exposed to fallout in the garden will be grossly contaminated but may still be usable after washing if protected by an outer skin or husk or if penetration of fallout into the edible portions is not excessive. ... Reservoirs will receive fallout, but much of it will settle to the bottom, be diluted by the huge volume of water, or be removed by the filtering and purifying systems. Cistern water may be very contaminated if contaminated rainwater or water from contaminated roofs has been collected. Milk from cattle who have fed on contaminated vegetation may contain large quantities of radioactive iodine for a period of a month or more ... but milk can be used for durable products such as powdered milk or cheese, since the radioactive iodine decays with a half-life of eight days. Thus, after a month only 7 percent of the initial [Iodine-131] remains.'

    There is then a chapter on 'Economic Recovery' by Professor of Economics, Jack Hirshleifer, who points out on page 244:

    'Economic recovery from localized bombing attacks in general has been quite remarkable. In Hiroshima, for example, power was generally restored to surviving areas on the day after the attack, and through railroad service recommenced on the following day. [Ref.: U.S. Strategic Bombing Survey, The Effects of Atomic Bombs on Hiroshima and Nagasaki, Washington, D.C., 1946, p. 8.]

    'By mid-1949, population was back to the preattack level, and 70 percent of the destroyed buildings had been reconstructed. [Ref.: Research Department, Hiroshima Municipal Office, as cited in Hiroshima, Hiroshima Publishing, 1949.]

    'In general, populations of damaged areas have been highly motivated to stay on, even in the presence of severe deprivation; once having fled, they have been anxious to return. The thesis has even been put forward that a community hit by disaster rebounds so as to attain higher levels of achievement than would otherwise have been possible. [Ref.: this refers to the study after the 1917 Halifax explosion, made by Samuel H. Prince, Catastrophe and Social Change, Columbia University-Longmans, Green, New York, 1920.] ...

    'In the midnineteenth century John Stuart Mill commented on:

    ... what has so often excited wonder, the great rapidity with which countries recover from a state of devastation; the disappearance, in a short
    time, of all traces of the mischiefs caused by earthquakes, floods, hurricanes,
    and the ravages of war. An enemy lays waste a country by fire and sword, and
    destroys or carries away nearly all the moveable wealth existing in it: all the
    inhabitants are ruined, and yet in a few years after, everything is much as it
    was before. -
    J.S. Mill, 'Principles of Political Economy', Ashley's New Edition, Longmans, Green, London, 1929, Book I, pp. 74-75.



    From Dr Samuel Glasstone and Philip J. Dolan, The Effects of Nuclear Weapons, 3rd ed., 1977, pp. 611-3:


    "From the earlier studies of radiation-induced mutations, made with fruitflies [by Nobel Laureate Hermann J. Muller and other geneticists who worked on plants, who falsely hyped their insect and plant data as valid for mammals like humans during the June 1957 U.S. Congressional Hearings on fallout effects], it appeared that the number (or frequency) of mutations in a given population ... is proportional to the total dose ... More recent experiments with mice, however, have shown that these conclusions need to be revised, at least for mammals. [Mammals are biologically closer to humans, in respect to DNA repair mechanisms, than short-lived insects whose life cycles are too small to have forced the evolutionary development of advanced DNA repair mechanisms, unlike mammals that need to survive for decades before reproducing.] When exposed to X-rays or gamma rays, the mutation frequency in these animals has been found to be dependent on the exposure (or dose) rate ...


    "At an exposure rate of 0.009 roentgen per minute [0.54 R/hour], the total mutation frequency in female mice is indistinguishable from the spontaneous frequency. [Emphasis added.] There thus seems to be an exposure-rate threshold below which radiation-induced mutations are absent ... with adult female mice ... a delay of at least seven weeks between exposure to a substantial dose of radiation, either neutrons or gamma rays, and conception causes the mutation frequency in the offspring to drop almost to zero. ... recovery in the female members of the population would bring about a substantial reduction in the 'load' of mutations in subsequent generations."






    Above: the theory of the experimentally observed threshold doses for the radium dial painters and for the Hiroshima survivors.

    Updates: http://glasstone.blogspot.com/2009/10/secrecy-propaganda-factual-evidence.html

    Wednesday, March 21, 2007

    Effect of dose rate (not merely dose) on the effects of radiation


    Dr John F. Loutit of the Medical Research Council, Harwell, England, in 1962 wrote a book called Irradiation of Mice and Men (University of Chicago Press, Chicago and London), which examines in detail the evidence for leukemia induced by radiation as known 45 years ago.

    Obviously at that time the human data collected from Hiroshima and Nagasaki was far from complete, for although the peak radiation induced leukemia rate occurred in 1951/52, some 6-7 years after exposure, the latent period was much longer in many cases. In any case, the doses which the survivors received were poorly known. Today, even the shielded doses (the calculation of radiation shielding is vital for those who survived in brick or concrete buildings around ground zero) are known quite well, the standard deviation is +/-30%.

    These aren't just theoretical computer calculations. With very sensitive instruments and long counting periods, it is possible to measure neutron induced activity in iron from irradiated buildings in Japan, and assess the neutron exposure, while gamma ray energy stored in ceramics like roof tiles can be released as light by heating them. It's possible to do this in a calibrated way (e.g., you can measure the light emitted on heating and then irradiate the same sample with a known dose of radiation, and repeat the process, so that the comparison calibrates the original radiation dose to the material), and after allowing for background radiation you can find out the gamma ray doses from the bomb.

    Because in 1962 there was little useful human data, extensive experiments were made on animals, in particular mice. Hence the title of Dr Loutit's book, Irradiation of Mice and Men.

    What caught my eye was the section on pages 61-82 of factors relating to leukemia. On page 61 he states:

    "... Mole [R. H. Mole, Brit. J. Radiol., v32, p497, 1959] gave different groups of mice an integrated total of 1,000 r of X-rays over a period of 4 weeks. But the dose-rate - and therefore the radiation-free time between fractions - was varied from 81 r/hour intermittently to 1.3 r/hour continuously. The incidence of leukemia varied from 40 per cent (within 15 months of the start of irradiation) in the first group to 5 per cent in the last compared with 2 per cent incidence in irradiated controls."

    So for a fixed dose, 1,000 R spread over a month (which is far less lethal in short term effects than a that dose spread over a few seconds, as occurs with initial radiation in a nuclear explosion, or over a few days when most of the fallout dose is delivered), the leukemia rate can vary from 5-40% as the dose rate varies from 1.3-81 r/hour.

    This does illustrate that the effects, even long-term effects, don't depend merely on the dose, but also upon the dose rate. There is a kind of religion that Health Physics is based upon, and has been based upon since about 1956, which states that the long-term effects of radiation are linearly dependent upon the total dose.

    The linear non-threshold (LNT) anti-civil defence dogma results from ignoring the vitally important effects of the dose rate on cancer induction, which have been known and published in papers by Mole and a book by Loutit for about 50 years; the current dogma which is falsely based on merely the total dose, thus ignoring the time-dependent ability of protein P53 and other to cancer-prevention mechanisms to repair broken DNA segments. This is particularly the case for double strand breaks, where the whole double helix gets broken; the repair of single strand breaks is less time-dependent because there is no risk of the broken single strand being joined to the wrong end of a broken DNA segment. Repair is only successful in preventing cancer if the broken ends are rapaired correctly before too many unrepaired breaks have accumulated in a short time; if too many double strand breaks occur quickly, segments can be incorrectly 'repaired' with double strand breaks being miss-matched to the wrong segments ends, possibly inducing cancer if the resulting somatic cell can then undergo division successfully without apoptosis.

    The only modification officially included for other factors are a set of radio-senstivity factors for different organs (those with fast-dividing cells are of course the most sensitive to radiation, since their DNA is highly vulnerable - while dividing - more of the time than other cells), and a quality factor for the type and energy of radiation (the amount of energy deposited in tissue per unit length of the track of a particle is the "linear energy transfer" or LET, and the higher the LET, the more ionisation type disruption a single particle causes to tissue while passing). In general, alpha radiation (helium nuclei, massive and highly charged) inside the body is high-LET radiation and so even a single alpha particle has a cancer risk, regardless of the dose rate, so the the cancer risk is simply proportional to dose as orthodoxy says.

    But for gamma and X-rays, which are low-LET radiation, you the amounts of ionization per unit length of the particle path is very small. Beta particle radiation is intermediate between alpha and gamma, and its effects are very dependent on the exact energy of the beta particles (strontium-90 emits high energy beta particles which can penetrate thin metal foils, for example, while the very low energy beta particles from carbon-14 are stopped by a sheet of paper, like alpha particles). In general gamma and X-ray radiation probably require multiple hits on a cell to overwhelm the DNA repair mechanism, so the dose rate is important.

    There is no inclusion in radiation dose effects assessments of the effect of dose rate! But it has been known for about twenty years that protein P53 repairs most breaks in DNA due to radiation if the dose rate is low, but can be saturated and overwhelmed at high dose rates.

    Hence, there is a known mechanism by which low dose rates are less likely to cause cancer than high dose rates.

    Now the entire controversy on radiation effects hinges on the effects of low-LET gamma and X-rays received at different dose rates, which are ignored in favour of just quoting total doses and trying to correlate those total doses to the statistical effects observed. Why? Well, it is administratively convenient just to record one numerical value - dose. Having to work out both the dose and the dose rate, and to set limits according to some combination, would have been difficult in the past (it probably is not difficult today, because modern electronic dosimeters measure dose rates and doses, and could easily be programmed to record the mean dose rate at which the dose was received).

    The data from Hiroshima and Nagasaki, and also for most X-ray and gamma ray medical treatment of people, applies to relatively high dose rates. You would expect this data to show more serious effects than similar doses received at lower dose rates.

    There is also some effect of age on exposure. Dr Alice Stewart and associates made the discovery (Brit. Med. J., v1, p1495, 1958, and v1, p452, 1961) after trying to analyse the pre-natal X-ray exposures of all children who died from cancers, particular leukemia, before age 10 in England and Wales between 1953-5, that those who had been X-rayed in utero had an overall risk of dying from cancer before age 10 of 1/600, compared to 1/1,200 for non-exposed children. For children who had been exposed to X-rays while in utero, the peak leukemia and other cancer risk occurred at the age of 3, a shorter latent interval than for survivors of Hiroshima and Nagasaki. (The actual average X-ray dose received to each child in utero was estimated as 1-5 rads on page 82 of Loutit's Irradiation of Mice and Men.)

    In Irradiation of Mice and Men, Loutit discusses a serious public relations problem caused by Professor E. B. Lewis, author of Leukemia and Ionizing Radiation, Science 17 May 1957, v125, No. 3255, pp. 965-72. Dr Loutit describes Lewis on page 78 as "... a geneticist of great renown...".

    The problem is that Professor Lewis was largely responsible for ignoring dose rate effects. In his 1957 paper Lewis calculated the probability of inducing leukemia per individual rad dose per year, getting 0.000,002 for both radiologists (where doses were somewhat uncertain, due to inaccurate dosimetry until the 1950s) and also for whole body exposure of atomic bomb survivors (a very uncertain figure in 1957, due partly to the fact most cancers had not yet appeared, and partly to the massive uncertainty in the dosimetry for atomic bomb survivors at that time), 0.000,001 for patient groups whose spine was irradiated for ankylosing spondylitis and whose chests were irradiated for thymic enlargement.

    Dr Loutit points out that all this data of Lewis' is for high dose rates. The problem is, Loutit writes on page 78 of Irradiation of Mice and Men:

    'What Lewis did, and which I have not copied, was to include in his table another group - spontaneous incidence of leukemia (Brooklyn, N.Y.) - who are taken to have received only natural background radiation throughout life at the very low dose-rate of 0.1-0.2 rad per year: the best estimate is listed as 2 x 10^{-6} like the others in the table. But the value of 2 x 10^{-6} was not calculated from the data as for the other groups; it was merely adopted. By its adoption and multiplication with the average age in years of Brooklyners - 33.7 years and radiation dose per year of 0.1-0.2 rad - a mortality rate of 7 to 13 cases per million per year due to background radiation was deduced, or some 10-20 per cent of the observed rate of 65 cases per million per year."

    So Professor Lewis has no evidence whatsoever that his data from human beings exposed to high dose rates also applied to low dose rates like background radiation. He merely assumed this was the case, without evidence or explanation or mechanism, and used this assumption to make some fanciful and uncheckable calculations, such as the guess that 10-20% of natural leukemias are caused by background radiation.

    On page 79, Dr Loutit challenges all of Professor Lewis' assumptions, pointing out for example that the effect of age at exposure and the effects of dose rate are being totally ignored by Lewis:

    "All these points are very much against the basic hypothesis of Lewis of a linear relation of dose to leukemic effect irrespective of time. Unhappily it is not possible to claim for Lewis's work as others have done, 'It is now possible to calculate - within narrow limits - how many deaths from leukemia will result in any population from an increase in fall-out or other source of radiation' [Leading article in Science, v125, p963, 1957]. This is just wishful journalese.

    "The burning questions to me are not what are the numbers of leukemia to be expected from atom bombs or radiotherapy, but what is to be expected from natural background .... Furthermore, to obtain estimates of these, I believe it is wrong to go to [1950s inaccurate, dose rate effect ignoring, data from] atom bombs, where the radiations are qualitatively different [i.e., including effects from neutrons] and, more important, the dose-rate outstandingly different."

    This conclusion about the importance of dose rate has been totally ignored, and Lewis' fiddles based only on dose have been continued ever since. No wonder the data from groups exposed to similar doses (but at different dose rates) remain in conflict. It's no mystery. There's nothing unknown about radiation. It's just censorship and officialdom enforcing confusion, as shown by that 1957 editorial in Science, mentioned above. It's straightforward to see that induction of cancer depends to some extent on the saturation of the P53 repair mechanism (for damage to DNA) due to radiation dose rate. This factor is completely ignored in Lewis' linear, no-threshold (LNT) model based on the faulty early data available in 1957.


    Notice that the dose rate varied with distance in Hiroshima and Nagasaki, since the duration of exposure was did not vary as rapidly with distance from ground zero as the dose did, so the casualties nearer to ground zero received their doses at higher dose rates.

    W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, ‘Is Chronic Radiation an Effective Prophylaxis Against Cancer?’, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here:

    ‘An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

    'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...’

    The statistics in the paper by Chen and others has been alleged to apply to a younger age group than the general population, affecting the significance of the data, although in other ways the data are more valid than Hiroshima and Nagasaki data extrapolations to low doses. For instance, the radiation cancer scare mongering of survivors of high doses in Hiroshima and Nagasaki would have been prejudiced in the sense of preventing a blind to avoid “anti-placebo” effect, e.g. increased fear, psychological stress and worry about the long term effects of radiation, and associated behaviour. The 1958 book about the Hiroshima and Nagasaki survivors, “Formula for Death”, makes the point that highly irradiated survivors often smoked more, in the belief that they were doomed to die from radiation induced cancer anyway. Therefore, the fear culture of the irradiated survivors would statistically be expected to result in a deviancy from normal behaviour, in some cases increasing the cancer risks above those due purely to radiation exposure.

    For up-to-date data and literature discussions on the effects of DNA repair enzymes on preventing cancers from low-dose rate radiation, please see

    http://en.wikipedia.org/wiki/Radiation_hormesis

    ‘What is Science?’ by Richard P. Feynman, presented at the fifteenth annual meeting of the National Science Teachers Association, 1966 in New York City, and published in The Physics Teacher, vol. 7, issue 6, 1968, pp. 313-20:

    ‘... great religions are dissipated by following form without remembering the direct content of the teaching of the great leaders. In the same way, it is possible to follow form and call it science, but that is pseudo-science. In this way, we all suffer from the kind of tyranny we have today in the many institutions that have come under the influence of pseudoscientific advisers.

    ‘We have many studies in teaching, for example, in which people make observations, make lists, do statistics, and so on, but these do not thereby become established science, established knowledge. They are merely an imitative form of science analogous to the South Sea Islanders’ airfields--radio towers, etc., made out of wood. The islanders expect a great airplane to arrive. They even build wooden airplanes of the same shape as they see in the foreigners' airfields around them, but strangely enough, their wood planes do not fly. The result of this pseudoscientific imitation is to produce experts, which many of you are. ... you teachers, who are really teaching children at the bottom of the heap, can maybe doubt the experts. As a matter of fact, I can also define science another way: Science is the belief in the ignorance of experts.’

    Protein P53, discovered only in 1979, is encoded by gene TP53, which occurs on human chromosome 17. P53 also occurs in other mammals including mice, rats and dogs. P53 is one of the proteins which continually repairs breaks in DNA, which easily breaks at body temperature due to free radicals produced naturally in various ways and also as a result of ionisation of caused by radiation hitting water and other molecules in the body. Cancer occurs when several breaks in DNA happen to occur by chance at nearly the same time, giving several loose ends which P53 repairs incorrectly, causing a mutation. This cannot occur when only one break occurs, because only two loose ends are produced, and P53 will reattach them correctly. If low-LET ionising radiation levels are increased to a certain extent, causing more single strand breaks, P53 works faster and is able deal with faster breaks as they occur, so that multiple broken strand ends do not arise. This prevents DNA strands being repaired incorrectly, and prevents cancer - a result of mutation caused by faults in DNA - from arising. Too much radiation of course overloads the P53 repair mechanism, and then it cannot repair breaks as they occur, so multiple breaks begin to appear and loose ends of DNA are wrongly connected by P53, causing an increased cancer risk.

    1. DNA-damaging free radicals are equivalent to a source of sparks which is always present naturally.

    2. Cancer is equivalent the fire you get if the sparks are allowed to ignite the gasoline, i.e. if the free radicals are allowed to damage DNA without the damage being repaired.

    3. Protein P53 is equivalent to a fire suppression system which is constantly damping out the sparks, or repairing the damaged DNA so that cancer doesn't occur.

    In this way of thinking, the ‘cause’ of cancer will be down to a failure of a gene like P53 to repair the damage.



    ‘Professor Edward Lewis used data from four independent populations exposed to radiation to demonstrate that the incidence of leukemia was linearly related to the accumulated dose of radiation. ... Outspoken scientists, including Linus Pauling, used Lewis’s risk estimate to inform the public about the danger of nuclear fallout by estimating the
    number of leukemia deaths that would be caused by the test detonations. In May of 1957 Lewis’s analysis of the radiation-induced human leukemia data was published as a lead article in Science magazine. In June he presented it before the Joint Committee on Atomic Energy of the US Congress.’ – Abstract of thesis by Jennifer Caron, Edward Lewis and Radioactive Fallout: the Impact of Caltech Biologists Over Nuclear Weapons Testing in the 1950s and 60s, Caltech, January 2003.

    Dr John F. Loutit of the Medical Research Council, Harwell, England, in 1962 wrote a book called Irradiation of Mice and Men (University of Chicago Press, Chicago and London), discrediting the pseudo-science from geneticist Edward Lewis on pages 61, and 78-79:

    ‘... Mole [R. H. Mole, Brit. J. Radiol., v32, p497, 1959] gave different groups of mice an integrated total of 1,000 r of X-rays over a period of 4 weeks. But the dose-rate - and therefore the radiation-free time between fractions - was varied from 81 r/hour intermittently to 1.3 r/hour continuously. The incidence of leukemia varied from 40 per cent (within 15 months of the start of irradiation) in the first group to 5 per cent in the last compared with 2 per cent incidence in irradiated controls. ...

    ‘What Lewis did, and which I have not copied, was to include in his table another group - spontaneous incidence of leukemia (Brooklyn, N.Y.) - who are taken to have received only natural background radiation throughout life at the very low dose-rate of 0.1-0.2 rad per year: the best estimate is listed as 2 x 10-6 like the others in the table. But the value of 2 x 10-6 was not calculated from the data as for the other groups; it was merely adopted. By its adoption and multiplication with the average age in years of Brooklyners - 33.7 years and radiation dose per year of 0.1-0.2 rad - a mortality rate of 7 to 13 cases per million per year due to background radiation was deduced, or some 10-20 per cent of the observed rate of 65 cases per million per year. ...

    ‘All these points are very much against the basic hypothesis of Lewis of a linear relation of dose to leukemic effect irrespective of time. Unhappily it is not possible to claim for Lewis’s work as others have done, “It is now possible to calculate - within narrow limits - how many deaths from leukemia will result in any population from an increase in fall-out or other source of radiation” [Leading article in Science, v125, p963, 1957]. This is just wishful journalese.

    ‘The burning questions to me are not what are the numbers of leukemia to be expected from atom bombs or radiotherapy, but what is to be expected from natural background .... Furthermore, to obtain estimates of these, I believe it is wrong to go to [1950s inaccurate, dose rate effect ignoring, data from] atom bombs, where the radiations are qualitatively different [i.e., including effects from neutrons] and, more important, the dose-rate outstandingly different.’

    Tragically, the entire health physics industry has conned itself for political reasons into ignoring the vitally important effect of dose rate, and all their current radiation dosimeters just measure the accumulated dose without accurately determining the effective dose rate. It is the dose rate which determines whether DNA repair mechanisms can cope with the rate at which damage occurs (or even be stimulated to greater activity and positive benefit), or whether the rate at which radiation damage occurs is sufficient to saturate the natural repair mechanisms.

    All future radiation dosimeters must incorporate some kind of integral to determine the effective dose rate at which doses are accumulated. A Fourier-spectrum type record (showing the proportion of the dose received as a function of dose rate) is needed in electronic dosimeters because, in real life, dose rates are never constant but vary with time as the distance to contamination and shielding vary, and as radioactive decay occurs.

    All radioactive materials decay exponentially unless they are being formed from the decay of something else in a decay chain, such as many fission products and the decay chains of actinides like uranium and plutonium.
    But for the case of simple exponential decay, the mathematical exponential decay law predicts that the dose rate never reaches zero, so effective dose rate for exposure to an exponentially decaying source needs clarification: taking an infinite exposure time will obviously underestimate the dose rate regardless of the total dose, because any dose divided into an infinite exposure time will give a false dose rate of zero.

    Part of the problem here is that the exponential decay curve is false: it is based on calculus for continuous variations, and doesn't apply to radioactive decay which isn't continuous but is a discrete phenomenon. This mathematical failure undermines the interpretation of real events in quantum mechanics and quantum field theory, because discrete quantized fields are being falsely approximated by the use of the calculus, which ignores the discontinuous (lumpy) changes which actually occur in quantum field phenomena, e.g., as Dr Thomas Love of California State University points out, the 'wavefunction collapse' in quantum mechanics when a radioactive decay occurs is a mathematical discontinuity due to the use of continuously varying differential field equations to represent a discrete (discontinuous) transition!

    Alpha radioactive decay occurs when an alpha particle undergoes quantum tunnelling to escape from the nucleus through a 'field barrier' which should confine it perfectly, according to classical physics. But as Professor Bridgman explains, the classical field law falsely predicts a definite sharp limit on the distance of approach of charged particles, which is not observed in reality (in the real world, there is a more gradual decrease). The explanation for alpha decay and 'quantum tunnelling' is not that the mathematical laws are perfect and nature is 'magical and beyond understanding', but simply that the differential field law is just a statistical approximation and wrong at the fundamental level: electromagnetic forces are not continuous and steady on small scales, but are due to chaotic, random exchange radiation, which only averages out and approaches the mathematical 'law' over long distances or long times. Forces are actually produced by lots of little particles, quanta, being exchanged between charges.

    On large scales, the effect of all these little particles averages out to appear like Coulomb's simple law, just as on large scales, air pressure can appear steady, when in fact on small scales it is a random bombardment of air molecules which cause Brownian motion. On small scales, such as the distance between an alpha particle and other particles in the nucleus, the forces are not steady but fluctuate as the field quanta are randomly and chaotically exchanged between the nucleons. Sometimes it is stronger and sometimes weaker than the potential predicted by the mathematical law. When the field confining the alpha particle is weaker, the alpha particle may be able to escape, so there is no magic to 'quantum tunnelling'. Therefore, radioactive decay only behaves the smooth exponential decay law as a statistical approximation for large decay rates. In general the exponential decay rate is false and for a nuclide of short half-life, all the radioactive atoms decay after a non-infinite time; the prediction of that 'law' that radioactivity continues forever is false. Richard P. Feynman explains in his book, QED, Penguin, 1990, pp. 55-6, and 84:

    'I would like to put the uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas ... But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, "Your old-fashioned ideas are no damn good when ...". If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [arrows = phase amplitudes in the path integral] for all the ways an event can happen – there is no need for an uncertainty principle! ... on a small scale, such as inside an atom, the space is so small that there is no main path, no "orbit"; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by field quanta] becomes very important ...'

    There is a stunning lesson from human 'groupthink' arrogance today that Feynman's fact-based physics is still censored out by mainstream string theory, despite the success of path integrals based on this field quanta interference mechanism! The entire mainstream modern physics waggon has ignored Feynman's case for simplicity and understanding what is known for sure, and has gone off in the other direction (magical unexplainable religion) and built up a 10 dimensional superstring model whose conveniently 'explained' Calabi-Yau compactification of the unseen 6 dimensions can take 10500 different forms (conveniently explained away as a 'landscape' of unobservable parallel universes, from which ours is picked out using the anthropic principle that because we exist, the values of fundamental parameters we observe must be such that they allow our existence) to combine a non-experimentally justifiable speculation about forces unifying at the Planck scale, with another non-experimentally justifiable speculation that gravity is mediated by spin-2 particles which only exchange between the two masses in your calculation, and somehow avoid exchanging with the way bigger masses in the surrounding universe (when you include in your path integral the fact that exchange gravitons coming from distant masses will be converging inwards towards an apple and the earth, it turns out that this exchange radiation with distant masses actually predominates over the local exchange and pushes the apple to the earth, so gravitons can be deduced to be spin-1 not spin-2; this makes checkable predictions and tells us exactly how quantum gravity fits into the electroweak symmetry of the Standard Model, altering the usual interpretation, and radically changing the nature of electroweak symmetry breaking from the usual poorly predictive mainstream Higgs field).

    IMPORTANT NOTICE ON LINEAR ENERGY TRANSFER (LET) AND ITS CONSEQUENCES FOR DIFFERING EFFECTS FROM ALPHA, BETA AND GAMMA RADIATIONS:

    Just in case anyone ignorant of the basics of radiation reads this post, it should be explained that the data and conclusions given in this post apply to gamma and also neutron radiation, both of which are ELECTRICALLY NEUTRAL PARTICLES, which is the major threat from nuclear explosions. Because gamma rays and neutrons are UNCHARGED, they tend to be weakly ionizing (i.e., they penetrate easily, and deposit energy only in discrete events such as occasional collisions with orbital electrons and nuclei of atoms, e.g., the Compton effect for gamma rays striking electrons). Neutrons however can emulate charged radiations when they hit protons (hydrogen nuclei): the protons are electrically charged and can carry off much of the energy, behaving almost like alpha particles and causing 20 times as much damage as gamma rays for every unit of dose (Joule per kilogram). This correction factor of 20 for neutrons is termed the relative biological effectiveness (RBE). However, low-energy (well scattered or 'thermalized') neutrons are unlikely to be scattered by protons, and instead are captured by protons to form heavy hydrogen (deuterium); in this 'radiative capture' process the surplus energy is released in the form of gamma rays.

    The weakly ionizing nature of gamma rays means that they deposit relatively little energy per unit length of their path through living matter, so are LOW-LET (Linear Energy Transfer) radiations. This is not the case with the ELECTRICALLY CHARGED alpha and beta particles from internal emitters like iodine-131 in the thyroid gland of people drinking milk without taking any precautions from cattle eating contaminated pasture grass, which occurred in Russia which didn't issue potassium iodide tablets to civilians after the Chernobyl disaster, but not in Poland where the tablets were issued (ion exchange removes iodine-131 from milk, as does turning it to cheese and storing it, because of the short 8-days half life of iodine-131, and if the predicted dose is above 25 cGy then the excess risk is negated by taking 130-milligram potassium iodide tablets to prevent the uptake of iodine-131); electrically charged alpha and beta particles in the body are stopped over a very small path length in tissue, and so they deposit all of their energy in that small amount of tissue which means that a single alpha or beta particle can potentially saturate the P53 DNA repair mechanism in a single cell nucleus and cause a cancer risk: hence, alpha and beta particles, because of their electrical charge, are HIGH-LET radiations, that probably have no threshold for effects, and are dangerous at all doses. On the positive side, this high-LET nature of alpha and beta particles means that they are not very penetrating, so they cause relatively little risk as long as you don't ingest or inhale contamination. Protective clothing and respirators can totally negate the risks from alpha and beta radiations during decontamination work after fallout. A report on beta skin burns from fallout on the Marshallese in 1954 is here, and calculations of the length of time over which fallout can cause beta burns when deposited on skin are reported here in the post about Dr Carl F. Miller's excellent fallout research. For proof of U.S. Department of Defense fallout exaggerations and the continuing cover-up of the limited extent of nuclear test fallout during the Cold War, see this post. For underwater burst contamination see this post. For data on the efficiency of decontamination, see this post as well of course as the post about Dr Miller's work.

    Tuesday, March 13, 2007

    The Control of Exposure of the Public to Ionizing Radiation

    Above: The Control of Exposure of the Public to Ionizing Radiation in the Event of Accident or Attack, Proceedings of a Symposium Sponsored by the National Council on Radiation Protection and Measurements (NCRP), April 27-29, 1981, Held at the International Conference Center, Reston, Virginia. (The proceedings were published on May 15, 1982, by the U.S. National Council on Radiation Protection and Measurements, Bethesda, Md.) The NCRP was chartered by U.S. Congress in 1964 to analyze and publish information on radiation protection issues.


    In several previous posts quotations are made from Dr Carl F. Miller's enlightening acceptance speech for an award (published beginning on page 99), which give a first-hand idea of the awesome task of obtaining the vital early-time data on arrival and deposit characteristics of fallout during the 1950s American atmospheric nuclear test series in the Pacific and also at the Nevada test site. Dr Miller's 60 rads gamma dose more than doubled his natural leukemia risk from 0.5% to over 1%, and he tragically died from leukemia in August 1981. A longer excerpt from the Session B Discussion text is as follows, and illuminates the whole subject very clearly:

    'Mr Greene [Jack C. Greene, Moderator]: 'As all should know, much of the work that Jim Sartor described [in a summary of decontamination data] was done under the very able direction of Dr Carl Miller who is with us today and will join the panel.

    'Carl, I am very pleased to use this occasion to present to you a "certificate of appreciation" from the Planning Committee for this symposium, joined by members of the former NAS Advisory Committee on Civil Defense and others who have worked with you over the years. Here's what the certificate says:

    'We hereby present a Certificate of Appreciation to Carl F. Miller. Dr Miller's many years of dedicated research have combined the best of theoretical and applied work and have resulted in an unparalleled contribution to our understanding of the physical and chemical characteristics of radioactive fallout, as well as the means for protecting against it. It is specially meaningful to those of us who have known and admired Carl over the years to have an opportunity to publicly acknowledge and honor the work of this extraordinary, versatile and innovative scientist.'

    'We will start the discussion period by giving Carl five minutes to comment on anything he chooses.

    'Dr Miller: Thank you very much. I appreciate this commendation and your comments. I might add here that thanks should also be given to my co-workers who helped me and did most of the hard work - this includes, of course, Pete [Peter O. Strom of Sandia National Laboratories] and Jim [James D. Sartor of Woodward-Clyde Consultants] who just completed their presentations. Sitting here and listening, I was impressed with many of the talks that were given. Starting with Lew Spencer's, I think he did a very good job in giving the outline for the hazards to be discussed. One thing that occurred to me and probably occurs to a lot of you is that the real hazards in a nuclear attack are not from radiation - the real hazards are in the blast and other initial effects. Though his paper was clearly limited to the radiation effects, he knows, and you know, that the countermeasures against the other hazards would be more difficult to achieve.

    'Someone talked a little about risks. One thing that usually comes to my mind when risks are discussed goes back to World War II when I was in Burma working with the Chinese Fifth Army as an artillery-man. We (and they) used to be supplied by airplanes that would fly over and parachute-drop food and ammunition and so forth. The thing about risk was that we used to watch the behavior of the Chinese soldiers - they would make bets on who could catch a sack of peanuts dropping down and they would truly try. I don't know what the LD50 for that "exposure" was or might be, but it was exceedingly high for a good catcher. It was a high risk game in a rather high risk environment. However, in the (now) old days, some of us at the then existing U.S. Naval Radiological Defense Laboratory did about the same thing with fallout at nuclear weapons tests.

    'In 1954, some civilians and Navy men were on a specially outfitted ship - a monstrosity bedecked with all kinds of sprinklers for testing the Navy washdown system [i.e., the continuous spraying of decks by fire sprinklers, to flush fallout off the deck as it landed]. On one test, we were about 20 miles away when a 10-megaton shot was detonated. At the time, one piece of data we were interested in obtaining was the early time decay; also, additional data on the characteristics of the fallout were desired. My job was to put out a series of funnels, tubes, and other things on this ship to collect some of the fallout. The ship was sailed on a pathway that led to an area directly underneath the expanding cloud so as to be exposed to a maximum amount of fallout. The ship, called the YAG-39, was highly instrumented with gamma detectors; it was accompanied by a sister ship, the YAG-40, which was operated by remote control but without the washdown system. Fallout arrived about 20 minutes after detonation, at which time I collected the first few drops of "hot" washdown water from tubing that extended from the deck to the bottom of the ship across from where the radioactive assay equipment was located.

    'In 1957, at the Nevada Test Site, personnel from NRDL and the AEC sat in an underground shelter a mile away when Shot Diablo was detonated. Some of us collected fallout particles as they fell out of the sky from this event. We didn't chase after them on the outside of the shelter because we had little funnels and tubes running to the outside from inside. One could hear that stuff trickle down into containers in a deep cave from which we picked out single particles for assay. I was trying to do gamma spectrometry on particles. I picked up one little particle, and the spectrometer just about blew up, so I quickly put it back and got a smaller one. That didn't work either: it was too hot. Finally, I got a teeny one, but it was still too hot. So I took it back in and smashed it into smaller pieces, picked up a chip with tweezers and found out it didn't blank out the spectrometer. Of course, after about a half-hour or so, one could hardly get a reading on it anymore, because of the rapid decay rate. Many people received some gamma exposure on ventures such as these. I did as well. ...

    'I like the way Jim Sartor brought out the character of the fallout, and Pete Strom, too. With most of the local fallout that we're talking about, a lot of the larger particles are fused or melted to form little glassy marbles. The tower shots had iron in them so they were magnetic and we could separate hot fallout particles from tower shots with magnetism. The radioactive atoms that could be absorbed into, or by, body organs were the few that plated out on the surface of the fallout particles during the later stages of condensation in the fireball. That's why the elements iodine, strontium, ruthenium and a few other isotopes of that nature have been found in organs of animals and humans.'



    Above: this is a summary of the decontamination data tables presented by James Sartor which Dr Miller was commenting on. This type of empirical field information is vital for informed decision making about how best to deal with a nuclear fallout disaster of any kind, be it an accident or a weapon attack.

    The volume also contains other information of background importance. Dr Clarence Lushbaugh goes through the history of the LD50, i.e., the estimated dose which kills 50% of exposed people. He reveals that even before the bombs fell on Hiroshima and Nagasaki, the Manhattan Project had determined a figure of 500 +/- 100 R as the human LD50, based on extrapolations from animal data, and shows that the commonly quoted 450 R estimate of the LD50 stems not directly from any particular analysis of evidence, but instead from an average of the guesses made by 24 consultants to the U.S. Armed Forces Special Weapons Project who met at San Francisco in 1947 under the chairmanship of Dr R. R. Newell. (This 450 R human LD50 estimate was first published in 1950 by S. Warren and J. Z. Bowers in "Acute Radiation Syndrome in Man," Ann. Int. Med., v32, pp207-16.)

    Dr Lushbaugh also comments on the disagreement which occurred in 1959, when Dr Payne Harris testified before the U.S. Joint Committee on Atomic Energy that the human LD50 was 700 R +/- 25%, based on Oak Ridge and Yugoslavian accident data, while Drs Cronkite and Bond testified using Marshallese evidence plus dog and swine data that the human LD50 was 350 R. As a result of this disagreement (one estimate above the previous LD50 estimate of 450 R, and the other estimate below that figure), the 450 R estimate continued to be used as the best available consensus. Lushbaugh however notes that he and Dr Auxier, using the best available data for shielding by buildings in Japan and the best empirical estimates of the radiation doses (confirmed by measurements of neutron induced activity in Hiroshima and Nagasaki, with thermoluminescent data which allow measurement gamma ray doses in roof tiles at various distances because some radiation energy is transferred to the ceramic as energy trapped in the crystalline structure, which is released as light when the material is subsequently heated), found an LD50 estimate of 260 REM, assuming a relative biological effectiveness (RBE) factor of 2 for neutrons. (REM = exposure in roentgens multiplied by RBE.) Lushbaugh comments that this low figure of the LD50 from Hiroshima and Nagasaki data is due to the blast and burn trauma the people took from the shock wave and thermal radiation which accompanied the nuclear radiation. (C. C. Lushbaugh and J. Auxier, "Reestimation of Human LD50 Radiation Levels at Hiroshima and Nagasaki", Radiation Research, v39, p526, 1969.)

    He reports another study of Hiroshima and Nagasaki effects which found that a 50% incidence of epilation (hair loss) occurred at a dose of 310 REM if the neutron RBE is 4, and 50% incidence of hemorrhage (i.e., platelet suppression in blood due to irradiation of the bone marrow where blood cells are produced; the reduction in the platelet count causes small vessels to leak, producing small but visible skin hemorrhages below the outer skin layer). Some of these data will be obsolete now because they were based on the 1965 dosimetry of Hiroshima and Nagasaki, which has been updated with improved radiation transport models (although the 'improved" estimates of the yields of the Hiroshima and Nagasaki bombs may be a step backward, because the yields depend on random chances of the time of initiation of the chair reaction after fissile assembly, and other chance factors, and so should be evaluated from the actual measured blast effects data like the crushing of petrol tins and the overturning of stone slabs of known mass, as Penney did in his 1970 report, not on computer simulations of bomb dynamics).

    Lushbaugh also discusses the effects of protracted exposure, where the body can repair some of the damage if the radiation is received at a low dose rate. A man accidentally irradiated by a Co-60 radiotherapy source in Mexico in 1964 for 106 days at a gamma exposure rate of 9-16 R/day (total dose 980-1,700 R) was still alive 17 years later, but four others who suffered daily exposure rates as least twice that amount were all killed within 80 days due to suppressed blood cell counts, hemorrhages and infections accompanying the reduced white blood cell count.

    Another interesting item in the report is the table of neutron induced activities due in soils on different bedrocks (igneous, shale, sandstone, limestone and sediment) as part of Dr Peter Strom's paper on page 81. This shows that the initial beta Al-28 radioactivity induced in soil is on the order of 1,000 times as intense as that of Na-24. This is partly due to the higher typical abundance of aluminium than sodium in most soils, but is mainly due to the shorter half life of Al-28 (2.3 minutes, contrasted to 15 hours for Na-24). The faster something decays, the more intense the decay rate (decays/second, i.e., Becquerels) during its decay.

    The typical igneous rock sample (at least half silicon dioxide, by mass) initially (i.e., at zero time) would give an beta activity from neutron induced Al-28 which is 550 times that from Na-24. After an hour (26 half lives of Al-28, but only 1/15th of a half life of Na-24) the ratio is only 0.0000088. Hence, despite the initial higher radiation levels from Al-28, it is always trivial within a fraction about half an hour of a nuclear explosion, as compared to other nuclides.

    There are two interesting appendices in the volume. The first is by Philip J. Dolan of SRI International and is entitled Appendix A: Characteristics of the Nuclear Radiation Environment Produced by Several Types of Disasters, Summary Volume. On page 264, Dolan comments that:

    'The hypothetical attack selected for use is a strategic attack on U.S. military installations, military supporting industrial and logistics facilities, other basic industries, and major population centers.

    'The attack consists of 1,444 weapons with a total of 6,559 megatons, of which 5,051 weapons are surface burst. ...

    'More than 67 million persons are located in areas receiving unit-time [1 hour reference time, although fallout is obviously not deposited everywhere within 1 hour of detonation so these unit-time figures are gross exaggerations if applied to distances of several hours downwind] reference dose rates in excess of 3,000 R/hr, more than 159 million in areas receiving in excess of 300 R/hr, and more than 188 million in areas in excess of 30 R/hr.

    'The dose rates mentioned above would not necessarily exist since the deposition would take place over an extended time period and the fallout is decaying while deposition takes place. The four-day doses, which consider arrival time and which represent most of the lifetime accumulations, corresponding to the above-mentioned unit-time dose rates are 5,400, 360, and 24 roentgens, respectively. Shielding or relocation could reduce these accumulated doses.'

    On page 265, Dolan adds:

    'The four major ways to reduce adverse effects of fallout are: shelter, relocation, decontamination, and minimization of ingestion and inhalation ...

    'The effectiveness of shelters usually is described in terms of a protective factor (PF), which is the ratio of the dose rate that would be measured 3 feet above an (imaginary) infinite smooth plane to the dose rate expected inside the shelter (accounting for surroundings as well as protection afforded by the shelter). About 20 percent of the urban population and 19 percent of the rural population of the U.S. could be afforded a PF of 1,000 or more (subways, mines, caves, and some basements) without evacuation, while about 75 percent of the urban population and 43 percent of the rural population could be afforded PF's of 100 or greater. ...

    'The consequences of a multiweapon nuclear attack would certainly be grave, but exact numbers have large uncertainties. Estimates of 20 to 160 million short term fatalities have been made, with the majority of the survivors receiving doses from >10 to a few hundred rem. Nevertheless, recovery should be possible if plans exist and are carried out to restore social order and to mitigate the economic disruption.'

    Commenting on the uranium and plutonium hazards of nuclear weapon accidents on page 272, Dolan states:

    'Uranium taken internally represents a heavy-metal poison hazard in quantities less than those required to be a radiation hazard.

    'Less than 10^{-4} of the plutonium eaten by man is absorbed from the intestine. Inhalation is a more probable route of deposition, but once the cloud has passed, inhalation requires that the plutonium be resuspended. This is an inefficient process.

    '"Soluble" plutonium may be cleared from the lung within a year or so and will be translocated primarily to bone and liver. "Insoluble" plutonium will be retained much longer in the lung and will be translocated principally to lymph nodes. Plutonium dispersed in a weapon accident is expected to be in the form of insoluble oxides.

    'Two accidents of this type are recorded. ... The first occurred near Palomares, Spain on January 17, 1966. A B-52 collided in flight with a tanker during a refueling operation, and 4 weapons were dropped. One weapon was found on the beach undamaged, and one was recovered intact from the sea at a much later date. The other 2 weapons resulted in high explosive detonations on impact with the earth. The resulting contamination covered about 650 acres with a concentration of about 5 micrograms per square metre or more.

    'The second accident occurred near Thule, Greenland on January 21, 1968. A B-52 crashed on an ice floe just off the coast. Snow was falling at the time of the accident, and the precipitation increased after the accident. Most of the plutonium sank with the aircraft debris, and the rest was trapped under the snow and the ice. ... The worst consequence of such an accident is likely to be a partial denial of the use of a relatively small area.'

    The second appendix is by Dr Alvin M. Weinberg, Appendix B: Civil Defense and Nuclear Energy, pages 275-7:

    'The rejection of nuclear energy has been catalyzed by the articulate and influential energy radicals in the Western world. ... I continue to believe, and preach, the obvious: that defensive systems are less threatening than offensive systems: 100 million Americans can't be killed with Russian ABM's or civil defense ... Escalation of defense is not nearly as threatening as is escalation of offense. ... The ultimate issue is not how many people are going to be killed in a nuclear war: it is how can we both maintain our freedoms and avoid nuclear war. ...

    'Nuclear power is an instrument of peace because it reduces pressure on oil. The energy crisis is primarily a crisis of liquid fuels. Insofar as nuclear power can replace oil, it helps stabilize the world order.

    'The world today uses about 60 million barrels of oil per day; of that, about 18 million barrels per day came through the Straits of Hormuz before the Iran/Iraq war. A nuclear reactor of 1,000 megawatts electric output uses the equivalent of about 25,000 barrels of residual oil per day. If the world had 1,000 reactors operating now, the primary energy supplied by uraniu to those 1,000 reactors would exceed 18 million barrels of oil per day that go through the Straits of Hormuz. To be sure, the substitution is not direct, since what would be displaced is residual oil, not gasoline or other higher distillates. But with an expenditure of about $10-15 thousand per daily barrel of capital equipment, refineries could convert the residual oil into higher distillates [i.e., break the longer hydrocarbon molecules into smaller ones]. So to speak, residual oil, made available by conversion from oil-fired to nuclear power plants, is the best feedstock for a synthetic fuel plant. To make high distillates from coal requires an expenditure of about $100,000 per daily barrel. To make high distillates from residual oil takes only about one tenth as much. ...

    'This simple-minded argument cannot be ignored: substitution of nuclear energy for oil reduces the pressure on oil and therefore reduces the political pressures that lead first to political instability, then to war, and possibly eventually to nuclear war. We forget that the immediate cause of the Japanese attack on Pearl Harbor was the decision by the United States to prevent Japan from moving into Indonesia to get oil. The Japanese entry into World War II demonstrated how oil can trigger a world conflagration. ...

    'I do not know whether nuclear energy, which is now in a state of moratorium [following Three Mile Island controversy in 1979], will get started again. ... That people will eventually acquire more sensible attitudes towards low level radiation is suggested by an analogy, pointed out by William Clark, between our fear of very low levels of radiation insult and of witches. In the fifteenth and sixteenth centuries, people knew that their children were dying and their cattle were getting sick because witches were casting spells on them. During these centuries no fewer than 500,000 witches were burned at the stake. Since the witches were causing the trouble, if you burn the witches, then the trouble will disappear. Of course, one could never be really sure that the witches were causing the trouble. Indeed, though many witches were killed, the troubles remained. The answer was not to stop killing the witches - the answer was: kill more witches. ...

    'I want to end on a happy note. The Inquisitor of the south of Spain, Alonzo Frias, in 1610 decided that he ought to appoint a committee to examine the connection between witches and all these bad things that were happening. The committee could find no real correlation ... So the Inquisitor decided to make illegal the use of torture to extract a confession from a witch. ...

    'I don't know whether the modern witch - low level radiation and the hysteria that is exhibited about nuclear energy - will be resolved soon enough for nuclear energy to play a proper part in avoiding the oil confrontation. After all, it took 200 years for the Inquisition to run its course on witches. I only hope that our attitude towards nuclear energy will become more sensible long before 200 years have gone by. The possible alternative - nuclear war sparked by competition for dwindling oil - is far too horrible to accept, whether or not we have civil defense.'