Peace through practical, proved civil defence for credible war deterrence
  • Please see also post linked here, and our summary of the key points in Herman Kahn's much-abused call for credible deterrence, On Thermonuclear War, linked here.

  • Hiroshima's air raid shelters were unoccupied because Japanese Army officers were having breakfast when B29s were detected far away, says Yoshie Oka, the operator of the Hiroshima air raid sirens on 6 August 1945...

  • In 1,881 burns cases in Hiroshima, only 17 (or 0.9 percent) were due to ignited clothing and 15 (or 0.7%) were due to the firestorm flames...

  • Dr Harold L. Brode’s new book, Nuclear Weapons in ...

  • 800 war migrants drowned on 22 April by EU policy:...

  • Photographed fireball shielding by cloud cover in ...

  • Nuclear weapons effects "firestorm" and "nuclear w...

  • Proved 97.5% survival in completely demolished houses ...

    "There has never been a war yet which, if the facts had been put calmly before the ordinary folk, could not have been prevented." - British Foreign Secretary Ernest Bevin, House of Commons Debate on Foreign Affairs, Hansard, 23 November 1945, column 786 (unfortunately secret Cabinet committees called "democracy" for propaganda purposes have not been quite so successful in preventing war). Protection is needed against collateral civilian damage and contamination in conventional, chemical and nuclear attack, with credible low yield clean nuclear deterrence against conventional warfare which, in reality (not science fiction) costs far more lives. Anti scientific media, who promulgate and exploit terrorism for profit, censor (1) vital, effective civil defense knowledge and (2) effective, safe, low yield air burst clean weapons like the Mk54 and W79 which deter conventional warfare and escalation, allowing arms negotiations from a position of strength. This helped end the Cold War in the 1980s. Opposing civil defense and nuclear weapons that really deter conventional war, is complacent and dangerous.

    War and coercion dangers have not stemmed from those who openly attack mainstream mistakes, but from those who camouflage themselves as freedom fighters to ban such free criticism itself, by making the key facts seem taboo, without even a proper debate, let alone financing research into unfashionable alternatives. Research and education in non-mainstream alternatives is needed before an unprejudiced debate, to establish all the basic facts for a real debate. “Wisdom itself cannot flourish, nor even truth be determined, without the give and take of debate and criticism.” – Robert Oppenheimer (quotation from the H-bomb TV debate hosted by Eleanor Roosevelt, 12 February 1950).

    “Apologies for freedom? I can’t handle this! ... Deal from strength or get crushed every time ... Freedom demands liberty everywhere. I’m thinking, you see, it’s not so easy. But we have to stand up tall and answer freedom’s call!” – Freedom Kids

  • Friday, March 23, 2007

    Radiation Effects Research Foundation covers up the very low cancer rates of Hiroshima and Nagasaki nuclear survivors using cynical obfuscation tactic

    In a controlled sample of 36,500 survivors, 89 people got leukemia over a 40 year period, above the number in the unexposed control group. (Data: Radiation Research, volume 146, 1996, pages 1-27.) Over 40 years, in 36,500 survivors monitored, there were 176 leukemia deaths which is 89 more than the control (unexposed) group got naturally. There were 4,687 other cancer deaths, but that was merely 339 above the number in the control (unexposed) group, so this is statistically a much smaller rise than the leukemia result. Natural leukemia rates, which are very low in any case, were increased by 51% in the irradiated survivors, but other cancers were merely increased by just 7%. Adding all the cancers together, the total was 4,863 cancers (virtually all natural cancer, nothing whatsoever to do with radiation), which is just 428 more than the unexposed control group. Hence, the total increase over the natural cancer rate due to bomb exposure was only 9%, spread over a period of 40 years. There was no increase whatsoever in genetic malformations.

    'This continues the series of general reports on mortality in the cohort of atomic bomb survivors followed up by the Radiation Effects Research Foundation. This cohort includes 86,572 people with individual dose estimates ... There have been 9,335 deaths from solid cancer and 31,881 deaths from noncancer diseases during the 47-year follow-up. ... We estimate that about 440 (5%) of the solid cancer deaths and 250 (0.8%) of the noncancer deaths were associated with the radiation exposure [emphasis added]. ... a new finding is that relative risks decline with increasing attained age, as well as being highest for those exposed as children as noted previously. A useful representative value is that for those exposed at age 30 the solid cancer risk is elevated by 47% per sievert at age 70. ... There is no direct evidence of radiation effects for doses less than about 0.5 Sv [emphasis added; notice that this report considers 86,572 people with individual dose estimates, and 40% have doses below 5 mSv or 0.005 Sv, so the politically expedient so-called 'lack of evidence' is actually a fact backed up by one hell of a lot of evidence that there are no radiation effects at low doses, a fact the biased scare-story-selling media and corrupt politically-expedient politicians will never report!].' - D. L. Preston, Y. Shimizu, D. A. Pierce, A. Suyama, and K. Mabuchi, 'Studies of mortality of atomic bomb survivors. Report 13: Solid cancer and noncancer disease mortality: 1950-1997', Radiation Research, volume 160, issue 2, pp. 381-407 (2003).
    Above: what is being politically covered up in the latest reports by the Radiation Effects Research Foundation. D. A. Pierce and D. L. Preston (Radiation Effects Research Foundation, Hijiyama Park, Hiroshima) wrote in 'Radiation-related cancer risks at low doses among atomic bomb survivors', Radiation Research, volume 154, issue 2. pp. 178-86 (August 2000): 'To clarify the information in the Radiation Effects Research Foundation data regarding cancer risks of low radiation doses, we focus on survivors with doses less than 0.5 Sv. ... Analysis is of solid cancer incidence from 1958-1994, involving 7,000 cancer cases among 50,000 survivors in that dose and distance range. The results provide useful risk estimates for doses as low as 0.05-0.1 Sv, which are not overestimated by linear risk estimates computed from the wider dose ranges 0-2 Sv or 0-4 Sv. There is ... an upper confidence limit on any possible threshold is computed as 0.06 Sv [emphasis added]. It is indicated that modification of the neutron dose estimates currently under consideration would not markedly change the conclusions.' In the illustration above, at 3.4 rads (gamma dose equivalent) reduced the natural leukemia rate by 30% in the Hiroshima and Nagasaki data available in 1982. There seems to be a "threshold" of 8 rads before there is any increase in risk. (H. Kato and W. J. Schull, 'Studies of the mortality of A-bomb survivors. 7. Mortality, 1950-1978: Part I. Cancer mortality', Radiation Research, May 1982, v90, Issue 2, pp. 395-432.) The accuracy in dosimetry (substantiated by measurements of neutron induced activity and gamma ray thermoluminescence in the two cities) at that time meant that the doses were generally believed accurate to about +/- 50% (the accuracy of later estimates has increased). These data are based on a radiation quality factor of about 20 for neutrons, to reconcile data from the two cities (the Hiroshima gun-type bomb bomb leaked the most neutrons, which were mainly absorbed in the high explosive in the Nagasaki device which worked by spherically symmetrical implosion), i.e., 1 rad from neutrons was considered to be equivalent to 20 rads of gamma rays. The reason for the reduction in natural leukemia rate by 3.4 rads may be either the stimulation of the protein P53 repair mechanism which repairs DNA strands broken by radiation, and/or a long-term boosting to the immune system caused somehow by surviving the nuclear explosions with low doses. It is unlikely to be a completely statistical random error, because the sample size of people exposed to low doses of radiation is very large - 23,073 people exposed to an average of 3.4 rads, with an unexposed control group size of 31,581. However, the exact doses received were still fairly uncertain in 1982, and the survivors of Hiroshima and Nagasaki were still dying:


    This means that the early data from the 1950s upon which all the health physics philosophy (linear dose-effects relation with no threshold dose before effects start to appear, and no effect of dose rate - see previous post) is useless not only because of the dosimetry but because it was premature to judge long terms effects by that early data. For example, the major source of 1950s data from Hiroshima and Nagasaki is summarised in a table on page 966 of the 1957 U.S. Congressional Hearings before the Special Subcommittee on Radiation of the Join Committee on Atomic Energy, The Nature of Radioactive Fallout and Its Effects on Man. This table was headed "Incidence of leukemia among the combined exposed populations of Hiroshima and Nagasaki by distance from the hypocenter (January 1948-September 1955)", and it is divided into distances of 0-1 km (0.96% of survivors had leukemia), 1-1.5 km (0.30% of survivors had leukemia), 1.5-2 km (0.043% of survivors had leukemia) and beyond 2 km (0.017% had leukemia). This early data was simply not detailed enough, and not collected over a long enough period of time to assess the effects of radiation properly, and there was no proper dosimetry to determine the doses people received, their shielding by houses (and the mutual shielding of clusters of houses), etc. The valuable data has taken decades to get.

    The joint Japanese-American (Department of Energy)-funded Radiation Effects Research Foundation aren't putting the sort of detailed dose-effects data we need on the internet due to political bias in favour of fashionable prejudice in Japan, despite such bias being cynically anti-scientific, ignorance-promoting, politically expedient dogmatism: its online 16 pages booklet called 'Basic Guide to Radiation and Health Sciences' gives no quantitative results on radiation effects whatsoever, while it falsely promotes lies about radioactive rainout on page 5:



    Above: by the time that the mass fires developed in the wooden homes of Hiroshima (breakfast time) and Nagasaki (lunch time) from blast wind-overturned cooking braziers, paper screens, and bamboo furnishings, the mushroom cloud has been blown away by the wind. The moisture and soot from the firestorm in Hiroshima which condensed to a 'black rain' when it had risen and cooled above the city, fell on the city an hour or two after the explosion and did not intersect the radioactive mushroom cloud, which had attained much higher altitude than the firestorm soot and moisture in any case. The neutron induced activity doses from the ground were trivial compared to the outdoor initial nuclear radiation doses, as illustrated in a previous post using the latest DS02 dosimetry. The RERF propaganda seeks to discredit civil defence by false propaganda, a continuation of the fellow travelled Cold War Soviet communist propaganda against Western defenses.

    ‘Science is the organized skepticism in the reliability of expert opinion.’

    - R. P. Feynman (quoted by Smolin, The Trouble with Physics, 2006, p. 307).

    ‘Science is the belief in the ignorance of [the speculative consensus of] experts.’

    - R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

    The linear non-threshold (LNT) anti-civil defence dogma results from ignoring the vitally important effects of the dose rate on cancer induction, which have been known and published in papers by Mole and a book by Loutit for about 50 years; the current dogma which is falsely based on merely the total dose, thus ignoring the time-dependent ability of protein P53 and other to cancer-prevention mechanisms to repair broken DNA segments. This is particularly the case for double strand breaks, where the whole double helix gets broken; the repair of single strand breaks is less time-dependent because there is no risk of the broken single strand being joined to the wrong end of a broken DNA segment. Repair is only successful in preventing cancer if the broken ends are rapaired correctly before too many unrepaired breaks have accumulated in a short time; if too many double strand breaks occur quickly, segments can be incorrectly 'repaired' with double strand breaks being miss-matched to the wrong segments ends, possibly inducing cancer if the resulting somatic cell can then undergo division successfully without apoptosis.
    http://www.rerf.jp/top/qae.htm.If you look at the data they provide at http://www.rerf.or.jp/eigo/faqs/faqse.htm, it only extends to 1990 and deliberately doesn't include any dosimetry at all (although the doses depend on shielding, they could have dealt with this by providing average shielding figures at each range, or simply ignoring distance and plotting dose versus effects). But I found the 1988 report update based on the 1986 dosimetry, which is close to the latest data: Y. Shimizu, et al., Life Span Study Report II, Part 2, Cancer Mortality in the Years 1950-1985 Based on the Recently Revised Doses (DS86), Radiation Effects Research Foundation, RERF-TR-5-88:

    You can see that small doses up to 5 rads have no effect either way on the leukemia risk, while 6-9 rads in this data seems to cause a reduction in normal leukemia risk from 0.17% to 0.12%. Doses which exceed this are harmful, possibly because the P53 repair mechanism was saturated and could not repair radiation induced damage to DNA due to the rate it occurred at higher doses. A dose of 20-40 rads more than doubles the natural leukemia risk. Hence anyone getting leukemia after a larger dose is more than 50% likely to have got the cancer as a result of the radiation exposure than naturally. (You cannot say this about other forms of cancer because 23% of Americans die from some form of cancer now anyway, so even the sort of risks at massive radiation doses can't compete with the natural risk of cancer for most types of cancer.) Notice that the DS02 dosimetry dose effects estimates are within 10% of the earlier DS86 estimates. DS02 (Dosimetry System 2002) was adopted in 2003 and gives a radiation dose at 1 m above he ground in open terrain at 1 km from ground zero of 4.5 and 8.7 Gy in Hiroshima and Nagasaki, respectively, with 0.08 and 0.14 Gy at 2 km, respectively. According to the recent Life Span Study report for the period of 1950-2000, among 86,611 survivors for whom individual doses were estimated, there were 47,685 deaths (55% of total number of survivors alive in 1950), including 10,127 from solid cancer and 296 from leukemia. Out of the 10,127 solid cancer deaths, only 5% were due to radiation, as shown by comparison to a non-exposed (but otherwise matched) control group.

    In 1969, Professor Ernest Sternglass, a physicist, correlated a dramatic increase in infant mortality during the 1950s to the increasing fallout radiation from nuclear testing. His papers and books on low-level radiation effects were unscientific in the sense that they illustrate how not to do science. He had no control group, unlike the Hiroshima and Nagasaki data. So he had no idea what was causing the childhood mortality rise. It could have been diet, proximity to smoking adults at home, effects of natal X-rays (see previous post), childhood X-ray checks or screening for TB, etc.


    Above: Professor Sternglass' analysis, which wasn't even based upon a real increase childhood mortality, which was falling before, during and after the nuclear tests. Sternglass instead claimed that in the absence of nuclear testing, childhood mortality should (in his opinion), have somehow continued to decrease according to the average fall rate of 1935-50 (when better medical care was reducing childhood mortality). Then he claimed that the flattening of the curve which occurred instead was evidence for a relative increase in childhood mortality due to radiation from fallout caused by nuclear testing.

    He was therefore first assuming that fallout from bomb testing was responsible, and then - without stating this assumption - he was using this assumption to claim that the data of the correlation between infant mortality and fallout rate was evidence that fallout was causing the increase! His first presentation was at the 9th Annual Hanford Biological Symposium, May 1969. On 24 July 1969, Dr Alice Stewart wrote an article for the New Scientist, "The pitfalls of Extrapolation" which found another contradiction in Professor Sternglass' theory:

    "Sternglass has postulated a fetal mortality trend which would eventually produce rates well below the level which - according to his own theory - would result from background radiation."

    The danger here is that bad science, lacking mechanism, can be asserted and become credible in the public despite being completely false, just because a scientist misuses authority to gain attention. In this case, when Sternglass' paper was rejected from a scientific journal, he had it published in the September 1969 issue of Esquire magazine, titled ‘The death of all children’. That magazine advertised the story as a selling point, and sent out copies of the magazine to prominent people in politics. If he had scientific evidence that was being covered up, that would have been reason to do that, assuming that the media would be interested in making a political storm out of the facts (which strongly support a result which is the opposite of that which Sternglass makes). So you end up with the idea that these false claims about low level radiation stem from politics: if the public wants to fear low level, low dose rate radiation, someone will fiddle the statistics accordingly. Anyone giving the facts is conveniently ignored or ridiculed as being ‘out of touch’ or part of a conspiracy and cover-up.

    Sternglass' straight line extrapolation is completely pseudoscience, because if carried into the past it predicts a time with 100% infant mortality (evidently wrong, because people are alive now), and extrapolated into the future it predicts 0% childhood mortality (clearly false, because disease cannot be eradicated, despite the innovations like sulfonamides and antibiotics in the 1935-50 era). This type of error due to a lack of causality and proper mechanism based predictions is not limited to the controversy over the effects of radiation. It is also typical of how controversy is created by people like Dr Edward Witten, string theorist, and is completely false science: Witten claims that string theory has the wonderful property of "predicting gravity". Actually, it predicts nothing checkable about it, and so doesn't, it is rather the case that 11-dimensional supergravity is assumed to be true, because it is assumed that gravity is due to spin-2 gauge bosons (gravitons) which nobody has ever observed. What Witten should have said is that it is an ad hoc model which includes spin-2 gravitons that nobody has ever seen, which is a far cry from claiming that string theory predicts gravity. These people are in some way well meaning I'm sure, but they are being deliberately misleading over scientific facts to boost some research program or political viewpoint just for the sake of politics or controversy, not science. As stated in an earlier post, quite a bit of iodine-131 was released across America by Nevada testing in 1951-62, but even the effects of that were far smaller than what Sternglass was claiming.

    Darrell Huff wrote a book called How to Lie with Statistics which has the example that researchers found that the number of children in a family in Holland correlated to the number of storks nests on the roof of the home! Perhaps that proves that storks really deliver children to families? Well, actually the bigger the family, the bigger the home they needed on the average. The bigger families with more children tended to have bigger, older houses, with big old roofs which had more storks nests because of their size and age. Professor Sternglass has recently had a change from claiming that low-level radiation is lethal: he has published a book about what happened before the big bang, an analogy to an egg dividing many times to produce all the particles.

    (I don't find that too scientific either because it just ignores the pair-production mechanism for the creation of fundamental particles in strong fields, it is essentially ad hoc theorizing which doesn't explain or predict the key issues in the cosmological application of general relativity - such as the epicycles like dark matter and dark energy - it doesn't explain the Standard Model of Particle Physics, and as a result, perhaps, it has not gained so much attention as his claims on low-level radiation. However, Sternglass is right in some details, such as the cause of the double slit experiment interference with single photons being the size of the photon compared to the slit spacing, about Bohr's mainstream Copenhagen Interpretation orthodoxy being not even wrong unpredictive belief, and about Dirac's sea in quantum field theory being censored today as a physical mechanism because of heresies over aether, which he discussed with Einstein and others like Feynman, who advised him to check and prove his ideas more carefully.)

    Update: the report by Donald A. Pierce and Dale L. Preston of RERF, 'Radiation-Related Cancer Risks at Low Doses among Atomic Bomb Survivors' in Radiation Research v. 154 (2000), pp. 178–186 states: 'Analysis is of solid cancer [not leukemia] incidence from 1958–1994, involving 7,000 cancer cases among 50,000 survivors in that dose and distance range. ... There is a statistically significant risk in the range 0–0.1 Sv, and an upper confidence limit on any possible threshold is computed as 0.06 Sv. It is indicated that modification of the neutron dose estimates currently under consideration would not markedly change the conclusions.'

    D. L. Preston et al., 'Effect of Recent Atomic Bomb Survivor Dosimetry Changes on Cancer Mortality Risk Estimates,' Radiation Research, v162 (2004), pp. 377-389 state: 'The Radiation Effects Research Foundation has recently implemented a new dosimetry system, DS02, to replace the previous system, DS86. This paper assesses the effect of the change on risk estimates for radiation-related solid cancer and leukemia mortality. The changes in dose estimates were smaller than many had anticipated, with the primary systematic change being an increase of about 10% in γ-ray estimates for both cities. In particular, an anticipated large increase of the neutron component in Hiroshima for low-dose survivors did not materialize. However, DS02 improves on DS86 in many details, including the specifics of the radiation released by the bombs and the effects of shielding by structures and terrain. ... For both solid cancer and leukemia, estimated age–time patterns and sex difference are virtually unchanged by the dosimetry revision. The estimates of solid-cancer radiation risk per sievert and the curvilinear dose response for leukemia are both decreased by about 8% by the dosimetry revision, due to the increase in the γ-ray dose estimates...' However, the difficulty of finding any recent reported summary of the key data on the internet suggests that maybe they are not publishing the detailed data on dose versus effects, but just some average based on force-fitting the high-dose effects data to the linear, no-threshold model. Otherwise it would just be embarrassing to the orthodoxy, and draw the ignorant scorn of the anti-nuclear lobby? Of course the public at large only wants to hear lies about radiation because they've been brainwashed by propaganda based on prejudices, not science, and the media provide what readers want to hear, political arguments.

    The information from the current online version of http://www.rerf.or.jp/eigo/faqs/faqse.htm#faq2, quoting data for 1950-90 from Radiation Research (146:1-27, 1996), without any doses to correspond to the effects despite the massive 2002 dosimetry project, clearly seems to prove that the Radiation Effects Research Foundation is covering up the dose-effects data by only making available on the internet data stripped of the dosimetry, so that it doesn't upset the 1950s linear, no-threshold religious style orthodoxy or rather, dogma. Of course, if they didn't cover-up, the implications would be uproar. So the one really valuable source of information is censored.

    There is no other really reliable data because of the lack of good control groups (with similar exposures to other risks, similar lifestyles, etc.) and statistically significant population sizes exposed. For example, the 64 Marshallese on Rongelap after the Bravo test who were exposed to about 175 rads of gamma radiation from fallout over 44 hours in 1954 are too small a sample to get accurate long-term data from. In 1972, one person died from leukemia due to gamma radiation in the group of 64, and several thyroid nodules (most thyroid effects of radiation are not lethal) also occurred as a result of beta radiation to the thyroid gland from drinking water collected by an open cistern which became contaminated by the fallout containing iodine-131. Although this gives a leukemia risk of 1/64 after 175 rads received over 44 hours, this figure is statistically very weak because of the small sample size.

    Hiroshima and Nagasaki data are being deliberately abused for propaganda purposes by ignoring the low dose data, and falsely taking high dose data and using that as if effects are directly proportional to dose with no threshold and no dose rate effect. Sometimes in the past claims have been made that the cancer rates were worse than previously thought. In 1957, Japanese type isolated wooden houses were exposed to nuclear tests in Nevada during Operation Plumbbob to determine how much radiation shielding they provided. It's obvious that a cluster of houses will provided more shielding than an isolated house in a desert, because the slant direct and scattered radiation will get additional shielding by the surrounding buildings they have to penetrate. It turned out that the wooden houses gave a typical protection factor of about 2-3 against the initial neutrons and gamma rays. The shielding by adjacent buildings was ignored. Later it was shown that the shielding by surrounding houses in a city doubles the overall protection factor for wooden houses, from 2-3 to 4-6. As a result, the estimated doses were halved. This meant that the same number of cancers was caused by only half as much radiation, so the number of cancers per unit of radiation was doubled.

    So these revisions were caused by dosimetry, not new effects showing up! The dosimetry is very accurate now. The effects of radiation are "well known" in the scientific sense, although they're not "well known" in the political sense.

    Kenneth L. Mossman of Arizona State University wrote a review of the problem in the March 1998 issue of Medical Physics (v25, Issue No. 3, pp. 279-284), on 'The linear no-threshold debate: Where do we go from here?', arguing:

    'For the past several years, the LNT (linear no-threshold) theory has come under attack within the scientific community. Analysis of a number of epidemiological studies of the Japanese survivors of the atomic bombings and workers exposed to low level radiation suggest that the LNT philosophy is overly conservative, and low-level radiation may be less dangerous than commonly believed. Proponents of current standards argue that risk conservatism is justified because low level risks remain uncertain and it is prudent public health policy; LNT opponents maintain that regulatory compliance costs are excessive, and there is now substantial scientific information arguing against the LNT model. Regulators use the LNT theory in the standards setting process to predict numbers of cancers due to exposure to low level radiation because direct observations of radiation-induced cancers in populations exposed to low level radiation are difficult. The LNT model is simplistic and provides a conservative estimate of risk. Abandoning the LNT philosophy and relaxing regulations would have enormous economic implications. However, alternative models to predict risk at low dose are as difficult to justify as the LNT model. Perhaps exposure limits should be based on model-independent approaches. There is no requirement that exposure limits be based on any predictive model. It is prudent to base exposure limits on what is known directly about health effects of radiation exposure of human populations.'

    A more recent review, in 2005, of the mechanism behind the Hiroshima and Nagasaki data at low doses was done by L. E. Feinendegen in his paper, 'Evidence for beneficial low level radiation effects and radiation hormesis' in the British Journal of Radiology, v78 (2005), pp. 3-7:

    'Low doses in the mGy range [1 mGy = 0.1 rad, since 1 Gray = 1 Joule/kg = 100 rads] cause a dual effect on cellular DNA. One is a relatively low probability of DNA damage per energy deposition event and increases in proportion to the dose. At background exposures this damage to DNA is orders of magnitude lower than that from endogenous sources, such as reactive oxygen species. The other effect at comparable doses is adaptive protection against DNA damage from many, mainly endogenous, sources, depending on cell type, species and metabolism. Adaptive protection causes DNA damage prevention and repair and immune stimulation. It develops with a delay of hours, may last for days to months, decreases steadily at doses above about 100 mGy to 200 mGy and is not observed any more after acute exposures of more than about 500 mGy. Radiation-induced apoptosis and terminal cell differentiation also occur at higher doses and add to protection by reducing genomic instability and the number of mutated cells in tissues. At low doses reduction of damage from endogenous sources by adaptive protection maybe equal to or outweigh radiogenic damage induction. Thus, the linear-no-threshold (LNT) hypothesis for cancer risk is scientifically unfounded and appears to be invalid in favour of a threshold or hormesis. This is consistent with data both from animal studies and human epidemiological observations on low-dose induced cancer. The LNT hypothesis should be abandoned and be replaced by a hypothesis that is scientifically justified and causes less unreasonable fear and unnecessary expenditure.'

    Online there is a 1982 book by Harvey Wasserman, Norman Solomon, Robert Alvarez and Eleanor Walters called 'Killing our Own: Chronicling the Disaster of America's Experience with Atomic Radiation, 1945-1982'. It contains a summary of all the radiation horror stories (some like Sternglass, et al., are pseudoscience, and some are valid). It doesn't contain any of the basic data with large control groups that shows how many excess cancers actually occur as a function of dose for particular dose rates. It relies instead on the opinions of committees and scientific authorities, repeating Sternglass' claims in chapter 11 and complaining that 'The industry as a whole has devoted thousands of dollars to undercutting his reputation.' That's the problem: you can't deal with errors by making ad hominem attacks on the reputations of the people making the errors, but by clearly emphasising where the errors are. Better still, publish the facts briefly, clearly, honestly, and fairly as simple graphs in the first place, and then the public will know what they are and will be able to make informed judgements.

    In May 1985, a U.S. National Research Council report on mortality in nuclear weapons test participants raised several questions. Some 5,113 nuclear test participants had died between 1952-81, when 6,125 deaths would be expected for a similar sized group of non-exposed Americans. The number of leukemia deaths was 56, identical to that in a similar sized non-exposed group. However, as the graph at the top of this post shows, the risk depends on the dose, so the few people with the highest doses would have far greater risks. In 1983, a C.D.C. report on the effects of fallout from the Plumbbob-Smoky test in 1957 showed that 8 participants in that test has died from leukemia up to 1979, compared to only 3 expected from a similar sized sample of non-exposed Americans. However, even for the Plumbbob-Smoky test, the overall death rate from all causes in the exposed test participants (320 deaths from 1957-79) was less than that in a matched sample of non-exposed Americans (365 deaths). The average dose to American nuclear test participants was only about 0.5 rad, although far higher doses were received by those working with fallout soon after nuclear tests. Altogether, out of 205,000 U.S. Department of Defense participants in nuclear tests, 34,000 were expected to die from naturally occurring cancer, and 11 from cancer due to radiation exposure. (According to the March 1990 U.S. Defense Nuclear Agency study guide DNA1.941108.010, report HRE-856, Medical Effects of Nuclear Weapons.

    Update (13 August 2007):

    There is an essay by Dr Donald W. Miller, Afraid of Radiation? Low Doses are Good for You, available in PDF format here. Two problems with that title are:

    (1) as pointed out in previous posts, only long-ranged, low-LET radiation like gamma rays and x-rays (which are electrically neutral, and thus only weakly ionizing) exhibit a threshold in all reliable data. Alpha and beta radiations are short-ranged, high-LET radiation, so where they can gain entry to the body (by being inhaled or ingested in soluble form, for example, which is not too easy for insoluble radioactivity trapped in fallout particles composed of fused glass spheres from melted sand grains), they can irradiate a few nearby cells very intensely because of their short range. With alpha and beta radiation, there is no threshold dose and all exposure is potentially harmful; the effects do obey a linear dose-response relationship at low doses of alpha and beta exposure. Only for gamma and x-rays at low dose rates are there thresholds and benefits possible from boosting the immune system and DNA repair mechanisms like P53.

    (2) the dose rate seriously affects the rate of cancer induction, which is an effect currently ignored completely by Health Physicists. This is because all laws and personal 'dosimeter' radiation monitoring systems for radiation safety record merely the integrated total doses, without regard to the dose rate at which the dose was received. (Some effects prediction schemes do make arbitrary 'factor of two' corrections, doubling the danger expected from doses received above some threshold for high dose rates, but these corrections grossly neglect the observed facts; see previous post for details of how this was discovered in animal experiments, and why it is still censored out!).

    Summary: gamma or x-ray radiation received at a low dose rate in small total doses can reduce the normal cancer rate. If this small total dose radiation is received at a high dose rate, however, protein P53 may not be fast enough able to repair the damage successfully during the exposure, and if there are multiple breaks in DNA strands produced in a short period of time, the broken bits will risk being 'repaired' incorrectly (the wrong way around or whatever), initiating cancer at some time in the future when that DNA is unzipped and thus copied in order to create new cells.

    This isn't rocket science. As an analogy, solar radiation from the sun contains ultraviolet radiation, which will make a geiger counter (provided it has been provided with a transparent glass window, not a shield to keep ultraviolet light out!) click rapidly, since it borders the soft x-ray spectrum and is weakly ionizing. If you receive ultraviolet radiation at a low dose rate in small total doses, the positive effects may outweigh the risks: vitamin D produced which is helpful rather than dangerous. If, however, you are exposed to very intense ultraviolet, the DNA in your skin gets broken up at a rate faster than protein P53 can stick the pieces together again, so some bits are put back together in the wrong order and skin cancer may eventually result when those cells try to divide to form new skin cells. The visible 'burning' of skin by ultraviolet is also due to dose rate effects causing cellular death and serious cellular disruption. It doesn't matter so much what the total dose is. What matters even more than the dose, for long term effects, is the dose rate (speed) at which the radiation dose is received.

    The key facts about radiation seem to be: it's all harmful at sufficiently high dose rates and at high doses. Gamma and x-rays are 'safe' (i.e., have advantages which outweigh risks) at low dose rates (obviously dose rates were high at Hiroshima and Nagasaki, where 95% of the doses were received from initial radiation lasting 10 seconds) and at low total doses. On the other hand, there is always a risk from cellular exposure to alpha and beta radiation because they are short-ranged so their energy is all absorbed in just a small number cells. Because they are quickly stopped by solid matter, they deposit all their energy in sensitive areas of bone tissue if you inhale or ingest sources of alpha and beta radiation that can be doposited in bones (a very small fraction of ingested soluble radium, strontium, uranium, and plutonium can end up in the bones). Gamma rays and x-rays are not dangerous at low dose rates and small total doses because they are not stopped so easily by matter as alpha and beta particles because they carry no electrical charge. This means that gamma and x-rays deposit their energy over a larger volume of tissue so that at low dose rates DNA repair mechanisms can repair damage as soon as it occurs.

    Anyway, to get back to the paper by Donald W. Miller, Jr., MD, he does usefully explain an evolved conspiracy to confuse the facts:

    'A process known as radiation hormesis mediates its beneficial effect on health. Investigators have found that small doses of radiation have a stimulating and protective effect on cellular function. It stimulates immune system defenses, prevents oxidative DNA damage, and suppresses cancer.'

    He cites the monumental report on effects of low dose rate, low-LET gamma radiation on 10,000 people in Taiwan by W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, Is Chronic Radiation an Effective Prophylaxis Against Cancer?, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here:

    'An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

    'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...

    'The data on reduced cancer mortality and congenital malformations are compatible with the phenomenon of radiation hormesis, an adaptive response of biological organisms to low levels of radiation stress or damage; a modest overcompensation to a disruption, resulting in improved fitness. Recent assessments of more than a century of data have led to the formulation of a well founded scientific model of this phenomenon.

    'The experience of these 10,000 persons suggests that long term exposure to [gamma]radiation, at a dose rate of the order of 50 mSv (5 rem) per year, greatly reduces cancer mortality, which is a major cause of death in North America.'



    The statistics in the paper by Chen and others has been alleged to apply to a younger age group than the general population, affecting the significance of the data, although in other ways the data are more valid than Hiroshima and Nagasaki data extrapolations to low doses. For instance, the radiation cancer scare mongering of survivors of high doses in Hiroshima and Nagasaki would have been prejudiced in the sense of preventing a blind to avoid “anti-placebo” effect, e.g. increased fear, psychological stress and worry about the long term effects of radiation, and associated behaviour. The 1958 book about the Hiroshima and Nagasaki survivors, “Formula for Death”, makes the point that highly irradiated survivors often smoked more, in the belief that they were doomed to die from radiation induced cancer anyway. Therefore, the fear culture of the irradiated survivors would statistically be expected to result in a deviancy from normal behaviour, in some cases increasing the cancer risks above those due purely to radiation exposure.

    For up-to-date data and literature discussions on the effects of DNA repair enzymes on preventing cancers from low-dose rate radiation, please see

    http://en.wikipedia.org/wiki/Radiation_hormesis

    The irrational, fashionable, groupthink semi-religious (believing in speculation) society we live in!

    ‘Science is the organized skepticism in the reliability of expert opinion.’ - R. P. Feynman (quoted by Smolin, TTWP, 2006, p. 307).

    ‘Science is the belief in the ignorance of [the speculative consensus of] experts.’ - R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

    If we lived in a rational society, the facts above would be reported in the media, and would be the focus for discussion about radiation hazards. Instead, the media and their worshippers (the politicians) as well as their funders (the general public who fund the media by paying for it), choose to ignore or ridicule the facts because the facts are 'unfashionable' and lying bullshit (see Sterngrass graph above) is 'fashionable' and some sort of consensus of mainstream narcissistic elitists with a political mandate to kill people by lying about the effects of low-level radiation and refusing to discuss the facts. There is no uncertainty about these facts, as radiation effects have been better checked and more extensively studied than any other alleged hazard to life!

    Below a little summary of politically-inexpedient facts from a book edited by Nobel Laureate Professor Eugene P. Wigner, Survival and the Bomb: Methods of Civil Defense, Indiana University Press, Bloomington, London, 1969.

    The dust jacket blurb states: 'The purpose of civil defence, Mr. Wigner believes, is the same as that of the anti-ballistic missile: to provide not a retaliation to an attack, but a defense against it; for no peace is possible as long as defense consists solely in the threat of revenge and as long as an aggressor - the one who strikes first - has a considerable advantage. Civil and anti-ballistic missile defense not only provide some protection against an attack, they render it less likely by decreasing the advantage gained by striking first.'

    The chapter on 'Psychological Problems of A-Bomb Defense' is by Professor of psychology, Irving L. Janis, who states on p. 61:

    'It has been suggested that the device of using increasing doses of graphic sound films (preferably in technicolor) showing actual disasters should be investigated as a possible way of hardening people and preventing demoralization.'

    He adds on pp. 62-3:

    'For the large number of patients who will be worried about epilation, ugly scar tissue, and other disfigurements, a special series of pamphlets and posters might be prepared in advance, containing reassuring information about treatment and the chances of recovery.'

    On pp. 64-5 he deals with the 'General Effects on Morale of A-Bomb Attack':

    'In general, a single atomic bomb disaster is not likely to produce any different kind of effects on morale than those produced by other types of heavy air attacks. This is the conclusion reached by USSBS [U.S. Strategic Bombing Survey, 1945] investigators in Japan. Only about one-fourth of the survivors of Hiroshima and Nagasaki asserted that they had felt that victory was impossible because of the atomic bombing. The amount of defeatism was not greater than that in other Japanese cities. In fact, when the people of Hiroshima and Nagasaki were compared with those in all other cities in Japan, the morale of the former was found to resemble that of people in the lightly bombed and unbombed cities rather than in the heavily bombed cities. This has been explained as being due to the fact that morale was initially higher than average in the two cities because, prior to the A-Bomb disasters, the populace had not been exposed to a series of heavy air attacks. Apparently a single A-Bomb attack produced no greater drop in morale among the Japanese civilians than would be expected from a single saturation raid of incendiaries or of high explosive bombs.'

    On p. 68, Professor Janis addresses the question 'Will There Be Widespread Panic?':

    'Prior to World War II, government circles in Britain believed that if their cities were subjected to heavy air raids, a high percentage of the bombed civilian population would break down mentally and become chronically neurotic. This belief, based on predictions made by various specialists, proved to be a myth.'

    The chapter on 'Decontamination' is by Dr Frederick P. Cowan (then the Head of the Health Physics Division, Brookhaven National Laboratory) and Charles B. Meinhold, who summarise data from a vital selection of decontamination research reports. The first report summarised (on page 227) is J. C. Maloney, et al., Cold Weather Decontamination Study, McCoy, I, II, and IV, U.S. Army Chemical Corps., Nuclear Defense Laboratory, Edgewood Arsenal, reports NDL-TR-24, -32, and -58 (1962, 1962 and 1964), which demonstrated that:

    1. 'In most cases, the time during which access to important facilities must be denied can be reduced by a factor of 10 (e.g., from two months to less than a week) using practical methods of decontamination.'

    2. 'Radiation levels inside selected structures can be reduced by a factor of 5.'

    3. 'Radiation levels outdoors in selected areas can be reduced by a factor of 20.'

    4. 'These results can be achieved without excessive exposure to individuals carrying out the decontamination.'

    On page 228, Cowan and Meinhold point out:

    'Although long sheltering periods may in some cases be reduced by the effect of rainfall or by transfer of people to less-contaminated areas, it is clear that decontamination is a very important technique for hastening the process of recovery.

    'Although the gamma radiation from fallout is the major concern, the effects of beta radiation should not be overlooked. Fallout material left on the skin for an extended period of time [this critical time is just a few minutes for fallout contamination an hour after the explosion, but much longer periods of exposure are required for burns if the fallout is more than an hour old, and after 3 days the specific activity of fallout from a land surface burst is simply too low to cause beta burns] can cause serious burns, and if inhaled or ingested in sufficient quantities, it can result in internal damage. Grossly contaminated clothing may contribute to such skin exposures or indirectly to the ingestion of radioactive material. Thus it may be necessary to resort to decontamination of body surfaces, clothing, food and water.'

    On pp. 229-230, the basic facts about land surface burst fallout stated are:

    1. 'The mass of the radioactive material itself is a tiny fraction of the mass of the inert fallout material with which it is associated. This, in discussing the mechanics of removal, fallout may be considered as a type of dirt.'

    2. 'In general, the amount of radioactive material removed is proportional to the total amount of fallout material removed.'

    3. 'Although the solubility of fallout particles depends on the composition of the ground where the detonation took place, it is fair to say that detonations over land will produce essentially insoluble particles while detonations over water will produce much less but fairly soluble fallout material. This soluble material will have a much greater tendency to adsorb to surfaces.'

    4. 'Under most circumstances one is dealing with small particle sizes.

    'The methods applicable to radiological decontamination are those available to dirt removal in general. Some common examples are sweeping, brushing, vacuuming, flushing with water, scrubbing, surface removal, and filtration. In addition, the radioactive material can be shielded by plowing, spading, covering with clean dirt or by construction of protective dikes. Such methods may utilize power equipment or depend upon manual labor. Their effectiveness will vary widely, depending upon the method of application, the type of surface, the conditions of deposition, etc. ...


    'Flushing with water can be very effective, particularly if the water is under pressure, the surface is smooth and proper drainage [to deep drains, where the radiation is shielded by intervening soil] is available. Under certain conditions, the use of water flushing during the deposition period can be of great value. The water will tend to fill the surface irregularities and prevent entrapment of particles. Soluble materials will be kept in solution, thereby reducing the chance of surface adsorption.'

    On p. 232, a useful summary table of decontamination is given:




    There is other extensive data on fallout decontamination in many previous posts on this blog, e.g., as the posts here, here and here (this last link includes a slightly different table of decontamination efficiencies, which is interesting to compare to the table of data above), as well as several other earlier ones. In summing up the situation for urban area decontamination, Cowan and Meinhold state on p. 232:

    'A number of factors make large-scale decontamination useful in urban areas. Much of the area between buildings is paved and, thus, readily cleaned using motorized flushers and sweepers, which are usually available. If, in addition, the roofs are decontaminated by high-pressure hosing, it may be possible to make entire buildings habitable fairly soon, even if the fallout has been very heavy.'

    On page 237 they summarise the evidence concerning methods for the 'Decontamination of People, Clothing, Food, Water and Equipment':

    'Since fallout is basically dirt contaminated with radioactive substances, it can be largely removed from the skin by washing with soap and water. ... Not all the radioactivity will be removed by washing, but that remaining will not be large enough to be harmful. ... To be a problem in relation to food, fallout must get into the food actually eaten by people. ... Vegetables exposed to fallout in the garden will be grossly contaminated but may still be usable after washing if protected by an outer skin or husk or if penetration of fallout into the edible portions is not excessive. ... Reservoirs will receive fallout, but much of it will settle to the bottom, be diluted by the huge volume of water, or be removed by the filtering and purifying systems. Cistern water may be very contaminated if contaminated rainwater or water from contaminated roofs has been collected. Milk from cattle who have fed on contaminated vegetation may contain large quantities of radioactive iodine for a period of a month or more ... but milk can be used for durable products such as powdered milk or cheese, since the radioactive iodine decays with a half-life of eight days. Thus, after a month only 7 percent of the initial [Iodine-131] remains.'

    There is then a chapter on 'Economic Recovery' by Professor of Economics, Jack Hirshleifer, who points out on page 244:

    'Economic recovery from localized bombing attacks in general has been quite remarkable. In Hiroshima, for example, power was generally restored to surviving areas on the day after the attack, and through railroad service recommenced on the following day. [Ref.: U.S. Strategic Bombing Survey, The Effects of Atomic Bombs on Hiroshima and Nagasaki, Washington, D.C., 1946, p. 8.]

    'By mid-1949, population was back to the preattack level, and 70 percent of the destroyed buildings had been reconstructed. [Ref.: Research Department, Hiroshima Municipal Office, as cited in Hiroshima, Hiroshima Publishing, 1949.]

    'In general, populations of damaged areas have been highly motivated to stay on, even in the presence of severe deprivation; once having fled, they have been anxious to return. The thesis has even been put forward that a community hit by disaster rebounds so as to attain higher levels of achievement than would otherwise have been possible. [Ref.: this refers to the study after the 1917 Halifax explosion, made by Samuel H. Prince, Catastrophe and Social Change, Columbia University-Longmans, Green, New York, 1920.] ...

    'In the midnineteenth century John Stuart Mill commented on:

    ... what has so often excited wonder, the great rapidity with which countries recover from a state of devastation; the disappearance, in a short
    time, of all traces of the mischiefs caused by earthquakes, floods, hurricanes,
    and the ravages of war. An enemy lays waste a country by fire and sword, and
    destroys or carries away nearly all the moveable wealth existing in it: all the
    inhabitants are ruined, and yet in a few years after, everything is much as it
    was before. -
    J.S. Mill, 'Principles of Political Economy', Ashley's New Edition, Longmans, Green, London, 1929, Book I, pp. 74-75.



    From Dr Samuel Glasstone and Philip J. Dolan, The Effects of Nuclear Weapons, 3rd ed., 1977, pp. 611-3:


    "From the earlier studies of radiation-induced mutations, made with fruitflies [by Nobel Laureate Hermann J. Muller and other geneticists who worked on plants, who falsely hyped their insect and plant data as valid for mammals like humans during the June 1957 U.S. Congressional Hearings on fallout effects], it appeared that the number (or frequency) of mutations in a given population ... is proportional to the total dose ... More recent experiments with mice, however, have shown that these conclusions need to be revised, at least for mammals. [Mammals are biologically closer to humans, in respect to DNA repair mechanisms, than short-lived insects whose life cycles are too small to have forced the evolutionary development of advanced DNA repair mechanisms, unlike mammals that need to survive for decades before reproducing.] When exposed to X-rays or gamma rays, the mutation frequency in these animals has been found to be dependent on the exposure (or dose) rate ...


    "At an exposure rate of 0.009 roentgen per minute [0.54 R/hour], the total mutation frequency in female mice is indistinguishable from the spontaneous frequency. [Emphasis added.] There thus seems to be an exposure-rate threshold below which radiation-induced mutations are absent ... with adult female mice ... a delay of at least seven weeks between exposure to a substantial dose of radiation, either neutrons or gamma rays, and conception causes the mutation frequency in the offspring to drop almost to zero. ... recovery in the female members of the population would bring about a substantial reduction in the 'load' of mutations in subsequent generations."






    Above: the theory of the experimentally observed threshold doses for the radium dial painters and for the Hiroshima survivors.

    Updates: http://glasstone.blogspot.com/2009/10/secrecy-propaganda-factual-evidence.html

    31 Comments:

    At 7:39 pm, Blogger nige said...

    http://nige.wordpress.com/2007/04/05/are-there-hidden-costs-of-bad-science-in-string-theory/

    ... Low-level radiation is another example of a science being controlled by politics.

    By the time the protein P53 repair mechanism for DNA breaks was discovered and the Hiroshima-Nagasaki effects of radiation were accurately known, the nuclear and health physics industries had been hyping inaccurate radiation effects models which ignored non-linear effects (like saturation of the normal P53 repair mechanism of DNA) and the effects of dose rate for twenty years.

    The entire industry had become indoctrinated in the philosophy of 1957, and there was no going back. Most of health physicists are employed by the nuclear or radiation industry at reactors or in medicine/research, so all these people have a vested interest in not rocking their own boat. The only outsiders around seem to politically motivated in one direction only (anti-nuclear), so there’s a standoff. Virtually everyone who enters the subject of health physics gets caught in the same trap, and so there is no mechanism in place to allow for any shift of consensus....

     
    At 9:52 am, Blogger nige said...

    http://motls.blogspot.com/2006/04/twenty-years-after-chernobyl.html

    Saturday, April 29, 2006

    Twenty years after Chernobyl

    On Wednesday morning, it's been 20 years since the Chernobyl disaster... The communist regimes could not pretend that nothing had happened (although in the era before Gorbachev, they could have tried to do so) but they had attempted to downplay the impact of the meltdown. At least this is what we used to say for twenty years. You may want to look how BBC news about the Chernobyl tragedy looked like 20 years ago.

    Ukraine remembered the event (see the pictures) and Yushchenko wants to attract tourists to Chernobyl. You may see a photo gallery here. Despite the legacy, Ukraine has plans to expand nuclear energy.

    Today I think that the communist authorities did more or less exactly what they should have done - for example try to avoid irrational panic. It seems that only 56 people were killed directly and 4,000 people indirectly. See here. On the other hand, about 300,000 people were evacuated which was a reasonable decision, too. And animals are perhaps the best witnesses for my statements: the exclusion zone - now an official national park - has become a haven for wildlife - as National Geographic also explains:


    Reappeared: Lynx, eagle owl, great white egret, nesting swans, and possibly a bear
    Introduced: European bison, Przewalski's horse
    Booming mammals: Badger, beaver, boar, deer, elk, fox, hare, otter, raccoon dog, wolf
    Booming birds: Aquatic warbler, azure tit, black grouse, black stork, crane, white-tailed eagle (the birds especially like the interior of the sarcophagus)

    ... Greenpeace in particular are very wrong whenever they say that the impact of technology on wildlife must always have a negative sign. ...

    In other words, the impact of that event has been exaggerated for many years. Moreover, it is much less likely that a similar tragedy would occur today. Nuclear power has so many advantages that I would argue that even if the probability of a Chernobyl-like disaster in the next 20 years were around 10%, it would still be worth to use nuclear energy.

    Some children were born with some defects - but even such defects don't imply the end of everything. On the contrary. A girl from the Chernobyl area, born around 1989, was abandoned by her Soviet parents, was adopted by Americans, and she became the world champion in swimming. Her name? Hint: the Soviet president was Gorbachev and this story has something to do with the atomic nucleus. Yes, her name is Mikhaila Rutherford. ;-)

    http://motls.blogspot.com/2007/04/chernobyl-21-years-later.html

    Thursday, April 26, 2007

    Chernobyl: 21 years later

    Exactly 21 years ago, the Ukrainian power plant exploded. ...

    A new study has found that the long-term health impact of the Chernobyl disaster was negligible. All kinds of mortality rates were at most 1% higher than normally.

    ScienceDaily, full study.

    Everyday life is riskier.

    Yushchenko calls for a revival of the zone. His proposals include a nature preserve - which is more or less a fact now - as well as production of bio-fuels and a science center. The Korean boss of the U.N. calls for aid to the region.


    copy of a fast comment there:


    Environmental thinking is in perfect harmony with media hype.

    Chernobyl wasn't the first case. Hiroshima was. A Manhatten District PhD physicist (Dr Jacobson, from memory?), who didn't actually work at Los Alamos and because of the compartmentalization of secrets didn't know anything about nuclear weapons effects, issued a press release about fallout the day after Hiroshima was on the front pages.

    He wrote that the radioactivity would turn Hiroshima into a radioactive waste land for 75 years. Not 70 or 80 years, but 75 years, which is a bit weird bearing in mind the fact that radioactivity decays exponentially.

    Actually there was no significant fallout or neutron induced activity beyond a few hours at Hiroshima due to the burst altitude. Even in a surface burst, the radioactivity drops to within the natural background at ground zero after a few years, and there are people living at Bikini Atoll today, where a 15 megatons surface burst was tested in 1954.

    The effects of radiation are serious at high doses, but there is plenty of evidence that they are exaggerated for low doses of gamma and neutron radiation...

    copy of another fast comment there:

    The full report http://www.biomedcentral.com/1471-2458/7/49/ states: "The ICRP risk estimate assumes a dose and dose-rate effectiveness factor (DDREF) of 2.0 (reducing predicted risk by a factor of 2.0) for extrapolation of the data from the bomb survivors (who were exposed at extremely high dose rate) to lower dose and/or dose-rate exposures."

    This is a vital issue, because cancer occurs when when the damage to DNA occurs so quickly that protein P53 can't repair it as single strand breaks. As soon as you get double breaks of DNA, there is the risk of the resulting bits of loose DNA being "repaired" the wrong way around in the strand by protein P53, and this can cause radiation induced cancer.

    So at low dose rates to weakly ionizing (low linear energy transfer, or low LET) radiation like gamma rays, radiation causes single breaks in DNA and protein P53 has time to repair them before further breaks occur.

    At high dose rates, the breaks occur so quickly that the P53 repair mechanism is overloaded with work, and repairs go wrong because DNA gets fairly fragmented (not just two loose ends to be reattached, but many bits) and P53 then accidentally puts some of the ends "back" in the wrong places, causing the risk of cancer.

    The factor of 2 risk increase for high dose rates as opposed to low dose rates is nonsense; it's a far bigger factor, as Dr Loutit explained in his ignored book "Irradiation of Mice and Men" in 1962. On page 61 he states:


    "... Mole [R. H. Mole, Brit. J. Radiol., v32, p497, 1959] gave different groups of mice an integrated total of 1,000 r of X-rays over a period of 4 weeks. But the dose-rate - and therefore the radiation-free time between fractions - was varied from 81 r/hour intermittently to 1.3 r/hour continuously. The incidence of leukemia varied from 40 per cent (within 15 months of the start of irradiation) in the first group to 5 per cent in the last compared with 2 per cent incidence in irradiated controls."

    So, for a fixed dose of 1,000 R spread over a month (which is far less lethal in short term effects than a that dose spread over a few seconds, as occurs with initial radiation in a nuclear explosion, or over a few days when most of the fallout dose is delivered), the leukemia rate can vary from 5-40% as the dose rate varies from 1.3-81 r/hour.

    The cancer rate doesn't just double at high dose rates. It increases by a factor of 8 (i.e., 5% to 40%) as the dose rate rises from 1.3 to 81 r/hour.

    In fact, for comparing cancer risks at low level (near background) and Hiroshima, the dose rates cover a wider range that this experiment, so the correction factor for the effect of dose rate on risk will be bigger than 8.

    Background radiation studies are based on average exposure rates of just 0.02 mr/hour, i.e., 0.00002 r/hour, while at Hiroshima and similar instrumented nuclear tests, the initial nuclear radiation lasted a total of 20 seconds until it was terminated by bouyant cloud rise effect.

    Hence for a dose of 1 r spread over 20 seconds at Hiroshima, the dose rate it was received at was 180 r/hour. (Although according to Glasstone and Dolan'snuclear test data, half of the initial radiation dose would generally be received in about half a second, so the effective dose rate would be even higher than 180 r/hour.)

    Hence, the range of dose rates from bachground to Hiroshima is 0.00002 r/hour to 180 r/hour or more, a factor of 9,000,000 difference or more.

    Since in the animal experiments the leukemia rate increased by a factor of 8 due to a 62 fold increase in dose rate (i.e., as the dose rate increased from 1.3 to 81 r/hour), cancer risk is approximately proportional to the square root of the dose rate, so a 9,000,000 fold increase in dose rate should increase the cancer risk by 3,000 times.

    Hence, the cancer risks at Hiroshima and Nagasaki by this model exaggerate low level radiation effects by over 3,000 times, not merely by a factor of 2.

     
    At 1:17 pm, Blogger nige said...

    Update 28 April 2007: the last comment above contains an error and the exaggeration of radiation effects at low dose rates is even greater as a result.

    The calculation should have have subtracted the 2% leukemia incidence in the non-irradiated control group from both the 40% and 5% figures. Hence, the radiation induced leukemia for 1000 R received at rates of 1.3 to 81 r/hour ranged from 3% to 38%, not 5% to 40%. This means that a 62.3 fold increase in dose rate increased the leukemia rate due to the radiation by a factor of 38/3 = 12.7. Hence, the radiation-induced (not the total) leukemia incidence is proportional to (dose rate)^{0.615}, instead of (dose rate)^{1/2}.

    Using this corrected result for a 9 million fold difference between the dose rates of background (low dose rate) and Hiroshima (high dose rate) radiation, the radiation induced leukemia incidence for a similar total radiation dose will increase by a factor of 18,900, not 3,000.

    Hence, radiation-induced leukemia rates currently being extrapolated from Hiroshima and Nagasaki data down to low dose rate radiation will exaggerate by a factor of 18,900 or so, rather than the factor of 2 currently assumed by orthodoxy.

     
    At 5:19 pm, Blogger nige said...

    15 May 2007 update: the Radiation Effects Research Foundation has deleted the pages linked to in this post, including http://www.rerf.or.jp/eigo/faqs/faqse.htm and http://www.rerf.jp/top/qae.htm

    The new locations are http://www.rerf.or.jp/general/qa_e/qa2.html and http://www.rerf.or.jp/general/qa_e/qa7.html

    http://www.rerf.or.jp/general/qa_e/qa2.html contains an interesting table which shows the probability that cancer is caused by radiation instead of natural causes (non-exposed control group data) at various distances from ground zero in Hiroshima and Nagasaki.

    For the most highly irradiated 810 survivors within 1 km of ground zero, 22 died from leukemia between 1950-90, of which 100% are attributable to radiation exposure. In the same group of 810 persons, 128 died from other forms of cancer, but only 42% of these 128 deaths were attributable to radiation.

    Hence, even the most highly irradiated survivors who did die from cancer (apart from leukemia) were more likely (58% chance) to have died from naturally contracted cancer than from radiation induced cancer (42% risk).

    Only for leukemia was there more than 100% chance that a cancer death was due to radiation and not due to natural cancer risks. As explained in previous posts, this is due to the fact that leukemia is both rare and is strongly correlated to radiation exposure than other cancers.

    It is a pity that the Radiation Effects Research Foundation still has not added the mean shielded biologically equivalent doses (in centi-sieverts, which are identical to the old unit the rem) for each group of survivors listed using the DS02 dosimetry system.

    http://www.rerf.or.jp/general/qa_e/qa7.html states:

    "Question 7: What health effects have been seen among the children born to atomic-bomb survivors?

    "Answer 7: This was one of the earliest concerns in the aftermath of the bombings. Efforts to detect genetic effects were begun in the late 1940s and continue to this day. Thus far, no evidence of genetic effects has been found. ..."

     
    At 5:24 pm, Blogger nige said...

    Comment about Pugwash and the anti-nuclear hysteria propaganda

    I recently came across a free PDF book on the internet, authored by John Avery of the Danish Pugwash Group and Danish Peace Academy, called Space-Age Science and Stone-Age Politics.

    It is a similar kind of book to that which was published widely in the 1980s, full of pseudophysics like claims that nuclear weapons can somehow destroy life on earth, when a 200 teratons (equal to 200*10^12 tons, i.e., 200 million-million tons or 200 million megatons) explosion from the KT event 65 million years ago failed to kill off all life on earth!

    (This figure comes from David W. Hughes, "The approximate ratios between the diameters of terrestrial impact craters and the causative incident asteroids", Monthly Notice of the Royal Astronomical Society, Vol. 338, Issue 4, pp. 999-1003, February 2003. The KT boundary impact energy was 200,000,000 megatons of TNT equivalent for the 200 km diameter of the Chicxulub crater at Yucatan which marks the KT impact site. Hughes shows that the impact energy (in ergs) is: E= (9.1*10^24)*(D^2.59), where D is impact crater's diameter in km. To convert from ergs to teratons of TNT equivalent, divide the result by the conversion factor of 4.2*10^28 ergs/teraton.)

    Compare the power of all the world's nuclear weapons to that comet, and you see it's a complete fantasy. A megaton low altitude burst nuclear weapon will cause rapidly decreasing casualty rates in houses and rapidly decreasing destruction as the distance increases from 2 to 3 miles. In brich houses, the mortality risk from all blast effects (deaths are mainly due to debris impacts and related collapse of the house) for people lying down on the floor in U.K. type brick houses (with standard 9 inch thick brick walls) falls from about 50% at 2 miles to only about 5% at 3 miles from a 1 Mt surface burst. (Depending on the weapon design and the shielding geometry of the house and neighbouring homes, particularly the locations of windows, initial radiation and or thermal radiation may also cause some injury in some circumstances at these distances. If the location is downwind of the explosion, some quickly decaying fallout hazard may also exist in the event of true surface bursts on land (but not air bursts), depending on the windshear, fission yield fraction of the weapon, and whether the person stays indoors in a central area for the first few days or not.)

    In addition, a 1 Mt low altitude bomb explosion would break virtually 100% of windows for a 10 mile radius (with a lower incidence of breakage extending much further, and isolated breaks occurring due to atmospheric "focussing" of the shock wave periodically in focus-point zones hundreds of miles downwind), and cause immense panic and massive numbers of casualties for any people outdoors, particularly any who have a clear line-of-sight to the fireball in the few seconds before the heat flash stops (the blast wave generally takes longer to arrive than the heat flash lasts, so even if a shield is destroyed by the blast, it protects against the heat flash; the shadow cast by a single leaf is enough to prevent serious thermal burns over the shadow area, as proved by photographs taken by the U.S. Army at Hiroshima and Nagasaki: see previous posts on this blog, for example). Skin burns from thermal flash and from beta rays due to fallout contamination are two of the biggest immediate concerns.

    However, many nuclear weapons have yields lower than 1 Mt. Current (January 2007) mean yield of the 4,552 nuclear warheads and bombs in the deployed U.S. nuclear stockpile: 0.257 Mt. Total yield: 1,172 Mt.

    This is completely trivial compared to the 200,000,000 Mt KT-impact explosion 65 million years ago which did not kill off life on Earth.

    Let's now get back to John Avery's Space-Age Science and Stone-Age Politics, the Preface of which claims inaccurately:

    "... science has given humans the power to obliterate their civilization with nuclear weapons..."

    Avery gives no evidence for this claim, it's a typical example of groupthing propaganda.

    The idea is as follows:

    (1) claim that your pet hate, such as nuclear weapons or global warming or aliens, is a real threat to civilization,

    (2) ignore all evidence to the contrary, and blame politicians for stone age ideas which are causing or risking a disaster.

    In the 1930s, those who said Hitler was a threat and should be deterred by air power were dismissed by idealists and Nazi fellow-travellers as "war-mongers".

    They either didn't understand, or pretended that they didn't understand, that they were the war-mongers.

    If you want peace under freedom, human nature being as it is, you need to be strong enough enough to protect yourself.

    Weaken yourself by disarmament and you make yourself attractive to thugs by standing out as a good potential mugging victim.

    Somehow all those pacifists escaped the real world of childhood, where you find out that defenselessness makes you victim to all passing thugs who want someone to pick on, mug, rob, and fight.

    If you carry a big stick, you live a happier life if you want peace and freedom, than you live if you don't carry a big stick.

    Stone age politics is sensible, because the problems of life remain the same now as then: politics is not leading human nature into war. Human nature leads politicians into war.

    Contrary to Avery's mindset, you can't change the world by imposing a new political idealism on man.

    That's called dictatorship, and all versions so far have been evil failures: fascist Nazism as well well as Communist dictatorship.

    It makes a powerful elite few into dictators who become corrupt, people have to be coerced or bribed into maintaining the status quo under the dictatorship, the whole thing is unstable and based on hatreds and violence and terror.

    The error is trying to impose a political "solution" on human nature. Human nature is determined by genes, not by ideals written in books by philosophers.

    We see this in the fact that even when each individual in identical twins is brought up under different conditions in different places, they remain extremely similar in their interests and outlooks and intelligence. Genetics, sadly perhaps, does exert a massive influence on life.

    The idea that problems like war - which are caused by deep-rooted instincts, hatreds, feelings of injustice, tribal rivalries, and prejudices - can be wiped out by philosophical belief systems like pacifist ideology, is utopian, not realistic.

    Avery does reflect some of these issues on page 8 of his Preface, where he quotes Arthur Koestler's remark:

    "We can control the movements of a space-craft orbiting about a distant planet, but we cannot control the situation in Northern Ireland."

    This is very deep. The pacifists keep kidding themselves that if only they could explain to terrorists or dictators how wrong they are, the world would be put right. Wrong. The terrorists and dictators are paranoid, deluded bigots who simply don't want to hear the facts. The only thing they will listen to or respect is force, not talk. However, sometimes - as in the case in Northern Ireland - after several generations a compromise and settlement can be reached to halt the ongoing violence, at least for a while. Unless the core problems are resolved, however, violence might flare up again when tensions and pressures are increased at some time in the future.

    Don't rely on politicians and talk. Those who cherish peace agreements should remember that unless there is really genuine goodwill behind each signatory, the agreement isn't worth the paper it is written on. It's actually negative equity, if it brings one side a false sense of security, as was the case with Neville Chamberlain when he got Adolf Hitler's autograph on a peace pact at Munich in September 1938.

    In other words, making peace with a homocidal maniac and then waving the peace agreement around and saying "Peace in our time", is very dangerous. If Chamberlain had responded better (more aggressively) to Hitler, millions of lives might had been saved. Unfortunately he and his predecessors had a fixation with an inaccurate analysis of the interpretation of World War I. They thought that "peace at any cost" was worth while. They were wrong in forgetting the advice of the Roman, Publius Flavius Vegetius Renatus, writing in his book Epitoma rei militaris (c. 390 AD): "Si vis pacem, para bellum." (If you wish for peace, prepare for war.)

    Moving on to Chapter 1 of Avery's book, The world as it is, and the world as it could be, things get more interesting. The chapter begins with a list of interesting facts. Some are a bit out of date. As of 1997, the annual U.S. military budget is not nearly a thousand billion, but merely $439 billion (only $23 billion is spent on nuclear warheads).

    Avery claims in his list on page 12 that:

    "In the world as it is, the nuclear weapons now stockpiled are sufficient to kill everyone on earth several times over."

    This is total nonsense as already explained and shown by bthe previous posts on this blog: nuclear weapons are not that powerful, in part because the blast and other effects distances don't scale up in proportion to the amount of energy. Hence, increase the energy release by a million times, and the blast pressure damage distances are increased by only the cube-root of a million, i.e., just 100 times.

    Nuclear weapons actually "work" more by knocking over charcoal cooking braziers at breakfast time (as in the case of the firestorm at Hiroshima) and by the combination of nuclear radiation with thermal flash burns (the nuclear radiation lowers the white blood cell count for a few months after exposure, preventing infections of the burned skin from being healed naturally, so the person dies). Nuclear weapons are very dangerous and powerful, but they are certainly not the superbombs painted by media hysteria and political propaganda in the Cold War. That propaganda went unchallenged by most scientists largely because it helped stabilize the situation and deter war.

    However, it is now dangerous to over-hype the effects of nuclear weapons in this manner, because it detracts from the effectiveness of simple civil defence "duck and cover" and evacuation countermeasures. If you do get a terrorist nuclear attack, the casualty toll is basically dependent on whether people watch the flash and fireball through windows (getting burned by thermal radiation in the process, and then get killed or seriously injured when the blast wave arrives some seconds later, fragmenting the windows), or whether they "duck and cover".

    Avery then goes on to Africa and its problems: the need for safe and adequate drinking water supplies and medical help.

    Avery suggests funding these justifiable schemes by taxing international currency transactions. Sounds good if it works, i.e., if it doesn't put international trade out of business. If you tax too much, this will happen, because imports and exports will become even more highly taxed than they are now.

    Page 16 contains the stupid claim:

    "In the world as it could be, a universal convention against terrorism and hijacking would give terrorists no place to hide."

    So a piece of paper will prevent terrorists hiding in mountains, in vast sparsely populated areas? This is an example of dangerous nonsense. It's dangerous because it creates a false sense of security, resting on a piece of paper.

    In the real world, there are lots of bits of paper in each country with laws written on them. That doesn't abolish crime. Reason? The laws are just scribbles on pieces of paper. Human nature is such that criminals don't pay attention to laws, and even with the deterrents of fines and prison sentences, nobody has ever invented a way to make all people conform to the law. When you apply this fact to terrorists, the problem is immense because the punishments available cannot be scaled up in proportion to the potential crimes and acts of terrorism. The relative risks for terrorists don't increase in proportion to the threat that the terrorists present to civilization.

    Avery also goes on to claim (on page 17) that war could be abolished just as slavery was abolished.

    There are various deep-rooted problems with this claim: the American Civil War was essentially a war against slavery.

    War and slavery have very little to do with one another. Often, people go to war to preserve your independence, i.e., to fight for freedom, or fight against the prospect of becoming a slave in some political ideology (fascism, communism, etc.).

    It follows that if you want to ensure continued freedom from slavery, you need the ability to fight against those who would make a slave of you. Therefore, you need to be able to go to war to prevent slavery.

    If you abolish the possibility of fighting against thugs, you will risk becoming a slave. So Avery's idea that the abolition of slavery in the American Civil War and other fights should now be followed by an abolition of the possibility of war, i.e., fighting against the prospect of being enslaved by thugs, is a contradiction. It is manifestly gullible and dangerous.

    Moving on to Chapter 2 of Avery's book, Tribalism, which contains a discussion of how bees returning to their hives can communicate to other bees some information about how far away, and in what direction, any good pollen sources can be found. This was discovered in 1945 by Karl von Frisch. The returning bee flies around in a kind of circle with occasional short cuts across the diameter of the circle. The direction of the short cut across the diameter of the circle marks the direction to the pollen source, and the number of wiggles the Bee's abdomen makes as it crosses the diameter of the circle indicate the distance to the pollen source:

    "Studies of the accuracy with which her hive-mates follow these instructions show that the waggle dance is able to convey approximately 7 bits of information - 3 bits concerning distance and 4 bits concerning direction."

    Interesting trivia.

    Moving on to Avery's Chapter 3, Nationalism, a false religion, things get back on topic again. Avery argues that the nation state is a kind of tribe, and wars between nation states are basically tribal wars: "... a totally sovereign nation-state has become a dangerous anachronism."

    Unless you precisely define a "totally sovereign nation-state", this is illucid. Most nation-states have some interdependence on others, such as being part of a union (e.g., the European Union or the United States of America) or federation (e.g., the Russian Federation).

    This doesn't prevent them being involved in wars, or starting wars to protect themselves if their vital interests are threatened.

    There is actually a danger in federation and union which I must explain:

    A few hundred years ago, there was an era called the "Age of Discovery" in which large unions, nations and empires, sent out armies to seize the assets of small, happy, free and independent tribes. This theft was "justified" by denying the victims any right of free expression, and indeed herding them up and selling them as slaves.

    The Spanish actually destroyed an ancient South American civilization in this way, while other Europeans colonised massive areas of Africa and Asia, forcing the people into slavery or brainwashing them with various religions.

    Therefore, even successful unions, federations and other groups may pose a threat to civilizations that differ from themselves. There is an enormous amount of sheer arrogance in the replies people give to this. They claim that errors that occurred before cannot happen again because people learn from their mistakes. Wrong. Errors that have occurred in the past actually keep on occurring:

    Even within any union or federation, you will find groups of people being exploited by the union or federation as a whole. They will be forced to pay taxes for services they do not use, and so on. They do not have any say because they are a minority and the particular form of so-called "democracy" (which is a sheer travesty of the term "democracy" as used in Ancient Greece, where every citizen had a daily vote on the day's policies) in use is basically a dictatorship with a choice between two rich old men, once every five years.

    If we go back in time a bit to the age before really effective military forces existed in England, you come to a time when England was free to all who came with enough swords. This is precisely the reason why first the Romans and later the Normans invaded England successfully.

    If we give up our armaments, we will be in a similar position to that we were in when we were conquered by the Romans and the Normans.

    Alliances are fickle. In World War I, from 1914-8 the allies consisting of Britain, France, Russia, Italy, Serbia, and Belgium fought against the enemies consisting of Germany, Austria-Hungary, and Turkey. In 1915 Japan joined the allies and Bulgaria joined the enemy. In 1916 and 1917, Romania and America, respectively, joined the allies.

    In World War II, 1939-45, Italy and Japan, which had both been allies in World War I, became enemies, switching sides. Allegiances can shift, treaties can be broken. In World War II, America was surprised by the sneaky Japanese attack on Pearl Harbor, while Russia was surprised by Hitler's treachery. Russia had a pact with Hitler which guaranteed peace. It wasn't worth a cent.

    Avery correctly pins a share of the blame for World War II on the French, on page 62:

    "In 1921, the Reparations Commission fixed the amount that Germany would have to pay [mainly to France, in compensation for the costs of World War II] at 135,000,000,000 gold marks. Various western economists realized that this amount was far more than Germany would be able to pay; and in fact, French efforts to collect it proved futile. Therefore France sent army units to occupy industrial areas of the Ruhr in order to extract payment in kind. The German workers responded by sitting down at their jobs. Their salaries were paid by the Weimar government, which printed more and more paper money. The printing presses ran day and night, flooding Germany with worthless currency. By 1923, inflation had reached such ruinous proportions that baskets full of money were required to buy a loaf of bread. At one point, four trillion paper marks were equal to one dollar. This catastropic inflation reduced the German middle class to poverty and destroyed its faith in the orderly working of society.

    "The Nazi Party had only seven members when Adolf Hitler joined it in 1919. By 1923, because of the desperation caused by economic chaos, it had grown to 70,000 members."

    Avery's Chapter 4 is called Religion: Part of the problem? - or the answer?. This is particularly interesting (page 67):

    "Early religions tended to be centred on particular tribes, and the ethics associated with them were usually tribal in nature. ... In the 6th century B.C., Prince Gautama Buddha founded a new religion in India, with a universal (non-tribal) code of ethics. Among the sayings of the Buddha are the following: Hatred does not cease by hatred at any time; hatred ceases by love. Let a man overcome anger by love; let him overcome evil by good. All [weak] men tremble at punishment. All [over-indulged] men love life. Remember that you are like them, and do not cause slaughter.

    "One of the early converts to Buddhism was the emperor Ashoka Maurya, who reigned in India between 273 B.C. and 232 B.C. After his conversion, he resolved never again to use war as an instrument of policy. He became one of the most humane rulers in history, and he also did much to promote the spread of Buddhism throughout Asia.

    "In Christianity, which is built on the foundations of Judaism, the concept of universal human brotherhood replaces narrow loyalty to the tribe. [This simplification of Avery's won't go down well with followers of Judaism, and ignores the crimes, from the Inquisition to Nazi Christianity, done in the name of Christianity over the centuries.] The universality of Christian ethical principles, which we see especially in the Parable of the Good Samaritan, make them especially relevant to our own times. Today, in a world of thermonuclear weapons, the continued existence of civilization depends on whether or not we are able to look on all of humanity as a single family."

    This again is wrong: thermonuclear weapons don't threaten our existence. They are there because people threaten our freedom.

    If they do get used in war again, that would be terrible, but how terrible it is depends on what people can do to protect themselves. It's a quantitative thing, not a qualitative thing.

    All disasters are terrible. They are more terrible if you give up in advance and don't have any civil defence advice in place with the evidence to support - to the general public hearing the advice - the fact that "duck and cover" and decontamination and other countermeasures do actually work and are feasible and have been well tested against a range of different types of nuclear explosion in carefully instrumented, scientific trials.

    On page 68, Avery states:

    "In the Christian Gospel According to Matthew, the following passage occurs: You have heard it said: Thou shalt love thy neighbor and hate thy enemy. But I say unto you: Love your enemies, bless them that curse you, do good to them that hate you, and pray for them that spitefully use you and persecute you. ...

    "The seemingly impractical advice given to us by both Jesus and Buddha - that we should love our emenies and return good for evil - is in fact of the greatest practicality, since acts of unilateral kindness and generosity can stop escalatory cycles of revenge and counter-revenge such as those which characterize the present conflict in the Middle East and the recent troubles in Northern Ireland. Amazingly, Christian nations, while claiming to adhere to the ethic of love and forgiveness, have adopted a policy of 'massive retaliation', involving systems of thermonuclear missiles whose purpose is to destroy as much as possible of the country at which retaliation is aimed. It is planned that entire populations shall be killed in a 'massive retaliation', innocent children along with guilty politicians."

    Avery here neglects a very important question:

    "Would you love Adolf Hitler as your neighbour and forgive him while he is in the middle of exterminating millions in gas chambers, and allow him to continue?"

    Jesus's to love thy neighbour doesn't or shouldn't apply to Mr Hitler. So we immediately find a massive hole in Christian ethics and philosophy. Avery merely ignores the existence of this hole, which in the real world (if he were a politician) would mean he would be liable to fall straight down the hole, dragging all those who followed him down there too.

    No. You shouldn't love thy neighbour if that neighbour is potentially a mass murderer who will use your good will to help accomplish evil goals. That's a massive problem that totally destroys the whole thesis of Avery.

    Avery goes on, still on page 68:

    "The startling contradiction between what Christian nations profess and what they do was obvious even before the advent of nuclear weapons..."

    Hold hard. Nuclear weapons ended World War II, because they forced Russia to declare war on Japan in order to get some of the advantages of being a victor. This made Japan's leaders realise it has lost the war. The numbers of people killed in Hiroshima and Nagasaki were trivial compared to the numbers killed by regular incendiary air raids on Japan, which included the firestorm on Tokyo in March 1945 that was far more destructive than a nuclear bomb.

    The purpose of our having nuclear weapons is to deter war and prevent a war. 'Massive retaliation' is an old and largely outdated deterrent concept, and there are more modern strategies such as counterforce (hitting military targets with weapons of yields and burst types such as to minimise any possible collateral damage to civilian homes).

    However, the point is that World Wars are less likely when the potential losses to all parties are massive. This is the main reason why nuclear weapons have prevented regional Cold War conflicts from escalating to all-out World War.

    On page 160, Avery produces a graph (Figure 8.1) which shows the increase in infant deaths due to the sanctions imposed on Iraq in 1990, under U.N. Security Council Resolution 678 which authorized the use of 'all necessary means' to force Iraq to withdraw from Kuwait. The mortality rate of children under five years of age in Iraq doubled within a year.

    This highlights the perils of economic sanctions: they don't hurt the dictators, they kill innocent kids. They may sound "peaceful" but they still kill. In the same way, according to some muddled pacifist sentiments, only war using particular kinds of "violent weapons" is a bad thing. According to that bad philosophy, the use of gas chambers to massacre people is "peaceful" because there are no "horrible bombs or bullets" involved. Actually, it is just as bad to kill people regardless of the method used. Cold-blooded slaughter with gas, refusal to allow medical treatment, or starvation is - if anything - even more sinister than the use of violence in anger.

    Avery's Chapter 10 is World government. The flaw with this idea is simple to see: laws get broken. The idea that a world government based on laws will be a success is refuted amply by a look at what happens in any country when laws are made: criminals break them regardless of law enforcers. At present, the stability of the world is ensured by military deterrence. Remove that mechanism, and you are playing with fire. Every time people have tried to impose a philosophical solution like Marxism or Fascism, it has failed. Power corrupts, absolute power corrupts absolutely. The idea of a world government is that of absolute power, and absolute corruption.

    The Roman Empire was the world government of its time. It was maintained by ruthless suppression of dissent, and it was continually at war, often civil war.

    A world government would not abolish war, it would relabel all future wars as "civil wars". Merely adding the word "civil" to war is the kind of worse-than-useless political solution to a problem you can expect from moronic zombies.

    On page 214, Avery quotes a 1954 suggestion by Edith Wynner for world government (which sounds as if it is a line borrowed from the 1951 film The Day the Earth Stood Still):

    "A policeman seeing a fight between two men, does not attempt to determine which of them is in the right and then help him beat up the one he considers wrong. His function is to restrain violence by both, to bring them before a judge who has authority to determine the rights of the dispute, and to see that the court's decision is carried out."

    This is all false. First, the person who is being wrongfully attacked first wants the attack to stop, not to beat up the other person.

    The suggestion in the quotation that people always want revenge is prejudiced and wrong.

    Second, the policeman does have a duty to collect relevant evidence and to do that efficiently he or she needs to take statements from any witnesses, and ascertain that any evidence (weapons with fingerprints, etc.) will be available for use in a prosecution. The policeman decides on the basis of this preliminary investigation who he or she should arrest.

    If the policeman arrests an innocent person to bring them before a judge, that is wrongful arrest. Arrests must be based on evidence or at least strong suspicion with some reasoning behind it.

    The whole idea that in any war both sides are equally at fault is nonsense: and an insult to those murdered by Hitler's thugs.

    In particular, the idea of an international police force to catch and punish criminals fighting terrorist wars is just nonsense because it can't deal with suicide bombers. You can jump up and down on the grave of the suicide bomber, but that will not deter other suicide bombers.

    The pacifist case for world government is just shallow and insulting. It is likely to cause more violent wars (which will be called, ironically, "civil wars") than before, simply because vast numbers of people will probably resent the system. It will permit corruption and "might is right" majority rule and barbarity on a scale not seen since the Roman Empire. It will not be capable of stopping 9/11 type suicide bombers.

    World government would reduce individualism by removing part of each person's sense of personal identity to a group, and will thereby increase the risk of subversive warfare and insurrection against the massive nanny-state quango of dictatorial majority-controlled officialdom that constitutes the travesty of democracy masquerading as a "world government".

     
    At 12:14 pm, Blogger nige said...

    copy of a comment made on John Horgan's blog:

    http://www.stevens.edu/csw/cgi-bin/blogs/csw/?p=50

    "... the fact is that, as near as we can tell from the fossil record, humans have not killed other humans as a matter of course for the greatest part of Homo Sapiens’ time on earth. The beginnings of our species are figured to be about 195,000 years ago, the date of the earliest anatomically modern skeletons, but there is no indication of anything like murder until about 20,000 years ago. Doesn’t seem to be in our blood, but in our circumstances. ...

    "But it is important to see that this kind of interpersonal violence and murder comes rather late in Sapiens development. In fact, for 90 per cent of our time on earth there is nothing to indicate that humans ever reached the extreme state of knocking each other off. It was a reaction to an extreme crisis, it got to be a familiar response to the increased tensions in a time of scarcity and competition, and once established it seems to have continued on.

    "But not because it was in our human nature. Rather it was in the conditions of our life. And therefore the obvious lesson is that we can’t just shrug and say some people are just “born killers,” or “it’s in the blood.” It’s not, and was not for 175,000 years." - Kirkpatrick Sale

    The killing started with people hunting animals for food. People were primarily tribal hunters for the 175,000 years before they became farmers around the time of the last ice age, around 20,000 years ago. Hunting is a violent activity, so when hunting ended, those people used to regular bouts of violence would be more likely to fight among themselves instead. You see this in primitive tribes even today: they have two important activities, both full of ceremony and skill: hunting and warfare. The hunting provides food. The warfare maintains order between rival tribes, driving away the hunting competition and the danger of invasion of their villages and theft of their wives by other tribes.

    According to Wikipedia: "Neanderthals became extinct in Europe approximately 24,000 years ago". Maybe they were driven away or killed off in warfare? Even if this was the case, it's not automatically the survivors who are to blame.

    The pacifist approach begins with the false assumption that fighting and violence are totally immoral and inexcusable under all circumstances. Yet the cold blooded massacres of history (which pacifists don't seem to worry about) like concentration camps where malnutrition and disease, slavery and neglect, or cold-blooded gassing, are the really big problems. Anne Frank died from typhus, and millions died from murder or deliberate neglect in axis civilian concentration camps.

    Saddam used nerve gas to murder thousands of Kurds in 1988, and he ordered the torture and murder of thousands of others. It doesn't make that much difference whether he used a bullet to "violently" murder someone, or simply let them die from thirst more "peacefully" in a cell. It's still murder.

    I think that this problem is deliberately being neglected by pacifism, and it's the fatal flaw in pacifism. There was an infamous Oxford Union debate around 1933 on whether to "Fight for King and Country". A pacifist philosopher, the immoral Professor C.E.M. Joad (later imprisoned for travelling without a valid railway ticket), was asked what he would do "if his wife was being raped by enemy soldiers". He dismissed the question with a comic reply that he would simply join in and have an orgy, which made most people there laugh, and he won the vote.

    The public viewed the plight of people in concentration camps as a joke at that time, circa 1933, in comparison to the violent horrors of having a major war.

    But the correct question to pin on the pacifist is what you do if the enemy is torturing people held without charge in concentration camps, as Hitler and Saddam did. Economic sanctions is worse than useless: the death rate of children under 5 years of age doubled within a year due to the sanctions imposed on Iraq in 1990, under U.N. Security Council Resolution 678 which authorized the use of 'all necessary means' to force Iraq to withdraw from Kuwait. You can't hurt the dictator by applying economic sanctions: innocent people suffer. The only real option is to go to war against them. It's simply not a case that "two wrongs don't make a right". You have to try to estimate how many more people you can save by going to war than will be killed if you don't have the war. Whether it is right or not depends on whether there is a profit to be had, i.e. if the number of lives saved exceeds the number killed in the conflict.

    You can only call a war illegitimate or "murder" if amount of anticipated suffering as a result of the conflict exceeds the amount of suffering which is likely if the war doesn't occur.

     
    At 12:46 am, Anonymous Susan said...

    Hi Nige

    Do you have any comments about the Japanese earthquake and the nuclear reactor yet?

    SM

     
    At 1:45 pm, Blogger nige said...

    Hi Susan,

    There is not much to say about the incident, really, which was hardly Chernobyl.

    It's interesting however that Japan has managed to embrace nuclear reactor technology despite the anti-nuclear sentiment in the aftermath of Hiroshima, Nagasaki, and the 1954 contamination of 23 Japanese fishermen on the "Lucky Dragon".

    What is interesting about the media is not the science (the newspaper editor doesn't know the difference between a pBq and a GBq, it's all the same), but the politics.

    Here in the UK, there is no antinuclear concern about the risks of having 0.99 microcurie or 9.9 kBq of Am-241 (very similar to plutonium for health reasons) in smoke detectors in every house to save lives in fires.

    Antinuclear people don't put on a massive front-page propaganda attack saying that 9.9 kBq of Am-241 emits 9900 alpha particles per second, and since a single alpha particle can in principle set off a lung cancer, it follows that over a two week period a smoke detector emits enough alpha particles to totally wipe out humanity, at least in principle.

    It's fairly obvious that this scare-mongering won't get into the newspapers, although on a scientific footing it is similar quantitatively to much of the anti-nuclear protestors propaganda.

    Nobody will listen to propaganda unless it reinforces their prejudices.

    If you point out that a single smoke detector, if incinerated in a fire, could - according to the exceptionally fiddled antinuclear lobby calculations - exterminate humanity, nobody listens.

    If the media publish the same fiddled calculation about a leak of radioactivity from a nuclear reactor, it gets a very different treatment from those reading it, starting off a panic wave.

    If you take a rock, in principle (according to misleading calculations) that could be used by a terrorist to kill everyone, simply by hitting people over the head. In practice, of course that is not going to happen. Similarly, the Am-241 contaminated smoke from a single burned smoke detector isn't going to end up in people's lungs, with one alpha particle setting off a cancer in each person in the world.

    If you want to play the numbers game, you can point out that Am-241 has a half-life of 432 years, so it's effective life is statistically 432/(ln 2) = 623 years. (Am-241 will emit the same number of alpha particles until it has completely decayed as would be emitted if the emission rate at the present time, now, were sustained for the statistically effective lifespan of 623 years.)

    So 0.99 microcurie/9.9 kBq of Am-241 in a smoke detector emits a total of 2x10^14 alpha particles in its lifespan.

    Since the world's population is 6,700,000,000 = 6.7x10^9, it is clear that if only 1 in 30,000 of the alpha particles emitted by a single smoke detector starts a lung cancer, the number of people killed will be equal to the number of people on the planet!

    So it's very easy to come up with scare-mongering statements about radiation, simply because the numbers are so big.

    The "problem" for scare mongerers is that the actual risks are diluted by immense factors. Not only will it be extremely unlikely that smoke containing alpha emitters will get into many people's lungs, but even when it gets there, it is usually removed quickly like ordinary dust, and in any case the probability of a single alpha particle causing a lung cancer is extremely low.

    So because the quantitative errors involve in naive scare-mongering antinuclear propaganda are so extreme, the qualitative nature of the risk changes totally:

    it's a trivial risk compared to the hazards of inhaling natural radioactive radon gas that comes from the soil and seeps into houses.

    Traditionally, the pro-nuclear lobby has made an awful mess and have never properly made people aware clearly of the nature and intensity of natural background radiation from space and present in soil, water, food, and the air.

    If they did calculate and measure background radiation exposures carefully, they could express radiation levels in terms of the natural average sea level exposure, so people would be aware of radiation in a more useful, more quantitative sort of way.

    However, they don't do this. Edward Teller made a complete mess of it in the 1950s by having to compare radiation from the nuclear industry to cigarette smoking, instead of comparing it to natural background radiation in a quantitative way.

    In addition, it is vital to present the facts of how background radiation levels vary in different locations.

    The best thing the nuclear industry could do is to publish a global map (like a layered Google map) on the internet with reliable data on radiation levels around the world, showing how cosmic radiation varies as a function of terrain altitude and proximity to the poles (where the earth's magnetic field lines are nearly vertical and so can't shield charged cosmic rays), as well as the effects of different types of soils which contain different amounts of radioactive minerals.

    They should also indicate the natural alpha, beta and gamma radiation in food, water, air and soil in different places.

    It would be useful knowledge that anyone could grasp if colour coding was used.

     
    At 2:46 pm, Blogger nige said...

    SM has kindly emailed me the following extract about firestorm exaggerations by Dr W. E. Strope who worked at the U.S. Naval Radiological Defense Laboratory at nuclear tests from Crossroads (1946) at Bikini Atoll, onward.

    Link: http://www.strategicdefense.org/Commentary/Worldonfire.htm

    AIR DEFENSE BALLISTIC MISSILE DEFENSE CIVIL DEFENSE


    Whole World on Fire—And All Wet

    Walmer (Jerry) Strope

    I have just finished reading a strange book, Whole World on Fire, by Lynn Eden, published by Cornell University Press a month ago. Ms. Eden is an historian at Stanford University. Her thesis is that Air Force targeteers perversely continued to use blast damage as the basis for targeting even though fire damage "would extend two to five times farther than blast damage" because of institutional biases stemming from the emphasis on precision bombing in World War II. That is, "organizations draw on past practices and ideas even as they innovate to solve new problems."

    To Ms. Eden, the question of prioritizing nuclear weapon effects is just a convenient example of this institutional characteristic. She does not purport herself to be an expert on the physics of mass fires. This helps explain part of the strangeness I find in the book; namely, Why now? After all, the Cold War is over and targeteers are not fine-tuning the SIOP. It seems she has spent 15 years reviewing the literature on nuclear fires, interviewing the knowledgeable people and writing the book. It just happened to come out now.

    In Chapter 1, Ms. Eden introduces her readers to the problem by postulating the detonation of a 300-kt bomb 1,500 feet above the Pentagon. It is here that I encounter more of the strangeness. It seems that Ms. Eden is under some pressure to convince her readers that the Air Force had plainly ignored the obvious. Therefore, she tends to present the most extreme positions on mass fire issues, as well as some of the "tricks of the trade." One trick: close in, we are told "the fireball would melt asphalt in the streets." But when the description gets to the Capitol building some three miles away, there is no comparable sentence. The previous image is permitted to carry over.

    Next, we are told, "Even though the Capitol is well constructed to resist fire, and stands in an open space at a distance from other buildings, light from the fireball shining through the building’s windows would ignite papers, curtains, light fabrics, and some furniture coverings. Large sections of the building would probably suffer heavy fire damage. The House and Senate office buildings would suffer even greater damage. The interiors of these buildings would probably burn."

    Hold on! Wait a minute! The Capitol building is completely protected by sprinklers. So are the House and Senate office buildings, the Library of Congress, the Supreme Court building, and the massive buildings lining the Mall and in the Federal Triangle. These buildings may become sopping wet but they probably will not burn. The monuments also will not burn.

    Why don’t mass fire calculators take sprinkler systems, venetian and vertical blinds, and other fire protection measures into account? Is the situation in the Nation’s Capital unusual? Not anymore. For decades, the lowly fire protection engineer and his employer, the fire insurance industry, have been gnawing away at the fire problem. According to the National Fire Protection Association, between 1977 and 2002 the annual number of building fires in the United States declined by 50%, from 3.2 million a year to 1.6 million a year. Fires in hotels and motels, which killed over 100 people a year as recently as the late 1960s, have become so rare that the U.S. Fire Administration no longer keeps statistics on them. If it were not for a sizable increase in wildfire damage—resulting from timber management practices—the statistics would look even better.

    Ultimately, Ms. Eden concludes, "Within tens of minutes, the entire area, approximately 40 to 65 square miles—everything within 3.5 or 4.6 miles of the Pentagon—would be engulfed in a mass fire. The fire would extinguish all life and destroy almost everything else." To reach this horrific prediction, Ms. Eden has to ignore more than the prevalence of sprinkler systems. Among these other issues are the hole in the doughnut problem and the survivability problem.

    I was introduced to the hole in the doughnut issue in 1963 when I first visited UK civil defense in the Home Office, Horseferry House, London. I discovered that the people I was talking to had planned the incendiary attacks during World War II. Their effectiveness depended on how many explosive bombs they included in an attack. If they included too many, the buildings were knocked down and didn’t burn well. In fact, the target just smoldered. If they included too few, the incendiaries often just burned out on undamaged roofs. Finally, in the Hamburg attack, they got it right, just opening up the buildings so they burned rapidly. The Hamburg mass fire was called a "fire storm." These people were adamantly unanimous that a nuclear weapon could never cause a firestorm. The severe-damage region around the explosion would just smolder, producing a "ring fire," called a doughnut by our fire research people. That’s apparently what happened at Hiroshima.

    Mass fire models that ignore such views produce fierce fires that would seem to destroy everything. But lots of people survived in the fire areas at Hamburg and Hiroshima. The late Dr. Carl F. Miller (after whom the California chapter of ASDA is named) did the definitive analysis of the records of the Hamburg Fire Department. About 20 percent of the people in the fire area were in underground bunkers. Eighty percent were in shelters in building basements. Survival in bunkers was 100%; in basements, it was 80%.

    Despite her exaggeration of mass fire effects, I don’t think Ms. Eden’s book would convince the joint strategic targeteers to change their ways. I have concluded that the blast footprint and the fire footprint will be roughly congruent. Thus, I refer to them simply as the "direct effects area" (See my Nuclear Emergency Operations 101.)

    Lynn Eden’s book is a strange book—and a little bit dumb (her term.) I wouldn’t recommend you buy it. But if you are part of the old civil defense research group, you should find the pages on that work interesting. If you just want to learn something about mass fires, try to find a copy of FEMA H-21 of August 1990, the Nuclear Attack Environment Handbook. It won’t lead you astray.


    I exchanged emails on the subject of blast wave energy attenuation in causing damage, a couple of years ago, with Dr Harold Brode, the RAND Corporation expert on the effects of nuclear weapons. I had read Dr William Penney's evidence published in 1970 about the blast in Hiroshima and Nagasaki which he had personally surveyed as soon as the war ended in 1945. The blast, Penney's studies showed, rapidly lost energy (and pressure) due to the work done in causing damage. According to the laws of physics, once damage is done like this, energy is irreversibly lost. The American book by Glasstone ignores this effect entirely, although it does cite Penney's paper.

    Harold Brode suggested that when the blast knocks down a house, the energy used to do that is not entirely lost because you get accelerated fragments of brick, glass and wood moving outward in the radial direction. However, these move far more slowly than the shock front and soon lag behind the shock, fall to the ground and decelerate by tumbling. The distances debris moved when houses were knocked down in nuclear tests in 1953 and 1955 were carefully measured and filmed; it is not that far, and most of the debris remains close (within a matter of metres) to the house. So there is a problem here. For relatively small weapons, the blast pressure drops so rapidly with distance in the range of serious damage, that the energy loss effect is not too severe (although it was apparent in Penney's measurements of the deflection of steel bars and the crushing of petrol cans at Hiroshima and Nagasaki). But for big weapons, it interferes seriously with the blast scaling laws and the result is that blast damage distances increase far more slowly than the official predictions, especially at low pressures.

    This is relevant to massive controversies over thermal radiation effects like skin burns and fires. The majority of fires in Japanese residential areas were caused by the blast wave via overturning cooking braziers in homes full of inflammable paper screens, bamboo furniture, etc., the charcoal braziers being in use at the times of each nuclear attack (breakfast time for Hiroshima, lunch time for Nagasaki). Colonel Tibbets, in charge of the 509th which dropped the bombs (he was the pilot on the Hiroshima raid) writes in his 1978 autobiography about how expert he was on firestorms. He had served in Europe on successful incendiary missions before going to Japan, where he advised General LeMay on how to successfully create firestorms with a mix of incendiaries plus EXPLOSIVES, which create blast damage and enable fires to start in the debris of wooden buildings. The bombs dropped on Hiroshima and Nagasaki landed at local times of 8.15 am, when people were either on their way to work or school, or having breakfast (using charcoal cooking braziers in wood frame houses containing inflammable bamboo and paper screen furnishings), and at 12 pm, when many people were preparing lunch and others were out of doors.

    The skin burns risk depends mainly on the time of day, since the percentage of people who would be in an unobstructed line-of-sight of a fireball in a modern city ranges from 1% in the early hours of the morning to an average of 25% during the daytime. Hence, the flash burns casualty rate can easily vary by a factor of 25, just as a function of the time of day that an explosion occurs. Obviously, the density of combustible materials on the ground determines the risk of a firestorm, but this is trivial for most modern buildings in cities made largely from steel and concrete, which simply don't burn. Dr Brode did several studies of firestorm physics during the 1980s, which I feel are irrelevant because the fact is that firestorms were well investigated in World War II when incendiary attacks were made on many cities in an effort to start them. The areas which burned well and led to firestorms has a massive abundance of combustible materials per square foot, and these were mainly the wooden medieval parts of old cities like Hamburg, and wooden construction areas of Japanese cities. Once burned, there were rebuilt with less inflammable materials, so these firestorms cannot be repeated in the future. (Similarly, wooden London was burned down in 1666, and was rebuilt in a more fire-resistant manner.)

    There are some interesting reports by Carl Miller on firestorms in Germany, written in the 1960s. Somehow, the RAND Corporation did not get hold of this information, or else it simply jumped on the "Nuclear Winter" funding band waggon in 1983, and ignored the facts about firestorms derived from WWII obtained from personal experience by people like George R. Stanbury of the British Home Office Scientific Advisory Branch.

    Dr Strope wrote a 1963 NRDL unclassified report on the base surge radiation effects of the 1946 Baker underwater test, which took a lot of finding. Fortunately the British library at one time was donated a lot of original NRDL reports (in printed form, not the usual poor-quality microfilm) and hold them at Boston Spa. I've compiled and assessed a great deal of information on radiation from underwater tests, but blogger and wordpress blog sites are not suited to publishing tables of information.

    The British information which Dr Strope refers to in 1963 is that of Home Office scientist George R. Stanbury, who did the civil defence studies at the first British nuclear test in Monte Bello, 1952. Stanbury writes in the originally 'Restricted' (since declassified) U.K. Home Office Scientific Adviser's Branch journal Fission Fragments, Issue Number 3, August 1962, pages 22-26:

    'The fire hazard from nuclear weapons

    'by G. R. Stanbury, BSc, ARCS, F.Inst.P.

    'We have often been accused of underestimating the fire situation from nuclear attack. We hope to show that there is good scientific justification for the assessments we have made, and we are unrepentant in spite of the television utterances of renowned academic scientists who know little about fire. ...

    'Firstly ... the collapse of buildings would snuff out any incipient fires. Air cannot get into a pile of rubble, 80% of which is incombustible anyway. This is not just guess work; it is the result of a very complete study of some 1,600 flying bomb [V1 cruise missile] incidents in London supported by a wealth of experience gained generally in the last war.

    'Secondly, there is a considerable degree of shielding of one building by another in general.

    'Thirdly, even when the windows of a building can "see" the fireball, and something inside is ignited, it by no means follows that a continuing and destructive fire will develop.

    'The effect of shielding in a built-up area was strikingly demonstrated by the firemen of Birmingham about 10 years ago with a 144:1 scale model of a sector of their city which they built themselves; when they put a powerful lamp in the appropriate position for an air burst they found that over 50% of the buildings were completely shielded. More recently a similar study was made in Liverpool over a much larger area, not with a model, but using the very detailed information provided by fire insurance maps. The result was similar.

    'It is not so easy to assess the chance of a continuing fire. A window of two square metres would let in about 10^5 calories at the 5 cal/(cm)^2 range. The heat liberated by one magnesium incendiary bomb is 30 times this and even with the incendiary bomb the chance of a continuing fire developing in a small room is only 1 in 5; in a large room it is very much less.

    'Thus even if thermal radiation does fall on easily inflammable material which ignites, the chance of a continuing fire developing is still quite small. In the Birmingham and Liverpool studies, where the most generous values of fire-starting chances were used, the fraction of buildings set on fire was rarely higher than 1 in 20.

    'And this is the basis of the assertion [in Nuclear Weapons] that we do not think that fire storms are likely to be started in British cities by nuclear explosions, because in each of the five raids in which fire storms occurred (four on Germany - Hamburg, Darmstadt, Kassel, Wuppertal and a "possible" in Dresden, plus Hiroshima in Japan - it may be significant that all these towns had a period of hot dry weather before the raid) the initial fire density was much nearer 1 in 2. Take Hamburg for example:

    'On the night of 27/28th July 1943, by some extraordinary chance, 190 tons of bombs were dropped into one square mile of Hamburg. This square mile contained 6,000 buildings, many of which were [multistorey wooden] medieval.

    'A density of greater than 70 tons/sq. mile had not been achieved before even in some of the major fire raids, and was only exceeded on a few occasions subsequently. The effect of these bombs is best shown in the following diagram, each step of which is based on sound trials and operational experience of the weapons concerned.

    '102 tons of high explosive bombs dropped -> 100 fires

    '88 tons of incendiary bombs dropped, of which:

    '48 tons of 4 pound magnesium bombs = 27,000 bombs -> 8,000 hit buildings -> 1,600 fires

    '40 tons of 30 pound gel bombs = 3,000 bombs -> 900 hit buildings -> 800 fires

    'Total = 2,500 fires

    'Thus almost every other building [1 in 2 buildings] was set on fire during the raid itself, and when this happens it seems that nothing can prevent the fires from joining together, engulfing the whole area and producing a fire storm (over Hamburg the column of smoke, observed from aircraft, was 1.5 miles in diameter at its base and 13,000 feet high; eyewitnesses on the ground reported that trees were uprooted by the inrushing air).

    'When the density was 70 tons/square mile or less the proportion of buildings fired during the raid was about 1 in 8 or less and under these circumstances, although extensive areas were burned out, the situation was controlled, escape routes were kept open and there was no fire storm.'


    Regarding Hiroshima, Nagasaki, Tokyo and other incendiary attacks on Japan, there is an excellent table of comparison of all the data on page 336 of the 1950 "Effects of Atomic Weapons" (deleted from later editions), based on the U.S. Strategic Bombing Survey report of 1946 on Hiroshima and Nagasaki: the Hiroshima bomb destroyed 4.7 square miles, Nagasaki 1.8 square miles, and the 1,667 tons of incendiary and TNT dropped on Tokyo in one conventional raid destroyed 15.8 square miles, killing many more people than the atomic bombs.

    The nuclear winter cold war propaganda dependent as it was on firestorm nonsense, is a complete lie scientifically of course, but it was a major politician and media "spin event":

    "This study, which is based entirely on open Soviet sources, examines and analyzes Soviet views on and uses made by Soviet scientists of the so-called ''Nuclear Winter'' hypothesis. In particular, the study seeks to ascertain whether Soviet scientists have in fact independently confirmed the TTAPS prediction of a ''Nuclear Winter'' phenomenon or have contributed independent data or scenarios to it. The findings of the study are that the Soviets view the ''Nuclear Winter'' hypothesis as a political and propaganda opportunity to influence Western scientific and public opinion and to restrain U.S. defense programs. Analysis of Soviet publications shows that, in fact, Soviet scientists have made no independent or new contributions to the study of the ''Nuclear Winter'' phenomenon, but have uncritically made use of the worst-case scenarios, parameters, and values published in the Crutzen-Birks (Ambio 1982) and the TTAPS (Science, December 1983) studies, as well as models of atmospheric circulation borrowed from Western sources. Furthermore, current Soviet directives to scientists call for work on the further strengthening of the Soviet Union's military might, while it is also explained that the dire predictions of the possible consequences of a nuclear war in no way diminish the utility of the Soviet civil defense program and the need for its further improvement."

    - Dr Leon Goure, USSR foreign policy expert, Soviet Exploitation of the 'Nuclear Winter' Hypothesis, SCIENCE APPLICATIONS INTERNATIONAL CORP., MCLEAN, VA, report SAIC-84/1310, DNA-TR-84-373, SBITR-84-373, ADA165794, June 1985.

    A great deal of the problem is that following fashion and consensus is the easiest thing to do. Usually the mainstream viewpoint is the best there is, so people have a lot of faith in it, on the principle that "so many people can't all be wrong".

    Nuclear winter has quite an interesting history which I've followed from the beginning. It started off with the comet impact that wiped out the dinosaurs. The comet forms a fireball when it collides with the atmosphere, and the thermal radiation is supposed to ignite enough tropical vegetation to produce a thick smoke cloud, freezing the ground and killing off many species.

    The best soot to absorb solar radiation is that from burning oil, and Saddam tested this by igniting all of Kuwait's oil wells after the first Gulf War. Massive clouds of soot were produced, but the temperature drop was far less than "nuclear winter" calculations predicted occurred in the affected areas: http://en.wikipedia.org/wiki/Nuclear_winter#Kuwait_wells_in_the_first_Gulf_War

    The idea that a dark smoke layer will stop heat energy reaching the ground is naive because by conservation of energy, the dark smoke must heat up when it absorbs sunlight, and since it is dark in colour it is as good at radiating heat as absorbing it. So it passes the heat energy downwards as the whole cloud heats up, and when the bottom of the cloud has reached a temperature equilibrium with the top, it radiates heat down to the ground, preventing the dramatic sustained cooling.

    Although there is a small drop in temperature at first, as when clouds obscure the sun, all the soot cloud will do in the long run is to reduce the daily temperature variation of the air from day to night, so that the temperature all day and all night will be fairly steady and close to the average of the normal daytime and nighttime temperatures.

    The dinosaur extinction evidence, http://en.wikipedia.org/wiki/Chicxulub_Crater, might be better explained by the direct effects of the comet impact: the air blast wave and thermal radiation effects on dinosaurs, and the kilometers-high tsunami. At the time the comet struck Chicxulub in Mexico with 100 TT (100,000,000 megatons or 100 million million tons) energy 65 million years ago, the continents were all located in the same area, see the map at http://www.dinotreker.com/cretaceousearth.html and would all have suffered severe damage from the size of the explosion. Most dinosaur fossils found are relatively close to the impact site on the world map 65 million years ago.

    Another issue is that some proportion of the rock in the crater was calcium carbonate, which releases CO2 when heated in a fireball. If there was enough of it, the climatic effects would have been due to excessive heating, not cooling.

    The "nuclear winter" idea relies on soot, not dust such as fallout (which is only about 1% of the crater mass, the remainder being fallback of rock and crater ejecta which lands within a few minutes). So it is basically an extension of the massive firestorms theory, which has many issues because modern cities don't contain enough flammable material per square kilometre to start a firestorm even when using thousands of incendiaries. In cases such as Hiroshima, the heavy fuel loading of the target area created a smoke cloud which carried up a lot of moisture that condensed in the cool air at high altitudes, bringing the soot back promptly to earth as a black rain.

    Because this kind of thing is completely ignored by "nuclear winter" calculations, the whole "nuclear winter" physics looks artificial to me. In 1990, after several studies showed that TTAPS (Sagan et al.) had exaggerated the problem massively by their assumptions of a 1-dimensional model and so on, TTAPS wrote another paper in Science, where they sneakily modified the baseline nuclear targetting assumptions so that virtually all the targets were oil refineries. This enabled them to claim that a moderate cooling was still credible. However, the Kuwait burning oil wells experience a few years later did nothing to substantiate their ideas. Sagan did eventually concede there were faulty assumptions in the "nuclear winter" model, although some of his collaborators continue to write about it.

     
    At 6:26 pm, Blogger nige said...

    copy of a comment:

    http://kea-monad.blogspot.com/2007/10/where-to-now.html

    Just to comment on this. I read Kahn's "On Thermonuclear War" (first published 1960) as a teenager and then requested his other books via the local library.

    Kahn's book "The Next 200 Years" is if I recall, a small slim paperback and I don't think there was much data in it to make his case.

    The key book for environmentalism is Herman Kahn and Julian Simon, "The Resourceful Earth - A Response to Global 2000" published in 1984 (Kahn died in 1983 while it was still in the press).

    That volume is massive and contains hundreds of graphs and tables of data which really make a convincing case that environmentalism exaggerated the facts.

    I read that perhaps twenty years ago and don't have a copy handy. But I think it dealt with everything.

    Even things like species extinction are being grossly exaggerated - species are always becoming extinct as the fossil record shows. It's nothing new. As new species come along, old ones die off. It that wasn't the case, there would still be dinosaurs around, and the world would be a lot less healthy for humans. The whole reason why saber toothed tigers and other wild beasts were hunted to extinction was to make life bearable, not out of ignorance or selfishness!

    Most of this environmentalism is a replacement for religion. The rate of rise of sea levels, etc., is slow enough that low lying areas can build up defenses in the meanwhile - far more cheaply than cutting CO2 emissions.

    Better still, switch to nuclear power. The effects of low doses of external gamma radiation, especially if delivered at low dose rates, are actually beneficial to human beings as they stimulate DNA repair mechanisms like P53 and cut the cancer risk (it's only internal high-LET radiation like alpha and beta particles from ingested Sr-90 or Pu-239, or extremely large doses/dose rates from gamma rays, that cause a net health risk):

    See the monumental report on effects of low dose rate, low-LET gamma radiation on 10,000 people in Taiwan by W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, Is Chronic Radiation an Effective Prophylaxis Against Cancer?, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format at

    http://www.jpands.org/vol9no1/chen.pdf

    'An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

    'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...

    'The data on reduced cancer mortality and congenital malformations are compatible with the phenomenon of radiation hormesis, an adaptive response of biological organisms to low levels of radiation stress or damage; a modest overcompensation to a disruption, resulting in improved fitness. Recent assessments of more than a century of data have led to the formulation of a well founded scientific model of this phenomenon.

    'The experience of these 10,000 persons suggests that long term exposure to [gamma]radiation, at a dose rate of the order of 50 mSv (5 rem) per year, greatly reduces cancer mortality, which is a major cause of death in North America.'

    For Hiroshima-Nagasaki data supporting the fact that low level gamma radiation cuts down cancer risks, see the most recent two posts at http://glasstone.blogspot.com/

    Growing populations and economic activity really need to be used to sort out the problems in an unbiased way. Unfortunately, the mainstream approaches start with a prejudice dating back to 1957 when the facts were not known (P53 was only discovered twenty years later). The culture clash between fashionable politics and scientific facts always result in fashionable politics winning, and people needlessly dying and suffering as a consequence.

    "History shows that much can change, expectedly or unexpectedly, over short periods, and it is unlikely that most trends would continue unabated for decades without changing course."

    I hope you are right. Unfortunately, they will probably make changes for the worst. Like spending enough to bankrupt the world by building giant CO2 extractors which will be completed just about the time the oil, gas and coal runs out, and so will never be used. That's the story of how politics always works when it uses "common sense" to tackle complex problems: it is not merely "too little too late", but "completely crazy".

     
    At 10:06 pm, Blogger nige said...

    copy of a comment to Wikipedia:

    http://en.wikipedia.org/wiki/Talk:Ernest_J._Sternglass#POV_issues

    Fastfission, please don't make ''ad hominem'' personal insults about Sternglass being "semi-crackpot". If you want to see my alternative POV on Sternglass, see my top blog post at [[http://glasstone.blogspot.com/]], which analyses errors in Sternglass' work. Notice that this (Wikipedia) article on Sternglass contains a lot of bias. First, it claims in passing that Herman Kahn minimises the effects of radiation, when in fact radiation is the topic Kahn dwells on, e.g., Kahn stated in his 1960 book ''On Thermonuclear War,'' Princeton University Press, p 24:

    ‘... those waging a modern war are going to be as much concerned with bone cancer, leukemia, and genetic malformations as they are with the range of a B-52 or the accuracy of an Atlas missile.’

    Secondly, this Wiki article claims that Linus Pauling was warning that there is no safe threshold back in the 1960s. Scientifically, what matters is what evidence there is either for or against a threshold. Certainly there is no threshold for high-LET radiation like alpha and beta particles in tissue, because they are stopped within a small distance and the ionization density is large enough to overcome human DNA repair mechanisms like protein P53 (which was only discovered in the late 1970s). However, low-LET radiation like gamma rays, when received at either high or low dose rates, do show a threshold [[http://glasstone.blogspot.com/]]; this data is from Japanese nuclear weapon attacks (where the dose rates were high, due to initial nuclear radiation) and from low-level radiation during an accident where Cobalt-60 got into steel used to make buildings lived in for 20 years by 10,000 people in Taiwan (see W.L. Chen, Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, ''Is Chronic Radiation an Effective Prophylaxis Against Cancer?,'' published in the ''Journal of American Physicians and Surgeons,'' Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here: [[http://www.jpands.org/vol9no1/chen.pdf]]).

    Thirdly, as explained in my blog [[http://glasstone.blogspot.com/]], Alice Stewart actually debunked Sternglass' model, instead of confirming it as this Wikipedia nonsense claims:

    Sternglass first publicised his "theory" at 9th Annual Hanford Biological Symposium, May 1969. On 24 July 1969, Dr Alice Stewart wrote an article for the ''New Scientist'', "The pitfalls of Extrapolation" which found a big problem:


    "Sternglass has postulated a fetal mortality trend which would eventually produce rates well below the level which - according to his own theory - would result from background radiation." [[http://glasstone.blogspot.com/]]

    Fourthly, his book ''Before the Big Bang'' contains various errors and doesn't address or replace the standard model of particle physics. It's not a case that Sternglass belongs to a group of "crackpots", it's just a case that his work on these subjects is severely defective and wanting. If he did a lot more work on it and resolved the problems, then that would be fine. What causes difficulties is the dictatorial difficulty when people try to impose things which contain errors, without first correcting the errors, on the world. Labelling all people with alternative ideas "crackpots" by default isn't helpful, especially when you do it from anonymously under the name "Fastfission".

    Sternglass may have a problem with nuclear power, but in that case he has the problem that the sun is nuclear and that background low-level radiation exists everywhere. Does he advise us to minimise it by living at the bottom of mineshafts in locations where there is little thorium-232, potassium-40, uranium-238 (and uranium decay daughters, like radon-222), etc? What about carbon-14 naturally in food? People like Sternglass have helped prejudice the public against the facts. I've traced the history of radiation hysteria here: [[http://glasstone.blogspot.com/2007/03/control-of-exposure-of-public-to.html]], [[http://glasstone.blogspot.com/2007/03/effect-of-dose-rate-not-merely-dose-on.html]] in particular, and [[http://glasstone.blogspot.com/2007/03/above-3.html]]. The basic conclusion is that the "no threshold" dictum was popularised on the basis of flawed paper by Professor E. B. Lewis, author of Leukemia and Ionizing Radiation, ''Science'', 17 May 1957, v125, No. 3255, pp. 965-72. Lewis used very preliminary Japanese and other data which wasn't detailed enough to show that a threshold existed. He was arguing from ignorance, not from evidence! Yet his argument, which ignored dose rate effects and the quality factor of the radiation, i.e., high or low linear energy transfer (LET). - Nigel Cook 172.207.139.192 (talk) 22:04, 17 November 2007 (UTC)

     
    At 10:13 pm, Blogger nige said...

    continuation of last sentence in previous comment:

    Yet his argument, which ignored dose rate effects and the quality factor of the radiation, i.e., high or low linear energy transfer (LET), was widely accepted at the time due to political prejudice about the Cold War, and has not been corrected as the facts have emerged since.

     
    At 10:28 pm, Blogger nige said...

    I've also got a lot of other evidence that backs up the Japanese and Taiwan data: there were studies for example of cancer rates in different cities with different levels of natural background radiation yet closely matched population groups (with matched age groups, same diets, same habits regards smoking and drinking, etc.).

    The fact that this isn't being done by professional health physicists is a sad reflection on the state of society with its severe radiation dogmas and orthodoxies, and character assassination of anyone who prefers FACTS to FASHIONS.

    ‘Science is the organized skepticism in the reliability of expert opinion.’ - R. P. Feynman (quoted by Smolin, The Trouble with Physics, 2006, p. 307).

    ‘Science is the belief in the ignorance of [the speculative consensus of] experts.’ - R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

    Cities at greater elevations above sea level have higher background radiation due to cosmic radiation (at sea level, the atmosphere is equivalent to a radiation shield of 10 metres of water, but as you move to higher altitudes, there is a fall in this amount of shielding between you and the nuclear furnaces called stars in the vacuum of outer space, so the cosmic background radiation exposure you get increases substantially).

    The studies date from the 1970s-1990s, and show that if anything there is a fall in the cancer rate as you go to cities at higher altitude and more background radiation.

    However, critics may claim that it is due to cleaner air, less smog, lower oxygen pressure and a healthier lifestyle.

    Other studies of this sort have therefore compared cities at similar altitude with matched populations, where differences in background radiation arise from the bedrock. A city built on granite will generally have a higher background radiation level than one built on clay or limestone, so it is possible to measure the different considerably background radiation levels in different cities, and this can be correlated to cancer rates.

    Obviously here things are complicated because you get radon-222 gas inside buildings built on granite that contains traces of uranium ore. This radon-222 emits alpha particles that are high-LET radiation and certainly they don't conform to a threshold-effects relationship (there is no threshold for high-LET radiation like alpha particles inside the body, only for gamma rays at relatively low doses). So this would complicate the results of the survey, and this is the case.

    I will dig out all the graphs and other evidence and publish them on this blog in the future.

     
    At 12:34 pm, Blogger nige said...

    copy of a comment:

    http://riofriospacetime.blogspot.com/

    Louise, thank you for a very interesting post on a fascinating subject! Cosmic rays are amazing. Apparently 90% that hit the Earth's atmosphere are protons from the sun, 9% are alpha particles (helium nuclei) and 1% are electrons.

    Of course the protons don't make it through the Earth's atmosphere (equivalent to a radiation shield of 10 metres of water, which is quite adequate to shield the core of a critical water-moderated nuclear reactor!!).

    When the high-energy protons hit air nuclei, you get some secondary radiation being created like pions which decay into muons and then electrons.

    A lot of the electrons get trapped into spiralling around the Earth's magnetic field lines at high altitudes, in space, forming the Van Allen radiation belts.

    Where the magnetic field lines dip at the poles, they all come together, and so the electron density increases at the poles. At some point this negative electric charge density is sufficiently large to "reflect" most incoming electrons back, and that spot is called the "mirror point".

    Hence the captured electrons are trapped into spiralling around magnetic field lines, to-and-fro between mirror points in the Northern and Southern hemispheres.

    There are also of course occasional irregular gamma ray flashes from gamma ray bursters, heavy particles, etc.

    It's not clear what the actual radiation levels involved are: obviously the radiation level from cosmic radiation on Earth's surface is known. It's highest at the poles where incoming radiation runs down parallel to magnetic field lines (without being captured), hence the "aurora" around the polar regions where cosmic rays leak into the atmosphere in large concentrations.

    It's also high in the Van Allen belts of trapped electrons.

    It's not quite as bad in space well away from the Earth. Apparently, the cosmic radiation level on the Moon's surface is approximately 1 milliRoentgens/hour (10 micro Sieverts/hour), about 100 times the level on the Earth's surface. If that's true, then presumably the Earth's atmosphere (and the Earth's magnetic field) is shielding 99% of the cosmic radiation exposure rate.

    All satellites have to have radiation-hardened solar cells and electronics, in order to survive the enhanced cosmic radiation exposure rate in space.

    In the original version of Hawking's "A Brief History of Time" he has a graph showing the gamma ray energy spectrum of cosmic radiation in outer space, with another curve showing the gamma ray output from black holes via Hawking radiation. Unfortunately, the gamma background radiation intensity at all frequencies in the spectrum is way higher than the predicted gamma ray output from massive black holes (which is tiny), so there is too much "noise" to identify this Hawking radiation.

     
    At 5:34 pm, Blogger nige said...

    copy of a comment:

    http://www.stevens.edu/csw/cgi-bin/blogs/csw/?p=85#comment-13372

    This list is mainly books I've actually read, whereas [for] the last one you did (a year or more ago), I hadn't hea[r]d most of the titles. So I can make a comment or two.

    I have ... disagreements with some claims in these books. ...

    * I disagree with Weinberg's hype (The First 3 Minutes) in applying general relativity to cosmology, because I think in a quantum gravity theory the exchanged gravitons will be received in a redshifted state for gravitational interactions between relativistically receding masses.

    * I disagree with Richard Rhodes several major errors (The Making of the Atomic/Hydrogen Bomb). (1) Presenting the Copenhagen Interpretation of Bohr as if it is Gospel truth, and ignoring Feynman's path integrals interpretation which replaces the Copenhagen Interpretation with chaotic effects due to path interference in small distance scales, e.g., interference to electron orbits by pair production of virtual particles which cause Brownian-motion type chaos in the atom. (2) Not pointing out clearly that the role of Hiroshima and Nagasaki was to encourage Russia to declare war on Japan (to be on the list of victors, and hence to end Japan's hopes that Russia might negotiate a settlement with America, which was why Japan was holding out), which ended Japan's hopes that Russia would negotiate with America to end the war without loss of face. (Although because America on 9 August had exaggerated its hand with atomic warfare, and the President had promised an endless rain of ruin when in fact a third atomic bomb wouldn't be ready until September, America had to accept a conditional surrender from Japan, rather than unconditional surrender: it couldn't afford to have its bluff called when it would be unable to deliver another atomic bomb for many weeks.) According to page 336 of the U.S. Government book "Effects of Atomic Weapons" (1950), the incendiary raid on Tokyo killed more than the number killed at Hiroshima and Nagasaki put together. According to the Radiation Effects Research Foundation internet site (they do the surveys of Hiroshima and Nagasaki survivors), even in the survivors within 1 km of the hypocentre, less than half the leukemia cases were due to radiation (the majority were natural). Leukemia is the most enhanced cancer after radiation exposure. Altogether, from 1950-90, only 428 people died of cancer of all sorts due to radiation (9% of all the cancer deaths, i.e., 91% of cancer deaths were not connected to radiation, as proved by the control group survey) in a group of 36,500 survivors. Hence, the average risk of death from cancer due to radiation to a survivor for a period of 40 years after exposure was only 1.2%, compared to a natural (non-radiation) cancer death risk of 13%. No wonder that 50% of the survivors were still alive in 1995, fifty years after the bombings. Ref.: Radiation Research (146:1-27, 1996).

    Richard Rhodes - like Steven Weinberg - [is] not writing pure science, but spin that the public want to read because they are prejudiced by propaganda from political institutions with axes to grind. What really made me angry in Rhodes' books are his pseudoscientific treatment of fallout particles. He claims that coral is reduced to calcium metal in one place, and in another he claims the first H-bomb test in 1952 produced 80 million tons of mud. It turned out that the fallout had been carefully collected and analysed in weapon test reports WT-615, WT-915 and WT-1317, which show that the fallout from coral (CaCO3, calcium carbonate) is CaO (lime), with an outer layer of Ca(OH)2 (slaked lime, calcium hydroxide). The 80 million tons he quotes is the crater volume which is mostly just ejected as rocks around the crater. The fallout mass taken up into the fireball is only 1% of the cratered material. Even if the fireball was hot enough to reduce 80 million tons of coral to calcium metal (it isn't), that calcium would oxidise in the atmosphere while falling out. Rhodes' science is a lot of hogwash!

     
    At 9:00 am, Blogger nige said...

    copy of a comment:

    http://riofriospacetime.blogspot.com/2007/12/night-at-museum-pt-2.html

    Thanks, Louise. This is extremely interesting and very informative! It's interesting that the dense meteorites, especially those composed of iron and nickel, tend to survive the ablation during their fall through the atmosphere, and hit the ground. Less dense stony objects of similar mass tend to heat up and then explode like an air burst nuclear bomb while still high in the atmosphere, as was the case of the Tunguska explosion of June 30, 1908 (an explosion equivalent to several megatons of TNT, see C. Chyba, P. Thomas, and K. Zahnle, "The 1908 Tunguska Explosion: Atmospheric Disruption of a Stony Asteroid", Nature, v361, 1993, p. 40-44).

    "Since the Hall of Meteorites contains similiar samples, are any of them about to melt? If they contained even a tiny amount of radioactive isotopes, it would not be safe to go near this room. If they contained any isotopes, those would have decaued to nothing long ago. Today these rocks are cold as the New York Winter, yet Earth's core continues to produce heat."

    If a small rock was hot enough to measure the heat, the radiation would be lethal. A radiation dose of 10 Sieverts, which is equal to 10 Joules/kg for a quality factor of 1 (low LET radiations), is lethal within a few days. Since an average person is 70 kg, that means that 700 Joules of radiation is lethal. To make a rock hot and remain hot for long periods by the degradation of radioactive energy into heat, a larger amounts of radioactive energy are required, so the radiation from such a rock would be lethal.

    The thing about the earth is that you have a lot of radioactivity distributed within it, and very little leakage of that energy. A few feet of earth or rock can keep the embers of a fire hot for a long time. If you take account of the thickness of the earth's crust, it traps heat very efficiently, so that a moderate amount of radioactivity keeps the core hot. (However, I'm skeptical about the details as I've not seen any convincing calculations from geologists so far.)

    If you try testing those meteorites for radioactivity content, you will find there will be some content in them (probably little, but still a trace)! The earth does contain a lot of uranium: http://www.uic.com.au/nip78.htm:

    "The convection in the core may be driven by the heat released during progressive solidification of the core (latent heat of crystallisation) and leads to the self-sustaining terrestrial dynamo which is the source of the Earth's magnetic field. Heat transfer from the core at the core/mantle boundary is also believed to trigger upwellings of relatively hot, and hence low density, plumes of material. These plumes then ascend, essentially without gaining or losing heat, and undergo decompression melting close to the Earth's surface at 'hot spots' like Hawaii, Reunion and Samoa.

    "However, the primary source of energy driving the convection in the mantle is the radioactive decay of uranium, thorium and potassium. In the present Earth, most of the energy generated is from the decay of U-238 (c 10-4 watt/kg). At the time of the Earth's formation, however, decay of both U-235 and K-40 would have been subequal in importance and both would have exceeded the heat production of U-238. ...

    "Measurements of heat have led to estimates that the Earth is generating between 30 and 44 terawatts of heat, much of it from radioactive decay. Measurements of antineutrinos have provisionally suggested that about 24 TW arises from radioactive decay. Professor Bob White provides the more recent figure of 17 TW from radioactive decay in the mantle. This compares with 42-44 TW heat loss at the Earth's surface from the deep Earth."

    There's nothing in the universe that isn't radioactive. (Even clouds of hydrogen gas contain traces of tritium.)

    Table 1 in that above-linked article shows that meteorites are 0.008 parts per billion uranium, the earth's mantle is 0.021 parts per billion uranium, and the continental crust is 1.4 parts per billion uranium. The concentration of uranium in the earth's core is not very well known (antineutrino measurements are available), but since uranium is relatively dense (denser than lead), there may be a considerable concentration of uranium in the earth's core, at least similar to that in the crust. Also thorium-232, etc.

    ... The earth's core is hot not because the radioactivity is capable of keeping isolated rocks hot, but because the rate of loss of heat is minimised due to the poor thermal conductivity of the outer layers, particularly the crust. This keeps most of the heat trapped.

    The calculation to check the theory should be simple. Take the total radioactivity in the earth (in Becquerels, decays/second), multiply it by the average energy of the radiation emitted (0.3 MeV or so for a beta particle, 4 MeV or so for an alpha particle) and that gives you the total MeV/second, then convert that power ... into Joules/second (watts). Then estimate the diffusion rate of the heat out of the earth.

     
    At 11:16 am, Blogger nige said...

    [BTW, I've noticed some typographical errors and errors of grammar in the last update added to the body of this post, e.g., the update section. I'm not going to try to update it, for the following reasons. There is a flaw in the old blogger template software used on this blog, and every time any changes are made, extra line spacings between paragraphs are automatically inserted when the changes are saved. There is also a flaw that if the template is changed, comments are lost and not transferred over.]

    Extract of relevant material from a comment to:

    http://kea-monad.blogspot.com/2007/11/panthalassa.html

    ...
    The evidence in favour of a supernova explosion shortly before the Earth formed 4,540 million years ago is compelling from the natural radioactivity distribution in the Earth. Earth is basically a giant fallout particle, as people like Edward Teller first pointed out over fifty years ago:

    ‘Dr Edward Teller remarked recently that the origin of the earth was somewhat like the explosion of the atomic bomb...’

    – Dr Harold C. Urey, The Planets: Their Origin and Development, Yale University Press, New Haven, 1952, p. ix.

    ‘It seems that similarities do exist between the processes of formation of single particles from nuclear explosions and formation of the solar system from the debris of a supernova explosion. We may be able to learn much more about the origin of the earth, by further investigating the process of radioactive fallout from the nuclear weapons tests.’

    – Dr P.K. Kuroda, ‘Radioactive Fallout in Astronomical Settings: Plutonium-244 in the Early Environment of the Solar System,’ Radionuclides in the Environment (Dr Edward C. Freiling, Symposium Chairman), Advances in Chemistry Series No. 93, American Chemical Society, Washington, D.C., 1970.
    ...

     
    At 5:02 pm, Blogger nige said...

    A rare, non-detailed background survey of the social reasons for nuclear weapons effects data censorship is the paper:

    Professor Brian Martin (then a PhD physicist at the Department of Mathematics, Faculty of Science, Australian National University, Canberra, but now he is Professor of Social Sciences in the School of Social Sciences, Media and Communication at the University of Wollongong), "Critique of Nuclear Extinction", published in Journal of Peace Research, Vol. 19, No. 4, pp. 287-300 (1982):

    "The idea that global nuclear war could kill most or all of the world's population is critically examined and found to have little or no scientific basis. A number of possible reasons for beliefs about nuclear extinction are presented, including exaggeration to justify inaction, fear of death, exaggeration to stimulate action, the idea that planning is defeatist, exaggeration to justify concern, white western orientation, the pattern of day-to-day life, and reformist political analysis. Some of the ways in which these factors inhibit a full political analysis and practice by the peace movement are indicated. Prevalent ideas about the irrationality and short duration of nuclear war and of the unlikelihood of limited nuclear war are also briefly examined."

    For his article debunking the "nuclear winter" hoax of Sagan et al., see Brian Martin's article, "Nuclear winter: science and politics", Science and Public Policy, Vol. 15, No. 5, October 1988, pp. 321-334, http://www.uow.edu.au/arts/sts/bmartin/pubs/88spp.html.

    Notice that Brian Martin is an immensely important figure in censorship studies: http://www.uow.edu.au/arts/sts/bmartin/pubs/controversy.html#nuclearwar.

    Of particular interest on the Brian Martin site are the following pages:

    http://www.uow.edu.au/arts/sts/bmartin/dissent/intro/

    and

    http://www.uow.edu.au/arts/sts/bmartin/pubs/controversy.html

     
    At 7:05 pm, Blogger nige said...

    Also see the informative article on line in PDF:

    "Nitrogen oxides, nuclear weapon testing, Concorde and stratospheric ozone" P. Goldsmith, A. F. Tuck, J. S. Foot, E. L. Simmons & R. L. Newson, published in Nature, v. 244, issue 5418, pp. 545-551, 31 August 1973:

    "ALTHOUGH AMOUNTS OF NITROGEN OXIDES EQUIVALENT TO THE OUTPUT FROM MANY CONCORDES WERE RELEASED INTO THE ATMOSPHERE WHEN NUCLEAR TESTING WAS AT ITS PEAK, THE AMOUNT OF OZONE IN THE ATMOSPHERE WAS NOT AFFECTED."

    What happens when nitrogen oxides are released in a nuclear explosion is partly that they combine with moisture in the mushroom cloud to form very dilute nitric acid which eventually (after being blown around the world in small particles) gets precipitated.

    More important, although the shock wave of a nuclear explosion creates nitrogen oxides, especially nitrogen dioxide, THE PROMPT X-RAYS AND GAMMA RADIATION CREATE OZONE!

    It's the ozone around the early fireball that shields most of the the early-time thermal radiation, which is mainly in the ultraviolet.

    Hence, nuclear explosions in the atmosphere don't just release oxone-destroying nitrogen oxides, THEY ALSO RELEASE OZONE! Depending on the yield and the altitude of the detonation, in some cases the Earth's ozone layer can actually be INCREASED not reduced.

    A high altitude nuclear explosion does NOT produce a strong blast wave, and all nitrogen oxides production requires a high overpressure shock wave! Hence, in a high altitude nuclear explosion, the production of ozone from gamma radiation EXCEEDS the production of nitrogen oxides by many times. It is quite conceivable that suitable high altitude nuclear explosions over the South Pole would have the effect of repairing the hole in the ozone layer there. Of course, it won't happen, because as Feynman said when discussing nuclear testing hysteria in the 1960s, we really still live in a pseudo-scientific age.

    See also:

    J. Strzelczyk, W. Potter, & Z. Zdrojewicz, "Rad-By-Rad (Bit-By-Bit): Triumph of Evidence Over Activities Fostering Fear of Radiogenic Cancers at Low Doses", Dose Response, v. 5 (2007), issue 4, pp. 275-283:

    "Large segments of Western populations hold sciences in low esteem. This trend became particularly pervasive in the field of radiation sciences in recent decades. The resulting lack of knowledge, easily filled with fear that feeds on itself, makes people susceptible to prevailing dogmas. Decades-long moratorium on nuclear power in the US, resentment of "anything nuclear", and delay/refusal to obtain medical radiation procedures are some of the societal consequences. The problem has been exacerbated by promulgation of the linear-no-threshold (LNT) dose response model by advisory bodies such as the ICRP, NCRP and others. This model assumes no safe level of radiation and implies that response is the same per unit dose regardless of the total dose. The most recent (June 2005) report from the National Research Council, BEIR VII (Biological Effects of Ionizing Radiation) continues this approach and quantifies potential cancer risks at low doses by linear extrapolation of risk values obtained from epidemiological observations of populations exposed to high doses, 0.2 Sv to 3 Sv. It minimizes the significance of a lack of evidence for adverse effects in populations exposed to low doses, and discounts documented beneficial effects of low dose exposures on the human immune system. The LNT doctrine is in direct conflict with current findings of radiobiology and important features of modern radiation oncology. Fortunately, these aspects are addressed in-depth in another major report—issued jointly in March 2005 by two French Academies, of Sciences and of Medicine. The latter report is much less publicized, and thus it is a responsibility of radiation professionals, physicists, nuclear engineers, and physicians to become familiar with its content and relevant studies, and to widely disseminate this information. To counteract biased media, we need to be creative in developing means of sharing good news about radiation with co-workers, patients, and the general public."

    Here's a quotation from Feynman (not his specific objection to low-level radiation hysteria which he rejected elsewhere by saying that if Pauling et al were so worried about such levels of radiation, they'd campaign first and foremost to make everyone evacuate cities at high altitudes where cosmic radiation is highest, they'd ban air travel, they'd evacuate cities built on bedrock like granite that contains substantial quantities of naturally radioactive uranium-238, etc., and THEN move on to the far smaller dangers of fallout from weapons tests which only increased lifetime background radiation dosage by typically a mere 1%, see Feynman's book called "The Meaning of It All"):

    "What is Science?" by R.P. Feynman, presented at the fifteenth annual meeting of the National Science Teachers Association, 1966 in New York City, and reprinted from The Physics Teacher Vol. 7, issue 6, 1968, pp. 313-320:

    "... great religions are dissipated by following form without remembering the direct content of the teaching of the great leaders. In the same way, it is possible to follow form and call it science, but that is pseudo-science. In this way, we all suffer from the kind of tyranny we have today in the many institutions that have come under the influence of pseudoscientific advisers.

    "We have many studies in teaching, for example, in which people make observations, make lists, do statistics, and so on, but these do not thereby become established science, established knowledge. They are merely an imitative form of science analogous to the South Sea Islanders' airfields--radio towers, etc., made out of wood. The islanders expect a great airplane to arrive. They even build wooden airplanes of the same shape as they see in the foreigners' airfields around them, but strangely enough, their wood planes do not fly. The result of this pseudoscientific imitation is to produce experts, which many of you are. ... you teachers, who are really teaching children at the bottom of the heap, can maybe doubt the experts. As a matter of fact, I can also define science another way: Science is the belief in the ignorance of experts.

    "When someone says, "Science teaches such and such," he is using the word incorrectly. Science doesn't teach anything; experience teaches it. If they say to you, "Science has shown such and such," you might ask, "How does science show it? How did the scientists find out? How? What? Where?"

    "It should not be "science has shown" but "this experiment, this effect, has shown." And you have as much right as anyone else, upon hearing about the experiments--but be patient and listen to all the evidence--to judge whether a sensible conclusion has been arrived at.

    "In a field which is so complicated ... that true science is not yet able to get anywhere, we have to rely on a kind of old-fashioned wisdom, a kind of definite straightforwardness. I am trying to inspire the teacher at the bottom to have some hope and some self-confidence in common sense and natural intelligence. The experts who are leading you may be wrong.

    "I have probably ruined the system, and the students that are coming into Caltech no longer will be any good. I think we live in an unscientific age in which almost all the buffeting of communications and television--words, books, and so on--are unscientific. As a result, there is a considerable amount of intellectual tyranny in the name of science.

    "Finally, with regard to this time-binding, a man cannot live beyond the grave. Each generation that discovers something from its experience must pass that on, but it must pass that on with a delicate balance of respect and disrespect, so that the race--now that it is aware of the disease to which it is liable--does not inflict its errors too rigidly on its youth, but it does pass on the accumulated wisdom, plus the wisdom that it may not be wisdom.

    "It is necessary to teach both to accept and to reject the past with a kind of balance that takes considerable skill. Science alone of all the subjects contains within itself the lesson of the danger of belief in the infallibility of the greatest teachers of the preceding generation."

     
    At 7:31 pm, Blogger nige said...

    On the subject of consensus-led "groupthink", see http://en.wikipedia.org/wiki/Groupthink:

    ’Groupthink is a type of thought exhibited by group members who try to minimize conflict and reach consensus without critically testing, analyzing, and evaluating ideas. During Groupthink, members of the group avoid promoting viewpoints outside the comfort zone of consensus thinking. A variety of motives for this may exist such as a desire to avoid being seen as foolish, or a desire to avoid embarrassing or angering other members of the group. Groupthink may cause groups to make hasty, irrational decisions, where individual doubts are set aside, for fear of upsetting the group’s balance.’

    - Wikipedia.

    ‘[Groupthink is a] mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action.’

    - Professor Irving Janis.

    The wikipedia article on groupthink gives two examples which have been investigated in some detail: the Space Shuttle Challenger disaster (1986) which Feynman and a military guy investigated, and Bay of Pigs invasion (1959-1962).

    Challenger exploded on launch in 1986 because it was launched in freezing weather, when the cold had caused the rubber O-rings (sealing the sections of reusable booster rocket) to cease being rubbery and to leak fuel outwide the rocket, which ran down the outside and ignited in the rocket flames at the bottom. The leaking O-rings were soon burned, failing entirely, and causing the boosters to ignite along the seals and blow up.

    In the days and hours before the disaster occurred, technicians repeatedly pointed out to their bosses that this risk was going to put lives on the line. The way out was to wait until the environment temperature was above the point where rubber ceases to function as a sealant. But the delay in firing the shuttle was deemed to be too costly and unnecessary. The risks, although known, were dismissed by the senior "experts" in charge of the operation who heard about them. There was also the problem that the major source of data on the problem was the contractor which was selling the booster's to NASA, which didn't want to lose its contract by causing unnecessary problems and worries. So everyone agreed to cross their fingers, hope for the best, and launch the shuttle when they knew the temperature was so low that the rubber sealants would malfunction and possibly leak causing disaster.

    An account of the investigation was written by Feynman, included as an appendix to Feynman's book What Do You Care What Other People Think?, Feynman the main physicist on the commission inquiry into the disaster. However, Feynman couldn't find the exact cause directly himself because - despite going to all the contractors - nobody there told him the facts. The people concerned who knew were scared to lose their jobs or to be somehow "disloyal" to their employers by speaking out.

    What happened instead was that Feynman was told the facts by another expert, the rocket engineer General Donald J. Kutyna, who was investigating the disaster on the same committee. Kutyna was the man who had headed the inquiry into the explosion of a liquid-fuelled Titan missile in its silo in 1980 (a technician had in that case caused the diaster by accidentally dropping a wrench socket down the silo, where it hit the fuel tank and caused a leak which led to a chemical explosion which blew the 9-megaton warhead off the missile without detonating the 1-point safe nuclear core, of course). Because of his experience of investigating a liquid fuelled rocket explosion, Kutyna was able to work out why Challenger blew up and told Feynman the facts to ensure that the NASA cover-up would be exposed, and criticisms levelled at those who made the decision to launch in low temperature weather, when they should have delayed launch to reduce risks, saving lives.

    The Bay of Pigs disaster occurred when President Kennedy in 1961 authorised the invasion of Cuba by a group of Cuban exiles. They met fierce opposition and called for air support. Kennedy didn't want to provide air support using the U.S. military, for fear that the U.S. involvement would become known. In the end, he lost both the invasion and also the anonymity of the U.S., because when Castro captured the Cuban exiles they had U.S. equipment and admitted having been trained by the U.S. The cause of the failure was the groupthink of Kennedy's advisors, who feared speaking out of turn when the final planning for the operation was being approved.

    Another example of groupthink by a committee of leading experts, which is the design of the Hiroshima bomb, which used 64.1 kg of highly enriched uranium and only managed to fission 1% of that, about 12 kt or so. The Nagasaki bomb contained only 6.19 kg of plutonium, but had an efficiency of over 20% fission. In 1952, an implosion bomb along the lines of the Nagasaki bomb (but with a hollow core) using the same quantity of uranium as the Hiroshima bomb, yielded 500 kt, i.e. 50% fission efficiency.

    Why was the Hiroshima bomb relatively so inefficient? In the book The Curve of Binding Energy by John McPhee, Dr Theodore Taylor criticised the Hiroshima bomb design as a stupid design which was the result of groupthink-type incompetence due to a committee team.

    The Hiroshima bomb was like a gun and fired a solid cylinder of highly enriched U-235 into a hollow cylinder. There were quite a few issues with this. If fission reaction was started by a stray neutron before the projectile U-235 was fully home in the hollow U-235 sleeve, the result would be relatively inefficient because the geometry would allow most of the neutrons to escape instead of producing further fissions. Since the duration of any fission reaction was trivial in comparison to the time taken for the projective to move a few centimetres, there would be no further assembly once the reaction began.

    These details are disclosed in an article called The Physics of the Bomb published in No. 2 of the Penguin series 'Science News;, 1947, written by Los Alamos weapons designer Professor Philip Morrison. Morrison expanded on several details of the 1945 Smyth report (Atomic Energy for Military Purposes) in his article, for example the statistical risks of pre-detonation (inefficiency) due to the neutron background in a gun-type weapon assembly, the fact that neutrons reflected by a tamper take so long to go out of the core, get reflected and return to the core, that in that interval the chain reaction has grown exponentially and the returning neutrons are trivial. Morrison made it clear that the role of a neutron reflector is restricted largely to keeping the critical mass minimal at the moment the reaction starts and thus starting the reaction efficiently (with as much supercriticality as possible for a given mass of fissile material, since supercriticality depends on the ratio of the actual mass present to the critical mass at the moment the reaction begins), instead of preventing neutron escape once the reaction is growing at an exponential rate. He also made it clear that the fission chain reaction is ended in all cases prematurely to some degree (hence without 100% fission efficiency) due to thermal expansion of the bomb core, which soon makes it subcritical, quenching the reaction. he made it clear that the key to high efficiency is achieving the maximum possible degree of supercriticality at the start of the reaction, i.e. having a configuration which has as many critical masses present as possible. This can be achieved by reducing the effective value of the critical mass while keeping the actual mass of fissile material constant; this is the route taken when a bomb core is compressed, and in high yield, high-efficiency fission weapons it is is achieved using a hollow fissile core surrounded by a layer of chemical explosive which is detonated at many points simultaneously. Such points were not always intuitively obvious during WWII, and the nearest the Manhattan had to a computer was a non-electronic (mechanical, driven by electric motors but involving no information in the form of electrical signals) punched-card sorting system. Initially it was used for months by the person in charge of it to play about, producing logarithmic tables, until Oppenheimer fired that person and put Feynman in charge of making efficiency calculations for atomic bomb cores. In 1953, Morrison had to testify before the U.S. Congressional hearings on "Subversive Influence in the Educational Process" by the Hearings before the Subcommittee to Investigate the Administration of the Internal Security Act and other Internal Security Laws of the Committee on the Judiciary, US Senate, 83nd congress. Morrison admitted being a communist party member at college. In earlier hearings before the same committee, Morrison was accused of hyping the effects of nuclear weapons in Japan for political purposes. See http://writing.upenn.edu/~afilreis/50s/morrison.html

    Philip Morrison, a Cornell Professor of Physics, expresses doubts about atomic warfare and then faces a Congressional anticommunist investigating committee, 1952
    a brief excerpt from: SUBVERSIVE INFLUENCE IN THE EDUCATIONAL PROCESS (Hearings before the Subcommittee to Investigate the Administration of the Internal Security Act and other Internal Security Laws of the Committee on the Judiciary, US Senate, 82nd congress, 2nd session, Sept. 8, 9, 10, 23, 24, 25, and Oct. 13, 1952 (US Govt Printing Office, 1952)


    -----------------------------------
    Mr. Morris. Did you contribute an article to the Scientific Amercan?
    Dr. Morrison. I have had it published. I don't know if you call that contributing or not.

    Mr. Morris. Did you write a review of a book by an Englishman named P.M.S. Blackett, entitled "Fear, War, and the Bomb?"

    Dr. Morrison. I reviewed P.M.S. Blackett's book for the Herald Tribune and for the Bulletin of Atomic Scientists.

    Mr. Morris. And you praised that book?

    Dr. Morrison. I said that book had many excellent things in it. I also criticized an amendment. I wrote an honest review of the book.

    Mr. Morris. Mr. Chairman, may that review of Dr. Morrison of P.M.S. Blackett's book entitled "Fear, War, and the Bomb" be put into the record?

    The Chairman. It may be made part of the record.

    (The material referred to follows:)


    "BLACKETT'S ANALYSIS OF THE ISSUES"
    by Philip Morrison
    [Bulletin of Atomic Scientists, February 1949]

    It is 3 years since the writing of the first extensive political work of the atomic scientists: One World or None. Now the same publishers put out the American edition of a book by another scientist, the distinguished. well-informed, and earnest P.M.S. Blackett of Great Britain. As a contributor to the first book, I feel no proprietary pangs in urging all those who bought or borrowed it--and there were many--to get hold of the Blackett book.

    It is written at a sadder time, and perhaps a wiser one. It is written by a man whose experience is both that of a physicist and that of a military man, and who is no American, but an Englishman, willing to take a somewhat more critical position on the issues of the day than almost any American scientist has publicly done. It is a book which does Professor Blackett credit for its thoughtfulness and scope, even though as he himself points out it is by no means "the whole truth." Read it if you wish to have an opinion on the issues of atomic energy.

    My piece in One World or None was the description of the effect of a single atomic bomb on New York City. It is a frightening article, as I have many times tested by direct observation. Yet it is a major thesis of the Blackett book--and I believe a correct thesis--that even a thousand bombs will not of themselves decide the issue of a major war. We said there is no defense, and we meant it. It is still true. But we spoke in a different language from the language of Blackett. We did not speak in terms of strategy, in terms of overall economies, in terms of production and territorial conquest. We spoke of the impact of the bomb on the homes and the hopes of men and women.

    I wrote of the lingering death of the radiation casualties, of the horrible flash burns, of the human wretchedness and misery that every atomic bomb will leave near its ground zero. Against this misery there is indeed no real defense. Neither our oceans nor our radar nor our fighters can keep us intact through another major war. But--and I quote Blackett (p. 159): "The very effective campaign, largely initiated by the atomic scientists themselves, to make the world aware of the terrible dangers of atomic bombs, played an important part in bringing pressure to bear on the American Government to propose measures to control atomic weapons and to take them out of the hands of the military."


    -----------------------------------
    The hearing transcript provides this note on Morrison:
    "Professor Morrison is a nuclear physicist who took part in the design and fabrication of the bomb at Los AIamos Laboratory. He is now a member of the Physics Department at Cornell University."


    Much of the effort made by people to publish nuclear weapon design secrets seems to be motivated by anti-deterrence sentiments.

    Information on details of nuclear weapon design is not needed to justify civil defence/defense efforts.

     
    At 11:57 am, Blogger nige said...

    Some historically scientific material of great relevance to this post is to be found in the book by physicists Dr. Edward Teller and Dr. Albert L. Latter, Our Nuclear Future ... Facts, Dangers and Opportunities, Criterion Books, New York, 1958:

    A very prescient passage from page 119:

    "It is possible that radiation of less than a certain intensity does not cause bone cancer or leukemia at all. In the past small doses of radiation have often been regarded as beneficial. This is not supported by any scientific evidence [as of 1958]. Today many well-informed people believe [without any evidence, i.e. on the basis of pure ignorance] that radiation is harmful even in the smallest amounts. This statement has been repeated [by mainstream "professional" cranks, who haven't grasped the subtle difference between fact-based science and authority-based religion/belief, and that no amount of "professional" dogma can overrule the need for fact based evidence in science, unlike subjective fields like politics/education/religion, where the student must give answers in exams which confirm to groupthink ideology, not to the facts if the facts are different to the mainstream consensus behind the examinations board; students who pass such exams by giving the "right" answers to subjective controversies are often instilled with a confusion between what is fact and what is speculative dogma, and as a result they defend crackpot mainstream beliefs as if those beliefs were science, not lies: the only way they have to defend such lies is by personal abuse of those who factual evidence and by lying, since they have no factual evidence, no scientific basis for arguing their case, just vacuous assertions based on ignorance and a refusal to read the facts and act upon them] in an authoritive manner. Actually there can be little doubt that radiation hurts the individual cell. But a living being is a most complex thing. Damage to a small fraction of the cells might be beneficial to the whole organism. Some experiments on mice seem to show that exposure to a little radiation increases the life expectancy of the animals. Scientific truth is firm - when it is complete. The evidence of what a little radiation will do in a complex animal like a human being is in an early and uncertain state."

    On pages 121-122, the book points out that Denver in the United States is at an altitude of 5000 feet above sea level, and so receives 43% more hazardous cosmic radiation (because there is less air shield between it and outer space) than you get at sea level.

    The bone cancer and leukemia rates in Denver, where the 5000 feet altitude caused a 43% increase in cosmic radiation, were significantly lower than those in the sea level cities of San Francisco and New Orleans in 1947 (before any nuclear test fallout arrived).

    For example, there were 10.3 leukemia cases diagnosed per 100,000 of population in San Francisco in 1947, and only 6.4 in Denver.

    On page 122, Drs. Teller and latter analyse the results as follows:

    "One possible explanation for the lower incidence of bone cancer and leukemia in Denver is that disruptive processes like radiation are not necessarily harmful in small enough doses. Cell deterioration and regrowth go on all the time in living creatures. A slight acceleration of these processes could conceivably be beneficial to the organism."

    Actually, the mechanism is more subtle: protein P53, discovered only in 1979, is encoded by gene TP53 which occurs on human chromosome 17. P53 also occurs in other mammals including mice, rats and dogs. P53 continually repairs breaks in DNA which easily breaks at body temperature due to free radicals produced naturally in various ways and also as a result of ionisation of caused by radiation hitting water and other molecules in the body. Cancer occurs when several breaks in DNA happen to occur by chance at nearly the same time, giving several loose ends which P53 repairs incorrectly, causing a mutation. This cannot occur when only one break occurs, because only two loose ends are produced, and P53 will reattach them correctly. If low-LET ionising radiation levels are increased to a certain extent, causing more single strand breaks, P53 works faster and is able deal with faster breaks as they occur, so that multiple broken strand ends do not arise. This prevents DNA strands being repaired incorrectly, and prevents cancer - a result of mutation caused by faults in DNA - from arising. Too much radiation of course overloads the P53 repair mechanism, and then it cannot repair breaks as they occur, so multiple breaks begin to appear and loose ends of DNA are wrongly connected by P53, causing an increased cancer risk. Obviously there is a statistical risk. Quite a lot of wrongly reassembled broken DNA needs to occur until the result causes cancer. Many wrongly assembled DNA strands simply result in cell death when it tries to divide, instead of allowing endless divisions into defective cells, i.e. cancer cells. Besides P53, there are other proteins involved in DNA repair after damage. Over 50% of all cancers, however, result from mutated forms of P53 which are unable to repair damaged DNA.

    So it is clear that most cancers occur as a result of a rapid double break to the TP 53 gene on human chromosome 17. The cell then divides normally, but the resulting cell produces its P53 from a mutated TP 53 gene and thus produces a flawed P53 protein, which is unable to properly repair further damage in the DNA of the cell. As a result, these cells are subjected to cumulative damage and mutations from free radicals, and are relatively likely to become cancer cells. The stimulation of P53 with low-LET (weakly ionising) radiation can boost it's efficiency, preventing multiple strand breaks from having time to occur because breaks get repaired faster before a backlog can accumulate. This is a homeostasis effect: an increase in the rate of low-LET radiation weak ionisation naturally causes the body to slightly over-respond by increasing in a non-linear response the rate of P53 repairs (similarly, the body over-responds for a long time after an infection by boosting the white cell count to levels higher than those which existed before the infection). This disproportionately or over-compensation boosts the body's ability to cope with other causes of DNA damage, such as natural causes, so the net effect is a reduction in natural cancer rates that far outweighs the trivial radiation damage at low dose rates. Hence, the overall cancer risk at low-LET low dose rate radiation is less than it would be in the absence of the radiation.

    Teller and Latter then point out that if there is an effect of the enhanced cosmic radiation in Denver on the leukemia and bone cancer rate as compared to lower altitude cities, "the effect is too small to be noticed compared to other effects."

    In other words, this factual data as of 1947 set a limit on how bad the radiation-induced leukemia rate could be: if it existed at all, it was dwarfed by "noise" in the data. Whenever some signal gets drowned by "noise" in data, then the real scientist starts to investigate the "noise" which is more important than the trivial signal. (This was directly how the big bang was confirmed, when the microwave background noise in the sky was investigated in the mid-1960s and found to be severely red-shifted fireball radiation from the big bang.)

    On page 124, it is pointed out that mortality statistics - which don't show a decrease in cancer risks from living in places of high cosmic radiation exposure like Denver - and which therefore don't show any negative risks from low level radiation, do show correlations between other things. For example, being 10% overweight reduces life expectancy by 1.5 years, while smoking one pack of cigarettes a day reduces life expectancy by 9 years (equivalent to an average of 15 minutes reduction in life per cigarette smoked).

    These are things which are real, statistically significant risks. Low-LET radiation at low dose rates isn't that kind of problem (to say the very least of it).

     
    At 5:35 pm, Blogger nige said...

    For more about Lewis's non-threshold propaganda campaign "and the debate about nuclear weapons testing", see:

    >http://etd.caltech.edu/etd/available/etd-03292004-111416/unrestricted/LewisandFallout.pdf

    EDWARD LEWIS AND RADIOACTIVE FALLOUT
    THE IMPACT OF CALTECH BIOLOGISTS ON THE DEBATE
    OVER NUCLEAR WEAPONS TESTING IN THE 1950s AND 60s
    Thesis by
    Jennifer Caron
    In Partial Fulfillment of the Requirements for the
    degree of
    Bachelor of Science
    Science, Ethics, and Society Option
    CALIFORNIA INSTITUTE OF TECHNOLOGY
    Pasadena, California
    2003
    (Presented January 8, 2003)


    "ACKNOWLEDGEMENTS
    Professor Ed Lewis, I am deeply grateful to you for sharing your story and spending
    hours talking to me. ...

    "ABSTRACT
    The work of Caltech biologists, particularly, Edward Lewis, on leukemia and ionizing radiation transformed the public debate over nuclear weapons testing. The United States began testing hydrogen bombs in 1952, sending radioactive fallout around the globe. Earlier more localized fallout was generated starting in 1945 from tests of atomic weapons at Nevada test sites. The Atomic Energy Commission claimed the tests would not harm human health. Geneticists knew from animal and plant experiments that radiation can cause both illness and gene mutations. They spoke out to warn the policymakers and the public. Edward Lewis used data from four independent populations
    exposed to radiation to demonstrate that the incidence of leukemia was linearly related to
    the accumulated dose of radiation. He argued that this implied that leukemia resulted from a somatic gene mutation. Since there was no evidence for the existence of a
    threshold for the induction of gene mutations down to doses as low as 25 r, there was unlikely to be a threshold for the induction of leukemia. This was the first serious challenge to the concept that there would be a threshold for the induction of cancer by
    ionizing radiation. Outspoken scientists, including Linus Pauling, used Lewis’s risk
    estimate to inform the public about the danger of nuclear fallout by estimating the
    number of leukemia deaths that would be caused by the test detonations. In May of 1957
    Lewis’s analysis of the radiation-induced human leukemia data was published as a lead article in Science magazine. In June he presented it before the Joint Committee on Atomic Energy of the US Congress."
    (Emphasis added to key points.)

    Page 13:

    "The most controversial aspect of his analysis was the linear dose-response curve. This relationship made sense to geneticists who had found a linear relationship between
    radiation and mutations in Drosophila down to 25 rad (Stern and Spencer). Additionally, it fit with the hypothesis of Muller that cancer could result from somatic mutations. This was not the accepted idea in other scientific and medical communities. Rather, as the official voice, the AEC medical doctors and scientists promoted the assumption that there
    would be a threshold below which radiation would do no harm, just as there is frequently such a threshold in chemical toxicology because the body can process small quantities of toxins like alcohol. The AEC vocally assumed and defended the threshold hypothesis;
    furthermore, they seem to have assumed that the amount of radiation received by Americans from fallout would be less than the threshold. Lewis found no evidence for such a threshold, and the AEC scientists were unable to offer any."

    (Emphasis added to Lewis's ignorant failure to discover the facts about low level radiation, and its pseudoscientific abuse or misinterpretation as being a fact rather than an expression of science-abusing ignorance and scientific failure. If a scientist fails to find evidence which in fact does exist, that is hardly an accomplishment to be hyped or applauded. Lewis failed to find the evidence of a threshold because the dosimetry available from Hiroshima and Nagasaki was then too crude and inaccurate to produce accurate, detailed results. If Lewis had made efforts to obtain the facts instead of pretending that ignorant error was fact and going on a crusade to promote such ignorant error in journals like Science and in testimony to U.S. Congressional Hearings, then he would have been doing science not pseudoscience.)

     
    At 10:54 am, Blogger nige said...

    To make the mechanism easily understood, one simple analogy to the roles of protein P53, cancer and radiation is a gasoline dump:

    1. DNA-damaging free radicals are equivalent to a source of sparks which is always present naturally, and are caused by many interactions including those of ionizing radiation produced by many other causes in the body.

    2. Cancer is equivalent the fire you get if the sparks are allowed to ignite the gasoline, i.e. if the free radicals are allowed to damage DNA without the damage being repaired.

    3. Protein P53 is equivalent to a fire suppression system which is constantly damping out the sparks, or repairing the damaged DNA so that cancer doesn't occur.

    In this way of thinking, the "cause" of cancer will be down to a failure of the P53 to repair the damage.

    Naturally, the majority of cancers involve cells containing mutated P53: to get cancer naturally you usually need to have a mutation in a cell's P53 protein, which stops P53 from repairing DNA.

    In other words, cancer appears when the cancer suppressor is damaged.

    In nuclear radiation induced cancer, the mechanism is just slightly different: radiation induced cancer occurs where the radiation level is so great that it overwhelms the ability of P53 to repair the damage to the DNA.

    However, there is another effect. As the radiation level increases, the rate of P53 repairs increases slightly faster than the DNA damage rate. This is because the body naturally detects radiation damage as an increase in free radicals (chemical-type poisoning) and over-compensates for this increase by dramatically increasing the P53 activity in repairing damaged DNA (cf. the old adage "a little of what does you harm, makes you stronger").

    Only when the radiation level is higher than the maximum rate that P53 can repair broken DNA, does the cancer rate start to rise. Up to that level, the increasing P53 activity over-compensates for the radiation damage by repairing DNA much more quickly than normal, in a attempt to return the body to homeostasis.

    As an analogy, think about flu: once you get infected the body's immune system must over-compensate, not just "tread water" in just keeping the infection level from rising.

    It's inadequate for the immune system to respond to rampant infection by merely increasing the attacks on bacteria (which surge through tissues damaged by the flu virus, and cause the worst symptoms) at the same rate that the bacteria is growing.

    If the rate of response of the immune system was the same as that of the cause of the problem, then the immune system would merely be containing the infection and preventing it from getting worse.

    That's not good enough.

    Instead, the immune system needs to increase the rate of attack on bacteria to a higher value than the rate which the bacteria is multiplying at, in order to not merely prevent the infection from getting worse, but to actually cause the bacterial to get killed off at a rate which is bigger than the rate at which the bacteria are multiplying. Only in this case can the population of bacteria decrease, instead of remaining constant, as would be the case if the immune system response was matched to the infection level.

    Similarly, in a war, if you only respond with exactly the same amount of force as your enemy, you will be able to prevent the enemy winning, but you won't be able to end the war! The battle will go on without end. The only way to win is for one side to use more force than the other. If both sides always remain equal in strength, then the war will last forever.

    Protein P53 inside individual cell nuclei, by analogy to the role of T-cells and the white blood cells of the immune system, must over-compensate for increasing problems in order to return the body to normal.

    It is no good if P53's response is identical to the rate of production of DNA damage. P53 must over-compensate to any increased damage, so that the overall amount of excess DNA damage, once it is detected, begins to decrease with time instead of merely remaining constant.

    Homeostasis is used in many organs and systems. In order for normal conditions to be maintained, as soon as any problem is experienced, the body must over-compensate to push conditions back towards the original conditions, not merely keep problems from getting worse.

    It's not good enough to merely negate additional damage. The body has to over-compensate in order to not just prevent the problem getting worse, but to restore health. And that is precisely what happens if the injury is not overwhelmingly severe.

    Once a fire starts, you don't want to simply respond by preventing it from getting bigger. You want, instead, to make the fire smaller. If the rate of growth of the fire is dF/dt, you don't want your fire-fighting response to equal dF/dt, or you will simply be preventing the fire from getting bigger. What you want is to respond at a rate which exceeds the rate of increase of the fire, so that the size of the fire falls with time instead of remaining constant.

    Similarly, with radiation or any other problem, the body's response at low levels is to over-compensate. This over-compensation will actually reduce the natural cancer rate at low radiation levels.

    At very high radiation levels, this effect is disappears and the net response is negative, because damage occurs at such a high rate that the P53 repair mechanism is overloaded and is increasingly unable to repair the damage.

    Another analogy to P53 is the brakes of a vehicle. Cancer in this analogy is like a automobile crash. If the brakes are defective, that can cause a crash. The ability of a vehicle driver to avoid a crash depends to a considerable extent upon having good brakes. The ability of the brakes to prevent a crash may be impaired by various factors, such as excessive speed or oil on the road. If the brakes are merely capable of preventing the speed from increasing, they are not good enough. Brakes must be able to do more than cancel out acceleration and keep the velocity constant. Brakes must be able to bring about a deceleration, to slow a vehicle.

    It's pretty obvious that protein P53 is able to bring about a net reduction in the natural cancer rate when exposed to low-LET ionizing radiation at a "low" dose rates (but still many times the natural background dose rate).

    Once the excessive number of free radicals from radiation are detected as a chemical poisoning internally, P53 repair processes are greatly enhanced to over-compensate for the damage rate, in order to reduce the total amount of damage (rather than merely keeping it in check, or constant). By analogy, in any infection problem, homeostasis mechanisms act to restore the equilibrium not by keeping the damage level constant, but increasing the repair rate so that it exceeds the rate of damage. Only in this way can the total amount of damage be reduced.

     
    At 12:23 pm, Blogger nige said...

    copy of a comment in moderation queue to:

    http://sovietologist.blogspot.com/2008/04/funnist-thing-ever-said-about-herman.html

    dv8 2xl:

    If you actually read Kahn's most important work, On Thermonuclear War, the key arguments against wishful thinking are based on facts, not "opinions".

    Fact: arms control was tried throughout the 1930s to enable the world to "live in peace" with the Nazis.

    Fact: the Nazis simply agreed to everything then broke their word, broke the written agreements they gave to Prime Minister Chamberlain at Munich, etc.

    Fact: arms control does not protect you from other countries with secret rearmament programs.

    Fact: Hitler's Germany were able to almost instantly convert peacetime factories to munitions factories, by simply preparing the plans and blueprints. No practical arms-inspection policy can get around that.

    Fact: even if arms control and pacifism prevented World War II, which it failed to do of course, but even if it did "succeed", millions would still have died in concentration camps and "peaceful invasions" could not have been prevented.

    Fact: if you want to prevent evil, you need leverage, not worthless paper agreements. The only leverage the Stalins and Hitlers understand is bombs. Everything else is propaganda and lies as far as they are concerned. Dictators aren't interested in being seen as respectable nice guys who stick on contracts.

    As Herman Kahn wrote, Khruschev's proposal for arms control - whereby no inspections of Russian disarmament were allowed and anyone cheating would be (in Kruuschev's words) expected to "cover themselves in shame" was a hoax. The Soviets never covered themselves in shame. They broke the testing cessation in 1961 and detonated a 50 megaton bomb. They were proud, not covered in shame.

    Fact: the only way to encourage peace and freedom is to carry a big stick and be seen to be ready to actually USE the big stick. Having civil defence, even just improvised plans like the Kearny car-over-trench shelter than anyone can fix up in the time between a bomb going off and the fallout arriving and building up to a hazardous level downwind - is crucial. Three feet of dirt and you're safe. If you look at the fallout patterns actually measured after nuclear tests with the average yield of stockpiled bombs today, the danger is way exaggerated. Also, the fallout in hazardous areas is clearly visible. Walk crosswind, and you can get out of the danger area before you get a dangerous dose. All nuclear effects are grossly exaggerated. It's pretty easy to grasp this when you understand the physics, instead of believing uneducated hype and spin.

    Unless you can find some wood-frame cities like Hiroshima and Nagasaki to detonate the bombs over, the effects are not as impressive as the hype claims. Even in Hiroshima and Nagasaki, the death rates for people with any kind of screening from the thermal flash (whose severe effects was stopped by just a single leaf, a thin white shirt, or a sheet of paper) cut casualty rates massively. Duck-and-cover does work. Nuclear radiation produced high mortality only when combined with thermal burns: this is the "syngerism" effect because the mechanism for death is that radiation reduces the white blood cell count at just the time when skin burn blisters burst and become infected. If you avoid thermal burns, the LD50 for nuclear radiation is about three times higher. That's why ducking and covering is so vital. It also reduces the amount of debris that can hit you in the face (like flying glass). Most of the people killed in Hiroshima and Nagasaki looked at the fireball, often through glass windows, as the blast wave was silently approaching. Films of nuclear explosions which superimpose the sound of blast on to the fireball with no delay time, mislead viewers about the time-sequence of the effects of nuclear weapons. Similarly, you get some time after a nuclear explosion to evacuate or prepare an improvised shelter, before the fallout even starts to arrive. Philip LaRiviere in the 1950s measured nuclear test data showing the different arrival times and maximum fallout dose rate times after a range of Nevada and Pacific nuclear tests in nuclear test report USNRDL-TR-137 ("The Relationship of Time of Peak Activity from Fallout to Time of Arrival", U.S. Naval Radiological Defense Laboratory, 28 February 1957). On average, even once fallout begins to arrive, it settles diffusively and takes a long time to react peak activity. The time of peak radiation level is about the same time as the time taken for the fallout to begin to arrive in the first place. So as with the delayed double heat flash pulse and the delayed arrival of the blast, you have enough time to protect yourself or evacuate from a potential downwind fallout area. If fallout begins to arrive, you can see it. It's clearly visible wherever the dose rate is life-threatening.

    The point is, nuclear weapons are not automatically going to produce a lot of civilian casualties if there is a reasonable civil defense education in the reliable (nuclear test based) facts.

    If you are going to deter dictators from walking all over you like Hitler and Stalin, then you need to be tough. Toughness is the only thing that deters the sort of trash who don't care about human values at all.

     
    At 1:17 pm, Blogger nige said...

    copy of a comment in moderation queue to:

    http://sovietologist.blogspot.com/2008/04/new-toon-et-al-study-on-regional.html

    The TAPPS (Toon et al.) studies have been wrong from day 1. In 1983 they used flawed assumptions for everything, from the absorption coefficient for sunlight by soot, to ignoring scavenging and atmospheric turbulence, etc. They also exaggerated the burnability of the fuel loading.

    When a building collapses, most of the combustible material is buried under tons of debris and dust and can't burn. You don't get firestorms anymore like you did in wood-frame buildings such as those in the old, medieval part of Hamburg or Dresden, or Hiroshima and Nagasaki.

    In addition, they ignored the fact that for surface bursts (unlike the air bursts over japan) the EMP deposition region will overlap the ground surface and couple thousands of amps in microsecond surges into all the electrical conductors, which would branch out throughout the city before the crater had even formed, and would blow all the fuses/circuit breakers and cut off the electricity supply to buildings, reducing the fire risk.

    For a surface burst, the fireball evevation angle is such that most buildings will be "shadowed" by other buildings, preventing ignition.

    In their 1989 paper, the TAPPS team failed to retract their earlier errors and apologise for hyping poorly researched trash, and instead changed the targetting assumptions to oil refineries, in an attempt to maintain some climatic effects. It was still wrong! The smoke from mass oil refinery fires doesn't hang around freezing the ground for months. It gets blown around and dispersed by atmospheric winds, turbulence, and it gets washed out by rainfall.

    Another popular myth is that the entire crater volume gets converted into dust which enters the stratosphere. Actually, 99% of the apparent crater volume is due to compression of the ground and material dumped around the crater to form the crater "lip" and the ejecta zone surrounding the lip. Only 1% of the cratered mass ends up in the atmosphere, and that forms the fallout, 70% of which is deposited within 24 hours.

    On the topic of ozone depletion, please notice that the prompt gamma rays from a nuclear explosion ionize the air, creating ozone. This effect seriously modifies the early-time history of the thermal pulse output, and has been intensively studied in nuclear tests (although early studies were classified).

    Hence, the production of ozone-destroying nitrogen oxides in the air shock wave at high overpressures must be balanced against the production of ozone by gamma rays.

    For increasing burst altitude, the amount of ozone produced by a nuclear explosion becomes greater than the nitrogen oxide ozone depletion effect, because at high altitudes the air shock wave does not reach sufficient overpressure to produce nitrogen oxides (the equilibrium concentration of nitrogen oxides is a strong function of the pressure).

    Hence, high altitude bursts - which have been threatened on the West by Russian leaders due to EMP effects - will actually increase the amount of ozone in the stratosphere!

    In addition, the net ozone depletion by a low altitude burst is a lot less than 1970s and 1980s predictions (ignoring the production of ozone in nuclear explosions) suggested. Much of the nitrogen oxides combine with water vapour in the fireball and you end up with nitric acid, eventually gets washed out of the atmosphere by rain and doesn't affect ozone. See:

    "Nitrogen oxides, nuclear weapon testing, Concorde and stratospheric ozone" P. Goldsmith, A. F. Tuck, J. S. Foot, E. L. Simmons & R. L. Newson, published in Nature, v. 244, issue 5418, pp. 545-551, 31 August 1973:

    "ALTHOUGH AMOUNTS OF NITROGEN OXIDES EQUIVALENT TO THE OUTPUT FROM MANY CONCORDES WERE RELEASED INTO THE ATMOSPHERE WHEN NUCLEAR TESTING WAS AT ITS PEAK, THE AMOUNT OF OZONE IN THE ATMOSPHERE WAS NOT AFFECTED."

    Below is an extract from a British Civil Defence magazine article written by George R. Stanbury, head of civil defence research on the British "Operation Hurricane" nuclear bomb test at Monte Bello, and before that an expert on the incendiary bombing of Britain in World War II.

    'Restricted' classified U.K. Home Office Scientific Adviser's Branch journal Fission Fragments, W. F. Greenhalgh, Editor, London, Issue Number 3, August 1962, pages 22-26:

    'The fire hazard from nuclear weapons

    'by G. R. Stanbury, BSc, ARCS, F.Inst.P.

    'We have often been accused of underestimating the fire situation from nuclear attack. We hope to show that there is good scientific justification for the assessments we have made, and we are unrepentant in spite of the television utterances of renowned academic scientists who know little about fire. ...

    'Firstly ... the collapse of buildings would snuff out any incipient fires. Air cannot get into a pile of rubble, 80% of which is incombustible anyway. This is not just guess work; it is the result of a very complete study of some 1,600 flying bomb [V1 cruise missile] incidents in London supported by a wealth of experience gained generally in the last war.

    'Secondly, there is a considerable degree of shielding of one building by another in general.

    'Thirdly, even when the windows of a building can "see" the fireball, and something inside is ignited, it by no means follows that a continuing and destructive fire will develop.

    'The effect of shielding in a built-up area was strikingly demonstrated by the firemen of Birmingham about 10 years ago with a 144:1 scale model of a sector of their city which they built themselves; when they put a powerful lamp in the appropriate position for an air burst they found that over 50% of the buildings were completely shielded. More recently a similar study was made in Liverpool over a much larger area, not with a model, but using the very detailed information provided by fire insurance maps. The result was similar.

    'It is not so easy to assess the chance of a continuing fire. A window of two square metres would let in about 10^5 calories at the 5 cal/(cm)^2 range. The heat liberated by one magnesium incendiary bomb is 30 times this and even with the incendiary bomb the chance of a continuing fire developing in a small room is only 1 in 5; in a large room it is very much less.

    'Thus even if thermal radiation does fall on easily inflammable material which ignites, the chance of a continuing fire developing is still quite small. In the Birmingham and Liverpool studies, where the most generous values of fire-starting chances were used, the fraction of buildings set on fire was rarely higher than 1 in 20.

    'And this is the basis of the assertion [in Nuclear Weapons] that we do not think that fire storms are likely to be started in British cities by nuclear explosions, because in each of the five raids in which fire storms occurred (four on Germany - Hamburg, Darmstadt, Kassel, Wuppertal and a "possible" in Dresden, plus Hiroshima in Japan - it may be significant that all these towns had a period of hot dry weather before the raid) the initial fire density was much nearer 1 in 2. Take Hamburg for example:

    'On the night of 27/28th July 1943, by some extraordinary chance, 190 tons of bombs were dropped into one square mile of Hamburg. This square mile contained 6,000 buildings, many of which were [multistorey wooden] medieval.

    'A density of greater than 70 tons/sq. mile had not been achieved before even in some of the major fire raids, and was only exceeded on a few occasions subsequently. The effect of these bombs is best shown in the following diagram, each step of which is based on sound trials and operational experience of the weapons concerned.

    '102 tons of high explosive bombs dropped -> 100 fires

    '88 tons of incendiary bombs dropped, of which:

    '48 tons of 4 pound magnesium bombs = 27,000 bombs -> 8,000 hit buildings -> 1,600 fires

    '40 tons of 30 pound gel bombs = 3,000 bombs -> 900 hit buildings -> 800 fires

    'Total = 2,500 fires

    'Thus almost every other building [1 in 2 buildings] was set on fire during the raid itself, and when this happens it seems that nothing can prevent the fires from joining together, engulfing the whole area and producing a fire storm (over Hamburg the column of smoke, observed from aircraft, was 1.5 miles in diameter at its base and 13,000 feet high; eyewitnesses on the ground reported that trees were uprooted by the inrushing air).

    'When the density was 70 tons/square mile or less the proportion of buildings fired during the raid was about 1 in 8 or less and under these circumstances, although extensive areas were burned out, the situation was controlled, escape routes were kept open and there was no fire storm.'

     
    At 1:47 pm, Blogger nige said...

    Copy of a comment to:

    http://sovietologist.blogspot.com/2008/04/new-toon-et-al-study-on-regional.html

    My comment about the fact that high altitude nuclear explosions produce an excess of ozone (by gamma ray emission) without producing nitrogen oxides that destroy ozone (nitrogen oxide formation requires a very compressed shock wave, which can't occur in low density air at high altitude), needs the following reference:

    U.S. Congress Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack, 2004. These EMP hearings discuss the politics, such as an outrageous threat allegedly made by the Soviet Ambassador to the U.S., Vladimir Lukin, who said to the Americans in Vienna in May 1999: 'we have the ultimate ability to bring you down [by EMP from high altitude nuclear detonations]'.

     
    At 11:10 am, Blogger nige said...

    copy of a comment to

    http://riofriospacetime.blogspot.com/2008/05/science-of-iron-man.html

    "The Tokomak is a donut-shaped magnetic bottle for containing hot plasma. Controlled fusion has long held the promise of limitless energy, but requires temperatures and pressures similiar to the Sun's interior. Despite decades of work, controlled fusion remains as it was in 1964, just around the corner."

    Controlled nuclear fusion by magnetic confinement of hot plasma is a joke. Strong magnetic fields are never perfectly uniform and the pressure of plasma needed to cause deuterium and tritium nuclei to fuse is immense! So you always get instabilities develop.

    The situation is similar to trying to use a low-density fluid to compress a higher-density fluid, in other words you get a form of Taylor instability develop.

    The magnetic field causes the plasma to not be uniformly compressed, but to break up into jets where the magnetic field is slightly weaker. Because you can't make the magnetic field perfectly uniform, this is inevitable.

    It's like squeezing an orange with your hands. You don't end up with a uniformly compressed orange. You end up with juice squirting into somebody's eye.

    The radioactive waste from a controlled nuclear fusion reactor, if if could be made to work efficiently, would in practical terms be even worse than that from nuclear fission!

    At least the 300 fission products decay, as a mixture, faster than the inverse of time. The fission product dose rate falls as about t^{-1.2} where t is time after fission. In any case, fission products have been proved to be safely confined with only a few feet migration over a time span of 1.7 billion years, as a result of the intense natural nuclear reactors in concentrated uranium ore seams at Oklo, in Gabon:

    "Once the natural reactors burned themselves out, the highly radioactive waste they generated was held in place deep under Oklo by the granite, sandstone, and clays surrounding the reactors’ areas. Plutonium has moved less than 10 feet from where it was formed almost two billion years ago."

    - http://www.ocrwm.doe.gov/factsheets/doeymp0010.shtml

    But for fusion, you get the accumulation of relatively long lived iron-59, iron-55, cobalt-60, nickel-63, and many other nuclides which are caused by the capture in reactor materials of high energy neutrons from the fusion process. E.g., the fusion of tritium and deuterium releases 17.6 MeV, of which 14.1 MeV is carried by the neutron. This massive neutron energy is to be compared to the thermalized neutrons of 0.025 eV energy! As a result, whereas in fission you can reprocess the fuel rods to extract the radioactive waste without the whole reactor becoming dangerously radioactive, in fusion the whole reactor becomes almost uniformly contaminated by neutron capture in the structural elements! There is nothing you can do about this.

    Controlled nuclear fusion has a lot in common with string theory in terms of over-hype, and failure. The most sensible way to use safe nuclear fusion energy is to further develop solar power and other ways to extract the energy of fusion being carried out in the sun's core.

     
    At 6:22 pm, Blogger nige said...

    copy of a comment to

    http://backreaction.blogspot.com/2008/05/nuclear-power-return-of.html

    "Nuclear's OK, but cars can't run on nuclear, so how can that really be a solution?" - Andrew

    Nuclear power doesn't burn fossil fuels, which leaves more of those fuels for powering the internal combustion engine rather than generating electricity.

    Cars can eventually (when fossil fuel costs make the price of gasoline too much for most people to afford) be fitted with electric motors run on electricity using efficient, low-weight rechargable lithium-ion batteries, and these can be recharged from mains supplied by nuclear reactors.

    Obviously, electric trains can run on nuclear generated electricity without any interim battery storage.

    The thing about nuclear power is that it is excessively expensive due to excessive safety precautions, and it is also a victim of lying propaganda from the environmental lobby which doesn't understand nuclear power in the proper context of natural radiation background levels and natural radon gas hazards, or even the naturally proved storage of intense radioactive waste for billions of years!

    Fission products have been proved to be safely confined with only a few feet migration over a time span of 1.7 billion years, as a result of the intense natural nuclear reactors in concentrated uranium ore seams at Oklo, in Gabon:

    "Once the natural reactors burned themselves out, the highly radioactive waste they generated was held in place deep under Oklo by the granite, sandstone, and clays surrounding the reactors’ areas. Plutonium has moved less than 10 feet from where it was formed almost two billion years ago."

    - http://www.ocrwm.doe.gov/factsheets/doeymp0010.shtml

    Data from Hiroshima and Nagasaki is strongest (most evidence) for low doses, where it shows a suppression and a threshold for such low-LET (linear energy transfer) radiation like gamma rays. See my post here for a discussion of the extensive evidence.

    High-LET radiation like alpha particles deposits a lot of energy per unit length of path of the radiation through tissue, and this can overcome the natural protein P53 repair mechanism which sticks broken DNA fragments back together. In fact, the main cancer risk occurs in multiple DNA strand breaks, where bits of DNA end up being stuck back together in the wrong sequence, either killing the cell when it later tries to divide, or more seriously causing cancer when the cell divides in a damaged form which is out of control and causes a tumour.

    But such high-LET radiation like alpha particles are only a hazard internally, such as when radioactive material is inhaled or ingested. The alpha particle emitter plutonium in a nuclear reactor is inside sealed aluminium fuel pellets and at no time is such waste a serious inhalation or ingestion hazard.

    Gamma radiation, from evidence at Hiroshima and Nagasaki, as well as the Taiwan incident where 180 buildings lived in by 10,000 people for 20 years were constructed of steel which accidentally included intensely radioactive cobalt-60 from discarded radiotherapy sources, is low-LET radiation which does exhibit a threshold before any excess cancer risk (predominantly leukemia) shows up. There is evidence that the exact threshold dose effect for low-LET radiations such as gamma radiation depends on the dose rate at which the radiation is received, and not merely on the total dose. If the dose rate is producing DNA damage at a rate which is lower than the maximum rate at which P53 can repair DNA strand breaks, no excess cancer (above the natural cancer rate) occurs. The cancer risk depends on the proportion of the radiation dose which is above this threshold, and is proportional to that dose received at a rate exceeding the repairable DNA damage rate.

    W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, Is Chronic Radiation an Effective Prophylaxis Against Cancer?, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here:

    'An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

    'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...

    'The data on reduced cancer mortality and congenital malformations are compatible with the phenomenon of radiation hormesis, an adaptive response of biological organisms to low levels of radiation stress or damage; a modest overcompensation to a disruption, resulting in improved fitness. Recent assessments of more than a century of data have led to the formulation of a well founded scientific model of this phenomenon.

    'The experience of these 10,000 persons suggests that long term exposure to [gamma]radiation, at a dose rate of the order of 50 mSv (5 rem) per year, greatly reduces cancer mortality, which is a major cause of death in North America.'

    The fact that leukemia risk is sensitive function of dose rate and not just dose means that most of the radiation monitors for workers in the nuclear industry (which merely record total dose, i.e. integrated dose rate, and don't show the mean rate at which the dose was received at) is almost useless for assessing risks.

    This has been known and published widely since 1962:

    "... Mole [R. H. Mole, Brit. J. Radiol., v32, p497, 1959] gave different groups of mice an integrated total of 1,000 r of X-rays over a period of 4 weeks. But the dose-rate - and therefore the radiation-free time between fractions - was varied from 81 r/hour intermittently to 1.3 r/hour continuously. The incidence of leukemia varied from 40 per cent (within 15 months of the start of irradiation) in the first group to 5 per cent in the last compared with 2 per cent incidence in irradiated controls."

    All of this evidence is ignored or censored out of mainstream discussions by bigoted politicians, journalists, editors and environmental quangos. So "Health Physics" (which radiation safety is currently known as) isn't really healthy physics anymore, it's instead becoming more of a pseudoscientific exercise in political expediency and ignoring evidence.

    Fusion power doesn't look very realistic or safe either, because of the high energy neutrons given off in tritium-deuterium fusion which will turn the structural materials of the entire fusion reactor radioactive quite quickly, since they have a much greater range than the moderated (thermalized) neutrons in a nuclear fission reactor. So neutron-induced activity is a problem with fusion reactors. You have also to compress plasma to enormous pressures to achieve fusion using electrically controlled magnetic fields, which in a commercial fusion reactor producing gigawatts of power, would not exactly have the "fail-safe" safety features of a fission reactor. Any slight upset to the carefully aligned and balanced magnetic fields which are compressing the fusion plasma would potentially turn the fusion reactor into the scene of a H-bomb exposion, complete with radioactive fallout from the neutron-induced activity in the structural materials. This aspect of fusion power isn't hyped very much in the popular media, either. Could it be that the people working on such areas simply don't want their funding to dry up?

     
    At 11:59 pm, Blogger nige said...

    copy of a comment submitted to moderation queue at the blog:

    http://www.builtonfacts.com/2008/05/15/nuclear-fusion-power/#comment-16


    "That’s where nuclear fusion comes in. Like the sun, it fuses light atoms (hydrogen isotopes, generally) into heavier ones (helium, generally). Radioactivity is produced, but in vastly smaller and easier-to-handle amounts than in nuclear fission plants. But to get fusion to work, the power plant has to produce conditions of extreme heat and adequate pressure to get the hydrogen to fuse in the first place. On one hand this is a perfect safety feature. If a breakdown ever occurred, damage to the reactor instantly destroys the conditions necessary for continued nuclear reactions. And since only a very small amount of fuel is reacting in a given time, a problem instantly and automatically prevents the reactor from causing melting down. It’s a physical impossibility."

    You have a bit of disinformation here for some reason. If you know the physics you choose to write about, you are aware that the easiest fusion process to achieve is tritium+deuterium -> helium + neutron + 17.6 MeV.

    Since 80% of the mass is helium and only 20% is the neutron, 80% of the energy, i.e. 14.1 MeV of that is carried by the neutron, so in each fusion event of 17.6 MeV, you get 14.1 MeV of neutron energy which can potentially induce radioactivity into the reactor containment vessel or building.

    In fission, an average of about 200 MeV of energy is released in each fission event of which only about 30 MeV is residual radioactivity from fission products.

    So in fission, about 15% of the energy is released as radioactivity, while in fusion of tritium and deuterium it can be anything up to 80%.

    Neutron induced activity is a less severe problem in fission reactors than in experimental fusion reactors, because the neutrons are thermalized to low energy (about 0.025 eV ) and don't irradiate the entire reactor structure, whereas the 14.1 MeV fusion neutrons are highly penetrating and do go everywhere, turning structural steel radioactive, etc. This is not 'easily handled'.

    'On one hand this is a perfect safety feature. If a breakdown ever occurred, damage to the reactor instantly destroys the conditions necessary for continued nuclear reactions. And since only a very small amount of fuel is reacting in a given time, a problem instantly and automatically prevents the reactor from causing melting down. It’s a physical impossibility.'

    To make a nuclear fusion reactor work at an energy density that gives the gigawatts of power required for economic or meaningful commercial use, you need a massive amount of fuel with an immense pressure, exerted on the plasma by strong magnetic fields which can squeeze the conductive (ionized) plasma.

    If anything goes wrong, you get an explosion. Trying to compress a plasma with magnetic fields is like trying to squeeze and compress an orange with your fingers anyway, which is one reason why fusion has always been a crackpot activity (all hype, no commercially viable success).

    It is the nuclear fission reactor which is inherently stable, because it has built-in 'fail safe' design. The control rods fall back in and make it sub-critical if power fails. By contrast, if power fails to the electromagnets containing plasma as a thousand atmospheres or more in a fusion reactor, you get a nuclear explosion as a matter of course.

    The more you go into the details, the more stupid nuclear fusion becomes. If you want to use the most easy to achieve fusion reaction, you need to use tritium as well as deuterium, and tritium is exceedingly expensive (it's produced by bombarding lithium with neutrons in a fission reactor).

    If you just want to use just deuterium, the amount of pressure and temperature you need to make the reaction exothermal is far higher, because the reaction has a higher threshold for ignition, like a high activation energy in a chemical reaction.

    The 'ITER' reactor page http://www.iter.org/a/index_faq.htm states:

    'The DT fusion reaction creates helium, which is inert and harmless, and neutrons, which can make surrounding materials radioactive for varying amounts of time.'

    This seems to indicate that they are planning to demonstate the concept using DT fusion, using tritium presumably made at great expense in fission reactors. (Which would be extremely expensive, but cheap compared to the cost of trying to extract the tiny amount of natural tritium in seawater.)

    The whole fusion spin industry is a complete fraud and pseudoscience. If you want to promote safe nuclear fusion energy, make do with sunlight and its derivatives.

    Chernobyl didn't blow up because it was an old design. It blew up because the Soviet RBMK reactor was a stupid design with a massive positive reactivity when the control rods are withdrawn, and the engineers in charge in April 1986 were cowboys, carrying out an unauthorized experiment (to see if the reactor could power its own energency water cooling pumps in the event of losing external electric power), which was obviously dangerous. They switched off the water cooling system, they switched off all the automatic safety systems (which can't be switched off in Western reactor designs when the reactor is in use), then they withdrew most of the control rods. The reactor design was stupid because the control rods were driven by only slow electric motors which couldn't quickly insert them in an emergency. It would take 18 seconds in the RBMK reactor to fully insert the control rods (in Western reactors it takes only 2-3 seconds), and the reactor exploded 40 seconds after the experiment began due to stupidity.

    Also, nuclear fission waste is easy to handle and has been proved safe for 1.7 billion years, which is longer than any other kind of industrial waste has been verified to be safe for!

    Fission products have been proved to be safely confined with only a few feet migration over a time span of 1.7 billion years, as a result of the intense natural nuclear reactors in concentrated uranium ore seams at Oklo, in Gabon:

    "Once the natural reactors burned themselves out, the highly radioactive waste they generated was held in place deep under Oklo by the granite, sandstone, and clays surrounding the reactors’ areas. Plutonium has moved less than 10 feet from where it was formed almost two billion years ago."

    - http://www.ocrwm.doe.gov/factsheets/doeymp0010.shtml

     
    At 9:21 pm, Blogger David Howard said...

    google: we got nuked on 9/11

     
    At 11:19 pm, Blogger nige said...

    Hi David Howard,

    I took a look at your blog and and its link to http://wtcdemolition.blogspot.com/ which claims that the World Trade Center twin towers collapse in 2001 was due to an explosion instead of planes crashing in and melting the steel frame with burning aviation fuel, which then allowed the floors to collapse under gravity (piling up into an accumulating downward-travelling mass as they fell, the snowplow effect, which soon makes negligible the resistance of each extra floor the immense mass hits; so there is relatively little deviation from free fall - it soon becomes like dropping a brick on a pile of leaves).

    The alleged evidence for it being due to an explosion which is given is not explosion evidence: dust, "extreme high heat in the ground zero rubble (widely-reported/well-substantiated)" etc are normal results of a heavy mass of building falling a great height and hitting the ground. The kinetic energy is

    E = (1/2)mv^2

    and for gravitational near-free fall velocity v is related to gravitational acceleration g and vertical fall distance s by

    v^2 = 2gs

    Hence

    E = (1/2)mv^2 = (1/2)m(2gs) = mgs.

    Each WTC tower had a structural mass of 169,000,000 kg (mainly structural steel and concrete), and was 417 m high (to the top of room, not the spire/antenna). Hence the mean fall distance was 209 m.

    This gives an energy release of

    E = mgs = 169,000,000*9.81*209

    = 3.46*10^11 Joules

    Now remember that 1 kt of TNT is equivalent to 4.184*10^12 J.

    Hence, each of the twin towers released the equivalent of 0.083 kt of TNT just due to the gravitational collapse, neglecting the energy of the aircraft impacts and the aviation fuel.

    This 0.083 kt is in the yield range of the smallest American nuclear bomb, the 23 kg Davy Crockett. So it just equivalent to a very small nuclear explosion.

    So all the alleged "evidence" that the tower collapses had some characteristics similar to a small nuclear explosion are missing the point that the energy release when 169,000 metric tons thuds after a fall of hundreds of metres is substantial! Of course it has some characteristics of a big explosion, and you do generally get electromagnetic pulses released by conventional explosions or collisions (the heat causes ionization of material, which if the electrons are detached in an asymmetric way creates the radiation of a radio-frequency pulse).

    The easy discriminator between a nuclear explosion and a conventional explosion or collapse is obviously the easily traced radioactive fission product signature. Anyone with a portable detector have been able to detect if there had been a nuclear explosion involved.

    The simplest theory which fits the facts for the World Trade Centre twin towers collapse is the most obvious one, that the conspiracy was a terrorist group which flew aircraft into the twin towers. That was enough to cause the destruction observed. You don't need to add more explosives, the weight of the building in combination to the damage and fires due to the aviation fuel heating and weakening the steel frame and thereby allowing the floors to fall was enough to cause all the effects.

    If you want to attack conspiracies, please attack the many real conspiracies instead of imaginary ones, e.g. discredit mainstream string theory for claiming to be a theory of quantum gravity when it predicts nothing, or discredit the conspiracy to misinform people on the effects of nuclear weapons tests and radiation effects as a function of dose rate!

    The problem is, as I'm sure you are aware, the factual conspiracies just don't have any interest to many people, who prefer more imaginary speculative stuff, instead of sticking to solid evidence.

    However, thanks for your comment!

     

    Post a Comment

    << Home