Click here for the key declassified nuclear testing and capability documents compilation (EM-1 related USA research reports and various UK nuclear weapon test reports on blast and radiation), from nukegate.org

We also uploaded an online-viewable version of the full text of the 1982 edition of the UK Goverment's Domestic Nuclear Shelters - Technical Guidance, including secret UK and USA nuclear test report references and extracts proving protection against collateral damage, for credible deterrence (linked here).

ABOVE: Russian 1985 1st Cold War SLBM first strike plan. The initial use of Russian SLBM launched nuclear missiles from off-coast against command and control centres (i.e. nuclear explosions to destroy warning satellite communications centres by radiation on satellites as well as EMP against ground targets, rather than missiles launched from Russia against cities, as assumed by 100% of the Cold War left-wing propaganda) is allegedly a Russian "fog of war" strategy. Such a "demonstration strike" is aimed essentially at causing confusion about what is going on, who is responsible - it is not quick or easy to finger-print high altitude bursts fired by SLBM's from submerged submarines to a particular country because you don't get fallout samples to identify isotopic plutonium composition. Russia could immediately deny the attack (implying, probably to the applause of the left-wingers that this was some kind of American training exercise or computer based nuclear weapons "accident", similar to those depicted in numerous anti-nuclear Cold War propaganda films). Thinly-veiled ultimatums and blackmail follow. America would not lose its population or even key cities in such a first strike (contrary to left-wing propaganda fiction), as with Pearl Harbor in 1941; it would lose its complacency and its sense of security through isolationism, and would either be forced into a humiliating defeat or a major war.

Before 1941, many warned of the risks but were dismissed on the basis that Japan was a smaller country with a smaller economy than the USA and war was therefore absurd (similar to the way Churchill's warnings about European dictators were dismissed by "arms-race opposing pacifists" not only in the 1930s, but even before WWI; for example Professor Cyril Joad documents in the 1939 book "Why War?" his first hand witnessing of Winston Churchill's pre-WWI warning and call for an arms-race to deter that war by the sneering Norman Angell). It is vital to note that there is an immense pressure against warnings of Russian nuclear superiority even today, most of it contradictory. E.g. the left wing (Russian biased) "experts" whose voices are the only ones reported in the Western media (traditionally led by "Scientific American" and "Bulletin of the Atomic Scientists"), simultaneously claim Russia imposes such a complex SLBM and ICBM threat that we must disarm now, while also claiming that their tactical nuclear weapons probably won't work so aren't a threat! In similar vein, Teller-critic Hans Bethe also used to falsely "dismiss" Russian nuclear superiority by claiming (with any more evidence than Brezhnev's word, it appeared) that Russian delivery systems are "less accurate" than Western missiles (as if accuracy has anything to do with high altitude EMP strikes, where the effects cover thousands of miles radii). Such claims would then by repeatedly endlessly in the Western media by Russian biased "journalists" or agents of influence, and any attempt to point out the propaganda would turn into a "Reds under beds" argument, designed to imply that the truth is dangerous to "peaceful coexistence"!

The Top Secret American intelligency report NIE 11-3/8-74 "Soviet Forces for Intercontinental Conflict" warned on page 6: "the USSR has largely eliminated previous US quantitative advantages in strategic offensive forces." page 9 of the report estimated that the Russian's ICBM and SLBM launchers exceed the USAs 1,700 during 1970, while Russia's on-line missile throw weight had exceeded the USA's one thousand tons back in 1967! Because the USA had more long-range bombers which can carry high-yield bombs than Russia (bombers are more vulnerable to air defences so were not Russia's priority), it took a little longer for Russia to exceed the USA in equivalent megatons, but the 1976 Top Secret American report NIE 11-3/8-76 at page 17 shows that in 1974 Russia exceeded the 4,000 equivalent-megatons payload of USA missiles and aircraft (with less vulnerability for Russia, since most of Russia's nuclear weapons were on missiles not in SAM-vulnerable aircraft), amd by 1976 Russia could deliver 7,000 tons of payload by missiles compared to just 4,000 tons on the USA side. These reports were kept secret for decades to protect the intelligence sources, but they were based on hard evidence. For example, in August 1974 the Hughes Aircraft Company used a specially designed ship (Glomar Explorer, 618 feet long, developed under a secret CIA contract) to recover nuclear weapons and their secret manuals from a Russian submarine which sank in 16,000 feet of water, while in 1976 America was able to take apart the electronics systems in a state-of-the-art Russian MIG-25 fighter which was flown to Japan by defector Viktor Belenko, discovering that it used exclusively EMP-hard miniature vacuum tubes with no EMP-vulnerable solid state components.

There are four ways of dealing with aggressors: conquest (fight them), intimidation (deter them), fortification (shelter against their attacks; historically used as castles, walled cities and even walled countries in the case of China's 1,100 mile long Great Wall and Hadrian's Wall, while the USA has used the Pacific and Atlantic as successful moats against invasion, at least since Britain invaded Washington D.C. back in 1812), and friendship (which if you are too weak to fight, means appeasing them, as Chamberlain shook hands with Hitler for worthless peace promises). These are not mutually exclusive: you can use combinations. If you are very strong in offensive capability and also have walls to protect you while your back is turned, you can - as Teddy Roosevelt put it (quoting a West African proverb): "Speak softly and carry a big stick." But if you are weak, speaking softly makes you a target, vulnerable to coercion. This is why we don't send troops directly to Ukraine. When elected in 1960, Kennedy introduced "flexible response" to replace Dulles' "massive retaliation", by addressing the need to deter large provocations without being forced to decide between the unwelcome options of "surrender or all-out nuclear war" (Herman Kahn called this flexible response "Type 2 Deterrence"). This was eroded by both Russian civil defense and their emerging superiority in the 1970s: a real missiles and bombers gap emerged in 1972 when the USSR reached and then exceeded the 2,200 of the USA, while in 1974 the USSR achieve parity at 3,500 equivalent megatons (then exceeded the USA), and finally today Russia has over 2,000 dedicated clean enhanced neutron tactical nuclear weapons and we have none (except low-neutron output B61 multipurpose bombs). (Robert Jastrow's 1985 book How to make nuclear Weapons obsolete was the first to have graphs showing the downward trend in nuclear weapon yields created by the development of miniaturized MIRV warheads for missiles and tactical weapons: he shows that the average size of US warheads fell from 3 megatons in 1960 to 200 kilotons in 1980, and from a total of 12,000 megatons in 1960 to 3,000 megatons in 1980.)

The term "equivalent megatons" roughly takes account of the fact that the areas of cratering, blast and radiation damage scale not linearly with energy but as something like the 2/3 power of energy release; but note that close-in cratering scales as a significantly smaller power of energy than 2/3, while blast wind drag displacement of jeeps in open desert scales as a larger power of energy than 2/3. Comparisons of equivalent megatonnage shows, for example, that WWII's 2 megatons of TNT in the form of about 20,000,000 separate conventional 100 kg (0.1 ton) explosives is equivalent to 20,000,000 x (10-7)2/3 = 431 separate 1 megaton explosions! The point is, nuclear weapons are not of a different order of magnitude to conventional warfare, because: (1) devastated areas don't scale in proportion to energy release, (2) the number of nuclear weapons is very much smaller than the number of conventional bombs dropped in conventional war, and (3) because of radiation effects like neutrons and intense EMP, it is possible to eliminate physical destruction by nuclear weapons by a combination of weapon design (e.g. very clean bombs like 99.9% fusion Dominic-Housatonic, or 95% fusion Redwing-Navajo) and burst altitude or depth for hard targets, and create a weapon that deters invasions credibly (without lying local fallout radiation hazards), something none of the biased "pacifist disarmament" lobbies (which attract Russian support) tell you! There's a big problem with propaganda here.

(These calculations, showing that even if strategic bombing had worked in WWII - and the US Strategic Bombing Survey concluded it failed, thus the early Cold War effort to develop and test tactical nuclear weapons and train for tactical nuclear war in Nevada field exercises - you need over 400 megaton weapons to give the equivalent of WWII city destruction in Europe and Japan, are often inverted by anti-nuclear bigots to try to obfuscate the truth. What we're driving at is that nuclear weapons give you the ability to DETER the invasions that set off such wars, regardless of whether they escalate from poison gas - as feared in the 20s and 30s thus appeasement and WWII - or nuclear. Escalation was debunked in WWII where the only use of poison gases were in "peaceful" gas chambers, not dropped on cities. Rather than justifying appeasement, the "peaceful" massacre of millions in gas chambers justified war. But evil could and should have been deterred. The "anti-war" propagandarists like Lord Noel-Baker and pals who guaranteed immediate gas knockout blows in the 30s if we didn't appease evil dictators were never held to account and properly debunked by historians after the war, so they converted from gas liars to nuclear liars in the Cold War and went on winning "peace" prices for their lies, which multiplied up over the years, to keep getting news media headlines and Nobel Peace Prizes for starting and sustaining unnecessary wars and massacres by dictators. There's also a military side to this, with Field Marshall's Lord Mountbatten, lord Carver and lord Zuckerman in the 70s arguing for UK nuclear disarmament and a re-introduction of conscription instead. These guys were not pacifist CND thugs who wanted Moscow to rule the world, but they were quoted by them attacking the deterrent but not of course calling for conscription instead. The abolishment of UK conscription for national service in 1960 was due to the H-bomb, and was a political money-saving plot by Macmillan. If we disarmed our nuclear deterrent and spend the money on conscription plus underground shelters, we might well be able to resist Russia as Ukraine does, until we run out of ammunition etc. However, the cheapest and most credible deterrent is tactical nuclear weapons to prevent the concentration of aggressive force by terrorist states..)

Britain was initially in a better position with regards to civil defense than the USA, because in WWII Britain had built sufficient shelters (of various types, but all tested against blast intense enough to demolish brick houses, and later also tested them at various nuclear weapon trials in Monte Bello and Maralinga, Australia) and respirators for the entire civilian population. However, Britain also tried to keep the proof testing data secret from Russia (which tested their own shelters at their own nuclear tests anyway) and this meant it appeared that civil defense advice was unproved and would not work, an illusion exploited especially for communist propaganda in the UK via CND. To give just one example, CND and most of the UK media still rely on Duncan Campbell's pseudo-journalism book War Plan UK since it is based entirely on fake news about UK civil defense, nuclear weapons, Hiroshima, fallout, blast, etc. He takes for granted that - just because the UK Government kept the facts secret - the facts don't exist, and to him any use of nuclear weapons which spread any radioactivity whatsoever will make life totally impossible: "What matters 'freedom' or 'a way of life' in a radioactive wasteland?" (Quote from D. Campbell, War Plan UK, Paladin Books, May 1983, p387.) The problem here is the well known fallout decay rate; Trinity nuclear test ground zero was reported by Glasstone (Effects of Atomic Weapons, 1950) to be at 8,000 R/hr at 1 hour after burst, yet just 57 days later, on September 11, 1945, General Groves, Robert Oppenheimer, and a large group of journalists safely visited it and took their time inspecting the surviving tower legs, when the gamma dose rate was down to little more than 1 R/hr! So fission products decay fast: 1,000 R/hr at 1 hour decays to 100 at 7 hours, 10 at 2 days, and just 1 at 2 weeks. So the "radioactive wasteland" is just as much a myth as any other nuclear "doomsday" fictional headline in the media. Nuclear weapons effects have always been fake news in the mainstream media: editors have always regarded facts as "boring copy". Higher yield tests showed that even the ground zero crater "hot spots" were generally lower, due to dispersal by the larger mushroom cloud. If you're far downwind, you can simply walk cross-wind, or prepare an improvised shelter while the dust is blowing. But point any such errors out to fanatical bigots and they will just keep making up more nonsense.

Duncan Campbell's War Plan UK relies on the contradiction of claiming that the deliberately exaggerated UK Government worst-case civil defense "exercises" for training purposes are "realistic scenarios" (e.g. 1975 Inside Right, 1978 Scrum Half, 1980 Square Leg, 1982 Hard Rock planning), while simultaneously claiming the very opposite about reliable UK Government nuclear effects and sheltering effectiveness data, and hoping nobody would spot his contradictory tactics. He quotes extensively from these lurid worst-case scenario UK civil defense exercises ,as if they are factually defensible rather than imaginary fiction to put planners under the maximum possible stress (standard UK military policy of “Train hard to fight easy”), while ignoring the far more likely limited nuclear uses scenario of Sir John Hackett's Third World War. His real worry is the 1977 UK Government Training Manual for Scientific Advisers which War Plan UK quotes on p14: "a potential threat to the security of the United Kingdom arising from acts of sabotage by enemy agents, possibly assisted by dissident groups. ... Their aim would be to weaken the national will and ability to fight. ... Their significance should not be underestimated." On the next page, War Plan UK quotes J. B. S. Haldane's 1938 book Air Raid Precautions (ARP) on the terrible destruction Haldane witnessed on unprotected people in the Spanish civil war, without even mentioning that Haldane's point is pro-civil defense, pro-shelters, and anti-appeasement of dictatorship, the exact opposite of War Plan UK which wants Russia to run the world. On page 124 War Plan UK the false assertion is made that USA nuclear casualty data is "widely accepted" and true (declassified Hiroshima casaulty data for people in modern concrete buildings proves it to be lies) while the correct UK nuclear casualty data is "inaccurate", and on page 126, Duncan Campbell simply lies that the UK Government's Domestic Nuclear Shelters- Technical Guidance "ended up offering the public a selection of shelters half of which were invented in the Blitz ... None of the designs was ever tested." In fact, Frank Pavry (who studied similar shelters surviving near ground zero at Hiroshima and Nagasaki in 1945 with the British Mission to Japan_ and George R. Stanbury tested 15 Anderson shelters at the first UK nuclear explosion, Operation Hurricane in 1952, together with concrete structures, and many other improvised trench and earth-covered shelters were nuclear tested by USA and UK at trials in 1955, 1956, 1957, and 1958, and later at simulated nuclear explosions by Cresson Kearny of Oak Ridge National Laboratory in the USA, having also earlier been exposed to early Russian nuclear tests (scroll down to see the evidence of this). Improved versions of war tested and nuclear weapons tested shelters! So war Plan UK makes no effort whatsoever to dig up the facts, and instead falsely claims the exact opposite of the plain unvarnished truth! War Plan UK shows its hypocrisy on page 383 in enthusiastically praising Russian civil defense:

"Training in elementary civil defence is given to everyone, at school, in industry or collective farms. A basic handbook of precautionary measures, Everybody must know this!, is the Russian Protect and Survive. The national civil defence corps is extensive, and is organized along military lines. Over 200,000 civil defence troops would be mobilized for rescue work in war. There are said to be extensive, dispersed and 'untouchable' food stockpiles; industrial workers are issued with kits of personal protection apparatus, said to include nerve gas counteragents such as atropine. Fallout and blast shelters are provided in the cities and in industrial complexes, and new buildings have been required to have shelters since the 1950s. ... They suggest that less than 10% - even as little as 5% - of the Soviet population would die in a major attack. [Less than Russia's loss of 12% of its population in WWII.]"

'LLNL achieved fusion ignition for the first time on Dec. 5, 2022. The second time came on July 30, 2023, when in a controlled fusion experiment, the NIF laser delivered 2.05 MJ of energy to the target, resulting in 3.88 MJ of fusion energy output, the highest yield achieved to date. On Oct. 8, 2023, the NIF laser achieved fusion ignition for the third time with 1.9 MJ of laser energy resulting in 2.4 MJ of fusion energy yield. “We’re on a steep performance curve,” said Jean-Michel Di Nicola, co-program director for the NIF and Photon Science’s Laser Science and Systems Engineering organization. “Increasing laser energy can give us more margin against issues like imperfections in the fuel capsule or asymmetry in the fuel hot spot. Higher laser energy can help achieve a more stable implosion, resulting in higher yields.” ... “The laser itself is capable of higher energy without fundamental changes to the laser,” said NIF operations manager Bruno Van Wonterghem. “It’s all about the control of the damage. Too much energy without proper protection, and your optics blow to pieces.” ' - https://lasers.llnl.gov/news/llnls-nif-delivers-record-laser-energy

NOTE: the "problem" very large lasers "required" to deliver ~2MJ (roughly 0.5 kg of TNT energy) to cause larger fusion explosions of 2mm diameter capsules of frozen D+T inside a 1 cm diameter energy reflecting hohlraum, and the "problem" of damage to the equipment caused by the explosions, is immaterial to clean nuclear deterrent development based on this technology, because in a clean nuclear weapon, whatever laser or other power ignition system is used only has to be fired once, so it needs to be less robust than the NIF lasers which are used repeatedly. Similarly, damage done to the system by the explosion is also immaterial for a clean nuclear weapon, in which the weapon is detonated once only! This is exactly the same point which finally occurred during a critical review of the first gun-type assembly nuclear weapon, in which the fact it would only ever be fired once (unlike a field artillery gun) enabled huge reductions in the size of the device, into a practical weapon, as described by General Leslie M. Groves on p163 of his 1962 book Now it can be told: the story of the Manhattan Project:

"Out of the Review Committee's work came one important technical contribution when Rose pointed out ... that the durability of the gun was quite immaterial to success, since it would be destroyed in the explosion anyway. Self-evident as this seemed once it was mentioned, it had not previously occurred to us. Now we could make drastic reductions in ... weight and size."

This principle also applies to weaponizing NIF clean fusion explosion technology. General Groves' book was reprinted in 1982 with a useful Introduction by Edward Teller on the nature of nuclear weapons history: "History in some ways resembles the relativity principle in science. What is observed depends on the observer. Only when the perspective of the observer is known, can proper corrections be made. ... The general ... very often managed to ignore complexity and arrive at a result which, if not ideal, at least worked. ... For Groves, the Manhattan project seemed a minor assignment, less significant than the construction of the Pentagon. He was deeply disappointed at being given the job of supervising the development of an atomic weapon, since it deprived him of combat duty. ... We must find ways to encourage mutual understanding and significant collaboration between those who defend their nation with their lives and those who can contribute the ideas to make that defense successful. Only by such cooperation can we hope that freedom will survive, that peace will be preserved."

General Groves similarly comments in Chapter 31, "A Final Word" of Now it can be told:

"No man can say what would have been the result if we had not taken the steps ... Yet, one thing seems certain - atomic energy would have been developed somewhere in the world ... I do not believe the United States ever would have undertaken it in time of peace. Most probably, the first developer would have been a power-hungry nation, which would then have dominated the world completely ... it is fortunate indeed for humanity that the initiative in this field was gained and kept by the United States. That we were successful was due entirely to the hard work and dedication of the more than 600,000 Americans who comprised and directly supported the Manhattan Project. ... we had the full backing of our government, combined with the nearly infinite potential of American science, engineering and industry, and an almost unlimited supply of people endowed with ingenuity and determination."

Update: Lawrence Livermore National Laboratory's $3.5 billion National Ignition Facility, NIF, using ultraviolet wavelength laser beam pulses of 2MJ on to a 2mm diameter spherical beryllium shell of frozen D+T inside a 1 cm-long hollow gold cylinder "hohlraum" (which is heated to a temperature where it then re-radiates energy at much higher frequency, x-rays, on to the surface of the beryllium ablator of the central fusion capsule, which ablates causing it to recoil inward (as for the 1962 Ripple II nuclear weapon's secondary stage, the capsule is compressed by a factor of 35, mimicking the isentropic compression mechanism of a miniature Ripple II clean nuclear weapon secondary stage), has now repeatedly achieved nuclear fusion explosions of over 3MJ, equivalent to nearly 1 kg of TNT explosive. According to a Time article (linked her) about fusion system designer Annie Kritcher, the recent breakthrough was in part due to using a ramping input energy waveform: "success that came thanks to tweaks including shifting more of the input energy to the later part of the laser shot", a feature that minimises the rise in entropy due to shock shock wave generation (which heats the capsule, causing it to expand and resist compression) and increases isentropic compression which was the principle used by LLNL's J. H. Nuckolls to achieve the 99.9% clean Ripple II 9.96 megaton nuclear test success in Dominic-Housatonic on 30 October 1962. Nuckolls in 1972 published the equation for the idealized input power waveform required for isentropic, optimized compression of fusion fuel (Nature, v239, p139): P ~ (1 - t)-1.875, where t is time in units of the transit time (the time taken for the shock to travel to the centre of the fusion capsule), and -1.875 a constant based on the specific heat of the ionized fuel (Nuckolls has provided the basic declassified principles, see extract linked here). To be clear, the energy reliably released by the 2mm diameter capsule of fusion fuel was roughly a 1 kg TNT explosion. 80% of this is in the form of 14.1 MeV neutrons (ideal for fissioning lithium-7 in LiD to yield more tritium), and 20% is the kinetic energy of fused nuclei (which is quickly converted into x-rays radiation energy by collisions). Nuckolls' 9.96 megaton Housatonic (10 kt Kinglet primary and 9.95 Mt Ripple II 100% clean isentropically compressed secondary) of 1962 proved that it is possible to use multiplicative staging whereby lower yield primary nuclear explosions trigger off a fusion stage 1,000 times more powerful than its initiator. Another key factor, as shown on our ggraph linked here, is that you can use cheap natural LiD as fuel once you have a successful D+T reaction, because naturally abundant, cheap Li-7 more readily fissions to yield tritium with the 14.1 MeV neutrons from D+T fusion, than expensively enriched Li-6, which is needed to make tritium in nuclear reactors where the fission neutron energy of around 1 MeV is too low to to fission Li-7. It should also be noted that despite an openly published paper about Nuckolls' Ripple II success being stymied in 2021 by Jon Grams, the subject is still being covered up/ignored by the anti-nuclear biased Western media! Grams article fails to contain the design details such as the isentropic power delivery curve etc from Nuckolls' declassified articles that we include in the latest blog post here. One problem regarding "data" causing continuing confusion about the Dominic-Housatonic 30 October 1962 Ripple II test at Christmas Island, is made clear in the DASA-1211 report's declassified summary of the sizes, weights and yields of those tests: Housatonic was Nuckolls' fourth and final isentropic test, with the nuclear system inserted into a heavy steel Mk36 drop case, making the overall size 57.2 inches in diameter, 147.9 long and 7,139.55 lb mass, i.e. 1.4 kt/lb or 3.0 kt/kg yield-to-mass ratio for 9.96 Mt yield, which is not impressive for that yield range until you consider (a) that it was 99.9% fusion and (b) the isentropic design required a heavy holhraum around the large Ripple II fusion secondary stage to confine x-rays for relatively long time during which a slowly rising pulse of x-rays were delivered from the primary to secondary via a very large areas of foam elsewhere in the weapon, to produce isentropic compression. Additionally, the test was made in a hurry before an atmospheric teat ban treaty, and this rushed use of a standard air drop steel casing made the tested weapon much heavier than a properly weaponized Ripple II. The key point is that a 10 kt fission device set off a ~10 Mt fusion explosion, a very clean deterrent. Applying this Ripple II 1,000-factor multiplicative staging figure directly to this technology for clean nuclear warheads, a 0.5 kg TNT D+T fusion capsule would set off a 0.5 ton TNT 2nd stage of LiD, which would then set off a 0.5 kt 3rd stage "neutron bomb", which could then be used to set off a 500 kt 4th stage or "strategic nuclear weapon". It is therefore now possible not just in principle but in practice, using suitable already-proved technical staging systems used in 1960s nuclear weapon tests successfully, to design 100% clean fusion nuclear warheads! Yes, the details have been worked out, yes the technology has been tested in piecemeal fashion. All that is now needed is a new, but quicker and cheaper, Star Wars program or Manhattan Project style effort to pull the components together. This will constitute a major leap forward in the credibility of the deterrence of aggressors.

ABOVE: as predicted, the higher the input laser pulse for the D+T initiator of a clean multiplicatively-staged nuclear deterrent, the lower the effect of plasma instabilities and asymmetries and the greater the fusion burn. To get ignition (where the x-ray energy injected into the fusion hohlraum by the laser is less than the energy released in the D+T fusion burn) they have had to use about 2 MJ delivered in 10 ns or so, equivalent to 0.5 kg of TNT equivalent. But for deterrent use, why use such expensive, delicate lasers? Why not just use one-shot miniaturised x-ray tubes with megavolt electron acceleration, powered a suitably ramped pulse from a chemical explosion for magnetic flux compression current generation? At 10% efficiency, you need 0.5 x 10 = 5 kg of TNT! Even at 1% efficiency, 50 kg of TNT will do. Once the D+T gas capsule's hohlraum is well over 1 cm in size, to minimise the risk of imperfections that cause asymmetries, you don't any longer need focussed laser beams to enter tiny apertures. You might even be able to integrate many miniature flash x-ray tubes (each designed to burn out when firing one pulse of a MJ or so) into a special hohlraum. Humanity urgently needs a technological arms race akin to Reagan's Star Wars project, to deter the dictators from invasions and WWIII. In the conference video above, a question was asked about the real efficiency of the enormous repeat-pulse capable laser system's efficiency (not required for a nuclear weapon whose components only require the capability to be used once, unlike lab equipment): the answer is that 300 MJ was required by the lab lasers to fire a 2 MJ pulse into the D+T capsule's x-ray hohlraum, i.e. their lasers are only 0.7% efficient! So why bother? We know - from the practical use of incoherent fission primary stage x-rays to compress and ignite fusion capsules in nuclear weapons - that you simply don't need coherent photons from a laser for this purpose. The sole reason they are approaching the problem with lasers is that they began their lab experiments decades ago with microscopic sized fusion capsules and for those you need a tightly focussed beam to insert energy through a tiny hohlraum aperture. But now they are finally achieving success with much larger fusion capsules (to minimise instabilities that caused the early failures), it may be time to change direction. A whole array of false "no-go theorems" can and will be raised by ignorant charlatan "authorities" against any innovation; this is the nature of the political world. There is some interesting discussion of why clean bombs aren't in existence today, basically the idealized theory (which works fine for big H-bombs but ignores small-scale asymmetry problems which are important only at low ignition energy) understimated the input energy required for fusion ignition by a factor of 2000:

"The early calculations on ICF (inertial-confinement fusion) by John Nuckolls in 1972 had estimated that ICF might be achieved with a driver energy as low as 1 kJ. ... In order to provide reliable experimental data on the minimum energy required for ignition, a series of secret experiments—known as Halite at Livermore and Centurion at Los Alamos—was carried out at the nuclear weapons test site in Nevada between 1978 and 1988. The experiments used small underground nuclear explosions to provide X-rays of sufficiently high intensity to implode ICF capsules, simulating the manner in which they would be compressed in a hohlraum. ... the Halite/Centurion results predicted values for the required laser energy in the range 20 to 100MJ—higher than the predictions ..." - Garry McCracken and Peter Stott, Fusion, Elsevier, 2nd ed., p149.

In the final diagram above, we illustrate an example of what could very well occur in the near future, just to really poke a stick into the wheels of "orthodoxy" in nuclear weapons design: is it possible to just use a lot of (perhaps hardened for higher currents, perhaps no) pulsed current driven microwave tubes from kitchen microwave ovens, channelling their energy using waveguides (simply metal tubes, i.e. electrical Faraday cages, which reflect and thus contain microwaves) into the hohlraum, and make the pusher of dipole molecules (like common salt, NaCl) which is a good absorber of microwaves (as everybody knows from cooking in microwave ovens)? It would be extremely dangerous, not to mention embarrassing, if this worked, but nobody had done any detailed research into the possibility due to groupthink orthodoxy and conventional boxed in thinking! Remember, the D+T capsule just needs extreme compression and this can be done by any means that works. Microwave technology is now very well-established. It's no good trying to keep anything of this sort "secret" (either officially or unofficially) since as history shows, dictatorships are the places where "crackpot"-sounding ideas (such as douple-primary Project "49" Russian thermonuclear weapon designs, Russian Sputnik satellites, Russian Novichok nerve agent, Nazi V1 cruise missiles, Nazi V2 IRBM's, etc.) can be given priority by loony dictators. We have to avoid, as Edward Teller put it (in his secret commentary debunking Bethe's false history of the H-bomb, written AFTER the Teller-Ulam breakthrough), "too-narrow" thinking (which Teller said was still in force on H-bomb design even then). Fashionable hardened orthodoxy is the soft underbelly of "democracy" (a dictatorship by the majority, which is always too focussed on fashionable ideas and dismissive of alternative approaches in science and technology). Dictatorships (minorities against majorities) have repeatedly demonstrated a lack of concern for the fake "no-go theorems" used by Western anti-nuclear "authorities" to ban anything but fashionable groupthink science.

ABOVE: 1944-dated film of the Head of the British Mission to Los Alamos, neutron discoverer James Chadwick, explaining in detail to American how hard it was for him to discover the neutron, taking 10 years on a shoe-string budget, mostly due to having insufficiently strong sources of alpha particles to bombard nuclei in a cloud chamber! The idea of the neutron came from his colleague Rutherford. Chadwick reads his explanation while rapidly rotating a pencil in his right hand, perhaps indicating the stress he was under in 1944. In 1946, when British participation at Los Alamos ended, Chadwick wrote the first detailed secret British report on the design of a three-stage hydrogen bomb, another project that took over a decade. In the diagram below, it appears that the American Mk17 only had a single secondary stage like the similar yield 1952 Mike design. The point here is that popular misunderstanding of the simple mechanism of x-ray energy transfer for higher yield weapons may be creating a dogmatic attitude even in secret nuclear weaponeer design labs, where orthodoxy is followed too rigorously. The Russians (see quotes on the latest blog post here) state they used two entire two-stage thermonuclear weapons with a combined yield of 1 megaton to set off their 50 megaton test in 1961. If true, you can indeed use two-stage hydrogen bombs as an "effective primary" to set off another secondary stage, of much higher yield. Can this be reversed in the sense of scaling it down so you have several bombs-within-bombs, all triggered by a really tiny first stage? In other words, can it be applied to neutron bomb design?

ABOVE: 16 kt at 600m altitude nuclear explosion on a city, Hiroshima ground zero (in foreground) showing modern concrete buildings surviving nearby (unlike the wooden ones that mostly burned at the peak of the firestorm 2-3 hours after survivors had evacuated), in which people were shielded from most of the radiation and blast winds, as they were in simple shelters.

The 1946 Report of the British Mission to Japan, The Effects of the Atomic Bombs at Hiroshima and Nagasaki, compiled by a team of 16 in Hiroshima and Nagasaki during November 1945, which included 10 UK Home Office civil defence experts (W. N. Thomas, J. Bronowski, D. C. Burn, J. B. Hawker, H. Elder, P. A. Badland, R. W. Bevan, F. H. Pavry, F. Walley, O. C. Young, S. Parthasarathy, A. D. Evans, O. M. Solandt, A. E. Dark, R. G. Whitehead and F. G. S. Mitchell) found: "Para. 26. Reinforced concrete buildings of very heavy construction in Hiroshima, even when within 200 yards of the centre of damage, remained structurally undamaged. ... Para 28. These observations make it plain that reinforced concrete framed buildings can resist a bomb of the same power detonated at these heights, without employing fantastic thicknesses of concrete. ... Para 40. The provision of air raid shelters throughout Japan was much below European standards. ... in Hiroshima ... they were semi-sunk, about 20 feet long, had wooden frames, and 1.5-2 feet of earth cover. ... Exploding so high above them, the bomb damaged none of these shelters. ... Para 42. These observations show that the standard British shelters would have performed well against a bomb of the same power exploded at such a height. Anderson shelters, properly erected and covered, would have given protection. Brick or concrete surfac shelters with adequate reinforcement would have remained safe from collapse. The Morrison shelter is designed only to protect its occupants from the refuge load of a house, and this it would have done. Deep shelters such as the refuge provided by the London Underground would have given complete protection. ... Para 60. Buildings and walls gave complete protection from flashburn."

Glasstone and Dolan's 1977 Effects of Nuclear Weapons in Table 12.21 on p547 flunks making this point by giving data without citing its source to make it credible to readers: it correlated 14% mortality (106 killed out of 775 people in Hiroshima's Telegraph Office) to "moderate damage" at 500m in Hiroshima (the uncited "secret" source was NP-3041, Table 12, applying to unwarned people inside modern concrete buildings).

"A weapon whose basic design would seem to provide the essence of what Western morality has long sought for waging classical battlefield warfare - to keep the war to a struggle between the warriors and exclude the non-combatants and their physical assets - has been violently denounced, precisely because it achieves this objective." - Samuel T. Cohen (quoted in Chapman Pincher, The secret offensive, Sidgwick and Jackson, London, 1985, Chapter 15: The Neutron Bomb Offensive, p210).

The reality is, dedicated enhanced neutron tactical nuclear weapons were used to credibly deter the concentrations of force required for triggering of WWIII during the 1st Cold War, and the thugs who support Russian propaganda for Western disarmament got rid of them on our side, but not on the Russian side. Air burst neutron bombs or even as subsurface earth penetrators of relatively low fission yield (where the soil converts energy that would otherwise escape as blast and radiation into ground shock for destroying buried tunnels - new research on cratering shows that a 20 kt subsurface burst creates similar effects on buried hard targets as a 1 Mt surface burst), they cause none of the vast collateral damage to civilians that we see now in Ukraine and Gaza, or that we saw in WWII and the wars in Korea and Vietnam. This is 100% contrary to CND propaganda which is a mixture of lying on nuclear explosion collateral damage, escalation/knockout blow propaganda (of the type used to start WWII by appeasers) and lying on the designs of nuclear weapons in order to ensure the Western side (but not the thugs) gets only incredible "strategic deterrence" that can't deter the invasions that start world wars (e.g. Belgium in 1914 and Poland in 1939.) "Our country entered into an agreement in Budapest, Hungary when the Soviet Union was breaking up that we would guarantee the independence of Ukraine." - Tom Ramos. There really is phoney nuclear groupthink left agenda politics at work here: credible relatively clean tactical nuclear weapons are banned in the West but stocked by Russia, which has civil defense shelters to make its threats far more credible than ours! We need low-collateral damage enhanced-neutron and earth-penetrator options for the new Western W93 warhead, or we remain vulnerable to aggressive coercion by thugs, and invite invasions. Ambiguity, the current policy ("justifying" secrecy on just what we would do in any scenario) actually encourages experimental provocations by enemies to test what we are prepared to do (if anything), just as it did in 1914 and the 1930s.

ABOVE: 0.2 kt (tactical yield range) Ruth nuclear test debris, with lower 200 feet of the 300 ft steel tower surviving in Nevada, 1953. Note that the yield of the tactical invasion-deterrent Mk54 Davy Crockett was only 0.02 kt, 10 times less than than 0.2 kt Ruth.

It should be noted that cheap and naive "alternatives" to credible deterrence of war were tried in the 1930s and during the Cold War and afterwards, with disastrous consequences. Heavy "peaceful" oil sanctions and other embargoes against Japan for its invasion of China between 1931-7 resulted in the plan for the Pearl Harbor surprise attack of 7 December 1941, with subsequent escalation to incendiary city bombing followed nuclear warfare against Hiroshima and Nagasaki. Attlee's pressure on Truman to guarantee no use of tactical nuclear weapons in the Korean War (leaked straight to Stalin by the Cambridge Spy Ring), led to an escalation of that war causing the total devastation of the cities of that country by conventional bombing (a sight witnessed by Sam Cohen, that motivated his neutron bomb deterrent of invasions), until Eisenhower was elected and reversed Truman's decision, leading not to the "escalatory Armageddon" assertions of Attlee, but to instead to a peaceful armistice! Similarly, as Tom Ramos argues in From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Kennedy's advisers who convinced him to go ahead with the moonlit 17 April 1961 Bay of Pigs invasion of Cuba without any USAF air support, which led to precisely what they claimed they would avoid: an escalation of aggression from Russia in Berlin, with the Berlin Wall going up on 17 August 1961 because any showing weakness to an enemy, as in the bungled invasion of Cuba, is always a green light to dictators to go ahead with revolutions, invasions and provocations everywhere else. Rather than the widely hyped autistic claims from disarmers and appeasers about "weakness bringing peace by demonstrating to the enemy that they have nothing to fear from you", the opposite result always occurs. The paranoid dictator seizes the opportunity to strike first. Similarly, withdrawing from Afghanistan in 2021 was a clear green light to Russia to go ahead with a full scale invasion of Ukraine, reigniting the Cold War. von Neumann and Morgenstein's Minimax theorem for winning games - minimise the maximum possible loss - fails with offensive action in war because it sends a signal of weakness to the enemy, which does not treat war as a game with rules to be obeyed. Minimax is only valid for defense, such as civil defense shelters used by Russia to make their threats more credible than ours. The sad truth is that cheap fixes don't work, no matter how much propaganda is behind them. You either need to militarily defeat the enemy or at least economically defeat them using proven Cold War arms race techniques (not merely ineffective sanctions, which they can bypass by making alliances with Iran, North Korea, and China). Otherwise, you are negotiating peace from a position of weakness, which is called appeasement, or collaboration with terrorism.

"Following the war, the Navy Department was intent to see the effects of an atomic blast on naval warships ... the press was invited to witness this one [Crossroads-Able, 23.5 kt at 520 feet altitude, 1 July 1946, Bikini Atoll]. ... The buildup had been too extravagant. Goats that had been tethered on warship decks were still munching their feed, and the atoll's palm trees remained standing, unscathed. The Bikini test changed public attitudes. Before July 1, the world stood in awe of a weapon that had devastated two cities and forced the Japanese Empire to surrender. After that date, the bomb was still a terrible weapon, but a limited one." - Tom Ramos (LLNL nuclear weaponeer and nuclear pumped X-ray laser developer), From Berkeley to Berlin: How the Rad Lab Helped Prevent Nuclear War, Naval Institute Press, 2022, pp43-4.

ABOVE: 16 February 1950 Daily Express editorial on H Bomb problem due to the fact that the UN is another virtue signalling but really war mongering League of Nations (which oversaw Nazi appeasement and the outbreak of WWII); however Fuchs had attended the April 1946 Super Conference during which the Russian version of the H-bomb involving isentropic radiation implosion of a separate low-density fusion stage (unlike Teller's later dense metal ablation rocket implosion secondary TX14 Alarm Clock and Sausage designs) were discussed and then given to Russia. The media was made aware only that Fuchs hade given the fission bomb to Russia. The FBI later visited Fuchs in British jail, showed him a film of Harry Gold (whom Fuchs identified as his contact while at Los Alamos) and also gave Fuchs a long list of secret reports to mark off individually so that they knew precisely what Stalin had been given. Truman didn't order H-bomb research and development because Fuchs gave Stalin the A-bomb, but because he gave them the H-bomb. The details of the Russian H-bomb are still being covered up by those who want a repetition of 1930s appeasement, or indeed the deliberate ambiguity of the UK Cabinet in 1914 which made it unclear what the UK would do if Germany invaded Belgium, allowing the enemy to exploit that ambiguity, starting a world war. The key fact usually covered up (Richard Rhodes, Chuck Hansen, and the whole American "expert nuclear arms community" all misleadingly claim that Teller's Sausage H-bomb design with a single primary and a dense ablator around a cylindrical secondary stage - uranium, lead or tungsten - is the "hydrogen bomb design") here is that two attendees of the April 1946 Super Conference, the report author Egon Bretscher and the radiation implosion discoverer Klaus Fuchs - were British, and both contributed key H-bomb design principles to the Russian and British weapons (discarded for years by America). Egon Bretscher for example wrote up the Super Conference report, during which attendees suggested various ways to try to achieve isentropic compression of low-density fusion fuel (a concept discarded by Teller's 1951 Sausage design, but used by Russia and re-developed in America on Nuckolls 1962 Ripple tests), and after Teller left Los Alamos, Bretscher took over work on Teller's Alarm Clock layered fission-fusion spherical hybrid device before Bretscher himself left Los Alamos and became head of nuclear physics at Harwell, UK,, submitting UK report together with Fuchs (head of theoretical physics at Harwell) which led to Sir James Chadwick's UK paper on a three-stage thermonuclear Super bomb which formed the basis of Penney's work at the UK Atomic Weapons Research Establishment. While Bretscher had worked on Teller's hybrid Alarm Clock (which originated two months after Fuchs left Los Alamos), Fuchs co-authored a hydrogen bomb patent with John von Neumann, in which radiation implosion and ionization implosion was used. Between them, Bretscher and Fuchs had all the key ingredients. Fuchs leaked them to Russia and the problem persists today in international relations.

ILLUSTRATION: the threat of WWII and the need to deter it was massively derided by popular pacifism which tended to make "jokes" of the Nazi threat until too late (example of 1938 UK fiction on this above; Charlie Chaplin's film "The Great Dictator" is another example), so three years after the Nuremberg Laws and five years after illegal rearmament was begun by the Nazis, in the UK crowds of "pacifists" in Downing Street, London, support friendship with the top racist, dictatorial Nazis in the name of "world peace". The Prime Minister used underhand techniques to try to undermine appeasement critics like Churchill and also later to get W. E. Johns fired from both editorships of Flying (weekly) and Popular Flying (monthly) to make it appear everybody "in the know" agreed with his actions, hence the contrived "popular support" for collaborating with terrorists depicted in these photos. The same thing persists today; the 1920s and 1930s "pacifist" was also driven by "escalation" and "annihilation" claims explosions, fire and WMD poison gas will kill everybody in a "knockout blow", immediately any war breaks out.

Update (4 January 2024): on the important world crisis, https://vixra.org/abs/2312.0155 gives a detailed review of "Britain and the H-bomb" (linked here), and why the "nuclear deterrence issue" isn't about "whether we should deter evil", but precisely what design of nuclear warhead we should have in order to do that cheaply, credibly, safely, and efficiently without guaranteeing either escalation or the failure of deterrence. When we disarmed our chemical and biological weapons, it was claimed that the West could easily deter those weapons using strategic nuclear weapons to bomb Moscow (which has shelters, unlike us). That failed when Putin used sarin and chlorine to prop up Assad in Syria, and Novichok in the UK to kill Dawn Sturgess in 2018. So it's just not a credible deterrent to say you will bomb Moscow if Putin invades Europe or uses his 2000 tactical nuclear weapons. An even more advanced deterrent, the 100% clean very low yield (or any yield) multiplicative staged design without any fissile material whatsoever, just around the corner. Clean secondary stages have been proof-tested successfully for example in the 100% clean Los Alamos Redwing Navajo secondary, and the 100% clean Ripple II secondary tested 30 October 1962, and the laser ignition of very tiny fusion capsules to yield more energy than supplied has been done on 5 December 2022 when a NIF test delivered 2.05 MJ (the energy of about 0.5 kg of TNT) to a fusion capsule which yielded 3.15 MJ, so all that is needed is to combine both ideas in a system whereby suitably sized second stages - ignited in the first place by a capacitative charged circuit sending a pulse of energy to a suitable laser system (the schematic shown is just a sketch of principle - more than one laser would possibly be required for reliability of fusion ignition) acting on tiny fusion capsule as shown - are encased to two-stage "effective primaries" which each become effective primaries of bigger systems, thus a geometric series of multiplicative staging until the desired yield is reached. Note that the actual tiny first T+D capsule can be compressed by one-shot lasers - compact lasers used way beyond their traditional upper power limit and burned out in a firing a single pulse - in the same way the gun assembly of the Hiroshima bomb was based on a one-shot gun. In other words, forget all about textbook gun design. The Hiroshima bomb gun assembly system only had to be fired once, unlike a field artillery piece which has to be ready to be fired many thousands of times (before metal fatigue/cracks set in). Thus, by analogy, the lasers - which can be powered by ramping current pulses from magnetic flux compressor systems - for use in a clean bomb will be much smaller and lighter than current lab gear which is designed to be used thousands of times in repeated experiments. The diagram below shows cylindrical Li6D stages throughout for a compact bomb shape, but spherical stages can be used, and once a few stages get fired, the flux of 14 MeV neutrons is sufficient to go to cheap natural LiD. To fit it into a MIRV warhead, the low density of LiD constrains such a clean warhead will have a low nuclear yield, which means a tactical neutron deterrent of the invasions that cause big wars; a conversion of incredible strategic deterrence into a more credible combined strategic-tactical deterrent of major provocations, not just direct attacks. It should also be noted that in 1944 von Neumann suggested that T + D inside the core of the fission weapon would be compressed by "ionization compression" during fission (where a higher density ionized plasma compresses a lower density ionized plasma, i.e. the D + T plasma), an idea that was - years later - named the Internal Booster principle by Teller; see Frank Close, "Trinity", Allen Lane, London, 2019, pp158-159 where Close argues that during the April 1946 Superbomb Conference, Fuchs extended von Neumann's 1944 internal fusion boosting idea to an external D + T filled BeO walled capsule:

"Fuchs reasoned that [the very low energy, 1-10 kev, approximately 10-100 lower energy than medical] x-rays from the [physically separated] uranium explosion would reach the tamper of beryllium oxide, heat it, ionize the constituents and cause them to implode - the 'ionization implosion' concept of von Neumann but now applied to deuterium and tritium contained within beryllium oxide. To keep the radiation inside the tamper, Fuchs proposed to enclose the device inside a casing impervious to radiation. The implosion induced by the radiation would amplify the compression ... and increase the chance of the fusion bomb igniting. The key here is 'separation of the atomic charge and thermonuclear fuel, and compression of the latter by radiation travelling from the former', which constitutes 'radiation implosion'." (This distinction between von Neumann's "ionization implosion" INSIDE the tamper, of denser tamper expanding and thus compressing lower density fusion fuel inside, and Fuchs' OUTSIDE capsule "radiation implosion", is key even today for isentropic H-bomb design; it seems Teller's key breakthroughs were not separate stages or implosion but rather radiation mirrors and ablative recoil shock compression, where radiation is used to ablate a dense pusher of Sausage designs like Mike in 1952 etc., a distinction not to be confused for the 1944 von Neumann and 1946 Fuchs implosion mechanisms!

It appears Russian H-bombs used von Neumann's "ionization implosion" and Fuchs's "radiation implosion" for RDS-37 on 22 November 1955 and also in their double-primary 23 February 1958 test and subsequently, where their fusion capsules reportedly contained a BeO or other low-density outer coating, which would lead to quasi-isentropic compression, more effective for low density secondary stages than purely ablative recoil shock compression. This accounts for the continuing classification of the April 1946 Superbomb Conference (the extract of 32 pages linked here is so severely redacted that it is less helpful than the brief but very lucid summary of its technical content, in the declassified FBI compilation of reports concerning data Klaus Fuchs sent to Stalin, linked here!). Teller had all the knowledge he needed in 1946, but didn't go ahead because he made the stupid error of killing progress off by his own "no-go theorem" against compression of fusion fuel. Teller did a "theoretical" calculation in which he claimed that compression has no effect on the amount of fusion burn because the compressed system is simply scaled down in size so that the same efficiency of fusion burn occurs, albeit faster, and then stops as the fuel thermally expands. This was wrong. Teller discusses the reason for his great error in technical detail during his tape-recorded interview by Chuck Hansen at Los Alamos on 7 June 1993 (C. Hansen, Swords of Armageddon, 2nd ed., pp. II-176-7):

"Now every one of these [fusion] processes varied with the square of density. If you compress the thing, then in one unit's volume, each of the 3 important processes increased by the same factor ... Therefore, compression (seemed to be) useless. Now when ... it seemed clear that we were in trouble, then I wanted very badly to find a way out. And it occurred to be than an unprecedentedly strong compression will just not allow much energy to go into radiation. Therefore, something had to be wrong with my argument and then, you know, within minutes, I knew what must be wrong ... [energy] emission occurs when an electron and a nucleus collide. Absorption does not occur when a light quantum and a nucleus ... or ... electron collide; it occurs when a light quantum finds an electron and a nucleus together ... it does not go with the square of the density, it goes with the cube of the density." (This very costly theoretical error, wasting five years 1946-51, could have been resolved by experimental nuclear testing. There is always a risk of this in theoretical physics, which is why experiments are done to check calculations before prizes are handed out. The ban on nuclear testing is a luddite opposition to technological progress in improving deterrence.)

(This 1946-51 theoretical "no-go theorem" anti-compression error of Teller's, which was contrary to the suggestion of compression at the April 1946 superbomb conference as Teller himself refers to on 14 August 1952, and which was corrected only by comparison of the facts about compression validity in pure fission cores in Feb '51 after Ulam's argument that month for fission core compression by lens focussed primary stage shock waves, did not merely lead to Teller's dismissal of vital compression ideas. It also led to his false equations - exaggerating the cooling effect of radiation emission - causing underestimates of fusion efficiency in all theoretical calculations done of fusion until 1951! For this reason, Teller later repudiated the calculations that allegedly showed his Superbomb would fizzle; he argued that if it had been tested in 1946, the detailed data obtained - regardless of whatever happened - would have at least tested the theory which would have led to rapid progress, because the theory was wrong. The entire basis of the cooling of fusion fuel by radiation leaking out was massively exaggerated until Lawrence Livermore weaponeer John Nuckolls showed that there is a very simple solution: use baffle re-radiated, softened x-rays for isentropic compression of low-density fusion fuel, e.g. very cold 0.3 kev x-rays rather than the usual 1-10 kev cold-warm x-rays emitted directly from the fission primary. Since the radiation losses are proportional to the fourth-power of the x-ray energy or temperature, losses are virtually eliminated, allowing very efficient staging as for Nuckolls' 99.9% 10 Mt clean Ripple II, detonated on 30 October 1962 at Christmas Island. Teller's classical Superbomb was actually analyzed by John C. Solem in a 15 December 1978 report, A modern analysis of Classical Super, LA-07615, according to a Freedom of Information Act request filed by mainstream historian Alex Wellerstein, FOIA 17-00131-H, 12 June 2017; according to a list of FOIA requests at https://www.governmentattic.org/46docs/NNSAfoiaLogs_2016-2020.pdf. However, a google search for the documents Dr Wellerstein requested shows only a few at the US Gov DOE Opennet OSTI database or otherwise online yet e.g. LA-643 by Teller, On the development of Thermonuclear Bombs dated 16 Feb. 1950. The page linked here stating that report was "never classified" is mistaken! One oddity about Teller's anti-compression "no-go theorem" is that the even if fusion rates were independent of density, you would still want compression of fissile material in a secondary stage such as a radiation imploded Alarm Clock, because the whole basis of implosion fission bombs is the benefit of compression; another issue is that even if fusion rates are unaffected by density, inward compression would still help to delay the expansion of the fusion system which leads to cooling and quenching of the fusion burn.)

ABOVE: the FBI file on Klaus Fuchs contains a brief summary of the secret April 1946 Super Conference at Los Alamos which Fuchs attended, noting that compression of fusion fuel was discussed by Lansdorf during the morning session on 19 April, attended by Fuchs, and that: "Suggestions were made by various people in attendance as to the manner of minimizing the rise in entropy during compression." This fact is vitally interesting, since it proves that an effort was being made then to secure isentropic compression of low-density fusion fuel in April 1946, sixteen years before John H. Nuckolls tested the isentropically compressed Ripple II device on 30 October 1962, giving a 99.9% clean 10 megaton real H-bomb! So the Russians were given a massive head start on this isentropic compression of low-density fusion fuel for hydrogen bombs, used (according to Trutnev) in both the single primary tests like RDS-37 in November 1955 and also in the double-primary designs which were 2.5 times more efficient on a yield-to-mass basis, tested first on 23 February 1958! According to the FBI report, the key documents Fuchs gave to Russia were LA-551, Prima facie proof of the feasibility of the Super, 15 Apr 1946 and the LA-575 Report of conference on the Super, 12 June 1946. Fuchs also handed over to Russia his own secret Los Alamos reports, such as LA-325, Initiator Theory, III. Jet Formation by the Collision of Two Surfaces, 11 July 1945, Jet Formation in Cylindrical lmplosion with 16 Detonation Points, Secret, 6 February 1945, and Theory of Initiators II, Melon Seed, Secret, 6 January 1945. Note the reference to Bretscher attending the Super Conference with Fuchs; Teller in a classified 50th anniversary conference at Los Alamos on the H-bomb claimed that after he (Teller) left Los Alamos for Chicago Uni in 1946, Bretscher continued work on Teller's 31 August 1946 "Alarm Clock" nuclear weapon (precursor of the Mike sausage concept etc) at Los Alamos; it was this layered uranium and fusion fuel "Alarm Clock" concept which led to the departure of Russian H-bomb design from American H-bomb design, simply because Fuchs left Los Alamos in June 1946, well before Teller invented the Alarm Clock concept on 31 August 1946 (Teller remembered the date precisely simply because he invented the Alarm Clock on the day his daughter was born, 31 August 1946! Teller and Richtmyer also developed a variant called "Swiss Cheese", with small pockets or bubbles of expensive fusion fuels, dispersed throughout cheaper fuel, in order to kinder a more cost-effective thermonuclear reaction; this later inspired the fission and fusion boosted "spark plug" ideas in later Sausage designs; e.g. security cleared Los Alamos historian Anne Fitzpatrick stated during her 4 March 1997 interview with Robert Richtmyer, who co-invented the Alarm Clock with Teller, that the Alarm Clock evolved into the spherical secondary stage of the 6.9 megaton Castle-Union TX-14 nuclear weapon!).

In fact (see Lawrence Livermore National Laboratory nuclear warhead designer Nuckolls' explanation in report UCRL-74345): "The rates of burn, energy deposition by charged reaction products, and electron-ion heating are proportional to the density, and the inertial confinement time is proportional to the radius. ... The burn efficiency is proportional to the product of the burn rate and the inertial confinement time ...", i.e. the fusion burn rate is directly proportional to the fuel density, which in turn is of course inversely proportional to the cube of its radius. But the inertial confinement time for fusion to occur is proportional to the radius, so the fusion stage efficiency in a nuclear weapon is the product of the burn rate (i.e., 1/radius^3) and time (i.e., radius), so efficiency ~ radius/(radius^3) ~ 1/radius^2. Therefore, for a given fuel temperature, the total fusion burn, or the efficiency of the fusion stage, is inversely proportional to the square of the compressed radius of the fuel! (Those condemning Teller's theoretical errors or "arrogance" should be aware that he pushed hard all the time for experimental nuclear tests of his ideas, to check if they were correct, exactly the right thing to do scientifically and others who read his papers had the opportunity to point out any theoretical errors, but was rebuffed by those in power, who used a series of contrived arguments to deny progress, based upon what Harry would call "subconscious bias", if not arrogant, damning, overt bigotry against the kind of credible, overwhelming deterrence which had proved lacking a decade earlier, leading to WWII. This callousness towards human suffering in war and under dictatorship existed in some UK physicists too: Joseph Rotblat's hatred of anything to deter Russia be it civil defense or tactical neutron bombs of the West - he had no problem smiling and patting Russia's neutron bomb when visiting their labs during cosy groupthink deluded Pugwash campaigns for Russian-style "peaceful collaboration" - came from deep family communist convictions, since his brother was serving in the Red Army in 1944 when he alleged he heard General Groves declare that the bomb must deter Russia! Rotblat stated he left Los Alamos as a result. The actions of these groups are analogous to the "Cambridge Scientists Anti-War Group" in the 1930s. After Truman ordered a H-bomb, Bradbury at Los Alamos had to start a "Family Committee" because Teller had a whole "family" of H-bomb designs, ranging from the biggest, "Daddy", through various "Alarm Clocks", all the way down to small internally-boosted fission tactical weapons. From Teller's perspective, he wasn't putting all eggs in one basket.)

Above: declassified illustration from a January 1949 secret report by the popular physics author and Los Alamos nuclear weapons design consultant George Gamow, showing his suggestion of using x-rays from both sides of a cylindrically imploded fission device to expose two fusion capsules to x-rays to test whether compression (fusion in BeO box on right side) helps, or is unnecessary (capsule on left side). Neutron counters detect 14.1 Mev T+D neutrons using time-of-flight method (higher energy neutrons traver faster than ~1 Mev fission stage neutrons, arriving at detectors first, allowing discrimination of the neutron energy spectrum by time of arrival). It took over two years to actually fire this 225 kt shot (8 May 1951)! No wonder Teller was outraged. A few interesting reports by Teller and also Oppenheimer's secret 1949 report opposing the H bomb project as it then stood on the grounds of low damage per dollar - precisely the exact opposite of the "interpretation" the media and gormless fools will assert until the cows come home - are linked here. The most interesting is Teller's 14 August 1952 Top Secret paper debunking Hans Bethe's propaganda, by explaining that contrary to Bethe's claims, Stalin's spy Klaus Fuch had the key "radiation implosion"- see second para on p2 - secret of the H-bomb because he attended the April 1946 Superbomb Conference which was not even attended by Bethe!  It was this very fact in April 1946, noted by two British attendees of the 1946 Superbomb Conference before collaboration was ended later in the year by the 1946 Atomic Energy Act, statement that led to Sir James Cladwick's secret use of "radiation implosion" for stages 2 and 3 of his triple staged H-bomb report the next month, "The Superbomb", a still secret document that inspired Penney's original Tom/Dick/Harry staged and radiation imploded H-bomb thinking, which is summarized by security cleared official historian Arnold's Britain and the H-Bomb.  Teller's 24 March 1951 letter to Los Alamos director Bradbury was written just 15 days after his historic Teller-Ulam 9 March 1951 report on radiation coupling and "radiation mirrors" (i.e. plastic casing lining to re-radiate soft x-rays on to the thermonuclear stage to ablate and thus compress it), and states: "Among the tests which seem to be of importance at the present time are those concerned with boosted weapons. Another is connected vith the possibility of a heterocatalytic explosion, that is, implosion of a bomb using the energy from another, auxiliary bomb. A third concerns itself with tests on mixing during atomic explosions, which question is of particular importance in connection with the Alarm Clock."

There is more to Fuchs' influence on the UK H-bomb than I go into that paper; Chapman Pincher alleged that Fuchs was treated with special leniency at his trial and later he was given early release in 1959 because of his contributions and help with the UK H-bomb as author of the key Fuchs-von Neumann x-ray compression mechanism patent. For example, Penney visited Fuchs in June 1952 in Stafford Prison; see pp309-310 of Frank Close's 2019 book "Trinity". Close argues that Fuchs gave Penney a vital tutorial on the H-bomb mechanism during that prison visit. That wasn't the last help, either, since the UK Controller for Atomic Energy Sir Freddie Morgan wrote Penney on 9 February 1953 that Fuchs was continuing to help. Another gem: Close gives, on p396, the story of how the FBI became suspicious of Edward Teller, after finding a man of his name teaching at the NY Communist Workers School in 1941 - the wrong Edward Teller, of course - yet Teller's wife was indeed a member of the Communist-front "League of women shoppers" in Washington, DC.

Chapman Pincher, who attended the Fuchs trial, writes about Fuchs hydrogen bomb lectures to prisoners in chapter 19 of his 2014 autobiography, Dangerous to know (Biteback, London, pp217-8): "... Donald Hume ... in prison had become a close friend of Fuchs ... Hume had repaid Fuchs' friendship by organising the smuggling in of new scientific books ... Hume had a mass of notes ... I secured Fuchs's copious notes for a course of 17 lectures ... including how the H-bomb works, which he had given to his fellow prisoners ... My editor agreed to buy Hume's story so long as we could keep the papers as proof of its authenticity ... Fuchs was soon due for release ..."

Chapman Pincher wrote about this as the front page exclusive of the 11 June 1952 Daily Express, "Fuchs: New Sensation", the very month Penney visited Fuchs in prison to receive his H-bomb tutorial! UK media insisted this was evidence that UK security still wasn't really serious about deterring further nuclear spies, and the revelations finally culminated in the allegations that the MI5 chief 1956-65 Roger Hollis was a Russian fellow-traveller (Hollis was descended from Peter the Great, according to his elder brother Chris Hollis' 1958 book Along the Road to Frome) and GRU agent of influence, codenamed "Elli". Pincher's 2014 book, written aged 100, explains that former MI5 agent Peter Wright suspected Hollis was Elli after evidence collected by MI6 agent Stephen de Mowbray was reported to the Cabinet Secretary. Hollis is alleged to have deliberately fiddled his report of interviewing GRU defector Igor Gouzenko on 21 November 1945 in Canada. Gouzenko had exposed the spy and Groucho Marx lookalike Dr Alan Nunn May (photo below), and also a GRU spy in MI5 codenamed Elli, who used only duboks (dead letter boxes), but Gouzenko told Pincher that when Hollis interviewed him in 1945 he wrote up a lengthy false report claiming to discredit many statements by Gouzenko: "I could not understand how Hollis had written so much when he had asked me so little. The report was full of nonsense and lies. As [MI5 agent Patrick] Stewart read the report to me [during the 1972 investigation of Hollis], it became clear that it had been faked to destroy my credibility so that my information about the spy in MI5 called Elli could be ignored. I suspect that Hollis was Elli." (Source: Pincher, 2014, p320.) Christopher Andrew claimed Hollis couldn't have been GRU spy Elli because KGB defector Oleg Gordievsky suggested it was the KGB spy Leo Long (sub-agent of KGB spy Anthony Blunt). However, Gouzenko was GRU, not KGB like Long and Gordievsky! Gordievsky's claim that "Elli" was on the cover of Long's KGB file was debunked by KGB officer Oleg Tsarev, who found that Long's codename was actually Ralph! Another declassified Russian document, from General V. Merkulov to Stalin dated 24 Nov 1945, confirmed Elli was a GRU agent inside british intelligence, whose existence was betrayed by Gouzenko. In Chapter 30 of Dangerous to Know, Pincher related how he was given a Russian suitcase sized microfilm enlarger by 1959 Hollis spying eyewitness Michael J. Butt, doorman for secret communist meetings in London. According to Butt, Hollis delivered documents to Brigitte Kuczynski, younger sister of Klaus Fuchs' original handler, the notorious Sonia aka Ursula. Hollis allegedly provided Minox films to Brigitte discretely when walking through Hyde Park at 8pm after work. Brigitte gave her Russian made Minox film enlarger to Butt to dispose of, but he kept it in his loft as evidence. (Pincher later donated it to King's College.) Other more circumstantial evidence is that Hollis recruited the spy Philby, Hollis secured spy Blunt immunity from prosecution, Hollis cleared Fuchs in 1943, and MI5 allegedly destroyed Hollis' 1945 interrogation report on Gouzenko, to prevent the airing of the scandal that it was fake after checking it with Gouzenko in 1972.

It should be noted that the very small number of Russian GRU illegal agents in the UK and the very small communist party membership had a relatively large influence on nuclear policy via infiltration of unions which had block votes in the Labour Party, as well the indirect CND and "peace movement" lobbies saturating the popular press with anti-civil defence propaganda to make the nuclear deterrent totally incredible for any provocation short of a direct all-out countervalue attack. Under such pressure, UK Prime Minister Harold Wilson's government abolished the UK Civil Defence Corps, making the UK nuclear deterrent totally incredible against major provocations, in March 1968. While there was some opposition to Wilson, it was focussed on his profligate nationalisation policies which were undermining the economy and thus destabilizing military expenditure for national security. Peter Wright’s 1987 book Spycatcher and various other sources, including Daily Mirror editor Hugh Cudlipp's book Walking on Water, documented that on 8 May 1968, the Bank of England's director Cecil King, who was also Chairman of Daily Mirror newspapers, Mirror editor Cudlipp and the UK Ministry of Defence's anti-nuclear Chief Scientific Adviser Sir Solly Zuckerman, met at Lord Mountbatten's house in Kinnerton Street, London, to discuss a coup e'tat to overthrow Wilson and make Mountbatten the UK President, a new position. King's position, according to Cudlipp - quite correctly as revealed by the UK economic crises of the 1970s when the UK was effectively bankrupt - was that Wilson was setting the UK on the road to financial ruin and thus military decay. Zuckerman and Mountbatten refused to take part in a revolution, however Wilson's government was attacked by the Daily Mirror in a front page editorial by Cecil King two days later, on 10 May 1968, headlined "Enough is enough ... Mr Wilson and his Government have lost all credibility, all authority." According to Wilson's secretary Lady Falkender, Wilson was only told of the coup discussions in March 1976.

CND and the UK communist party alternatively tried to claim, in a contradictory way, that they were (a) too small in numbers to have any influence on politics, and (b) they were leading the country towards utopia via unilateral nuclear disarmament saturation propaganda about nuclear weapons annihilation (totally ignoring essential data on different nuclear weapon designs, yields, heights of burst, the "use" of a weapon as a deterrent to PREVENT an invasion of concentrated force, etc.) via the infiltrated BBC and most other media. Critics pointed out that Nazi Party membership in Germany was only 5% when Hitler became dictator in 1933, while in Russia there were only 200,000 Bolsheviks in September 1917, out of 125 million, i.e. 0.16%. Therefore, the whole threat of such dictatorships is a minority seizing power beyond it justifiable numbers, and controlling a majority which has different views. Traditional democracy itself is a dictatorship of the majority (via the ballot box, a popularity contest); minority-dictatorship by contrast is a dictatorship by the fanatically motivated minority by force and fear (coercion) to control the majority. The coercion tactics used by foreign dictators to control the press in free countries are well documented, but never publicised widely. Hitler put pressure on Nazi-critics in the UK "free press" via UK Government appeasers Halifax, Chamberlain and particularly the loathsome UK ambassador to Nazi Germany, Sir Neville Henderson, for example trying to censor or ridicule appeasement critics David Low, to fire Captain W. E. Johns (editor of both Flying and Popular Flying, which had huge circulations and attacked appeasement as a threat to national security in order to reduce rearmament expenditure), and to try to get Winston Churchill deselected. These were all sneaky "back door" pressure-on-publishers tactics, dressed up as efforts to "ease international tensions"! The same occurred during the Cold War, with personal attacks in Scientific American and Bulletin of the Atomic Scientists and by fellow travellers on Herman Kahn, Eugene Wigner, and others who warned we need civil defence to make a deterrent of large provocations credible in the eyes of an aggressor.

Chapman Pincher summarises the vast hypocritical Russian expenditure on anti-Western propaganda against the neutron bomb in Chapter 15, "The Neutron Bomb Offensive" of his 1985 book The Secret Offensive: "Such a device ... carries three major advantages over Hiroshima-type weapons, particularly for civilians caught up in a battle ... against the massed tanks which the Soviet Union would undoubtedly use ... by exploding these warheads some 100 feet or so above the massed tanks, the blast and fire ... would be greatly reduced ... the neutron weapon produces little radioactive fall-out so the long-term danger to civilians would be very much lower ... the weapon was of no value for attacking cities and the avoidance of damage to property can hardly be rated as of interest only to 'capitalists' ... As so often happens, the constant repetition of the lie had its effects on the gullible ... In August 1977, the [Russian] World Peace Council ... declared an international 'Week of action' against the neutron bomb. ... Under this propaganda Carter delayed his decision, in September ... a Sunday service being attended by Carter and his family on 16 October 1977 was disrupted by American demonstrators shouting slogans against the neutron bomb [see the 17 October 1977 Washington Post] ... Lawrence Eagleburger, when US Under Secretary of State for Political Affairs, remarked, 'We consider it probably that the Soviet campaign against the 'neutron bomb cost some $100 million'. ... Even the Politburo must have been surprised at the size of what it could regard as a Fifth Column in almost every country." [Unfortunately, Pincher himself had contributed to the anti-nuclear nonsense in his 1965 novel "Not with a bang" in which small amounts of radioactivity from nuclear fallout combine with medicine to exterminate humanity! The allure of anti-nuclear propaganda extends to all who which to sell "doomsday fiction", not just Russian dictators but mainstream media story tellers in the West. By contrast, Glasstone and Dolan's 1977 Effects of Nuclear Weapons doesn't even mention the neutron bomb, so there was no scientific and technical effort whatsoever by the West to make it a credible deterrent even in the minds of the public it had to protect from WWIII!]

"The Lance warhead is the first in a new generation of tactical mini-nukes that have been sought by Army field leading advocates: the series of American generals who have commanded the North Atlantic Treaty organization theater. They have argued that the 7,000 unclear warheads now in Europe are old, have too large a nuclear yield and thus would not be used in a war. With lower yields and therefore less possible collateral damage to civilian populated areas, these commanders have argued, the new mini-nukes are more credible as deterrents because they just might be used on the battlefield without leading to automatic nuclear escalation. Under the nuclear warhead production system, a President must personally give the production order. President Ford, according to informed sources, signed the order for the enhanced-radiation Lance warhead. The Lance already has regular nuclear warheads and it deployed with NATO forces in Europe. In addition to the Lance warhead, other new production starts include: An 8-inch artillery-fired nuclear warhead to replace those now in Europe. This shell had been blocked for almost eight years by Sen. Stuart Symington (D-Mo.), who had argued that it was not needed. Symington retired last year. The Pentagon and ERDA say the new nuclear 8-inch warhead would be safer from stealing by terrorists. Starbird testified. It will be "a command disable system" to melt its inner workings if necessary. ... In longer-term research, the bill contains money to finance an enhanced-radiational bomb to the dropped from aircraft." - Washington post, 5 June 1977.

This debunks fake news that Teller's and Ulam's 9 March 1951 report LAMS-1225 itself gave Los Alamos the Mike H-bomb design, ready for testing! Teller was proposing a series of nuclear tests of the basic principles, not 10Mt Ivy-Mike which was based on a report the next month by Teller alone, LA-1230, "The Sausage: a New Thermonuclear System". When you figure that, what did Ulam actually contribute to the hydrogen bomb? Nothing about implosion, compression or separate stages - all already done by von Neumann and Fuchs five years earlier - and just a lot of drivel about trying to channel material shock waves from a primary to compress another fissile core, a real dead end. What Ulam did was to kick Teller out of his self-imposed mental objection to compression devices. Everything else was Teller's; the radiation mirrors, the Sausage with its outer ablation pusher and its inner spark plug. Note also that contrary to official historian Arnold's book (which claims due to a misleading statement by Dr Corner that all the original 1946 UK copies of Superbomb Conference documentation were destroyed after being sent from AWRE Aldermaston to London between 1955-63), all the documents did exist in the AWRE TPN (theoretical physics notes, 100% of which have been perserved) and are at the UK National Archives, e.g. AWRE-TPN 5/54 is listed in National Archives discovery catalogue ref ES 10/5: "Miscellaneous super bomb notes by Klaus Fuchs", see also the 1954 report AWRE-TPN 6/54, "Implosion super bomb: substitution of U235 for plutonium" ES 10/6, the 1954 report AWRE-TPN 39/54 is "Development of the American thermonuclear bomb: implosion super bomb" ES 10/39, see also ES 10/21 "Collected notes on Fermi's super bomb lectures", ES 10/51 "Revised reconstruction of the development of the American thermonuclear bombs", ES 1/548 and ES 1/461 "Superbomb Papers", etc. Many reports are secret and retained, despite containing "obsolete" designs (although UK report titles are generally unredacted, such as: "Storage of 6kg Delta (Phase) -Plutonium Red Beard (tactical bomb) cores in ships")! It should also be noted that the Livermore Laboatory's 1958 TUBA spherical secondary with an oralloy (enriched U235) outer pusher was just a reversion from Teller's 1951 core spark plug idea in the middle of the fusion fuel, back to the 1944 von Neumann scheme of having fission material surrounding the fusion fuel. In other words, the TUBA was just a radiation and ionization imploded, internally fusion-boosted, second fission stage which could have been accomplished a decade earlier if the will existed, when all of the relevant ideas were already known. The declassified UK spherical secondary-stage alternatives linked here (tested as Grapple X, Y and Z with varying yields but similar size, since all used the 5 ft diameter Blue Danube drop casing) clearly show that a far more efficient fusion burn occurs by minimising the mass of hard-to-compress U235 (oralloy) sparkplug/pusher, but maximising the amount of lithium-7, not lithium-6. Such a secondary with minimal fissionable material also automatically has minimal neutron ABM vulnerability (i.e., "Radiation Immunity", RI). This is the current cheap Russian neutron weapon design, but not the current Western design of warheads like the W78, W88 and bomb B61.

So why on earth doesn't the West take the cheap efficient option of cutting expensive oralloy and maximising cheap natural (mostly lithium-7) LiD in the secondary? Even Glasstone's 1957 Effects of Nuclear Weapons on p17 (para 1.55) states that "Weight for weight ... fusion of deuterium nuclei would produce nearly 3 times as much energy as the fission of uranium or plutonium"! The sad answer is "density"! Natural LiD (containing 7.42% Li6 abundance) is a low density white/grey crystalline solid like salt that actually floats on water (lithium deuteroxide would be formed on exposure to water), since its density is just 820 kg/m^3. Since the ratio of mass of Li6D to Li7D is 8/9, it would be expected that the density of highly enriched 95% Li6D is 739 kg/m^3, while for 36% enriched Li6D it is 793 kg/m^3. Uranium metal has a density of 19,000 kg/m^3, i.e. 25.7 times greater than 95% enriched li6D or 24 times greater than 36% enriched Li6D. Compactness, i.e. volume is more important in a Western MIRV warhead than mass/weight! In the West, it's best to have a tiny-volume, very heavy, very expensive warhead. In Russia, cheapness outweights volume considerations. The Russians in some cases simply allowed their more bulky warheads to protrude from the missile bus (see photo below), or compensated for lower yields at the same volume using clean LiD by using the savings in costs to build more warheads. (The West doubles the fission yield/mass ratio of some warheads by using U235/oralloy pushers in place of U238, which suffers from the problem that about half the neutrons it interacts with result in non-fission capture, as explained below. Note that the 720 kiloton UK nuclear test Orange Herald device contained a hollow shell of 117 kg of U235 surrounded by a what Lorna Arnold's book quotes John Corner referring to a "very thin" layer of high explosive, and was compact, unboosted - the boosted failed to work - and gave 6.2 kt/kg of U235, whereas the first version of the 2-stage W47 Polaris warhead contained 60 kg of U235 which produced most of the secondary stage yield of about 400 kt, i.e. 6.7 kt/kg of U235. Little difference - but because perhaps 50% of the total yield of the W47 was fusion, its efficiency of use of U235 must have actually been less than the Orange Herald device, around 3 kt/kg of U235 which indicates design efficiency limits to "hydrogen bombs"! Yet anti-nuclear charlatans claimed that the Orange Herald bomb was a con!)

ABOVE: USA nuclear weapons data declassified by UK Government in 2010 (the information was originally acquired due to the 1958 UK-USA Act for Cooperation on the Uses of Atomic Energy for Mutual Defense Purposes, in exchange for UK nuclear weapons data) as published at http://nuclear-weapons.info/images/tna-ab16-4675p63.jpg. This single table summarizes all key tactical and strategic nuclear weapons secret results from 1950s testing! (In order to analyze the warhead pusher thicknesses and very basic schematics from this table it is necessary to supplement it with the 1950s warhead design data declassified in other documents, particularly some of the data from Tom Ramos and Chuck Hansen, as quoted in some detail below.) The data on the mass of special nuclear materials in each of the different weapons argues strongly that the entire load of Pu239 and U235 in the 1.1 megaton B28 was in the primary stage, so that weapon could not have had a fissile spark plug in the centre let alone a fissile ablator (unlike Teller's Sausage design of 1951), and so the B28 it appears had no need whatsoever of a beryllium neutron radiation shield to prevent pre-initiation of the secondary stage prior to its compression (on the contrary, such neutron exposure of the lithium deuteride in the secondary stage would be VITAL to produce some tritium in it prior to compression, to spark fusion when it was compressed). Arnold's book indeed explains that UK AWE physicists found the B28 to be an excellent, highly optimised, cheap design, unlike the later W47 which was extremely costly. The masses of U235 and Li6 in the W47 shows the difficulties of trying to maintain efficiency while scaling down the mass of a two-stage warhead for SLBM delivery: much larger quantities of Li6 and U235 must be used to achieve a LOWER yield! To achieve thermonuclear warheads of low mass at sub-megaton yields, both the outer bomb casing and the pusher around the the fusion fuel must be reduced:

"York ... studied the Los Alamos tests in Castle and noted most of the weight in thermonuclear devices was in their massive cases. Get rid of the case .... On June 12, 1953, York had presented a novel concept ... It radically altered the way radiative transport was used to ignite a secondary - and his concept did not require a weighty case ... they had taken the Teller-Ulam concept and turned it on its head ... the collapse time for the new device - that is, the amount of time it took for an atomic blast to compress the secondary - was favorable compared to older ones tested in Castle. Brown ... gave a female name to the new device, calling it the Linda." - Dr Tom Ramos (Lawrence Livermore National Laboratory nuclear weapon designer), From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Naval Institute press, 2022, pp137-8. (So if you reduce the outer casing thickness to reduce warhead weight, you must complete the pusher ablation/compression faster, before the thinner outer casing is blown off, and stops reflecting/channelling x-rays on the secondary stage. Making the radiation channel smaller and ablative pusher thinner helps to speed up the process. Because the ablative pusher is thinner, there is relatively less blown-off debris to block the narrower radiation channel before the burn ends.)

"Brown's third warhead, the Flute, brought the Linda concept down to a smaller size. The Linda had done away with a lot of material in a standard thermonuclear warhead. Now the Flute tested how well designers could take the Linda's conceptual design to substantially reduce not only the weight but also the size of a thermonuclear warhead. ... The Flute's small size - it was the smallest thermonuclear device yet tested - became an incentive to improve codes. Characteristics marginally important in a larger device were now crucially important. For instance, the reduced size of the Flute's radiation channel could cause it to close early [with ablation blow-off debris], which would prematurely shut off the radiation flow. The code had to accurately predict if such a disaster would occur before the device was even tested ... the calculations showed changes had to be made from the Linda's design for the Flute to perform correctly." - Dr Tom Ramos (Lawrence Livermore National Laboratory nuclear weapon designer), From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Naval Institute press, 2022, pp153-4. Note that the piccolo (the W47 secondary) is a half-sized flute, so it appears that the W47's secondary stage design miniaturization history was: Linda -> Flute -> Piccolo:

"A Division's third challenge was a small thermonuclear warhead for Polaris [the nuclear SLBM submarine that preceeded today's Trident system]. The starting point was the Flute, that revolutionary secondary that had performed so well the previous year. Its successor was called the Piccolo. For Plumbbob [Nevada, 1957], the design team tested three variations of the Piccolo as a parameter test. One of the variants outperformed the others ... which set the stage for the Hardtack [Nevada and Pacific, 1958] tests. Three additional variations for the Piccolo ... were tested then, and again an optimum candidate was selected. ... Human intuition as well as computer calculations played crucial roles ... Finally, a revolutionary device was completed and tested ... the Navy now had a viable warhead for its Polaris missile. From the time Brown gave Haussmann the assignment to develop this secondary until the time they tested the device in the Pacific, only 90 days had passed. As a parallel to the Robin atomic device, this secondary for Polaris laid the foundation for modern thermonuclear weapons in the United States." - Dr Tom Ramos (Lawrence Livermore National Laboratory nuclear weapon designer), From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Naval Institute press, 2022, pp177-8. (Ramos is very useful in explaining that many of the 1950s weapons with complex non-spherical, non-cylindrical shaped primaries and secondaries were simply far too complex to fully simulate on the really pathetic computers they had - Livermore got a 4,000 vacuum tubes-based IBM 701 with 2 kB memory in 1956, AWRE Aldermaston in the Uk had to wait another year for theirs - so they instead did huge numbers of experimental explosive tests. For instance, on p173, Ramos discloses that the Swan primary which developed into the 155mm tactical shell, "went through over 100 hydrotests", non-nuclear tests in which fissile material is replaced with U238 or other substitutes, and the implosion is filmed with flash x-ray camera systems.)

"An integral feature of the W47, from the very start of the program, was the use of an enriched uranium-235 pusher around the cylindrical secondary." - Chuck Hansen, Swords 2.0, p. VI-375 (Hansen's source is his own notes taken during a 19-21 February 1992 nuclear weapons history conference he attended; if you remember the context, "Nuclear Glasnost" became fashionable after the Cold War ended, enabling Hansen to acquire almost unredacted historical materials for a few years until nuclear proliferation became a concern in Iraq, Afghanistan, Iran and North Korea). The key test of the original (Robin primary and Piccolo secondary) Livermore W47 was 412 kt Hardtack-Redwood on 28 June 1958. Since Li6D utilized at 100% efficiency would yield 66 kt/kg, the W47 fusion efficiency was only about 6%; since 100% fission of u235 yields 17 kt/kg, the W47's Piccolo fission (the u235 pusher) efficiency was about 20%; the comparable figures for secondary stage fission and fusion fuel burn efficiencies in the heavy B28 are about 7% and 15%, respectively:

ABOVE: the heavy B28 gave a very "big bang for the buck": it was cheap in terms of expensive Pu, U235 and Li6, and this was the sort of deterrent which was wanted by General LeMay for the USAF, which wanted as many weapons as possible, within the context of Eisenhower's budgetary concerns. But its weight (not its physical size) made it unsuitable for SLBM Polaris warheads. The first SLBM warhead, the W47, was almost the same size as the B28 weapon package, but much lighter due to having a much thinner "pusher" on the secondary, and casing. But this came at a large financial cost in terms of the quantities of special nuclear materials required to get such a lightweight design to work, and also a large loss of total yield. The fusion fuel burn efficiency ranges from 6% for the 400 kt W47 to 15% for the 1.1 megaton B28 (note that for very heavy cased 11-15 megaton yield tests at Castle, up to 40% fusion fuel burn efficiency was achieved), whereas the secondary stage ablative pusher fission efficiency ranged from 7% for a 1.1 inch thick natural uranium (99.3% U238) ablator to 20% for a 0.15 inch thick highly enriched oralloy (U235) ablator. From the brief description of the design evolution given by Dr Tom Ramos (Lawrence Livermore National Laboratory), it appears that when the x-ray channelling outer case thickness of the weapon is reduced to save weight, the duration of the x-ray coupling is reduced, so the dense metal pusher thickness must be reduced if the same compression factor (approximately 20) for the secondary stage is to be accomplished (lithium deuteride, being of low density, is far more compressable by a given pressure, than dense metal). In both examples, the secondary stage is physically a boosted fission stage. (If you are wondering why the hell the designers don't simply use a hollow core U235 bomb like Orange Herald instead of bothering with such inefficient x-ray coupled two-stage designs as these, the answer is straightforward: the risk of large fissile core meltdown by neutrons Moscow ABM defensive nuclear warheads, neutron bombs.)

The overall weight of the W47 was minimized by replacing the usual thick layer of U238 pusher with a very thin layer of fissile U235 (supposedly Teller's suggestion), which is more efficient for fission, but is limited by critical mass issues. The W47 used a 95% enriched Li6D cylinder with a 3.8mm thick U235 pusher; the B28 secondary was 36% enriched Li6D, with a very heavy 3cm thick U238 pusher. As shown below, it appears the B28 was related to the Los Alamos clean design of the TX21C tested as 95% clean 4.5 megatons Redwing-Navajo in 1956 and did not have a central fissile spark plug. From the declassified fallout composition, it is known the Los Alamos designers replaced the outer U238 pusher of Castle secondaries with lead in Navajo. Livermore did the same for their 85% clean 3.53 megatons Redwing-Zuni test, but Livermore left the central fission spark plug, which contributed 10% of its 15% fission yield, instead of removing the neutron shield, using foam channel filler for slowing down the x-ray compression, and thereby using primary stage neutrons to split lithium-6 giving tritium prior to compression. Our point is that Los Alamos got it wrong in sticking too conservatively to ideology: for clean weapons they should have got rid of the dense lead pusher and gone for John H. Nuckolls idea (also used by Fuchs in 1946 and the Russians in 1955 and 1958) of a low-density pusher for isentropic compression of low-density fusion fuel. This error is the reason why those early cleaner weapons were extremely heavy due to unnecessary 2" thick lead or tungsten pushers around the fusion fuel, which massively reduced their yield-to-weight ratios, so that LeMay rejected them!

Compare these data for the 20 inch diameter, 49 inch, 1600 lb, 1.1 megaton bomb B28 to the 18 inch diameter, 47 inch, 700 lb, 400 kt Mk47/W47 Polaris SLBM warhead (this is the correct yield for the first version of the W47 confirmed by UK data in Lorna Arnold Britain and the H-bomb 2001 and AB 16/3240; Wikipedia wrongly gives the 600 kt figure in Hansen, which was a speculation or a later upgrade). The key difference is that the W47 is much lighter, and thus suitable for the Polaris SLBM unlike the heavier, higher yield B28. Both B28 and W47 used cylindrical sausages, but they are very different in composition; the B28 used a huge mass of U238 in its ablative sausage outer shell or pusher, while the W47 used oralloy/U235 in the pusher. The table shows the total amounts of Pu, Oralloy (U235), Lithium-6 (excluding cheaper lithium-7, which is also present in varying amounts in different thermonuclear weapons), and tritium (which is used for boosting inside fissile material, essentially to reduce the amount of Pu and therefore the vulnerability of the weapon to Russian enhanced neutron ABM warhead meltdown). The B28 also has an external dense natural U (99.3% U238) "ablative pusher shell" whose mass is not listed in this table. The table shows that the 400 kt W47 Polaris SLBM warhead contains 60 kg of U235 (nearly as much as the 500 kt pure fission Mk18), which is in an ablative pusher shell around the lithium deuteride, so that the cylinder of neutron-absorbing lithium-6 deuteride within it keeps that mass of U235 subcritical, until compressed. So the 400 kt W47 contains far more Pu, U235, Li6 and T than the higher yield 1.1 megaton B28: this is the big $ price you pay for reducing the mass of the warhead; the total mass of the W47 is reduced to 44% of the mass of the B28, since the huge mass of cheap U238 pusher in the B28 is replaced by a smaller mass of U235, which is more efficient because (as Dr Carl F. Miller reveals in USNRDL-466, Table 6), about half of the neutrons hitting U238 don't cause fission but instead non-fission capture reactions which produce U239, plus the n,2n reaction that produces U237, emitting a lot of very low energy gamma rays in the fallout. For example, in the 1954 Romeo nuclear test (which, for simplicity, we quote since it used entirely natural LiD, with no expensive enrichment of the Li6 isotope whatsoever), the U238 jacket fission efficiency was reduced by capture as follows: 0.66 atom/fission of U239, 0.10 atom/fission of U237 and 0.23 atom/fission of U240 produced by fission, a total of 0.66 + 0.10 + 0.23 ~ 1 atom/fission, i.e. 50% fission in the U238 pusher, versus 50% non-fission neutron captures. So by using U235 in place of U238, you virtually eliminate the non-fission capture (see UK Atomic Weapons Establishment graph of fission and capture cross-sections for U235, shown below), which roughly halves the mass of the warhead, for a given fission yield. This same principle of using an outer U235/oralloy pusher instead of U238 to reduce mass - albeit with the secondary cylindrical "Sausage" shape now changed to a sphere - applies to today's miniaturised, high yield, low mass "MIRV" warheads. Just as the lower-yield W47 counter-intuitively used more expensive ingredients than the bulkier higher-yield B28, modern compact, high-yield oralloy-loaded warheads literally cost a bomb, just to keep the mass down! There is evidence Russia uses alternative ideas.

This is justified by the data given for a total U238 capture-to-fission ratio of 1 in the 11 megaton Romeo test and also the cross-sections for U235 capture and fission on the AWE graph for relevant neutron energy range of about 1-14 Mev. If half the neutrons are captured in U238 without fission, then the maximum fission yield you can possibly get from "x" kg of U238 pusher is HALF the energy obtained from 100% fission of "x" kg of U238. Since with U238 only about half the atoms can undergo fission by thermonuclear neutrons (because the other half undergo non-fission capture), the energy density (i.e., the Joules/kg produced by the fission explosion of the pusher) reached by an exploding U238 pusher is only half that reached by U235 (in which there is less non-fission capture of neutrons, which doubles the pusher mass without doubling the fission energy release). So a U235 pusher will reach twice the temperature of a U238 pusher, doubling its material heating of fusion fuel within, prolonging the fusion burn and thus increasing fusion burn efficiency. 10 MeV neutron energy is important since it allows for likely average scattering of 14.1 MeV D+T fusion neutrons and it is also the energy at which the most important capture reaction, the (n,2n) cross-section peaks for both U235 (peak of 0.88 barn at 10 Mev) and U238 (peak of 1.4 barns at 10 Mev). For 10 Mev neutrons, U235 and U238 have fission cross-sections of 1.8 and 1 barn, respectively. For 14 Mev neutrons, U238 has a (n,2n) cross section of 0.97 barn for U237 production. So ignoring non-fission captures, you need 1.8/1 = 1.8 times greater thickness of pusher for U238 than for U235, to achieve the same amount of fission. But this simple consideration ignores the x-ray ablation requirement of the explosing pusher, so there are several factors requiring detailed computer calculations, and/or nuclear testing.

Note: there is an extensive collection of declassified documents released after Chuck Hansen's final edition, Swords 2.0, which are now available at https://web.archive.org/web/*/http://www.nnsa.energy.gov/sites/default/files/nnsa/foiareadingroom/*, being an internet-archive back-up of a now-removed US Government Freedom of Information Act Reading Room. Unfortunately they were only identified by number sequence, not by report title or content, in that reeding room, and so failed to achieve wide attention when originally released! (This includes extensive "Family Committee" H-bomb documentation and many long-delayed FOIA requests submitted originally by Hansen, but not released in time for inclusion in Swords 2.0.) As the extract below - from declassified document RR00132 - shows, some declassified documents contained very detailed information or typewriter spaces that could only be filled by a single specific secret word (in this example, details of the W48 linear implosion tactical nuclear warhead, including the fact that it used PBX9404 plastic bonded explosive glued to the brittle beryllium neutron reflector around the plutonium core using Adiprene L100 adhesive!).

ABOVE: Declassified data on the radiation flow analysis for the 10 megaton Mike sausage: http://nnsa.energy.gov/sites/default/files/nnsa/foiareadingroom/RR00198.pdf Note that the simplistic "no-go theorem" given in this extract, against any effect from varying the temperature to help the radiation channelling, was later proved false by John H. Nuckolls (like Teller's anti-compression "no-go theorem" was later proved false), since lowered temperature delivers energy where it is needed while massively reducing radiation losses (which go as the fourth power of temperature/x-ray energy in kev).

ABOVE: Hans A. Bethe's disastrous back-of-the-envelope nonsense "non-go theorem" against lithium-7 fission into tritium by 14.1 Mev D+T neutrons in Bravo (which contained 40% lithium-6 and 60% lithium-7; unnecessarily enriched - at great expense and effort - from the natural 7.42% lithum-6 abundance). It was Bethe's nonsense "physics" speculation, unbacked by serious calculation, who caused Bravo to go off at 2.5 times the expected 6 megatons and therefore for the Japanese Lucky Dragon tuna trawler crew in the maximum fallout hotspot area 80 miles downwind to be contaminated by fallout, and also for Rongelap's people to be contaminated ("accidents" that inevitably kickstarted the originally limited early 1950s USSR funded Communist Party anti-nuclear deterrence movements in the West into mainstream media and thus politics). There was simply no solid basis for assuming that the highly penetrating 14.1 Mev neutrons would be significantly slowed by scattering in the fuel before hitting lithium-7 nuclei. Even teller's 1950 report LA-643 at page 17 estimated that in a fission-fusion Alarm Clock, the ratio of 14 Mev to 2.5 Mev neutrons was 0.7/0.2 = 3.5. Bethe's complacently bad guesswork-based physics also led to the EMP fiasco for high altitude bursts, after he failed to predict the geomagnetic field deflection of Compton electrons at high altitude in his secret report “Electromagnetic Signal Expected from High-Altitude Test”, Los Alamos report LA-2173, October 1957, Secret. He repeatedly caused nuclear weapons effects study disasters. For the true utility of lithium-7, which is actually BETTER than lithum-6 at tritium production when struck by 14.1 Mev D+T fusion neutrons, and its consequences for cheap isentropically compressed fusion capsules in Russian neutron bombs, please see my paper here which gives a graph of lithium isotopic cross section versus neutron energy, plus the results when Britain used cheap lithium-7 in Grapple Y to yield 3 megatons (having got lower yields with costly lithium-6 in previous tests!).

Update (15 Dec 2023): PDF uploaded of UK DAMAGE BY NUCLEAR WEAPONS (linked here on Internet Archive) - secret 1000 pages UK and USA nuclear weapon test effects analysis, and protective measures determined at those tests (not guesswork) relevant to escalation threats by Russia for EU invasion (linked here at wordpress) in response to Ukraine potentially joining the EU (this is now fully declassified without deletions, and in the UK National Archives at Kew):

Hiroshima and Nagasaki terrorist liars debunked by secret American government evidence that simple shelters worked, REPORT LINKED HERE (this was restricted from public view and never published by the American government, and Glasstone's lying Effects of Nuclear Weapons book reversed its evidence for propaganda purposes, a fact still covered by all the lying cold war pseudo "historians" today), Operation Hurricane 1952 declassified nuclear weapon test data (here), declassified UK nuclear tested shelter research reports (here), declassified EMP nuclear test research data (here), declassified clandestine nuclear bombs in ships attack on Liverpool study (here), declassified fallout decontamination study for UK recovery from nuclear attack (here), declassified Operation Buffalo surface burst and near surface burst fallout patterns, water decontamination, initial radiation shielding at Antler nuclear tests, and resuspension of deposited fallout dust into the air (inhalation hazard) at different British nuclear tests, plus Operation Totem nuclear tests crater region radiation surveys (here), declassified Operation Antler nuclear blast precursor waveforms (here), declassified Operation Buffalo nuclear blast precursor waveforms (here), declassified UK Atomic Weapons Establishment nuclear weapons effects symposium (here), and declassified UK Atomic Weapons Establishment paper on the gamma radiation versus time at Crossroads tests Able and Baker (here, paper by inventor of lenses in implosion weapons, James L. Tuck of the British Mission to Los Alamos and Operation Crossroads, clearly showing how initial gamma shielding in an air burst can be achieved with a few seconds warning and giving the much greater escape times available for residual radiation dose accumulations in an underwater burst; key anti-nuclear hysteria data kept covered up by Glasstone and the USA book Effects of Nuclear Weapons), and Penney and Hicks paper on the base surge contamination mechanism (here), and Russian nuclear warhead design evidence covered-up by both America and the so-called arms control and disarmament "experts" who always lie and distort the facts to suit their own agenda to try to start a nuclear war (linked here). If they wanted "peace" they'd support the proved facts, available on this blog nukegate.org since 2006, and seek international agreement to replace the incredible, NON-war deterring strategic nuclear weapons with safe tactical neutron warheads which collateral damage averting and invasion-deterring (thus war deterring in all its forms, not only nuclear), plus civil defence against all forms of collateral damage from war, which reduces escalation risks during terrorist actions, as proved in wars which don't escalate because of effective civil defence and credible deterrence (see below). Instead, they support policies designed to maximise civilian casualties and to deliberately escalate war, to profit "politically" from the disasters caused which they blame falsely on nuclear weapons, as if deterrence causes war! (Another lie believed by mad/evil/gullible mainstream media/political loons in "authority".) A good summary of the fake news basis of "escalation" blather against credible tactical nuclear deterrence of the invasions that set off wars is inadvertently provided by Lord David Owen's 2009 "Nuclear Papers" (Liverpool Uni Press), compiling his declassified nuclear disarmament propaganda reports written while he was UK Foreign Secretary 1977-9. It's all Carter era appeasement nonsense. For example, on pp158-8 he reprints his Top Secret 19 Dec 1978 "Future of the British Deterrent" report to the Prime Minister which states that "I am not convinced by the contention ... that the ability to destroy at least 10 major cities, or inflict damage on 30 major targets ... is the minimum criterion for a British deterrent." (He actually thinks this is too strong a deterrent, despite the fact it is incredible for the realpolitik tactics of dictators who make indirect provocations like invading their neighbours!) The reality Owens ignores is that Russia had and still has civil defence shelters and evacuation plans, so threatening some damage in retaliation is not a credible deterrent against the invasions that set off both world wars. On page 196, he gives a Secret 18 April 1978 paper stating that NATO then had 1000 nuclear artillery pieces (8" and 155mm), 200 Lance and Honest John tactical nuclear missile systems, 135 Pershing; all now long ago disarmed and destroyed while Russian now has over 2000 dedicated tactical nuclear weapons of high neutron output (unlike EM1's data for the low yield option of the multipurpose NATO B61). Owen proudly self-congratulates on his Brezhnev supporting anti-neutron bomb ranting 1978 book, "Human Rights", pp. 136-7. If Owen really wants "Human Rights", he needs to back the neutron bomb now to deter the dictatorships which destroy human rights! His 2009 "Nuclear Papers" at p287 gives the usual completely distorted analysis of the Cuban missiles crisis, claiming that despite the overwhelming American tactical and strategic nuclear superiority for credible deterrence in 1962, the world came "close" to a nuclear war. It's closer now, mate, when thanks to your propaganda we no longer have a credible deterrent, civil defence, tactical neutron warheads. Pathetic.

ABOVE secret reports on Australian-British nuclear test operations at Maralinga in 1956 and 1957, Buffalo and Antler, proved that even at 10 psi peak overpressure for the 15 kt Buffalo-1 shot, the dummy lying prone facing the blast was hardly moved due to the low cross-sectional area exposed to the blast winds, relative to standing dummies which were severely displaced and damaged. The value of trenches in protecting personnel against blast winds and radiation was also proved in tests (gamma radiation shielding of trenches had been proved at an earlier nuclear test in Australia, Operation Hurricane in 1952). (Antler report linked here; Buffalo report linked here.) This debunks the US Department of Defense models claiming that people will automatically be blown out of the upper floors of modern city buildings at very low pressures, and killed by the gravitational impact with the pavement below! In reality, tall buildings mutually shield one another from the blast winds, not to mention the radiation (proven in the latest post on this blog), and on seeing the flash most people will have time to lie down on typical surfaces like carpet which give a frictional resistance to displacement, ignored in fiddled models which assume surfaces have less friction than a skating rink; all of this was omitted from the American 1977 Glasstone and Dolan book "The Effects of Nuclear Weapons". As Tuck's paper below on the gamma radiation dose rate measurements on ships at Operation Crossroads, July 1946 nuclear tests proved, contrary to Glasstone and Dolan, scattered radiation contributions are small, so buildings or ships gun turrets provided excellent radiation "shadows" to protect personnel. This effect was then calculated by UK civil defence weapons effects expert Edward Leader-Williams in his paper presented at the UK's secret London Royal Society Symposium on the Physical Effects of Atomic Weapons, but the nuclear test data as always was excluded from the American Glasstone book published the next year, The Effects of Atomic Weapons in deference to lies about the effects in Hiroshima, including an "average" casualty curve which deliberately obfuscated huge differences in survival rates in different types of buildings and shelters, or simply in shadows!

Above: Edward Leader-Williams on the basis for UK civil defence shelters in SECRET 1949 Royal Society's London Symposium on physical effects of atomic weapons, a study that was kept secret by the Attlee Government and subsequent UK governments, instead of being openly published to enhance public knowledge of civil defence effectiveness against nuclear attack. Leader-Williams also produced the vital civil defence report seven years later (published below for the first time on this blog), proving civil defence sheltering and city centre evacuation is effective against 20 megaton thermonuclear weapons. Also published in the same secret symposium, which was introduced by Penney, was Penney's own Hiroshima visit analysis of the percentage volume reduction in overpressure-crushed empty petrol cans, blueprint containers, etc., which gave a blast partition yield of 7 kilotons (or 15.6 kt total yield, if taking the nuclear blast as 45% of total yield, i.e. 7/0.45 = 15.6, as done in later AWRE nuclear weapons test blast data reports). Penney in a 1970 updated paper allowed for blast reduction due to the damage done in the city bursts.

ABOVE: The 1996 Northrop EM-1 (see extracts below showing protection by modern buildings and also simple shelters very close to nuclear tests; note that Northrop's entire set of damage ranges as a function of yield for underground shelters, tunnels, silos are based on two contained deep underground nuclear tests of different yield scaled to surface burst using the assumption of 5% yield ground coupling relative to the underground shots; this 5% equivalence figure appears to be an exaggeration for compact modern warheads, e.g. the paper “Comparison of Surface and Sub-Surface Nuclear Bursts,” from Steven Hatch, Sandia National Laboratories, to Jonathan Medalia, October 30, 2000, shows a 2% equivalence, e.g. Hatch shows that 1 megaton surface burst produces identical ranges to underground targets as a 20 kt burst at >20m depth of burst, whereas Northrop would require 50kt) has not been openly published, despite such protection being used in Russia! This proves heavy bias against credible tactical nuclear deterrence of the invasions that trigger major wars that could escalate into nuclear war (Russia has 2000+ dedicated neutron bombs; we don't!) and against simple nuclear proof tested civil defence which makes such deterrence credible and of course is also of validity against conventional wars, severe weather, peacetime disasters, etc.

The basic fact is that nuclear weapons can deter/stop invasions unlike the conventional weapons that cause mass destruction, and nuclear collateral damage is eliminated easily for nuclear weapons by using them on military targets, since at collateral damage distances all the effects are sufficiently delayed in arrival (unlike the case for the smaller areas affected by conventional weapons), and as the original 1951 SECRET American Government "Handbook on Capabilities of Atomic Weapons" (limited report AD511880L, forerunner to today's still secret EM-1) stated in Section 10.32:

"PERHAPS THE MOST IMPORTANT ITEM TO BE REMEMBERED WHEN ESTIMATING EFFECTS ON PERSONNEL IS THE AMOUNT OF COVER ACTUALLY INVOLVED. ... IT IS OBVIOUS THAT ONLY A FEW SECONDS WARNING IS NECESSARY UNDER MOST CONDITIONS TO TAKE FAIRLY EFFECTIVE COVER. THE LARGE NUMBER OF CASUALTIES IN JAPAN RESULTED FOR THE MOST PART FROM THE LACK OF WARNING."

As for Hitler's stockpile of 12,000 tons of tabun nerve gas, whose strategic and also tactical use was deterred by proper defences (gas masks for all civilians and soldiers, as well as UK stockpiles of fully trial-tested deliverable biological agent anthrax and mustard gas retaliation capacity), it is possible to deter strategic nuclear escalation to city bombing, even within a world war with a crazy terrorist, if all the people are protected by both defence and deterrence.

J. R. Oppenheimer (opposing Teller), February 1951: "It is clear that they can be used only as adjuncts in a military campaign which has some other components, and whose purpose is a military victory. They are not primarily weapons of totality or terror, but weapons used to give combat forces help they would otherwise lack. They are an integral part of military operations. Only when the atomic bomb is recognized as useful insofar as it is an integral part of military operations, will it really be of much help in the fighting of a war, rather than in warning all mankind to avert it." (Quotation: Samuel Cohen, Shame, 2nd ed., 2005, page 99.)

‘The Hungarian revolution of October and November 1956 demonstrated the difficulty faced even by a vastly superior army in attempting to dominate hostile territory. The [Soviet Union] Red Army finally had to concentrate twenty-two divisions in order to crush a practically unarmed population. ... With proper tactics, nuclear war need not be as destructive as it appears when we think of [World War II nuclear city bombing like Hiroshima]. The high casualty estimates for nuclear war are based on the assumption that the most suitable targets are those of conventional warfare: cities to interdict communications ... With cities no longer serving as key elements in the communications system of the military forces, the risks of initiating city bombing may outweigh the gains which can be achieved. ...

‘The elimination of area targets will place an upper limit on the size of weapons it will be profitable to use. Since fall-out becomes a serious problem [i.e. fallout contaminated areas which are so large that thousands of people would need to evacuate or shelter indoors for up to two weeks] only in the range of explosive power of 500 kilotons and above, it could be proposed that no weapon larger than 500 kilotons will be employed unless the enemy uses it first. Concurrently, the United States could take advantage of a new development which significantly reduces fall-out by eliminating the last stage of the fission-fusion-fission process.’

- Dr Henry Kissinger, Nuclear Weapons and Foreign Policy, Harper, New York, 1957, pp. 180-3, 228-9. (Note that sometimes the "nuclear taboo" issue is raised against this analysis by Kissenger: if anti-nuclear lying propaganda on weapons effects makes it apparently taboo in the Western pro-Russian disarmament lobbies to escalate from conventional to tactical nuclear weapons to end war as on 6 and 9 August 1945, then this "nuclear taboo" can be relied upon to guarantee peace for our time. However, this was not only disproved by Hiroshima and Nagasaki, but by the Russian tactical nuclear weapons reliance today, the Russian civil defense shelter system detailed on this blog which showed they believed a nuclear war survivable based on the results of their own nuclear tests, and the use of Russian nuclear weapons years after Kissinger's analysis was published and criticised, for example their 50 megaton test in 1961 and their supply of IRBM's capable of reaching East Coast mainland USA targets to the fanatical Cuban dictatorship in 1962. So much for the "nuclear taboo" as being any more reliable than Chamberlain's "peace for our time" document, co-signed by Hitler on 30 September 1938! We furthermore saw how Russia respected President Obama's "red line" for the "chemical weapons taboo": Russia didn't give a toss about Western disarmament thugs prattle about what they think is a "taboo", Russia used chlorine and sarin in Syria to keep Assad the dictator and they used Novichok to attack and kill in the UK in 2018, with only diplomatic expulsions in response. "Taboos" are no more valid to restrain madmen than peace treaties, disarmament agreements, Western CND books attacking civil defense or claiming that nuclear war is the new 1930s gas war bogyman, or "secret" stamps on scientific facts. In a word, they're bullshit superstitions.)

(Quoted in 2006 on this blog here.)

All of this data should have been published to inform public debate on the basis for credible nuclear deterrence of war and civil defense, PREVENTING MILLIONS OF DEATHS SINCE WWII, instead of DELIBERATELY allowing enemy anti-nuclear and anti-civil defence lying propaganda from Russian supporting evil fascists to fill the public data vacuum, killing millions by allowing civil defence and war deterrence to be dismissed by ignorant "politicians" in the West, so that wars triggered by invasions with mass civilian casualties continue today for no purpose other than to promote terrorist agendas of hate and evil arrogance and lying for war, falsely labelled "arms control and disarmament for peace":

"Controlling escalation is really an exercise in deterrence, which means providing effective disincentives to unwanted enemy actions. Contrary to widely endorsed opinion, the use or threat of nuclear weapons in tactical operations seems at least as likely to check [as Hiroshima and Nagasaki] as to promote the expansion of hostilities [providing we're not in a situation of Russian biased arms control and disarmament whereby we've no tactical weapons while the enemy has over 2000 neutron bombs thanks to "peace" propaganda from Russian thugs]." - Bernard Brodie, pvi of Escalation and the nuclear option, RAND Corp memo RM-5444-PR, June 1965.

Note: the DELFIC, SIMFIC and other computer predicted fallout area comparisons for the 110 kt Bikini Atoll Castle-Koon land surface burst nuclear test are false since the distance scale of Bikini Atoll is massively exaggerated on many maps, e.g. in the Secret January 1955 AFSWP "Fall-out Symposium", the Castle fallout report WT-915, and the fallout patterns compendium DASA-1251! The Western side of the Bikini Atoll reef is at 165.2 degrees East, while the most eastern island in the Bikini Atoll, Enyu, is at 165.567 degrees East: since there are 60 nautical miles per degree by definition, the width of Bikini Atoll is therefore (165.567-165.2)(60) = 22 nautical miles, approximately half the distance shown in the Castle-Koon fallout patterns. Since area is proportional to the square of the distance scale, this constitutes a serious exaggeration in fallout casualty calculations, before you get into the issue of the low energy (0.1-0.2 MeV) gamma rays from neutron induced Np239 and U237 in the fallout enhancing the protection factor of shelters (usually calculated assuming hard 1.17 and 1.33 MeV gamma rads from Co60), during the sheltering period of approximately 1-14 days after detonation.

"Since the nuclear stalemate became apparent, the Governments of East and West have adopted the policy which Mr Dulles calls 'brinkmanship'. This is a policy adopted from a sport ... called 'Chicken!' ... If one side is unwilling to risk global war, while the other side is willing to risk it, the side which is willing to run the risk will be victorious in all negotiations and will ultimately reduce the other side to complete impotence. 'Perhaps' - so the practical politician will argue - 'it might be ideally wise for the sane party to yield to the insane party in view of the dreadful nature of the alternative, but, whether wise or not, no proud nation will long acquiesce in such an ignominious role. We are, therefore, faced, quite inevitably, with the choice between brinkmanship and surrender." - Bertrand Russell, Common Sense and Nuclear Warfare, George Allen and Unwin, London, 1959, pp30-31.

Emphasis added. Note that Russell accepts lying about nuclear weapons just as gas weapons had been lied about in the 1920s-30s by "arms controllers" to start WWII, then he simply falls into the 1930s Cambridge Scientists Antiwar Group delusional propaganda fraud of assuming that any attempt to credibly deter fascism is immoral because it will automatically result in escalatory retaliation with Herman Goering's Luftwaffe drenching London with "overkill" by poison gas WMDs etc. In particular, he forgets that general disarmament pursued in the West until 1935 - when Baldwin suddenly announced that the Nazis had secretly produced a massive, unstoppable warmachine in two years - encouraged aggressors to first secretly rearm, then coerce and invade their neighbours while signing peace promises purely to buy more time for rearmament, until a world war resulted. Not exactly a great result for disarmament propaganda. So after obliterating what Reagan used to call (to the horror of commie "historians") the "true facts of history" from his mind, he advocates some compromise with the aggressors of the 30 September 1938 Munich Agreement peace-in-our-time sort, the historically proved sure fire way to really escalate a crisis into a major war by showing the green lamp to a loon to popular media acclaim and applause for a fairy tale utopian fantasy; just as the "principled" weak, rushed, imbecile withdrawl from Afghanistan in 2021 encouraged Putin to invade Ukraine in 2022, and also the green lamp for Hamas to invade Israel in 2023.

"... deterrence ... consists of threatening the enemy with thermonuclear retaliation should he act provocatively. ... If war is 'impossible', how can one threaten a possible aggressor with war? ... The danger, evoked by numerous critics, that such research will result in a sort of resigned expectation of the holocaust, seems a weak argument ... The classic theory of Clausewitz defines absolute victory in terms of disarmament of the enemy ... Today ... it will suffice to take away his means of retaliation to hold him at your mercy." - Raymond Aron, Introduction to Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, pp. 9-12. (This is the commie support for arms control and disarmament has achieved, precisely the weakening of the West to take away credible deterrence.)

"75 years ago, white slavery was rampant in England. ... it could not be talked about openly in Victorian England, moral standards as to the subjects of discussion made it difficult to arouse the community to necessary action. ... Victorian standards, besides perpetuating the white slave trade, intensified the damage ... Social inhibitions which reinforce natural tendencies to avoid thinking about unpleasant subjects are hardly uncommon. ... But when our reluctance to consider danger brings danger nearer, repression has gone too far. In 1960, I published a book that attempted to direct attention to the possibility of a thermonuclear war ... people are willing to argue that it is immoral to think and even more immoral to write in detail about having to fight ... like those ancient kings who punished messengers who brought them bad news. That did not change the news; it simply slowed up its delivery. On occasion it meant that the kings were ill informed and, lacking truth, made serious errors in judgement and strategy. ... We cannot wish them away. Nor should we overestimate and assume the worst is inevitable. This leads only to defeatism, inadequate preparations (because they seem useless), and pressures toward either preventative war or undue accommodation." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, pp. 17-19. (In the footnote on page 35, Kahn notes that original nuclear bullshitter, the 1950 creator of fake cobalt-60 doomsday bomb propaganda, Leo Szilard, was in the usual physics groupthink nutters club: "Szilard is probably being too respectful of his scientific colleagues who also seem to indulge in ad hominem arguments - especially when they are out of their technical specialty.")

"Ever since the catastropic and disillusioning experience of 1914-18, war has been unthinkable to most people in the West ... In December 1938, only 3 months after Munich, Lloyd's of London gave odds of 32 to 1 that there would be no war in 1939. On August 7, 1939, the London Daily Express reported the result of a poll of its European reporters. 10 out of 12 said, 'No war this year'. Hitler invaded Poland 3 weeks later." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, p. 39. (But as the invasion of Ukraine in 2022 proved, even the label "war" is now "controversial": the aggressor now simply declares they are on a special operation of unifying people under one flag to ensure peace! So the reason why there is war in Ukraine is that Ukraine is resisting. If it waved a white flag, as the entire arms control and disarmament lobby insists is the only sane response to a nuclear-armed aggressor, there would be "peace," albeit on Russia's terms: that's why they disarmed Ukraine in 1994. "Peace propaganda" of "disarmers"! Free decent people prefer to fight tyranny. But as Kahn states on pp. 7-9:

"Some, most notably [CND's pseudo-historian of arms race lying] A. J. P. Taylor, have even said that Hitler was not like Hitler, that further appeasement [not an all-out arms race as was needed but repeatedly rejected by Baldwin and Chamberlain until far too late; see discussion of this fact which is still deliberately ignored or onfuscated by "historians" of the A. J. P. Taylor biased anti-deterrence left wing type, in Slessor's The Central Blue, quoted on this blog] would have prevented World War II ... If someone says to you, 'One of us has to be reasonable and it is not going to be me, so it has to be you', he has a very effective bargaining advantage, particularly if he is armed with thermonuclear bombs [and you have damn all civil defense, ABM, or credible tactical deterrent]. If he can convince you he is stark, staring mad and if he has enough destructive power ... deterrence alone will not work. You must then give in or accept the possibility of being annihilated ... in the first instance if we fight and lose; in the second if we capitulate without fighting. ... We could still resist by other means ranging from passive resistance of the Gandhi type to the use of underground fighting and sabotage. All of these alternatives might be of doubtful effectiveness against [the Gulag system, KGB/FSB torture camps or Siberian salt mines of] a ruthless dictatorship."

Sometimes people complain that Hitler and the most destructive and costly war and only nuclear war of history, WWII, is given undue attention. But WWII is a good analogy to the danger precisely because of the lying WMD gas war propaganda-based disarmament of the West which allowed the war, because of the attacks by Hitler's fans on civil defense in the West to make even the token rearmament after 1935 ineffective as a credible deterrent, and because Hitler has mirrors in Alexander the Great, Attila the Hun, Ghengis Khan, Tamerlane, Napoleon and Stalin. Kahn explains on p. 173: "Because history has a way of being more imaginative and complex than even the most imaginative and intelligent analysts, historical examples often provide better scenarios than artificial ones, even though they may be no more directly applicable to current equipment, postures, and political situations than the fictional plot of the scenario. Recent history can be especially useful.")

"One type of war resulting at least partly from deliberate calculation could occur in the process of escalation. For example, suppose the Soviets attacked Europe, relying upon our fear of their reprisal to deter a strategic attack by us; we might be deterred enough to pause, but we might evacuate our cities during this pause in the hope we could thereby convince the Soviets we meant business. If the Soviets did not back down, but continued their attack upon Europe, we might decide that we would be less badly off if we proceeded ... The damage we would receive in return would then be considerably reduced, compared with what we would have suffered had we not evacuated. We might well decide at such a time that we would be better off to attack the Soviets and accept a retalitory blow at our dispersed population, rather than let Europe be occupied, and so be forced to accept the penalty of living in the hostile and dangerous world that would follow." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, pp. 51-2.

"We must recognise that the stability we want in a system is more than just stability against accidental war or even against an attack by the enemy. We also want stability against extreme provocation [e.g. invasion of allies, which then escalates as per invasion of Belgium 1914, or Poland 1939]." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, p. 53(footnote).

Note: this 1962 book should not be confused with Kahn's 1984 "updated" Thinking About the Unthinkable in the 1980s, which omits the best material in the 1962 edition (in the same way that the 1977 edition of The Effects of Nuclear Weapons omits the entire civil defense chapter which was the one decent thing in the 1957 and 1962/4 editions!) and thus shows a reversion to the less readable and less helpful style of his 1960 On Thermonuclear War, which severely fragmented and jumbled up all the key arguments making it easy for critics to misquote or quote out of context. For example, Kahn's 1984 "updated" book starts on the first page of the first chapter with the correct assertion that Johnathan Schell's Fate of the Earth is nonsense, but doesn't say why it's nonsense, and you have to read through to the final chapter - pages 207-8 of chapter 10 - to find Kahn writing in the most vague way possible, without a single specific example, that Schell is wrong because of "substantive inadequacies and inaccuracies", without listing a single example such as Schell's lying that the 1954 Bravo nuclear test blinded everyone well beyond the range of Rongelap, and that it was impossible to easily shield the radiation from the fallout or evacuate the area until it decays, which Schell falsely attributed to Glasstone and Dolan's nonsense in the 1977 Effects of Nuclear Weapons! Kahn eventually in the footnote on page 208 refers readers to an out-of-print article for facts: "These criticisms are elaborated in my review of The Fate of the Earth, see 'Refusing to Think About the Unthinkable', Fortune, June 28, 1982, pp. 113-6. Kahn does the same for civil defense in the 1984 book, referring in such general, imprecise and vague terms to Russian civil defence, with no specific data, that it is a waste of time, apart possibly one half-baked sentence on page 177: "Variations in the total megatonnage, somewhat surprisingly, do not seem to affect the toll nearly as much as variations in the targetting or the type of weapon bursts." Kahn on page 71 quotes an exchange between himself and Senator Proxmire during the US Congressional Hearings of the Joint Committee on Defense Production, Civil preparedness and limited nuclear war where on page 55 of the hearings, Senator Proxmire alleges America would escalate a limited conflict to an all-out war because: "The strategic value and military value of destroying cities in the Soviet Union would be very great." Kahn responded: "No American President is likely to do that, no matter what the provocation." Nuclear war will be limited, according to Herman Kahn's analysis, despite the bullshit fron nutters to the contrary.

Kahn on page 101 of Thinking About the Unthinkable in the 1980s correctly and accurately condemns President Carter's 1979 State of the Union Address, which claimed falsely that just a single American nuclear submarine is required by America and has an "overwhelming" deterrent against "every large and medium-sized city in the Soviet Union". Carter ignored Russian retaliation on cities if you bomb theirs: America has avoided the intense Russian protection efforts that make the Russian nuclear threat credible, namely civil defense shelters and evacuation plans, and also the realpolitik of deterrence of world wars, which so far have only been triggered due to invasions of third parties (Belgium '14, Poland '39). Did America strategically nuke every city in Russia when it invaded Ukraine in 2022? No, debunking Proxmire and the entire Western pro-Russian "automatic escalation" propaganda lobby, and it didn't even have tactical neutron bombs to help deter the Russians like Reagan in the 1980s, because in the 1990s America had ignored Kahn's argument, and went in for MINIMAL deterrence of the least credible sort (abolishing the invasion-deterring dedicated neutron tactical nuclear stockpile entirely; the following quotation is from p101 of Kahn's Thinking About the Unthinkable in the 1980s):

"Minimum deterrence, or any predicated on an escessive emphasis on the inevitably of mutual homocide, is both misleading and dangerous. ... MAD principles can promote provocation - e.g. Munich-type blackmail on an ally. Hitler, for example, did not threaten to attack France or England - only Austria, Czechoslovakia, and Poland. It was the French and the British who finally had to threaten all-out war [they could only do this after rearmament and building shelters and gas masks to reduce the risk of reprisals in city bombing, which gave more time for Germany to prepare since it was rearming faster than France and Britain which still desperately counted on appeasement and peace treaties and feared provoking a war by an arms-race due to endless lying propaganda from Lord Grey that his failure to deter war in 1914 had been due to an arms-race rather than the incompetence of the procrastination of his anti-war Liberal Party colleagues in the Cabinet] - a move they would not and could not have made if the notion of a balance of terror between themselves and Germany had been completely accepted. As it was, the British and French were most reluctant to go to war; from 1933 to 1939 Hitler exploited that reluctance. Both nations [France and Britain] were terrified by the so-called 'knockout blow', a German maneuver that would blanket their capitals with poison gas ... The paralyzing effect of this fear prevented them from going to war ... and gave the Germans the freedom to march into the Ruhr, to form the Anschluss with Austria, to force the humiliating Munich appeasement (with the justification of 'peace in our time'), and to take other aggressive actions [e.g. against the Jews in the Nuremberg Laws, Kristallnacht, etc.] ... If the USSR were sufficiently prepared in the event a war did occur, only the capitalists would be destroyed. The Soviets would survive ... that would more than justify whatever sacrifice and destruction had taken place.

"This view seems to prevail in the Soviet military and the Politburo even to the present day. It is almost certain, despite several public denials, that Soviet military preparations are based on war-fighting, rather than on deterrence-only concepts and doctrines..." - Herman Kahn, Thinking About the Unthinkable in the 1980s, 1984, pages 101-102.

Kahn adds, in his footnote on p111, that "Richard Betts has documented numerous historical cases in which attackers weakened their opponents defenses through the employment of unanticipated tactics. These include: rapid changes in tactics per se, false alarms and fluctuating preparations for war ... doctrinal innovations to gain surprise. ... This is exactly the kind of thing which is likely to surprise those who subscribe to MAD theories. Those who see a need for war-fighting capabilities expect the other side to try to be creative and use tactical innovations such as coercion and blackmail, technological surprises, or clever tactics on 'leverage' targets, such as command and control installations. If he is to adhere to a total reliance on MAD, the MADvocate has to ignore these possibilities." See Richard Betts, "Surprise Despite Warning: Why Sudden Attacks Succeed", Political Science Quarterly, Winter 1980-81, pp. 551-572.)

Compare two situations: (1) Putin explodes a 50 megaton nuclear "test" of the warhead for his new nuclear reactor powered torpedo, Poseidon, a revamped 1961 Tsar Bomba, or detonates a high-altitude nuclear EMP "test" over neutral waters but within the thousands of miles range of USA or UK territory; (2) Putin invades Poland using purely conventional weapons. Our point here is that both nuclear AND conventional weapons trigger nuclear threats and the risk of nuclear escalation, as indeed they have done (for Putin's nuclear threats scroll down to videos with translations below). So the fashionable CND style concept that only nuclear weapons can trigger nuclear escalation is bullshit, and is designed to help Russia start and win WWIII to produce a world government, by getting us to undertake further unilateral (not multilateral) disarmament, just as evolved in the 1930s, setting the scene for WWII. Japan for example did not have nuclear weapons in August 1945, yet triggered not just tactical nuclear war (both cities had some military bases and munitions factories, as well as enormous numbers of civilians), and the decision to attack cities rather than just "test" weapons obove Tokyo bay as Teller demanded but Oppenheimer rejected (for maximum impact with a very small supply of nuclear weapons) showed some strategic nuclear war thinking. Truman was escalating to try to shock Japan into rapid surrender emotionally (many cities in Japan had already been burned out in conventional incendiary air raids, and the two nuclear attacks while horrible for civilians in those cities contributed only a fraction of the millions killed in WWII, despite anti-nuclear propaganda lies to the contrary). Truman's approach escalating to win is the opposite of the "Minimax game theory" (von Neumann's maths and Thomas Schelling's propaganda) gradual escalation approach that's currently the basis of nuclear deterrence planning despite its failure wherever it has been tried (Vietnam, Afghanistan, etc). Gradual escalation is supposed to minimise the maximum possible risk (hence "minimax" name), but it guarantees failure in the real world (unlike rule abided games) by maximising the build up of resentment. E.g. Schelling/Minimax say that if you gradually napalm civilians day after day (because they are the unprotected human shields used by terrorists/insurgents; the Vietcong are hiding in underground tunnels, exactly like Hamas today, and the Putin regime's metro 2 shelter tunnels under Russia) you somehow "punish the enemy" (although they don't give a toss about the lives of kids which is why you're fighting them!) and force them to negotiate for peace in good faith, then you can pose for photos with them sharing a glass of champagne and there is "world peace". That's a popular fairy tale, like Marxist mythology.

Once you grasp this fact, that nuclear weapons have been and will again be "used" explosively without automatic escalation, for example provocative testing as per the 1961 Russian 50 megaton bomb test, or the 1962 high altitude EMP bursts, you should be able to grasp the fact that the "escalation" deception used to dismiss civil defense and tactical nuclear deterrence against limited nuclear war, is fake news from Russian fellow-travellers like Corbyn. Once you assign a non-unity probability to "escalation", you're into conventional war territory: if you fight a conventional war, it can "escalate" to nuclear war as on 6 August 1945. Japan did not avoid nuclear attack by not having nuclear weapons on 6 August 1945. If it had nuclear weapons ready to be delivered, a very persuasive argument could be made that unless Truman wanted to invite retaliation, World War II would have remained strategically non-nuclear: no net strategic advantage would have been achieved by nuclear city bombing so only war-ending tactical nuclear threats could have prevailed in practice. But try explaining this to the groupthink pseudosocialist bigoted mass murderers who permeate fake physics with crap; it's no easier to explain to them the origins of particle masses or even dark energy/gravitation; in both cases groupthink lying hogwash persists because statements of proved facts are hated and rejected if them debunk religious style fairy tales the mass media loves. There were plenty of people warning that mass media gas war fear mongering was disguised Nazi supporting propaganda in the 1930s, but the public listened to that crap then just as it accepted the "eugenics" (anti-diversity evolution crap of Sir Galton, cousin of Darwin) basis for Hitler's Mein Kampf without question, just as they accepted the lying propaganda from the UK "Cambridge Scientists Anti-War Group" which like CND and all other arms control and disarmament lobbies supporting terrorist states today, did more than even Hitler to deliberately lay the foundations for the Holocaust and World War II, while never being criticised in the UK media! Thus, it's surely time for people to oppose evil lying on civil defence to save lives in all disasters from storms to conventional war, to collateral damage risks in nuclear terrorism by mad enemies. At some point, the majority has to decide to either defend itself honestly and decently against barbarism, or be consumed by it as a price for believing bullshit. It's time for decent people to oppose lying evil regarding the necessity to have credible tactical (not incredible strategic) nuclear weapons, as Oppenheimer called for in his 1951 speech, to deter invasions.

Democracy can't function when secrecy is used to deliberately cover-up vital data from viewing by Joe Public. Secrecy doesn't protect you from enemies who independently develop weapons in secret, or who spy from inside your laboratories:

"The United States and Great Britain resumed testing in 1962, and we spared no effort trying to find out what they were up to. I attended several meetings on that subject. An episode related to those meetings comes to mind ... Once we were shown photographs of some documents ... the photographer had been rushed. Mixed in with the photocopies was a single, terribly crumpled original. I innocently asked why, and was told that it had been concealed in panties. Another time ... questions were asked along the following lines: What data about American weapons would be most useful for your work and for planning military technology in general?"

- Andrei Sakharov, Memoirs, Hutchinson, London, 1990, pp225-6.

ABOVE: The British government has now declassified detailed summary reports giving secret original nuclear test data on the EMP (electromagnetic pulse) damage due to numerous nuclear weapons, data which is still being kept under wraps in America since it hasn't been superseded because Western atmospheric nuclear tests were stopped late in 1962 and never resumed - even though the Russians have even more extensive data - completely debunking Glasstone and Dolan's disarmament propaganda nonsense in the 1962, 1964 and 1977 Effects of Nuclear Weapons which ignores EMP piped far away from low altitude nuclear tests by power and communications cables and falsely claims instead that such detonations don't produce EMP damage outside the 2psi blast radius! For a discussion of the new data and also a link to the full 200+ pages version (in addition to useful data, inevitably like all official reports it also contains a lot of "fluff" padding), please see the other (physics) site: https://nige.wordpress.com/2023/09/12/secret-emp-effects-of-american-nuclear-tests-finally-declassified-by-the-uk-and-at-uk-national-archives/ (by contrast, this "blogspot" uses old non-smartphone proof coding, no longer properly indexed any long longer by "google's smartphone bot"). As long ago as 1984, Herman Kahn argued on page 112 of his book Thinking About the Unthinkable in the 1980s: "The effects of an EMP attack are simply not well understood [in the West, where long powerlines were never exposed on high altitude nuclear tests, unlike the Russian's 1962 Operation K, so MHD-EMP or E3 damage wasn't even mentioned in the 1977 Glasstone and Dolan Effects of Nuclear Weapons], but the Soviets seem to know - or think they know - more than we do."

BELOW: declassified British nuclear war planning blast survival data showing that even without special Morrison table shelters, the American assumption that nobody can survive in a demolished house is false, based on detailed WWII British data (the majority of people in houses flattened within 77 ft from V1 Nazi cruise missiles survived!), and secret American reports (contradicting their unclassified propaganda) proved that blast survival occurred at 16 psi overpressure in Hiroshima's houses, e.g. see limited distribution Dirkwood corp DC-P-1060 for Hiroshima, also the secret 1972 Capabilities of Nuclear Weapons DNA-EM-1 table 10-1, and WWII report RC-450 table 8.2, p145 (for determining survival of people sheltered in brick houses, the WWII A, B, C, and D damage versus casualty data from V1 blast was correlated to similar damage from nuclear blast as given Glasstone's 1957 Effects of Nuclear Weapons page 249, Fig. 6.41a, and page 109 Fig. 3.94a, which show that A, B, C, and D damage to brick houses from nuclear weapons occur at peak overpressures of 9, 6, 3 and 0.5 psi, respectively; the longer blast from higher yields blows the debris over a wider area, reducing the load per unit area falling on to people sheltered under tables etc), and the declassified UK government assessment of nuclear terrorist attack on a port or harbour, as well as the confidential classified UK Government analysis of the economic and social effects from WWII bombing (e.g. the recovery times for areas as a function of percentage of houses destroyed):

Unofficial Russian video on the secret Russian nuclear shelters from Russian Urban Exploration, titled "Проникли на секретный Спецобъект Метро!" = "We infiltrated a secret special facility of the Metro!":

ABOVE: Moscow Metro and Metro-2 (secret nuclear subway) horizonially swinging blast doors take only 70 seconds to shut, whereas their vertically rising blast doors take 160 seconds to shut; both times are however far shorter than the arrival time of Western ICBMs or even SLBMs which take 15-30 minutes by which time the Russian shelters are sealed from blast and radiation! In times of nuclear crisis, Russia planned to evacuate from cities those who could not be sheltered, and for the remainder to be based in shelters (similarly to the WWII British situation, when people slept in shelters of one kind or another when there was a large risk of being bombed without notice, particularly in supersonic V2 missile attacks where little warning time was available).

fCo2fnIEVVDG-6K0Kwk9cik87id46Qw5l0qJSBtQ/s1600/Moscow%20bomb%20shelter6.png"/>

ABOVE: originally SECRET diagrams showing the immense casualty reductions for simple shelters and local (not long distance as in 1939) evacuation, from a UK Home Office Scientific Advisers’ Branch report CD/SA 72 (UK National Archives document reference HO 225/72), “Casualty estimates for ground burst 10 megaton bombs”, which exposed the truth behind UK Cold War civil defence (contrary to Russian propaganda against UK defence, which still falsely claims there was no scientific basis for anything, playing on the fact the data was classified SECRET). Evacuation plus shelter eliminates huge casualties for limited attacks; notice that for the 10 megaton bombs (more than 20 times the typical yield of today’s MIRV compact warheads!), you need 20 weapons, i.e. a total of 10 x 20 = 200 megatons, for 1 million killed, if civil defence is in place for 45% of people to evacuate a city and the rest to take shelter. Under civil defence, therefore, you get 1 million killed per 200 megatons. This proves that civil defence work to make deterrence more credible in Russian eyes. For a discussion of the anti-civil defence propaganda scam in the West led by Russian agents for Russian advantage in the new cold war, just read posts on this blog started in 2006 when Putin's influence became clear. You can read the full PDF by clicking the link here. Or see the files here.

ABOVE: the originally CONFIDENTIAL classified document chapters of Dr D.G. Christopherson’s “Structural Defence 1945, RC450”, giving low cost UK WWII shelter effectiveness data, which should also have been published to prove the validity of civil defence countermeasures in making deterrence of future war more credible by allowing survival of “demonstration” strikes and “nuclear accidents / limited wars” (it’s no use having weapons and no civil defence, so you can’t deter aggressors, the disaster of Munich appeasement giving Hitler a green light on 30 September 1938, when Anderson shelters were only issued the next year, 1939!). For the original WWII UK Government low cost sheltering instruction books issued to the public (for a small charge!) please click here (we have uploaded them to internet archive), and please click here for further evidence for the effectiveness of indoor shelters during WWII from Morrison shelter inventor Baker's analysis, please click here (he titled his book about WWII shelters "Enterprise versus Bureaucracy" which tells you all you need to know about the problems his successful innovations in shelter design experienced; his revolutionary concept was that the shelter should be damaged to protect the people inside because of the vast energy absorption soaked up in the plastic deformation of steel - something which naive fools can never appreciate - by analogy, if your car bumper is perfectly intact after impact you're unlikely to be because it has not absorbed the impact energy which has been passed on to you!). We have also placed useful declassified UK government nuclear war survival information on internet archive here and here. There is also a demonstration of how proof-tested WWII shelters were tested in 1950s nuclear weapon trials and adapted for use in Cold War nuclear civil defence, here, thus permanently debunking the somewhat pro-dictatorship/anti-deterrence Jeremy Corbyn/Matthew Grant/Duncan Campbell anti-civil defence propaganda rants which pretend to to based on reality, but obviously just ignore the hard, yet secret, nuclear testing facts upon which UK government civil defence was based as my father (a Civil Defence Corps instructor) explained here back in 2006. The reality is that the media follows herd fashion to sell paper/airtime; it doesn't lead it. This is why it backed Nazi appeasement (cheering Chamberlain's 1938 handshakes with Hitler for instance) and only switched tune when it was too late to deter Nazi aggression in 1939; it made the most money that way. We have to face the facts!

NUKEGATE - Western tactical neutron bombs were disarmed after Russian propaganda lie. Russia now has over 2000... "Disarmament and arms control" charlatans, quacks, cranks, liars, mass murdering Russian affiliates, and evil genocidal Marxist media exposed for what it is, what it was in the 1930s when it enabled Hitler to murder tens of millions in war. Glasstone's and Dolan's 1977 Effects of Nuclear Weapons deceptions totally disproved. Professor Brian Martin, TRUTH TACTICS, 2021 (pp45-50): "In trying to learn from scientific publications, trust remains crucial. The role of trust is epitomised by Glasstone’s book The Effects of Atomic Weapons. Glasstone was not the author; he was the editor. The book is a compilation of information based on the work of numerous contributors. For me, the question was, should I trust this information? Was there some reason why the editors or authors would present fraudulent information, be subject to conflicts of interest or otherwise be biased? ... if anything, the authors would presumably want to overestimate rather than underestimate the dangers ... Of special interest would be anyone who disagreed with the data, calculations or findings in Glasstone. But I couldn’t find any criticisms. The Effects of Nuclear Weapons was treated as the definitive source, and other treatments were compatible with it. ... One potent influence is called confirmation bias, which is the tendency to look for information that supports current beliefs and dismiss or counter contrary information. The implication is that changing one’s views can be difficult due to mental commitments. To this can be added various forms of bias, interpersonal influences such as wanting to maintain relationships, overconfidence in one’s knowledge, desires to appear smart, not wanting to admit being mistaken, and career impacts of having particular beliefs. It is difficult to assess the role of these influences on yourself. "

Honest Effects of Nuclear Weapons!

ABOVE (VIDEO CLIP): Russian State TV Channel 1 war inurer and enabler, NOT MERELY MAKING "INCREDIBLE BLUFF THREATS THAT WE MUST ALL LAUGH AT AND IGNORE LIKE DR GOEBBELS THREATS TO GAS JEWS AND START A WORLD WAR" AS ALMOST ALL THE BBC SCHOOL OF "JOURNALISM" (to which we don't exactly belong!) LIARS CLAIM, but instead preparing Russians mentally for nuclear war (they already have nuclear shelters and a new Putin-era tactical nuclear war civil defense manual from 2014, linked and discussed in blog posts on the archive above), arguing for use of nuclear weapons in Ukraine war in 2023: "We should not be afraid of what it is unnecessary to be afraid of. We need to win. That is all. We have to achieve this with the means we have, with the weapons we have. I would like to remind you that a nuclear weapon is not just a bomb; it is the heritage of the whole Russian people, suffered through the hardest times. It is our heritage. And we have the right to use it to defend our homeland [does he mean the liberated components of the USSR that gained freedom in 1992?]. Changing the [nuclear use] doctrine is just a piece of paper, but it is worth making a decision."

NOTE: THIS IS NOT ENGLISH LANGUAGE "PROPAGANDA" SOLELY ADDRESSED AS A "BLUFF" TO UK AND USA GOV BIGOTED CHARLATANS (those who have framed photos of hitler, stalin, chamberlain, baldwin, lloyd george, eisenhower, et al., on their office walls), BUT ADDRESSED AT MAKING RUSSIAN FOLK PARTY TO THE NEED FOR PUTIN TO START A THIRD WORLD WAR! Duh!!!!! SURE, PUTIN COULD PRESS THE BUTTON NOW, BUT THAT IS NOT THE RUSSIAN WAY, ANY MORE THAN HITLER SET OFF WWII BY DIRECTLY BOMBING LONDON! HE DIDN'T. THESE PEOPLE WANT TO CONTROL HISTORY, TO GO DOWN THE NEXT "PUTIN THE GREAT". THEY WANT TO GET THEIR PEOPLE, AND CHINA, NORTH KOREA, IRAN, ET Al. AS ALLIES, BY APPEARING TO BE DEFENDING RATIONALITY AND LIBERTY AGAINST WAR MONGERING WESTERN IMPERIALISM. For the KGB mindset here, please read Chapman Pincher's book "The Secret offensive" and Paul Mercer's "Peace of the Dead - The Truth Behind the Nuclear Disarmers". Please note that the link to the analysis of the secret USSBS report 92, The Effects of the Atomic Bomb on Hiroshima, Japan (which google fails to appreciate is a report with the OPPOSITE conclusions to the lying unclassified reports and Glasstone's book on fire, is on internet archive in the PDF documents list at the page "The effects of the atomic bomb on Hiroshima, Japan" (the secret report 92 of the USSBS, not the lying unclassified version or the Glasstone book series). If you don't like the plain layout of this blog, you can change it into a "fashionable" one with smaller photos you can't read by adding ?m=1 to the end of the URL, e.g. https://glasstone.blogspot.com/2022/02/analogy-of-1938-munich-crisis-and.html?m=1

PLEASE BEAR WITH US - THIS SITE WAS DEVELOPED IN 2006 BEFORE GOOGLE SMARTPHONE BOT CACHING (GOOGLE BOTS CAN'T INDEX THIS FORMAT ANYMORE AS IT IS SIMPLY UNSUITABLE TO SMARTPHONES WHICH DIDN'T EXIST BACK IN 2006 - WILL MOVE TO A NEW DOMAIN SOON TO OVERCOME THIS. (HOPEFULLY THE TEXT WILL ALSO BE EDITED AND RE-WRITTEN TO TAKE OUT TYPING ERRORS AND DEAD LINKS DATING BACK TO 2006 WHEN THE BLOG BEGAN - A LOT HAS CHANGED SINCE THEN!)

Glasstone's Effects of Nuclear Weapons exaggerations completely undermine credible deterrence of war: Glasstone exaggerates urban "strategic" nuclear weapons effects by using effects data taken from unobstructed terrain (without the concrete jungle shielding of blast winds and radiation by cities!), and omits the most vital uses and most vital effects of nuclear weapons: to DETER world war credibly by negating the concentrations of force used to invade Belgium, 1914 (thus WWI) and Poland (WWII). The facts from Hiroshima and Nagasaki for the shielding of blast and radiation effects by modern concrete buildings in the credible nuclear deterrence of invasions (click here for data) which - unlike the countervalue drivel that failed to prevent WW2 costing millions of human lives - worked in the Cold War despite the Western media's obsession with treating as Gospel truth the lying anti-nuclear propaganda from Russia's World Peace Council and its allies (intended to make the West disarm to allow Russian invasions without opposition, as worked in Ukraine recently)! If we have credible W54's and W79's tactical nukes to deter invasions as used to Cold War, pro Russian World Peace Council inspired propaganda says: "if you use those, we'll bomb your cities", but they can bomb our cities with nuclear if we use conventional weapons, or even if we fart, if they want - we don't actually control what thugs in dictatorships - it is like saying Hitler had 12,000 tons of tabun nerve agent by 1945, so lying we had to surrender for fear of it. Actually, he had to blow his brains out because he had an incredible deterrent, as retaliation risk plus defence (masks) negated it!

Credible deterrence necessitates simple, effective protection against concentrated and dispersed invasions and bombing. The facts can debunk massively inaccurate, deliberately misleading CND "disarm or be annihilated" pro-dictatorship ("communism" scam) political anti-nuclear deterrence dogma. Hiroshima and Nagasaki anti-nuclear propaganda effects lies on blast and radiation for modern concrete cities is debunked by solid factual evidence kept from public sight for political reasons by the Marx-media which is not opposed by the remainder of the media, and the completely fake "nuclear effects data" sneaks into "established pseudo-wisdom" by the back-door. Another trick is hate attacks on anyone telling the truth: this is a repeat of lies from Nobel Peace Prize winner Angell and pals before WWI (when long-"outlawed" gas was used by all sides, contrary to claims that paper agreements had "banned" it somehow) and WWII (when gas bombing lies prior to the war by Angell, Noel-Baker, Joad and others were used as an excuse to "make peace deals" with the Nazis, again, not worth the paper they were printed on). Mathematically, the subset of all States which keep agreements (disarmament and arms control, for instance) is identical to the subset of all States which are stable Democracies (i.e., tolerating dissent for the past several years), but this subset is - as Dr Spencer Weart's statistical evidence of war proves in his book Never at War: Why Democracies Won't Fight One Another - not the bloody war problem! Because none of the disarmaments grasp set theory, or bother to read Dr Weart's book, they can never understand that disarmament of Democracies doesn't cause peace but causes millions of deaths.

PLEASE CLICK HERE for the truth from Hiroshima and Nagasaki for the shielding of blast and radiation effects by modern concrete buildings in the credible nuclear deterrence of invasions which - unlike the countervalue drivel that failed to prevent WW2 costing millions of human lives - worked in the Cold War despite the Western media's obsession with treating as Gospel truth the lying anti-nuclear propaganda from Russia's World Peace Council and its allies (intended to make the West disarm to allow Russian invasions without opposition, as worked in Ukraine recently)! Realistic effects and credible nuclear weapon capabilities are needed for deterring or stopping aggressive invasions and attacks which could escalate into major conventional or nuclear wars. Credible deterrence is through simple, effective protection against concentrated and dispersed invasions and aerial attacks, debunking inaccurate, misleading CND "disarm or be annihilated" left political anti-nuclear deterrence dogma. Hiroshima and Nagasaki anti-nuclear propaganda effects lies on blast and radiation for modern concrete cities is debunked by solid factual evidence kept from public sight for political reasons by the Marx-media.

Glasstone's and Nukemap's fake Effects of Nuclear Weapons effects data for unobstructed deserts, rather than realistic blast and radiation shielding concrete jungles which mitigate countervalue damage as proved in Hiroshima and Nagasaki by Penney and Stanbury, undermine credible world war deterrence just as Philip Noel-Baker's 1927 BBC radio propaganda on gas war knock-out blow lies were used by Nazi propaganda distributing "pacifist disarmers" to undermine deterrence of Hitler's war, murdering tens of millions deliberately through lies (e.g. effective gas masks don't exist) that were easy to disprove, but supported by the mainstream fascist leaning press in the UK. There is not just one country, Russia, which could trigger WW3, because we know from history that the world forms alliances once a major war breaks out, apart from a few traditional neutral countries like Ireland and Switzerland, so a major US-China war over Taiwan could draw in support from Russia and North Korea, just as the present Russian invasion and war against Ukraine has drawn in Iranian munitions support for Russia. So it is almost certain that a future East-vs-West world war will involve an alliance of Russia-China-North Korea-Iran fighting on multiple fronts, with nuclear weapons being used carefully for military purposes (not in the imaginary 1930s massive "knockout blow" gas/incendiary/high explosive raids against cities that was used by the UK media to scare the public into appeasing Hitler and thus enabling him to trigger world war; Chamberlain had read Mein Kampf and crazily approved Hitler's plans to exterminate Jews and invade Russia starting a major war, a fact censored out of biased propaganda hailing Chamberlain as a peacemaker).

Realistic effects and credible nuclear weapons capabilities are VITAL for deterring or stopping aggressive invasions and attacks which could escalate into major conventional or nuclear wars debunk Marx media propagandarists who obfuscate because they don't want you to know the truth, so activism is needed to get the message out against lying frauds and open fascists in the Russian supporting Marx mass media, which sadly includes government officialdom (still infiltrated by reds under beds, sorry to Joe MaCarthy haters, but admit it as a hard fact that nuclear bomb labs in the West openly support Russian fascist mass murders; I PRAY THIS WILL SOON CHANGE!).

ABOVE: Tom Ramos at Lawrence Livermore National Laboratory (quoted at length on the development details of compact MIRV nuclear warhead designs in the latest post on this blog) explains how the brilliant small size primary stage, the Robin, was developed and properly proof-tested in time to act as the primary stage for a compact thermonuclear warhead to deter Russia in the 1st Cold War, something now made impossible due to Russia's World Peace Council propaganda campaigns. (Note that Ramos has a new book published, called From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War which describes in detail in chapter 13, "First the Flute and Then the Robin", how caring, dedicated nuclear weapons physicists in the 1950s and 1960s actually remembered the lesson of disarmament disaster in the 1930s, and so WORKED HARD to develop the "Flute" secondary and the "Robin" primary to enable a compact, light thermonuclear warhead to help deter WWIII! What a difference to today, when all we hear from such "weaponeers" now is evil lying about nuclear weapons effects on cities and against Western civil defence and against credible deterrence on behalf of the enemy.)

ABOVE: Star Wars filmmaker Peter Kuran has at last released his lengthy (90 minutes) documentary on The neutron bomb. Unfortunately, it is not yet being widely screened in cinemas or on DVD Blu Ray disc, so you have to stream it (if you have fast broadband internet hooked up to a decent telly). At least Peter managed to interview Samuel Cohen, who developed the neutron bomb out of the cleaner Livermore devices Dove and Starling in 1958 (Ramos says Livermore's director, who invented a wetsuit, is now trying to say Cohen stole the neutron bomb idea from him! Not so, as RAND colleague and 1993 Effects Manual EM-1 editor Dr Harold L. Brode explains in his recent brilliant book on the history of nuclear weapons in the 1st Cold War (reviewed in a post on this blog in detail) that Cohen was after the neutron bomb for many years before Livermore was even built as a rival to Los Alamos. Cohen had been into neutrons when working in the Los Alamos Efficiency Group of the Manhattan project on the very first nuclear weapons, used with neutron effects on people by Truman, back in 1945 to end a bloody war while the Livermore director was in short pants.)

For the true effects in modern city concrete buildings in Hiroshima and Nagasaki, disproving the popular lies for nudes in open deserts used as the basis for blast and radiation calculations by Glasstone and Nukemap, please click here The deceptive bigots protraying themselves as Federation of American Scientists genuine communist disarmers in the Marx media including TV scammers have been suppressing the truth to sell fake news since 1945 and in a repetition of the 1920s and 1930s gas war media lying for disarmament and horror news scams that caused disarmament and thus encouraged Hitler to initiate the invasions that set off WWII!

Glasstone's Effects of Nuclear Weapons exaggerations completely undermine credible deterrence of war: Glasstone exaggerates urban "strategic" nuclear weapons effects by using effects data taken from unobstructed terrain (without the concrete jungle shielding of blast winds and radiation by cities!), and omits the most vital uses and most vital effects of nuclear weapons: to DETER world war credibly by negating the concentrations of force used to invade Belgium, 1914 (thus WWI) and Poland (WWII). Disarmament and arms control funded propaganda lying says any deterrent which is not actually exploded in anger is a waste of money since it isn't being "used", a fraud apparently due to the title and content of Glasstone's book which omits the key use and effect of nuclear weapons, to prevent world wars: this is because Glasstone and Dolan don't even bother to mention the neutron bomb or 10-fold reduced fallout in the the Los Alamos 95% clean Redwing-Navajo test of 1956, despite the neutron bomb effects being analysed for its enhanced radiation and reduced thermal and blast yield in detail in the 1972 edition of Dolan's edited secret U.S. Department of Defense Effects Manual EM-1, "Capabilities of Nuclear Weapons", data now declassified yet still being covered-up by "arms control and disarmament" liars today to try to destroy credible deterrence of war in order to bolster their obviously pro-Russian political anti-peace agenda. "Disarmament and arms control" charlatans, quacks, cranks, liars, mass murdering Russian affiliates, and evil genocidal Marxist media exposed for what it is, what it was in the 1930s when it enabled Hitler to murder tens of millions in war .

ABOVE: 11 May 2023 Russian state TV channel 1 loon openly threatens nuclear tests and bombing UK. Seeing how the Russian media is under control of Putin, this is like Dr Goebbels rantings, 80 years past. But this doesn't disprove the world war threat any more than it did with Dr Goebbels. These people, like the BBC here, don't just communicate "news" but attempt to do so selectively and with interpretations and opinions that set the stage for a pretty obviously hate based political agenda with their millions of viewers, a trick that worked in the 1st Cold War despite Orwell's attempts to lampoon it in books about big brother like "1984" and "Animal Farm". When in October 1962 the Russians put nuclear weapons into Cuba in secret without any open "threats", and with a MASSIVELY inferior overall nuclear stockpile to the USA (the USA had MORE nuclear weapons, more ICBMs, etc.), the media made a big fuss, even when Kennedy went on TV on 22 October and ensured no nuclear "accidents" in Cuba by telling Russia that any single accidentally launched missile from Cuba against any Western city would result in a FULL RETALITORY STRIKE ON RUSSIA. There was no risk of nuclear war then except by accident, and Kennedy had in his 25 May 1961 speech on "Urgent National Needs" a year and a half before instigated NUCLEAR SHELTERS in public basement buildings to help people in cities survive (modern concrete buildings survive near ground zero Hiroshima, as proved by declassified USSBS reports kept covered up by Uncle Sam). NOE THAT THERE IS A CREDIBLE THREAT OF NUCLEAR TESTS AND HIROSHIMA TYPE INTIMIDATION STRIKES, THE BBC FINALLY DECIDES TO SUPPRESS NUCLEAR NEWS SUPPOSEDLY TO HELP "ANTI-NUCLEAR" RUSSIAN PROPAGANDA TRYING TO PREVENT US FROM GETTING CREDIBLE DETERRENCE OF INVASIONS, AS WE HAD WITH THE W79 UNTIL DISARMERS REMOVED IT IN THE 90s! This stinks of prejudice, the usual sort of hypocrisy from the 1930s "disarmament heroes" who lied their way to Nobel peace prizes by starting a world war!

The facts from Hiroshima and Nagasaki for the shielding of blast and radiation effects by modern concrete buildings in the credible nuclear deterrence of invasions (click here for data) which - unlike the countervalue drivel that failed to prevent WW2 costing millions of human lives - worked in the Cold War despite the Western media's obsession with treating as Gospel truth the lying anti-nuclear propaganda from Russia's World Peace Council and its allies (intended to make the West disarm to allow Russian invasions without overwhelming, effective deterrence or opposition, as worked in Ukraine recently)!

Realistic effects and credible nuclear weapon capabilities are required now for deterring or stopping aggressive invasions and attacks which could escalate into major conventional or nuclear wars. Credible deterrence necessitates simple, effective protection against concentrated and dispersed invasions and bombing. The facts can debunk massively inaccurate, deliberately misleading CND "disarm or be annihilated" pro-dictatorship ("communism" scam) political anti-nuclear deterrence dogma. Hiroshima and Nagasaki anti-nuclear propaganda effects lies on blast and radiation for modern concrete cities is debunked by solid factual evidence kept from public sight for political reasons by the Marx-media, which is not opposed by the fashion-obsessed remainder of the media, and so myths sneak into "established pseudo-wisdom" by the back-door.

Friday, March 23, 2007

Radiation Effects Research Foundation covers up the very low cancer rates of Hiroshima and Nagasaki nuclear survivors using cynical obfuscation tactic

In a controlled sample of 36,500 survivors, 89 people got leukemia over a 40 year period, above the number in the unexposed control group. (Data: Radiation Research, volume 146, 1996, pages 1-27.) Over 40 years, in 36,500 survivors monitored, there were 176 leukemia deaths which is 89 more than the control (unexposed) group got naturally. There were 4,687 other cancer deaths, but that was merely 339 above the number in the control (unexposed) group, so this is statistically a much smaller rise than the leukemia result. Natural leukemia rates, which are very low in any case, were increased by 51% in the irradiated survivors, but other cancers were merely increased by just 7%. Adding all the cancers together, the total was 4,863 cancers (virtually all natural cancer, nothing whatsoever to do with radiation), which is just 428 more than the unexposed control group. Hence, the total increase over the natural cancer rate due to bomb exposure was only 9%, spread over a period of 40 years. There was no increase whatsoever in genetic malformations.

'This continues the series of general reports on mortality in the cohort of atomic bomb survivors followed up by the Radiation Effects Research Foundation. This cohort includes 86,572 people with individual dose estimates ... There have been 9,335 deaths from solid cancer and 31,881 deaths from noncancer diseases during the 47-year follow-up. ... We estimate that about 440 (5%) of the solid cancer deaths and 250 (0.8%) of the noncancer deaths were associated with the radiation exposure [emphasis added]. ... a new finding is that relative risks decline with increasing attained age, as well as being highest for those exposed as children as noted previously. A useful representative value is that for those exposed at age 30 the solid cancer risk is elevated by 47% per sievert at age 70. ... There is no direct evidence of radiation effects for doses less than about 0.5 Sv [emphasis added; notice that this report considers 86,572 people with individual dose estimates, and 40% have doses below 5 mSv or 0.005 Sv, so the politically expedient so-called 'lack of evidence' is actually a fact backed up by one hell of a lot of evidence that there are no radiation effects at low doses, a fact the biased scare-story-selling media and corrupt politically-expedient politicians will never report!].' - D. L. Preston, Y. Shimizu, D. A. Pierce, A. Suyama, and K. Mabuchi, 'Studies of mortality of atomic bomb survivors. Report 13: Solid cancer and noncancer disease mortality: 1950-1997', Radiation Research, volume 160, issue 2, pp. 381-407 (2003).
Above: what is being politically covered up in the latest reports by the Radiation Effects Research Foundation. D. A. Pierce and D. L. Preston (Radiation Effects Research Foundation, Hijiyama Park, Hiroshima) wrote in 'Radiation-related cancer risks at low doses among atomic bomb survivors', Radiation Research, volume 154, issue 2. pp. 178-86 (August 2000): 'To clarify the information in the Radiation Effects Research Foundation data regarding cancer risks of low radiation doses, we focus on survivors with doses less than 0.5 Sv. ... Analysis is of solid cancer incidence from 1958-1994, involving 7,000 cancer cases among 50,000 survivors in that dose and distance range. The results provide useful risk estimates for doses as low as 0.05-0.1 Sv, which are not overestimated by linear risk estimates computed from the wider dose ranges 0-2 Sv or 0-4 Sv. There is ... an upper confidence limit on any possible threshold is computed as 0.06 Sv [emphasis added]. It is indicated that modification of the neutron dose estimates currently under consideration would not markedly change the conclusions.' In the illustration above, at 3.4 rads (gamma dose equivalent) reduced the natural leukemia rate by 30% in the Hiroshima and Nagasaki data available in 1982. There seems to be a "threshold" of 8 rads before there is any increase in risk. (H. Kato and W. J. Schull, 'Studies of the mortality of A-bomb survivors. 7. Mortality, 1950-1978: Part I. Cancer mortality', Radiation Research, May 1982, v90, Issue 2, pp. 395-432.) The accuracy in dosimetry (substantiated by measurements of neutron induced activity and gamma ray thermoluminescence in the two cities) at that time meant that the doses were generally believed accurate to about +/- 50% (the accuracy of later estimates has increased). These data are based on a radiation quality factor of about 20 for neutrons, to reconcile data from the two cities (the Hiroshima gun-type bomb bomb leaked the most neutrons, which were mainly absorbed in the high explosive in the Nagasaki device which worked by spherically symmetrical implosion), i.e., 1 rad from neutrons was considered to be equivalent to 20 rads of gamma rays. The reason for the reduction in natural leukemia rate by 3.4 rads may be either the stimulation of the protein P53 repair mechanism which repairs DNA strands broken by radiation, and/or a long-term boosting to the immune system caused somehow by surviving the nuclear explosions with low doses. It is unlikely to be a completely statistical random error, because the sample size of people exposed to low doses of radiation is very large - 23,073 people exposed to an average of 3.4 rads, with an unexposed control group size of 31,581. However, the exact doses received were still fairly uncertain in 1982, and the survivors of Hiroshima and Nagasaki were still dying:


This means that the early data from the 1950s upon which all the health physics philosophy (linear dose-effects relation with no threshold dose before effects start to appear, and no effect of dose rate - see previous post) is useless not only because of the dosimetry but because it was premature to judge long terms effects by that early data. For example, the major source of 1950s data from Hiroshima and Nagasaki is summarised in a table on page 966 of the 1957 U.S. Congressional Hearings before the Special Subcommittee on Radiation of the Join Committee on Atomic Energy, The Nature of Radioactive Fallout and Its Effects on Man. This table was headed "Incidence of leukemia among the combined exposed populations of Hiroshima and Nagasaki by distance from the hypocenter (January 1948-September 1955)", and it is divided into distances of 0-1 km (0.96% of survivors had leukemia), 1-1.5 km (0.30% of survivors had leukemia), 1.5-2 km (0.043% of survivors had leukemia) and beyond 2 km (0.017% had leukemia). This early data was simply not detailed enough, and not collected over a long enough period of time to assess the effects of radiation properly, and there was no proper dosimetry to determine the doses people received, their shielding by houses (and the mutual shielding of clusters of houses), etc. The valuable data has taken decades to get.

The joint Japanese-American (Department of Energy)-funded Radiation Effects Research Foundation aren't putting the sort of detailed dose-effects data we need on the internet due to political bias in favour of fashionable prejudice in Japan, despite such bias being cynically anti-scientific, ignorance-promoting, politically expedient dogmatism: its online 16 pages booklet called 'Basic Guide to Radiation and Health Sciences' gives no quantitative results on radiation effects whatsoever, while it falsely promotes lies about radioactive rainout on page 5:



Above: by the time that the mass fires developed in the wooden homes of Hiroshima (breakfast time) and Nagasaki (lunch time) from blast wind-overturned cooking braziers, paper screens, and bamboo furnishings, the mushroom cloud has been blown away by the wind. The moisture and soot from the firestorm in Hiroshima which condensed to a 'black rain' when it had risen and cooled above the city, fell on the city an hour or two after the explosion and did not intersect the radioactive mushroom cloud, which had attained much higher altitude than the firestorm soot and moisture in any case. The neutron induced activity doses from the ground were trivial compared to the outdoor initial nuclear radiation doses, as illustrated in a previous post using the latest DS02 dosimetry. The RERF propaganda seeks to discredit civil defence by false propaganda, a continuation of the fellow travelled Cold War Soviet communist propaganda against Western defenses.

‘Science is the organized skepticism in the reliability of expert opinion.’

- R. P. Feynman (quoted by Smolin, The Trouble with Physics, 2006, p. 307).

‘Science is the belief in the ignorance of [the speculative consensus of] experts.’

- R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

The linear non-threshold (LNT) anti-civil defence dogma results from ignoring the vitally important effects of the dose rate on cancer induction, which have been known and published in papers by Mole and a book by Loutit for about 50 years; the current dogma which is falsely based on merely the total dose, thus ignoring the time-dependent ability of protein P53 and other to cancer-prevention mechanisms to repair broken DNA segments. This is particularly the case for double strand breaks, where the whole double helix gets broken; the repair of single strand breaks is less time-dependent because there is no risk of the broken single strand being joined to the wrong end of a broken DNA segment. Repair is only successful in preventing cancer if the broken ends are rapaired correctly before too many unrepaired breaks have accumulated in a short time; if too many double strand breaks occur quickly, segments can be incorrectly 'repaired' with double strand breaks being miss-matched to the wrong segments ends, possibly inducing cancer if the resulting somatic cell can then undergo division successfully without apoptosis.
http://www.rerf.jp/top/qae.htm.If you look at the data they provide at http://www.rerf.or.jp/eigo/faqs/faqse.htm, it only extends to 1990 and deliberately doesn't include any dosimetry at all (although the doses depend on shielding, they could have dealt with this by providing average shielding figures at each range, or simply ignoring distance and plotting dose versus effects). But I found the 1988 report update based on the 1986 dosimetry, which is close to the latest data: Y. Shimizu, et al., Life Span Study Report II, Part 2, Cancer Mortality in the Years 1950-1985 Based on the Recently Revised Doses (DS86), Radiation Effects Research Foundation, RERF-TR-5-88:

You can see that small doses up to 5 rads have no effect either way on the leukemia risk, while 6-9 rads in this data seems to cause a reduction in normal leukemia risk from 0.17% to 0.12%. Doses which exceed this are harmful, possibly because the P53 repair mechanism was saturated and could not repair radiation induced damage to DNA due to the rate it occurred at higher doses. A dose of 20-40 rads more than doubles the natural leukemia risk. Hence anyone getting leukemia after a larger dose is more than 50% likely to have got the cancer as a result of the radiation exposure than naturally. (You cannot say this about other forms of cancer because 23% of Americans die from some form of cancer now anyway, so even the sort of risks at massive radiation doses can't compete with the natural risk of cancer for most types of cancer.) Notice that the DS02 dosimetry dose effects estimates are within 10% of the earlier DS86 estimates. DS02 (Dosimetry System 2002) was adopted in 2003 and gives a radiation dose at 1 m above he ground in open terrain at 1 km from ground zero of 4.5 and 8.7 Gy in Hiroshima and Nagasaki, respectively, with 0.08 and 0.14 Gy at 2 km, respectively. According to the recent Life Span Study report for the period of 1950-2000, among 86,611 survivors for whom individual doses were estimated, there were 47,685 deaths (55% of total number of survivors alive in 1950), including 10,127 from solid cancer and 296 from leukemia. Out of the 10,127 solid cancer deaths, only 5% were due to radiation, as shown by comparison to a non-exposed (but otherwise matched) control group.

In 1969, Professor Ernest Sternglass, a physicist, correlated a dramatic increase in infant mortality during the 1950s to the increasing fallout radiation from nuclear testing. His papers and books on low-level radiation effects were unscientific in the sense that they illustrate how not to do science. He had no control group, unlike the Hiroshima and Nagasaki data. So he had no idea what was causing the childhood mortality rise. It could have been diet, proximity to smoking adults at home, effects of natal X-rays (see previous post), childhood X-ray checks or screening for TB, etc.


Above: Professor Sternglass' analysis, which wasn't even based upon a real increase childhood mortality, which was falling before, during and after the nuclear tests. Sternglass instead claimed that in the absence of nuclear testing, childhood mortality should (in his opinion), have somehow continued to decrease according to the average fall rate of 1935-50 (when better medical care was reducing childhood mortality). Then he claimed that the flattening of the curve which occurred instead was evidence for a relative increase in childhood mortality due to radiation from fallout caused by nuclear testing.

He was therefore first assuming that fallout from bomb testing was responsible, and then - without stating this assumption - he was using this assumption to claim that the data of the correlation between infant mortality and fallout rate was evidence that fallout was causing the increase! His first presentation was at the 9th Annual Hanford Biological Symposium, May 1969. On 24 July 1969, Dr Alice Stewart wrote an article for the New Scientist, "The pitfalls of Extrapolation" which found another contradiction in Professor Sternglass' theory:

"Sternglass has postulated a fetal mortality trend which would eventually produce rates well below the level which - according to his own theory - would result from background radiation."

The danger here is that bad science, lacking mechanism, can be asserted and become credible in the public despite being completely false, just because a scientist misuses authority to gain attention. In this case, when Sternglass' paper was rejected from a scientific journal, he had it published in the September 1969 issue of Esquire magazine, titled ‘The death of all children’. That magazine advertised the story as a selling point, and sent out copies of the magazine to prominent people in politics. If he had scientific evidence that was being covered up, that would have been reason to do that, assuming that the media would be interested in making a political storm out of the facts (which strongly support a result which is the opposite of that which Sternglass makes). So you end up with the idea that these false claims about low level radiation stem from politics: if the public wants to fear low level, low dose rate radiation, someone will fiddle the statistics accordingly. Anyone giving the facts is conveniently ignored or ridiculed as being ‘out of touch’ or part of a conspiracy and cover-up.

Sternglass' straight line extrapolation is completely pseudoscience, because if carried into the past it predicts a time with 100% infant mortality (evidently wrong, because people are alive now), and extrapolated into the future it predicts 0% childhood mortality (clearly false, because disease cannot be eradicated, despite the innovations like sulfonamides and antibiotics in the 1935-50 era). This type of error due to a lack of causality and proper mechanism based predictions is not limited to the controversy over the effects of radiation. It is also typical of how controversy is created by people like Dr Edward Witten, string theorist, and is completely false science: Witten claims that string theory has the wonderful property of "predicting gravity". Actually, it predicts nothing checkable about it, and so doesn't, it is rather the case that 11-dimensional supergravity is assumed to be true, because it is assumed that gravity is due to spin-2 gauge bosons (gravitons) which nobody has ever observed. What Witten should have said is that it is an ad hoc model which includes spin-2 gravitons that nobody has ever seen, which is a far cry from claiming that string theory predicts gravity. These people are in some way well meaning I'm sure, but they are being deliberately misleading over scientific facts to boost some research program or political viewpoint just for the sake of politics or controversy, not science. As stated in an earlier post, quite a bit of iodine-131 was released across America by Nevada testing in 1951-62, but even the effects of that were far smaller than what Sternglass was claiming.

Darrell Huff wrote a book called How to Lie with Statistics which has the example that researchers found that the number of children in a family in Holland correlated to the number of storks nests on the roof of the home! Perhaps that proves that storks really deliver children to families? Well, actually the bigger the family, the bigger the home they needed on the average. The bigger families with more children tended to have bigger, older houses, with big old roofs which had more storks nests because of their size and age. Professor Sternglass has recently had a change from claiming that low-level radiation is lethal: he has published a book about what happened before the big bang, an analogy to an egg dividing many times to produce all the particles.

(I don't find that too scientific either because it just ignores the pair-production mechanism for the creation of fundamental particles in strong fields, it is essentially ad hoc theorizing which doesn't explain or predict the key issues in the cosmological application of general relativity - such as the epicycles like dark matter and dark energy - it doesn't explain the Standard Model of Particle Physics, and as a result, perhaps, it has not gained so much attention as his claims on low-level radiation. However, Sternglass is right in some details, such as the cause of the double slit experiment interference with single photons being the size of the photon compared to the slit spacing, about Bohr's mainstream Copenhagen Interpretation orthodoxy being not even wrong unpredictive belief, and about Dirac's sea in quantum field theory being censored today as a physical mechanism because of heresies over aether, which he discussed with Einstein and others like Feynman, who advised him to check and prove his ideas more carefully.)

Update: the report by Donald A. Pierce and Dale L. Preston of RERF, 'Radiation-Related Cancer Risks at Low Doses among Atomic Bomb Survivors' in Radiation Research v. 154 (2000), pp. 178–186 states: 'Analysis is of solid cancer [not leukemia] incidence from 1958–1994, involving 7,000 cancer cases among 50,000 survivors in that dose and distance range. ... There is a statistically significant risk in the range 0–0.1 Sv, and an upper confidence limit on any possible threshold is computed as 0.06 Sv. It is indicated that modification of the neutron dose estimates currently under consideration would not markedly change the conclusions.'

D. L. Preston et al., 'Effect of Recent Atomic Bomb Survivor Dosimetry Changes on Cancer Mortality Risk Estimates,' Radiation Research, v162 (2004), pp. 377-389 state: 'The Radiation Effects Research Foundation has recently implemented a new dosimetry system, DS02, to replace the previous system, DS86. This paper assesses the effect of the change on risk estimates for radiation-related solid cancer and leukemia mortality. The changes in dose estimates were smaller than many had anticipated, with the primary systematic change being an increase of about 10% in γ-ray estimates for both cities. In particular, an anticipated large increase of the neutron component in Hiroshima for low-dose survivors did not materialize. However, DS02 improves on DS86 in many details, including the specifics of the radiation released by the bombs and the effects of shielding by structures and terrain. ... For both solid cancer and leukemia, estimated age–time patterns and sex difference are virtually unchanged by the dosimetry revision. The estimates of solid-cancer radiation risk per sievert and the curvilinear dose response for leukemia are both decreased by about 8% by the dosimetry revision, due to the increase in the γ-ray dose estimates...' However, the difficulty of finding any recent reported summary of the key data on the internet suggests that maybe they are not publishing the detailed data on dose versus effects, but just some average based on force-fitting the high-dose effects data to the linear, no-threshold model. Otherwise it would just be embarrassing to the orthodoxy, and draw the ignorant scorn of the anti-nuclear lobby? Of course the public at large only wants to hear lies about radiation because they've been brainwashed by propaganda based on prejudices, not science, and the media provide what readers want to hear, political arguments.

The information from the current online version of http://www.rerf.or.jp/eigo/faqs/faqse.htm#faq2, quoting data for 1950-90 from Radiation Research (146:1-27, 1996), without any doses to correspond to the effects despite the massive 2002 dosimetry project, clearly seems to prove that the Radiation Effects Research Foundation is covering up the dose-effects data by only making available on the internet data stripped of the dosimetry, so that it doesn't upset the 1950s linear, no-threshold religious style orthodoxy or rather, dogma. Of course, if they didn't cover-up, the implications would be uproar. So the one really valuable source of information is censored.

There is no other really reliable data because of the lack of good control groups (with similar exposures to other risks, similar lifestyles, etc.) and statistically significant population sizes exposed. For example, the 64 Marshallese on Rongelap after the Bravo test who were exposed to about 175 rads of gamma radiation from fallout over 44 hours in 1954 are too small a sample to get accurate long-term data from. In 1972, one person died from leukemia due to gamma radiation in the group of 64, and several thyroid nodules (most thyroid effects of radiation are not lethal) also occurred as a result of beta radiation to the thyroid gland from drinking water collected by an open cistern which became contaminated by the fallout containing iodine-131. Although this gives a leukemia risk of 1/64 after 175 rads received over 44 hours, this figure is statistically very weak because of the small sample size.

Hiroshima and Nagasaki data are being deliberately abused for propaganda purposes by ignoring the low dose data, and falsely taking high dose data and using that as if effects are directly proportional to dose with no threshold and no dose rate effect. Sometimes in the past claims have been made that the cancer rates were worse than previously thought. In 1957, Japanese type isolated wooden houses were exposed to nuclear tests in Nevada during Operation Plumbbob to determine how much radiation shielding they provided. It's obvious that a cluster of houses will provided more shielding than an isolated house in a desert, because the slant direct and scattered radiation will get additional shielding by the surrounding buildings they have to penetrate. It turned out that the wooden houses gave a typical protection factor of about 2-3 against the initial neutrons and gamma rays. The shielding by adjacent buildings was ignored. Later it was shown that the shielding by surrounding houses in a city doubles the overall protection factor for wooden houses, from 2-3 to 4-6. As a result, the estimated doses were halved. This meant that the same number of cancers was caused by only half as much radiation, so the number of cancers per unit of radiation was doubled.

So these revisions were caused by dosimetry, not new effects showing up! The dosimetry is very accurate now. The effects of radiation are "well known" in the scientific sense, although they're not "well known" in the political sense.

Kenneth L. Mossman of Arizona State University wrote a review of the problem in the March 1998 issue of Medical Physics (v25, Issue No. 3, pp. 279-284), on 'The linear no-threshold debate: Where do we go from here?', arguing:

'For the past several years, the LNT (linear no-threshold) theory has come under attack within the scientific community. Analysis of a number of epidemiological studies of the Japanese survivors of the atomic bombings and workers exposed to low level radiation suggest that the LNT philosophy is overly conservative, and low-level radiation may be less dangerous than commonly believed. Proponents of current standards argue that risk conservatism is justified because low level risks remain uncertain and it is prudent public health policy; LNT opponents maintain that regulatory compliance costs are excessive, and there is now substantial scientific information arguing against the LNT model. Regulators use the LNT theory in the standards setting process to predict numbers of cancers due to exposure to low level radiation because direct observations of radiation-induced cancers in populations exposed to low level radiation are difficult. The LNT model is simplistic and provides a conservative estimate of risk. Abandoning the LNT philosophy and relaxing regulations would have enormous economic implications. However, alternative models to predict risk at low dose are as difficult to justify as the LNT model. Perhaps exposure limits should be based on model-independent approaches. There is no requirement that exposure limits be based on any predictive model. It is prudent to base exposure limits on what is known directly about health effects of radiation exposure of human populations.'

A more recent review, in 2005, of the mechanism behind the Hiroshima and Nagasaki data at low doses was done by L. E. Feinendegen in his paper, 'Evidence for beneficial low level radiation effects and radiation hormesis' in the British Journal of Radiology, v78 (2005), pp. 3-7:

'Low doses in the mGy range [1 mGy = 0.1 rad, since 1 Gray = 1 Joule/kg = 100 rads] cause a dual effect on cellular DNA. One is a relatively low probability of DNA damage per energy deposition event and increases in proportion to the dose. At background exposures this damage to DNA is orders of magnitude lower than that from endogenous sources, such as reactive oxygen species. The other effect at comparable doses is adaptive protection against DNA damage from many, mainly endogenous, sources, depending on cell type, species and metabolism. Adaptive protection causes DNA damage prevention and repair and immune stimulation. It develops with a delay of hours, may last for days to months, decreases steadily at doses above about 100 mGy to 200 mGy and is not observed any more after acute exposures of more than about 500 mGy. Radiation-induced apoptosis and terminal cell differentiation also occur at higher doses and add to protection by reducing genomic instability and the number of mutated cells in tissues. At low doses reduction of damage from endogenous sources by adaptive protection maybe equal to or outweigh radiogenic damage induction. Thus, the linear-no-threshold (LNT) hypothesis for cancer risk is scientifically unfounded and appears to be invalid in favour of a threshold or hormesis. This is consistent with data both from animal studies and human epidemiological observations on low-dose induced cancer. The LNT hypothesis should be abandoned and be replaced by a hypothesis that is scientifically justified and causes less unreasonable fear and unnecessary expenditure.'

See also the previous post on this blog for the cause of the LNT error due to early, inaccurate 1950s data, and ignoring dose rate consequences.
Online there is a 1982 book by Harvey Wasserman, Norman Solomon, Robert Alvarez and Eleanor Walters called 'Killing our Own: Chronicling the Disaster of America's Experience with Atomic Radiation, 1945-1982'. It contains a summary of all the radiation horror stories (some like Sternglass, et al., are pseudoscience, and some are valid). It doesn't contain any of the basic data with large control groups that shows how many excess cancers actually occur as a function of dose for particular dose rates. It relies instead on the opinions of committees and scientific authorities, repeating Sternglass' claims in chapter 11 and complaining that 'The industry as a whole has devoted thousands of dollars to undercutting his reputation.' That's the problem: you can't deal with errors by making ad hominem attacks on the reputations of the people making the errors, but by clearly emphasising where the errors are. Better still, publish the facts briefly, clearly, honestly, and fairly as simple graphs in the first place, and then the public will know what they are and will be able to make informed judgements.

In May 1985, a U.S. National Research Council report on mortality in nuclear weapons test participants raised several questions. Some 5,113 nuclear test participants had died between 1952-81, when 6,125 deaths would be expected for a similar sized group of non-exposed Americans. The number of leukemia deaths was 56, identical to that in a similar sized non-exposed group. However, as the graph at the top of this post shows, the risk depends on the dose, so the few people with the highest doses would have far greater risks. In 1983, a C.D.C. report on the effects of fallout from the Plumbbob-Smoky test in 1957 showed that 8 participants in that test has died from leukemia up to 1979, compared to only 3 expected from a similar sized sample of non-exposed Americans. However, even for the Plumbbob-Smoky test, the overall death rate from all causes in the exposed test participants (320 deaths from 1957-79) was less than that in a matched sample of non-exposed Americans (365 deaths). The average dose to American nuclear test participants was only about 0.5 rad, although far higher doses were received by those working with fallout soon after nuclear tests. Altogether, out of 205,000 U.S. Department of Defense participants in nuclear tests, 34,000 were expected to die from naturally occurring cancer, and 11 from cancer due to radiation exposure. (According to the March 1990 U.S. Defense Nuclear Agency study guide DNA1.941108.010, report HRE-856, Medical Effects of Nuclear Weapons.

Update (13 August 2007):

There is an essay by Dr Donald W. Miller, Afraid of Radiation? Low Doses are Good for You, available in PDF format here. Two problems with that title are:

(1) as pointed out in previous posts, only long-ranged, low-LET radiation like gamma rays and x-rays (which are electrically neutral, and thus only weakly ionizing) exhibit a threshold in all reliable data. Alpha and beta radiations are short-ranged, high-LET radiation, so where they can gain entry to the body (by being inhaled or ingested in soluble form, for example, which is not too easy for insoluble radioactivity trapped in fallout particles composed of fused glass spheres from melted sand grains), they can irradiate a few nearby cells very intensely because of their short range. With alpha and beta radiation, there is no threshold dose and all exposure is potentially harmful; the effects do obey a linear dose-response relationship at low doses of alpha and beta exposure. Only for gamma and x-rays at low dose rates are there thresholds and benefits possible from boosting the immune system and DNA repair mechanisms like P53.

(2) the dose rate seriously affects the rate of cancer induction, which is an effect currently ignored completely by Health Physicists. This is because all laws and personal 'dosimeter' radiation monitoring systems for radiation safety record merely the integrated total doses, without regard to the dose rate at which the dose was received. (Some effects prediction schemes do make arbitrary 'factor of two' corrections, doubling the danger expected from doses received above some threshold for high dose rates, but these corrections grossly neglect the observed facts; see previous post for details of how this was discovered in animal experiments, and why it is still censored out!).

Summary: gamma or x-ray radiation received at a low dose rate in small total doses can reduce the normal cancer rate. If this small total dose radiation is received at a high dose rate, however, protein P53 may not be fast enough able to repair the damage successfully during the exposure, and if there are multiple breaks in DNA strands produced in a short period of time, the broken bits will risk being 'repaired' incorrectly (the wrong way around or whatever), initiating cancer at some time in the future when that DNA is unzipped and thus copied in order to create new cells.

This isn't rocket science. As an analogy, solar radiation from the sun contains ultraviolet radiation, which will make a geiger counter (provided it has been provided with a transparent glass window, not a shield to keep ultraviolet light out!) click rapidly, since it borders the soft x-ray spectrum and is weakly ionizing. If you receive ultraviolet radiation at a low dose rate in small total doses, the positive effects may outweigh the risks: vitamin D produced which is helpful rather than dangerous. If, however, you are exposed to very intense ultraviolet, the DNA in your skin gets broken up at a rate faster than protein P53 can stick the pieces together again, so some bits are put back together in the wrong order and skin cancer may eventually result when those cells try to divide to form new skin cells. The visible 'burning' of skin by ultraviolet is also due to dose rate effects causing cellular death and serious cellular disruption. It doesn't matter so much what the total dose is. What matters even more than the dose, for long term effects, is the dose rate (speed) at which the radiation dose is received.

The key facts about radiation seem to be: it's all harmful at sufficiently high dose rates and at high doses. Gamma and x-rays are 'safe' (i.e., have advantages which outweigh risks) at low dose rates (obviously dose rates were high at Hiroshima and Nagasaki, where 95% of the doses were received from initial radiation lasting 10 seconds) and at low total doses. On the other hand, there is always a risk from cellular exposure to alpha and beta radiation because they are short-ranged so their energy is all absorbed in just a small number cells. Because they are quickly stopped by solid matter, they deposit all their energy in sensitive areas of bone tissue if you inhale or ingest sources of alpha and beta radiation that can be doposited in bones (a very small fraction of ingested soluble radium, strontium, uranium, and plutonium can end up in the bones). Gamma rays and x-rays are not dangerous at low dose rates and small total doses because they are not stopped so easily by matter as alpha and beta particles because they carry no electrical charge. This means that gamma and x-rays deposit their energy over a larger volume of tissue so that at low dose rates DNA repair mechanisms can repair damage as soon as it occurs.

Anyway, to get back to the paper by Donald W. Miller, Jr., MD, he does usefully explain an evolved conspiracy to confuse the facts:

'A process known as radiation hormesis mediates its beneficial effect on health. Investigators have found that small doses of radiation have a stimulating and protective effect on cellular function. It stimulates immune system defenses, prevents oxidative DNA damage, and suppresses cancer.'

He cites the monumental report on effects of low dose rate, low-LET gamma radiation on 10,000 people in Taiwan by W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, Is Chronic Radiation an Effective Prophylaxis Against Cancer?, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here:

'An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...

'The data on reduced cancer mortality and congenital malformations are compatible with the phenomenon of radiation hormesis, an adaptive response of biological organisms to low levels of radiation stress or damage; a modest overcompensation to a disruption, resulting in improved fitness. Recent assessments of more than a century of data have led to the formulation of a well founded scientific model of this phenomenon.

'The experience of these 10,000 persons suggests that long term exposure to [gamma]radiation, at a dose rate of the order of 50 mSv (5 rem) per year, greatly reduces cancer mortality, which is a major cause of death in North America.'



The statistics in the paper by Chen and others has been alleged to apply to a younger age group than the general population, affecting the significance of the data, although in other ways the data are more valid than Hiroshima and Nagasaki data extrapolations to low doses. For instance, the radiation cancer scare mongering of survivors of high doses in Hiroshima and Nagasaki would have been prejudiced in the sense of preventing a blind to avoid “anti-placebo” effect, e.g. increased fear, psychological stress and worry about the long term effects of radiation, and associated behaviour. The 1958 book about the Hiroshima and Nagasaki survivors, “Formula for Death”, makes the point that highly irradiated survivors often smoked more, in the belief that they were doomed to die from radiation induced cancer anyway. Therefore, the fear culture of the irradiated survivors would statistically be expected to result in a deviancy from normal behaviour, in some cases increasing the cancer risks above those due purely to radiation exposure.

For up-to-date data and literature discussions on the effects of DNA repair enzymes on preventing cancers from low-dose rate radiation, please see

http://en.wikipedia.org/wiki/Radiation_hormesis

The irrational, fashionable, groupthink semi-religious (believing in speculation) society we live in!

‘Science is the organized skepticism in the reliability of expert opinion.’ - R. P. Feynman (quoted by Smolin, TTWP, 2006, p. 307).

‘Science is the belief in the ignorance of [the speculative consensus of] experts.’ - R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

If we lived in a rational society, the facts above would be reported in the media, and would be the focus for discussion about radiation hazards. Instead, the media and their worshippers (the politicians) as well as their funders (the general public who fund the media by paying for it), choose to ignore or ridicule the facts because the facts are 'unfashionable' and lying bullshit (see Sterngrass graph above) is 'fashionable' and some sort of consensus of mainstream narcissistic elitists with a political mandate to kill people by lying about the effects of low-level radiation and refusing to discuss the facts. There is no uncertainty about these facts, as radiation effects have been better checked and more extensively studied than any other alleged hazard to life!

Below a little summary of politically-inexpedient facts from a book edited by Nobel Laureate Professor Eugene P. Wigner, Survival and the Bomb: Methods of Civil Defense, Indiana University Press, Bloomington, London, 1969.

The dust jacket blurb states: 'The purpose of civil defence, Mr. Wigner believes, is the same as that of the anti-ballistic missile: to provide not a retaliation to an attack, but a defense against it; for no peace is possible as long as defense consists solely in the threat of revenge and as long as an aggressor - the one who strikes first - has a considerable advantage. Civil and anti-ballistic missile defense not only provide some protection against an attack, they render it less likely by decreasing the advantage gained by striking first.'

The chapter on 'Psychological Problems of A-Bomb Defense' is by Professor of psychology, Irving L. Janis, who states on p. 61:

'It has been suggested that the device of using increasing doses of graphic sound films (preferably in technicolor) showing actual disasters should be investigated as a possible way of hardening people and preventing demoralization.'

He adds on pp. 62-3:

'For the large number of patients who will be worried about epilation, ugly scar tissue, and other disfigurements, a special series of pamphlets and posters might be prepared in advance, containing reassuring information about treatment and the chances of recovery.'

On pp. 64-5 he deals with the 'General Effects on Morale of A-Bomb Attack':

'In general, a single atomic bomb disaster is not likely to produce any different kind of effects on morale than those produced by other types of heavy air attacks. This is the conclusion reached by USSBS [U.S. Strategic Bombing Survey, 1945] investigators in Japan. Only about one-fourth of the survivors of Hiroshima and Nagasaki asserted that they had felt that victory was impossible because of the atomic bombing. The amount of defeatism was not greater than that in other Japanese cities. In fact, when the people of Hiroshima and Nagasaki were compared with those in all other cities in Japan, the morale of the former was found to resemble that of people in the lightly bombed and unbombed cities rather than in the heavily bombed cities. This has been explained as being due to the fact that morale was initially higher than average in the two cities because, prior to the A-Bomb disasters, the populace had not been exposed to a series of heavy air attacks. Apparently a single A-Bomb attack produced no greater drop in morale among the Japanese civilians than would be expected from a single saturation raid of incendiaries or of high explosive bombs.'

On p. 68, Professor Janis addresses the question 'Will There Be Widespread Panic?':

'Prior to World War II, government circles in Britain believed that if their cities were subjected to heavy air raids, a high percentage of the bombed civilian population would break down mentally and become chronically neurotic. This belief, based on predictions made by various specialists, proved to be a myth.'

The chapter on 'Decontamination' is by Dr Frederick P. Cowan (then the Head of the Health Physics Division, Brookhaven National Laboratory) and Charles B. Meinhold, who summarise data from a vital selection of decontamination research reports. The first report summarised (on page 227) is J. C. Maloney, et al., Cold Weather Decontamination Study, McCoy, I, II, and IV, U.S. Army Chemical Corps., Nuclear Defense Laboratory, Edgewood Arsenal, reports NDL-TR-24, -32, and -58 (1962, 1962 and 1964), which demonstrated that:

1. 'In most cases, the time during which access to important facilities must be denied can be reduced by a factor of 10 (e.g., from two months to less than a week) using practical methods of decontamination.'

2. 'Radiation levels inside selected structures can be reduced by a factor of 5.'

3. 'Radiation levels outdoors in selected areas can be reduced by a factor of 20.'

4. 'These results can be achieved without excessive exposure to individuals carrying out the decontamination.'

On page 228, Cowan and Meinhold point out:

'Although long sheltering periods may in some cases be reduced by the effect of rainfall or by transfer of people to less-contaminated areas, it is clear that decontamination is a very important technique for hastening the process of recovery.

'Although the gamma radiation from fallout is the major concern, the effects of beta radiation should not be overlooked. Fallout material left on the skin for an extended period of time [this critical time is just a few minutes for fallout contamination an hour after the explosion, but much longer periods of exposure are required for burns if the fallout is more than an hour old, and after 3 days the specific activity of fallout from a land surface burst is simply too low to cause beta burns] can cause serious burns, and if inhaled or ingested in sufficient quantities, it can result in internal damage. Grossly contaminated clothing may contribute to such skin exposures or indirectly to the ingestion of radioactive material. Thus it may be necessary to resort to decontamination of body surfaces, clothing, food and water.'

On pp. 229-230, the basic facts about land surface burst fallout stated are:

1. 'The mass of the radioactive material itself is a tiny fraction of the mass of the inert fallout material with which it is associated. This, in discussing the mechanics of removal, fallout may be considered as a type of dirt.'

2. 'In general, the amount of radioactive material removed is proportional to the total amount of fallout material removed.'

3. 'Although the solubility of fallout particles depends on the composition of the ground where the detonation took place, it is fair to say that detonations over land will produce essentially insoluble particles while detonations over water will produce much less but fairly soluble fallout material. This soluble material will have a much greater tendency to adsorb to surfaces.'

4. 'Under most circumstances one is dealing with small particle sizes.

'The methods applicable to radiological decontamination are those available to dirt removal in general. Some common examples are sweeping, brushing, vacuuming, flushing with water, scrubbing, surface removal, and filtration. In addition, the radioactive material can be shielded by plowing, spading, covering with clean dirt or by construction of protective dikes. Such methods may utilize power equipment or depend upon manual labor. Their effectiveness will vary widely, depending upon the method of application, the type of surface, the conditions of deposition, etc. ...


'Flushing with water can be very effective, particularly if the water is under pressure, the surface is smooth and proper drainage [to deep drains, where the radiation is shielded by intervening soil] is available. Under certain conditions, the use of water flushing during the deposition period can be of great value. The water will tend to fill the surface irregularities and prevent entrapment of particles. Soluble materials will be kept in solution, thereby reducing the chance of surface adsorption.'

On p. 232, a useful summary table of decontamination is given:




There is other extensive data on fallout decontamination in many previous posts on this blog, e.g., as the posts here, here and here (this last link includes a slightly different table of decontamination efficiencies, which is interesting to compare to the table of data above), as well as several other earlier ones. In summing up the situation for urban area decontamination, Cowan and Meinhold state on p. 232:

'A number of factors make large-scale decontamination useful in urban areas. Much of the area between buildings is paved and, thus, readily cleaned using motorized flushers and sweepers, which are usually available. If, in addition, the roofs are decontaminated by high-pressure hosing, it may be possible to make entire buildings habitable fairly soon, even if the fallout has been very heavy.'

On page 237 they summarise the evidence concerning methods for the 'Decontamination of People, Clothing, Food, Water and Equipment':

'Since fallout is basically dirt contaminated with radioactive substances, it can be largely removed from the skin by washing with soap and water. ... Not all the radioactivity will be removed by washing, but that remaining will not be large enough to be harmful. ... To be a problem in relation to food, fallout must get into the food actually eaten by people. ... Vegetables exposed to fallout in the garden will be grossly contaminated but may still be usable after washing if protected by an outer skin or husk or if penetration of fallout into the edible portions is not excessive. ... Reservoirs will receive fallout, but much of it will settle to the bottom, be diluted by the huge volume of water, or be removed by the filtering and purifying systems. Cistern water may be very contaminated if contaminated rainwater or water from contaminated roofs has been collected. Milk from cattle who have fed on contaminated vegetation may contain large quantities of radioactive iodine for a period of a month or more ... but milk can be used for durable products such as powdered milk or cheese, since the radioactive iodine decays with a half-life of eight days. Thus, after a month only 7 percent of the initial [Iodine-131] remains.'

There is then a chapter on 'Economic Recovery' by Professor of Economics, Jack Hirshleifer, who points out on page 244:

'Economic recovery from localized bombing attacks in general has been quite remarkable. In Hiroshima, for example, power was generally restored to surviving areas on the day after the attack, and through railroad service recommenced on the following day. [Ref.: U.S. Strategic Bombing Survey, The Effects of Atomic Bombs on Hiroshima and Nagasaki, Washington, D.C., 1946, p. 8.]

'By mid-1949, population was back to the preattack level, and 70 percent of the destroyed buildings had been reconstructed. [Ref.: Research Department, Hiroshima Municipal Office, as cited in Hiroshima, Hiroshima Publishing, 1949.]

'In general, populations of damaged areas have been highly motivated to stay on, even in the presence of severe deprivation; once having fled, they have been anxious to return. The thesis has even been put forward that a community hit by disaster rebounds so as to attain higher levels of achievement than would otherwise have been possible. [Ref.: this refers to the study after the 1917 Halifax explosion, made by Samuel H. Prince, Catastrophe and Social Change, Columbia University-Longmans, Green, New York, 1920.] ...

'In the midnineteenth century John Stuart Mill commented on:

... what has so often excited wonder, the great rapidity with which countries recover from a state of devastation; the disappearance, in a short
time, of all traces of the mischiefs caused by earthquakes, floods, hurricanes,
and the ravages of war. An enemy lays waste a country by fire and sword, and
destroys or carries away nearly all the moveable wealth existing in it: all the
inhabitants are ruined, and yet in a few years after, everything is much as it
was before. -
J.S. Mill, 'Principles of Political Economy', Ashley's New Edition, Longmans, Green, London, 1929, Book I, pp. 74-75.



From Dr Samuel Glasstone and Philip J. Dolan, The Effects of Nuclear Weapons, 3rd ed., 1977, pp. 611-3:


"From the earlier studies of radiation-induced mutations, made with fruitflies [by Nobel Laureate Hermann J. Muller and other geneticists who worked on plants, who falsely hyped their insect and plant data as valid for mammals like humans during the June 1957 U.S. Congressional Hearings on fallout effects], it appeared that the number (or frequency) of mutations in a given population ... is proportional to the total dose ... More recent experiments with mice, however, have shown that these conclusions need to be revised, at least for mammals. [Mammals are biologically closer to humans, in respect to DNA repair mechanisms, than short-lived insects whose life cycles are too small to have forced the evolutionary development of advanced DNA repair mechanisms, unlike mammals that need to survive for decades before reproducing.] When exposed to X-rays or gamma rays, the mutation frequency in these animals has been found to be dependent on the exposure (or dose) rate ...


"At an exposure rate of 0.009 roentgen per minute [0.54 R/hour], the total mutation frequency in female mice is indistinguishable from the spontaneous frequency. [Emphasis added.] There thus seems to be an exposure-rate threshold below which radiation-induced mutations are absent ... with adult female mice ... a delay of at least seven weeks between exposure to a substantial dose of radiation, either neutrons or gamma rays, and conception causes the mutation frequency in the offspring to drop almost to zero. ... recovery in the female members of the population would bring about a substantial reduction in the 'load' of mutations in subsequent generations."






Above: the theory of the experimentally observed threshold doses for the radium dial painters and for the Hiroshima survivors.

Updates: http://glasstone.blogspot.com/2009/10/secrecy-propaganda-factual-evidence.html

31 Comments:

At 6:39 pm, Blogger nige said...

http://nige.wordpress.com/2007/04/05/are-there-hidden-costs-of-bad-science-in-string-theory/

... Low-level radiation is another example of a science being controlled by politics.

By the time the protein P53 repair mechanism for DNA breaks was discovered and the Hiroshima-Nagasaki effects of radiation were accurately known, the nuclear and health physics industries had been hyping inaccurate radiation effects models which ignored non-linear effects (like saturation of the normal P53 repair mechanism of DNA) and the effects of dose rate for twenty years.

The entire industry had become indoctrinated in the philosophy of 1957, and there was no going back. Most of health physicists are employed by the nuclear or radiation industry at reactors or in medicine/research, so all these people have a vested interest in not rocking their own boat. The only outsiders around seem to politically motivated in one direction only (anti-nuclear), so there’s a standoff. Virtually everyone who enters the subject of health physics gets caught in the same trap, and so there is no mechanism in place to allow for any shift of consensus....

 
At 8:52 am, Blogger nige said...

http://motls.blogspot.com/2006/04/twenty-years-after-chernobyl.html

Saturday, April 29, 2006

Twenty years after Chernobyl

On Wednesday morning, it's been 20 years since the Chernobyl disaster... The communist regimes could not pretend that nothing had happened (although in the era before Gorbachev, they could have tried to do so) but they had attempted to downplay the impact of the meltdown. At least this is what we used to say for twenty years. You may want to look how BBC news about the Chernobyl tragedy looked like 20 years ago.

Ukraine remembered the event (see the pictures) and Yushchenko wants to attract tourists to Chernobyl. You may see a photo gallery here. Despite the legacy, Ukraine has plans to expand nuclear energy.

Today I think that the communist authorities did more or less exactly what they should have done - for example try to avoid irrational panic. It seems that only 56 people were killed directly and 4,000 people indirectly. See here. On the other hand, about 300,000 people were evacuated which was a reasonable decision, too. And animals are perhaps the best witnesses for my statements: the exclusion zone - now an official national park - has become a haven for wildlife - as National Geographic also explains:


Reappeared: Lynx, eagle owl, great white egret, nesting swans, and possibly a bear
Introduced: European bison, Przewalski's horse
Booming mammals: Badger, beaver, boar, deer, elk, fox, hare, otter, raccoon dog, wolf
Booming birds: Aquatic warbler, azure tit, black grouse, black stork, crane, white-tailed eagle (the birds especially like the interior of the sarcophagus)

... Greenpeace in particular are very wrong whenever they say that the impact of technology on wildlife must always have a negative sign. ...

In other words, the impact of that event has been exaggerated for many years. Moreover, it is much less likely that a similar tragedy would occur today. Nuclear power has so many advantages that I would argue that even if the probability of a Chernobyl-like disaster in the next 20 years were around 10%, it would still be worth to use nuclear energy.

Some children were born with some defects - but even such defects don't imply the end of everything. On the contrary. A girl from the Chernobyl area, born around 1989, was abandoned by her Soviet parents, was adopted by Americans, and she became the world champion in swimming. Her name? Hint: the Soviet president was Gorbachev and this story has something to do with the atomic nucleus. Yes, her name is Mikhaila Rutherford. ;-)

http://motls.blogspot.com/2007/04/chernobyl-21-years-later.html

Thursday, April 26, 2007

Chernobyl: 21 years later

Exactly 21 years ago, the Ukrainian power plant exploded. ...

A new study has found that the long-term health impact of the Chernobyl disaster was negligible. All kinds of mortality rates were at most 1% higher than normally.

ScienceDaily, full study.

Everyday life is riskier.

Yushchenko calls for a revival of the zone. His proposals include a nature preserve - which is more or less a fact now - as well as production of bio-fuels and a science center. The Korean boss of the U.N. calls for aid to the region.


copy of a fast comment there:


Environmental thinking is in perfect harmony with media hype.

Chernobyl wasn't the first case. Hiroshima was. A Manhatten District PhD physicist (Dr Jacobson, from memory?), who didn't actually work at Los Alamos and because of the compartmentalization of secrets didn't know anything about nuclear weapons effects, issued a press release about fallout the day after Hiroshima was on the front pages.

He wrote that the radioactivity would turn Hiroshima into a radioactive waste land for 75 years. Not 70 or 80 years, but 75 years, which is a bit weird bearing in mind the fact that radioactivity decays exponentially.

Actually there was no significant fallout or neutron induced activity beyond a few hours at Hiroshima due to the burst altitude. Even in a surface burst, the radioactivity drops to within the natural background at ground zero after a few years, and there are people living at Bikini Atoll today, where a 15 megatons surface burst was tested in 1954.

The effects of radiation are serious at high doses, but there is plenty of evidence that they are exaggerated for low doses of gamma and neutron radiation...

copy of another fast comment there:

The full report http://www.biomedcentral.com/1471-2458/7/49/ states: "The ICRP risk estimate assumes a dose and dose-rate effectiveness factor (DDREF) of 2.0 (reducing predicted risk by a factor of 2.0) for extrapolation of the data from the bomb survivors (who were exposed at extremely high dose rate) to lower dose and/or dose-rate exposures."

This is a vital issue, because cancer occurs when when the damage to DNA occurs so quickly that protein P53 can't repair it as single strand breaks. As soon as you get double breaks of DNA, there is the risk of the resulting bits of loose DNA being "repaired" the wrong way around in the strand by protein P53, and this can cause radiation induced cancer.

So at low dose rates to weakly ionizing (low linear energy transfer, or low LET) radiation like gamma rays, radiation causes single breaks in DNA and protein P53 has time to repair them before further breaks occur.

At high dose rates, the breaks occur so quickly that the P53 repair mechanism is overloaded with work, and repairs go wrong because DNA gets fairly fragmented (not just two loose ends to be reattached, but many bits) and P53 then accidentally puts some of the ends "back" in the wrong places, causing the risk of cancer.

The factor of 2 risk increase for high dose rates as opposed to low dose rates is nonsense; it's a far bigger factor, as Dr Loutit explained in his ignored book "Irradiation of Mice and Men" in 1962. On page 61 he states:


"... Mole [R. H. Mole, Brit. J. Radiol., v32, p497, 1959] gave different groups of mice an integrated total of 1,000 r of X-rays over a period of 4 weeks. But the dose-rate - and therefore the radiation-free time between fractions - was varied from 81 r/hour intermittently to 1.3 r/hour continuously. The incidence of leukemia varied from 40 per cent (within 15 months of the start of irradiation) in the first group to 5 per cent in the last compared with 2 per cent incidence in irradiated controls."

So, for a fixed dose of 1,000 R spread over a month (which is far less lethal in short term effects than a that dose spread over a few seconds, as occurs with initial radiation in a nuclear explosion, or over a few days when most of the fallout dose is delivered), the leukemia rate can vary from 5-40% as the dose rate varies from 1.3-81 r/hour.

The cancer rate doesn't just double at high dose rates. It increases by a factor of 8 (i.e., 5% to 40%) as the dose rate rises from 1.3 to 81 r/hour.

In fact, for comparing cancer risks at low level (near background) and Hiroshima, the dose rates cover a wider range that this experiment, so the correction factor for the effect of dose rate on risk will be bigger than 8.

Background radiation studies are based on average exposure rates of just 0.02 mr/hour, i.e., 0.00002 r/hour, while at Hiroshima and similar instrumented nuclear tests, the initial nuclear radiation lasted a total of 20 seconds until it was terminated by bouyant cloud rise effect.

Hence for a dose of 1 r spread over 20 seconds at Hiroshima, the dose rate it was received at was 180 r/hour. (Although according to Glasstone and Dolan'snuclear test data, half of the initial radiation dose would generally be received in about half a second, so the effective dose rate would be even higher than 180 r/hour.)

Hence, the range of dose rates from bachground to Hiroshima is 0.00002 r/hour to 180 r/hour or more, a factor of 9,000,000 difference or more.

Since in the animal experiments the leukemia rate increased by a factor of 8 due to a 62 fold increase in dose rate (i.e., as the dose rate increased from 1.3 to 81 r/hour), cancer risk is approximately proportional to the square root of the dose rate, so a 9,000,000 fold increase in dose rate should increase the cancer risk by 3,000 times.

Hence, the cancer risks at Hiroshima and Nagasaki by this model exaggerate low level radiation effects by over 3,000 times, not merely by a factor of 2.

 
At 12:17 pm, Blogger nige said...

Update 28 April 2007: the last comment above contains an error and the exaggeration of radiation effects at low dose rates is even greater as a result.

The calculation should have have subtracted the 2% leukemia incidence in the non-irradiated control group from both the 40% and 5% figures. Hence, the radiation induced leukemia for 1000 R received at rates of 1.3 to 81 r/hour ranged from 3% to 38%, not 5% to 40%. This means that a 62.3 fold increase in dose rate increased the leukemia rate due to the radiation by a factor of 38/3 = 12.7. Hence, the radiation-induced (not the total) leukemia incidence is proportional to (dose rate)^{0.615}, instead of (dose rate)^{1/2}.

Using this corrected result for a 9 million fold difference between the dose rates of background (low dose rate) and Hiroshima (high dose rate) radiation, the radiation induced leukemia incidence for a similar total radiation dose will increase by a factor of 18,900, not 3,000.

Hence, radiation-induced leukemia rates currently being extrapolated from Hiroshima and Nagasaki data down to low dose rate radiation will exaggerate by a factor of 18,900 or so, rather than the factor of 2 currently assumed by orthodoxy.

 
At 4:19 pm, Blogger nige said...

15 May 2007 update: the Radiation Effects Research Foundation has deleted the pages linked to in this post, including http://www.rerf.or.jp/eigo/faqs/faqse.htm and http://www.rerf.jp/top/qae.htm

The new locations are http://www.rerf.or.jp/general/qa_e/qa2.html and http://www.rerf.or.jp/general/qa_e/qa7.html

http://www.rerf.or.jp/general/qa_e/qa2.html contains an interesting table which shows the probability that cancer is caused by radiation instead of natural causes (non-exposed control group data) at various distances from ground zero in Hiroshima and Nagasaki.

For the most highly irradiated 810 survivors within 1 km of ground zero, 22 died from leukemia between 1950-90, of which 100% are attributable to radiation exposure. In the same group of 810 persons, 128 died from other forms of cancer, but only 42% of these 128 deaths were attributable to radiation.

Hence, even the most highly irradiated survivors who did die from cancer (apart from leukemia) were more likely (58% chance) to have died from naturally contracted cancer than from radiation induced cancer (42% risk).

Only for leukemia was there more than 100% chance that a cancer death was due to radiation and not due to natural cancer risks. As explained in previous posts, this is due to the fact that leukemia is both rare and is strongly correlated to radiation exposure than other cancers.

It is a pity that the Radiation Effects Research Foundation still has not added the mean shielded biologically equivalent doses (in centi-sieverts, which are identical to the old unit the rem) for each group of survivors listed using the DS02 dosimetry system.

http://www.rerf.or.jp/general/qa_e/qa7.html states:

"Question 7: What health effects have been seen among the children born to atomic-bomb survivors?

"Answer 7: This was one of the earliest concerns in the aftermath of the bombings. Efforts to detect genetic effects were begun in the late 1940s and continue to this day. Thus far, no evidence of genetic effects has been found. ..."

 
At 4:24 pm, Blogger nige said...

Comment about Pugwash and the anti-nuclear hysteria propaganda

I recently came across a free PDF book on the internet, authored by John Avery of the Danish Pugwash Group and Danish Peace Academy, called Space-Age Science and Stone-Age Politics.

It is a similar kind of book to that which was published widely in the 1980s, full of pseudophysics like claims that nuclear weapons can somehow destroy life on earth, when a 200 teratons (equal to 200*10^12 tons, i.e., 200 million-million tons or 200 million megatons) explosion from the KT event 65 million years ago failed to kill off all life on earth!

(This figure comes from David W. Hughes, "The approximate ratios between the diameters of terrestrial impact craters and the causative incident asteroids", Monthly Notice of the Royal Astronomical Society, Vol. 338, Issue 4, pp. 999-1003, February 2003. The KT boundary impact energy was 200,000,000 megatons of TNT equivalent for the 200 km diameter of the Chicxulub crater at Yucatan which marks the KT impact site. Hughes shows that the impact energy (in ergs) is: E= (9.1*10^24)*(D^2.59), where D is impact crater's diameter in km. To convert from ergs to teratons of TNT equivalent, divide the result by the conversion factor of 4.2*10^28 ergs/teraton.)

Compare the power of all the world's nuclear weapons to that comet, and you see it's a complete fantasy. A megaton low altitude burst nuclear weapon will cause rapidly decreasing casualty rates in houses and rapidly decreasing destruction as the distance increases from 2 to 3 miles. In brich houses, the mortality risk from all blast effects (deaths are mainly due to debris impacts and related collapse of the house) for people lying down on the floor in U.K. type brick houses (with standard 9 inch thick brick walls) falls from about 50% at 2 miles to only about 5% at 3 miles from a 1 Mt surface burst. (Depending on the weapon design and the shielding geometry of the house and neighbouring homes, particularly the locations of windows, initial radiation and or thermal radiation may also cause some injury in some circumstances at these distances. If the location is downwind of the explosion, some quickly decaying fallout hazard may also exist in the event of true surface bursts on land (but not air bursts), depending on the windshear, fission yield fraction of the weapon, and whether the person stays indoors in a central area for the first few days or not.)

In addition, a 1 Mt low altitude bomb explosion would break virtually 100% of windows for a 10 mile radius (with a lower incidence of breakage extending much further, and isolated breaks occurring due to atmospheric "focussing" of the shock wave periodically in focus-point zones hundreds of miles downwind), and cause immense panic and massive numbers of casualties for any people outdoors, particularly any who have a clear line-of-sight to the fireball in the few seconds before the heat flash stops (the blast wave generally takes longer to arrive than the heat flash lasts, so even if a shield is destroyed by the blast, it protects against the heat flash; the shadow cast by a single leaf is enough to prevent serious thermal burns over the shadow area, as proved by photographs taken by the U.S. Army at Hiroshima and Nagasaki: see previous posts on this blog, for example). Skin burns from thermal flash and from beta rays due to fallout contamination are two of the biggest immediate concerns.

However, many nuclear weapons have yields lower than 1 Mt. Current (January 2007) mean yield of the 4,552 nuclear warheads and bombs in the deployed U.S. nuclear stockpile: 0.257 Mt. Total yield: 1,172 Mt.

This is completely trivial compared to the 200,000,000 Mt KT-impact explosion 65 million years ago which did not kill off life on Earth.

Let's now get back to John Avery's Space-Age Science and Stone-Age Politics, the Preface of which claims inaccurately:

"... science has given humans the power to obliterate their civilization with nuclear weapons..."

Avery gives no evidence for this claim, it's a typical example of groupthing propaganda.

The idea is as follows:

(1) claim that your pet hate, such as nuclear weapons or global warming or aliens, is a real threat to civilization,

(2) ignore all evidence to the contrary, and blame politicians for stone age ideas which are causing or risking a disaster.

In the 1930s, those who said Hitler was a threat and should be deterred by air power were dismissed by idealists and Nazi fellow-travellers as "war-mongers".

They either didn't understand, or pretended that they didn't understand, that they were the war-mongers.

If you want peace under freedom, human nature being as it is, you need to be strong enough enough to protect yourself.

Weaken yourself by disarmament and you make yourself attractive to thugs by standing out as a good potential mugging victim.

Somehow all those pacifists escaped the real world of childhood, where you find out that defenselessness makes you victim to all passing thugs who want someone to pick on, mug, rob, and fight.

If you carry a big stick, you live a happier life if you want peace and freedom, than you live if you don't carry a big stick.

Stone age politics is sensible, because the problems of life remain the same now as then: politics is not leading human nature into war. Human nature leads politicians into war.

Contrary to Avery's mindset, you can't change the world by imposing a new political idealism on man.

That's called dictatorship, and all versions so far have been evil failures: fascist Nazism as well well as Communist dictatorship.

It makes a powerful elite few into dictators who become corrupt, people have to be coerced or bribed into maintaining the status quo under the dictatorship, the whole thing is unstable and based on hatreds and violence and terror.

The error is trying to impose a political "solution" on human nature. Human nature is determined by genes, not by ideals written in books by philosophers.

We see this in the fact that even when each individual in identical twins is brought up under different conditions in different places, they remain extremely similar in their interests and outlooks and intelligence. Genetics, sadly perhaps, does exert a massive influence on life.

The idea that problems like war - which are caused by deep-rooted instincts, hatreds, feelings of injustice, tribal rivalries, and prejudices - can be wiped out by philosophical belief systems like pacifist ideology, is utopian, not realistic.

Avery does reflect some of these issues on page 8 of his Preface, where he quotes Arthur Koestler's remark:

"We can control the movements of a space-craft orbiting about a distant planet, but we cannot control the situation in Northern Ireland."

This is very deep. The pacifists keep kidding themselves that if only they could explain to terrorists or dictators how wrong they are, the world would be put right. Wrong. The terrorists and dictators are paranoid, deluded bigots who simply don't want to hear the facts. The only thing they will listen to or respect is force, not talk. However, sometimes - as in the case in Northern Ireland - after several generations a compromise and settlement can be reached to halt the ongoing violence, at least for a while. Unless the core problems are resolved, however, violence might flare up again when tensions and pressures are increased at some time in the future.

Don't rely on politicians and talk. Those who cherish peace agreements should remember that unless there is really genuine goodwill behind each signatory, the agreement isn't worth the paper it is written on. It's actually negative equity, if it brings one side a false sense of security, as was the case with Neville Chamberlain when he got Adolf Hitler's autograph on a peace pact at Munich in September 1938.

In other words, making peace with a homocidal maniac and then waving the peace agreement around and saying "Peace in our time", is very dangerous. If Chamberlain had responded better (more aggressively) to Hitler, millions of lives might had been saved. Unfortunately he and his predecessors had a fixation with an inaccurate analysis of the interpretation of World War I. They thought that "peace at any cost" was worth while. They were wrong in forgetting the advice of the Roman, Publius Flavius Vegetius Renatus, writing in his book Epitoma rei militaris (c. 390 AD): "Si vis pacem, para bellum." (If you wish for peace, prepare for war.)

Moving on to Chapter 1 of Avery's book, The world as it is, and the world as it could be, things get more interesting. The chapter begins with a list of interesting facts. Some are a bit out of date. As of 1997, the annual U.S. military budget is not nearly a thousand billion, but merely $439 billion (only $23 billion is spent on nuclear warheads).

Avery claims in his list on page 12 that:

"In the world as it is, the nuclear weapons now stockpiled are sufficient to kill everyone on earth several times over."

This is total nonsense as already explained and shown by bthe previous posts on this blog: nuclear weapons are not that powerful, in part because the blast and other effects distances don't scale up in proportion to the amount of energy. Hence, increase the energy release by a million times, and the blast pressure damage distances are increased by only the cube-root of a million, i.e., just 100 times.

Nuclear weapons actually "work" more by knocking over charcoal cooking braziers at breakfast time (as in the case of the firestorm at Hiroshima) and by the combination of nuclear radiation with thermal flash burns (the nuclear radiation lowers the white blood cell count for a few months after exposure, preventing infections of the burned skin from being healed naturally, so the person dies). Nuclear weapons are very dangerous and powerful, but they are certainly not the superbombs painted by media hysteria and political propaganda in the Cold War. That propaganda went unchallenged by most scientists largely because it helped stabilize the situation and deter war.

However, it is now dangerous to over-hype the effects of nuclear weapons in this manner, because it detracts from the effectiveness of simple civil defence "duck and cover" and evacuation countermeasures. If you do get a terrorist nuclear attack, the casualty toll is basically dependent on whether people watch the flash and fireball through windows (getting burned by thermal radiation in the process, and then get killed or seriously injured when the blast wave arrives some seconds later, fragmenting the windows), or whether they "duck and cover".

Avery then goes on to Africa and its problems: the need for safe and adequate drinking water supplies and medical help.

Avery suggests funding these justifiable schemes by taxing international currency transactions. Sounds good if it works, i.e., if it doesn't put international trade out of business. If you tax too much, this will happen, because imports and exports will become even more highly taxed than they are now.

Page 16 contains the stupid claim:

"In the world as it could be, a universal convention against terrorism and hijacking would give terrorists no place to hide."

So a piece of paper will prevent terrorists hiding in mountains, in vast sparsely populated areas? This is an example of dangerous nonsense. It's dangerous because it creates a false sense of security, resting on a piece of paper.

In the real world, there are lots of bits of paper in each country with laws written on them. That doesn't abolish crime. Reason? The laws are just scribbles on pieces of paper. Human nature is such that criminals don't pay attention to laws, and even with the deterrents of fines and prison sentences, nobody has ever invented a way to make all people conform to the law. When you apply this fact to terrorists, the problem is immense because the punishments available cannot be scaled up in proportion to the potential crimes and acts of terrorism. The relative risks for terrorists don't increase in proportion to the threat that the terrorists present to civilization.

Avery also goes on to claim (on page 17) that war could be abolished just as slavery was abolished.

There are various deep-rooted problems with this claim: the American Civil War was essentially a war against slavery.

War and slavery have very little to do with one another. Often, people go to war to preserve your independence, i.e., to fight for freedom, or fight against the prospect of becoming a slave in some political ideology (fascism, communism, etc.).

It follows that if you want to ensure continued freedom from slavery, you need the ability to fight against those who would make a slave of you. Therefore, you need to be able to go to war to prevent slavery.

If you abolish the possibility of fighting against thugs, you will risk becoming a slave. So Avery's idea that the abolition of slavery in the American Civil War and other fights should now be followed by an abolition of the possibility of war, i.e., fighting against the prospect of being enslaved by thugs, is a contradiction. It is manifestly gullible and dangerous.

Moving on to Chapter 2 of Avery's book, Tribalism, which contains a discussion of how bees returning to their hives can communicate to other bees some information about how far away, and in what direction, any good pollen sources can be found. This was discovered in 1945 by Karl von Frisch. The returning bee flies around in a kind of circle with occasional short cuts across the diameter of the circle. The direction of the short cut across the diameter of the circle marks the direction to the pollen source, and the number of wiggles the Bee's abdomen makes as it crosses the diameter of the circle indicate the distance to the pollen source:

"Studies of the accuracy with which her hive-mates follow these instructions show that the waggle dance is able to convey approximately 7 bits of information - 3 bits concerning distance and 4 bits concerning direction."

Interesting trivia.

Moving on to Avery's Chapter 3, Nationalism, a false religion, things get back on topic again. Avery argues that the nation state is a kind of tribe, and wars between nation states are basically tribal wars: "... a totally sovereign nation-state has become a dangerous anachronism."

Unless you precisely define a "totally sovereign nation-state", this is illucid. Most nation-states have some interdependence on others, such as being part of a union (e.g., the European Union or the United States of America) or federation (e.g., the Russian Federation).

This doesn't prevent them being involved in wars, or starting wars to protect themselves if their vital interests are threatened.

There is actually a danger in federation and union which I must explain:

A few hundred years ago, there was an era called the "Age of Discovery" in which large unions, nations and empires, sent out armies to seize the assets of small, happy, free and independent tribes. This theft was "justified" by denying the victims any right of free expression, and indeed herding them up and selling them as slaves.

The Spanish actually destroyed an ancient South American civilization in this way, while other Europeans colonised massive areas of Africa and Asia, forcing the people into slavery or brainwashing them with various religions.

Therefore, even successful unions, federations and other groups may pose a threat to civilizations that differ from themselves. There is an enormous amount of sheer arrogance in the replies people give to this. They claim that errors that occurred before cannot happen again because people learn from their mistakes. Wrong. Errors that have occurred in the past actually keep on occurring:

Even within any union or federation, you will find groups of people being exploited by the union or federation as a whole. They will be forced to pay taxes for services they do not use, and so on. They do not have any say because they are a minority and the particular form of so-called "democracy" (which is a sheer travesty of the term "democracy" as used in Ancient Greece, where every citizen had a daily vote on the day's policies) in use is basically a dictatorship with a choice between two rich old men, once every five years.

If we go back in time a bit to the age before really effective military forces existed in England, you come to a time when England was free to all who came with enough swords. This is precisely the reason why first the Romans and later the Normans invaded England successfully.

If we give up our armaments, we will be in a similar position to that we were in when we were conquered by the Romans and the Normans.

Alliances are fickle. In World War I, from 1914-8 the allies consisting of Britain, France, Russia, Italy, Serbia, and Belgium fought against the enemies consisting of Germany, Austria-Hungary, and Turkey. In 1915 Japan joined the allies and Bulgaria joined the enemy. In 1916 and 1917, Romania and America, respectively, joined the allies.

In World War II, 1939-45, Italy and Japan, which had both been allies in World War I, became enemies, switching sides. Allegiances can shift, treaties can be broken. In World War II, America was surprised by the sneaky Japanese attack on Pearl Harbor, while Russia was surprised by Hitler's treachery. Russia had a pact with Hitler which guaranteed peace. It wasn't worth a cent.

Avery correctly pins a share of the blame for World War II on the French, on page 62:

"In 1921, the Reparations Commission fixed the amount that Germany would have to pay [mainly to France, in compensation for the costs of World War II] at 135,000,000,000 gold marks. Various western economists realized that this amount was far more than Germany would be able to pay; and in fact, French efforts to collect it proved futile. Therefore France sent army units to occupy industrial areas of the Ruhr in order to extract payment in kind. The German workers responded by sitting down at their jobs. Their salaries were paid by the Weimar government, which printed more and more paper money. The printing presses ran day and night, flooding Germany with worthless currency. By 1923, inflation had reached such ruinous proportions that baskets full of money were required to buy a loaf of bread. At one point, four trillion paper marks were equal to one dollar. This catastropic inflation reduced the German middle class to poverty and destroyed its faith in the orderly working of society.

"The Nazi Party had only seven members when Adolf Hitler joined it in 1919. By 1923, because of the desperation caused by economic chaos, it had grown to 70,000 members."

Avery's Chapter 4 is called Religion: Part of the problem? - or the answer?. This is particularly interesting (page 67):

"Early religions tended to be centred on particular tribes, and the ethics associated with them were usually tribal in nature. ... In the 6th century B.C., Prince Gautama Buddha founded a new religion in India, with a universal (non-tribal) code of ethics. Among the sayings of the Buddha are the following: Hatred does not cease by hatred at any time; hatred ceases by love. Let a man overcome anger by love; let him overcome evil by good. All [weak] men tremble at punishment. All [over-indulged] men love life. Remember that you are like them, and do not cause slaughter.

"One of the early converts to Buddhism was the emperor Ashoka Maurya, who reigned in India between 273 B.C. and 232 B.C. After his conversion, he resolved never again to use war as an instrument of policy. He became one of the most humane rulers in history, and he also did much to promote the spread of Buddhism throughout Asia.

"In Christianity, which is built on the foundations of Judaism, the concept of universal human brotherhood replaces narrow loyalty to the tribe. [This simplification of Avery's won't go down well with followers of Judaism, and ignores the crimes, from the Inquisition to Nazi Christianity, done in the name of Christianity over the centuries.] The universality of Christian ethical principles, which we see especially in the Parable of the Good Samaritan, make them especially relevant to our own times. Today, in a world of thermonuclear weapons, the continued existence of civilization depends on whether or not we are able to look on all of humanity as a single family."

This again is wrong: thermonuclear weapons don't threaten our existence. They are there because people threaten our freedom.

If they do get used in war again, that would be terrible, but how terrible it is depends on what people can do to protect themselves. It's a quantitative thing, not a qualitative thing.

All disasters are terrible. They are more terrible if you give up in advance and don't have any civil defence advice in place with the evidence to support - to the general public hearing the advice - the fact that "duck and cover" and decontamination and other countermeasures do actually work and are feasible and have been well tested against a range of different types of nuclear explosion in carefully instrumented, scientific trials.

On page 68, Avery states:

"In the Christian Gospel According to Matthew, the following passage occurs: You have heard it said: Thou shalt love thy neighbor and hate thy enemy. But I say unto you: Love your enemies, bless them that curse you, do good to them that hate you, and pray for them that spitefully use you and persecute you. ...

"The seemingly impractical advice given to us by both Jesus and Buddha - that we should love our emenies and return good for evil - is in fact of the greatest practicality, since acts of unilateral kindness and generosity can stop escalatory cycles of revenge and counter-revenge such as those which characterize the present conflict in the Middle East and the recent troubles in Northern Ireland. Amazingly, Christian nations, while claiming to adhere to the ethic of love and forgiveness, have adopted a policy of 'massive retaliation', involving systems of thermonuclear missiles whose purpose is to destroy as much as possible of the country at which retaliation is aimed. It is planned that entire populations shall be killed in a 'massive retaliation', innocent children along with guilty politicians."

Avery here neglects a very important question:

"Would you love Adolf Hitler as your neighbour and forgive him while he is in the middle of exterminating millions in gas chambers, and allow him to continue?"

Jesus's to love thy neighbour doesn't or shouldn't apply to Mr Hitler. So we immediately find a massive hole in Christian ethics and philosophy. Avery merely ignores the existence of this hole, which in the real world (if he were a politician) would mean he would be liable to fall straight down the hole, dragging all those who followed him down there too.

No. You shouldn't love thy neighbour if that neighbour is potentially a mass murderer who will use your good will to help accomplish evil goals. That's a massive problem that totally destroys the whole thesis of Avery.

Avery goes on, still on page 68:

"The startling contradiction between what Christian nations profess and what they do was obvious even before the advent of nuclear weapons..."

Hold hard. Nuclear weapons ended World War II, because they forced Russia to declare war on Japan in order to get some of the advantages of being a victor. This made Japan's leaders realise it has lost the war. The numbers of people killed in Hiroshima and Nagasaki were trivial compared to the numbers killed by regular incendiary air raids on Japan, which included the firestorm on Tokyo in March 1945 that was far more destructive than a nuclear bomb.

The purpose of our having nuclear weapons is to deter war and prevent a war. 'Massive retaliation' is an old and largely outdated deterrent concept, and there are more modern strategies such as counterforce (hitting military targets with weapons of yields and burst types such as to minimise any possible collateral damage to civilian homes).

However, the point is that World Wars are less likely when the potential losses to all parties are massive. This is the main reason why nuclear weapons have prevented regional Cold War conflicts from escalating to all-out World War.

On page 160, Avery produces a graph (Figure 8.1) which shows the increase in infant deaths due to the sanctions imposed on Iraq in 1990, under U.N. Security Council Resolution 678 which authorized the use of 'all necessary means' to force Iraq to withdraw from Kuwait. The mortality rate of children under five years of age in Iraq doubled within a year.

This highlights the perils of economic sanctions: they don't hurt the dictators, they kill innocent kids. They may sound "peaceful" but they still kill. In the same way, according to some muddled pacifist sentiments, only war using particular kinds of "violent weapons" is a bad thing. According to that bad philosophy, the use of gas chambers to massacre people is "peaceful" because there are no "horrible bombs or bullets" involved. Actually, it is just as bad to kill people regardless of the method used. Cold-blooded slaughter with gas, refusal to allow medical treatment, or starvation is - if anything - even more sinister than the use of violence in anger.

Avery's Chapter 10 is World government. The flaw with this idea is simple to see: laws get broken. The idea that a world government based on laws will be a success is refuted amply by a look at what happens in any country when laws are made: criminals break them regardless of law enforcers. At present, the stability of the world is ensured by military deterrence. Remove that mechanism, and you are playing with fire. Every time people have tried to impose a philosophical solution like Marxism or Fascism, it has failed. Power corrupts, absolute power corrupts absolutely. The idea of a world government is that of absolute power, and absolute corruption.

The Roman Empire was the world government of its time. It was maintained by ruthless suppression of dissent, and it was continually at war, often civil war.

A world government would not abolish war, it would relabel all future wars as "civil wars". Merely adding the word "civil" to war is the kind of worse-than-useless political solution to a problem you can expect from moronic zombies.

On page 214, Avery quotes a 1954 suggestion by Edith Wynner for world government (which sounds as if it is a line borrowed from the 1951 film The Day the Earth Stood Still):

"A policeman seeing a fight between two men, does not attempt to determine which of them is in the right and then help him beat up the one he considers wrong. His function is to restrain violence by both, to bring them before a judge who has authority to determine the rights of the dispute, and to see that the court's decision is carried out."

This is all false. First, the person who is being wrongfully attacked first wants the attack to stop, not to beat up the other person.

The suggestion in the quotation that people always want revenge is prejudiced and wrong.

Second, the policeman does have a duty to collect relevant evidence and to do that efficiently he or she needs to take statements from any witnesses, and ascertain that any evidence (weapons with fingerprints, etc.) will be available for use in a prosecution. The policeman decides on the basis of this preliminary investigation who he or she should arrest.

If the policeman arrests an innocent person to bring them before a judge, that is wrongful arrest. Arrests must be based on evidence or at least strong suspicion with some reasoning behind it.

The whole idea that in any war both sides are equally at fault is nonsense: and an insult to those murdered by Hitler's thugs.

In particular, the idea of an international police force to catch and punish criminals fighting terrorist wars is just nonsense because it can't deal with suicide bombers. You can jump up and down on the grave of the suicide bomber, but that will not deter other suicide bombers.

The pacifist case for world government is just shallow and insulting. It is likely to cause more violent wars (which will be called, ironically, "civil wars") than before, simply because vast numbers of people will probably resent the system. It will permit corruption and "might is right" majority rule and barbarity on a scale not seen since the Roman Empire. It will not be capable of stopping 9/11 type suicide bombers.

World government would reduce individualism by removing part of each person's sense of personal identity to a group, and will thereby increase the risk of subversive warfare and insurrection against the massive nanny-state quango of dictatorial majority-controlled officialdom that constitutes the travesty of democracy masquerading as a "world government".

 
At 11:14 am, Blogger nige said...

copy of a comment made on John Horgan's blog:

http://www.stevens.edu/csw/cgi-bin/blogs/csw/?p=50

"... the fact is that, as near as we can tell from the fossil record, humans have not killed other humans as a matter of course for the greatest part of Homo Sapiens’ time on earth. The beginnings of our species are figured to be about 195,000 years ago, the date of the earliest anatomically modern skeletons, but there is no indication of anything like murder until about 20,000 years ago. Doesn’t seem to be in our blood, but in our circumstances. ...

"But it is important to see that this kind of interpersonal violence and murder comes rather late in Sapiens development. In fact, for 90 per cent of our time on earth there is nothing to indicate that humans ever reached the extreme state of knocking each other off. It was a reaction to an extreme crisis, it got to be a familiar response to the increased tensions in a time of scarcity and competition, and once established it seems to have continued on.

"But not because it was in our human nature. Rather it was in the conditions of our life. And therefore the obvious lesson is that we can’t just shrug and say some people are just “born killers,” or “it’s in the blood.” It’s not, and was not for 175,000 years." - Kirkpatrick Sale

The killing started with people hunting animals for food. People were primarily tribal hunters for the 175,000 years before they became farmers around the time of the last ice age, around 20,000 years ago. Hunting is a violent activity, so when hunting ended, those people used to regular bouts of violence would be more likely to fight among themselves instead. You see this in primitive tribes even today: they have two important activities, both full of ceremony and skill: hunting and warfare. The hunting provides food. The warfare maintains order between rival tribes, driving away the hunting competition and the danger of invasion of their villages and theft of their wives by other tribes.

According to Wikipedia: "Neanderthals became extinct in Europe approximately 24,000 years ago". Maybe they were driven away or killed off in warfare? Even if this was the case, it's not automatically the survivors who are to blame.

The pacifist approach begins with the false assumption that fighting and violence are totally immoral and inexcusable under all circumstances. Yet the cold blooded massacres of history (which pacifists don't seem to worry about) like concentration camps where malnutrition and disease, slavery and neglect, or cold-blooded gassing, are the really big problems. Anne Frank died from typhus, and millions died from murder or deliberate neglect in axis civilian concentration camps.

Saddam used nerve gas to murder thousands of Kurds in 1988, and he ordered the torture and murder of thousands of others. It doesn't make that much difference whether he used a bullet to "violently" murder someone, or simply let them die from thirst more "peacefully" in a cell. It's still murder.

I think that this problem is deliberately being neglected by pacifism, and it's the fatal flaw in pacifism. There was an infamous Oxford Union debate around 1933 on whether to "Fight for King and Country". A pacifist philosopher, the immoral Professor C.E.M. Joad (later imprisoned for travelling without a valid railway ticket), was asked what he would do "if his wife was being raped by enemy soldiers". He dismissed the question with a comic reply that he would simply join in and have an orgy, which made most people there laugh, and he won the vote.

The public viewed the plight of people in concentration camps as a joke at that time, circa 1933, in comparison to the violent horrors of having a major war.

But the correct question to pin on the pacifist is what you do if the enemy is torturing people held without charge in concentration camps, as Hitler and Saddam did. Economic sanctions is worse than useless: the death rate of children under 5 years of age doubled within a year due to the sanctions imposed on Iraq in 1990, under U.N. Security Council Resolution 678 which authorized the use of 'all necessary means' to force Iraq to withdraw from Kuwait. You can't hurt the dictator by applying economic sanctions: innocent people suffer. The only real option is to go to war against them. It's simply not a case that "two wrongs don't make a right". You have to try to estimate how many more people you can save by going to war than will be killed if you don't have the war. Whether it is right or not depends on whether there is a profit to be had, i.e. if the number of lives saved exceeds the number killed in the conflict.

You can only call a war illegitimate or "murder" if amount of anticipated suffering as a result of the conflict exceeds the amount of suffering which is likely if the war doesn't occur.

 
At 11:46 pm, Anonymous Anonymous said...

Hi Nige

Do you have any comments about the Japanese earthquake and the nuclear reactor yet?

SM

 
At 12:45 pm, Blogger nige said...

Hi Susan,

There is not much to say about the incident, really, which was hardly Chernobyl.

It's interesting however that Japan has managed to embrace nuclear reactor technology despite the anti-nuclear sentiment in the aftermath of Hiroshima, Nagasaki, and the 1954 contamination of 23 Japanese fishermen on the "Lucky Dragon".

What is interesting about the media is not the science (the newspaper editor doesn't know the difference between a pBq and a GBq, it's all the same), but the politics.

Here in the UK, there is no antinuclear concern about the risks of having 0.99 microcurie or 9.9 kBq of Am-241 (very similar to plutonium for health reasons) in smoke detectors in every house to save lives in fires.

Antinuclear people don't put on a massive front-page propaganda attack saying that 9.9 kBq of Am-241 emits 9900 alpha particles per second, and since a single alpha particle can in principle set off a lung cancer, it follows that over a two week period a smoke detector emits enough alpha particles to totally wipe out humanity, at least in principle.

It's fairly obvious that this scare-mongering won't get into the newspapers, although on a scientific footing it is similar quantitatively to much of the anti-nuclear protestors propaganda.

Nobody will listen to propaganda unless it reinforces their prejudices.

If you point out that a single smoke detector, if incinerated in a fire, could - according to the exceptionally fiddled antinuclear lobby calculations - exterminate humanity, nobody listens.

If the media publish the same fiddled calculation about a leak of radioactivity from a nuclear reactor, it gets a very different treatment from those reading it, starting off a panic wave.

If you take a rock, in principle (according to misleading calculations) that could be used by a terrorist to kill everyone, simply by hitting people over the head. In practice, of course that is not going to happen. Similarly, the Am-241 contaminated smoke from a single burned smoke detector isn't going to end up in people's lungs, with one alpha particle setting off a cancer in each person in the world.

If you want to play the numbers game, you can point out that Am-241 has a half-life of 432 years, so it's effective life is statistically 432/(ln 2) = 623 years. (Am-241 will emit the same number of alpha particles until it has completely decayed as would be emitted if the emission rate at the present time, now, were sustained for the statistically effective lifespan of 623 years.)

So 0.99 microcurie/9.9 kBq of Am-241 in a smoke detector emits a total of 2x10^14 alpha particles in its lifespan.

Since the world's population is 6,700,000,000 = 6.7x10^9, it is clear that if only 1 in 30,000 of the alpha particles emitted by a single smoke detector starts a lung cancer, the number of people killed will be equal to the number of people on the planet!

So it's very easy to come up with scare-mongering statements about radiation, simply because the numbers are so big.

The "problem" for scare mongerers is that the actual risks are diluted by immense factors. Not only will it be extremely unlikely that smoke containing alpha emitters will get into many people's lungs, but even when it gets there, it is usually removed quickly like ordinary dust, and in any case the probability of a single alpha particle causing a lung cancer is extremely low.

So because the quantitative errors involve in naive scare-mongering antinuclear propaganda are so extreme, the qualitative nature of the risk changes totally:

it's a trivial risk compared to the hazards of inhaling natural radioactive radon gas that comes from the soil and seeps into houses.

Traditionally, the pro-nuclear lobby has made an awful mess and have never properly made people aware clearly of the nature and intensity of natural background radiation from space and present in soil, water, food, and the air.

If they did calculate and measure background radiation exposures carefully, they could express radiation levels in terms of the natural average sea level exposure, so people would be aware of radiation in a more useful, more quantitative sort of way.

However, they don't do this. Edward Teller made a complete mess of it in the 1950s by having to compare radiation from the nuclear industry to cigarette smoking, instead of comparing it to natural background radiation in a quantitative way.

In addition, it is vital to present the facts of how background radiation levels vary in different locations.

The best thing the nuclear industry could do is to publish a global map (like a layered Google map) on the internet with reliable data on radiation levels around the world, showing how cosmic radiation varies as a function of terrain altitude and proximity to the poles (where the earth's magnetic field lines are nearly vertical and so can't shield charged cosmic rays), as well as the effects of different types of soils which contain different amounts of radioactive minerals.

They should also indicate the natural alpha, beta and gamma radiation in food, water, air and soil in different places.

It would be useful knowledge that anyone could grasp if colour coding was used.

 
At 1:46 pm, Blogger nige said...

SM has kindly emailed me the following extract about firestorm exaggerations by Dr W. E. Strope who worked at the U.S. Naval Radiological Defense Laboratory at nuclear tests from Crossroads (1946) at Bikini Atoll, onward.

Link: http://www.strategicdefense.org/Commentary/Worldonfire.htm

AIR DEFENSE BALLISTIC MISSILE DEFENSE CIVIL DEFENSE


Whole World on Fire—And All Wet

Walmer (Jerry) Strope

I have just finished reading a strange book, Whole World on Fire, by Lynn Eden, published by Cornell University Press a month ago. Ms. Eden is an historian at Stanford University. Her thesis is that Air Force targeteers perversely continued to use blast damage as the basis for targeting even though fire damage "would extend two to five times farther than blast damage" because of institutional biases stemming from the emphasis on precision bombing in World War II. That is, "organizations draw on past practices and ideas even as they innovate to solve new problems."

To Ms. Eden, the question of prioritizing nuclear weapon effects is just a convenient example of this institutional characteristic. She does not purport herself to be an expert on the physics of mass fires. This helps explain part of the strangeness I find in the book; namely, Why now? After all, the Cold War is over and targeteers are not fine-tuning the SIOP. It seems she has spent 15 years reviewing the literature on nuclear fires, interviewing the knowledgeable people and writing the book. It just happened to come out now.

In Chapter 1, Ms. Eden introduces her readers to the problem by postulating the detonation of a 300-kt bomb 1,500 feet above the Pentagon. It is here that I encounter more of the strangeness. It seems that Ms. Eden is under some pressure to convince her readers that the Air Force had plainly ignored the obvious. Therefore, she tends to present the most extreme positions on mass fire issues, as well as some of the "tricks of the trade." One trick: close in, we are told "the fireball would melt asphalt in the streets." But when the description gets to the Capitol building some three miles away, there is no comparable sentence. The previous image is permitted to carry over.

Next, we are told, "Even though the Capitol is well constructed to resist fire, and stands in an open space at a distance from other buildings, light from the fireball shining through the building’s windows would ignite papers, curtains, light fabrics, and some furniture coverings. Large sections of the building would probably suffer heavy fire damage. The House and Senate office buildings would suffer even greater damage. The interiors of these buildings would probably burn."

Hold on! Wait a minute! The Capitol building is completely protected by sprinklers. So are the House and Senate office buildings, the Library of Congress, the Supreme Court building, and the massive buildings lining the Mall and in the Federal Triangle. These buildings may become sopping wet but they probably will not burn. The monuments also will not burn.

Why don’t mass fire calculators take sprinkler systems, venetian and vertical blinds, and other fire protection measures into account? Is the situation in the Nation’s Capital unusual? Not anymore. For decades, the lowly fire protection engineer and his employer, the fire insurance industry, have been gnawing away at the fire problem. According to the National Fire Protection Association, between 1977 and 2002 the annual number of building fires in the United States declined by 50%, from 3.2 million a year to 1.6 million a year. Fires in hotels and motels, which killed over 100 people a year as recently as the late 1960s, have become so rare that the U.S. Fire Administration no longer keeps statistics on them. If it were not for a sizable increase in wildfire damage—resulting from timber management practices—the statistics would look even better.

Ultimately, Ms. Eden concludes, "Within tens of minutes, the entire area, approximately 40 to 65 square miles—everything within 3.5 or 4.6 miles of the Pentagon—would be engulfed in a mass fire. The fire would extinguish all life and destroy almost everything else." To reach this horrific prediction, Ms. Eden has to ignore more than the prevalence of sprinkler systems. Among these other issues are the hole in the doughnut problem and the survivability problem.

I was introduced to the hole in the doughnut issue in 1963 when I first visited UK civil defense in the Home Office, Horseferry House, London. I discovered that the people I was talking to had planned the incendiary attacks during World War II. Their effectiveness depended on how many explosive bombs they included in an attack. If they included too many, the buildings were knocked down and didn’t burn well. In fact, the target just smoldered. If they included too few, the incendiaries often just burned out on undamaged roofs. Finally, in the Hamburg attack, they got it right, just opening up the buildings so they burned rapidly. The Hamburg mass fire was called a "fire storm." These people were adamantly unanimous that a nuclear weapon could never cause a firestorm. The severe-damage region around the explosion would just smolder, producing a "ring fire," called a doughnut by our fire research people. That’s apparently what happened at Hiroshima.

Mass fire models that ignore such views produce fierce fires that would seem to destroy everything. But lots of people survived in the fire areas at Hamburg and Hiroshima. The late Dr. Carl F. Miller (after whom the California chapter of ASDA is named) did the definitive analysis of the records of the Hamburg Fire Department. About 20 percent of the people in the fire area were in underground bunkers. Eighty percent were in shelters in building basements. Survival in bunkers was 100%; in basements, it was 80%.

Despite her exaggeration of mass fire effects, I don’t think Ms. Eden’s book would convince the joint strategic targeteers to change their ways. I have concluded that the blast footprint and the fire footprint will be roughly congruent. Thus, I refer to them simply as the "direct effects area" (See my Nuclear Emergency Operations 101.)

Lynn Eden’s book is a strange book—and a little bit dumb (her term.) I wouldn’t recommend you buy it. But if you are part of the old civil defense research group, you should find the pages on that work interesting. If you just want to learn something about mass fires, try to find a copy of FEMA H-21 of August 1990, the Nuclear Attack Environment Handbook. It won’t lead you astray.


I exchanged emails on the subject of blast wave energy attenuation in causing damage, a couple of years ago, with Dr Harold Brode, the RAND Corporation expert on the effects of nuclear weapons. I had read Dr William Penney's evidence published in 1970 about the blast in Hiroshima and Nagasaki which he had personally surveyed as soon as the war ended in 1945. The blast, Penney's studies showed, rapidly lost energy (and pressure) due to the work done in causing damage. According to the laws of physics, once damage is done like this, energy is irreversibly lost. The American book by Glasstone ignores this effect entirely, although it does cite Penney's paper.

Harold Brode suggested that when the blast knocks down a house, the energy used to do that is not entirely lost because you get accelerated fragments of brick, glass and wood moving outward in the radial direction. However, these move far more slowly than the shock front and soon lag behind the shock, fall to the ground and decelerate by tumbling. The distances debris moved when houses were knocked down in nuclear tests in 1953 and 1955 were carefully measured and filmed; it is not that far, and most of the debris remains close (within a matter of metres) to the house. So there is a problem here. For relatively small weapons, the blast pressure drops so rapidly with distance in the range of serious damage, that the energy loss effect is not too severe (although it was apparent in Penney's measurements of the deflection of steel bars and the crushing of petrol cans at Hiroshima and Nagasaki). But for big weapons, it interferes seriously with the blast scaling laws and the result is that blast damage distances increase far more slowly than the official predictions, especially at low pressures.

This is relevant to massive controversies over thermal radiation effects like skin burns and fires. The majority of fires in Japanese residential areas were caused by the blast wave via overturning cooking braziers in homes full of inflammable paper screens, bamboo furniture, etc., the charcoal braziers being in use at the times of each nuclear attack (breakfast time for Hiroshima, lunch time for Nagasaki). Colonel Tibbets, in charge of the 509th which dropped the bombs (he was the pilot on the Hiroshima raid) writes in his 1978 autobiography about how expert he was on firestorms. He had served in Europe on successful incendiary missions before going to Japan, where he advised General LeMay on how to successfully create firestorms with a mix of incendiaries plus EXPLOSIVES, which create blast damage and enable fires to start in the debris of wooden buildings. The bombs dropped on Hiroshima and Nagasaki landed at local times of 8.15 am, when people were either on their way to work or school, or having breakfast (using charcoal cooking braziers in wood frame houses containing inflammable bamboo and paper screen furnishings), and at 12 pm, when many people were preparing lunch and others were out of doors.

The skin burns risk depends mainly on the time of day, since the percentage of people who would be in an unobstructed line-of-sight of a fireball in a modern city ranges from 1% in the early hours of the morning to an average of 25% during the daytime. Hence, the flash burns casualty rate can easily vary by a factor of 25, just as a function of the time of day that an explosion occurs. Obviously, the density of combustible materials on the ground determines the risk of a firestorm, but this is trivial for most modern buildings in cities made largely from steel and concrete, which simply don't burn. Dr Brode did several studies of firestorm physics during the 1980s, which I feel are irrelevant because the fact is that firestorms were well investigated in World War II when incendiary attacks were made on many cities in an effort to start them. The areas which burned well and led to firestorms has a massive abundance of combustible materials per square foot, and these were mainly the wooden medieval parts of old cities like Hamburg, and wooden construction areas of Japanese cities. Once burned, there were rebuilt with less inflammable materials, so these firestorms cannot be repeated in the future. (Similarly, wooden London was burned down in 1666, and was rebuilt in a more fire-resistant manner.)

There are some interesting reports by Carl Miller on firestorms in Germany, written in the 1960s. Somehow, the RAND Corporation did not get hold of this information, or else it simply jumped on the "Nuclear Winter" funding band waggon in 1983, and ignored the facts about firestorms derived from WWII obtained from personal experience by people like George R. Stanbury of the British Home Office Scientific Advisory Branch.

Dr Strope wrote a 1963 NRDL unclassified report on the base surge radiation effects of the 1946 Baker underwater test, which took a lot of finding. Fortunately the British library at one time was donated a lot of original NRDL reports (in printed form, not the usual poor-quality microfilm) and hold them at Boston Spa. I've compiled and assessed a great deal of information on radiation from underwater tests, but blogger and wordpress blog sites are not suited to publishing tables of information.

The British information which Dr Strope refers to in 1963 is that of Home Office scientist George R. Stanbury, who did the civil defence studies at the first British nuclear test in Monte Bello, 1952. Stanbury writes in the originally 'Restricted' (since declassified) U.K. Home Office Scientific Adviser's Branch journal Fission Fragments, Issue Number 3, August 1962, pages 22-26:

'The fire hazard from nuclear weapons

'by G. R. Stanbury, BSc, ARCS, F.Inst.P.

'We have often been accused of underestimating the fire situation from nuclear attack. We hope to show that there is good scientific justification for the assessments we have made, and we are unrepentant in spite of the television utterances of renowned academic scientists who know little about fire. ...

'Firstly ... the collapse of buildings would snuff out any incipient fires. Air cannot get into a pile of rubble, 80% of which is incombustible anyway. This is not just guess work; it is the result of a very complete study of some 1,600 flying bomb [V1 cruise missile] incidents in London supported by a wealth of experience gained generally in the last war.

'Secondly, there is a considerable degree of shielding of one building by another in general.

'Thirdly, even when the windows of a building can "see" the fireball, and something inside is ignited, it by no means follows that a continuing and destructive fire will develop.

'The effect of shielding in a built-up area was strikingly demonstrated by the firemen of Birmingham about 10 years ago with a 144:1 scale model of a sector of their city which they built themselves; when they put a powerful lamp in the appropriate position for an air burst they found that over 50% of the buildings were completely shielded. More recently a similar study was made in Liverpool over a much larger area, not with a model, but using the very detailed information provided by fire insurance maps. The result was similar.

'It is not so easy to assess the chance of a continuing fire. A window of two square metres would let in about 10^5 calories at the 5 cal/(cm)^2 range. The heat liberated by one magnesium incendiary bomb is 30 times this and even with the incendiary bomb the chance of a continuing fire developing in a small room is only 1 in 5; in a large room it is very much less.

'Thus even if thermal radiation does fall on easily inflammable material which ignites, the chance of a continuing fire developing is still quite small. In the Birmingham and Liverpool studies, where the most generous values of fire-starting chances were used, the fraction of buildings set on fire was rarely higher than 1 in 20.

'And this is the basis of the assertion [in Nuclear Weapons] that we do not think that fire storms are likely to be started in British cities by nuclear explosions, because in each of the five raids in which fire storms occurred (four on Germany - Hamburg, Darmstadt, Kassel, Wuppertal and a "possible" in Dresden, plus Hiroshima in Japan - it may be significant that all these towns had a period of hot dry weather before the raid) the initial fire density was much nearer 1 in 2. Take Hamburg for example:

'On the night of 27/28th July 1943, by some extraordinary chance, 190 tons of bombs were dropped into one square mile of Hamburg. This square mile contained 6,000 buildings, many of which were [multistorey wooden] medieval.

'A density of greater than 70 tons/sq. mile had not been achieved before even in some of the major fire raids, and was only exceeded on a few occasions subsequently. The effect of these bombs is best shown in the following diagram, each step of which is based on sound trials and operational experience of the weapons concerned.

'102 tons of high explosive bombs dropped -> 100 fires

'88 tons of incendiary bombs dropped, of which:

'48 tons of 4 pound magnesium bombs = 27,000 bombs -> 8,000 hit buildings -> 1,600 fires

'40 tons of 30 pound gel bombs = 3,000 bombs -> 900 hit buildings -> 800 fires

'Total = 2,500 fires

'Thus almost every other building [1 in 2 buildings] was set on fire during the raid itself, and when this happens it seems that nothing can prevent the fires from joining together, engulfing the whole area and producing a fire storm (over Hamburg the column of smoke, observed from aircraft, was 1.5 miles in diameter at its base and 13,000 feet high; eyewitnesses on the ground reported that trees were uprooted by the inrushing air).

'When the density was 70 tons/square mile or less the proportion of buildings fired during the raid was about 1 in 8 or less and under these circumstances, although extensive areas were burned out, the situation was controlled, escape routes were kept open and there was no fire storm.'


Regarding Hiroshima, Nagasaki, Tokyo and other incendiary attacks on Japan, there is an excellent table of comparison of all the data on page 336 of the 1950 "Effects of Atomic Weapons" (deleted from later editions), based on the U.S. Strategic Bombing Survey report of 1946 on Hiroshima and Nagasaki: the Hiroshima bomb destroyed 4.7 square miles, Nagasaki 1.8 square miles, and the 1,667 tons of incendiary and TNT dropped on Tokyo in one conventional raid destroyed 15.8 square miles, killing many more people than the atomic bombs.

The nuclear winter cold war propaganda dependent as it was on firestorm nonsense, is a complete lie scientifically of course, but it was a major politician and media "spin event":

"This study, which is based entirely on open Soviet sources, examines and analyzes Soviet views on and uses made by Soviet scientists of the so-called ''Nuclear Winter'' hypothesis. In particular, the study seeks to ascertain whether Soviet scientists have in fact independently confirmed the TTAPS prediction of a ''Nuclear Winter'' phenomenon or have contributed independent data or scenarios to it. The findings of the study are that the Soviets view the ''Nuclear Winter'' hypothesis as a political and propaganda opportunity to influence Western scientific and public opinion and to restrain U.S. defense programs. Analysis of Soviet publications shows that, in fact, Soviet scientists have made no independent or new contributions to the study of the ''Nuclear Winter'' phenomenon, but have uncritically made use of the worst-case scenarios, parameters, and values published in the Crutzen-Birks (Ambio 1982) and the TTAPS (Science, December 1983) studies, as well as models of atmospheric circulation borrowed from Western sources. Furthermore, current Soviet directives to scientists call for work on the further strengthening of the Soviet Union's military might, while it is also explained that the dire predictions of the possible consequences of a nuclear war in no way diminish the utility of the Soviet civil defense program and the need for its further improvement."

- Dr Leon Goure, USSR foreign policy expert, Soviet Exploitation of the 'Nuclear Winter' Hypothesis, SCIENCE APPLICATIONS INTERNATIONAL CORP., MCLEAN, VA, report SAIC-84/1310, DNA-TR-84-373, SBITR-84-373, ADA165794, June 1985.

A great deal of the problem is that following fashion and consensus is the easiest thing to do. Usually the mainstream viewpoint is the best there is, so people have a lot of faith in it, on the principle that "so many people can't all be wrong".

Nuclear winter has quite an interesting history which I've followed from the beginning. It started off with the comet impact that wiped out the dinosaurs. The comet forms a fireball when it collides with the atmosphere, and the thermal radiation is supposed to ignite enough tropical vegetation to produce a thick smoke cloud, freezing the ground and killing off many species.

The best soot to absorb solar radiation is that from burning oil, and Saddam tested this by igniting all of Kuwait's oil wells after the first Gulf War. Massive clouds of soot were produced, but the temperature drop was far less than "nuclear winter" calculations predicted occurred in the affected areas: http://en.wikipedia.org/wiki/Nuclear_winter#Kuwait_wells_in_the_first_Gulf_War

The idea that a dark smoke layer will stop heat energy reaching the ground is naive because by conservation of energy, the dark smoke must heat up when it absorbs sunlight, and since it is dark in colour it is as good at radiating heat as absorbing it. So it passes the heat energy downwards as the whole cloud heats up, and when the bottom of the cloud has reached a temperature equilibrium with the top, it radiates heat down to the ground, preventing the dramatic sustained cooling.

Although there is a small drop in temperature at first, as when clouds obscure the sun, all the soot cloud will do in the long run is to reduce the daily temperature variation of the air from day to night, so that the temperature all day and all night will be fairly steady and close to the average of the normal daytime and nighttime temperatures.

The dinosaur extinction evidence, http://en.wikipedia.org/wiki/Chicxulub_Crater, might be better explained by the direct effects of the comet impact: the air blast wave and thermal radiation effects on dinosaurs, and the kilometers-high tsunami. At the time the comet struck Chicxulub in Mexico with 100 TT (100,000,000 megatons or 100 million million tons) energy 65 million years ago, the continents were all located in the same area, see the map at http://www.dinotreker.com/cretaceousearth.html and would all have suffered severe damage from the size of the explosion. Most dinosaur fossils found are relatively close to the impact site on the world map 65 million years ago.

Another issue is that some proportion of the rock in the crater was calcium carbonate, which releases CO2 when heated in a fireball. If there was enough of it, the climatic effects would have been due to excessive heating, not cooling.

The "nuclear winter" idea relies on soot, not dust such as fallout (which is only about 1% of the crater mass, the remainder being fallback of rock and crater ejecta which lands within a few minutes). So it is basically an extension of the massive firestorms theory, which has many issues because modern cities don't contain enough flammable material per square kilometre to start a firestorm even when using thousands of incendiaries. In cases such as Hiroshima, the heavy fuel loading of the target area created a smoke cloud which carried up a lot of moisture that condensed in the cool air at high altitudes, bringing the soot back promptly to earth as a black rain.

Because this kind of thing is completely ignored by "nuclear winter" calculations, the whole "nuclear winter" physics looks artificial to me. In 1990, after several studies showed that TTAPS (Sagan et al.) had exaggerated the problem massively by their assumptions of a 1-dimensional model and so on, TTAPS wrote another paper in Science, where they sneakily modified the baseline nuclear targetting assumptions so that virtually all the targets were oil refineries. This enabled them to claim that a moderate cooling was still credible. However, the Kuwait burning oil wells experience a few years later did nothing to substantiate their ideas. Sagan did eventually concede there were faulty assumptions in the "nuclear winter" model, although some of his collaborators continue to write about it.

 
At 6:26 pm, Blogger nige said...

copy of a comment:

http://kea-monad.blogspot.com/2007/10/where-to-now.html

Just to comment on this. I read Kahn's "On Thermonuclear War" (first published 1960) as a teenager and then requested his other books via the local library.

Kahn's book "The Next 200 Years" is if I recall, a small slim paperback and I don't think there was much data in it to make his case.

The key book for environmentalism is Herman Kahn and Julian Simon, "The Resourceful Earth - A Response to Global 2000" published in 1984 (Kahn died in 1983 while it was still in the press).

That volume is massive and contains hundreds of graphs and tables of data which really make a convincing case that environmentalism exaggerated the facts.

I read that perhaps twenty years ago and don't have a copy handy. But I think it dealt with everything.

Even things like species extinction are being grossly exaggerated - species are always becoming extinct as the fossil record shows. It's nothing new. As new species come along, old ones die off. It that wasn't the case, there would still be dinosaurs around, and the world would be a lot less healthy for humans. The whole reason why saber toothed tigers and other wild beasts were hunted to extinction was to make life bearable, not out of ignorance or selfishness!

Most of this environmentalism is a replacement for religion. The rate of rise of sea levels, etc., is slow enough that low lying areas can build up defenses in the meanwhile - far more cheaply than cutting CO2 emissions.

Better still, switch to nuclear power. The effects of low doses of external gamma radiation, especially if delivered at low dose rates, are actually beneficial to human beings as they stimulate DNA repair mechanisms like P53 and cut the cancer risk (it's only internal high-LET radiation like alpha and beta particles from ingested Sr-90 or Pu-239, or extremely large doses/dose rates from gamma rays, that cause a net health risk):

See the monumental report on effects of low dose rate, low-LET gamma radiation on 10,000 people in Taiwan by W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, Is Chronic Radiation an Effective Prophylaxis Against Cancer?, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format at

http://www.jpands.org/vol9no1/chen.pdf

'An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...

'The data on reduced cancer mortality and congenital malformations are compatible with the phenomenon of radiation hormesis, an adaptive response of biological organisms to low levels of radiation stress or damage; a modest overcompensation to a disruption, resulting in improved fitness. Recent assessments of more than a century of data have led to the formulation of a well founded scientific model of this phenomenon.

'The experience of these 10,000 persons suggests that long term exposure to [gamma]radiation, at a dose rate of the order of 50 mSv (5 rem) per year, greatly reduces cancer mortality, which is a major cause of death in North America.'

For Hiroshima-Nagasaki data supporting the fact that low level gamma radiation cuts down cancer risks, see the most recent two posts at http://glasstone.blogspot.com/

Growing populations and economic activity really need to be used to sort out the problems in an unbiased way. Unfortunately, the mainstream approaches start with a prejudice dating back to 1957 when the facts were not known (P53 was only discovered twenty years later). The culture clash between fashionable politics and scientific facts always result in fashionable politics winning, and people needlessly dying and suffering as a consequence.

"History shows that much can change, expectedly or unexpectedly, over short periods, and it is unlikely that most trends would continue unabated for decades without changing course."

I hope you are right. Unfortunately, they will probably make changes for the worst. Like spending enough to bankrupt the world by building giant CO2 extractors which will be completed just about the time the oil, gas and coal runs out, and so will never be used. That's the story of how politics always works when it uses "common sense" to tackle complex problems: it is not merely "too little too late", but "completely crazy".

 
At 10:06 pm, Blogger nige said...

copy of a comment to Wikipedia:

http://en.wikipedia.org/wiki/Talk:Ernest_J._Sternglass#POV_issues

Fastfission, please don't make ''ad hominem'' personal insults about Sternglass being "semi-crackpot". If you want to see my alternative POV on Sternglass, see my top blog post at [[http://glasstone.blogspot.com/]], which analyses errors in Sternglass' work. Notice that this (Wikipedia) article on Sternglass contains a lot of bias. First, it claims in passing that Herman Kahn minimises the effects of radiation, when in fact radiation is the topic Kahn dwells on, e.g., Kahn stated in his 1960 book ''On Thermonuclear War,'' Princeton University Press, p 24:

‘... those waging a modern war are going to be as much concerned with bone cancer, leukemia, and genetic malformations as they are with the range of a B-52 or the accuracy of an Atlas missile.’

Secondly, this Wiki article claims that Linus Pauling was warning that there is no safe threshold back in the 1960s. Scientifically, what matters is what evidence there is either for or against a threshold. Certainly there is no threshold for high-LET radiation like alpha and beta particles in tissue, because they are stopped within a small distance and the ionization density is large enough to overcome human DNA repair mechanisms like protein P53 (which was only discovered in the late 1970s). However, low-LET radiation like gamma rays, when received at either high or low dose rates, do show a threshold [[http://glasstone.blogspot.com/]]; this data is from Japanese nuclear weapon attacks (where the dose rates were high, due to initial nuclear radiation) and from low-level radiation during an accident where Cobalt-60 got into steel used to make buildings lived in for 20 years by 10,000 people in Taiwan (see W.L. Chen, Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, ''Is Chronic Radiation an Effective Prophylaxis Against Cancer?,'' published in the ''Journal of American Physicians and Surgeons,'' Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here: [[http://www.jpands.org/vol9no1/chen.pdf]]).

Thirdly, as explained in my blog [[http://glasstone.blogspot.com/]], Alice Stewart actually debunked Sternglass' model, instead of confirming it as this Wikipedia nonsense claims:

Sternglass first publicised his "theory" at 9th Annual Hanford Biological Symposium, May 1969. On 24 July 1969, Dr Alice Stewart wrote an article for the ''New Scientist'', "The pitfalls of Extrapolation" which found a big problem:


"Sternglass has postulated a fetal mortality trend which would eventually produce rates well below the level which - according to his own theory - would result from background radiation." [[http://glasstone.blogspot.com/]]

Fourthly, his book ''Before the Big Bang'' contains various errors and doesn't address or replace the standard model of particle physics. It's not a case that Sternglass belongs to a group of "crackpots", it's just a case that his work on these subjects is severely defective and wanting. If he did a lot more work on it and resolved the problems, then that would be fine. What causes difficulties is the dictatorial difficulty when people try to impose things which contain errors, without first correcting the errors, on the world. Labelling all people with alternative ideas "crackpots" by default isn't helpful, especially when you do it from anonymously under the name "Fastfission".

Sternglass may have a problem with nuclear power, but in that case he has the problem that the sun is nuclear and that background low-level radiation exists everywhere. Does he advise us to minimise it by living at the bottom of mineshafts in locations where there is little thorium-232, potassium-40, uranium-238 (and uranium decay daughters, like radon-222), etc? What about carbon-14 naturally in food? People like Sternglass have helped prejudice the public against the facts. I've traced the history of radiation hysteria here: [[http://glasstone.blogspot.com/2007/03/control-of-exposure-of-public-to.html]], [[http://glasstone.blogspot.com/2007/03/effect-of-dose-rate-not-merely-dose-on.html]] in particular, and [[http://glasstone.blogspot.com/2007/03/above-3.html]]. The basic conclusion is that the "no threshold" dictum was popularised on the basis of flawed paper by Professor E. B. Lewis, author of Leukemia and Ionizing Radiation, ''Science'', 17 May 1957, v125, No. 3255, pp. 965-72. Lewis used very preliminary Japanese and other data which wasn't detailed enough to show that a threshold existed. He was arguing from ignorance, not from evidence! Yet his argument, which ignored dose rate effects and the quality factor of the radiation, i.e., high or low linear energy transfer (LET). - Nigel Cook 172.207.139.192 (talk) 22:04, 17 November 2007 (UTC)

 
At 10:13 pm, Blogger nige said...

continuation of last sentence in previous comment:

Yet his argument, which ignored dose rate effects and the quality factor of the radiation, i.e., high or low linear energy transfer (LET), was widely accepted at the time due to political prejudice about the Cold War, and has not been corrected as the facts have emerged since.

 
At 10:28 pm, Blogger nige said...

I've also got a lot of other evidence that backs up the Japanese and Taiwan data: there were studies for example of cancer rates in different cities with different levels of natural background radiation yet closely matched population groups (with matched age groups, same diets, same habits regards smoking and drinking, etc.).

The fact that this isn't being done by professional health physicists is a sad reflection on the state of society with its severe radiation dogmas and orthodoxies, and character assassination of anyone who prefers FACTS to FASHIONS.

‘Science is the organized skepticism in the reliability of expert opinion.’ - R. P. Feynman (quoted by Smolin, The Trouble with Physics, 2006, p. 307).

‘Science is the belief in the ignorance of [the speculative consensus of] experts.’ - R. P. Feynman, The Pleasure of Finding Things Out, 1999, p187.

Cities at greater elevations above sea level have higher background radiation due to cosmic radiation (at sea level, the atmosphere is equivalent to a radiation shield of 10 metres of water, but as you move to higher altitudes, there is a fall in this amount of shielding between you and the nuclear furnaces called stars in the vacuum of outer space, so the cosmic background radiation exposure you get increases substantially).

The studies date from the 1970s-1990s, and show that if anything there is a fall in the cancer rate as you go to cities at higher altitude and more background radiation.

However, critics may claim that it is due to cleaner air, less smog, lower oxygen pressure and a healthier lifestyle.

Other studies of this sort have therefore compared cities at similar altitude with matched populations, where differences in background radiation arise from the bedrock. A city built on granite will generally have a higher background radiation level than one built on clay or limestone, so it is possible to measure the different considerably background radiation levels in different cities, and this can be correlated to cancer rates.

Obviously here things are complicated because you get radon-222 gas inside buildings built on granite that contains traces of uranium ore. This radon-222 emits alpha particles that are high-LET radiation and certainly they don't conform to a threshold-effects relationship (there is no threshold for high-LET radiation like alpha particles inside the body, only for gamma rays at relatively low doses). So this would complicate the results of the survey, and this is the case.

I will dig out all the graphs and other evidence and publish them on this blog in the future.

 
At 12:34 pm, Blogger nige said...

copy of a comment:

http://riofriospacetime.blogspot.com/

Louise, thank you for a very interesting post on a fascinating subject! Cosmic rays are amazing. Apparently 90% that hit the Earth's atmosphere are protons from the sun, 9% are alpha particles (helium nuclei) and 1% are electrons.

Of course the protons don't make it through the Earth's atmosphere (equivalent to a radiation shield of 10 metres of water, which is quite adequate to shield the core of a critical water-moderated nuclear reactor!!).

When the high-energy protons hit air nuclei, you get some secondary radiation being created like pions which decay into muons and then electrons.

A lot of the electrons get trapped into spiralling around the Earth's magnetic field lines at high altitudes, in space, forming the Van Allen radiation belts.

Where the magnetic field lines dip at the poles, they all come together, and so the electron density increases at the poles. At some point this negative electric charge density is sufficiently large to "reflect" most incoming electrons back, and that spot is called the "mirror point".

Hence the captured electrons are trapped into spiralling around magnetic field lines, to-and-fro between mirror points in the Northern and Southern hemispheres.

There are also of course occasional irregular gamma ray flashes from gamma ray bursters, heavy particles, etc.

It's not clear what the actual radiation levels involved are: obviously the radiation level from cosmic radiation on Earth's surface is known. It's highest at the poles where incoming radiation runs down parallel to magnetic field lines (without being captured), hence the "aurora" around the polar regions where cosmic rays leak into the atmosphere in large concentrations.

It's also high in the Van Allen belts of trapped electrons.

It's not quite as bad in space well away from the Earth. Apparently, the cosmic radiation level on the Moon's surface is approximately 1 milliRoentgens/hour (10 micro Sieverts/hour), about 100 times the level on the Earth's surface. If that's true, then presumably the Earth's atmosphere (and the Earth's magnetic field) is shielding 99% of the cosmic radiation exposure rate.

All satellites have to have radiation-hardened solar cells and electronics, in order to survive the enhanced cosmic radiation exposure rate in space.

In the original version of Hawking's "A Brief History of Time" he has a graph showing the gamma ray energy spectrum of cosmic radiation in outer space, with another curve showing the gamma ray output from black holes via Hawking radiation. Unfortunately, the gamma background radiation intensity at all frequencies in the spectrum is way higher than the predicted gamma ray output from massive black holes (which is tiny), so there is too much "noise" to identify this Hawking radiation.

 
At 5:34 pm, Blogger nige said...

copy of a comment:

http://www.stevens.edu/csw/cgi-bin/blogs/csw/?p=85#comment-13372

This list is mainly books I've actually read, whereas [for] the last one you did (a year or more ago), I hadn't hea[r]d most of the titles. So I can make a comment or two.

I have ... disagreements with some claims in these books. ...

* I disagree with Weinberg's hype (The First 3 Minutes) in applying general relativity to cosmology, because I think in a quantum gravity theory the exchanged gravitons will be received in a redshifted state for gravitational interactions between relativistically receding masses.

* I disagree with Richard Rhodes several major errors (The Making of the Atomic/Hydrogen Bomb). (1) Presenting the Copenhagen Interpretation of Bohr as if it is Gospel truth, and ignoring Feynman's path integrals interpretation which replaces the Copenhagen Interpretation with chaotic effects due to path interference in small distance scales, e.g., interference to electron orbits by pair production of virtual particles which cause Brownian-motion type chaos in the atom. (2) Not pointing out clearly that the role of Hiroshima and Nagasaki was to encourage Russia to declare war on Japan (to be on the list of victors, and hence to end Japan's hopes that Russia might negotiate a settlement with America, which was why Japan was holding out), which ended Japan's hopes that Russia would negotiate with America to end the war without loss of face. (Although because America on 9 August had exaggerated its hand with atomic warfare, and the President had promised an endless rain of ruin when in fact a third atomic bomb wouldn't be ready until September, America had to accept a conditional surrender from Japan, rather than unconditional surrender: it couldn't afford to have its bluff called when it would be unable to deliver another atomic bomb for many weeks.) According to page 336 of the U.S. Government book "Effects of Atomic Weapons" (1950), the incendiary raid on Tokyo killed more than the number killed at Hiroshima and Nagasaki put together. According to the Radiation Effects Research Foundation internet site (they do the surveys of Hiroshima and Nagasaki survivors), even in the survivors within 1 km of the hypocentre, less than half the leukemia cases were due to radiation (the majority were natural). Leukemia is the most enhanced cancer after radiation exposure. Altogether, from 1950-90, only 428 people died of cancer of all sorts due to radiation (9% of all the cancer deaths, i.e., 91% of cancer deaths were not connected to radiation, as proved by the control group survey) in a group of 36,500 survivors. Hence, the average risk of death from cancer due to radiation to a survivor for a period of 40 years after exposure was only 1.2%, compared to a natural (non-radiation) cancer death risk of 13%. No wonder that 50% of the survivors were still alive in 1995, fifty years after the bombings. Ref.: Radiation Research (146:1-27, 1996).

Richard Rhodes - like Steven Weinberg - [is] not writing pure science, but spin that the public want to read because they are prejudiced by propaganda from political institutions with axes to grind. What really made me angry in Rhodes' books are his pseudoscientific treatment of fallout particles. He claims that coral is reduced to calcium metal in one place, and in another he claims the first H-bomb test in 1952 produced 80 million tons of mud. It turned out that the fallout had been carefully collected and analysed in weapon test reports WT-615, WT-915 and WT-1317, which show that the fallout from coral (CaCO3, calcium carbonate) is CaO (lime), with an outer layer of Ca(OH)2 (slaked lime, calcium hydroxide). The 80 million tons he quotes is the crater volume which is mostly just ejected as rocks around the crater. The fallout mass taken up into the fireball is only 1% of the cratered material. Even if the fireball was hot enough to reduce 80 million tons of coral to calcium metal (it isn't), that calcium would oxidise in the atmosphere while falling out. Rhodes' science is a lot of hogwash!

 
At 9:00 am, Blogger nige said...

copy of a comment:

http://riofriospacetime.blogspot.com/2007/12/night-at-museum-pt-2.html

Thanks, Louise. This is extremely interesting and very informative! It's interesting that the dense meteorites, especially those composed of iron and nickel, tend to survive the ablation during their fall through the atmosphere, and hit the ground. Less dense stony objects of similar mass tend to heat up and then explode like an air burst nuclear bomb while still high in the atmosphere, as was the case of the Tunguska explosion of June 30, 1908 (an explosion equivalent to several megatons of TNT, see C. Chyba, P. Thomas, and K. Zahnle, "The 1908 Tunguska Explosion: Atmospheric Disruption of a Stony Asteroid", Nature, v361, 1993, p. 40-44).

"Since the Hall of Meteorites contains similiar samples, are any of them about to melt? If they contained even a tiny amount of radioactive isotopes, it would not be safe to go near this room. If they contained any isotopes, those would have decaued to nothing long ago. Today these rocks are cold as the New York Winter, yet Earth's core continues to produce heat."

If a small rock was hot enough to measure the heat, the radiation would be lethal. A radiation dose of 10 Sieverts, which is equal to 10 Joules/kg for a quality factor of 1 (low LET radiations), is lethal within a few days. Since an average person is 70 kg, that means that 700 Joules of radiation is lethal. To make a rock hot and remain hot for long periods by the degradation of radioactive energy into heat, a larger amounts of radioactive energy are required, so the radiation from such a rock would be lethal.

The thing about the earth is that you have a lot of radioactivity distributed within it, and very little leakage of that energy. A few feet of earth or rock can keep the embers of a fire hot for a long time. If you take account of the thickness of the earth's crust, it traps heat very efficiently, so that a moderate amount of radioactivity keeps the core hot. (However, I'm skeptical about the details as I've not seen any convincing calculations from geologists so far.)

If you try testing those meteorites for radioactivity content, you will find there will be some content in them (probably little, but still a trace)! The earth does contain a lot of uranium: http://www.uic.com.au/nip78.htm:

"The convection in the core may be driven by the heat released during progressive solidification of the core (latent heat of crystallisation) and leads to the self-sustaining terrestrial dynamo which is the source of the Earth's magnetic field. Heat transfer from the core at the core/mantle boundary is also believed to trigger upwellings of relatively hot, and hence low density, plumes of material. These plumes then ascend, essentially without gaining or losing heat, and undergo decompression melting close to the Earth's surface at 'hot spots' like Hawaii, Reunion and Samoa.

"However, the primary source of energy driving the convection in the mantle is the radioactive decay of uranium, thorium and potassium. In the present Earth, most of the energy generated is from the decay of U-238 (c 10-4 watt/kg). At the time of the Earth's formation, however, decay of both U-235 and K-40 would have been subequal in importance and both would have exceeded the heat production of U-238. ...

"Measurements of heat have led to estimates that the Earth is generating between 30 and 44 terawatts of heat, much of it from radioactive decay. Measurements of antineutrinos have provisionally suggested that about 24 TW arises from radioactive decay. Professor Bob White provides the more recent figure of 17 TW from radioactive decay in the mantle. This compares with 42-44 TW heat loss at the Earth's surface from the deep Earth."

There's nothing in the universe that isn't radioactive. (Even clouds of hydrogen gas contain traces of tritium.)

Table 1 in that above-linked article shows that meteorites are 0.008 parts per billion uranium, the earth's mantle is 0.021 parts per billion uranium, and the continental crust is 1.4 parts per billion uranium. The concentration of uranium in the earth's core is not very well known (antineutrino measurements are available), but since uranium is relatively dense (denser than lead), there may be a considerable concentration of uranium in the earth's core, at least similar to that in the crust. Also thorium-232, etc.

... The earth's core is hot not because the radioactivity is capable of keeping isolated rocks hot, but because the rate of loss of heat is minimised due to the poor thermal conductivity of the outer layers, particularly the crust. This keeps most of the heat trapped.

The calculation to check the theory should be simple. Take the total radioactivity in the earth (in Becquerels, decays/second), multiply it by the average energy of the radiation emitted (0.3 MeV or so for a beta particle, 4 MeV or so for an alpha particle) and that gives you the total MeV/second, then convert that power ... into Joules/second (watts). Then estimate the diffusion rate of the heat out of the earth.

 
At 11:16 am, Blogger nige said...

[BTW, I've noticed some typographical errors and errors of grammar in the last update added to the body of this post, e.g., the update section. I'm not going to try to update it, for the following reasons. There is a flaw in the old blogger template software used on this blog, and every time any changes are made, extra line spacings between paragraphs are automatically inserted when the changes are saved. There is also a flaw that if the template is changed, comments are lost and not transferred over.]

Extract of relevant material from a comment to:

http://kea-monad.blogspot.com/2007/11/panthalassa.html

...
The evidence in favour of a supernova explosion shortly before the Earth formed 4,540 million years ago is compelling from the natural radioactivity distribution in the Earth. Earth is basically a giant fallout particle, as people like Edward Teller first pointed out over fifty years ago:

‘Dr Edward Teller remarked recently that the origin of the earth was somewhat like the explosion of the atomic bomb...’

– Dr Harold C. Urey, The Planets: Their Origin and Development, Yale University Press, New Haven, 1952, p. ix.

‘It seems that similarities do exist between the processes of formation of single particles from nuclear explosions and formation of the solar system from the debris of a supernova explosion. We may be able to learn much more about the origin of the earth, by further investigating the process of radioactive fallout from the nuclear weapons tests.’

– Dr P.K. Kuroda, ‘Radioactive Fallout in Astronomical Settings: Plutonium-244 in the Early Environment of the Solar System,’ Radionuclides in the Environment (Dr Edward C. Freiling, Symposium Chairman), Advances in Chemistry Series No. 93, American Chemical Society, Washington, D.C., 1970.
...

 
At 5:02 pm, Blogger nige said...

A rare, non-detailed background survey of the social reasons for nuclear weapons effects data censorship is the paper:

Professor Brian Martin (then a PhD physicist at the Department of Mathematics, Faculty of Science, Australian National University, Canberra, but now he is Professor of Social Sciences in the School of Social Sciences, Media and Communication at the University of Wollongong), "Critique of Nuclear Extinction", published in Journal of Peace Research, Vol. 19, No. 4, pp. 287-300 (1982):

"The idea that global nuclear war could kill most or all of the world's population is critically examined and found to have little or no scientific basis. A number of possible reasons for beliefs about nuclear extinction are presented, including exaggeration to justify inaction, fear of death, exaggeration to stimulate action, the idea that planning is defeatist, exaggeration to justify concern, white western orientation, the pattern of day-to-day life, and reformist political analysis. Some of the ways in which these factors inhibit a full political analysis and practice by the peace movement are indicated. Prevalent ideas about the irrationality and short duration of nuclear war and of the unlikelihood of limited nuclear war are also briefly examined."

For his article debunking the "nuclear winter" hoax of Sagan et al., see Brian Martin's article, "Nuclear winter: science and politics", Science and Public Policy, Vol. 15, No. 5, October 1988, pp. 321-334, http://www.uow.edu.au/arts/sts/bmartin/pubs/88spp.html.

Notice that Brian Martin is an immensely important figure in censorship studies: http://www.uow.edu.au/arts/sts/bmartin/pubs/controversy.html#nuclearwar.

Of particular interest on the Brian Martin site are the following pages:

http://www.uow.edu.au/arts/sts/bmartin/dissent/intro/

and

http://www.uow.edu.au/arts/sts/bmartin/pubs/controversy.html

 
At 7:05 pm, Blogger nige said...

Also see the informative article on line in PDF:

"Nitrogen oxides, nuclear weapon testing, Concorde and stratospheric ozone" P. Goldsmith, A. F. Tuck, J. S. Foot, E. L. Simmons & R. L. Newson, published in Nature, v. 244, issue 5418, pp. 545-551, 31 August 1973:

"ALTHOUGH AMOUNTS OF NITROGEN OXIDES EQUIVALENT TO THE OUTPUT FROM MANY CONCORDES WERE RELEASED INTO THE ATMOSPHERE WHEN NUCLEAR TESTING WAS AT ITS PEAK, THE AMOUNT OF OZONE IN THE ATMOSPHERE WAS NOT AFFECTED."

What happens when nitrogen oxides are released in a nuclear explosion is partly that they combine with moisture in the mushroom cloud to form very dilute nitric acid which eventually (after being blown around the world in small particles) gets precipitated.

More important, although the shock wave of a nuclear explosion creates nitrogen oxides, especially nitrogen dioxide, THE PROMPT X-RAYS AND GAMMA RADIATION CREATE OZONE!

It's the ozone around the early fireball that shields most of the the early-time thermal radiation, which is mainly in the ultraviolet.

Hence, nuclear explosions in the atmosphere don't just release oxone-destroying nitrogen oxides, THEY ALSO RELEASE OZONE! Depending on the yield and the altitude of the detonation, in some cases the Earth's ozone layer can actually be INCREASED not reduced.

A high altitude nuclear explosion does NOT produce a strong blast wave, and all nitrogen oxides production requires a high overpressure shock wave! Hence, in a high altitude nuclear explosion, the production of ozone from gamma radiation EXCEEDS the production of nitrogen oxides by many times. It is quite conceivable that suitable high altitude nuclear explosions over the South Pole would have the effect of repairing the hole in the ozone layer there. Of course, it won't happen, because as Feynman said when discussing nuclear testing hysteria in the 1960s, we really still live in a pseudo-scientific age.

See also:

J. Strzelczyk, W. Potter, & Z. Zdrojewicz, "Rad-By-Rad (Bit-By-Bit): Triumph of Evidence Over Activities Fostering Fear of Radiogenic Cancers at Low Doses", Dose Response, v. 5 (2007), issue 4, pp. 275-283:

"Large segments of Western populations hold sciences in low esteem. This trend became particularly pervasive in the field of radiation sciences in recent decades. The resulting lack of knowledge, easily filled with fear that feeds on itself, makes people susceptible to prevailing dogmas. Decades-long moratorium on nuclear power in the US, resentment of "anything nuclear", and delay/refusal to obtain medical radiation procedures are some of the societal consequences. The problem has been exacerbated by promulgation of the linear-no-threshold (LNT) dose response model by advisory bodies such as the ICRP, NCRP and others. This model assumes no safe level of radiation and implies that response is the same per unit dose regardless of the total dose. The most recent (June 2005) report from the National Research Council, BEIR VII (Biological Effects of Ionizing Radiation) continues this approach and quantifies potential cancer risks at low doses by linear extrapolation of risk values obtained from epidemiological observations of populations exposed to high doses, 0.2 Sv to 3 Sv. It minimizes the significance of a lack of evidence for adverse effects in populations exposed to low doses, and discounts documented beneficial effects of low dose exposures on the human immune system. The LNT doctrine is in direct conflict with current findings of radiobiology and important features of modern radiation oncology. Fortunately, these aspects are addressed in-depth in another major report—issued jointly in March 2005 by two French Academies, of Sciences and of Medicine. The latter report is much less publicized, and thus it is a responsibility of radiation professionals, physicists, nuclear engineers, and physicians to become familiar with its content and relevant studies, and to widely disseminate this information. To counteract biased media, we need to be creative in developing means of sharing good news about radiation with co-workers, patients, and the general public."

Here's a quotation from Feynman (not his specific objection to low-level radiation hysteria which he rejected elsewhere by saying that if Pauling et al were so worried about such levels of radiation, they'd campaign first and foremost to make everyone evacuate cities at high altitudes where cosmic radiation is highest, they'd ban air travel, they'd evacuate cities built on bedrock like granite that contains substantial quantities of naturally radioactive uranium-238, etc., and THEN move on to the far smaller dangers of fallout from weapons tests which only increased lifetime background radiation dosage by typically a mere 1%, see Feynman's book called "The Meaning of It All"):

"What is Science?" by R.P. Feynman, presented at the fifteenth annual meeting of the National Science Teachers Association, 1966 in New York City, and reprinted from The Physics Teacher Vol. 7, issue 6, 1968, pp. 313-320:

"... great religions are dissipated by following form without remembering the direct content of the teaching of the great leaders. In the same way, it is possible to follow form and call it science, but that is pseudo-science. In this way, we all suffer from the kind of tyranny we have today in the many institutions that have come under the influence of pseudoscientific advisers.

"We have many studies in teaching, for example, in which people make observations, make lists, do statistics, and so on, but these do not thereby become established science, established knowledge. They are merely an imitative form of science analogous to the South Sea Islanders' airfields--radio towers, etc., made out of wood. The islanders expect a great airplane to arrive. They even build wooden airplanes of the same shape as they see in the foreigners' airfields around them, but strangely enough, their wood planes do not fly. The result of this pseudoscientific imitation is to produce experts, which many of you are. ... you teachers, who are really teaching children at the bottom of the heap, can maybe doubt the experts. As a matter of fact, I can also define science another way: Science is the belief in the ignorance of experts.

"When someone says, "Science teaches such and such," he is using the word incorrectly. Science doesn't teach anything; experience teaches it. If they say to you, "Science has shown such and such," you might ask, "How does science show it? How did the scientists find out? How? What? Where?"

"It should not be "science has shown" but "this experiment, this effect, has shown." And you have as much right as anyone else, upon hearing about the experiments--but be patient and listen to all the evidence--to judge whether a sensible conclusion has been arrived at.

"In a field which is so complicated ... that true science is not yet able to get anywhere, we have to rely on a kind of old-fashioned wisdom, a kind of definite straightforwardness. I am trying to inspire the teacher at the bottom to have some hope and some self-confidence in common sense and natural intelligence. The experts who are leading you may be wrong.

"I have probably ruined the system, and the students that are coming into Caltech no longer will be any good. I think we live in an unscientific age in which almost all the buffeting of communications and television--words, books, and so on--are unscientific. As a result, there is a considerable amount of intellectual tyranny in the name of science.

"Finally, with regard to this time-binding, a man cannot live beyond the grave. Each generation that discovers something from its experience must pass that on, but it must pass that on with a delicate balance of respect and disrespect, so that the race--now that it is aware of the disease to which it is liable--does not inflict its errors too rigidly on its youth, but it does pass on the accumulated wisdom, plus the wisdom that it may not be wisdom.

"It is necessary to teach both to accept and to reject the past with a kind of balance that takes considerable skill. Science alone of all the subjects contains within itself the lesson of the danger of belief in the infallibility of the greatest teachers of the preceding generation."

 
At 7:31 pm, Blogger nige said...

On the subject of consensus-led "groupthink", see http://en.wikipedia.org/wiki/Groupthink:

’Groupthink is a type of thought exhibited by group members who try to minimize conflict and reach consensus without critically testing, analyzing, and evaluating ideas. During Groupthink, members of the group avoid promoting viewpoints outside the comfort zone of consensus thinking. A variety of motives for this may exist such as a desire to avoid being seen as foolish, or a desire to avoid embarrassing or angering other members of the group. Groupthink may cause groups to make hasty, irrational decisions, where individual doubts are set aside, for fear of upsetting the group’s balance.’

- Wikipedia.

‘[Groupthink is a] mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action.’

- Professor Irving Janis.

The wikipedia article on groupthink gives two examples which have been investigated in some detail: the Space Shuttle Challenger disaster (1986) which Feynman and a military guy investigated, and Bay of Pigs invasion (1959-1962).

Challenger exploded on launch in 1986 because it was launched in freezing weather, when the cold had caused the rubber O-rings (sealing the sections of reusable booster rocket) to cease being rubbery and to leak fuel outwide the rocket, which ran down the outside and ignited in the rocket flames at the bottom. The leaking O-rings were soon burned, failing entirely, and causing the boosters to ignite along the seals and blow up.

In the days and hours before the disaster occurred, technicians repeatedly pointed out to their bosses that this risk was going to put lives on the line. The way out was to wait until the environment temperature was above the point where rubber ceases to function as a sealant. But the delay in firing the shuttle was deemed to be too costly and unnecessary. The risks, although known, were dismissed by the senior "experts" in charge of the operation who heard about them. There was also the problem that the major source of data on the problem was the contractor which was selling the booster's to NASA, which didn't want to lose its contract by causing unnecessary problems and worries. So everyone agreed to cross their fingers, hope for the best, and launch the shuttle when they knew the temperature was so low that the rubber sealants would malfunction and possibly leak causing disaster.

An account of the investigation was written by Feynman, included as an appendix to Feynman's book What Do You Care What Other People Think?, Feynman the main physicist on the commission inquiry into the disaster. However, Feynman couldn't find the exact cause directly himself because - despite going to all the contractors - nobody there told him the facts. The people concerned who knew were scared to lose their jobs or to be somehow "disloyal" to their employers by speaking out.

What happened instead was that Feynman was told the facts by another expert, the rocket engineer General Donald J. Kutyna, who was investigating the disaster on the same committee. Kutyna was the man who had headed the inquiry into the explosion of a liquid-fuelled Titan missile in its silo in 1980 (a technician had in that case caused the diaster by accidentally dropping a wrench socket down the silo, where it hit the fuel tank and caused a leak which led to a chemical explosion which blew the 9-megaton warhead off the missile without detonating the 1-point safe nuclear core, of course). Because of his experience of investigating a liquid fuelled rocket explosion, Kutyna was able to work out why Challenger blew up and told Feynman the facts to ensure that the NASA cover-up would be exposed, and criticisms levelled at those who made the decision to launch in low temperature weather, when they should have delayed launch to reduce risks, saving lives.

The Bay of Pigs disaster occurred when President Kennedy in 1961 authorised the invasion of Cuba by a group of Cuban exiles. They met fierce opposition and called for air support. Kennedy didn't want to provide air support using the U.S. military, for fear that the U.S. involvement would become known. In the end, he lost both the invasion and also the anonymity of the U.S., because when Castro captured the Cuban exiles they had U.S. equipment and admitted having been trained by the U.S. The cause of the failure was the groupthink of Kennedy's advisors, who feared speaking out of turn when the final planning for the operation was being approved.

Another example of groupthink by a committee of leading experts, which is the design of the Hiroshima bomb, which used 64.1 kg of highly enriched uranium and only managed to fission 1% of that, about 12 kt or so. The Nagasaki bomb contained only 6.19 kg of plutonium, but had an efficiency of over 20% fission. In 1952, an implosion bomb along the lines of the Nagasaki bomb (but with a hollow core) using the same quantity of uranium as the Hiroshima bomb, yielded 500 kt, i.e. 50% fission efficiency.

Why was the Hiroshima bomb relatively so inefficient? In the book The Curve of Binding Energy by John McPhee, Dr Theodore Taylor criticised the Hiroshima bomb design as a stupid design which was the result of groupthink-type incompetence due to a committee team.

The Hiroshima bomb was like a gun and fired a solid cylinder of highly enriched U-235 into a hollow cylinder. There were quite a few issues with this. If fission reaction was started by a stray neutron before the projectile U-235 was fully home in the hollow U-235 sleeve, the result would be relatively inefficient because the geometry would allow most of the neutrons to escape instead of producing further fissions. Since the duration of any fission reaction was trivial in comparison to the time taken for the projective to move a few centimetres, there would be no further assembly once the reaction began.

These details are disclosed in an article called The Physics of the Bomb published in No. 2 of the Penguin series 'Science News;, 1947, written by Los Alamos weapons designer Professor Philip Morrison. Morrison expanded on several details of the 1945 Smyth report (Atomic Energy for Military Purposes) in his article, for example the statistical risks of pre-detonation (inefficiency) due to the neutron background in a gun-type weapon assembly, the fact that neutrons reflected by a tamper take so long to go out of the core, get reflected and return to the core, that in that interval the chain reaction has grown exponentially and the returning neutrons are trivial. Morrison made it clear that the role of a neutron reflector is restricted largely to keeping the critical mass minimal at the moment the reaction starts and thus starting the reaction efficiently (with as much supercriticality as possible for a given mass of fissile material, since supercriticality depends on the ratio of the actual mass present to the critical mass at the moment the reaction begins), instead of preventing neutron escape once the reaction is growing at an exponential rate. He also made it clear that the fission chain reaction is ended in all cases prematurely to some degree (hence without 100% fission efficiency) due to thermal expansion of the bomb core, which soon makes it subcritical, quenching the reaction. he made it clear that the key to high efficiency is achieving the maximum possible degree of supercriticality at the start of the reaction, i.e. having a configuration which has as many critical masses present as possible. This can be achieved by reducing the effective value of the critical mass while keeping the actual mass of fissile material constant; this is the route taken when a bomb core is compressed, and in high yield, high-efficiency fission weapons it is is achieved using a hollow fissile core surrounded by a layer of chemical explosive which is detonated at many points simultaneously. Such points were not always intuitively obvious during WWII, and the nearest the Manhattan had to a computer was a non-electronic (mechanical, driven by electric motors but involving no information in the form of electrical signals) punched-card sorting system. Initially it was used for months by the person in charge of it to play about, producing logarithmic tables, until Oppenheimer fired that person and put Feynman in charge of making efficiency calculations for atomic bomb cores. In 1953, Morrison had to testify before the U.S. Congressional hearings on "Subversive Influence in the Educational Process" by the Hearings before the Subcommittee to Investigate the Administration of the Internal Security Act and other Internal Security Laws of the Committee on the Judiciary, US Senate, 83nd congress. Morrison admitted being a communist party member at college. In earlier hearings before the same committee, Morrison was accused of hyping the effects of nuclear weapons in Japan for political purposes. See http://writing.upenn.edu/~afilreis/50s/morrison.html

Philip Morrison, a Cornell Professor of Physics, expresses doubts about atomic warfare and then faces a Congressional anticommunist investigating committee, 1952
a brief excerpt from: SUBVERSIVE INFLUENCE IN THE EDUCATIONAL PROCESS (Hearings before the Subcommittee to Investigate the Administration of the Internal Security Act and other Internal Security Laws of the Committee on the Judiciary, US Senate, 82nd congress, 2nd session, Sept. 8, 9, 10, 23, 24, 25, and Oct. 13, 1952 (US Govt Printing Office, 1952)


-----------------------------------
Mr. Morris. Did you contribute an article to the Scientific Amercan?
Dr. Morrison. I have had it published. I don't know if you call that contributing or not.

Mr. Morris. Did you write a review of a book by an Englishman named P.M.S. Blackett, entitled "Fear, War, and the Bomb?"

Dr. Morrison. I reviewed P.M.S. Blackett's book for the Herald Tribune and for the Bulletin of Atomic Scientists.

Mr. Morris. And you praised that book?

Dr. Morrison. I said that book had many excellent things in it. I also criticized an amendment. I wrote an honest review of the book.

Mr. Morris. Mr. Chairman, may that review of Dr. Morrison of P.M.S. Blackett's book entitled "Fear, War, and the Bomb" be put into the record?

The Chairman. It may be made part of the record.

(The material referred to follows:)


"BLACKETT'S ANALYSIS OF THE ISSUES"
by Philip Morrison
[Bulletin of Atomic Scientists, February 1949]

It is 3 years since the writing of the first extensive political work of the atomic scientists: One World or None. Now the same publishers put out the American edition of a book by another scientist, the distinguished. well-informed, and earnest P.M.S. Blackett of Great Britain. As a contributor to the first book, I feel no proprietary pangs in urging all those who bought or borrowed it--and there were many--to get hold of the Blackett book.

It is written at a sadder time, and perhaps a wiser one. It is written by a man whose experience is both that of a physicist and that of a military man, and who is no American, but an Englishman, willing to take a somewhat more critical position on the issues of the day than almost any American scientist has publicly done. It is a book which does Professor Blackett credit for its thoughtfulness and scope, even though as he himself points out it is by no means "the whole truth." Read it if you wish to have an opinion on the issues of atomic energy.

My piece in One World or None was the description of the effect of a single atomic bomb on New York City. It is a frightening article, as I have many times tested by direct observation. Yet it is a major thesis of the Blackett book--and I believe a correct thesis--that even a thousand bombs will not of themselves decide the issue of a major war. We said there is no defense, and we meant it. It is still true. But we spoke in a different language from the language of Blackett. We did not speak in terms of strategy, in terms of overall economies, in terms of production and territorial conquest. We spoke of the impact of the bomb on the homes and the hopes of men and women.

I wrote of the lingering death of the radiation casualties, of the horrible flash burns, of the human wretchedness and misery that every atomic bomb will leave near its ground zero. Against this misery there is indeed no real defense. Neither our oceans nor our radar nor our fighters can keep us intact through another major war. But--and I quote Blackett (p. 159): "The very effective campaign, largely initiated by the atomic scientists themselves, to make the world aware of the terrible dangers of atomic bombs, played an important part in bringing pressure to bear on the American Government to propose measures to control atomic weapons and to take them out of the hands of the military."


-----------------------------------
The hearing transcript provides this note on Morrison:
"Professor Morrison is a nuclear physicist who took part in the design and fabrication of the bomb at Los AIamos Laboratory. He is now a member of the Physics Department at Cornell University."


Much of the effort made by people to publish nuclear weapon design secrets seems to be motivated by anti-deterrence sentiments.

Information on details of nuclear weapon design is not needed to justify civil defence/defense efforts.

 
At 11:57 am, Blogger nige said...

Some historically scientific material of great relevance to this post is to be found in the book by physicists Dr. Edward Teller and Dr. Albert L. Latter, Our Nuclear Future ... Facts, Dangers and Opportunities, Criterion Books, New York, 1958:

A very prescient passage from page 119:

"It is possible that radiation of less than a certain intensity does not cause bone cancer or leukemia at all. In the past small doses of radiation have often been regarded as beneficial. This is not supported by any scientific evidence [as of 1958]. Today many well-informed people believe [without any evidence, i.e. on the basis of pure ignorance] that radiation is harmful even in the smallest amounts. This statement has been repeated [by mainstream "professional" cranks, who haven't grasped the subtle difference between fact-based science and authority-based religion/belief, and that no amount of "professional" dogma can overrule the need for fact based evidence in science, unlike subjective fields like politics/education/religion, where the student must give answers in exams which confirm to groupthink ideology, not to the facts if the facts are different to the mainstream consensus behind the examinations board; students who pass such exams by giving the "right" answers to subjective controversies are often instilled with a confusion between what is fact and what is speculative dogma, and as a result they defend crackpot mainstream beliefs as if those beliefs were science, not lies: the only way they have to defend such lies is by personal abuse of those who factual evidence and by lying, since they have no factual evidence, no scientific basis for arguing their case, just vacuous assertions based on ignorance and a refusal to read the facts and act upon them] in an authoritive manner. Actually there can be little doubt that radiation hurts the individual cell. But a living being is a most complex thing. Damage to a small fraction of the cells might be beneficial to the whole organism. Some experiments on mice seem to show that exposure to a little radiation increases the life expectancy of the animals. Scientific truth is firm - when it is complete. The evidence of what a little radiation will do in a complex animal like a human being is in an early and uncertain state."

On pages 121-122, the book points out that Denver in the United States is at an altitude of 5000 feet above sea level, and so receives 43% more hazardous cosmic radiation (because there is less air shield between it and outer space) than you get at sea level.

The bone cancer and leukemia rates in Denver, where the 5000 feet altitude caused a 43% increase in cosmic radiation, were significantly lower than those in the sea level cities of San Francisco and New Orleans in 1947 (before any nuclear test fallout arrived).

For example, there were 10.3 leukemia cases diagnosed per 100,000 of population in San Francisco in 1947, and only 6.4 in Denver.

On page 122, Drs. Teller and latter analyse the results as follows:

"One possible explanation for the lower incidence of bone cancer and leukemia in Denver is that disruptive processes like radiation are not necessarily harmful in small enough doses. Cell deterioration and regrowth go on all the time in living creatures. A slight acceleration of these processes could conceivably be beneficial to the organism."

Actually, the mechanism is more subtle: protein P53, discovered only in 1979, is encoded by gene TP53 which occurs on human chromosome 17. P53 also occurs in other mammals including mice, rats and dogs. P53 continually repairs breaks in DNA which easily breaks at body temperature due to free radicals produced naturally in various ways and also as a result of ionisation of caused by radiation hitting water and other molecules in the body. Cancer occurs when several breaks in DNA happen to occur by chance at nearly the same time, giving several loose ends which P53 repairs incorrectly, causing a mutation. This cannot occur when only one break occurs, because only two loose ends are produced, and P53 will reattach them correctly. If low-LET ionising radiation levels are increased to a certain extent, causing more single strand breaks, P53 works faster and is able deal with faster breaks as they occur, so that multiple broken strand ends do not arise. This prevents DNA strands being repaired incorrectly, and prevents cancer - a result of mutation caused by faults in DNA - from arising. Too much radiation of course overloads the P53 repair mechanism, and then it cannot repair breaks as they occur, so multiple breaks begin to appear and loose ends of DNA are wrongly connected by P53, causing an increased cancer risk. Obviously there is a statistical risk. Quite a lot of wrongly reassembled broken DNA needs to occur until the result causes cancer. Many wrongly assembled DNA strands simply result in cell death when it tries to divide, instead of allowing endless divisions into defective cells, i.e. cancer cells. Besides P53, there are other proteins involved in DNA repair after damage. Over 50% of all cancers, however, result from mutated forms of P53 which are unable to repair damaged DNA.

So it is clear that most cancers occur as a result of a rapid double break to the TP 53 gene on human chromosome 17. The cell then divides normally, but the resulting cell produces its P53 from a mutated TP 53 gene and thus produces a flawed P53 protein, which is unable to properly repair further damage in the DNA of the cell. As a result, these cells are subjected to cumulative damage and mutations from free radicals, and are relatively likely to become cancer cells. The stimulation of P53 with low-LET (weakly ionising) radiation can boost it's efficiency, preventing multiple strand breaks from having time to occur because breaks get repaired faster before a backlog can accumulate. This is a homeostasis effect: an increase in the rate of low-LET radiation weak ionisation naturally causes the body to slightly over-respond by increasing in a non-linear response the rate of P53 repairs (similarly, the body over-responds for a long time after an infection by boosting the white cell count to levels higher than those which existed before the infection). This disproportionately or over-compensation boosts the body's ability to cope with other causes of DNA damage, such as natural causes, so the net effect is a reduction in natural cancer rates that far outweighs the trivial radiation damage at low dose rates. Hence, the overall cancer risk at low-LET low dose rate radiation is less than it would be in the absence of the radiation.

Teller and Latter then point out that if there is an effect of the enhanced cosmic radiation in Denver on the leukemia and bone cancer rate as compared to lower altitude cities, "the effect is too small to be noticed compared to other effects."

In other words, this factual data as of 1947 set a limit on how bad the radiation-induced leukemia rate could be: if it existed at all, it was dwarfed by "noise" in the data. Whenever some signal gets drowned by "noise" in data, then the real scientist starts to investigate the "noise" which is more important than the trivial signal. (This was directly how the big bang was confirmed, when the microwave background noise in the sky was investigated in the mid-1960s and found to be severely red-shifted fireball radiation from the big bang.)

On page 124, it is pointed out that mortality statistics - which don't show a decrease in cancer risks from living in places of high cosmic radiation exposure like Denver - and which therefore don't show any negative risks from low level radiation, do show correlations between other things. For example, being 10% overweight reduces life expectancy by 1.5 years, while smoking one pack of cigarettes a day reduces life expectancy by 9 years (equivalent to an average of 15 minutes reduction in life per cigarette smoked).

These are things which are real, statistically significant risks. Low-LET radiation at low dose rates isn't that kind of problem (to say the very least of it).

 
At 5:35 pm, Blogger nige said...

For more about Lewis's non-threshold propaganda campaign "and the debate about nuclear weapons testing", see:

>http://etd.caltech.edu/etd/available/etd-03292004-111416/unrestricted/LewisandFallout.pdf

EDWARD LEWIS AND RADIOACTIVE FALLOUT
THE IMPACT OF CALTECH BIOLOGISTS ON THE DEBATE
OVER NUCLEAR WEAPONS TESTING IN THE 1950s AND 60s
Thesis by
Jennifer Caron
In Partial Fulfillment of the Requirements for the
degree of
Bachelor of Science
Science, Ethics, and Society Option
CALIFORNIA INSTITUTE OF TECHNOLOGY
Pasadena, California
2003
(Presented January 8, 2003)


"ACKNOWLEDGEMENTS
Professor Ed Lewis, I am deeply grateful to you for sharing your story and spending
hours talking to me. ...

"ABSTRACT
The work of Caltech biologists, particularly, Edward Lewis, on leukemia and ionizing radiation transformed the public debate over nuclear weapons testing. The United States began testing hydrogen bombs in 1952, sending radioactive fallout around the globe. Earlier more localized fallout was generated starting in 1945 from tests of atomic weapons at Nevada test sites. The Atomic Energy Commission claimed the tests would not harm human health. Geneticists knew from animal and plant experiments that radiation can cause both illness and gene mutations. They spoke out to warn the policymakers and the public. Edward Lewis used data from four independent populations
exposed to radiation to demonstrate that the incidence of leukemia was linearly related to
the accumulated dose of radiation. He argued that this implied that leukemia resulted from a somatic gene mutation. Since there was no evidence for the existence of a
threshold for the induction of gene mutations down to doses as low as 25 r, there was unlikely to be a threshold for the induction of leukemia. This was the first serious challenge to the concept that there would be a threshold for the induction of cancer by
ionizing radiation. Outspoken scientists, including Linus Pauling, used Lewis’s risk
estimate to inform the public about the danger of nuclear fallout by estimating the
number of leukemia deaths that would be caused by the test detonations. In May of 1957
Lewis’s analysis of the radiation-induced human leukemia data was published as a lead article in Science magazine. In June he presented it before the Joint Committee on Atomic Energy of the US Congress."
(Emphasis added to key points.)

Page 13:

"The most controversial aspect of his analysis was the linear dose-response curve. This relationship made sense to geneticists who had found a linear relationship between
radiation and mutations in Drosophila down to 25 rad (Stern and Spencer). Additionally, it fit with the hypothesis of Muller that cancer could result from somatic mutations. This was not the accepted idea in other scientific and medical communities. Rather, as the official voice, the AEC medical doctors and scientists promoted the assumption that there
would be a threshold below which radiation would do no harm, just as there is frequently such a threshold in chemical toxicology because the body can process small quantities of toxins like alcohol. The AEC vocally assumed and defended the threshold hypothesis;
furthermore, they seem to have assumed that the amount of radiation received by Americans from fallout would be less than the threshold. Lewis found no evidence for such a threshold, and the AEC scientists were unable to offer any."

(Emphasis added to Lewis's ignorant failure to discover the facts about low level radiation, and its pseudoscientific abuse or misinterpretation as being a fact rather than an expression of science-abusing ignorance and scientific failure. If a scientist fails to find evidence which in fact does exist, that is hardly an accomplishment to be hyped or applauded. Lewis failed to find the evidence of a threshold because the dosimetry available from Hiroshima and Nagasaki was then too crude and inaccurate to produce accurate, detailed results. If Lewis had made efforts to obtain the facts instead of pretending that ignorant error was fact and going on a crusade to promote such ignorant error in journals like Science and in testimony to U.S. Congressional Hearings, then he would have been doing science not pseudoscience.)

 
At 9:54 am, Blogger nige said...

To make the mechanism easily understood, one simple analogy to the roles of protein P53, cancer and radiation is a gasoline dump:

1. DNA-damaging free radicals are equivalent to a source of sparks which is always present naturally, and are caused by many interactions including those of ionizing radiation produced by many other causes in the body.

2. Cancer is equivalent the fire you get if the sparks are allowed to ignite the gasoline, i.e. if the free radicals are allowed to damage DNA without the damage being repaired.

3. Protein P53 is equivalent to a fire suppression system which is constantly damping out the sparks, or repairing the damaged DNA so that cancer doesn't occur.

In this way of thinking, the "cause" of cancer will be down to a failure of the P53 to repair the damage.

Naturally, the majority of cancers involve cells containing mutated P53: to get cancer naturally you usually need to have a mutation in a cell's P53 protein, which stops P53 from repairing DNA.

In other words, cancer appears when the cancer suppressor is damaged.

In nuclear radiation induced cancer, the mechanism is just slightly different: radiation induced cancer occurs where the radiation level is so great that it overwhelms the ability of P53 to repair the damage to the DNA.

However, there is another effect. As the radiation level increases, the rate of P53 repairs increases slightly faster than the DNA damage rate. This is because the body naturally detects radiation damage as an increase in free radicals (chemical-type poisoning) and over-compensates for this increase by dramatically increasing the P53 activity in repairing damaged DNA (cf. the old adage "a little of what does you harm, makes you stronger").

Only when the radiation level is higher than the maximum rate that P53 can repair broken DNA, does the cancer rate start to rise. Up to that level, the increasing P53 activity over-compensates for the radiation damage by repairing DNA much more quickly than normal, in a attempt to return the body to homeostasis.

As an analogy, think about flu: once you get infected the body's immune system must over-compensate, not just "tread water" in just keeping the infection level from rising.

It's inadequate for the immune system to respond to rampant infection by merely increasing the attacks on bacteria (which surge through tissues damaged by the flu virus, and cause the worst symptoms) at the same rate that the bacteria is growing.

If the rate of response of the immune system was the same as that of the cause of the problem, then the immune system would merely be containing the infection and preventing it from getting worse.

That's not good enough.

Instead, the immune system needs to increase the rate of attack on bacteria to a higher value than the rate which the bacteria is multiplying at, in order to not merely prevent the infection from getting worse, but to actually cause the bacterial to get killed off at a rate which is bigger than the rate at which the bacteria are multiplying. Only in this case can the population of bacteria decrease, instead of remaining constant, as would be the case if the immune system response was matched to the infection level.

Similarly, in a war, if you only respond with exactly the same amount of force as your enemy, you will be able to prevent the enemy winning, but you won't be able to end the war! The battle will go on without end. The only way to win is for one side to use more force than the other. If both sides always remain equal in strength, then the war will last forever.

Protein P53 inside individual cell nuclei, by analogy to the role of T-cells and the white blood cells of the immune system, must over-compensate for increasing problems in order to return the body to normal.

It is no good if P53's response is identical to the rate of production of DNA damage. P53 must over-compensate to any increased damage, so that the overall amount of excess DNA damage, once it is detected, begins to decrease with time instead of merely remaining constant.

Homeostasis is used in many organs and systems. In order for normal conditions to be maintained, as soon as any problem is experienced, the body must over-compensate to push conditions back towards the original conditions, not merely keep problems from getting worse.

It's not good enough to merely negate additional damage. The body has to over-compensate in order to not just prevent the problem getting worse, but to restore health. And that is precisely what happens if the injury is not overwhelmingly severe.

Once a fire starts, you don't want to simply respond by preventing it from getting bigger. You want, instead, to make the fire smaller. If the rate of growth of the fire is dF/dt, you don't want your fire-fighting response to equal dF/dt, or you will simply be preventing the fire from getting bigger. What you want is to respond at a rate which exceeds the rate of increase of the fire, so that the size of the fire falls with time instead of remaining constant.

Similarly, with radiation or any other problem, the body's response at low levels is to over-compensate. This over-compensation will actually reduce the natural cancer rate at low radiation levels.

At very high radiation levels, this effect is disappears and the net response is negative, because damage occurs at such a high rate that the P53 repair mechanism is overloaded and is increasingly unable to repair the damage.

Another analogy to P53 is the brakes of a vehicle. Cancer in this analogy is like a automobile crash. If the brakes are defective, that can cause a crash. The ability of a vehicle driver to avoid a crash depends to a considerable extent upon having good brakes. The ability of the brakes to prevent a crash may be impaired by various factors, such as excessive speed or oil on the road. If the brakes are merely capable of preventing the speed from increasing, they are not good enough. Brakes must be able to do more than cancel out acceleration and keep the velocity constant. Brakes must be able to bring about a deceleration, to slow a vehicle.

It's pretty obvious that protein P53 is able to bring about a net reduction in the natural cancer rate when exposed to low-LET ionizing radiation at a "low" dose rates (but still many times the natural background dose rate).

Once the excessive number of free radicals from radiation are detected as a chemical poisoning internally, P53 repair processes are greatly enhanced to over-compensate for the damage rate, in order to reduce the total amount of damage (rather than merely keeping it in check, or constant). By analogy, in any infection problem, homeostasis mechanisms act to restore the equilibrium not by keeping the damage level constant, but increasing the repair rate so that it exceeds the rate of damage. Only in this way can the total amount of damage be reduced.

 
At 11:23 am, Blogger nige said...

copy of a comment in moderation queue to:

http://sovietologist.blogspot.com/2008/04/funnist-thing-ever-said-about-herman.html

dv8 2xl:

If you actually read Kahn's most important work, On Thermonuclear War, the key arguments against wishful thinking are based on facts, not "opinions".

Fact: arms control was tried throughout the 1930s to enable the world to "live in peace" with the Nazis.

Fact: the Nazis simply agreed to everything then broke their word, broke the written agreements they gave to Prime Minister Chamberlain at Munich, etc.

Fact: arms control does not protect you from other countries with secret rearmament programs.

Fact: Hitler's Germany were able to almost instantly convert peacetime factories to munitions factories, by simply preparing the plans and blueprints. No practical arms-inspection policy can get around that.

Fact: even if arms control and pacifism prevented World War II, which it failed to do of course, but even if it did "succeed", millions would still have died in concentration camps and "peaceful invasions" could not have been prevented.

Fact: if you want to prevent evil, you need leverage, not worthless paper agreements. The only leverage the Stalins and Hitlers understand is bombs. Everything else is propaganda and lies as far as they are concerned. Dictators aren't interested in being seen as respectable nice guys who stick on contracts.

As Herman Kahn wrote, Khruschev's proposal for arms control - whereby no inspections of Russian disarmament were allowed and anyone cheating would be (in Kruuschev's words) expected to "cover themselves in shame" was a hoax. The Soviets never covered themselves in shame. They broke the testing cessation in 1961 and detonated a 50 megaton bomb. They were proud, not covered in shame.

Fact: the only way to encourage peace and freedom is to carry a big stick and be seen to be ready to actually USE the big stick. Having civil defence, even just improvised plans like the Kearny car-over-trench shelter than anyone can fix up in the time between a bomb going off and the fallout arriving and building up to a hazardous level downwind - is crucial. Three feet of dirt and you're safe. If you look at the fallout patterns actually measured after nuclear tests with the average yield of stockpiled bombs today, the danger is way exaggerated. Also, the fallout in hazardous areas is clearly visible. Walk crosswind, and you can get out of the danger area before you get a dangerous dose. All nuclear effects are grossly exaggerated. It's pretty easy to grasp this when you understand the physics, instead of believing uneducated hype and spin.

Unless you can find some wood-frame cities like Hiroshima and Nagasaki to detonate the bombs over, the effects are not as impressive as the hype claims. Even in Hiroshima and Nagasaki, the death rates for people with any kind of screening from the thermal flash (whose severe effects was stopped by just a single leaf, a thin white shirt, or a sheet of paper) cut casualty rates massively. Duck-and-cover does work. Nuclear radiation produced high mortality only when combined with thermal burns: this is the "syngerism" effect because the mechanism for death is that radiation reduces the white blood cell count at just the time when skin burn blisters burst and become infected. If you avoid thermal burns, the LD50 for nuclear radiation is about three times higher. That's why ducking and covering is so vital. It also reduces the amount of debris that can hit you in the face (like flying glass). Most of the people killed in Hiroshima and Nagasaki looked at the fireball, often through glass windows, as the blast wave was silently approaching. Films of nuclear explosions which superimpose the sound of blast on to the fireball with no delay time, mislead viewers about the time-sequence of the effects of nuclear weapons. Similarly, you get some time after a nuclear explosion to evacuate or prepare an improvised shelter, before the fallout even starts to arrive. Philip LaRiviere in the 1950s measured nuclear test data showing the different arrival times and maximum fallout dose rate times after a range of Nevada and Pacific nuclear tests in nuclear test report USNRDL-TR-137 ("The Relationship of Time of Peak Activity from Fallout to Time of Arrival", U.S. Naval Radiological Defense Laboratory, 28 February 1957). On average, even once fallout begins to arrive, it settles diffusively and takes a long time to react peak activity. The time of peak radiation level is about the same time as the time taken for the fallout to begin to arrive in the first place. So as with the delayed double heat flash pulse and the delayed arrival of the blast, you have enough time to protect yourself or evacuate from a potential downwind fallout area. If fallout begins to arrive, you can see it. It's clearly visible wherever the dose rate is life-threatening.

The point is, nuclear weapons are not automatically going to produce a lot of civilian casualties if there is a reasonable civil defense education in the reliable (nuclear test based) facts.

If you are going to deter dictators from walking all over you like Hitler and Stalin, then you need to be tough. Toughness is the only thing that deters the sort of trash who don't care about human values at all.

 
At 12:17 pm, Blogger nige said...

copy of a comment in moderation queue to:

http://sovietologist.blogspot.com/2008/04/new-toon-et-al-study-on-regional.html

The TAPPS (Toon et al.) studies have been wrong from day 1. In 1983 they used flawed assumptions for everything, from the absorption coefficient for sunlight by soot, to ignoring scavenging and atmospheric turbulence, etc. They also exaggerated the burnability of the fuel loading.

When a building collapses, most of the combustible material is buried under tons of debris and dust and can't burn. You don't get firestorms anymore like you did in wood-frame buildings such as those in the old, medieval part of Hamburg or Dresden, or Hiroshima and Nagasaki.

In addition, they ignored the fact that for surface bursts (unlike the air bursts over japan) the EMP deposition region will overlap the ground surface and couple thousands of amps in microsecond surges into all the electrical conductors, which would branch out throughout the city before the crater had even formed, and would blow all the fuses/circuit breakers and cut off the electricity supply to buildings, reducing the fire risk.

For a surface burst, the fireball evevation angle is such that most buildings will be "shadowed" by other buildings, preventing ignition.

In their 1989 paper, the TAPPS team failed to retract their earlier errors and apologise for hyping poorly researched trash, and instead changed the targetting assumptions to oil refineries, in an attempt to maintain some climatic effects. It was still wrong! The smoke from mass oil refinery fires doesn't hang around freezing the ground for months. It gets blown around and dispersed by atmospheric winds, turbulence, and it gets washed out by rainfall.

Another popular myth is that the entire crater volume gets converted into dust which enters the stratosphere. Actually, 99% of the apparent crater volume is due to compression of the ground and material dumped around the crater to form the crater "lip" and the ejecta zone surrounding the lip. Only 1% of the cratered mass ends up in the atmosphere, and that forms the fallout, 70% of which is deposited within 24 hours.

On the topic of ozone depletion, please notice that the prompt gamma rays from a nuclear explosion ionize the air, creating ozone. This effect seriously modifies the early-time history of the thermal pulse output, and has been intensively studied in nuclear tests (although early studies were classified).

Hence, the production of ozone-destroying nitrogen oxides in the air shock wave at high overpressures must be balanced against the production of ozone by gamma rays.

For increasing burst altitude, the amount of ozone produced by a nuclear explosion becomes greater than the nitrogen oxide ozone depletion effect, because at high altitudes the air shock wave does not reach sufficient overpressure to produce nitrogen oxides (the equilibrium concentration of nitrogen oxides is a strong function of the pressure).

Hence, high altitude bursts - which have been threatened on the West by Russian leaders due to EMP effects - will actually increase the amount of ozone in the stratosphere!

In addition, the net ozone depletion by a low altitude burst is a lot less than 1970s and 1980s predictions (ignoring the production of ozone in nuclear explosions) suggested. Much of the nitrogen oxides combine with water vapour in the fireball and you end up with nitric acid, eventually gets washed out of the atmosphere by rain and doesn't affect ozone. See:

"Nitrogen oxides, nuclear weapon testing, Concorde and stratospheric ozone" P. Goldsmith, A. F. Tuck, J. S. Foot, E. L. Simmons & R. L. Newson, published in Nature, v. 244, issue 5418, pp. 545-551, 31 August 1973:

"ALTHOUGH AMOUNTS OF NITROGEN OXIDES EQUIVALENT TO THE OUTPUT FROM MANY CONCORDES WERE RELEASED INTO THE ATMOSPHERE WHEN NUCLEAR TESTING WAS AT ITS PEAK, THE AMOUNT OF OZONE IN THE ATMOSPHERE WAS NOT AFFECTED."

Below is an extract from a British Civil Defence magazine article written by George R. Stanbury, head of civil defence research on the British "Operation Hurricane" nuclear bomb test at Monte Bello, and before that an expert on the incendiary bombing of Britain in World War II.

'Restricted' classified U.K. Home Office Scientific Adviser's Branch journal Fission Fragments, W. F. Greenhalgh, Editor, London, Issue Number 3, August 1962, pages 22-26:

'The fire hazard from nuclear weapons

'by G. R. Stanbury, BSc, ARCS, F.Inst.P.

'We have often been accused of underestimating the fire situation from nuclear attack. We hope to show that there is good scientific justification for the assessments we have made, and we are unrepentant in spite of the television utterances of renowned academic scientists who know little about fire. ...

'Firstly ... the collapse of buildings would snuff out any incipient fires. Air cannot get into a pile of rubble, 80% of which is incombustible anyway. This is not just guess work; it is the result of a very complete study of some 1,600 flying bomb [V1 cruise missile] incidents in London supported by a wealth of experience gained generally in the last war.

'Secondly, there is a considerable degree of shielding of one building by another in general.

'Thirdly, even when the windows of a building can "see" the fireball, and something inside is ignited, it by no means follows that a continuing and destructive fire will develop.

'The effect of shielding in a built-up area was strikingly demonstrated by the firemen of Birmingham about 10 years ago with a 144:1 scale model of a sector of their city which they built themselves; when they put a powerful lamp in the appropriate position for an air burst they found that over 50% of the buildings were completely shielded. More recently a similar study was made in Liverpool over a much larger area, not with a model, but using the very detailed information provided by fire insurance maps. The result was similar.

'It is not so easy to assess the chance of a continuing fire. A window of two square metres would let in about 10^5 calories at the 5 cal/(cm)^2 range. The heat liberated by one magnesium incendiary bomb is 30 times this and even with the incendiary bomb the chance of a continuing fire developing in a small room is only 1 in 5; in a large room it is very much less.

'Thus even if thermal radiation does fall on easily inflammable material which ignites, the chance of a continuing fire developing is still quite small. In the Birmingham and Liverpool studies, where the most generous values of fire-starting chances were used, the fraction of buildings set on fire was rarely higher than 1 in 20.

'And this is the basis of the assertion [in Nuclear Weapons] that we do not think that fire storms are likely to be started in British cities by nuclear explosions, because in each of the five raids in which fire storms occurred (four on Germany - Hamburg, Darmstadt, Kassel, Wuppertal and a "possible" in Dresden, plus Hiroshima in Japan - it may be significant that all these towns had a period of hot dry weather before the raid) the initial fire density was much nearer 1 in 2. Take Hamburg for example:

'On the night of 27/28th July 1943, by some extraordinary chance, 190 tons of bombs were dropped into one square mile of Hamburg. This square mile contained 6,000 buildings, many of which were [multistorey wooden] medieval.

'A density of greater than 70 tons/sq. mile had not been achieved before even in some of the major fire raids, and was only exceeded on a few occasions subsequently. The effect of these bombs is best shown in the following diagram, each step of which is based on sound trials and operational experience of the weapons concerned.

'102 tons of high explosive bombs dropped -> 100 fires

'88 tons of incendiary bombs dropped, of which:

'48 tons of 4 pound magnesium bombs = 27,000 bombs -> 8,000 hit buildings -> 1,600 fires

'40 tons of 30 pound gel bombs = 3,000 bombs -> 900 hit buildings -> 800 fires

'Total = 2,500 fires

'Thus almost every other building [1 in 2 buildings] was set on fire during the raid itself, and when this happens it seems that nothing can prevent the fires from joining together, engulfing the whole area and producing a fire storm (over Hamburg the column of smoke, observed from aircraft, was 1.5 miles in diameter at its base and 13,000 feet high; eyewitnesses on the ground reported that trees were uprooted by the inrushing air).

'When the density was 70 tons/square mile or less the proportion of buildings fired during the raid was about 1 in 8 or less and under these circumstances, although extensive areas were burned out, the situation was controlled, escape routes were kept open and there was no fire storm.'

 
At 12:47 pm, Blogger nige said...

Copy of a comment to:

http://sovietologist.blogspot.com/2008/04/new-toon-et-al-study-on-regional.html

My comment about the fact that high altitude nuclear explosions produce an excess of ozone (by gamma ray emission) without producing nitrogen oxides that destroy ozone (nitrogen oxide formation requires a very compressed shock wave, which can't occur in low density air at high altitude), needs the following reference:

U.S. Congress Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack, 2004. These EMP hearings discuss the politics, such as an outrageous threat allegedly made by the Soviet Ambassador to the U.S., Vladimir Lukin, who said to the Americans in Vienna in May 1999: 'we have the ultimate ability to bring you down [by EMP from high altitude nuclear detonations]'.

 
At 10:10 am, Blogger nige said...

copy of a comment to

http://riofriospacetime.blogspot.com/2008/05/science-of-iron-man.html

"The Tokomak is a donut-shaped magnetic bottle for containing hot plasma. Controlled fusion has long held the promise of limitless energy, but requires temperatures and pressures similiar to the Sun's interior. Despite decades of work, controlled fusion remains as it was in 1964, just around the corner."

Controlled nuclear fusion by magnetic confinement of hot plasma is a joke. Strong magnetic fields are never perfectly uniform and the pressure of plasma needed to cause deuterium and tritium nuclei to fuse is immense! So you always get instabilities develop.

The situation is similar to trying to use a low-density fluid to compress a higher-density fluid, in other words you get a form of Taylor instability develop.

The magnetic field causes the plasma to not be uniformly compressed, but to break up into jets where the magnetic field is slightly weaker. Because you can't make the magnetic field perfectly uniform, this is inevitable.

It's like squeezing an orange with your hands. You don't end up with a uniformly compressed orange. You end up with juice squirting into somebody's eye.

The radioactive waste from a controlled nuclear fusion reactor, if if could be made to work efficiently, would in practical terms be even worse than that from nuclear fission!

At least the 300 fission products decay, as a mixture, faster than the inverse of time. The fission product dose rate falls as about t^{-1.2} where t is time after fission. In any case, fission products have been proved to be safely confined with only a few feet migration over a time span of 1.7 billion years, as a result of the intense natural nuclear reactors in concentrated uranium ore seams at Oklo, in Gabon:

"Once the natural reactors burned themselves out, the highly radioactive waste they generated was held in place deep under Oklo by the granite, sandstone, and clays surrounding the reactors’ areas. Plutonium has moved less than 10 feet from where it was formed almost two billion years ago."

- http://www.ocrwm.doe.gov/factsheets/doeymp0010.shtml

But for fusion, you get the accumulation of relatively long lived iron-59, iron-55, cobalt-60, nickel-63, and many other nuclides which are caused by the capture in reactor materials of high energy neutrons from the fusion process. E.g., the fusion of tritium and deuterium releases 17.6 MeV, of which 14.1 MeV is carried by the neutron. This massive neutron energy is to be compared to the thermalized neutrons of 0.025 eV energy! As a result, whereas in fission you can reprocess the fuel rods to extract the radioactive waste without the whole reactor becoming dangerously radioactive, in fusion the whole reactor becomes almost uniformly contaminated by neutron capture in the structural elements! There is nothing you can do about this.

Controlled nuclear fusion has a lot in common with string theory in terms of over-hype, and failure. The most sensible way to use safe nuclear fusion energy is to further develop solar power and other ways to extract the energy of fusion being carried out in the sun's core.

 
At 5:22 pm, Blogger nige said...

copy of a comment to

http://backreaction.blogspot.com/2008/05/nuclear-power-return-of.html

"Nuclear's OK, but cars can't run on nuclear, so how can that really be a solution?" - Andrew

Nuclear power doesn't burn fossil fuels, which leaves more of those fuels for powering the internal combustion engine rather than generating electricity.

Cars can eventually (when fossil fuel costs make the price of gasoline too much for most people to afford) be fitted with electric motors run on electricity using efficient, low-weight rechargable lithium-ion batteries, and these can be recharged from mains supplied by nuclear reactors.

Obviously, electric trains can run on nuclear generated electricity without any interim battery storage.

The thing about nuclear power is that it is excessively expensive due to excessive safety precautions, and it is also a victim of lying propaganda from the environmental lobby which doesn't understand nuclear power in the proper context of natural radiation background levels and natural radon gas hazards, or even the naturally proved storage of intense radioactive waste for billions of years!

Fission products have been proved to be safely confined with only a few feet migration over a time span of 1.7 billion years, as a result of the intense natural nuclear reactors in concentrated uranium ore seams at Oklo, in Gabon:

"Once the natural reactors burned themselves out, the highly radioactive waste they generated was held in place deep under Oklo by the granite, sandstone, and clays surrounding the reactors’ areas. Plutonium has moved less than 10 feet from where it was formed almost two billion years ago."

- http://www.ocrwm.doe.gov/factsheets/doeymp0010.shtml

Data from Hiroshima and Nagasaki is strongest (most evidence) for low doses, where it shows a suppression and a threshold for such low-LET (linear energy transfer) radiation like gamma rays. See my post here for a discussion of the extensive evidence.

High-LET radiation like alpha particles deposits a lot of energy per unit length of path of the radiation through tissue, and this can overcome the natural protein P53 repair mechanism which sticks broken DNA fragments back together. In fact, the main cancer risk occurs in multiple DNA strand breaks, where bits of DNA end up being stuck back together in the wrong sequence, either killing the cell when it later tries to divide, or more seriously causing cancer when the cell divides in a damaged form which is out of control and causes a tumour.

But such high-LET radiation like alpha particles are only a hazard internally, such as when radioactive material is inhaled or ingested. The alpha particle emitter plutonium in a nuclear reactor is inside sealed aluminium fuel pellets and at no time is such waste a serious inhalation or ingestion hazard.

Gamma radiation, from evidence at Hiroshima and Nagasaki, as well as the Taiwan incident where 180 buildings lived in by 10,000 people for 20 years were constructed of steel which accidentally included intensely radioactive cobalt-60 from discarded radiotherapy sources, is low-LET radiation which does exhibit a threshold before any excess cancer risk (predominantly leukemia) shows up. There is evidence that the exact threshold dose effect for low-LET radiations such as gamma radiation depends on the dose rate at which the radiation is received, and not merely on the total dose. If the dose rate is producing DNA damage at a rate which is lower than the maximum rate at which P53 can repair DNA strand breaks, no excess cancer (above the natural cancer rate) occurs. The cancer risk depends on the proportion of the radiation dose which is above this threshold, and is proportional to that dose received at a rate exceeding the repairable DNA damage rate.

W.L. Chen,Y.C. Luan, M.C. Shieh, S.T. Chen, H.T. , Kung, K.L. Soong, Y.C.Yeh, T.S. Chou, S.H. Mong, J.T.Wu, C.P. Sun,W.P. Deng, M.F.Wu, and M.L. Shen, Is Chronic Radiation an Effective Prophylaxis Against Cancer?, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, page 6, available in PDF format here:

'An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, low-LET gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19.

'The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure. ...

'The data on reduced cancer mortality and congenital malformations are compatible with the phenomenon of radiation hormesis, an adaptive response of biological organisms to low levels of radiation stress or damage; a modest overcompensation to a disruption, resulting in improved fitness. Recent assessments of more than a century of data have led to the formulation of a well founded scientific model of this phenomenon.

'The experience of these 10,000 persons suggests that long term exposure to [gamma]radiation, at a dose rate of the order of 50 mSv (5 rem) per year, greatly reduces cancer mortality, which is a major cause of death in North America.'

The fact that leukemia risk is sensitive function of dose rate and not just dose means that most of the radiation monitors for workers in the nuclear industry (which merely record total dose, i.e. integrated dose rate, and don't show the mean rate at which the dose was received at) is almost useless for assessing risks.

This has been known and published widely since 1962:

"... Mole [R. H. Mole, Brit. J. Radiol., v32, p497, 1959] gave different groups of mice an integrated total of 1,000 r of X-rays over a period of 4 weeks. But the dose-rate - and therefore the radiation-free time between fractions - was varied from 81 r/hour intermittently to 1.3 r/hour continuously. The incidence of leukemia varied from 40 per cent (within 15 months of the start of irradiation) in the first group to 5 per cent in the last compared with 2 per cent incidence in irradiated controls."

All of this evidence is ignored or censored out of mainstream discussions by bigoted politicians, journalists, editors and environmental quangos. So "Health Physics" (which radiation safety is currently known as) isn't really healthy physics anymore, it's instead becoming more of a pseudoscientific exercise in political expediency and ignoring evidence.

Fusion power doesn't look very realistic or safe either, because of the high energy neutrons given off in tritium-deuterium fusion which will turn the structural materials of the entire fusion reactor radioactive quite quickly, since they have a much greater range than the moderated (thermalized) neutrons in a nuclear fission reactor. So neutron-induced activity is a problem with fusion reactors. You have also to compress plasma to enormous pressures to achieve fusion using electrically controlled magnetic fields, which in a commercial fusion reactor producing gigawatts of power, would not exactly have the "fail-safe" safety features of a fission reactor. Any slight upset to the carefully aligned and balanced magnetic fields which are compressing the fusion plasma would potentially turn the fusion reactor into the scene of a H-bomb exposion, complete with radioactive fallout from the neutron-induced activity in the structural materials. This aspect of fusion power isn't hyped very much in the popular media, either. Could it be that the people working on such areas simply don't want their funding to dry up?

 
At 10:59 pm, Blogger nige said...

copy of a comment submitted to moderation queue at the blog:

http://www.builtonfacts.com/2008/05/15/nuclear-fusion-power/#comment-16


"That’s where nuclear fusion comes in. Like the sun, it fuses light atoms (hydrogen isotopes, generally) into heavier ones (helium, generally). Radioactivity is produced, but in vastly smaller and easier-to-handle amounts than in nuclear fission plants. But to get fusion to work, the power plant has to produce conditions of extreme heat and adequate pressure to get the hydrogen to fuse in the first place. On one hand this is a perfect safety feature. If a breakdown ever occurred, damage to the reactor instantly destroys the conditions necessary for continued nuclear reactions. And since only a very small amount of fuel is reacting in a given time, a problem instantly and automatically prevents the reactor from causing melting down. It’s a physical impossibility."

You have a bit of disinformation here for some reason. If you know the physics you choose to write about, you are aware that the easiest fusion process to achieve is tritium+deuterium -> helium + neutron + 17.6 MeV.

Since 80% of the mass is helium and only 20% is the neutron, 80% of the energy, i.e. 14.1 MeV of that is carried by the neutron, so in each fusion event of 17.6 MeV, you get 14.1 MeV of neutron energy which can potentially induce radioactivity into the reactor containment vessel or building.

In fission, an average of about 200 MeV of energy is released in each fission event of which only about 30 MeV is residual radioactivity from fission products.

So in fission, about 15% of the energy is released as radioactivity, while in fusion of tritium and deuterium it can be anything up to 80%.

Neutron induced activity is a less severe problem in fission reactors than in experimental fusion reactors, because the neutrons are thermalized to low energy (about 0.025 eV ) and don't irradiate the entire reactor structure, whereas the 14.1 MeV fusion neutrons are highly penetrating and do go everywhere, turning structural steel radioactive, etc. This is not 'easily handled'.

'On one hand this is a perfect safety feature. If a breakdown ever occurred, damage to the reactor instantly destroys the conditions necessary for continued nuclear reactions. And since only a very small amount of fuel is reacting in a given time, a problem instantly and automatically prevents the reactor from causing melting down. It’s a physical impossibility.'

To make a nuclear fusion reactor work at an energy density that gives the gigawatts of power required for economic or meaningful commercial use, you need a massive amount of fuel with an immense pressure, exerted on the plasma by strong magnetic fields which can squeeze the conductive (ionized) plasma.

If anything goes wrong, you get an explosion. Trying to compress a plasma with magnetic fields is like trying to squeeze and compress an orange with your fingers anyway, which is one reason why fusion has always been a crackpot activity (all hype, no commercially viable success).

It is the nuclear fission reactor which is inherently stable, because it has built-in 'fail safe' design. The control rods fall back in and make it sub-critical if power fails. By contrast, if power fails to the electromagnets containing plasma as a thousand atmospheres or more in a fusion reactor, you get a nuclear explosion as a matter of course.

The more you go into the details, the more stupid nuclear fusion becomes. If you want to use the most easy to achieve fusion reaction, you need to use tritium as well as deuterium, and tritium is exceedingly expensive (it's produced by bombarding lithium with neutrons in a fission reactor).

If you just want to use just deuterium, the amount of pressure and temperature you need to make the reaction exothermal is far higher, because the reaction has a higher threshold for ignition, like a high activation energy in a chemical reaction.

The 'ITER' reactor page http://www.iter.org/a/index_faq.htm states:

'The DT fusion reaction creates helium, which is inert and harmless, and neutrons, which can make surrounding materials radioactive for varying amounts of time.'

This seems to indicate that they are planning to demonstate the concept using DT fusion, using tritium presumably made at great expense in fission reactors. (Which would be extremely expensive, but cheap compared to the cost of trying to extract the tiny amount of natural tritium in seawater.)

The whole fusion spin industry is a complete fraud and pseudoscience. If you want to promote safe nuclear fusion energy, make do with sunlight and its derivatives.

Chernobyl didn't blow up because it was an old design. It blew up because the Soviet RBMK reactor was a stupid design with a massive positive reactivity when the control rods are withdrawn, and the engineers in charge in April 1986 were cowboys, carrying out an unauthorized experiment (to see if the reactor could power its own energency water cooling pumps in the event of losing external electric power), which was obviously dangerous. They switched off the water cooling system, they switched off all the automatic safety systems (which can't be switched off in Western reactor designs when the reactor is in use), then they withdrew most of the control rods. The reactor design was stupid because the control rods were driven by only slow electric motors which couldn't quickly insert them in an emergency. It would take 18 seconds in the RBMK reactor to fully insert the control rods (in Western reactors it takes only 2-3 seconds), and the reactor exploded 40 seconds after the experiment began due to stupidity.

Also, nuclear fission waste is easy to handle and has been proved safe for 1.7 billion years, which is longer than any other kind of industrial waste has been verified to be safe for!

Fission products have been proved to be safely confined with only a few feet migration over a time span of 1.7 billion years, as a result of the intense natural nuclear reactors in concentrated uranium ore seams at Oklo, in Gabon:

"Once the natural reactors burned themselves out, the highly radioactive waste they generated was held in place deep under Oklo by the granite, sandstone, and clays surrounding the reactors’ areas. Plutonium has moved less than 10 feet from where it was formed almost two billion years ago."

- http://www.ocrwm.doe.gov/factsheets/doeymp0010.shtml

 
At 8:21 pm, Anonymous Anonymous said...

google: we got nuked on 9/11

 
At 10:19 pm, Blogger nige said...

Hi David Howard,

I took a look at your blog and and its link to http://wtcdemolition.blogspot.com/ which claims that the World Trade Center twin towers collapse in 2001 was due to an explosion instead of planes crashing in and melting the steel frame with burning aviation fuel, which then allowed the floors to collapse under gravity (piling up into an accumulating downward-travelling mass as they fell, the snowplow effect, which soon makes negligible the resistance of each extra floor the immense mass hits; so there is relatively little deviation from free fall - it soon becomes like dropping a brick on a pile of leaves).

The alleged evidence for it being due to an explosion which is given is not explosion evidence: dust, "extreme high heat in the ground zero rubble (widely-reported/well-substantiated)" etc are normal results of a heavy mass of building falling a great height and hitting the ground. The kinetic energy is

E = (1/2)mv^2

and for gravitational near-free fall velocity v is related to gravitational acceleration g and vertical fall distance s by

v^2 = 2gs

Hence

E = (1/2)mv^2 = (1/2)m(2gs) = mgs.

Each WTC tower had a structural mass of 169,000,000 kg (mainly structural steel and concrete), and was 417 m high (to the top of room, not the spire/antenna). Hence the mean fall distance was 209 m.

This gives an energy release of

E = mgs = 169,000,000*9.81*209

= 3.46*10^11 Joules

Now remember that 1 kt of TNT is equivalent to 4.184*10^12 J.

Hence, each of the twin towers released the equivalent of 0.083 kt of TNT just due to the gravitational collapse, neglecting the energy of the aircraft impacts and the aviation fuel.

This 0.083 kt is in the yield range of the smallest American nuclear bomb, the 23 kg Davy Crockett. So it just equivalent to a very small nuclear explosion.

So all the alleged "evidence" that the tower collapses had some characteristics similar to a small nuclear explosion are missing the point that the energy release when 169,000 metric tons thuds after a fall of hundreds of metres is substantial! Of course it has some characteristics of a big explosion, and you do generally get electromagnetic pulses released by conventional explosions or collisions (the heat causes ionization of material, which if the electrons are detached in an asymmetric way creates the radiation of a radio-frequency pulse).

The easy discriminator between a nuclear explosion and a conventional explosion or collapse is obviously the easily traced radioactive fission product signature. Anyone with a portable detector have been able to detect if there had been a nuclear explosion involved.

The simplest theory which fits the facts for the World Trade Centre twin towers collapse is the most obvious one, that the conspiracy was a terrorist group which flew aircraft into the twin towers. That was enough to cause the destruction observed. You don't need to add more explosives, the weight of the building in combination to the damage and fires due to the aviation fuel heating and weakening the steel frame and thereby allowing the floors to fall was enough to cause all the effects.

If you want to attack conspiracies, please attack the many real conspiracies instead of imaginary ones, e.g. discredit mainstream string theory for claiming to be a theory of quantum gravity when it predicts nothing, or discredit the conspiracy to misinform people on the effects of nuclear weapons tests and radiation effects as a function of dose rate!

The problem is, as I'm sure you are aware, the factual conspiracies just don't have any interest to many people, who prefer more imaginary speculative stuff, instead of sticking to solid evidence.

However, thanks for your comment!

 

Post a Comment

<< Home

All of this data should have been published to inform public debate on the basis for credible nuclear deterrence of war and civil defense, PREVENTING MILLIONS OF DEATHS SINCE WWII, instead of dDELIBERATELY allowing enemy anti-nuclear and anti-civil defence lying propaganda from Russian supporting evil fascists to fill the public data vacuum, killing millions by allowing civil defence and war deterrence to be dismissed by ignorant "politicians" in the West, so that wars triggered by invasions with mass civilian casualties continue today for no purpose other than to promote terrorist agendas of hate and evil arrogance and lying for war, falsely labelled "arms control and disarmament for peace": "Controlling escalation is really an exercise in deterrence, which means providing effective disincentives to unwanted enemy actions. Contrary to widely endorsed opinion, the use or threat of nuclear weapons in tactical operations seems at least as likely to check [as Hiroshima and Nagasaki] as to promote the expansion of hostilities [providing we're not in a situation of Russian biased arms control and disarmament whereby we've no tactical weapons while the enemy has over 2000 neutron bombs thanks to "peace" propaganda from Russian thugs]." - Bernard Brodie, pvi of Escalation and the nuclear option, RAND Corp memo RM-5444-PR, June 1965.

Update (19 January 2024): Jane Corbin of BBC TV is continuing to publish ill-informed nuclear weapons capabilities nonsense debunked here since 2006 (a summary of some key evidence is linked here), e.g. her 9pm 18 Jan 2024 CND biased propaganda showpiece Nuclear Armageddon: How Close Are We? https://www.bbc.co.uk/iplayer/episode/m001vgq5/nuclear-armageddon-how-close-are-we which claims - from the standpoint of 1980s Greenham Common anti-American CND propaganda - that the world would be safer without nuclear weapons, despite the 1914-18 and 1939-45 trifles that she doesn't even bother to mention, which were only ended with nuclear deterrence. Moreover, she doesn't mention the BBC's Feb 1927 WMD exaggerating broadcast by Noel-Baker which used the false claim that there is no defence against mass destruction by gas bombs to argue for UK disarmament, something that later won him a Nobel Peace Prize and helped ensure the UK had no deterrent against the Nazis until too late to set off WWII (Nobel peace prizes were also awarded to others for lying, too, for instance Norman Angell whose pre-WWI book The Great Illusion helped ensure Britain's 1914 Liberal party Cabinet procrastinated on deciding what to do if Belgium was invaded, and thus failed deter the Kaiser from triggering the First World War!). The whole basis of her show was to edit out any realism whatsoever regarding the topic which is the title of her programme! No surprise there, then. Los Alamos, Livermore and Sandia are currently designing the W93 nuclear warhead for SLBM's to replace the older W76 and W88, and what she should do next time is to address the key issue of what that design should be to deter dictators without risking escalation via collateral damage: "To enhance the flexibility and responsiveness of our nuclear forces as directed in the 2018 NPR, we will pursue two supplemental capabilities to existing U.S. nuclear forces: a low-yield SLBM warhead (W76-2) capability and a modern nuclear sea launched cruise missile (SLCM-N) to address regional deterrence challenges that have resulted from increasing Russian and Chinese nuclear capabilities. These supplemental capabilities are necessary to correct any misperception an adversary can escalate their way to victory, and ensure our ability to provide a strategic deterrent. Russia’s increased reliance on non-treaty accountable strategic and theater nuclear weapons and evolving doctrine of limited first-use in a regional conflict, give evidence of the increased possibility of Russia’s employment of nuclear weapons. ... The NNSA took efforts in 2019 to address a gap identified in the 2018 NPR by converting a small number of W76-1s into the W76-2 low-yield variant. ... In 2019, our weapon modernization programs saw a setback when reliability issues emerged with commercial off-the-shelf non-nuclear components intended for the W88 Alteration 370 program and the B61-12 LEP. ... Finally, another just-in-time program is the W80-4 LEP, which remains in synchronized development with the LRSO delivery system. ... The Nuclear Weapons Council has established a requirement for the W93 ... If deterrence fails, our combat-ready force is prepared now to deliver a decisive response anywhere on the globe ..." - Testimony of Commander Charles Richard, US Strategic Command, to the Senate Committee on Armed Services, 13 Feb 2020. This issue of how to use nuclear weapons safely to deter major provocations that escalate to horrific wars is surely is the key issue humanity should be concerned with, not the CND time-machine of returning to a non-nuclear 1914 or 1939! Corbin doesn't address it; she uses debunked old propaganda tactics to avoid the real issues and the key facts.

For example, Corbin quotes only half a sentence by Kennedy in his TV speech of 22 October 1962: "it shall be the policy of this nation to regard any nuclear missile launched from Cuba against any nation in the Western hemisphere as an attack by the Soviet Union on the United States", and omits the second half of the sentence, which concludes: "requiring a full retalitory response upon the Soviet Union." Kennedy was clearly using US nuclear superiority in 1962 to deter Khrushchev from allowing the Castro regime to start any nuclear war with America! By chopping up Kennedy's sentence, Corbin juggles the true facts of history to meet the CND agenda of "disarm or be annihilated." Another trick is her decision to uncritically interview CND biased anti-civil defense fanatics like the man (Professor Freedman) who got Bill Massey of the Sunday Express to water down my article debunking pro-war CND type "anti-nuclear" propaganda lies on civil defense in 1995! Massey reported to me that Freedman claimed civil defense is no use against a H-bomb, which he claims is cheaper than dirt cheap shelters, exactly what Freedman wrote in his deceptive letter published in the 26 March 1980 Times newspaper: "for far less expenditure the enemy could make a mockery of all this by increasing the number of attacking weapons", which completely ignores the Russian dual-use concept of simply adding blast doors to metro tubes and underground car parks, etc. In any case, civil defense makes deterrence credible as even the most hard left wingers like Duncan Campbell acknowledged on page 5 of War Plan UK (Paladin Books, London, 1983): "Civil defence ... is a means, if need be, of putting that deterrence policy, for those who believe in it, into practical effect."