“... Freedom is the right to question, and change the established way of doing things. It is the continuing revolution ... It is the understanding that allows us to recognize shortcomings and seek solutions. It is the right to put forth an idea ....” – Ronald Reagan, Moscow State University, May 31, 1988 (quoted at our physics site, www.quantumfieldtheory.org). Text in blue on this blog is hyperlinked directly to reference material (so can be opened in another tab by right-clicking on it):

Click here for the key declassified nuclear testing and capability documents compilation (EM-1 related USA research reports and various UK nuclear weapon test reports on blast and radiation), from nukegate.org

We also uploaded an online-viewable version of the full text of the 1982 edition of the UK Goverment's Domestic Nuclear Shelters - Technical Guidance, including secret UK and USA nuclear test report references and extracts proving protection against collateral damage, for credible deterrence (linked here).

For a review of this site see: https://www.nextbigfuture.com/2016/02/are-nuclear-weapons-100-times-less.html which states: "Cook is a master researcher who digs up incredible piles of research on all topics nuclear and the following is digest of various writings of his gathered for easy access centered on the remarkable thesis that the effects of nuclear weapons, while literally awesome, have been exaggerated or misunderstood to an even greater extent, with perhaps very considerable military consequences." Also see some key extracts from this blog published over at http://www.militarystory.org/nuclear-detonations-in-urban-and-suburban-areas/ and blog statistics (over 2.3 million views) linked here (populist pseudo-critics love to falsely claim that "nobody takes any notice of the truth, justifying their decision to ignore the facts by following the fake fashion herd groupthink agenda"). (Or, for Field Marshall Slim's "the more you use, fewer you lose" success formula for ending war by winning in Burma against Japan - where physicist Herman Kahn served while his friend Sam Cohen was calculating nuclear weapon efficiencies at the Los Alamos Manhattan Project, which again used "overkill" to convince the opponent to throw in the towel - please see my post on the practicalities of really DETERRING WWIII linked here; this is the opposite of the failure to escalate formula used to drag out war until bankrupcy aka the Vietnam effect.)

This blog's url is now "www.nukegate.org". When this nuclear effects blog began in 2006, "glasstone.blogspot.com" was used to signify the key issue of Glasstone's obfuscating Effects of Nuclear Weapons, specifically the final 1977 edition, which omitted not just the credible deterrent "use" of nuclear weapons but the key final "Principles of protection" chapter that had been present in all previous editions, and it also ignored the relatively clean neutron bombs which had been developed in the intervening years, as a credible deterrent to the concentrations of force needed for aggressive invasions, such as the 1914 invasion of Belgium and the 1939 invasion of Poland; both of which triggered world wars. Those editors themselves were not subversives, but both had nuclear weapons security clearances which constituted political groupthink censorship control, regarding which designs of nuclear weapons they could discuss and the level of technical data (they include basically zero information on their sources and the "bibliographies" are in most cases not to their classified nuclear testing sources but merely further reading); the 1977 edition had been initially drafted in 1974 solely by EM-1 editor Dolan at SRI International, and was then submitted to Glasstone who made further changes. The persistent and hypocritical Russian World Peace Council's and also hardline arms controllers propaganda tactic - supported by some arms industry loons who have a vested interest in conventional war - has been to try to promote lies on nuclear weapons effects to get rid of credible Western nuclear deterrence of provocations that start war. Naturally, the Russians have now stocked 2000+ tactical neutron weapons of the sort they get the West to disarm.

This means that they can invade territory with relative impunity, since the West won't deter such provocations by flexible response - the aim of Russia is to push the West into a policy of massive retaliation of direct attacks only, and then use smaller provocations instead - and Russia can then use its tactical nuclear weapons to "defend" its newly invaded territories by declaring them to now be part of Mother Russia and under Moscow's nuclear umbrella. Russia has repeatedly made it clear - for decades - that it expects a direct war with NATO to rapidly escalate into nuclear WWIII and it has prepared civil defense shelters and evacuation tactics to enable it. Herman Kahn's public warnings of this date back to his testimony to the June 1959 Congressional Hearings on the Biological and Environmental Effects of Nuclear War, but for decades were deliberately misrepresented by most media outlets. President Kennedy's book "Why England Slept" makes it crystal clear how exactly the same "pacifist" propaganda tactics in the 1930s (that time it was the "gas bomb knockout blow has no defense so disarm, disarm, disarm" lie) caused war, by using fear to slow credible rearmament in the face of state terrorism. By the time democracies finally decided to issue an ultimatum, Hitler had been converted - by pacifist appeasement - from a cautious tester of Western indecision, into an overconfident aggressor who simply ignored last-minute ultimatums.

Glasstone and Dolan's 1977 Effects of Nuclear Weapons (US Government) is written in a highly ambiguous fashion (negating nearly every definite statement with a deliberately obfuscating contrary statement to leave a smokescreen legacy of needless confusion, obscurity and obfuscation), omits nearly all key nuclear test data and provides instead misleading generalizations of data from generally unspecified weapon designs tested over 60 years ago which apply to freefield measurements on unobstructed radial lines in deserts and oceans. It makes ZERO analysis of the overall shielding of radiation and blast by their energy attenuation in modern steel and concrete cities, and even falsely denies such factors in its discussion of blast in cities and in its naive chart for predicting the percentage of burns types as a function of freefield outdoor thermal radiation, totally ignoring skyline shielding geometry (similar effects apply to freefield nuclear radiation exposure, despite vague attempts to dismiss this by non-quantitative talk about some scattered radiation arriving from all angles). It omits the huge variations in effects due to weapon design e.g. cleaner warhead designs and the tactical neutron bomb. It omits quantitative data on EMP as a function of burst yield, height and weapon design.

It omits most of the detailed data collected from Hiroshima and Nagasaki on the casualty rates as a function of type of building or shelter and blast pressure. It fails to analyse overall standardized casualty rates for different kinds of burst (e.g. shallow underground earth penetrators convert radiation and blast energy into ground shock and cratering against hard targets like silos or enemy bunkers). It omits a detailed analysis of blast precursor effects. It omits a detailed analysis of fallout beta and gamma spectra, fractionation, specific activity (determining the visibility of the fallout as a function of radiation hazard, and the mass of material to be removed for effective decontamination), and data which does exist on the effect of crater soil size distribution upon the fused fallout particle size distribution (e.g. tests like Small Boy in 1962 on the very fine particles at Frenchman Flats gave mean fallout particle sizes far bigger than the pre-shot soil, proving that - as for Trinitite - melted small soil particles fuse together in the fireball to produce larger fallout particles, so the pre-shot soil size distribution is irrelevant for fallout analysis).

By generally (with few exceptions) lumping "effects" of all types of bursts together into chapters dedicated to specific effects, it falsely gives the impression that all types of nuclear explosions produce similar effects with merely "quantitative differences". This is untrue because air bursts eliminate fallout casualties entirely, while slight burial (e.g. earth penetrating warheads) eliminates thermal (including fires and dust "climatic nuclear winter" BS), the initial radiation and severe blast effects, while massively increasing ground shock, and the same applies to shallow underwater bursts. So a more objective treatment to credibly deter all aggression MUST emphasise the totally different collateral damage effects, by dedicating chapters to different kinds of burst (high altitude/space bursts, free air bursts, surface bursts, underground bursts, underwater bursts), and would include bomb design implications on these effects in detail. A great deal of previously secret and limited distributed nuclear effects data has been declassified since 1977, and new research has been done. Our objectives in this review are: (a) to ensure that an objective independent analysis of the relevant nuclear weapons effects facts is placed on the record in case the currently, increasingly vicious Cold War 2.0 escalates into some kind of limited "nuclear demonstration" by aggressors to try to end a conventional war by using coercive threats, (b) to ensure the lessons of tactical nuclear weapon design for deterring large scale provocations (like the invasions of Belgium in 1914 and Poland in 1939 which triggered world wars) are re-learned in contrast to Dulles "massive retaliation" (incredible deterrent) nonsense, and finally (c) to provide some push to Western governments to "get real" with our civil defense, to try to make credible our ageing "strategic nuclear deterrent". We have also provided a detailed analysis of recently declassified Russian nuclear warhead design data, shelter data, effects data, tactical nuclear weapons employment manuals, and some suggestions for improving Western thermonuclear warheads to improve deterrence.

‘The evidence from Hiroshima indicates that blast survivors, both injured and uninjured, in buildings later consumed by fire [caused by the blast overturning charcoal braziers used for breakfast in inflammable wooden houses filled with easily ignitable bamboo furnishings and paper screens] were generally able to move to safe areas following the explosion. Of 130 major buildings studied by the U.S. Strategic Bombing Survey ... 107 were ultimately burned out ... Of those suffering fire, about 20 percent were burning after the first half hour. The remainder were consumed by fire spread, some as late as 15 hours after the blast. This situation is not unlike the one our computer-based fire spread model described for Detroit.’

- Defense Civil Preparedness Agency, U.S. Department of Defense, DCPA Attack Environment Manual, Chapter 3: What the Planner Needs to Know About Fire Ignition and Spread, report CPG 2-1A3, June 1973, Panel 27.

The Effects of the Atomic Bomb on Hiroshima, Japan, US Strategic Bombing Survey, Pacific Theatre, report 92, volume 2 (May 1947, secret):

Volume one, page 14:

“... the city lacked buildings with fire-protective features such as automatic fire doors and automatic sprinkler systems”, and pages 26-28 state the heat flash in Hiroshima was only:

“... capable of starting primary fires in exposed, easily combustible materials such as dark cloth, thin paper, or dry rotted wood exposed to direct radiation at distances usually within 4,000 feet of the point of detonation (AZ).”

Volume two examines the firestorm and the ignition of clothing by the thermal radiation flash in Hiroshima:

Page 24:

“Scores of persons throughout all sections of the city were questioned concerning the ignition of clothing by the flash from the bomb. ... Ten school boys were located during the study who had been in school yards about 6,200 feet east and 7,000 feet west, respectively, from AZ [air zero]. These boys had flash burns on the portions of their faces which had been directly exposed to rays of the bomb. The boys’ stories were consistent to the effect that their clothing, apparently of cotton materials, ‘smoked,’ but did not burst into flame. ... a boy’s coat ... started to smoulder from heat rays at 3,800 feet from AZ.” [Contrast this to the obfuscation and vagueness in Glasstone, The Effects of Nuclear Weapons!]

Page 88:

“Ignition of the City. ... Only directly exposed surfaces were flash burned. Measured from GZ, flash burns on wood poles were observed at 13,000 feet, granite was roughened or spalled by heat at 1,300 feet, and vitreous tiles on roofs were blistered at 4,000 feet. ... six persons who had been in reinforced-concrete buildings within 3,200 feet of air zero stated that black cotton blackout curtains were ignited by radiant heat ... dark clothing was scorched and, in some cases, reported to have burst into flame from flash heat [although as the 1946 unclassified USSBS report admits, most immediately beat the flames out with their hands without sustaining injury, because the clothing was not drenched in gasoline, unlike peacetime gasoline tanker road accident victims]

“... but a large proportion of over 1,000 persons questioned was in agreement that a great majority of the original fires was started by debris falling on kitchen charcoal fires, by industrial process fires, or by electric short circuits. Hundreds of fires were reported to have started in the centre of the city within 10 minutes after the explosion. Of the total number of buildings investigated [135 buildings are listed] 107 caught fire, and in 69 instances, the probable cause of initial ignition of the buildings or their contents was as follows: (1) 8 by direct radiated heat from the bomb (primary fire), (2) 8 by secondary sources, and (3) 53 by fire spread from exposed [wooden] buildings.”

There is now a relatively long introduction at the top of this blog, due to the present nuclear threat caused by disarmament and arms control propaganda, and the dire need to get the facts out past pro-Russian media influencers or loony mass media which has never cared about nuclear and radiation effects facts, so please scroll down to see blog posts. The text below in blue is hyperlinked (direct to reference source materials, rather than numbered and linked to reference at the end of the page) so you can right-click on it and open in a new tab to see the source. This page is not about opinions, it provides censored out facts that debunk propaganda, but for those who require background "authority" nonsense on censored physics facts, see stuff here or here. Regarding calling war-mongering, world war causing, terrorism-regime-supporting UK disarmers of the 20th century "thugs" instead of "kind language": I was put through the Christianity grinder as a kid so will quote Jesus (whom I'm instructed to follow), Matthew 23:33: "Ye serpents, ye generation of vipers, how can ye escape the damnation of Hell?" The fake "pacifist" thugs will respond with some kindly suggestion that this is "paranoid" and that "Jesus was rightfully no-platformed for his inappropriate language"! Yeah, you guys would say that, wouldn't ya. Genuine pacifism requires credible deterrence! Decent people seem to be very confused about the facts of this. Jesus did not say "disarm to invite your annihilation by terrorists". You can't "forgive and forget" when the enemy is still on the warpath. They have to be stopped, either by deterrence, force, defense, or a combination of all these.

Above: Edward Leader-Williams on the basis for UK civil defence shelters in SECRET 1949 Royal Society's London Symposium on physical effects of atomic weapons, a study that was kept secret by the Attlee Government and subsequent UK governments, instead of being openly published to enhance public knowledge of civil defence effectiveness against nuclear attack. Leader-Williams also produced the vital civil defence report seven years later (published below for the first time on this blog), proving civil defence sheltering and city centre evacuation is effective against 20 megaton thermonuclear weapons. Also published in the same secret symposium, which was introduced by Penney, was Penney's own Hiroshima visit analysis of the percentage volume reduction in overpressure-crushed empty petrol cans, blueprint containers, etc., which gave a blast partition yield of 7 kilotons (or 15.6 kt total yield, if taking the nuclear blast as 45% of total yield, i.e. 7/0.45 = 15.6, as done in later AWRE nuclear weapons test blast data reports). Penney in a 1970 updated paper allowed for blast reduction due to the damage done in the city bursts.

The June 1957 edition of Glasstone's Effects of Nuclear Weapons was the first to include the effects of blast duration (which increases with the cube-root of weapon yield) on blast damage from nuclear weapons. This is significant for wind drag loading to drag-sensitive targets, but is minimal to diffraction-sensitive targets which respond to peak pressures, especially where the blast pressure rapidly equalizes around the structure (e.g. utility poles or buildings with large expanses of glass which shattersm allowing rapid pressure equalization). For example, Glasstone 1957, Fig. 6.41b (p253, using Fig. 3.94a on p109 to convert scaled distances to overpressures from a surface burst on open deserted terrain) shows that for yields of 1 kt, 20 kt (approximately the 16 kt Hiroshima and 21 kt Nagasaki yields), and 1 megaton, peak overpressures of 55, 23 and 15 psi, respectively, are required for collapse (severe damage) to modern multistory concrete buildings with light walls (Fig. 6.41a shows that about 5 psi will demolish a wood frame house - no longer in modern city centres - regardless of yield). Notice that this means that modern cities are extremely resistant to blast from ~1 kt neutron bombs, requiring more than twice the peak overpressure for collapse than was needed in Hiroshima and Nagasaki. Also notice that very large amounts of energy are absorbed from the blast in causing severe damage to modern reinforced concrete city buildings, causing rapid attenuation of free-field pressure so that ocean and desert test validated cube-root damage scaling laws break down for high yield bursts in modern cities (see latest blog post here for examples of calculations of this energy absorption in both oscillating a building in the elastic deflection engineering graph zone, and the much larger energy absorption in causing plastic zone distortion to reinforced concrete - basically the former typically absorbs about 1% of blast energy, whereas the latter takes up something like 10 times more energy, or 10%, a factor entirely dismissed by Glasstone and Dolan but analyzed by Penney). Above a megaton or so, the increasing blast duration has less and less effect on the peak overpressure required for severe damage, because for destruction a threshold blast loading exists, regardless of the blast duration. (A 1 mile/hour wind will not blow a wall down, regardless of how long it lasts. In other words, large impulses cease to be damage criteria if the blast pressure drops below a threshold needed for damage.) Glasstone 1957 Fig 6.41c on p255 shows that automobiles suffer severe damage 36 psi peak overpressure for 1 kt, 18 psi for 20 kt, and 12 psi for 1 megaton. (These pressures for destruction of cars are similar to the severe damage data for multistorey steel frame office buildings with light walls, given in Fig. 3.94a on p109.) The main point here is that low-yield (around 1 kt) tactical nuclear weapons produce far less collateral damage to civilian infrastructure than high yield bursts, and even the effects of the latter are exaggerated severely for modern cities when using wooden house data in unobstructed terrain at ocean or desert terrain nuclear tests.

ABOVE: The 1996 Northrop EM-1 (see extracts below showing protection by modern buildings and also simple shelters very close to nuclear tests; note that Northrop's entire set of damage ranges as a function of yield for underground shelters, tunnels, silos are based on two contained deep underground nuclear tests of different yield scaled to surface burst using the assumption of 5% yield ground coupling relative to the underground shots; this 5% equivalence figure appears to be an exaggeration for compact modern warheads, e.g. the paper “Comparison of Surface and Sub-Surface Nuclear Bursts,” from Steven Hatch, Sandia National Laboratories, to Jonathan Medalia, October 30, 2000, shows a 2% equivalence, e.g. Hatch shows that 1 megaton surface burst produces identical ranges to underground targets as a 20 kt burst at >20m depth of burst, whereas Northrop would require 50kt) has not been openly published, despite such protection being used in Russia! This proves heavy bias against credible tactical nuclear deterrence of the invasions that trigger major wars that could escalate into nuclear war (Russia has 2000+ dedicated neutron bombs; we don't!) and against simple nuclear proof tested civil defence which makes such deterrence credible and of course is also of validity against conventional wars, severe weather, peacetime disasters, etc.

The basic fact is that nuclear weapons can deter/stop invasions unlike the conventional weapons that cause mass destruction, and nuclear collateral damage is eliminated easily for nuclear weapons by using them on military targets, since for high yields at collateral damage distances all the effects are sufficiently delayed in arrival to allow duck and cover to avoid radiation and blast wind/flying debris injuries (unlike the case for the smaller areas affected by smaller yield conventional weapons, where there is little time on seeing the flash to duck and cover to avoid injury), and as the original 1951 SECRET American Government "Handbook on Capabilities of Atomic Weapons" (limited report AD511880L, forerunner to today's still secret EM-1) stated in Section 10.32:


As for Hitler's stockpile of 12,000 tons of tabun nerve gas, whose strategic and also tactical use was deterred by proper defences (gas masks for all civilians and soldiers, as well as UK stockpiles of fully trial-tested deliverable biological agent anthrax and mustard gas retaliation capacity), it is possible to deter strategic nuclear escalation to city bombing, even within a world war with a crazy terrorist, if all the people are protected by both defence and deterrence.

J. R. Oppenheimer (opposing Teller), February 1951: "It is clear that they can be used only as adjuncts in a military campaign which has some other components, and whose purpose is a military victory. They are not primarily weapons of totality or terror, but weapons used to give combat forces help they would otherwise lack. They are an integral part of military operations. Only when the atomic bomb is recognized as useful insofar as it is an integral part of military operations, will it really be of much help in the fighting of a war, rather than in warning all mankind to avert it." (Quotation: Samuel Cohen, Shame, 2nd ed., 2005, page 99.)

‘The Hungarian revolution of October and November 1956 demonstrated the difficulty faced even by a vastly superior army in attempting to dominate hostile territory. The [Soviet Union] Red Army finally had to concentrate twenty-two divisions in order to crush a practically unarmed population. ... With proper tactics, nuclear war need not be as destructive as it appears when we think of [World War II nuclear city bombing like Hiroshima]. The high casualty estimates for nuclear war are based on the assumption that the most suitable targets are those of conventional warfare: cities to interdict communications ... With cities no longer serving as key elements in the communications system of the military forces, the risks of initiating city bombing may outweigh the gains which can be achieved. ...

‘The elimination of area targets will place an upper limit on the size of weapons it will be profitable to use. Since fall-out becomes a serious problem [i.e. fallout contaminated areas which are so large that thousands of people would need to evacuate or shelter indoors for up to two weeks] only in the range of explosive power of 500 kilotons and above, it could be proposed that no weapon larger than 500 kilotons will be employed unless the enemy uses it first. Concurrently, the United States could take advantage of a new development which significantly reduces fall-out by eliminating the last stage of the fission-fusion-fission process.’

- Dr Henry Kissinger, Nuclear Weapons and Foreign Policy, Harper, New York, 1957, pp. 180-3, 228-9. (Note that sometimes the "nuclear taboo" issue is raised against this analysis by Kissenger: if anti-nuclear lying propaganda on weapons effects makes it apparently taboo in the Western pro-Russian disarmament lobbies to escalate from conventional to tactical nuclear weapons to end war as on 6 and 9 August 1945, then this "nuclear taboo" can be relied upon to guarantee peace for our time. However, this was not only disproved by Hiroshima and Nagasaki, but by the Russian tactical nuclear weapons reliance today, the Russian civil defense shelter system detailed on this blog which showed they believed a nuclear war survivable based on the results of their own nuclear tests, and the use of Russian nuclear weapons years after Kissinger's analysis was published and criticised, for example their 50 megaton test in 1961 and their supply of IRBM's capable of reaching East Coast mainland USA targets to the fanatical Cuban dictatorship in 1962. So much for the "nuclear taboo" as being any more reliable than Chamberlain's "peace for our time" document, co-signed by Hitler on 30 September 1938! We furthermore saw how Russia respected President Obama's "red line" for the "chemical weapons taboo": Russia didn't give a toss about Western disarmament thugs prattle about what they think is a "taboo", Russia used chlorine and sarin in Syria to keep Assad the dictator and they used Novichok to attack and kill in the UK in 2018, with only diplomatic expulsions in response. "Taboos" are no more valid to restrain madmen than peace treaties, disarmament agreements, Western CND books attacking civil defense or claiming that nuclear war is the new 1930s gas war bogyman, or "secret" stamps on scientific facts. In a word, they're crazy superstitions.)

(Quoted in 2006 on this blog here.)

All of this data should have been published to inform public debate on the basis for credible nuclear deterrence of war and civil defense, PREVENTING MILLIONS OF DEATHS SINCE WWII, instead of DELIBERATELY allowing enemy anti-nuclear and anti-civil defence lying propaganda from Russian supporting evil fascists to fill the public data vacuum, killing millions by allowing civil defence and war deterrence to be dismissed by ignorant "politicians" in the West, so that wars triggered by invasions with mass civilian casualties continue today for no purpose other than to promote terrorist agendas of hate and evil arrogance and lying for war, falsely labelled "arms control and disarmament for peace":

"Controlling escalation is really an exercise in deterrence, which means providing effective disincentives to unwanted enemy actions. Contrary to widely endorsed opinion, the use or threat of nuclear weapons in tactical operations seems at least as likely to check [as Hiroshima and Nagasaki] as to promote the expansion of hostilities [providing we're not in a situation of Russian biased arms control and disarmament whereby we've no tactical weapons while the enemy has over 2000 neutron bombs thanks to "peace" propaganda from Russian thugs]." - Bernard Brodie, pvi of Escalation and the nuclear option, RAND Corp memo RM-5444-PR, June 1965.

Russian project 49 dual-primary thermonuclear weaponeer Dr Yuri Trutnev has an officially "proatom.ru"-published technical history of the design of the Russian nuclear weapons (which differ from UK-USA designs fundamentally) here (extracted from Russian "Atomic Strategy" No. 18, August 2005): "the problem of ensuring spherically symmetric compression of the secondary module was radically solved, since the time of “symmetrization” of the energy around the secondary module was much less than the time of compression of this module. ... The first two-stage thermonuclear charge, designated RDS-37, was developed in 1955 and successfully tested on November 22, 1955. The energy release of the charge in the experiment was 1.6 Mt, and since for safety reasons at the Semipalatinsk test site the charge was tested at partial power, the predicted full-scale energy release of the charge was ~ 3 Mt. The energy release amplification factor in RDS-37 was about two orders of magnitude, the charge did not use tritium, the thermonuclear fuel was lithium deuteride, and the main fissile material was U-238. ... Particular attention should be paid to the works of 1958. This year, a new type of thermonuclear charge, “product 49,” was tested [the double-primary H-bomb], which was the next step in the formation of a standard for thermonuclear charges (its development was completed in 1957, but testing on the SIP did not take place). The ideologists of this project and the developers of the physical charge circuit were Yu. N. Babaev and I. The peculiarity of the new charge was that, using the basic principles of the RDS-37, it was possible to: • significantly reduce overall parameters due to a new bold solution to the problem of transfer of X-ray radiation, which determines implosion; • simplify the layered structure of the secondary module, which turned out to be an extremely important practical decision. According to the conditions of adaptation to specific carriers, “product 49” was developed in a smaller overall weight category compared to the RDS-37 charge, but its specific volumetric energy release turned out to be 2.4 times greater.

"The physical design of the charge turned out to be extremely successful; the charge was transferred to service and subsequently underwent modernization associated with the replacement of primary energy sources. In 1958, together with Yu. N. Babaev, we managed to develop 4 thermonuclear charges, which were tested on the field in 7 full-scale tests, and all of them were successful. This work was practically implemented within 8 months of 1958. All of these charges used a new circuit, first introduced in Product 49. Their energy release ranged from 0.3 to 2.8 Mt. In addition, in 1958, under my leadership M. V. Fedulov also developed the lightest thermonuclear charge at that time according to the “product 49” design, which was also successfully tested. Work on the miniaturization of thermonuclear weapons was new at that time, and it was met with a certain misunderstanding and resistance. ... One of the well-known pages in the history of work on thermonuclear weapons of the USSR is the creation of a superbomb - the most powerful thermonuclear charge. I will dwell on some points of this development. ... Among the features of this charge, it should be noted that the large volume of the charge (due to its high energy release) required significant amounts of X-ray energy to carry out implosion. The developed nuclear charges did not satisfy this condition, and therefore, a previously developed two-stage thermonuclear charge with a relatively low energy release was used as the primary source of the “super-powerful charge”. This charge was developed by me and Yu. N. Babaev. ... In the next project (a return to the untested 1958 system) that I supervised, every effort was made to ensure near-perfect implosion symmetry. This brilliant work led to success, and in 1962, the problem of implementing thermonuclear ignition was solved in a special device. In other full-scale tests that followed, this success was consolidated, and as a result, thermonuclear ignition provided the calculated combustion of the secondary module with an energy release of 1 Mt. My co-authors in this development were V.B. Adamsky, Yu.N. Babaev, V.G. Zagrafov and V.N. Mokhov. ... This principle has found a variety of applications in the creation of fundamentally new types of thermonuclear charges, from special devices for the use of nuclear explosions for peaceful purposes to significant military applications." (Note there is a 2017 filmed interview of Trutnev - in Russian - linked here.)

This is the basis for both the Russian isentropic-compressed pure fusion secondary (99.85% clean) neutron bomb and related progress with strategic warheads:

“In 1966, VNIIEF conducted a successful test of the second generation charge, in which an almost doubling of the power density was achieved by increasing the contribution of fission reactions in the thermonuclear module. These results were subsequently used to create new third-generation products.” - A. A. Greshilov, N. D. Egupov and A. M. Matushchenko, Nuclear shield (official Russian nuclear weapons history), 2008, p171 (linked here: https://elib.biblioatom.ru/text/greshilov_yaderny-schit_2008/p171/ ). Note that first double-primary Project 49 Russian test on 23 February 1958 was rapidly weaponised as the 1364 kg 8F12/8F12N warhead for the 8K63 missile in 1959, according to http://militaryrussia.ru/blog/index-0-5.html which also gives a table of yields and masses of other Russian warheads: the 2.3 megaton warhead 8K15 for the 8K65 missile had a mass of 1546 kg; the 5 megaton 8F116 warhead for the 8K64 and 8K65 missiles had a mass of 2175 kg; the 6 megaton 8F117 for the 8K64 and other missiles had a mass of 2200 kg, etc. The diagram below shows a cut-away through the shells in the isentropically-compressed megaton secondary stage of the first Russian weapon without a central fission neutron-producing sparkplug (1.1 megaton Russian test number 218 at Novaya Zemlya on 24 December 1962, an air drop detonating at 1320 m altitude). This diagram was declassified in the official Russian "History of the domestic nuclear project - Report by the scientific director of RFNC-VNIIEF, Academician of the Russian Academy of Sciences R.I. Ilkaeva at the General Meeting, Department of Physical Sciences of the Russian Academy of Sciences December 17, 2012, RAS", after John H. Nuckolls' summary of the similar, 99.9% clean 10 megaton Ripple-2, tested 30 October 1962 as detailed in posts below (the detailed interior design analysis of the Russian megaton nuclear warhead for the R13 - which is on display in a Russian nuclear warhead design museum - is from the Russian sites here and here).

https://hbr.org/1995/05/why-the-news-is-not-the-truth/ (Peter Vanderwicken in the Harvard Business Review Magazine, May-June 1995): "The news media and the government are entwined in a vicious circle of mutual manipulation, mythmaking, and self-interest. Journalists need crises to dramatize news, and government officials need to appear to be responding to crises. Too often, the crises are not really crises but joint fabrications. The two institutions have become so ensnared in a symbiotic web of lies that the news media are unable to tell the public what is true and the government is unable to govern effectively. That is the thesis advanced by Paul H. Weaver, a former political scientist (at Harvard University), journalist (at Fortune magazine), and corporate communications executive (at Ford Motor Company), in his provocative analysis entitled News and the Culture of Lying: How Journalism Really Works ... The news media and the government have created a charade that serves their own interests but misleads the public. Officials oblige the media’s need for drama by fabricating crises and stage-managing their responses, thereby enhancing their own prestige and power. Journalists dutifully report those fabrications. Both parties know the articles are self-aggrandizing manipulations and fail to inform the public about the more complex but boring issues of government policy and activity. What has emerged, Weaver argues, is a culture of lying. ... The architect of the transformation was not a political leader or a constitutional convention but Joseph Pulitzer, who in 1883 bought the sleepy New York World and in 20 years made it the country’s largest newspaper. Pulitzer accomplished that by bringing drama to news—by turning news articles into stories ... His journalism took events out of their dry, institutional contexts and made them emotional rather than rational, immediate rather than considered, and sensational rather than informative. The press became a stage on which the actions of government were a series of dramas. ... The press swarmed on the story, which had all the necessary dramatic elements: a foot-dragging bureaucracy, a study finding that the country’s favorite fruit was poisoning its children, and movie stars opposing the pesticide. Sales of apples collapsed. Within months, Alar’s manufacturer withdrew it from the market, although both the EPA and the Food and Drug Administration stated that they believed Alar levels on apples were safe. The outcry simply overwhelmed scientific evidence. That happens all too often, Cynthia Crossen argues in her book Tainted Truth: The Manipulation of Fact in America. ... Crossen writes, “more and more of the information we use to buy, elect, advise, acquit and heal has been created not to expand our knowledge but to sell a product or advance a cause.” “Most members of the media are ill-equipped to judge a technical study,” Crossen correctly points out. “Even if the science hasn’t been explained or published in a U.S. journal, the media may jump on a study if it promises entertainment for readers or viewers. And if the media jump, that is good enough for many Americans.” ... A press driven by drama and crises creates a government driven by response to crises. Such an “emergency government can’t govern,” Weaver concludes. “Not only does public support for emergency policies evaporate the minute they’re in place and the crisis passes, but officials acting in the emergency mode can’t make meaningful public policies. According to the classic textbook definition, government is the authoritative allocation of values, and emergency government doesn’t authoritatively allocate values.” (Note that Richard Rhodes' Pulitzer prize winning books such as The making of the atomic bomb which uncritically quote Hiroshima firestorm lies and survivors nonsense about people running around without feet, play to this kind of emotional fantasy mythology of nuclear deterrence obfuscation so loved by the mass media.)

ABOVE: "missile gap" propaganda debunked by secret 1970s data; Kennedy relied on US nuclear superiority. Using a flawed analysis of nuclear weapons effects on Hiroshima - based on lying unclassified propaganda reports and ignorant dismissals of civil defense shelters in Russia (again based on Hiroshima propaganda by groves in 1945) - America allowed Russian nuclear superiority in the 1970s. Increasingly, the nuclear deterrent was used by Russia to stop the West from "interfering" with its aggressive invasions and wars, precisely Hitler's 1930s strategy with gas bombing knockout-blow threats used to engineer appeasement. BELOW: H-bomb effects and design secrecy led to tragic mass media delusions, such as the 18 February 1950 Picture Post claim that the H-bomb can devastate Australia (inspiring the Shute novel and movie "On the Beach" and also other radiation scams like "Dr Strangelove" to be used by Russia to stir up anti Western disarmament movement to help Russia win WWIII). Dad was a Civil Defense Corps Instructor in the UK when this was done (the civil defense effectiveness and weapon effects facts on shelters at UK and USA nuclear tests were kept secret and not used to debunk lying political appeasement propaganda tricks in the mass media by sensationalist "journalists" and Russian "sputniks"):

Message to mass-media journalists: please don't indulge in lying "no defence" propaganda as was done by most of the media in previous pre-war crises!

ABOVE: Example of a possible Russian 1985 1st Cold War SLBM first strike plan. The initial use of Russian SLBM launched nuclear missiles from off-coast against command and control centres (i.e. nuclear explosions to destroy warning satellite communications centres by radiation on satellites as well as EMP against ground targets, rather than missiles launched from Russia against cities, as assumed by 100% of the Cold War left-wing propaganda) is allegedly a Russian "fog of war" strategy. Such a "demonstration strike" is aimed essentially at causing confusion about what is going on, who is responsible - it is not quick or easy to finger-print high altitude bursts fired by SLBM's from submerged submarines to a particular country because you don't get fallout samples to identify isotopic plutonium composition. Russia could immediately deny the attack (implying, probably to the applause of the left-wingers that this was some kind of American training exercise or computer based nuclear weapons "accident", similar to those depicted in numerous anti-nuclear Cold War propaganda films). Thinly-veiled ultimatums and blackmail follow. America would not lose its population or even key cities in such a first strike (contrary to left-wing propaganda fiction), as with Pearl Harbor in 1941; it would lose its complacency and its sense of security through isolationism, and would either be forced into a humiliating defeat or a major war.

Before 1941, many warned of the risks but were dismissed on the basis that Japan was a smaller country with a smaller economy than the USA and war was therefore absurd (similar to the way Churchill's warnings about European dictators were dismissed by "arms-race opposing pacifists" not only in the 1930s, but even before WWI; for example Professor Cyril Joad documents in the 1939 book "Why War?" his first hand witnessing of Winston Churchill's pre-WWI warning and call for an arms-race to deter that war, as dismissed by the sneering Norman Angell who claimed an arms race would cause a war rather than avert one by bankrupting the terrorist state). It is vital to note that there is an immense pressure against warnings of Russian nuclear superiority even today, most of it contradictory. E.g. the left wing and Russian-biased "experts" whose voices are the only ones reported in the Western media (traditionally led by "Scientific American" and "Bulletin of the Atomic Scientists"), simultaneously claim Russia imposes such a terrible SLBM and ICBM nuclear threat that we must desperately disarm now, while also claiming that Russian tactical nuclear weapons probably won't work so aren't a threat that needs to be credibly deterred! This only makes sense as Russian siding propaganda. In similar vein, Teller-critic Hans Bethe also used to falsely "dismiss" Russian nuclear superiority by claiming (with quotes from Brezhnev about the peaceful intentions of Russia) that Russian delivery systems are "less accurate" than Western missiles (as if accuracy has anything to do with high altitude EMP strikes, where the effects cover huge areas, or large city targets. Such claims would then by repeatedly endlessly in the Western media by Russian biased "journalists" or agents of influence, and any attempt to point out the propaganda (i.e. he real world asymmetry: Russia uses cheap countervalue targetting on folk that don't have civil defense, whereas we need costly, accurate counterforce targetting because Russia has civil defense shelters that we don't have) became a "Reds under beds" argument, implying that the truth is dangerous to "peaceful coexistence"!

“Free peoples ... will make war only when driven to it by tyrants. ... there have been no wars between well-established democracies. ... the probability ... that the absence of wars between well-established democracies is a mere accident [is] less than one chance in a thousand. ... there have been more than enough to provide robust statistics ... When toleration of dissent has persisted for three years, but not until then, we can call a new republic ‘well established.’ ... Time and again we observe authoritarian leaders ... using coercion rather than seeking mutual accommodation ... Republican behaviour ... in quite a few cases ... created an ‘appeasement trap.’ The republic tried to accommodate a tyrant as if he were a fellow republican; the tyrant concluded that he could safely make an aggressive response; eventually the republic replied furiously with war. The frequency of such errors on both sides is evidence that negotiating styles are not based strictly on sound reasoning.” - Spencer Weart, Never at War: Why Democracies Will Not Fight One Another (Yale University Press)

The Top Secret American intelligency report NIE 11-3/8-74 "Soviet Forces for Intercontinental Conflict" warned on page 6: "the USSR has largely eliminated previous US quantitative advantages in strategic offensive forces." page 9 of the report estimated that the Russian's ICBM and SLBM launchers exceed the USAs 1,700 during 1970, while Russia's on-line missile throw weight had exceeded the USA's one thousand tons back in 1967! Because the USA had more long-range bombers which can carry high-yield bombs than Russia (bombers are more vulnerable to air defences so were not Russia's priority), it took a little longer for Russia to exceed the USA in equivalent megatons, but the 1976 Top Secret American report NIE 11-3/8-76 at page 17 shows that in 1974 Russia exceeded the 4,000 equivalent-megatons payload of USA missiles and aircraft (with less vulnerability for Russia, since most of Russia's nuclear weapons were on missiles not in SAM-vulnerable aircraft), amd by 1976 Russia could deliver 7,000 tons of payload by missiles compared to just 4,000 tons on the USA side. These reports were kept secret for decades to protect the intelligence sources, but they were based on hard evidence. For example, in August 1974 the Hughes Aircraft Company used a specially designed ship (Glomar Explorer, 618 feet long, developed under a secret CIA contract) to recover nuclear weapons and their secret manuals from a Russian submarine which sank in 16,000 feet of water, while in 1976 America was able to take apart the electronics systems in a state-of-the-art Russian MIG-25 fighter which was flown to Japan by defector Viktor Belenko, discovering that it used exclusively EMP-hard miniature vacuum tubes with no EMP-vulnerable solid state components.

There are four ways of dealing with aggressors: conquest (fight them), intimidation (deter them), fortification (shelter against their attacks; historically used as castles, walled cities and even walled countries in the case of China's 1,100 mile long Great Wall and Hadrian's Wall, while the USA has used the Pacific and Atlantic as successful moats against invasion, at least since Britain invaded Washington D.C. back in 1812), and friendship (which if you are too weak to fight, means appeasing them, as Chamberlain shook hands with Hitler for worthless peace promises). These are not mutually exclusive: you can use combinations. If you are very strong in offensive capability and also have walls to protect you while your back is turned, you can - as Teddy Roosevelt put it (quoting a West African proverb): "Speak softly and carry a big stick." But if you are weak, speaking softly makes you a target, vulnerable to coercion. This is why we don't send troops directly to Ukraine. When elected in 1960, Kennedy introduced "flexible response" to replace Dulles' "massive retaliation", by addressing the need to deter large provocations without being forced to decide between the unwelcome options of "surrender or all-out nuclear war" (Herman Kahn called this flexible response "Type 2 Deterrence"). This was eroded by both Russian civil defense and their emerging superiority in the 1970s: a real missiles and bombers gap emerged in 1972 when the USSR reached and then exceeded the 2,200 of the USA, while in 1974 the USSR achieve parity at 3,500 equivalent megatons (then exceeded the USA), and finally today Russia has over 2,000 dedicated clean enhanced neutron tactical nuclear weapons and we have none (except low-neutron output B61 multipurpose bombs). (Robert Jastrow's 1985 book How to make nuclear Weapons obsolete was the first to have graphs showing the downward trend in nuclear weapon yields created by the development of miniaturized MIRV warheads for missiles and tactical weapons: he shows that the average size of US warheads fell from 3 megatons in 1960 to 200 kilotons in 1980, and from a total of 12,000 megatons in 1960 to 3,000 megatons in 1980.)

The term "equivalent megatons" roughly takes account of the fact that the areas of cratering, blast and radiation damage scale not linearly with energy but as something like the 2/3 power of energy release; but note that close-in cratering scales as a significantly smaller power of energy than 2/3, while blast wind drag displacement of jeeps in open desert scales as a larger power of energy than 2/3. Comparisons of equivalent megatonnage shows, for example, that WWII's 2 megatons of TNT in the form of about 20,000,000 separate conventional 100 kg (0.1 ton) explosives is equivalent to 20,000,000 x (10-7)2/3 = 431 separate 1 megaton explosions! The point is, nuclear weapons are not of a different order of magnitude to conventional warfare, because: (1) devastated areas don't scale in proportion to energy release, (2) the number of nuclear weapons is very much smaller than the number of conventional bombs dropped in conventional war, (3) because of radiation effects like neutrons and intense EMP, it is possible to eliminate physical destruction by nuclear weapons by a combination of weapon design (e.g. very clean bombs like 99.9% fusion Dominic-Housatonic, or 95% fusion Redwing-Navajo) and burst altitude or depth for hard targets, and create a weapon that deters invasions credibly (without lying local fallout radiation hazards), something none of the biased "pacifist disarmament" lobbies (which attract Russian support) tell you, and (4) people at collateral damage distances have time to take cover from radiation and flying glass, blast winds, etc from nuclear explosions (which they don't in Ukraine and Gaza where similar blast pressures arrive more rapidly from smaller conventional explosions). There's a big problem with propaganda here.

(These calculations, showing that even if strategic bombing had worked in WWII - and the US Strategic Bombing Survey concluded it failed, thus the early Cold War effort to develop and test tactical nuclear weapons and train for tactical nuclear war in Nevada field exercises - you need over 400 megaton weapons to give the equivalent of WWII city destruction in Europe and Japan, are often inverted by anti-nuclear bigots to try to obfuscate the truth. What we're driving at is that nuclear weapons give you the ability to DETER the invasions that set off such wars, regardless of whether they escalate from poison gas - as feared in the 20s and 30s thus appeasement and WWII - or nuclear. Escalation was debunked in WWII where the only use of poison gases were in "peaceful" gas chambers, not dropped on cities. Rather than justifying appeasement, the "peaceful" massacre of millions in gas chambers justified war. But evil could and should have been deterred. The "anti-war" propagandarists like Lord Noel-Baker and pals who guaranteed immediate gas knockout blows in the 30s if we didn't appease evil dictators were never held to account and properly debunked by historians after the war, so they converted from gas liars to nuclear liars in the Cold War and went on winning "peace" prices for their lies, which multiplied up over the years, to keep getting news media headlines and Nobel Peace Prizes for starting and sustaining unnecessary wars and massacres by dictators. There's also a military side to this, with Field Marshall's Lord Mountbatten, lord Carver and lord Zuckerman in the 70s arguing for UK nuclear disarmament and a re-introduction of conscription instead. These guys were not pacifist CND thugs who wanted Moscow to rule the world, but they were quoted by them attacking the deterrent but not of course calling for conscription instead. The abolishment of UK conscription for national service in 1960 was due to the H-bomb, and was a political money-saving plot by Macmillan. If we disarmed our nuclear deterrent and spend the money on conscription plus underground shelters, we might well be able to resist Russia as Ukraine does, until we run out of ammunition etc. However, the cheapest and most credible deterrent is tactical nuclear weapons to prevent the concentration of aggressive force by terrorist states..)

Britain was initially in a better position with regards to civil defense than the USA, because in WWII Britain had built sufficient shelters (of various types, but all tested against blast intense enough to demolish brick houses, and later also tested them at various nuclear weapon trials in Monte Bello and Maralinga, Australia) and respirators for the entire civilian population. However, Britain also tried to keep the proof testing data secret from Russia (which tested their own shelters at their own nuclear tests anyway) and this meant it appeared that civil defense advice was unproved and would not work, an illusion exploited especially for communist propaganda in the UK via CND. To give just one example, CND and most of the UK media still rely on Duncan Campbell's pseudo-journalism book War Plan UK since it is based entirely on fake news about UK civil defense, nuclear weapons, Hiroshima, fallout, blast, etc. He takes for granted that - just because the UK Government kept the facts secret - the facts don't exist, and to him any use of nuclear weapons which spread any radioactivity whatsoever will make life totally impossible: "What matters 'freedom' or 'a way of life' in a radioactive wasteland?" (Quote from D. Campbell, War Plan UK, Paladin Books, May 1983, p387.) The problem here is the well known fallout decay rate; Trinity nuclear test ground zero was reported by Glasstone (Effects of Atomic Weapons, 1950) to be at 8,000 R/hr at 1 hour after burst, yet just 57 days later, on September 11, 1945, General Groves, Robert Oppenheimer, and a large group of journalists safely visited it and took their time inspecting the surviving tower legs, when the gamma dose rate was down to little more than 1 R/hr! So fission products decay fast: 1,000 R/hr at 1 hour decays to 100 at 7 hours, 10 at 2 days, and just 1 at 2 weeks. So the "radioactive wasteland" is just as much a myth as any other nuclear "doomsday" fictional headline in the media. Nuclear weapons effects have always been fake news in the mainstream media: editors have always regarded facts as "boring copy". Higher yield tests showed that even the ground zero crater "hot spots" were generally lower, due to dispersal by the larger mushroom cloud. If you're far downwind, you can simply walk cross-wind, or prepare an improvised shelter while the dust is blowing. But point any such errors out to fanatical bigots and they will just keep making up more nonsense.

Duncan Campbell's War Plan UK relies on the contradiction of claiming that the deliberately exaggerated UK Government worst-case civil defense "exercises" for training purposes are "realistic scenarios" (e.g. 1975 Inside Right, 1978 Scrum Half, 1980 Square Leg, 1982 Hard Rock planning), while simultaneously claiming the very opposite about reliable UK Government nuclear effects and sheltering effectiveness data, and hoping nobody would spot his contradictory tactics. He quotes extensively from these lurid worst-case scenario UK civil defense exercises ,as if they are factually defensible rather than imaginary fiction to put planners under the maximum possible stress (standard UK military policy of “Train hard to fight easy”), while ignoring the far more likely limited nuclear uses scenario of Sir John Hackett's Third World War. His real worry is the 1977 UK Government Training Manual for Scientific Advisers which War Plan UK quotes on p14: "a potential threat to the security of the United Kingdom arising from acts of sabotage by enemy agents, possibly assisted by dissident groups. ... Their aim would be to weaken the national will and ability to fight. ... Their significance should not be underestimated." On the next page, War Plan UK quotes J. B. S. Haldane's 1938 book Air Raid Precautions (ARP) on the terrible destruction Haldane witnessed on unprotected people in the Spanish civil war, without even mentioning that Haldane's point is pro-civil defense, pro-shelters, and anti-appeasement of dictatorship, the exact opposite of War Plan UK which wants Russia to run the world. On page 124 War Plan UK the false assertion is made that USA nuclear casualty data is "widely accepted" and true (declassified Hiroshima casaulty data for people in modern concrete buildings proves it to be lies) while the correct UK nuclear casualty data is "inaccurate", and on page 126, Duncan Campbell simply lies that the UK Government's Domestic Nuclear Shelters- Technical Guidance "ended up offering the public a selection of shelters half of which were invented in the Blitz ... None of the designs was ever tested." In fact, Frank Pavry (who studied similar shelters surviving near ground zero at Hiroshima and Nagasaki in 1945 with the British Mission to Japan_ and George R. Stanbury tested 15 Anderson shelters at the first UK nuclear explosion, Operation Hurricane in 1952, together with concrete structures, and many other improvised trench and earth-covered shelters were nuclear tested by USA and UK at trials in 1955, 1956, 1957, and 1958, and later at simulated nuclear explosions by Cresson Kearny of Oak Ridge National Laboratory in the USA, having also earlier been exposed to early Russian nuclear tests (scroll down to see the evidence of this). Improved versions of war tested and nuclear weapons tested shelters! So war Plan UK makes no effort whatsoever to dig up the facts, and instead falsely claims the exact opposite of the plain unvarnished truth! War Plan UK shows its hypocrisy on page 383 in enthusiastically praising Russian civil defense:

"Training in elementary civil defence is given to everyone, at school, in industry or collective farms. A basic handbook of precautionary measures, Everybody must know this!, is the Russian Protect and Survive. The national civil defence corps is extensive, and is organized along military lines. Over 200,000 civil defence troops would be mobilized for rescue work in war. There are said to be extensive, dispersed and 'untouchable' food stockpiles; industrial workers are issued with kits of personal protection apparatus, said to include nerve gas counteragents such as atropine. Fallout and blast shelters are provided in the cities and in industrial complexes, and new buildings have been required to have shelters since the 1950s. ... They suggest that less than 10% - even as little as 5% - of the Soviet population would die in a major attack. [Less than Russia's loss of 12% of its population in WWII.]"

'LLNL achieved fusion ignition for the first time on Dec. 5, 2022. The second time came on July 30, 2023, when in a controlled fusion experiment, the NIF laser delivered 2.05 MJ of energy to the target, resulting in 3.88 MJ of fusion energy output, the highest yield achieved to date. On Oct. 8, 2023, the NIF laser achieved fusion ignition for the third time with 1.9 MJ of laser energy resulting in 2.4 MJ of fusion energy yield. “We’re on a steep performance curve,” said Jean-Michel Di Nicola, co-program director for the NIF and Photon Science’s Laser Science and Systems Engineering organization. “Increasing laser energy can give us more margin against issues like imperfections in the fuel capsule or asymmetry in the fuel hot spot. Higher laser energy can help achieve a more stable implosion, resulting in higher yields.” ... “The laser itself is capable of higher energy without fundamental changes to the laser,” said NIF operations manager Bruno Van Wonterghem. “It’s all about the control of the damage. Too much energy without proper protection, and your optics blow to pieces.” ' - https://lasers.llnl.gov/news/llnls-nif-delivers-record-laser-energy

NOTE: the "problem" very large lasers "required" to deliver ~2MJ (roughly 0.5 kg of TNT energy) to cause larger fusion explosions of 2mm diameter capsules of frozen D+T inside a 1 cm diameter energy reflecting hohlraum, and the "problem" of damage to the equipment caused by the explosions, is immaterial to clean nuclear deterrent development based on this technology, because in a clean nuclear weapon, whatever laser or other power ignition system is used only has to be fired once, so it needs to be less robust than the NIF lasers which are used repeatedly. Similarly, damage done to the system by the explosion is also immaterial for a clean nuclear weapon, in which the weapon is detonated once only! This is exactly the same point which finally occurred during a critical review of the first gun-type assembly nuclear weapon, in which the fact it would only ever be fired once (unlike a field artillery gun) enabled huge reductions in the size of the device, into a practical weapon, as described by General Leslie M. Groves on p163 of his 1962 book Now it can be told: the story of the Manhattan Project:

"Out of the Review Committee's work came one important technical contribution when Rose pointed out ... that the durability of the gun was quite immaterial to success, since it would be destroyed in the explosion anyway. Self-evident as this seemed once it was mentioned, it had not previously occurred to us. Now we could make drastic reductions in ... weight and size."

This principle also applies to weaponizing NIF clean fusion explosion technology. General Groves' book was reprinted in 1982 with a useful Introduction by Edward Teller on the nature of nuclear weapons history: "History in some ways resembles the relativity principle in science. What is observed depends on the observer. Only when the perspective of the observer is known, can proper corrections be made. ... The general ... very often managed to ignore complexity and arrive at a result which, if not ideal, at least worked. ... For Groves, the Manhattan project seemed a minor assignment, less significant than the construction of the Pentagon. He was deeply disappointed at being given the job of supervising the development of an atomic weapon, since it deprived him of combat duty. ... We must find ways to encourage mutual understanding and significant collaboration between those who defend their nation with their lives and those who can contribute the ideas to make that defense successful. Only by such cooperation can we hope that freedom will survive, that peace will be preserved."

General Groves similarly comments in Chapter 31, "A Final Word" of Now it can be told:

"No man can say what would have been the result if we had not taken the steps ... Yet, one thing seems certain - atomic energy would have been developed somewhere in the world ... I do not believe the United States ever would have undertaken it in time of peace. Most probably, the first developer would have been a power-hungry nation, which would then have dominated the world completely ... it is fortunate indeed for humanity that the initiative in this field was gained and kept by the United States. That we were successful was due entirely to the hard work and dedication of the more than 600,000 Americans who comprised and directly supported the Manhattan Project. ... we had the full backing of our government, combined with the nearly infinite potential of American science, engineering and industry, and an almost unlimited supply of people endowed with ingenuity and determination."

Update: Lawrence Livermore National Laboratory's $3.5 billion National Ignition Facility, NIF, using ultraviolet wavelength laser beam pulses of 2MJ on to a 2mm diameter spherical beryllium shell of frozen D+T inside a 1 cm-long hollow gold cylinder "hohlraum" (which is heated to a temperature where it then re-radiates energy at much higher frequency, x-rays, on to the surface of the beryllium ablator of the central fusion capsule, which ablates causing it to recoil inward (as for the 1962 Ripple II nuclear weapon's secondary stage, the capsule is compressed efficiently, mimicking the isentropic compression mechanism of a miniature Ripple II clean nuclear weapon secondary stage), has now repeatedly achieved nuclear fusion explosions of over 3MJ, equivalent to nearly 1 kg of TNT explosive. According to a Time article (linked her) about fusion system designer Annie Kritcher, the recent breakthrough was in part due to using a ramping input energy waveform: "success that came thanks to tweaks including shifting more of the input energy to the later part of the laser shot", a feature that minimises the rise in entropy due to shock shock wave generation (which heats the capsule, causing it to expand and resist compression) and increases isentropic compression which was the principle used by LLNL's J. H. Nuckolls to achieve the 99.9% clean Ripple II 9.96 megaton nuclear test success in Dominic-Housatonic on 30 October 1962. Nuckolls in 1972 published the equation for the idealized input power waveform required for isentropic, optimized compression of fusion fuel (Nature, v239, p139): P ~ (1 - t)-1.875, where t is time in units of the transit time (the time taken for the shock to travel to the centre of the fusion capsule), and -1.875 a constant based on the specific heat of the ionized fuel (Nuckolls has provided the basic declassified principles, see extract linked here). To be clear, the energy reliably released by the 2mm diameter capsule of fusion fuel was roughly a 1 kg TNT explosion. 80% of this is in the form of 14.1 MeV neutrons (ideal for fissioning lithium-7 in LiD to yield more tritium), and 20% is the kinetic energy of fused nuclei (which is quickly converted into x-rays radiation energy by collisions). Nuckolls' 9.96 megaton Housatonic (10 kt Kinglet primary and 9.95 Mt Ripple II 100% clean isentropically compressed secondary) of 1962 proved that it is possible to use multiplicative staging whereby lower yield primary nuclear explosions trigger off a fusion stage 1,000 times more powerful than its initiator. Another key factor, as shown on our ggraph linked here, is that you can use cheap natural LiD as fuel once you have a successful D+T reaction, because naturally abundant, cheap Li-7 more readily fissions to yield tritium with the 14.1 MeV neutrons from D+T fusion, than expensively enriched Li-6, which is needed to make tritium in nuclear reactors where the fission neutron energy of around 1 MeV is too low to to fission Li-7. It should also be noted that despite an openly published paper about Nuckolls' Ripple II success being stymied in 2021 by Jon Grams, the subject is still being covered up/ignored by the anti-nuclear biased Western media! Grams article fails to contain the design details such as the isentropic power delivery curve etc from Nuckolls' declassified articles that we include in the latest blog post here. One problem regarding "data" causing continuing confusion about the Dominic-Housatonic 30 October 1962 Ripple II test at Christmas Island, is made clear in the DASA-1211 report's declassified summary of the sizes, weights and yields of those tests: Housatonic was Nuckolls' fourth and final isentropic test, with the nuclear system inserted into a heavy steel Mk36 drop case, making the overall size 57.2 inches in diameter, 147.9 long and 7,139.55 lb mass, i.e. 1.4 kt/lb or 3.0 kt/kg yield-to-mass ratio for 9.96 Mt yield, which is not impressive for that yield range until you consider (a) that it was 99.9% fusion and (b) the isentropic design required a heavy holhraum around the large Ripple II fusion secondary stage to confine x-rays for relatively long time during which a slowly rising pulse of x-rays were delivered from the primary to secondary via a very large areas of foam elsewhere in the weapon, to produce isentropic compression.

Additionally, the test was made in a hurry before an atmospheric teat ban treaty, and this rushed use of a standard air drop steel casing made the tested weapon much heavier than a properly weaponized Ripple II. The key point is that a 10 kt fission device set off a ~10 Mt fusion explosion, a very clean deterrent. Applying this Ripple II 1,000-factor multiplicative staging figure directly to this technology for clean nuclear warheads, a 0.5 kg TNT D+T fusion capsule would set off a 0.5 ton TNT 2nd stage of LiD, which would then set off a 0.5 kt 3rd stage "neutron bomb", which could then be used to set off a 500 kt 4th stage or "strategic nuclear weapon". In practice, this multiplication factor of 1,000 given by Ripple II in 1962 from 10 kt to 10 Mt may not be immediately achievable to get from ~1 kg TNT yield to 1 ton TNT, so a few more tiny stages may be needed for the lower yield. But there is every reason to forecast that with enough research, improvements will be possible and the device will become a reality. It is therefore now possible not just in "theory" or in principle, but with evidence obtained from practical experimentation, using suitable already-proved technical staging systems used in 1960s nuclear weapon tests successfully, to design 100% clean fusion nuclear warheads! Yes, the details have been worked out, yes the technology has been tested in piecemeal fashion. All that is now needed is a new, but quicker and cheaper, Star Wars program or Manhattan Project style effort to pull the components together. This will constitute a major leap forward in the credibility of the deterrence of aggressors.

ABOVE: as predicted, the higher the input laser pulse for the D+T initiator of a clean multiplicatively-staged nuclear deterrent, the lower the effect of plasma instabilities and asymmetries and the greater the fusion burn. To get ignition (where the x-ray energy injected into the fusion hohlraum by the laser is less than the energy released in the D+T fusion burn) they have had to use about 2 MJ delivered in 10 ns or so, equivalent to 0.5 kg of TNT equivalent. But for deterrent use, why use such expensive, delicate lasers? Why not just use one-shot miniaturised x-ray tubes with megavolt electron acceleration, powered a suitably ramped pulse from a chemical explosion for magnetic flux compression current generation? At 10% efficiency, you need 0.5 x 10 = 5 kg of TNT! Even at 1% efficiency, 50 kg of TNT will do. Once the D+T gas capsule's hohlraum is well over 1 cm in size, to minimise the risk of imperfections that cause asymmetries, you don't any longer need focussed laser beams to enter tiny apertures. You might even be able to integrate many miniature flash x-ray tubes (each designed to burn out when firing one pulse of a MJ or so) into a special hohlraum. Humanity urgently needs a technological arms race akin to Reagan's Star Wars project, to deter the dictators from invasions and WWIII. In the conference video above, a question was asked about the real efficiency of the enormous repeat-pulse capable laser system's efficiency (not required for a nuclear weapon whose components only require the capability to be used once, unlike lab equipment): the answer is that 300 MJ was required by the lab lasers to fire a 2 MJ pulse into the D+T capsule's x-ray hohlraum, i.e. their lasers are only 0.7% efficient! So why bother? We know - from the practical use of incoherent fission primary stage x-rays to compress and ignite fusion capsules in nuclear weapons - that you simply don't need coherent photons from a laser for this purpose. The sole reason they are approaching the problem with lasers is that they began their lab experiments decades ago with microscopic sized fusion capsules and for those you need a tightly focussed beam to insert energy through a tiny hohlraum aperture. But now they are finally achieving success with much larger fusion capsules (to minimise instabilities that caused the early failures), it may be time to change direction. A whole array of false "no-go theorems" can and will be raised by ignorant charlatan "authorities" against any innovation; this is the nature of the political world. There is some interesting discussion of why clean bombs aren't in existence today, basically the idealized theory (which works fine for big H-bombs but ignores small-scale asymmetry problems which are important only at low ignition energy) understimated the input energy required for fusion ignition by a factor of 2000:

The early calculations on ICF (inertial-confinement fusion) by John Nuckolls in 1972 had estimated that ICF might be achieved with a driver energy as low as 1 kJ. ... In order to provide reliable experimental data on the minimum energy required for ignition, a series of secret experiments—known as Halite at Livermore and Centurion at Los Alamos—was carried out at the nuclear weapons test site in Nevada between 1978 and 1988. The experiments used small underground nuclear explosions to provide X-rays of sufficiently high intensity to implode ICF capsules, simulating the manner in which they would be compressed in a hohlraum. ... the Halite/Centurion results predicted values for the required laser energy in the range 20 to 100MJ—higher than the predictions ..." - Garry McCracken and Peter Stott, Fusion, Elsevier, 2nd ed., p149.

In the final diagram above, we illustrate an example of what could very well occur in the near future, just to really poke a stick into the wheels of "orthodoxy" in nuclear weapons design: is it possible to just use a lot of (perhaps hardened for higher currents, perhaps no) pulsed current driven microwave tubes from kitchen microwave ovens, channelling their energy using waveguides (simply metal tubes, i.e. electrical Faraday cages, which reflect and thus contain microwaves) into the hohlraum, and make the pusher of dipole molecules (like common salt, NaCl) which is a good absorber of microwaves (as everybody knows from cooking in microwave ovens)? It would be extremely dangerous, not to mention embarrassing, if this worked, but nobody had done any detailed research into the possibility due to groupthink orthodoxy and conventional boxed in thinking! Remember, the D+T capsule just needs extreme compression and this can be done by any means that works. Microwave technology is now very well-established. It's no good trying to keep anything of this sort "secret" (either officially or unofficially) since as history shows, dictatorships are the places where "crackpot"-sounding ideas (such as douple-primary Project "49" Russian thermonuclear weapon designs, Russian Sputnik satellites, Russian Novichok nerve agent, Nazi V1 cruise missiles, Nazi V2 IRBM's, etc.) can be given priority by loony dictators. We have to avoid, as Edward Teller put it (in his secret commentary debunking Bethe's false history of the H-bomb, written AFTER the Teller-Ulam breakthrough), "too-narrow" thinking (which Teller said was still in force on H-bomb design even then). Fashionable hardened orthodoxy is the soft underbelly of "democracy" (a dictatorship by the majority, which is always too focussed on fashionable ideas and dismissive of alternative approaches in science and technology). Dictatorships (minorities against majorities) have repeatedly demonstrated a lack of concern for the fake "no-go theorems" used by Western anti-nuclear "authorities" to ban anything but fashionable groupthink science.

ABOVE: 1944-dated film of the Head of the British Mission to Los Alamos, neutron discoverer James Chadwick, explaining in detail to American how hard it was for him to discover the neutron, taking 10 years on a shoe-string budget, mostly due to having insufficiently strong sources of alpha particles to bombard nuclei in a cloud chamber! The idea of the neutron came from his colleague Rutherford. Chadwick reads his explanation while rapidly rotating a pencil in his right hand, perhaps indicating the stress he was under in 1944. In 1946, when British participation at Los Alamos ended, Chadwick wrote the first detailed secret British report on the design of a three-stage hydrogen bomb, another project that took over a decade. In the diagram below, it appears that the American Mk17 only had a single secondary stage like the similar yield 1952 Mike design. The point here is that popular misunderstanding of the simple mechanism of x-ray energy transfer for higher yield weapons may be creating a dogmatic attitude even in secret nuclear weaponeer design labs, where orthodoxy is followed too rigorously. The Russians (see quotes on the latest blog post here) state they used two entire two-stage thermonuclear weapons with a combined yield of 1 megaton to set off their 50 megaton test in 1961. If true, you can indeed use two-stage hydrogen bombs as an "effective primary" to set off another secondary stage, of much higher yield. Can this be reversed in the sense of scaling it down so you have several bombs-within-bombs, all triggered by a really tiny first stage? In other words, can it be applied to neutron bomb design?

ABOVE: 16 kt at 600m altitude nuclear explosion on a city, Hiroshima ground zero (in foreground) showing modern concrete buildings surviving nearby (unlike the wooden ones that mostly burned at the peak of the firestorm 2-3 hours after survivors had evacuated), in which people were shielded from most of the radiation and blast winds, as they were in simple shelters.

The 1946 Report of the British Mission to Japan, The Effects of the Atomic Bombs at Hiroshima and Nagasaki, compiled by a team of 16 in Hiroshima and Nagasaki during November 1945, which included 10 UK Home Office civil defence experts (W. N. Thomas, J. Bronowski, D. C. Burn, J. B. Hawker, H. Elder, P. A. Badland, R. W. Bevan, F. H. Pavry, F. Walley, O. C. Young, S. Parthasarathy, A. D. Evans, O. M. Solandt, A. E. Dark, R. G. Whitehead and F. G. S. Mitchell) found: "Para. 26. Reinforced concrete buildings of very heavy construction in Hiroshima, even when within 200 yards of the centre of damage, remained structurally undamaged. ... Para 28. These observations make it plain that reinforced concrete framed buildings can resist a bomb of the same power detonated at these heights, without employing fantastic thicknesses of concrete. ... Para 40. The provision of air raid shelters throughout Japan was much below European standards. ... in Hiroshima ... they were semi-sunk, about 20 feet long, had wooden frames, and 1.5-2 feet of earth cover. ... Exploding so high above them, the bomb damaged none of these shelters. ... Para 42. These observations show that the standard British shelters would have performed well against a bomb of the same power exploded at such a height. Anderson shelters, properly erected and covered, would have given protection. Brick or concrete surfac shelters with adequate reinforcement would have remained safe from collapse. The Morrison shelter is designed only to protect its occupants from the refuge load of a house, and this it would have done. Deep shelters such as the refuge provided by the London Underground would have given complete protection. ... Para 60. Buildings and walls gave complete protection from flashburn."

Glasstone and Dolan's 1977 Effects of Nuclear Weapons in Table 12.21 on p547 flunks making this point by giving data without citing its source to make it credible to readers: it correlated 14% mortality (106 killed out of 775 people in Hiroshima's Telegraph Office) to "moderate damage" at 500m in Hiroshima (the uncited "secret" source was NP-3041, Table 12, applying to unwarned people inside modern concrete buildings).

"A weapon whose basic design would seem to provide the essence of what Western morality has long sought for waging classical battlefield warfare - to keep the war to a struggle between the warriors and exclude the non-combatants and their physical assets - has been violently denounced, precisely because it achieves this objective." - Samuel T. Cohen (quoted in Chapman Pincher, The secret offensive, Sidgwick and Jackson, London, 1985, Chapter 15: The Neutron Bomb Offensive, p210).

The reality is, dedicated enhanced neutron tactical nuclear weapons were used to credibly deter the concentrations of force required for triggering of WWIII during the 1st Cold War, and the thugs who support Russian propaganda for Western disarmament got rid of them on our side, but not on the Russian side. Air burst neutron bombs or even as subsurface earth penetrators of relatively low fission yield (where the soil converts energy that would otherwise escape as blast and radiation into ground shock for destroying buried tunnels - new research on cratering shows that a 20 kt subsurface burst creates similar effects on buried hard targets as a 1 Mt surface burst), they cause none of the vast collateral damage to civilians that we see now in Ukraine and Gaza, or that we saw in WWII and the wars in Korea and Vietnam. This is 100% contrary to CND propaganda which is a mixture of lying on nuclear explosion collateral damage, escalation/knockout blow propaganda (of the type used to start WWII by appeasers) and lying on the designs of nuclear weapons in order to ensure the Western side (but not the thugs) gets only incredible "strategic deterrence" that can't deter the invasions that start world wars (e.g. Belgium in 1914 and Poland in 1939.) "Our country entered into an agreement in Budapest, Hungary when the Soviet Union was breaking up that we would guarantee the independence of Ukraine." - Tom Ramos. There really is phoney nuclear groupthink left agenda politics at work here: credible relatively clean tactical nuclear weapons are banned in the West but stocked by Russia, which has civil defense shelters to make its threats far more credible than ours! We need low-collateral damage enhanced-neutron and earth-penetrator options for the new Western W93 warhead, or we remain vulnerable to aggressive coercion by thugs, and invite invasions. Ambiguity, the current policy ("justifying" secrecy on just what we would do in any scenario) actually encourages experimental provocations by enemies to test what we are prepared to do (if anything), just as it did in 1914 and the 1930s.

ABOVE: 0.2 kt (tactical yield range) Ruth nuclear test debris, with lower 200 feet of the 300 ft steel tower surviving in Nevada, 1953. Note that the yield of the tactical invasion-deterrent Mk54 Davy Crockett was only 0.02 kt, 10 times less than than 0.2 kt Ruth.

It should be noted that cheap and naive "alternatives" to credible deterrence of war were tried in the 1930s and during the Cold War and afterwards, with disastrous consequences. Heavy "peaceful" oil sanctions and other embargoes against Japan for its invasion of China between 1931-7 resulted in the plan for the Pearl Harbor surprise attack of 7 December 1941, with subsequent escalation to incendiary city bombing followed nuclear warfare against Hiroshima and Nagasaki. Attlee's pressure on Truman to guarantee no use of tactical nuclear weapons in the Korean War (leaked straight to Stalin by the Cambridge Spy Ring), led to an escalation of that war causing the total devastation of the cities of that country by conventional bombing (a sight witnessed by Sam Cohen, that motivated his neutron bomb deterrent of invasions), until Eisenhower was elected and reversed Truman's decision, leading not to the "escalatory Armageddon" assertions of Attlee, but to instead to a peaceful armistice! Similarly, as Tom Ramos argues in From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Kennedy's advisers who convinced him to go ahead with the moonlit 17 April 1961 Bay of Pigs invasion of Cuba without any USAF air support, which led to precisely what they claimed they would avoid: an escalation of aggression from Russia in Berlin, with the Berlin Wall going up on 17 August 1961 because any showing weakness to an enemy, as in the bungled invasion of Cuba, is always a green light to dictators to go ahead with revolutions, invasions and provocations everywhere else. Rather than the widely hyped autistic claims from disarmers and appeasers about "weakness bringing peace by demonstrating to the enemy that they have nothing to fear from you", the opposite result always occurs. The paranoid dictator seizes the opportunity to strike first. Similarly, withdrawing from Afghanistan in 2021 was a clear green light to Russia to go ahead with a full scale invasion of Ukraine, reigniting the Cold War. von Neumann and Morgenstein's Minimax theorem for winning games - minimise the maximum possible loss - fails with offensive action in war because it sends a signal of weakness to the enemy, which does not treat war as a game with rules to be obeyed. Minimax is only valid for defense, such as civil defense shelters used by Russia to make their threats more credible than ours. The sad truth is that cheap fixes don't work, no matter how much propaganda is behind them. You either need to militarily defeat the enemy or at least economically defeat them using proven Cold War arms race techniques (not merely ineffective sanctions, which they can bypass by making alliances with Iran, North Korea, and China). Otherwise, you are negotiating peace from a position of weakness, which is called appeasement, or collaboration with terrorism.

"Following the war, the Navy Department was intent to see the effects of an atomic blast on naval warships ... the press was invited to witness this one [Crossroads-Able, 23.5 kt at 520 feet altitude, 1 July 1946, Bikini Atoll]. ... The buildup had been too extravagant. Goats that had been tethered on warship decks were still munching their feed, and the atoll's palm trees remained standing, unscathed. The Bikini test changed public attitudes. Before July 1, the world stood in awe of a weapon that had devastated two cities and forced the Japanese Empire to surrender. After that date, the bomb was still a terrible weapon, but a limited one." - Tom Ramos (LLNL nuclear weaponeer and nuclear pumped X-ray laser developer), From Berkeley to Berlin: How the Rad Lab Helped Prevent Nuclear War, Naval Institute Press, 2022, pp43-4.

ABOVE: 16 February 1950 Daily Express editorial on H Bomb problem due to the fact that the UN is another virtue signalling but really war mongering League of Nations (which oversaw Nazi appeasement and the outbreak of WWII); however Fuchs had attended the April 1946 Super Conference during which the Russian version of the H-bomb involving isentropic radiation implosion of a separate low-density fusion stage (unlike Teller's later dense metal ablation rocket implosion secondary TX14 Alarm Clock and Sausage designs) were discussed and then given to Russia. The media was made aware only that Fuchs hade given the fission bomb to Russia. The FBI later visited Fuchs in British jail, showed him a film of Harry Gold (whom Fuchs identified as his contact while at Los Alamos) and also gave Fuchs a long list of secret reports to mark off individually so that they knew precisely what Stalin had been given. Truman didn't order H-bomb research and development because Fuchs gave Stalin the A-bomb, but because he gave them the H-bomb. The details of the Russian H-bomb are still being covered up by those who want a repetition of 1930s appeasement, or indeed the deliberate ambiguity of the UK Cabinet in 1914 which made it unclear what the UK would do if Germany invaded Belgium, allowing the enemy to exploit that ambiguity, starting a world war. The key fact usually covered up (Richard Rhodes, Chuck Hansen, and the whole American "expert nuclear arms community" all misleadingly claim that Teller's Sausage H-bomb design with a single primary and a dense ablator around a cylindrical secondary stage - uranium, lead or tungsten - is the "hydrogen bomb design") here is that two attendees of the April 1946 Super Conference, the report author Egon Bretscher and the radiation implosion discoverer Klaus Fuchs - were British, and both contributed key H-bomb design principles to the Russian and British weapons (discarded for years by America). Egon Bretscher for example wrote up the Super Conference report, during which attendees suggested various ways to try to achieve isentropic compression of low-density fusion fuel (a concept discarded by Teller's 1951 Sausage design, but used by Russia and re-developed in America on Nuckolls 1962 Ripple tests), and after Teller left Los Alamos, Bretscher took over work on Teller's Alarm Clock layered fission-fusion spherical hybrid device before Bretscher himself left Los Alamos and became head of nuclear physics at Harwell, UK,, submitting UK report together with Fuchs (head of theoretical physics at Harwell) which led to Sir James Chadwick's UK paper on a three-stage thermonuclear Super bomb which formed the basis of Penney's work at the UK Atomic Weapons Research Establishment. While Bretscher had worked on Teller's hybrid Alarm Clock (which originated two months after Fuchs left Los Alamos), Fuchs co-authored a hydrogen bomb patent with John von Neumann, in which radiation implosion and ionization implosion was used. Between them, Bretscher and Fuchs had all the key ingredients. Fuchs leaked them to Russia and the problem persists today in international relations.

ILLUSTRATION: the threat of WWII and the need to deter it was massively derided by popular pacifism which tended to make "jokes" of the Nazi threat until too late (example of 1938 UK fiction on this above; Charlie Chaplin's film "The Great Dictator" is another example), so three years after the Nuremberg Laws and five years after illegal rearmament was begun by the Nazis, in the UK crowds of "pacifists" in Downing Street, London, support friendship with the top racist, dictatorial Nazis in the name of "world peace". The Prime Minister used underhand techniques to try to undermine appeasement critics like Churchill and also later to get W. E. Johns fired from both editorships of Flying (weekly) and Popular Flying (monthly) to make it appear everybody "in the know" agreed with his actions, hence the contrived "popular support" for collaborating with terrorists depicted in these photos. The same thing persists today; the 1920s and 1930s "pacifist" was also driven by "escalation" and "annihilation" claims explosions, fire and WMD poison gas will kill everybody in a "knockout blow", immediately any war breaks out.

Update (4 January 2024): on the important world crisis, https://vixra.org/abs/2312.0155 gives a detailed review of "Britain and the H-bomb" (linked here), and why the "nuclear deterrence issue" isn't about "whether we should deter evil", but precisely what design of nuclear warhead we should have in order to do that cheaply, credibly, safely, and efficiently without guaranteeing either escalation or the failure of deterrence. When we disarmed our chemical and biological weapons, it was claimed that the West could easily deter those weapons using strategic nuclear weapons to bomb Moscow (which has shelters, unlike us). That failed when Putin used sarin and chlorine to prop up Assad in Syria, and Novichok in the UK to kill Dawn Sturgess in 2018. So it's just not a credible deterrent to say you will bomb Moscow if Putin invades Europe or uses his 2000 tactical nuclear weapons. An even more advanced deterrent, the 100% clean very low yield (or any yield) multiplicative staged design without any fissile material whatsoever, just around the corner. Clean secondary stages have been proof-tested successfully for example in the 100% clean Los Alamos Redwing Navajo secondary, and the 100% clean Ripple II secondary tested 30 October 1962, and the laser ignition of very tiny fusion capsules to yield more energy than supplied has been done on 5 December 2022 when a NIF test delivered 2.05 MJ (the energy of about 0.5 kg of TNT) to a fusion capsule which yielded 3.15 MJ, so all that is needed is to combine both ideas in a system whereby suitably sized second stages - ignited in the first place by a capacitative charged circuit sending a pulse of energy to a suitable laser system (the schematic shown is just a sketch of principle - more than one laser would possibly be required for reliability of fusion ignition) acting on tiny fusion capsule as shown - are encased to two-stage "effective primaries" which each become effective primaries of bigger systems, thus a geometric series of multiplicative staging until the desired yield is reached. Note that the actual tiny first T+D capsule can be compressed by one-shot lasers - compact lasers used way beyond their traditional upper power limit and burned out in a firing a single pulse - in the same way the gun assembly of the Hiroshima bomb was based on a one-shot gun. In other words, forget all about textbook gun design. The Hiroshima bomb gun assembly system only had to be fired once, unlike a field artillery piece which has to be ready to be fired many thousands of times (before metal fatigue/cracks set in). Thus, by analogy, the lasers - which can be powered by ramping current pulses from magnetic flux compressor systems - for use in a clean bomb will be much smaller and lighter than current lab gear which is designed to be used thousands of times in repeated experiments. The diagram below shows cylindrical Li6D stages throughout for a compact bomb shape, but spherical stages can be used, and once a few stages get fired, the flux of 14 MeV neutrons is sufficient to go to cheap natural LiD. To fit it into a MIRV warhead, the low density of LiD constrains such a clean warhead will have a low nuclear yield, which means a tactical neutron deterrent of the invasions that cause big wars; a conversion of incredible strategic deterrence into a more credible combined strategic-tactical deterrent of major provocations, not just direct attacks. It should also be noted that in 1944 von Neumann suggested that T + D inside the core of the fission weapon would be compressed by "ionization compression" during fission (where a higher density ionized plasma compresses a lower density ionized plasma, i.e. the D + T plasma), an idea that was - years later - named the Internal Booster principle by Teller; see Frank Close, "Trinity", Allen Lane, London, 2019, pp158-159 where Close argues that during the April 1946 Superbomb Conference, Fuchs extended von Neumann's 1944 internal fusion boosting idea to an external D + T filled BeO walled capsule:

"Fuchs reasoned that [the very low energy, 1-10 kev, approximately 10-100 lower energy than medical] x-rays from the [physically separated] uranium explosion would reach the tamper of beryllium oxide, heat it, ionize the constituents and cause them to implode - the 'ionization implosion' concept of von Neumann but now applied to deuterium and tritium contained within beryllium oxide. To keep the radiation inside the tamper, Fuchs proposed to enclose the device inside a casing impervious to radiation. The implosion induced by the radiation would amplify the compression ... and increase the chance of the fusion bomb igniting. The key here is 'separation of the atomic charge and thermonuclear fuel, and compression of the latter by radiation travelling from the former', which constitutes 'radiation implosion'." (This distinction between von Neumann's "ionization implosion" INSIDE the tamper, of denser tamper expanding and thus compressing lower density fusion fuel inside, and Fuchs' OUTSIDE capsule "radiation implosion", is key even today for isentropic H-bomb design; it seems Teller's key breakthroughs were not separate stages or implosion but rather radiation mirrors and ablative recoil shock compression, where radiation is used to ablate a dense pusher of Sausage designs like Mike in 1952 etc., a distinction not to be confused for the 1944 von Neumann and 1946 Fuchs implosion mechanisms!

It appears Russian H-bombs used von Neumann's "ionization implosion" and Fuchs's "radiation implosion" for RDS-37 on 22 November 1955 and also in their double-primary 23 February 1958 test and subsequently, where their fusion capsules reportedly contained a BeO or other low-density outer coating, which would lead to quasi-isentropic compression, more effective for low density secondary stages than purely ablative recoil shock compression. This accounts for the continuing classification of the April 1946 Superbomb Conference (the extract of 32 pages linked here is so severely redacted that it is less helpful than the brief but very lucid summary of its technical content, in the declassified FBI compilation of reports concerning data Klaus Fuchs sent to Stalin, linked here!). Teller had all the knowledge he needed in 1946, but didn't go ahead because he made the stupid error of killing progress off by his own "no-go theorem" against compression of fusion fuel. Teller did a "theoretical" calculation in which he claimed that compression has no effect on the amount of fusion burn because the compressed system is simply scaled down in size so that the same efficiency of fusion burn occurs, albeit faster, and then stops as the fuel thermally expands. This was wrong. Teller discusses the reason for his great error in technical detail during his tape-recorded interview by Chuck Hansen at Los Alamos on 7 June 1993 (C. Hansen, Swords of Armageddon, 2nd ed., pp. II-176-7):

"Now every one of these [fusion] processes varied with the square of density. If you compress the thing, then in one unit's volume, each of the 3 important processes increased by the same factor ... Therefore, compression (seemed to be) useless. Now when ... it seemed clear that we were in trouble, then I wanted very badly to find a way out. And it occurred to be than an unprecedentedly strong compression will just not allow much energy to go into radiation. Therefore, something had to be wrong with my argument and then, you know, within minutes, I knew what must be wrong ... [energy] emission occurs when an electron and a nucleus collide. Absorption does not occur when a light quantum and a nucleus ... or ... electron collide; it occurs when a light quantum finds an electron and a nucleus together ... it does not go with the square of the density, it goes with the cube of the density." (This very costly theoretical error, wasting five years 1946-51, could have been resolved by experimental nuclear testing. There is always a risk of this in theoretical physics, which is why experiments are done to check calculations before prizes are handed out. The ban on nuclear testing is a luddite opposition to technological progress in improving deterrence.)

(This 1946-51 theoretical "no-go theorem" anti-compression error of Teller's, which was contrary to the suggestion of compression at the April 1946 superbomb conference as Teller himself refers to on 14 August 1952, and which was corrected only by comparison of the facts about compression validity in pure fission cores in Feb '51 after Ulam's argument that month for fission core compression by lens focussed primary stage shock waves, did not merely lead to Teller's dismissal of vital compression ideas. It also led to his false equations - exaggerating the cooling effect of radiation emission - causing underestimates of fusion efficiency in all theoretical calculations done of fusion until 1951! For this reason, Teller later repudiated the calculations that allegedly showed his Superbomb would fizzle; he argued that if it had been tested in 1946, the detailed data obtained - regardless of whatever happened - would have at least tested the theory which would have led to rapid progress, because the theory was wrong. The entire basis of the cooling of fusion fuel by radiation leaking out was massively exaggerated until Lawrence Livermore weaponeer John Nuckolls showed that there is a very simple solution: use baffle re-radiated, softened x-rays for isentropic compression of low-density fusion fuel, e.g. very cold 0.3 kev x-rays rather than the usual 1-10 kev cold-warm x-rays emitted directly from the fission primary. Since the radiation losses are proportional to the fourth-power of the x-ray energy or temperature, losses are virtually eliminated, allowing very efficient staging as for Nuckolls' 99.9% 10 Mt clean Ripple II, detonated on 30 October 1962 at Christmas Island. Teller's classical Superbomb was actually analyzed by John C. Solem in a 15 December 1978 report, A modern analysis of Classical Super, LA-07615, according to a Freedom of Information Act request filed by mainstream historian Alex Wellerstein, FOIA 17-00131-H, 12 June 2017; according to a list of FOIA requests at https://www.governmentattic.org/46docs/NNSAfoiaLogs_2016-2020.pdf. However, a google search for the documents Dr Wellerstein requested shows only a few at the US Gov DOE Opennet OSTI database or otherwise online yet e.g. LA-643 by Teller, On the development of Thermonuclear Bombs dated 16 Feb. 1950. The page linked here stating that report was "never classified" is mistaken! One oddity about Teller's anti-compression "no-go theorem" is that the even if fusion rates were independent of density, you would still want compression of fissile material in a secondary stage such as a radiation imploded Alarm Clock, because the whole basis of implosion fission bombs is the benefit of compression; another issue is that even if fusion rates are unaffected by density, inward compression would still help to delay the expansion of the fusion system which leads to cooling and quenching of the fusion burn.)

ABOVE: the FBI file on Klaus Fuchs contains a brief summary of the secret April 1946 Super Conference at Los Alamos which Fuchs attended, noting that compression of fusion fuel was discussed by Lansdorf during the morning session on 19 April, attended by Fuchs, and that: "Suggestions were made by various people in attendance as to the manner of minimizing the rise in entropy during compression." This fact is vitally interesting, since it proves that an effort was being made then to secure isentropic compression of low-density fusion fuel in April 1946, sixteen years before John H. Nuckolls tested the isentropically compressed Ripple II device on 30 October 1962, giving a 99.9% clean 10 megaton real H-bomb! So the Russians were given a massive head start on this isentropic compression of low-density fusion fuel for hydrogen bombs, used (according to Trutnev) in both the single primary tests like RDS-37 in November 1955 and also in the double-primary designs which were 2.5 times more efficient on a yield-to-mass basis, tested first on 23 February 1958! According to the FBI report, the key documents Fuchs gave to Russia were LA-551, Prima facie proof of the feasibility of the Super, 15 Apr 1946 and the LA-575 Report of conference on the Super, 12 June 1946. Fuchs also handed over to Russia his own secret Los Alamos reports, such as LA-325, Initiator Theory, III. Jet Formation by the Collision of Two Surfaces, 11 July 1945, Jet Formation in Cylindrical lmplosion with 16 Detonation Points, Secret, 6 February 1945, and Theory of Initiators II, Melon Seed, Secret, 6 January 1945. Note the reference to Bretscher attending the Super Conference with Fuchs; Teller in a classified 50th anniversary conference at Los Alamos on the H-bomb claimed that after he (Teller) left Los Alamos for Chicago Uni in 1946, Bretscher continued work on Teller's 31 August 1946 "Alarm Clock" nuclear weapon (precursor of the Mike sausage concept etc) at Los Alamos; it was this layered uranium and fusion fuel "Alarm Clock" concept which led to the departure of Russian H-bomb design from American H-bomb design, simply because Fuchs left Los Alamos in June 1946, well before Teller invented the Alarm Clock concept on 31 August 1946 (Teller remembered the date precisely simply because he invented the Alarm Clock on the day his daughter was born, 31 August 1946! Teller and Richtmyer also developed a variant called "Swiss Cheese", with small pockets or bubbles of expensive fusion fuels, dispersed throughout cheaper fuel, in order to kinder a more cost-effective thermonuclear reaction; this later inspired the fission and fusion boosted "spark plug" ideas in later Sausage designs; e.g. security cleared Los Alamos historian Anne Fitzpatrick stated during her 4 March 1997 interview with Robert Richtmyer, who co-invented the Alarm Clock with Teller, that the Alarm Clock evolved into the spherical secondary stage of the 6.9 megaton Castle-Union TX-14 nuclear weapon!).

In fact (see Lawrence Livermore National Laboratory nuclear warhead designer Nuckolls' explanation in report UCRL-74345): "The rates of burn, energy deposition by charged reaction products, and electron-ion heating are proportional to the density, and the inertial confinement time is proportional to the radius. ... The burn efficiency is proportional to the product of the burn rate and the inertial confinement time ...", i.e. the fusion burn rate is directly proportional to the fuel density, which in turn is of course inversely proportional to the cube of its radius. But the inertial confinement time for fusion to occur is proportional to the radius, so the fusion stage efficiency in a nuclear weapon is the product of the burn rate (i.e., 1/radius^3) and time (i.e., radius), so efficiency ~ radius/(radius^3) ~ 1/radius^2. Therefore, for a given fuel temperature, the total fusion burn, or the efficiency of the fusion stage, is inversely proportional to the square of the compressed radius of the fuel! (Those condemning Teller's theoretical errors or "arrogance" should be aware that he pushed hard all the time for experimental nuclear tests of his ideas, to check if they were correct, exactly the right thing to do scientifically and others who read his papers had the opportunity to point out any theoretical errors, but was rebuffed by those in power, who used a series of contrived arguments to deny progress, based upon what Harry would call "subconscious bias", if not arrogant, damning, overt bigotry against the kind of credible, overwhelming deterrence which had proved lacking a decade earlier, leading to WWII. This callousness towards human suffering in war and under dictatorship existed in some UK physicists too: Joseph Rotblat's hatred of anything to deter Russia be it civil defense or tactical neutron bombs of the West - he had no problem smiling and patting Russia's neutron bomb when visiting their labs during cosy groupthink deluded Pugwash campaigns for Russian-style "peaceful collaboration" - came from deep family communist convictions, since his brother was serving in the Red Army in 1944 when he alleged he heard General Groves declare that the bomb must deter Russia! Rotblat stated he left Los Alamos as a result. The actions of these groups are analogous to the "Cambridge Scientists Anti-War Group" in the 1930s. After Truman ordered a H-bomb, Bradbury at Los Alamos had to start a "Family Committee" because Teller had a whole "family" of H-bomb designs, ranging from the biggest, "Daddy", through various "Alarm Clocks", all the way down to small internally-boosted fission tactical weapons. From Teller's perspective, he wasn't putting all eggs in one basket.)

Above: declassified illustration from a January 1949 secret report by the popular physics author and Los Alamos nuclear weapons design consultant George Gamow, showing his suggestion of using x-rays from both sides of a cylindrically imploded fission device to expose two fusion capsules to x-rays to test whether compression (fusion in BeO box on right side) helps, or is unnecessary (capsule on left side). Neutron counters detect 14.1 Mev T+D neutrons using time-of-flight method (higher energy neutrons traver faster than ~1 Mev fission stage neutrons, arriving at detectors first, allowing discrimination of the neutron energy spectrum by time of arrival). It took over two years to actually fire this 225 kt shot (8 May 1951)! No wonder Teller was outraged. A few interesting reports by Teller and also Oppenheimer's secret 1949 report opposing the H bomb project as it then stood on the grounds of low damage per dollar - precisely the exact opposite of the "interpretation" the media and gormless fools will assert until the cows come home - are linked here. The most interesting is Teller's 14 August 1952 Top Secret paper debunking Hans Bethe's propaganda, by explaining that contrary to Bethe's claims, Stalin's spy Klaus Fuch had the key "radiation implosion"- see second para on p2 - secret of the H-bomb because he attended the April 1946 Superbomb Conference which was not even attended by Bethe!  It was this very fact in April 1946, noted by two British attendees of the 1946 Superbomb Conference before collaboration was ended later in the year by the 1946 Atomic Energy Act, statement that led to Sir James Cladwick's secret use of "radiation implosion" for stages 2 and 3 of his triple staged H-bomb report the next month, "The Superbomb", a still secret document that inspired Penney's original Tom/Dick/Harry staged and radiation imploded H-bomb thinking, which is summarized by security cleared official historian Arnold's Britain and the H-Bomb.  Teller's 24 March 1951 letter to Los Alamos director Bradbury was written just 15 days after his historic Teller-Ulam 9 March 1951 report on radiation coupling and "radiation mirrors" (i.e. plastic casing lining to re-radiate soft x-rays on to the thermonuclear stage to ablate and thus compress it), and states: "Among the tests which seem to be of importance at the present time are those concerned with boosted weapons. Another is connected vith the possibility of a heterocatalytic explosion, that is, implosion of a bomb using the energy from another, auxiliary bomb. A third concerns itself with tests on mixing during atomic explosions, which question is of particular importance in connection with the Alarm Clock."

There is more to Fuchs' influence on the UK H-bomb than I go into that paper; Chapman Pincher alleged that Fuchs was treated with special leniency at his trial and later he was given early release in 1959 because of his contributions and help with the UK H-bomb as author of the key Fuchs-von Neumann x-ray compression mechanism patent. For example, Penney visited Fuchs in June 1952 in Stafford Prison; see pp309-310 of Frank Close's 2019 book "Trinity". Close argues that Fuchs gave Penney a vital tutorial on the H-bomb mechanism during that prison visit. That wasn't the last help, either, since the UK Controller for Atomic Energy Sir Freddie Morgan wrote Penney on 9 February 1953 that Fuchs was continuing to help. Another gem: Close gives, on p396, the story of how the FBI became suspicious of Edward Teller, after finding a man of his name teaching at the NY Communist Workers School in 1941 - the wrong Edward Teller, of course - yet Teller's wife was indeed a member of the Communist-front "League of women shoppers" in Washington, DC.

Chapman Pincher, who attended the Fuchs trial, writes about Fuchs hydrogen bomb lectures to prisoners in chapter 19 of his 2014 autobiography, Dangerous to know (Biteback, London, pp217-8): "... Donald Hume ... in prison had become a close friend of Fuchs ... Hume had repaid Fuchs' friendship by organising the smuggling in of new scientific books ... Hume had a mass of notes ... I secured Fuchs's copious notes for a course of 17 lectures ... including how the H-bomb works, which he had given to his fellow prisoners ... My editor agreed to buy Hume's story so long as we could keep the papers as proof of its authenticity ... Fuchs was soon due for release ..."

Chapman Pincher wrote about this as the front page exclusive of the 11 June 1952 Daily Express, "Fuchs: New Sensation", the very month Penney visited Fuchs in prison to receive his H-bomb tutorial! UK media insisted this was evidence that UK security still wasn't really serious about deterring further nuclear spies, and the revelations finally culminated in the allegations that the MI5 chief 1956-65 Roger Hollis was a Russian fellow-traveller (Hollis was descended from Peter the Great, according to his elder brother Chris Hollis' 1958 book Along the Road to Frome) and GRU agent of influence, codenamed "Elli". Pincher's 2014 book, written aged 100, explains that former MI5 agent Peter Wright suspected Hollis was Elli after evidence collected by MI6 agent Stephen de Mowbray was reported to the Cabinet Secretary. Hollis is alleged to have deliberately fiddled his report of interviewing GRU defector Igor Gouzenko on 21 November 1945 in Canada. Gouzenko had exposed the spy and Groucho Marx lookalike Dr Alan Nunn May (photo below), and also a GRU spy in MI5 codenamed Elli, who used only duboks (dead letter boxes), but Gouzenko told Pincher that when Hollis interviewed him in 1945 he wrote up a lengthy false report claiming to discredit many statements by Gouzenko: "I could not understand how Hollis had written so much when he had asked me so little. The report was full of nonsense and lies. As [MI5 agent Patrick] Stewart read the report to me [during the 1972 investigation of Hollis], it became clear that it had been faked to destroy my credibility so that my information about the spy in MI5 called Elli could be ignored. I suspect that Hollis was Elli." (Source: Pincher, 2014, p320.) Christopher Andrew claimed Hollis couldn't have been GRU spy Elli because KGB defector Oleg Gordievsky suggested it was the KGB spy Leo Long (sub-agent of KGB spy Anthony Blunt). However, Gouzenko was GRU, not KGB like Long and Gordievsky! Gordievsky's claim that "Elli" was on the cover of Long's KGB file was debunked by KGB officer Oleg Tsarev, who found that Long's codename was actually Ralph! Another declassified Russian document, from General V. Merkulov to Stalin dated 24 Nov 1945, confirmed Elli was a GRU agent inside british intelligence, whose existence was betrayed by Gouzenko. In Chapter 30 of Dangerous to Know, Pincher related how he was given a Russian suitcase sized microfilm enlarger by 1959 Hollis spying eyewitness Michael J. Butt, doorman for secret communist meetings in London. According to Butt, Hollis delivered documents to Brigitte Kuczynski, younger sister of Klaus Fuchs' original handler, the notorious Sonia aka Ursula. Hollis allegedly provided Minox films to Brigitte discretely when walking through Hyde Park at 8pm after work. Brigitte gave her Russian made Minox film enlarger to Butt to dispose of, but he kept it in his loft as evidence. (Pincher later donated it to King's College.) Other more circumstantial evidence is that Hollis recruited the spy Philby, Hollis secured spy Blunt immunity from prosecution, Hollis cleared Fuchs in 1943, and MI5 allegedly destroyed Hollis' 1945 interrogation report on Gouzenko, to prevent the airing of the scandal that it was fake after checking it with Gouzenko in 1972.

It should be noted that the very small number of Russian GRU illegal agents in the UK and the very small communist party membership had a relatively large influence on nuclear policy via infiltration of unions which had block votes in the Labour Party, as well the indirect CND and "peace movement" lobbies saturating the popular press with anti-civil defence propaganda to make the nuclear deterrent totally incredible for any provocation short of a direct all-out countervalue attack. Under such pressure, UK Prime Minister Harold Wilson's government abolished the UK Civil Defence Corps, making the UK nuclear deterrent totally incredible against major provocations, in March 1968. While there was some opposition to Wilson, it was focussed on his profligate nationalisation policies which were undermining the economy and thus destabilizing military expenditure for national security. Peter Wright’s 1987 book Spycatcher and various other sources, including Daily Mirror editor Hugh Cudlipp's book Walking on Water, documented that on 8 May 1968, the Bank of England's director Cecil King, who was also Chairman of Daily Mirror newspapers, Mirror editor Cudlipp and the UK Ministry of Defence's anti-nuclear Chief Scientific Adviser Sir Solly Zuckerman, met at Lord Mountbatten's house in Kinnerton Street, London, to discuss a coup e'tat to overthrow Wilson and make Mountbatten the UK President, a new position. King's position, according to Cudlipp - quite correctly as revealed by the UK economic crises of the 1970s when the UK was effectively bankrupt - was that Wilson was setting the UK on the road to financial ruin and thus military decay. Zuckerman and Mountbatten refused to take part in a revolution, however Wilson's government was attacked by the Daily Mirror in a front page editorial by Cecil King two days later, on 10 May 1968, headlined "Enough is enough ... Mr Wilson and his Government have lost all credibility, all authority." According to Wilson's secretary Lady Falkender, Wilson was only told of the coup discussions in March 1976.

CND and the UK communist party alternatively tried to claim, in a contradictory way, that they were (a) too small in numbers to have any influence on politics, and (b) they were leading the country towards utopia via unilateral nuclear disarmament saturation propaganda about nuclear weapons annihilation (totally ignoring essential data on different nuclear weapon designs, yields, heights of burst, the "use" of a weapon as a deterrent to PREVENT an invasion of concentrated force, etc.) via the infiltrated BBC and most other media. Critics pointed out that Nazi Party membership in Germany was only 5% when Hitler became dictator in 1933, while in Russia there were only 200,000 Bolsheviks in September 1917, out of 125 million, i.e. 0.16%. Therefore, the whole threat of such dictatorships is a minority seizing power beyond it justifiable numbers, and controlling a majority which has different views. Traditional democracy itself is a dictatorship of the majority (via the ballot box, a popularity contest); minority-dictatorship by contrast is a dictatorship by the fanatically motivated minority by force and fear (coercion) to control the majority. The coercion tactics used by foreign dictators to control the press in free countries are well documented, but never publicised widely. Hitler put pressure on Nazi-critics in the UK "free press" via UK Government appeasers Halifax, Chamberlain and particularly the loathsome UK ambassador to Nazi Germany, Sir Neville Henderson, for example trying to censor or ridicule appeasement critics David Low, to fire Captain W. E. Johns (editor of both Flying and Popular Flying, which had huge circulations and attacked appeasement as a threat to national security in order to reduce rearmament expenditure), and to try to get Winston Churchill deselected. These were all sneaky "back door" pressure-on-publishers tactics, dressed up as efforts to "ease international tensions"! The same occurred during the Cold War, with personal attacks in Scientific American and Bulletin of the Atomic Scientists and by fellow travellers on Herman Kahn, Eugene Wigner, and others who warned we need civil defence to make a deterrent of large provocations credible in the eyes of an aggressor.

Chapman Pincher summarises the vast hypocritical Russian expenditure on anti-Western propaganda against the neutron bomb in Chapter 15, "The Neutron Bomb Offensive" of his 1985 book The Secret Offensive: "Such a device ... carries three major advantages over Hiroshima-type weapons, particularly for civilians caught up in a battle ... against the massed tanks which the Soviet Union would undoubtedly use ... by exploding these warheads some 100 feet or so above the massed tanks, the blast and fire ... would be greatly reduced ... the neutron weapon produces little radioactive fall-out so the long-term danger to civilians would be very much lower ... the weapon was of no value for attacking cities and the avoidance of damage to property can hardly be rated as of interest only to 'capitalists' ... As so often happens, the constant repetition of the lie had its effects on the gullible ... In August 1977, the [Russian] World Peace Council ... declared an international 'Week of action' against the neutron bomb. ... Under this propaganda Carter delayed his decision, in September ... a Sunday service being attended by Carter and his family on 16 October 1977 was disrupted by American demonstrators shouting slogans against the neutron bomb [see the 17 October 1977 Washington Post] ... Lawrence Eagleburger, when US Under Secretary of State for Political Affairs, remarked, 'We consider it probably that the Soviet campaign against the 'neutron bomb cost some $100 million'. ... Even the Politburo must have been surprised at the size of what it could regard as a Fifth Column in almost every country." [Unfortunately, Pincher himself had contributed to the anti-nuclear nonsense in his 1965 novel "Not with a bang" in which small amounts of radioactivity from nuclear fallout combine with medicine to exterminate humanity! The allure of anti-nuclear propaganda extends to all who which to sell "doomsday fiction", not just Russian dictators but mainstream media story tellers in the West. By contrast, Glasstone and Dolan's 1977 Effects of Nuclear Weapons doesn't even mention the neutron bomb, so there was no scientific and technical effort whatsoever by the West to make it a credible deterrent even in the minds of the public it had to protect from WWIII!]

"The Lance warhead is the first in a new generation of tactical mini-nukes that have been sought by Army field leading advocates: the series of American generals who have commanded the North Atlantic Treaty organization theater. They have argued that the 7,000 unclear warheads now in Europe are old, have too large a nuclear yield and thus would not be used in a war. With lower yields and therefore less possible collateral damage to civilian populated areas, these commanders have argued, the new mini-nukes are more credible as deterrents because they just might be used on the battlefield without leading to automatic nuclear escalation. Under the nuclear warhead production system, a President must personally give the production order. President Ford, according to informed sources, signed the order for the enhanced-radiation Lance warhead. The Lance already has regular nuclear warheads and it deployed with NATO forces in Europe. In addition to the Lance warhead, other new production starts include: An 8-inch artillery-fired nuclear warhead to replace those now in Europe. This shell had been blocked for almost eight years by Sen. Stuart Symington (D-Mo.), who had argued that it was not needed. Symington retired last year. The Pentagon and ERDA say the new nuclear 8-inch warhead would be safer from stealing by terrorists. Starbird testified. It will be "a command disable system" to melt its inner workings if necessary. ... In longer-term research, the bill contains money to finance an enhanced-radiational bomb to the dropped from aircraft." - Washington post, 5 June 1977.

This debunks fake news that Teller's and Ulam's 9 March 1951 report LAMS-1225 itself gave Los Alamos the Mike H-bomb design, ready for testing! Teller was proposing a series of nuclear tests of the basic principles, not 10Mt Ivy-Mike which was based on a report the next month by Teller alone, LA-1230, "The Sausage: a New Thermonuclear System". When you figure that, what did Ulam actually contribute to the hydrogen bomb? Nothing about implosion, compression or separate stages - all already done by von Neumann and Fuchs five years earlier - and just a lot of drivel about trying to channel material shock waves from a primary to compress another fissile core, a real dead end. What Ulam did was to kick Teller out of his self-imposed mental objection to compression devices. Everything else was Teller's; the radiation mirrors, the Sausage with its outer ablation pusher and its inner spark plug. Note also that contrary to official historian Arnold's book (which claims due to a misleading statement by Dr Corner that all the original 1946 UK copies of Superbomb Conference documentation were destroyed after being sent from AWRE Aldermaston to London between 1955-63), all the documents did exist in the AWRE TPN (theoretical physics notes, 100% of which have been perserved) and are at the UK National Archives, e.g. AWRE-TPN 5/54 is listed in National Archives discovery catalogue ref ES 10/5: "Miscellaneous super bomb notes by Klaus Fuchs", see also the 1954 report AWRE-TPN 6/54, "Implosion super bomb: substitution of U235 for plutonium" ES 10/6, the 1954 report AWRE-TPN 39/54 is "Development of the American thermonuclear bomb: implosion super bomb" ES 10/39, see also ES 10/21 "Collected notes on Fermi's super bomb lectures", ES 10/51 "Revised reconstruction of the development of the American thermonuclear bombs", ES 1/548 and ES 1/461 "Superbomb Papers", etc. Many reports are secret and retained, despite containing "obsolete" designs (although UK report titles are generally unredacted, such as: "Storage of 6kg Delta (Phase) -Plutonium Red Beard (tactical bomb) cores in ships")! It should also be noted that the Livermore Laboatory's 1958 TUBA spherical secondary with an oralloy (enriched U235) outer pusher was just a reversion from Teller's 1951 core spark plug idea in the middle of the fusion fuel, back to the 1944 von Neumann scheme of having fission material surrounding the fusion fuel. In other words, the TUBA was just a radiation and ionization imploded, internally fusion-boosted, second fission stage which could have been accomplished a decade earlier if the will existed, when all of the relevant ideas were already known. The declassified UK spherical secondary-stage alternatives linked here (tested as Grapple X, Y and Z with varying yields but similar size, since all used the 5 ft diameter Blue Danube drop casing) clearly show that a far more efficient fusion burn occurs by minimising the mass of hard-to-compress U235 (oralloy) sparkplug/pusher, but maximising the amount of lithium-7, not lithium-6. Such a secondary with minimal fissionable material also automatically has minimal neutron ABM vulnerability (i.e., "Radiation Immunity", RI). This is the current cheap Russian neutron weapon design, but not the current Western design of warheads like the W78, W88 and bomb B61.

So why on earth doesn't the West take the cheap efficient option of cutting expensive oralloy and maximising cheap natural (mostly lithium-7) LiD in the secondary? Even Glasstone's 1957 Effects of Nuclear Weapons on p17 (para 1.55) states that "Weight for weight ... fusion of deuterium nuclei would produce nearly 3 times as much energy as the fission of uranium or plutonium"! The sad answer is "density"! Natural LiD (containing 7.42% Li6 abundance) is a low density white/grey crystalline solid like salt that actually floats on water (lithium deuteroxide would be formed on exposure to water), since its density is just 820 kg/m^3. Since the ratio of mass of Li6D to Li7D is 8/9, it would be expected that the density of highly enriched 95% Li6D is 739 kg/m^3, while for 36% enriched Li6D it is 793 kg/m^3. Uranium metal has a density of 19,000 kg/m^3, i.e. 25.7 times greater than 95% enriched li6D or 24 times greater than 36% enriched Li6D. Compactness, i.e. volume is more important in a Western MIRV warhead than mass/weight! In the West, it's best to have a tiny-volume, very heavy, very expensive warhead. In Russia, cheapness outweights volume considerations. The Russians in some cases simply allowed their more bulky warheads to protrude from the missile bus (see photo below), or compensated for lower yields at the same volume using clean LiD by using the savings in costs to build more warheads. (The West doubles the fission yield/mass ratio of some warheads by using U235/oralloy pushers in place of U238, which suffers from the problem that about half the neutrons it interacts with result in non-fission capture, as explained below. Note that the 720 kiloton UK nuclear test Orange Herald device contained a hollow shell of 117 kg of U235 surrounded by a what Lorna Arnold's book quotes John Corner referring to a "very thin" layer of high explosive, and was compact, unboosted - the boosted failed to work - and gave 6.2 kt/kg of U235, whereas the first version of the 2-stage W47 Polaris warhead contained 60 kg of U235 which produced most of the secondary stage yield of about 400 kt, i.e. 6.7 kt/kg of U235. Little difference - but because perhaps 50% of the total yield of the W47 was fusion, its efficiency of use of U235 must have actually been less than the Orange Herald device, around 3 kt/kg of U235 which indicates design efficiency limits to "hydrogen bombs"! Yet anti-nuclear charlatans claimed that the Orange Herald bomb was a con!)

ABOVE: USA nuclear weapons data declassified by UK Government in 2010 (the information was originally acquired due to the 1958 UK-USA Act for Cooperation on the Uses of Atomic Energy for Mutual Defense Purposes, in exchange for UK nuclear weapons data) as published at http://nuclear-weapons.info/images/tna-ab16-4675p63.jpg. This single table summarizes all key tactical and strategic nuclear weapons secret results from 1950s testing! (In order to analyze the warhead pusher thicknesses and very basic schematics from this table it is necessary to supplement it with the 1950s warhead design data declassified in other documents, particularly some of the data from Tom Ramos and Chuck Hansen, as quoted in some detail below.) The data on the mass of special nuclear materials in each of the different weapons argues strongly that the entire load of Pu239 and U235 in the 1.1 megaton B28 was in the primary stage, so that weapon could not have had a fissile spark plug in the centre let alone a fissile ablator (unlike Teller's Sausage design of 1951), and so the B28 it appears had no need whatsoever of a beryllium neutron radiation shield to prevent pre-initiation of the secondary stage prior to its compression (on the contrary, such neutron exposure of the lithium deuteride in the secondary stage would be VITAL to produce some tritium in it prior to compression, to spark fusion when it was compressed). Arnold's book indeed explains that UK AWE physicists found the B28 to be an excellent, highly optimised, cheap design, unlike the later W47 which was extremely costly. The masses of U235 and Li6 in the W47 shows the difficulties of trying to maintain efficiency while scaling down the mass of a two-stage warhead for SLBM delivery: much larger quantities of Li6 and U235 must be used to achieve a LOWER yield! To achieve thermonuclear warheads of low mass at sub-megaton yields, both the outer bomb casing and the pusher around the the fusion fuel must be reduced:

"York ... studied the Los Alamos tests in Castle and noted most of the weight in thermonuclear devices was in their massive cases. Get rid of the case .... On June 12, 1953, York had presented a novel concept ... It radically altered the way radiative transport was used to ignite a secondary - and his concept did not require a weighty case ... they had taken the Teller-Ulam concept and turned it on its head ... the collapse time for the new device - that is, the amount of time it took for an atomic blast to compress the secondary - was favorable compared to older ones tested in Castle. Brown ... gave a female name to the new device, calling it the Linda." - Dr Tom Ramos (Lawrence Livermore National Laboratory nuclear weapon designer), From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Naval Institute press, 2022, pp137-8. (So if you reduce the outer casing thickness to reduce warhead weight, you must complete the pusher ablation/compression faster, before the thinner outer casing is blown off, and stops reflecting/channelling x-rays on the secondary stage. Making the radiation channel smaller and ablative pusher thinner helps to speed up the process. Because the ablative pusher is thinner, there is relatively less blown-off debris to block the narrower radiation channel before the burn ends.)

"Brown's third warhead, the Flute, brought the Linda concept down to a smaller size. The Linda had done away with a lot of material in a standard thermonuclear warhead. Now the Flute tested how well designers could take the Linda's conceptual design to substantially reduce not only the weight but also the size of a thermonuclear warhead. ... The Flute's small size - it was the smallest thermonuclear device yet tested - became an incentive to improve codes. Characteristics marginally important in a larger device were now crucially important. For instance, the reduced size of the Flute's radiation channel could cause it to close early [with ablation blow-off debris], which would prematurely shut off the radiation flow. The code had to accurately predict if such a disaster would occur before the device was even tested ... the calculations showed changes had to be made from the Linda's design for the Flute to perform correctly." - Dr Tom Ramos (Lawrence Livermore National Laboratory nuclear weapon designer), From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Naval Institute press, 2022, pp153-4. Note that the piccolo (the W47 secondary) is a half-sized flute, so it appears that the W47's secondary stage design miniaturization history was: Linda -> Flute -> Piccolo:

"A Division's third challenge was a small thermonuclear warhead for Polaris [the nuclear SLBM submarine that preceeded today's Trident system]. The starting point was the Flute, that revolutionary secondary that had performed so well the previous year. Its successor was called the Piccolo. For Plumbbob [Nevada, 1957], the design team tested three variations of the Piccolo as a parameter test. One of the variants outperformed the others ... which set the stage for the Hardtack [Nevada and Pacific, 1958] tests. Three additional variations for the Piccolo ... were tested then, and again an optimum candidate was selected. ... Human intuition as well as computer calculations played crucial roles ... Finally, a revolutionary device was completed and tested ... the Navy now had a viable warhead for its Polaris missile. From the time Brown gave Haussmann the assignment to develop this secondary until the time they tested the device in the Pacific, only 90 days had passed. As a parallel to the Robin atomic device, this secondary for Polaris laid the foundation for modern thermonuclear weapons in the United States." - Dr Tom Ramos (Lawrence Livermore National Laboratory nuclear weapon designer), From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War, Naval Institute press, 2022, pp177-8. (Ramos is very useful in explaining that many of the 1950s weapons with complex non-spherical, non-cylindrical shaped primaries and secondaries were simply far too complex to fully simulate on the really pathetic computers they had - Livermore got a 4,000 vacuum tubes-based IBM 701 with 2 kB memory in 1956, AWRE Aldermaston in the Uk had to wait another year for theirs - so they instead did huge numbers of experimental explosive tests. For instance, on p173, Ramos discloses that the Swan primary which developed into the 155mm tactical shell, "went through over 100 hydrotests", non-nuclear tests in which fissile material is replaced with U238 or other substitutes, and the implosion is filmed with flash x-ray camera systems.)

"An integral feature of the W47, from the very start of the program, was the use of an enriched uranium-235 pusher around the cylindrical secondary." - Chuck Hansen, Swords 2.0, p. VI-375 (Hansen's source is his own notes taken during a 19-21 February 1992 nuclear weapons history conference he attended; if you remember the context, "Nuclear Glasnost" became fashionable after the Cold War ended, enabling Hansen to acquire almost unredacted historical materials for a few years until nuclear proliferation became a concern in Iraq, Afghanistan, Iran and North Korea). The key test of the original (Robin primary and Piccolo secondary) Livermore W47 was 412 kt Hardtack-Redwood on 28 June 1958. Since Li6D utilized at 100% efficiency would yield 66 kt/kg, the W47 fusion efficiency was only about 6%; since 100% fission of u235 yields 17 kt/kg, the W47's Piccolo fission (the u235 pusher) efficiency was about 20%; the comparable figures for secondary stage fission and fusion fuel burn efficiencies in the heavy B28 are about 7% and 15%, respectively:

ABOVE: the heavy B28 gave a very "big bang for the buck": it was cheap in terms of expensive Pu, U235 and Li6, and this was the sort of deterrent which was wanted by General LeMay for the USAF, which wanted as many weapons as possible, within the context of Eisenhower's budgetary concerns. But its weight (not its physical size) made it unsuitable for SLBM Polaris warheads. The first SLBM warhead, the W47, was almost the same size as the B28 weapon package, but much lighter due to having a much thinner "pusher" on the secondary, and casing. But this came at a large financial cost in terms of the quantities of special nuclear materials required to get such a lightweight design to work, and also a large loss of total yield. The fusion fuel burn efficiency ranges from 6% for the 400 kt W47 to 15% for the 1.1 megaton B28 (note that for very heavy cased 11-15 megaton yield tests at Castle, up to 40% fusion fuel burn efficiency was achieved), whereas the secondary stage ablative pusher fission efficiency ranged from 7% for a 1.1 inch thick natural uranium (99.3% U238) ablator to 20% for a 0.15 inch thick highly enriched oralloy (U235) ablator. From the brief description of the design evolution given by Dr Tom Ramos (Lawrence Livermore National Laboratory), it appears that when the x-ray channelling outer case thickness of the weapon is reduced to save weight, the duration of the x-ray coupling is reduced, so the dense metal pusher thickness must be reduced if the same compression factor (approximately 20) for the secondary stage is to be accomplished (lithium deuteride, being of low density, is far more compressable by a given pressure, than dense metal). In both examples, the secondary stage is physically a boosted fission stage. (If you are wondering why the hell the designers don't simply use a hollow core U235 bomb like Orange Herald instead of bothering with such inefficient x-ray coupled two-stage designs as these, the answer is straightforward: the risk of large fissile core meltdown by neutrons Moscow ABM defensive nuclear warheads, neutron bombs.)

The overall weight of the W47 was minimized by replacing the usual thick layer of U238 pusher with a very thin layer of fissile U235 (supposedly Teller's suggestion), which is more efficient for fission, but is limited by critical mass issues. The W47 used a 95% enriched Li6D cylinder with a 3.8mm thick U235 pusher; the B28 secondary was 36% enriched Li6D, with a very heavy 3cm thick U238 pusher. As shown below, it appears the B28 was related to the Los Alamos clean design of the TX21C tested as 95% clean 4.5 megatons Redwing-Navajo in 1956 and did not have a central fissile spark plug. From the declassified fallout composition, it is known the Los Alamos designers replaced the outer U238 pusher of Castle secondaries with lead in Navajo. Livermore did the same for their 85% clean 3.53 megatons Redwing-Zuni test, but Livermore left the central fission spark plug, which contributed 10% of its 15% fission yield, instead of removing the neutron shield, using foam channel filler for slowing down the x-ray compression, and thereby using primary stage neutrons to split lithium-6 giving tritium prior to compression. Our point is that Los Alamos got it wrong in sticking too conservatively to ideology: for clean weapons they should have got rid of the dense lead pusher and gone for John H. Nuckolls idea (also used by Fuchs in 1946 and the Russians in 1955 and 1958) of a low-density pusher for isentropic compression of low-density fusion fuel. This error is the reason why those early cleaner weapons were extremely heavy due to unnecessary 2" thick lead or tungsten pushers around the fusion fuel, which massively reduced their yield-to-weight ratios, so that LeMay rejected them!

Compare these data for the 20 inch diameter, 49 inch, 1600 lb, 1.1 megaton bomb B28 to the 18 inch diameter, 47 inch, 700 lb, 400 kt Mk47/W47 Polaris SLBM warhead (this is the correct yield for the first version of the W47 confirmed by UK data in Lorna Arnold Britain and the H-bomb 2001 and AB 16/3240; Wikipedia wrongly gives the 600 kt figure in Hansen, which was a speculation or a later upgrade). The key difference is that the W47 is much lighter, and thus suitable for the Polaris SLBM unlike the heavier, higher yield B28. Both B28 and W47 used cylindrical sausages, but they are very different in composition; the B28 used a huge mass of U238 in its ablative sausage outer shell or pusher, while the W47 used oralloy/U235 in the pusher. The table shows the total amounts of Pu, Oralloy (U235), Lithium-6 (excluding cheaper lithium-7, which is also present in varying amounts in different thermonuclear weapons), and tritium (which is used for boosting inside fissile material, essentially to reduce the amount of Pu and therefore the vulnerability of the weapon to Russian enhanced neutron ABM warhead meltdown). The B28 also has an external dense natural U (99.3% U238) "ablative pusher shell" whose mass is not listed in this table. The table shows that the 400 kt W47 Polaris SLBM warhead contains 60 kg of U235 (nearly as much as the 500 kt pure fission Mk18), which is in an ablative pusher shell around the lithium deuteride, so that the cylinder of neutron-absorbing lithium-6 deuteride within it keeps that mass of U235 subcritical, until compressed. So the 400 kt W47 contains far more Pu, U235, Li6 and T than the higher yield 1.1 megaton B28: this is the big $ price you pay for reducing the mass of the warhead; the total mass of the W47 is reduced to 44% of the mass of the B28, since the huge mass of cheap U238 pusher in the B28 is replaced by a smaller mass of U235, which is more efficient because (as Dr Carl F. Miller reveals in USNRDL-466, Table 6), about half of the neutrons hitting U238 don't cause fission but instead non-fission capture reactions which produce U239, plus the n,2n reaction that produces U237, emitting a lot of very low energy gamma rays in the fallout. For example, in the 1954 Romeo nuclear test (which, for simplicity, we quote since it used entirely natural LiD, with no expensive enrichment of the Li6 isotope whatsoever), the U238 jacket fission efficiency was reduced by capture as follows: 0.66 atom/fission of U239, 0.10 atom/fission of U237 and 0.23 atom/fission of U240 produced by fission, a total of 0.66 + 0.10 + 0.23 ~ 1 atom/fission, i.e. 50% fission in the U238 pusher, versus 50% non-fission neutron captures. So by using U235 in place of U238, you virtually eliminate the non-fission capture (see UK Atomic Weapons Establishment graph of fission and capture cross-sections for U235, shown below), which roughly halves the mass of the warhead, for a given fission yield. This same principle of using an outer U235/oralloy pusher instead of U238 to reduce mass - albeit with the secondary cylindrical "Sausage" shape now changed to a sphere - applies to today's miniaturised, high yield, low mass "MIRV" warheads. Just as the lower-yield W47 counter-intuitively used more expensive ingredients than the bulkier higher-yield B28, modern compact, high-yield oralloy-loaded warheads literally cost a bomb, just to keep the mass down! There is evidence Russia uses alternative ideas.

This is justified by the data given for a total U238 capture-to-fission ratio of 1 in the 11 megaton Romeo test and also the cross-sections for U235 capture and fission on the AWE graph for relevant neutron energy range of about 1-14 Mev. If half the neutrons are captured in U238 without fission, then the maximum fission yield you can possibly get from "x" kg of U238 pusher is HALF the energy obtained from 100% fission of "x" kg of U238. Since with U238 only about half the atoms can undergo fission by thermonuclear neutrons (because the other half undergo non-fission capture), the energy density (i.e., the Joules/kg produced by the fission explosion of the pusher) reached by an exploding U238 pusher is only half that reached by U235 (in which there is less non-fission capture of neutrons, which doubles the pusher mass without doubling the fission energy release). So a U235 pusher will reach twice the temperature of a U238 pusher, doubling its material heating of fusion fuel within, prolonging the fusion burn and thus increasing fusion burn efficiency. 10 MeV neutron energy is important since it allows for likely average scattering of 14.1 MeV D+T fusion neutrons and it is also the energy at which the most important capture reaction, the (n,2n) cross-section peaks for both U235 (peak of 0.88 barn at 10 Mev) and U238 (peak of 1.4 barns at 10 Mev). For 10 Mev neutrons, U235 and U238 have fission cross-sections of 1.8 and 1 barn, respectively. For 14 Mev neutrons, U238 has a (n,2n) cross section of 0.97 barn for U237 production. So ignoring non-fission captures, you need 1.8/1 = 1.8 times greater thickness of pusher for U238 than for U235, to achieve the same amount of fission. But this simple consideration ignores the x-ray ablation requirement of the explosing pusher, so there are several factors requiring detailed computer calculations, and/or nuclear testing.

Note: there is an extensive collection of declassified documents released after Chuck Hansen's final edition, Swords 2.0, which are now available at https://web.archive.org/web/*/http://www.nnsa.energy.gov/sites/default/files/nnsa/foiareadingroom/*, being an internet-archive back-up of a now-removed US Government Freedom of Information Act Reading Room. Unfortunately they were only identified by number sequence, not by report title or content, in that reeding room, and so failed to achieve wide attention when originally released! (This includes extensive "Family Committee" H-bomb documentation and many long-delayed FOIA requests submitted originally by Hansen, but not released in time for inclusion in Swords 2.0.) As the extract below - from declassified document RR00132 - shows, some declassified documents contained very detailed information or typewriter spaces that could only be filled by a single specific secret word (in this example, details of the W48 linear implosion tactical nuclear warhead, including the fact that it used PBX9404 plastic bonded explosive glued to the brittle beryllium neutron reflector around the plutonium core using Adiprene L100 adhesive!).

ABOVE: Declassified data on the radiation flow analysis for the 10 megaton Mike sausage: http://nnsa.energy.gov/sites/default/files/nnsa/foiareadingroom/RR00198.pdf Note that the simplistic "no-go theorem" given in this extract, against any effect from varying the temperature to help the radiation channelling, was later proved false by John H. Nuckolls (like Teller's anti-compression "no-go theorem" was later proved false), since lowered temperature delivers energy where it is needed while massively reducing radiation losses (which go as the fourth power of temperature/x-ray energy in kev).

ABOVE: Hans A. Bethe's disastrous back-of-the-envelope nonsense "non-go theorem" against lithium-7 fission into tritium by 14.1 Mev D+T neutrons in Bravo (which contained 40% lithium-6 and 60% lithium-7; unnecessarily enriched - at great expense and effort - from the natural 7.42% lithum-6 abundance). It was Bethe's nonsense "physics" speculation, unbacked by serious calculation, who caused Bravo to go off at 2.5 times the expected 6 megatons and therefore for the Japanese Lucky Dragon tuna trawler crew in the maximum fallout hotspot area 80 miles downwind to be contaminated by fallout, and also for Rongelap's people to be contaminated ("accidents" that inevitably kickstarted the originally limited early 1950s USSR funded Communist Party anti-nuclear deterrence movements in the West into mainstream media and thus politics). There was simply no solid basis for assuming that the highly penetrating 14.1 Mev neutrons would be significantly slowed by scattering in the fuel before hitting lithium-7 nuclei. Even teller's 1950 report LA-643 at page 17 estimated that in a fission-fusion Alarm Clock, the ratio of 14 Mev to 2.5 Mev neutrons was 0.7/0.2 = 3.5. Bethe's complacently bad guesswork-based physics also led to the EMP fiasco for high altitude bursts, after he failed to predict the geomagnetic field deflection of Compton electrons at high altitude in his secret report “Electromagnetic Signal Expected from High-Altitude Test”, Los Alamos report LA-2173, October 1957, Secret. He repeatedly caused nuclear weapons effects study disasters. For the true utility of lithium-7, which is actually BETTER than lithum-6 at tritium production when struck by 14.1 Mev D+T fusion neutrons, and its consequences for cheap isentropically compressed fusion capsules in Russian neutron bombs, please see my paper here which gives a graph of lithium isotopic cross section versus neutron energy, plus the results when Britain used cheap lithium-7 in Grapple Y to yield 3 megatons (having got lower yields with costly lithium-6 in previous tests!).

Update (15 Dec 2023): PDF uploaded of UK DAMAGE BY NUCLEAR WEAPONS (linked here on Internet Archive) - secret 1000 pages UK and USA nuclear weapon test effects analysis, and protective measures determined at those tests (not guesswork) relevant to escalation threats by Russia for EU invasion (linked here at wordpress) in response to Ukraine potentially joining the EU (this is now fully declassified without deletions, and in the UK National Archives at Kew):

Hiroshima and Nagasaki terrorist liars debunked by secret American government evidence that simple shelters worked, REPORT LINKED HERE (this was restricted from public view and never published by the American government, and Glasstone's lying Effects of Nuclear Weapons book reversed its evidence for propaganda purposes, a fact still covered by all the lying cold war pseudo "historians" today), Operation Hurricane 1952 declassified nuclear weapon test data (here), declassified UK nuclear tested shelter research reports (here), declassified EMP nuclear test research data (here), declassified clandestine nuclear bombs in ships attack on Liverpool study (here), declassified fallout decontamination study for UK recovery from nuclear attack (here), declassified Operation Buffalo surface burst and near surface burst fallout patterns, water decontamination, initial radiation shielding at Antler nuclear tests, and resuspension of deposited fallout dust into the air (inhalation hazard) at different British nuclear tests, plus Operation Totem nuclear tests crater region radiation surveys (here), declassified Operation Antler nuclear blast precursor waveforms (here), declassified Operation Buffalo nuclear blast precursor waveforms (here), declassified UK Atomic Weapons Establishment nuclear weapons effects symposium (here), and declassified UK Atomic Weapons Establishment paper on the gamma radiation versus time at Crossroads tests Able and Baker (here, paper by inventor of lenses in implosion weapons, James L. Tuck of the British Mission to Los Alamos and Operation Crossroads, clearly showing how initial gamma shielding in an air burst can be achieved with a few seconds warning and giving the much greater escape times available for residual radiation dose accumulations in an underwater burst; key anti-nuclear hysteria data kept covered up by Glasstone and the USA book Effects of Nuclear Weapons), and Penney and Hicks paper on the base surge contamination mechanism (here), and Russian nuclear warhead design evidence covered-up by both America and the so-called arms control and disarmament "experts" who always lie and distort the facts to suit their own agenda to try to start a nuclear war (linked here). If they wanted "peace" they'd support the proved facts, available on this blog nukegate.org since 2006, and seek international agreement to replace the incredible, NON-war deterring strategic nuclear weapons with safe tactical neutron warheads which collateral damage averting and invasion-deterring (thus war deterring in all its forms, not only nuclear), plus civil defence against all forms of collateral damage from war, which reduces escalation risks during terrorist actions, as proved in wars which don't escalate because of effective civil defence and credible deterrence (see below). Instead, they support policies designed to maximise civilian casualties and to deliberately escalate war, to profit "politically" from the disasters caused which they blame falsely on nuclear weapons, as if deterrence causes war! (Another lie believed by mad/evil/gullible mainstream media/political loons in "authority".) A good summary of the fake news basis of "escalation" blather against credible tactical nuclear deterrence of the invasions that set off wars is inadvertently provided by Lord David Owen's 2009 "Nuclear Papers" (Liverpool Uni Press), compiling his declassified nuclear disarmament propaganda reports written while he was UK Foreign Secretary 1977-9. It's all Carter era appeasement nonsense. For example, on pp158-8 he reprints his Top Secret 19 Dec 1978 "Future of the British Deterrent" report to the Prime Minister which states that "I am not convinced by the contention ... that the ability to destroy at least 10 major cities, or inflict damage on 30 major targets ... is the minimum criterion for a British deterrent." (He actually thinks this is too strong a deterrent, despite the fact it is incredible for the realpolitik tactics of dictators who make indirect provocations like invading their neighbours!) The reality Owens ignores is that Russia had and still has civil defence shelters and evacuation plans, so threatening some damage in retaliation is not a credible deterrent against the invasions that set off both world wars. On page 196, he gives a Secret 18 April 1978 paper stating that NATO then had 1000 nuclear artillery pieces (8" and 155mm), 200 Lance and Honest John tactical nuclear missile systems, 135 Pershing; all now long ago disarmed and destroyed while Russian now has over 2000 dedicated tactical nuclear weapons of high neutron output (unlike EM1's data for the low yield option of the multipurpose NATO B61). Owen proudly self-congratulates on his Brezhnev supporting anti-neutron bomb ranting 1978 book, "Human Rights", pp. 136-7. If Owen really wants "Human Rights", he needs to back the neutron bomb now to deter the dictatorships which destroy human rights! His 2009 "Nuclear Papers" at p287 gives the usual completely distorted analysis of the Cuban missiles crisis, claiming that despite the overwhelming American tactical and strategic nuclear superiority for credible deterrence in 1962, the world came "close" to a nuclear war. It's closer now, mate, when thanks to your propaganda we no longer have a credible deterrent, civil defence, tactical neutron warheads. Pathetic.

ABOVE secret reports on Australian-British nuclear test operations at Maralinga in 1956 and 1957, Buffalo and Antler, proved that even at 10 psi peak overpressure for the 15 kt Buffalo-1 shot, the dummy lying prone facing the blast was hardly moved due to the low cross-sectional area exposed to the blast winds, relative to standing dummies which were severely displaced and damaged. The value of trenches in protecting personnel against blast winds and radiation was also proved in tests (gamma radiation shielding of trenches had been proved at an earlier nuclear test in Australia, Operation Hurricane in 1952). (Antler report linked here; Buffalo report linked here.) This debunks the US Department of Defense models claiming that people will automatically be blown out of the upper floors of modern city buildings at very low pressures, and killed by the gravitational impact with the pavement below! In reality, tall buildings mutually shield one another from the blast winds, not to mention the radiation (proven in the latest post on this blog), and on seeing the flash most people will have time to lie down on typical surfaces like carpet which give a frictional resistance to displacement, ignored in fiddled models which assume surfaces have less friction than a skating rink; all of this was omitted from the American 1977 Glasstone and Dolan book "The Effects of Nuclear Weapons". As Tuck's paper below on the gamma radiation dose rate measurements on ships at Operation Crossroads, July 1946 nuclear tests proved, contrary to Glasstone and Dolan, scattered radiation contributions are small, so buildings or ships gun turrets provided excellent radiation "shadows" to protect personnel. This effect was then calculated by UK civil defence weapons effects expert Edward Leader-Williams in his paper presented at the UK's secret London Royal Society Symposium on the Physical Effects of Atomic Weapons, but the nuclear test data as always was excluded from the American Glasstone book published the next year, The Effects of Atomic Weapons in deference to lies about the effects in Hiroshima, including an "average" casualty curve which deliberately obfuscated huge differences in survival rates in different types of buildings and shelters, or simply in shadows!

Note: the DELFIC, SIMFIC and other computer predicted fallout area comparisons for the 110 kt Bikini Atoll Castle-Koon land surface burst nuclear test are false since the distance scale of Bikini Atoll is massively exaggerated on many maps, e.g. in the Secret January 1955 AFSWP "Fall-out Symposium", the Castle fallout report WT-915, and the fallout patterns compendium DASA-1251! The Western side of the Bikini Atoll reef is at 165.2 degrees East, while the most eastern island in the Bikini Atoll, Enyu, is at 165.567 degrees East: since there are 60 nautical miles per degree by definition, the width of Bikini Atoll is therefore (165.567-165.2)(60) = 22 nautical miles, approximately half the distance shown in the Castle-Koon fallout patterns. Since area is proportional to the square of the distance scale, this constitutes a serious exaggeration in fallout casualty calculations, before you get into the issue of the low energy (0.1-0.2 MeV) gamma rays from neutron induced Np239 and U237 in the fallout enhancing the protection factor of shelters (usually calculated assuming hard 1.17 and 1.33 MeV gamma rads from Co60), during the sheltering period of approximately 1-14 days after detonation.

"Since the nuclear stalemate became apparent, the Governments of East and West have adopted the policy which Mr Dulles calls 'brinkmanship'. This is a policy adopted from a sport ... called 'Chicken!' ... If one side is unwilling to risk global war, while the other side is willing to risk it, the side which is willing to run the risk will be victorious in all negotiations and will ultimately reduce the other side to complete impotence. 'Perhaps' - so the practical politician will argue - 'it might be ideally wise for the sane party to yield to the insane party in view of the dreadful nature of the alternative, but, whether wise or not, no proud nation will long acquiesce in such an ignominious role. We are, therefore, faced, quite inevitably, with the choice between brinkmanship and surrender." - Bertrand Russell, Common Sense and Nuclear Warfare, George Allen and Unwin, London, 1959, pp30-31.

Emphasis added. Note that Russell accepts lying about nuclear weapons just as gas weapons had been lied about in the 1920s-30s by "arms controllers" to start WWII, then he simply falls into the 1930s Cambridge Scientists Antiwar Group delusional propaganda fraud of assuming that any attempt to credibly deter fascism is immoral because it will automatically result in escalatory retaliation with Herman Goering's Luftwaffe drenching London with "overkill" by poison gas WMDs etc. In particular, he forgets that general disarmament pursued in the West until 1935 - when Baldwin suddenly announced that the Nazis had secretly produced a massive, unstoppable warmachine in two years - encouraged aggressors to first secretly rearm, then coerce and invade their neighbours while signing peace promises purely to buy more time for rearmament, until a world war resulted. Not exactly a great result for disarmament propaganda. So after obliterating what Reagan used to call (to the horror of commie "historians") the "true facts of history" from his mind, he advocates some compromise with the aggressors of the 30 September 1938 Munich Agreement peace-in-our-time sort, the historically proved sure fire way to really escalate a crisis into a major war by showing the green lamp to a loon to popular media acclaim and applause for a fairy tale utopian fantasy; just as the "principled" weak, rushed, imbecile withdrawl from Afghanistan in 2021 encouraged Putin to invade Ukraine in 2022, and also the green lamp for Hamas to invade Israel in 2023.

"... deterrence ... consists of threatening the enemy with thermonuclear retaliation should he act provocatively. ... If war is 'impossible', how can one threaten a possible aggressor with war? ... The danger, evoked by numerous critics, that such research will result in a sort of resigned expectation of the holocaust, seems a weak argument ... The classic theory of Clausewitz defines absolute victory in terms of disarmament of the enemy ... Today ... it will suffice to take away his means of retaliation to hold him at your mercy." - Raymond Aron, Introduction to Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, pp. 9-12. (This is the commie support for arms control and disarmament has achieved, precisely the weakening of the West to take away credible deterrence.)

"75 years ago, white slavery was rampant in England. ... it could not be talked about openly in Victorian England, moral standards as to the subjects of discussion made it difficult to arouse the community to necessary action. ... Victorian standards, besides perpetuating the white slave trade, intensified the damage ... Social inhibitions which reinforce natural tendencies to avoid thinking about unpleasant subjects are hardly uncommon. ... But when our reluctance to consider danger brings danger nearer, repression has gone too far. In 1960, I published a book that attempted to direct attention to the possibility of a thermonuclear war ... people are willing to argue that it is immoral to think and even more immoral to write in detail about having to fight ... like those ancient kings who punished messengers who brought them bad news. That did not change the news; it simply slowed up its delivery. On occasion it meant that the kings were ill informed and, lacking truth, made serious errors in judgement and strategy. ... We cannot wish them away. Nor should we overestimate and assume the worst is inevitable. This leads only to defeatism, inadequate preparations (because they seem useless), and pressures toward either preventative war or undue accommodation." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, pp. 17-19. (In the footnote on page 35, Kahn notes that original nuclear bullshitter, the 1950 creator of fake cobalt-60 doomsday bomb propaganda, Leo Szilard, was in the usual physics groupthink nutters club: "Szilard is probably being too respectful of his scientific colleagues who also seem to indulge in ad hominem arguments - especially when they are out of their technical specialty.")

"Ever since the catastropic and disillusioning experience of 1914-18, war has been unthinkable to most people in the West ... In December 1938, only 3 months after Munich, Lloyd's of London gave odds of 32 to 1 that there would be no war in 1939. On August 7, 1939, the London Daily Express reported the result of a poll of its European reporters. 10 out of 12 said, 'No war this year'. Hitler invaded Poland 3 weeks later." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, p. 39. (But as the invasion of Ukraine in 2022 proved, even the label "war" is now "controversial": the aggressor now simply declares they are on a special operation of unifying people under one flag to ensure peace! So the reason why there is war in Ukraine is that Ukraine is resisting. If it waved a white flag, as the entire arms control and disarmament lobby insists is the only sane response to a nuclear-armed aggressor, there would be "peace," albeit on Russia's terms: that's why they disarmed Ukraine in 1994. "Peace propaganda" of "disarmers"! Free decent people prefer to fight tyranny. But as Kahn states on pp. 7-9:

"Some, most notably [CND's pseudo-historian of arms race lying] A. J. P. Taylor, have even said that Hitler was not like Hitler, that further appeasement [not an all-out arms race as was needed but repeatedly rejected by Baldwin and Chamberlain until far too late; see discussion of this fact which is still deliberately ignored or onfuscated by "historians" of the A. J. P. Taylor biased anti-deterrence left wing type, in Slessor's The Central Blue, quoted on this blog] would have prevented World War II ... If someone says to you, 'One of us has to be reasonable and it is not going to be me, so it has to be you', he has a very effective bargaining advantage, particularly if he is armed with thermonuclear bombs [and you have damn all civil defense, ABM, or credible tactical deterrent]. If he can convince you he is stark, staring mad and if he has enough destructive power ... deterrence alone will not work. You must then give in or accept the possibility of being annihilated ... in the first instance if we fight and lose; in the second if we capitulate without fighting. ... We could still resist by other means ranging from passive resistance of the Gandhi type to the use of underground fighting and sabotage. All of these alternatives might be of doubtful effectiveness against [the Gulag system, KGB/FSB torture camps or Siberian salt mines of] a ruthless dictatorship."

Sometimes people complain that Hitler and the most destructive and costly war and only nuclear war of history, WWII, is given undue attention. But WWII is a good analogy to the danger precisely because of the lying WMD gas war propaganda-based disarmament of the West which allowed the war, because of the attacks by Hitler's fans on civil defense in the West to make even the token rearmament after 1935 ineffective as a credible deterrent, and because Hitler has mirrors in Alexander the Great, Attila the Hun, Ghengis Khan, Tamerlane, Napoleon and Stalin. Kahn explains on p. 173: "Because history has a way of being more imaginative and complex than even the most imaginative and intelligent analysts, historical examples often provide better scenarios than artificial ones, even though they may be no more directly applicable to current equipment, postures, and political situations than the fictional plot of the scenario. Recent history can be especially useful.")

"One type of war resulting at least partly from deliberate calculation could occur in the process of escalation. For example, suppose the Soviets attacked Europe, relying upon our fear of their reprisal to deter a strategic attack by us; we might be deterred enough to pause, but we might evacuate our cities during this pause in the hope we could thereby convince the Soviets we meant business. If the Soviets did not back down, but continued their attack upon Europe, we might decide that we would be less badly off if we proceeded ... The damage we would receive in return would then be considerably reduced, compared with what we would have suffered had we not evacuated. We might well decide at such a time that we would be better off to attack the Soviets and accept a retalitory blow at our dispersed population, rather than let Europe be occupied, and so be forced to accept the penalty of living in the hostile and dangerous world that would follow." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, pp. 51-2.

"We must recognise that the stability we want in a system is more than just stability against accidental war or even against an attack by the enemy. We also want stability against extreme provocation [e.g. invasion of allies, which then escalates as per invasion of Belgium 1914, or Poland 1939]." - Herman Kahn's 1962 Thinking About the Unthinkable, Weidenfield and Nicholson, London, p. 53(footnote).

Note: this 1962 book should not be confused with Kahn's 1984 "updated" Thinking About the Unthinkable in the 1980s, which omits the best material in the 1962 edition (in the same way that the 1977 edition of The Effects of Nuclear Weapons omits the entire civil defense chapter which was the one decent thing in the 1957 and 1962/4 editions!) and thus shows a reversion to the less readable and less helpful style of his 1960 On Thermonuclear War, which severely fragmented and jumbled up all the key arguments making it easy for critics to misquote or quote out of context. For example, Kahn's 1984 "updated" book starts on the first page of the first chapter with the correct assertion that Johnathan Schell's Fate of the Earth is nonsense, but doesn't say why it's nonsense, and you have to read through to the final chapter - pages 207-8 of chapter 10 - to find Kahn writing in the most vague way possible, without a single specific example, that Schell is wrong because of "substantive inadequacies and inaccuracies", without listing a single example such as Schell's lying that the 1954 Bravo nuclear test blinded everyone well beyond the range of Rongelap, and that it was impossible to easily shield the radiation from the fallout or evacuate the area until it decays, which Schell falsely attributed to Glasstone and Dolan's nonsense in the 1977 Effects of Nuclear Weapons! Kahn eventually in the footnote on page 208 refers readers to an out-of-print article for facts: "These criticisms are elaborated in my review of The Fate of the Earth, see 'Refusing to Think About the Unthinkable', Fortune, June 28, 1982, pp. 113-6. Kahn does the same for civil defense in the 1984 book, referring in such general, imprecise and vague terms to Russian civil defence, with no specific data, that it is a waste of time, apart possibly one half-baked sentence on page 177: "Variations in the total megatonnage, somewhat surprisingly, do not seem to affect the toll nearly as much as variations in the targetting or the type of weapon bursts." Kahn on page 71 quotes an exchange between himself and Senator Proxmire during the US Congressional Hearings of the Joint Committee on Defense Production, Civil preparedness and limited nuclear war where on page 55 of the hearings, Senator Proxmire alleges America would escalate a limited conflict to an all-out war because: "The strategic value and military value of destroying cities in the Soviet Union would be very great." Kahn responded: "No American President is likely to do that, no matter what the provocation." Nuclear war will be limited, according to Herman Kahn's analysis, despite the bullshit fron nutters to the contrary.

Kahn on page 101 of Thinking About the Unthinkable in the 1980s correctly and accurately condemns President Carter's 1979 State of the Union Address, which claimed falsely that just a single American nuclear submarine is required by America and has an "overwhelming" deterrent against "every large and medium-sized city in the Soviet Union". Carter ignored Russian retaliation on cities if you bomb theirs: America has avoided the intense Russian protection efforts that make the Russian nuclear threat credible, namely civil defense shelters and evacuation plans, and also the realpolitik of deterrence of world wars, which so far have only been triggered due to invasions of third parties (Belgium '14, Poland '39). Did America strategically nuke every city in Russia when it invaded Ukraine in 2022? No, debunking Proxmire and the entire Western pro-Russian "automatic escalation" propaganda lobby, and it didn't even have tactical neutron bombs to help deter the Russians like Reagan in the 1980s, because in the 1990s America had ignored Kahn's argument, and went in for MINIMAL deterrence of the least credible sort (abolishing the invasion-deterring dedicated neutron tactical nuclear stockpile entirely; the following quotation is from p101 of Kahn's Thinking About the Unthinkable in the 1980s):

"Minimum deterrence, or any predicated on an escessive emphasis on the inevitably of mutual homocide, is both misleading and dangerous. ... MAD principles can promote provocation - e.g. Munich-type blackmail on an ally. Hitler, for example, did not threaten to attack France or England - only Austria, Czechoslovakia, and Poland. It was the French and the British who finally had to threaten all-out war [they could only do this after rearmament and building shelters and gas masks to reduce the risk of reprisals in city bombing, which gave more time for Germany to prepare since it was rearming faster than France and Britain which still desperately counted on appeasement and peace treaties and feared provoking a war by an arms-race due to endless lying propaganda from Lord Grey that his failure to deter war in 1914 had been due to an arms-race rather than the incompetence of the procrastination of his anti-war Liberal Party colleagues in the Cabinet] - a move they would not and could not have made if the notion of a balance of terror between themselves and Germany had been completely accepted. As it was, the British and French were most reluctant to go to war; from 1933 to 1939 Hitler exploited that reluctance. Both nations [France and Britain] were terrified by the so-called 'knockout blow', a German maneuver that would blanket their capitals with poison gas ... The paralyzing effect of this fear prevented them from going to war ... and gave the Germans the freedom to march into the Ruhr, to form the Anschluss with Austria, to force the humiliating Munich appeasement (with the justification of 'peace in our time'), and to take other aggressive actions [e.g. against the Jews in the Nuremberg Laws, Kristallnacht, etc.] ... If the USSR were sufficiently prepared in the event a war did occur, only the capitalists would be destroyed. The Soviets would survive ... that would more than justify whatever sacrifice and destruction had taken place.

"This view seems to prevail in the Soviet military and the Politburo even to the present day. It is almost certain, despite several public denials, that Soviet military preparations are based on war-fighting, rather than on deterrence-only concepts and doctrines..." - Herman Kahn, Thinking About the Unthinkable in the 1980s, 1984, pages 101-102.

Kahn adds, in his footnote on p111, that "Richard Betts has documented numerous historical cases in which attackers weakened their opponents defenses through the employment of unanticipated tactics. These include: rapid changes in tactics per se, false alarms and fluctuating preparations for war ... doctrinal innovations to gain surprise. ... This is exactly the kind of thing which is likely to surprise those who subscribe to MAD theories. Those who see a need for war-fighting capabilities expect the other side to try to be creative and use tactical innovations such as coercion and blackmail, technological surprises, or clever tactics on 'leverage' targets, such as command and control installations. If he is to adhere to a total reliance on MAD, the MADvocate has to ignore these possibilities." See Richard Betts, "Surprise Despite Warning: Why Sudden Attacks Succeed", Political Science Quarterly, Winter 1980-81, pp. 551-572.)

Compare two situations: (1) Putin explodes a 50 megaton nuclear "test" of the warhead for his new nuclear reactor powered torpedo, Poseidon, a revamped 1961 Tsar Bomba, or detonates a high-altitude nuclear EMP "test" over neutral waters but within the thousands of miles range of USA or UK territory; (2) Putin invades Poland using purely conventional weapons. Our point here is that both nuclear AND conventional weapons trigger nuclear threats and the risk of nuclear escalation, as indeed they have done (for Putin's nuclear threats scroll down to videos with translations below). So the fashionable CND style concept that only nuclear weapons can trigger nuclear escalation is bullshit, and is designed to help Russia start and win WWIII to produce a world government, by getting us to undertake further unilateral (not multilateral) disarmament, just as evolved in the 1930s, setting the scene for WWII. Japan for example did not have nuclear weapons in August 1945, yet triggered not just tactical nuclear war (both cities had some military bases and munitions factories, as well as enormous numbers of civilians), and the decision to attack cities rather than just "test" weapons obove Tokyo bay as Teller demanded but Oppenheimer rejected (for maximum impact with a very small supply of nuclear weapons) showed some strategic nuclear war thinking. Truman was escalating to try to shock Japan into rapid surrender emotionally (many cities in Japan had already been burned out in conventional incendiary air raids, and the two nuclear attacks while horrible for civilians in those cities contributed only a fraction of the millions killed in WWII, despite anti-nuclear propaganda lies to the contrary). Truman's approach escalating to win is the opposite of the "Minimax game theory" (von Neumann's maths and Thomas Schelling's propaganda) gradual escalation approach that's currently the basis of nuclear deterrence planning despite its failure wherever it has been tried (Vietnam, Afghanistan, etc). Gradual escalation is supposed to minimise the maximum possible risk (hence "minimax" name), but it guarantees failure in the real world (unlike rule abided games) by maximising the build up of resentment. E.g. Schelling/Minimax say that if you gradually napalm civilians day after day (because they are the unprotected human shields used by terrorists/insurgents; the Vietcong are hiding in underground tunnels, exactly like Hamas today, and the Putin regime's metro 2 shelter tunnels under Russia) you somehow "punish the enemy" (although they don't give a toss about the lives of kids which is why you're fighting them!) and force them to negotiate for peace in good faith, then you can pose for photos with them sharing a glass of champagne and there is "world peace". That's a popular fairy tale, like Marxist mythology.

Once you grasp this fact, that nuclear weapons have been and will again be "used" explosively without automatic escalation, for example provocative testing as per the 1961 Russian 50 megaton bomb test, or the 1962 high altitude EMP bursts, you should be able to grasp the fact that the "escalation" deception used to dismiss civil defense and tactical nuclear deterrence against limited nuclear war, is fake news from Russian fellow-travellers like Corbyn. Once you assign a non-unity probability to "escalation", you're into conventional war territory: if you fight a conventional war, it can "escalate" to nuclear war as on 6 August 1945. Japan did not avoid nuclear attack by not having nuclear weapons on 6 August 1945. If it had nuclear weapons ready to be delivered, a very persuasive argument could be made that unless Truman wanted to invite retaliation, World War II would have remained strategically non-nuclear: no net strategic advantage would have been achieved by nuclear city bombing so only war-ending tactical nuclear threats could have prevailed in practice. But try explaining this to the groupthink pseudosocialist bigoted mass murderers who permeate fake physics with crap; it's no easier to explain to them the origins of particle masses or even dark energy/gravitation; in both cases groupthink lying hogwash persists because statements of proved facts are hated and rejected if them debunk religious style fairy tales the mass media loves. There were plenty of people warning that mass media gas war fear mongering was disguised Nazi supporting propaganda in the 1930s, but the public listened to that crap then just as it accepted the "eugenics" (anti-diversity evolution crap of Sir Galton, cousin of Darwin) basis for Hitler's Mein Kampf without question, just as they accepted the lying propaganda from the UK "Cambridge Scientists Anti-War Group" which like CND and all other arms control and disarmament lobbies supporting terrorist states today, did more than even Hitler to deliberately lay the foundations for the Holocaust and World War II, while never being criticised in the UK media! Thus, it's surely time for people to oppose evil lying on civil defence to save lives in all disasters from storms to conventional war, to collateral damage risks in nuclear terrorism by mad enemies. At some point, the majority has to decide to either defend itself honestly and decently against barbarism, or be consumed by it as a price for believing bullshit. It's time for decent people to oppose lying evil regarding the necessity to have credible tactical (not incredible strategic) nuclear weapons, as Oppenheimer called for in his 1951 speech, to deter invasions.

Democracy can't function when secrecy is used to deliberately cover-up vital data from viewing by Joe Public. Secrecy doesn't protect you from enemies who independently develop weapons in secret, or who spy from inside your laboratories:

"The United States and Great Britain resumed testing in 1962, and we spared no effort trying to find out what they were up to. I attended several meetings on that subject. An episode related to those meetings comes to mind ... Once we were shown photographs of some documents ... the photographer had been rushed. Mixed in with the photocopies was a single, terribly crumpled original. I innocently asked why, and was told that it had been concealed in panties. Another time ... questions were asked along the following lines: What data about American weapons would be most useful for your work and for planning military technology in general?"

- Andrei Sakharov, Memoirs, Hutchinson, London, 1990, pp225-6.

ABOVE: The British government has now declassified detailed summary reports giving secret original nuclear test data on the EMP (electromagnetic pulse) damage due to numerous nuclear weapons, data which is still being kept under wraps in America since it hasn't been superseded because Western atmospheric nuclear tests were stopped late in 1962 and never resumed - even though the Russians have even more extensive data - completely debunking Glasstone and Dolan's disarmament propaganda nonsense in the 1962, 1964 and 1977 Effects of Nuclear Weapons which ignores EMP piped far away from low altitude nuclear tests by power and communications cables and falsely claims instead that such detonations don't produce EMP damage outside the 2psi blast radius! For a discussion of the new data and also a link to the full 200+ pages version (in addition to useful data, inevitably like all official reports it also contains a lot of "fluff" padding), please see the other (physics) site: https://nige.wordpress.com/2023/09/12/secret-emp-effects-of-american-nuclear-tests-finally-declassified-by-the-uk-and-at-uk-national-archives/ (by contrast, this "blogspot" uses old non-smartphone proof coding, no longer properly indexed any long longer by "google's smartphone bot"). As long ago as 1984, Herman Kahn argued on page 112 of his book Thinking About the Unthinkable in the 1980s: "The effects of an EMP attack are simply not well understood [in the West, where long powerlines were never exposed on high altitude nuclear tests, unlike the Russian's 1962 Operation K, so MHD-EMP or E3 damage wasn't even mentioned in the 1977 Glasstone and Dolan Effects of Nuclear Weapons], but the Soviets seem to know - or think they know - more than we do."

BELOW: declassified British nuclear war planning blast survival data showing that even without special Morrison table shelters, the American assumption that nobody can survive in a demolished house is false, based on detailed WWII British data (the majority of people in houses flattened within 77 ft from V1 Nazi cruise missiles survived!), and secret American reports (contradicting their unclassified propaganda) proved that blast survival occurred at 16 psi overpressure in Hiroshima's houses, e.g. see limited distribution Dirkwood corp DC-P-1060 for Hiroshima, also the secret 1972 Capabilities of Nuclear Weapons DNA-EM-1 table 10-1, and WWII report RC-450 table 8.2, p145 (for determining survival of people sheltered in brick houses, the WWII A, B, C, and D damage versus casualty data from V1 blast was correlated to similar damage from nuclear blast as given Glasstone's 1957 Effects of Nuclear Weapons page 249, Fig. 6.41a, and page 109 Fig. 3.94a, which show that A, B, C, and D damage to brick houses from nuclear weapons occur at peak overpressures of 9, 6, 3 and 0.5 psi, respectively; the longer blast from higher yields blows the debris over a wider area, reducing the load per unit area falling on to people sheltered under tables etc), and the declassified UK government assessment of nuclear terrorist attack on a port or harbour, as well as the confidential classified UK Government analysis of the economic and social effects from WWII bombing (e.g. the recovery times for areas as a function of percentage of houses destroyed):

Unofficial Russian video on the secret Russian nuclear shelters from Russian Urban Exploration, titled "Проникли на секретный Спецобъект Метро!" = "We infiltrated a secret special facility of the Metro!":

ABOVE: Moscow Metro and Metro-2 (secret nuclear subway) horizonially swinging blast doors take only 70 seconds to shut, whereas their vertically rising blast doors take 160 seconds to shut; both times are however far shorter than the arrival time of Western ICBMs or even SLBMs which take 15-30 minutes by which time the Russian shelters are sealed from blast and radiation! In times of nuclear crisis, Russia planned to evacuate from cities those who could not be sheltered, and for the remainder to be based in shelters (similarly to the WWII British situation, when people slept in shelters of one kind or another when there was a large risk of being bombed without notice, particularly in supersonic V2 missile attacks where little warning time was available).


ABOVE: originally SECRET diagrams showing the immense casualty reductions for simple shelters and local (not long distance as in 1939) evacuation, from a UK Home Office Scientific Advisers’ Branch report CD/SA 72 (UK National Archives document reference HO 225/72), “Casualty estimates for ground burst 10 megaton bombs”, which exposed the truth behind UK Cold War civil defence (contrary to Russian propaganda against UK defence, which still falsely claims there was no scientific basis for anything, playing on the fact the data was classified SECRET). Evacuation plus shelter eliminates huge casualties for limited attacks; notice that for the 10 megaton bombs (more than 20 times the typical yield of today’s MIRV compact warheads!), you need 20 weapons, i.e. a total of 10 x 20 = 200 megatons, for 1 million killed, if civil defence is in place for 45% of people to evacuate a city and the rest to take shelter. Under civil defence, therefore, you get 1 million killed per 200 megatons. This proves that civil defence work to make deterrence more credible in Russian eyes. For a discussion of the anti-civil defence propaganda scam in the West led by Russian agents for Russian advantage in the new cold war, just read posts on this blog started in 2006 when Putin's influence became clear. You can read the full PDF by clicking the link here. Or see the files here.

ABOVE: the originally CONFIDENTIAL classified document chapters of Dr D.G. Christopherson’s “Structural Defence 1945, RC450”, giving low cost UK WWII shelter effectiveness data, which should also have been published to prove the validity of civil defence countermeasures in making deterrence of future war more credible by allowing survival of “demonstration” strikes and “nuclear accidents / limited wars” (it’s no use having weapons and no civil defence, so you can’t deter aggressors, the disaster of Munich appeasement giving Hitler a green light on 30 September 1938, when Anderson shelters were only issued the next year, 1939!). For the original WWII UK Government low cost sheltering instruction books issued to the public (for a small charge!) please click here (we have uploaded them to internet archive), and please click here for further evidence for the effectiveness of indoor shelters during WWII from Morrison shelter inventor Baker's analysis, please click here (he titled his book about WWII shelters "Enterprise versus Bureaucracy" which tells you all you need to know about the problems his successful innovations in shelter design experienced; his revolutionary concept was that the shelter should be damaged to protect the people inside because of the vast energy absorption soaked up in the plastic deformation of steel - something which naive fools can never appreciate - by analogy, if your car bumper is perfectly intact after impact you're unlikely to be because it has not absorbed the impact energy which has been passed on to you!). We have also placed useful declassified UK government nuclear war survival information on internet archive here and here. There is also a demonstration of how proof-tested WWII shelters were tested in 1950s nuclear weapon trials and adapted for use in Cold War nuclear civil defence, here, thus permanently debunking the somewhat pro-dictatorship/anti-deterrence Jeremy Corbyn/Matthew Grant/Duncan Campbell anti-civil defence propaganda rants which pretend to to based on reality, but obviously just ignore the hard, yet secret, nuclear testing facts upon which UK government civil defence was based as my father (a Civil Defence Corps instructor) explained here back in 2006. The reality is that the media follows herd fashion to sell paper/airtime; it doesn't lead it. This is why it backed Nazi appeasement (cheering Chamberlain's 1938 handshakes with Hitler for instance) and only switched tune when it was too late to deter Nazi aggression in 1939; it made the most money that way. We have to face the facts!

NUKEGATE - Western tactical neutron bombs were disarmed after Russian propaganda lie. Russia now has over 2000... "Disarmament and arms control" charlatans, quacks, cranks, liars, mass murdering Russian affiliates, and evil genocidal Marxist media exposed for what it is, what it was in the 1930s when it enabled Hitler to murder tens of millions in war. Glasstone's and Dolan's 1977 Effects of Nuclear Weapons deceptions totally disproved. Professor Brian Martin, TRUTH TACTICS, 2021 (pp45-50): "In trying to learn from scientific publications, trust remains crucial. The role of trust is epitomised by Glasstone’s book The Effects of Atomic Weapons. Glasstone was not the author; he was the editor. The book is a compilation of information based on the work of numerous contributors. For me, the question was, should I trust this information? Was there some reason why the editors or authors would present fraudulent information, be subject to conflicts of interest or otherwise be biased? ... if anything, the authors would presumably want to overestimate rather than underestimate the dangers ... Of special interest would be anyone who disagreed with the data, calculations or findings in Glasstone. But I couldn’t find any criticisms. The Effects of Nuclear Weapons was treated as the definitive source, and other treatments were compatible with it. ... One potent influence is called confirmation bias, which is the tendency to look for information that supports current beliefs and dismiss or counter contrary information. The implication is that changing one’s views can be difficult due to mental commitments. To this can be added various forms of bias, interpersonal influences such as wanting to maintain relationships, overconfidence in one’s knowledge, desires to appear smart, not wanting to admit being mistaken, and career impacts of having particular beliefs. It is difficult to assess the role of these influences on yourself. "

Honest Effects of Nuclear Weapons!

ABOVE (VIDEO CLIP): Russian State TV Channel 1 war inurer and enabler, NOT MERELY MAKING "INCREDIBLE BLUFF THREATS THAT WE MUST ALL LAUGH AT AND IGNORE LIKE DR GOEBBELS THREATS TO GAS JEWS AND START A WORLD WAR" AS ALMOST ALL THE BBC SCHOOL OF "JOURNALISM" (to which we don't exactly belong!) LIARS CLAIM, but instead preparing Russians mentally for nuclear war (they already have nuclear shelters and a new Putin-era tactical nuclear war civil defense manual from 2014, linked and discussed in blog posts on the archive above), arguing for use of nuclear weapons in Ukraine war in 2023: "We should not be afraid of what it is unnecessary to be afraid of. We need to win. That is all. We have to achieve this with the means we have, with the weapons we have. I would like to remind you that a nuclear weapon is not just a bomb; it is the heritage of the whole Russian people, suffered through the hardest times. It is our heritage. And we have the right to use it to defend our homeland [does he mean the liberated components of the USSR that gained freedom in 1992?]. Changing the [nuclear use] doctrine is just a piece of paper, but it is worth making a decision."

NOTE: THIS IS NOT ENGLISH LANGUAGE "PROPAGANDA" SOLELY ADDRESSED AS A "BLUFF" TO UK AND USA GOV BIGOTED CHARLATANS (those who have framed photos of hitler, stalin, chamberlain, baldwin, lloyd george, eisenhower, et al., on their office walls), BUT ADDRESSED AT MAKING RUSSIAN FOLK PARTY TO THE NEED FOR PUTIN TO START A THIRD WORLD WAR! Duh!!!!! SURE, PUTIN COULD PRESS THE BUTTON NOW, BUT THAT IS NOT THE RUSSIAN WAY, ANY MORE THAN HITLER SET OFF WWII BY DIRECTLY BOMBING LONDON! HE DIDN'T. THESE PEOPLE WANT TO CONTROL HISTORY, TO GO DOWN THE NEXT "PUTIN THE GREAT". THEY WANT TO GET THEIR PEOPLE, AND CHINA, NORTH KOREA, IRAN, ET Al. AS ALLIES, BY APPEARING TO BE DEFENDING RATIONALITY AND LIBERTY AGAINST WAR MONGERING WESTERN IMPERIALISM. For the KGB mindset here, please read Chapman Pincher's book "The Secret offensive" and Paul Mercer's "Peace of the Dead - The Truth Behind the Nuclear Disarmers". Please note that the link to the analysis of the secret USSBS report 92, The Effects of the Atomic Bomb on Hiroshima, Japan (which google fails to appreciate is a report with the OPPOSITE conclusions to the lying unclassified reports and Glasstone's book on fire, is on internet archive in the PDF documents list at the page "The effects of the atomic bomb on Hiroshima, Japan" (the secret report 92 of the USSBS, not the lying unclassified version or the Glasstone book series). If you don't like the plain layout of this blog, you can change it into a "fashionable" one with smaller photos you can't read by adding ?m=1 to the end of the URL, e.g. https://glasstone.blogspot.com/2022/02/analogy-of-1938-munich-crisis-and.html?m=1


Glasstone's Effects of Nuclear Weapons exaggerations completely undermine credible deterrence of war: Glasstone exaggerates urban "strategic" nuclear weapons effects by using effects data taken from unobstructed terrain (without the concrete jungle shielding of blast winds and radiation by cities!), and omits the most vital uses and most vital effects of nuclear weapons: to DETER world war credibly by negating the concentrations of force used to invade Belgium, 1914 (thus WWI) and Poland (WWII). The facts from Hiroshima and Nagasaki for the shielding of blast and radiation effects by modern concrete buildings in the credible nuclear deterrence of invasions (click here for data) which - unlike the countervalue drivel that failed to prevent WW2 costing millions of human lives - worked in the Cold War despite the Western media's obsession with treating as Gospel truth the lying anti-nuclear propaganda from Russia's World Peace Council and its allies (intended to make the West disarm to allow Russian invasions without opposition, as worked in Ukraine recently)! If we have credible W54's and W79's tactical nukes to deter invasions as used to Cold War, pro Russian World Peace Council inspired propaganda says: "if you use those, we'll bomb your cities", but they can bomb our cities with nuclear if we use conventional weapons, or even if we fart, if they want - we don't actually control what thugs in dictatorships - it is like saying Hitler had 12,000 tons of tabun nerve agent by 1945, so lying we had to surrender for fear of it. Actually, he had to blow his brains out because he had an incredible deterrent, as retaliation risk plus defence (masks) negated it!

Credible deterrence necessitates simple, effective protection against concentrated and dispersed invasions and bombing. The facts can debunk massively inaccurate, deliberately misleading CND "disarm or be annihilated" pro-dictatorship ("communism" scam) political anti-nuclear deterrence dogma. Hiroshima and Nagasaki anti-nuclear propaganda effects lies on blast and radiation for modern concrete cities is debunked by solid factual evidence kept from public sight for political reasons by the Marx-media which is not opposed by the remainder of the media, and the completely fake "nuclear effects data" sneaks into "established pseudo-wisdom" by the back-door. Another trick is hate attacks on anyone telling the truth: this is a repeat of lies from Nobel Peace Prize winner Angell and pals before WWI (when long-"outlawed" gas was used by all sides, contrary to claims that paper agreements had "banned" it somehow) and WWII (when gas bombing lies prior to the war by Angell, Noel-Baker, Joad and others were used as an excuse to "make peace deals" with the Nazis, again, not worth the paper they were printed on). Mathematically, the subset of all States which keep agreements (disarmament and arms control, for instance) is identical to the subset of all States which are stable Democracies (i.e., tolerating dissent for the past several years), but this subset is - as Dr Spencer Weart's statistical evidence of war proves in his book Never at War: Why Democracies Won't Fight One Another - not the bloody war problem! Because none of the disarmaments grasp set theory, or bother to read Dr Weart's book, they can never understand that disarmament of Democracies doesn't cause peace but causes millions of deaths.

PLEASE CLICK HERE for the truth from Hiroshima and Nagasaki for the shielding of blast and radiation effects by modern concrete buildings in the credible nuclear deterrence of invasions which - unlike the countervalue drivel that failed to prevent WW2 costing millions of human lives - worked in the Cold War despite the Western media's obsession with treating as Gospel truth the lying anti-nuclear propaganda from Russia's World Peace Council and its allies (intended to make the West disarm to allow Russian invasions without opposition, as worked in Ukraine recently)! Realistic effects and credible nuclear weapon capabilities are needed for deterring or stopping aggressive invasions and attacks which could escalate into major conventional or nuclear wars. Credible deterrence is through simple, effective protection against concentrated and dispersed invasions and aerial attacks, debunking inaccurate, misleading CND "disarm or be annihilated" left political anti-nuclear deterrence dogma. Hiroshima and Nagasaki anti-nuclear propaganda effects lies on blast and radiation for modern concrete cities is debunked by solid factual evidence kept from public sight for political reasons by the Marx-media.

Glasstone's and Nukemap's fake Effects of Nuclear Weapons effects data for unobstructed deserts, rather than realistic blast and radiation shielding concrete jungles which mitigate countervalue damage as proved in Hiroshima and Nagasaki by Penney and Stanbury, undermine credible world war deterrence just as Philip Noel-Baker's 1927 BBC radio propaganda on gas war knock-out blow lies were used by Nazi propaganda distributing "pacifist disarmers" to undermine deterrence of Hitler's war, murdering tens of millions deliberately through lies (e.g. effective gas masks don't exist) that were easy to disprove, but supported by the mainstream fascist leaning press in the UK. There is not just one country, Russia, which could trigger WW3, because we know from history that the world forms alliances once a major war breaks out, apart from a few traditional neutral countries like Ireland and Switzerland, so a major US-China war over Taiwan could draw in support from Russia and North Korea, just as the present Russian invasion and war against Ukraine has drawn in Iranian munitions support for Russia. So it is almost certain that a future East-vs-West world war will involve an alliance of Russia-China-North Korea-Iran fighting on multiple fronts, with nuclear weapons being used carefully for military purposes (not in the imaginary 1930s massive "knockout blow" gas/incendiary/high explosive raids against cities that was used by the UK media to scare the public into appeasing Hitler and thus enabling him to trigger world war; Chamberlain had read Mein Kampf and crazily approved Hitler's plans to exterminate Jews and invade Russia starting a major war, a fact censored out of biased propaganda hailing Chamberlain as a peacemaker).

Realistic effects and credible nuclear weapons capabilities are VITAL for deterring or stopping aggressive invasions and attacks which could escalate into major conventional or nuclear wars debunk Marx media propagandarists who obfuscate because they don't want you to know the truth, so activism is needed to get the message out against lying frauds and open fascists in the Russian supporting Marx mass media, which sadly includes government officialdom (still infiltrated by reds under beds, sorry to Joe MaCarthy haters, but admit it as a hard fact that nuclear bomb labs in the West openly support Russian fascist mass murders; I PRAY THIS WILL SOON CHANGE!).

ABOVE: Tom Ramos at Lawrence Livermore National Laboratory (quoted at length on the development details of compact MIRV nuclear warhead designs in the latest post on this blog) explains how the brilliant small size primary stage, the Robin, was developed and properly proof-tested in time to act as the primary stage for a compact thermonuclear warhead to deter Russia in the 1st Cold War, something now made impossible due to Russia's World Peace Council propaganda campaigns. (Note that Ramos has a new book published, called From Berkeley to Berlin: How the Rad Lab Helped Avert Nuclear War which describes in detail in chapter 13, "First the Flute and Then the Robin", how caring, dedicated nuclear weapons physicists in the 1950s and 1960s actually remembered the lesson of disarmament disaster in the 1930s, and so WORKED HARD to develop the "Flute" secondary and the "Robin" primary to enable a compact, light thermonuclear warhead to help deter WWIII! What a difference to today, when all we hear from such "weaponeers" now is evil lying about nuclear weapons effects on cities and against Western civil defence and against credible deterrence on behalf of the enemy.)

ABOVE: Star Wars filmmaker Peter Kuran has at last released his lengthy (90 minutes) documentary on The neutron bomb. Unfortunately, it is not yet being widely screened in cinemas or on DVD Blu Ray disc, so you have to stream it (if you have fast broadband internet hooked up to a decent telly). At least Peter managed to interview Samuel Cohen, who developed the neutron bomb out of the cleaner Livermore devices Dove and Starling in 1958 (Ramos says Livermore's director, who invented a wetsuit, is now trying to say Cohen stole the neutron bomb idea from him! Not so, as RAND colleague and 1993 Effects Manual EM-1 editor Dr Harold L. Brode explains in his recent brilliant book on the history of nuclear weapons in the 1st Cold War (reviewed in a post on this blog in detail) that Cohen was after the neutron bomb for many years before Livermore was even built as a rival to Los Alamos. Cohen had been into neutrons when working in the Los Alamos Efficiency Group of the Manhattan project on the very first nuclear weapons, used with neutron effects on people by Truman, back in 1945 to end a bloody war while the Livermore director was in short pants.)

For the true effects in modern city concrete buildings in Hiroshima and Nagasaki, disproving the popular lies for nudes in open deserts used as the basis for blast and radiation calculations by Glasstone and Nukemap, please click here The deceptive bigots protraying themselves as Federation of American Scientists genuine communist disarmers in the Marx media including TV scammers have been suppressing the truth to sell fake news since 1945 and in a repetition of the 1920s and 1930s gas war media lying for disarmament and horror news scams that caused disarmament and thus encouraged Hitler to initiate the invasions that set off WWII!

Glasstone's Effects of Nuclear Weapons exaggerations completely undermine credible deterrence of war: Glasstone exaggerates urban "strategic" nuclear weapons effects by using effects data taken from unobstructed terrain (without the concrete jungle shielding of blast winds and radiation by cities!), and omits the most vital uses and most vital effects of nuclear weapons: to DETER world war credibly by negating the concentrations of force used to invade Belgium, 1914 (thus WWI) and Poland (WWII). Disarmament and arms control funded propaganda lying says any deterrent which is not actually exploded in anger is a waste of money since it isn't being "used", a fraud apparently due to the title and content of Glasstone's book which omits the key use and effect of nuclear weapons, to prevent world wars: this is because Glasstone and Dolan don't even bother to mention the neutron bomb or 10-fold reduced fallout in the the Los Alamos 95% clean Redwing-Navajo test of 1956, despite the neutron bomb effects being analysed for its enhanced radiation and reduced thermal and blast yield in detail in the 1972 edition of Dolan's edited secret U.S. Department of Defense Effects Manual EM-1, "Capabilities of Nuclear Weapons", data now declassified yet still being covered-up by "arms control and disarmament" liars today to try to destroy credible deterrence of war in order to bolster their obviously pro-Russian political anti-peace agenda. "Disarmament and arms control" charlatans, quacks, cranks, liars, mass murdering Russian affiliates, and evil genocidal Marxist media exposed for what it is, what it was in the 1930s when it enabled Hitler to murder tens of millions in war .

ABOVE: 11 May 2023 Russian state TV channel 1 loon openly threatens nuclear tests and bombing UK. Seeing how the Russian media is under control of Putin, this is like Dr Goebbels rantings, 80 years past. But this doesn't disprove the world war threat any more than it did with Dr Goebbels. These people, like the BBC here, don't just communicate "news" but attempt to do so selectively and with interpretations and opinions that set the stage for a pretty obviously hate based political agenda with their millions of viewers, a trick that worked in the 1st Cold War despite Orwell's attempts to lampoon it in books about big brother like "1984" and "Animal Farm". When in October 1962 the Russians put nuclear weapons into Cuba in secret without any open "threats", and with a MASSIVELY inferior overall nuclear stockpile to the USA (the USA had MORE nuclear weapons, more ICBMs, etc.), the media made a big fuss, even when Kennedy went on TV on 22 October and ensured no nuclear "accidents" in Cuba by telling Russia that any single accidentally launched missile from Cuba against any Western city would result in a FULL RETALITORY STRIKE ON RUSSIA. There was no risk of nuclear war then except by accident, and Kennedy had in his 25 May 1961 speech on "Urgent National Needs" a year and a half before instigated NUCLEAR SHELTERS in public basement buildings to help people in cities survive (modern concrete buildings survive near ground zero Hiroshima, as proved by declassified USSBS reports kept covered up by Uncle Sam). NOE THAT THERE IS A CREDIBLE THREAT OF NUCLEAR TESTS AND HIROSHIMA TYPE INTIMIDATION STRIKES, THE BBC FINALLY DECIDES TO SUPPRESS NUCLEAR NEWS SUPPOSEDLY TO HELP "ANTI-NUCLEAR" RUSSIAN PROPAGANDA TRYING TO PREVENT US FROM GETTING CREDIBLE DETERRENCE OF INVASIONS, AS WE HAD WITH THE W79 UNTIL DISARMERS REMOVED IT IN THE 90s! This stinks of prejudice, the usual sort of hypocrisy from the 1930s "disarmament heroes" who lied their way to Nobel peace prizes by starting a world war!

The facts from Hiroshima and Nagasaki for the shielding of blast and radiation effects by modern concrete buildings in the credible nuclear deterrence of invasions (click here for data) which - unlike the countervalue drivel that failed to prevent WW2 costing millions of human lives - worked in the Cold War despite the Western media's obsession with treating as Gospel truth the lying anti-nuclear propaganda from Russia's World Peace Council and its allies (intended to make the West disarm to allow Russian invasions without overwhelming, effective deterrence or opposition, as worked in Ukraine recently)!

Realistic effects and credible nuclear weapon capabilities are required now for deterring or stopping aggressive invasions and attacks which could escalate into major conventional or nuclear wars. Credible deterrence necessitates simple, effective protection against concentrated and dispersed invasions and bombing. The facts can debunk massively inaccurate, deliberately misleading CND "disarm or be annihilated" pro-dictatorship ("communism" scam) political anti-nuclear deterrence dogma. Hiroshima and Nagasaki anti-nuclear propaganda effects lies on blast and radiation for modern concrete cities is debunked by solid factual evidence kept from public sight for political reasons by the Marx-media, which is not opposed by the fashion-obsessed remainder of the media, and so myths sneak into "established pseudo-wisdom" by the back-door.

Wednesday, March 29, 2006

EMP radiation from nuclear space bursts in 1962

Above: the masters degree thesis by Louis W. Seiler, Jr., A Calculational Model for High Altitude EMP, report ADA009208, computes these curves for the peak EMP at ground zero for a burst above the magnetic equator, where the Earth's magnetic field is far weaker than it is at high latitudes (nearer the poles) where magnetic field lines converge (increasing the magnetic field strength). The discoverer of the magnetic dipole EMP mechanism, Longmire, states in another report that the peak EMP is almost directly proportional to the transverse component of the Earth's magnetic field across the radial line from the bomb to the observer. Seiler shows that the peak EMP is almost directly proportional to strength of the Earth's magnetic field: the curves above apply to 0.3 Gauss magnetic field strength, which is the weak field at the equator (the 1962 American tests over Johnston Island were nearer the equator). Over North America, Europe or Russia, peak EMP fields would be doubled those in the diagram above, due to the Earth's stronger magnetic field of around 0.5 Gauss, which deflects Compton electrons more effectively, causing more of their kinetic energy to be converted into EMP energy than in the 0.3 Gauss field over Johnston Island in the 1962 American tests. If you look at the curves above, you see that the peak EMP is only a weak function of the gamma ray output of the weapon (the peak EMP increases by just a factor of 5, from roughly 10 kV/m to 50 kV/m, as prompt gamma ray output rises by a factor of 10,000, i.e. from 0.01 to 100 kt); it is far less than directly proportional to yield. Seiler also shows that large two-stage thermonuclear weapons will often produce a smaller peak EMP than a single stage fission bomb, because of "pre-ionization" of the atmosphere by X-rays and gamma rays from the first stage, which ionize the air, making it electrically conductive so that free electrons and ions almost immediately short out the Compton current from the larger secondary stage, negating most the EMP that would otherwise occur.

Above: the declassified principles involved in enhanced EMP nuclear weapons are very simple and obvious. Materials are selected to maximize the prompt gamma radiation that comes from the inelastic scatter of high-energy fusion neutrons, while a simple radiation shield around the fission primary stage part of the weapon averts the problem of the shorting-out of the final (fusion) stage EMP by fission primary stage pre-ionization of the atmosphere (which prevents most EMP-producing Compton currents, due to making the air so electrically conductive that it immediately shorts out secondary stage Compton currents). In the Starfish Prime test, the warhead was simply inverted before launch, so the fusion secondary stage prevented pre-ionization of the atmosphere by absorbing downward X-rays and gamma rays from the primary stage! In the film taken horizontally from a Hawaiian mountain top (above the local cloud cover), you can clearly see the primary stage of the Starfish Prime weapon being ejected upwards, out of the top, by the immense blast and radiation impulse which has been delivered to it due to the bigger explosion of the secondary (thermonuclear) stage. The primary stage of the bomb flies upwards into space, expanding as it does so, while the heavier secondary stage remains almost stationary below it (photo sequence below).

Philip J. Dolan's Capabilities of Nuclear Weapons, DNA-EM-1 chapter 7, page 7-1 (change 1 page updates, 1978), report ADA955391, states that low yield pure fission bombs typically release 0.5% of their yield as prompt gamma rays, compared to only 0.1% in old high yield warhead designs with relative thick outer cases, like the 1.4 Mt STARFISH test in 1962. Furthermore, Northrop's 1996 handbook of declassified 1990s EM-1 data gives details on the prompt gamma ray output from four very different nuclear weapon designs, showing that the enhanced radiation warhead ("neutron bomb") releases 2.6% of its total yield in the form of prompt gamma rays, which is mainly because of the outer weapon casing which is designed to minimize radiation absorption, allowing as much as possible to escape. This gives an idea of the amount of enormous variation in the EMP potential of existing bomb designs. About 3.5% of the energy of fission is prompt gamma rays, and neutrons exceeding 0.5 MeV energy undergo inelastic neutron scatter with heavy nuclei (such as iron and uranium), leaving the nuclei excited isomers that release further prompt gamma rays.

Thus, low yield bombs at somewhat lower altitudes than 400 km can produce peak EMP fields that exceed those from the 1962 high altitude thermonuclear tests, while still affecting vast areas. Single stage (fission) weapons in some cases produce a larger EMP than high-yield two-stage thermonuclear weapons, mentioned above. Weapon designs that use a minimal tamper, a minimal shell of TNT for implosion, or a linear implosion system, and a minimal outer casing, can maximise the fraction of the prompt gamma rays which escape from the weapon, enhancing the EMP. Hence, a low yield fission device could easily produce a peak (VHF to UHF) EMP effect on above ground cables similar to the 1962 STARFISH test (although the delayed very low intensity MHD-EMP ELF effects penetrating through the earth into underground cables would be weaker, since the MHD-EMP is essentially dependent upon the total fission yield of the weapon not prompt radiation output; MHD-EMP occurs as the fireball expands and as the ionized debris travels along the magnetic field lines, seconds to minutes after detonation).

Naïvely, by assuming that a constant fraction of the bomb energy is converted into EMP, textbook radio transmission theory suggests that the peak radiated EMP should then be proportional to the square root of the bomb energy and inversely proportional to the distance from the bomb. But in fact, as the graph above shows, this assumption is a misleading, false approximation: the fraction of bomb energy converted into the EMP is highly variable instead of being constant, suppressing much of the expected variation of peak EMP field strength with bomb energy. For weapons with a prompt gamma ray yield of 0.01-0.1 kt, the peak EMP on the ground decreases as the weapon is detonated at higher altitudes, from 60 to 300 km. But for prompt gamma ray yields approaching 100 kt, the opposite is true: the peak EMP at ground zero then rises as the burst altitude is increased from 60 to 300 km. What happens here is due to a change in the effective altitude from which the EMP is generated. The fraction of prompt gamma rays absorbed by any thickness of air is constant, but large outputs of prompt gamma rays will allow substantial EMP generation to occur over larger distances than smaller outputs. Hence, high yields are able to ionize and generate EMP within a larger vertical thickness of air (a bigger "deposition region" volume) than smaller yields.

For sufficiently large yields, this makes the peak EMP on the ground increase while the burst altitude is increased, despite the increasing distance between the ground and the bomb! This is because a large prompt gamma output is able to produce substantial EMP contributions from a bigger volume of air, effectively utilizing more of the increased volume of air between bomb and ground for EMP generation. This increasing deposition region size for higher yields increases the efficiency with which gamma ray energy is turned into EMP energy. Weapons with a lower output of prompt gamma rays produce a smaller effective "deposition region" volume for EMP production, concentrated at higher altitudes (closer to the bomb, where the gamma radiation is stronger), which is less effective in producing ground-level EMP.

Above: this comparison of the prompt gamma ray deposition regions for space bursts of 1 and 10 megatons total yield (i.e., 1 kt and 10 kt prompt gamma ray yield, respectively) in the 1977 Effects of Nuclear Weapons explains why the peak EMP at ground zero varies as Seiler's graph shows. In all cases (for burst heights of 50-300 km) the base of the deposition region is at an altitude of 8-10 km, but the height of the top of the deposition region is a function of bomb yield as well as burst altitude. The deposition region radius marks the region where the peak conductivity of the air (due to ionization by the nuclear radiation) is 10-7 S/m; inside this distance the air is conductive and the EMP is being produced by transverse (magnetic field-deflected) Compton electron currents, and is being limited by the air conductivity rise due to secondary electrons. Beyond this radius, the EMP is no longer being significantly produced or attenuated by secondary electrons, and the EMP thus propagates like normal radio waves (of similar frequency). The greater the vertical thickness of the deposition region between the bomb and the surface for a given yield, the greater the EMP intensity. Thus, for the 1 megaton burst shown, the vertical height of the deposition region above ground zero reaches:

62 km altitude for 50 km burst height
84 km altitude for 100 km burst height
74 km altitude for 200 km burst height, and
67 km altitude for 300 km burst height

Hence, the 100 km burst height maximises the thickness of the prompt gamma ray deposition region above ground zero, and maximises the EMP for that 1 megaton yield. (For 1 megaton burst altitudes above 100 km, the inverse square law of radiation reduces the intensity of the prompt gamma rays hitting the atmosphere sufficiently to decrease the deposition region top altitude.) For the 10 megaton yield, the extra yield is sufficient to extend the size of the deposition region to much greater sizes and enable it to continue increasing vertically aboveground zero as the burst height is increased to 200 km, where it reaches an altitude of 85 km, falling to 79 km for 300 km burst altitude. The extra thickness of the deposition layer enables a greater EMP because the small fraction of the EMP generated in the lowest density air at the highest altitudes, above 70 km or so, suffers the smallest conduction current attenuation (EMP shorting by secondary electrons severely increases with increasing air density, at lower altitudes), so it boosts the total EMP strength at ground zero.

Honolulu Advertiser newspaper article dated 9 July 1962 (local time):

'The street lights on Ferdinand Street in Manoa and Kawainui Street in Kailua went out at the instant the bomb went off, according to several persons who called police last night.'

New York Herald Tribune (European Edition), 10 July 1962, page 2:

'Electrical Troubles in Hawaii

'In Hawaii, burglar alarms and air-raid sirens went off at the time of the blast.'

EMP effects data is given in the Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack, CRITICAL NATIONAL INFRASTRUCTURES, April 2008:

Page 18: “The Commission has concluded that even a relatively modest-to-small yield weapon of particular characteristics, using design and fabrication information already disseminated through licit and illicit means, can produce a potentially devastating E1 [prompt gamma ray caused, 10-20 nanoseconds rise time] field strength over very large geographical regions.”

Page 27: “There are about 2,000 ... transformers rated at or above 345 kV in the United States with about 1 percent per year being replaced due to failure or by the addition of new ones. Worldwide production capacity is less than 100 units per year and serves a world market, one that is growing at a rapid rate in such countries as China and India. Delivery of a new large transformer ordered today is nearly 3 years, including both manufacturing and transportation. An event damaging several of these transformers at once means it may extend the delivery times to well beyond current time frames as production is taxed. The resulting impact on timing for restoration can be devastating. Lack of high voltage equipment manufacturing capacity represents a glaring weakness in our survival and recovery to the extent these transformers are vulnerable.”

Pages 30-31: “Every generator requires a load to match its electrical output as every load requires electricity. In the case of the generator, it needs load so it does not overspin and fail, yet not so much load it cannot function. ... In the case of EMP, large geographic areas of the electrical system will be down, and there may be no existing system operating on the periphery for the generation and loads to be incrementally added with ease. ... In that instance, it is necessary to have a “black start”: a start without external power source. Coal plants, nuclear plants, large gas- and oil-fired plants, geothermal plants, and some others all require power from another source to restart. In general, nuclear plants are not allowed to restart until and unless there are independent sources of power from the interconnected transmission grid to provide for independent shutdown power. This is a regulatory requirement for protection rather than a physical impediment. What might be the case in an emergency situation is for the Government to decide at the time.”

Page 33: “Historically, we know that geomagnetic storms ... have caused transformer and capacitor damage even on properly protected equipment.”

Page 42: “Probably one of the most famous and severe effects from solar storms occurred on March 13, 1989. On this day, several major impacts occurred to the power grids in North America and the United Kingdom. This included the complete blackout of the Hydro-Quebec power system and damage to two 400/275 kV autotransformers in southern England. In addition, at the Salem nuclear power plant in New Jersey, a 1200 MVA, 500 kV transformer was damaged beyond repair when portions of its structure failed due to thermal stress. The failure was caused by stray magnetic flux impinging on the transformer core. Fortunately, a replacement transformer was readily available; otherwise the plant would have been down for a year, which is the normal delivery time for larger power transformers. The two autotransformers in southern England were also damaged from stray flux that produced hot spots, which caused significant gassing from the breakdown of the insulating oil.”

Page 45: “It is not practical to try to protect the entire electrical power system or even all high value components from damage by an EMP event. There are too many components of too many different types, manufactures, ages, and designs. The cost and time would be prohibitive. Widespread collapse of the electrical power system in the area affected by EMP is virtually inevitable after a broad geographic EMP attack ...”

Page 88: “The electronic technologies that are the foundation of the financial infrastructure are potentially vulnerable to EMP. These systems also are potentially vulnerable to EMP indirectly through other critical infrastructures, such as the power grid and telecommunications.”

Page 110: “Similar electronics technologies are used in both road and rail signal controllers. Based on this similarity and previous test experience with these types of electronics, we expect malfunction of both block and local railroad signal controllers, with latching upset beginning at EMP field strengths of approximately 1 kV/m and permanent damage occurring in the 10 to 15 kV/m range.”

Page 112: “Existing data for computer networks show that effects begin at field levels in the 4 to 8 kV/m range, and damage starts in the 8 to 16 kV/m range. For locomotive applications, the effects thresholds are expected to be somewhat higher because of the large metal locomotive mass and use of shielded cables.”

Page 115: “We tested a sample of 37 cars in an EMP simulation laboratory, with automobile vintages ranging from 1986 through 2002. ... The most serious effect observed on running automobiles was that the motors in three cars stopped at field strengths of approximately 30 kV/m or above. In an actual EMP exposure, these vehicles would glide to a stop and require the driver to restart them. Electronics in the dashboard of one automobile were damaged and required repair. ... Based on these test results, we expect few automobile effects at EMP field levels below 25 kV/m. Approximately 10 percent or more of the automobiles exposed to higher field levels may experience serious EMP effects, including engine stall, that require driver intervention to correct.”

Page 116: “Five of the 18 trucks tested did not exhibit any anomalous response up to field strengths of approximately 50 kV/m. Based on these test results, we expect few truck effects at EMP field levels below approximately 12 kV/m. At higher field levels, 70 percent or more of the trucks on the road will manifest some anomalous response following EMP exposure. Approximately 15 percent or more of the trucks will experience engine stall, sometimes with permanent damage that the driver cannot correct.”

Page 153: “Results indicate that some computer failures can be expected at relatively low EMP field levels of 3 to 6 kilovolts per meter (kV/m). At higher field levels, additional failures are likely in computers, routers, network switches, and keyboards embedded in the computer-aided dispatch, public safety radio, and mobile data communications equipment. ... none of the radios showed any damage with EMP fields up to 50 kV/m. While many of the operating radios experienced latching upsets at 50 kV/m field levels, these were correctable by turning power off and then on.”

Page 161: “In 1957, N. Christofilos at the University of California Lawrence Radiation Laboratory postulated that the Earth’s magnetic field could act as a container to trap energetic electrons liberated by a high-altitude nuclear explosion to form a radiation belt that would encircle the Earth. In 1958, J. Van Allen and colleagues at the State University of Iowa used data from the Explorer I and III satellites to discover the Earth’s natural radiation belts (J. A. Van Allen, and L. A. Frank, “Radiation Around the Earth to a Radial Distance of 107,400 km,” Nature, v183, p430, 1959). ... Later in 1958, the United States conducted three low-yield ARGUS high-altitude nuclear tests, producing nuclear radiation belts detected by the Explorer IV satellite and other probes. In 1962, larger tests by the United States and the Soviet Union produced more pronounced and longer lasting radiation belts that caused deleterious effects to satellites then in orbit or launched soon thereafter.”

Above: USSR Test ‘184’ on 22 October 1962, ‘Operation K’ (ABM System A proof tests) 300-kt burst at 290-km altitude near Dzhezkazgan. Prompt gamma ray-produced EMP induced a current of 2,500 amps measured by spark gaps in a 570-km stretch of 500 ohm impedance overhead telephone line to Zharyq, blowing all the protective fuses. The late-time MHD-EMP was of low enough frequency to enable it to penetrate the 90 cm into the ground, overloading a shallow buried lead and steel tape-protected 1,000-km long power cable between Aqmola and Almaty, firing circuit breakers and setting the Karaganda power plant on fire.

In December 1992, the U.S. Defence Nuclear Agency spent $288,500 on contracting 200 Russian scientists to produce a 17-chapter analysis of effects from the Soviet Union’s nuclear tests, which included vital data on three underwater nuclear tests in the arctic, as well three 300 kt high altitude tests at altitudes of 59-290 km over Kazakhstan. In February 1995, two of the military scientists, from the Russian Central Institute of Physics and Technology, lectured on the electromagnetic effects of nuclear tests at Lawrence Livermore National Laboratory. The Soviet Union had first suffered electromagnetic pulse (EMP) damage to electronic blast instruments in their 1949 test. Their practical understanding of EMP damage eventually led them, on Monday 22 October 1962, to detonate a 300 kt missile-carried thermonuclear warhead at an altitude of 300 km (USSR test 184). That was at the very height of the Cold War and the test was detected by America: at 7 pm that day, President John F. Kennedy, in a live TV broadcast, warned the Soviet Union’s Premier Khrushchev of nuclear war if a nuclear missile was launched against the West, even by an accident: ‘It shall be the policy of this nation to regard any nuclear missile launched from Cuba against any nation in the Western hemisphere as an attack by the Soviet Union on the United States, requiring a full retalitory response upon the Soviet Union.’ That Russian space missile nuclear test during the Cuban missiles crisis deliberately instrumented the civilian power infrastructure of populated areas, unwarned, in Kazakhstan to assess EMP effects on a 570 km long civilian telephone line and a 1,000 km civilian electric power cable! This test produced the worst effects of EMP ever witnessed (the more widely hyped 1.4 Mt, 400 km burst STARFISH EMP effects were trivial by comparison, because of the weaker natural magnetic field strength at Johnston Island). The bomb released 1025 MeV of prompt gamma rays (0.13% of the bomb yield). The 550 km East-West telephone line was 7.5 m above the ground, with amplifiers every 60 km. All of its fuses were blown by the induced peak current, which reached 2-3 kA at 30 microseconds, as indicated by the triggering of gas discharge tubes. Amplifiers were damaged, and lightning spark gaps showed that the potential difference reached 350 kV. The 1,000 km long Aqmola-Almaty power line was a lead-shielded cable protected against mechanical damage by spiral-wound steel tape, and buried at a depth of 90 cm in ground of conductivity 10-3 S/m. It survived for 10 seconds, because the ground attenuated the high frequency field, However, it succumbed completely to the low frequency EMP at 10-90 seconds after the test, since the low frequencies penetrated through 90 cm of earth, inducing an almost direct current in the cable, that overheated and set the power supply on fire at Karaganda, destroying it. Cable circuit breakers were only activated when the current finally exceeded the design limit by 30%. This limit was designed for a brief lightning-induced pulse, not for DC lasting 10-90 seconds. By the time they finally tripped, at a 30% excess, a vast amount of DC energy had been transmitted. This overheated the transformers, which are vulnerable to short-circuit by DC. Two later 300 kt Soviet Union space tests, with similar yield but low altitudes down to 59 km, produced EMPs which damaged military generators.

Above: the STARFISH (1.4 Mt, 400 km detonation altitude, 9 July 1962) detonation, seen from a mountain above the low-level cloud cover on Maui, consisted of a luminous debris fireball expanding in the vacuum of space with a measured initial speed of 2,000 km/sec. (This is 0.67% of the velocity of light and is 179 times the earth's escape velocity. Compare this to the initial upward speed of only 6 times earth's escape velocity, achieved by the 10-cm thick, 1.2 m diameter steel cover blown off the top of the 152 m shaft of the 0.3 kt Plumbbob-Pascal B underground Nevada test on 27 August 1957. In that test, a 1.5 m thick 2 ton concrete plug immediately over the bomb was pushed up the shaft by the detonation, knocking the welded steel lid upward. This was a preliminary experiment by Dr Robert Brownlee called 'Project Thunderwell', which ultimately aimed to launch spacecraft using the steam pressure from deep shafts filled with water, with a nuclear explosion at the bottom; an improvement of Jules Verne's cannon-fired projectile described in De la Terre à la Lune, 1865, where steam pressure would give a more survivable gentle acceleration than Verne's direct impulse from an explosion. Some 90% of the radioactivity would be trapped underground.) The film: 'shows the expansion of the bomb debris from approximately 1/3 msec to almost 10 msec. The partition of the bomb debris into two parts ... is shown; in particular the development of the "core" into an upwards mushroomlike expansion configuration is seen clearly. The fast moving fraction takes the shape of a thick disc. Also the interaction of the bomb debris with the booster at an apparent distance (projected) of approximately 1.5 km is shown.' (Page A1-38 of the quick look report.)

In this side-on view the fireball expansion has a massive vertical asymmetry due to the effects of the device orientation (the dense upward jetting is an asymmetric weapon debris shock wave, due to the missile delivery system and/or the fact that the detonation deliberately occurred with 'the primary and much of the fusing and firing equipment' vertically above the fusion stage, see page A1-7 of the quick look technical report linked here): 'the STARFISH test warhead was inverted prior to the high-altitude test over Johnston Island in 1962 because of concerns that some masses within the design would cause an undesirable shadowing of prompt gamma rays and mask selected nuclear effects that were to be tested.' (April 2005 U.S. Department of Defense Report of the Defense Science Board Task Force on Nuclear Weapon Effects Test, Evaluation, and Simulation, page 29.). The earth's magnetic field also played an immediate role in introducing asymmetric fireball expansion as seen from Maui: 'the outer shell of expanding bomb materials forms ... at ... 1/25 to 1/10 sec, an elongated ellipsoidal shape with the long axis orientated along the magnetic field lines.' (Page A1-12 of the quick look report.)

The STARFISH test as filmed from Johnston Island with a camera pointing upwards could not of course show the vertical asymmetry, but it did show that the debris fireball: 'separated into two parts ... the central core which expands rather slowly and ... an outer spherically expanding shell ... The diameter of the expanding shell is approximately 2 km at 500 microseconds ...' (William E. Ogle, Editor, A 'Quick Look' at the Technical Results of Starfish Prime, August 1962, report JO-600, AD-A955411, originally secret-restricted data, p. A1-7.) Within 0.04-0.1 second after burst, the outer shell - as filmed from Maui in the Hawaiian Islands, had become elongated along the earth's magnetic field, creating an ellipsoid-shaped fireball. Visible 'jetting' of radiation up and southward was observed from the debris fireball at 20-50 seconds, and some of these jets are visible in the late time photograph of the debris fireball at 3 minutes after burst (above right).

The analysis of STARFISH on the right was done by the Nuclear Effects Group at the Atomic Weapons Establishment, Aldermaston, and was briefly published on their website, with the following discussion of the 'patch deposition' phenomena which applied to bursts above 200 km: 'the expanding debris compresses the geomagnetic field lines because the expansion velocity is greater than the Alfven speed at these altitudes. The debris energy is transferred to air ions in the resulting region of tightly compressed magnetic field lines. Subsequently the ions, charge-exchanged neutrals, beta-particles, etc., escape up and down the field lines. Those particles directed downwards are deposited in patches at altitudes depending on their mean free paths. These particles move along the magnetic field lines, and so the patches are not found directly above ground zero. Uncharged radiation (gamma-rays, neutrons and X-rays) is deposited in layers which are centered directly under the detonation point. The STARFISH event (1.4 megatons at 400 km) was in this altitude regime. Detonations at thousands of kilometres altitude are contained purely magnetically. Expansion is at less than the local Alfven speed, and so energy is radiated as hydromagnetic waves. Patch depositions are again aligned with the field lines.'

The Atomic Weapons Establishment site also showed a Monte Carlo model of STARFISH radiation belt development, indicating that the electron belt stretched a third of the way around the earth's equator at 3 minutes, and encircled the earth at 10 minutes. The averaged beta particle radiation flux in the belt was 2 x 1014 electrons per square metre per second at 3 minutes after burst, falling to a quarter of that at 10 minutes. As the time goes on, the radiation belt pushes up to higher altitudes and becomes more concentrated over the magnetic equator. For the first 5 minutes, the radiation belt has an altitude range of about 200-400 km and spans from 27 degrees south of the magnetic equator to 27 degrees north of it. At 1 day after burst, the radiation belt height has increased to the 600-1,100 km zone and the average flux is then 1.5 x 1012 electrons/m2/sec. At 4 months the altitude for this average flux (plus or minus a factor of 4) is confined to altitudes of 1,100-1,500 km, and it is covering a smaller latitude range around the magnetic equator, from about 20 degrees north to about 20 degrees south. At 95 years after burst, the remaining electrons will be 2,000 km above the magnetic equator, the latitude range will be only plus or minus 10 degrees from the equator, and the shell will only be 50 km thick.

Update: John B. Cladis, et al., “The Trapped Radiation Handbook”, Lockheed Palo Alto Research Laboratory, California, December 1971, AD-738841, Defense Nuclear Agency report DNA 2524H, 746 pages, is available online as a 57 MB PDF download linked here. (The key pages of nuclear test data, under 1 MB download, are linked here.) Page changes (updates) 3-5 separately available: change 3 (254 pages, 1974), change 4 (137 pages, 1977), and change 5 (102 pages, 1977).

This handbook discusses the Earth’s magnetic field trapping mechanism for electrons emitted by a nuclear explosion at high altitude or in outer space, including some unique satellite measured maps (Figures 6-15 and 6-16) of the trapped electron radiation belts created by 1.4 Mt American nuclear test at 400 km altitude on 9 July 1962, Starfish (Injun 1 data for 10 hours after burst and Telstar data for 48 hours). In addition, the handbook includes Telstar satellite measured maps of the trapped radiation shells for the 300 kt Russian tests at 290 and 150 km altitude on 22 and 28 October 1962 (Figures 6-23 and 6-24). The Russian space bursts were detonated at greater latitudes north than the Starfish burst that occurred almost directly over Johnston Island, more appropriate for the situation of high altitude burst over most potential targets. On page 6-39 the handbook concludes that 7.5 x 1025 electrons from Starfish (10 percent of its total emission) were initially trapped in the Earth’s magnetic field to form radiation belts in outer space (the rest were captured by the atmosphere). Page 6-54 concludes that the 300 kt, 290 km burst altitude 22 October 1962 Russian test had 3.6 x 1025 of its electrons trapped in the radiation belts, while the 300 kt, 150 km altitude shot on 28 October had only one-third as many of its electrons trapped, and the 300 kt, 59 km altitude burst on 1 November had only 1.2 x 1024 electrons trapped in space. So increasing the height of burst for a given yield greatly increased the percentage of the electrons trapped in radiation belts in space by the Earth’s magnetic field.

These data we give for the yields and burst heights for the 1962 Russian high altitude tests are the Russian data based on close-in accurate measurements and the yields of similar bombs under other conditions, released in 1995. The original American data on the Russian tests was relatively inaccurate since it was based on long-range EMP, air pressure wave, and trapped radiation belt measurements, but it has all recently been declassified by the CIA and is given in the CIA National Intelligence Estimate, July 2, 1963, on pages 43-44: "Joe 157" on 22 October 1962, "Joe 160" on 28 October and "Joe 168" on 1 November were initially assessed by America to be 200 kt, 200 kt and 1.8 Mt, detonated at altitudes of about 297 km, 167 km, and 93 km, respectively. As mentioned, the true yield was 300 kt in all cases and the true heights of burst were 290, 150 and 59 km. This is very interesting as it indicates how accurately the yield and burst altitude can be determined in the event of an unexpected nuclear test by an enemy, even with 1962 technology. The report also indicates that the Russians carefully scheduled their high altitude tests to be measured by their COSMOS XI satellite:

"A unique feature of all three 1962 high-altitude tests [by Russia] was the apparent planned use of a satellite to collect basic physical data. COSMOS XI passed over the burst point of JOE 157 within minutes of the detonation; it was at the antipodal point for the JOE 160 test at the time of detonation; and it was near the magnetic conjugate point of the JOE 168 detonation at time of burst."

A very brief (11 pages, 839 kb) compilation of the key pages with the vital nuclear test data from the long Trapped Radiation Handbook is linked here. The rate at which the radiation belts diminished with time was slow and hard to measure accurately, and is best determined by computer Monte Carlo simulations like the AWRE code discussed in this post. If the altitude of the “mirror points” (where the Earth’s strong magnetic field strengths near the poles reflects back the spiralling electrons) dips into the atmosphere, electrons get stopped and captured by air molecules, instead of reflected back into space. Therefore, there is a leakage of electrons at the mirror points, if those points are at low enough altitudes.

When STARFISH was detonated: 'The large amount of energy released at such a high altitude by the detonation caused widespread auroras throughout the Pacific area, lasting in some cases as long as 15 minutes; these were observed on both sides of the equator. In Honolulu an overcast, nighttime sky was turned into day for 6 minutes (New York Times, 10 July 1962). Observers on Kwajalein 1,400 nautical miles (about 2,600 km) west reported a spectacular display lasting at least 7 minutes. At Johnston Island all major visible phenomena had disappeared by 7 minutes except for a faint red glow. The earth's magnetic field [measured at Johnston] also was observed to respond to the burst. ... On 13 July, 4 days after the shot, the U.K. satellite, Ariel, was unable to generate sufficient electricity to function properly. From then until early September things among the satellite designers and sponsors were "along the lines of the old Saturday matinee one-reeler" as the solar panels on several other satellites began to lose their ability to generate power (reference: The Artificial Radiation Belt, Defense Atomic Support Agency, 4 October 1962, report DASA-1327, page 2). The STARFISH detonation had generated large quantities of electrons that were trapped in the earth's magnetic field; the trapped electrons were damaging the solar cells that generated the power in the panels.' (Source: Defense Nuclear Agency report DNA-6040F, AD-A136820, pp. 229-30.)

Above: the conjugate region aurora from STARFISH, 4,200 km from the detonation, as seen from Tongatapu 11 minutes after detonation. (Reference: W. P. Boquist and J. W. Snyder, 'Conjugate Auroral Measurements from the 1962 U.S. High Altitude Nuclear Test Series, in Aurora and Airglow, B. M. McCormac, Ed., Reinhold Publishing Corp., 1967.) A debris aurora caused by fission product ions travelling along magnetic field lines to the opposite hemisphere requires a burst altitude above 150 km, and in the STARFISH test at 400 km some 40% of the fission products were transported south along the magnetic force field into the conjugate region (50% was confined locally and 10% escaped into space). The resulting colourful aurora was filmed at Tongatapu (21 degrees south) looking north, and it was also seen looking south from Samoa (14 degrees south). The STARFISH debris reached an altitude of about 900-km when passing over the magnetic equator. The debris in the conjugate region behaves like the debris remaining in the burst locale; over the course of 2 hours following detonation, it simply settles back down along the Earth’s magnetic field lines to an altitude of 200 km (assuming a burst altitude exceeding 85 km). Hence, the debris is displaced towards the nearest magnetic pole. The exact ‘offset distance’ depends simply upon the angle of the Earth’s magnetic field lines. The ionisation in the debris region is important since it can disrupt communications if the radio signals need to pass through the region to reach an orbital satellite, and also because it may disrupt radar systems from spotting incoming warheads (since radar beams are radio signals which are attenuated).

In the Pacific nuclear high altitude megaton tests, communications using ionosphere-reflected high frequency (HF) radio were disrupted for hours at both ends of the geomagnetic field lines which passed through the detonation point. However, today HF is obsolete and the much higher frequencies involved do not suffer so much attenuation. Instead of relying on the ionosphere and conducting ocean to form a reflecting wave-guide for HF radio, the standard practice today is to use microwave frequencies that penetrate right through the normal ionosphere and are beamed back to another area by an orbital satellite. These frequencies can still be attenuated by severe ionisation from a space burst, but the duration of disruption will be dramatically reduced to seconds or minutes.

‘Recently analyzed beta particle and magnetic field measurements obtained from five instrumented rocket payloads located around the 1962 Starfish nuclear burst are used to describe the diamagnetic cavity produced in the geomagnetic field. Three of the payloads were located in the cavity during its expansion and collapse, one payload was below, and the fifth was above the fully expanded cavity. This multipoint data set shows that the cavity expanded into an elongated shape 1,840 km along the magnetic field lines and 680 km vertically across in 1.2 s and required an unexpectedly long time of about 16 s to collapse. The beta flux contained inside the cavity was measured to be relatively uniform throughout and remained at 3 × 1011 beta particles/cm2 s for at least 7 s. The plasma continued to expand upward beyond the fully expanded cavity boundary and injected a flux measuring 2.5 × 1010 beta particles/cm2 s at H + 34 s into the most intense region of the artificial belt. Measured 10 hours later by the Injun I spacecraft, this flux was determined to be 1 × 109 beta particles/cm2 s.’ - Palmer Dyal, ‘Particle and field measurements of the Starfish diamagnetic cavity’, Journal of Geophysical Research, volume 111, issue A12, page 211 (2006).

Palmer Dyal was the nuclear test Project Officer and co-author with W. Simmons of Operation DOMINIC, FISH BOWL Series, Project 6.7, Debris Expansion Experiment, U.S. Air Force Weapons Laboratory, Kirkland Air Force Base, New Mexico, POR-2026 (WT-2026), AD-A995428, December 1965:

'This experiment was designed to measure the interaction of expanding nuclear weapon debris with the ion-loaded geomagnetic field. Five rockets on STARFISH and two rockets on CHECKMATE were used to position instrumented payloads at various distances around the burst points. The instruments measured the magnetic field, ion flux, beta flux, gamma flux, and the neutron flux as a function of time and space around the detonations. Data was transmitted at both real and recorded times to island receiving sites near the burst regions. Measurements of the telemetry signal strengths at these sites allowed observations of blackout at 250 MHz ... the early expansion of the STARFISH debris probably took the form of an ellipsoid with its major axis oriented along the earth's magnetic field lines. Collapse of the magnetic bubble was complete in approximately 16 seconds, and part of the fission fragment beta particles were subsequently injected into trapped orbits. ...

‘At altitudes above 200 kilometres ... the particles travel unimpeded for several thousands of kilometres. During the early phase of a high-altitude explosion, a large percentage of the detonation products is ionized and can therefore interact with the geomagnetic field and can also undergo Coulomb scattering with the ambient air atoms. If the expansion is high enough above the atmosphere, an Argus shell of electrons can be formed as in the 1958 and 1962 test series. ... If this velocity of the plasma is greater than the local sound or Alfven speed, a magnetic shock similar to a hydro shock can be formed which dissipates a sizable fraction of the plasma kinetic energy. The Alfven velocity is C = B/(4*{Pi}*{Ion density, in ions per cubic metre})1/2, where ... B is the magnetic field ... Since the STARFISH debris expansion was predicted and measured to be approximately 2 x 108 cm/sec and the Alfven velocity is about 2 x 107 cm/sec, a shock should be formed. A consideration of the conservation of momentum and energy indicates that the total extent of the plasma expansion proceeds until the weapon plasma kinetic energy is balanced by the B2/(8{Pi}) magnetic field energy [density] in the excluded region and the energy of the air molecules picked up by the expanding debris. ... An estimate of the maximum radial extent of the STARFISH magnetic bubble can be made assuming conservation of momentum and energy. The magnetic field swept along by the plasma electrons will pick up ambient air ions as it proceeds outward. ...’

Conservation of momentum suggests that the initial outward bomb momentum, MBOMBVBOMB must be equal to the momentum of the total expanding fireball after it has picked up air ions of mass MAIR IONS:


where V is the velocity of the combined shell of bomb and air ions. The expansion of the ionized material against the earth’s magnetic field slows it down, so that the maximum radial extent occurs when the initial kinetic energy E = (1/2) MBOMBVBOMB2 has been converted into the potential energy density of the magnetic field which stops its expansion. The energy of the magnetic field excluded from the ionized shell of radius R is simply the volume of that shell multiplied by the magnetic field energy density B2/(8{Pi}). By setting the energy of the magnetic field bubble equal to the kinetic energy of the explosion, the maximum size of the bubble could be calculated, assuming the debris was 100% ionized.

For CHECKMATE, they reported: ‘Expansion of the debris was mostly determined by the surrounding atmosphere which had a density of 4.8 x 1010 particles/cm3.

Richard L. Wakefield's curve above, although it suffers from many instrument problems, explained EMP damage on Hawaii some 1,300 km from the burst point - see map below. Dr Longmire explained Wakefield's curve by a brand new EMP theory called the 'magnetic dipole mechanism' - a fancy name for the deflection at high altitudes of electrons by the Earth's natural magnetic dipole field. The original plan for the Starfish test is declassified here, and the first report on the effects is declassified here. The zig-zag on the measured curve above is just 'ringing' in the instrument, not in the EMP. The inductance, capacitance, and resistance combination of the electronic circuit in the oscilloscope used to measure the EMP evidently had a natural resonance - rather like a ringing bell - at a frequency of 110 MHz, which was set off by the rapid rise in the first part of the EMP and continued oscillating for more than 500 ns. The wavy curve from the instrument is thus superimposed on the real EMP.

Above: raw data released by America so far on the Starfish EMP consists of the graph on the left based on a measurement by Richard L. Wakefield in a C-130 aircraft 1,400 km East-South-East of the detonation, with a CHAP (code for high altitude pulse) Longmire computer simulation for that curve both with and without instrument response corrections (taken from Figure 9 of the book EMP Interaction, online edition), and the graph on the right which is Longmire's CHAP calculation of the EMP at Honolulu, 1,300 km East-North-East of the detonation (page 7 of Longmire's report EMP technical note 353, March 1985). By comparing the various curves, you can guess the correct scales for the graph on the left and also what the time-dependent instrument response is.Above: locations of test aircraft which suffered EMP damage during Operation Fishbowl in 1962. In testimony to 1997 U.S. Congressional Hearings on EMP, Dr. George W. Ullrich, the Deputy Director of the U.S. Department of Defense's Defense Special Weapons Agency (now the DTRA, Defence Threat Reduction Agency) said that the lower-yield Fishbowl tests after Starfish 'produced electronic upsets on an instrumentation aircraft that was approximately 300 kilometers away from the detonations.' The report by Charles N. Vittitoe, 'Did high-altitude EMP (electromagnetic pulse) cause the Hawaiian streetlight incident?', Sandia National Labs., Albuquerque, NM, report SAND-88-0043C; conference CONF-880852-1 (1988) states on page 3: 'Several damage effects have been attributed to the high-altitude EMP. Tesche notes the input-circuit troubles in radio receivers during the Starfish [1.4 Mt, 400 km altitude] and Checkmate [7 kt, 147 km altitude] bursts; the triggering of surge arresters on an airplane with a trailing-wire antenna during Starfish, Checkmate, and Bluegill [410 kt, 48 km altitude] ...'

Below are the prompt EMP waveforms measured in California, 5,400 km away from Starfish (1.4 Mt, 400 km altitude) and Kingfish (410 kt, 95 km altitude) space shots above Johnston Island in 1962:

It is surprising to find that on 11 January 1963, the American journal Electronics Vol. 36, Issue No. 2, had openly published the distant MHD-EMP waveforms from all five 1962 American high altitude detonations Starfish, Bluegill, Kingfish, Checkmate, and Tightrope: 'Recordings made during the high-altitude nuclear explosions over Johnston Island, from July to November 1962, shed new light on the electromagnetic waves associated with nuclear blasts. Hydrodynamic wave theory is used to explain the main part of the signal from a scope. The results recorded for five blasts are described briefly. The scopes were triggered about 30 micro-seconds before the arrival of the main spike of the electromagnefic pulse.'

Above: if we ignore the late-time MHD-EMP mechanism (which takes seconds to minutes to peak and has extremely low frequencies) there are three EMP mechanisms at play in determining the radiated EMP as a function of burst altitude. This diagram plots typical peak radiated EMP signals from 1 kt and 1 Mt bombs as a function of altitude for an observer at a fixed distance of 100 km from ground zero. For very low burst altitudes, the major cause of EMP radiation is the asymmetry due to the Earth's surface (there is net upward Compton current due to the ground absorbing downward-directed gamma rays). This is just like a vertical 'electric dipole' radio transmitter antenna radiating radio waves horizontally (at right angles to the direction of the time-varying current) when the vertical current supplied to the antenna is varied in time. Dolan's DNA-EM-1 states that a 1 Mt surface burst radiates a peak EMP of 1,650 v/m at 7.2 km distance (which falls off inversely with distance for greater distances). As the burst altitude is increased above about 1 km or so, this ground asymmetry mechanism becomes less important because the gamma rays take 1 microsecond to travel 300 metres and don't reach the ground with much intensity; in any case by that time the EMP has been emitted by another mechanism of asymmetry, the fall in air density with increasing altitude, which is particularly important for bursts of 1-10 km altitude. Finally, detonations above 10 km altitude send gamma rays into air of low density, so that the Compton electrons have the chance (before hitting air molecules!) to be deflected significantly by the Earth's magnetic field; this 'magnetic dipole' deflection makes them emit synchrotronic radiation which is the massive EMP hazard from space bursts which was discovered by Dr Conrad Longmire after the Starfish test on 9 July 1962. After the Starfish EMP was measured by Richard Wakefield, the Americans started looking for 'magnetic dipole' EMP from normal megaton air bursts dropped from B-52 aircraft (at a few km altitude to prevent local fallout). Until then they measured EMP from air bursts using oscilloscopes set to measure EMP with durations of tens of microseconds. By increasing the sweep speed to sub-microsecond times (nanoseconds), they were then able to see the positive pulse of 'magnetic dipole' EMP even in sea level air bursts at relatively low altitude, typically peaking at 18 v/m at 70 nanoseconds for 20 km distance as in the following illustration from LA-2808:

Above: the long-duration, weak field electric-dipole EMP waveform due to vertical asymmetry from a typical air burst, measured 4,700 km from the Chinese 200 kt shot on 8 May 1966.

Because of Nobel Laureate Dr Hans Bethe's errors in predicting the wrong EMP mechanism for high altitude bursts back in 1958 (he predicted the electric dipole EMP, neglecting both the magnetic dipole mechanism and the MHD/auroral EMP mechanisms), almost all the instruments were set to measure a longer and less intense EMP with a different polarization (vertical, not horizontal), and at best they only recorded vertical-looking spikes which went off-scale and provided zero information about the peak EMP. In 1958 tests Teak and Orange, there was hardly any information at all due to both this instrumentation problem and missile errors.

Above: the American 1.4 Mt Starfish test at 400-km, on 9 July 1962, induced large EMP currents in the overhead wires of 30 strings of Oahu streetlights, each string having 10 lights (300 streetlights in all). The induced current was sufficient to blow the fuses. EMP currents in the power lines set off ‘hundreds’ of household burglar alarms and opened many power line circuit breakers. On the island of Kauai, EMP closed down telephone calls to the other islands despite the 1962 sturdy relay (electromechanical) telephone technology, by damaging the microwave diode in the electronic microwave link used to connect the telephone systems between different Hawaiian islands (because of the depth of the ocean between the islands, the use of undersea cables was impractical). If the Starfish Prime warhead had been detonated over the northern continental United States, the magnitude of the EMP would have been about 2.4 times larger because of the stronger magnetic field over the USA which deflects Compton electrons to produce EMP, while the much longer power lines over the USA would pick up a lot more EMP energy than the short power lines in Hawaiian islands, and finally the 1962 commonplace electronic 'vacuum tubes' or 'triode valves' (used before transistors and microchips became common) which could survive 1-2 Joules of EMP, have now been completely replaced by modern semiconductor microchips which are millions of times times more sensitive to EMP (burning out at typically 1 microJoule of EMP energy or less), simply because they pack millions of times more components into the same space, so the over-heating problem is far worse for a very sudden EMP power surge (rising within a microsecond). Heat can't be dissipated fast enough so the microchip literally melts or burns up under EMP exposure, while older electronics can take a lot more punishment. So new electronics is a million times more vulnerable than in 1962.

'The time interval detectors used on Maui went off scale, probably due to an unexpectedly large electromagnetic signal ...' - A 'Quick Look' at the Technical Results of Starfish Prime, 1962, p. A1-27.

The illustration of Richard Wakefield's EMP measurement from the Starfish test is based on the unclassified reference is K. S. H. Lee's 1986 book, EMP Interaction. (The earlier, 1980, edition is now online here as a 28 MB download, and it contains the Starfish EMP data.) However, although that reference gives the graph data (including instrument-corrected data from an early computer study called ‘CHAP’ - Code for High Altitude Pulse, by Longmire in 1974), it omits the scales from the graph for the time and electric field, which need to be filled in from another graph declassified separately in Dolan's DNA-EM-1. Full calculations of EMP as a function of burst altitude are also online in pages 33 and 36 of Louis W. Seiler, Jr., A Calculational Model for High Altitude EMP, report AD-A009208, March 1975.

The recently declassified report on Starfish states that Richard L. Wakefield's measurement - the only one at the extremely high frequency that measured the peak EMP with some degree of success, was an attempt to measure the time-interval between the first and secondary stage explosions in the weapon (the fission primary produces one pulse of gamma rays, which subsides before the final thermonuclear stage signal). Wakefield's report title is (taken from page 44 of the declassified Starfish report):

Measurement of time interval from electromagnetic signal received in C-130 aircraft, 753 nautical miles from burst, at 11 degrees 16 minutes North, 115 degrees 7 minutes West, 24,750 feet.

There is really no wonder why it remains secret: the title alone tells you that you can measure not just the emission from the bomb but the internal functioning (the time interval between the primary fission stage and secondary thermonuclear stage!) of the bomb, just by photographing an oscilloscope with a suitable sweep speed, connected to an antenna, from an aircraft 1,400 km away flying at an altitude of 24,750 feet! The longitude of the measurement is clearly in error as it doesn't correspond to the stated distance from ground zero. Presumably there is a typing error and the C-130 was at 155 degrees 7 minutes West, not 115 degrees 7 minutes. This would put the position of Wakefield's C-130 some 800 km or so South of the Hawaiian islands at detonation time. The report also shows why all the other EMP measurements failed to measure the peak field: they were almost all made in the ELF and VLF frequency bands, corresponding to rise times in milliseconds and seconds, not nanoseconds. They were concentrating on measuring the magnetohydrodynamic (MHD) EMP due to the ionized fireball expansion displacing the Earth's magnetic field, and totally ignored the possibility of a magnetic dipole EMP from the deflection of Compton electrons by the Earth's magnetic field.

Notice that the raw data from Starfish - without correction for the poor response of the oscilloscope's aerial orientation and amplifier circuit to the EMP - indicates a somewhat lower peak electric field at a later time than the properly corrected EMP curve. The true peak was 5,210 v/m at 22 nanoseconds (if this scale is correct; notice that Longmire's reconstruction of the Starfish EMP at Honolulu using CHAP gave 5,600 v/m peaking at 100 ns). The late-time (MHD-EMP) data for Starfish shown is for the horizontal field and is available online in Figure 6 of the arXiv filed report here by Dr Mario Rabinowitz.

Dr Rabinowitz has also compiled a paper here, which quotes some incompetent political 'off the top of my head' testimony from clowns at hearings from the early 1960s, which suggests that Starfish Prime did not detonate over Johnston Island but much closer to Hawaii, but the burst position was accurately determined from theodolite cameras to be 16° 28' 6.32" N and 169° 37' 48.27" W (DASA-1251 which has been in the public domain since 1979 gives this, along with the differing exact burst positions of other tests; this is not the position of launch or an arbitrary point in Johnston Island but is the detonation point). The coordinates of Johnston Island launch area are 16° 44' 15" N and 169° 31' 26" W (see this site), so Starfish Prime occurred about 16 minutes (nautical miles) south of the launch pad and about 6 minutes (nautical miles) west of the launch pad, i.e., 32 km from the launch pad (this is confirmed on page 6 of the now-declassified Starfish report available online).

Hence, Starfish Prime actually detonated slightly further away from Hawaii than the launch pad, instead of much closer to Hawaii! The detonation point was around 32 km south-south-west of Johnston Island, as well as being 400 km up. It is however true as Rabinowitz records that the 300 streetlights fused in the Hawaiian Islands by Starfish were only 1-3% of the total number. But I shall have more to say about this later on, particularly after reviewing extensive Russian EMP experiences with long shallow-buried power lines and long overhead telephone lines which Dr Rabinowitz did not know about in 1987 when writing the critical report.

Above: EMP waveform for all times (logarithmic axes) and frequency spectra for a nominal high altitude detonation (P. Dittmer et al., DNA EMP Course Study Guide, Defense Nuclear Agency, DNA Report DNA-H-86-68-V2,
May 1986). The first EMP signal comes from the prompt gamma rays of fission and gamma rays released within the bomb due to the inelastic scatter of neutrons with the atoms of the weapon. For a fission weapon, about 3.5% of the total energy emerges as prompt gamma rays, and this is added to by the gamma rays due to inelastic neutron scatter in the bomb. But despite their high energy (typically 2 MeV), most of these gamma rays are absorbed by the weapons materials, and don't escape from the bomb casing. Typically only 0.1-0.5% of the bomb energy is actually radiated as prompt gamma rays (the lower figure applying to massive, old fashioned high-yield Teller-Ulam multimegaton thermonuclear weapons with thick outer casings, and the high figure to lightweight, low-yield weapons, with relatively thin outer casings). The next part of the EMP from a space burst comes from inelastic scatter of neutrons as they hit air molecules. Then, after those neutrons are slowed down a lot by successive inelastic scattering in the air (releasing gamma rays each time), they are finally captured by the nuclei of nitrogen atoms, which causes gamma rays to be emitted and a further EMP signal which adds to the gamma rays from decaying fission product debris. Finally, you get an EMP signal at 1-10 seconds from the magnetohydrodynamic (MHD) mechanism, where the ionized fireball expansion pushes out the earth's magnetic field (which can't enter an electrically-conductive, ionized region) with a frequency of less than 1 Hertz, and then the auroral motion of charged particles from the detonation (spiralling along the earth's magnetic field between conjugate points in opposite magnetic hemispheres) constitutes another motion of charge (i.e. an time-varying electric current) which sends out a final EMP at extremely low frequencies, typically 0.01 Hertz. These extremely low frequencies, unlike the high frequencies, can penetrate substantial depths underground, where they can induce substantial electric currents in very long (over 100 km long) buried cables.

Above: the late-time magnetohydrodynamic EMP (MHD-EMP) measured by the change in the natural magnetic field flux density as a function of time after American tests Starfish (1.4 Mt, 400 km burst altitude), Checkmate (7 kt, 147 km burst altitude) and Kingfish (410 kt, 95 km burst altitude) at Johnston Island, below the detonations. The first (positive) pulse in each case is due to the ionized (diamagnetic) fireball expanding and pushing out the earth's magnetic field, which cannot penetrate into a conductive cavity such as an ionized fireball. Consequently, the pushed-out magnetic field lines become bunched up outside the fireball, which means that the magnetic field intensity increases (the magnetic field intensity can be defined by the concentration of the magnetic field lines in space). Under the fireball - as in the case of the data above, measured at Johnston Island, which was effectively below the fireball in each case - there is a patch of ionized air caused by X-rays being absorbed from the explosion, and this patch shields in part the first pulse of MHD-EMP (i.e., that from the expansion of the fireball which pushes out the earth's magnetic field). The second (negative) pulse of the late-time EMP is bigger in the case of the Starfish test, because it is unshielded: this large negative pulse is simply due to the auroral effect of the ionized fireball rising and moving along the earth's magnetic field lines. This motion of ionized fission product debris constitutes a large varying electric current for a high yield burst like Starfish, and as a result of this varying current, the accelerating charges radiate an EMP signal which can peak at a minute or so after detonation.

Above: the measured late-time MHD-EMP at Hawaii, 1,500 km from the Starfish test, was stronger than at Johnston Island (directly below the burst!) because of the ionized X-ray patch of conductive air below the bomb, which shielded Johnston Island. The locations of these patches of ionized air below bursts at various altitudes are discussed in the blog post linked here.
Above: correlation of global measurements of the Starfish MHD-EMP late signal which peaked 3-5 seconds after detonation.

The 3-stages of MHD-EMP:

  1. Expansion of ionized, electrically conducting fireball excludes and so pushes out Earth’s magnetic field lines, causing an EMP. This peaks within 10 seconds. However, the air directly below the detonation is ionized and heated by X-rays so that it is electrically conducting and thus partly shields the ground directly below the burst from the late-time low-frequency EMP.
  2. A MHD-EMP wave then propagates between the ionosphere’s F - layer and the ground, right around the planet.
  3. The final stage of the late-time EMP is due to the aurora effect of charged particles and fission products physically moving along the Earth’s magnetic field lines towards the opposite pole. This motion of charge constitutes a large time-varying electric current which emits the final pulse of EMP, which travels around the world.

MHD-EMP has serious effects for long conductors because its extremely low frequencies (ELF) can penetrate a lot further into the ground than higher frequencies can, as proved by its effect on a long buried power line during the nuclear test of a 300 kt warhead at 290 km altitude on 22 October 1962 near Dzhezkazgan in Kazakhstan (as part of some Russian ABM system proof tests). In this test, prompt gamma ray-produced EMP induced a current of 2,500 amps measured by spark gaps in a 570-km stretch of overhead telephone line out to Zharyq, blowing all the protective fuses. But the late-time MHD-EMP was of special interest because it was of low enough frequency to enable it to penetrate the 90 cm into the ground, overloading a shallow buried lead and steel tape-protected 1,000-km long power cable between Aqmola and Almaty, firing circuit breakers and setting the Karaganda power plant on fire. The Russian 300 kt test on 22 October 1962 at 290 km altitude (44,84º N, 66,05º E) produced an MHD-EMP magnetic field of 1025 nT measured at ground zero, 420 nT at 433 km, and 240 nT at 574 km distance. Along ground of conductivity 10-3 S/m, 400 v was induced in a cable 80 km long, implying an MHD-EMP of 5 v/km.
Above: the incendiary effects of a relatively weak but natural MHD-EMP from the geomagnetic solar storm of 13 March 1989 in saturating the core of a transformer in the Hydro-Quebec electric power grid. Hydro-Quebec lost electric power, cutting the supply of electricity to 6 million people for several hours, and it took 9 hours to restore 83% (21.5 GW) of the power supply (1 million people were still without electric power then). Two 400/275 kV autotransformers were also damaged in England:

'In addition, at the Salem nuclear power plant in New Jersey, a 1200 MVA, 500 kV transformer was damaged beyond repair when portions of its structure failed due to thermal stress. The failure was caused by stray magnetic flux impinging on the transformer core. Fortunately, a replacement transformer was readily available; otherwise the plant would have been down for a year, which is the normal delivery time for larger power transformers. The two autotransformers in southern England were also damaged from stray flux that produced hot spots, which caused significant gassing from the breakdown of the insulating oil.' - EMP Commission report, 'Critical National Infrastructures', 2008, page 42.

A study of these effects is linked here. Similar effects from the Russian 300 kt nuclear test at 290 km altitude over Dzhezkazgan in Kazakhstan on 22 October 1962 induced enough current in a 1,000 km long protected underground cable to burn the Karaganda power plant to the ground. Dr Lowell Wood testified on 8 March 2005 during Senate Hearings 109-30 that these MHD-EMP effects are: 'the type of damage which is seen with transformers in the core of geomagnetic storms. The geomagnetic storm, in turn, is a very tepid, weak flavor of the so-called slow component of EMP. So when those transformers are subjected to the slow component of the EMP, they basically burn, not due to the EMP itself but due to the interaction of the EMP and normal power system operation. Transformers burn, and when they burn, sir, they go and they are not repairable, and they get replaced, as you very aptly pointed out, from only foreign sources. The United States, as part of its comparative advantage, no longer makes big power transformers anywhere at all. They are all sourced from abroad. And when you want a new one, you order it and it is delivered - it is, first of all, manufactured. They don't stockpile them. There is no inventory. It is manufactured, it is shipped, and then it is delivered by very complex and tedious means within the U.S. because they are very large and very massive objects. They come in slowly and painfully. Typical sort of delays from the time that you order until the time that you have a transformer in service are one to 2 years, and that is with everything working great. If the United States was already out of power and it suddenly needed a few hundred new transformers because of burnout, you could understand why we found not that it would take a year or two to recover, it might take decades, because you burn down the national plant, you have no way of fixing it and really no way of reconstituting it other than waiting for slow-moving foreign manufacturers to very slowly reconstitute an entire continent's worth of burned down power plant.'


‘The British Government and our scientists have … been kept fully informed ... the fall-out from these very high-altitude tests is negligible ... the purpose of this experiment is of the greatest importance from the point of view of defence, for it is intended to find out how radio, radar, and other communications systems on which all defence depends might be temporarily put out of action by explosions of this kind.’ – British Prime Minister Harold Macmillan, Statement to the House of Commons, 8 May 1962.

‘Detonations above about 130,000 feet [40 km] produce EMP effects on the ground … of sufficient magnitude to damage electrical and electronic equipment.’ – Philip J. Dolan, editor, Capabilities of Nuclear Weapons, U.S. Department of Defense, 1981, DNA-EM-1, c. 1, p. 19, originally ‘Secret – Restricted Data’ (declassified and released on 13 February 1989).

Above: area coverage by the first (fast or 'magnetic dipole mechanism') peak EMP and by the second (slow or 'magneto-hydrodynamic, MHD-EMP, mechanism') for a 10-20 kt single stage (pure fission) thin-cased burst at 150 km altitude. Both sets of contours are slightly disturbed from circles by the effect of the earth's slanting magnetic field (the burst is supposed to occur 500 km west of Washington D.C.). Notice that the horizon range for this 150 km burst height is 1,370 km and with the burst location shown that zaps 70 % of the electricity consumption of the United States, but if the burst height were 500 km then the horizon radius would be 2,450 km and would cover the entire United States of America. This distance is very important because the peak signal has a rise time of typically 20 ns, which implies a VHF frequency on the order of 50 MHz, which cannot extend past the horizon (although lower frequencies will obviously bounce off the ionosphere and refract and therefore extend past the horizon). However if you simply increase the burst altitude, you would then need a megaton explosion, to avoid diluting the energy and hence the effects by increasing the area coverage.


In October 1957, Nobel Laureate Dr Hans A. Bethe's report, "Electromagnetic Signal Expected from High-Altitude Test" (Los Alamos Scientific Laboratory report LA-2173, secret-restricted data), predicted incorrectly that only a weak electromagnetic pulse (EMP) would be produced by a nuclear detonation in space or at very high altitude, due to vertical oscillations resulting from the downward-travelling hemisphere of radiation. This is the 'electric dipole' EMP mechanism and is actually a trivial EMP mechanism for high altitude bursts.

Hardtack-Teak, a 3.8 Mt, 50 % fission test on 1 August 1958 missile carried to 77 km directly over Johnston Island, gave rise to a powerful EMP, but close-in waveform measurements failed. This was partly due to an error in the missile which caused it to detonate above the island instead of 30 km down range as planned (forcing half a dozen filmed observers at the entrance to the control station to duck and cover in panic, see the official on-line U.S. Department of Energy test film clip), but mainly because of Bethe's false prediction that the EMP would be vertically polarised and very weak (on the order of 1 v/m). Due to Bethe's error, the EMP measurement oscilloscopes were set to excessive sensitivity which would have sent them immediately off-scale:

'The objective was to obtain and analyze the wave form of the electromagnetic (EM) pulse resulting from nuclear detonations, especially the high-altitude shots. ... Because of relocation of the shots, wave forms were not obtained for the very-high-altitude shots, Teak and Orange. During shots Yucca, Cactus, Fir, Butternut, Koa, Holly, and Nutmeg, the pulse was measured over the frequency range from 0 to 10 mega-cycles. ... Signals were picked up by short probe-type antennas, and fed via cathode followers and delay lines to high-frequency oscilloscopes. Photographs of the traces were taken at three sweep settings: 0.2, 2, and 10 micro-sec/cm.

'The shot characteristics were compared to the actual EM-pulse wave-form parameters. These comparisons showed that, for surface shots, the yield, range and presence of a second [fusion] stage can be estimated from the wave-form parameters. EM-pulse data obtained by this project is in good agreement with that obtained during Operation Redwing, Project 6.5.' - F. Lavicka and G. Lang, Operation Hardtack, Project 6.4, Wave Form of Electromagnetic Pulse from Nuclear Detonations, U.S. Army, weapon test report WT-1638, originally Secret - Restricted Data (15 September 1960).

However, the Apia Observatory at Samoa, 3,200 km from the Teak detonation, recorded the ‘sudden commencement’ of an intense magnetic disturbance – four times stronger than any recorded due to solar storms – followed by a visible aurora along the earth’s magnetic field lines (reference: A.L. Cullington, Nature, vol. 182, 1958, p. 1365). [See also: D. L. Croom, ‘VLF radiation from the high altitude nuclear explosions at Johnston Island, August 1958,’ J. Atm. Terr. Phys., vol. 27, p. 111 (1965).]

The expanding ionised (thus conductive, and hence diamagnetic) fireball excluded and thus ‘pushed out’ the Earth’s natural magnetic field as it expanded, an effect called ‘magnetohydrodynamic (MHD)-EMP’. But it was on the 9 July 1962, during the American Starfish shot, a 1.4 Mt warhead missile-carried to an altitude of 400 km, that EMP damage at over 1300 km east was seen, and the Starfish space burst EMP waveform was measured by Richard Wakefield. Cameras were used to photograph oscilloscope screens, showing the EMP pickup in small aerials. Neither Dr Bethe’s downward current model, nor the MHD-EMP model, explained the immense peak EMP. In 1963, Dr Conrad Longmire at Los Alamos argued that, in low-density air, electrons knocked from air molecules by gamma rays travel far enough to be greatly deflected by the earth’s magnetic dipole field. Longmire's theory is therefore called the 'magnetic dipole' EMP mechanism, to distinguish it from Bethe's 'electric dipole' mechanism.

[Illustration credit: Atomic Weapons Establishment, Aldermastion, http://www.awe.co.uk/main_site/scientific_and_technical/featured_areas/dpd/computational_physics/nuclear_effects_group/electromagnetic_pulse/index.html (this site page removed since accessed in 2006.]

Dr Longmire showed that the successive, sideways-deflected Compton-scattered electrons cause an electromagnetic field that adds up coherently (it travels in step with the gamma rays causing the Compton current), until ‘saturation’ is reached at ~ 60,000 v/m (when the strong field begins to attract electrons back to positive charges, preventing further increase). It is impossible to produce a 'magnetic dipole' EMP from a space burst which exceeds 65,000 v/m at the Earth's surface, no matter if it is a 10 Mt detonation at just 30 km altitude over the magnetic equator. The exact value of the saturation field depends on burst altitude. See pages 33 and 36 of Louis W. Seiler, Jr., A Calculational Model for High Altitude EMP, report AD-A009208, March 1975.

Many modern nuclear warheads with thin cases would produce weaker EMP, because of pre-ionisation of the atmosphere by x-rays released by the primary fission stage before the major gamma emission from the fission final stage of the weapon. An EMP cannot be produced efficiently in ionised (electrically conducting) air, as that literally shorts out the EMP very quickly. This means that many thermonuclear weapons with yields of around 100 kilotons would produce saturation electric fields on the ground of only 15,000-30,000 v/m if detonated in space. More about this, see Dr Michael Bernardin's testimony to the U.S. Congress:

'I speak as a weapons designer with specialized knowledge in electromagnetic pulse. Since 1996, I have been the provost for the Postgraduate Nuclear Weapon Design Institute within the laboratory chartered with training the next generation of nuclear weapon designers. The issue to be addressed this morning is the impact of a high-altitude nuclear detonation over the United States to the civilian and military infrastructure. A high-altitude nuclear detonation would produce an electromagnetic pulse that would cover from one to several million square miles, depending on the height of burst, with electric fields larger than those typically associated with lightning. In such an event, would military equipment deployed within the area of EMP exposure be seriously impaired? Would civilian communications, the power grid and equipment connected to the power grid catastrophically fail? The answers to these questions depend on three elements: One, the types of threat weapons deployed; two, the EMP produced by these weapons; and three, the effects that are caused by EMP. The Defense Intelligence Agency (DIA) and the Central Intelligence Agency (CIA) identify current and projected nuclear weapon threats and provide inputs to the Department of Energy nuclear design labs, Los Alamos and Livermore National Laboratories, who model foreign nuclear weapons. The labs each have over 25 years of experience in performing this type of modeling. The weapon models serve as a basis for associated EMP threat assessments. For the purpose of EMP assessment, it is convenient to group the threat weapons into the following five categories: One, single-stage fission weapons; two, single-stage boosted weapons; three, nominal two-stage thermonuclear weapons with yields up to a few megatons; four, two-stage thermonuclear weapons with yields over a few megatons; and five, special technology thermonuclear weapons. ...The ionization shorts out the EMP, limiting its value to typically 30,000 volts per meter. High-energy x-rays are also produced by the exploding weapon and can enhance the ionization in the high-altitude EMP source region. This source of ionization was largely ignored in EMP assessments until 1986. The inclusion of the X-rays lowered the assessed values of the peak field for many weapons. Note in graphic three that the thermonuclear weapon consists of two stages, a primary stage, which is typically of relatively low yield and is used to drive the secondary stage, which produces a relatively large yield. Each weapons stage produces its own EMP signal, but the primary stage gamma rays, after they go out, leave behind an ionized atmosphere from their EMP generation that is present when the secondary stage gamma rays arrive a moment later. Thus, the primary stage can degrade the EMP associated with the secondary stage.'

Dr William Graham, the President and CEO of National Security Research, then testified:

'By way of background, I have worked in EMPs since 1962, when I was a lieutenant at the Air Force weapons lab, handed a dataset taken from the last atmospheric and Pacific exoatmospheric nuclear test series, and asked to try to explain some very strange-looking phenomena that had been observed. Fortunately, we had the benefit of colleagues at Livermore, Los Alamos and other places in doing this, and the theory of high-altitude EMP, and, in fact, all EMP was developed over the next decade or so. Interestingly, though, like many important scientific discoveries, the intense electromagnetic pulse produced by the exoatmospheric nuclear weapon explosion was discovered by accident. It was first observed both directly and by its effects on civilian systems during the exoatmospheric nuclear test series we had conducted, primarily the Fishbowl series [tests Starfish, Checkmate, Bluegill, Kingfish] in the beginning of the 1960s. However, the theory that was being used at the time to predict the effect had been incorrectly derived by a Nobel laureate [Bethe] actually and caused all of the instrumentation on monitoring those exoatmospheric tests to be set at far too low a scale, far too sensitive a level, so that the data on the scope tended to look like vertical lines. We couldn't see the peak amplitudes that were being produced, and it was Conrad Longmire of Los Alamos National Laboratory who, after looking at the data, figured out what was really happening.'

In those same U.S. Congressional Hearings of October 1999, Dr Lowell Wood, of Lawrence Livermore National Laboratory, explained the effects of EMP as then known from Starfish test experience:

'I am grateful for the invitation to appear today. Like Dr. Graham, my esteemed senior colleague, I also commenced EMP studies in 1962, as my graduate advisor Willard Libby had recently retired from a long term of service as the Commissioner of the Atomic Energy Commission, and he assigned me EMP analysis problems kind of as exercises for the students, as he was then very keenly concerned by them.

'Indeed, electromagnetic pulses, EMP, generated by high-altitude nuclear explosions have riveted the attention of the military nuclear technical community for more than three and a half decades since the first comparatively modest one very unexpectedly and abruptly turned off the lights over a few million square miles of the mid-Pacific. This EMP also shut down radio stations and street-lighting systems, turned off cars, burned out telephone systems and wreaked other technical mischief throughout the Hawaiian Islands nearly 1,000 miles distant from ground zero.'

However, Dr Wood is not very specific when mentioning damage to radio stations and telephone systems. Dr John Malik notes on page 31 of Herman Hoerlin's Los Alamos National Laboratory report LA-6405, United States High Altitude Test Experiences:

'Starfish produced the largest fields of the high-altitude detonations; they caused outages of the series-connected street-lighting systems of Oahu (Hawaii), probable failure of a microwave repeating station on Kauai, failure of the input stages of ionospheric sounders and damage to rectifiers in communication receivers. Other than the failure of the microwave link, no problem was noted in the telephone system. No failure was noted in the telemetry systems used for data transmission on board the many instrumentation rockets.

'There was no apparent increase in radio or television repairs subsequent to any of the Johnston Island detonations. The failures observed were generally in the unprotected input stages of receivers or in rectifiers of electronic equipment; transients on the power line probably caused the rectifier failures. There was one failure in the unprotected part of an electronic system of the LASL Optical Station on top of Mount Haleakala on Maui Island. With the increase of solidstate circuitry over the vacuum-tube technology of 1962, the susceptibility of electronic equipment will be higher, and the probability of more problems for future detonations will be greater. However, if detonations are below line-of-sight, the fields and therefore system problems will be much smaller.'

In addition to the July 1962 Hawaiian experience of EMP induced equipment failures - including some anecdotal evidence of car ignition systems fusing (modern microprocessor controlled vehicles would be more vulnerable), some severe Russian EMP damage occurred in ‘Operation K’ (ABM System A proof tests) of 1962. On 22 October – during the Cuban missile crisis – Russia detonated a 300-kt missile-warhead at 290-km altitude. Prompt EMP fused 570 km of overhead telephone line west from Zharyq, then MHD-EMP started a fire at the Karaganda power plant and shut down 1,000-km of buried civilian power cables between Aqmola and Almaty. Russian Army diesel electricity generators were burned out by EMP, after 300-kt tests at altitudes of 150 km on 28 October and 59 km on 1 November.

America produces two classified reports on nuclear weapons effects: a 'red book' of foreign threats and a 'blue book' of its own nuclear weapons radiation output data. See page 27 of the candid April 2005 U.S. Department of Defense Report of the Defense Science Board Task Force on Nuclear Weapon Effects Test, Evaluation, and Simulation. Page 29 says:

'The flux or fluence of prompt gammas, neutrons and X-rays is by no means isotropic about the burst point of a high-altitude detonation. Clumps of materials (thrusters, gas bottles, propellant tanks, firing units, etc., for example) surround a warhead in a non-symmetric fashion and make radiation output estimation inherently three-dimensional. In realistic situations, some warhead components will shield the prompt radiations from other components, creating a large shadow cone in a preferential direction.

'For example, the Starfish test warhead was inverted prior to the high-altitude test over Johnston Island in 1962 because of concerns that some masses within the design would cause an undesirable shadowing of prompt gamma rays and mask selected nuclear effects that were to be tested. In another example, a nuclear driven kinetic kill warhead (that destroys a reentry vehicle with steel pellets) will have a very low yield-to-mass ratio, which will greatly suppress the X-ray output. The Russians reported on their 1962 high-altitude testing of such a device at an International Conference on Electromagnetic Effects in 1994 held in Bordeaux, France.'

This is far more candid that the older data released here and here.

In addition, in testimony to 1997 U.S. Congressional Hearings, Dr. George W. Ullrich, the Deputy Director of the U.S. Department of Defense's Defense Special Weapons Agency (now the DTRA, Defence Threat Reduction Agency) said:

'... Enrico Fermi ... prior to the Trinity Event, first predicted that nuclear explosions were capable of generating strong electromagnetic fields. ... A less well known effect of high altitude bursts, but also one with potentially devastating consequences, is the artificial 'pumping' of the Van Allen belt with large numbers of electrons. The bomb-induced electrons will remain trapped in these belts for periods exceeding one year. All unhardened satellites traversing these belts in low earth orbit could demise in a matter of days to weeks following even one high altitude burst. ...

'One of our earliest experiences with HEMP dates back to the resumption of atmospheric nuclear testing in 1962 following a three year testing moratorium. Starfish Prime, a 1.4 megaton device, was detonated at an altitude of 400 kilometers over Johnston Island. Failures of electronic systems resulted in Hawaii, 1,300 kilometers away from the detonation. Street lights and fuzes failed on Oahu and telephone service was disrupted on the island of Kauai. Subsequent tests with lower yield devices [410 kt Kingfish at 95 km altitude, 410 kt Bluegill at 48 km altitude, and 7 kt Checkmate at 147 km] produced electronic upsets on an instrumentation aircraft [presumably the KC-135 that filmed the tests from above the clouds?] that was approximately 300 kilometers away from the detonations.

'Soviet scientists had similar experiences during their atmospheric test program. In one test, all protective devices in overhead communications lines were damaged at distances out to 500 kilometers; the same event saw a 1,000 kilometer segment of power line shut down by these effects. Failures in transmission lines, breakdowns of power supplies, and communications outages were wide-spread.

'... a 50 kiloton (KT) weapon detonated at a 120 km altitude (75 miles) can produce electron densities several orders of magnitude higher than the natural electron environment in low earth orbit. These elevated electron densities can last for months to years and significantly increase the total ionizing dose accumulated by space assets that transit these belts. This increase in total dose accumulation can dramatically shorten the lifetime of satellite systems. Projected lifetimes of up to ten years can be reduced to a mere two months after such an event.

'EMP does not distinguish between military and civilian systems. Unhardended systems, such as commercial power grids, telecommunications networks, and computing systems, remain vulnerable to widespread outages and upsets ... While DoD hardens assets it deems vital, no comparable civil program exists. Thus, the detonation of one or a few high-altitude nuclear weapons could result in devastating problems for the entire U.S. commercial infrastructure. Some detailed network analyses of critical civil systems would be useful to better understand the magnitude of the problem and define possible solution paths.'

However, some claim that EMP is an exaggerated threat. It is true that the 300 streetlights which failed on Oahu were only a small fraction (around 1-3 %) of the total number of streetlights in the Hawaiian islands, but you have to remember that the small size of the islands meant that the conductors were similarly limited in size. The Russian experience of tests over land shows that the worst effects occur in electrical and electronics equipment connected to very long power transmission or telephone lines, which did not exist in the Hawaiian Islands. In addition, the threat is worse today than in 1962 because a microchip is a million times more vulnerable to a power surge than the thermonic valves in use in electronics in 1962.

The claim http://www.alternet.org/story/25738/ makes about EMP from a 10-20 kt fission bomb being proportionately weaker than that from the 1.4 Mt Starfish test is blatant nonsense. The formula for EMP, even neglecting saturation, shows that the peak electric field varies as the square root of the weapon yield divided by the distance from the burst. Hence, a 100-fold increase in yield only increases the EMP at a given distance by a factor of 10, even when you neglect saturation.

When you include saturation, the difference is even less. Saturation introduces a exponential limiting of the form: E = Y[1 - exp{-(X/Y)2}]1/2, where X is peak EMP predicted by the simple law that ignores saturation, and Y is the saturation field (Y ~ 60,000 v/m). (When X is very large, the exponential disappears so this formula reduces to the saturation value E=Y, but when X is very small, the formula reduces to E=X, the weak field limit. The reason for the square and square root powers appearing instead of just E = Y[1 - exp{-(X/Y)}], is actually due to the fact that for the time of peak EMP, the air conductivity at that time is proportional to the square-root of the Compton current. I'll return to this mathematical model in a later post. In the meantime see the full calculations of EMP as a function of burst altitude online in Louis W. Seiler, Jr., A Calculational Model for High Altitude EMP, report AD-A009208, March 1975.)

Still another factor you have to take account of is that Philip J. Dolan's formerly classified Capabilities of Nuclear Weapons, DNA-EM-1, chapters 5 and 7, show that the prompt gamma yield fraction was only 0.1% for Starfish but can be 0.5% for less efficient low yield pure fission devices, depending on the design.

Hence a 10-20 kt fission weapon, because it has a thinner case than a massive x-ray coupled 1.4 Mt thermonuclear weapon (Starfish), would result in up to 5 times as much prompt gamma ray energy release per kiloton of yield, which causes the peak EMP. Taking all factors into account, it is easy to design a 10-20 kt fission weapon which produces exactly the same peak EMP as Starfish if you reduce the burst altitude slightly (the area covered will still be massive). Another plus is that, because you are only dealing with a single stage design, there is no danger of pre-ionisation of the atmosphere.

If gamma or x-rays from the first stage deposit much energy in the atmosphere, they will cause ionisation and hence a rise in conductivity of the air, which will literally 'short out' much of the Compton current for the EMP from the second pulse of gamma rays (see Dr Bernardin's comment, quoted above). Dr Mario Rabinowitz was censored out in the early 1980s, after he wrote a paper {by email dated November 19, 2006 6:42 PM he kindly confirms to me: 'I actually did this work in the very early 80's. The forces that be suppressed release of my EPRI report, and prevented publication of my work until 1987. I even have a galley of my paper in Science which managed to get through their tough review process. It was about a week before being published, when it was killed.'}.

Dr Bernardin rediscovered this in a classified report dated 1986 and refined the calculations to quantify precisely how primary stage gamma and x-rays reduce the main EMP by pre-ionizing the atmosphere. Dr Rabinowitz independently published in 1987 giving a general discussion of it in his less weapons-sensitive - unclassified - report which was published in an IEEE journal, where he notices also that you can't use several EMP weapons or they will interfere with each other, reducing the total EMP.

So nuclear terrorism using EMP from one single-stage low-yield fission weapon is really a very real threat. Unfortunately, Dr Lowell Wood did not explain these facts when asked so the media ignored the reports vague (i.e., unscientific, as in: lacking actual nuclear test data to validate claims) warning of EMP:

'Wood refused, however, to respond to questions about whether weapons capable of doing such damage are technologically possible and within reach of so-called “rogue” states and terrorists he said might pose a threat. “You seriously don’t expect answers in an unclassified [setting] to those sorts of questions?” he said.'

The media justifiably reported this poor answer under the banner 'Plausibility of EMP Threat Classified, Expert Says'. Why should the media believe severe claims without seeing hard nuclear test evidence and rigorous mathematical physics to back them up?

See the recent non-technical U.S. Congress sponsored discussion: Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack, Volume 1: Executive Summary, July 22, 2004. This unclassified volume of the report doesn't contain any science, but it does have colourful maps with circles to illustrate how much of America would be covered by EMP for different heights of burst and so forth. The accompanying 2004 EMP hearings discuss the politics, such as an outrageous threat allegedly made by the Soviet Ambassador to the U.S., Vladimir Lukin, who said to the Americans in Vienna in May 1999: 'we have the ultimate ability to bring you down [by EMP]'. (It was this alleged threat, or warning, or whatever you'd call it, that prompted the new American congressional EMP concerns.)

Appendix A of the July 2004 Commission EMP report quotes from Thomas C. Schelling's Foreword to Roberta Wohlstetter's book Pearl Harbor: Warning and Decision, Stanford UniversityPress, 1962, p. vii:

'[There is] a tendency in our planning to confuse the unfamiliar with the improbable. The contingency we have not considered looks strange; what looks strange is therefore improbable; what seems improbable need not be considered seriously.'

This is true. Even when Hitler mobilized 100 divisions at the Soviet Union's border in 1941, Stalin was dismissive of all reports of preparations being made by the Nazis to invade the Soviet Union (this was because of the Nazi-Soviet peace-pact of 1939, creating a false sense of security to the USSR). Herman Kahn has explained in On Thermonuclear War (1960) how Pearl Harbor, Oahu, Hawaii (appropriately by coincidence also the centre of unpredicted EMP damage in the 1962 Starfish nuclear test) was supposedly immune from attack, because it was shallower than the textbook-stated minimum water depth for a torpedo. The Japanese simply made special torpedoes to use in the attack on the U.S. Pacific Fleet in 1941. (Even when America received advanced warning of the attack, its wishful thinking simply dismissed the warning as an error, so no warning was passed on, and the scale of the tragedy was maximised.)

Above: the 2004 Commission report on EMP includes this map of the EMP from the solar storm on 13 March 1989 which had effects similar to a weak MHD-EMP and the auroral EMP (caused by a fraction of the debris fireball and charged particle radiation which moves along magnetic field lines between conjugate points in different hemispheres). For example, the 1989 event overloaded and caused the collapse of Quebec Hydro power supply grid. Page 12 says:

'During the Northeast power blackout of 1965, Consolidated Edison generators, transformers, motors, and auxiliary equipment were damaged by the sudden shutdown. In particular, the #3 unit at the Ravenswood power plant in New York City suffered damage when the blackout caused loss of oil pressure to the main turbine bearing. The damage kept that unit out of service for nearly a year, and more immediately, complicated and delayed the restoration of service to New York City.'

There is a 2004 U.S. Army review of EMP by Thomas C. Riddle online: NUCLEAR HIGH ALTITUDE ELECTROMAGNETIC PULSE – IMPLICATIONS FOR HOMELAND SECURITY AND HOMELAND DEFENSE. There is also a U.S. Army EMP protection Technical Army (TM 5-590) here, and Dr Glen A. Williamson who was at Kwajalein Atoll when Starfish was detonated in 1962, has an informed page about EMP protection here.

But it is not even all one-sided doom and gloom! Lawrence Livermore National Laboratory in its February 1992 Energy and Technology Review was considering ‘EMP warheads for nonlethal attacks of targets with sensitive electronics.’ So it is even possible that the Allies could be the first to use this new effect for peaceful and safe conflict resolution, as I suggested in the November 1994 issue of Electronics World.

Pages 19-21 of A 'Quick Look' at the Technical Results of Starfish Prime, August 1962 states:

'At Kwajalein, 1400 miles to the west, a dense overcast extended the length of the eastern horizon to a height of 5 or 8 degrees. At 0900 GMT a brilliant while flash burned through the clouds rapidly changing to an expanding green ball of irradiance extending into the clear sky above the overcast. From its surface extruded great white fingers, resembling cirro-stratus clouds, which rose to 40 degrees above the horizon in sweeping arcs turning downward toward the poles and disappearing in seconds to be replaced by spectacular concentric cirrus like rings moving out from the blast at tremendous initial velocity, finally stopping when the outermost ring was 50 degrees overhead. They did not disappear but persisted in a state of frozen stillness. All this occurred, I would judge, within 45 seconds. As the greenish light turned to purple and began to fade at the point of burst, a bright red glow began to develop on the horizon at a direction 50 degrees north of east and simultaneously 50 degrees south of east expanding inward and upward until the whole eastern sky was a dull burning red semicircle 100 degrees north to south and halfway to the zenith obliterating some of the lesser stars. This condition, interspersed with tremendous white rainbows, persisted no less than seven minutes.

'At zero time at Johnston, a white flash occurred, but as soon as one could remove his goggles, no intense light was present. A second after shot time a mottled red disc was observed directly overhead and covered the sky down to about 45 degrees from the zenith. Generally, the red mottled region was more intense on the eastern portions. Along the magnetic north-south line through the burst, a white-yellow streak extended and grew to the north from near zenith. The width of the white streaked region grew from a few degrees at a few seconds to about 5-10 degrees in 30 seconds. Growth of the auroral region to the north was by addition of new lines developing from west to east. The white-yellow auroral streamers receded upward from the horizon to the north and grew to the south and at about 2 minutes the white-yellow bands were still about 10 degrees wide and extended mainly from near zenith to the south. By about two minutes, the red disc region had completed disappearance in the west and was rapidly fading on the eastern portion of the overhead disc. At 400 seconds essentially all major visible phenomena had disappeared except for possibly some faint red glow along the north-south line and on the horizon to the north. No sounds were heard at Johnston Island that could be definitely attributed to the detonation.

'Strong electromagnetic signals were observed from the burst, as were significant magnetic field disturbances and earth currents.'

Update: The DVD called Nukes in Space: the Rainbow Bombs (Narrated by William Shatner), contains an interview comment by Dr Byron Ristvet of the U.S. Defense Threat Reduction Agency who states that either the 1958 Teak or Orange shot caused unexpected EMP induced power cuts on Oahu in the Hawaiian Islands:

'As it was, one of those two high altitude shots [Teak and Orange, August 1958] did affect the power grid on Oahu, knocking out quite a bit of it. That was unexpected.'

Oahu is 71 km long by 48 km wide, and power cables could have picked up significant EMP, especially the MHD-EMP effect caused by fireball expansion. However, this is surmise. Why is the U.S. Defense Threat Reduction Agency being coy over their EMP effects data? Which test did this? Why not say "TEAK knocked out part of the power grid on Oahu"? Why secrecy?

Obviously the one factor against 3.8 Mt TEAK causing damage in Hawaii was that the burst altitude of only 77 km was below the horizon as seen from Hawaii, cutting off the highest frequencies of the EMP from reaching Hawaii, although the rising fireball later appeared over the horizon as it gained sufficient altitude. However, a very useful Norwegian report on EMP seems to state that TEAK in 1958 had some similar effects to those from STARFISH:

'Spesielle sikringer som skulle beskytte disse lamper ble ødelagt. Ved en eksplosjon samme sted i 1958 på ca. 4 Mt i en høyde av 77 km (Teak) ble det også angitt at det oppsto feil på elektrisk utstyr i Hawaii (24).'

[‘Special protection that would protect these lamps was destroyed by an explosion on the same place in 1958 of yield approximately 4 Mt and burst height of 77 km (Teak), and it was also indicated that it resulted in malfunctions to electrical equipment along roads in Hawaii (24).’]

Reference (24) is to two reports: EMP threat and protective measures, US Office of Civil Defence, report TR-61 A, US OCD (1971) and EMP and associated effects on power, communications and command and control systems, report JES-107-1M-12-63, Joslyn Electronic Systems (1963).

Another example: the sanitized report ITR-1660-(SAN), Operation Hardtack: Preliminary Report, Technical Summary of Military Effects Programs 1-9, DASA, Sandia Base, Albuquerque, New Mexico, 23 September 1959, sanitized version 23 February 1999.

On page 347 of ITR-1660-(SAN), the first American measurement of high altitude EMP was made not at Starfish in 1962 (which Dr Conrad Longmire claimed), but at the 2 kt Yucca test in 1958. (The Teak shot EMP measurements failed because the shot went off directly overhead instead of 20 miles downrange due to a missile guidance error.) They only measured the beta ionisation which affects radio/radar transmissions for hours, but it is the brief high frequency EMP which causes physical damage to equipment. Although Yucca was of too low yield to cause EMP damage, oscilloscopes in 1958 did record the intense, high frequency magnetic dipole EMP mechanism which caused the damage in the higher yield (1.4 Mt) Starfish test of 1962:

'Shot Yucca ... [EMP] field strength at Kusaie indicated that deflection at Wotho would have been some five times the scope limits... The wave form was radically different from that expected. The initial pulse was positive, instead of the usual negative. The signal consisted mostly of high frequencies of the order of 4 Mc, instead of the primary lower-frequency component [electric dipole EMP] normally received ...'

Another EMP cover up story - which comes from Glen Williamson who was on Kwajalein when Starfish was tested - is that the first surface burst in Nevada in 1951 (test Sugar) coupled EMP out of cables from the bomb to the control point, and on to the main power supply, then beyond it to Las Vegas, tripping circuit breakers:

'Right after WWII, during one Nevada test, circuit breakers, 90 miles away [Las Vegas], were tripped; thus giving early hints of EMP.'

Notice that there is some evidence of something like this in extracts from B. J. Stralser's 30 April 1961 EG&G Secret - Restricted Data report Electromagnetic Effects from Nuclear Tests. Prevous Nevada tests were aircraft dropped free air bursts with no close-in cables to couple EMP into equipment. As soon as cable-controlled Nevada testing started, they found EMP returning in the cables would get into other circuits by cross-talk (i.e., mutual inductance, Ivor Catt's alleged area of excellence).

After the first bad EMP event in 1951, they switched over the Nevada Test Site's telephone system to run off diesel generators at shot times, to avoid EMP getting into the U.S. power grid. The Stralser report states that at the main power supply, 30 miles (50 km) from the detonation, technicians were warned over the loudspeaker system prior to each shot:

'Stand by to reset circuit breakers.'

Stralser also reports that protective measures like carbon block lightning protectors proved useless at the Nevada against the EMP from the cables: the EMP was so severe it would simply 'arc over' the power surge arrestor. Lead-tape shielded cables at out to 800 metres from Nevada tests with yields below 75 kt had their multicore conductors fused together by the heat of carrying thousands of amps of EMP current! The full Stralser report is unavailable at present, only a brief extract and summary of it can be found in the U.K. National Archives at Kew, in an originally 'Secret - Atomic' note (the British equivalent of the American 'Secret - Restricted Data' classification). The file is a British Home Office Scientific Advisory Branch report on the effects of nuclear detonations on communications technology. Dr R. H. Purcell was the chief scientific advisor in the Home Office at that time, and apparently he wrote the summary for the benefit of his scientists because it was of too high classification for them to see the full American report. A few years later, the summary was published - without the source (Stralser) report being disclosed - in the Home Office Scientific Advisory Branch magazine Fission Fragments.

UPDATE (10 November 2008)

Various later posts add to the information on this post. The following section from the latest EMP post (mainly concerned with surface and air bursts, but including the following on high altitude bursts) is particularly important and relevant so the excerpt is being copied from that post to here:



An earlier post on this blog, 'EMP radiation from nuclear space bursts in 1962', attempted to document the vital scientific data concerning high altitude nuclear test EMP from American and Russian nuclear tests in 1962 (and some previous tests in 1958 that were not properly measured due to a theory by Bethe that led to instruments being set up to detect a radiated EMP with the wrong polarization, duration and strength). That post still contains valuable data and the motivation for civil defence, although a great deal has changed and much new vital technical information on high altitude EMP predictions has come to light since that post was written.

Dr Conrad Longmire, as stated in that post, discovered the vital 'magnetic dipole' EMP mechanism for high altitude explosions (quite different to Bethe's 'electric dipole' predictions from 1958) after he saw Richard Wakefield's curve of EMP from the 9 July 1962 Starfish test of 1.4 Mt (1.4 kt of which was prompt gamma rays) at 400 km altitude.

'Longmire, a weapons designer who worked in [Los Alamos] T Division from 1949 to 1969 and currently is a Lab associate, played a key role in developing an understanding of some of the fundamental processes in weapons performance. His work included the original detailed theoretical analysis of boosting and ignition of the first thermonuclear device. Longmire ... wrote Elementary Plasma Physics (one of the early textbooks on this topic). He also became the first person to work out a detailed theory of the generation and propagation of the [high altitude magnetic dipole mechanism] electromagnetic pulse from nuclear weapons.'

Starfish was however not the first suitable measured curve of the magnetic dipole EMP, which was obtained from the 2 kt Yucca test in 1958 and described in detail in 1959 on page 347 of report ITR-1660-(SAN), but no EMP damage occurred from that test and so nobody worried about the size and shape of that EMP which was treated as an anomaly: 'Shot Yucca ... [EMP] field strength at Kusaie indicated that deflection at Wotho would have been some five times the scope limits... The wave form was radically different from that expected. The initial pulse was positive, instead of the usual negative. The signal consisted mostly of high frequencies of the order of 4 Mc, instead of the primary lower-frequency component [electric dipole EMP] normally received ...' Longmire's secret lectures on the magnetic dipole EMP mechanism were included in his April 1964 Los Alamos National Laboratory report, LAMS-3073. The first open publication of Longmire's theory was in the 1965 paper 'Detection of the Electromagnetic Radiation from Nuclear Explosions in Space' in the Physical Review (vol. 137B, p. 1369) by W. J. Karzas and Richard Latter of the RAND Corporation, which is available in RAND report format online as report AD0607788. (The same authors had perviously in October 1961 written a report on Bethe's misleading 'electric dipole' EMP mechanism - predicting incorrectly an EMP peak electric field of only 1 volt/metre at 400 km from a burst like Starfish instead of 50,000 volts/metre which occurs in the 'magnetic dipole' mechanism - called 'Electromagnetic Radiation from a Nuclear Explosion in Space', AD0412984.) It was only after the publication of this 1965 paper that the first real concerns about civil defence implications of high altitude bursts occurred.

The next paper which is widely cited in the open literature is Longmire's, 'On the electromagnetic pulse produced by nuclear explosions' published in the January 1978 issue of IEEE Transactions on Antennas and Propagation, volume 26, issue 1, pp. 3-13. That paper does not give the EMP field strength on the ground as a function of the high altitude burst yield and altitude, but it does give a useful discussion of the theoretical physics involved and also has a brief history of EMP. In the earlier post on this blog, I extracted the vital quantitative information from a March 1975 masters degree thesis by Louis W. Seiler, Jr., A Calculational Model for High Altitude EMP, AD-A009208, pages 33 and 36, which had gone unnoticed by everyone with an interest in the subject. I also obtained Richard Wakefield's EMP measurement from the Starfish test which is published in K. S. H. Lee's 1986 book, EMP Interaction, and added a scale to the plot using a declassified graph in Dolan's DNA-EM-1, Chapter 7. However, more recent information has now come to light.

The reason for checking these facts scientifically for civil defence is that the entire EMP problem will be dismissed by critics as a Pentagon invention for wasting time because of the alleged lack of EMP effects evidence or because of excessive secrecy being used as an excuse to not bother presenting the facts to the public in a scientific manner, with evidence for assertions ('extraordinary claims require extraordinary evidence' - Carl Sagan).

The latest information on EMP comes from a brand new (October 24, 2008) SUMMA Foundation database of EMP reports compiled by Dr Carl E. Baum of the Air Force Weapons Laboratory and hosted on the internet site of the Electrical and Computer Engineering Department of the University of New Mexico:

'Announcements. Update: Oct. 24, 2008 - We are pleased to announce that many of the unclassified Note Series are now on-line and is being hosted by the Electrical and Computer Engineering Department at the University of New Mexico. More notes will be added in the coming months. We appreciate your patience.'

The first of these reports that needs to be discussed here is Note 353 of March 1985 by Conrad L. Longmire, 'EMP on Honolulu from the Starfish Event'. Longmire notes that: 'the transverse component of the geomagnetic field, to which the EMP amplitude is approximately proportional, was only 0.23 Gauss. Over the northern U.S., for some rays, the transverse geomagnetic field is 2.5 times larger.' For Starfish he uses 400 km burst altitude, 1.4 Mt total yield and 1.4 kt (i.e. 0.1%) prompt gamma ray yield with a mean gamma ray energy of 2 MeV. Honolulu, Hawaii (which was 1,450 km from the Starfish bomb detonation point 400 km above Johnston Island) had a magnetic azimuth of 54.3 degrees East and a geomagnetic field strength in the source region of 0.35 gauss (the transverse component of this was 0.23 Gauss).

Longmire calculates that the peak radiated (transverse) EMP at Honolulu from Starfish was only 5,600 volts/metre at about 0.1 microsecond, with the EMP delivering 0.1 J/m2 of energy: 'The efficiency of conversion of gamma energy to EMP in this [Honolulu] direction is about 4.5 percent.' Longmire's vital Starfish EMP graph for Honolulu is shown below:
Longmire points out that much higher EMP fields occurred closer to the burst point, concluding on page 12: 'We see that the amplitude of the EMP incident on Honolulu [which blew the sturdy electric fuses in 1-3% of the streetlamps on the island] from the Starfish event was considerably smaller than could be produced over the northern U.S. ... Therefore one cannot conclude from what electrical and electronic damage did not occur in Honolulu that high-altitude EMP is not a serious threat.

'In addition, modern electronics is much more sensitive than that in common use in 1962. Strings of series-connected street lights did go out in Honolulu ... sensitive semiconductor components can easily be burned out by the EMP itself, 10-7 Joules being reportedly sufficient.'

The next vitally important report deserving discussion here in Dr Baum's collection is K. D. Leuthauser's A Complete EMP Environment Generated by High-Altitude Nuclear Bursts, Note 363, October 1992, which gives the following vital data (notice that 10 kt prompt gamma ray yield generally corresponds to a typical thermonuclear weapon yield of about 10 megatons):

Quotations from some of the Theoretical Notes on EMP in Dr Carl E. Baum's database:

Theoretical Note 368:

Conrad L. Longmire, Justification and verification of High-Altitude EMP Theory, Part 1, Mission Research Corporation, June 1986, pages 1-3:

'Over the 22 years since the first publication of the theory of High-Altitude Electromagnetic Pulse (HEMP), there have been several doubters of the correctness of that theory. ... commonly, it has been claimed that the HEMP is a much smaller pulse than our theory indicates and it has been implied, though not directly stated in writing, that the HEMP has been exaggerated by those who work on it in order to perpetuate their own employment. It could be noted that, in some quarters, the disparagement of HEMP has itself become an occupation. ...

'... One possible difficulty with previous papers is that they are based on solving Maxwell's equations. While this is the most legitimate approach for the mathematically inclined reader, many of the individuals we think it important to reach may not feel comfortable with that approach. We admit to being surprised at the number of people who have wanted to understand HEMP in terms of the fields radiated by individual Compton recoil electrons. Apparently our schools do a better job in teaching the applications of Maxwell's equations (in this case, the cyclotron radiation) than they do in imparting a basic understanding of those equations and how they work. ...

'The confidence we have in our calculations of the HEMP rests on two circumstances. The first of these is the basic simplicity of the theory. The physical processes involved, e.g., Compton scattering, are quite well known, and the physical parameters needed in the calculations, such as electron mobility, have been measured in relevant laboratory experiments. There is no mathematical difficulty in determining the solution of the outgoing wave equation, or in understanding why it is an accurate approximation. ...

'... the model of cycotron radiation from individual Compton recoil electrons is very difficult to apply with accuracy to our problem because of the multitudinous secondary electrons, which absorb the radiation emitted by the Compton electrons [preventing simple coherent addition of the individual fields from accelerated electrons once when the outgoing EMP wave front becomes strong, and therefore causing the radiated field to reach a saturation value in strong fields which is less than the simple summation of the individual electron contributions]. ...

'The other circumstance is that there is experimental data on the HEMP obtained by the Los Alamos Scientific Laboratory in the nuclear test series carried out in 1962. In a classified companion report (Mission Research Corp. report MRC-R-1037, November 1986) we present calculations of the HEMP from the Kingfish and Bluegill events and compare them with the experimental data. These calculations were performed some years ago, but they have not been widely circulated. In order to make the calculations transparently honest, the gamma-ray output was provided by Los Alamos, the HEMP calculations were performed by MRC and the comparison with the experimental data was made by RDA. The degree of agreement between calculation and experiment gives important verification of the correctness of HEMP theory.'

As stated in this blog post, Theoretical Note TN353 of March 1985 by Conrad L. Longmire, EMP on Honolulu from the Starfish Event calculates that the peak radiated (transverse) EMP at Honolulu from Starfish delivered only 0.1 J/m2 of energy: 'The efficiency of conversion of gamma energy to EMP in this [Honolulu] direction is about 4.5 percent.'

He and his collaborators elaborate on the causes of this inefficiency problem on page 24 of the January 1987 Theoretical Note TN354:

'Contributing to inefficiency ... only about half of the gamma energy is transferred to the Compton recoil electron, on the average [e.g., the mean 2 MeV prompt gamma rays create 1 MeV Compton electrons which in getting slowed down by hitting molecules each ionize 30,000 molecules releasing 30,000 'secondary' electrons, which uses up energy from the Compton electron that would otherwise be radiated as EMP energy; also, these 30,000 secondary electrons have random directions so they don't contribute to the Compton current, but they do contribute greatly to the rise in air conductivity, which helps to short-out the Compton current by allowing a return 'conduction current' of electrons to flow back to ions].'

Longmire also points out that Glasstone and Dolan's Effects of Nuclear Weapons pages 495 and 534 gives the fraction of bomb energy radiated in prompt gamma rays as 0.3 %. If this figure is correct, then 10 kt prompt gamma ray yield is obviously produced by a 3.3 megatons nuclear explosion. However, the Glasstone and Dolan figure of 0.3 % is apparently just the average of the 0.1 % to 0.5 % range specified by Dolan in Capabilities of Nuclear Weapons, Chapter 7, Electromagnetic Pulse (EMP) Phenomena, page 7-1 (Change 1, 1978 update):

'Briefly, the prompt gammas arise from the fission or fusion reactions taking place in the bomb and from the inelastic collisions of neutrons with the weapon materials. The fraction of the total weapon energy that may be contained in the prompt gammas will vary nominally from about 0.1% for high yield weapons to about 0.5% for low yield weapons, depending on weapon design and size. Special designs might increase the gamma fraction, whereas massive, inefficient designs would decrease it.'

Later related posts:






At 9:52 pm, Anonymous Anonymous said...


The EMP from a high air burst is never strong enough at the Earth's surface to do this. The strongest EMP was produced by the Hardtack-Teak shot, not the Starfish test. (Teak was 3.8 Mt and was detonated at 77 km. EMP field strength (but not area coverage) is maximised for a burst at 40 km altitude, so Teak at 77 km would have produced a stronger ground level EMP than Starfish at 400 km.) The prompt EMP electric field from Teak was not measured due to instrument failure, but the late-time magnetic field variation was measured in a laboratory which studies solar storms:

"... the Apia Observatory at Samoa recorded the ‘sudden commencement’ of an intense magnetic disturbance – four times stronger than any recorded due to solar storms – followed by a visible aurora along the earth’s magnetic field lines (reference: A.L. Cullington, Nature, vol. 182, 1958, p. 1365)."

Since this EMP covered vast areas (though not as wide as those from Starfish), if the magnetic field was strong enough to wipe magnetic information off swipe cards, it would in 1962 have wiped magnetic audio and data tapes (a swipe card is just a plastic card with a strip of magnetic tape stuck on it). This didn't happen. If you think about it, the electromagnetic radiation which propagates is governed by Maxwell's equations (like visible light), and the magnetic field component of such a light velocity wave is given by:

B = E / c

Inserting the commonly used value for EMP of E = 50,000 volts/metre for the prompt field with a rise time of about 20 nanoseconds, the magnetic field strength is seen to be B = 0.000167 Teslas. This is only 2.9 times the natural magnetic field strength in Washington D.C. according to http://www.vsg.cape.com/~pbaum/magtape.htm which says the natural field there is 0.0000571 Testa. However, the ability to erase magnetic tape or credit card strip information depends on the field intensity in Orested not the field strength in Teslas:

"QUESTION: What is the danger that my tape will accidentally be erased?
"ANSWER: Standard open reel audio tapes have a coercivity of approximately 360 Oersteds. It takes an even greater magnetic field (approaching 900 Oersted) to completely erase a tape. For a comparison: The earth's magnetic field is 0.6 Oersted." - http://www.vsg.cape.com/~pbaum/magtape.htm

EMP can't directly wipe out magnetic information. However, it could wipe magnetic information indirectly, if it induced a large current in a long conductor which runs near magnetic tape. Any conductor carrying an induced pulse of electric energy creates a magnetic field around it, which can easily be much stronger than the magnetic field of the EMP in free space. For example, a long overhead power transmission line, subjected to 50,000 v/m peak EMP will typically give a pulse with a peak of 1 million volts at 10,000 Amps. This will create tremendous magnetic fields. When these pulses go into transformers at the end of the power line, the transformer can explode or catch fire, but some of the energy is passed on before that happens, and can end up in home power systems. Any loop of cable connected to the mains will be a source of a powerful magnetic field which could wipe nearby magnetic tape, cards, and discs. 21:03, 30 March 2006 (UTC)

Microchips are vulnerable. In the 1950s and 1960s, America tested weapons at Nevada with yields up to 74 kilotons in air bursts and near surface bursts, which just produced 'clicks' on car radios. If you see B. J. Stralser's declassified 30 April 1961 EG&G report, Electromagnetic Effects from Nuclear Tests, you see that there is no damage to anything unless it was connected physically to a cable which had induced an EMP. Hence, in tower test, wth cables running from bomb to control point 50 km away, after serious damage in a 1951 test they had to switch off mains power and go over to diesel generators at shot time. In the 1958 Teak test the 3.8 Mt bomb exploded 77 km directly over Johnston Island, producing a massive EMP, but again no portable radios were destroyed. In the 1962 Starfish test, and also three Russian tests, lots of things were damaged but only if they were connected to long wires. Portable radios working off batteries were OK. Although modern microchips are up to a million times more sensitive than valve/vacuum tube radios, the aerial size in a UHF cellular phone is really tiny compared to the long aerials of old HF valve/vacuum tube radios, so things balance out. I agree that anything you can fit in your pocket is not likely to be damaged by EMP, unless it is being recharged from the mains when the bomb exploded. (Batteries could only be damaged if they were being recharged at the time.) However, a safe, working cellular radio wouldn't be any use to you if the network (running from mains electricity) was zapped by EMP! 21:24, 30 March 2006 (UTC)

At 10:27 am, Anonymous Anonymous said...

Dr Bernadin info:


Written Statement by Dr. Michael P. Bernardin

Provost for the Theoretical Institute for Thermonuclear and Nuclear Studies

Applied Theoretical and Computational Physics Division

Los Alamos National Laboratory

I have been employed in the nuclear weapon design division at Los Alamos National Laboratory since 1985 to work on nuclear weapon design, nuclear outputs, and high-altitude electromagnetic pulse (EMP) assessment. I discovered the impact of x-rays on EMP and quantified the impact of two-stage shadowing effects on it as well, revolutionizing the understanding of realistic EMP environments. From 1992 – 1995, I was the Laboratory Project Leader for the Joint DoD/DOE Phase 2 Feasibility Study of a High Power Radio Frequency (HPRF) Weapon. This study effort focused on the feasibility and effectiveness of developing an HPRF weapon for offensive purposes. Since 1996, I have been the Provost for a post-graduate nuclear weapon design Institute within the Laboratory, chartered with training the next generation of nuclear weapon designers. ...

The Defense Threat Reduction Agency (DTRA), through contractors that it employs, is the principal DoD organization for EMP assessment. Los Alamos also has a capability for assessing the large-amplitude portion of the EMP, and has provided the Joint Staff with independent EMP threat assessments since 1987. ...

For a 200-km height of burst, which might be appropriate for a hypothetical multi-Mt weapon, the horizon is located at about 1600 km (or 1000 miles) from the point on the ground directly beneath the burst. For a 50-km height of burst, which might be appropriate for a 10-kt fission weapon, the horizon is located at about 800 km from the ground point beneath the burst. ...

[Very interesting: a 10 kt weapon would be best detonated at 50 km to produce the same (?) intensity of peak EMP on the earth's surface as a Mt weapon detonated at 200 km. Radius for damage from 10 kt burst at 50 km altitude is 800 km. Quite big!]

A characteristic amplitude of the electric field is 30,000 volts per meter (V/m) (Longmire, 1978). The intermediate-time component is defined as the portion of the pulse from one microsecond to one second, and it is produced primarily through prompt gamma rays that have been scattered in the atmosphere and by neutrons produced in the explosion. This component is characterized by a peak electric field value of 100 V/m (Radasky, 1988). The third component, the late-time component, is defined as the portion of the pulse beginning at one second and lasting up to several hundred seconds. It is produced primarily through the interaction of the expanding and rising fireball with the earth’s geomagnetic field lines. This EMP component is characterized by a peak field of 0.01 V/m. ...

[The reason why this weaker MHD-EMP causes damage is that low frequencies can penetrate the topsoil and affect very long buried electric cables. Although the MHD-EMP field strength is tiny compared to 10 ns peak, the extra duration (1 peak second EMP is a time factor of 100,000,000 longer than 10 ns peak EMP) means that the energy deposited can be similar in both cases. However the MHD-EMP depends largely on the fission yield of the weapon, not the amount of prompt gamma ray energy which escapes from the weapon casing. Hence, bigger bombs - despite thicker cases - produce far more MHD-EMP energy. A low yield weapon, say 10 kt, withy a thin case if burst at an appropriate altitude (50-150 km) may produce similar 10 ns peak EMP on the ground to 1 Mt burst at 300-500 kt, but it will produce much lower MHD-EMP effects.]

The ionization shorts out the EMP, limiting its value to typically 30,000 V/m.

High-energy x-rays produced by the exploding weapon can also enhance the ionization in the high-altitude EMP source region. This source of ionization was largely ignored in EMP assessments until 1986. Inclusion of the x-rays lowered the assessed values of the peak field for many weapons.

Note in graphic 3 that a thermonuclear weapon consists of two stages. The primary stage is typically of relatively low yield and is used to drive the secondary stage that produces a relatively large yield. Each weapon stage produces its own E1 EMP signal. But the primary stage gamma rays leave behind an ionized atmosphere from their EMP generation that is present when the secondary stage gamma rays arrive. Thus, the primary stage can degrade the EMP associated with the secondary stage.

Graphic 4 shows the spatial distribution of the peak EMP fields for a hypothetical weapon detonated over the United States. The directionality of the earth’s magnetic field causes the largest peak-field region to occur to the south of the burst point. The larger numbers on the plot are peak electric field values, in thousands of volts per meter (kV/m), and the smaller numbers are distance increments in kilometers. Note that the peak field ranges from 12 to about 25 kV/m. ...

It is worthwhile reviewing the most famous of the EMP effects from U.S. atmospheric testing, namely the simultaneous failure of 30 strings of streetlights in Oahu during the Starfish event. Starfish was detonated at 400 km above Johnston Island in the Pacific on July 9, 1962. It had a yield of 1.4 Mt (about 115 times the yield of the bomb dropped on Hiroshima). Oahu was located approximately 1300 km from the designated ground zero of the burst, which was within line of sight of the detonation. A post-mortem following the event indicated that the failure of the strings of streetlights resulting from the Starfish event was due to damaged fuses. This event was analyzed by Charles Vittitoe, a Sandia National Laboratory scientist, in a report published in 1989 (SAND88-3341, April 1989). He notes that the observed damage is consistent with the magnitude and orientation of the EMP fields impinging on the streetlight strings that suffered damage. More importantly, he notes that the 30 strings of failed streetlights represented only about 1% of the streetlights that existed on Oahu at the time. Thus, the effects were not ubiquitous. ...


Barnes, P.R., et al, (1993). Electromagnetic Pulse Research on Electric Power Systems: Program Summary and Recommendations, Oak Ridge National Laboratory report ORNL-6708.

Longmire, C.L., (1978). On the Electromagnetic Pulse Produced by Nuclear Explosions, IEEE Transactions on Antennas and Propagation, Vol. AP-26, No. 1, p. 3.

Radasky, W.A., et al, (1988). High-Altitude Electromagnetic Pulse – Theory and Calculations,

Defense Nuclear Agency technical report DNA-TR-88-123. See figure on page 2.

Vittitoe, C.N., (1989). Did High-Altitude EMP Cause the Hawaiian Streetlight Incident? Sandia National Laboratories report SAND88-3341.

At 6:54 pm, Anonymous Anonymous said...

The White House is now ignoring high altitude EMP threats in its current civil defence planning. They also ignore the likely scenario of an underwater burst in a harbor. They only consider a 10 kt gun type U-235 burst surface burst on land (in Washington D.C.). All the other scenarios are biological, chemical and radioactive ground-level attacks.

The study, marked "official use", is: http://www.strac.org/Docs/Exdocs/National%20Planning%20Scenarios%20Feb%202006.pdf :


Version 21.2 DRAFT


Created for Use in National, Federal, State, and Local Homeland Security Preparedness Activities

February 2006

White House Homeland Security Council

[This is a 164 page book]

Introduction.... ii
Scenario 1: Nuclear Detonation – 10-kiloton Improvised Nuclear Device .... 1-1
Scenario 2: Biological Attack – Aerosol Anthrax .... 2-1
Scenario 3: Biological Disease Outbreak – Pandemic Influenza.... 3-1
Scenario 4: Biological Attack – Plague .... 4-1
Scenario 5: Chemical Attack – Blister Agent .... 5-1
Scenario 6: Chemical Attack – Toxic Industrial Chemicals.... 6-1
Scenario 7: Chemical Attack – Nerve Agent.... 7-1
Scenario 8: Chemical Attack – Chlorine Tank Explosion .... 8-1
Scenario 9: Natural Disaster – Major Earthquake .... 9-1
Scenario 10: Natural Disaster – Major Hurricane.... 10-1
Scenario 11: Radiological Attack – Radiological Dispersal Devices.... 11-1
Scenario 12: Explosives Attack – Bombing Using Improvised Explosive Devices.... 12-1
Scenario 13: Biological Attack – Food Contamination .... 13-1
Scenario 14: Biological Attack – Foreign Animal Disease (Foot-and-Mouth Disease).... 14-1
Scenario 15: Cyber Attack .... 15-1
Appendix: Scenario Working Group Members .... A-1
Attack Timelines.... Published Under Separate Cover
Universal Adversary Group Profiles.... Published Under Separate Cover

page ii:


The Federal interagency community has developed 15 all-hazards planning scenarios (the National Planning Scenarios or Scenarios) for use in national, Federal, State, and local homeland security preparedness activities. The Scenarios are planning tools and are representative of the range of potential terrorist attacks and natural disasters and the related impacts that face our nation. The objective was to develop a minimum number of
credible scenarios in order to establish the range of response requirements to facilitate
preparedness planning. Since these Scenarios were compiled to be the minimum number necessary to develop the range of response capabilities and resources, other hazards were inevitably omitted.
Examples of other potentially high-impact events include nuclear power plant incidents1,
industrial and transportation accidents, and frequently occurring natural disasters. Entities at all levels of government can use the National Planning Scenarios as a reference to help them identify the potential scope, magnitude, and complexity of potential major events. Entities are not precluded from developing their own scenarios to supplement the
National Planning Scenarios.
These Scenarios reflect a rigorous analytical effort by Federal homeland security experts,
with reviews by State and local homeland security representatives. However, it is
recognized that refinement and revision over time will be necessary to ensure the
Scenarios remain accurate, represent the evolving all-hazards threat picture, and embody
the capabilities necessary to respond to domestic incidents.

How to Use the National Planning Scenarios:

Capabilities-Based Planning –
In seeking to prepare the Nation for terrorist attacks, major disasters, and other emergencies, it is impossible to maintain the highest level of preparedness for all possibilities all of the time. Given limited resources, managing the risk posed by major
events is imperative. In an atmosphere of changing and evolving threat, it is vital to build flexible capabilities that will enable the Nation, as a whole, to prevent, respond to, and
recover from a range of major events. To address this challenge, the Department of
Homeland Security (DHS) employs a capabilities-based planning process that occurs under uncertainty to identify capabilities suitable for a wide range of challenges and
circumstances, while working within an economic framework that necessitates prioritization and choice. As a first step in the capabilities-based planning process, the Scenarios, while not exhaustive, provide an illustration of the potential threats for which we must be prepared. The Scenarios were designed to be broadly applicable; they generally do not specify a geographic location, and the impacts are meant to be scalable for a variety of population and geographic considerations.

page 1-1

Scenario 1: Nuclear Detonation –
10-kiloton Improvised Nuclear Device

Scenario Overview:
General Description –

In this scenario, terrorist members of the Universal Adversary (UA) group—represented
by two radical Sunni groups: the core group El-Zahir (EZ) and the affiliated group Al Munsha’a Al Islamia (AMAI)—plan to assemble a gun-type nuclear device using Highly Enriched Uranium (HEU) stolen from a nuclear facility located in Pakistan. The nuclear
device components will be smuggled into the United States. The device will be assembled near a major metropolitan center. Using a delivery van, terrorists plan to transport the device to the business district of a large city and detonate it.

Detailed Attack Scenario –

Current intelligence suggests that EZ may be working with AMAI to develop an Improvised Nuclear Device (IND). It is suspected that special training camps in the
Middle East have been established for IND training. Some IND manuals have also been confiscated from suspected EZ operatives. The volume of communications between EZ
and AMAI operatives has increased significantly in past two weeks.
EZ operatives have spent 10 years acquiring small amounts of HEU. Operatives acquired the material by posing as legitimate businessmen and by using ties to ideologically sympathetic Pakistani nuclear scientists. EZ plans to construct a simple gun-type nuclear device and detonate the weapon at a symbolic American location. EZ Central Command initiates the operation. To preserve operational effectiveness at all levels, compartmentalization and secrecy are required. Due to fears of penetration, EZ has become increasingly discreet in its decision-making process, with few operatives informed of the next target. Target selection, preparation, and acquisition are confined to a small number of terrorist operatives.

page 1-2:

This scenario postulates a 10-kiloton nuclear detonation in a large metropolitan area. The
effects of the damage from the blast, thermal radiation, prompt radiation, and the subsequent radioactive fallout have been calculated (based on a detonation in Washington, DC), and the details are presented in Appendix 1-A. However, the calculation is general enough that most major cities in the United States can be
substituted in a relatively straightforward manner. Enough information is presented in the
appendix to allow for this kind of extrapolation1. The radioactive plume track depends
strongly on the local wind patterns and other weather conditions. In a situation where the wind direction cycles on a regular basis or other wind anomalies are present, caution
should be exercised in directly using the fallout contours presented in the appendix.
If the incident happened near the U.S. border, there would be a need for cooperation between the two border governments. Additionally, the IND attack may warrant the closure of U.S. borders for some period of time. If the detonation occurs in a coastal city, the fallout plume may be carried out over the water, causing a subsequent reduction in casualties. On the other hand, the surrounding water will likely restrict the zones that are suitable for evacuation. Bridges and tunnels that generally accompany coastal cities will restrict the evacuation, causing delay and an increase in the radioactive dose that evacuees receive. This delay may be substantial, and the resulting dose increase may
drive a decision to shelter-in-place or evacuate-in-stages. This assumes that the authorities have an effective communication channel with the public.

Page A-1:

APPENDIX: Scenario Working Group


The Homeland Security Council receives interagency guidance via a number of Policy Coordinating Committees (PCCs). One of them is the Domestic Threat, Response, and
Incident Management (DTRIM) PCC; the Scenarios Working Group (SWG) supports the DTRIM. The members of the SWG are as follows:

CHAIR: Janet K. Benini, Director of Response and Planning, White House Homeland Security Council




Version 17.3DRAFT

Attack Timelines

Created for Use in National, State, and Local Homeland Security Preparedness Activities

February 2006

White House Homeland Security Council

[This 112 pages long book sets out in diary format the envisaged activities of the terrorists in assembling and detonating various types of weapons for each of the 15 attack scenarios detailed above. All the details certainly do make my hair stand on end. But they don't consider other nuclear attacks like underwater bursts in ships, of the kind Britain tested in Operation Hurricane, 1952. The radioactive effects of a shallow underwater burst are more important than those of a surface burst on land, because of the early high speed base surge and also the difficulty in decontaminating ionic wet fallout - it becomes chemically attached to surfaces unlike dry land burst fallout which can be sweeped away with a broom or hosed off.)

page 1-1

In this scenario, terrorist members of the UA group—represented by two radical Sunni groups: the core group El-Zahir (EZ) and the affiliated group Al Munsha’a Al Islamia
(AMAI)—plan to assemble a gun-type nuclear device using Highly Enriched Uranium (HEU) stolen from a nuclear facility located in Pakistan. The nuclear device components will be smuggled into the United States. The device will be assembled near a major metropolitan center. Using a delivery van, terrorists plan to transport the device to the business district of a large city and detonate it.


At 10:15 pm, Blogger nige said...

Dr Mario Rabinowitz has very kindly emailed me (19 November 2006 18:42) some corrections to this blog post which I will make when time permits.

At present, this comment will indicate the changes required.

The report by Mario mentioned with the date 1987 for publication in an IEEE journal (where he notices also that you can't use several EMP weapons or they will interfere with each other, reducing the total EMP) was actually done in:

"... the very early 80's. The forces that be suppressed release of my EPRI report, and prevented publication of my work until 1987. I even have a galley of my paper in Science which managed to get through their tough review process. It was about a week before being published, when it was killed.

"I'm sure many other scientists have encountered similar problems."

Well I have suffered problems of this sort myself.

The problem of censorship is precisely that it creates these priority issues.

Dr Bernadin was unaware of the work of Dr Rabinowitz because the latter was censored. It is extremely difficult to resolve such issues in a satisfactory way.

Dr Rabinowitz was generally at a disadvantage anyway by lack of access to classified nuclear test data and even declassified documents, which were not easy to find out about or obtain in the 80s.

Nigel Cook

At 11:53 am, Blogger nige said...

13 October 2007: updates

(1) Regarding the map showing USSR Test ‘184’ on 22 October 1962 (‘Operation K’ (ABM System A proof tests), A 300-kt burst at 290-km altitude near Dzhezkazgan, the source for the information in the box that a radar installation 1,000 km away malfunctioned due to EMP and that radio receivers failed out to a distance of 600 km, is a summary briefing by General Vladimir Loborev (Director of the Russian Central Institute of Physics and Technology, CIPT, near Moscow), made at the June 1994 EUROEM Conference in Bordeaux, France.

It is not clear whether the effects were due to EMP received directly by the affected devices, or whether they were merely affected by power surges in long buried power lines or long overhead telephone lines connected to them.

However, see the later post http://glasstone.blogspot.com/2006/08/nuclear-weapons-1st-edition-1956-by.html for British Home Office Scientific Advisory Branch studies published in its restricted journal "Fission Fragments" on EMP effects to portable transistor-based battery powered radios (not connected to any external power line, external aerial, etc.):

"Fission Fragments", Issue No. 21, April 1977, pages 18-25:

On pages 20-24 there is an article by C. H. Lewis, MSc, The Effects of EMP, in Particular on Home Defence Communications which states:

'For a near ground-burst the downward component [of the outward Compton electron current in the air, produced by initial gamma radiation] is largely suppressed leaving the upward component to form what is virtually a conventional dipole aerial with a tremendously high current. ... Field strengths for a 5 Mt weapon may be about 20 kV/m at 3 miles, 5 kV/m at 5 miles and 1 kV/m at 8 miles, where blast pressure will be down to 2 psi. ... Consider first the possible effects on the power system. Fortunately the super-grid (which is designed to work at 400 kV) is not thought to be particularly vulnerable, but perhaps 1/4 of the pulse energy picked up by the supergrid may be passed on by the distribution transformers with consequent current surges in the lower voltage systems of perhaps 20,000 amps. Thus although the supergrid may survive, the current surges in the distribution system may result in major system instability with consequent serious breakdown ... It will be remembered that system instability in 1965 resulted in a total black-out of the north-east U.S. for several days. ... Turning to communications ... transmitters appear to be vulnerable to EMP, which can generate peak currents in the aerials of medium wave transmitters (which may be of the order of 100 m long) of several kiloamperes. As a result there is a considerable risk of breakdown in the high voltage capacitors of the transmitters. Additionally, the continuity of broadcasting depends on power supplies, communication with the studio and the studio equipment. Ironically the ordinary domestic transistor receiver with ferrite rod aerials is likely to survive, but VHF receivers with stick aerials are vulnerable when the aerial is extended. ... At this stage the vulnerability of various devices may be considered. A 300 ft length of conductor may pick up between 0.1 and 40 Joules (1 Joule = 1 watt-second). According to US sources, a motor or transformer can survive about 10,000 J, electronic valves about 0.01 J. Small bipolar transistors are sensitive to about 10^{-7} J and microwave diodes, field effect transistors, etc., are sensitive to about 10^{-9} J. ... With a rise time of 10^{-8} secs, 10^{-8} J equates to 1 watt - well beyond the capacity of small transistors. Clearly, motors and transformers are likely to survive, thermionic valves are reasonably good, but transistors in general are vulnerable, whilst equipment using field effect transistors or microwave diodes is especially vulnerable.'

The remainder of that article discussed the effects of EMP on the British wired telephone system: 'The effect of any EMP pick-up in the system will be to cause flashover at one or more of a number of points - terminal boards, relay contacts, relay coil terminations, capacitors, etc. ... There are likely to be many domestic telephones connected in part by overhead lines, and these lines can pick up EMP currents, passing them into the exchange equipment. Because most telephone lines are underground, it is no longer Post Office policy to provide lightning protectors at the exchange or on subscribers premises. Within the exchange, all incoming cables are terminated at the Main Distribution Frame, and from this point the internal wiring to the exchange equipment is unshielded. In view of the tremendous amount and complexity of this internal wiring it appears that the major source of EMP pick-up may lie within the exchange. ... The limit of satisfactory direct speech transmission is about 25 miles and since this must include the subscribers lines to and from the exchange it is customary to provide "repeaters" (amplifiers [including inductance coils to prevent frequency-dependent distortion]) at intervals of 15 miles between exchanges.'

The next very interesting article in "Fission Fragments", Issue No. 21, April 1977, is at page 25: A. D. Perryman (Scientific Advisory Branch, Home Office), EMP and the Portable Transistor Radio. Perryman states: 'In an attempt to answer some of these questions [about EMP effects on communications] the Scientific Advisory Branch carried out a limited programme of tests in which four popular brands of transistor radio were exposed in an EMP simulator to threat-level pulses of electric field gradient about 50 kV/m.

'The receivers were purchased from the current stock of a typical retailer. They comprised:

'1. a low-price pocket set of the type popular with teenagers.

'2. a Japanese set in the middle-price range.

'3. a domestic type portable in the upper-price range.

'4. an expensive and sophisticated portable receiver.

'All these sets worked on dry cells and had internal ferrite aerials for medium and long wave reception. In addition, sets 2, 3 and 4 had extendable whip aerials for VHF/FM reception. Set 3 also had one short wave band and set 4 two short wave bands... .

'During the tests the receivers were first tuned to a well-known long-wave station and then subjected to a sequence of pulses in the EMP simulator. This test was repeated on the medium wave and VHF bands. Set 1 had no VHF facility and was therefore operated only on long and medium waves.

'The results of this experimentation showed that transistor radios of the type tested, when operated on long or medium waves, suffer little loss of performance. This could be attributed to the properties of the ferrite aerial and its associated circuitry (e.g. the relatively low coupling efficiency). Set 1, in fact, survived all the several pulses applied to it, whereas sets 2, 3 and 4 all failed soon after their whip aerials were extended for VHF reception. The cause of failure was identified as burnout of the transistors in the VHF RF [radio frequency] amplifier stage. Examination of these transistors under an electron microscope revealed deformation of their internal structure due to the passage of excessive current transients (estimated at up to 100 amps).

'Components other than transistors (e.g. capacitors, inductors, etc.) appeared to be unaffected by the number of EM pulses applied in these tests.

'From this very limited test programme, transistor radios would appear to have a high probability of survival in a nuclear crisis when operated on long and medium bands using the internal ferrite aerial. If VHF ranges have to be used, then probably the safest mode of operation is with the whip aerial extended to the minimum length necessary to give just audible reception with the volume control fully up.

'Hardening of personal transistor radios is theoretically possible and implies good design practice (e.g. shielding, bonding, earthing, filtering etc.) incorporated at the time of manufacture. Such receivers are not currently available on the popular market.'

The effects of EMP on electronics can be amplified if the equipment is switched on, because the amplification of an EMP signal by an operating circuit will add extra power to the current surge. Damage also occurs when current passes the wrong way through transistors, overheating them (especially the transistors built into IC's since these have no effective heat sink available over the small time scale for nanosecond duration power surges).

(2) The 1963 secret American Defense Department film "High-Altitude Nuclear Weapons Effects - Part One, Phenomenology" (20 minutes), has been declassified.

It discusses in detail, including film clips and discussions of the sizes and quantitative phenomena of the tests, the effects of 1962 high altitude tests BLUEGILL (410 kt, 48 km altitude), KINGFISH (410 kt, 95 km altitude), and STARFISH (1.4 Mt, 400 km altitude).

This film is mainly concerned with fireball expansion, rise, striation along the Earth's natural magnetic field lines, and air ionization effects on radio and radar communications, but it also includes a section explaining the high altitude EMP damage mechanism.

Here is a summary of facts and figures from this film:

BLUEGILL (410 kt, 48 km height of burst, 26 October 1962): within 0.1 second the fireball is several km in diameter at 10,000 K so air is fully ionised. Fireball reaches 10 km in diameter at 5 seconds. By 5 seconds, the fireball is buoyantly rising at 300 metres/second. It is filmed from below and seen within a minute to be transforming into a torus or doughnut shape as it rises. The fireball has reached a 40 km diameter at 1 minute, stabilising at an altitude of 100 km some minutes later.

KINGFISH (410 kt, 95 km altitude, 1 Nov. 1962): fireball size is initially 10 times bigger than in the case of BLUEGILL. The KINGFISH fireball rises ballistically (not just buoyantly) at a speed 5 times greater than BLUEGILL. It's diameter (longways) is 300 km at 1 minute and it is elongated along the Earth's natural geomagnetic field lines while it expands. It reaches a maximum altitude of 1000 km in 7 or 8 minutes before falling back to 150-200 km (it falls back along the Earth's magnetic field lines, not a simple vertical fall). The settled debris has a diameter of about 300 km and has a thickness is about 30 km. This emits beta and gamma radiation, ionizing the air in the D-layer, forming a "beta patch". Photographs of beta radiation aurora from the KINGFISH fireball are included in the film. These beta particles spiral along the Earth's magnetic field lines and shuttle along the field lines from pole to pole. Because magnetic field lines concentrate together as they approach the Earth's poles, the negative Coulomb field strength due to concentrated beta particles near the poles (where the magnetic field lines come close together) slows and reflects beta particles back. This is the "mirroring" effect discovered in Operation Argus in 1958. It only works effectively if the mirror point altitude is above 200 km, otherwise the beta particles will be rapidly absorbed by the atmosphere (after a few passes from pole to pole) before they can be reflected. Hence, only sufficiently high altitude nuclear explosions can create long-lasting "shells" of trapped electrons at very high altitude. To some extent, the trapping effect varies as the debris rises and sinks back in one explosion.

STARFISH (1.4 Mt, 400 km,9 July 1962): the film shows STARFISH early fireball expansion effects. STARFISH produced an asymmetric fireball due to the missile which carried the fireball: a shock wave goes upward and another goes downward, while a small star-like remnant continues to glow at the detonation point (contrary to predictions!). Fireball expansion was resisted by geomagnetic back-pressure: the electrically conductive fireball gases exclude the Earth's magnetic fields, so the latter is displaced as the fireball expands. This is the "magnetic bubble" effect.

The film then explains the mechanism for the magnetic dipole EMP: prompt gamma rays are mainly absorbed between 25-30 km altitude, the Compton electrons being deflected by the Earth's magnetic field lines, emitting coherent EMP in the process. The film shows the damaging results by depicting an overhead powerline experiencing a powersurge and sparking.

Near the end of the film, there is an amazing and impressive speeded-up film showing the KINGFISH fireball (initially a large egg shaped fireball) rising and striating into a series of line-like filaments orientated along the Earth's magnetic field lines.

Other declassified films worth mentioning are "Fishbowl High-Altitude Weapons Effects" (1962, 28 minutes) which explains the instrumentation and shows the effects of each detonation on Pacific radio comunications at different frequencies, and the lengthy set of four films "Starfish Prime Event Interim Report By Commander JTF-8", "Fishbowl Auroral Sequences", "Dominic on Fishbowl Phenomena" and "Fishbowl XR Summary" (1 hour 9 minutes in total).

Some highlights of these films: the high-altitude 1962 Fishbowl series involved 266 instrument stations: 156 stations on land, 80 stations aboard 10 ships, and 30 stations aboard 15 test aircraft. They mention the 3 high altitude Argus tests in 1958 and the Yucca (1.7 kt, 26 km), Orange (3.8 Mt, 43 km) and Teak (3.8 Mt, 77 km) tests of Hardtack in 1958. The 3 objectives of Fishbowl are stated to be:

1. ICBM acquisition problems for ABM radar installations after a nuclear explosion,
2. AICBM (Anti-ICBM) kill mechanism to use a nuclear explosion to destroy an incoming ICBM (by neutron and gamma radiation, shock wave, and thermal ablation phenomena),
3. Communications effects of high altitude explosions of various yields and burst altitude.

STARFISH HF radio effects lasted for 2 days over the Pacific.

CHECKMATE (7 kt, 147 km burst altitude) HF radio effects extended out to 700 km for 30 minutes.

KINGFISH HF radio effects extended to 2500 km radius for 2 hours.

BLUEGILL HF radio was blacked out over 1 minute over 200 km radius, and lesser effects lasted over this region for 2 hours. BLUEGILL also produced retinal burns to test rabbits.

VLF was relatively inaffected by the tests, LF was degraded, HF was extensively degraded as was VHF except for less severe absorption. UHF line of sight was relatively unaffected, except where the signal path was through a fireball region.

On the silent films there is an especially good BLUEGILL torus film, and nice films of KINGISH auroral radiation emission from the fireball. There are also detailed films showing the STARFISH auroral fireball developing around the burst location, the striation of CHECKMATE fireball debris (a speeded up film) and some interesting films showing shock waves rebounding inside the TIGHTROPE fireball: explosive and implosive shock waves occur with the implosion shock wave bouncing off the singularity in the middle and transforming itself into an outward explosive shock wave.

At 6:45 am, Blogger Corky Boyd said...

Regarding the Starfish test, I performed an unsophisticated test of EMP myself.

I was a junior officer in the Navy at Pearl Harbor assigned to Pacific Fleet Headquarters.

I knew of the test and the countdown frequency. I purchased an inexpensive Hallicrafter SW radio to monitor the countdown, which used the ID of April Weather. There were numerous scrubbed missions and one disaster when the radar lost track of the Thor IRBM and it had to be destroyed at a very low altitude.

My test was to monitor the countdown, which was broadcast from Johnston Island at just slightly above 10 mhz. Near the countdown frequency was a VOA broadcast from California. My intention was to shift frequency shortly after detonation, which I did, and test reception.

When the detonation occurred, the sky, which was overcast, lit up in a brilliant yellow/chartreuse color. After about 45 seconds the edges of the chartreuse turned a deep red , which worked its way into the center of the light until it darkened about 5 to 7 minutes after the test. It was an awesome experience.

At the time of the detonation there was a zzzzzt sound for about a half second. There was no loss of signal from April Weather and when I changed frequencies to VOA it was coming in as clear as before.

My recollection was the test altitude was significantly higher than 400km now being reported. It apperared to be 35 to 40 degrees above the horizon. The countdown from launch to detonation (nudet in the vernacular) was slightly in excess of 13 minutes.

The news outlets in Hawaii reported some lights going out, but no widespread effects. There were also reports of EMP related problems in New Zealand, but very little else. My own test did not show any electric power interruption, or any loss of signal in the 30 meter band.

Thought you might be interested.

At 3:30 pm, Blogger nige said...

"The news outlets in Hawaii reported some lights going out, but no widespread effects. There were also reports of EMP related problems in New Zealand, but very little else. My own test did not show any electric power interruption, or any loss of signal in the 30 meter band."

Hi Corky Boyd,

Thank you very much for your first-hand experience of the Starfish EMP. It is extremely extremely useful to have first-hand accounts.

I exchanged an email with Glen Williamson ( http://www.williamson-labs.com/480_emp.htm ) who observed the same Starfish test from Kwajalein Atoll, 1500 miles away. He wrote, as he says on his site:

"I don't remember hearing of anything happening on Kwaj as a result of the shots. Of course, all of the technical facilities there were heavily shielded. Knowing that there were artifacts in Hawaii, I am surprised we didn't experience the same..."

- http://www.williamson-labs.com/480_emp.htm

It does seem that EMP effects on 1962 electronics on small islands were few and far between after Starfish.

I've seen the declassified reports, and they all - from interim scientific report to the present day - give the Starfish burst altitude as 400 km. There is actual film of the Starfish device exploding, included in the set of films, "Starfish Prime Event Interim Report By Commander JTF-8", "Fishbowl Auroral Sequences", "Dominic on Fishbowl Phenomena" and "Fishbowl XR Summary" (1 hour 9 minutes in total).

These films do indicate that the burst altitude was correct: it was above the horizon as seen from Hawaii. The calculation is straightforward to determine the burst altitude, allowing for the Earth's curvature.

This business about the streetlamps and radios in Hawaii is a red-herring, it's true only 1-3% of streetlamps were put out (the uncertainty of 1-3% depends is just historical guesswork about how many streetlamps there were in Hawaii, it is known for sure that the number that had to have fuses replaced by engineers were 300 streetlamps in 30 overhead-connected strings of 10 lamps each) on the island Oahu. If you look at the size of the Hawaiian islands and compare to the Russian test, the overhead and buried power and communication lines were short in Hawaii. That, plus the electromechanical phone systems and valve/vacuum tube radios, was what limited damage as compared to what would happen if the test was repeated today over land.

The electromechanical relay phone switchboards and vacuum tube electronics were capable of surviving power surges a million times greater than microchips and other transistor-dependent devices.

In addition, for above ground power cables, the current induced by a fixed EMP fast (prompt gamma produced) pulse is almost directly proportional to the length of the line for line lengths of up to 100 km or so. Hence, even if a string of 10 streetlamps on Hawaii was say 1 km long, then you would get 100 times more current induced in 100 km or more of overhead power line over land. In the case of the slow (MHD) EMP, the situation is even more severe, with the cable length effect increasing the induced EMP for even bigger distances.

The vulnerability of solid state chip computer systems to EMP is a problem that was never investigated in Russian or American nuclear tests.

Certainly the MHD EMP is slow enough (several seconds rise time) that circuit breakers in protected power supplies could fully protect equipment from damage, but the microsecond surge spike in powerlines from the fast EMP (caused by prompt gamma rays) is supposed to be faster than many circuit breakers can respond to (they are chiefly designed to stop millisecond spikes due to lightning flashes, not microsecond spikes from a high altitude nuclear explosion). It seems that any protective equipment would reduce damage in threshold cases, by stopping at least part of the surge after the spike has passed. However, most portable (laptop) equipment that was not connected to the mains at the time of the explosion probably be unaffected because they are so small and so can't directly pick up much damaging current from the EMP: the wireless antennae they have are also small and tuned for 2.4 GHz, much higher than the predominantly HF signal of the EMP. Mobile cellular phones similarly now mainly work on microwave frequencies and are small enough to resist quite well fairly powerful EMP's of 5-20 kV/m.

So the major crisis of EMP would be damage to power stations and distribution, and its effects in turn on putting out computers and mobile phone network repeaters. There is also the problem of the electronic ignition failure of cars/automobiles due to EMP, again due to the greater sensitivity of microchips to EMP than the kind of simple electronics (distributor system) used in electronic ignition systems in Hawaii in 1962.

Altogether, it seems that there are concerns for countries with long power lines and long phone lines, that depend on microchips, and neither of these concerns existed in the small sized Hawaiian islands back in 1962.

One example of this kind is the failure of the telephone system on the Hawaiian island of Kauai due to the EMP destroying the microwave link, which was the one piece of crucial equipment there back in 1962. I think it was supposed to have burned out a semiconductor diode.

Really, in discussing 1962 nuclear test EMP effects in a modern context, emphasis needs to be placed on the relative insensitivity of 1962 electronic systems in general, and the small size of the conductor cables involved in those small islands. The Russian experiences of detonating bombs over inhabited areas and fusing the phone lines while causing lead-shielded underground cables to pick up enough current to set the power station on fire by overloading heavy-duty transformer coils, shows the likely effects of high altitude explosions over large, inhabited land areas.

At 6:47 am, Blogger Corky Boyd said...


It is possible I misread the Starfish test altitude, but my memory was that it was significantly higher than 250nm. A couple of items still make me question the officially reported altitude.

First the countdown from launch to detonation was over 13 minutes, which included burn time and coast. Seems excessive for a 250nm burst. Second, would a 250nm altitude burst be directly visible above the horizon from Kwajalein 1500nm away? Also from rough calculations (please check me) a 250nm high burst would be about 10 degees over the horizon at Pearl Harbor about 700nm away. It appeared higher than that.

On the other hand, it doesn’t make sense for the US to be deceptive on this. Surely the Soviets made their own measurements.

You sound as if you are well versed in physics. Would you run the numbers on the Kwajalein altitude and burn time scenarios?

I enjoy your discussions.

At 1:00 am, Blogger nige said...

Hi Corky,

The photos of the Starfish Prime fireball are shown on another post of this blog:

There is a comparison between photos of the fireball at 3 minutes after detonation, taken with an 80 mm Hasselblad camera aboard a Los Alamos instrumented KC-135 instrumentation jet above the clouds, 300 km horizontal distance from detonation.

The photo shows the burst location against the background stars which are also visible behind the fireball. There is film also from earlier times, before the fireball had expanded so much. Therefore, it looks to me as if the burst altitude was accurately determined from careful measurements based on photos.

Visible effects of a nuclear detonation above the horizon were documented after the 1958 "Teak" nuclear test above Johnston Island at night, which was even more powerful than Starfish but was below the horizon as seen from the Hawaiian Islands (3.8 megatons, of which 1.9 megatons was from fission, at a burst altitude of 77 km).

There was little cloud cover at the time and a few people were able to photograph the "Teak" test. four very good quality amateur photos, taken at intervals of about 50 seconds, were even published in the Journal of Geophysical Research, vol. 65, 1960, p. 545).

Despite detonating below the horizon, the "Teak" explosion was immediately visible (within a fraction of a second) due to beta particle radiation streaming upward from the radioactive fireball and causing a bright aurora in the low density air above the detonation point. After a few seconds, when the fireball starting rising at a "ballistic" rate due to the fireball height exceeding the altitude over which the air density fell by an order of magnitude, the fireball itself rose above the detonation point and could be seen directly from Hawaii, despite the burst having occurred at only 77 km altitude.

So, could it be a case that the apparently high angle of the flash as seen through the cloud at Hawaii was just a result of beta radiation causing a bright aurora high above the burst point, as the photo taken at 3 minutes seems to show?

This effect of a glow far higher than the detonation point due to the passage of radiation upward, would also account for some of the visible effects from Starfish seen at Kwajalein Atoll.

I can't find any data on how long the rocket burned before the Starfish device exploded. The declassified films I obtained (which I will be transferring to Google Video as soon as possible), did indicate that the Starfish missile with its 1.4 megaton thermonuclear warhead, instrument pods, etc., was very heavy and the previous attempt to fire it failed about a minute after launch.

I don't know how long it is supposed to take to get such a missile up to 400 km. It will depend on the rocket thrust and the total mass of the missile including all the attached instrument pods which were ejected at different altitudes on the way up, to measure the effects at different distances from the fireball.

The film does make it clear that the missile was tracked carefully by both radar and by camera stations until the detonation occurred.

In the DVD "Nukes in Space" there are some conferences of President Kennedy discussing the nuclear tests in space in October 1962, and one of the major arguments was about "Uracca", a test planned for very high altitude (I think it was planned to be 7 kt at 1300 km altitude). That test had to be cancelled, and there is a discussion of that as follows in a technical report I found about the general effects of American high altitude tests:

"In any case, Dr. Webb, the NASA administrator at that time, prevailed upon Dr. Jerome Wiesner, the Chief Scientific Advisor to the President, and reportedly also directly upon President Kennedy to have future nuclear space experiments restricted to lower altitudes. This, in my personal opinion, highly emotional response led un-fortunately to the cancellation of the low-yield Uracca event, which was to be exploded at analtitude of 1300 km as proposed by LASL. The event, as planned, would have added less than 17% to the inventory of the artificial belts but would have increased our knowledge ofnear-space physics significantly."

- http://www.fas.org/sgp/othergov/doe/lanl/docs1/00322994.pdf

Thank you for the discussion, which is very interesting.

At 3:24 pm, Anonymous Anonymous said...

A minor comment. I was a high school student in Hilo (Hawaii) during Teak, and saw the burst. As I remember, there were two tests, separated by a week or maybe several weeks. The first one was unannounced and some of my friends were out, late at night, and were very frightened by what they saw. They weren't alone in that.

For the second test there was an official announcement. Many of the students in my high school, including me, drove over to Ka Lae (South Point) to watch the explosion, which we did indeed see.

As I remember, there were widespread reports of power outages for both tests. And also, as I remember, the authorities denied that the explosions could have had anything to do with them.

Of course this was 50 years ago and my memory may be faulty about anything except what I witnessed that night at Ka Lae.

At 7:30 pm, Blogger nige said...

Hilo boy,

Thank you. Could you please describe what you saw, presumably the "Orange" test on 12 August of 3.8 Mt (50% fission) at 43 km altitude over Johnston Island? "Teak" was an identical weapon design detonated at 76.8 km altitude on 1 August.

According to Glasstone & Dolan's Effects of Nuclear Weapons, 3rd ed., 1977, Chapter 2:

"2.56 The TEAK explosion was accompanied by a sharp and bright flash of light which was visible above the horizon from Hawaii, over 700 miles away. Because of the long range of the X rays in the low-density atmosphere in the immediate vicinity of the burst, the fireball grew very rapidly in size. In 0.3 second, its diameter was already 11 miles and it increased to 18 miles in 3.5 seconds. The fireball also ascended with great rapidity, the initial rate of rise being about a mile per second. Surrounding the fireball was a very large red luminous spherical wave, arising apparently from electronically excited oxygen atoms produced by a shock wave passing through the low-density air (Fig. 2.56). [Fireball and red luminous spherical wave formed after the TEAK high-altitude shot. (The photograph was taken from Hawaii, 780 miles from the explosion.)]

2.57 At about a minute or so after the detonation, the TEAK fireball had risen to a height of over 90 miles, and it was then directly (line-of-sight) visible from Hawaii. The rate of rise of the fireball was estimated to be some 3,300 feet per second and it was expanding horizontally at a rate of about 1,000) feet per second. The large red luminous sphere was observed for a few minutes; at roughly 6 minutes after the explosion it was nearly 600 miles in diameter. ...

"2.60 Additional important effects that result from high-altitude bursts are the widespread ionization and other disturbances of the portion of the upper atmosphere known as the ionosphere. These disturbances affect the propagation of radio and radar waves, sometimes over extended areas (see Chapter X). Following the TEAK event, propagation of high-frequency (HF) radio communications (Table 10.91) was degraded over a region of several thousand miles in diameter for a period lasting from shortly after midnight until sunrise. Some very-high-frequency (VHF) communications circuits in the Pacific area were unable to function for about 30 seconds after the STARFISH PRIME event.

"2.61 Detonations above about 19 miles can produce EMP effects (§ 2.46) on the ground over large areas, increasing with the yield of the explosion and the height of burst. For fairly large yields and burst heights, the EMP fields may be significant at nearly all points within the line of sight, i.e., to the horizon, from the burst point. ...

"2.62 An interesting visible effect of high-altitude nuclear explosions is the creation of an ''artificial aurora." Within a second or two after burst time of the TEAK shot a brilliant aurora appeared from the bottom of the fireball and purple streamers were seen to spread toward the north. Less than a second later, an aurora was observed at Apia, in the Samoan Islands, more than 2,000 miles from the point of burst, although at no time was the fireball in direct view. The formation of the aurora is attributed to the motion along the lines of the earth's magnetic field of beta particles (electrons), emitted by the radioactive fission fragments. Because of the natural cloud cover over Johnston Island at the time of burst, direct observation of the ORANGE fireball was not possible from the ground. However, such observations were made from aircraft flying above the low clouds. The auroras were less marked than from the TEAK shot, but an aurora lasting 17 minutes was again seen from Apia. Similar auroral effects were observed after the other high-altitude explosions ..."

The earlier 2nd edition (1962 and massively corrected 1964 reprint) of that book contained a bit more information about the "Orange" test; it states that observers at Hawaii saw a grey cloud rise over the horizon about 1 minute after the detonation and disappear shortly thereafter. It would be interesting if you can recall what you saw of the explosion. Was there cloud intervening, or was the sky clear?

Both detonations were well below the horizon as seen from ground level at Hawaii. Because the long-range EMP that causes most of the damage is VHF frequency, it can't propagate around the horizon. The MHD-EMP is ELF and can get around the horizon, but the powerlines and phont lines in Hawaii probably were not long enough to pick up significant currents from MHD-EMP. I can't see how either "Teak" or "Orange" could have had much EMP effect out at Hawaii, because both shots were too low to allow VHF frequency EMP to propagate with sufficient strength (well past the horizon radius as seen from the burst point in those tests).

There were certainly effects on radio propagation due to enhanced atmospheric ionisation by beta particles (the ionosphere was used to bounce radio signals to and from Australia and America, etc.). But this is not EMP damage, and doesn't damage equipment or cause power losses, it just introduces noise (static) in long range radio signals, or phase shifts in the paths taken by the radio signals (due to bouncing off the ionosphere at a different altitude from normal when being ducted between the sea and the ionosphere).

But do you remember any specific EMP effects occurring after the 1962 "Starfish" test?

At 5:23 pm, Blogger nige said...

More about the visible effects of 3.8 Mt "HARDTACK-ORANGE" at 43 km above Johnston Island in 1958:

"The dramatic display of southern lights [aurora] which TEAK generated raised considerable anxiety in Hawaii, but most observers in the islands were disappointed in ORANGE. One bserver
on the top of Mount Haleakala on
Maui described the display as “... a dark brownish red mushroom [that] rose in the sky and then died down and turned to white with a dark red rainbow.” While ORANGE was visible for about 10 minutes
in Hawaii, it had little effect on radio communications."

- Page 142 of http://www.dtra.mil/newsservices/publications/pub_includes/docs/DefensesNuclearAgency.pdf

At 5:25 pm, Blogger nige said...

The full title of that last linked reference above is:

"Defense's Nuclear Agency: 1947-1997", DTRA History Series, U.S. Defense Threat Reduction Agency, U.S. Department of Defense, Washington, D.C., 2002.

At 3:00 pm, Blogger nige said...

Another useful source of early unclassified and incomplete data on Starfish effects is:


NASA Technical Note: NASA TN D-2402, The Effects of High Altitude Explosions, by Wilmot N. Hess, Goddard Space Flight Centre, Greenbelt, Md., 1964.

It mentions the EMP radiated by electrons deflected by the Earth's magnetic field, but only under cover of the physics jargon "synchrotron radiation", and completely misses the important prompt gamma radiation induced VHF/UHF frequency microsecond duration EMP, mentioning on page 9 only inconsequential non-damaging minutes-long low frequency radiation from electrons trapped in radiation belts:

"A few minutes after Starfish, synchrotron radiation from the trapped electron was observed in
Peru (Reference 15). This is the only effect of the artificial radiation belts that is observed on the ground for long periods. Synchrotron radiation is the electromagnetic radiation given off when an electric charge is accelerated in a circle (Reference 16 - Schwinger, J., "On the Classical Radiation of Accelerated Electrons," Phys. Rev. v75, pp1912-1925, 15 June 1949). It was first observed as light emitted from a synchrotron electron accelerator. If the charged particles have V << c , then the radiation is emitted only at the cyclotron frequency and is called cyclotron radiation; but, when the particle is relativistic, many higher harmonics of the cyclotron frequency are emitted, too, and the radiation is called synchrotron radiation. The radio emission of the planet Jupiter in the 30 cm range is tentatively identified as being synchrotron radiation from trapped electrons with energies in the order of 5 to 100 Mev ..."

Much more usefully, it gives some of the early data from Starfish on the radiation belts it caused in space (mapped by early satellites' geiger counters) and some data on the degradation of solar cells on satellites due to the radiation damage from transversing the enhanced radiation belts due to the Starfish explosion. There are also various later, better papers on the subject, but as this is already available in full on the internet it is worth linking to right away.

At 7:34 pm, Blogger nige said...

About the "Orange" test, Chuck Hansen's book "U.S. Nuclear Weapons", Orion Books, 1988, page 81 states (referencing Glasstone's Effects of Nuclear Weapons, Feb. 1964 revision pages 50-52, 82-3, which I don't have handy at present):

"The Orange fireball was also seen from Hawaii; about a minute later, a grayish-white radioactive cloud was seen low on the horizon, but it disappeared within four minutes."

At 10:56 am, Blogger nige said...

copy of a comment to:


Beautiful pictures of volcanic lightning and of Saturn! It is certainly true that cosmic rays can trigger lightning bolts. There is a large electric potential between the Earth's surface and the ionosphere, which is at high altitude and hence low pressure air. This is similar to conditions in a Geiger-Muller tube, where you have low pressure gas and a strong electric field. Any cosmic ray can potentially set off an electron avalanche, which in the absence of a quenching agent (Geiger-Muller tubes include some inert gases like helium, neon or argon which have filled outer-shells of electrons, in order to limit the size of the electron avalanche and thus quench each small discharge). Since there is little quenching gas in the Earth's atmosphere, you get major lightning bolts develop.

One pretty impressive lightning situation which demonstrates the connection between ionizing radiation and lightning, was lightning filmed around the periphery of the fireball from the "Mike" nuclear test on 1 Nov. 1952 at Eniwetok. The yield was 10.4 Mt, and the gamma rays set off at least five lightning flashes in the ionized air just around the fireball. All the lightning bolts were essentially vertical, from the scud cloud just above the fireball down to the lagoon water. This confirms that nuclear radiation, via causing ionization in the atmosphere, definitely can trigger a shorting of the natural vertical electric potential gradient in the atmosphere, resulting in a bolt of lightning:

At 11:32 pm, Blogger nige said...

The vital 1963 declassified films of the 1962 high altitude nuclear test effects (se my comment avove) are available on YouTube:

Part 1: http://youtube.com/watch?v=tdrirktDT2Y&feature=related (20 minutes)

part 2: http://youtube.com/watch?v=T6eLPLR_WPs&feature=related (16 minutes)

To recap, here again is my review and smmary of Part 1 (the association of nuclear test names to test events discussed in the film have to be deduced from the films of the explosions):

The 1963 secret American Defense Department film "High-Altitude Nuclear Weapons Effects - Part One, Phenomenology" (20 minutes), has been declassified.

It discusses in detail, including film clips and discussions of the sizes and quantitative phenomena of the tests, the effects of 1962 high altitude tests BLUEGILL (410 kt, 48 km altitude), KINGFISH (410 kt, 95 km altitude), and STARFISH (1.4 Mt, 400 km altitude).

This film is mainly concerned with fireball expansion, rise, striation along the Earth's natural magnetic field lines, and air ionization effects on radio and radar communications, but it also includes a section explaining the high altitude EMP damage mechanism.

Here is a summary of facts and figures from this film:

BLUEGILL (410 kt, 48 km height of burst, 26 October 1962): within 0.1 second the fireball is several km in diameter at 10,000 K so air is fully ionised. Fireball reaches 10 km in diameter at 5 seconds. By 5 seconds, the fireball is buoyantly rising at 300 metres/second. It is filmed from below and seen within a minute to be transforming into a torus or doughnut shape as it rises. The fireball has reached a 40 km diameter at 1 minute, stabilising at an altitude of 100 km some minutes later.

KINGFISH (410 kt, 95 km altitude, 1 Nov. 1962): fireball size is initially 10 times bigger than in the case of BLUEGILL. The KINGFISH fireball rises ballistically (not just buoyantly) at a speed 5 times greater than BLUEGILL. It's diameter (longways) is 300 km at 1 minute and it is elongated along the Earth's natural geomagnetic field lines while it expands. It reaches a maximum altitude of 1000 km in 7 or 8 minutes before falling back to 150-200 km (it falls back along the Earth's magnetic field lines, not a simple vertical fall). The settled debris has a diameter of about 300 km and has a thickness is about 30 km. This emits beta and gamma radiation, ionizing the air in the D-layer, forming a "beta patch". Photographs of beta radiation aurora from the KINGFISH fireball are included in the film. These beta particles spiral along the Earth's magnetic field lines and shuttle along the field lines from pole to pole. ...

At 2:41 pm, Blogger nige said...

Nobel Laureate Hans A. Bethe's report containing the wrong EMP mechanism for high altitude bursts (electric dipole instead of magnetic dipole) is:

H. A. Bethe, "Electromagnetic Signal Expected from High-Altitude Test", Los Alamos Scientific Laboratory report LA-2173, October 1957, secret-restricted data.

This report is significant because it predicted all three major parameters so wrongly that it prevented the magnetic dipole EMP being discovered for five years. It predicted (1) totally the wrong polarization (the direction antenna need to be pointed to detect the EMP), (2) completely the wrong rise time of the EMP (the oscilloscope time-sweep setting needed to show up the pulse on the display so it could be photographed; the pulse duration is tens of nanoseconds not tens of microseconds), and finally (3) the wrong intensity of the pulse (about 1 volt/metre was predicted instead of 10,000 or more volts/metre, so the oscilloscope pulse height settings were wrong by a factor of 10,000 and any instruments which did detect the pulse just gave vertical spikes extending off-scale, with no information whatsoever about the peak EMP or its duration.

These problems were only resolved after one instrument operated in an instrumentation aircraft operated in 1962 by Wakefield at Starfish was set with a very fast sweep and low intensity, so it managed to capture the EMP peak and duration successfully:

Richard L. Wakefield, "Measurement of time interval from electromagnetic signal received in C-130 aircraft, 753 nautical miles from burst, at 11 degrees 16 minutes North, 115 degrees 7 minutes West, 24,750 feet", Los Alamos Scientific Laboratory, pages 44-45 of Francis Narin's Los Alamos Scientific Laboratory compilation "A 'Quick Look' at the Technical Results of Starfish Prime", report AD-A955411, August 1962. (Figure 8 on page 45 gives the Wakefield EMP waveform measurement for Starfish, and is headed "EM Time Interval Signal on C-130 aircraft 753 Nautical Miles from Burst".)

At subsequent 1962 "Fishbowl" (high altitude) tests Kingfish, Bluegill and Checkmate, similar oscilloscope settings were used to obtain further successful waveform measurements of EMP:

John S. Malik, "Dominic Fishbowl Radioflash Waveforms", Los Alamos Scientific Laboratory report LA(MS)-3105, May 1964, Secret-restricted data.

John S. Malik and Ralph E. Partridge, Jr., "Operation Dominic Radioflash Records", Los Alamos Scientific Laboratory report LAMS-3019, November 1963, Secret-restricted data.

The two reports above are still classified, more than 35 years after being written.

At 1:45 pm, Blogger nige said...

Update (26 Feb 2009): Vital fresh information on EMP from Starfish and other 1962 nuclear tests has been published and is reported on this blog in the new post:


'The street lights on Ferdinand Street in Manoa and Kawainui Street in Kailua went out at the instant the bomb went off, according to several persons who called police last night.'

- HONOLULU ADVERTISER newspaper article dated 9 July 1962 (local time; reprinted in the Tuesday 21 February 1984 edition, celebrating the 15th anniversary of Hawaiian statehood to the U.S.A.).

At 11 pm on 8 July 1962 (local time, Hawaii), 300 streetlights in 30 series connected loops (strings) were fused by the EMP from the Starfish nuclear test, detonated 800 miles away and 248 miles above Johnston Island. This is approximately 1-3% of the total number of streetlights on Oahu.

In a much earlier blog post (linked here), the 1962 EMP damage effects from high altitude explosions (including three Russian high altitude tests of 300 kt each with differing altitudes of burst) were examined in some detail.

Then, in a more recent blog post (linked here), freshly released information from Dr Carl Baum's EMP notes series was given and discussed, including Dr Conrad Longmire's investigation (Note 353 of March 1985, EMP on Honolulu from the Starfish Event) which assessed the EMP field strength at Hawaii, which peaked after 100 nanoseconds at 5,600 volts/metre.

Longmire stated on page 12 of his report:

'We see that the amplitude of the EMP incident on Honolulu [which blew the sturdy electric fuses in 1-3% of the streetlamps on the island] from the Starfish event was considerably smaller than could be produced over the northern U.S. ... Therefore one cannot conclude from what electrical and electronic damage did not occur in Honolulu that high-altitude EMP is not a serious threat. In addition, modern electronics is much more sensitive than that in common use in 1962. Strings of series-connected street lights did go out in Honolulu ... sensitive semiconductor components can easily be burned out by the EMP itself, 10^(-7) Joules being reportedly sufficient.'

This 5,600 v/m figure allows definite correlations to be made between the observed effects and the size of the EMP field, which is a massive leap forward for quantitative civil defence assessments of the probable effects of EMP.

Now Dr Baum (who has an important and interesting overview of EMP here, although it misses out some early important pieces of the secret history of EMP in the table of historical developments) has made available the report by Charles N. Vittitoe, 'Did high-altitude EMP (electromagnetic pulse) cause the Hawaiian streetlight incident?', Sandia National Labs., Albuquerque, NM, report SAND-88-0043C; conference CONF-880852-1 (1988).

Vittitoe on page 3 states: 'Several damage effects have been attributed to the high-altitude EMP. Tesche notes the input-circuit troubles in radio receivers during the Starfish [1.4 Mt, 400 km altitude] and Checkmate [7 kt, 147 km altitude] bursts; the triggering of surge arresters on an airplane with a trailing-wire antenna during Starfish, Checkmate, and Bluegill [410 kt, 48 km altitude] ...'

This refers to the KC-135 aircraft that filmed the tests from above the clouds, approximately 300 kilometers away from the detonations.

The reference Vittitoe gives to Dr Frederick M. Tesche is: 'F. M. Tesche, IEEE Transactions on Power Delivery, PWRD-2, 1213 (1987). [This reference is unfortunately wrong since there were only 4 issues of that journal published in 1987 and page 1213 occurs in issue 4 - in the middle of an article on EMP by Dr Mario Rabinowitz - that article being also available on arXiv.org and reviewed critically in a previous blog post here.] The effects were reported earlier by G. S. Parks, Jr., T. I. Dayharsh, and A. L. Whitson, A Survey of EMP Effects During Operation Fishbowl, DASA [U.S. Department of Defense's Defense Atomic Support Agency, now the DTRA] Report DASA-2415, May 1970 (Secret - Restricted Data).'

Vittitoe then quotes Glasstone and Dolan's statement in The Effects of Nuclear Weapons:

'One of the best authenticated cases was the simultaneous failure of 30 strings (series-connected loops) of street lights at various locations on the Hawaiian
island of Oahu, at a distance of 800 miles from ground zero.'

The detonation occurred at 11pm 8 July 1962 (local time) for Hawaii, so the flash was seen across the night sky and the failure of some street lights was observed. Vittitoe usefully on page 5 quotes the vital newspaper reports of the EMP damage, the first of which is the most important since it was published the very next day following the explosion:

'The street lights on Ferdinand Street in Manoa and Kawainui Street in Kailua went out at the instant the bomb went off, according to several persons who called police last night.'

- HONOLULU ADVERTISER newspaper article dated 9 July 1962 (local time; this amazing Starfish EMP effects article was reprinted in the Tuesday 21 February 1984 edition, celebrating the 15th anniversary of Hawaiian statehood to the U.S.A.).

A technical investigation was then done by the streetlights department into the causes of the 300 streetlight failures, and then on 28 July 1962, the HONOLULU STAR-BULLETIN newspaper article 'What Happened on the Night of July 8?' by Robert Scott (a professor at Hawaii University) reported that a Honolulu streetlight department official attributed the failure of the streetlights to blown fuses, due to the energy released by the bomb test being coupled into the power supply line circuits (see illustration above; the street lamps were attached to regular overhead power line poles, allowing EMP energy to be coupled into the circuit).

On 8 April 1967, HONOLULU STAR-BULLETIN newspaper published an article by Cornelius Downes about the blown fuses: 'small black plastic rings with two discs of lead separated by thin, clear-plastic washers.'

Vittitoe reports that the streetlight officials found that: 'The failure of 30 strings was well beyond any expectations for severe [electrical lightning] storms (where ~4 failures were typical).'

Vittitoe then gives a full analysis of the physics of how the EMP calculated by Longmire turned off the streetlights, and confirms that the EMP was responsible for the fuse failures.

Interestingly, Vittitoe co-authoried the 2003 arXiv.org paper Radiative Reactions and Coherence Modeling in the High-Altitude Electromagnetic Pulse with Dr Mario Rabinowitz, who has kindly corresponded with me by email on the subjects of EMP and also particle physics (although Dr Rabinowitz did not mention this EMP paper he co-authored with Vittitoe!).

At 5:58 pm, Blogger nige said...

Literature references to EMP effects data from the three Russian EMP nuclear tests at high altitudes over Kazakhstan in October and November 1962:

V. M. Loborev, “Up to Date State of the NEMP Problems and Topical Research Directions,” Electromagnetic Environments and Consequences: Proceedings of the EUROEM 94 International Symposium, Bordeaux, France, May 30-June 3, 1994, edited by D. J. Serafin, J. Ch. Bolomey, and D. Dupouy, published in 1995, pp. 15-21. (Details of 1962 Russian high altitude nuclear test damage to the fuses in a 500 km long above-ground communications line, and to the insulation to a 1,000 km long buried power line, as well as diesel generators and radar systems).

Greetsai, V. N., A. H. Kozlovsky, M. M. Kuvshinnikov, V. M. Loborev, Yu. V. Parfenov, O. A. Tarasov, L. N. Zdoukhov, “Response of Long Lines to Nuclear High-Altitude Electromagnetic Pulse (HEMP),” IEEE Transactions on EMC, vol. 40, No. 4, November 1998, pp. 348-354. (Details of 1962 Russian high altitude nuclear test damage to two communication lines. Abstract: “During high-altitude nuclear testing in 1962 over Kazakhstan, several system effects were noted due to the high-altitude electromagnetic pulse (HEMP). In particular a 500-km-long aerial communications line experienced a failure due to the damage of its protective devices. This failure is examined in detail beginning with the calculation of the incident HEMP environments, including those from the early- and late-time portions of the HEMP. In addition, the currents and voltages induced on the line are computed and the measured electrical characteristics of the protection devices are presented. With this information it is possible to determine which portions of the HEMP environment were responsible for particular protection failures. The paper concludes with recommendations for further work required to understand the best ways to protect power lines from HEMP in the future”.)

Howard Seguine (SeguineH@c3isky1.c3i.osd.mil), “US-Russian meeting – HEMP effects on national power grid & telecommunications”, 17 Feb. 1995, is a report that gives data relevant to the USSR Test ‘184’ on 22 October 1962, ‘Operation K’ (ABM System A proof tests) 300-kt burst at 290-km altitude near Dzhezkazgan. Prompt gamma ray-produced EMP induced a current of 2,500 amps measured by spark gaps in a 570-km stretch of overhead telephone line westwards from Zharyq, blowing all the protective fuses. The late-time MHD-EMP was of low enough frequency to enable it to penetrate the 90 cm into the ground, overloading a shallow buried lead and steel tape-protected 1,000-km long power cable between Aqmola and Almaty, firing circuit breakers and setting the Karaganda power plant on fire. Russian Army diesel electricity generators were burned out by EMP, after 300-kt tests at altitudes of 150 km on 28 October and 59 km on 1 November. Seguine’s report gives many useful details, a few extracts from which follow:

“Lawrence Livermore National Lab (LLNL) hosted the Workshop on Atmospheric Nuclear Test Experience with the Russian Electric Power Grid, 14-15 Feb. Russian attendees were Professor (Maj Gen) Vladimir M. Loborev, Director, Russian Federal Ministry of Defense Central Institute of Physics and Technology (CIPT), Moscow; and Dr. (Colonel) Valery M. Kondrat’ev, Senior Scientist, CIPT. Dr. Lynn Shaeffer, LLNL, hosted the meeting. About 20 LLNL members attended. Other US attendees were Stan Gooch, STRATCOM; Chuck Lear, Silo-Based ICBM System Project Office, Hill AFB; Maj ValVerde, USSPACECOM; Balram Prasad, Defense Nuclear Agency (DNA); Mike Zmuda, Sacramento Air Logistic Center; two translators; and me. …

“Question [asked to Loborev]: Based on your understanding of what the US has published, can US models be improved by Russian models and/or data? Answer: We follow world literature, in this area, assiduously. I suspect the US doesn’t have close-in data on even the Soviet detonations. I’m convinced US-Russian specialists’ discussions in this area would be absolutely beneficial to both sides with regard to improving methodologies. But this type of collaboration is in the bailiwick of higher ups in both our governments. Such could occur if they agreed. The fact that I’m standing before you and that you have some Russian scientists at the lab says that the process has begun, as President Yeltsin recently said. We both should pursue this through out respective chains. …

“KONDRAT’EV – Formal paper (read by Kondrat’ev, with some difficulty)
a. USSR EMP theory was developed 1961-62. The Ministry of Communications did EMP experiments on communications lines.
b. The attached diagram [nuclear test of 23 October 1962] approximates a vu-graph used to discuss damages. Dimensions shown and information in the three boxes were provided verbally by Kondrat’ev and/or Loborev.
c. Amplifiers, spaced 40-80 km apart were damaged as were spark gap tubes. The latter were commonly used to protect the system from lightning damage. Spark gaps saw more than 350 volts for 30-40 microsecs; parts of the line saw more than kiloamps, and the rise time was 30-40 microsecs – these were actual measurements.
d. Experiments were set up specifically to study protection measures for critical items. We experienced fires from EMP and loss of communications gear Seven-wire cables were common in telecommunications networks.
e. Destruction of power supply at Karaganda. Fuses failed during the test, as they were
designed to do; actually, they burned. …”

Russian EMP effects report PDF link:

Seguine report on Russian EMP nuclear tests 1962

Corrected EMP effects illustration

In testimony to the 1997 U.S. Congressional Hearings, “Threats Posed by Electromagnetic Pulse to U.S. Military Systems and Civilian Infrastructure; House of Representatives, Committee on National Security, Military Research and Development Subcommittee, Washington, DC, Wednesday, July 16, 1997” (Hon. Curt Weldon, Chairman of Military Research and Development Subcommittee), Dr. George W. Ullrich, the Deputy Director of the U.S. Department of Defense's Defense Special Weapons Agency, DSWA (now the Defence Threat Reduction Agency, DTRA) stated:

“Starfish Prime, a 1.4 megaton device, was detonated at an altitude of 400 kilometers over Johnston Island. Failures of electronic systems resulted in Hawaii, 1,300 kilometers away from the detonation. Street lights and fuzes failed on Oahu and telephone service was disrupted on the island of Kauai. Subsequent tests with lower yield devices [410 kt Kingfish at 95 km altitude, 410 kt Bluegill at 48 km altitude caused EMP problems, 7 kt Checkmate at 147 km] produced electronic upsets on an instrumentation aircraft [the KC-135 that filmed the tests from above the clouds] that was approximately 300 kilometers away from the detonations.

“Soviet scientists had similar experiences during their atmospheric test program. In one test, all protective devices in overhead communications lines were damaged at distances out to 500 kilometers; the same event saw a 1,000 kilometer segment of power line shut down by these effects. Failures in transmission lines, breakdowns of power supplies, and communications outages were wide-spread.”

At 3:44 pm, Anonymous Hilo Boy said...

Hello Nige --

I'm very sorry that I failed to monitor the comments, and thus missed your question.

First, I was not in Hawai'i for the 1962 tests, and can't report anything. I did witness Orange (but not Teak).

Memory always causes problems in these matters, an obvious statement of course.

My memory is of a fireball as well as a cloud. For years, when telling about my experience, I would talk about a "mushroom cloud," until I began to think that a space burst could not have produced such a cloud, and that my memory had simply supplied the cloud to go along with the fireball.

I can't remember whether the sky was clear or not. I do remember that we were all looking in the direction to which a tracking dish at South Point was pointing -- we had no idea whether there was a connection or not, but it seemed reasonable. I don't know whose dish that was. We always just referred to it as "the tracking station." It's gone now, except for the concrete support.

So: I saw a fireball and I used to think I saw a cloud.

I think newspapers of that time would mention any electrical disturbances. I do remember hearing or reading about it but it's also true that these memories could have have come from clippings or reports that my mother might have sent me after Starfish, in 1962. It's possible.

I'm sorry I can't be much more help. Although witnessing Orange affected me strongly, I have to admit that -- since we were all teenagers -- there was a certain amount of drinking going on that night, not to mention fooling around with girls. It seemed a lark.

I don't see comment dates, so I'll make my own: 9 August 2010.

At 2:08 am, Blogger Unknown said...

I heard some one say,
an emp would cause main power lines to glow and explode.

as in melt metal.

1. Is this true?
2. would a emp that powerful kill people any way?
3.If your very deep in the ground would you survive anyway?
4. is it easier to kill people, or melt metal with a EMP?

At 1:24 pm, Blogger nige said...

You're confusing the EMP with the higher energy density of a microwave oven.

The energy density of the EMP isn't high enough to melt things on a large scale, only to melt quite small electronic connectors once the energy has been collected by large antennae or other metal collectors and channeled into that small connector, inside a transistor or a microchip.

The worst case is where you get a cable running close to a surface burst (inside the intense radiation deposition region), where you can get thousands of amps induced in the cable, overheating it, burning the insulation and allowing the conductors to touch and fuse together.

The human effects depend on a person shorting the EMP from a large collector to the ground. If you stand on a large metallic object, no effect. If you touch a large metallic object that is otherwise insulated from the earth, then the EMP current surge will try to pass through you to the ground, depending on the total resistance (whether your hands are dry, whether you are wearing rubber soled shoes, etc.). Someone touching a long metal railing or wire held above the ground by wooden posts could get a very brief electric shock from the EMP. Some electrical fires might be started, but people could easily put them out.

Mammals have small crystals of magnetite in their brains which can be twisted by very strong, rapidly changing magnetic fields, but this doesn't cause long term damage.

In summary, the highest frequencies of the EMP correspond to roughly the frequency of the rise time (first half cycle) of the EMP waveform. Since this is about 10 nanoseconds or most weapons, i.e. 10^{-8} second, the maximum EMP frequency is roughly 10^8 cycles/second or 100 megaHertz. This is less than the gigaHertz frequencies of microwave ovens that heat food and stuff. The EMP energy density (Joules per cubic metre) is proportional to the square of the field strength (volts/metre), but isn't high enough at 50 kV/m to cause significant heating, given the brief duration of the strong field intensity.

The only way EMP can cause significant damage is by being picked in in antennas and cables, and fed into sensitive equipment where it burns out delicate components.


Post a Comment

<< Home

All of this data should have been published to inform public debate on the basis for credible nuclear deterrence of war and civil defense, PREVENTING MILLIONS OF DEATHS SINCE WWII, instead of dDELIBERATELY allowing enemy anti-nuclear and anti-civil defence lying propaganda from Russian supporting evil fascists to fill the public data vacuum, killing millions by allowing civil defence and war deterrence to be dismissed by ignorant "politicians" in the West, so that wars triggered by invasions with mass civilian casualties continue today for no purpose other than to promote terrorist agendas of hate and evil arrogance and lying for war, falsely labelled "arms control and disarmament for peace": "Controlling escalation is really an exercise in deterrence, which means providing effective disincentives to unwanted enemy actions. Contrary to widely endorsed opinion, the use or threat of nuclear weapons in tactical operations seems at least as likely to check [as Hiroshima and Nagasaki] as to promote the expansion of hostilities [providing we're not in a situation of Russian biased arms control and disarmament whereby we've no tactical weapons while the enemy has over 2000 neutron bombs thanks to "peace" propaganda from Russian thugs]." - Bernard Brodie, pvi of Escalation and the nuclear option, RAND Corp memo RM-5444-PR, June 1965.

Update (19 January 2024): Jane Corbin of BBC TV is continuing to publish ill-informed nuclear weapons capabilities nonsense debunked here since 2006 (a summary of some key evidence is linked here), e.g. her 9pm 18 Jan 2024 CND biased propaganda showpiece Nuclear Armageddon: How Close Are We? https://www.bbc.co.uk/iplayer/episode/m001vgq5/nuclear-armageddon-how-close-are-we which claims - from the standpoint of 1980s Greenham Common anti-American CND propaganda - that the world would be safer without nuclear weapons, despite the 1914-18 and 1939-45 trifles that she doesn't even bother to mention, which were only ended with nuclear deterrence. Moreover, she doesn't mention the BBC's Feb 1927 WMD exaggerating broadcast by Noel-Baker which used the false claim that there is no defence against mass destruction by gas bombs to argue for UK disarmament, something that later won him a Nobel Peace Prize and helped ensure the UK had no deterrent against the Nazis until too late to set off WWII (Nobel peace prizes were also awarded to others for lying, too, for instance Norman Angell whose pre-WWI book The Great Illusion helped ensure Britain's 1914 Liberal party Cabinet procrastinated on deciding what to do if Belgium was invaded, and thus failed deter the Kaiser from triggering the First World War!). The whole basis of her show was to edit out any realism whatsoever regarding the topic which is the title of her programme! No surprise there, then. Los Alamos, Livermore and Sandia are currently designing the W93 nuclear warhead for SLBM's to replace the older W76 and W88, and what she should do next time is to address the key issue of what that design should be to deter dictators without risking escalation via collateral damage: "To enhance the flexibility and responsiveness of our nuclear forces as directed in the 2018 NPR, we will pursue two supplemental capabilities to existing U.S. nuclear forces: a low-yield SLBM warhead (W76-2) capability and a modern nuclear sea launched cruise missile (SLCM-N) to address regional deterrence challenges that have resulted from increasing Russian and Chinese nuclear capabilities. These supplemental capabilities are necessary to correct any misperception an adversary can escalate their way to victory, and ensure our ability to provide a strategic deterrent. Russia’s increased reliance on non-treaty accountable strategic and theater nuclear weapons and evolving doctrine of limited first-use in a regional conflict, give evidence of the increased possibility of Russia’s employment of nuclear weapons. ... The NNSA took efforts in 2019 to address a gap identified in the 2018 NPR by converting a small number of W76-1s into the W76-2 low-yield variant. ... In 2019, our weapon modernization programs saw a setback when reliability issues emerged with commercial off-the-shelf non-nuclear components intended for the W88 Alteration 370 program and the B61-12 LEP. ... Finally, another just-in-time program is the W80-4 LEP, which remains in synchronized development with the LRSO delivery system. ... The Nuclear Weapons Council has established a requirement for the W93 ... If deterrence fails, our combat-ready force is prepared now to deliver a decisive response anywhere on the globe ..." - Testimony of Commander Charles Richard, US Strategic Command, to the Senate Committee on Armed Services, 13 Feb 2020. This issue of how to use nuclear weapons safely to deter major provocations that escalate to horrific wars is surely is the key issue humanity should be concerned with, not the CND time-machine of returning to a non-nuclear 1914 or 1939! Corbin doesn't address it; she uses debunked old propaganda tactics to avoid the real issues and the key facts.

For example, Corbin quotes only half a sentence by Kennedy in his TV speech of 22 October 1962: "it shall be the policy of this nation to regard any nuclear missile launched from Cuba against any nation in the Western hemisphere as an attack by the Soviet Union on the United States", and omits the second half of the sentence, which concludes: "requiring a full retalitory response upon the Soviet Union." Kennedy was clearly using US nuclear superiority in 1962 to deter Khrushchev from allowing the Castro regime to start any nuclear war with America! By chopping up Kennedy's sentence, Corbin juggles the true facts of history to meet the CND agenda of "disarm or be annihilated." Another trick is her decision to uncritically interview CND biased anti-civil defense fanatics like the man (Professor Freedman) who got Bill Massey of the Sunday Express to water down my article debunking pro-war CND type "anti-nuclear" propaganda lies on civil defense in 1995! Massey reported to me that Freedman claimed civil defense is no use against a H-bomb, which he claims is cheaper than dirt cheap shelters, exactly what Freedman wrote in his deceptive letter published in the 26 March 1980 Times newspaper: "for far less expenditure the enemy could make a mockery of all this by increasing the number of attacking weapons", which completely ignores the Russian dual-use concept of simply adding blast doors to metro tubes and underground car parks, etc. In any case, civil defense makes deterrence credible as even the most hard left wingers like Duncan Campbell acknowledged on page 5 of War Plan UK (Paladin Books, London, 1983): "Civil defence ... is a means, if need be, of putting that deterrence policy, for those who believe in it, into practical effect."