Above: film of the Effects of Nuclear Weapons.

'The Life Span Study (LSS) population consists of about 120,000 persons who were selected on the basis of data from the 1950 Japanese National Census. This population includes ... atomic-bomb survivors living in Hiroshima or Nagasaki and nonexposed controls. ... all persons in the Master Sample who were located less than 2,500 meters from the hypocenter ATB were included in the LSS sample, with about 28,000 persons exposed at less than 2,000 meters serving as the core. Equal numbers of persons who had been located 2,500-9,999 meters from hypocenter ... were selected to match the core group by age and sex. ... As of 1995, more than 50% of LSS cohort members are still alive. As of the end of 1990, almost 38,000 deaths have occurred in this group, including about 8,000 cancer deaths among the 87,000 survivors. Approximately 430 of these cancer deaths are estimated to be attributable to radiation.'
Nuclear tests were later conducted in 1953 and 1955 to see if white-washing wooden houses would reflect the heat flash and prevent ignition: it worked! Various types of full scale houses were exposed on March 17, 1953 to 16-kiloton Annie and on May 5, 1955 to 29-kiloton Apple-2 in Nevada. The front of the houses was charred during the intense radiant heat flash, but none of them ignited even where the blast was severe enough to physically demolish the house. Fences and huts exposed at the 1953 test had to be covered in rubbish to ignite: see this film.
Even ignoring the problems of starting a fire in a modern building after the EMP has taken out mains electricity, firestorms are also impossible in most cities, judging by the criteria required for firestorms when the R.A.F. and U.S.A.F. tried very hard with thousands of tons of incendiaries over limited areas in World War II. Some experts were aware of this fact as early as 1979:
‘Some believe that firestorms in U.S. or Soviet cities are unlikely because the density of flammable materials (‘fuel loading’) is too low–the ignition of a firestorm is thought to require a fuel loading of at least 8 lbs/ft2 (Hamburg had 32), compared to fuel loading of 2 lbs/ft2 in a typical U.S. suburb and 5 lbs/ft2 in a neighborhood of two story brick row-houses, communications and electric power systems of the victim.’ – U.S. Congress, Office of Technology Assessment, The Effects of Nuclear War, May 1979, page 22.
See also the report by Drs. Kenneth A. Lucas, Jane M. Orient, Arthur Robinson, Howard MacCabee, Paul Morris, Gerald Looney, and Max Klinghoffer, ‘Efficacy of Bomb Shelters: With Lessons From the Hamburg Firestorm’, Southern Medical Journal, vol. 83 (1990), No. 7, pp. 812-20:
‘Others who have recently tried to develop criteria for the development of a firestorm state that the requisite fuel loading appears to be about four times the value of 8 lb/sq ft cited earlier. ... A standard Soviet civil defense textbook states: "Fires do not occur in zones of complete destruction [overpressure greater than 7 psi]; flames due to thermal radiation are prevented, because rubble is scattered and covers the burning structures. As a result the rubble only smolders."’
The then-‘secret’ May 1947 U.S. Strategic Bombing Survey report on Nagasaki states (v. 1, p. 10): ‘… the raid alarm was not given ... until 7 minutes after the atomic bomb had exploded ... less than 400 persons were in the tunnel shelters which had capacities totalling approximately 70,000.’ This situation, of most people watching lone B-29 bombers, led to the severe burns by radiation and flying debris injuries in Hiroshima and Nagasaki. The May 1947 U.S. Strategic Bombing Survey report on Hiroshima, pp 4-6:
‘Six persons who had been in reinforced-concrete buildings within 3,200 feet [975 m] of air zero stated that black cotton black-out curtains were ignited by flash heat... A large proportion of over 1,000 persons questioned was, however, in agreement that a great majority of the original fires were started by debris falling on kitchen charcoal fires... There had been practically no rain in the city for about 3 weeks. The velocity of the wind ... was not more than 5 miles [8 km] per hour….
‘The fire wind, which blew always toward the burning area, reached a maximum velocity of 30 to 40 miles [48-64 km] per hour 2 to 3 hours after the explosion ... Hundreds of fires were reported to have started in the centre of the city within 10 minutes after the explosion... almost no effort was made to fight this conflagration within the outer perimeter which finally encompassed 4.4 square miles [11 square km]. Most of the fire had burned itself out or had been extinguished on the fringe by early evening ... There were no automatic sprinkler systems in building...’
Dr Ashley Oughterson and Dr Shields Warren noted a fire risk in Medical Effects of the Atomic Bomb in Japan (McGraw-Hill, New York, 1956, p. 17):
‘Conditions in Hiroshima were ideal for a conflagration. Thousands of wooden dwellings and shops were crowded together along narrow streets and were filled with combustible material.’
Dr Harold L. Brode and others have investigated the physics of firestorms in commendable depth, see for example, this 1986 report, The Physics of Large Urban Fires: http://fermat.nap.edu/books/0309036925/html/73.html
That lists the history of firestorms, and gives some very interesting empirical and semiempirical formulae. It contains computer simulations of firestorms which show the way Hiroshima and Hamburg must have burned, and these simulations might well apply to forests in the fall (with little green leaves to shield the ground, and lots of dry leaves and branches on the ground to act as kindling, like the litter used to start fence and house fires in the 1953 Nevada nuclear tests). However, modern cities would tend to be left smouldering rubble. Dresden and Hamburg had medieval multistorey wooden buildings in the areas that burned, and people don't build cities like they used to!
The best equation in that 1986 report is Brode's formula for the thermal radiating power versus time curve of a low altitude nuclear detonation: P(t) ~ P(max).[(2t2)/(1 + t4)], where t is time (as measured in units of the time taken for the final peak thermal pulse to occur).
This useful equation adds to the vast collection of empirical formulae in his 50-odd pages long detailed mathematical analysis, 'Review of Nuclear Weapons Effects' in the Annual Review of Nuclear Science, v18, 1968, and many blast wave reports from the early 1980s. However, he has to quote the thermal time scaling law from Glasstone and Dolan, which is nonsense.
Glasstone and Dolan states that the time of peak thermal power from a low air burst is: 0.0417W(kt)0.44 seconds. This is a fiddle, based on computer calculations which use an imperfect knowledge of properties of hot air. Nuclear test data from all the American low altitude tests, 1945-62, shows that the empirical law is quite different: 0.036W(kt)0.48 seconds. In a later post I'll collect the most important empirical formulae together with the test data from which they are derived, and describe the controversy which resulted when Dolan took over editorship of Capabilities of Nuclear Weapons (and The Effects of Nuclear Weapons) and moved most nuclear effects predictions from being based on test data to being based on computer calculations of physics from first principles.
The reason for the false time of second thermal maximum must be to compensate for a change in the shape of the thermal curve for higher yields. Glasstone and Dolan show thermal curve for low yield weapons in which 20% of the thermal radiation is emitted by the time of final peak thermal power. Dolan's DNA-EM-1 however gives a computer simulation curve showing that about 30% is emitted by that time for a high yield weapon (also air burst in approximately sea level air). Brode in the Annual Review of Nuclear Science 1968 gives a formula which shows that the thermal yield increases from 40% of initial fireball energy to 44% as yield increases from say 1 kt to 10 Mt or so.
What physically happens is that the radius and time of final peak fireball radiation power scale as about W2/5 and W0.48, respectively. However the principal thermal minimum (before the final maximum brightness) is caused by nitrogen dioxide created in the shock front, and the range of this shielding and duration of this shielding both scale as W1/3. Hence, as the bomb yield increases, the shock caused nitrogen dioxide shield covers less of the fireball and covers a smaller proportion of the thermal radiation curve.
This means that the percentage of bomb which is radiated as thermal energy increases slightly and the fraction which is radiated by the time of the final maximum also increases. Instead of presenting a lot of thermal emission radiation curves for different yields, Glasstone and Dolan 1977 instead apparently used the standard 20 kt thermal power curve and changed the formula for the second maximum so that it was defined not as the true time for second maximum, but rather as the time by which 20% of the thermal energy was emitted, so that it was in reasonable agreement with the curves presented.
Many other fiddles exist in Glasstone & Dolan 1977. In January 1963, Dr Edward C. Freiling and Samuel C. Rainey of the U.S. Naval Radiological Defense Laboratory issued a 17 page draft report, Fractionation II: On Defining the Surface Density of Contamination, stating: "The section ‘Radiation Dose Over Contaminated Surfaces,’ in The Effects of Nuclear Weapons, is out of date with regard to the account it takes of fractionation effects. This report presents the technical basis for revising that section. It recommends that the exposure rate from fractionated debris be presented as the product of a contamination surface density with the sum of 3 terms. The 1st term is the exposure rate contribution of refractorily (unfractionated) behaving fission products. The 2nd term is for volatilely (fractionated) behaving fission product nuclides. The 3rd term expressed the contribution of the induced activities."
This criticism of The Effects of Nuclear Weapons was deleted in the final March 1963 version of the report (USNRDL-TR-631). However, criticism of Glasstone’s neglect of fractionation varying with distance in the fallout pattern continued with R. Robert Rapp of the RAND Corporation in 1966 authoring report RM-5164-PR, An Error in the Prediction of Fallout Radiation, and John W. Cane in 1967 authoring DASIAC Special Report 64, Fallout Phenomenology: Nuclear Weapons Effects Research Project at a Crossroads. (This was all ignored in the 1977 edition.)
Glasstone and Dolan state, for example, that water surface bursts only deposit 30% of their radioactivity as local fallout. This figure comes from inaccurate analysis during Operation Redwing in WT-1318 page 57, which says that water surface bursts Tewa and Flathead deposited 28% and 29% of their fallout activity locally (within areas of 43,500 and 11,000 square miles respectively). However, this is misleading as the same report says that the water surface burst Navajo deposited 50% of its fallout activity locally over 10,490 square miles, while it states that the land surface burst Zuni deposited 48% of its fallout activity locally over 13,400 square miles. On 9 July 1957, B. L. Tucker of the RAND Corporation showed, in his secret report Fraction of Redwing Activity in Local Fallout, that these percentages were based on a false conversion factor from dose rate to activity, and that by using the correct conversion factor, all these tests deposited about 68-85% for corrected Redwing data. Also, more accurate data from Operation Hardtack in 1958 shows that water surface bursts deposit similar local fallout to ground surface bursts, although water burst fallout is less fractionated.
During the 1960s, the consequences of fission product fractionation for discussions of the ‘percentage’ of radioactivity deposited as local fallout, and also for the radiation decay rate as a function of particle size and therefore of downwind distance, occurred in the Defence Atomic Support Agency of the U.S. Department of Defence. It was established that there is no single fixed percentage of radioactivity in early fallout, since the different fission products fractionate differently so the percentage depends on the nuclides being considered; if attempts are made to add up the total radioactivity, it is found that beta and gamma radioactivities of the different fission products differ, as does the average gamma ray energy of fallout fractionated to different degrees at the same time after detonation, so it is not scientific to give a single ‘average’ percentage of the total radioactivity in local fallout; considerations must be done for each fission product separately. For example, after the Hardtack-Oak surface burst 49% of Cs-137 and 89% of Mo-99 were deposited within 24 hours of burst; while Glasstone states that 60% of the activity is deposited within 24 hours. This makes the data on fallout effects in The Effects of Nuclear Weapons both misleading scientific explanation, and also generally inaccurate for making any sort of numerical calculation of fallout.
In a previous post, it was mentioned that the official U. S. manual in 1957 exaggerated the ignition radius for shredded dry newspaper for a 10-Mt air burst on a clear day by a factor of two, and that fire areas were therefore exaggerated by a factor of four (circular area being Pi times the square of radius).
It also exaggerated the range of blistered skin (second degree burns):
Dr Carl F. Miller, who worked for the U.S. Naval Radiological Defense Laboratory at nuclear tests, hit out in the February 1966 Scientist and Citizen: ‘Reliance on the Effects of Nuclear Weapons has its shortcomings... I was twenty miles from a detonation ... near ten megatons. The thermal flash did not produce the second-degree burn on the back of my neck, nor indeed any discomfort at all.’
The online NATO HANDBOOK ON THE MEDICAL ASPECTS OF NBC DEFENSIVE OPERATIONS, FM 8-9, 1996, calculates in Table 4-VI that second-degree skin burns even from 10 Mt air bursts (where the range is greater than from surface bursts) would only extend to a range of 14.5 km (9 miles) in a typical city with atmospheric visibility of 10 km.
There is plenty of evidence that the high mortality to thermal burns victims was due to combined thermal and nuclear radiation exposure, since the nuclear radiation doses to people in the open at thermal burns ranges were sufficient to depress the bone marrow which produces white blood cells. The maximum depression in the white blood cell count occurs a few weeks after exposure, by which time the thermal burns had generally become infected in the insanitary conditions the survivors had to make do with. This combination of depressed infection-fighting capability and infected burn wounds often proved lethal at Hiroshima and Nagasaki, but it is essential to note that apparently severe burns were more superficial than is made out by most propaganda, nobody was vaporized:
‘Persons exposed to nuclear explosions of low or intermediate yield may sustain very severe burns on their faces and hands or other exposed areas of the body as a result of the short pulse of directly absorbed thermal radiation. These burns may cause severe superficial damage similar to a third-degree burn, but the deeper layers of the skin may be uninjured. Such burns would heal rapidly [unless the person also receives a massive nuclear radiation dose], like mild second-degree burns.’ – Dr Samuel Glasstone and Philip J. Dolan, editors, The Effects of Nuclear Weapons, U.S. Department of Defence, 1977, p. 561.
The British Home Office Scientific Advisory Branch, whose 1950s and 1960s reports on civil defence aspects of nuclear weapons tests are available at the U.K. National Archives (references HO229, HO338, etc., also see DEFE16 files), immediately distrusted the American manual on several points. Physicists such as George R. Stanbury from the Scientific Advisory Branch had attended Operation Hurricane and other U.K. nuclear weapons tests to measure the effects!
However, when they published the British data in civil defence publications, they were quickly 'discredited' by physics academics using the American manual. The problem was that they could not reveal where their data came from, because of secrecy. This plagued civil defence science not only in Britain but also in America throughout the cold war. The public distrusted all but the most exaggerated 'facts', being led by propaganda from various sources (including the Soviet-funded lobbies such as the U.S.S.R.-controlled 'World Peace Council') that the only way to be safe was to surrender by unilateral nuclear disarmament. (Similarly, Japan was supposedly safe from nuclear attack in August 1945 because it had no nuclear weapons.)
Professor Freeman Dyson helpfully explained the paradox in his 1984 book Weapons and Hope: '[Civil defence measures] according to the orthodox doctrine of deterrence, are destabilizing insofar as they imply a serious intention to make a country invulnerable to attack.'
This political attitude meant simply that without civil defence, both sides need fewer weapons to deter the other side. Therefore by one way of looking at the logic, it is more sensible to have no civil defence, and this will allow both sides to agree to have a minimal number of weapons to deter the other side. This set in during the 1960s, and spread from passive civil defence to active defences like ABM treaties, where both the U.S.S.R. and the U.S. agreed to limit the number of ABM (anti ballistic missile) systems.
This was I believe signed by people like President Nixon, who were regarded as slightly cynical by some people. The problem with pure deterrence is that it doesn't help you if you have no civil defence and a terrorist attacks you, or there is a less than all-out war. (Even if you disarm your country of nuclear weapons entirely, you are obviously no more safe from a nuclear attack than Japan was in August 1945 when both Hiroshima and Nagasaki were blasted. So you still need civil defence, unless you start pretending - as many do - that people were magically vaporised in the hot but rapidly cooling nitrogen dioxide-coloured 'fireball' of hot air, and falsely claim from this lie that duck and cover would not have helped any burned, battered and lacerated survivors.)
In America, nuclear age civil defence had begun with the crazy-sounding but actually sensible 'duck and cover' advice of 1950, just after the first Russian nuclear test. Effects like shattered glass, bodily displacement, and thermal flash burns covered the largest areas in Hiroshima and Nagasaki, and were the easiest to protect against as nuclear test data shows. The higher the overpressure, the more horizontally the glass fragments go, so you can avoid lacerations by ducking, which also stops burns and bodily displacement by the wind drag or dynamic pressure. The last big civil defence expenditure was President Kennedy's fallout program of 1961. Kennedy equipped all public building basements throughout America with food, radiation meters and water. (Something like two million radiation meters had to be made.)
In Britain the 'Civil Defence Corps', which had been honored during the Blitz in World War II, was finally abolished in March 1968, after ridicule. (Civil defence handbook Number 10, dated 1963, was held up in parliament for public ridicule, mainly because of one sentence advising people driving cars to 'park alongside the kerb' if an explosion occurred. This was as regarded patronising and stupid by the British Government of 1968 - which was, of course, a different one from that of 1963 when the manual was published in response to public demand stemming from the Cuban missiles crisis.)
Recently Dr Lynn Eden has written a book with input from firestorm modeller Dr Harold Brode and various other physicists, Whole World on Fire: Organizations, Knowledge, and Nuclear Weapons Devastation (Ithaca, N.Y.: Cornell University Press, 2004), which makes the case that a technical U.S. Defence Intelligence Agency's secret publication, Physical Vulnerability Handbook - Nuclear Weapons (1954-92), never included any predictions of fire damage! (Some reviewers of that book have falsely claimed that fire risks were covered up, which is absurd seeing that the 1957 non-secret U.S. Department of Defense The Effects of Nuclear Weapons book falsely shows that dry shredded newspaper would be ignited 56 km from a 10 megaton air burst, which was reduced to half that, 28 km in the 1977 edition. Dr Eden misses this completely and tends to poke fun at civil defence in her presentation of the effects of the 1953 Encore nuclear test thermal ignition experiments in Nevada. We'll look at the facts in a later post.)
However, the scientific reference is not that physical vulnerability handbook, but is the U.S. Department of Defense's secret manual Capabilities of Nuclear Weapons, which does contain fire ignition predictions. The tests and the science will be examined in later posts.
Photo credit: Hiroshima photo in colour was taken by the United States Air Force.
very interesting and very helpful!!!!!!!!!!!
ReplyDeleteCopy of a comment to
ReplyDeletehttp://riofriospacetime.blogspot.com/2006/11/direct-route.html
Hi Kea,
Lubos Motl is right about the climate change manure because what the doom-sayers of climate change forget is that we're running out of fossile fuels anyhow!
Before the world is wrecked completely by global warming, we'll have run out of oil, coal, gas (North Sea oil is far more expensive than Arabic supplies because of the costs of oil rigs in the North Sea, and gas - "vapour" I suppose to USA readers to avoid confusion with gas(oline) - is seriously more expensive now that the market is opened up to Europe by a new gas pipeline, and if I buy a new home I'm getting all-electric heating, not the traditional piped gas heating currently used in most of the UK).
To combat global warming, go nuclear. Nuclear is clean, safe, and it is EVEN ECONOMICAL if you lower radioactive pollution horseshit propaganda and shoot crackpots who claim radiation is lethal.
Those people don't understand background radiation, or the effect of higher altitudes, air travel, etc on radiation exposure.
They think "natural" radiation is safe and "artificial" radiation from nuclear power is totally different.
The Health Physicists who work in the nuclear industry are a load of gormless, feeble, patronising fools who couldn't explain anything to anybody without making it sound like condescending pro-nuclear propaganda, which is why the situation continues.
They have no idea that physics and maths are well known, and that people can by and large understand radiation. They perpetuate the myths.
The FIRST thing physicists in the f***ing nuclear industry should put on their posters is the fact that on the Moon the natural radiation is 50 times higher than on earth, 1 mR/hr on Moon compared to 0.02 mR/hr on Earth (the earth's atomsphere shields most of the background radiation, which is 99% protons and alpha particles).
Then they should give the ACTUAL (not relative!) natural dose rates in different cities with different bedrocks and altitudes above sea level! The thorium rich beach sands of Brazil and India are more radioactive than 90% of the "nuclear waste" from the nuclear industry!
It is a wide range!!! Then, finally, they should show the lies about low level radiation by plotting the mortality in Hiroshima and Nagasaki and other long-term reliable studies as a function of dose.
It is true that massive doses severely increase leukemia rates, but it is a lie that small doese do so in proportion, or that other cancers are increases in the same way. Leukemia is a special problem, because the bone marrow is very susceptible to ionising radiation.
Particularly, high-LET (linear energy transfer) radiations like alpha and beta and also soft x-rays (which cause ionisation by the photoelectric effect) inside the body increase cancer risks, NOT low-LET radiation like gamma rays (unless the dose is really massive).
People should be aware that the more penetrating the radiation is, the less of it gets stopped by soft tissue in the body, so the LOWER the absorbed dose per unit of fluence!
The real dangers from low level radiation are from ingesting or inhaling soluble alpha and beta emitters, like radium and strontium-90 respectively, which get deposited in bone and can cause leukemia. Radon gas from the decay of radium is also a massive natural hazard, killing far more people than all the hundreds of megatons of 1950s nuclear tests or Chernobyl.
Chernobyl showed that iodine-131 causes a short term problem (half like 8 days) after an explosion, because it gets concentrated in milk and then in kid's thyroid glands and can cause lumps (mostly benign). The answer is simple: for the few weeks while milk is contaminated, either put it through an ion-exchanger to remove the iodine-131, or switch to using powdered milk or simply put the cattle in winter barns eating winter feed like hay so that they don't eat contaminated grass! Problem sorted!
See http://glasstone.blogspot.com/2006/04/fallout-prediction-and-common-sense-in.html for more info on this!
Regards Hiroshima and Nagasaki long-term radiation effects cover-up, see links on http://www.rerf.or.jp/top/introe.htm:
Fewer than 1% of victims died due to cancer caused by radiation!!!!!
The maximum leukemia rate occurred in 1952 and ever since has been declining. There were no genetic effects above the normal rate in offspring of even highly irradiated survivors and cancer risks were carefully studied:
'The Life Span Study (LSS) population consists of about 120,000 persons who were selected on the basis of data from the 1950 Japanese National Census. This population includes ... atomic-bomb survivors living in Hiroshima or Nagasaki and nonexposed controls. ... all persons in the Master Sample who were located less than 2,500 meters from the hypocenter ATB were included in the LSS sample, with about 28,000 persons exposed at less than 2,000 meters serving as the core. Equal numbers of persons who had been located 2,500-9,999 meters from hypocenter ... were selected to match the core group by age and sex. ... As of 1995, more than 50% of LSS cohort members are still alive. As of the end of 1990, almost 38,000 deaths have occurred in this group, including about 8,000 cancer deaths among the 87,000 survivors. Approximately 430 of these cancer deaths are estimated to be attributable to radiation.'
More here.
Sorry to go on, but all this environmental crackpottery just drives me nuts.
Best,
nc
Nuclear winter has quite an interesting history. It started off with the comet impact that wiped out the dinosaurs. The comet forms a fireball when it collides with the atmosphere, and the thermal radiation is supposed to ignite enough tropical vegetation to produce a thick smoke cloud, freezing the ground and killing off many species.
ReplyDeleteThe best soot to absorb solar radiation is that from burning oil, and Saddam tested this by igniting all of Kuwait's oil wells after the first Gulf War. Massive clouds of soot were produced, but the temperature drop was far less than "nuclear winter" calculations predicted occurred in the affected areas: http://en.wikipedia.org/wiki/Nuclear_winter#Kuwait_wells_in_the_first_Gulf_War
The idea that a dark smoke layer will stop heat energy reaching the ground is naive because by conservation of energy, the dark smoke must heat up when it absorbs sunlight, and since it is dark in colour it is as good at radiating heat as absorbing it. So it passes the heat energy downwards as the whole cloud heats up, and when the bottom of the cloud has reached a temperature equilibrium with the top, it radiates heat down to the ground, preventing the dramatic sustained cooling.
Although there is a small drop in temperature at first, as when clouds obscure the sun, all the soot cloud will do in the long run is to reduce the daily temperature variation of the air from day to night, so that the temperature all day and all night will be fairly steady and close to the average of the normal daytime and nighttime temperatures.
The dinosaur extinction evidence, http://en.wikipedia.org/wiki/Chicxulub_Crater, might be better explained by the direct effects of the comet impact: the air blast wave and thermal radiation effects on dinosaurs, and the kilometers-high tsunami. At the time the comet struck Chicxulub in Mexico with 100 TT (100,000,000 megatons or 100 million million tons) energy 65 million years ago, the continents were all located in the same area, see the map at http://www.dinotreker.com/cretaceousearth.html and would all have suffered severe damage from the size of the explosion. Most dinosaur fossils found are relatively close to the impact site on the world map 65 million years ago.
Another issue is that some proportion of the rock in the crater was calcium carbonate, which releases CO2 when heated in a fireball. If there was enough of it, the climatic effects would have been due to excessive heating, not cooling.
The "nuclear winter" idea relies on soot, not dust such as fallout (which is only about 1% of the crater mass, the remainder being fallback of rock and crater ejecta which lands within a few minutes). So it is basically an extension of the massive firestorms theory, which has many issues because modern cities don't contain enough flammable material per square kilometre to start a firestorm even when using thousands of incendiaries. In cases such as Hiroshima, the heavy fuel loading of the target area created a smoke cloud which carried up a lot of moisture that condensed in the cool air at high altitudes, bringing the soot back promptly to earth as a black rain.
Because this kind of thing is completely ignored by "nuclear winter" calculations, the whole "nuclear winter" physics looks artificial to me. In 1990, after several studies showed that TTAPS (Sagan et al.) had exaggerated the problem massively by their assumptions of a 1-dimensional model and so on, TTAPS wrote another paper in Science, where they sneakily modified the baseline nuclear targetting assumptions so that virtually all the targets were oil refineries. This enabled them to claim that a moderate cooling was still credible. However, the Kuwait burning oil wells experience a few years later did nothing to substantiate their ideas. Sagan did eventually concede there were faulty assumptions in the "nuclear winter" model, although some of his collaborators continue to write about it.