Category Archives: existential risk

The Germy Paradox – Filters: A taboo

This is the fifth post in a sequence of blog posts on state biological weapons programs. (“A taboo” was originally going to be the second-to-last post, but has been switched with “The shadow of nuclear weapons”. The index has been rearranged in past posts accordingly.) Others will be linked here as they come out:

1.0 Introduction
2.1 The empty sky: A history of state biological weapons programs
2.2 The empty sky: How close did we get to BW usage?
3.1 Filters: Hard and soft skills
3.2 Filters: A taboo
3.3 Filters: The shadow of nuclear weapons
4.0 Conclusion: Open questions and the future of state BW   

In 3.1: Hard and soft skills, we discussed the possibility that the Germy Paradox exists because bioweapons aren’t actually easy to make. Today, we go into the past and discuss another possibility – that whether or not they’re effective, there’s some kind of taboo or cultural reason they aren’t used.

This is not a new idea, although there’s no real consensus. I separate scholarly explanations for the Taboo Filter into two schools: the humaneness hypothesis and the treachery hypothesis. In the humaneness hypothesis, people reject BW because they are unnecessarily cruel. The treachery hypothesis asserts that the taboo is an outgrowth of the ancient beliefs about poison and disease in warfare – that they are secretive, unfair, inexplicable, and fundamentally evil. This hypothesis has many facets and intermingles with evolutionary revulsion to poison and contamination.  

But first, how do weapons taboos break?

The reason this filter explanation is particularly interesting is that taboos exist at the whim of the culture, and don’t have particular concrete reasons for existence. If we are protected by a taboo against BW usage, how resilient is that protection?

Trying to assess the strength of a taboo is difficult. We cannot reliably predict the future or the vicissitudes of future policy decisions, particularly when it comes to rare events like BW development or usage.

This is especially true when we are not even certain of the origin of the taboo. But we can perhaps compare it to another event: the chemical weapons taboo. Chemical and biological weapons are, while not terribly similar, often treated similarly in policy (Smith 2014), and are often seen lumped under the categories of “biochemical weapons”, “CBW” (Chemical and Biological Weapons), “CBRN” (chemical, biological, radiological and nuclear weapons), or “WMD” (weapons of mass destruction), so it is appropriate to compare policy decisions. 

Chemical weapons have been used much more frequently than BW, and hence, the taboo has been broken on multiple occasions. The Hague Convention was broken by the use of poison gas in WW1; and the Geneva Protocol was broken by chemical and biological weapons use by, among others, Japan, Iraq, and Syria. (Jefferson 2014) Weapon usage is subject to “popularity”, and the up to 161 usages of chemical weapons in Syria have probably built upon each other and reduced psychological and political barriers to future attacks.(Revill 2017) This taboo may already be on unsteady ground. 

More directly, the USSR created the largest biological weapons program of all time after signing the Biological Weapons Convention treaty, and Iraq operated an enormous program secretly until 1991. (Wheelis and Rózsa, 2009) In addition to a handful of bioterror efforts in modern times, several states are currently suspected of having biological weapons programs. (A list and sources can be found in Aftergood, 2000.) The BW taboo, as well, may not be as resilient as it appears. That said, it seems likely that an outright declaration or usage of biological weapons by a state would both provoke a stronger response, and erode the taboo severely. The likelihood of this last possibility is what seems most concerning. However, to understand it, we must understand what factors are behind the apparent taboo today. 

Humaneness

The first school of thought I consider holds that a taboo exists because biological and chemical weapons are seen as unacceptably inhumane compared to conventional weapons. This seems to have been the motivation of one of the earliest modern explicit taboos against chemical and biological weapons usage, the 1863 Lieber Code (Jefferson 2014). This set of guidelines for humane warfare in the US Army included the following: 

Article 16: “Military necessity does not admit of cruelty – that is, the infliction of suffering for the sake of suffering or for revenge, nor of maiming or wounding except in fight, nor of torture to extort confessions. It does not admit of the use of poison in any way, nor of the wanton devastation of a district. It admits of deception, but disclaims acts of perfidy; and, in general, military necessity does not include any act of hostility which makes the return to peace unnecessarily difficult.”

Article 70: “The use of poison in any manner, be it to poison wells, or food, or arms, is wholly excluded from modern warfare. He that uses it puts himself out of the pale of the law and usages of war.”

Leiber, 1863

This code influenced other guidelines for warfare, leading to the forbidding of use of chemical and bacteriological weapons in the Hague Conventions and later the Geneva Protocol. 

As a principle, the humaneness taboo relies implicitly on at least one of two assumptions – first, that death or injury from BW is worse than the same from conventional weapons. Second, that BW are more likely to be used against civilian populations, or otherwise inflicting harm on people who would not be affected by conventional weapons. It is easy to imagine that the days-long struggle of a lethal case of smallpox is less humane than an instantaneous death from a bomb, or that biological weapons are frequently targeted at civilians. But these assumptions should be assessed. 

There is some evidence that soldiers affected by chemical weapons during World War 1 had higher rates of post-traumatic stress disorder (Jefferson 2014), and diseases are obviously capable of causing protracted suffering. Still, objections have been raised to the notion that biological weapons harm targets more than conventional weapons. Early US responses to the Lieber Code, quoted above, point out that effects from “poison” weapons are not necessarily worse than sinking ships and causing enemy soldiers to drown. (Zander 2003) BW may even be better – many pathogens give infected enemies “a fighting chance” to recover completely, rather than a kinetic attack that maims or kills outright. (Rappert 2003) A minority objection is that since war-makers are motivated to end wars quickly, it may be inhumane to ban any kind of battlefield weapon, on the grounds that removing a country’s best strategy will cause the war to go on longer, and thus extend the suffering that goes along with it. (Rappert 2003) These utilitarian frameworks are important to consider, although when understanding the route of norms, it is important to note that the actual tradeoffs involved are less important than the perception of what the tradeoffs are. Either way, it seems as though neither academia nor military decision-makers have considered these points in detail when making choices about BW, suggesting that this kind of reasoned analysis is not being done anywhere. 

The second assumption that the humaneness hypothesis may rely upon is that biological weapons are more likely than conventional to be used on civilians. There is some evidence for this, in that Japanese BW were extensively used against civilians (Barenblatt 2004) and targets of later BW programs during the Cold War included cities and agriculture. (Wheelis and Rózsa 2009) But the far larger and more influential nuclear war plans during the Cold War included the destruction of cities and billions of civilian lives (Ellsberg 2017), even after both Soviet and American governments agreed to give up their BW programs. Sparing civilian lives in worst-case scenarios cannot have been a military priority. 

If the humaneness hypothesis is true, we should expect states to be more comfortable with facing and wielding nonlethal BW. For chemical weapons and perhaps biological weapons as well, this seems to be true. (Pearson 2006, Martin 2016) The American, Soviet, and Japanese BW programs had vast programs to develop agricultural weapons, damaging crops or livestock without affecting humans. (Wheelis and Rózsa 2009) Nonlethal anti-personnel weapons were major component of the American BW program (Alastair 1999) as well as an end goal of the South African BW program. (Wheelis and Rózsa 2009) As far as chemical weapons go, the nonlethality of defoliants is considered to have been a major reason those weapons in particular were used by Kennedy during the Vietnam War (Martin 2016) (although other explanations have been proposed as well, as described later in this piece). It is less clear that revelations of a nonlethal BW program today would provoke less fear than revelations of a lethal program – recent literature has not discussed this question. 

Disconcertingly, Susan Pearson suggests that the promise of incapacitating nonlethal BW may inspire development of other BW tactics. (Pearson 2006) While nonlethal weapons were an eventual goal of the South African weapons program (Wheelis and Rózsa 2009), they planned on developing anthrax weapons first, so there is precedent for this claim. Any erosion of this norm may open the door for more overall use of BW, humane or not. (Ilchmann and Revill 2014)

Powerful and invisible: The treachery hypothesis

A separate body of thought holds that biological warfare exists in the public mind in a category that can be described as “treacherous”. This is related to what Jessica Stern calls the “dreaded” nature of BW: it is invisible, unfamiliar, and triggers a disproportionately degree of disgust and fear. (Stern 2003) In explaining this nature, proponents generally refer to the history of revulsion to poison and disease. This would have begun in the evolution of the species, creating an intuitive revulsion of sickness and contamination that kept our ancestors alive. (Cole, 1998)

 This trend can be observed in a huge variety of cultures: in Hindu laws of war from 600 AD (Cole 1998), to poison’s association in Christianity with the devil and witchcraft (van Courtland Moon 2008), to South American tribes that allowed warriors to poison their arrows when hunting but not for war. (Cole 1998) Poison and disease are often seen as acceptable tools against subhumans, but not against equals. (Zanders 2003)

It’s important not to overstate it – the taboo is not a human universal. Both the Bible and the Quran contain provisions on how to wage war, but do not forbid poison, disease, or the like as weapons. (Zanders 2003) In Europe, despite official prohibitions beginning in the Renaissance era, their use was defended until 1737. (Price 1995) Nonetheless, the taboo is still notably common in human culture. It is hard to imagine early taboos existing for humanitarian reasons, when conventional warfare before guns and modern medical treatment was so disabling. Instead, it may be because toxic weapons were seen as unbalanced – hard to detect, difficult to explain, and near-impossible to treat. 

Proponents of this view rarely address an apparent contradiction this presents – that poison weapons are taboo because they are too powerful. This seems, on its face, absurd. Richard Price addresses this and suggests that despite the conception of poison as a “woman’s weapon” and a treacherous “equalizer” between forces, it is not actually very effective as a weapon. (Price 1995) Similarly, in modern settings, Sonia Ben Ouagrham-Gormley makes a compelling case that the threat of biological weapons programs has been overstated and underwhelming compared to their actual accomplishments. (Ben Ouagrham-Gormley 2014)

Additionally, the persistence of these taboos throughout the modern age has not been fully explained. In the treachery lens, poison and disease are thought of as “the unknown” and often associated with magic. (Price 1995) Today, we understand much more about biology. Does this imply that the taboo is weaker now than it has been? If not, why has the taboo on biological weapons remained constant throughout recent history, but not, for instance, one against bullets, fire, and explosions? This theory does seem generally robust, and the comparisons to historical taboos are compelling indeed, but existing research does not explain why the taboo persists and is so specific to biology.

Taboos into the future

What kills a taboo? Susan Martin discusses the idea in the context of US chemical weapons usage in Vietnam. In this case, Martin argues, politicians were able to override one norm by asserting another – that usage of defoliant chemical weapons was acceptable because the viable alternative was use of nuclear weapons, also taboo. (Martin 2016) Kai Ilchman and James Revill assert that this is part of a string of incidents that have been eroding the biological and chemical weapons taboos over time. (Ilchman and Revill 2014) In addition, many believe that the Biological Weapons Convention is nearly or entirely useless, since it contains no provisions for verification and since it allows for “defensive research” that is practically indistinguishable from offensive research. (Zanders 2003, McCauley and Payne 2010, Koblentz 2016)

The two hypotheses presented here are not the only ones. Further research or thinking in this area might identify more solidly the source of the taboo, and otherwise help determine how it can be strengthened.

The upcoming section will discuss the idea that there is no taboo, or at least no functional taboo any more. If this is the case, then the lack of observed weapons programs or usage is a purely strategic decision.


References

Aftergood, Steven. “States Possessing, Pursuing or Capable of Acquiring Weapons of Mass Destruction.” Federation Of American Scientists, July 29, 2000. https://fas.org.

Barenblatt, Daniel. A plague upon humanity: The secret genocide of axis Japan’s germ warfare operation. New York: HarperCollins, 2004.

Ben Ouagrham-Gormley, Sonia Ben. Barriers to Bioweapons: The Challenges of Expertise and Organization for Weapons Development. Cornell University Press, 2014.

Cole, Leonard A. “The Poison Weapons Taboo: Biology, Culture, and Policy.” Politics and the Life Sciences 17 (September 1, 1998): 119–32. https://doi.org/10.1017/S0730938400012119.

Courtland Moon, John Ellis van. “The Development of the Norm against the Use of Poison: What Literature Tells Us.” Politics and the Life Sciences 27, no. 1 (2008): 55–77.

Ellsberg, Daniel. The Doomsday Machine: Confessions of a Nuclear War Planner. Bloomsbury Publishing USA, 2017.

Hay, Alastair. “Simulants, stimulants and diseases: the evolution of the United States biological warfare programme, 1945–60.” Medicine, Conflict and Survival 15, no. 3 (1999): 198-214.

Ilchmann, Kai, and James Revill. “Chemical and Biological Weapons in the ‘New Wars.’” Science and Engineering Ethics 20, no. 3 (September 1, 2014): 753–67. https://doi.org/10.1007/s11948-013-9479-7.

Jefferson, Catherine. “Origins of the Norm against Chemical Weapons.” International Affairs 90, no. 3 (May 1, 2014): 647–61. https://doi.org/10.1111/1468-2346.12131.

Koblentz, Gregory D. “Quandaries in contemporary biodefense research.” In Biological Threats in the 21st Century: The Politics, People, Science and Historical Roots, pp. 303-328. 2016.

Martin, Susan B. “Norms, Military Utility, and the Use/Non-use of Weapons: The Case of Anti-plant and Irritant Agents in the Vietnam War.” Journal of Strategic Studies 39, no. 3 (2016): 321-364.

McCauley, Phillip M., and Rodger A. Payne. “The Illogic of the Biological Weapons Taboo.” Strategic Studies Quarterly 4, no. 1 (2010): 6-35.

Lieber, Francis. “INSTRUCTIONS FOR THE GOVERNMENT OF ARMIES OF THE UNITED STATES IN THE FIELD, ‘The Lieber Code.’” Government Printing Office, April 24, 1863. Lillian Goldman Law Library, The Avalon Project.

Pearson, Alan. “Incapacitating biochemical weapons: Science, technology, and policy for the 21st century.” Nonproliferation Review 13, no. 2 (2006): 151-188.

Price, Richard. “A Genealogy of the Chemical Weapons Taboo.” International Organization 49, no. 1 (1995): 73–103.

Rappert, Brian. “Coding ethical behaviour: The challenges of biological weapons.” Science and engineering ethics 9, no. 4 (2003): 453-470.

Revill, James. “Past as Prologue? The Risk of Adoption of Chemical and Biological Weapons by Non-State Actors in the EU.” European Journal of Risk Regulation 8, no. 4 (December 2017): 626–42. https://doi.org/10.1017/err.2017.35.

Smith, Frank. American biodefense: How dangerous ideas about biological weapons shape national security. Cornell University Press, 2014.

Stern, Jessica. “Dreaded risks and the control of biological weapons.” International Security 27, no. 3 (2003): 89-123.

Wheelis, Mark, and Lajos Rózsa. Deadly cultures: biological weapons since 1945. Harvard University Press, 2009.

Zanders, Jean Pascal. “International Norms Against Chemical and Biological Warfare: An Ambiguous Legacy.” Journal of Conflict and Security Law 8, no. 2 (October 1, 2003): 391–410. https://doi.org/10.1093/jcsl/8.2.391.

The Germy Paradox – Filters: Hard and soft skills

This is the fourth post in a sequence of blog posts on state biological weapons programs. Others will be linked here as they come out:

1.0 Introduction
2.1 The empty sky: A history of state biological weapons programs
2.2 The empty sky: How close did we get to BW usage?
3.1 Filters: Hard and soft skills
3.2 Filters: A taboo
3.3 Filters: The shadow of nuclear weapons
4.0 Conclusion: Open questions and the future of state BW 

Welcome to the second half of our series. (This is post 3.1.) I’ve established that despite extensive historical weapons programs, biological weapons haven’t really been used in a major way since WWII. We don’t ever seem to have been a “close call” away from biological warfare. Why not?

I don’t have a complete answer. I have some pieces of the answer, though. The first piece, and one very good answer, is that BW are not as cheap and deadly as commonly thought, and that substantial resources and expertise are needed to successfully create biological weapons. This argument is well-made by Sonia Ben Ouagrham-Gormley in her book Barriers to Bioweapons – a combination between hard technical skills and soft skills like poor management. I’ve written a summary of the book on this blog before.

That post will act as Part 1 of this section.

References

Ben Ouagrham-Gormley, Sonia. Barriers to Bioweapons: The Challenges of Expertise and Organization for Weapons Development. Cornell University Press, 2014.

The Germy Paradox – The empty sky: How close did we get to BW usage?

This is the third post in a sequence of blog posts on state biological weapons programs. Others will be linked here as they come out:

1.0 Introduction
2.1 The empty sky: A history of state biological weapons programs
2.2 The empty sky: How close did we get to BW usage?
3.1 Filters: Hard and soft skills
3.2 Filters: A taboo
3.3 Filters: The shadow of nuclear weapons
4.0 Conclusion: Open questions and the future of state BW 

 I’ve heard a lot about “nuclear close calls.” Stanislav Petrov was, at one point, one human and one uncomfortable decision away from initiating an all-out nuclear exchange between the US and the USSR. Then that happened several dozen more times. As described in Part 1, there were quite a few large state biological weapons programs after WWII. Was a similar situation unfolding there, behind the scenes like the nuclear near-misses? Were any biological apocalypses-that-weren’t quietly hidden in the pages of history? What were the actual usage plans for biological weapons, and how close did any of them come to deployment?  

First, to note that there are a few examples of state uses of BW in modernity – the most major being the Japanese BW program’s use of lethal weapons against Chinese civilians in the 1940’s. More recently, the Rhodesian BW program used weapons against their own citizens on small scales in the 1970’s, and South Africa program operatives attempted to sicken a refugee camp’s water supplies with cholera in 1989 (but failed). There are also a few ambiguous incidents, and the possibility of unnoticed usage. These incidents and possibilities are concerning, but are rare and on small scales. 

The claim I assert is that since the 1940’s, there have been no instances of the sort of large-scale usage of BW in international warfare, like that which the imagination conjures upon hearing the phrase “biological weapons.” My research suggests, more tentatively, that not only were there no such uses, but there were no close calls – that even in major state programs that created and stockpiled weapons in quantity, these weapons were virtually never included into war plans, prepared for usage or, especially, used. The strategies they were intended for were were generally vague and retaliatory to offensive uses of biological or nuclear weapons.  This has another bonus: understanding what biological weapons were ever made in mass quantity, or were ever seen to have particular strategic value.

Ways of classifying BW

Note two failure modes which are common in existing literature which tries to extrapolate from past bioweapons plans: first, an overly narrow reference class limited to weapons that were actually used in combat, of which there are very few; second, an overly broad class composed of every pathogen or plan that programs ever experimented with. The latter would include many plans that were impractical or undesirable and eventually abandoned. No literature thus far has focused on this middle ground between these extremes, or analyzed the strategic goals of such usages in depth. 

Addressing BW under a framework of planned usage is rare. Categorizations of weapon usage are also uncommon. Most commonly found is the sorting of BW strategies into the theater of war – strategic, tactical, and operational uses (Koblentz 2009, Leitenberg et al 2012). This categorization system is broadly applicable and not agent-based, and thus may apply to future weapons with novel agents as well as to past programs. On the other hand, three abstract categories are not enough to be predictive, and many usages could plausibly be described as affecting multiple theaters. Any new categorization system should factor in historical strategies and motivations as a reference class to understand possible future uses.

No concrete intentions

To illustrate why considering usage plans is valuable, note that many sophisticated BW programs did not in fact make war plans that incorporated their weapons. In many cases, this is because early conceptual research was time-consuming and unsuccessful, and the results were considered inferior to conventional or nuclear weapons. UK, South African, Canadian, and French programs never resulted in mass-production of a finalized weapon, much less strategies for such a weapon’s use (Balmer 1997, Wheelis et al 2006). The South African and UK programs both explored unusual intended uses (assassinations and fertility control (Purkitt and Burgess 2002), and area-denial, respectively (Wheelis et al 2006), but no weapons were created which would fulfill them. Even the USSR program, which weaponized more agents than any other BW program in history (Leienberg et al 2012), failed to make concrete war plans involving BW (Wheelis et al 2006). The broader intents of these programs are still interesting, but clearly the relative importance of various programs changes when considering concrete war plans. 

“No first use” of BW

Many state biological weapons programs post-WWII, and their strategies for weapons usage, were heavily intertwined with nuclear policy. The UK BW program, since 1946, was guided by a principle of “no first use” that echoes that of their nuclear program – that they would not use biological weapons unless the UK was first attacked with biological or nuclear weapons (Balmer 1997). This was also the original stated purpose of the US program, although the “no first use” doctrine was officially rescinded in 1956 (Wheelis et al 2006). Information revealed after the Iraq BW program survey group investigation in 2004 revealed Saddam Hussein’s intent in 1991 to use BW to retaliate against “unconventional” weapon attacks by the US (“Iraq Survey Group” 2004). The USSR was likely also interested in using weapons after a nuclear exchange with the US (Leitenberg et al 2012). In this way, BW join with nuclear weapons as a facet of mutually-assured destruction dynamics. 

BW have long been considered “the poor man’s nuclear weapon”. Numerous studies and documents have suggested that biological weapons are cheaper per death-caused than nuclear weapons (Koblentz 2004), are relatively simple to construct, and could be developed by less-wealthy states to give them the MAD protection or bargaining capability of a full-fledged nuclear power. This idea has been criticized in several ways – for instance, by Gregory Koblentz, who points out that biological attacks are not reliable in the same way that nuclear attacks are and thus do not fulfil the major goals of deterrence (Koblentz 2009), and by Sonia Ben Ouagrham-Gormley, who observes that biological weapons are in fact significantly more difficult to produce than commonly believed (Ben Ouagrham-Gormley 2014).

Nonetheless, the perception that biological weapons are an alternative to nuclear weapons is widespread, including by powerful figures like Hillary Clinton (Reuters 2011) and Bill Gates (Farmer 2017). In light of this, it is interesting to observe that all three programs for which “no first use” was a primary strategic goal – the UK, US, and Iraq programs – were also nuclear powers or pursuing nuclear weapons.  The UK and US BW programs were indeed later abandoned in favor of developing and relying on on nuclear weapons programs for defense and deterrence (Balmer 1997).

Non-lethal weapons

The popular image of BW as lethal antipersonnel weapons does not describe the whole of BW development, but certainly describes large swathes of it. The USSR, US, and Iraq programs all mass-produced lethal weapons; for example, anthrax-filled munitions. Of these, it seems that the Iraq program was the only one that actually deployed such weapons. Saddam Hussein was particularly interested in anthrax, not just for its virulence, but for its ability to remain active in soil and potentially cause death over many years.

Despite the prominence of the above, both in actuality and in the public eye, a great deal of effort focused on biological weapons has been focused on non-lethal weapons. These may be incapacitating antipersonnel weapons, or weapons with non-human targets such as plants or animals to attack the enemy’s agricultural system. Desiderata between lethal and non-lethal biological weapons has not been well-explored by the literature. It has been suggested that caring for incapacitated soldiers, particularly those who are sick over a long period of time, can be more burdensome on an enemy than in simply burying and replacing killed soldiers (Pappas et al, 2006). While strongly dependent on the pathogen and expected death rates, this is a practical reason these weapons may be preferred. Inflicting nonlethal or economic damage is also seen as more humane than lethal damage, and perhaps more politically acceptable. Political acceptability was, for instance, part of the internal US motivation for using anti-plant chemical weapons rather than human-targeted weapons in the Vietnam War (Pearson 2006). However, humanitarian or political motivations for developing biological weapons have not been described or well-documented in research.

Some of the first modern BW production was for weapons aimed at cattle: before WWII, the US and UK collaborated on producing anthrax-infected cow cakes, meant to be scattered over German fields to damage agriculture. Later, the US program explored agricultural weapons until 1958 (Hay 1999), and two of the twelve weapons finalized by the US program were agricultural – rice blast and wheat stem rust (Wheelis et al 2006). The USSR did not stockpile anti-agriculture weapons, but a variety of them were developed by a sub-program of the BW apparatus code-named Ecology (Koblentz 2009), and production was designed to scale quickly if needed (Alibek 1999). These would not be used in conditions of total war with the US, but in smaller wars with smaller nations (Alibek 1999).            

Both the US and the USSR also invested in incapacitating weapons for use against humans. The USSR weaponized the rarely-lethal Brucella sp. and Burkholderia mallei, as well as Venezuelan equine encephalitis virus (VEEV), which causes flu-like symptoms. The US also invested heavily in incapacitating weapons. Four out of ten standardized antipersonnel weapons produced by the US BW program were not intended to be lethal: Brucella suis, Coxiella burnetti and VEEV, as well as the staphylococcus-derived Enterotoxin B (Wheelis 2006). In what may have been the closest call of major use of a BW since WWII, airplanes were loaded with VEEV-containing weapons during the Cuban missile crisis (Guillemin 2004). This was unauthorized, but nonetheless seems to have come very close to real battlefield usage.           

Here’s a summary of my findings by program:

CountryProgram active periodUsageStrategic goal for weaponsDegree of progress in creation
USSR1925-1990Never usedSomewhat unclear, as second-strike capability (to ensure complete destruction), as a first-strike weapon against small countriesMultiple biological weapons created and stockpiled
UK1942-1956Never usedAs second-strike capabilityNo weapons ever developed to completion
USA1942-1970Never used, nonlethal weapon loaded onto plane in one instanceUnclear, complement to conventional weaponsMultiple biological weapons created and stockpiled
Canada1940’sNever usedUnclearNo weapons ever developed to completion
French1940’sNever usedUnclear No weapons ever developed to completion
Japan1940’sUsed weapons in international warfare.Death, area denial, regular weapon of warMultiple biological weapons created, stockpiled, and used
Israel1948 – ???Probably never usedUnclear, perhaps as deterrenceUnclear
Rhodesia1975-1979Used weapons against dissidents inside the country.As internal tool.Crude weapons created and used (e.g. placing cholera in food and water supplies.)
Iraq1980s-1996Never usedAs deterrence, second-strikeMultiple biological weapons created and stockpiled
South Africa1983-1988One instance of attempted use (failed)Assassination, fertility control, control, possibly as regular weapon of warNo weapons ever developed to completion

[Bold country name = BW used. Italics: BW use attempted or “almost used.”]


References

Alibek, Kenneth. “The Soviet Union’s Anti-Agricultural Biological Weapons.” Annals of the New York Academy of Sciences 894, no. 1 (December 1, 1999): 18–19. https://doi.org/10.1111/j.1749-6632.1999.tb08038.x.

Balmer, Brian. “The drift of biological weapons policy in the UK 1945–1965.” The Journal of Strategic Studies 20, no. 4 (1997): 115-145.

Farmer, Ben. “Bioterrorism Could Kill More People than Nuclear War, Bill Gates to Warn World Leaders.” The Telegraph, February 17, 2017. https://www.telegraph.co.uk/news/2017/02/17/biological-terrorism-could-kill-people-nuclear-attacks-bill/.

Guillemin, Jeanne. Biological weapons: From the invention of state-sponsored programs to contemporary bioterrorism. Columbia University Press, 2004.

Hay, Alastair. “Simulants, Stimulants and Diseases: The Evolution of the United States Biological Warfare Programme, 1945–60.” Medicine, Conflict and Survival 15, no. 3 (July 1, 1999): 198–214. https://doi.org/10.1080/13623699908409459.

“Iraq Survey Group Final Report.” Iraq Survey Group, September 30, 2004. https://www.globalsecurity.org/wmd/library/report/2004/isg-final-report/.

Koblentz, Gregory. Living Weapons. Ithaca: Cornell University Press, 2009.

Koblentz, Gregory. “Pathogens as weapons: The international security implications of biological warfare.” International security 28, no. 3 (2004): 84-122.

Leitenberg, Milton, Raymond Zilinskas, and Jens Kuhn. The Soviet Biological Weapons Program: A History. Cambridge: Harvard University Press, 2012.

Ouagrham-Gormley, Sonia Ben. Barriers to Bioweapons: The Challenges of Expertise and Organization for Weapons Development. Cornell University Press, 2014.

Pappas G, Panagopoulou P, Christou L, Akritidis N. Brucella as a biological weapon. Cell Mol Life Sci 2006; 63: 2229-36.

Pearson, Alan. “Incapacitating Biochemical Weapons.” The Nonproliferation Review 13, no. 2 (July 1, 2006): 151–88. https://doi.org/10.1080/10736700601012029.

Purkitt, Helen E., and Stephen Burgess. “South Africa’s Chemical and Biological Warfare Programme: A Historical and International Perspective.” Journal of Southern African Studies 28, no. 2 (2002): 229–53.

Reuters, Thomas. “Hillary Clinton Warns Terror Threat from Biological Weapons Is Growing.” The National Post, December 7, 2011. https://nationalpost.com/news/biological-weapons-threat-growing-clinton-says.

Wheelis, Mark, Lajos Rozsa, and Malcolm Dando. Deadly Cultures: Biological Weapons since 1945. Cambridge, MA: Harvard University Press, 2006.

The Germy Paradox – The empty sky: A history of state biological weapons programs

This is the second post in a sequence of blog posts on state biological weapons programs. Others will be linked here as they come out:

1.0 Introduction
2.1 The empty sky: A history of state biological weapons programs
2.2 The empty sky: How close did we get to BW usage?
3.1 Filters: Hard and soft skills
3.2 Filters: A taboo
3.3 Filters: The shadow of nuclear weapons
4.0 Conclusion: Open questions and the future of state BW 

A lot of ink has been consecrated in describing the history of modern BW programs. Martin et al’s “Chapter 1: History of Biological Weapons: From Poisoned Darts to Intentional Epidemics.” In Medical Aspects of Biological Warfare (2018) [PDF] is a superb short summary with a focus on the agents and weapons explored by these programs. It’s far better than I could write, and relevant to the rest of this sequence. I recommend reading it.

I present a condensed timeline of state BW programs within the last century here as well:

CountryProgram active periodNotes
USSR1925-1990Major expansion and resurgence during approximately 1970.  By far the largest BW program to have ever existed. A great deal of our information on it is gleaned from Ken Alibek’s memoir Biohazard.
Japan1940’sWeapons were extensively tested and used against the Manchurian Chinese population.
USA1942-1970A fairly large and well-run program, similarly technically competent as the USSR program, despite having 10-fold less funding. Ended with the US beginning the Biological Weapons Convention, which came into effect in 1975.
UK1942-1956Worked closely with the US BW program. Later helped introduce the Biological Weapons Convention.
Canada1940’sPartially sponsored/encouraged by the US.
French1940’sDetails are scarce.
Israel1948 – ???Details are scarce.
Rhodesia1975-1979Crude weapons created and used (e.g. placing cholera in food and water supplies.)
Iraq1980s-1996Created in response to the external threat represented by Israel. The Iraq Survey Group Final Report is a comprehensive summary.
South Africa1983-1988Mostly controlled by one person. Focus was on weapons against internal threats and assassination, not large-scale offensive weapons against other nations.

References

The reference mentioned in the first paragraph is: Martin, James W, George W Christopher, and Edward M Eitzen. “Chapter 1: History of Biological Weapons: From Poisoned Darts to Intentional Epidemics.” In Medical Aspects of Biological Warfare, 20. Textbooks of Military Medicine. Fort Sam Houston: Office of the Surgeon General, Borden Institute, 2018.

Iraq: See the 2004 US Iraq Survey Group Final Report. (Available here.)

Israel: See the Nuclear Threat Initiative’s summary (available here.)

Other country’s details are either in the Martin et al piece linked above, or may be cross-pollinated from other sources.

The Germy Paradox: An Introduction

I’m writing a sequence of blog posts on state biological weapons programs. This is the first post. Others will be linked here as they come out:

1.0 Introduction
2.1 The empty sky: A history of state biological weapons programs
2.2 The empty sky: How close did we get to BW usage?
3.1 Filters: Hard and soft skills
3.2 Filters: A taboo
3.3 Filters: The shadow of nuclear weapons
4.0 Conclusion: Open questions and the future of state BW 

This is a bit of a review of my first year of graduate school. Unintentionally, many of my projects that year revolved around a question I’ve been mulling over for a long time. I’m calling it by a terrible nickname, “the germy paradox”, as a counterpart to the question posed by Richard Fermi on the apparent nonexistence of aliens in a vast and star-filled night sky: “If biological weapons are as cheap and deadly as is everyone seems to fear, then where are they?”      

There are few known state bioweapon (BW) programs, and very few at all since the 1970’s. This is despite greater interest and advances in life sciences, as well as a lot of active concern over biodefense. Why don’t we see many state BW programs? Is this for concrete and technological reasons, for strategic reasons, or for reasons of social unacceptability? If some of these reasons go away, for instance as biotechnologies become cheaper and easier to weaponize (as many, e.g. Mukunda et al 2009, Schmidt 2008, believe they will), how safe will we be then?

The history of the biological weapons taboo is multifaceted and will be addressed in greater detail. But some basic background will help kick off this series and explain the question. Biological and chemical weapons are often considered in the same breath (Smith 2014), referred to in early history as “poison” weapons (Cole 1998). Explicit prohibitions of the use of poison weapons in warfare began in the 1400s (Price 1995). Use of biological (or at the time, “bacteriological”) weapons was formally forbidden in combat by the 1925 Geneva Protocol, but several state BW programs went on to proliferate during WWII and the Cold War (Zanders 2003). These programs often stated that their weapons would be used only in retaliation against a nuclear or biological attack, in a version of the popular nuclear “no first use” policy, and they rarely developed clear strategies for BW wartime usage (see Wheelis and Rózsa 2018).

States today are constrained by the Biological Weapons Convention, an international protocol banning not only usage but development of BW. It was developed by the US in 1972 alongside removal of the US’s own program, and signed by all but a handful of countries (UN Office).

 Today, no nation claims to have a BW program, nor to have used BW for large-scale attacks since WWII.


I use the language of the Fermi Paradox to structure this series. 

The first part of this series, “The Empty Sky,” describes the modern history of biological weapons. It then makes the case that we have never seen large-scale usage or near-usage of these weapons since WWII. 

The second part, “Filters”, borrows from Robin Hanson’s model of “great filters” that stand in the way of planets creating space-faring civilizations, thus creating the Fermi Paradox (Hanson, 1998). This section explores reasons we don’t see biological weapons as much as we might expect.


New posts in this sequence will come out twice a week, on Mondays and Thursdays, until it’s done. (This is to give readers time to keep up.)


References

  1. Mukunda, Gautam, Kenneth A. Oye, and Scott C. Mohr. “What rough beast? Synthetic biology, uncertainty, and the future of biosecurity.” Politics and the Life Sciences 28, no. 2 (2009): 2-26
  2. Schmidt, Markus. “Diffusion of synthetic biology: a challenge to biosafety.” Systems and synthetic biology 2, no. 1-2 (2008): 1-6.]
  3. Smith, Frank. American biodefense: How dangerous ideas about biological weapons shape national security. Cornell University Press, 2014.
  4. A. Cole, Leonard. “The Poison Weapons Taboo: Biology, Culture, and Policy.” Politics and the Life Sciences 17 (September 1, 1998): 119–32. https://doi.org/10.1017/S0730938400012119.
  5. Price, Richard. “A Genealogy of the Chemical Weapons Taboo.” International Organization 49, no. 1 (1995): 73–103.
  6. Zanders, Jean Pascal. “International Norms Against Chemical and Biological Warfare: An Ambiguous Legacy.” Journal of Conflict and Security Law 8, no. 2 (October 1, 2003): 391–410. https://doi.org/10.1093/jcsl/8.2.391.
  7. Wheelis, Mark, and Lajos Rózsa. Deadly cultures: biological weapons since 1945. Harvard University Press, 2009.
  8. “The Biological Weapons Convention.” The United Nations Office at Geneva. Accessed December 11, 2018. https://www.unog.ch/80256EE600585943/(httpPages)/04FBBDD6315AC720C1257180004B1B2F.
  9. Hanson, Robin. “The Great Filter – Are We Almost Past it?”. 1998. (Web Archive) https://web.archive.org/web/20100507074729/http://hanson.gmu.edu/greatfilter.html

Seattle-Oxford Petrov Day 2018

My friend Finan wrote up an account of the 2018 Seattle and Oxford Petrov Day Faux Nuclear Crisis.

Recognizing this, a couple of community members in Seattle whipped up a program to have mutually assured party destruction for Petrov Day. In the game you are told if the other party has launched and given time to retaliate if you so choose. Both parties successfully made it through several false alarms without nuking each other. It could be a testament to our general feelings of well being towards each other, and to lack of real incentive to nuke the other party–aside from protecting your own.

[…]

One of the partygoers in Seattle pressed the launch button at -1 seconds. 1 second past the end time, they were assuming the game was over and enjoying the humor of an actually quite low stakes game. The server and the Seattle computer were slightly out of sync, so although the game appeared over from the Seattle end, it was not according to the server.

“If our extinction proceeds slowly enough to allow a moment of horrified realization, the doers of the deed will likely be quite taken aback on realizing that they have actually destroyed the world. Therefore I suggest that if the Earth is destroyed, it will probably be by mistake. ” – Eliezer Yudkowsky


The “Big Red Button” approach to the day has only been around for the last 3 years, but it’s seen some adoption since. This is notably the first time that the button has been pressed at a party.

Further reading:

More on Petrov Day

There Is A Button

Book review: The Doomsday Machine

This book and the movie Dr. Strangelove are my two recommendations for learning about why you should still be concerned about nuclear war. The Dr. Strangelove post is coming soon. For now, The Doomsday Machine by Daniel Ellsberg:

…is a book about designing the end of the world as we know it, chronologically through Daniel Ellsberg’s career as a nuclear war planner. It’s well written, and Ellsberg makes a compelling hero.

He’s most famous for leaking the Pentagon Papers, government documents on the Vietnam War that contributed to Nixon’s resignation. This book came out of a second set of documents he photocopied and intended to release after his trial for the Pentagon Papers, but lost in an act of nature. Early on, he describes this second planned leak as the one that he fully expected to put him in jail for the rest of his life, and how he felt the loss of those documents as both a tragedy for the nation, and a blessing that allowed him to spend the following decades beside his wife. It’s the kind of thing that makes you glad you’re driving alone when your audiobook is making you tear up in the desert along the Washington-Idaho border.

But all this just helps – the real meat of the book is in the systems he describes.


Let’s talk about nuclear winter real quick. (My favorite line on dates.) Ellsberg puts this at the end, which makes sense chronologically, but it’d be burying the lede for an x-risk focused blog, so let’s get it out there now:

All of our plans for cold war were decided before anyone knew about nuclear winter. I feel like I should capitalize that – Nuclear Winter. It’s the hypothesized event where nuclear explosions cause fired in cities that launch so much ash into the stratosphere that it blots out the sun for months and makes it impossible for plants to grow, killing most human and large animal life. There’s uncertainty around the specifics, but its existence is generally agreed upon in the scientific community.

All US strategy during the early Cold War hinged on this idea of “general war”, an all-out nuclear exchange with Russia and China. General War included dropping enormous nuclear weapons on literally every single city in both Russia and China. Obviously, this is atrocious enough – this level of calamity was expected to kill something like 20% of the world’s population at the time, mostly from fallout.

But every time general war was mentioned, a little voice in my head yelled “nuclear winter!” – that the death toll is actually >90% of humanity, Americans, Russians, Chinese, and everyone else alike, unbeknownst to everybody at the time. My loose impression is that there’s not substantial reason to believe that nuclear war planning policy ever shifted to account for this fact.


Another quick takeaway: the US planned on making the first nuclear strike on Russia and China throughout the Cold War. Today we have a perception that the US only plans for using a second strike, but almost the entirety of planning material is based on the supposition of the US using nuclear weapons first. Again, there’s little reason to suspect this has changed now.


Through this book, I was repeatedly reminded of the Litany of Jai: Almost nobody is evil, almost everything is broken. The problems described in the book aren’t the result of insanity or complete carelessness, but instead a horrifying spider web of incentives, laid unwittingly by people with limited goals and limited knowledge. It’s a sinister net of multipolar traps. If you follow this web down, as Ellsberg does, you find yourself looking into the yawning chasm of a nuclear apocalypse – not built on purpose, but built nonetheless.

Let’s look at how some of these tangled incentives lead us there.


  • Branches of the military want high budgets.
  • Budget decisions are made based on intelligence from those branches.
  • Branches compete with each other for funding from Congress and other officials.

  • Various branches hugely overestimate enemy capacities.
  • E.g. the army reports extremely high Soviet ground force numbers.
  • The Air Force reports extremely high Soviet ICBM capacity.
  • Inter-branch coordination gets trampled.
  • There is no incentive for estimates or behavior that aligns with strategy or reality.

  • All military branches want to get in all-out war if/when it happens.

  • The Pacific Navy basically insists on attacking Asia alongside Russia in all cases, because they want to be involved and don’t just want to attack minor Siberian targets “on the sidelines” of The Big War.
  • Nuclear plans have Moscow area getting blanketed with hundreds of nuclear bombs from all sides. “Hundreds of nuclear bombs” is a phrase that here and elsewhere means “calamitous overkill”.

  • Military branches don’t want to listen to civilian politicians.
  • Civilian politicians are powerful decision-makers.

  • Information is concealed, including from the president (for instance, the JSCP, which is the detailed plan for all-out war).
  • Military leaders just don’t listen to civilians who outrank them (e.g. in moving ships with nuclear warheads illegally stationed in Japanese ports).
  • Civilian President Kennedy is politically obliged not to override poor decisions made by President Eisenhower, the famous military general.

  • Nuclear bomber pilots need to receive an authorized signal to enact plans for bombing Russian and Asian targets.
  • Air force planners want as little delay as possible in executing war plans once they get the order.
  • Air Force planners want to save time and effort.

  • Authorization codes are stored in plaintext in envelopes in each plane, are the same between every plane, and are rarely changed.
  • Any pilot who realized this could easily lead their base in a nuclear strike, and almost certainly trigger all-out nuclear war.
  • There’s no way in the target database to easily distinguish Russian and Chinese targets, so everyone at Air Force bases assumes that if they get the war order, they’ll just drop nuclear weapons on everyone. All Chinese cities were going to be destroyed under every nuclear attack plan, throughout the entire early Cold War.

  • Communications systems with Washington DC might be destroyed if Russia attacks the US with nuclear weapons first.
  • Communication systems between bases might be destroyed during a Russian attack.
  • Communications in general are pretty unreliable.
  • Everyone in the military chain of command, including the President, wants the US to be able to respond as quickly as possible to a Russian first strike.

  • Ability to initiate a nuclear war is secretly delegated down the chain of command in cases where bases are not in touch with Washington DC.
  • Contact with Washington DC is often unreliable – for hours every day on some bases in the Pacific.
  • Basically anyone in the chain of command is not just capable of, but entirely authorized to, declare total nuclear war most of the time.

This are not even every example. A story retold in many different forms throughout the entire book goes like this:

  1. Daniel Ellsberg learns about one of these outcomes.
  2. Ellsberg talks to some relevant officials and outlines a possible catastrophe.
  3. The officials go still, think about it, and say with concern, “That seems entirely possible.”
  4. Nothing changes, ever.

A possible solution for most of these spiraling incentives is a countervailing force, balancing the dynamic back away from “total catastrophe”. An actor, or an incentive, or something. Often, that does not exist – in the veil of secrecy surrounding nuclear war, any party with an incentive to care about the implied risk isn’t aware of the entire situation, and can’t unilaterally fix it if it exists. Ellsberg tries to repair these flawed systems when he notices them, but has little power to do so.

He talks about how he suspects that some leaders, including President Kennedy, never had real intentions of using nuclear weapons, but even if that’s true, the scenarios above suggest that presidential intent may have had little to do with the outcome.

Ellsberg’s knowledge of the situation drops off in the 70’s or so when he started doing other work. Are all of these nuclear war and control systems still like this?? Maybe??!! Certainly nobody was rushing to reform them throughout his long tenure with the government.

I don’t know what to do about any of this. This book illuminates the number of needles we somehow threaded in avoiding catastrophe since the start of the Cold War. Here’s where you can get it.

Why was smallpox so deadly in the Americas?

In Eurasia, smallpox was undoubtedly a killer. It came and went in waves for ages, changing the course of empires and countries. 30% of those infected with the disease died from it. This is astonishingly high mortality from a disease – worse than botulism, Lassa Fever, tularemia, the Spanish flu, Legionnaire’s disease, and SARS.

In the Americas, smallpox was a rampaging monster.

When it first appeared Hispaniola in 1518, it spread 150 miles in four months and killed 30-50% of people. Not just of those infected, of the entire population1. It’s said to have infected a quarter of the population of the Aztec Empire within two weeks, killing half of those2, and laying the stage for another disease to kill many more3. 

Then, alongside other diseases and warfare, it contributed to 84% of the Incan Empire dying4.

Among the people who sometimes traded at the Hudson Bay Company’s Cumberland House on the Seskatchewan River in 1781 and 1782, 95% seemed to have died. Of them, the U’Basquiau (also called, I believe, the Basquia Cree people) were entirely killed5.

Over time, smallpox killed 90% of the Mandan tribe, along with 80% of people in the Columbia River region, 67% of the Omahas, and half of the Piegan tribe and of the Huron and Iroquois Confederations6.

Here are some estimates of the death rates between ~1605 and 1650 in various Northeastern American groups. This was during a time of severe smallpox epidemics. Particularly astonishing figures are highlighted (mine).

highlightedtable

Figure adapted from European contact and Indian depopulation in the Northeast: The timing of the first epidemics[^7]

Most of our truly deadly diseases don’t move quickly or aren’t contagious. Rabies, prion diseases, and primary amoebic meningoencephalitis have more or less 100% fatality rates. So do trypanosomiasis (African sleeping sickness) and HIV, when untreated.

When we look at the impact of smallpox in the Americas, we see extremely fast death rates that are worse than the worst forms of Ebola.

What happened?

In short, probably a total lack of previous exposure to smallpox and the other pathogenic European diseases, combined with cultural responses that helped the pathogen spread. The fact that smallpox was intentionally spread by Europeans in some cases probably contributed, but I’m not sure how much.

Virgin soil

Smallpox and its relatives in the orthopox family – monkeypox, cowpox, horsepox, and alastrim (smallpox’s milder variant) – had been established in Eurasia and Africa for centuries. Exposure to one would give some immune protection to the others. Variolation, a cruder version of vaccination, was also sometimes practiced.

Between these, and the frequent waves of outbreaks, a European adult would have survived some kind of direct exposure to smallpox-like antigens in the past, and would have the protection of antibodies to it, preventing future sickness. They would also have had, as children, the indirect protection of maternal antibodies, protecting them as children1.

In the Americas, everyone was exposed to the most virulent form of the disease with no defenses. This is called a “virgin soil epidemic”.

In this case, epidemics would stampede through occasionally, ferociously but infrequently enough for any given tribe that antibodies wouldn’t successfully form, and maternal protection didn’t develop. Many groups were devastated repeatedly by smallpox outbreaks over decades, as well as other European diseases: the Cocolizti epidemics3, measles, influenza, typhoid fever, and others7.

In virgin soil epidemics, including these ones, disease strikes all ages: children and babies, the elderly and strong young adults6. This sort of indiscriminate attack on all age groups is a known sign in animal populations that a disease is extremely lethal8. In humans, it also slows the gears of society to a halt.

When so much of the population of a village was too sick to move, not only was there nobody to tend crops or hunt – setting the stage for scarcity and starvation – but there was nobody to fetch water. Dehydration is suspected as a major cause of death, especially in children16. Very sick mothers would also be unable to nurse infants6

Other factors that probably contributed:

Cultural factors

Native Americans had some concept of disease transmission – some people would run away when smallpox arrived in their village, possibly carrying and spreading the germ7. They also would steer clear of other tribes that had it. That said, many people lived in communal or large family dwellings, and didn’t quarantine the sick to private areas. They continued to sleep alongside and spend time with contagious people6.

In addition, pre-colonization Native American measures against diseases were probably somewhat effective to pre-colonization diseases, but tended to be ineffective or harmful for European diseases. Sweat baths, for instance, could have spread the disease and wouldn’t have helped9. Transmission could also have occurred during funerals10

Looking at combinations of the above factors, death rates of 70% and up are not entirely unsurprising.

Use as a bioweapon

Colonizers repeatedly used smallpox as an early form of biowarfare against Native Americans, knowing that they were more susceptible. This included, at times, intentionally withholding vaccines from them. Smallpox also spreads rapidly naturally, so I’m not sure how much contributed to the overall extreme death toll, although it certainly resulted in tremendous loss of life.

Probably not responsible:

Genetics. A lack of immunological diversity, or some other genetic susceptibility, has been cited as a possible reason for the extreme mortality rate. This might be particularly expected in South America, because of the serial founder effect – in which a small number of people move away from their home community and start their own, repeated over and over again, all the way across Beringia and down North America, into South America9.

That said, this theory is considered unlikely today1. For one, the immune systems of native peoples of the Americas react similarly to vaccines as the immune systems of Europeans10. For another, groups in the Americas also had unusually high mortality from other European diseases (influenza, measles, etc), but this mortality decreased relatively quickly after first exposure – quickly enough that genetic attributes couldn’t change quickly enough to explain the response10.

Some have also proposed general malnutrition, which would weaken the immune system and make it harder to fight off smallpox. This doesn’t seem to have been a factor1. Scarce food was a fact of life in many Native American groups, but then again, the same was true for European peasants, who still didn’t suffer as much from smallpox.

Africa

Smallpox has had a long history in parts of Africa – the earliest known instance of smallpox infection comes from Egyptian mummies2, and frequent European contact throughout the centuries spread the disease to the parts they interacted with. Various groups in North, East, and West Africa developed their own variolation techniques11.

However, when the disease was introduced to areas it hadn’t existed before, we saw similarly astounding death rates as in the Americas: one source describes mortality rates of 80% among the Griqua people of South Africa. Less quantitatively, it describes how several Hottentot tribes were “wiped out” by the disease, that some tribes in northern Kenya were “almost exterminated”, and that parts of the eastern Congo River basin became “completely depopulated”2.

This makes it sound like smallpox acted similarly in unexposed people in Africa. It also lends another piece of evidence against the genetic predisposition hypothesis – that the disease would act similarly on groups so geographically removed.

Wikipedia also tells me that smallpox was comparably deadly when it was first introduced to various Australasian islands, but I haven’t looked into this further.

Extra

Required reading on humanism, smallpox, and smallpox eradication.


When smallpox arrived in India around 400 AD, it spurred the creation of Shitala, the Hindu goddess of (both causing and curing) smallpox. She is normally depicted on a donkey, carrying a broom for either spreading germs or sweeping out a house, and a bowl of either smallpox germs or of cool water.

The last set of images on this page also seems to be a depiction of the goddess, and captures something altogether different, something more dark and visceral.


Finally, this blog has a Patreon. If you like what you’ve read, consider giving it your support so I can make more of it.

References


  1. Riley, J. C. (2010). Smallpox and American Indians revisited. Journal of the history of medicine and allied sciences65(4), 445-477. 
  2. Fenner, F., Henderson, D. A., Arita, I., Jezek, Z., Ladnyi, I. D., & World Health Organization. (1988). Smallpox and its eradication. 
  3. Acuna-Soto, R., Sthale, D. W., Cleaveland, M. K., & Therrell, M. D. (2002). Megadrought and megadeath in 16th century Mexico. Revista Biomédica13, 289-292. 
  4. Beer, M., & Eisenstat, R. A. (2000). The silent killers of strategy implementation and learning. Sloan management review41(4), 29. 
  5. Houston, C. S., & Houston, S. (2000). The first smallpox epidemic on the Canadian Plains: in the fur-traders’ words. Canadian Journal of Infectious Diseases and Medical Microbiology11(2), 112-115. 
  6. Crosby, A. W. (1976). Virgin soil epidemics as a factor in the aboriginal depopulation in America. The William and Mary Quarterly: A Magazine of Early American History, 289-299. 
  7. Sundstrom, L. (1997). Smallpox Used Them Up: References to Epidemic Disease in Northern Plains Winter Counts, 1714-1920. Ethnohistory, 305-343. 
  8. MacPhee, R. D., & Greenwood, A. D. (2013). Infectious disease, endangerment, and extinction. International journal of evolutionary biology, 2013. 
  9. Snow, D. R., & Lanphear, K. M. (1988). European contact and Indian depopulation in the Northeast: the timing of the first epidemics. Ethnohistory, 15-33. 
  10. Walker, R. S., Sattenspiel, L., & Hill, K. R. (2015). Mortality from contact-related epidemics among indigenous populations in Greater Amazonia. Scientific reports5, 14032. 
  11. Herbert, E. W. (1975). Smallpox inoculation in Africa. The Journal of African History16(4), 539-559. 

[OPEN QUESTION] Insect declines: Why aren’t we dead already?

One study on a German nature reserve found insect biomass (e.g., kilograms of insects you’d catch in a net) has declined 75% over the last 27 years. Here’s a good summary that answered some questions I had about the study itself.

Another review study found that, globally, invertebrate (mostly insect) abundance has declined 35% over the last 40 years.

Insects are important, as I’ve been told repeatedly (and written about myself). So this news begs a very important and urgent question:

Why aren’t we all dead yet?

This is an honest question, and I want an answer. (Readers will know I take catastrophic possibilities very seriously.) Insects are among the most numerous animals on earth and central to our ecosystems, food chains, etcetera. 35%+ lower populations are the kind of thing where, if you’d asked me to guess the result in advanced, I would have expected marked effects on ecosystems. By 75% declines – if the German study reflects the rest of the world to any degree – I would have predicted literal global catastrophe.

Yet these declines have been going on for apparently decades apparently consistently, and the biosphere, while not exactly doing great, hasn’t literally exploded.

So what’s the deal? Any ideas?

Speculation/answers welcome in the comments. Try to convey how confident you are and what your sources are, if you refer to any.

(If your answer is “the biosphere has exploded already”, can you explain how, and why that hasn’t changed trends in things like global crop production or human population growth? I believe, and think most other readers will agree, that various parts of ecosystems worldwide are obviously being degraded, but not to the degree that I would expect by drastic global declines in insect numbers (especially compared to other well-understood factors like carbon dioxide emissions or deforestation.) If you have reason to think otherwise, let me know.)


Sidenote: I was going to append this with a similar question about the decline in ocean phytoplankton levels I’d heard about – the news that populations of phytoplankton, the little guys that feed the ocean food chain and make most of the oxygen on earth, have decreased 40% since 1950.

But a better dataset, collected over 80 years with consistent methods, suggests that phytoplankton have actually increased over time. There’s speculation that the appearance of decrease in the other study may have been because they switched measurement methods partway through. An apocalypse for another day! Or hopefully, no other day, ever.


Also, this blog has a Patreon. If you like my work, consider incentivizing me to make more!

An invincible winter

[Picture in public domain, taken by Jon Sullivan]

Early September brought Seattle what were to be some of the hottest days of the summer. For weeks, people had been turning on fans, escaping to cooler places to spend the day, buying out air conditioners, which most of the city didn’t own. I cowered in my room with an AC unit on loan from a friend lodged in the window, only going out walking when the sun had set.

That week, Eastern Washington was burning. It does that every summer. But this year, a lot of Eastern Washington was burning. Say it with me – 2017 was one of the worst fire years on record. That week, the ash from the fires drifted over Seattle. You smelled smoke everywhere in the city. The sky was gray. At sunrise and sunset, the sun was blood-red. One day, gray ash drifted down from the sky, the exact size of snowflakes. It dusted the cars and kept falling through the afternoon.

That day, people said the weather was downright apocalyptic. They weren’t entirely wrong.

Many people aren’t clear on what exactly a nuclear winter is. The mechanic is straightforward. When cities burn in the extreme heat of a nuclear blast – and we do mean cities, plural, most nuclear exchange scenarios involve multiple strikes for strategic reasons – they turn into soot, and the soot floats up. If enough ash from burned cities reaches the stratosphere, the upper layer of the atmosphere, it stays there for a long time. The ash clouds blot out the sun, cool the earth, and choke off the growth of crops. Within weeks, agriculture grinds to a halt.

There’s a lot of uncertainty over nuclear winter. But by one estimate, the detonation of less than 1% of the world’s nuclear arsenal – a fairly small war – could drop the temperatures by five degrees Celsius, and warm up slowly again over twenty years. The ozone layer would thin. Less rain would fall. Two billion people would starve.

On Tuesday and Wednesday that week, the temperature was predicted to reach over 100 degrees. It didn’t. The particulates in the air blocked enough of the sun’s heat that it barely hit the 90’s. Pedestrians didn’t quite breathe easier, but did sweat less. Our own tiny, toy model taste of a nuclear apocalypse.


I’d been feeling strange for the last few weeks, unrelatedly, and sitting at my desk for hours, my mind did a lot of wandering. I hoped things would be looking up – I’d just gotten back from an exciting conference with good friends, and also from seeing the solar eclipse.

I’d made the pilgrimage with friends. We drove for hours, east across the mountains the week before they burned. We crossed the Colombia River into Oregon, and finally, drove up a winding dirt road to a wide clearing with a small pond. I studied for the GRE in the shadows of dry pines. We played tug-of-war with the crayfish and watched the mayflies dance above the pond. The morning of, the sun climbed in the sky, and I had never appreciated how invisible the new moon is, or how much light the sun puts out – even when it was half-gone, we still had to peer through black plastic glasses to confirm that something had changed. But soon, it became impossible not to notice.

I kept thinking about what state I would have been thrown into if I hadn’t known the mechanism of an eclipse – how deep the pits of my spiritual terror could go. Whether it would be limited by biology or belief. As it is, it was only sort of spiritually terrifying, in a good way. The part of my brain that knew what was happening had spread that knowledge to all the other parts well, so I could run around in excitement and really appreciate the chill in the air, the eerie distortion of shadows into slivers, and finally, the moon sealing off the sun.

The solar corona.

The sunset-colored horizon all around the rim of the sky.

Stars at midday.

We left after the daylight returned, but while the moon was still easing away, eager to beat the crowds back to the city. I thought about the mayflies in the pond, and their brief lives – the only adults in hundreds of generations to see the sun, see the stars, and then see the sun again.

I thought something might shake loose in my brain. Things should have been looking up, but the adventures had scarcely touched the inertia. Oh, right, I had also been thinking a lot about the end of the world.

I wonder about the mental health of people who work in existential risk. I think it must vary. I know people who are terrified on a very emotional and immediate level, and I know people who clearly believe it’s bad but don’t get anxious over it, and aren’t inclined to start. I can’t blame them. I used to be more of the former, and now I’m not sure if it’s eased up or if I’m just not thinking about things that viscerally scare me anymore. I’m not sure the existential terror can tip you towards something you weren’t predisposed to. In my case, I don’t think the mental fog was from it. But the backdrop of existential horror certainly lent it an interesting flavor.


It’s late October now. I’ve pulled out the flannel and the space heater and the second blanket for the bed. When I went jogging, my hands got numb. I don’t mind – I like autumn, I like the descent into winter, heralded by rain and red leaves and darkness, and the trappings of horror and then of the harvest. Peaches in the summertime, apples in the fall. The seasons have a comforting rhyme to them.

That strange inertia hasn’t quite lifted, but I’m working on it. Meanwhile, the world continues to cant sideways. When we arranged the Stanislav Petrov day party in Seattle this year, to celebrate the day a single man decided not to start World War 3, I wondered if we should ease up on the “nuclear war is a terrifying prospect” theme we had laid on last year. I thought that had probably been on people’s minds already.

So geopolitical tensions are rising, and have been rising. The hemisphere gets colder. Not quite out of nostalgia, my mind keeps flickering back to last month, to not-quite-a-hundred-degrees Seattle, to the red sun.

There’s a beautiful quote from Albert Camus: “In the midst of winter, I found there was, within me, an invincible summer.” That Tuesday, like the momentary pass of the moon over the sun in mid-day, in the height of summer, I saw the shadow of a nuclear winter.



For a more detailed exploration of the mechanics of nuclear winter and why we need more research, look at this piece from Global Risk Research Network.

What do you do if a nuclear blast is going to go off near you? Read this piece. Maybe read it beforehand.

What do you do if you don’t want a nuclear blast to go off near you? The Ploughshares Fund is one of the canonical organizations funding work on reducing risks from nuclear weapons. You might also be interested in Physicians for Social Responsibility and the Nuclear Threat Initiative.

This blog has a Patreon. If you like what you’ve read, consider giving it a look.