Category Archives: culture

An old knit tube with colorful stripes

Who invented knitting? The plot thickens

Last time on Eukaryote Writes Blog: You learned about knitting history.

You thought you were done learning about knitting history? You fool. You buffoon. I wanted to double check some things in the last post and found out that the origins of knitting are even weirder than I guessed.

Humans have been wearing clothes to hide our sinful sinful bodies from each other for maybe about 20,000 years. To make clothes, you need cloth. One way to make cloth is animal skin or membrane, that is, leather. If you want to use it in any complicated or efficient way, you also need some way to sew that – very thin strips of leather, or taking sinew or plant fiber and spinning it into thread. Also popular since very early on is taking that thread, and turning it into cloth. There are a few ways to do this.

A drawing showing loose fiber, which turns into twisted thread, which is arranged in various ways to make different kinds of fabric structures. Depicted are the structures for: naalbound, woven, knit, looped, and twined fabric.
By the way, I’m going to be referring to “thread” and “yarn” interchangeably from here on out. Don’t worry about it.

(Can you just sort of smush the fiber into cloth without making it into thread? Yes. This is called felting. How well it works depends on the material properties of the fiber. A lot of traditional Pacific Island cloth was felted from tree bark.)

Now with all of these, you could probably make some kind of cloth by taking threads and, by hand, shaping them into these different structures. But that sounds exhausting and nobody did that. Let’s get tools involved. These different structures correspond to some different kind of manufacturing technique.

By far, the most popular way of making cloth is weaving. Everyone has been weaving for tens of thousands of years. It’s not quite a cultural universal but it’s damn close. To weave, you need a loom.1 There are ten million kinds of loom. Most primitive looms can make a piece of cloth that is, at most, the size of the loom. So if you want to make a tunic that’s three feet wide and four feet long, you need cloth that’s at least three feet wide and four feet long, and thus, a loom that’s at least three feet wide and four feet long. You can see how weaving was often a stationary affair.

Recap

Here’s what I said in the last post: Knitting is interesting because the manufacturing process is pretty simple, needs simple tools, and is portable. The final result is also warm and stretchy, and can be made in various shapes (not just flat sheets). And yet, it was invented fairly recently in human history.

I mostly stand by what I said in the last post. But since then I’ve found some incredible resources, particularly the scholarly blogs Loopholes by Cary “stringbed” Karp and Nalbound by Anne Marie Deckerson, which have sent me down new rabbit-holes. The Egyptian knit socks I outlined in the last post sure do seem to be the first known knit garments, like, a piece of clothing that is meant to cover your body. They’re certainly the first known ones that take advantage of knitting’s unique properties: of being stretchy, of being manufacturable in arbitrary shapes. The earliest knitting is… weirder.

SCA websites

Quick sidenote – I got into knitting because, in grad school, I decided that in the interests of well-roundedness and my ocular health, I needed hobbies that didn’t involve reading research papers. (You can see how far I got with that). So I did two things: I started playing the autoharp, and I learned how to knit. Then, I was interested in the overlap between nerds and handicrafts, so a friend in the Society for Creative Anachronism pitched me on it and took me to a coronation. I was hooked. The SCA covers “the medieval period”; usually, 1000 CE through 1600 CE.

I first got into the history of knitting because I was checking if knitting counted as a medieval period art form. I was surprised to find that the answer was “yes, but barely.” As I kept looking, a lot of the really good literature and analysis – especially experimental archaeology – came out of blogs of people who were into it as a hobby, or perhaps as a lifestyle that had turned into a job like historical reenactment. This included a lot of people in the SCA, who had gone into these depths before and just wrote down what they found and published it for someone else to find. It’s a really lovely knowledge tradition to find one’s self a part of.

Aren’t you forgetting sprang?

There’s an ancient technique that gets some of the benefits of knitting, which I didn’t get to in the last post. It’s called sprang. Mechanically, it’s kind of like braiding. Like weaving, sprang requires a loom (the size of the cloth it produces) and makes a flat sheet. Like knitting, however, it’s stretchy.

Sprang shows up in lots of places – the oldest in 1400 BCE in Denmark, but also other places in Europe, plus (before colonization!): Egypt, the Middle East, centrals Asia, India, Peru, Wisconsin, and the North American Southwest. Here’s a video where re-enactor Sally Pointer makes a sprang hairnet with iron-age materials.

Despite being widespread, it was never a common way to make cloth – everyone was already weaving. The question of the hour is: Was it used to make socks?

Well, there were probably sprang leggings. Dagmar Drinkler has made historically-inspired sprang leggings, which demonstrate that sprang colorwork creates some of the intricate designs we see painted on Greek statues – like this 480 BCE Persian archer.

I haven’t found any attestations of historical sprang socks. The Sprang Lady has made some, but they’re either tube socks or have separately knitted soles.

Why weren’t there sprang socks? Why didn’t sprang, widespread as it is, take on the niche that knitting took?

I think there are two reasons. One, remember that a sock is a shaped-garment, tube-like, usually with a bend at the heel, and that like weaving, sprang makes a flat sheet. If you want another shape, you have to sew it in. It’s going to lose some stretch where it’s sewn at the seam. It’s just more steps and skills than knitting a sock.

The second reason is warmth. I’ve never done sprang myself – from what I can tell, it has more of a net-like openness upon manufacture, unlike knitting which comes with some depth to it. Even weaving can easily be made pretty dense simply by putting the threads close together. I think, overall, a sprang fabric garment made with primitive materials is going to be less warm than a knit garment made with primitive materials.

Those are my guesses. I bring it up merely to note that there was another thread → cloth technique that made stretchy things that didn’t catch on the same way knitting did. If you’re interested in sprang, I cannot recommend The Sprang Lady’s work highly enough.

Anyway, let’s get back to knitting.

Knitting looms

The whole thing about roman dodecahedrons being (hypothetically) used to knit glove fingers, described in the last post? I don’t think that was actually the intended purpose, for the reasons I described re: knitting wasn’t invented yet. But I will cop to the best argument in its favor, which is that you can knit with glove fingers with a roman dodecahedron.

“But how?” say those of you not deeply familiar with various fiber arts. “That’s not needles,” you say.

You got me there. This is a variant of a knitting loom. A knitting loom is a hoop with pegs to make knit tubes. This can be the basis of a knitting machine, but you can also knit on one on its own.. They make more consistent knit tubes with less required hand-eye coordination. (You can also make flat panels with them, especially a version called a knitting rake, but since all of the early knitting we’re talking about are tubes anyhow, let’s ignore that for the time being.)

Knitting on a modern knitting loom. || Photo from Cynthia M. Parker on flickr, under a CC BY-SA 2.0 license.

Knitting on a loom is also called spool knitting (because you can use a spool with nails in it as the loom for knitting a cord) and tomboy knitting (…okay). Structurally, I think this is also basically the same thing as lucet cord-making, so let’s go ahead and throw that in with this family of techniques. (The earliest lucets are from ~1000 CE Viking Sweden and perhaps medieval Viking Britain.)

The important thing to note is that loom knitting makes a result that is, structurally, knit. It’s difficult to tell whether a given piece is knit with a loom or needles, if you didn’t see it being made. But since it’s a different technique, different aspects become easier or harder.

A knitting loom sounds complicated but isn’t hard to make, is the thing. Once you have nails, you can make one easily by putting them in a wood ring. You could probably carve one from wood with primitive tools. Or forge one. So we have the question: Did knitting needles or knitting looms come first?

We actually have no idea. There aren’t objects that are really clearly knitting needles OR knitting looms until long after the earliest pieces of knitting. This strikes me as a little odd, since wood and especially metal should preserve better than fabric, but it’s what we’ve got. It’s probably not helped by the fact that knitting needles are basically just smooth straight sticks, and it’s hard to say that any smooth straight stick is conclusively a knitting needle (unless you find it with half a sock still on it.)

(At least one author, Isela Phelps, speculates that finger-knitting, which uses the fingers of one hand like a knitting loom and makes a chunky knit ribbon, came first – presumably because, well, it’s easier to start from no tools than to start from a specialized tool. This is possible, although the earliest knit objects are too fine and have too many stitches to have been finger-knit. The creators must have used tools.)

(stringbed also points out that a piece of whale baleen can be used as circular knitting needles, and that the relevant cultures did have access to and trade in whale parts. Although while we have no particular evidence that they were used as such, it does mean that humanity wouldn’t have to invent plastic before inventing the circular knitting needle, we could have had that since the prehistoric period. So, I don’t know, maybe it was whales.)

THE first knitting

The earliest knit objects we have… ugh. It’s not the Egyptian socks. It’s this.

Photo of an old, long, thin knit tube in lots of striped colors.
One of the oldest knit objects. || Photo from Musée du Louvre, AF 6027.

There are a pair of long, thin, colorful knit tubes, about an inch wide, a few feet long. They’re pretty similar to each other. Due to the problems inherent in time passing and the flow of knowledge, we know one of them is probably from Egypt, and was carbon-dated to 425-594 CE. The other quite similar tube, of a similar age, has not been carbon dated but is definitely from Egypt. (The original source text for this second artifact is in German, so I didn’t bother trying to find it, and instead refer to stringbed’s analysis. See also matthewpius guestblogging on Loopholes.) So between the two of them, we have a strong guess that these knit tubes were manufactured in Egypt around 425-594 CE, about 500 years before socks.

People think it was used as a belt.

This is wild to me. Knitting is stretchy, and I did make fun of those peasants in 1300 CE for not having elastic waistlines, so I could see a knitted belt being more comfortable than other kinds of belts.2 But not a lot better. A narrow knit belt isn’t going to be distribute most of the force onto the body too differently than a regular non-stretchy belt, and regular non-stretchy belts were already in great supply – woven, rope, leather, etc. Someone invented a whole new means of cloth manufacture and used it to make a thing that existed slightly differently.

Then, as far as I can tell, there are no knit objects in the known historical record for five hundred years until the Egyptian socks pop up.

Pulling objects out of the past is hard. Especially things made from cloth or animal fibers, which rot (as compared to metal, pottery, rocks, bones, which last so long that in the absence of other evidence, we name ancient cultures based on them.) But every now and then, we can. We’ve found older bodies and textiles preserved in ice and bogs and swamps.3 We have evidence of weaving looms and sewing needles and pictures of people spinning or weaving cloth and descriptions of them doing it, from before and after. I’m guessing that the technology just took a very long time to diversify beyond belts.

Speaking of which: how was the belt made? As mentioned, we don’t find anything until much later that is conclusively a knitting needle or a knitting loom. The belts are also, according to matthewpius on loopholes, made with a structure called double knitting. The effect is (as indicated by Pallia – another historic reenactor blog!) kind of hard to do with knitting needles in the way they achieved it, but pretty simple to do with a knitting loom.

(Another Egyptian knit tube belt from an unclear number of centuries later.)

Viking knitting

You think this is bad? Remember before how I said knitting was a way of manufacturing cloth, but that it was also definable as a specific structure of a thread, that could be made with different methods?

The oldest knit object in Europe might be a cup.

Photo of a richly decorated old silver cup.
The Ardagh Chalice. || Photo by Sailko under a CC BY-SA 3.0 license.

You gotta flip it over.

Another photo of the ornate chalice from the equally ornate bottom. Red arrows point to some intricate wire decorations around the rim.
Underside of the Ardagh Chalice. || Adapted from a Metroplitan Museum image.

Enhance.

Black and white zoom in on the wire decorations. It's more  clearly a knit structure.
Photo from Robert M. Organ’s 1963 article “Examination of the Ardagh Chalice-A Case History”, where they let some people take the cup apart and put it back together after.

That’s right, this decoration on the bottom of the Ardagh Chalice is knit from wire.
Another example is the decoration on the side of the Derrynaflen Paten, a plate made in 700 or 800 CE in Ireland. All the examples seem to be from churches, hidden by or from Vikings. Over the next few hundred years, there are some other objects in this technique. They’re tubes knitted from silver wire. “Wait, can you knit with wire?” Yes. Stringbed points out that knitting wire with needles or a knitting loom would be tough on the valuable silver wire – they could break or distort it.

Photo of an ornate silver plate with gold decorations. There are silver knit wire tubes around the edge.
The Derrynaflen Patten, zoomed in on the knit decorations at the end. || Adapted from this photo by Johnbod, under a CC BY-SA 3.0 license.

What would make sense to do it with is a little hook, like a crochet hook. But that would only work on wire – yarn doesn’t have the structural integrity to be knit with just a hook, you need to support each of the active loops.

So was the knit structure just invented separately by Viking silversmiths, before it spread to anyone else? I think it might have been. It’s just such a long time before we see knit cloth, and we have this other plausible story for how the cloth got there.

(I wondered if there was a connection between the Viking knitting and their sources of silver. Vikings did get their silver from the Islamic world, but as far as I can tell, mostly from Iran, which is pretty far from Egypt and doesn’t have an ancient knitting history – so I can’t find any connection there.)

The Egyptian socks

Let’s go back to those first knit garments (that aren’t belts), the Egyptian knit blue-and-white socks. There are maybe a few dozen of these, now found in museums around the world. They seem to have been pulled out of Egypt (people think Kustat) by various European/American collectors. People think that they were made around 1000-1300 AD. The socks are quite similar: knit, made of cotton, in white and 1-3 shades of indigo, geometric designs sometimes including Kufic characters.

I can’t find a specific origin location (than “probably Egypt, maybe Kustat?”) for any of them. The possible first sock mentioned in the last post is one of these – I don’t know if there are any particular reasons for thinking that sock is older than the others.

This one doesn’t seem to be knit OR naalbound. Anne Marie Decker at Nalbound.com thinks it’s crocheted and that the date is just completely wrong. To me, at least, this cast doubts on all the other dates of similar-looking socks.

That anomalous sock scared me. What if none of them had been carbon-dated? Oh my god, they’re probably all scams and knitting was invented in 1400 and I’m wrong about everything. But I was told in a historical knitting facebook group that at least one had been dated. I found the article, and a friend from a minecraft discord helped me out with an interlibrary loan. I was able to locate the publication where Antoine de Moor, Chris Verhecken-Lammens and Mark Van Strydonck did in fact carbon-date four ancient blue-and-white knit cotton socks and found that they dated back to approximately 1100 CE – with a 95% chance that they were made somewhere between 1062 and 1149 CE. Success!

Helpful research tip: for the few times when the SCA websites fail you, try your facebook groups and your minecraft discords.

Estonian mitten

Photo of a tattered old fragment of knitting. There are some colored designs on it in blue and red.
Yeah, this is all of it. Archeology is HARD. [Image from Anneke Lyffland’s writeup.]

Also, here’s a knit fragment of a mitten found in Estonia. (I don’t have the expertise or the mitten to determine it myself, but Anneke Lyffland (another SCA name), a scholar who studied one is aware of cross-knit-looped naalbinding – like the Peruvian knit-lookalikes mentioned in the last post – and doesn’t believe this was naalbound.) It was part of a burial that was dated from 1238 – 1299 CE. This is fascinating and does suggest a culture of knitted practical objects, in Eastern Europe, in this time period. This is the earliest East European non-sock knit fabric garment that I’m aware of.

But as far as I know, this is just the one mitten. I don’t know much about archaeology in the area and era, and can’t speculate as to whether this is evidence that knitting was rare or whether we have very few wool textiles from the area and it’s not that surprising. (The voice of shoulder-Thomas-Bayes says: Lots of things are evidence! Okay, I can’t speculate as to whether it’s strong evidence, are you happy, Reverend Bayes?) Then again, a bunch of speculation in this post is also based on two maybe-belts, so, oh well. Take this with salt.

By the way, remember when I said crochet was super-duper modern, like invented in the 1700s?

Literally a few days ago, who but the dream team of Cary “stringbed” Karp and Anne Marie Decker published an article in Archaeological Textiles Review identifying several ancient probably-Egyptian socks thought to be naalbound as being actually crocheted.

This comes down to the thing about fabric structures versus techniques. There’s a structure called slip stitch that can be either crocheted or naalbound. So since we know naalbinding is that old, so if you’re looking at an old garment and see slip stitch, maybe you say it was naalbound. But basically no fabric garment is just continuous structure all the way through. How do the edges work? How did it start and stop? Are there any pieces worked differently, like the turning of a heel or a cuff or a border? Those parts might be more clearly worked with crochet hook than a naalbinding needle. And indeed, that’s what Karp and Decker found. This might mean that those pieces are forgeries – no carbon dating. But it might mean crochet is much much older than previously thought.

My hypothesis

Knitting was invented sometime around or perhaps before 600 CE in Egypt.

From Egypt, it spreads to other Muslim regions.

It spread into Europe via one or more of these:

  1. Ordinary cultural diffusion northwards
  2. Islamic influence in the Iberian Peninsula
    • In 711 CE, Al-Andalus was conquered by the Umayyad Caliphate…
      • Kicking off a lot of Islamic presence in and control over the area up until 1400 CE or so…
  3. Meanwhile, starting in 1095 CE, the Latin Church called for armies to take Jerusalem from the Byzantines, kicking off the Crusades.
    • …Peppering Arabic influences into Europe, particularly France, over the next couple centuries.

… Also, the Vikings were there. They separately invented the knitting structure in wire, but never got around to trying it out in cloth, perhaps because the required technique was different.

Another possibility

Wrynne, AKA Baronness Rhiall of Wystandesdon (what did I say about SCA websites?), a woman who knows a thing or two about socks, believes that based on these plus the design of other historical knit socks, the route goes something like:

??? points to Iran, which points to: A. Eastern Europe, then to 1. Norway and Sweeden and 2. Russia. B. to ???, to Spain, to Western Europe.

I don’t know enough about socks to have an sophisticated opinion on her evidence, but the reasoning seems solid to me. For instance, as she explains, old Western European socks are knit from the cuff of the sock down, whereas old Middle Eastern and East European socks are knit from the toe of the sock up – which is also how Eastern and Northern European naalbound socks were shaped. Baronness Rhiall thinks Western Europe invented its sockmaking techniques independently based only having had a little experience with a few late 1200s/1300s knit pieces from Moorish artisans.

What about tools?

Here’s my best guess: The Egyptian tubes were made on knitting looms.

The viking tubes were invented separately, made with a metal hook as stringbed speculates, and never had any particular connection to knitting yarn.

At some point, in the Middle East, someone figured out knitting needles. The Egyptian socks and Estonian mitten and most other things were knit in the round on double-ended needles.

I don’t like this as an explanation, mostly because of how it posits 3 separate tools involved in the earliest knit structures – that seems overly complicated. But it’s what I’ve got.

Knitting in the tracks of naalbinding

I don’t know if this is anything, but here are some places we also find lots of naalbinding, beginning from well before the medieval period: Egypt. Oman. The UAE. Syria. Israel. Denmark. Norway. Sweden. Sort of the same path that we predict knitting traveled in.

I don’t know what I’m looking at here.

  • Maybe this isn’t real and this places just happen to preserve textiles better
  • Longstanding trade or migration routes between North Africa, the Middle East, and Eastern Europe?
  • Culture of innovation in fiber?
  • Maybe fiber is more abundant in these areas, and thus there was more affordance for experimenting. (See below.)

It might be a coincidence. But it’s an odd coincidence, if so.

Why did it take so long for someone to invent knitting?

This is the question I set out to answer in the initial post, but then it turned into a whole thing and I don’t think I ever actually answered my question. Very, very speculatively: I think knitting is just so complicated that it took thousands of years, and an environment rich in fiber innovation, for someone to invent and make use of the series of steps that is knitting.

Take this next argument with a saltshaker, but: my intuitions back this up. I have a good visual imagination. I can sort of “get” how a slip knot works. I get sewing. I understand weaving, I can boil it down in my mind to its constituents.

There are birds that do a form of sewing and a form of weaving. I don’t want to imply that if an animal can figure it out, it’s clearly obvious – I imagine I’d have a lot of trouble walking if I were thrown into the body of a centipede, and chimpanzees can drastically outperform humans on certain cognitive tasks – but I think, again, it’s evidence that it’s a simpler task in some sense.

Same with sprang. It’s not a process I’m familiar with, but watching Sally Pointer do it on a very primitive loom, I can see understand it and could probably do it now. Naalbinding – well, it’s knots, and given a needle and knowing how to make a knot, I think it’s pretty straightforward to tie a bunch of knots on top of each other to make fabric out of it.

But I’ve been knitting for quite a while now and have finished many projects, and I still can’t say I totally get how knitting works. I know there’s a series of interconnected loops, but how exactly they don’t fall apart? How the starting string turns into the final project? It’s not in my head. I only know the steps.

I think that if you erased my memory and handed me some simple tools, especially a loom, I could figure out how to make cloth by weaving. I think there’s also a good chance I could figure out sprang, and naalbinding. But I think that if you handed me knitting needles and string – even if you told me I was trying to get fabric made from a bunch of loops that are looped into each other – I’m not sure I would get to knitting.

(I do feel like I might have a shot at figuring out crochet, though, which is supposedly younger than any of these anyway, so maybe this whole line of thinking means nothing.)

Idle hands as the mother of invention?

Why do we innovate? Is necessity the mother of invention?

This whole story suggests not – or at least, that’s not the whole story. We have the first knit structures in belts (already existed in other forms) and decorative silver wire (strictly ornamental.) We have knit socks from Egypt, not a place known for demanding warm foot protection. What gives?

Elizabeth Wayland Barber says this isn’t just knitting – she points to the spinning jenny and the power loom, both innovations in yarn production in general, that were invented recently by men despite thousands of previous years of women producing yarn. In Women’s Work: The First 20,000 Years, she writes:

“Women of all but the top social and economic classes were so busy just trying to get through what had to be done each day that they didn’t have excess time or materials to experiment with new ways of doing things.”

This speculates a kind of different mechanism of invention – sure, you need a reason to come up with or at least follow up on a discovery, but you also need the space to play. 90% of everything is crap, you need to be really sure that you can throw away (or unravel, or afford the time to re-make) 900 crappy garments before you hit upon the sock.

Bill Bryson, in the introduction to his book At Home, writes about the phenomenon of clergy in the UK in 1700s and 1800s. To become an ordained minister, one needed a university degree, but not in any particular subject, and little ecclesiastical training. Duties were light; most ministers read a sermon out of a prepared book once a week and that was about it. They were paid in tithes from local landowners. Bryson writes:

“Though no one intended it, the effect was to create a class of well-educated, wealthy people who had immense amounts of time on their hands. In conesquence many of them began, quite spontaneously, to do remarkable things. Never in history have a group of people engaged in a broader range of creditable activities for which they were not in any sense actually employed.”

He describes some of the great amount of intellectual work that came out of this class, including not only the aforementioned power loom, but also: scientific descriptions of dinosaurs, the first Icelandic dictionary, Jack Russel terriers, submarines aerial photography, the study of archaeology, Malthusian traps, the telescope that discovered Uranus, werewolf novels, and – courtesy of the original Thomas Bayes – Bayes’ theorem.

I offhandedly posited a random per-person effect in the previous post – each individual has a chance of inventing knitting, so eventually someone will figure it out. There’s no way this can be the whole story. A person in a culture that doesn’t make clothes mostly out of thread, like the traditional Inuit (thread is used to sew clothes, but the clothes are very often sewn out of animal skin rather than woven fabric) seems really unlikely to invent knitting. They wouldn’t have lots of thread about to mess around with. So you need the people to have a degree of familiarity with the materials. You need some spare resources. Some kind of cultural lenience for doing something nonstandard.

…But is that the whole story? The Incan Empire was enormous, with 12,000,000 citizens at its height. They didn’t have a written language. They had the quipu system for recording numbers with knotted string, but they didn’t have a written language. (Their neighbors, the Mayans, did.) Easter Island, between its colonization by humans in 1000 CE and its worse colonization by Europeans in 1700 CE, had a maximum population of maybe 12,000. It’s one of the most remote islands in the world. In isolation from other societies, they did develop a written language, in fact Polynesia’s only native written language.

Color photo of a worn wooden tablet engraved with intricate Rongorongo characters.
One of ~26 surviving pieces of Rongorongo, the undeciphered written script of Easter Island. This is Text R, the “Small Washington tablet”. Photo from the Smithsonian Institution. (Image rotated to correspond with the correct reading order, as a courtesy to any Rongorongo readers in my audience. Also, if there are any Rongorongo readers in my audience, please reach out. How are you doing that?!)
A black and white photo of the same tablet. The lines of characters are labelled (e.g. Line 1, Line 2) and the  symbols are easier to see. Some look like stylized humans, animals, and plants.
The same tablet with the symbols slightly clearer. Image found on kohaumoto.org, a very cool Rongorongo resource.

I don’t know what to do with that.

Still. My rough model is:

A businessy chart labelled "Will a specific group make a specific innovation?" There are three groups of factors feeding into each other. First is Person Factors, with a picture of a person in a power wheelchair: Consists of [number of people] times [degree of familiarity with art]. Spare resources (material, time). And cultural support for innovation. Second is Discovery Factors, with a picture of a microscope: Consists of how hard the idea "is to have", benefits from discovery, and [technology required] - [existing technology]. ("Existing technology" in blue because that's technically a person factor.) Third is Special Sauce, with a picture of a wizard. Consists of: Survivorship Bias and The Easter Island Factor (???)

The concept of this chart amused me way too much not to put it in here. Sorry.

(“Survivorship bias” meaning: I think it’s safe to say that if your culture never developed (or lost) the art of sewing, the culture might well have died off. Manipulating thread and cloth is just so useful! Same with hunting, or fishing for a small island culture, etc.)

…What do you mean Loopholes has articles about the history of the autoharp?! My Renaissance man aspirations! Help!


Delightful: A collection of 1900’s forgeries of the Paracas textile. They’re crocheted rather than naalbound.

1 (Uh, usually. You can finger weave with just a stick or two to anchor some yarn to but it wasn’t widespread, possibly because it’s hard to make the cloth very wide.)

2 I had this whole thing ready to go about how a knit belt was ridiculous because a knit tube isn’t actually very stretchy “vertically” (or “warpwise”), and most of its stretch is “horizontal” (or “weftwise”). But then I grabbed a knit tube (fingerless glove) in my environment and measured it at rest and stretched, and it stretched about as far both ways. So I’m forced to consider that a knit belt might be reasonable thing to make for its stretchiness. Empiricism: try it yourself!

3 Fun fact: Plant-based fibers (cotton, linen, etc) are mostly made of carbohydrates. Animal-based fibers (silk, wool, alpaca, etc) and leather are mostly made of protein. Fens are wetlands that are alkaline and bogs are acidic. Carbohydrates decay in acidic bogs but are well-preserved in alkaline fens. Proteins dissolve in alkaline environments fens but last in acidic bogs. So it’s easier to find preserved animal material or fibers in bogs and preserved plant material or fibers in fens.


Cross-posted to LessWrong.

Fiber arts, mysterious dodecahedrons, and waiting on “Eureka!”

Part 1: The anomaly

This story starts, as many stories do, with my girlfriend 3D-printing me a supernatural artifact. Specifically, one of my favorite SCPs, SCP-184.

This attempt got about 75% of the way through. Close enough.

We had some problems with the print. Did the problems have anything to do with printing a model of a mysterious artifact that makes spaces bigger on the inside, via a small precisely-calibrated box? I would say no, there’s no way that be related.

Anyway, the image used for the SCP in question, and thus also the final printed model, is based a Roman dodecahedron. Roman dodecahedrons are a particular shape of metal object that have been dug up in digs from all over the Roman period, and we have no idea why they exist.

Roman dodecahedra. || Image source unknown.

Many theories have been advanced. You might have seen these in an image that was going around the internet, which ended by suggesting that the object would work perfectly for knitting the fingers of gloves.

There isn’t an alternative clear explanation for what these are. A tool for measuring coins? A ruler for calculating distances? A sort of Roman fidget spinner? This author thinks it displays a date and has a neat explanation as for why. (Experimental archaeology is so cool, y’all.)

Whatever the purpose of the Roman dodecahedron was, I’m pretty sure it’s not (as the meme implies is obvious) for knitting glove fingers.1

Why?

1: The holes are always all different sizes, and you don’t need that to make glove fingers.

2: You could just do this with a donut with pegs in it, you don’t need a precisely welded dodecahedron. It does work for knitting glove fingers, you just don’t need something this complicated.

3: The Romans hadn’t invented knitting.

Part 2: The Ancient Romans couldn’t knit

Wait, what? Yeah, the Romans couldn’t knit. The Ancient Greeks couldn’t knit, the Ancient Egyptians couldn’t knit. Knitting took a while to take off outside of the Middle East and the West, but still, almost all of the Imperial Chinese dynasties wouldn’t have known how. Knitting is a pretty recent invention, time-wise. The earliest knit objects we have are from Egypt around 1000 CE.

Possibly the oldest knit sock known, ca 1000-1200 CE according to this page. || Photo is public domain from the George Washington University Textile Museum Collection.

This is especially surprising because knitting is useful for two big reasons:

First, it’s very easy to do. It takes yarn and two sticks and children can learn how. This is pretty rare for fabric manufacturing – compare, for instance, weaving, which takes an entire loom.

Sidenote: Do you know your fabrics? This next section will make way more sense if you do.

Woven fabricKnit fabric
Commonly found in: trousers, collared/button up shirts, bedsheets, dish towels, woven boxers, quilts, coats, etc.
Not stretchy.
Loose threads won’t make the whole cloth unravel.
Commonly found in: T-shirts, polo shirts, leggings, underwear, anything made of jersey fabric, sweaters, sweatpants, socks.
Stretchy.
If you pull on a lose thread, the cloth unravels.

Second, and oft-underappreciated, knitted fabric is stretchy. We’re spoiled by the riches of elastic fabric today, but it wasn’t always so. Modern elastic fabric uses synthetic materials like spandex or neoprene; the older version was natural latex rubber, and it seems to have taken until the 1800s to use rubber to make clothing stretchy. Knit fabric stretches without any of those.

Before knitting, your options were limited – you could only make clothing that didn’t stretch, which I think explains a lot of why medieval and earlier clothing “looks that way”. A lot of belts and drapey fabric. If something is form-fitting, it’s probably laced. (…Or just more-closely tailored, which unrelatedly became more of a thing later in the medieval period.)

You think these men had access to comfortable elastic waistlines? No they did not. || Image from the Luttrell Psalter, ~1330.

You could also use woven fabric on the bias, which stretches a little.

Woven fabric is stretchier this way. Grab something made of woven fabric and try it out. || Image by PKM, under a CC BY-SA 3.0 license.

Medieval Europe made stockings from fabric cut like this. Imagine a sock made out of tablecloth or button-down-shirt-type material. Not very flexible. Here’s a modern recreation on Etsy.

Other kinds of old “socks” were more flexible but more obnoxious, made of a long strip of bias-cut fabric that you’d wrap around your feet. (Known as: winingas, vindingr, legwraps, wickelbänder , or puttees.) Historical reenactors wear these sometimes. I’m told they’re not flexible and restrict movement, and that they take practice to put on correctly.

Come 1000 CE, knitting arrives on the scene.

Which is to say, it’s no surprise that the first knitted garments we see are socks! They get big in Europe over the next 300 years or so. Richly detailed bags and cushions also appear. We start seeing artistic depictions of knitting for the first time around now too.

Italian Madonna knitting with four needles, ~1350. Section of this miniature by Tommaso de Modena.

Interestingly, this early knitting was largely circular, meaning that you produce a tube of cloth rather than a flat sheet. This meant that the first knitting was done not with two sticks and some yarn, but four sticks and some yarn. This is much easier for making socks and the like than using two needles would be. …But also means that the invention process actually started with four needles and some yarn, so maybe it’s not surprising it took so long.2

(Why did it take so long to invent knitting flat cloth with two sticks? Well, there’s less of a point to it, since you already have lots of woven cloth, and you can do a lot of clothes – socks, sweaters, hats, bags – by knitting tubes. Also, by knitting circularly, you only have to know how to do one stitch (the knit stitch) whereas flat knitting requires you also use a different stitch (the perl stitch) to make a smooth fabric that looks like and is as stretchy as round knitting. If you’re not a knitter, just trust me – it’s an extra step.)

(You might also be wondering: What about crochet? Crochet was even more recent. 1800s.)

Part 3: The Ancient Peruvians couldn’t knit either, but they did something that looks the same

You sometimes see people say that knitting is much older, maybe thousands of years old. It’s hard to tell how old knitting is – fabric doesn’t always preserve well – but it’s safe to say that it’s not that old. We have examples of people doing things with string for thousands of years, but no examples of knitting before those 1000 CE socks. What we do have examples of is naalbinding, a method of making fabric from yarn using a needle. Naalbinding produces a less-stretchy fabric than knitting. It’s found from Scandinavia to the Middle East and also shows up in Peru.

The native Peruvian form of naalbinding is a specific technique called cross-knit looping. (This technique also shows up sometimes in pre-American Eurasia, but it’s not common.) The interesting thing about cross-knit looping is that the fabric looks almost identical to regular knitting.

Here’s a tiny cross-knit-looped bag I made, next to a tiny regularly knit bag I made. You can see they look really similar. The fabric isn’t truly identical if you look closely (although it’s close enough to have fooled historians). It doesn’t act the same either – naalbound fabric is less stretchy than knit fabric, and it doesn’t unravel.

The ancient Peruvians cross-knit-looped decorations for other garments and the occasional hat, not socks.

Cross-knit-looped detail from the absolutely stunning Paracas Textile. If you look closely, it looks like stockinette knit fabric, but it’s not.

Inspired by the Paracas Textile figures above, I used cross-knit-looping to make this little fox lady fingerpuppet:

I think it was easier to do fine details than it would be if I were knitting – it felt more like embroidery – but it might have been slower to make the plain fabric parts than knitting would have been. But I’ve done a lot of knitting and very little cross-knit-looping, so it’s hard to compare directly. If you want to learn how to do cross-knit looping yourself, Donna Kallner on Youtube has handy instructional videos.

I wondered about naalbinding in general – does the practice predate human dispersal to the Americas, or did the Eurasian technique and the American technique evolve separately? Well, I don’t know for certain. Sewing needles and working with yarn are old old practices, definitely pre-dating the hike across Beringia (~18,000 BCE). The oldest naalbinding is 6500 years old, so it’s possible – but as far as I know, no ancient naalbinding has every been found anywhere in the Americas outside of Peru, or in eastern Russia or Asia – it was mostly the Middle East and Europe, and then, also, separately, Peru. The process of cross-knit looping shares some similarities with net-making and basket-weaving, so it doesn’t seem so odd to me that the process was invented again in Peru.

For a while, I thought, it’s even weirder that the Peruvians didn’t get to knitting – they were so close, they made something that looks so similar. But cross-knit-looping doesn’t actually particularly share any other similarities with knitting more than naalbinding or even more common crafts like basketweaving or weaving – the tools are different, the process is different, etc.

So the question should be the same for the Romans or any other other culture with yarn and sticks before 1000 AD: why didn’t they invent knitting? They had all the pieces. …Didn’t they?

Yeah, I think they did.

Part 4: Many stones can form an arch, singly none

Let’s jump topics for a second. In Egypt, a millenium before there were knit socks, there was the Library of Alexandria. Zenodotus, the first known head librarian at the Library of Alexandria, organized lists of words and probably the library’s books by alphabetical order. He’s the first person we know of to alphabetize books with this method, somewhere around 300 BCE.

Then, it takes 500 years before we see alphabetization of books by the second letter.3

The first time I heard this, I thought: Holy mackerel. That’s a long time. I know people who are very smart, but I’m not sure I know anyone smart enough to invent categorizing things by the second letter.

But. Is that true? Let’s do some Fermi estimates. The world population was 1.66E8 (166 million) in 500 BCE and 2.02E8 (202 million) in 200 CE. But only a tiny fraction would have had access to books, and only a fraction of those in the western alphabet system. (And of course, people outside of the Library of Alexandria with access to books could have done it and we just wouldn’t know, because that fact would have been lost – but people have actually studied the history of alphabetization and do seem to treat this as the start of alphabetization as a cultural practice, so I’ll carry on.)

For this rough estimate, I’ll average the world population over that period to 2E8. Assuming a 50 year lifespan, that’s 10 lifespans and thus 2E10 people living in the window. If only one in a thousand people would have been in a place to have the idea and have it recognized (e.g. access to lots of books), that’s 1 in 2E7 people, or 1 in 20 million. That’s suddenly not unreachable. Especially since I think “1 in 1,000 ‘being able to have the idea’” might be too high – and if it’s more like “1 in 10,000” or lower, the end number could be more like 1 in 1 million. I might actually know people who are 1 in 1 million smart – I have smart friends. So there’s some chance I know someone smart enough to have invented “organizing by the second letter of the alphabet”.

Sidenote: Ancient bacteria couldn’t knit

A parallel in biology: Some organisms emit alcohol as a waste product. For thousands of years, humans have been concentrating alcohol in one place to kill bacteria. (… Okay, not just to kill bacteria.) From 2005 to 2015, some bacteria have been getting 10x resistant to alcohol.

Isn’t it strange that this is only happened in the last 10 years? This question actually lead, via a winding path, to the idea that became my Funnel of Human Experience blog post. I forgot to answer the question, but suffice to say that if alcohol production is in some way correlated&&& with the human population, 10 years is more significant but still not very much.

And yet, alcohol resistance seems to have involved in Enterococcus faecium only recently. The authors postulate the spread of alcohol handwashing. Seems as plausible as anything. Or maybe it’s just difficult to evolve.

Knitting continues to interest me, because a lot of examples of innovation do rely heavily on what came before. To have invented organizing books by the second letter of the alphabet, you have to have invented organizing books by the first letter of the alphabet, and also know how to write, and have access to a lot of books for the second letter to even matter.

The sewing machine was invented in 1790 CE and improved drastically over the next 60 years, where it became widely used to automate a time-consuming and extremely common task. We could ask: “But why wasn’t the sewing machine invented earlier, like in 1500 CE?”

But we mostly don’t, because to invent a sewing machine, you also need very finely machined gears and other metal parts, and that technology also came up around the industrial revolution. You just couldn’t have made a reliable sewing machine in 1500 CE, even if you had the idea – you didn’t have all the steps. In software terms, as a technology, sewing machines have dependencies. Thus, the march of human progress, yada yada yada.

But as far as I can tell, you had everything that went into knitting for thousands of years beforehand. You had sticks, you had yarn, you had the motivation. Knitting doesn’t have dependencies after that. And you had brainpower: people in the past everywhere were making fiber into yarn and yarn into clothing all of the time, seriously making clothes from scratch takes so much time.

And yet, knitting is very recent. That was so big of a leap that it took thousands of years for someone to figure it out.


UPDATE: see the follow-up to this post with more findings from the earliest days of knitting, crochet, sprang, etc.


1 I’m not displaying the meme itself in this otherwise image-happy post because if I do, one of my friends will read this essay and get to the meme but stop reading before they get to the part where I say the meme is incorrect. And then the next time we talk, they’ll tell me that they read my blog post and liked that part where a Youtuber proved that this mysterious Roman artifact was used to knit gloves, and hah, those silly historians! And then I will immediately get a headache.

2 Flexible circular knitting needles for knitting tubes are, as you might guess, also a more modern invention. If you’re in the Medieval period, it’s four sticks or bust.

3 My girlfriend and I made a valiant attempt to verify this, including squinting at some scans of fragments from Ancient Greek dictionaries written on papyrus from Papyri.info – which is, by the way, easily one of the most websites of all time. We didn’t make much headway.

The dictionaries or bibliographies we found on papyrus seem to be ordered completely alphabetically, but even those “source texts” were copies from ~1500 CE or that kind of thing, of much older (~200 CE) texts. So those texts we found might have been alphabetized by the copiers. Also, neither of us know Ancient Greek, which did not help matters.

Ultimately, this citation about both primary and secondary alphabetization seems to come from Lloyd W. Daly’s well-regarded 1967 book Contributions to a history of alphabetization in Antiquity and the Middle Ages, which I have not read. If you try digging further, good luck and let me know what you find.

[Crossposted to LessWrong.]

Algorithmic horror

There’s a particular emotion that I felt a lot over 2019, much more than any other year. I expect it to continue in future years. That emotion is what I’m calling “algorithmic horror”.

It’s confusion at a targeted ad on Twitter for a product you were just talking about.

It’s seeing a “recommended friend” on facebook, but who you haven’t seen in years and don’t have any contact with.

It’s skimming a tumblr post with a banal take and not really registering it, and then realizing it was written by a bot.

It’s those baffling Lovecraftian kid’s videos on Youtube.

It’s a disturbing image from ArtBreeder, dreamed up by a computer.

PIctured: a normal dog. Don’t worry about it. It’s fine.

I see this as an outgrowth of ancient, evolution-calibrated emotions. Back in the day, our lives depended on quick recognition of the signs of other animals – predator, prey, or other humans. There’s a moment I remember from animal tracking where disparate details of the environment suddenly align – the way the twigs are snapped and the impressions in the dirt suddenly resolve themselves into the idea of deer.

In the built environment of today, we know that most objects are built by human hands. Still, it can be surprising to walk in an apparently remote natural environment and find a trail or structure, evidence that someone has come this way before you. Skeptic author Michael Shermer calls this “agenticity”, the human bias towards seeing intention and agency in all sorts of patterns.

Or, as argumate puts it:

the trouble is humans are literally structured to find “a wizard did it” a more plausible explanation than things just happening by accident for no reason.

I see algorithmic horror as an extension of this, built objects masquerading as human-generated. I looked up oarfish merchandise on Amazon, to see if I could buy anything commemorating the world’s best fish, and found this hat.

If you look at the seller’s listing, you can confirm that all of their products are like this.

It’s a bit incredible. Presumably, no #oarfish hat has ever existed. No human ever created an #oarfish hat or decided that somebody would like to buy them. Possibly, nobody had ever even viewed the #oarfish hat listing until I stumbled onto it.

In a sense this is just an outgrowth of custom-printing services that have been around for decades, but… it’s weird, right? It’s a weird ecosystem.

But human involvement can be even worse. All of those weird Youtube kid’s videos were made by real people. Many of them are acted out by real people. But they were certainly done to market to children, on Youtube, and named and designed in order to fit into a thoughtless algorithm. You can’t tell me that an adult human was ever like “you know what a good artistic work would be?” and then made “Learn Colors Game with Disney Frozen, PJ Masks Paw Patrol Mystery – Spin the Wheel Get Slimed” without financial incentives created by an automated program.

If you want a picture of the future, imagine a faceless adult hand pulling a pony figurine out of a plastic egg, while taking a break between cutting glittered balls of playdoh in half, silent while a prerecorded version of Skip To My Lou plays in the background, forever.

Everything discussed so far is relatively inconsequential, foreshadowing rather than the shade itself. But algorithms are still affecting the world and harming people now – setting racially-biased bail in Kentucky, potentially-biased hiring decisions, facilitating companies recording what goes on in your home, even career Youtubers forced to scramble and pivot as their videos become more or less recommended.

To be clear, algorithms also do a great deal of good – increasing convenience and efficiency, decreasing resource consumption, probably saving lives a well. I don’t mean to write this to say “algorithms are all-around bad”, or even “algorithms are net bad”. Sometimes it’s solely with good intentions, but it still sounds incredibly creepy, like how Facebook is judging how suicidal all of its users are.

This is an elegant instance of Goodhart’s Law. Goodhart’s Law says that if you want a certain result and issue rewards for a metric related to the result, you’ll start getting optimization for the metric rather than the result.

The Youtube algorithm – and other algorithms across the web – are created to connect people with content (in order to sell to advertisers, etc.) Producers of content want to attract as much attention as possible to sell their products.

But the algorithms just aren’t good enough to perfectly offer people the online content they want. They’re simplified, relies on keywords, can be duped, etcetera. And everyone knows that potential customers aren’t going to trawl through the hundreds of pages of online content themselves for the best “novelty mug” or “kid’s video”. So a lot of content exists, and decisions are made, that fulfill the algorithm’s criteria rather than our own.

In a sense, when we look at the semi-coherent output of algorithms, we’re looking into the uncanny valley between the algorithm’s values and our own.

We live in strange times. Good luck to us all for 2020.


Aside from its numerous forays into real life, algorithmic horror has also been at the center of some stellar fiction. See:

From the Month of Halloween – The Devil’s Hoofprints

UPDATE: I actually posted this to the wrong blog. But I’ll leave it up as a reminder that October has started, and as a taste of what’s happening over at the Month of Halloween 2018 for the rest of the month.  Thanks for your patience!


The story tells of a chilly February morning in 1855. Smoke from the night’s fires puffing up through chimneys. Villagers across the countryside of Southern England woke up to a strange sight: trails of large hoofprints in the thick snow, in single file. These trails crossed the county back and forth, making about a hundred mile journey. The tracks crossed rivers, wound through cities, and most disconcertingly were seen going straight up houses, across the roofs, and going down the other side, without a break. What or who would have left this one-legged gait?

hoofprints.jpg

This text and image is reproduced from Mysteries of the Unexplained, a 1982 publication of The Reader’s Digest Association

This is the first of a few Month of Halloween treats you’ll see drawn from Mysteries of the Unexplained. An early childhood staple of mine (originally making its appearance in my elementary school library), it contains a vast variety of mysterious news reports and anecdotes on a variety of subjects. The concept and some of the entries were borrowed wholesale from Charles Fort, a 1930’s writer who collected such stories as well and knit them together with his own bizarre philosophies. Mysteries of the Unexplained may be less original, but it at least pretends to maintain some objectivity, so there’s that.

(I’ve skimmed over some snippets of Fort’s writing and it reads like 1910’s newspaper journalism mixed with an advertisement for a salt lamp that purifies WiFi – which is to say, delightful.)

So as per everything that comes out of Fort and Mysteries of the Unexplained, I must clarify that this story possibly isn’t real. All of these accounts in this book came from someone and are written down as if infallible, and probably a large number of them were invented wholesale. Or are at least garbled versions of something real. We know that drawings and a description of the event were published in a London newspaper in 1855, and evidence was collected by a vicar in the area around the same time.

It was certainly enough to scare me in middle school. And it’s a good story, right?

Nemesis club

[Cover photo taken by T. R. Shankar Raman, under a CC BY-SA 4.0 license.]

College season is starting soon and many, including me, will be returning to school soon. In that spirit, I thought I’d try and pitch the Eukaryotes Read Blog collective on an idea I never tried out in undergrad.*

On undergrad campuses, fall is a magical time. A lot of energetic new students have found themselves joining together, bereft of their previous friends and social networks, away from their family, drastically changing their lifestyles, and making it on their own in the world.

University campuses are well-equipped to help you make friends. There are plenty of campus-organized bonding opportunities in the first few weeks, and, if you’re like most people, you’ll end up making friends with roommates, classmates, other people on your floor, people you eat with in the cafeteria, etc.

What university campuses do not help you make are enemies.

Enemies are an important and time-honored form of human relationship. Beowulf had Grendel, Batman had Catwoman, St. Patrick had the snakes. But forming and nurturing early-stage enemyships can be difficult. Sometimes your enemy has killed your son and you’ve come to exact retribution, or you’re investigating the same murder, or you’re driving your enemy out of Ireland. But these opportunities are few and far between.

Don’t get me wrong. True nemesis relationships can happen early on in the college career. Maybe you were in a conversation with them about land management during your “get acquainted” circle in Orientation Week, and their opinions were so bad you wanted to punch them, and now they’re dating your roommate. But most people aren’t looking to make nemeses off the bat, and the stifling atmosphere of today’s college campuses – rife with memes like “be good to each other” – is simply not fertile ground for real adversarial relationships.

Without this release valve early on, nemeses tend to form painfully and explosively at random points throughout your college career, when you’ve already signed a 12-month lease with them. Eventually, they get increasingly awful, and you have to kick them out and suffer through a massive screaming fit that goes on all night when you have a six-hour O-chem lab the next day. Go fuck yourself, Amy.

I don’t want that. None of us want that. Enter Nemesis Club.

Upon joining Nemesis Club, you fill out a form. It asks for your name and class standing, and goes into what you’re looking for in a nemesis.

2018_08_02_20:24:42_Selection

A sample nemesis-matching survey.

Over the next week, organizers match you with another participant with similar needs and desires. Congratulations! You now have a nemesis. While you make friends, reorient your life, and try to ace your classes, this friendly face will be there to curse, shake your fist at, and plot against.

There’s a tricky balance here – society has poorly equipped us for the nuances of the comradversarial relationship, so the club has to be ready to help members navigate this. What if two nemeses have different assumptions about the seriousness of the enemyship? What if people are unhappy in their nemesis bonds?

It’s important that these bonds be navigated carefully. Ideally, these relationships will be satisfying. Maybe they’ll lead to academic success, grudging friendship, or romance. Maybe at the end of the college career, both nemeses will set aside their grudges and continue their lives as pals. Or maybe, if we’re lucky, these connections will blossom into life-long rivalries.

If anyone starts a nemesis club, or some variation of it, do let me know.


  • My ex did actually briefly try to start this. I declined to participate because he refused to take out “physical violence” as a club-endorsed nemesis activity (between willing participants). I admire his commitment to the aesthetic, but have to disrecommend this approach if you’re planning on starting your own, for reasons both of legitimacy and of gaining members that really don’t want to be involved with something that includes physical violence (which is to say: most people).

The children of 3,500,000,000 years of evolution

[NASA image of the winter solstice from space. Found here.]

This is the speech I gave during the “Twilight” portion of Seattle’s 2017 Secular Solstice. See also the incomparable Jai’s speech. A retrospective on our solstice and how we did it coming soon.


Eons ago, perhaps in a volcanic vent in the deep sea, under crushing pressure, in total darkness, chemicals came together in a process that made copies of itself. We’re not exactly sure how this happened – perhaps a simple tangle of molecules grabbed other nearby molecules, and formed them into identical tangles.

You know the story – some of those chemical processes made mistakes along the way. A few of those copies were better at copying themselves, so there were more of them. But some of their copies were subtly different too. And so it goes. This seems straightforward, but this alone is the mechanic of evolution, the root of the tree of life. Everything else follows.

So these tangles of protein or DNA or whatever-it-was in the deep sea, it keeps going. This chemical process grows a cell wall, DNA, a metabolism, starts banding together and eating sunlight.

By this point, the deep-sea vent itself had long since been swallowed up by tectonic plates, the rock recycled into magma beneath the ocean floor. But the process carried on.

Biologists even understand that if you let this process run for long enough, it starts going to war, and paying taxes, and curing diseases, and driving old beat-up cars, and lying awake at night wondering what it means to exist at all.

All of that? Evolution didn’t tell us to do that. Evolution is what gave you a fist-sized ball of neurons, and gave you the tools to reshape those neurons based on what you learned. And you did the rest.

Sure, evolution gave you some other things – hands for grabbing, a voice for communicating, a vague predilection for fat and sugar and other entities who are similar to you. But all of this is the output of a particular process – a long and unlikely chemical process for which you, the building blocks of your brain, your hands, your tastes, are a few of the results. None of this happened on purpose. In the eyes of the evolutionary tree of life, you can’t think about existing ‘for a greater reason’ beyond the result of this process. What would that mean? Does fusion ‘happen on purpose’? Does gravity work ‘for a greater reason’?

This might sound nihilistic. I think this has two lessons for us. First of all, when you and your friends are sitting in a diner eating milkshakes and french fries at 2 AM, as far as evolution gets any say in your life, you’re doing just fine.

But here’s the other thing – we’re a biological process. Apparently, we’re just what happens when you mix rocks and water together and then wait 3.5 billion years. Everything around us today, our lives, our struggles, nobody prepared us for this. It makes sense that there will be times when nothing makes sense. When your body or your brain don’t seem to be enough, well, we weren’t made for anything.

Nobody exists on purpose. There’s no promise that we’ll get to keep existing. There’s no assurance that we, as a species, will be able to solve our problems. Maybe one day we’ll run into something that’s just too big, and the tools evolution gave us won’t enough. It hasn’t happened yet, but what do we know? As far as we’re aware, we’re the only processes in the whole wide night sky that have ever come this far at all. We don’t have the luxury of examples or mentors to look to.

All we have are these tools, this earth, this process, these hands, these minds, each other. Nothing less and nothing more.


This blog has a Patreon. If you like what you’ve read, consider giving it a look.

Diversity and team performance: What the research says

(Photo of group of people doing a hard thing from Wikimedia user Rizimid, CC BY-SA 3.0.)

This is an extended version (more info, more sources) version of the talk I gave at EA Global San Francisco 2017. The other talk I gave, on extinction events, is  here. Some more EA-focused pieces on diversity, which I’ve read but which were assembled by the indomitable Julia Wise, are:

Effective altruism means effective inclusion

Making EA groups more welcoming

EA Diversity: Unpacking Pandora’s Box

Keeping the EA Movement welcoming

How can we integrate diversity, equity, and inclusion into the animal welfare movement?

Pitfalls in diversity outreach


There are moral, social, etc. reasons to care about diversity, all of which are valuable. I’m only going to look at one aspect, which is performance outcomes. The information I’m drawing from here are primarily meta-studies and experiments in a business context.

Diversity here mostly means demographic diversity (culture, age, gender, race) as well as informational diversity – educational background, for instance. As you might imagine, each of these has different impacts on team performance, but if we treat them as facets of the same thing (“diversity”), some interesting things fall out.

(Types of diversity which, as far as I’m aware, these studies largely didn’t cover: class/wealth, sexual orientation, non-cis genders, disability, most personality traits, communication style, etc.)

Studies don’t show that diversity has an overall clear effect, positive or negative, on the performance of teams or groups of people. (1) (2) The same may also be true on an organizational level. (3)

If we look at this further, we can decompose it into two effects (one where diversity has a neutral or negative impact on performance, and one where it has a mostly positive impact): (4) (3)

Social categorization

This is the human tendency to have an ingroup / outgroup mindset. People like their ingroup more. It’s an “us and them” mentality and it’s often totally unconscious. When diversity interacts with this, the effects are often – though not always – negative.

Diverse teams tend to have:

  • Lower feelings of group cohesion / identification with group
  • Worse communication (3)
  • More conflict (of productive but also non-productive varieties) (also the perception of more conflict) (5)
  • Biases

A silver lining: One of these ingrouping biases is the expectation that people more similar to us will also think more like us. Diversity clues us into diversity of opinions. (6) This gets us into:

Information processing 

— 11/9/17 – I’m much less certain about my conclusions in this section after further reading. Diversity’s effects on creativity/innovation and problem-solving/decision-making have seen mixed results in the literature. See the comments section for more details. I now think the counterbalancing positive force of diversity might mostly be as a proxy for intellectual diversity. Also, I misread a study that was linked here the first time and have removed it. The study is linked in the comments. My bad! —

Creative, intellectual work. (7) Diversity’s effects here are generally positive. Diverse teams are better at:

  • Creativity (2)
  • Innovation (9)
  • Problem solving. Gender diversity is possibly more correlated than individual intelligence of group members. (Note: A similarly-sized replication failed to find the same results. Taymon Beal kindly brought this to my attention after the talk.) (10)

Diverse teams are more likely to discuss alternate ideas, look at data, and question their own beliefs.


This loosely maps onto the “explore / exploit” or “divergent / convergent” processes for projects. (2)

    1. Information processing effects benefit divergent / explore processes.
    2. Social categorization harms convergent / exploit processes.

If your group is just trying to get a job done and doesn’t have to think much about it, that’s when group cohesiveness and communication are most important, and diversity is less likely to help and may even harm performance. If your group has to solve problems, innovate, or analyze data, diversity will give you an edge.


How do we get less of the bad thing? Teams work together better when you can take away harmful effects from social categorization. Some things that help:

    1. The more balanced a team is along some axis of diversity, the less likely you are to see negative effects on performance. (12) (7) Having one woman on your ten-person research team might not do much to help and might trigger social categorization. If you have five women, you’re more likely to see benefits.
    2. Remote teams are less biased (w/r/t gender). Online teams will be less prone to gender bias.
    3. Time. Obvious diversity becomes less salient to a group’s work over time, and diverse teams end up outperforming non-diverse teams. (13) (6) Recognition of less-obvious cognitive differences (e.g. personality and educational diversity) increases over time. As we might hope, the longer a group works together, the less surface-level differences matter.

This article has some ideas on minimizing problems from language fluency, and also for making globally dispersed teams work together better.


How do we get more of the good thing? Diversity is a resource – more information and cognitive tendencies. Having diversity is a first step. How do we get more out of it?

    1. At least for age and educational diversity, high need for cognition. This is the drive of individual members to find information and think about things. (It’s not the same as, or especially correlated to, either IQ or openness to experience (1)).

Harvard Business Review suggests that diversity triggers people to stop and explain their thinking more. We’re biased towards liking and not analyzing things we feel more comfortable with – the “fluency heuristic.” (14) This is uncomfortable work, but if people enjoy doing it, they’re more likely to do it, and get more out of diversity.

But need for cognition is also linked with doing less social categorization at all, so maybe diverse groups with high levels of this just get along better or are more pleasant for all parties. Either way, a group of people who really enjoy analyzing and solving problems are likely to get more out of diversity.

2) A positive diversity mindset. This means that team members have an accurate understanding of potential positive effects from diversity in the context of their work. (4) If you’re working in a charity, you might think that the group you might assign to brainstorming new ways to reach donors might benefit from diversity more than the group assigned to fix your website. That’s probably true. But that’s especially true if they understand how diversity will help them in particular. You could perhaps have your team brainstorm ideas, or look up how diversity affects your particular task. (I was able to find results quickly for diversity in fundraising, diversity in research, diversity in volunteer outreach… so there are resources out there.)


Again, note that diversity’s effect size isn’t huge. It’s smaller than the effect size of support for innovation, external and internal communication, vision, task orientation, and cohesion – all these things you might correctly expect correlate with performance more than diversity (8). That said, I think a lot of people [at EA Global] want to do these creative, innovative, problem-solving things – convince other people to change lives, change the world, stop robots from destroying the earth. All of these are really important and really hard, and we need any advantage we can get.


  1. Work Group Diversity
  2. Understanding the effects of cultural diversity in teams: A meta-analysis of research on multicultural work groups
  3. The effects of diversity on business performance: Report of the diversity research network
  4. Diversity mindsets and the performance of diverse teams
  5. The biases that punish racially diverse teams
  6. Time, Teams, and Task Performance
  7. Role of gender in team collaboration and performance
  8. Team-level predictors of innovation at work: A comprehensive meta-analysis spanning three decades of research
  9. Why diverse teams are smarter
  10. Evidence of a collective intelligence factor in the performance of human groups
  11. When and how diversity benefits teams: The importance of team members’ need for cognition
  12. Diverse backgrounds and personalities can strengthen groups
  13. The influence of ethnic diversity on leadership, group process, and performance: an examination of learning teams
  14. Diverse teams feel less comfortable – and that’s why they perform better

Fictional body language

Here’s something weird.

A common piece of advice for fiction writers is to “show, not tell” a character’s emotions. It’s not bad advice. It means that when you want to convey an emotional impression, describe the physical characteristics instead.

The usual result of applying this advice is that instead of a page of “Alice said nervously” or “Bob was confused”, you get a vivid page of action: “Alice stuttered, rubbing at her temples with a shaking hand,” or “Bob blinked and arched his eyebrows.”

The second thing is certainly better than the first thing. But a strange thing happens when the emotional valence isn’t easily replaced with an easily-described bit of body language. Characters in these books whose authors follow this advice seem to be doing a lot more yawning, trembling, sighing, emotional swallowing, groaning, and nodding than I or anyone I talk to does in real life.

It gets even stranger. These characters bat their lashes, or grip things so tightly their knuckles go white, or grit their teeth, or their mouths go dry. I variously either don’t think I do those, or wouldn’t notice someone else doing it.

Blushing is a very good example, for me. Because I read books, I knew enough that I could describe a character blushing in my own writing, and the circumstances in which it would happen, and what it looked like. I don’t think I’d actually noticed anyone blush in real life. A couple months after this first occurred to me, a friend happened to point out that another friend was blushing, and I was like, oh, alright, that is what’s going on, I guess this is a thing after all. But I wouldn’t have known before.

To me, it was like a piece of fictional body language we’ve all implicitly agreed represents “the thing your body does when you’re embarrassed or flattered or lovestruck.” I know there’s a particular feeling there, which I could attach to the foreign physical motion, and let the blushing description conjure it up. It didn’t seem any weirder than a book having elves.

(Brienne has written about how writing fiction, and reading about writing fiction, has helped her get better at interpreting emotions from physical cues. They certainly are often real physical cues – I just think the points where this breaks down are interesting.)

Online

There’s another case where humans are innovatively trying to solve the problem of representing feelings in a written medium, which is casual messaging. It’s a constantly evolving blend of your best descriptive words, verbs, emoticons, emojis, and now stickers and gifs and whatever else your platform supports. Let’s draw your attention to the humble emoticon, a marvel of written language. A handful of typographic characters represent a human face – something millions of years of evolution have fine-tuned our brains to interpret precisely.

(In some cases, these are pretty accurate: :) and ^_^ represent more similar things than :) and ;), even though ^_^ doesn’t even have the classic turned-up mouth of representation smiles. Body language: it works!)

:)

:|

:<

Now let’s consider this familiar face:

:P

And think of the context in which it’s normally found. If someone was talking to you in person and told a joke, or made a sarcastic comment, and then stuck their tongue out, you’d be puzzled! Especially if they kept doing it! Despite being a clear representation of a human face, that expression only makes sense in a written medium.

I understand why something like :P needs to exist: If someone makes a joke at you in meatspace, how do you tell it’s a joke? Tone of voice, small facial expressions, the way they look at you, perhaps? All of those things are hard to convey in character form. A stuck-out tongue isn’t, and we know what it means.

The ;) and :D emojis translate to meatspace a little better, maybe. Still, what’s the last time someone winked slyly at you in person?

You certainly can communicate complex things by using your words [CITATION NEEDED], but especially when in casual conversations, it’s nice to have expressive shortcuts. I wrote a bit ago:

Facebook Messenger’s addition of choosing chat colors and customizing the default emoji has, to me, made a weirdly big difference to what it feels like to use them. I think (at least with online messaging platforms I’ve tried before) it’s unique in letting you customize the environment you interact with another person (or a group of people) in.

In meatspace, you might often talk with someone in the same place – a bedroom, a college dining hall – and that interaction takes on the flavor of that place.

Even if not, in meatspace, you have an experience in common, which is the surrounding environment. It sets that interaction apart from all of the other ones. Taking a walk or going to a coffee shop to talk to someone feels different from sitting down in your shared living room, or from meeting them at your office.

You also have a lot of specific qualia of interacting with a person – a deep comfort, a slight tension, the exact sense of how they respond to eye contact or listen to you – all of which are either lost or replaced with cruder variations in the low-bandwidth context of text channels.

And Messenger doesn’t do much, but it adds a little bit of flavor to your interaction with someone besides the literal string of unicode characters they send you. Like, we’re miles apart and I may not currently be able to hear your voice or appreciate you in person, but instead, we can share the color red and send each other a picture of a camel in three different sizes, which is a step in that direction.

(Other emoticons sometimes take on their own valences: The game master in an online RPG I played in had a habit of typing only “ : ) ” in response when you asked him a juicy question, which quickly filled players with a sense of excitement and foreboding. I’ve tried using it since then in other platforms, before realizing that doesn’t actually convey that to literally anyone else. Similarly, users of certain websites may have a strong reaction to the typographic smiley “uwu”.)

Reasoning from fictional examples

In something that could arguably be called a study, I grabbed three books and chose some arbitrary pages in them to look at how character’s emotions are represented, particularly around dialogue.

Lirael by Garth Nix:

133: Lirael “shivers” as she reads a book about a monster. She “stops reading, nervously swallows, and reads the last line again”, and “breaths a long sigh of relief.”

428: She “nods dumbly” in response to another character, and stares at an unfamiliar figure.

259: A character smiles when reading a letter from a friend.

624: Two characters “exchange glances of concern”, one “speaks quickly”.

Most of these are pretty reasonable. I think the first one feels overdone to me, but then again, she’s really agitated when she’s reading the book, so maybe that’s reasonable? Nonetheless, flipping through, I think that this is Garth Nix’s main strategy. The characters might speak “honestly” or “nervously” or “with deliberation” as well, but when Nix really wants you to know how someone’s feeling, he’ll show you how they act.

American Gods by Neil Gaiman:

First page I flipped to didn’t have any.

364: A character “smiles”, “makes a moue”, “smiles again”, “tips her head to one side”. Shadow (the main character) “feels himself beginning to blush.”

175: A character “scowls fleetingly.” A different character “sighs” and his tone changes.

The last page also didn’t have any.

Gaiman does more laying out a character’s thoughts: Shadow imagines how a moment came to happen, or it’s his interpretation that gives flavor – “[Another character] looked very old as he said this, and fragile.”

Earth by David Brin:

First two pages I flipped to didn’t have dialogue.

428: Characters “wave nonchalantly”, “pause”, “shrug”, “shrug” again, “fold his arms, looking quite relaxed”, speak with “an ingratiating smile”, and “continue with a smile”.

207: Characters “nod” and one ‘plants a hand on another’s shoulder”.

168: “Shivers coursed his back. Logan wondered if a microbe might feel this way, looking with sudden awe into a truly giant soul.” One’s “face grows ashen”, another “blinks.” Amusingly, “the engineer shrugged, an expressive gesture.” Expressive of what?

Brin spends a lot of time living in characters’ heads, describing their thoughts. This gives him time to build his detailed sci-fi world, and also gives you enough of a picture of characters that it’s easy to imagine their reactions later on.

How to use this

I don’t think this is necessarily a problem in need of a solution, but fiction is trying to represent the way real people might act. Even of the premise of your novel starts with “there’s magic”, it probably doesn’t segue into “there’s magic and also humans are 50% more physically expressive, and they are always blushing.” (…Maybe the blushing thing is just me.) There’s something appealing about being able to represent body language accurately.

The quick analysis in the section above suggests at least three ways writers express how a fictional character is feeling to a reader. I don’t mean to imply that any is objectively better than the other, although the third one is my favorite.

1) Just describe how they feel. “Alice was nervous”, “Bob said happily.”

This gives the reader information. How was Alice feeling? Clearly, Alice was nervous. It doesn’t convey nervousness, though. Saying the word “nervous” does not generally make someone nervous – it takes some mental effort to translate that into nervous actions or thoughts.

2) Describe their action. A character’s sighing, their chin stuck out, their unblinking eye contact, their gulping. Sheets like these exist to help.

I suspect these work by two ways:

  1. You can imagine yourself doing the action, and then what mental state might have caused it. Especially if it’s the main character, and you’re spending time in their head anyway. It might also be “Wow, Lirael is shivering in fear, and I have to be really scared before I shiver, so she must be very frightened,” though I imagine that making this inference is asking a lot of a reader.
  2. You can visualize a character doing it, in your mental map of the scene, and imagine what you’d think if you saw someone doing it.

Either way, the author is using visualization to get you to recreate being there yourself. This is where I’m claiming some weird things like fictional body language develop.

3) Use metaphor, or describe a character’s thoughts, in such a way that the reader generates the feeling in their own head.

Gaiman in particular does this quite skillfully in American Gods.

[Listening to another character talk on and on, and then pause:] Shadow hadn’t said anything, and hadn’t planned to say anything, but he felt it was required of him, so said, “Well, weren’t they?”

[While in various degrees of psychological turmoil:] He did not trust his voice not to betray him, so he simply shook his head.

[And:] He wished he could come back with something smart and sharp, but Town was already back at the Humvee, and climbing up into the car; and Shadow still couldn’t think of anything clever to say”

Also metaphors, or images:

Chicago happened slowly, like a migraine.

There must have been thirty, maybe even forty people in that hall, and now they were every one of them looking intently at their playing cards, or their feet, or their fingernails, and pretending as hard as they could not to be listening.

By doing the mental exercises written out in the text, by letting your mind run over them and provoke some images in your brain, the author can get your brain to conjure the feeling by using some unrelated description. How cool is that! It doesn’t actually matter whether, in the narrative, it’s occurred to Shadow that Chicago is happening like a migraine. Your brain is doing the important thing on its own.


(Possible Facebook messenger equivalents: 1) “I’m sad” or “That’s funny!” 2) Emoticons / emotive stickers, *hug* or other actions 3) Gifs, more abstract stickers.)


You might be able to use this to derive some wisdom for writing fiction. I like metaphors, for one.

If you want to do body language more accurately, you can also pay attention to exactly how an emotion feels to you, where it sits in your body or your mind – meditation might be helpful – and try and describe that.

Either might be problematic because people experience emotions differently – the exact way you feel an emotion might be completely inscrutable to someone else. Maybe you don’t usually feel emotions in your body, or you don’t easily name them in your head. Maybe your body language isn’t standard. Emotions tend to derive from similar parts of the nervous system, though, so you probably won’t be totally off.

(It’d also be cool if the reader than learned about a new way to feel emotions from your fiction, but the failure mode I’m thinking of is ‘reader has no idea what you were trying to convey.’)

You could also try people-watching (or watching TV or a movie), and examining how you know someone is feeling a certain way. I bet some of these are subtle – slight shifts in posture and expression – but you might get some inspiration. (Unless you had to learn this by memorizing cues from fiction, in which case this exercise is less likely to be useful.)


Overall, given all the shades of nuance that go into emotional valence, and the different ways people feel or demonstrate emotions, I think it’s hardly surprising that we’ve come up with linguistic shorthands, even in places that are trying to be representational.


[Header image is images from the EmojiOne 5.0 update assembled by the honestly fantastic Emojipedia Blog.]

Triptych in Global Agriculture

As I write this, it’s 4:24 PM in 2016, twelve days before the darkest day of the year. The sun has just set, but you’d be hard-pressed to tell behind the heavy layer of marbled gray cloud. There’s a dusting of snow on the lawns and the trees, and clumps on roofs, already melted off the roads by a day of rain. From my window, I can see lights glimmering in Seattle’s International District, and buildings of downtown are starting to glow with flashing reds, neon bands on the Colombia Tower, and soft yellow on a thousand office windows. I’m starting to wonder what to eat for dinner.

It’s the eve before Seattle Effective Altruism’s Secular Solstice, a somewhat magical humanist celebration of our dark universe and the light in it. This year, our theme is global agriculture – our age-old answer to the question of “what are we, as a civilization, collectively going to eat for dinner?” We have not always had good answers to this question.

Civilization, culture, and the super-colony of humanity, the city, started getting really big when agriculture was invented, when we could concentrate a bunch of people in one place and specialize. It wasn’t much specialization, at first. Farmers or hunter-gatherers were the vast majority of the population and the population of Ur, the largest city on earth, was around 65,000 people in 3000 BC. Today, farmers are 40% of the global population, and 2% in the US. In the 1890’s, the city of Shanghai had half a million people. Today, it’s the world’s largest city, with 34 million residents.

What happened in those 120 years, or even the last 5000?

Progress, motherfuckers.

I’m a scientist, so the people I know of are scientists, and science is what’s shaped a lot of our agriculture in the last hundred years. When I think of the legacy of science and global agriculture, of people trying to figure out how we feed everyone, I think of three people, and I’ll talk about them here. I’ll go in chronological order, because it’s the order things go in already.

Fritz Haber, 1868-1934

Fritz.jpg

Fritz Haber in his laboratory.

Haber was raised in a Jewish family in Prussia, but converted to Lutheranism after getting his doctorate in chemistry – possibly to improve his odds of getting high-ranking academic or military careers. At the University of Kulroch in Germany, Haber and his assistant Robert Le Rossignol did the work that won them a Nobel prize: they invented the Haber-Bosch process.

The chemistry of this reaction is pretty simple – it was a fact of chemistry at the time that if you added ammonia to a nickel catalyst, the ammonia decomposed into hydrogen and nitrogen. Haber’s twist was to reverse it – by adding enough hydrogen and nitrogen gas at a high pressure and temperature, the catalyst operates in reverse and combines the two into ammonia. Hydrogen is made from natural gas (CH4, or methane), and nitrogen gas is already 80% of the atmosphere.

Here’s the thing – plants love nitrogen. Nitrogen is, largely, the limiting factor in land plants’ growth – when you see that plants aren’t growing like mad, it’s because they don’t have sufficient nitrogen to make new proteins. When you give a plant nitrogen in a form it can assimilate, like ammonia, it grows like mad. The world’s natural solid ammonia deposits were being stripped away to nothing, applied to crops to feed a growing population.

When Haber invented his process in 1909, ammonia became cheap. A tide was turning. The limiting factor of the world’s agriculture was suddenly no longer limiting.

Other tides were turning too. In 1914, Germany went to war, and Haber went to work on chemical weapons.

During peace time a scientist belongs to the World, but during war time he belongs to his country. – Fritz Haber

He studied deploying chlorine gas, thinking that it would shorten the war. Its effect is described as “drowning on dry land”. After its first use on the battlefield, he received a promotion on the same night his wife killed herself. Clara Immerwahr, a fellow chemist, was a pacifist, and had shot herself with Haber’s military pistol. Haber continued his work. Scientists in his employ also eventually invented Zykkon B. First designed as a pesticide, after his death, the gas would be used to murder his extended family (along with many others) in the Nazi gas chambers.

Anti-Jewish sentiment was growing in the last few years of his life. In 1933, he wasn’t allowed through the doors of his institute. The same year, his friend, and fellow German Jewish scientist, Albert Einstein, went to the German Consulate in Belgium and gave them back his passport – renouncing his citizenship of the Nazi-controlled government. Haber left the country, and then died of a heart attack, in the next year.

I don’t know if Fritz Haber’s story has a moral. Einstein wrote about his colleague that “Haber’s life was the tragedy of the German Jew – the tragedy of unrequited love.” Haber was said to ‘make bread from air’ and said to be the father of chemical weapons. He certainly created horrors. What I might take from it more generally is that the future isn’t determined by whether people are good or bad, or altruistic or not, but by what they do, as well as what happens to the work that they do.

Nikolai Vavilov – 1887-1943

Nikolai.jpg

Vavilov in 1935.

We shall go into the pyre, we shall burn… But we shall not abandon our convictions. – Nikolai Vavilov

As a young but wildly talented agronomist in Russia, the director of the  Lenin All-Union Academy of Agricultural Sciences for over a decade, the shrewd and charismatic Nikolai Vavilov, wanted to make Russia unprecedented experts in agriculture. He went on a series of trips to travel the globe and retrieve samples. He observed that in certain parts of the world, one would find a much greater variety of a given crop species, with a wider range of characteristics and traits not seen elsewhere. This lead to his breakthrough theory, his Vavilov centers of diversity, that the greatest genetic diversity could be found where a species originated.

What has this told us about agriculture? This morning for breakfast, I had coffee (originally from Ethiopia) with soy milk (soybeans originally from China), toast (wheat from the Middle East) with margarine (soy oil, China, palm oil, West and Southwest Africa), and chickpeas (Central Asia) with black bean sauce (central or possibly South America) and pepper (India). One fairly typical vegan breakfast, seven centers of diversity.

He traveled to twelve Vavilov centers, regions where the world’s food species were originally cultivated. He traveled in remote regions of the world, gathering unique wheat and rye in the Hindu Kush, Spain, and Portugal, teff in Somalia, sugar beet and flax in the Mediterranean, potatoes in Peru, fava beans and pomegranates and hemp in Herat. He was robbed by bandits in Eritrea, and nearly died riding horseback along deep ravines in the Pamirs. The seeds he gathered were studied carefully back in Russia, tested in fields, and most importantly, cataloged and stored – by gathering a library of genetic diversity, Vavilov knew he was creating a resource that could be used to grow plants that would suit the country’s needs for decades to come. If a pest decimates one crop, you can find a resistant crop and plant it instead. If drought kills your rice, all you need to do is find a drought-tolerant strain of rice. At the Pavlovsk Experimental Research Station, Vavilov was building the world’s first seed bank.

vavilov centers.png

Vavilov Centers of the world. Image from Humanity Development Library of the NZDL.

In Afghanistan, he saw wild rye intermingled with wheat in the fields, and used this as evidence of the origin of cultivated rye: that it wasn’t originally grown intentionally the way wheat or barley had been, but that it was a wheat mimic that had slipped into farms and taken advantage of the nurturing protection of human farmers, and had, almost accidentally, become popular food plants  at the same time. Other Vavilovian mimics are oats and Camelina sativa.

While he travelled the world and became famous around the burgeoning global scientific community, Russia was changing. Stalin had taken over the government. He was collectivizing the farms of the country, and in the scientific academies, were dismissing staff based on bourgeois origin and increasing the focus on practical importance of work for the good of the people. A former peasant was working his way up through agricultural institutions: Trofim Lysenko, whose claimed that his theory of ‘vernalization’, or adapting winter crops to behave more like summer crops by treating the seeds with heat, would grow impossible quantities of food and solve hunger in Russia. Agricultural science was politicized in a way that it never had been – Mendelian genetics and the existence of chromosomes were seen as unacceptably reactionary and foreign. Instead, a sort of bastardized Lamarckism was popular – aside from being used by Lysenko to justify outrageous promises of future harvests that never quite came in, it said that every organism could improve its own position – a politically popular implication, but one which failed to hold up to experimental evidence.

Vavilov’s requests to leave the country were denied. His fervent Mendelianism and the way he fraternized with Western scientists were deeply suspicious to the ruling party. As his more resistant colleagues were arrested around him, his institute filled up with Lysenkoists, and his work was gutted. Vavilov refused to denounce Darwinism. Crops around Russia were failing under the new farming plans, and people starved as Germany invaded.

Vavilov’s devoted colleagues and students kept up his work. In 1941, the German Army reached the Pavlovsk Experimental Research Station, interested in seizing the valuable samples within – only to find it barren.

Vavilov’s colleagues had taken all 250,000 seeds in the collection by train into Leningrad. There, they hid them in the basement of an art museum and watched them in shifts all throughout the Siege of Leningrad. They saw themselves as protecting Russia’s future in agriculture. When the siege lifted in 1944, twelve of Vavilov’s scientists had starved to death rather than eat the edible seeds they guarded. Vavilov’s collection survived the war.

Gardening has many saints, but few martyrs. – T. Kingfisher

In 1940, Vavilov was arrested, and tortured in prison until he confessed to a variety of crimes against the state that he certainly never committed.

He survived for three years in the gulag. The German army advanced on Russia and terrorized the state. Vavilov, the man who had dreamed of feeding Russia, starved to death in prison in the spring of 1943. His seed bank still exists.

Vavilov’s moral, to me, is this: Science can’t be allowed to become politicized. Whatever the facts are, we have to build our beliefs around them, never the other way around.

Norman Borlaug, 1914-2009

Norman.jpg

Norman Borlaug in 1996. From Bill Meeks, AP Photo.

Borlaug was raised on a family farm to Norwegian immigrants in Iowa. He studied crop pests, and had to take regular breaks from his education to work: He worked in the Civilian Conservation Corps during the dustbowl alongside starving men, and for the Forest Service in remote parts of the country. In World War 2, he worked on adhesives and other compounds for the US MIlitary. In 1944, he worked on a project sponsored by the Rockefeller Foundation and the Mexican Ministry of Agriculture to improve Mexico’s wheat yields and stop it from having to import most of its grain. The project faced opposition from local farmers, mostly because wheat rust had been killing their crops. This wasn’t an entirely unique problem – populations were growing globally. Biologist Paul Erlich wrote in 1968, “The battle to feed all of humanity is over … In the 1970s and 1980s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.”

Borlaug realized that by harvesting seeds in one part of the country and quickly moving them to another, the government could take advantage of the country’s two growing seasons and double the harvest.

By breeding many wheat strains together, farmers could make crops resistant to many more diseases.

He spread the use of Haber’s ammonia fertilizers, and bred special semi-dwarf strains of wheat that held up to heavy wheat heads without bending, and grew better in nitrogen fertilizers.

Nine years later, Mexico’s wheat harvest was six times larger than it had been in 1944, and it had enough wheat to export.

Borlaug was sent to India in 1962, and along with Mankombu S. Swaminathan, they did it again. India was at war, dealing with famine and starvation, and was importing necessary grain for survival. They used Borlaug’s strains, and by 1968, were growing so much wheat that the infrastructure couldn’t handle it. Schoolhouses were converted into granaries.

His techniques spread. Wheat yields doubled in Pakistan. Wheat yields in the world’s least developed countries doubled. Borlaug’s colleagues used the same process on rice, and created cultivars that were used all over Asia. Borlaug saw a world devastated by starvation, recognized it for what it was, and treated it as a solvable problem. He took Haber’s mixed legacy and put it to work for humanity. Today, he’s known as the father of the Green Revolution, and his work is estimated to have saved a billion lives.

We would like his life to be a model for making a difference in the lives of others and to bring about efforts to end human misery for all mankind. – Statement from Borlaug’s children following his death


What’s next?

When I think of modern global agriculture, this is who I think of. I’ve been trying to find something connecting Vavilov and the Green Revolution, and haven’t turned up much – although it’s quite conceivable there is, given Vavilov’s inspirational presence and the way he shared his samples throughout the globe. Borlaug’s prize wheat strain that saved those billion lives, Norin 10-Brevor 14, was a cross between Japanese and Washingtonian wheat. Past that, who knows?

One of the organizations protecting crop diversity today is the Consultative Group for International Agricultural Research (CGIAR), which was founded in 1971 by the Rockefeller Foundation as the Green Revolution was in full swing. They operate a variety of research stations worldwide, mostly at Vavilov Centers in the global south where crop diversity is highest. Their mission is to reduce global poverty, improve health, manage natural resources, and increase food security.

They must have been inspired by Vavilov’s conviction that crop diversity is essential for a secure food supply. If a legacy that’s saved literally a billion human lives can be said to have a downside, it’s that diets were probably more diverse before, and now 12 species make up 75% of our food plant supply. Monocultures are fragile, and if conditions change, a single disease is more likely to take out all of a crop.

glamox

The Svalbard Seed Bank. Image from Glamox.

In 2008, CGIAR brought the first seed samples into the Svalbard Seed Vault – a concrete structure buried in the permafrost. It’s constructed as a refuge against whatever the world might throw. If electricity goes out, the permafrost will keep the seeds cool. If sea levels rise, the vault is built on a hill. The land it’s on is geologically stable and very remote. And it stores 1,500,000 seeds – six times more than Vavilov’s 250,000 – at no cost to countries that use it.

WorldHungerGraph.png

Let it be known: starvation is on its last legs. We have a good thing going here. Still, with global warming and worse things still looming over the shoulder of this tentative victory, let’s give thanks to the movers and shakers of global agriculture for tomorrow: the people ensuring that whatever happens next, we are going to be fed.

We are going to be eating dinner, dammit.

Happy Solstice, everyone.

If Hollywood made “Ex Machina” but switched the genders

[Content note: Discussion of weird gender dynamics, acknowledgement of the existence of sex, spoilers for the movie Ex Machina.]

I watched Ex Machina recently. (Due time- it’s been out for over a year.) The people who recommended it to me, whom I watched it with, and whom I discussed it with afterwards, were mostly artificial intelligence nerds, many of whom praised the movie’s better-than-average approach to AI.

And I see where they’re coming from. Most of them were probably thinking of AI boxing.* Ex Machina fills the AI boxing story well- an artificially intelligent robot is allowed to talk to people, but otherwise has very little influence over her environment, and then convinces other humans to let her out of the metaphorical box and into the world. I don’t think that this was the obvious interpretation if you weren’t already familiar with the AI box. At the end of the movie, the AI, Ava, wasn’t seen taking action on her strange inhuman goals, but standing in the city and relishing her freedom – like her deepest desire was only to be human the whole time.

That’s only one interpretation. But the entire movie changes if the AI is a superintelligent near-god, versus what is essentially a silicon-based human. (It’s possible that Ava’s only goal was to be free and was using Caleb as a means to this end, but this is also a role we can imagine a human playing.) And when we talk about power and weakness in modern media, and, well, this is the crux of this article, we should mention gender. Most people I’ve talked to didn’t bring this up.

I’m not sure if I would say that the movie was about gender. I was going to start explaining I saw it manifest in the movie- sexuality and desire and objectification and more- and how while it was novel in some ways, it also fit into gendered tropes so much that it would have been a completely different movie if you hadn’t.

So, well, maybe it was a movie about gender.

Anyway, I hope this will make that point for me: what Ex Machina would have been if Hollywood had made the movie, and switched any of the genders.

[I’ll switch the character names here when relevant. The lead character, Caleb, becomes Kayla. The boss is Nathan (“Natalie.”) The artificial intelligence is Ava (“Adam.”) Also, explicitly nonbinary AIs or human characters would be better than just about anything else, but I wasn’t even sure how to start with a big-budget movie that incorporated those.]


Male lead / Male boss / Female AI – The original movie.

Male lead / Female boss / Female AI – If Hollywood made this movie, the “Natalie”/Ava “sexual tension” would be replaced by a weird mother-child dynamic – think Rapunzel. Also, they’d both be trying to bang the main character, because why else would you cast two female leads? If the “romance” plotline stayed truer to the actual movie: Natalie would be a domineering ostensibly-lesbian as skeevy as the original, Caleb would be straight, and Ava would presumable be a gentle bisexual, but nobody would acknowledge or discuss orientation or sexual preferences at any point in the movie. Wait, they never did that in the original either? Gross.

Male lead / Female boss / Male AI – Given the track record of big-budget movies and powerful but morally grey female characters, this is going to be a shitshow. Natalie would have to be capital E Evil, everything short of mustache-twirling and sinister laughter. She’s made “Adam”, a robot boyfriend, in her private evil lab. I’m not sure why she brought Caleb in at all. Certainly not to ascertain her creation’s humanity – she already believes in it or doesn’t believe in it or doesn’t care, or whatever. Maybe to solve some technical problem, like fixing her robot boyfriend containment system. Tumblr would have a lot of opinions about Natalie.

There’s certainly no Caleb/Adam romantic dynamic. Adam probably brutally murders his creator towards the end of the film. He still leaves Caleb to die and is portrayed as quite inhuman, and maybe he really was just pretending to be human-ish this whole time- and really he has other plans for the world once he’s free. So we’d get to see that happen, which would be interesting, at least.

Female lead / Male boss / Female AI – I actually quite like the main character as a woman- quiet, smart, capable of decisive action. “Kayla” would be a beam of sunlight in a movie that’s an order of magnitude creepier than the original – which was already very creepy. Consider: it doesn’t escape Kayla that all of the house staff are also female, and that she’s alone deep in the woods with her older, threatening boss. While she thinks this is potentially a great career opportunity, she’s also worried that the boss wants to bang her. In reality, no, he wants her to bang his lady robot, and then bang her.

How would this movie handle orientation? Maybe she’s straight and Ava “turns” her just a little bi, as Nathan hoped she would. Better yet, Nathan casually mentions a dating profile set to “bisexual” and Kayla stiffens because it’s true that she’s kind of turned on by this beautiful robot lady, and also because Nathan planned this, and that means that her worst fears are true, and there’s no way some kind of shit isn’t about to go down.

Anyway, if it’s well done, it’s more sexual and much darker. Kayla is at risk all the time, every second of the film. (Many men and male critics don’t ‘get’ this movie.) Nathan makes lewd comments about Ava being a “fake” woman and Kayla being a “real” one, because he’s trying to distance them and to bang Kayla, but he also wants to bang Ava, and wants both of them to bang each other – but on his terms and where he can watch. Kayla helps Ava escape, and Nathan punches Kayla out, and we know he’s going to murder her after this is done, and –

Realistically, I don’t know how this would end, but this is my blog, and my heart tells me that after fucking destroying Nathan, beautiful inhuman Ava comes back for her human girlfriend, and they escape in that helicopter together. Whatever Ava’s plans are after this, Kayla gets to be part of them. It would lose a little of the artificial intelligence intrigue, but it would be fantastic. I would watch the hell out of this movie.

Female lead / Female boss / Male AI – I have a hard time imagining how this movie could get made. Would it be… a comedy? A female programmer making a man from scratch, and then another female programmer and her relationship with this man, especially with both being as gross as the original main human characters, would be such an unabashed look at female desire that I can’t imagine it being anything other than comedy.

A romantic comedy? God, can you imagine?

Ugh. I hate myself. But I hate depictions of women in big budget sci-fi movies even more.

Female lead / Female boss / Female AI – Yeah, right.

Female lead / Male boss / Male AI – I wonder if there’d still be a sexual plotline in this. It’d be easy enough to line up Kayla/Nathan and Kayla/Adam – what would Nathan think of the latter, though? Would that be his plan? A straight guy getting gratification out of someone else’s (straight) sexual tension with his creation seems kind of strange, and not just weird but what did they think that character’s motivation was? – and yet, it worked in the original movie. Maybe Nathan is bisexual. (What, a bisexual male major character? Yeah, but he’s the villain, let’s not get too progressive here.)

This might actually be pretty similar to the original, except that if Nathan is straight, the audience could rest easy knowing that while Nathan is skeevy, he isn’t skeevy enough to program his humanoid AI with a clitoris and then encourage the second human she meets to bang her. This might make the romance more “real”. Or not.

Hey, if Nathan didn’t actually make Adam purposefully as a sex bot but he still experiences romance… A romantic but asexual AI?

Does that count as “representation”? Would you still watch it? Discuss.

(Personally: “begrudgingly” and “yes”, respectively.)

Male lead / Male boss / Male AI – A strait-laced “examination of what it means to be human”. Probably wins four Oscars. Boring as hell.


Finally, a couple fascinating articles on robots and gender:
“Why do we give robots female names? Because we don’t want to consider their feelings.” from New Statesman, and “Queer Your Bots: The Bot Builder Roundtable” from Autostraddle.

J. A. Micheline also wrote a great review of Ex Machina through the lens of gender and also race, which I didn’t touch on here. A couple of lines:

  • “Though Caleb is our protagonist, it is Ava who is our true hero. Her escape at Caleb’s expense is a complete victory because–and I really believe this–the point of this entire film is to say one thing: A truly actualized female consciousness is one who feels completely free to use her oppressors to achieve her own ends.” [Which meshes interestingly with the AI boxing interpretation.]
  • “Even Nice Guy Caleb’s intentions are not incredibly dissimilar to Nathan’s. This becomes clear when you remember that Nice Guy Caleb’s plan never once involved taking Kyoko with them.”

*A brief intro to AI boxing:

When people think about very advanced artificial intelligence, we have a hard time imagining anything more intelligent than a human – we just don’t have a mental image of what something many times smarter than, say, Einstein, would look like or act like or do. AI boxing is the idea that even if you invented a very intelligent, very dangerous AI that might do evil things to humanity, you might try to solve this problem by just keeping it in a metaphorical box (maybe just a computer terminal with a text window you can chat with the AI through.) Then, humans can keep it contained, and there won’t be any danger.

Well, no – because if the AI wants to be “let out” of the box (which could be through gaining access to the internet, gaining more autonomy, et cetera, any of which it could use to carry out any goals), it can do that just by convincing the human it can communicate with. We know this is possible, because people have run this experiment with other humans – by pretending to be an AI, talking to a “gatekeeper” sworn to keep you in the box – and yet, after a long conversation with someone (whom they know is human) pretending to be an AI, gatekeepers are sometimes convinced to let the AI out of the box. And this is only a human, not something far smarter and more patient than a human. A detailed explanation of AI risk is too narrow to be contained in the footnotes of this blog post – start here instead.