hi fediverse, I just posted transcripts of a few talks and lectures I've given over the past few years, mostly concerning the connections between machine learning, language and poetry: posts.decontextualize.com

(notes and summaries in individual posts below)

"Desire (under)lines: Notes toward a queer phenomenology of spell check" asserts that spell check is a "straightening device" (following Sara Ahmed's use of that term) that attempts to curb spelling's material, expressive, and sensual qualities. Includes a fun proto-Indo-European etymology aside posts.decontextualize.com/quee

(originally prepared for the EACL 2021 Queer in AI Social)


"Language models can only write poetry" is an attempt to categorize the outputs of language models (from Tzara's newspaper cutups to Markov chains to GPT-3) using speech act theory. Touches on the immersive fallacy, Jhave's ReRites, Janelle Shane's AI Weirdness, Kristeva, etc. posts.decontextualize.com/lang

(excerpted from my Vector Institute / BMOLab "distinguished lecture")

· · Web · 3 · 5 · 8

"Rewordable versus the alphabet fetish" outlines how conventional spelling games (like Scrabble) are based on cryptography (via Poe's The Gold-Bug) and mystical alphabetical metaphysics—and how we attempted to circumvent those influences in Rewordable, a board game I co-designed a few years ago. includes a very adorable illustration I found of neoplatonist medieval philospher John Scotus Erigena posts.decontextualize.com/rewo

(originally a talk at NYU Game Center's Practice conference)

it was @brainwane's "if you give a speech you care about, post a transcript" post that finally motivated me to clean these up and put them online harihareswara.net/sumana/2021/

I've got a workflow going now where I can create a presentation and a nicely formatted transcript (i.e., my speaker notes) from one file, and post it to the web straight from my notes app (Zettlr), so hopefully it's easier for me to do all this in the future

@wim_v12e yeah I copied them wrong apparently, guess I should have checked the links. both urls should work now!

@aparrish these are great! as an aside, following up on one of the references in this, do you know when people started refering to bigram (etc) models as "markov chains"? hayes' scientific american article doesn't use the term (he says "eddington monkey", which I much prefer)

@mewo2 that is a really good question and I don't know! Cramer in _Words Made Flesh_ dates it back to Theo Lutz?

@aparrish @mewo2 This 1961 paper is the earliest relevant reference I can find. This biography of the author, Hockett, claims that he was the person who pioneered application of Markov chains to language and grammar in 1955:


@aparrish @mewo2 mid 1950s was the height of "let's apply Markov chains to everything" as a kind of post-information-theory, pre-cybernetics fad, so it makes sense that that's the time frame for it

@darius @aparrish does that use it for generating text though? markov himself did text analysis with markov chains back in the 1910s

@mewo2 @aparrish I guess my thinking is that a generative grammar is something that is posited to be useful for generating text, even if it's not used for that in practice

@darius @aparrish hmm yeah I guess. what I’m interested in is the parallel evolution where the people playing around with this stuff creatively seem to have been initially unaware of any fancy theory, while the theory folks were uninterested in applying it creatively

@mewo2 @aparrish here's an example of generative language art from "Markov processes" in Bit International 2 (1968) in an article by Hiroshi Kawano


@darius @aparrish okay, what I've got so far:

earliest generation of text by markov chain: claude shannon, 1948: people.math.harvard.edu/~ctm/h (calls it a "markoff process")

earliest suggestion of doing this creatively: francois le lionnais, 1963

then by 1968, kawano can say "many computer-aided works of art have been experimentally produced with this method"

the lutz reference is interesting - cramer claims he used markov chains in 1959, but afaik he used a madlib/grammar? stuttgarter-schule.de/lutz_sch

@darius @aparrish then the whole business seems to have been independently reinvented by bennett in 1977: jstor.org/stable/27848169 (no reference to markov or shannon, who you'd sort of assume a physicist who previously worked at bell labs would have heard of)

hayes picks it up from bennett, then kenner from hayes, and suddenly you have a second independent tradition

@darius @aparrish kenner and o'rourke (1984) do make reference to shannon, but none of the work from the 60s. kenner was a literary critic who wrote about modernist writers in france, so it's likely he was aware of the oulipo manifesto, but he probably didn't know what a markov chain was, so may not have understood what le lionnais was saying

@mewo2 @darius this is great, thank you for doing the digging!

@aparrish @darius I am now consumed by two questions:

who was the first person to generate text using markov chains on a computer?


who was the first person to use markov chains to make creative work?

I really feel like these should be answerable!

@aparrish @darius I’m also no longer convinced that le lionnais was talking about markov generation - I suspect he was thinking more about the kind of textual analysis queneau was doing than about generating new texts

more markov chain history 

@aparrish @darius hmm

kawano had made visual work using markov chains from 63/64: direct.mit.edu/leon/article/52 and hiller composed music in 57: medienkunstnetz.de/works/illia but both were using manually constructed transition probabilities, not derived from a "text"

bense wrote about shannon's ideas in 1960, but it seems like only for analysis, not generation: monoskop.org/log/?p=16249

I'm now actually unsure if anyone wrote a markov chain text generation computer program pre-1977

more markov chain history 

@mewo2 @aparrish I wonder if I could just... DM Frieder Nake on Twitter and see if he like. You know. Remembers.

more markov chain history 

@mewo2 @aparrish you might be right... this bio of Otto Beckmann lists basically everything BUT text as things he generated with Markov processes pre-1968


more markov chain history 

@darius @aparrish right! there are loads of people who were doing things just adjacent to it, but never quite getting there. it's wild to me that the first person to actually do it might have been a textbook author in the late 70s, seemingly unaware of any previous work

more markov chain history 

@mewo2 @aparrish Not sure if you found this survey paper, it's mostly about music, but it's nice and I like the broad conclusions and it amuses me (especially given Allison's teaching experience) that there is a big focus on "hey Markov can be approximately as good as AI if you do it right" in 1988


Show newer
Show newer

@mewo2 @darius this must be the mid-sixties, Oulipian Le Lionnais specifically suggesting creating centos "by a few considerations taken from Markov’s chain theory" ieeff.org/lipofull.pdf

@aparrish @darius oh this is good! do you know if they actually followed up on that?

@mewo2 @darius I don't know. Cramer's Markov chain chronology goes cold at that Oulipo mention and picks back up at the BYTE article. I suspect there's some other source/thinker/work that preceded both

Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.

<svg xmlns="http://www.w3.org/2000/svg" id="hometownlogo" x="0px" y="0px" viewBox="25 40 50 20" width="100%" height="100%"><g><path d="M55.9,53.9H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,53.9,55.9,53.9z"/><path d="M55.9,58.2H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,58.2,55.9,58.2z"/><path d="M55.9,62.6H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,62.6,55.9,62.6z"/><path d="M64.8,53.9c-0.7,0-1.3,0.6-1.3,1.3v8.8c0,0.7,0.6,1.3,1.3,1.3s1.3-0.6,1.3-1.3v-8.8C66,54.4,65.4,53.9,64.8,53.9z"/><path d="M60.4,53.9c-0.7,0-1.3,0.6-1.3,1.3v8.8c0,0.7,0.6,1.3,1.3,1.3s1.3-0.6,1.3-1.3v-8.8C61.6,54.4,61.1,53.9,60.4,53.9z"/><path d="M63.7,48.3c1.3-0.7,2-2.5,2-5.6c0-3.6-0.9-7.8-3.3-7.8s-3.3,4.2-3.3,7.8c0,3.1,0.7,4.9,2,5.6v2.4c0,0.7,0.6,1.3,1.3,1.3 s1.3-0.6,1.3-1.3V48.3z M62.4,37.8c0.4,0.8,0.8,2.5,0.8,4.9c0,2.5-0.5,3.4-0.8,3.4s-0.8-0.9-0.8-3.4C61.7,40.3,62.1,38.6,62.4,37.8 z"/><path d="M57,42.7c0-0.1-0.1-0.1-0.1-0.2l-3.2-4.1c-0.2-0.3-0.6-0.5-1-0.5h-1.6v-1.9c0-0.7-0.6-1.3-1.3-1.3s-1.3,0.6-1.3,1.3V38 h-3.9h-1.1h-5.2c-0.4,0-0.7,0.2-1,0.5l-3.2,4.1c0,0.1-0.1,0.1-0.1,0.2c0,0-0.1,0.1-0.1,0.1C34,43,34,43.2,34,43.3v7.4 c0,0.7,0.6,1.3,1.3,1.3h5.2h7.4h8c0.7,0,1.3-0.6,1.3-1.3v-7.4c0-0.2,0-0.3-0.1-0.4C57,42.8,57,42.8,57,42.7z M41.7,49.5h-5.2v-4.9 h10.2v4.9H41.7z M48.5,42.1l-1.2-1.6h4.8l1.2,1.6H48.5z M44.1,40.5l1.2,1.6h-7.5l1.2-1.6H44.1z M49.2,44.6h5.5v4.9h-5.5V44.6z"/></g></svg>