hi fediverse, I just posted transcripts of a few talks and lectures I've given over the past few years, mostly concerning the connections between machine learning, language and poetry: posts.decontextualize.com

(notes and summaries in individual posts below)

"Desire (under)lines: Notes toward a queer phenomenology of spell check" asserts that spell check is a "straightening device" (following Sara Ahmed's use of that term) that attempts to curb spelling's material, expressive, and sensual qualities. Includes a fun proto-Indo-European etymology aside posts.decontextualize.com/quee

(originally prepared for the EACL 2021 Queer in AI Social)

"Language models can only write poetry" is an attempt to categorize the outputs of language models (from Tzara's newspaper cutups to Markov chains to GPT-3) using speech act theory. Touches on the immersive fallacy, Jhave's ReRites, Janelle Shane's AI Weirdness, Kristeva, etc. posts.decontextualize.com/lang

(excerpted from my Vector Institute / BMOLab "distinguished lecture")

@aparrish these are great! as an aside, following up on one of the references in this, do you know when people started refering to bigram (etc) models as "markov chains"? hayes' scientific american article doesn't use the term (he says "eddington monkey", which I much prefer)

@mewo2 that is a really good question and I don't know! Cramer in _Words Made Flesh_ dates it back to Theo Lutz?

@aparrish @mewo2 This 1961 paper is the earliest relevant reference I can find. This biography of the author, Hockett, claims that he was the person who pioneered application of Markov chains to language and grammar in 1955:


@aparrish @mewo2 mid 1950s was the height of "let's apply Markov chains to everything" as a kind of post-information-theory, pre-cybernetics fad, so it makes sense that that's the time frame for it

@darius @aparrish does that use it for generating text though? markov himself did text analysis with markov chains back in the 1910s

@mewo2 @aparrish I guess my thinking is that a generative grammar is something that is posited to be useful for generating text, even if it's not used for that in practice

@darius @aparrish hmm yeah I guess. what I’m interested in is the parallel evolution where the people playing around with this stuff creatively seem to have been initially unaware of any fancy theory, while the theory folks were uninterested in applying it creatively

@mewo2 @aparrish here's an example of generative language art from "Markov processes" in Bit International 2 (1968) in an article by Hiroshi Kawano


Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.