hi fediverse, I just posted transcripts of a few talks and lectures I've given over the past few years, mostly concerning the connections between machine learning, language and poetry: posts.decontextualize.com

(notes and summaries in individual posts below)

"Desire (under)lines: Notes toward a queer phenomenology of spell check" asserts that spell check is a "straightening device" (following Sara Ahmed's use of that term) that attempts to curb spelling's material, expressive, and sensual qualities. Includes a fun proto-Indo-European etymology aside posts.decontextualize.com/quee

(originally prepared for the EACL 2021 Queer in AI Social)

"Language models can only write poetry" is an attempt to categorize the outputs of language models (from Tzara's newspaper cutups to Markov chains to GPT-3) using speech act theory. Touches on the immersive fallacy, Jhave's ReRites, Janelle Shane's AI Weirdness, Kristeva, etc. posts.decontextualize.com/lang

(excerpted from my Vector Institute / BMOLab "distinguished lecture")

@aparrish these are great! as an aside, following up on one of the references in this, do you know when people started refering to bigram (etc) models as "markov chains"? hayes' scientific american article doesn't use the term (he says "eddington monkey", which I much prefer)

@mewo2 that is a really good question and I don't know! Cramer in _Words Made Flesh_ dates it back to Theo Lutz?

@aparrish @mewo2 This 1961 paper is the earliest relevant reference I can find. This biography of the author, Hockett, claims that he was the person who pioneered application of Markov chains to language and grammar in 1955:

nasonline.org/publications/bio

@aparrish @mewo2 mid 1950s was the height of "let's apply Markov chains to everything" as a kind of post-information-theory, pre-cybernetics fad, so it makes sense that that's the time frame for it

@darius @aparrish does that use it for generating text though? markov himself did text analysis with markov chains back in the 1910s

@mewo2 @aparrish I guess my thinking is that a generative grammar is something that is posited to be useful for generating text, even if it's not used for that in practice

@darius @aparrish hmm yeah I guess. what I’m interested in is the parallel evolution where the people playing around with this stuff creatively seem to have been initially unaware of any fancy theory, while the theory folks were uninterested in applying it creatively

@mewo2 @aparrish here's an example of generative language art from "Markov processes" in Bit International 2 (1968) in an article by Hiroshi Kawano

monoskop.org/images/2/26/Bit_I

@darius @aparrish okay, what I've got so far:

earliest generation of text by markov chain: claude shannon, 1948: people.math.harvard.edu/~ctm/h (calls it a "markoff process")

earliest suggestion of doing this creatively: francois le lionnais, 1963

then by 1968, kawano can say "many computer-aided works of art have been experimentally produced with this method"

the lutz reference is interesting - cramer claims he used markov chains in 1959, but afaik he used a madlib/grammar? stuttgarter-schule.de/lutz_sch

@darius @aparrish then the whole business seems to have been independently reinvented by bennett in 1977: jstor.org/stable/27848169 (no reference to markov or shannon, who you'd sort of assume a physicist who previously worked at bell labs would have heard of)

hayes picks it up from bennett, then kenner from hayes, and suddenly you have a second independent tradition

@darius @aparrish kenner and o'rourke (1984) do make reference to shannon, but none of the work from the 60s. kenner was a literary critic who wrote about modernist writers in france, so it's likely he was aware of the oulipo manifesto, but he probably didn't know what a markov chain was, so may not have understood what le lionnais was saying

@mewo2 @darius this is great, thank you for doing the digging!

@aparrish @darius I am now consumed by two questions:

who was the first person to generate text using markov chains on a computer?

and

who was the first person to use markov chains to make creative work?

I really feel like these should be answerable!

more markov chain history 

@aparrish @darius hmm

kawano had made visual work using markov chains from 63/64: direct.mit.edu/leon/article/52 and hiller composed music in 57: medienkunstnetz.de/works/illia but both were using manually constructed transition probabilities, not derived from a "text"

bense wrote about shannon's ideas in 1960, but it seems like only for analysis, not generation: monoskop.org/log/?p=16249

I'm now actually unsure if anyone wrote a markov chain text generation computer program pre-1977

more markov chain history 

@mewo2 @aparrish you might be right... this bio of Otto Beckmann lists basically everything BUT text as things he generated with Markov processes pre-1968

zkm.de/en/otto-beckmann

more markov chain history 

@darius @aparrish right! there are loads of people who were doing things just adjacent to it, but never quite getting there. it's wild to me that the first person to actually do it might have been a textbook author in the late 70s, seemingly unaware of any previous work

more markov chain history 

@mewo2 @aparrish Not sure if you found this survey paper, it's mostly about music, but it's nice and I like the broad conclusions and it amuses me (especially given Allison's teaching experience) that there is a big focus on "hey Markov can be approximately as good as AI if you do it right" in 1988

jstor.org/stable/1575226

more markov chain history 

@darius @aparrish yes, I saw that, although re-reading it now I notice that it cites hiller and baker (1963) as deriving their transition probabilities from a traditionally composed piece of music, which definitely fills one of the gaps - still more interested in who was the first to do it for text though

Follow

more markov chain history 

@darius @aparrish okay holy shit, in 1963, hiller and baker were doing markov chain generation on strings of *phonemes* to produce "lyrics" for their markov-generated music: jstor.org/stable/832238

recording here: youtube.com/watch?v=85fvyWJFq2

· · Web · 1 · 0 · 4

more markov chain history 

@mewo2 @darius ok. well. this is obviously extremely my shit,

more markov chain history 

@aparrish @darius I looked at that page and immediately thought "allison is going to lose her shit at this"

Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.

<svg xmlns="http://www.w3.org/2000/svg" id="hometownlogo" x="0px" y="0px" viewBox="25 40 50 20" width="100%" height="100%"><g><path d="M55.9,53.9H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,53.9,55.9,53.9z"/><path d="M55.9,58.2H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,58.2,55.9,58.2z"/><path d="M55.9,62.6H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,62.6,55.9,62.6z"/><path d="M64.8,53.9c-0.7,0-1.3,0.6-1.3,1.3v8.8c0,0.7,0.6,1.3,1.3,1.3s1.3-0.6,1.3-1.3v-8.8C66,54.4,65.4,53.9,64.8,53.9z"/><path d="M60.4,53.9c-0.7,0-1.3,0.6-1.3,1.3v8.8c0,0.7,0.6,1.3,1.3,1.3s1.3-0.6,1.3-1.3v-8.8C61.6,54.4,61.1,53.9,60.4,53.9z"/><path d="M63.7,48.3c1.3-0.7,2-2.5,2-5.6c0-3.6-0.9-7.8-3.3-7.8s-3.3,4.2-3.3,7.8c0,3.1,0.7,4.9,2,5.6v2.4c0,0.7,0.6,1.3,1.3,1.3 s1.3-0.6,1.3-1.3V48.3z M62.4,37.8c0.4,0.8,0.8,2.5,0.8,4.9c0,2.5-0.5,3.4-0.8,3.4s-0.8-0.9-0.8-3.4C61.7,40.3,62.1,38.6,62.4,37.8 z"/><path d="M57,42.7c0-0.1-0.1-0.1-0.1-0.2l-3.2-4.1c-0.2-0.3-0.6-0.5-1-0.5h-1.6v-1.9c0-0.7-0.6-1.3-1.3-1.3s-1.3,0.6-1.3,1.3V38 h-3.9h-1.1h-5.2c-0.4,0-0.7,0.2-1,0.5l-3.2,4.1c0,0.1-0.1,0.1-0.1,0.2c0,0-0.1,0.1-0.1,0.1C34,43,34,43.2,34,43.3v7.4 c0,0.7,0.6,1.3,1.3,1.3h5.2h7.4h8c0.7,0,1.3-0.6,1.3-1.3v-7.4c0-0.2,0-0.3-0.1-0.4C57,42.8,57,42.8,57,42.7z M41.7,49.5h-5.2v-4.9 h10.2v4.9H41.7z M48.5,42.1l-1.2-1.6h4.8l1.2,1.6H48.5z M44.1,40.5l1.2,1.6h-7.5l1.2-1.6H44.1z M49.2,44.6h5.5v4.9h-5.5V44.6z"/></g></svg>