Show more

hi fediverse, you could read this essay I wrote in which I try to explain why automatic writing (e.g. Breton, Fox Sisters) and computer-generated writing often resemble one another in form and use—despite the (apparent) differences in how they're authored (with particular reference to Jenna Sutela's work) serpentinegalleries.org/art-an

(The Ephemerides btw is a poem generator I made a while ago that I used to make a twitter/tumblr bot the-ephemerides.tumblr.com/)

(it stopped working a while back because the OPUS API changed and I never bothered to fix my wrapper code. I'm making some zines with it for a show in a few weeks and decided to take this opportunity to revisit some of the inner workings)

Show thread

(using AllenNLP to do constituency parsing instead of Pattern, and using my stichography model to add line breaks instead of the weird ad-hoc system of rules I had originally programmed for this purpose. right now constituents are just being swapped in at random, as in the original code, but I'm thinking about weighting the choices by semantic similarity to what is being replaced, and/or by phonetic similarity to previous parts of the poem)

Show thread

oh wait, apparently with constituency parsing, you just sorta... take your best guess about what the head is. e.g. pattern: github.com/clips/pattern/blob/ even the latest edition of jurafsky's _Speech and Language Processing_ just says to use a rule-driven approach found in a paper from 1999 mitpressjournals.org/doi/abs/1

Show thread

a fun thing I discovered about this model is that it doesn't label the constituent's head for you! which makes it sort of useless for what I want to use it for (swapping out constituents of a sentence at random with other constituents of the same type—you need to know the number of the head noun of an NP when you do this to retain subject/verb agreement). though I'm not sure if the model doesn't do this at all, or if it's just not exposed in the information returned from the wrapper code

Show thread

covid, lockdown, work 

I actually kinda enjoyed doing an "intensive" workshop format over zoom—this was daily, with 90min sessions in the morning and individual student meetings all afternoon—maybe just because it was nice to have an actual immovable daily schedule for the first time since march. but I definitely would have preferred doing the workshop *at* Anderson Ranch, like I did last year. It's way up a crisp and beautiful mountain valley in Colorado and has very good catering

Show thread

lots of GPT-2 fine-tuning, markov chains, tracery and interesting combinations of all of the above!

Show thread

last week I led a five-day remote workshop on creative writing w/computation for Anderson Ranch Arts Center. on the last day, we put together a quick zine with pieces we made during the week: static.decontextualize.com/zin

allison boosted

I just posted on my blog: I'd like to pay someone to port my personal blog to a new platform/blogging tool, and create one for Changeset Consulting as well. So now I need to decide on the platform and the vendor. I listed my requirements and nice-to-haves in the post, and mention what I'd like advice/recommendations about: https://www.harihareswara.net/sumana/2020/08/11/1

at times Sara Ahmed's _Queer Phenomenology_ reads like Inform 7 source code

allison boosted

There is a big change coming to pip in October -- a watershed moment, a minor revolution. It'll be a great foundation for making it easier to deal with #Python packaging.

This is a thread where I'll share some of the stuff @`ThePyPA can build on that foundation.

(I'm interested in constituency parsing for poetic purposes bc I think it maps more neatly than dependency parsing to people's internal mental models of how syntax works)

Show thread

oh hey a deep learning constituency parser! I thought everyone had just moved to dependency parsing forever demo.allennlp.org/constituency

initial observations: (a) this isn't "fun" but it's sort of meditative, like knitting maybe or a really easy jigsaw puzzle; (b) i put the n-grams on slips so i could alphabetize them more easily as i progressed (and therefore look them up more easily), but the technique might not scale to the amount of desk space i have; (c) tiny dopamine hit when you find an n-gram that has been used before; (d) not sure yet if this method of "reading" is telling me anything about the text, oh well

Show thread

making a markov chain model by hand (this is about... eleven words out of a forty-word poem)

allison boosted

Back-lit computer monitors were a mistake

Every screen should be e-ink

sports (nba) 

also I can't believe J.R. Smith is playing professional basketball again

Show thread

sports (nba) 

I can't get over the banal cyberpunk dystopia vibe of the NBA reboot. in lieu of a crowd, there are huge video walls showing, like, webcam feeds of fans with their backgrounds keyed out to make it look like they're sitting in arena seats. all the games are being played on a handful of courts in florida, but the names of the "home" team's arena sponsors are being composited onto the court for the broadcast. an ad along the sidelines reads "NBA App / Tap to cheer"

allison boosted

the milky way
swallows twitter
chestnuts in the field

Show more
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.