hi fediverse, you could read this essay I wrote in which I try to explain why automatic writing (e.g. Breton, Fox Sisters) and computer-generated writing often resemble one another in form and use—despite the (apparent) differences in how they're authored (with particular reference to Jenna Sutela's work) https://www.serpentinegalleries.org/art-and-ideas/the-umbra-of-an-imago-writing-under-control-of-machine-learning/
(The Ephemerides btw is a poem generator I made a while ago that I used to make a twitter/tumblr bot https://the-ephemerides.tumblr.com/)
(it stopped working a while back because the OPUS API changed and I never bothered to fix my wrapper code. I'm making some zines with it for a show in a few weeks and decided to take this opportunity to revisit some of the inner workings)
(using AllenNLP to do constituency parsing instead of Pattern, and using my stichography model to add line breaks instead of the weird ad-hoc system of rules I had originally programmed for this purpose. right now constituents are just being swapped in at random, as in the original code, but I'm thinking about weighting the choices by semantic similarity to what is being replaced, and/or by phonetic similarity to previous parts of the poem)
oh wait, apparently with constituency parsing, you just sorta... take your best guess about what the head is. e.g. pattern: https://github.com/clips/pattern/blob/53245196139c6ef26dc9c34873dda8a16f236d23/pattern/text/tree.py#L394 even the latest edition of jurafsky's _Speech and Language Processing_ just says to use a rule-driven approach found in a paper from 1999 https://www.mitpressjournals.org/doi/abs/10.1162/089120103322753356
a fun thing I discovered about this model is that it doesn't label the constituent's head for you! which makes it sort of useless for what I want to use it for (swapping out constituents of a sentence at random with other constituents of the same type—you need to know the number of the head noun of an NP when you do this to retain subject/verb agreement). though I'm not sure if the model doesn't do this at all, or if it's just not exposed in the information returned from the wrapper code
covid, lockdown, work
I actually kinda enjoyed doing an "intensive" workshop format over zoom—this was daily, with 90min sessions in the morning and individual student meetings all afternoon—maybe just because it was nice to have an actual immovable daily schedule for the first time since march. but I definitely would have preferred doing the workshop *at* Anderson Ranch, like I did last year. It's way up a crisp and beautiful mountain valley in Colorado and has very good catering
lots of GPT-2 fine-tuning, markov chains, tracery and interesting combinations of all of the above!
last week I led a five-day remote workshop on creative writing w/computation for Anderson Ranch Arts Center. on the last day, we put together a quick zine with pieces we made during the week: http://static.decontextualize.com/zine-creative-writing-comp-learning-aug-20.pdf
I just posted on my blog: I'd like to pay someone to port my personal blog to a new platform/blogging tool, and create one for Changeset Consulting as well. So now I need to decide on the platform and the vendor. I listed my requirements and nice-to-haves in the post, and mention what I'd like advice/recommendations about: https://www.harihareswara.net/sumana/2020/08/11/1
There is a big change coming to pip in October -- a watershed moment, a minor revolution. It'll be a great foundation for making it easier to deal with #Python packaging.
This is a thread where I'll share some of the stuff @`ThePyPA can build on that foundation.
(I'm interested in constituency parsing for poetic purposes bc I think it maps more neatly than dependency parsing to people's internal mental models of how syntax works)
oh hey a deep learning constituency parser! I thought everyone had just moved to dependency parsing forever https://demo.allennlp.org/constituency-parsing/
initial observations: (a) this isn't "fun" but it's sort of meditative, like knitting maybe or a really easy jigsaw puzzle; (b) i put the n-grams on slips so i could alphabetize them more easily as i progressed (and therefore look them up more easily), but the technique might not scale to the amount of desk space i have; (c) tiny dopamine hit when you find an n-gram that has been used before; (d) not sure yet if this method of "reading" is telling me anything about the text, oh well
also I can't believe J.R. Smith is playing professional basketball again
I can't get over the banal cyberpunk dystopia vibe of the NBA reboot. in lieu of a crowd, there are huge video walls showing, like, webcam feeds of fans with their backgrounds keyed out to make it look like they're sitting in arena seats. all the games are being played on a handful of courts in florida, but the names of the "home" team's arena sponsors are being composited onto the court for the broadcast. an ad along the sidelines reads "NBA App / Tap to cheer"
Poet, programmer, game designer, computational creativity researcher. Assistant Arts Professor at NYU ITP. she/her
Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.