Show more

"haiku" here interpreted to mean "an english language poem with a 5-7-5 syllable pattern in its lines" which needless to say is a wild yet somehow conventional interpretation of the form

Show thread

found haiku in Frankenstein. for a long time I have had a blanket ban on haiku generators in my classes because what more is there to say about computer-generated haiku that wasn't already said 53 years ago rwet.decontextualize.com/pdfs/ but... I had never actually programmed the "find haiku in an existing text" thing before I did have fun and learned a bit making it, whoops

(and yes, I should probably train this on something other than my laptop but then I have to make the code pretty so I can copy it over and that takes more effort than just waiting. and it'll go faster once the results of finding the phonetic states is cached at the end of the first epoch)

Show thread

whoops I left the debug thing in where I printed out any words that weren't in the expected length limits

Show thread

the way I've been reviewing the output of this model is looking at the probability scores juxtaposed with the words, one by one, and checking for the highest scores (higher score = greater probability that a line break will directly follow this word) and anyway now I'm having a hard time not reading "Stopping By Woods on a Snowy Evening" in the Beastie Boys style with everyone shouting out the end rhymes

Show thread

hmm, weirdly the more I push the accuracy on the training set, the less it produces the result I want on arbitrary prose. (bc there are stray prose snippets throughout the corpus, I think it might actually be learning the difference between prose and verse, whoops!) gonna try training again with *only* phonetic information about each word, maybe that will help

Show thread

(there's a little bit of art in this—here I'm outputting a line break if the model's prediction was 0.25 or above. but I'm happy with the results so far!)

Show thread

training a quick neural network to predict where to add poetic line breaks in text, based on a large corpus of public domain poetry and taking into account phonetics and semantics. the goal is to be able to enjamb prose passages in a somewhat principled way—after just a handful of epochs, here's what it does to a passage on hyacinths from wikipedia:

interesting corpus: database of 60k+ poems written by kids (K-12), scraped with permission, "freely available for research with the condition that the research be used for the benefit of children" github.com/whipson/PoKi-Poems-

this has, once again, been "allison live-toots reading her e-mail"

Show thread

I was invited earlier this year to contribute to a weird blockchain thing that Casey Reas and collaborators run for new media artists to trade work with each other—here's the thing I made: a2p.bitmark.com/v2/artworks/05

python programming 

I did this today and it feels GOOD

Show thread

found this weird political compass in the scikit-learn documentation

possibly the first cat meme (Harry Pointer, 1870-ish)

Show more
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.