highly recommended generative technique: whenever you're selecting an item at random where the items are weighted (e.g. by their frequency in a corpus), normalize the weights and sample it with a softmax function, using temperature as a parameter. (at temp=1.0, it's the same as picking by the weights directly; at <1.0 it favors items that are weighted more heavily; as temperature increases >1.0, the sampling approaches a uniform distribution)
here are examples from the thing I'm working on
helpful social media tips I found on everyword's twitter analytics page
beautiful Mirtha Dermisache exhibition catalog available for free download, for the asemic writing fans on here https://www.malba.org.ar/catalogo-mirtha-dermisache/ ("Descargar PDF" on the right side of the page to download) (most of it is essays and stuff in Spanish but there's a big chunk of reproductions of Dermisache's work in the middle)
obligatory tag yourself/celestial emporium of benevolent knowledge joke
another day, another VAE (nonsense words)
I got very helpful advice today on this, which is that the distribution the VAE learns might not be centered at zero—after averaging together the latent vectors from a few thousand items from the data set and using *that* as the center of the distribution, I get much better results when sampling!
found haiku in Frankenstein. for a long time I have had a blanket ban on haiku generators in my classes because what more is there to say about computer-generated haiku that wasn't already said 53 years ago http://rwet.decontextualize.com/pdfs/morris.pdf but... I had never actually programmed the "find haiku in an existing text" thing before I did have fun and learned a bit making it, whoops
whoops I left the debug thing in where I printed out any words that weren't in the expected length limits
the way I've been reviewing the output of this model is looking at the probability scores juxtaposed with the words, one by one, and checking for the highest scores (higher score = greater probability that a line break will directly follow this word) and anyway now I'm having a hard time not reading "Stopping By Woods on a Snowy Evening" in the Beastie Boys style with everyone shouting out the end rhymes
it *almost* gets "How Doth the Little Crocodile" right:
training a quick neural network to predict where to add poetic line breaks in text, based on a large corpus of public domain poetry and taking into account phonetics and semantics. the goal is to be able to enjamb prose passages in a somewhat principled way—after just a handful of epochs, here's what it does to a passage on hyacinths from wikipedia:
Poet, programmer, game designer, computational creativity researcher. Assistant Arts Professor at NYU ITP. she/her
Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.