more fun with distilbert! this technique: (1) forward pass of model to transformer hidden state (2) add random noise to hidden state (3) predict tokens from the modified hidden state (noise in each line has increased intensity of noise)

generating 26 little poems about the alphabet by boosting the probability of tokens containing each letter during DistilGPT2 generation

conditional dcgan progress 

I sorta gave up on having the same model produce different fonts—it just didn't work and the samples across classes weren't similar for the same latent variable (which was the effect I was going for in the first place). HOWEVER, I am super pleased with the samples from the model I'm training on Garamond italics...

Show thread

conditional dcgan progress 

this is so tantalizingly close to what I want—I'm training the GAN on images of words, conditioned on labels for different text styles (italics, all caps, title case, etc)—you can clearly see many of the different styles in this sample (trained on about 100k images). I managed to avoid mode collapse, but the GAN unfortunately fails to converge (after 200k images, the generator just makes white noise)

Show thread

okay I THINK I finally found a way of doing this that comes close to meeting all of my criteria for this project (i.e., each step shows visible and meaningful change; the change is gradual, but the result "converges" after relatively few steps): calculate the probability of token in source text vs. token sampled from the distribution of mask token at that position, then find "peaks" of improbable tokens, and replace w/sampled token at those peaks; stop when any output repeats

Show thread

ANYWAY the goal of all this was to make poem things with the word image GANs I trained for NaNoGenMo last year—here's a result from the first working implementation! (now the question is: do I go easy on myself and generate a book with one poem per page, or do I let the poems stretch across pages, or do I try to group shorter poems on the same page, etc. etc. etc.)

Show thread

hey fediverse, one of my (computer-generated) poems is on the cover of BOMB Magazine's Winter 2021 issue! there are also several new poems from the Compasses series inside bombmagazine.org/issues/154/

okay I guess today is a day where I'm just going through a bunch of nlproc papers?? I love this example output from "Controlled Affective Text Generation" (columns from left to right: text completion prompt for the language model; desired topic; desired emotion; "knob" value adjusting emotional intensity; text output) paper link: wordplay-workshop.github.io/mo

"common sense" AI, alcohol mention 

playing around with COMeT (language model trained on knowledge base tuples from ConceptNet) mosaickg.apps.allenai.org/come

more fun with DistilBERT: iteratively expanding a sentence by inserting a mask token *between* random adjacent tokens and predicting the most likely token for that position with the model

Show thread

playing around with perlin noise and variable fonts (choppy framerate because this is just a screencap of a browser window and animating variable font axes is apparently pretty slow? also I need to close some tabs haha)

doing the same thing but instead of starting with words selected at random from a word list, starting with letters selected at random from the alphabet (plus a few spaces thrown in)

pretty weird!

Show thread

at each iteration, replace the token whose probability is the lowest compared to DistilBERT's prediction for the masked token in the same position

Show thread

okay haha it's much better at this if I include the begin string/end string tokens. (first line is randomly selected words from a word list; in each line one word is replaced by DistilBERT's prediction)

Show thread

same thing but starting with the word "hello" repeated twelve times. (also the procedure won't pick the word originally found at the random index, even if it would otherwise be the most likely token based on the model)

Show thread

(1) sample words at random from a word list; (2) replace each word (in random order) with the most likely word for that context using BERT (actually Hugging Face DistilBERT)

poetry generation, sorta jiggly video 

using simplex noise to explore the parameter space of this generator...

Show thread

playing around with some more expressive word placeholders (words will eventually be generated with a whole separate process but the rectangles felt too redactioney)

Show thread

read the first of a handful of new poems to be posted over the course of the next few weeks future-feed.net/variations-on- thx to Futurepoem for inviting me to be a futurefeed "blogger in residence!"

Show older
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.