highly recommended generative technique: whenever you're selecting an item at random where the items are weighted (e.g. by their frequency in a corpus), normalize the weights and sample it with a softmax function, using temperature as a parameter. (at temp=1.0, it's the same as picking by the weights directly; at <1.0 it favors items that are weighted more heavily; as temperature increases >1.0, the sampling approaches a uniform distribution)

here are examples from the thing I'm working on

another day, another VAE (nonsense words)

I got very helpful advice today on this, which is that the distribution the VAE learns might not be centered at zero—after averaging together the latent vectors from a few thousand items from the data set and using *that* as the center of the distribution, I get much better results when sampling!

Show thread

whoops I left the debug thing in where I printed out any words that weren't in the expected length limits

Show thread

the way I've been reviewing the output of this model is looking at the probability scores juxtaposed with the words, one by one, and checking for the highest scores (higher score = greater probability that a line break will directly follow this word) and anyway now I'm having a hard time not reading "Stopping By Woods on a Snowy Evening" in the Beastie Boys style with everyone shouting out the end rhymes

Show thread

- Home page
- https://www.decontextualize.com/

Poet, programmer, game designer, computational creativity researcher. Assistant Arts Professor at NYU ITP. she/her

Joined Mar 2019