more fun with distilbert! this technique: (1) forward pass of model to transformer hidden state (2) add random noise to hidden state (3) predict tokens from the modified hidden state (noise in each line has increased intensity of noise)

allison boosted

a personal project that i've come to appreciate is my screenshot garden, a small curated mirror of my desktop screenshots folder.

it is really just an online directory tree with a few automated processes on top, but even so i've liked having it- a little folder of time spent browsing, working, and collecting which bundles itself up for you

i've written out some instructions and template code for the workflow, so you can build your own;

allison boosted

Computer-generated literature 

In Brazil's main newspaper, Folha de S.Paulo, I argue that microblogging itself and National Novel Generation Month as initiated by @darius are more important to computer-generated literature than particular technologies like GPT-2 and -3

allison boosted

Two things I like in the "Getting Unstuck" ebook sampler I just released:

* "Who this book is for and what you should get out of it" but also "Who this book is NOT for"
* concrete examples and exercises to improve open source project management skills

@tripofmice I think it would work *surprisingly* well without a lot of effort even (given the homogeneity of book covers by genre and also how genres are already marketed toward very narrow segments of readers)

allison boosted

I made a toy that generates "What vibes do I give off?" style memes from Wikipedia categories. Enjoy.

allison boosted

bleaching your teeth
retiring Flash
talking trash
under my window

generating 26 little poems about the alphabet by boosting the probability of tokens containing each letter during DistilGPT2 generation

conditional dcgan progress 

I sorta gave up on having the same model produce different fonts—it just didn't work and the samples across classes weren't similar for the same latent variable (which was the effect I was going for in the first place). HOWEVER, I am super pleased with the samples from the model I'm training on Garamond italics...

Show thread
allison boosted

i made a mashup EP! it’s called Jaw Bra--20 minutes of high-energy throwback bops to get you thru winter lockdown ❄️💖

arranged in Fuser, lightly edited in Audacity.

hope you enjoy! xoxo

Show older
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.