more fun with distilbert! this technique: (1) forward pass of model to transformer hidden state (2) add random noise to hidden state (3) predict tokens from the modified hidden state (noise in each line has increased intensity of noise)

allison boosted

a personal project that i've come to appreciate is my screenshot garden, a small curated mirror of my desktop screenshots folder.

it is really just an online directory tree with a few automated processes on top, but even so i've liked having it- a little folder of time spent browsing, working, and collecting which bundles itself up for you

i've written out some instructions and template code for the workflow, so you can build your own;

https://screenshot-garden.neocities.org/build-your-own.html

allison boosted

Computer-generated literature 

In Brazil's main newspaper, Folha de S.Paulo, I argue that microblogging itself and National Novel Generation Month as initiated by @darius are more important to computer-generated literature than particular technologies like GPT-2 and -3 https://www1.folha.uol.com.br/ilustrada/2021/01/robos-ja-escrevem-de-poemas-e-microcontos-no-twitter-a-romances-inteiros.shtml

allison boosted

Two things I like in the "Getting Unstuck" ebook sampler I just released:

* "Who this book is for and what you should get out of it" but also "Who this book is NOT for"
* concrete examples and exercises to improve open source project management skills

https://changeset.nyc/resources/getting-unstuck-sampler-offer.html

@tripofmice I think it would work *surprisingly* well without a lot of effort even (given the homogeneity of book covers by genre and also how genres are already marketed toward very narrow segments of readers)

allison boosted

I made a toy that generates "What vibes do I give off?" style memes from Wikipedia categories. Enjoy.

tinysubversions.com/vibes/

allison boosted

bleaching your teeth
retiring Flash
talking trash
under my window

generating 26 little poems about the alphabet by boosting the probability of tokens containing each letter during DistilGPT2 generation

conditional dcgan progress 

I sorta gave up on having the same model produce different fonts—it just didn't work and the samples across classes weren't similar for the same latent variable (which was the effect I was going for in the first place). HOWEVER, I am super pleased with the samples from the model I'm training on Garamond italics...

Show thread
allison boosted

i made a mashup EP! it’s called Jaw Bra--20 minutes of high-energy throwback bops to get you thru winter lockdown ❄️💖

soundcloud.com/user-452090197/

arranged in Fuser, lightly edited in Audacity.

hope you enjoy! xoxo

Show older
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.