Show newer
allison boosted

My computer-generated book GOLEM 

Coming in a few days from Dead Alive's New Sight imprint, available for pre-order now!

“Like all great art, unraveling the secret of Nick Montfort’s Golem…reads us into its riddle.…Golem is an astonishingly rich work of text generation” @zachwhalen

“A rewarding expedition on the branching paths that connect grammar and code…sonorous lexical and syntactic counterpoint” @aparrish

allison boosted


This has been on my mind for a long time, and now it's a real thing!

more fun with distilbert! this technique: (1) forward pass of model to transformer hidden state (2) add random noise to hidden state (3) predict tokens from the modified hidden state (noise in each line has increased intensity of noise)

allison boosted

a personal project that i've come to appreciate is my screenshot garden, a small curated mirror of my desktop screenshots folder.

it is really just an online directory tree with a few automated processes on top, but even so i've liked having it- a little folder of time spent browsing, working, and collecting which bundles itself up for you

i've written out some instructions and template code for the workflow, so you can build your own;

allison boosted

Computer-generated literature 

In Brazil's main newspaper, Folha de S.Paulo, I argue that microblogging itself and National Novel Generation Month as initiated by @darius are more important to computer-generated literature than particular technologies like GPT-2 and -3

allison boosted

Two things I like in the "Getting Unstuck" ebook sampler I just released:

* "Who this book is for and what you should get out of it" but also "Who this book is NOT for"
* concrete examples and exercises to improve open source project management skills

allison boosted

I made a toy that generates "What vibes do I give off?" style memes from Wikipedia categories. Enjoy.

allison boosted

bleaching your teeth
retiring Flash
talking trash
under my window

generating 26 little poems about the alphabet by boosting the probability of tokens containing each letter during DistilGPT2 generation

conditional dcgan progress 

I sorta gave up on having the same model produce different fonts—it just didn't work and the samples across classes weren't similar for the same latent variable (which was the effect I was going for in the first place). HOWEVER, I am super pleased with the samples from the model I'm training on Garamond italics...

Show thread
allison boosted

i made a mashup EP! it’s called Jaw Bra--20 minutes of high-energy throwback bops to get you thru winter lockdown ❄️💖

arranged in Fuser, lightly edited in Audacity.

hope you enjoy! xoxo

allison boosted

Santa is just the fruiting body the real Christmas stretches for miles under the permafrost

conditional dcgan progress 

this is so tantalizingly close to what I want—I'm training the GAN on images of words, conditioned on labels for different text styles (italics, all caps, title case, etc)—you can clearly see many of the different styles in this sample (trained on about 100k images). I managed to avoid mode collapse, but the GAN unfortunately fails to converge (after 200k images, the generator just makes white noise)

Show thread
allison boosted

logic - pitch us! 

we're now accepting pitches for our next issue of Logic! the theme is DISTRIBUTION.

more on the theme, rates, how to pitch, and deadlines are here:

outside of reported pieces, we're also always looking for interesting folks to talk to for interviews (identified or anonymous). our anonymous series talks with rank-and-file tech workers about their work.

if you know of someone who'd like to chat, contact us the same way as a pitch or DM me for my signal

what do you call a gan that doesn't work 

a gan't

Show thread

allison learns about... gans 

apparently the answer to "why isn't my gan working" is usually "well why didn't you put more batch normalization in there, hotshot"

(currently trying to hack the dcgan model I've been using to generate conditioned on labels, with only sputtering success, woo)

allison boosted

what idiot called it vaccination and not a cowpoke

allison boosted

Folks with experience in nonprofit development and marketing: The Python Software Foundation is hiring. Remote full-time job; I can recommend the Executive Director and other staff (I have contracted with them frequently).

"doing this" = using DistilBERT to gradually transform a sequence words picked at random from a word list into text that appears to make sense

Show thread

okay I THINK I finally found a way of doing this that comes close to meeting all of my criteria for this project (i.e., each step shows visible and meaningful change; the change is gradual, but the result "converges" after relatively few steps): calculate the probability of token in source text vs. token sampled from the distribution of mask token at that position, then find "peaks" of improbable tokens, and replace w/sampled token at those peaks; stop when any output repeats

Show thread
Show older
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.