Show more

bike, park 

i only saw one other cyclist on the prospect park loop today but i refuse to accept that bikes-for-fun season is over in nyc. i am cold and damp and i REFUSE

HEY ARTISTS and general people who are looking for creative opportunities -->

i'm sharing what might be my life's work, which is a spreadsheet of almost 400 places that hold regular open calls for residencies, shows, and funding

y'all deserve this and we gotta take care of each other. apply to things and get that support:

https://docs.google.com/spreadsheets/d/1KWIzznlFNs_rQCEzW5ub6ehwaLcwR80xbuOokXwRa_Y/edit?usp=sharing

video games 

one thing about torchlight 2 is that if it's even vaguely close to bed time, it puts me to sleep almost immediately and i'll be jolted awake when it switches to the "resurrect here for 234 gold" screen. then again i'm playing on normal difficulty right now so dozing off while holding the regular attack isn't a bad overall strategy

also just had to teach it "variational", "autoencoders", "divinatory", "ontologies" and it's like, firefox textarea spellcheck, you really don't know my life

Show thread

firefox textarea spellcheck suggests "phoneme-to-ephemera" as a replacement for "phoneme-to-grapheme" and now I have some new project ideas

skynettoday.com/editorials/ai- this is really really good. I think I'd add: "Don't use words like 'yet' or phrases like 'the current state' to imply that General AI is the natural and inescapable teleology of AI research"

weird thing about working with ML models that don't have a softmax layer as an output is that there's no temperature parameter, and it's always like, wait how do you make this thing DO stuff?

just realized that if I finish this project I'm going to become one of those people that needs to put that diagram of GAN architecture into their slides. I'm going to find myself explaining how GANs work to someone at a party, dear god

Show thread

nanogenmo, gans 

definitely bit off more than I could chew when it comes to making something that I feel is conceptually sound with this. the instant temptation is to go full "alien artifact" (and include GAN-generated body horror imagery or whatever), or at least make page layouts that resemble those of typical novels. but then the project feels like it's "about" layout, or "about" books as artifacts, which aren't topics that I personally care to spend time making arguments about at the moment

Show thread

nanogenmo, gans 

retraining with a serif font instantly makes it seem more ancient manuscript-ey

Show thread

prototype page layout, sampling each word at random (with the fully trained model, or at least as fully as I care to train it)

Show thread

actually surprising how much it resembles this instagram.com/p/Bq1FHgil9Jl/ which I trained on a similar dataset except with vector data from words spelled in hershey fonts (using the sketch-rnn model)

Show thread

another change I made was having it train on bitmaps of random words weighted by the frequency of the words in a reference corpus (i.e. in this case spaCy's unigram probabilities). the idea was that this would help it learn higher-frequency letter combinations and generate words that mostly replicate the "look" of English in use (rather than words in a word list). the drawback is that it looks like half the latent space is trying to spell out "the"

Show thread

after a few thousand batches of training at a usable resolution on an actual GPU. recognizably "words" now—I wonder if increasing the depth of the model (or the kernel size of the convolutions?) would help it actually learn longer-distance dependencies...

Show thread

latent space interpolation on a lower resolution version of this model after just 100 batches or so, using matplotlib's default colors because it looks vaguely metroid prime-y

Show thread

training a gan on bitmaps of words and it's making my eyes hurt (or maybe I just need to go to sleep)

procrastinating... with quantified-self text analysis 

most common 25 words with counts in the URLs of my open tabs

https: 38
com: 29
google: 10
www: 10
2019: 9
http: 9
localhost: 8
1: 6
github: 6
11: 5
ipynb: 5
mail: 4
edit: 4
html: 4
calendar: 4
python: 4
notebooks: 4
8890: 4
0: 3
docs: 3
org: 3
aparrish: 3
bobey: 3
dig: 3
cloud: 3

machine learning, spooky 

early stage mnist gan looks like ghosts pressing their faces up against your window at night

i was on a podcast recently! it was a fun conversation (mostly about poetry and programming) corecursive.com/beautiful-and-

Show more
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.