the correct answer here was "country" but duolingo, my friend, my love, i *would* one day like to visit a french-speaking hamburger

been working on a lot of fancy text language poetry generation stuff lately but it's so tough to beat the tried and true technique of "replace words with other words sharing the same part of speech"

this has been on my "open projects and ideas" board for like a year and i am very happy right now to be able to take it down and rip it in half and throw it away

using the same models and code I also made a book of latent space interpolations. this isn't 50k words long but I like it and wanted to share. PDF version static.decontextualize.com/int image gallery: static.decontextualize.com/int

The final version of my asemic novel _Ahe Thd Yearidy Ti Isa_ (made for NaNoGenMo 2019) is now available, either as a 100MB PDF static.decontextualize.com/ahe or as an online image gallery static.decontextualize.com/ahe

it was generated from a suite of GANs trained on bitmaps of random words, which I then sampled from and arranged to look sorta novel-esque. training code: github.com/aparrish/word-dcgan novel generation/layout code: github.com/aparrish/word-gan-b

ugh, macos preview overfilters images in PDFs and it's making the output of my gan thing look like garbage :( (macos preview on left, pdf.js in firefox on right)

nanogenmo, gans 

nanogenmo, gans, weirdly makes me motion sick? 

nanogenmo, gans 

nanogenmo, gans 

prototype page layout, sampling each word at random (with the fully trained model, or at least as fully as I care to train it)

after a few thousand batches of training at a usable resolution on an actual GPU. recognizably "words" now—I wonder if increasing the depth of the model (or the kernel size of the convolutions?) would help it actually learn longer-distance dependencies...

latent space interpolation on a lower resolution version of this model after just 100 batches or so, using matplotlib's default colors because it looks vaguely metroid prime-y

training a gan on bitmaps of words and it's making my eyes hurt (or maybe I just need to go to sleep)

machine learning, spooky 

Aarseth: Cybertext, as now should be clear, is the wide range... of possible textualities seen as a typology of machines, as various kinds of literary communication systems where the functional differences among the mechanical parts play a defining role in determining the aesthetic process. [...] If these texts redefine literature by expanding our notion of it—and I believe that they do—then they must also redefine what is literary...

2019 web:

bikes, nyc transit, self congratulation 

a truly bizarre yet compelling political advertisement inside this hotel elevator in irvine, california

Show more
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.