latent space interpolation on a lower resolution version of this model after just 100 batches or so, using matplotlib's default colors because it looks vaguely metroid prime-y
after a few thousand batches of training at a usable resolution on an actual GPU. recognizably "words" now—I wonder if increasing the depth of the model (or the kernel size of the convolutions?) would help it actually learn longer-distance dependencies...
another change I made was having it train on bitmaps of random words weighted by the frequency of the words in a reference corpus (i.e. in this case spaCy's unigram probabilities). the idea was that this would help it learn higher-frequency letter combinations and generate words that mostly replicate the "look" of English in use (rather than words in a word list). the drawback is that it looks like half the latent space is trying to spell out "the"
prototype page layout, sampling each word at random (with the fully trained model, or at least as fully as I care to train it)
retraining with a serif font instantly makes it seem more ancient manuscript-ey
definitely bit off more than I could chew when it comes to making something that I feel is conceptually sound with this. the instant temptation is to go full "alien artifact" (and include GAN-generated body horror imagery or whatever), or at least make page layouts that resemble those of typical novels. but then the project feels like it's "about" layout, or "about" books as artifacts, which aren't topics that I personally care to spend time making arguments about at the moment
just realized that if I finish this project I'm going to become one of those people that needs to put that diagram of GAN architecture into their slides. I'm going to find myself explaining how GANs work to someone at a party, dear god
had an inkling to train a separate model for words with initial capitals, so I can introduce some structure (like sentences and paragraphs). the drawback here being that it won't have the same latent space as the lower-case model so interpolations won't work across the two. (training a separate model also for words with final punctuation)
nanogenmo, gans, weirdly makes me motion sick?
prototype with capitals and punctuation. you might need to take some dramamine before you try to read(?) this?
I wish more of this project was more "making weird text things" and less "reinventing typesetting from scratch," but here we are
it does full justification and indentation now! shown here zigzagging through interpolations of the latent space. (the capitalized words and end-of-sentence words are separately trained models, which is why they don't look like the surrounding words)
Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.