using the same models and code I also made a book of latent space interpolations. this isn't 50k words long but I like it and wanted to share. PDF version http://static.decontextualize.com/interpolations.pdf image gallery: http://static.decontextualize.com/interpolations-gallery/
The final version of my asemic novel _Ahe Thd Yearidy Ti Isa_ (made for NaNoGenMo 2019) is now available, either as a 100MB PDF http://static.decontextualize.com/ahe-thd-yearidy-ti-isa.pdf or as an online image gallery http://static.decontextualize.com/ahe-thd-yearidy-ti-isa-gallery/
it was generated from a suite of GANs trained on bitmaps of random words, which I then sampled from and arranged to look sorta novel-esque. training code: https://github.com/aparrish/word-dcgan novel generation/layout code: https://github.com/aparrish/word-gan-book-generator
had an inkling to train a separate model for words with initial capitals, so I can introduce some structure (like sentences and paragraphs). the drawback here being that it won't have the same latent space as the lower-case model so interpolations won't work across the two. (training a separate model also for words with final punctuation)
Aarseth: Cybertext, as now should be clear, is the wide range... of possible textualities seen as a typology of machines, as various kinds of literary communication systems where the functional differences among the mechanical parts play a defining role in determining the aesthetic process. [...] If these texts redefine literature by expanding our notion of it—and I believe that they do—then they must also redefine what is literary...
Poet, programmer, game designer, computational creativity researcher. Assistant Arts Professor at NYU ITP. she/her
Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.