the main trade-off of training a VAE is reconstruction fidelity vs. structure of the latent space, and the epoch 3 version of this model was much "better" at reconstructing arbitrary inputs. at epoch 12, it reconstructs inputs to appear more like the training data, which I kind of prefer? here's smashmouth again with reconstructions from this model

picking a random point in the latent space, then doing greedy decoding from various randomly selected nearby points (essentially generating variations on a line)—

greedy decoding from random samples at epoch 12, just at the right point before the model started to collapse. these are actually sorta... breathtaking?

now training a variational autoencoder neural network on the gutenberg poetry corpus but with pre-trained subword embeddings—it's MUCH better at reconstructing inputs now—here's a reconstruction of a little number you might know

i don't know why people say software engineering is difficult. just get the code cube to unlock the area leading to the boss. even kirby understands the basics

photos of people w/ec, marketing email 

COCO captions have to be the weirdest corpus commonly used in machine learning. just normal people describing normal things

I have new poems out in Andreas Bülhoff's sync² zine series today: for the series, or here's a direct link to the PDF:

I've posted prototypes of some of these poems here—"compass" shapes with real words on the points and interpolated words (generated from a seq2seq model I trained) between. a few excerpts:

in 1985 Ben Fong-Torres wrote an article about Amy Tan's Kaypro User Group, and I'm not sure why this fact is so amusing and surreal

(from Kirschenbaum's _Track Changes_, p. 65)

experimenting with RNN text generation using pre-trained byte-pair embeddings (—output below shows results at various temperatures after five epochs with a minimalist architecture on a small corpus (A.E. Waite's _Pictorial Key to the Tarot_). it's not gpt-2 but it's not bad for 10mins of training on my macbook air either

hard to imagine poetry more phonetically perfect than this snippet from Elsa von Freytag-Loringhoven

(quoted in C.D. Wright's introduction to Robert Hass, and Paul Ebenkamp, editors. Modernist Women Poets: An Anthology. Counterpoint, 2014, p. 3.)

part of what I showed last night at the Creative Machine Learning Meetup: a little riff on rich text editors, using the spelling/phonetics sequence-to-sequence model I've been working on

just noticed that this new concrete/sound poetry generator I'm working on inadvertently produced some Captain Planet fanfiction

the fluid, abstract, almost hallucinatory character forms in these textual talismans from "Scripture on the Rites of the Vajra-Being of Impure Traces for Exorcising the Hundred Transformations" are really beautiful—to my eye they're mimicking both the form of individual characters and the form of blocks of characters? (from from Robson, James. “Signs of Power: Talismanic Writing in Chinese Buddhism.” History of Religions, vol. 48, no. 2, 2008, pp. 130–169.)

when you choose too small a multiplier for your tsne coordinates and end up making a writhing mass of genesis 1:1

what I actually say: "This is wonderful, friend! Thank you for sharing your work with me."
what I want to say:

Show more
Friend Camp

The decentralized web is about trust. You should only join Friend Camp if you personally trust Darius Kazemi with your social media data. You probably only have that level of trust if we are IRL friends or have been internet friends for a long time. Generally speaking this is a small, closed community. In the end, Darius is the arbiter of what is allowed here. If you don't have a good idea of the kind of behavior that flies with Darius, again, you probably shouldn't join this instance. In the interest of specificity, we do have a code of conduct and privacy policy which you should read. Friend Camp features several modifications that were requested by our users. * you can log in via any subdomain, which means you can log in to multiple accounts in the same browser session (for example, log in once on and then as another user on * they are no longer called "toots", they are now "posts" * if you have a locked account and you get a follow request, a reminder appears under your "post" button (on normal Mastodon mobile it is otherwise buried in a sub-menu and you might not see it for a long time) * the emoji dropdown is a neutral smiley face instead of the cry-laughing smiley @mentions are rendered as "@user" for a Friend Camp user and "@user@domain" for remote users. This helps clear up when you follow two people who have the same username on different servers. * there is a "never ask me again" checkbox on the confirmation for clearing your notifications -- more info here * When an mp3 link is in a post, we also embed an inline mp3 player. git commit here * 500 characters of profile text git commit here, requested by @deerful Important Bit from the Privacy Docs: If you want decent privacy (the info doesn't leave this server), the only way to do that is to set your account to private, only accept friend requests from other users, and only ever @ mention other users. Once you start talking to people on other servers, all bets are off. Any private message you send to someone on another server could be looked at by the admin of a different server. This is kind of like email: if you are on a private email server, and you send an unencrypted email to a gmail account, congrats, Google now has the content of that email. But also, you do this every day, so, hey. The internet! Our beautiful icon is based on photo3idea_studio from, licensed CC 3.0 BY. It has been modified by!