life hack: if someone asks for a business card and of course you don't have one, just give them one of your name tags from a recent conference that you still have stashed in your bag for some reason

boop, here's the github repository, since it's not immediately clear how to get back to the repo from the notebook github.com/aparrish/vae-laggin (and again I want to thank the upstream researchers for making their code available!)

@halcy it's binder! mybinder.org/ it's like colab except, uh, it's still jupyter notebook and you can host it yourself and it's not owned by google, highly recommended

if you're curious about this poetry model, the code is now online and you can play around with it here: mybinder.org/v2/gh/aparrish/va (this will launch a cloud-hosted anonymous jupyter notebook—if you're not familiar with jupyter notebooks, just click on the cells with code in them and hit shift+enter, starting from the top)

@jonbro it's not really usable in any sense at this point! but yes, education contexts was exactly what I had in mind.

(if you don't know, p5.js is a creative coding framework, based on Processing, written in JavaScript: p5js.org/)

last week at the p5.js contributor's conference, I made a very (extremely) rough prototype of what I think a p5.js-specific notebook environment might look like, and wrote up a bit of my research and principles behind the prototype: github.com/aparrish/nb5js-proo

the main trade-off of training a VAE is reconstruction fidelity vs. structure of the latent space, and the epoch 3 version of this model was much "better" at reconstructing arbitrary inputs. at epoch 12, it reconstructs inputs to appear more like the training data, which I kind of prefer? here's smashmouth again with reconstructions from this model gist.github.com/aparrish/d356d

picking a random point in the latent space, then doing greedy decoding from various randomly selected nearby points (essentially generating variations on a line)—

@halcy there's a bit more structural variety with beam search, e.g.:

And rush'd from the city of his son,
And rush'd from the city of his son,
And rush'd from the palace of his son,
And rubs on his former tower,
And tossed with a hundred pounds,
And tossed with a hundred years,
Plucked with a hundred years,
Lambs on his country's command,
Drown'd by a country's throne,
Raging from the city of his father's throne,
And rear'd by the chief's son,
And rush'd from the city of his son,

@halcy not bad:

The sun is passing by one,
The living stream is one,
The living stream is one,
The living creature of a man,
The passing flag of a creature of man,
The passing foot of beauty is a foe,
From out of footsteps of a man of man,
From him of warlike and a man's eyes,
From heaven's mighty hand of the foe,
From heaven of footsteps of a foe,
The sunbeam is a foe,
The sun is passing by one,

@halcy I'm trying to work through it because it is a cool idea. what are latents_a, latents_b, latents_c in that code?

@halcy yeah that was the whole point of training this model and it's working GREAT, here's a greedy decoding of a linear interpolation between two random samples:

Sorrow than more, merrily
Sorrow of human fancy, unto aid
Sorrow of beauty, never beheld me
Good things of beauty, never slumbering
Good things of love! No longer to show
And sweet of love! No longer to behold
And love the world is lovely, my soul
And hear the world of beauty, never to me!
And thou art thou art thou, my soul!

(for reference I've never managed to train a sequence VAE before where greedy decoding of samples did anything but produce stuff like "the heart of the love the heart of hearts the love heart")

greedy decoding from random samples at epoch 12, just at the right point before the model started to collapse. these are actually sorta... breathtaking?

(you can see the subword embeddings at work here—it's learning "diverged" as "diver"+"ged" I think, hence the made-up words "freged" and "alterged" in the interpolation)

this model (after only 3 epochs!) is also much better at generating grammatical & semantically-appropriate interpolations between lines:

Two roads diverged in a yellow wood,
Two roads freged in a yellow wood,
Two roads alterged in a yellow wood,
As are lodged with a green tower,
As we weigh them in a trees street,
And which alter them of the midnight.
And that has made the music of.
And that has made all the smallest.
And that has made all the difference.

(the variational autoencoder works by squeezing down sequences of arbitrary length to fixed-length vectors, then trying to reconstruct the sequences on the other side—this is maybe a bit TOO good at reconstructing the input, or at least guessing semantically similar words—I might retrain with a smaller latent vector!)

(uh, this is that smash mouth song if you didn't guess)

now training a variational autoencoder neural network on the gutenberg poetry corpus but with pre-trained subword embeddings—it's MUCH better at reconstructing inputs now—here's a reconstruction of a little number you might know

Show more
Friend Camp

The decentralized web is about trust. You should only join Friend Camp if you personally trust Darius Kazemi with your social media data. You probably only have that level of trust if we are IRL friends or have been internet friends for a long time. Generally speaking this is a small, closed community. In the end, Darius is the arbiter of what is allowed here. If you don't have a good idea of the kind of behavior that flies with Darius, again, you probably shouldn't join this instance. In the interest of specificity, we do have a code of conduct and privacy policy which you should read. Friend Camp features several modifications that were requested by our users. * you can log in via any subdomain, which means you can log in to multiple accounts in the same browser session (for example, log in once on friend.camp and then as another user on alt.friend.camp) * they are no longer called "toots", they are now "posts" * if you have a locked account and you get a follow request, a reminder appears under your "post" button (on normal Mastodon mobile it is otherwise buried in a sub-menu and you might not see it for a long time) * the emoji dropdown is a neutral smiley face instead of the cry-laughing smiley @mentions are rendered as "@user" for a Friend Camp user and "@user@domain" for remote users. This helps clear up when you follow two people who have the same username on different servers. * there is a "never ask me again" checkbox on the confirmation for clearing your notifications -- more info here * When an mp3 link is in a post, we also embed an inline mp3 player. git commit here * 500 characters of profile text git commit here, requested by @deerful Important Bit from the Privacy Docs: If you want decent privacy (the info doesn't leave this server), the only way to do that is to set your account to private, only accept friend requests from other friend.camp users, and only ever @ mention other friend.camp users. Once you start talking to people on other servers, all bets are off. Any private message you send to someone on another server could be looked at by the admin of a different server. This is kind of like email: if you are on a private email server, and you send an unencrypted email to a gmail account, congrats, Google now has the content of that email. But also, you do this every day, so, hey. The internet! Our beautiful icon is based on photo3idea_studio from www.flaticon.com, licensed CC 3.0 BY. It has been modified by @casey@friend.camp!