training a neural network to "caption" summary vectors of lines of poetry, then using the network to generate text for summary vectors of arbitrary sentences. this is after seven epochs of training on my laptop last night; source line is first, output is second

the goal here was to be able to put in the vector for (e.g.) "dog" & get back a line about dogs. but it's learning the punctuation and the length of the lines, so putting in single words yields stuff like

abacus ➝ Beginabliny
allison ➝ It is is is is is is ine is ineay
cheese ➝ Great occhanting seaw
daring ➝ The left the lonious courtina
mastodon ➝ shorn born born borner
parrish ➝ the oh
purple ➝ Greath green green green
trousers ➝ To blenting my blank
whoops ➝ Aaann aaas! aaan aaas!
zoo ➝ T

Show thread

@Ranjit I just ran it to find out and

dogs ➝ earnest heed

which is...!! on a different inference it generated "ear ee" which is also appropriate

@aparrish as a dog partisan I’m hoping cats get something less sweet while feeling slightly guilty for hoping that

@Ranjit right now it generates

cats ➝ eers
kitten ➝ Lose lo lord

so... no idea

@aparrish happy to see both cats and dogs getting eers 🐶🐱

Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.