today in machine learning 

instead of figuring out how to do fancy tricks with keras masking and tensorflow indexing, I just wrote a function to feed in batches where all of the data have the same number of timesteps. seems to work fine

today in machine learning 

... then spent like six hours debugging my decoding function, which was always returning sequences of length 20... it was because I'd hardcoded "20" as the length of the time series in the vectorization function (which worked fine, of course, when all of the training data was padded to the same length)

Show thread
Follow

today in machine learning 

in other circumstances (i.e. when I'm doing it on purpose), a character RNN decoding after the stop token is one of my favorite things, it's just like "oh uh you want me to keep going? okay, um, ingisticanestionally...?"

Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.