highly recommended generative technique: whenever you're selecting an item at random where the items are weighted (e.g. by their frequency in a corpus), normalize the weights and sample it with a softmax function, using temperature as a parameter. (at temp=1.0, it's the same as picking by the weights directly; at <1.0 it favors items that are weighted more heavily; as temperature increases >1.0, the sampling approaches a uniform distribution)

here are examples from the thing I'm working on

Follow

(in this case, I'm replacing phrases in sentence templates that match the part of speech of the phrase being replaced in the sentence. the effect of the temperature change is a little subtle because the replacement operation only takes place about half the time, but you can see with the low temperature example you get the same high-probability items over and over again and with the high temperature example you get a lot of weird stuff)

· · Web · 0 · 0 · 1
Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.