Pinned post
allison boosted

Lately I've been reading a lot of children's picture books, over and over? I thought "Goodnight Moon" was pretty spooky, but I had trouble finding anyone writing about that online. @redoak jokingly suggested that I become the conspiracy theorist blogger I want to see in the world, so... I did it. Here's a totally serious take on why "Goodnight Moon" is an esoteric text, from me, a serious scholar of esotericism (aka podcast listener): https://pseudony.ms/blags/goodnight-nobody.html

logit biasing, markov chain style. here I'm doing it with phonetics—basically I check the possible outcomes for each context, and then artificially boost the probability of predictions that have certain phonetic characteristics. (in this case, more /k/ and /b/ sounds)

Show thread

(tomorrow I'm going to see if stealing alternatives from similar ngrams helps... but I am beginning to more viscerally understand why the solution to language modeling that really caught on is just... More Training Data)

Show thread

I like having this extra setting to fiddle with! but based on my limited testing, the temperature doesn't really matter once the length of the ngram hits a certain limit, since most ngrams only have one or two possible continuations. like... with word 3-grams, it's pretty difficult to distinguish 0.35 from 2.5

Show thread

generating with a markov chain using softmax sampling w/temperature (a la neural networks). this is an order 3 character model, and you can really see the difference between low temperature (instantly starts repeating itself) and high temperature (draws from wacky corners of the distribution) (if you've generated text with a markov chain before, it's probably using what amounts to a temperature of 1.0)

Show thread
allison boosted

The new issue of Bad Quarto's literary magazine is out! Taper #6 offers 26 computational poems, none larger than 2KB, from 23 authors

https://taper.badquar.to/6/

Taper #6 is thanks to Kyle Booten, Angela Chang, Leonardo Flores, Judy Heflin, and Milton Läufer. This editorial collective determined the theme, selected poems, worked with authors, and did other editorial and production work

All poems are free software

here it is working on an oov ngram ("you ate books" is not an ngram that appears in Frankenstein. all of this is trained on Frankenstein, I guess I forgot to mention that)

Show thread

another way to find similar ngram contexts: each context has an embedding derived from the sum of positional encoding (they're not just for transformers!) multiplied by "word vectors" (actually just truncated SVD of the transpose of the context matrix). then load 'em up in a nearest neighbor index

(this is cool because I can use it even on ngrams that *don't* occur in the source text, though all of the words themselves need to be in the vocabulary)

Show thread

poking at the edges of markov chain text generation... here I'm using truncated SVD to find similar ngrams, based on the tokens that follow them. (the goal is to add variety to the generation process by plucking possible next tokens from those following similar ngrams)

allison boosted

Wow, this is a cool little experiment that maps a word-vector space to a text adventure space that you walk around in.

spinfoam-games.itch.io/rainbow

allison boosted
allison boosted

Anyway, large language models (LLMs, like GPT-3) are one of the actual new technologies that technology corporations are racing to get out to market so fast that they've had to sideline and censor all the pesky ethicists and scientists who keep getting in the way by pointing out the litany of actual harms caused by LLMs (discrimination and segregation, wide scale disinformation, environmental impacts of excess computation).

https://www.technologyreview.com/2021/05/20/1025135/ai-large-language-models-bigscience-project

The upsides of LLMs to surveillance capitalism are too high to let social good get in the way of their inevitable production.

allison boosted

In Strange Horizons, Kelly Jennings calls Situation Normal "a hilarious, deeply moving, fast-paced yarn that catches hold of its reader and never lets go."

http://strangehorizons.com/non-fiction/situation-normal-by-leonard-richardson/

allison boosted

someday I should develop a poetics where the success condition is something other than "yeahhh now it's giving me a good headache" but... today is not that day

Show thread
allison boosted

The Small File Media Festival works in defense of the tiny image. Size matters, and small is better, tiny is best, which is not merely to argue for a different aesthetics or narrative structures but also for an understanding that all media is media ecology – and as such, directly related to infrastructures with environmental costs.

– Jussi Parikka

https://smallfile.ca/

allison boosted

Food-related project 

I made a little site that suggests a random sandwich for you to eat from Wikipedia's list of notable sandwiches.

tinysubversions.com/stuff/sand

(I ended up doing all the front-end programming myself, which was fun, especially since I haven't done front-end dev since the jquery days... the machine learning part of the javascript was basically 100% done months and months ago and I spent the rest of the time neck-deep in MDN and questions about Svelte on stackoverflow, haha)

Show thread
Show older
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.

<svg xmlns="http://www.w3.org/2000/svg" id="hometownlogo" x="0px" y="0px" viewBox="25 40 50 20" width="100%" height="100%"><g><path d="M55.9,53.9H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,53.9,55.9,53.9z"/><path d="M55.9,58.2H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,58.2,55.9,58.2z"/><path d="M55.9,62.6H35.3c-0.7,0-1.3,0.6-1.3,1.3s0.6,1.3,1.3,1.3h20.6c0.7,0,1.3-0.6,1.3-1.3S56.6,62.6,55.9,62.6z"/><path d="M64.8,53.9c-0.7,0-1.3,0.6-1.3,1.3v8.8c0,0.7,0.6,1.3,1.3,1.3s1.3-0.6,1.3-1.3v-8.8C66,54.4,65.4,53.9,64.8,53.9z"/><path d="M60.4,53.9c-0.7,0-1.3,0.6-1.3,1.3v8.8c0,0.7,0.6,1.3,1.3,1.3s1.3-0.6,1.3-1.3v-8.8C61.6,54.4,61.1,53.9,60.4,53.9z"/><path d="M63.7,48.3c1.3-0.7,2-2.5,2-5.6c0-3.6-0.9-7.8-3.3-7.8s-3.3,4.2-3.3,7.8c0,3.1,0.7,4.9,2,5.6v2.4c0,0.7,0.6,1.3,1.3,1.3 s1.3-0.6,1.3-1.3V48.3z M62.4,37.8c0.4,0.8,0.8,2.5,0.8,4.9c0,2.5-0.5,3.4-0.8,3.4s-0.8-0.9-0.8-3.4C61.7,40.3,62.1,38.6,62.4,37.8 z"/><path d="M57,42.7c0-0.1-0.1-0.1-0.1-0.2l-3.2-4.1c-0.2-0.3-0.6-0.5-1-0.5h-1.6v-1.9c0-0.7-0.6-1.3-1.3-1.3s-1.3,0.6-1.3,1.3V38 h-3.9h-1.1h-5.2c-0.4,0-0.7,0.2-1,0.5l-3.2,4.1c0,0.1-0.1,0.1-0.1,0.2c0,0-0.1,0.1-0.1,0.1C34,43,34,43.2,34,43.3v7.4 c0,0.7,0.6,1.3,1.3,1.3h5.2h7.4h8c0.7,0,1.3-0.6,1.3-1.3v-7.4c0-0.2,0-0.3-0.1-0.4C57,42.8,57,42.8,57,42.7z M41.7,49.5h-5.2v-4.9 h10.2v4.9H41.7z M48.5,42.1l-1.2-1.6h4.8l1.2,1.6H48.5z M44.1,40.5l1.2,1.6h-7.5l1.2-1.6H44.1z M49.2,44.6h5.5v4.9h-5.5V44.6z"/></g></svg>