as someone who grew up in that magical and apparently historically aberrant period where mostly standalone, usually not networked personal computers were the norm, I'm always weirded out by the word "edge computing" e.g. desktop computers are hardly ever mentioned as "edge" devices in this context, but most of them effectively are (lacking the GPUs necessary to train ML models). in this metaphor, using your own device to do "computing" now is literally an "edge" case

(likewise, initiatives like uTensor, TensorFlow Lite, etc. seem to almost always be about easing "deployment" to devices in order to make "inference" faster— they're never about making it faster/easier to train your own models. anecdotally it feels like over the past 3–4 years, people have stopped even pretending that training a ML model can be done by someone without specialized training and specialized hardware—"democratizing" demos/tutorials are all about using pre-trained models now)


(I need to develop this argument more and find actual evidence to back up words like "never" and "all." but I do really feel like this technology in particular is in the process of being "locked up" in very material ways)

@aparrish this is an entire area i've kind of… avoided thinking about, if i'm honest, which isn't terribly professional of me. but this rings true.

the complexity ratchet that drives a lot of the centralization of services / infrastructure / platforms seems like it's accelerated by giant blobs of ML that're (in practice) the domain of massively resourced teams at megacorps with unfettered access to data and effectively infinite hardware budgets.

@aparrish I wonder about this too but I take some solace in the fact that most ML code I see seems so brittle that it would surprise me if it still runs 6 months later.

@KnowPresent lately I've felt like the intense rate of change in the field is—whether intentional or not—engineered specifically to encourage centralization of resources and technology. ("can't figure out how to install the latest versions? just buy cloud tpu time from us instead!")

@aparrish Yeah I feel like this is another extension of the DevOps/CI/CD/Containerization philosophy which seems to be greatly motivated by a fear of being too close to hardware.

@KnowPresent @aparrish

adding yet another layer of abstraction always makes things better!

@aparrish @KnowPresent this has been stated to me directly as being the goal by two different big cloud providers

@aparrish hoping it’s ok if i bug you about this exact topic as i’m literally on day one of switching over to ML. (and come from the comparatively egalitarian land of web dev)

Some folks *were* trying to push for thin clients in the period from the late 80s to ~2000 -- notably Sun. I don't think it caught on until relatively high-speed wifi became common. Netbooks & later chromebooks are probably the harbingers of the transition back to thin clients (now over web instead of X forwarding).

@enkiv2 @aparrish
I think there's a philosophical reason that folks stuck to it as long as they did, though. Or, a cultural one, at least. Over at PARC there was this association between time sharing & government-funded top-down-organized networks, & PARC did a lot to promote the idea that 'personal computing = personal freedom', which caught on with the computer hobbyist crowd.

@enkiv2 @aparrish
Apparently this caused a bunch of friction with PLATO & with the Dartmouth folks -- i.e., time-sharing folks who nevertheless were very interested in computing-for-the-people & accessibility. Centralized control really *was* a problem (& still is to some degree) but it was a long time before standalone machines could get as friendly as PLATO.

@enkiv2 @aparrish The Friendly Orange Glow talks about some of the PLATO folks talking to Alan Kay and deciding he was out of his mind for pushing for personal computers. I'm pretty sure at least some of them thought he was literally insane.

@freakazoid @aparrish @enkiv2
That book is one of my sources for this. Highly recommend it. (I recommend A People's History of Computing to a substantially lesser degree, unfortunately: it has a lot of good information that was unfamiliar to me, but desperately needed someone to reorganize the macro-level structure...)

@enkiv2 @aparrish
In that time, it seems like (because no time sharing = no networking, effectively, or rare networking if you've got BBS access) personal computing started turning into solitary computing.

@enkiv2 @aparrish imagine if home computing had been taken a turn towards timesharing services

@enkiv2 I feel like the current situation is the worst of all possible worlds—where we're essentially just all carrying around disposable subsidized supercomputers that we don't have control over but that big corporations can use to track us, pester us, and temporarily offload their "compute" onto when it's convenient for them

@aparrish @enkiv2
Absolutely! We're using overpowered devices as thin clients, and then they don't perform properly, *and* we can't fix them.

@aparrish @enkiv2 is it bad that the first thing that comes to mind when I learn about the new compute modules in modern phones is "oh God that's going to be used for advertisment tracking isn't it?"


@aparrish Oh, that's a beautifully concise description of our wretched state of affairs.

@enkiv2 @aparrish there's a weird conflation of concerns here. The personally managed happened to originally be by virtue of having a computer. But that management has stayed fairly stunted at 1. We have either other people's ever wider pervading cloud or our personal devices.

@enkiv2 @aparrish
I truly think one of the most base personally liberating computings is thin client tech, for our personal systems. Not a month goes by that I don't wish miracast had persisted & flourished, enabling us to work across personal devices.

& we lack good ways to cohesively personally run device*S*.

@enkiv2 @aparrish
Small point but, wrt to thin client tech, being able to cast linux to any web destination looks like it's going to be fast easy & good real soon. which is cool for linux at least. but "the network is the computer" means & takes more.

If anything I think the big challenge of AI is wading through the information glut of open amazing frameworks, tools, papers, models, &c offerings out there to know what to use. The gap to adoption is that so few know what to do when so much is dropped in front of them.

@jauntywunderkind420 true, it's just that I wonder whether or not that glut itself is a (perhaps unintentional) tool for enforcing centralization

@aparrish a significant reason not to deploy and run the "ML bits" of an application locally is that, depending on how a trained model is applied, it may be important to keep it behind lock and key (e.g., behind a rate-limited, login-gated API) so that bad actors can't reappropriate it for their own use or probe for weaknesses as easily. Until what we're calling machine learning gets a lot more robust, that's going to continue to be an issue.

Sign in to participate in the conversation
Friend Camp

The decentralized web is about trust. You should only join Friend Camp if you personally trust Darius Kazemi with your social media data. You probably only have that level of trust if we are IRL friends or have been internet friends for a long time. Generally speaking this is a small, closed community. In the end, Darius is the arbiter of what is allowed here. If you don't have a good idea of the kind of behavior that flies with Darius, again, you probably shouldn't join this instance. In the interest of specificity, we do have a code of conduct and privacy policy which you should read.

Friend Camp features several modifications that were requested by our users.

  • you can log in via any subdomain, which means you can log in to multiple accounts in the same browser session (for example, log in once on and then as another user on
  • they are no longer called "toots", they are now "posts"
  • if you have a locked account and you get a follow request, a reminder appears under your "post" button (on normal Mastodon mobile it is otherwise buried in a sub-menu and you might not see it for a long time)
  • the emoji dropdown is a neutral smiley face instead of the cry-laughing smiley
  • @mentions are rendered as "@user" for a Friend Camp user and "@user@domain" for remote users. This helps clear up when you follow two people who have the same username on different servers.
  • there is a "never ask me again" checkbox on the confirmation for clearing your notifications -- more info here
  • images in a CW'ed post are collapsed behind the CW. When you expand the CW, you can see the whole image immediately. more info here
  • When an mp3 link is in a post, we also embed an inline mp3 player. git commit here
  • 500 characters of profile text git commit here, requested by @deerful

Important Bit from the Privacy Docs

If you want decent privacy (the info doesn't leave this server), the only way to do that is to set your account to private, only accept friend requests from other users, and only ever @ mention other users. Once you start talking to people on other servers, all bets are off. Any private message you send to someone on another server could be looked at by the admin of a different server. This is kind of like email: if you are on a private email server, and you send an unencrypted email to a gmail account, congrats, Google now has the content of that email. But also, you do this every day, so, hey. The internet!

Our beautiful icon is based on photo3idea_studio from, licensed CC 3.0 BY. It has been modified by!