Follow

A well-argued piece from @slightlyoff as to why we need to bring more capability to web, and why the privacy concerns brought up by Mozilla and Apple are kind of a red herring.

infrequently.org/2020/06/platf

My brief time in W3C-land was filled with people yelling 'but fingerprinting!' any time someone proposed something useful. (Fingerprinting is the idea that the more features your browser exposes, the easier it is for a third party to track you as an individual collection of features.)

Apple's "structural under-investment" (as Alex puts it) in the browser is the main reason why iOS users on my instance have to resort to third-party apps to read posts on this server (no native notifications). This results in them having to use code that is authored by strangers instead of code that is maintained and run by me, their trusted admin. (Yes I suppose I could write an iOS app but it would take me a few years to learn to do that.)

Show thread

Maybe I'm putting words in Alex's mouth, but the article more or less matches my thoughts at the time I was doing W3C stuff (2011): we are already fingerprinted and tracked to death. Alex points out that simply not using Tor means you are fully trackable no matter what other protections are in place. Since we are already living in this fully-tracked world, why not actually enhance browser features that will lead to better security along OTHER vectors like the one in my above post?

Show thread

theorizing, not fully sold on this claim but chewing on it 

Perhaps our obsession with third-party trackers is the online equivalent of focus on "stranger danger": yes it's worth putting energy into solving but I think it's low-hanging fruit that addresses ultimately a small amount of online abuse and makes for good PR. Meanwhile it tends to sucks up resources, leaving not much left over for work on intimate/targeted attacks.

Show thread

@darius yes, this! we're running an interview with sarah t hamid of the carceral tech resistance network for our next logic issue, and one of the *snaps* things she talks about is how this threat of surveillance capitalism has shifted the focus of surveillance to "oh creepy" instead of looking at the specific threats posted by specific communities of surveillance. this seems like a concrete example of that—focusing on privacy broadly, as opposed to the impact of that lack of privacy specifically

@christa Right, it's like, sure it's important to care about third party trackers but maybe put similar energy into think about cases like spyware installed by a loved one. It kind of mirrors the issues around abuse prevention irl, where people are incredibly worried about abuse/assault from strangers when most abuse/assault comes from literally inside the house.

@christa @darius what bothers me the most is that propietary surveillance capitalist video streaming services are most people's first introduction to video streaming and they don't look further. I'm so fucking sick of corporate platforms. Someone's gotta make the privacy respecting free open source stuff more accessible and easier to use. And more popular.

@darius that's why we need "web pages" and "web apps" as separate things

@darius Don't forget that doing an iOS app means you pay Apple money each year for the privilege of adding value to their proprietary walled garden.

@darius I observed the same thing in the W3C Web Perf working group. At first I took it as "Wow, Apple/Mozilla really care about privacy," but later on I wondered if it was just an easy excuse not to invest. (Which, to be fair, Google has way more browser devs than Moz/Apple have; they can't match the pace.)

Unless we revisit canvas, geolocation, hardware acceleration, local storage, and other APIs that have been there since forever, it's hard to argue against new APIs re:fingerprinting.

@nolan @darius Apple can hire infinite developers and task them to any damned thing they please. Capacity problems there must be understood as an explicit choice.

@nolan @darius also, my claim isn't that we should give up because we have X, Y, and Z that are already bad; it's that remediating those will also yeild solutions for (or new constraints on) the new stuff that are nearly identical. The costs *look* linear, but they aren't.

@nolan @darius: to give you an explicit example, consider local font enumeration. We all want to turn off the organic variant of this via CSS, but design tools *need* access to this, even though it can persistently re-identify users, even across cache clearing (etc.). The question here isn't "should this be possible", but rather "how often?" and "with what friction?"

@nolan @darius: "solutions" like "don't provide this" only make the market for insecure software larger. Many good-sounding frictionful UIs also fail at reidentification risk *specifically*. It's an area where remediation will involve turning things off that don't involve consent *and* inventing new surfaces for managing and communicating the risks. Doing half the job isn't solving the problem, it's externalising costs onto users.

@slightlyoff Right, this is where Privacy Budget comes in, I imagine? Seems like a more holistic approach to the problem.

@nolan yep, but even without that, you can invent lots of other ways to coarsen the granularity that don't involve not implementing APIs; e.g. per-site opt-in to storage partitioning to unlock access, requirements around installation, etc. etc.

@nolan when you take the problem space seriously, you sort of get the opposite problem: debilitating numbers of potential solutions. Pushing through, taking a shot, and preparing to be wrong via defensive API design helps. Iterating shows seriousness, hence my disdain for unserious pushback dressed up in thoughtful garb.

Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.