The Facebook whistleblower's testimony was refreshing and frightening by turns, revealing the company's awful internal culture, where product design decisions that benefited its users were sidelined if they were bad for its shareholders.

The big question now is, what do we do about it? The whistleblower, Frances Haugen rejected the idea that Facebook's power should be diminished; rather, she argued that it should be harnessed - put under the supervision of a new digital regulator.


Regulating tech is a great idea (assuming the regulations are thoughtful and productive, of course), but even if we can agree on what rules tech should follow, there's still a huge debate over how the tech sector should be structured.

Like, should we leave monopolies intact so that we only have to keep track of a few companies to make sure they're following the rules? Or should we smash them up - through breakups, unwinding anticompetitive mergers, and scrutinizing future mergers?


Mostra thread

For me, the answer is self-evident: if we don't make Big Tech weak, we'll never bring them to heel. Giant companies can extract "monopoly rents" - huge profits - and cartels can agree on how to spend those profits to subvert regulation.

We need to fix the internet, not the tech giants.


Mostra thread

The problem isn't just that Zuck is really bad at being the unelected pope-emperor of the digital lives of 3,000,000,000 people - it's that the job of "pope-emperor of 3,000,000,000 people" should be abolished.

I believe that people who rely on digital tools should have the final say in how those tools serve them. That's the proposition at the core of the "nothing about us without us" movement for accessible tech, and the ethos of Free Software.


Mostra thread

Technologists should take reasonables step to make their products suitable for users, and regulators should step in to ban certain design choices: for example, algorithms that result in racial discrimination in housing, finance and beyond.

The law should step in when sites or apps are deceptive or fraudulent or otherwise harmful; people hurt by negligent security and other choices should have remedies in law, both as private individuals and through their law enforcement officials.


Mostra thread

But even if we did all that - and to be clear, we don't - it wouldn't be enough to deliver technological self-determination, the right to decide how the technology you use works.

For example, when the W3C was standardizing EME - a shameful incident in which they created a standard for video DRM - there was a lot of work put into accessibility, including safeguarding closed captions and audio description tracks.


Mostra thread

But even the most inclusive design process can't contemplate all of the ways in which users will need to adapt their tools. My friend Jennifer has photosensitive epilepsy and was hospitalized after a strobe effect in a Netflix stream triggered a series of grand mal seizures.

EME could have accommodated that use-case by implementing a lookahead algorithm that checked for upcoming strobes and skipped past them or altered their gamma curves so that they didn't flash.


Mostra thread

The committee rejected this proposal, though.

But that wasn't all they did. They also rejected a proposal to extract a promise from the companies involved in EME's creation to refrain from threatening toolsmiths who added this feature on their own, either to help themselves or on behalf of other users.


Mostra thread

The reason such a promise was necessary is DRM enjoys special legal protection: distributing a tool that bypasses DRM - even to prevent grand mal seizures - can be prosecuted as a felony under Sec 1201 of the DMCA, with 5 years in prison and a $500k fine for a first offense.


Mostra thread

The companies making W3C DRM said they didn't need to promise not to destroy the lives of toolsmiths who added accessibility features to their product because they would add every necessary accessibility feature themselves.

Except they wouldn't. They blocked an anti-seizure tool, and Dan Kaminski's proposal to shift color palettes to compensate for color-blindness, and a proposal for captioning tools to bypass DRM to ingest videos and run multiple rounds of text-to-speech analysis.


Mostra thread

Even if they *had* accepted all of this, it wouldn't have been enough. No one can anticipate all the ways that people need to adapt their tools. "Nothing about us without us" can't just mean, "Some people with disabilities helped design this."

It also has to mean, "I, a person using this tool, get a veto over how it works. When my interests conflict with the manufacturer's choices, I win. It's my tool. Nothing about me without me." That's the soul of technological self-determination.


Mostra thread

Not only is Zuck a bad custodian of 3b lives, but every company is a bad custodian - or at least, an imperfect one - when it comes to its users' lives. Companies often do good things for their users, but when user interests conflict with shareholder priorities, users lose.


Mostra thread
Accedi per partecipare alla conversazione è un ambiente di confronto per pacifisti, anarcolibertari, ecologisti, antimilitaristi, anticlericalisti, antirazzisti, antifascisti e ogni altro genere di persone che sogna un mondo più pulito, solidale e libero. Il discorso d'odio è bandito, la pubblicità pure. Niente contenuti a pagamento, niente controllo da parte di qualche azienda privata, design etico e decentralizzazione! Su questo sito, grazie al software libero Mastodon, il proprietario dei tuoi dati sei tu! Senza pubblicità, senza paywall, senza algoritmi, senza cookies di profilazione, è indipendente, senza azionisti, investitori o gruppi/circoli/centri a cui dover riferire, senza tracciabilità. non è un social commerciale, non ha alcun scopo di lucro, nessuno ti spia, nessuno analizza cosa stai facendo, nessuna fastidiosa registrazione, niente dati personali, niente costi premium, nessuna notifica fastidiosa, nessuna email e niente banner pubblicitari. è una iniziativa dell'associazione culturale e dell'associazione