Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsInstalling Android phones in Blackberry chassis - Cory Doctorow

As much as I admire the techlash, I have some serious reservations. I worry that there's some pretty useful tech babies that we are at risk of throwing away with the bathwater.
For starters, there's the idea of "intermediary liability," which is the degree to which online services are held liable for the harms their users inflict on each other. Lots of people want to make Meta, Google and other tech giants liable for their users' actions, such as harassment and disinformation. These people are doubtless well-intentioned, but boy have they failed to pay attention to what happens when we create these liability rules.
Historically, the most important intermediary liability law is Section 230 of the Communications Decency Act. Despite the fact that this law is only 27 words long, it is among the most badly understood aspects of tech policy, worldwide:
https://www.techdirt.com/2020/06/23/hello-youve-been-referred-here-because-youre-wrong-about-section-230-communications-decency-act/
CDA 230 says that platforms aren't required to police their users' speech. If a user libels another user, or harasses them, or threatens them, that's between the users, who can sue each other, but not the platform (CDA 230 only relates to civil liability; it has no bearing on the ability of platforms to be held criminally liable for their users' actions).
Importantly, CDA 230 also says that if a platform does intervene to prevent one user from harming another, that doesn't mean they have to intervene in every such case. There's a good historical reason for this: back in the paleolithic era, Prodigy, a commercial online service, was sued after they stepped in to protect some users from other users' bad actions. The suit argued that once they'd set the precedent that they were going to police user conduct, they acquired an obligation to police every instance of bad user conduct. In response, Prodigy and its competitors stopped moderating altogether:
https://en.wikipedia.org/wiki/Stratton_Oakmont,_Inc._v._Prodigy_Services_Co.
No one who's used big online services would say that the CDA 230 world is a great one but it's provably a vastly better world than the world we get when we take away 230's protections.
Yes, provably.
In 2018, Donald Trump signed SESTA/FOSTA into law. This is a (supposedly) narrow exception to CDA 230 that makes platforms civilly liable when they are used in connection with sex trafficking:
https://decriminalizesex.work/advocacy/sesta-fosta/what-is-sesta-fosta/
Obviously, sex trafficking is a terrible crime (and again, CDA 230 has never affected a platform's criminal liability for sex trafficking, only civil liability). None of the people who spoke out against SESTA/FOSTA did so because they wanted to protect sex traffickers.
Rather, the opposition to SESTA/FOSTA was motivated by concern over the collateral damage that would ensue, and those concerns have been entirely borne out. Opponents of SESTA/FOSTA predicted that platforms would be unable or unwilling to distinguish between consensual sex work and trafficking, and that they would simply sweep all consensual sex work off of their platforms.
That's exactly what happened. Not only did the spaces where sex workers advertised and booked their work disappear, but so did the private "bad date" forums where sex workers helped one another steer clear of dangerous clients. Sex work moved back into the streets, and with it came a revival of pimping a scourge that had been all but killed off by the use of online platforms by sex workers to find work and stay safe:
https://www.vice.com/en/article/fosta-sesta-sex-work-and-trafficking/
To the extent that sex work survives online, it has been relegated to a few fringe services that have no competitors and exploit their captive audience of sex workers to rake in massive fees for sub-par services. Meanwhile, the forcible relocation of sex work from searchable, visible online spaces to the streets has made it significantly harder for law enforcement to detect and interdict actual sex trafficking:
https://instituteforsheltercare.org/wp-content/uploads/2018/09/After-SESTA-FOSTA.pdf
That's the evidence for what happens when you make intermediaries liable for their users' conduct. Far from being a gift to Big Tech, protections from intermediary liability primarily benefit smaller online spaces, which can't afford the high compliance costs of spying on and controlling their users, unlike, say, Facebook, which is why Mark Zuckerberg wants to get rid of CDA 230:
https://www.nbcnews.com/tech/tech-news/zuckerberg-calls-changes-techs-section-230-protections-rcna486
Every Fediverse host depends on limitation on intermediary liability. So does anyone who hosts one of the new, federated Bluesky relays:
https://whtwnd.com/bnewbold.net/3lo7a2a4qxg2l
SESTA/FOSTA isn't the only experimental evidence we have for what happens when we kill CDA 230-like protections. In the UK, the Online Safety Act imposes a duty on people who provide online speech forums to monitor and police their users' words. The immediate effect of this was to kill off many small business and hobbyist forums. Now, even large, multinational corporations are killing off their forums and relocating them to Facebook, where there's the budget and resources to conduct the surveillance and control required by the Act:
https://mastodon.sdf.org/@monkeyben/114902255326864878
Moving every independent speech forum to Facebook is a funny way of punishing Big Tech. Fundamentally, the lesson here is that we can't fix Big Tech by making it use its power more wisely the only way to fix Big Tech is to get rid of it, to make it smaller, to take away its power.
That's a lesson we keep missing. Take age verification laws: these require all online forums to exercise total control over their users, because they require platforms to know who a user is, to associate that user with every interaction, and, finally, to verify the user's age. But you can't verify a user's age unless you know which user is at the other end of an online connection. This affects every user, not just kids, because the only way to prove you're an adult is to prove that you're not a kid.
Age verification and intermediary liability are measures that are diametrically opposed to the mission of making Big Tech weaker. These measures only work if Big Tech stays all-powerful, and they devastate independent online alternatives to Big Tech. What's more, they cut directly against efforts to make it easier for users to leave Big Tech, through interoperable gateways that make it possible for users who depart an online platform to stay in touch with the people who stay behind:
https://www.eff.org/interoperablefacebook
These interoperability mandates figure heavily in modern anti-Big Tech laws like the EU's DMA and DSA, but they cannot peacefully coexist with stricter liabilty and age verification rules. A platform simply cannot identify, monitor and control users and allow users to leave their platform while maintaining contact with their friends who stay.
These efforts to force Big Tech to behave don't just undermine interoperability mandates, they also kill off "adversarial interoperability," the principle that a user of a technology should be allowed to reverse-engineer and modify it, for example, to block ads or tracking, to sideload apps or extract their data or to monitor a platform's moderation failures:
https://www.eff.org/deeplinks/2019/10/adversarial-interoperability
When Big Tech does adversarial interoperability, they call it "move fast and break things," and that's another baby the techlash stands ready to throw out with the bathwater. There's nothing wrong per se with a technologist changing how a device or service works without permission from its maker. Every ad-blocker does that. So do accountability tools that scrape Facebook to document its failures to police paid political disinformation:
https://pluralistic.net/2021/08/05/comprehensive-sex-ed/#quis-custodiet-ipsos-zuck
Moving fast and breaking things is fine, depending on whose things you're breaking. For example, I want every Tesla owner to be able to walk into any mechanic's shop and unlock all the subscription features and software upgrades, without paying a dime to Elon Musk:
https://pluralistic.net/2025/03/08/turnabout/#is-fair-play
And I want every person who uses a powered wheelchair to be able to alter its handling characteristics and other digital features without waiting months and paying through the nose to one of two private-equity backed duopolists:
https://www.eff.org/deeplinks/2024/06/disability-rights-are-technology-rights
I want gig workers to be able to mod the apps that hand out their jobs so that they don't get ripped off by their bosses:
https://pluralistic.net/2021/07/08/tuyul-apps/#gojek
Adversarial interoperability means that you and I don't need to convince tech bros to give us what we want: we can just take it from them.
That's important, because if there's one thing that tech companies keep proving, over and over again, it's that they don't give a shit what we want. Think of how they're force-feeding us AI (and how nice it would be to subscribe to a service run by adversarial interoperators who would automatically block every accursed AI popup in every app and service and device you use):
https://www.bloodinthemachine.com/p/how-big-tech-is-force-feeding-us
Or, more prosaically, how much mobile phone design has congealed around a monolithic design that has no room for a clicky little keyboard something I first saw demoed 23 years ago:
https://memex.craphound.com/2002/03/25/the-danger-hiptop-kicks-azz/
Or even how they stole our 3mm headphone jacks:
https://www.fastcompany.com/90270691/i-still-miss-my-headphone-jack-and-i-want-it-back
It turns out that we don't have to take that shit lying down. Like Prometheus, we can steal our clicky keyboards and 3mm headphone jacks back from the tech gods. That's exactly what the Q25 Pro does: it's a mobile phone that is built inside the housing of a Research in Motion Blackberry Classic Q20, with a modern processor and camera, and a recent version of Android:
https://linkapus.com/products/q25-pro-full-device
It's a project from Zinwa Technologies, led by a young Chinese hacker named Zinwa who explained the gadget's design in detail on a recent installment of Returning Retro:
Zinwa explains how he grew up with Blackberries (and also Chinese clones of Blackberries) and never learned to enjoy a modern distraction rectangle. So, as all good hackers do when they get an itch, he scratched it. He realized that there was an essentially infinite supply of old Blackberry housings sitting around in drawers or making their slow, inexorable way to an e-waste dump, where they would leach out poisonous ooze forever, and that, rather than spending $200K+ to design a chassis for a new phone, he could just create a motherboard around a modern processor with a recent-model screen, all sized to occupy exactly the same space that the original Q20 board fit in.
The new device supports 4G/LTE networks and Android 13. It has an SD card slot, USB C, and NFC on-board, as well as the classic Blackberry keyboard and yes, a 3mm headphone jack. Zinwa is launching with a small batch of conversion kits for hardware hackers who want to try their hand at a retro-restoration, with fully assembled units to follow.
Now, this isn't for everyone, but there's a huge community of people who are very excited about it indeed:
https://www.techradar.com/pro/the-return-of-the-og-chinese-firm-wants-to-androidify-the-blackberry-classic-and-sell-it-for-usd400-with-passport-and-keyone-to-follow
Mostafa, who sent me a tip about this project, writes:
After using [a Blackberry-like phone] for 3 years now, the form-factor is perfect for healthy phone usage habits. Ive found the physical keyboard/small screen combo to be an optimal solution to the problem having a simultaneously infinitely useful tool/infinitely novel toy in your pocket at all times maximize the tool factor, minimize the toy. This concept has spawned a rich community around it.
If you want to be a part of that community, you can hang out on their Discord:
https://discord.com/invite/D2P7UqFdXz
The point here isn't merely that Zinwa is doing something very cool that meets the needs of a group of people who Big Tech doesn't give a shit about (though he is doing that): it's that anyone should be able to do this to any technology. That includes Zinwa's Q25: in his interview with Returning Retro, Zinwa waffles a little about whether the Q25 will have an open bootloader, which would allow other hackers to replace the OS with one that's been modded to their heart's delight. Whether or not you get to modify the tech you use to suit you better has nothing to do with whether it came from someone with good or bad intentions you should have that right, no matter what, because it's your technology and you should be in charge of it.
This is the spirit of small tech: tech that communities bend to suit their needs. Just as CDA 230 primarily benefits small groups who are underserved or abused by Big Tech, the right to change your tech primarily helps marginalized groups. Marginalized groups have always relied on adapting their tech, because their needs rarely get taken into consideration by design teams at tech companies:
https://pluralistic.net/2022/05/19/the-weakest-link/#moms-are-ninjas
The world is full of "outdated" technology that has been replaced with enshittified versions. A robust right to tinker means that we can divert this superior, well-built technology from landfills, by retrofitting it with modern guts that keep it up to date with the good things that have emerged since it was built, while discarding all the garbage that came along with it.
Take the Thinkpad X220, one of the greatest computers ever made:
https://btxx.org/posts/x220/
As Brad at btxx wrote in 2023, the X220 is built like a tank, had every port under the sun, supported compact lightweight batteries and massive external ones, sported one of the greatest keyboards ever to grace a laptop, and had an open bootloader, making it a dream to run Linux on. It was incredibly easy to repair and maintain, too (I once swapped a keyboard on one of these one-handed while holding my infant daughter in my other hand).
I would love to have an X220 with a modern processor, a shit-ton of RAM, and and updated screen. There's no way I'm ever going to build it, but there's probably a couple thousand people like me who would pay, say, $2500 each for these retrofits. For some enterprising hardware hacker, that's a pretty good year's wages, and a project that could launch a reputation and future projects.
Thinkpads went steeply downhill after the X220, so much so that I abandoned them altogether, after more than a decade of annual hardware purchases, switching to the wonderful, repairable Framework:
https://pluralistic.net/2021/09/21/monica-byrne/#think-different
The fact that Lenovo the current owner of the Thinkpad line just sucks at making computers is no reason for those X220s to go to the landfill. Someone could and should move fast and break Lenovo.
For more than 20 years, we have tried to make tech better by "holding tech to account," trying to make giant tech companies wield their power more responsibly. This has been a total failure, which has done nothing but strengthen tech companies, making them both too big to jail and too big to care. A better tech future isn't one in which today's tech companies behave better, it's one in which their bad behavior doesn't matter because they no longer have any power over us.
To bring that future into being, we have to take away tech power, not try and direct it in positive ways. We need to design our policy around evacuating tech platforms, not fixing them. We need to encourage moving fast and breaking (Big Tech's) things. The problem with the world isn't that the wrong tech bosses weild vast power over the lives of billions of people it's that anyone has that power.
For starters, there's the idea of "intermediary liability," which is the degree to which online services are held liable for the harms their users inflict on each other. Lots of people want to make Meta, Google and other tech giants liable for their users' actions, such as harassment and disinformation. These people are doubtless well-intentioned, but boy have they failed to pay attention to what happens when we create these liability rules.
Historically, the most important intermediary liability law is Section 230 of the Communications Decency Act. Despite the fact that this law is only 27 words long, it is among the most badly understood aspects of tech policy, worldwide:
https://www.techdirt.com/2020/06/23/hello-youve-been-referred-here-because-youre-wrong-about-section-230-communications-decency-act/
CDA 230 says that platforms aren't required to police their users' speech. If a user libels another user, or harasses them, or threatens them, that's between the users, who can sue each other, but not the platform (CDA 230 only relates to civil liability; it has no bearing on the ability of platforms to be held criminally liable for their users' actions).
Importantly, CDA 230 also says that if a platform does intervene to prevent one user from harming another, that doesn't mean they have to intervene in every such case. There's a good historical reason for this: back in the paleolithic era, Prodigy, a commercial online service, was sued after they stepped in to protect some users from other users' bad actions. The suit argued that once they'd set the precedent that they were going to police user conduct, they acquired an obligation to police every instance of bad user conduct. In response, Prodigy and its competitors stopped moderating altogether:
https://en.wikipedia.org/wiki/Stratton_Oakmont,_Inc._v._Prodigy_Services_Co.
No one who's used big online services would say that the CDA 230 world is a great one but it's provably a vastly better world than the world we get when we take away 230's protections.
Yes, provably.
In 2018, Donald Trump signed SESTA/FOSTA into law. This is a (supposedly) narrow exception to CDA 230 that makes platforms civilly liable when they are used in connection with sex trafficking:
https://decriminalizesex.work/advocacy/sesta-fosta/what-is-sesta-fosta/
Obviously, sex trafficking is a terrible crime (and again, CDA 230 has never affected a platform's criminal liability for sex trafficking, only civil liability). None of the people who spoke out against SESTA/FOSTA did so because they wanted to protect sex traffickers.
Rather, the opposition to SESTA/FOSTA was motivated by concern over the collateral damage that would ensue, and those concerns have been entirely borne out. Opponents of SESTA/FOSTA predicted that platforms would be unable or unwilling to distinguish between consensual sex work and trafficking, and that they would simply sweep all consensual sex work off of their platforms.
That's exactly what happened. Not only did the spaces where sex workers advertised and booked their work disappear, but so did the private "bad date" forums where sex workers helped one another steer clear of dangerous clients. Sex work moved back into the streets, and with it came a revival of pimping a scourge that had been all but killed off by the use of online platforms by sex workers to find work and stay safe:
https://www.vice.com/en/article/fosta-sesta-sex-work-and-trafficking/
To the extent that sex work survives online, it has been relegated to a few fringe services that have no competitors and exploit their captive audience of sex workers to rake in massive fees for sub-par services. Meanwhile, the forcible relocation of sex work from searchable, visible online spaces to the streets has made it significantly harder for law enforcement to detect and interdict actual sex trafficking:
https://instituteforsheltercare.org/wp-content/uploads/2018/09/After-SESTA-FOSTA.pdf
That's the evidence for what happens when you make intermediaries liable for their users' conduct. Far from being a gift to Big Tech, protections from intermediary liability primarily benefit smaller online spaces, which can't afford the high compliance costs of spying on and controlling their users, unlike, say, Facebook, which is why Mark Zuckerberg wants to get rid of CDA 230:
https://www.nbcnews.com/tech/tech-news/zuckerberg-calls-changes-techs-section-230-protections-rcna486
Every Fediverse host depends on limitation on intermediary liability. So does anyone who hosts one of the new, federated Bluesky relays:
https://whtwnd.com/bnewbold.net/3lo7a2a4qxg2l
SESTA/FOSTA isn't the only experimental evidence we have for what happens when we kill CDA 230-like protections. In the UK, the Online Safety Act imposes a duty on people who provide online speech forums to monitor and police their users' words. The immediate effect of this was to kill off many small business and hobbyist forums. Now, even large, multinational corporations are killing off their forums and relocating them to Facebook, where there's the budget and resources to conduct the surveillance and control required by the Act:
https://mastodon.sdf.org/@monkeyben/114902255326864878
Moving every independent speech forum to Facebook is a funny way of punishing Big Tech. Fundamentally, the lesson here is that we can't fix Big Tech by making it use its power more wisely the only way to fix Big Tech is to get rid of it, to make it smaller, to take away its power.
That's a lesson we keep missing. Take age verification laws: these require all online forums to exercise total control over their users, because they require platforms to know who a user is, to associate that user with every interaction, and, finally, to verify the user's age. But you can't verify a user's age unless you know which user is at the other end of an online connection. This affects every user, not just kids, because the only way to prove you're an adult is to prove that you're not a kid.
Age verification and intermediary liability are measures that are diametrically opposed to the mission of making Big Tech weaker. These measures only work if Big Tech stays all-powerful, and they devastate independent online alternatives to Big Tech. What's more, they cut directly against efforts to make it easier for users to leave Big Tech, through interoperable gateways that make it possible for users who depart an online platform to stay in touch with the people who stay behind:
https://www.eff.org/interoperablefacebook
These interoperability mandates figure heavily in modern anti-Big Tech laws like the EU's DMA and DSA, but they cannot peacefully coexist with stricter liabilty and age verification rules. A platform simply cannot identify, monitor and control users and allow users to leave their platform while maintaining contact with their friends who stay.
These efforts to force Big Tech to behave don't just undermine interoperability mandates, they also kill off "adversarial interoperability," the principle that a user of a technology should be allowed to reverse-engineer and modify it, for example, to block ads or tracking, to sideload apps or extract their data or to monitor a platform's moderation failures:
https://www.eff.org/deeplinks/2019/10/adversarial-interoperability
When Big Tech does adversarial interoperability, they call it "move fast and break things," and that's another baby the techlash stands ready to throw out with the bathwater. There's nothing wrong per se with a technologist changing how a device or service works without permission from its maker. Every ad-blocker does that. So do accountability tools that scrape Facebook to document its failures to police paid political disinformation:
https://pluralistic.net/2021/08/05/comprehensive-sex-ed/#quis-custodiet-ipsos-zuck
Moving fast and breaking things is fine, depending on whose things you're breaking. For example, I want every Tesla owner to be able to walk into any mechanic's shop and unlock all the subscription features and software upgrades, without paying a dime to Elon Musk:
https://pluralistic.net/2025/03/08/turnabout/#is-fair-play
And I want every person who uses a powered wheelchair to be able to alter its handling characteristics and other digital features without waiting months and paying through the nose to one of two private-equity backed duopolists:
https://www.eff.org/deeplinks/2024/06/disability-rights-are-technology-rights
I want gig workers to be able to mod the apps that hand out their jobs so that they don't get ripped off by their bosses:
https://pluralistic.net/2021/07/08/tuyul-apps/#gojek
Adversarial interoperability means that you and I don't need to convince tech bros to give us what we want: we can just take it from them.
That's important, because if there's one thing that tech companies keep proving, over and over again, it's that they don't give a shit what we want. Think of how they're force-feeding us AI (and how nice it would be to subscribe to a service run by adversarial interoperators who would automatically block every accursed AI popup in every app and service and device you use):
https://www.bloodinthemachine.com/p/how-big-tech-is-force-feeding-us
Or, more prosaically, how much mobile phone design has congealed around a monolithic design that has no room for a clicky little keyboard something I first saw demoed 23 years ago:
https://memex.craphound.com/2002/03/25/the-danger-hiptop-kicks-azz/
Or even how they stole our 3mm headphone jacks:
https://www.fastcompany.com/90270691/i-still-miss-my-headphone-jack-and-i-want-it-back
It turns out that we don't have to take that shit lying down. Like Prometheus, we can steal our clicky keyboards and 3mm headphone jacks back from the tech gods. That's exactly what the Q25 Pro does: it's a mobile phone that is built inside the housing of a Research in Motion Blackberry Classic Q20, with a modern processor and camera, and a recent version of Android:
https://linkapus.com/products/q25-pro-full-device
It's a project from Zinwa Technologies, led by a young Chinese hacker named Zinwa who explained the gadget's design in detail on a recent installment of Returning Retro:
Zinwa explains how he grew up with Blackberries (and also Chinese clones of Blackberries) and never learned to enjoy a modern distraction rectangle. So, as all good hackers do when they get an itch, he scratched it. He realized that there was an essentially infinite supply of old Blackberry housings sitting around in drawers or making their slow, inexorable way to an e-waste dump, where they would leach out poisonous ooze forever, and that, rather than spending $200K+ to design a chassis for a new phone, he could just create a motherboard around a modern processor with a recent-model screen, all sized to occupy exactly the same space that the original Q20 board fit in.
The new device supports 4G/LTE networks and Android 13. It has an SD card slot, USB C, and NFC on-board, as well as the classic Blackberry keyboard and yes, a 3mm headphone jack. Zinwa is launching with a small batch of conversion kits for hardware hackers who want to try their hand at a retro-restoration, with fully assembled units to follow.
Now, this isn't for everyone, but there's a huge community of people who are very excited about it indeed:
https://www.techradar.com/pro/the-return-of-the-og-chinese-firm-wants-to-androidify-the-blackberry-classic-and-sell-it-for-usd400-with-passport-and-keyone-to-follow
Mostafa, who sent me a tip about this project, writes:
After using [a Blackberry-like phone] for 3 years now, the form-factor is perfect for healthy phone usage habits. Ive found the physical keyboard/small screen combo to be an optimal solution to the problem having a simultaneously infinitely useful tool/infinitely novel toy in your pocket at all times maximize the tool factor, minimize the toy. This concept has spawned a rich community around it.
If you want to be a part of that community, you can hang out on their Discord:
https://discord.com/invite/D2P7UqFdXz
The point here isn't merely that Zinwa is doing something very cool that meets the needs of a group of people who Big Tech doesn't give a shit about (though he is doing that): it's that anyone should be able to do this to any technology. That includes Zinwa's Q25: in his interview with Returning Retro, Zinwa waffles a little about whether the Q25 will have an open bootloader, which would allow other hackers to replace the OS with one that's been modded to their heart's delight. Whether or not you get to modify the tech you use to suit you better has nothing to do with whether it came from someone with good or bad intentions you should have that right, no matter what, because it's your technology and you should be in charge of it.
This is the spirit of small tech: tech that communities bend to suit their needs. Just as CDA 230 primarily benefits small groups who are underserved or abused by Big Tech, the right to change your tech primarily helps marginalized groups. Marginalized groups have always relied on adapting their tech, because their needs rarely get taken into consideration by design teams at tech companies:
https://pluralistic.net/2022/05/19/the-weakest-link/#moms-are-ninjas
The world is full of "outdated" technology that has been replaced with enshittified versions. A robust right to tinker means that we can divert this superior, well-built technology from landfills, by retrofitting it with modern guts that keep it up to date with the good things that have emerged since it was built, while discarding all the garbage that came along with it.
Take the Thinkpad X220, one of the greatest computers ever made:
https://btxx.org/posts/x220/
As Brad at btxx wrote in 2023, the X220 is built like a tank, had every port under the sun, supported compact lightweight batteries and massive external ones, sported one of the greatest keyboards ever to grace a laptop, and had an open bootloader, making it a dream to run Linux on. It was incredibly easy to repair and maintain, too (I once swapped a keyboard on one of these one-handed while holding my infant daughter in my other hand).
I would love to have an X220 with a modern processor, a shit-ton of RAM, and and updated screen. There's no way I'm ever going to build it, but there's probably a couple thousand people like me who would pay, say, $2500 each for these retrofits. For some enterprising hardware hacker, that's a pretty good year's wages, and a project that could launch a reputation and future projects.
Thinkpads went steeply downhill after the X220, so much so that I abandoned them altogether, after more than a decade of annual hardware purchases, switching to the wonderful, repairable Framework:
https://pluralistic.net/2021/09/21/monica-byrne/#think-different
The fact that Lenovo the current owner of the Thinkpad line just sucks at making computers is no reason for those X220s to go to the landfill. Someone could and should move fast and break Lenovo.
For more than 20 years, we have tried to make tech better by "holding tech to account," trying to make giant tech companies wield their power more responsibly. This has been a total failure, which has done nothing but strengthen tech companies, making them both too big to jail and too big to care. A better tech future isn't one in which today's tech companies behave better, it's one in which their bad behavior doesn't matter because they no longer have any power over us.
To bring that future into being, we have to take away tech power, not try and direct it in positive ways. We need to design our policy around evacuating tech platforms, not fixing them. We need to encourage moving fast and breaking (Big Tech's) things. The problem with the world isn't that the wrong tech bosses weild vast power over the lives of billions of people it's that anyone has that power.
https://pluralistic.net/2025/07/23/resto-modding/
Recommended!
