"A future for the i386 architecture" by the Debian Release Team
- Anmelden oder Registrieren um Kommentare zu schreiben
"Insofar as they still do, we anticipate that the kernel, d-i and images teams will cease to support i386 in the near future."
https://lists.debian.org/debian-devel-announce/2023/12/msg00003.html
When Debian does remove 32-bit support, that will eventually remove GNUinos and antiX-Libre as 32-bit options. But we will still have 32-bit Guix and Hyperbola options, I think.
Hyperbola also comes in 16-bit, 8-bit, tid-bit, wee-bit and hob-bit, I think.
But otherwise, yes, a whole wall of 32-bit options is going to be teared down when the aforementioned "near future for i386" becomes the present.
TinyCore and the Puppy's have 32-bit versions, and probably will far into the future. That may be the way to go, ultimately, for ancient equipment, if we can find libre versions.
Don't think Hyperbola supports 16 bit or lower. I hope this is a silly joke. ;)
Hyperbola will likely support 32 bit for the future for a very long time because of OpenBSD hard fork decision.
I hope HyperbolaBSD ends up getting forked and also worked on by for other implementations and also for more people getting involved.
If this happens, 32 bit would almost certainly be supported for a very long time.
I still stand behind this: https://jxself.org/32-bit.shtml
Hyperbola devs agree with you on that. I know one specifically who says that 32 bit should be supported for a very long time. I think a lifetime is what he means.
I hope 32 bit won't be dropped from debian though, for your sakes, because then it might be harder to support 32 bit in trisquel, etc.
I don't have a big love for Trisquel, admittedly, but if lots of people gonna use it, then they should have the best support they can have possible security/privacy and stability wise.
Killing 32 bit in debian and ubuntu might hurt that. I think ubuntu did already though.
I think a lifetime is what he means.
Should a current GNU/Linux distribution support a computer manufactured in the 80s? Many of us were born in the 80s and are still not dying, hence far below "a lifetime". At some point, the computer becomes a museum piece. Even if it still works, whoever owns it wants to run an operating system of that time.
And supporting the processor is not enough. The first computer my parents bought, in 1995, was a Pentium 166 (that is the clock rate, in MHz) with 16 MB of EDO RAM and a 1.6-GB HDD: no currently-supported GNU/Linux distribution, even if it supports i586, would run on that and I do not think developers should spend time trying to support such antiquities (what requires owning them, for testing).
Just to be clear: it is ecologically important to have our machines last as much as possible and software obsolescence is a reason for people to buy new hardware. That is bad: developers should target old machines... but not antique machines nobody uses for actual work.
The dilemma arises when developers cease support for their software on some hardware, despite the fact there are still users relying on it. It might be the case that developers deem the user base too small to matter, yet each user, understandably, believes their needs are significant and should be counted. This situation creates a self-perpetuating cycle. For instance, my ThinkPad X60, the first X60 to be flashed with a 100% free version of coreboot by GNUtoo at LibrePlanet before Libreboot even existed, now sits unused on a shelf. The reason? It's perceived that 'nobody uses it' anymore, leading to discontinued support. Consequently, I've shelved my computer because it lacks ongoing updates and support, which only reinforces the belief that it's not being used. However, I would actively use it if there were continued and supported updates for it. Thus, the original assumption by developers keeps perpetuating itself.
Have you tried Hyperbola on that X60? I've gotten some fairly ancient devices running with Hyperbola.
"This situation creates a self-perpetuating cycle."
The 32-bit frontier is indeed artificially pushing older hardware into undeservedly gathering dust. Many early 64-bit capable machines have lower specs than the X60 but can still run almost anything we want them to, just because they are 64-bit capable.
I have been running Trisquel Mini on an Atom N550 eeepc, which was already bottom-of-the-pit low end when it came out. Last time I checked, it was still running Aramo. This will only be possible as long as enough software projects keep an eye on resources, instead of assuming that everybody upgrades their hardware every other year with the last available power beast.
About a decade ago, I installed Wary Puppy on a third-hand tower powered by a *mobile* AMD sempron 3200+. The desktop had only two buttons - "Internet" and "Poweroff" - so my late father could easily read his favorite newspaper online. Lubuntu had already grown too thick for that.
Yeah it does depend on what people need to do, and I do need more powerful computers at home than the X60 in order to handle the compilation of Linux-libre, which currently means building 105 kernel packages weekly. My goal is to complete this task within a defined timeframe of 24 hours in order to deliver prompt updates to people's package managers. The X60 falls significantly short of this objective; I'm skeptical it could even accomplish the task within a week. To tackle this, I've set up a "compile farm" consisting of 4 Asus KGPE-D16s, with a total of 128 CPU cores, which typically manages to finish the job in about 9 hours. However, my demands from my laptops are minimal - I primarily use them for travel, simply to SSH back to my home machines which is where all of the real work happens. I like to think of it as a dumb terminal. The X60 can fill that role quite well. :)
I would like to see an eeepc farm trying to compile a hundred linux-libre packages.
Hah! For that experiment, I would require 117 eeePC computers. 105 of these would be allocated for individually building one package each, while the remaining 12 would be used to compress the source code tarballs. Then all of this work could be done at the same time. Please let me know when the hardware has been obtained. :)
That's a fair point. I must have misunderstood. But in any case, for the next decade or two, I think that's the plan at least, possibly much longer, dunno.
But yeah, supporting ancient hardware is rather pointless given how much the web and software keeps increasing in size.
The web 3.0 and beyond is another even stronger reason.
- Anmelden oder Registrieren um Kommentare zu schreiben