Low-Power Computing
I'm half-thinking of making a full article on this subject but I wanted to just lay down some preliminary thoughts on the subject first before I go all out researching and drafting and artsing and other article things. Articles by themselves take quite a bit of time and care, even if I'm a bit sloppy with proofreading sometimes. I've once or twice considered starting a YouTube channel, but I know that would never work out for me. How do YouTubers do it?
So yeah, there were a couple of Gemini posts and such that got me thinking about the subject of sustainable computing and such.
(External)
was the big one, literally being about literally that subject, but also to a lesser extent "small internet" observations like
(External)
along with some threads I picked up while writing my new article:
(External)
and the sort of Macintosh emulator development stuff I was working on for a bit.
Basically, for the uninitiated (which is probably nobody on Gemini but eh) the idea of "Solarpunk" is to be an alternative to "Cyberpunk". Cyberpunk is a dystopia where capitalism has destroyed any natural beauty in the world and replaced it with an oppressive system of neon billboards and loud noises. Cool aesthetics, until you realize that there are literally no parks. Good examples of this sorta vibe include Blade Runner, The Fifth Element, and Akira.
Solarpunk, on the other hand, fantasizes about a possible post-capitalist, sustainable future. Probably anararchist/communalist, but more specifically focused about being harmonious and symbiotic with nature, using the power of the wind and the sun as opposed to coal and oil, scaling things down and slowing society to a more comfortable speed, being focused on living a good life instead of endless growth. A good existing example of this is the Green-Sky Trilogy by Zilpha Keatley Snyder. The Obscuritory, a very good blog, took a look at the Commodore 64 adaptation:
(External)
Another good example is Sonic CD, although that isn't as heavy on the world-building. There's the bad cyperpunk future, where Dr. Robotnik has industrialized the entire planet, and the good solarpunk future, where some sustainable industry exists and it's just really pretty and chill ya know?
But, well, computers. I like computers. I think the internet, being able to make social connections that surpass geographic boundaries, is neat! And I like digital art, so much so that getting a drawing tablet has gotten me to draw SO much more because it's signficantly cleaner and easier to work with and won't make me an anxious mess about wasting paper. And I like how, in theory, anyone can upload their own videos or songs and circumvent the centralized media networks. (So long as they either upload to YouTube or are fine with no ad revenue, but I digress.) I would have been completely miserable during this whole quarantine thing without my internet friends. And heck, it even makes things more accessible. Video chats are superflous for some, but for Deaf people it's a way to FINALLY talk to other people using their native sign language. Blind people can have books and newspaper articles and such read out loud to them, or converted to Braille.
I guess my general annoyance with some discussions about sustainable computing is that the argument often tends to wander towards "oh, fuck it, throw out all the (personal) computers, they're useless if we're just more social and understanding". And I don't think that's what we should be doing. Computers are good and I'm very glad they exist. But also, we do need to ask questions about how MUCH computer we all need.
I've been very vaguely investigating the idea of programming an internet software suite for the classic Macintosh (System 7 era), to see if modern internet services are possible on very low-end hardware. Because for the vast majority of text-based communication (that's Twitter, Mastodon, email, the news, etc.) we really don't need that much computing power. The SSL encryption might slow things down, sure, but by what, a second or two? I find it really silly that we need to get brand new Intel Core i9 Gen 153½ processors to read the news these days. And I'm giving the major stink-eye to webdev people here, with all their JavaScript frameworks and such.
But also, graphics are still a thing. I know this might be an unpopular opinion on Gemini/Gopher, but I think pictures are a very effective communication tool and expecting everyone to go back to 1970s era text terminals is asinine. Graphics are also fairly hard to handle on, say, a 4MB Macintosh Plus. You basically have to download the image to some scratch space, convert the thing to B&W at the proper resolution, and then render it. Which I guess isn't impossible, but it's slow. Also, monochrome. So we'll probably want a slightly better computer for that. And then videos, and streaming audio, those are all good as well, and the Macintosh is never going to be capable of those. Hence the whole "multimedia computer" movement of the 90s. Those were reasonable things to want to upgrade a computer for, and I'm not sure if artificially restricting ourselves is the best chocie here.
But what I will say is that I believe 80% of computer users (and even more smartphone users) probably don't need a 3D acceleration card, let alone one that does real-time raytracing or whatever the cool kids are doing these days. There's a lot of video games you can play with just early-00's era 3D capabilities. Maybe it won't be cinematic quality, and I do respect people's desires for better graphics... but IMO we've long since passed the point where new console generations look any better than the last.
So all of this discombobulated writing is me effecgtively asking how much computer is the minimum required for a sustainable albeit still accessible life. I suppose the most sustainable computers are the ones that have already been made, and there are a crap load of mid-00s computers just sort floating around with almost zero resale value. They're not old enough to be interesting, but not new enough to be compatible with the modern internet. I actually got my hands on an old Pentium 4 computer for a few bucks and I tried seeing if I could do everything on that and, yeah, I totally could with Linux. It's a computer. I stuck with my newer computers because I already had them and they worked a bit better, but that P4 was totally usable. Do we need anything newer than that?
I once had to give a presentation in class (not a very in-depth one but a one) awith some classmates about the death of Moore's Law, what to do when we can't make computers better. They proposed ideas like "quantum computing!" or "neural networks"! or, ya know, buzzwords to throw VC dollars at to keep the machine of perpetual growth going. (I'm being overly pessimistic here, I'm not against new research just to be clear.) I did my part of the presentation of just, hey, what if we optimize stuff. What if we come up with better compression algorithms, more optimized software, heck, there's some new FPGA developments that let them be quickly reprogrammable, that could be super neat for hardware accleration stuff. And, generally, yeah, that's my take on sustainable computing. Let's just do more with what we have, instead of constantly buying/making new stuff.
FPGAs are also fairly interesting, because IMO they solve a lot of the "fixed useless silicon" issues that discrete logic has. If something like Meltdown or Spectre occurs, you can just reprogram your entire processor to fix that. If you need/want to switch ISAs entirely, or if some computer scientist comes up with a better way of rendering polygons, or some new encryption standard comes out, you can reprogram your processor to hardware-accelerate all that. And over time we'll figure out how to better exploit FPGAs to do more and more stuff, while keeping the silicon we use the same over decades maybe. And we'll optimize the software in the meantime, as well.
For a point of comparision, look at the retro computer demoscenes and homebrew games. Heck, look at the commercially released games. The early ones were sloppy and didn't use the hardware very well, but at the years went by, the performance of a particular game/demo went up and a library of tricks and tips was built up for doing useful and cool stuff with the same old hardware. Maybe we should take the same approach to all software and hardware.
I'm getting to the point where I'm just repeating myself a dozen times so I'll just finish this. Sustainable computing. Yes.