💾 Archived View for dfdn.info › dfdn › computerfuture.gmi captured on 2024-09-29 at 01:01:05. Gemini links have been rewritten to link to archived content

View Raw

More Information

-=-=-=-=-=-=-

Some Thoughts during the Summer Doldrums on the Past, Present, and Future of Computers and the Internet

8-16-2024

When I was in grade school back in the 1970's, school began on the first Monday in September and ended some time in the third week of May. Today's children may be surprised to learn that we had three-and-a-half months of Summer vacation to spend outside with our friends before we were forced back into bleak classrooms that smelled like unwashed feet and industrial cleaning products. Once back, we would resume our psychological conditioning for the life that awaited us as soon-to-be wage slaves--lives of arbitrary rules, arbitrary discipline, looking alike, acting alike, becoming alike. Becoming good little fungible cogs in the machine. The end of Summer was always an especially depressing time of the year for me.

I am writing this on August 9th, and I have already witnessed school buses picking up children in the mornings. It seems that children are having their childhoods stolen more completely with each new generation. No wonder an epidemic of depression exists among children. But, adults are facing the same thing. When I began my engineering career, we had unlimited sick days, and on my first job we had 20 days of holidays a year in addition to our vacation time, which varied from between two and six weeks, depending on seniority. Why are we all working harder and enjoying life less? My guess is that it may have something to do with narcissistic slave drivers like Elon Musk being in change of everything.

You may be asking yourself what this has to do with computers. Even though I have been out of school for decades, my psychological conditioning was so insidiously comprehensive that this time of the year remains a source of depression. And that makes it an especially bad time of the year to contemplate the future of anything. So, of course, it is the perfect time to talk about the bleak future of computers. But bleak as the future of computers looks, it doesn't have to be unless we let it happen.

The 1980's and early 1990's were the golden age of home computers. Computers were amazing back then. We waited with eager anticipation for the newest computer with the latest chip to come out because each new generation brought us new and surprising capabilities. I regularly spent many hours flipping through the pages of Computer Shopper, a magazine thick enough to knock you down if someone threw it hard at you. In the beginning, everything was done on the command line. Only nerds owned computers, and not a few of them wrote a significant portion of their own software. In fact, that was sort of the point of "microcomputers" in the very early days. Then, the Windows operating system came along, and normal people everywhere began to discover computers for the first time. Along with this new influx of computer users, we quickly transitioned to a dependence on software written mostly by the larger tech companies.

While Microsoft was establishing itself as a major player in the tech industry, the Internet sprang into existence and swelled, not unlike the big bang, to fill the void that had existed during all of human history. Even though it would be over a decade after the introduction of the first consumer Internet accounts until most people would begin to use it, everyone knew about the Internet. Perhaps the first widely-reported fight for dominance in the computer industry that normal people paid attention to was the first browser war that occurred in the late 1990's. Thanks in large part to the Internet, we discovered ways of listening to music on our computers. Then, we began to watch movies. Years later, Internet streaming services appeared. Early social media matured and became more widely known, and we began to make friendships of a sort with anonymous strangers on line. Later, most social media evolved to become less about friends and more about information and entertainment.

All the while, video games slowly matured into the amazing experiences they are today. Unfortunately, along the way, we have lost the experience of offline video games entirely and are now forced to play online with other people, whether we want to or not.

Computer hardware and software companies gradually became significant to us all, even to normally oblivious politicians, simply because we became so overly-dependent on them. Over the past four decades, computers have gradually transitioned from a source of excitement to a source of worry and even fear. Today, with very few exceptions, we no longer look forward to new computers coming out, probably because their capabilities rarely improve much. Instead, we worry more about retaining the freedoms that computers have brought us, some of which they now threaten to take away.

Sometime around 2005, Moore's law finally died without anyone much noticing, and the major innovations that lead to the increasing usefulness of our computers began to evaporate like morning dew on the grass and be replaced by much more gradual improvements in computer technology. Around that time, computers became nothing more than commodities. While I was in no way sorry to see the drop in computer prices that followed as a result, I was sorry to see the near death of innovation. From the end of the 2000's until today, try as Microsoft, Apple and the rest of the computer industry did to hide what was happening, the average person no longer needed to purchase newer faster computers. As long as we refused to install the increasingly inefficient software put out by the companies that ran the computer industry, we were pretty much able to do whatever we liked without a significant amount of the historic pressure to buy new computers every two or three years. Today, with an old version of Linux Mint, I can still listen to music and even watch movies on my sixteen-year-old Thinkpad T500. In fact, I can still do all the things I really want to do with that computer.

These days, computer hardware and software manufacturers are so devoid of significant innovation that they are forced to find other ways of increasing profits. For the past five years or more, that seems to have taken the form of Software As a Service (SAS) and cloud computing. SAS and cloud service providers enticed us initially with smaller monthly fees for the software we used to pay high prices for every few years, but as the cost of providing the equivalent services in the cloud drops thanks to the lower costs of hard drive technology and network bandwidth, the prices of the services themselves seem to decline little if any or even to increase. The cost of software as a service alone has been increasing at twice the rate of inflation. I am sure the sellers of SAS and cloud services hope we won't notice.

Over the past few years, Microsoft seems to have been working hard to create an operating system that is increasingly cloud-centric. The point is of course to further lock us into paying for everything by the month, with no option of going back. Once we are wholly in the cloud, Microsoft, Google, and others hope to be able to charge whatever they like as we slowly lose the knowledge required to go back to the old way of computing. They appear to have already fooled the large majority of individuals and organizations that once ran their own web servers into believing that they should now be paying monthly fees to off-premises data centers for that service. I am convinced that they hope to also turn the rest of us into true point-and-clickers who have no conception of the possibility of installing software on our own computers or of even remembering that people used to do that.

For decades, the decreasing efficiency of new computer software has been more than offsetting the increasing power of new hardware, and I believe this trend will continue for some time. One reason is that developers have been slowly losing their ability to write efficient software as they have relied more on development "frameworks". But now they are also beginning to look to AI to write their code for them. In these two respects, we are definitely not heading in a good direction. Add to this the long-standing belief of developers that they should be creating new applications by assembling other people's code like jigsaw puzzles, rather than writing their own code from scratch. This programming mindset has been around at least since I started my first engineering job in the 1980's, so I seriously doubt it will be changing any time soon. While this may make sense in certain circumstances, it generally over time leads to bloated slow-running software. Many argue that increasing layers of abstraction are necessary to deal with ever more complicated software. As one who writes nearly every line of code that runs my websites, my perspective is that much of the reason software is increasingly complicated is likely due to increasing layers of abstraction. I can still create a website with only hand-coded HTML, CSS, and PHP. I don't need React, Ruby on Rails, Node.js, Laravel, or any of the rest. I don't ever need to touch a Docker container. As someone who at times during my professional career has worked on projects with millions of lines of code, I know that nothing more that a good CMS is required. Many people talk about the degradation in developer's ability to create efficient software, but none of the talk seems to be driving the message home. Developers seem to be determined to continue along the course they have been heading. Eventually, of course they will arrive at their destination. Just as a vanishingly small number of people can still write assembly, in a decade or two, few may be able to write HTML, CSS, or PHP. Since most software is moving into the cloud and becoming dependent on Web interfaces, this means the trend of manufacturing increasingly fast hardware to run decreasingly efficient software with no net increase in capability will very likely continue for the foreseeable future.

With very few exceptions like AntiX and some vintage computer software, Linux developers don't bother to update software for computers past a certain age. This means we simply cannot continue to run our old computers in environments where the latest security updates are required. With efficient software, computers from the middle 2000's are still perfectly capable of surfing the Web, listening to music, watching DVD's and online movies in standard definition and doing everything else they have always done, but a lack of efficient up-to-date software makes that difficult or impossible. Lately, I have been having increasing difficulty running recent open-source Linux software with the latest security features on my old computers. I see the time fast approaching, when I will no longer be able to do so, and that means from that point forward, I will have to keep those computers off line. Oh, perhaps I will still be able to use them on Gopher, Gemini, and some other alternative networks, but with my normal usage patterns, the Web will be off limits.

The only way that I can see that my old software might remain relatively safe to use on the Internet would be if I run it on a non-persistent Linux USB flash drive. This way, any malware that is transferred to the USB drive during an Internet surfing session will not be there the next time the drive is used to boot the computer.

I have no interest in buying a new computer, because new computers seem to be becoming less practical every year. Manufacturers are increasingly ignoring the characteristics that make computers useful and functional and instead are designing whatever brings them the most profit. But, when has that not been true? Still, manufacturers seem to be banking on the fact that most consumers are like lambs being led to the slaughter. Just when you thought computers could not be any more locked down, computer industry leaders are inventing operating systems that only run on computers that are continuously connected to the cloud and hardware and firmware that is even less under the user's control thanks to new technologies like software-defined silicon and Pluton. Manufacturers fixate on ever thinner PC's for which consumers have never asked at the expense of sufficient USB ports, and slots for future technology upgrades have completely disappeared. Their reasoning must be that since we no longer "need" to install operating systems or software of any kind or load music or movies for off line listening or viewing and since all of our data resides in the cloud, why on earth would we ever again need to bother with another peripheral? Computers are increasingly fanless, despite the fact that makes them either glacially slow or causes them to overheat and burn up a few months after purchase. But the more often our computers burn up or need to be upgraded to the latest models, the more manufacturers and their shareholders rejoice on their way to the bank. Even the ease of use of the now ubiquitous touch pads without buttons is atrocious compared to those with buttons. Keyboard latency is also increasing on modern computers because so much unnecessary garbage is competing with keyboard inputs for the CPU's time.

Let's talk about what is likely to happen to computers in the near-term future. Two bright spots are that computer manufacturers realized some years ago that low power tablet and laptop computers with more hours of use per battery charge are important. We still have a long way to go, but the reason we have not made more progress is simply our horribly inefficient software. I contemplate a day soon when a manufacturer like Apple that can control both the hardware and the software of a computer line will realize that by demanding more efficient software from developers, it can create a line of very low-power, long battery life computers for those who don't need much computer power--people like writers and average Internet surfers. Combined with the leaps and bounds that are being made in battery technologies, this could lead to laptops that will run for a week or two between charges. This line of laptops would do well in the consumer computer market. It might even spur a "netbook 2.0" revival.

As I have mentioned, the current trends are for thinner, less functional cloud computers. Simultaneously, more regular computer maintenance has moved from being the responsibility of the user to being under the manufacturer's sole control. Though many refuse to see the problem with this, Microsoft has been taking ever more control of our computers, leaving less and less for us. This has been blatantly obvious ever since the Windows 7 era, when Microsoft began blocking our access to our computers whenever it wanted to perform a random update. More recently, it has begun upgrading from Windows 10 to Windows 11 even when we have specified that we don't want that. As operating systems have become far too complicated for consumers to maintain, computers have become more and more like black boxes that are beyond our control. The manufacturers of the software on our computers increasingly track us everywhere we go on the Internet and monitor everything we do. With Pluton, Microsoft now has all the tools it needs to completely lock us out of the operating system that runs our computers, and that is exactly what we should expect to be coming soon.

We are already at the point where our computers are privacy nightmares; coming next will be even more egregious offenses than most of us can imagine. Microsoft's Windows 11 Recall feature now has the capability of recording everything we will ever do on our computers and automatically transferring all of that information to every computer we will ever own in the future without us even knowing. Since Windows 11 computers are now effectively always-connected cloud computers in nearly everything but name, on which we have no control over which data resides on our hard drives and which data resides in the cloud, we should assume that is exactly what will occur on all computers that have the Recall feature. And since companies have no limits to their greed, we should expect that all of that data will soon be added to the information they already sell to anyone who wants to buy it, including criminal organizations and the totalitarian governments of the world.

The only away around this would be open-source software. Unfortunately, like rats following the pied piper, open-source software developers are already beginning to follow Microsoft into the cloud. I recently tried installing the Bunsenlabs Linux distribution and decided not to use it in part because the installation process could not be completed without an Internet connection. No reason for that exists other than to collect data on whoever is installing the operating system. So, in terms of privacy violations, open source-software is lagging behind closed-source software, but they are both going in the same direction. Part of the reason for this, I think, is that open-source developers who once wanted nothing more than to create privacy-respecting free operating systems are now in it for the money. Instead of looking at the software they create as their gift to the world, they are looking for a payoff.

Since the distinction between the Internet and the computers in our homes is increasingly becoming blurred, let's talk a bit about the Internet. I recently wrote an article called Everyone Wants to Control the Internet in which I explained that the days of the open and free (as in freedom) Internet are most likely nearing an end. In large part, this is due to governments locking down the Internet. What technologists forgot when they naively claimed 30 years ago that the Internet's technology made it beyond the control of governments was that governments can always raise money and pass laws that force changes in technology or find ways around it. That is what has been happening and will continue to happen. Governments will soon take whatever control of the Internet they have not already taken. Sure, in the short-to-intermediate term we will still have alternative communication protocols and networks like Matrix, XMPP, the Gopher network, Gemini, Tor, IPFS, I2P, and Secure Scuttlebutt to protect our privacy, but their usefulness and capabilities are limited. Unless something new comes along that is truly decentralized, economically efficient, infinitely scaleable, and unblockable, we have no real hope there either. As I have mentioned a number of times in past articles, Russia and China are already successfully blocking the Tor network and VPN's, and various countries have successfully de-anonymized select Tor users. Once we are all using cloud computers for everything, crooks and governments will have a perfect view of everything we do on our computers.

Another problem that I have talked about for years is that computers are increasingly becoming locked-down appliances. Since I have written about this before, I will not dwell on it here. We have already reached the point where most people access the Internet on their phones. Aside from iPhone users, phone users have zero privacy, because phones are largely locked down and run whatever spy software the phone companies want. My issue, however, is that the same thing is happening with general-purpose computers. We are quickly losing our ability to modify and repair the hardware of our computers. The way to put this off for as long as possible seems to be to go back to desktop computers or even to buy used industrial servers. But, even desktop computers are less upgradable than they used to be. Twenty years go we began to see new desktop PC's being sold with only one or two expansion slots on their motherboards. Now newer CPU's are supporting fewer PCIe laInternet overlay networks to grow and mature slowly as the enshittification of large social media networks and other Internet services on the Web continues. The chief advantage of these overlay networks is their lack of commercialism, which means they don't inundate users with advertisements or sell their data. The major problem is that, because they rely on resources provided by volunteers, most don't scale well. The notable example these days seems to be the Fediverse, which I think could have a bright future if instance owners are smart enough to block corporate-run instances before they have a chance to gain control of the network. Individual Fediverse instances do have problems scaling past tens of thousands of active users, but the solution to that is for individuals to realize that they need to limit the number of users on their instances. The other problem is that the Fediverse is not as efficient as it needs to be. Hopefully a solution can be found to change that.

The Gemini network currently has three or four small social media sites that users seem to be well pleased with. Hopefully they will grow, but if even if they don't, an increase in their number would work also. Unfortunately, the size of social networks on Gemini seems to be limited at the moment due to the inefficiency of the server software. I don't see a reason for this other than the fact that amateurs are writing inefficient code.

I think the growth of the Tor network has stalled. This is either for lack of sufficient volunteers to run a large enough number of servers or because it is being actively attacked by governments like China's. Once easy-to-use and fast, over the past two or three years, I have at times simply not been able to establish a connection and been forced to wait for a couple of days before trying again. I have no idea how to solve Tor's problems.

My hope is that the large social media networks run by corporations will be substantially replaced by tens or even hundreds of thousands of smaller ones run by individuals. The fact that this is already happening on the Fediverse has received much news coverage since the end of 2022. What no one seems to be acknowledging is that to a lesser extent the decentralization of social media has also been occurring on the Web and other networks. I strongly hope this trend will accelerate. Many of us dream of bringing back the types of small communities that were plentiful in the 1990's and early 2000's, where we didn't have to deal with big tech's enshittification and we could just talk to each other. I have written about this extensively in past articles, so I will not repeat myself here. I will simply say that I see no reason why this cannot happen, and I think it will. I wasted years on social media in a fruitless search for a better community. I wish I had realized sooner that the way to have it is to build my own community by running my own social media site on my own server and inviting my new online friends to join me. I think if enough of us learn this important lesson, the Internet will improve greatly.

This seems like a positive note on which to end this article. Hopefully, readers will think about what I have said and choose to do what they can to make the future of computers and the Internet a little better for us all.