💾 Archived View for midnight.pub › replies › 8273 captured on 2024-08-25 at 05:25:02. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2024-06-16)
-=-=-=-=-=-=-
< Some Brief Examples of the Unthinking Adoption of Technology (and a Solution?)
I have had a related thought, that in recent years the exponential rate of technological development has caused a situation that wasn't quite present at any point in the past. Think of the printing press. It was developed at a time when writing had existed for thousands of years already, and books had existed for a good many centuries, too. The press brought about a revolution in the distribution of knowledge, but it would take nearly a thousand years before new (modern) publishing technologies would make their appareance.
In the present, it is like we developed writing, books, the printing press and ereaders/epubs all within the span of 20 years. Mankind has not had the long spans of time to accomodate to changing technologies in a "natural" way, that is, they don't follow the periods of development that reach them to maturity. Instead, we hastily jump from one development to the next,we do not learn to use a technology optimally and we are already adopting the next one.
Case in point is the development of storage and computing devices. There's this vertiginous obsolescence cycle where you have to continually update your devices in order to keep up with the demands of the world. We went from 5-inch floppies to 3-inch floppies, to CDs, and apparently we've reached optimum size with USB sticks. Accordingly our laptops have evolved to accomodate each of these. Every 5 or so years a new kind of standard supersedes the last one, VGA to HDMI, micro-usb to type-c, etc.
Software is one of the biggest culprits of this. While I can run a decent linux system on a 15-year-old potato laptop, for the most part people are expected to have the latest processor to endure the demands of increasingly complex operating systems which offer little more functionality than they did 20 years ago. Don't get me started with mobiles. Not only are they monolithic black boxes, even a 5 year old mobile device won't run the latest version of MOST APPS because there are continuous software upgrades which stop supporting old devices --just for the sake of keeping the market active.
Interestingly, as I said, there is a proliferation of solutions that do not actually provide an increase in functionality to decades-old solutions. Take Obsidian, for example. A note taking app that uses a stack of technologies that build up in complexity and cruft, which does nothing you couldn't do with sed and awk. Of course, there is the benefit of being easy for end-users to start using quickly, but does that really justify all the computing power and a number of LOC somewhere in the millions? The essence of it's functionality was settled some 50 years back, text files and a file tree. The same applies to operating systems. Maybe this is the period of early development before we reach some kind of maturity; I like to think that at some point we'll go back to some semblance of simplicity or versatility. Right now we have the equivalent of fat unwieldy swiss-army knives where we could be using actual knives, scissors, saws, and corkscrews which do their job a lot better.
I am sure that at periods such as the industrial revolution, there would be a great proliferation of new techniques and technologies during a short span, and we may be right in the middle of one such period, so perhaps in a couple decades the whirlwind may settle for a bit and we will be left with the lessons of this rapid development and a basis for a solid further development of whatever technologies end up staying.
I don't give a flying F about AI, to be honest. I hardly see any value in it. Sure, as a PA it is probably a charm, but not at it's current state, when it still makes a lot of shit up and it yet doesn't hold the ability to follow a train of thought or carry an argument. At present it's mostly a shortcut to writing essays or drawing pictures, all of this devoid of style (or poorly emulating other styles, always leaving it's eerily synthetic imprint.) I hold fast to the idea that AI is heavily overhyped, and in a few years hopefully it will be seen, beyond the novelty, for what it really is: a natural language processor.
In conclusion, I think in about 20 or 30 years we may find our technologies reaching a sort of "metastable" configuration, reaching, as it were, their final form, in which hopefully everything will find it's place. AI may become a smarter clippy, well, hopefully it'll actually be at all useful.
Wow, well written and considered response this this post. I enjoy reading the responses here. I agree, too, a plateau of online (or online curated) knowledge will arrive after some time, and then it's all a discovery game. We are likely already there. Then it isn't WHAT is there, or HOW to find it, but just individuals deciding that they are INTERESTED in WHATS there and HOW they get it.