💾 Archived View for gemini.ctrl-c.club › ~stack › gemlog › 2022-06-06.new.cpus.gmi captured on 2023-05-24 at 18:33:23. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2022-07-16)

-=-=-=-=-=-=-

You don't need a new computer

gemini://altesq.net/~masqq/gemlog/2022-06-05.gmi

Back in the early 90's the workings of the Wintel cabal became clear. Intel sells shiny new CPUs that are twice as fast as the last generation; Gates comes up with a new version of Windows that is twice as big and slow as molasses, requiring the new CPU. But the pointer has dropshadows! Woo-hoo!

I tried my best to get off the Windows train, but let me tell you, the Apple train was even more expensive. At one point I was spending tens of thousands of dollars every year on new machines and developer memberships, because, you know, I was a professional coder. Heh.

When Linux became a reality I jumped ship. Amazingly, it worked well on leftover equipment I had neglected to give away; even on ancient boxes it was perfectly tolerable for many applications.

Unfortunately, an entire generation got cought up in this brainwashing scheme. Even my parents and in-laws think Windows is something they must have. In fact, every time I give them a Linux machine with a windows-like distros, a 'friend' wipes it and puts a pirated copy of Windows on. Gah.

For many years now, I've been picking up used i7 boxes for under $300, complete with everything I need. The last time I did it because I was interested in the new SIMD instructions my old box lacked. I was writing a compiler to test some ideas.

Do you really need that new computer?

If you have any green bones in you, you can probably use your old machine for another ten years, and make a very small dent in saving your planet. If you can do that without eating meat, good on you!

Do you really need to play that new game? Maybe you can upgrade your GPU, because that's where it counts anyway. What is the difference between an 8-core or a 12-core CPU anyway?

I find that my 2015 i7-6500U 2-core/4-thread notebook is chugging at 688 MHz most of the time, occasionally venturing into the 1GHz+ territory. Rarely do I see it go to its 2.2GHz max at 15W of power - only when Firefox chokes on some crap and tries to start a fire. But I am rarely on the Web anymore.

Do you really have compute-intensive applications? Are you using python? Heh. That's a couple of orders of magnitude in performance right there, as python is ass slow and kind of dumb. You are smarter than that!

The end of Moore's Law

The doubling of CPU power has stopped, and we've been getting smaller and smaller gains, while going wider (more cores)

The problem with Moore's exponential expansion is that it killed software development. Back in 6502 days, hackers used every ounce of processing power, and a lot of trickery, to get these sub-mega-instruction-per-second boxes to do anything useful.

When the expansion started, writing software became a different art. Sure, chosing the right algorithm is worth spending time on, but any kind of subtle performance improvement is not worth it, because next year's processors will be twice as fast anyway.

This is similar to the science-fiction problem of star travel. If it takes you 300 years to get to a nearby star, it's not worth it because tomorrow's technology may cut it to 150 years, and by the time you get there, others will have built cities. Likewise, they should not go either, because a few years later the distance will be 100 years; etc. etc.

So writing software is kind of a lost art. If you don't believe me, consider that as of today, Common Lisp, standardized in 1984 (but born in 1958), is still the pinnacle of languages. It is still not fully understood; you can still write PhD dissertations on finer features of macrology. It is still immitated by others, as standard features of Common Lisp slowly migrate into other languages. All those revolutions in Computer Science (mostly not a science) - Artificial Intelligence, object orientation, functional programming, lambdas, you name it -- have been inside Common Lisp from the start!

I am extremely happy about the recent events, as they signal a potential slowdown in the stupidity of Intel-driven instruction-set architectures, the emergence of alternative CPUs, the return of minimalist technologies (including Gemini but especially Spartan), and a possible beginning of the restructuring of our wasteful economy in all other areas.

Just ask yourself before buying that new Apple or Windows box: do I really want to be a sucker my entire life? Do I need to give billionaires even more of my money? The answer is: no.

index

home