One of my favorite classes at university was a computer engineering class where we started with a RISC processor core, and added optimizations over the entire semester as we learned about the evolution of processor design. Things like pipelining, opcode and then data caching, out-of-order execution, etc.
I've been wanting to follow this style and write about taking a simple (web or gemini) crawler and evolving it to web scale since I first started building Kennedy. It's still a work on progress, but the best way to write is to write:
gemini://gemi.dev/crawlers/
2 years ago ยท ๐ ttocsneb, krixano, mc