racing to the bottom

</p>

TL;DR If the purpose of AI is to become a programmers best friend, shouldn't we address a few issues in open source funding first?

When I was first coming up in programming, it was the early 90s. I was lucky and had the privilege to have a computer that was able to run a very early version of turbo linux red hat edition. I was even more lucky that I didn't cook the monitor I had. But that aside, programming was help files and programming was begging to find technical books because at the end of the day, you didn't have alot of resources to learn from. From my experience you had:

So that was it for me. I basically spent all the money I could on shareware and technical books trying to guess what would make the most sense to learn. At the time I had a dual boot going. So I pinged between working in ms-dos batch files and trying to understand X11. Which as funny as that sounds now, I really do have a memory of being stuck in vim and having to shamefully reboot the computer because I didn't know the magic of `:q`. So before the nostalgia overtakes me, let me get to the point.

The barrier for entry in programming, was money.

It was a time when knowledge had a price and it was kept behind lock and key. What must seem a triviality now, I remember first buying 'Encarta' and thinking, 'holy shit', im never need to go to the library again to cumbersomely read an encyclopedia. Which, to that point, id begged my parents for more than the letter 'M' encyclopedia we owned. So that to me was MY first experience of the barrier coming down a little bit. Now, ill admit, this was only for general knowledge, not programming, but it was the first time people started copying and pasting sections of Encarta and using them in schools. I was in 7th grade at the time and have a memory of a kid getting caught using verbatim copy/paste encarta entries as a paper.

Now my own journey was also enriched during this time period because of the rise of the internet. I was there with Mosaic on windows 3.11, I was there when netscape first blazed to life with midi and animated gifs. Which for an aspiring developer was like a drug. But more than that, you could read any webpage by easily looking at the source code. It was intoxicating. This was a time of simple web editors, of geocities and it was the early internet. Which, to be honest I really never thought would catch on. It was so fun but it was super campy. It was a quilt of people and a snapshot of the early/mid nineties.

So for me, I began learning more javascript and html, just because it was easier to learn. The resources were a bit more accessible. So much like Encarta, the barrier to entry for web development was lower. Were there still pieces to serving html that were hard? Sure, CGI-Bin was still a foreign concept, php and asp were still fighting, as well as developing in perl (which would be a life long love).

Later, after the dot com burst and recovery. So early 2000s, I had landed my first technical job as a web developer, now in my twenties. The world was largely the same. I still had a pile of books in front of me. 'Javascript the good parts', 'Perl Cookbook', 'Unix/Linux Admin Manual', etc. My cubical resembled a modern day wizards tabernacle with all the dr. pepper cans hot glued to create a wizard staff. It was challenging because even though the books had the answers for most anything you wanted to build. I would still need to pour over them and decipher there secrets. Working with team members we eventually put together a really interesting animation library, we figured out how to normalize javascript between browsers and were making a go of it.

</p>

Then, moo tools came out. Then, jquery came out. Things that took the work we had done and brought them into the forefront. Available to everyone. So in a moment, I got to watch a year of work and of some really great learning with coworkers get tossed away. I wasn't jealous, but I was sad, because I knew that open source technology could take away a problem that an organization would pay for. The company that paid for my time wasn't having to pay for jquery. So that was the first time I felt, kinda bad. It wasn't like my employer was paying the folks at jquery. I was the one having to relearn and my contributions were abysmal because at the time it was a world where (at least the orgs id been in) didn't contribute back.

Then, also, was the rise of Expert Exchange. Which was the first time outside of random forums/newsgroups, people started to congregate and solve common problems. I got there because I was trying to look up simple errors that would happen to apache, or weird javascript errors. But the thing was, the Exchange was behind a paywall. Well sure enough, a year or so later, Stack Overflow came in like an asteroid and with people helping people, or open sourcing the problem space. Experts Exchange went away and even more developers could be brought in to play. No longer did you 'need' those large technical books. Mostly the documentation and stack overflow would be enough. I still found myself buying books. Hell, at one time I truly believed that if I had access to all the o'reily books I could get any job in the industry.

So what im trying to say is, during my life as a developer ive seen the barrier to entry go down. From needing quite a lot of grit and expertise to hitting up google and searching.

Then of course, we had this massive increase in web frameworks after 2010, this was great in normalizing (think web2.0 but for front end ux experience) the front end space. This allowed for standardization in colleges and allowed code schools to start really getting the number of new programmers into the field. I think even president Obama had asked the american people to look into computer science. It was a lot of flooding of new talent into the industry for jobs that could actually met the cost of living. Which im sorry to say, is the hardest pill to swallow.

</p>

So at this point, your probably wondering, what? this hasn't been a rant on AI. Where is the AI bashing.

Well, its here and its not. AI as ive used it so far, which is just the beginning, can do some basic programming tasks. However as of this date, using claude, chatgpt, and gemini - hasn't resulted in very much 'good' code. Its mostly sloppy and it doesn't know what its talking about. However, its taking that barrier to entry for anyone non technical and moved the bar. Making it seem like you can get away with a much smaller work force.

AI lies impressively and the prompts ive seen are very good at creating error filled code with a lot of confidence.

Its kinda terrible.

But it does have me asking. Whats the end goal for this technology? If it does succeed and someday has the singularity of experience of a developer. What happens to me? Is it like the lamp lighters of the early 1900s. What happens to all those people who went through code schools who are chasing a cost of living?

There are so many societal considerations to take into account and things id love to see 'fixed' or even some open questions addressed

I personally am thinking those over but wanted to get them on paper first because its interesting how often they seem to bubble up in conversation.

Thanks for reading

UPDATE

After speaking with a friend, he reminded me of this fun xkcd graphic

[](https://xkcd.com/2347/)

</p>

---

updated: 24 June 2024.

to the Index

/ html