💾 Archived View for dioskouroi.xyz › thread › 29442307 captured on 2021-12-04 at 18:04:22. Gemini links have been rewritten to link to archived content
➡️ Next capture (2021-12-05)
-=-=-=-=-=-=-
________________________________________________________________________________
"First a punch is just a punch,
Then a punch is not only a punch,
And finally a punch is just a punch" (heard from Bruce Lee)
Basically it means that in the beginning we punch like we can.. then a martial arts student learns the proper and different ways to punch.. so a punch is no longer just a punch... Is a lot of instructions..
Then to sheer practice and repetition the right way to punch becomes second nature.. so the right way to punch is the only way so it's only a punch again.
So coding it's just ifs and loops... Then is OOP and Functional and Logic... But then once you transcend all those paradigms you understand that is all ifs and loops.
I started off writing a lot of 68000 machine code. I was always amazed what you could accomplish with a lot of loops and branches. I never lost that sense that at whatever higher level I was at later on, it was all loops and branches.
Reminds me of biological metabolic systems. All loops and branches…
With a lot of fuzziness, some state and temporal stuff.
Have you seen this?
https://news.ycombinator.com/item?id=25788317
Seems like we lose a lot of good technology and progress for random reasons, like the “ram” of society is finite.
Yes, one of my early jobs was writing Z80 code and I have the same sense.
Coding is not "ifs and loops, _then_ $foo". That's a false premise. If you want to be fundamentalist about it (which I don't advise), a CPU is just a state machine.
assembly instruction + machine state => new machine state
Our idea of "jumping around code" and "control flow", as in for loops or if statements, are themselves abstractions of control state (an instruction pointer) and data state (perhaps a CPU register).
So coding is really "the execution of a notional interpreter (aka semantics), _then_ $foo." That gets to the absolute bottom of what "code" even is: instructions for an interpreting device (be it physical or mathematical).
Oh, but CPU instructions can be built up from a series of ÎĽops which are then executed out of order or in an overlapping fashion, making both "instruction" and "state" boundaries in your equation more fuzzy than it looks. So at the absolute-absolute bottom of "code", it is instructions for an abstract model of an interpretative device.
(I'm not sure if your "but" is a retort, or an addendum. Assuming retort...) We can consider one more level of implementation machinery, but I fail to understand how uops aren't another form of instruction, and CPU internal gobbledygook like register files, caches, etc. aren't another form of state. It doesn't seem so fuzzy.
The difference is that you don't get to access the uops machine, subsets/sequences of that instruction set are offered as (macro-)instructions instead. Programming happens on the idea of a CPU.
Trying to explain Zen by starting with "basically" is a bit ambitious of your part haha :)
Before enlightenment: chop wood, carry water. After enlightenment: chop wood, carry water.
Or a set of composable primitives if you’re using an array language. What, you think if and for aren’t abstractions over jump/goto?
Everything else is only there for the monkeys pressing the keys.
I love this :). How much of mankind is creating things sheerly for their pleasure?
That’s what disappoints me in modern Java. There is practically no ifs. It makes it inaccessible to beginners on the project, and the streams are about twice as slow… just because devs think they are more clever if they pile 7 “.map()” to transform a list.
list.stream().filter(Objects::nonNull).map(User::getUsername).filter(Objects::nonNull).filter(name -> name.contains(searchString)).map(name -> “We have found your user: “ + name).orElse(“We haven’t found”);
Basically unreadable.
for (User user : list)
if (user != null && user.getName() != null && user.getName().contains(searchString))
return “Found it!”;
return “Not found”;
Java has never really liked to do in 10 characters anything that could be done in 30 characters instead, especially if it obfuscated things a bit more at the same time.
Ah, keen observation, young grasshopper! But nota bene: just as man cannot live on bread alone, one's understanding does not arrive merely from the consideration of a collection of atoms.
That first reply is so funny to me because it hits too close to home
https://twitter.com/nice_byte/status/1466940940229046273
The more I do this, the more I gravitate towards the simple things.
New developers write simple but shortsighted code. It gets the job done, but it will be painful to work with in the future.
Intermediate developers, bitten by their past mistakes, decide to future proof their work. But they don’t just look one or two steps ahead, rather they try to look five steps ahead and identity problems that do not and may never exist. Consequently, they over-engineer, over-abstract, and over-complicate everything. Carmack’s recent “But here we are” speech resonated with me.
Advanced developers identify the right balance for each task between simple and complex, concrete and abstract, and pragmatic and idealistic. In my experience, they favor simple and pragmatic solutions, use abstraction effectively, and can satisfy near-term goals quickly without painting themselves into a corner. “As simple as possible, but not simpler.”
I try to avoid working for tech leads stuck in the second phase, which is not uncommon. If you suggest taking the “ifs and for loops” approach of solving a simple problem with a simple solution, they’ll assume you’re on the wrong side of the bell curve.
Had a boss once who insisted that all if statements should be pushed out to factory classes, and all control logic should be done by constructing a different instance of an interface. It was a pretty rigid system, but at least it did force me to write small focused classes that were easy to test.
Debated for a long time whether that methodology was stuck in the second phase or if it was actually the third. Still don't have an answer, but these days I think having a plan is better than just letting engineers run roughshod, as long as the conventions are easy to follow.
Programming alone vs programming in a team are very, very different occupations. A lot of what applies to one doesn’t apply to the other. I’m still painfully learning this, after all these years.
Q: What's the difference between a novice and an expert?
A: The novice thinks twice before doing something stupid.
I don't understand this saying.
Recently posted this, which wasn’t well received.
https://motherfuckingwebsite.com/
Where does this fall?
In going too basic. We can go to simple and elegant without going all the way back to crappy stone tools.
http://bettermotherfuckingwebsite.com/
https://evenbettermotherfucking.website/
Thanks for that link! I actually like the other one much more. I really appreciate JS free sites, Amazon can be used without JS for example.
That site is a piece of shit, because it uses google analytics, thus snitches to a big brother.
You’re right, I didn’t notice! Gonna link to evenbettermotherfucking.website from now on.
And that's also why a lot of Architecture Astronaut that looooved Java, didn't see Python coming.
Funny, I took over a modern python service and I was pretty shocked at what I inherited. Long gone are the days of "There's one way to do things".
Instead, this thing would give the most "enterprisey" Spring JEE application a run for its money with its endless annotations, dependency injection magic, all sorts of pseudo-types - both the "built-in" Python 3 ones like Set and List, but also the libraries like Pydantic. But unlike Java, these types aren't even really guaranteed by the language at compile time, so even if your IDE plugin can successfully detect them, things will still (silently) slip through at runtime.
The async functionality that's been bolted on to the language is worse than even the old and confusing Java multi-threading primitives, and the funny thing is it still doesn't actually run things in multiple threads. For that, your simple Rest API is running on layers of C programs like Uvicorn which itself is then wrapped by another few processes running Gunicorn which in turn is probably running behind NGINX. LOL, and we thought the Servlet stuff with Tomcat and JBoss was clunky - this is insane.
To be honest, if there ever was a sweet spot for Python, it would have been for smaller code bases that weren't complex enough for big "enterprisey" langs like .Net or Java, but were more permanent and complex than shell scripts or (back in the day) Perl could handle.
But nowadays, I don't think modern Python fits any use case real well. It's still dynamically typed, slow, single-threaded, and has a poorly managed ecosystem and hodge-podge of tools like virtualenv, pyenv, poetry, etc. that never quite become standardized and stable.
So unless you've got a bunch of Python experts who aren't interested in moving to a better lang, I'd find it hard to go with Python for new projects.
Well said, this hits home. Also constant refactoring and maintenance of the codebase without adding any new features.
Bull's eye.
Despite being a joke, I know it's the "Ha Ha Only Serious" [0] sort. I can't help but think this is severely biased by the trends of "enterprise software," where you eventually "give up", and clock your 9–5 writing if+for making a fine living, but erroneously pass that off as a mature, Jedi-like way of thinking about programming, like the meme suggests. (And, consequently, you spend no less time debugging hairy, nested, imperative conditionals with for-loops that are off-by-1.)
I have no beef with if+for, but a large part of the reason they're "goto tools", if you will, is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years.
Simpler building blocks does not necessarily mean simpler solution. If only!
[0]
https://en.m.wiktionary.org/wiki/ha_ha_only_serious
> you eventually "give up", and clock your 9–5 writing if+for making a fine living, but erroneously pass that off as a mature
This comment sure indicates to me where you most likely are on the curve.
In all seriousness, I think this is considerably off the mark. After enough experience you realize that expressivity and convenience are antipatterns and don't actually simplify things but are harbingers of complexity, bugs, tech debt, even the downfall of organizations and products.
Saying it is all ifs and for-loops is completely true. Everything else, all the abstractions and high level features, are just sugar.
I try to keep a healthy and balanced diet, myself.
> industry is slow to assimilate most state-of-the-art ideas, sometimes by as much as 40 years.
Most of those ideas are terrible. The industry is so incredibly young and has experienced so much change over those 40 years that I have a hard time accepting the notion that the industry is slow to adopt. The reason the old building blocks are still popular is because they are a thin abstraction over how computers work, and ultimately that is at the root of everything we do.
Personally I disagree. We should be using state machines and pure functions. If+for loops are just what's easiest to express in the major languages of today. they are no more or less computationally expensive but due to lack of tooling they are often cheaper to write.
In languages and libraries that allow FSM and pure functional kernel based designs you can get just as clear logic that is expressible not just to the programmer but also to business personnel. It's counter-intuitive to a certain extent because so much of programming is built around imperative programming but FSM based logic is and will continue to be easier to understand long term because you can trivially visualise it graphically. This ultimately is what a lot of the functional paradigm is built around. Use the mathematical and graphical representations we've used to understand systems for decades. They are well understood and most people can understand them with little to no additional education past what they learned in their business or engineering bachelors degrees.
In a very real way, it's all conditional jumps in assembly, and every thing you've learned to make programming easier by allowing more directly letting you express your high level intent is just sugar. It might even help some or most of the time. But what you're actually doing is creating a bunch of branches and loops, and as much as the high level stuff might help, you really shouldn't forget this is the medium you actually work in.
Most professions have a healthy respect for the base materials they work with no matter how high the abstractions and structures they build with it go. Artists know their paints, stone, metal, etc. Engineers know their melaterials as well. They build by taking the advantages of each material into consideration, not assuming that it's no longer relevant to their job because they get to just work in I-beams. Programmers would do well to adopt a healthy respect for their base materials, and it seems like often we don't.
Programming is primarily about _managing complexity_. Other than perhaps mathematics, there is no other field that must, should and can apply the amount of abstraction on top of abstraction as software engineers do.
It is a must, because decades of business requirement built on top each other without understanding the whole _is complex_. Writing a JIT-compiler that can routinely change between interpreting code and executing it natively, a database optimizing queries, a mathematical library using some fancy algorithm are all complex, in a way that is _not reducible_.
Complexity easily outgrowth even the whole of our mathematics, we can’t prove any non-trivial property of a program, halting problem, etc.
So all in all, no, we can only respect our “base materials” by finding the proper abstraction for the problem, as our base material is complexity itself. It might be for loops and ifs, but it very well be a DSL built on top of who knows how many layers, because at that abstraction level can we even start to map the problem domain to human consumable ideas.
> high level intent is just sugar
I disagree with how your use of "just" here. It's common for programmers to dismiss the importance of syntax but syntax and notation are the interface and UX between the language semantics and your brain. It's no less important to get this right. There's a half-joke that Europe was able to rapidly advance in Calculus beyond Britain due to the superiority of Leibniz notation.
> healthy respect for their base materials
What's unique about computers is the theoretical guarantee that the base does not matter. Whether by lambda calculus, register machines or swarms of soldier crabs running from birds in specially designed enclosures, we're fine as long as we appropriately encode our instructions.
> bunch of branches and loops
You could also easily say it's just a bunch of state machines. We outsource tedious complexity and fine details to compiler abstractions. They track things for us that have analogues in logical deduction so that as long we follow their restrictions, we get a few guarantees. When say, writing asynchronous non-deterministic distributed programs, you'll need all the help you can get.
Even designing close to the machine (which most programs will not need) by paying attention to cache use, memory layout, branch divergence or using SIMD remain within the realm of abstractions.
Agree with this a lot. In other words, don’t be too clever. That leads to an unmaintainable codebase. There is value in simplicity and not overly using abstractions that take you farther and farther away from bare metal.
> Personally I disagree. We should be using state machines and pure functions. If+for loops are just what's easiest to express in the major languages of today. they are no more or less computationally expensive but due to lack of tooling they are often cheaper to write.
In my experience programming programming with primitives and basic flow control operations frequently tends to be at least be order of magnitude faster than more complex state management paradigms. Compilers are very good at optimizing that style of code. Loops often get unrolled, the and the branch predictor is kept happy. A good compiler may use vector expressions.
In many cases with cold code it flat out doesn't matter, the CPU is plenty fast, but when it does matter, explicit if-and-for code absolutely mops the floor with the alternatives.
Will your manually written imperative code beat an SQL database for the same task? Because it uses a very very high level description on what it has to do and chooses an appropriate algorithm for that for you.
You can optimize one specific query (quite painstakingly, I believe) to beat a general db, but it is very far from clear that “for loops will beat higher level abstractions”, imo.
At least in the C++ space, stuff like boost-sml is able to produce assembly that is often as fast or occasionally faster than handwritten if or switch based FSM logic.
for loop goes brrrrrrrrrrrrrrrrrrr!
I inherited a piece of code that was designed as a state machine and the state machine design basically made the project become pure technical debt as requirements increased over time and engineers had to make it work with the state machine model that had been chosen when the project started.
If the project had instead been designed to have less unnecessary state and “transitions” it would have been a lot easier to make changes.
All those ideas sound good by themselves but they are really bad for “defensive” coding. Good luck selling a project to refactor something when it’s going to take multiple people-years. Once you’ve made the mistake of committing to an inflexible design it’s either that, replace/ignore the thing, or deal with low productivity as long as the thing exists.
> state machine model that had been chosen when the project started.
so was the chosen model the issue or choosing a state machine model at all?
_FSM and pure functional kernel based designs you can get just as clear logic that is expressible not just to the programmer but also to business personnel_
I’m yet to see a secretary who could “return a new table state such that as if documents became a right fold as binding together a map of composition of signing and copy routines over documents” instead of “get these documents from the table, copy, sign and put them back in a binder”. This is a nonsense some of us _want_ to believe in, but it is not true.
_> We should be using state machines and pure functions._
For problems where those are the right tools, sure. But they aren't the right tools for all problems any more than ifs and for loops are.
I've always felt like explicit state machines are the sledge hammer you break out when you can't find any good abstraction to encapsulate the logic. As an intermediate step for parsers it's pretty powerful, but it's not something I want in my hand written code if I have any alternatives.
I've written many SM implementation starting from one used in low protocols and up to business process middleware so I have an experience and know how incredibly useful and powerful those are when used in right place. But to use them everywhere especially in some math algos would be an insanity worse than GoTo.
do you know some nice examples of medium complexity that you can show for inspiration?
Really, there are no if+for, just compare and jump. Why don't we use what the metal uses, instead of these "expressive abstractions"?
If+for have no deeper foundational significance in the construction of programs or computations, literally, than say a lambda function. But because the latter is unfamiliar, it's spoken about in the same manner you present: as if it is some highly abstract, complicating, high-level feature (when truly that take is just baloney).
> as if it is some highly abstract, complicating, high-level feature
But symbol calculus _is_ a highly abstract, complicating, high-level system assembled out more reality-based systems beneath it. If it seems simple to you, you're just under the curse of knowledge.
I'm not sure what a "symbol calculus" is. Do you mean "lambda calculus"? I think that's a lot less complicated and abstract than a fabled machine with an infinite tape that's controlled by a transition graph of symbolic states. :)
And I don't know what a "reality-based system" is.
You're confusing computer science with writing software to run on hardware. Coding requires the latter, but not the former.
Oof, and to think one could helpfully inform the other! :) To be clear, I am a programmer, not a computer scientist, so my opinions are based off writing code and managing teams to write code that works, and less so about abstract machinations of computer scientific thinking.
_> Why don't we use what the metal uses, instead of these "expressive abstractions"?_
Because the "expressive abstractions" are much easier to reason about and save programmers lots of mental effort. And, as I have commented upthread, ifs and for loops are by no means the only such abstractions.
_> because the latter is unfamiliar, it's spoken about in the same manner you present: as if it is some highly abstract, complicating, high-level feature_
If expressing your program in the lambda calculus is easy for you to reason about and saves you enough mental effort, go for it. But don't expect your code to be readable or understandable by many other people. The reason why ifs and for loops (and other "expressive abstractions", since as I have said, those are by no means the only ones) are ubiquitous in programs is that they are easy for _lots_ of programmers to reason about. Whereas the lambda calculus is only easy for a very small subset of programmers to reason about.
I'm not suggesting people "express programs in the lambda calculus", but instead that incorporating a philosophy of functions and their composition is not at all a bizarre concept.
Loops and ifs work miserably with any hierarchical data, compared to recursion, for example. A lot of the world's data is hierarchical.
We now have a chicken-egg problem. I can freely admit that for+if is easy for programmers to understand solely because of how we are educated, and not due to any Bruce Lee hocus pocus about simplicity or fundamentalism, as so many others here suggest.
A programmer who, say, learned from SICP first would find a for loop awkward and bizarre when you could "just" tail-recurse.
_> Saying it is all ifs and for-loops is completely true. Everything else, all the abstractions and high level features, are just sugar._
You could just as well say that ifs and for loops are just sugar for gotos and all programming is just gotos.
The reason ifs and for loops are used instead of gotos is that they are very useful abstractions that are easy to reason about and save the programmer lots of mental effort. But they are not the only such abstractions.
To the extent that other abstractions can create problems, it's not because they're really just sugar for ifs and for loops, it's because they are not well crafted abstractions so they are not easy to reason about and don't really save the programmer any mental effort. But there are plenty of abstractions other than ifs and for loops that _are_ well crafted and _do_ save the programmer mental effort, in many cases lots of it.
> expressivity and convenience are antipatterns and don't actually simplify things
What does this mean? At this point I can't guess whether you're more likely to prefer Haskell or Python. Python is the language that lets you put your if statements and loops wherever, this is expressivity and convenience, and thus an antipattern? This seems contradictory.
What this really means that once you get to a certain level of experience and seniority the actual code you write in the small is pretty much irrelevant. What matters is the overall architecture of what you’re building: the data structures and APIs. The challenge becomes about working together as a team, and with other teams within your ecosystem. Sophisticated language constructs don’t actually help you solve those problems, and imo their benefit is marginal where they do help.
This!
With at additional level of abstraction you could say “goto jumps”, but “if and loops” gives an commonly understandable logic for everyone; deeper abstractions increase reading complexities, while higher abstraction is achieved via functions and overall architecture.
Scaling up those “if and loops” is the challenge as a team or a single, with the common goals being to keep the software under control.
>I have no beef with if+for, but a large part of the reason they're "goto tools", if you will, is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years.
For assimilation to happen, the state-of-the-art solution also has to result in a net gain over the existing solution, and the higher the differential in complexity between the two, the bigger that gain has to be.
Functionally, this looks like selling off your client base and closing the doors rather than rewriting internal tools that mostly still work.
There's no "rubber meets the road" in OPs position because there's no cost in their calculations.
And, these days, "net gain" in an industrial context is typically tied to almost no aspect of the quality of the code, but more to the management of large groups of people, as well as stability and growth of the business.
It’s just so sad that the lowest common denominator has become the standard now. When I first learnt Clojure it entirely changed the way I think and solve problems. The code really was elegant.
Obviously, it can only be read by someone who can also understand programming beyond ifs and fors. That’s a non-starter in most environments - enterprise or otherwise.
Funny enough, I see most innovations coming from consultants who do the same work for multiple companies and realise the repeating patterns and extract an abstraction.
Ifs and fors are the easiest concepts to explain to non-developers, so it makes sense to start there.
I wouldn't say that they are the standard now, but using and mastering all features in a language is hard.
Add to that design patterns, classes and code layout it becomes a full-time job to keep up.
I have been in contact with code most of my professional life, but still isn't comfortable writing large amounts of code. The simple reason is that i don't do it full-time.
Here are the features in C# just to illustrate how complex a programming language is.
https://www.c-sharpcorner.com/article/c-sharp-versions/
I agree that modern software development for non-full time developers is brutal, several of my data scientist colleagues are remarkably brilliant people and yet they struggle with some more advanced programming concepts.
However, most of those features are relatively standard and are more conceptual than syntactical in nature. Bashing people because they don't know stuff is stupid and counterproductive, but I shouldn't be forced to code in the programming equivalent of roman numerals just because someone else can't be properly fucked to understand lambdas or interfaces or generics, all stuff that's literally covered in your run-of-the-mill CS 101 course.
It all boils down to having enough humility and empathy to understand that other people are not, in fact, the same as us.
That’s what I mean. Each language has a different syntax and it takes a while to gain mastery over it and that’s fine. But there are concepts that are immediately portable to multiple language.
Meh, most business logic really is "if" and "foreach". That doesn't mean it's not complicated, as you say. But all that category theory stuff, at the end of the day, really is just an attempt to manage the complexity of lots of conditional branching and iteration.
_> they're "goto tools"_
I see what you did there.
> debugging hairy, nested, imperative conditionals with for-loops that are off-by-1
Isn't this just a complicated case of ifs and fors?
Sure, but the word "just" is doing a lot of work. It seems to be where a code base of uncomplicated ifs and fors leads to asymptotically, because both of those constructs don't prohibit you in any way from sneaking in "just another line" to either of them.
> is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years
How convenient that the software industry is about 40 years old. So these ideas should "break through" this invisible arbitrary corporate waiting area into the limelight any day now, right?
They are breaking though. For instance, Python just got (a very limited form of) pattern matching. It has been A Good Idea since at least the 1970s. Garbage collection has been known since the 1950s but only became "mainstream" in Java.
The significance of off-by-one errors depends on whether you predictably get a runtime error on the first execution, or not.
There are a lot of sophisticated problems dealing with enterprise software even in higher languages and even in situations where things like performance or resource usage is not a primary concern.
For example, how do you handle authorization, logging, and how do you make the code maintainable? That's a really tough problem that requires a lot of thought about the overall system design.
And of course it's always a lie to say that performance and resource usage aren't a concern -- they're not a concern until they are.
I'd never seen that meme before, but there's a Bruce Lee quote (maybe apocryphal) that has had a lot of meaning for me ever since I got over the same hump myself.
“Before I learned the art, a punch was just a punch, and a kick, just a kick. After I learned the art, a punch was no longer a punch, a kick, no longer a kick. Now that I understand the art, a punch is just a punch and a kick is just a kick.”
makes me think of the Buddhist "Before enlightenment: chop wood, carry water. After enlightenment: chop wood, carry water".
not my favorite source since it doesn't go into the 'scaling the mountain' bit, but every source that talks abt that part seems to be...eh:
https://buddhism.stackexchange.com/questions/15921/what-is-t...
I always took that to mean something like this:
Q: What is the difference between an enlightened person and an ordinary person?
A: There is no difference, but only the enlightened person knows this.
“Before enlightenment: if then. After enlightenment: if then.”
Before agile: it's not done yet
After agile: it's not done yet
I'm may be biased because I spent too much time arguing about this
but you hear those $fancy_principles / fp / hard oop / "clean code" evangelists, and then you go to any repo of real world software - linux, compilers, kubernetes, git, blablabla and everywhere you see for loops, goto, ladders of if statements
I mean, you cherry-picked by quite a criteria there. It’s all C and Go, and they somewhat lack higher level abstraction capabilities. On the other hand compilers are often written in C++, or are bootstrapped in the very same language though. Also, what about OpenJDK, elastic search, all the web servers running the whole cloud? Linux might be the underlying OS, but that’s not the program actually doing business logic. If anything, it’s just another layer of abstraction.
Also, let’s be honest, C does all these “virtual method” magic on a per-project basis which will not be understood by any tool ever (all those function pointers to whole new implementations passed from God knows who, with barely any typing). At least FP and OOP knowledge somewhat transfers and is queryable by tools.
- it's okay to use printing instead of a debugger
- you don't need to write classes for everything
- it's okay to write something in a more verbose way to make it clear for other people
- your tools don't need to be perfect to get things done
I need more of these, maybe some that aren't as reductionist as Carmacks's original post.
This post by Aaron Patterson made me realize it's fine to debug with print statements
https://tenderlovemaking.com/2016/02/05/i-am-a-puts-debugger...
In rare cases I pull out a real debugger, but most of the time the right prints in the right places are just as good. I can also iterate much faster because I'm not jumping between the code the the debugger, or pulling the debugger out of the loop it's stuck in.
The really useful 'print' debug lines might be kept at additional 'debug' flag levels. This is particularly useful for tracing the behavior of programs that no longer have debug symbols and are running in real environments.
Funny, I posted this on another HN thread [0] recently, but it's perfectly relevant again:
_We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time._
T. S. Eliot - Little Gidding
[0]
https://news.ycombinator.com/item?id=29043941
How then am I so different from the first men through this way?
Like them, I left a settled life, I threw it all away
To seek a Northwest Passage
at the call of many men
To find there but the road back home again
Stan Rogers, "Northwest Passage"
Sounds good. I still remember my BASIC.
That image[a] is so funny because it is so _true_:
Anyone who has ever written any software has felt like the unenlighted half-person in the middle of that distribution at least once -- for example, when learning how to code with a different approach in a new language.
I have felt like that more than once. Everyone here has, I suspect.
--
[a]
https://twitter.com/nice_byte/status/1466940940229046273/pho...
This is basically the progression in _any_ field or craft. As one becomes more experienced, one basically figures out the optimized stuff needed to successfully solve the problem in hand, successful in the sense that it both meets current requirements and enables future changes or evolution.
I describe this path of discovery as:
beginner: function over form
intermediate: form over function
transcendence: form _is_ function
However, I will disagree that coding is just about ifs and for loops. To me, coding, programming, software development, or whatever you want to call it is about three things: how to use a computer to do something, communication between people (including your future self), and how to think about a domain. “ifs and for loops” does not capture this.
I gravitate towards useful abstraction. I've written the same loops (find an element, find a maximum, do something to every element, filter elements, etc.) 10 000s of times by now. It got old after the first 100.
Except it never can be that simple because many systems have millions of ifs when the entire system is considered. So architecture and parallel evolution of those millions of ifs becomes an entire field of study :)
Do you write your code in notepad or pico, or use brainfuck as your primary programming language, or copy your code to a new folder for version control? Those things are all the simplest in their tool category.
The more experience you have, the more you see _through_ the code to what is underneath.
"...there's way too much information to decode the Matrix. You get used to it, though. Your brain does the translating. I don't even see the code."
Thats why i am seeing DB schemas without indexes lately. End of the day people with this kinda thinking make others to fix the broken code that they left behind.
As a senior software engineer i had to spend a lot of time at night fixing code written by junior devs and interns.
the code that company and devs (just ifs and loops gang) proud of was a pain in the ass for me so i quit the job entirly and do freelancing these days.
I tried to explain how something was wrong and why but no one would listen all believe they were 10x developers, Life lesson learned, never ever try to correct an idiot.
Here are some of the practices they followed
In my twenties, I wanted to use all the cool PL techniques: meta-programming, reflection, monads, macros, type-level programming, etc. I'm getting older and now I want my programs to be 90–95% functions, arrays, structs, enums, and loops and only parsimoniously throw in more advanced language features.
I also like "all web development is basically fancy string concatenation", and as a web dev I feel seen.
It may optimize down to string concatenation, or better yet streaming output, but you really shouldn't be doing that concatenation directly.
https://glyph.twistedmatrix.com/2008/06/data-in-garbage-out....
It's almost like "a series of tubes" that do nothing more than squirt text around.
"U+1F346 is an eggplant" and other oddities from the tubes we build modern society on
Pretty much. Many frameworks exist to make it safe.
and file access
Well, quite: if you add that code must run sequentially it's the Boehm-Jacopini theorem:
https://en.wikipedia.org/wiki/Structured_program_theorem
If you want to go even lower than that, coding is basically just saying _yes_ or _no_ in response to a _yes_ or a _no_.
Sure, that's oversimplifying it, but that's the smallest unit of information being changed during computation.
But yes, once you learn the basics that are shared between most programming languages and don't get distracted by the nuances, it doesn't take that long to pick up a different language. Being _proficient_ is of course another question, but achieving a basic understanding shouldn't take all that long when you just focus on how to if-else, how to setup loops, and how to assign variables.
All chemistry is just sharing electrons.
yes, at the lowest level its binary code, 0 and 1
I like to think of myself, actually, as not a code writer, but an author. I just use zeros and ones instead of words 'cause words will let you down.
I like that. Someone on my team referred to us as (data) plumbers and I thought that was a pretty fitting analogy too.
Free Guy(2021)
https://www.imdb.com/title/tt6264654/
And really, aren't loops just ifs and jumps under the hood? So coding is just ifs
Indeed. Jump on zero and integer manipulation are sufficient for turing-completeness. For example:
https://en.wikipedia.org/wiki/Counter_machine
All you really need is `mov`.
https://github.com/xoreaxeaxeax/movfuscator
https://www.youtube.com/watch?v=R7EEoWg6Ekk
All you need is `mov`
`mov`
`mov` is all you need.
SUBLEQ :)
https://en.wikipedia.org/wiki/One-instruction_set_computer
λx. x S K
https://en.wikipedia.org/wiki/Combinatory_logic#One-point_ba...
It’s a pretty fun and enlightening exercise to sit down and work out how you’d implement all the usual instructions when all you have is Subtract And Branch If Negative.
it’s NANDs all the way down
It's NP-junctions all the way down.
all the way up, shurely?
and capacitors
Wires figure in there somewhere.
And lack of wires in the other places.
https://youtu.be/otAcmD6XEEE?t=1965
Sequence, selection, iteration (or recursion if available).
Note that the child overlooked the assumption that it's sequential.
iterating and conditionals?
This is funny, but it's like saying "Math is basically pluses and minuses".
I see coding as playing with hardware without having to use soldering iron.
Well, it is just counting natural numbers and making up placeholders for whenever a subtraction or division wouldn't work out.
A marathon is just putting one foot in front of the other, after all. What’s the big deal? I mean a two year old can do that, and they can’t even handle logic yet.
As a marathon runner, people sometimes ask "How does a person run for 3 hours O.O" and well it's about the same as running for 5 minutes except you don't stop.
Simple != easy
Strength training is equally simple. Just keep increasing the weight you lift.
Being thin is also easy, just don’t eat too much.
They said simple, not easy.
Being pedantic is simple, you just say things that don't contain detectable errors.
(I'm agreeing with you about "simple, not easy")
Mathematics is "just" set theory and everything else, including arithmetic, can be built on top of that.
Honestly that's not a great example given that you can't understand ZFC until you already know enough set theory to understand the motivations for ZFC.
Maybe math is just equations and sets.
It is just sets! Set theories like Zermelo–Fraenkel can be the foundations for all of mathematics.
https://en.wikipedia.org/wiki/Set_theory
I think he knows.
Math is just writing on a blackboard. Equations and sets optional.
Math is just addition and subtraction.
Coding is just copy and pasting boilerplate code and googling how to make it work.
Copilot. FTFY.
Reminds me of a coworker who said that they only data structure you need is an array since it can be used to to mimic every other data type.
Somewhere in the 90s a colleague asked about why those pesky types we had to use while we had void *. We fought, I won. (but somehow we still ended up with Javascript 20 years later)
Or, coding is basically just folds and filters.
There’s a joke in the fp community I can’t find right now that describes the evolution of programs from imperative side-effectful statements to a for comprehension, with exception catching, that looks nearly identical.
Probably not the right one, but "The Evolution of a Haskell Programmer"[0] sounds like a similar idea which goes from a Freshman Haskell programmer's simple factorial to the abstract heights of a Post-doc Haskell programmer, then back down to a Tenured professor's simple factorial.
0 -
http://www.willamette.edu/~fruehr/haskell/evolution.html
?
UGH. Back in my day the only language was BASIC and we only had IF and GOTO. Dijkstra has made these children SOFT and I'd piss on his grave if I could be arsed to get out of my rocking chair.
What version of Basic were you using that lacked FOR? Even the shitty one crammed into my Sinclair ZX80 had FOR.
His mind will be blown to lean it all runs on protons, neutron, and electrons.
Actually coding is just literary translation into a language spoken only by pedantic idiot savants
That's why it's pedantic idiot savants who tend to be the best coders
All life is just cytosine, guanine, adenine, and thymine.
Those mean nothing without the proteic machinery that translates codons into other proteins though. DNA and RNA without ribosomes is like a language without a compiler.
Umm.. what are those codons and other proteins made of? ...
Everything, everywhere is just entropy.
I remember explaining recursion to an aspiring programmer to apply to some tree node walking or something, and at some point it clicked! I saw the second it worked in the reflection in her eyes, they got big and lit up and there was this palpable sense of "a-ha" in the room! It was one of the coolest moments of my professional life.
But yeah, my kids (one of whom is picking up programming) would be right behind the "ifs and loops" statement.
Coding is basically just ifs and for loops.. But software engineering (or development) is much more than just coding.
Isn't software engineering pretty much ifs and loops too? Massively complicated ifs, and looping pretty much samething over and over again.
I like the phrasing "software engineering is programming integrated over time." It involves things like rollouts, migrations, deprecations, adapting to new requirements, and designing in so a way as to make all those easier.
coding is just ifs and for loops,
computer science is just types,
(to me) software engineering is a mix of those two
I like "Coding is to Software Engineering, as building a chair is to running a furniture factory". You need to know a fair amount about the former to excel at the latter - but the latter also requires project management, process and product design, collaboration, and communication skills.
It’s all just strings. Make the right strings happen in the right place and the computer does things.
Ifs and for loops are trash. Real programmers just write massive linear algebra operations that they can throw on a cluster of GPUs to get 50,000x parallelism. ;)
Coding is basically just ifs and _while_ loops.
Sounds like the Re-birth of the "Expert System". Now with neural networks ;)
Learning from Artificial Intelligence’s Previous Awakenings: The History of Expert Systems
https://ojs.aaai.org/index.php/aimagazine/article/view/2809
Coding is just electrons having fun.
And functional programming is just algebra. Ez Pz.
The way I interpret this is that the 'just ifs and for loops' is like Matrix rain code. In the beginning it looks like gibberish scrolling down the screen. When you master it, it's still gibberish scrolling down the screen, but it's simultaneously something else on another level as well.
I often find myself writing simple things with a compact-but-high-level-conceptualization, that when edited by someone else, clearly only saw the rain.
Wait till he reads about a Turing machine
Ahem, and _math_ in between. Quite a lot of it.
Calculating the square root of a number is ifs and loops but wouldn’t be much fun without the math.
you guys have for loops?
This was my first reaction too. Now I work in a (Swift) codebase where I could probably count the number of for loops we have on one hand. Pretty much everything is done through map/reduce/filter, etc.
At first, I thought it wasn't as readable, but now that I'm used to the syntax, I think it's much easier to parse what happening. I know what transformation is happening, and I know there shouldn't be any side-effects.
We used to have. Now we mostly chain map-functions. But there are those rare instances where we don't operate on iterables and still need to do something many times. I've seen people so weirded out by this that they prefer to _generate_ an iterable to work on a rather than to use a for-loop.
Yeah, it's just branching and iteration. The kid's right.
All we need are functions!
Until he finds out about Promise.then chain. So, programming is basically just ifs, for loop and then.
also "coding is basically over glorified plumbing".
~ my cynical coworkers
And thank goodness that it IS glorified!
and ops is custodial engineering for software. we mop up what leaks out after the plumbers are done
"construction is basically just concrete and steel" :)
Quite literally how I explain coding to people.
It’s basically that, plus resource management.
I describe coding as similar to game of chess, in that it requires you to think several moves in advanced.
Both involve filling up your mental stack.
my professor at uni always said that machine code is just 1's and 0's and if you can count to one you can understand computers
(with a smile on his face and shrugging his shoulders)
Thing with loops, they are complicated. Like infinity. I remember when someone taught me about multiple infinities. It is wonderful!
Building a house is just lumber and nails.
It really is. Everything complicated is either an interface or handling some edge case. Sound familiar?
For loops are basically ifs and gotos.
A Turing machine is basically just ifs in a loop.
for-switch, the weird uncle of the turing machine family.
Computers are just infinite tape marked out into squares... (Turing machine). A pyramid is just a stack of cubic rocks.
The earth is just a ball of mud
It's a ball bearing covered in dust
And ppl nagging you. No clueless ppl nagging you with things like jira and idiots like jeff sutherland.
One time I told someone I work on web apps and they said “oh that’s just html and css that’s easy.”
It's nice to learn so much about someone so quickly.
And assignments.
Until you get to network programming, then it's "just open a socket"
It's just 0s and 1s, really.
I really wish it was as simple as that.
Add in some form of mutability and you’ve got yourself a Turing machine, yes.
haha, it is! it is a wonderful experience when you make that realization. That's the point with programming. Easy to learn. Difficult to master.
Amateur. We do branchless coding and unroll every loop here.
And c++ is basically just c with classes.
Computers are just thinking rocks.
> When we write programs that "learn", it turns out that we do and they don't.
Programmers are just linguists that use and make artificial languages, like Esperanto!
It’s just 0’s and 1’s, baby
And datastructures.
Humans are basically just ifs and for loops
The masses are just loops. The mindful are gets and loops. The ruling class are gets, sets, and loops. The elite are ifs, gets, sets, and loops.
With quantum irrationality and self induced sleep deprivation!
At least 5 ifs
Whys?
And print. Don't forget print
I wish I could have a son who understands Turing completeness without being explicitly told... I guess that's the power of the Carmack bloodline.
No, coding is just converting some data into other forms of data.
I once told a girlfriend that's what I did. It worked, she wasn't impressed with my job at all after that.
In the eyes of an 85 year old Italian woman I was a "writer"(or perhaps just "typist" - something could have been lost in translation).
Fair enough.
10 years on and I'm still converting relational data into HTML or JSON
I had a brief time in my career where we used a database that you would query and it just returned the html. It was amazing.
I kinda want to give you a hug and say it's all gonna be okay...
I squint at a screen and push buttons. But which buttons in which order matters.
We're data plumbers
Routing shit around the world…
git is just a graph, man
I think it's funny that it's kind of like a blockchain, and as soon as the business people and MBAs realize this, they're going to "evangelize this amazing application of blockchain to all code!" to death.
So Bitcoin is just Git with extra steps!
Github is just programmer Wordpress.
it's a chain. of blocks... hmmm.
Moronic nonsense clear to anyone who has ever seen an enterprise spaghetti code joke
I dont understand why theres a trend against functional programming. I started as a mechanical engineer and became a manager. I think I know a bit better than code monkeys.
All programs are by definition a combination of sequence ,selection and itteration.
There's no limit to how hard HN will fellate pg and Carmack.
“that happened” stories by egotistical parents who need to log off Twitter for their own mental health’s sake now make the top of HN
This is the most insightful thing I've ever seen on Twitter.
“coding is basically just ifs and for loops.”
Maybe, he knows only that much, yet