💾 Archived View for perplexing.space › 2022 › re-the-static-file-startup.gmi captured on 2024-02-05 at 09:28:01. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2022-04-28)

-=-=-=-=-=-=-

RE: The Static File Startup

2022-03-19

It can be encouraging sometimes to hear stories of dysfunction at work, other times it just hits too close to home. marginalia.nu¹ posted a satirical story that cut a bit too deep with my own experiences. It seems only fair to pitch in a few instances of the kind of wrong-thinking that runs rampant in the industry.

1. The Static File Startup

A company spent months re-inventing cron. The stated goal of this system was to free it from the constraints of a 24-hour day, for example, should it ever need to operate off-planet. For bonus points, the new units of time and their intervals were codified into an elaborate type system which compiled away. This meant that the system inputs (hand written JSON) were never validated against the type system.

A company utilized multiple Kubernetes instances in order to scale a web service that was run from the Flask development server. If you are unfamiliar, launching the Flask development server outputs the following message:

WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.

That time a company decided they needed Kubernetes but also edge computing, so resolved to operate multiple nodes off of a collection of media PCs. The thinking was: Kubernetes is necessary for resiliency and resiliency is necessary because we are operating off of consumer-grade media PCs and we are operating off of media PCs because we need a cluster of computers to run Kubernetes.

A company resolved to pick their technologies first and use those technologies to guide their data models. Event sourcing and Apache Pulsar will inform what data we collect and how we use it! The existing relational database was deemed too inefficient for these undecided workloads. At the time new technologies were adopted the entire database was 1GB.

A company spending six figures per month on cloud computing in order to develop and operate a system with 600MB of data. The entire architecture utilized so many layers of indirection that it negated any possible value derived from running on a cloud platform.

Startups are weird, yes. More than that though people are really REALLY bad at computers. It is only through the fantastical amounts of computing power we throw at problems that any work even gets done any more. Abstractions, architectures, patterns, and technologies are all supposed to be tools to make a problem tenable. How much of your time is spent fighting the tools instead of making progress on the actual problems?