💾 Archived View for dioskouroi.xyz › thread › 25009130 captured on 2020-11-07 at 00:55:23. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
________________________________________________________________________________
For those who don't know, the title is a reference to the book "Seeing Like a State", by James C. Scott.
"Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed is a book by James C. Scott critical of a system of beliefs he calls high modernism, that centers around confidence in the ability to design and operate society in accordance with scientific laws. It was released in March 1998, with a paperback version in February 1999. The book catalogues schemes which states impose upon populaces that are convenient for the state since they make societies 'legible', but are not necessarily good for the people. For example, census data, standardized weights and measures, and uniform languages make it easier to tax and control the population.
https://en.wikipedia.org/wiki/Seeing_Like_a_State
This becomes pretty damn obvious if you actually click on the link. But to be fair, who does that?
The reason they "See Like a State" is because it works better on average essentially. Despite the romanticism of "local community in tune with their needs". Just look at small town corruption for one.
The actual practice is more Feudalism-lite of alliegences and fiedoms. It is why bureaucracy was an improvement. Even with all of its pain in the ass inanities, rigidities the fiefdom seeking is lessened as they have less discretion to abuse it with defined rules.
The agenda of Scott echoes bad ideas been seen many times before. It seems to be an "anti-practical so they can call it intellectually pure" pseudo-intellectualism legacy of philosophers idolizing the Greeks who were largely essentially being "paid" (even if technically slaves their lifestyle was supported without toil) to kiss up to aristocracy and promote their prestiege and looking down upon anything practical.
That toxic anti-practical attitude just keeps on showing up in history. The "southern aristocrats" and old money distancing itself from the new in the industrial revolution. Because aristocracy defined themselves as not working but fighting and owning. The split of natural philosophy from philosophy and into science was a good but it implicitly left a "but not emperical thinking as it has become too lowly and practical!" to philosophy leaving it frankly tending towards the delusional. Yes there is "philosophy of science" but the need for the qualifier is a hint that it isn't an implicit norm with respect to empiricism.
The article also misses the mark in claiming a reversion from machine learning - even if it loses legibility the fact it is still based upon the data instead of direct personal whims is still a net perk even if the rules it operates end up Garbage In Garbage Out. The judgement of it may still be evaluated and adjusted - just like every other rule based system.
Have you read the book? It's not about showing that states are 'wrong' or 'bad', it's about how measurement and standardization force changes that affect people, and what those changes are.
It has to in order to do things that scale.
I can buy wonderful handmade artisanal croissants at my local bakery, but they will never feed the number of people that Conagra can. (E.g. They make Peter Pan peanut butter and everything else in your cabinet)
You could host your email with a small family owned and operated company, but it will never be as cheap as gmail.
Imposing legibility is also a requirement to making a market. That's what makes products similar across different manufacturers/providers. Remember the bad old days where this programming language meant you had to use that database? Imposing legibility means that you can swap out one vendor for another! Suddenly, you have the freedom to choose.
> Imposing legibility is also a requirement to making a market.
I agree with the idea but markets result in emergent legibility rather than imposed legibility.
An apt piece. These tendencies in Big Tech lay bare the mechanism that pushes towards legibility because it's literally easier to code. It's worth some thought for programmers that we have to gravitate towards problems that can be solved by a tabulating machine (or a CRUD API).
For the record, I'm strongly against these companies having free reign over markets and societies, but opportunistically sympathetic towards them sometimes counterbalancing _arbitrary_ government actions. Of course I'd prefer having sane governments controlled by the public, but that is a pipe dream in most countries.
And yet, very few people emigrate from highly legible countries in the United States and Western Europe in order to live in more informal ones. Meanwhile, many people around the world do choose to immigrate to those hyper-legible societies.
I don't think this is completely accurate. Really old republican democracies, mainly the US and Switzerland, seem to have less legibility than more modern, centralized and top-down states (model: France). There are more independent local institutions and old privileges of the populace that no autocratic government managed to remove. Things like guns, lack of compulsory IDs, local direct democracy, electing small local officials. Mind you, I'm not saying that the US is some civil rights paradise in practice (in fact, I'm happy to not be living there), just that it has more historical mess to counteract easy legibility and centralization.
I'd say that the Eastern Bloc tends towards even more legibility than Western Europe, because totalitarian regimes (and even earlier, reactionary monarchies) have destroyed organic, grassroots social institutions. People were taught to resist bureaucracy by petty cheating, not by conscious political action. (This is slowly changing, maybe.) Even moreso, of course, mainland China.
In short: people immigrate to societies with high standard of living and rule of law. Both are _loosely correlated_ with legibility.
An ML-driven approach is only possible at large scale, and scale is only possible through legibility. But it’s the fate of all these legibility-imposers to move past legibility. They impose order on the world, and then they automate the order-imposing process, the order-imposer-refining process, and so on, until the end result is determined by a metis available to nobody.
This is a good explanation of how the process of instrumental rationality turned into a runaway positive feedback cycle of social irrationality.