💾 Archived View for beyondneolithic.life › graeber_archive › utopia_of_rules › introduction.gmi captured on 2024-03-21 at 15:21:35. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2021-12-03)
-=-=-=-=-=-=-
Nowadays, nobody talks much about bureaucracy. But in the middle of the last century, particularly in the late sixties and early seventies, the word was everywhere. There were sociological tomes with grandiose titles like A General Theory of Bureaucracy [1], The Politics of Bureaucracy [2], or even The Bureaucratization of the World [3], and popular paperback screeds with titles like Parkinson’s Law [4], The Peter Principle,[5] or Bureaucrats: How to Annoy Them.[6]
There were Kafkaesque novels, and satirical films. Everyone seemed to feel that the foibles and absurdities of bureaucratic life and bureaucratic procedures were one of the defining features of modern existence, and as such, eminently worth discussing. But since the seventies, there has been a peculiar falling off.
Consider, for example, the following table, which diagrams how frequently the word “bureaucracy” appears in books written in English over the last century and a half. A subject of only moderate interest until the postwar period, it shoots into prominence starting in the fifties and then, after a pinnacle in 1973, begins a slow but inexorable decline.
Frequency of the word "Bureaucracy" (Image)
Why? Well, one obvious reason is that we’ve just become accustomed to it. Bureaucracy has become the water in which we swim. Now let’s imagine another graph, one that simply documented the average number of hours per year a typical American — or a Briton, or an inhabitant of Thailand — spent filling out forms or otherwise fulfilling purely bureaucratic obligations. (Needless to say, the overwhelming majority of these obligations no longer involve actual, physical paper.) This graph would almost certainly show a line much like the one in the first graph — a slow climb until 1973. But here the two graphs would diverge — rather than falling back, the line would continue to climb; if anything, it would do so more precipitously, tracking how, in the late twentieth century, middle-class citizens spent ever more hours struggling with phone trees and web interfaces, while the less fortunate spent ever more hours of their day trying to jump through the increasingly elaborate hoops required to gain access to dwindling social services.
I imagine such a graph would look something like this:
Hours per year spent on paperwork (Image)
This is not a graph of hours spent on paperwork, just of how often the word “paperwork” has been used in English-language books. But absent time machines that could allow us to carry out a more direct investigation, this is about as close as we’re likely to get.
By the way, most similar paperwork-related terms yield almost identical results:
Frequency of paperwork-related terms over time (Image)
The essays assembled in this volume are all, in one way or another, about this disparity. We no longer like to think about bureaucracy, yet it informs every aspect of our existence. It’s as if, as a planetary civilization, we have decided to clap our hands over our ears and start humming whenever the topic comes up. Insofar as we are even willing to discuss it, it’s still in the terms popular in the sixties and early seventies. The social movements of the sixties were, on the whole, left-wing in inspiration, but they were also rebellions against bureaucracy, or, to put it more accurately, rebellions against the bureaucratic mindset, against the soul-destroying conformity of the postwar welfare states. In the face of the gray functionaries of both state-capitalist and state-socialist regimes, sixties rebels stood for individual expression and spontaneous conviviality, and against (“rules and regulations, who needs them?”) every form of social control.
With the collapse of the old welfare states, all this has come to seem decidedly quaint. As the language of antibureaucratic individualism has been adopted, with increasing ferocity, by the Right, which insists on “market solutions” to every social problem, the mainstream Left has increasingly reduced itself to fighting a kind of pathetic rearguard action, trying to salvage remnants of the old welfare state: it has acquiesced with — often even spearheaded — attempts to make government efforts more “efficient” through the partial privatization of services and the incorporation of ever-more “market principles,” “market incentives,” and market-based “accountability processes” into the structure of the bureaucracy itself.
The result is a political catastrophe. There’s really no other way to put it. What is presented as the “moderate” Left solution to any social problems — and radical left solutions are, almost everywhere now, ruled out tout court — has invariably come to be some nightmare fusion of the worst elements of bureaucracy and the worst elements of capitalism. It’s as if someone had consciously tried to create the least appealing possible political position. It is a testimony to the genuine lingering power of leftist ideals that anyone would even consider voting for a party that promoted this sort of thing — because surely, if they do, it’s not because they actually think these are good policies, but because these are the only policies anyone who identifies themselves as left-of-center is allowed to set forth.
Is there any wonder, then, that every time there is a social crisis, it is the Right, rather than the Left, which becomes the venue for the expression of popular anger?
The Right, at least, has a critique of bureaucracy. It’s not a very good one. But at least it exists. The Left has none. As a result, when those who identify with the Left do have anything negative to say about bureaucracy, they are usually forced to adopt a watered- down version of the right-wing critique.[7]
This right-wing critique can be disposed of fairly quickly. It has its origins in nineteenth- century liberalism.[8] The story that emerged in middle-class circles in Europe in the wake of the French revolution was that the civilized world was experiencing a gradual, uneven, but inevitable transformation away from the rule of warrior elites, with their authoritarian governments, their priestly dogmas, and their caste-like stratification, to one of liberty, equality, and enlightened commercial self-interest. The mercantile classes in the Middle Ages undermined the old feudal order like termites munching from below — termites, yes, but the good kind. The pomp and splendor of the absolutist states that were being overthrown were, according to the liberal version of history, the last gasps of the old order, which would end as states gave way to markets, religious faith to scientific understanding, and fixed orders and statuses of Marquis and Baronesses and the like to free contracts between individuals.
The emergence of modern bureaucracies was always something of a problem for this story because it didn’t really fit. In principle, all these stuffy functionaries in their offices, with their elaborate chains of command, should have been mere feudal holdovers, soon to go the way of the armies and officer corps that everyone was expecting to gradually become unnecessary as well. One need only flip open a Russian novel from the late nineteenth century: all the scions of old aristocratic families — in fact, almost everyone in those books — had been transformed into either military officers or civil servants (no one of any notice seems to do anything else), and the military and civil hierarchies seemed to have nearly identical ranks, titles, and sensibilities. But there was an obvious problem. If bureaucrats were just holdovers, why was it that everywhere — not just in backwaters like Russia but in booming industrial societies like England and Germany — every year seemed to bring more and more of them?
There followed stage two of the argument, which was, in its essence, that bureaucracy represents an inherent flaw in the democratic project.[9] Its greatest exponent was Ludwig von Mises, an exiled Austrian aristocrat, whose 1944 book Bureaucracy argued that by definition, systems of government administration could never organize information with anything like the efficiency of impersonal market pricing mechanisms. However, extending the vote to the losers of the economic game would inevitably lead to calls for government intervention, framed as high-minded schemes for trying to solve social problems through administrative means. Von Mises was willing to admit that many of those who embraced such solutions were entirely well-meaning; however, their efforts could only make matters worse. In fact, he felt they would ultimately end up destroying the political basis of democracy itself, since the administrators of social programs would inevitably form power- blocs far more influential than the politicians elected to run the government, and support ever-more radical reforms. Von Mises argued that as a result, the social welfare states then emerging in places like France or England, let alone Denmark or Sweden, would, within a generation or two, inevitably lead to fascism.
In this view, the rise of bureaucracy was the ultimate example of good intentions run amok. Ronald Reagan probably made the most effective popular deployment of this line of thought with his famous claim that, “the nine most terrifying words in the English language are, ‘I’m from the government and I’m here to help.’ ”
The problem with all this is that it bears very little relation to what actually happened. First of all, historically, markets simply did not emerge as some autonomous domain of freedom independent of, and opposed to, state authorities. Exactly the opposite is the case. Historically, markets are generally either a side effect of government operations, especially military operations, or were directly created by government policy. This has been true at least since the invention of coinage, which was first created and promulgated as a means of provisioning soldiers; for most of Eurasian history, ordinary people used informal credit arrangements and physical money, gold, silver, bronze, and the kind of impersonal markets they made possible remained mainly an adjunct to the mobilization of legions, sacking of cities, extraction of tribute, and disposing of loot. Modern central banking systems were likewise first created to finance wars. So there’s one initial problem with the conventional history. There’s another even more dramatic one. While the idea that the market is somehow opposed to and independent of government has been used at least at least since the nineteenth century to justify laissez faire economic policies designed to lessen the role of government, they never actually have that effect. English liberalism, for instance, did not lead to a reduction of state bureaucracy, but the exact opposite: an endlessly ballooning array of legal clerks, registrars, inspectors, notaries, and police officials who made the liberal dream of a world of free contract between autonomous individuals possible. It turned out that maintaining a free market economy required a thousand times more paperwork than a Louis XIV-style absolutist monarchy.
This apparent paradox — that government policies intending to reduce government interference in the economy actually end up producing more regulations, more bureaucrats, and more police — can be observed so regularly that I think we are justified in treating it as a general sociological law. I propose to call it “the iron law of liberalism”:
_The Iron Law of Liberalism_ states that any market reform, any government initiative intended to reduce red tape and promote market forces will have the ultimate effect of increasing the total number of regulations, the total amount of paperwork, and the total number of bureaucrats the government employs.
French sociologist Emile Durkheim was already observing this tendency at the turn of the twentieth century,[10] and eventually, it became impossible to ignore. By the middle of the century, even right-wing critics like von Mises were willing to admit — at least in their academic writing — that markets don’t really regulate themselves, and that an army of administrators was indeed required to keep any market system going. (For von Mises, that army only became problematic when it was deployed to alter market outcomes that caused undue suffering for the poor.)[11] Still, right-wing populists soon realized that, whatever the realities, making a target of bureaucrats was almost always effective. Hence, in their public pronouncements, the condemnation of what U.S. governor George Wallace, in his 1968 campaign for President, first labeled “pointy-headed bureaucrats” living off hardworking citizens’ taxes, was unrelenting.
Wallace is actually a crucial figure here. Nowadays, Americans mainly remember him as a failed reactionary, or even a snarling lunatic: the last die-hard Southern segregationist standing with an axe outside a public school door. But in terms of his broader legacy, he could just as well be represented as a kind of political genius. He was, after all, the first politician to create a national platform for a kind of right-wing populism that was soon to prove so infectious that by now, a generation later, it has come to be adopted by pretty much everyone, across the political spectrum. As a result, amongst working-class Americans, government is now generally seen as being made up of two sorts of people: “politicians,” who are blustering crooks and liars but can at least occasionally be voted out of office, and “bureaucrats,” who are condescending elitists almost impossible to uproot. There is assumed to be a kind of tacit alliance between what came to be seen as the parasitical poor (in America usually pictured in overtly racist terms) and the equally parasitical self-righteous officials whose existence depends on subsidizing the poor using other people’s money. Again, even the mainstream Left — or what it is supposed to pass for a Left these days — has come to offer little more than a watered-down version of this right-wing language. Bill Clinton, for instance, had spent so much of his career bashing civil servants that after the Oklahoma City bombing, he actually felt moved to remind Americans that public servants were human beings unto themselves, and promised never to use the word “bureaucrat” again.[12]
In contemporary American populism — and increasingly, in the rest of the world as well — there can be only one alternative to “bureaucracy,” and that is “the market.” Sometimes this is held to mean that government should be run more like a business. Sometimes it is held to mean we should simply get the bureaucrats out of the way and let nature take its course, which means letting people attend to the business of their lives untrammelled by endless rules and regulations imposed on them from above, and so allowing the magic of the marketplace to provide its own solutions.
“Democracy” thus came to mean the market; “bureaucracy,” in turn, government interference with the market; and this is pretty much what the word continues to mean to this day.
It wasn’t always so. The rise of the modern corporation, in the late nineteenth century, was largely seen at the time as a matter of applying modern, bureaucratic techniques to the private sector — and these techniques were assumed to be required, when operating on a large scale, because they were more efficient than the networks of personal or informal connections that had dominated a world of small family firms. The pioneers of these new, private bureaucracies were the United States and Germany, and Max Weber, the German sociologist, observed that Americans in his day were particularly inclined to see public and private bureaucracies as essentially the same animal:
The body of officials actively engaged in a “public” office, along with the respective apparatus of material implements and the files, make up a “bureau.” In private enterprise, “the bureau” is often called “the office” …
It is the peculiarity of the modern entrepreneur that he conducts himself as the “first official” of his corporation, in the very same way in which the ruler of a specifically modern bureaucratic state spoke of himself as “the first servant” of the state. The idea that the bureau activities of the state are intrinsically different in character from the management of private economic offices is a continental European notion and, by way of contrast, is totally foreign to the American way.[13]
In other words, around the turn of the century, rather than anyone complaining that government should be run more like a business, Americans simply assumed that governments and business — or big business, at any rate — were run the same way.
True, for much of the nineteenth century, the United States was largely an economy of small family firms and high finance — much like Britain’s at the time. But America’s advent as a power on the world stage at the end of the century corresponded to the rise of a distinctly American form: corporate — bureaucratic — capitalism. As Giovanni Arrighi pointed out, an analogous corporate model was emerging at the same time in Germany, and the two countries — the United States and Germany — ended up spending most of the first half of the next century battling over which would take over from the declining British empire and establish its own vision for a global economic and political order. We all know who won. Arrighi makes another interesting point here. Unlike the British Empire, which had taken its free market rhetoric seriously, eliminating its own protective tariffs with the famous Anti– Corn Law Bill of 1846, neither the German or American regimes had ever been especially interested in free trade. The Americans in particular were much more concerned with creating structures of international administration. The very first thing the United States did, on officially taking over the reins from Great Britain after World War II, was to set up the world’s first genuinely planetary bureaucratic institutions in the United Nations and the Bretton Woods institutions — the International Monetary Fund, World Bank, and GATT, later to become the WTO. The British Empire had never attempted anything like this. They either conquered other nations, or traded with them. The Americans attempted to administer everything and everyone.
British people, I’ve observed, are quite proud that they are not especially skilled at bureaucracy; Americans, in contrast, seem embarrassed by the fact that on the whole, they’re really quite good at it.[14] It doesn’t fit the American self-image. We’re supposed to be self-reliant individualists. (This is precisely why the right-wing populist demonization of bureaucrats is so effective.) Yet the fact remains the United States is — and for a well over a century has been — a profoundly bureaucratic society. The reason it is so easy to overlook is because most American bureaucratic habits and sensibilities — from the clothing to the language to the design of forms and offices — emerged from the private sector. When novelists and sociologists described the “Organization Man,” or “the Man in the Gray Flannel Suit,” the soullessly conformist U.S. equivalent to the Soviet apparatchik, they were not talking about functionaries in the Department of Landmarks and Preservation or the Social Security Administration — they were describing corporate middle management. True, by that time, corporate bureaucrats were not actually being called bureaucrats. But they were still setting the standard for what administrative functionaries were supposed to be like.
The impression that the word “bureaucrat” should be treated as a synonym for “civil servant” can be traced back to the New Deal in the thirties, which was also the moment when bureaucratic structures and techniques first became dramatically visible in many ordinary people’s lives. But in fact, from the very beginning, Roosevelt’s New Dealers worked in close coordination with the battalions of lawyers, engineers, and corporate bureaucrats employed by firms like Ford, Coca Cola, or Proctor & Gamble, absorbing much of their style and sensibilities, and — as the United States shifted to war footing in the forties — so did the gargantuan bureaucracy of the U.S. military. And, of course, the United States has never really gone off war footing ever since. Still, through these means, the word “bureaucrat” came to attach itself almost exclusively to civil servants: even if what they do all day is sit at desks, fill out forms, and file reports, neither middle managers nor military officers are ever quite considered bureaucrats. (Neither for that matter are police, or employees of the NSA.)
In the United States, the lines between public and private have long been blurry. The American military, for example, is famous for its revolving door — high-ranking officers involved in procurement regularly end up on the boards of corporations that operate on military contracts. On a broader level, the need to preserve certain domestic industries for military purposes, and to develop others, has allowed the U.S. government to engage in practically Soviet-style industrial planning without ever having to admit it’s doing so. After all, pretty much anything, from maintaining a certain number of steel plants, to doing the initial research to set up the Internet, can be justified on grounds of military preparedness. Yet again, since this kind of planning operates via an alliance between military bureaucrats and corporate bureaucrats, it’s never perceived as something bureaucratic at all.
Still, with the rise of the financial sector, things have reached a qualitatively different level — one where it is becoming almost impossible to say what is public and what is private. This is not just due to the much-noted outsourcing of one-time government functions to private corporations. Above all, it’s due to the way the private corporations themselves have come to operate.
Let me give an example. A few weeks ago, I spent several hours on the phone with Bank of America, trying to work out how to get access to my account information from overseas. This involved speaking to four different representatives, two referrals to nonexistent numbers, three long explanations of complicated and apparently arbitrary rules, and two failed attempts to change outdated address and phone number information lodged on various computer systems. In other words, it was the very definition of a bureaucratic runaround. (Neither was I able, when it was all over, to actually access my account.)
Now, there is not the slightest doubt in my mind that, were I to actually locate a bank manager and demand to know how such things could happen, he or she would immediately insist that the bank was not to blame — that it was all an effect of an arcane maze of government regulations. However, I am equally confident that, were it possible to investigate how these regulations came about, one would find that they were composed jointly by aides to legislators on some banking committee and lobbyists and attorneys employed by the banks themselves, in a process greased by generous contributions to the coffers of those same legislators’ reelection campaigns. And the same would be true of anything from credit ratings, insurance premiums, mortgage applications, to, for that matter, the process of buying an airline ticket, applying for a scuba license, or trying to requisition an ergonomic chair for one’s office in an ostensibly private university. The vast majority of the paperwork we do exists in just this sort of in-between zone — ostensibly private, but in fact entirely shaped by a government that provides the legal framework, underpins the rules with its courts and all of the elaborate mechanisms of enforcement that come with them, but — crucially — works closely with the private concerns to ensure that the results will guarantee a certain rate of private profit.
In cases like this the language we employ — derived as it is from the right-wing critique — is completely inadequate. It tells us nothing about what is actually going on.[15]
Consider the word “deregulation.” In today’s political discourse, “deregulation” is — like “reform” — almost invariably treated as a good thing. Deregulation means less bureaucratic meddling, and fewer rules and regulations stifling innovation and commerce. This usage puts those on the left-hand side of the political spectrum in an awkward position, since opposing deregulation — even, pointing out that it was an orgy of this very “deregulation” that led to the banking crisis of 2008 — seems to imply a desire for more rules and regulations, and therefore, more gray men in suits standing in the way of freedom and innovation and generally telling people what to do.
But this debate is based on false premises. Let’s go back to banks. There’s no such thing as an “unregulated” bank. Nor could there be. Banks are institutions to which the government has granted the power to create money — or, to be slightly more technical about it, the right to issue IOUs that the government will recognize as legal tender, and, therefore, accept in payment of taxes and to discharge other debts within its own national territory. Obviously no government is about to grant anyone — least of all a profit-seeking firm — the power to create as much money as they like under any circumstances. That would be insane. The power to create money is one that, by definition, governments can only grant under carefully circumscribed (read: regulated) conditions. And indeed this is what we always find: government regulates everything from a bank’s reserve requirements to its hours of operation; how much it can charge in interest, fees, and penalties; what sort of security precautions it can or must employ; how its records must be kept and reported; how and when it must inform its clients of their rights and responsibilities; and pretty much everything else.
So what are people actually referring to when they talk about “deregulation”? In ordinary usage, the word seems to mean “changing the regulatory structure in a way that I like.” In practice this can refer to almost anything. In the case of airlines or telecommunications in the seventies and eighties, it meant changing the system of regulation from one that encouraged a few large firms to one that fostered carefully supervised competition between midsize firms. In the case of banking, “deregulation” has usually meant exactly the opposite: moving away from a situation of managed competition between midsized firms to one where a handful of financial conglomerates are allowed to completely dominate the market. This is what makes the term so handy. Simply by labeling a new regulatory measure “deregulation,” you can frame it in the public mind as a way to reduce bureaucracy and set individual initiative free, even if the result is a fivefold increase in the actual number of forms to be filled in, reports to be filed, rules and regulations for lawyers to interpret, and officious people in offices whose entire job seems to be to provide convoluted explanations for why you’re not allowed to do things.[16]
This process — the gradual fusion of public and private power into a single entity, rife with rules and regulations whose ultimate purpose is to extract wealth in the form of profits — does not yet have a name. That in itself is significant. These things can happen largely because we lack a way to talk about them. But one can see its effects in every aspect of our lives. It fills our days with paperwork. Application forms get longer and more elaborate. Ordinary documents like bills or tickets or memberships in sports or book clubs come to be buttressed by pages of legalistic fine print.
I’m going to make up a name. I’m going to call this the age of “total bureaucratization.” (I was tempted to call this the age of “predatory bureaucratization” but it’s really the all- encompassing nature of the beast I want to highlight here.) It had its first stirrings, one might say, just at the point where public discussion of bureaucracy began to fall off in the late seventies, and it began to get seriously under way in the eighties. But it truly took off in the nineties.
In an earlier book, I suggested that the fundamental historical break that ushered in our current economic regime occurred in 1971, the date that the U.S. dollar went off the gold standard. This is what paved the way first for the financialization of capitalism, but ultimately, for much more profound long-term changes that I suspect will ultimately spell the end of capitalism entirely. I still think that. But here we are speaking of much more short-term effects. What did financialization mean for the deeply bureaucratized society that was postwar America?[17]
I think what happened is best considered as a kind of shift in class allegiances on the part of the managerial staff of major corporations, from an uneasy, de facto alliance with their own workers, to one with investors. As John Kenneth Galbraith long ago pointed out, if you create an organization geared to produce perfumes, dairy products, or aircraft fuselages, those who make it up will, if left to their own devices, tend to concentrate their efforts on producing more and better perfumes, dairy products, or aircraft fuselages, rather than thinking primarily of what will make the most money for the shareholders. What’s more, since for most of the twentieth century, a job in a large bureaucratic mega-firm meant a lifetime promise of employment, everyone involved in the process — managers and workers alike — tended to see themselves as sharing a certain common interest in this regard, over and against meddling owners and investors. This kind of solidarity across class lines even had a name: it was called “corporatism.” One mustn’t romanticize it. It was among other things the philosophical basis of fascism. Indeed, one could well argue that fascism simply took the idea that workers and managers had common interests, that organizations like corporations or communities formed organic wholes, and that financiers were an alien, parasitical force, and drove them to their ultimate, murderous extreme. Even in its more benign social democratic versions, in Europe or America, the attendant politics often came tinged with chauvinism[18] — but they also ensured that the investor class was always seen as to some extent outsiders, against whom white-collar and blue-collar workers could be considered, at least to some degree, to be united in a common front.
From the perspective of sixties radicals, who regularly watched antiwar demonstrations attacked by nationalist teamsters and construction workers, the reactionary implications of corporatism appeared self-evident. The corporate suits and the well-paid, Archie Bunker elements of the industrial proletariat were clearly on the same side. Unsurprising then that the left-wing critique of bureaucracy at the time focused on the ways that social democracy had more in common with fascism than its proponents cared to admit. Unsurprising, too, that this critique seems utterly irrelevant today.[19]
What began to happen in the seventies, and paved the way for what we see today, was a kind of strategic pivot of the upper echelons of U.S. corporate bureaucracy — away from the workers, and towards shareholders, and eventually, towards the financial structure as a whole. The mergers and acquisitions, corporate raiding, junk bonds, and asset stripping that began under Reagan and Thatcher and culminated in the rise of private equity firms were merely some of the more dramatic early mechanisms through which this shift of allegiance worked itself out. In fact, there was a double movement: corporate management became more financialized, but at the same time, the financial sector became corporatized, with investment banks, hedge funds, and the like largely replacing individual investors. As a result the investor class and the executive class became almost indistinguishable. (Think here of the term “financial management,” which came to refer simultaneously to how the highest ranks of the corporate bureaucracy ran their firms, and how investors managed their portfolios.) Before long, heroic CEOs were being lionized in the media, their success largely measured by the number of employees they could fire. By the nineties, lifetime employment, even for white-collar workers, had become a thing of the past. When corporations wished to win loyalty, they increasingly did it by paying their employees in stock options.[20]
At the same time, the new credo was that everyone should look at the world through the eyes of an investor — that’s why, in the eighties, newspapers began firing their labor reporters, but ordinary TV news reports came to be accompanied by crawls at the bottom of the screen displaying the latest stock quotes. The common cant was that through participation in personal retirement funds and investment funds of one sort or another, everyone would come to own a piece of capitalism. In reality, the magic circle was only really widened to include the higher paid professionals and the corporate bureaucrats themselves.
Still, that extension was extremely important. No political revolution can succeed without allies, and bringing along a certain portion of the middle class — and, even more crucially, convincing the bulk of the middle classes that they had some kind of stake in finance-driven capitalism — was critical. Ultimately, the more liberal members of this professional- managerial elite became the social base for what came to pass as “left-wing” political parties, as actual working-class organizations like trade unions were cast into the wilderness. (Hence, the U.S. Democratic Party, or New Labour in Great Britain, whose leaders engage in regular ritual acts of public abjuration of the very unions that have historically formed their strongest base of support). These were of course people who already tended to work in thoroughly bureaucratized environments, whether schools, hospitals, or corporate law firms. The actual working class, who bore a traditional loathing for such characters, either dropped out of politics entirely, or were increasingly reduced to casting protest votes for the radical Right.[21]
This was not just a political realignment. It was a cultural transformation. And it set the stage for the process whereby the bureaucratic techniques (performance reviews, focus groups, time allocation surveys …) developed in financial and corporate circles came to invade the rest of society — education, science, government — and eventually, to pervade almost every aspect of everyday life. One can best trace the process, perhaps, by following its language. There is a peculiar idiom that first emerged in such circles, full of bright, empty terms like vision, quality, stakeholder, leadership, excellence, innovation, strategic goals, or best practices. (Much of it traces back to “self-actualization” movements like Lifespring, Mind Dynamics, and EST, which were extremely popular in corporate boardrooms in the seventies, but it quickly became a language unto itself.) Now, imagine it would be possible to create a map of some major city, and then place one tiny blue dot on the location of every document that uses at least three of these words. Then imagine that we could watch it change over time. We would be able to observe this new corporate bureaucratic culture spread like blue stains in a petri dish, starting in the financial districts, on to boardrooms, then government offices and universities, then, finally, engulfing any location where any number of people gather to discuss the allocation of resources of any kind at all.
For all its celebration of markets and individual initiative, this alliance of government and finance often produces results that bear a striking resemblance to the worst excesses of bureaucratization in the former Soviet Union or former colonial backwaters of the Global South. There is a rich anthropological literature, for instance, on the cult of certificates, licenses, and diplomas in the former colonial world. Often the argument is that in countries like Bangladesh, Trinidad, or Cameroon, which hover between the stifling legacy of colonial domination and their own magical traditions, official credentials are seen as a kind of material fetish — magical objects conveying power in their own right, entirely apart from the real knowledge, experience, or training they’re supposed to represent. But since the eighties, the real explosion of credentialism has been in what are supposedly the most “advanced” economies, like the United States, Great Britain, or Canada. As one anthropologist, Sarah Kendzior, puts it:
“The United States has become the most rigidly credentialised society in the world,” write James Engell and Anthony Dangerfield in their 2005 book Saving Higher Education in the Age of Money. “A BA is required for jobs that by no stretch of imagination need two years of full-time training, let alone four.”
The promotion of college as a requirement for a middle-class life … has resulted in the exclusion of the non-college educated from professions of public influence. In 1971, 58 percent of journalists had a college degree. Today, 92 percent do, and at many publications, a graduate degree in journalism is required — despite the fact that most renowned journalists have never studied journalism.[22]
Journalism is one of many fields of public influence — including politics — in which credentials function as de facto permission to speak, rendering those who lack them less likely to be employed and less able to afford to stay in their field. Ability is discounted without credentials, but the ability to purchase credentials rests, more often than not, on family wealth.[23]
One could repeat the story in field after field, from nurses to art teachers, physical therapists to foreign policy consultants. Almost every endeavor that used to be considered an art (best learned through doing) now requires formal professional training and a certificate of completion, and this seems to be happening, equally, in both the private and public sectors, since, as already noted, in matters bureaucratic, such distinctions are becoming effectively meaningless. While these measures are touted — as are all bureaucratic measures — as a way of creating fair, impersonal mechanisms in fields previously dominated by insider knowledge and social connections, the effect is often the opposite. As anyone who has been to graduate school knows, it’s precisely the children of the professional- managerial classes, those whose family resources make them the least in need of financial support, who best know how to navigate the world of paperwork that enables them to get said support.[24] For everyone else, the main result of one’s years of professional training is to ensure that one is saddled with such an enormous burden of student debt that a substantial chunk of any subsequent income one will get from pursuing that profession will henceforth be siphoned off, each month, by the financial sector. In some cases, these new training requirements can only be described as outright scams, as when lenders, and those prepared to set up the training programs, jointly lobby the government to insist that, say, all pharmacists be henceforth required to pass some additional qualifying examination, forcing thousands already practicing the profession into night school, which these pharmacists know many will only be able to afford with the help of high-interest student loans.[25] By doing this, lenders are in effect legislating themselves a cut of most pharmacists’ subsequent incomes.[26]
The latter might seem an extreme case, but in its own way it’s paradigmatic of the fusion of public and private power under the new financial regime. Increasingly, corporate profits in America are not derived from commerce or industry at all, but from finance — which means, ultimately, from other people’s debts. These debts do not just happen by accident. To a large degree, they are engineered — and by precisely this kind of fusion of public and private power. The corporatization of education; the resulting ballooning of tuitions as students are expected to pay for giant football stadiums and similar pet projects of executive trustees, or to contribute to the burgeoning salaries of ever-multiplying university officials; the increasing demands for degrees as certificates of entry into any job that promises access to anything like a middle-class standard of living; resulting rising levels of indebtedness — all these form a single web. One result of all this debt is to render the government itself the main mechanism for the extraction of corporate profits. (Just think, here, of what happens if one tries to default on one’s student loans: the entire legal apparatus leaps into action, threatening to seize assets, garnish wages, and apply thousands of dollars in additional penalties.) Another is to force the debtors themselves to bureaucratize ever-increasing dimensions of their own lives, which have to be managed as if they were themselves a tiny corporation measuring inputs and outputs and constantly struggling to balance its accounts.
It’s also important to emphasize that while this system of extraction comes dressed up in a language of rules and regulations, in its actual mode of operation, it has almost nothing to do with the rule of law. Rather, the legal system has itself become the means for a system of increasingly arbitrary extractions. As the profits from banks and credit card companies derive more and more from “fees and penalties” levied on their customers — so much so that those living check to check can regularly expect to be charged eighty dollars for a five-dollar overdraft — financial firms have come to play by an entirely different set of rules. I once attended a conference on the crisis in the banking system where I was able to have a brief, informal chat with an economist for one of the Bretton Woods institutions (probably best I not say which). I asked him why everyone was still waiting for even one bank official to be brought to trial for any act of fraud leading up to the crash of 2008.
OFFICIAL: Well, you have to understand the approach taken by U.S. prosecutors to financial fraud is always to negotiate a settlement. They don’t want to have to go to trial. The upshot is always that the financial institution has to pay a fine, sometimes in the hundreds of millions, but they don’t actually admit to any criminal liability. Their lawyers simply say they are not going to contest the charge, but if they pay, they haven’t technically been found guilty of anything.
ME: So you’re saying if the government discovers that Goldman Sachs, for instance, or Bank of America, has committed fraud, they effectively just charge them a penalty fee.
OFFICIAL: That’s right.
ME: So in that case … okay, I guess the real question is this: has there ever been a case where the amount the firm had to pay was more than the amount of money they made from the fraud itself?
OFFICIAL: Oh no, not to my knowledge. Usually it’s substantially less.
ME: So what are we talking here, 50 percent?
OFFICIAL: I’d say more like 20 to 30 percent on average. But it varies considerably case by case.
ME: Which means … correct me if I’m wrong, but doesn’t that effectively mean the government is saying, “you can commit all the fraud you like, but if we catch you, you’re going to have to give us our cut”?
OFFICIAL: Well, obviously I can’t put it that way myself as long as I have this job …
And of course, the power of those same banks to charge account-holders eighty bucks for an overdraft is enforced by the same court system content to merely collect a piece of the
action when the bank itself commits fraud.
Now, on one level, this might just seem like another example of a familiar story: the rich always play by a different set of rules. If the children of bankers can regularly get off the hook for carrying quantities of cocaine that would almost certainly have earned them decades in a federal penitentiary if they happened to be poor or Black, why should things be any different when they grow up to become bankers themselves? But I think there is something deeper going on here, and it turns on the very nature of bureaucratic systems. Such institutions always create a culture of complicity. It’s not just that some people get to break the rules — it’s that loyalty to the organization is to some degree measured by one’s willingness to pretend this isn’t happening. And insofar as bureaucratic logic is extended to the society as a whole, all of us start playing along.
This point is worth expanding on. What I am saying is that we are not just looking at a double standard, but a particular kind of double standard typical of bureaucratic systems everywhere. All bureaucracies are to a certain degree utopian, in the sense that they propose an abstract ideal that real human beings can never live up to. Take the initial point about credentialism. Sociologists since Weber always note that it is one of the defining features of any bureaucracy that those who staff it are selected by formal, impersonal criteria — most often, some kind of written test. (That is, bureaucrats are not, say, elected like politicians, but neither should they get the job just because they are someone’s cousin.) In theory they are meritocracies. In fact everyone knows the system is compromised in a thousand different ways. Many of the staff are in fact there just because they are someone’s cousin, and everybody knows it. The first criterion of loyalty to the organization becomes complicity. Career advancement is not based on merit, and not even based necessarily on being someone’s cousin; above all, it’s based on a willingness to play along with the fiction that career advancement is based on merit, even though everyone knows this not to be true.[27] Or with the fiction that rules and regulations apply to everyone equally, when, in fact, they are often deployed as a means for entirely arbitrary personal power.
This is how bureaucracies have always tended to work. But for most of history, this fact has only been important for those who actually operated within administrative systems: say, aspiring Confucian scholars in Medieval China. Most everyone else didn’t really have to think about organizations very often; typically, they only encountered them every few years when it came time to register their fields and cattle for the local tax authorities. But as I’ve pointed out, the last two centuries have seen an explosion of bureaucracy, and the last thirty or forty years in particular have seen bureaucratic principles extended to every aspect of our existence. As a result, this culture of complicity has come to spread as well. Many of us actually act as if we believe that the courts really are treating the financial establishment as it should be treated, that they are even dealing with them too harshly; and that ordinary citizens really do deserve to be penalized a hundred times more harshly for an overdraft. As whole societies have come to represent themselves as giant credentialized meritocracies, rather than systems of arbitrary extraction, everyone duly scurries about trying to curry favor by pretending they actually believe this is to be true.
So: what would a left-wing critique of total, or predatory, bureaucratization look like?
I think the story of the Global Justice Movement provides a hint — because it was a movement that, rather to its own surprise, discovered this was what it was about. I remember this quite well because I was deeply involved in the movement at the time. Back in the 1990s, “globalization,” as touted by journalists like Thomas Friedman (but really, by the entire journalistic establishment in the United States and most of it in other wealthy countries) was portrayed as an almost natural force.
Technological advances — particularly the Internet — were knitting the world together as never before, increased communication was leading to increased trade, and national borders were rapidly becoming irrelevant as free trade treaties united the globe into a single world market. In political debates of the time in the mainstream media, all of this was discussed as such a self-evident reality that anyone who objected to the process could be treated as if they were objecting to basic laws of nature — they were flat-earthers, buffoons, the left-wing equivalents of Biblical fundamentalists who thought evolution was a hoax.
Thus when the Global Justice Movement started, the media spin was that it was a rearguard action of hoary, carbuncular leftists who wished to restore protectionism, national sovereignty, barriers to trade and communication, and, generally, to vainly stand against the Inevitable Tide of History. The problem with this was that it was obviously untrue. Most immediately, there was the fact that the protestors’ average age, especially in the wealthier countries, seemed to be about nineteen. More seriously, there was the fact that the movement was a form of globalization in itself: a kaleidoscopic alliance of people from every corner of the world, including organizations ranging from Indian farmers’ associations, to the Canadian postal workers’ union, to indigenous groups in Panama, to anarchist collectives in Detroit. What’s more, its exponents endlessly insisted that despite protestations to the contrary, what the media was calling “globalization” had almost nothing to do with the effacement of borders and the free movement of people, products, and ideas. It was really about trapping increasingly large parts of the world’s population behind highly militarized national borders within which social protections could be systematically withdrawn, creating a pool of laborers so desperate that they would be willing to work for almost nothing. Against it, they proposed a genuinely borderless world.
Obviously, these ideas’ exponents did not get to say any of this on TV or major newspapers — at least not in countries like America, whose media is strictly policed by its own internal corporate bureaucrats. Such arguments were, effectively, taboo. But we discovered that there was something we could do that worked almost as well. We could besiege the summits where the trade pacts were negotiated and the annual meetings of the institutions through which the terms of what was called globalization were actually concocted, encoded, and enforced. Until the movement came to North America with the siege of the World Trade Meeting in Seattle in November 1999 — and subsequent blockades against the IMF/World Bank Meetings in Washington — most Americans simply had no idea that any of these organizations even existed. The actions operated like a magic charm that exposed everything that was supposed to be hidden: all we had to do was show up and try to block access to the venue, and instantly we revealed the existence of a vast global bureaucracy of interlocking organizations that nobody was supposed to really think about. And of course, at the same time, we would magically whisk into existence thousands of heavily armed riot police ready to reveal just what those bureaucrats were willing to unleash against anyone — no matter how nonviolent — who tried to stand in their way.
It was a surprisingly effective strategy. Within a matter of two or three years, we had sunk pretty much every proposed new global trade pact, and institutions like the IMF had been effectively expelled from Asia, Latin America, and, indeed, most of the world’s surface.[28]
The imagery worked because it showed everything people had been told about globalization to be a lie. This was not some natural process of peaceful trade, made possible by new technologies. What was being talked about in terms of “free trade” and the “free market” really entailed the self-conscious completion of the world’s first effective[29] planetary-scale administrative bureaucratic system. The foundations for the system had been laid in the 1940s, but it was only with the waning of the Cold War that they became truly effective. In the process, they came to be made up — like most other bureaucratic systems being created on a smaller scale at the same time — of such a thorough entanglement of public and private elements that it was often quite impossible to pull them apart — even conceptually. Let us think about it this way: At the top were the trade bureaucracies like the IMF, World Bank, WTO and the G8s, along with treaty organizations like NAFTA or the EU. These actually developed the economic — and even social — policies followed by supposedly democratic governments in the global south. Just below were the large global financial firms like Goldman Sachs, Lehman Brothers, American Insurance Group, or, for that matter, institutions like Standard & Poors. Below that came the transnational mega-corporations. (Much of what was being called “international trade” in fact consisted merely of the transfer of materials back and forth between different branches of the same corporation.) Finally, one has to include the NGOs, which in many parts of the world come to provide many of the social services previously provided by government, with the result that urban planning in a city in Nepal, or health policy in a town in Nigeria, might well have been developed in offices in Zurich or Chicago.
At the time, we didn’t talk about things in quite these terms — that “free trade” and “the free market” actually meant the creation of global administrative structures mainly aimed at ensuring the extraction of profits for investors, that “globalization” really meant bureaucratization. We often came close. But we rarely quite out and said it.
In retrospect, I think this is exactly what we should have emphasized. Even the emphasis on inventing new forms of democratic processes that was at the core of the movement — the assemblies, the spokescouncils, and so on — was, more than anything else, a way to show that people could indeed get on with one another — and even make important decisions and carry out complex collective projects — without anyone ever having to fill out a form, appeal a judgment, or threaten to phone security or the police.
The Global Justice Movement was, in its own way, the first major leftist antibureaucratic movement of the era of total bureaucratization. As such, I think it offers important lessons for anyone trying to develop a similar critique. Let me end by outlining three of them:
The armies of highly militarized police that appeared to attack the summit protestors were not some sort of weird side effect of “globalization.” Whenever someone starts talking about the “free market,” it’s a good idea to look around for the man with the gun. He’s never far away. Free-market liberalism of the nineteenth century corresponded with the invention of the modern police and private detective agencies,[30] and gradually, with the notion that those police had at least ultimate jurisdiction over virtually every aspect of urban life, from the regulation of street peddlers to noise levels at private parties, or even to the resolution of bitter fights with crazy uncles or college roommates. We are now so used to the idea that we at least could call the police to resolve virtually any difficult circumstance that many of us find it difficult to even imagine what people would have done before this was possible.[31] Because, in fact, for the vast majority of people throughout history — even those who lived in in large cities — there were simply no authorities to call in such circumstances. Or, at least, no impersonal bureaucratic ones who were, like the modern police, empowered to impose arbitrary resolutions backed by the threat of force.
Here I think it is possible to add a kind of corollary to the Iron Law of Liberalism. History reveals that political policies that favor “the market” have always meant even more people in offices to administer things, but it also reveals that they also mean an increase of the range and density of social relations that are ultimately regulated by the threat of violence. This obviously flies in the face of everything we’ve been taught to believe about the market, but if you observe what actually happens, it’s clearly true. In a sense, even calling this a “corollary” is deceptive, because we’re really just talking about two different ways of talking about the same thing. The bureaucratization of daily life means the imposition of impersonal rules and regulations; impersonal rules and regulations, in turn, can only operate if they are backed up by the threat of force.[32] And indeed, in this most recent phase of total bureaucratization, we’ve seen security cameras, police scooters, issuers of temporary ID cards, and men and women in a variety of uniforms acting in either public or private capacities, trained in tactics of menacing, intimidating, and ultimately deploying physical violence, appear just about everywhere — even in places such as playgrounds, primary schools, college campuses, hospitals, libraries, parks, or beach resorts, where fifty years ago their presence would have been considered scandalous, or simply weird.
All this takes place as social theorists continue to insist that the direct appeal to force plays less and less of a factor in maintaining structures of social control.[33] The more reports one reads, in fact, of university students being tasered for unauthorized library use, or English professors being jailed and charged with felonies after being caught jaywalking on campus, the louder the defiant insistence that the kinds of subtle symbolic power analyzed by English professors are what’s really important. It begins to sound more and more like a desperate refusal to accept that the workings of power could really be so crude and simplistic as what daily evidence proves them to be.
In my own native New York, I have observed the endless multiplication of bank branches. When I was growing up, most bank offices were large, freestanding buildings, usually designed to look like Greek or Roman temples. Over the last thirty years, storefront branches of the same three or four megabanks have opened, it seems, on every third block in the more prosperous parts of Manhattan.
In the greater New York area there are now literally thousands of them, each one having replaced some earlier shop that once provided material goods and services of one sort or another. In a way these are the perfect symbols of our age: stores selling pure abstraction — immaculate boxes containing little but glass and steel dividers, computer screens, and armed security. They define the perfect point of conjuncture between guns and information, since that’s really all that’s there. And that conjuncture has come to provide the framework for almost every other aspect of our lives.
When we think about such matters at all, we generally act as if this is all simply an effect of technology: this is a world whisked into being by computers. It even looks like one. And indeed, all these new bank lobbies do bear a striking resemblance to the stripped-down virtual reality one often found in 1990s video games. It’s as if we have finally achieved the ability to make such virtual realities materialize, and in so doing, to reduce our lives, too, to a kind of video game, as we negotiate the various mazeways of the new bureaucracies. Since, in such video games, nothing is actually produced, it just kind of springs into being, and we really do spend our lives earning points and dodging people carrying weapons.
But this sense that we are living in a world created by computers is itself an illusion. To conclude that this was all an inevitable effect of technological development, rather than of social and political forces, would be making a terrible mistake. Here too the lessons of “globalization,” which was supposed to have been somehow created by the Internet, are critically important:
Just as what came to be called “globalization” was really a creation of new political alignments, policy decisions, and new bureaucracies — which were only later followed by physical technologies like containerized shipping, or the Internet — so the pervasive bureaucratization of everyday life made possible by the computers is not, itself, the result of technological development. Rather it’s the other way around. Technological change is simply not an independent variable. Technology will advance, and often in surprising and unexpected ways. But the overall direction it takes depends on social factors.
This is easy to forget because our immediate experience of everyday bureaucratization is entirely caught up in new information technologies: Facebook, smartphone banking, Amazon, PayPal, endless handheld devices that reduce the world around us to maps, forms, codes, and graphs. Still, the key alignments that made all this possible are precisely those that I have been describing in this essay, that first took place in the seventies and eighties, with the alliance of finance and corporate bureaucrats, the new corporate culture that emerged from it, and its ability to invade educational, scientific, and government circles in such a way that public and private bureaucracies finally merged together in a mass of paperwork designed to facilitate the direct extraction of wealth. This was not a product of new technologies. To the contrary, the appropriate technologies took decades to emerge. In the seventies, computers were still something of a joke. Banks and government offices were keen on putting them into service, but for most of those on the receiving end, they were the very definition of bureaucratic idiocy; whenever something went terribly, obviously wrong, the reaction was always to throw up one’s eyes and blame “some computer.” After forty years and the endless investment of research funding into information technologies, we have gotten to the point where the kinds of computers bankers employ, and provide, are our very definition of infallible, magical efficiency.
Consider the ATM machine. In the last thirty years, I can’t remember a single occasion in which I have asked an ATM machine for money and gotten an incorrect amount. Nor have I been able to find anyone I know who can. This is so true that in the wake of the 2000 U.S. presidential elections, when the public was being regaled with statistics on the 2.8 percent degree of error expected from this type of voting machine, or the 1.5 percent expected from that, some had the temerity to point out that in a country that defines itself as the world’s greatest democracy, where elections are our very sacrament, we seem to just accept that voting machines will regularly miscount the vote, while every day hundreds of millions of ATM transactions take place with an overall zero percent rate of error. What does this say about what really matters to Americans as a nation?
Financial technology then has gone from a running gag to something so reliable that it can form the assumed backbone of our social reality. You never have to think about whether the cash machine will dispense the correct amount of cash. If it’s working at all, it will not make a mistake. This gives financial abstractions an air of utter certainty — a “ready-to-hand” quality, as Martin Heidegger put it — such an essential part of the practical infrastructure of our daily projects and affairs that we never have to think about as something in itself at all. Meanwhile physical infrastructure like roads, escalators, bridges, and underground railways crumbles around us, and the landscape surrounding major cities is peppered with the futuristic visions of past generations now lying smelly, dirty, or abandoned. None of this just happened. It is, precisely, a matter of national priorities: the result of policy decisions that allocate funding for everything from landmark preservation to certain kinds of scientific research. This is the world that all those endless documents about “vision,” “quality,” “leadership,” and “innovation” have actually produced. Rather than causing our current situation, the direction that technological change has taken is itself largely a function of the power of finance.
The “self-actualization” philosophy from which most of this new bureaucratic language emerged insists that we live in a timeless present, that history means nothing, that we simply create the world around us through the power of the will. This is a kind of individualistic fascism. Around the time the philosophy became popular in the seventies, some conservative Christian theologians were actually thinking along very similar lines: seeing electronic money as a kind of extension for God’s creative power, which is then transformed into material reality through the minds of inspired entrepreneurs. It’s easy to see how this could lead to the creation of a world where financial abstractions feel like the very bedrock of reality, and so many of our lived environments look like they were 3-D- printed from somebody’s computer screen. In fact, the sense of a digitally generated world I’ve been describing could be taken as a perfect illustration of another social law — at least, it seems to me that it should be recognized as a law — that, if one gives sufficient social power to a class of people holding even the most outlandish ideas, they will, consciously or not, eventually contrive to produce a world organized in such a way that living in it will, in a thousand subtle ways, reinforce the impression that those ideas are self-evidently true.
In the north Atlantic countries, all this is the culmination of a very long effort to transform popular ideas about the origins of value. Most Americans, for instance, used to subscribe to a rough-and-ready version of the labor theory of value. It made intuitive sense in a world where most people were farmers, mechanics, or shopkeepers: the good things in life were assumed to exist because people took the trouble to produce them; doing so was seen as involving both brain and muscle, usually, in roughly equal proportions. In the mid- nineteenth century even mainstream politicians would often use language that might seem to have been taken straight from Karl Marx. So Abraham Lincoln:
Labor is prior to, and independent of, capital. Capital is only the fruit of labor, and could never have existed if labor had not first existed. Labor is the superior of capital, and deserves much the higher consideration.[34]
The rise of bureaucratic capitalism in the Gilded Age was accompanied by a self-conscious effort, on the part of the new tycoons of the day, to put this kind of language aside, and to promulgate what was considered at the time the bold new philosophy — steel magnate Andrew Carnegie spoke of it as “The Gospel of Wealth” — that value was instead derived from capital itself. Carnegie and his allies embarked on a well-funded campaign of promoting the new gospel, not just in Rotary Clubs and Chambers of Commerce across the nation, but also in schools, churches, and civic associations.[35] The basic argument was that the very efficiency of the new giant firms these men directed could produce such a material bounty it would allow Americans to realize themselves through what they consumed rather than what they produced. In this view, value was ultimately a product of the very bureaucratic organization of the new conglomerates.
One thing that the global justice movement taught us is that politics is, indeed, ultimately about value; but also, that those creating vast bureaucratic systems will almost never admit what their values really are. This was as true of the Carnegies as it is today. Normally, they will — like the robber barons of the turn of the last century — insist that they are acting in the name of efficiency, or “rationality.” But in fact this language always turns out to be intentionally vague, even nonsensical. The term “rationality” is an excellent case in point here. A “rational” person is someone who is able to make basic logical connections and assess reality in a non-delusional fashion. In other words, someone who isn’t crazy. Anyone who claims to base their politics on rationality — and this is true on the left as well as on the right — is claiming that anyone who disagrees with them might as well be insane, which is about as arrogant a position as one could possibly take. Or else, they’re using “rationality” as a synonym for “technical efficiency,” and thus focusing on how they are going about something because they do not wish to talk about what it is they are ultimately going about. Neoclassical economics is notorious for making this kind of move. When an economist attempts to prove that it is “irrational” to vote in national elections (because the effort expended outweighs the likely benefit to the individual voter), they use the term because they do not wish to say “irrational for actors for whom civic participation, political ideals, or the common good are not values in themselves, but who view public affairs only in terms of personal advantage.” There is absolutely no reason why one could not rationally calculate the best way to further one’s political ideals through voting. But according to the economists’ assumptions, anyone who takes this course might as well be out of their minds.
In other words, talking about rational efficiency becomes a way of avoiding talking about what the efficiency is actually for; that is, the ultimately irrational aims that are assumed to be the ultimate ends of human behavior. Here is another place where markets and bureaucracies ultimately speak the same language. Both claim to be acting largely in the name of individual freedom, and individual self-realization through consumption. Even supporters of the old Prussian bureaucratic state in the nineteenth century, like Hegel or Goethe, insisted that its authoritarian measures could be justified by the fact they allowed citizens to be absolutely secure in their property, and therefore, free to do absolutely anything they pleased in their own homes — whether that meant pursuing the arts, religion, romance, or philosophical speculation, or simply a matter of deciding for themselves what sort of beer they chose to drink, music they chose to listen to, or clothes they chose to wear. Bureaucratic capitalism, when it appeared in the United States, similarly justified itself on consumerist grounds: one could justify demanding that workers abandon any control over the conditions under which they worked if one could thus guarantee them a wider and cheaper range of products for them to use at home.[36] There was always assumed to be a synergy between impersonal, rule-bound organization — whether in the public sphere, or the sphere of production — and absolute free self-expression in the club, café, kitchen, or family outing. (At first, of course, this freedom was limited to male heads of household; over time, it was at least in principle extended to everyone.)
The most profound legacy of the dominance of bureaucratic forms of organization over the last two hundred years is that it has made this intuitive division between rational, technical means and the ultimately irrational ends to which they are put seem like common sense. This is true on the national level, where civil servants pride themselves on being able to find the most efficient means to pursue whatever national destiny their country’s rulers happen to dream up: whether that be rooted in the pursuit of cultural brilliance, imperial conquest, the pursuit of a genuinely egalitarian social order, or the literal application of Biblical law. It is equally true on the individual level, where we all take for granted that human beings go out into the marketplace merely to calculate the most efficient way to enrich themselves, but that once they have the money, there’s no telling what they might decide to do with it: whether it be to buy a mansion, or a race car, engage in a personal investigation of UFO disappearances, or simply lavish the money on one’s kids. It all seems so self-evident that it’s hard for us to remember that in most human societies that have existed, historically, such a division would make no sense at all. In most times and places, the way one goes about doing something is assumed to be the ultimate expression of who one is.[37] But it also seems as if the moment one divides the world into two spheres in this way — into the domain of sheer technical competence and a separate domain of ultimate values — each sphere will inevitably begin trying to invade the other. Some will declare that rationality, or even efficiency, are themselves values, that they are even ultimate values, and that we should somehow create a “rational” society (whatever that means). Others will insist that life should become art; or else, religion. But all such movements are premised on the very division they profess to overcome.
In the big picture it hardly matters, then, whether one seeks to reorganize the world around bureaucratic efficiency or market rationality: all the fundamental assumptions remain the same. This helps explain why it’s so easy to move back and forth between them, as with those ex-Soviet officials who so cheerfully switched hats from endorsing total state control of the economy, to total marketization — and in the process, true to the Iron Law, managed to increase the total number of bureaucrats employed in their country dramatically.[38] Or how the two can fuse into an almost seamless whole, as in the current era of total bureaucratization.
For anyone who has ever been a refugee, or for that matter had to fill out the forty-page application required to get one’s daughter considered for admission by a London music school, the idea that bureaucracy has anything to do with rationality, let alone efficiency, might seem odd. But this is the way it looks from the top. In fact, from inside the system, the algorithms and mathematical formulae by which the world comes to be assessed become, ultimately, not just measures of value, but the source of value itself.[39] Much of what bureaucrats do, after all, is evaluate things. They are continually assessing, auditing, measuring, weighing the relative merits of different plans, proposals, applications, courses of action, or candidates for promotion. Market reforms only reinforce this tendency. This happens on every level. It is felt most cruelly by the poor, who are constantly monitored by an intrusive army of moralistic box-tickers assessing their child-rearing skills, inspecting their food cabinets to see if they are really cohabiting with their partners, determining whether they have been trying hard enough to find a job, or whether their medical conditions are really sufficiently severe to disqualify them from physical labor. All rich countries now employ legions of functionaries whose primary function is to make poor people feel bad about themselves. But the culture of evaluation is if anything even more pervasive in the hypercredentialized world of the professional classes, where audit culture reigns, and nothing is real that cannot be quantified, tabulated, or entered into some interface or quarterly report. Not only is this world ultimately a product of financialization, it’s really just a continuation of it. Since what is the world of securitized derivatives, collateralized debt obligations, and other such exotic financial instruments but the apotheosis of the principle that value is ultimately a product of paperwork, and the very apex of a mountain of assessment forms which begins with the irritating caseworker determining whether you are really poor enough to merit a fee waiver for your children’s medicine and ends with men in suits engaged in high-speed trading of bets over how long it will take you to default on your mortgage.
A critique of bureaucracy fit for the times would have to show how all these threads — financialization, violence, technology, the fusion of public and private — knit together into a single, self-sustaining web. The process of financialization has meant that an ever- increasing proportion of corporate profits come in the form of rent extraction of one sort or another. Since this is ultimately little more than legalized extortion, it is accompanied by ever-increasing accumulation of rules and regulations, and ever-more sophisticated, and omnipresent, threats of physical force to enforce them. Indeed they become so omnipresent that we no longer realize we’re being threatened, since we cannot imagine what it would be like not to be. At the same time, some of the profits from rent extraction are recycled to select portions of the professional classes, or to create new cadres of paper-pushing corporate bureaucrats. This helps a phenomenon I have written about elsewhere: the continual growth, in recent decades, of apparently meaningless, make-work, “bullshit jobs” — strategic vision coordinators, human resources consultants, legal analysts, and the like — despite the fact that even those who hold such positions are half the time secretly convinced they contribute nothing to the enterprise. In the end, this is just an extension of the basic logic of class realignment that began in the seventies and eighties as corporate bureaucracies become extensions of the financial system.
Every now and then you chance on a particular example that brings everything together. In September 2013, I visited a tea factory outside Marseille that was currently being occupied by its workers. There had been a standoff with local police for over a year. What had brought things to such a pass? A middle-aged factory worker, who took me on a tour of the plant, explained that while ostensibly the issue was a decision to move the plant to Poland to take advantage of cheaper labor, the ultimate issue had to do with the allocation of profits. The oldest and most experienced of the hundred-odd workers there had spent years tinkering with, and improving, the efficiency of the giant machines used to package teabags. Output had increased and with them profits. Yet what did the owners do with the extra money? Did they give the workers a raise to reward them for increased productivity? In the old Keynesian days of the fifties and sixties they almost certainly would have. No longer. Did they hire more workers and expand production? No again. All they did was hire middle managers.
For years, he explained, there had only been two executives in the factory: the boss, and a human resources officer. As profits rose, more and more men in suits appeared, until there were almost a dozen of them. The suits all had elaborate titles but there was almost nothing for them to do, so they spent a lot of time walking the catwalks staring at the workers, setting up metrics to measure and evaluate them, writing plans and reports. Eventually, they hit on the idea of moving the entire operation overseas — largely, he speculated, because devising the plan created a retrospective excuse for their existence, though, he added, it probably didn’t hurt that while the workers themselves would mostly lose their jobs, the executives who made the plan would likely be relocated to a more attractive location. Before long, the workers had seized the building, and the perimeter was swarming with riot cops.
A left critique of bureaucracy, therefore, is sorely lacking. This book is not, precisely, an outline for such a critique. Neither is it in any sense an attempt to develop a general theory of bureaucracy, a history of bureaucracy, or even of the current age of total bureaucracy. It is a collection of essays, each of which points at some directions a left-wing critique of bureaucracy might take. The first focuses on violence; the second, on technology; the third, on rationality and value.
The chapters do not form a single argument. Perhaps they could be said to circle around one, but mainly, they are an attempt to begin a conversation — one long overdue.
We are all faced with a problem. Bureaucratic practices, habits, and sensibilities engulf us. Our lives have come to be organized around the filling out of forms. Yet the language we have to talk about these things is not just woefully inadequate — it might as well have been designed to make the problem worse. We need to find a way to talk about what it is we actually object to in this process, to speak honestly about the violence it entails, but at the same time, to understand what is appealing about it, what sustains it, which elements carry within them some potential for redemption in a truly free society, which are best considered the inevitable price to pay for living in any complex society, which can and should be entirely eliminated entirely. If this book plays even a modest role in sparking such a conversation, it will have made a genuine contribution to contemporary political
life.
[1] Elliot Jacques (Ann Arbor: University of Michigan Press, 1976).
[2] Gordon Tullock (Washington, D.C.: Public Affairs Press, 1965).
[3] Henry Jacoby (Berkeley: University of California Press, 1973).
[4] C. Northcote Parkinson (Cambridge, MA: Riverside Press, 1957). “Work in an organization expands to fill the time allotted to do it.”
[5] Laurence J. Peter and Raymond Hill (London: Souvenir Press, 1969). The famous work on how those operating in an organization “rise to the level of their incompetence” also became a popular British TV show.
[6] R. T. Fishall (London: Arrow Books, 1982). A now-classic text on how to flummox and discomfit bureaucrats, widely rumored to be by British astronomer and BBC host Sir Patrick Moore.
[7] One could go further. The “acceptable” Left has, as I say, embraced bureaucracy and the market simultaneously. The libertarian Right at least has a critique of bureaucracy. The fascist Right has a critique of the market—generally, they are supporters of social welfare policies; they just want to restrict them to members of their own favored ethnic group.
[8] Owing to a peculiar set of historical circumstances, the word “liberal” no longer has the same meaning in the United States as it does in the rest of the world. The term originally applied to free-market enthusiasts, and in much of the world, it still does. In the United States, it was adopted by social democrats, and as a result, became anathema to the right, and free- market enthusiasts were forced to take the term “libertarian,” originally interchangeable with “anarchist,” used in such terms as “libertarian socialist” or “libertarian communist” to mean the same thing.
[9] In fact, Ludwig von Mises’s position is inherently antidemocratic: at least insofar as it tends to reject state solutions of any kind, while, at the same time, opposing left-wing antistatist positions that propose the creation of forms of democratic self- organization outside it.
[10] In the Durkheimian tradition this has come to be known as “the non-contractual element in contract,” certainly one of the less catchy sociological phrases of all time. The discussion goes back to The Division of Labor in Society (New York: Free Press, 1984 [1893]), p. 162.
[11] Michel Foucault’s essays on neoliberalism insist that this is the difference between the old and new varieties: those promulgating markets now understand that they do not form spontaneously, but must be nurtured and maintained by government intervention. Naissance de la biopolitique, Michel Senellart, ed. (Paris: Gallimard, 2004).
[12] “I don’t know how many times I have used the term ‘Government bureaucrat.’ And you will never find a politician using that term that doesn’t have some slightly pejorative connotation. That is, we know taxpayers resent the money they have to pay to the Government, and so we try to get credit by saying we’re being hard on bureaucrats or reducing bureaucrats … But remember, most of those people are just like most of you: They love their children. They get up every day and go to work. They do the very best they can … After what we have been through in this last month, after what I have seen in the eyes of the children of those Government bureaucrats that were serving us on that fateful day in Oklahoma City, or in their parents’ eyes who were serving us when their children were in that daycare center, I will never use that phrase again.” (www.presidency.ucsb.edu/ws/?pid=51382)
[13] From “Bureaucracy,” Max Weber, in From Max Weber: Essays in Sociology, H. H. Gerth and C. Wright Mills, eds. (New York: Oxford University Press, 1946), pp. 197–98.
[14] In many ways, the United States is a German country that, owing to that same early twentieth century rivalry, refuses to recognize itself as such. Despite the use of the English language there are far more Americans of German descent than English. (Or consider the two quintessentially American foods: the hamburger and the frankfurter). Germany in contrast is a country quite proud of its efficiency in matters bureaucratic, and Russia, to complete the set, might be considered a country where people generally feel they really ought to be better at bureaucracy, and are somewhat ashamed that they are not.
[15] A British bank employee recently explained to me that ordinarily, even those working for the bank effect a kind of knowing doublethink about such matters. In internal communications, they will always speak of regulations as being imposed on them—“The Chancellor has decided to increase ISA allowances”; “The Chancellor has initiated a more liberal pension regime” and so on—even though everyone in fact knows bank executives have just had repeated dinners and meetings with the Chancellor in question lobbying them to bring these laws and regulations about. There is a kind of game where senior executives will feign surprise or even dismay when their own suggestions are enacted.
[16] About the only policies that can’t be referred to as “deregulation” are ones that aim to reverse some other policy that has already been labeled “deregulation,” which means it’s important, in playing the game, to have your policy labeled “deregulation” first.
[17] The phenomenon I am describing is a planetary one, but it began in the United States, and it was U.S. elites who made the most aggressive efforts to export it, so it seems appropriate to begin with what happened in America.
[18] In a way, the famous TV character of Archie Bunker, an uneducated longshoreman who can afford a house in the suburbs and a non-working wife, and who is bigoted, sexist, and completely supportive of the status quo that allows him such secure prosperity, is the very quintessence of the corporatist age.
[19] Though it is notable that it is precisely this sixties radical equation of communism, fascism, and the bureaucratic welfare state that has been taken up by right-wing populists in America today. The Internet is rife with such rhetoric. One need only consider the way that “Obamacare” is continually equated with socialism and Nazism, often, both at the same time.
[20] William Lazonick has done the most work on documenting this shift, noting that it is a shift in business models—the effects of globalization and offshoring really only took off later, in the late nineties and early 2000s. (See, for example, his “Financial Commitment and Economic Performance: Ownership and Control in the American Industrial Corporation,” Business and Economic History, 2nd series, 17 [1988]: 115–28; “The New Economy Business Model and the Crisis of U.S. Capitalism,” Capitalism and Society [2009], 4, 2, Article 4; or “The Financialization of the U.S. Corporation: What Has Been Lost, and How It Can Be Regained,” INET Research Notes, 2012.) A Marxian approach to the same class realignment can be found in Gérard Duménil and Dominique Lévy’s Capital Resurgent: The Roots of the Neoliberal Revolution (Cambridge, MA: Harvard University Press, 2004), and The Crisis of Neoliberalism (Cambridge, MA: Harvard University Press, 2013). Effectively, the investor and executive classes became the same—they intermarried—and careers spanning the financial and corporate management worlds became commonplace. Economically, according to Lazonick, the most pernicious effect was the practice of stock buybacks. Back in the fifties and sixties, a corporation spending millions of dollars to purchase its own stock so as to raise that stock’s market value would have likely been considered illegal market manipulation. Since the eighties, as executives’ have increasingly been paid in stock, it has become standard practice, and literally trillions of dollars in corporate revenue that would in an earlier age have been sunk into expanding operations, hiring workers, or research, have instead been redirected to Wall Street.
[21] A popular code word from the eighties onwards was “lifestyle liberal, fiscal conservative.” This referred to those who had internalized the social values of the sixties counterculture, but had come to view the economy with the eyes of investors.
[22] Just to be clear: this is by no means the case of major journalistic venues, newspapers like The New York Times, The Washington Post, or magazines like The New Yorker, The Atlantic, or Harper’s. In such institutions, a degree in journalism would probably be counted as a negative. At this point, at least, it’s only true of minor publications. But the general trend is always towards greater credentialism in all fields, never less.
[23] www.aljazeera.com/indepth/opinion/2014/05/college-promise-economy-does-n.... The cited text is in Saving Higher Education in the Age of Money (Charlottesville: University of Virginia Press, 2005), p. 85. It continues, “Why do Americans think this is a good requirement, or at least a necessary one? Because they think so. We’ve left the realm of reason and entered that of faith and mass conformity.”
[24] This was certainly my own personal experience. As one of the few students of working-class origins in my own graduate program, I watched in dismay as professors first explained to me that they considered me the best student in my class— even, perhaps, in the department—and then threw up their hands claiming there was nothing that could be done as I languished with minimal support—or during many years none at all, working multiple jobs, as students whose parents were doctors, lawyers, and professors seemed to automatically mop up all the grants, fellowships, and student funding.
[25] Loans directly from the government are not available for continuing education, so borrowers are forced to take private loans with much higher interest rates.
[26] A friend gives me the example of master’s degrees in library science, which are now required for all public library jobs, despite the fact that the yearlong course of study generally provides no essential information that couldn’t be obtained by a week or two of on-the-job training. The main result is to ensure that for the first decade or two of a new librarian’s career, 20 to 30 percent of his or her income is redirected to repaying loans—in the case of my friend, $1,000 a month, about half of which goes to the university (principal) and half to the loan provider (interest).
[27] This logic of complicity can extend to the most unlikely organizations. One of the premiere left journals in America has, as editor in chief, a billionaire who basically bought herself the position. The first criterion for advancement in the organization is of course willingness to pretend there is some reason, other than money, for her to have the job.
[28] I outlined what happened in an essay called “The Shock of Victory.” Obviously, the planetary bureaucracy remained in place, but policies like IMF-imposed structural adjustment ended, and Argentina’s writing down of its loans in 2002, under intense pressure from social movements, set off a chain of events that effectively ended the Third World debt crisis.
[29] The League of Nations and the UN up until the seventies were basically talking-shops.
[30] In England, for instance, the anti–Corn Law legislation eliminating British tariff protections, which is seen as initiating the liberal age, was introduced by Conservative Prime Minister Sir Robert Peel, mainly famous for having created the first British police force.
[31] I was reminded of this a few years ago by none other than Julian Assange, when a number of Occupy activists appeared on his TV show The World Tomorrow. Aware that many of us were anarchists, he asked us what he considered a challenging question: say you have a camp, and there are some people playing the drums all night and keeping everyone awake, and they won’t stop, what do you propose to do about it? The implication is that police, or something like them—some impersonal force willing to threaten violence—were simply necessary in such conditions. He was referring to a real incident—there had been some annoying drummers in Zuccotti Park. But in fact, the occupiers who didn’t like the music simply negotiated a compromise with them where they would only drum during certain hours. No threats of violent force were necessary. This brings home the fact that, for the overwhelming majority of humans who have lived in human history, there has simply been nothing remotely like police to call under such circumstances. Yet they worked something out. One seeks in vain for Mesopotamian or Chinese or ancient Peruvian accounts of urban dwellers driven mad by neighbors’ raucous parties.
[32] It is possible that there could be market relations that might not work this way. While impersonal markets have been throughout most of history the creation of states, mostly organized to support military operations, there have been periods where states and markets have drifted apart. Many of the ideas of Adam Smith and other Enlightenment market proponents seem to derive from one such, in the Islamic world of the Middle Ages, where sharia courts allowed commercial contracts to be enforced without direct government intervention, but only through merchants’ reputation (and hence creditworthiness). Any such market will in many key ways operate very differently from those we are used to: for instance, market activity was seen as much more about cooperation than competition (see Debt: The First 5,00 Years [Brooklyn: Melville House, 2011], pp. 271–82). Christendom had a very different tradition where commerce was always more entangled in war, and purely competitive behavior, especially in the absence of prior social ties, requires, pretty much of necessity, something like police to guarantee people keep to the rules.
[33] There is some possibility this has begun to turn around somewhat in recent years. But it has been my own experience that pretty much any time I presented a paper that assumed that some form of social control is ultimately made possible by the state’s monopoly of violence, I would be instantly faced with someone challenging me on Foucauldian, Gramscian, or Althusserian grounds that such an analysis is foolishly outdated: either because “disciplinary systems” no longer work that way, or because we have now realized they never did.
[34] The Collected Works of Abraham Lincoln, vol. 5. Roy P. Basler, ed. (New Brunswick, NJ: Rutgers University Press), p. 52. Anthropologist Dimitra Doukas provides a good historical overview of how this transformation worked itself out in small- town upstate New York in Worked Over: The Corporate Sabotage of an American Community (Ithaca, NY: Cornell University Press, 2003). See also E. Paul Durrenberger and Dimitra Doukas, “Gospel of Wealth, Gospel of Work: Counter- hegemony in the U.S. Working Class” American Anthropologist, Vol. 110, Issue 2 (2008), pp. 214–25, on the ongoing conflict between the two perspectives among contemporary American laborers.
[35] It would be interesting to compare this campaign to the similarly well-funded effort to promulgate free-market ideologies starting in the sixties and seventies, which began with the establishment of think tanks like the American Enterprise Institute. The latter appears both to have come from a smaller sector of the capitalist classes and to have taken much longer to achieve broad effects on popular opinion—though in the end it was if anything even more successful.
[36] Even the Soviet bureaucracy combined a celebration of labor with a long-term commitment to creating a consumer utopia. It should be noted that when the Reagan administration effectively abandoned antitrust enforcement in the eighties, they did it by shifting the criteria for approval of a merger from whether it operates as a restraint of trade to whether it “benefits the consumer.” The result is that the U.S. economy is in most sectors, from agriculture to book sales, dominated by a few giant bureaucratic monopolies or oligopolies.
[37] Similarly in the Classical world, or that of Medieval Christianity, rationality could hardly be seen as a tool because it was literally the voice of God. I will be discussing these issues in more detail in the third essay.
[38] Total number of civil servants employed in Russia in 1992: 1 million. Total number employed in 2004: 1.26 million. This is especially remarkable considering a lot of this time was marked by economic free-fall, so there was much less activity to administer.
[39] The logic is analogous to Marxian notions of fetishism, where human creations seem to take life, and control their creators rather than the other way around. It is probably best considered a subspecies of the same phenomenon.