Underneath the many theories of civilization collapse, one fact stands consistent: people replace goal with methods. Instead of worrying about what they are trying to do, they try to do what everyone else does that is succeeding, and therefore society repeats itself like a superstition, even as results fail to be what is desired.
This process begins with jobs. A man in nature must clear some acres, set up a farm, and hunt; a man in a city must find a job, get credit, and buy goods and services. In the latter case, he has a job, and is not directly responsible for results like the farmer, but rewarded based on how he does what his customers or managers instruct.
Jobs require people to do what they are told, and therefore promote both slavish dedication and an equally strong desire to bend the rules out of resentment for being told what to do. You can rage at nature, but it gets nowhere; if you rage at your job, you can simply cheat, cut corners, not notice things, or any number of small sabotages.
Over time, jobs become a goal in themselves, and society trades them like properties. This leads to a situation where jobs are not created to fit the work, but work is invented to fit the need for jobs. As you can guess, this is a kissing cousin to what unions or socialism offers, so society shifts in that direction.
Jobs broke the West.
Saner alternatives like manorial feudalism and monarchism, in which all had social roles to which their jobs belonged and not the other way around, were discarded because they did not fit the appetite of the multitudes for easy stupid labor in which most of the time is spent screwing around and socializing.
This led jobs to become an obsession, paired with the need to generate “growth” so that taxes did not crash the economy, and this in turn produced a situation where people live to work, instead of working to live. After all, pre-industrial and pre-agricultural societies had more time off than wagie slaves:
The Ju/’hoansi spent an average of 17 hours a week finding food—2,140 calories daily—and devoted another 20 to chores, as Suzman gleaned from other ethnographies and firsthand research. This left them with considerably more downtime than the typical full-time employee in the U.S., who spends about 44 hours a week doing work—and that doesn’t include domestic labor and child care. In that downtime, the Ju/’hoansi remained strikingly free, over centuries, from the urge to cram it with activities that we would classify as “productive” (or, for that matter, destructive). By day, they did go on walks with children to teach them how to read the canvas of the desert for the footprints of animals.
Suzman calls attention to the changing nature of work. He draws on the writing of the French sociologist Émile Durkheim, who pointed to a crucial difference between “primitive” and complex societies called interchangeability. For hunter-gatherers, chiefs and shamans could, and did, moonlight as foragers and hunters. Overlapping duties preserved a strong sense of community, reinforced by customs and religions that obscured individual differences in strength, skill, and ambition. Shared labor meant shared values.
The productivity mode thrived—and it just might deserve credit (along with luck) for almost all scientific progress and technological ingenuity. But it also bears the blame for what Durkheim called a “malady of infinite aspiration,” which by now we’ve discovered is chronic. When a recent Pew Research Center survey asked about the secret to happiness, most Americans, of all ages, ranked “a job or career they enjoy” above marriage, children, or any other committed relationship. Careerism, not community, is the keystone in the arch of life.
Why do we need productivity, and for that matter, why do we need an ever-rising population? Government takes half of the money to redistribute it, which makes up three-quarters of our budget, and this creates a loss the markets can only fill with endless growth.
In other words, the presumed solution — socialism — is the cause of the decline of capitalism only because the supposedly capitalist societies adopted it. Remove the entitlements and free stuff from government and suddenly you no longer need this vicious cycle.
Our own recent past shows us how jobs eat time and leave people essentially running on a treadmill to pay for the right to keep going to work:
Plowing and harvesting were backbreaking toil, but the peasant enjoyed anywhere from eight weeks to half the year off. The Church, mindful of how to keep a population from rebelling, enforced frequent mandatory holidays. Weddings, wakes and births might mean a week off quaffing ale to celebrate, and when wandering jugglers or sporting events came to town, the peasant expected time off for entertainment. There were labor-free Sundays, and when the plowing and harvesting seasons were over, the peasant got time to rest, too. In fact, economist Juliet Shor found that during periods of particularly high wages, such as 14th-century England, peasants might put in no more than 150 days a year.
When workers fought for the eight-hour workday, they weren’t trying to get something radical and new, but rather to restore what their ancestors had enjoyed before industrial capitalists and the electric lightbulb came on the scene. Go back 200, 300 or 400 years and you find that most people did not work very long hours at all. In addition to relaxing during long holidays, the medieval peasant took his sweet time eating meals, and the day often included time for an afternoon snooze.
It’s true that the New Deal brought back some of the conditions that farm workers and artisans from the Middle Ages took for granted, but since the 1980s things have gone steadily downhill. With secure long-term employment slipping away, people jump from job to job, so seniority no longer offers the benefits of additional days off. The rising trend of hourly and part-time work, stoked by the Great Recession, means that for many, the idea of a guaranteed vacation is a dim memory.
We get told how good we have it because we live longer and are healthier, but it is unclear if that is true. First, we probably do not actually live longer, although more of the sickly survive, and second, we are weaker, with worse eyesight and more chronic health conditions, than people were in the past.
To understand this, it is important to realize that an average is not a lifespan; the human lifespan has never changed, only the averages. Ancient people lived long lives and were free of the onslaught of beetus, cancers, and autoimmune diseases plaguing us now:
Life expectancy is an average. If you have two children, and one dies before their first birthday but the other lives to the age of 70, their average life expectancy is 35.
That’s mathematically correct – and it certainly tells us something about the circumstances in which the children were raised. But it doesn’t give us the full picture. It also becomes especially problematic when looking at eras, or in regions, where there are high levels of infant mortality. Most of human history has been blighted by poor survival rates among children, and that continues in various countries today.
If one’s thirties were a decrepit old age, ancient writers and politicians don’t seem to have got the message. In the early 7th Century BC, the Greek poet Hesiod wrote that a man should marry “when you are not much less than 30, and not much more”. Meanwhile, ancient Rome’s ‘cursus honorum’ – the sequence of political offices that an ambitious young man would undertake – didn’t even allow a young man to stand for his first office, that of quaestor, until the age of 30 (under Emperor Augustus, this was later lowered to 25; Augustus himself died at 75). To be consul, you had to be 43 – eight years older than the US’s minimum age limit of 35 to hold a presidency.
Modern people have more healthcare, but it does not necessarily make them healthier; in fact, it seems to be keeping those who would have died in nature so that they can be a profit center for healthcare through the money handed out by government. We are making ourselves sick with medical care.
In addition, people now have a greater burden to keep up. Yes, our technology factors in, but more importantly, we have to pay taxes, have insurance, and maintain the infrastructure for jobs, something that hunter-gatherers did not face:
Sahlins’s principal argument was simple but counterintuitive: before being driven into marginal environments by colonial powers, hunter-gatherers, or foragers, were not engaged in a desperate struggle for meager survival. Quite the contrary, they satisfied their needs with far less work than people in agricultural and industrial societies, leaving them more time to use as they wished. Hunters, he quipped, keep bankers’ hours. Refusing to maximize, many were “more concerned with games of chance than with chances of game.”2 The so-called Neolithic Revolution, rather than improving life, imposed a harsher work regime and set in motion the long history of growing inequality (a claim recently revived by James C. Scott in Against the Grain: A Deep History of the Earliest States [2017]).
Agricultural society seems to survive well when it consists of big farms run by higher-IQ people who have workers tied to sacred roles like caste layers. As soon as jobs appear, the decay accelerates, and soon you have a resentful population phoning it in while trying to steal from each other to accumulate enough to escape the treadmill.
Tags: antiwork, capitalism, jobs, manorial feudalism, socialism, work