To crank or not to crank
A properly socialized individual had a powerful sense that the wild world was feeding him, and he ought to be as grateful and as anxious to act decently as he would to any human who fed him out of sheer kindness.
– E.N. Anderson, Ecologies of the heart
People intuitively view agriculture as the root of domination because intensifying food economies made possible large surpluses which could then support elites and their servants. As indeed they did. But the link with agriculture is conditional.
Certain well-endowed economies (whether foraging, horti, field agriculture, or grazing) make large surpluses possible. But they do not make them inevitable. Food harvests– of any kind — do not lead to surplus unless the people in question decide to produce it. Given the fact that humans generally have better things to do with themselves than toil, they tend to work as little as necessary to cover their food needs and a little extra for the winter or an upcoming celebration. If they planted a field of rye, and it produced twice as much as they expected, they’d be likely to plant half next year, and spare themselves the extra work. If salmon or anchovies are particularly plentiful this year, why not kick back and enjoy the easy life?
And indeed, there is a great deal of evidence that “agriculture does not automatically create a food surplus. We know this because many agricultural people of the world produce no such surplus. Virtually all Amazonian Indians, for example, were agricultural, but in aboriginal times they did not produce a food surplus. That it was technically feasible for them to produce such a surplus is shown by the fact that, under the stimulus or European settlers’ desire for food, a number of tribes did raise manioc in amounts well above their own needs, for the purpose of trading.” These tribespeople went back to underproduction when their trading needs were satisfied.
Even the simplest foragers often produced some subsistence surplus. They were, however, not exercised much by planning ahead, and often blew through the entire cache at a midwinter feast, going hungry shortly thereafter, trusting that the world would provide. Many anthropologists noted that strictures against taking “more than you need” were extant in these societies.
Boreal Algonquians expected intermittent periods of hunger during the winter, and these fasts—and even the possible threat of death—were preferable to the planning and labor entailed by food storage. The definition of the resource situation was one in which animals were ordinarily available and hunger a predictable, endurable, and usually transient aspect of the winter round. It is precisely in this arbitrary weighting of risk aversion and optimism that the operation of the cultural logic of Cree labor is specifiable. The costs of the labor, always potentially superfluous, entailed in storage was reckoned disproportionate to the reliability ensured by the surplus. Before approximately 1900, boreal forest Algonquians often fasted and sometimes perished for lack of food. These tragedies would have occurred less frequently if more intensive food storage had been practiced. Experiencing long-term game shortages as though they were new instances of transient scarcity, the Algonquians continued, with some concessions, “to let tomorrow provide for itself.” The decision to store less and starve more (or, among Chipewyans, to store more and starve less) was not objectively determined by the Canadian Shield ecosystem, the limits of the technology, or caloric efficiency. The paradox of the starving Montagnais consuming all their preserved eels in autumn feasts is a particularly forceful example of the meaningful construction of utility, efficiency, and the entire structure of foraging labor and consumption. This skepticism toward advanced planning and reliability is not limited exclusively to foragers. Audrey Richards’s (1932) classic monograph on the Bemba is a detailed exposition of an agricultural society whose members preferred transient hunger to what they deemed excessive labor.
To broaden the areal focus, comparable practices existed even in a “delayed return” foraging society like the Alaskan Koyukons who occupied sedentary winter villages provisioned by preserved fish and caribou meat. According to Sullivan (1942), the Koyukons sometimes disposed of their stored foods during lavish feasts in late summer, midwinter, and early spring. The midwinter feasts, in particular, sometimes occasioned hardship if hunting was unsuccessful, but they continued into the present century. The Koyukon feasts pose the same paradox as the Montagnais: the surplus was accumulated and preserved but then consumed, precluding its use to level fluctuations in the long term. Murphy (1970:153) described among the Brazilian Munduruçu “the hunter’s glut, an abundance of meat that had to be consumed before it spoiled, and the men stayed at home because further hunting would have been a crime against the game and because they had to apply themselves steadily to the serious business of eating.”
These subsistence surpluses hedge the bets of survival a little; much of the time, though, simple (or “immediate return”) foragers only get enough to eat for the next several days. Surplus that goes beyond subsistence is a luxury good. Since it is above what the community needs, it can be traded, or given away, and no one is the worse off. It is not the little extra a community needs to weather a winter or to set aside seed for spring planting. That “little extra” is needed for survival and cannot be derailed toward optional undertakings. Luxury surplus is the kind that can support elites.
The extant records, like the ones quoted above, show that even the most basic subsistence surpluses were the result of choice. Only more so, then, can luxury surpluses be said to result from a choice (within either forager, horticultural, or agricultural economies). They cannot be the automatic result of the agricultural way of life. There will be no surplus, no matter how abundant the land, unless the people in question decide to override their culture’s disapproval, begin taking more than they need, and devote much more effort to storage techniques. And it appears that the first people who chose to produce luxury surpluses were very ancient complex (or “delayed-return“) foragers. Brian Hayden has this to say:
From all the indications that prehistorians have gathered, it appears that humans have existed for well over 2 million years in a state of relative equality. It is possible to perceive the glimmerings of some changes toward socioeconomic inequality around 50,000 years ago. These changes became more pronounced in some areas about 30,000 years ago, and then became especially dramatic and widespread after about 15,000 years ago.
The shift toward socioeconomic inequality is not tied to food production, but occurred well before agriculture emerged. At the end of the Pleistocene, these changes occurred independently in a number of different areas of the globe. Thus the emergence of significant inequality followed a pattern that is strikingly similar to the emergence of food production, but preceded it by many millennia. (Richman, Poorman, Beggarman, Chief, 2007)
There we have it. The root of domination lies in the Paleolithic, deep in forager world.
What do you think? Leave a comment below.
Sign up for regular Resilience bulletins direct to your email.
This is a community site and the discussion is moderated. The rules in brief: no personal abuse and no climate denial. Complete Guidelines.