Does plant cultivation inevitably lead to Civilization? | Reader’s Correspondence
And, by extension, is farming a result of population growth – or does it cause population growth? --- [Estimated reading time: 30 min.]
This short essay was inspired by a brilliant inquiry from a reader – Thank you, Paul!
The main questions were:
Does human understanding of plant reproductive cycles inevitably lead to agriculture?
What happens when humanity has reached planetary carrying capacity under foraging (and later, horticultural) conditions?
Consequently, once that point is reached, would dispersal not cease to be an effective option to avoid exceeding carrying capacity?
And if so, wouldn't the choice become: adopt agriculture, or starve?
I’ve encountered and contemplated similar questions before, and will thus attempt to answer them in the detail they deserve.
People have been arguing for what seems like ages about what the exact point was when things took a wrong turn for our species (or at least for the dominant culture). Most people today would intuitively blame Capitalism, the Industrial Revolution, money, or technology; others blame the supposed greed of our species, or even “Human Nature.” Some people, equipped with a broader perspective on our species’ history, say it might have been the taming of fire, while others insist that it was agriculture, or metallurgy, or patriarchy, and some go as far as to say it was when we started hunting, or when we left the trees.
While the first four predicaments can easily be dismissed upon closer inspection – problems started appearing much, much earlier than that, and those aspects are merely symptoms of deeper-lying problems – any claims to an allegedly insatiable greed inherent to our species are immensely exaggerated. Humans are surely capable of excessive greed, but there is absolutely no reason to assume that greed is a cornerstone of our existence. Especially “Human Nature” itself is a concept that has been distorted to such a grotesque extend that most people have absolutely no clue what Human Nature actually encompasses. A whole generation grew up on far-fetched assumptions and erroneous mechanistic metaphors made by scientific materialists like Richard Dawkins, who coyly admitted in the foreword to the 30th-Anniversary Edition of his bestseller “The Selfish Gene” that he might as well have called the book “The Cooperative Gene.”1
Agriculture, patriarchy and metallurgy are all part of the same trend (and constitute fundamental causes of our current predicament), but the taming of fire is a bit different, since it occurred much earlier (10,000 years vs. 1+ million years) and there is no evidence of a prolonged or pronounced negative effect of fire usage by earlier human cultures on the biosphere.
The last two turning points, both exceptionally absurd, seem to gain traction around the edges of the mainstream, especially among fundamentalist vegans. Although proponents of those ideas would vehemently deny any such connection, this kind of thinking is in fact a direct extension of the Christian doctrine of the Original Sin, presuming that the human species alone is somehow separate from all other animals, in that we alone are flawed. Needless to say, when the genus Homo first emerged, there was absolutely nothing wrong with us. It’s not like there was some sort of virus preprogrammed into the earlier stages of our evolutionary path that led us to become Enemies of Life, the bad apple in Mother Nature’s fruit basket, a whole three million years later. If any new species emerged, would you imagine this species to be flawed or successful? Species come into existence because they work, not because they’re faulty. It’s the same with us humans. If we had some inbuilt design flaw that made us destructive and overly greedy by default, it is highly questionable whether we would have made it very far. Usually, behavior that doesn’t work gets sorted out (read: goes extinct) quite rapidly.
So, what was our mistake? Where did we take the wrong turn?
Only few people, usually folks who’ve already spent some time contemplating the issue and have above-average knowledge concerning this topic, arrive at the question of whether the cultivation of plants might have been Pandora’s Box. Could it be that plant cultivation itself sets us humans down a path that invariably leads to strongly stratified, large-scale societies that destroy their environment by necessity?
Let me make clear from the beginning on that we’re in great danger of drifting down a slippery slope here. If civilization requires agriculture to feed it, and agriculture was made possible by the knowledge of how plants reproduce, and this knowledge was in turn made possible by the expansion of our brains, and this expansion took place because of complex interconnected positive feedback loops caused by a combination of increased dexterity, tool use, enhanced cooperation, new hunting techniques, more complex social arrangements and relationships, the development of symbolic language, and the utilizing of fire, especially for cooking, then where the hell does it even start?!
To separate the wheat from the chaff (please excuse the agrarian metaphor), we have to look closely at which aspects are harmless in and of themselves, and which ones carry the seeds of destruction.
The utilization of fire, for instance, at first seems to carry this seed – yet people have used fire for well over a million years without laying waste to the biosphere. There is more and more evidence emerging that fire, if used wisely, can actually be beneficial to a given ecosystem, as shown by fire management techniques of Native American societies. Tool use, language, and complex social relationships were also present for millions of years without causing any major damage to the planetary ecosystem. And while our hunting of megafauna might have contributed to their eventual extinction, a comprehensive review of the literature concerning this subject unanimously shows that climatic changes were the main culprit. And, even assuming it was entirely our species’ fault, those extinctions happened at a rate of roughly two species per millennium – well within the limits of the natural background extinction rate, and a logical result if a capable predator starts settling in new territory.
So, what about plant cultivation? Humans probably understood very well that a plant grows when you leave a seed on the ground, most likely ever since we became humans (it’s something you encounter every day in Nature), and perhaps even before that. In my own view, I’d go a step further and say that it’s not only humans that have an understanding of how the seed-to-plant life cycle works, but also a handful of other species, like acorn-burying squirrels, certain bird species, or some ant colonies. I don’t have any scientific evidence to back up those claims (yet), but it’s more of a belief. The dominant culture of anthropocentrism teaches us to see other animals as passive seed dispersers, blindly following their instincts, but there is no way to know for sure that a squirrel burying acorns doesn’t understand that a tree grows in those places where acorns were buried.
Merely holding the knowledge of the life cycle of plants doesn’t automatically necessitate any interference with this cycle, and while humans have probably always experimented with seed dispersal (and other more passive forms of cultivation) to some extend (Paul and I agree on this), it was only once the climate stabilized that those experiments could really take off.
I do indeed believe that some form of large-scale agricultural society is inevitable if the climate stabilizes for a long enough period of time (like during the so-called Holocene). What I would dare to question is that, if conditions would have differed ever so slightly, agriculture would have lasted as long as it did. The really strange thing is not that such societies came into existence, but that they lasted so long, and, even more so, were actually rebuilt time and again after they collapsed.
People have been experimenting with different variations on subsistence strategies ever since we came into existence, and curiosity and ingenuity are a part of who we are. We like trying new things, and if something seems to work, we try more of it and see what happens. Daniel Quinn said that it might easily be that prehistoric humans or other animals developed something like agriculture at one or another point in the past – but if they did, we wouldn’t know, since any species that follows such a highly unstable evolutionary strategy usually doesn’t last long enough to leave a fossil record (I’ll explain why agriculture is an evolutionary unstable strategy in a minute). What baffles me is that some human cultures never stopped farming and building, even after repeated failure.
I’ve heard the argument that human activity (for instance deforestation, especially slash-and-burn, and releasing carbon stored in soils through plowing) might have been one factor that reinforced the long interglacial period that 19th-century scientists mistook for an entirely new epoch – the “Holocene” (a move that made the unusually stable climate of the past ten millennia seem like less of an abnormality than it really is). I’m not sure if the comparatively small human population at that time could have had an impact on the planetary ecosystem (I actually doubt it, although it might have been a contributing factor), but once you reach a certain population threshold, such self-reinforcing feedback loops are to be expected. And even pre-industrial humans did cause climatic changes we can measure today. The Orbis Spike, for instance, is a short but noticeable dip in atmospheric carbon dioxide captured in the Antarctic ice core around the year 1610, caused by the extermination of over 50 million people in the “New” World by Western intruders. Many of the societies affected were horticultural or even agricultural ones, and the following gradual conversion of farmland back to climax forest was what caused this inverted spike. The newly emerging trees sucked up so much carbon from the atmosphere that this led to a temporary (but rather short-lived) cooling of the planet.
Anyway, I guess it makes sense to say that, in certain ways, horticulture begets agriculture, and agriculture begets more agriculture. It’s a positive feedback loop, concerning climate, environment, human population, and human culture.
So, yes, I would agree that human understanding (and hence utilizing) of the reproductive cycle of plants will inevitably lead to some form of agriculture and, in some instances, hierarchical societies, the largest of which we find ourselves in today – environmental (and especially climatic) conditions permitted!
This tendency is pretty much impossible to avert the first time such societies emerge (there’s no precedential cases, so to speak), but it could theoretically have been avoided after the initial failure (and collapse) of said societies – if the necessary policies and cultural mechanisms inhibiting such extremes would have been codified into the customs and traditions of the people in question. If people would have learned from their mistakes and adapted their cultures to avoid the dangers inherent in the agricultural lifestyle, I believe they could have created sustainable horticultural societies – but more on that later.
Since the first larger agricultural societies (and the hierarchies they necessitated) were, culturally speaking, uncharted territory people didn’t have the time or necessary experience to instigate social mechanisms discouraging behavior that leads to hierarchies. You have to get burned to truly understand that fire is hot.
And this is what’s particularly peculiar about the past few millennia: after the first few civilizations collapsed, people didn’t stop building more of them. The really interesting question here is why some people continued building civilizations! (My take on answering this question would exceed the scope of this essay, but I’d love to hear what you, dear reader, think in the comments!)
You would expect that people learn from their mistakes and try different social arrangements, but this clearly hasn’t been the case overall – until now! It most certainly happened in some cases, like for instance after the collapse of the Classic Maya civilization, when cities were abandoned (and sometimes ritually burned to the ground), or among some North American indigenous societies that were once farmers (or neighbors of a farming culture) but returned to a foraging lifestyle, like various tribes of the Great Plains after the introduction of horses by Europeans.
Although there were always dissidents, dropouts, hermits, runaways and doomsday cults (as Fredy Pearlman has pointed out in Against His-Story, Against Leviathan), I’m not aware of any wider political or social movement that was decidedly and explicitly anti-civilization (and, by extension, anti-agriculture) before the end of the 20th century – apart from the countless indigenous societies resisting (sometimes violently!) the dominant culture’s encroachment on their traditional ways of life, of course. But there most certainly weren’t such movements of any noteworthy scale among the “civilized.” The people of this culture have been blinded by what Daniel Quinn has termed “The Great Forgetting” ever since the first civilizations arose. After a few thousand years, they simply forgot that humans ever lived any other way and that there is any alternative to farming and city-building, and the few remaining foragers they encountered were swiftly classified as “less-than-human,” “living relicts of an animal past,” or as “inhabiting lower rungs of the evolutionary ladder” at whose top we imagine ourselves to stand (and were consequently assimilated, expelled, or exterminated).
It’s only now that, with the archaeological and anthropological findings of the last few decades, we have a clear enough understanding of the entirety of human history (and beyond), and it is only now that there are, in fact, subcultures emerging on the fringes of mainstream society that actively oppose agriculture as a subsistence mode and civilization as the default social organization for humans: (anarcho-)primitivism is the prime example.
This kind of thinking is completely new to the members of the dominant culture, but has existed for quite some time among indigenous peoples all over the world (for as long as they’ve encountered agricultural societies, I assume), which is why the vast majority of hunter-gatherers did not adopt farming as long as they had other choices (meaning before their homes were razed to the ground and they were enslaved by adjacent farming communities). Some contemporary examples for sustainable horticultural societies include the Yanomami, the Ayoreo, the Achuar, the Kayapo, the Zo'é, and the Huaorani of the Amazon rainforest – all of whom practice plant cultivation as part of a mixed subsistence strategy that also includes hunting and gathering. All of those societies (or, better, those that survived the last few centuries) strongly resisted efforts by missionaries and governments to become sedentary and start farming or working in the wage economy, and at least some members of each of those societies still do so to this very day. The most notorious examples are provided by the Sentinelese, who fire arrows at anyone approaching their island, and the Tagaeri, a splinter group formerly belonging to the Huaorani, that formed around a man called Taga, their leader. The Tagaeri saw what happened to their culture at large after evangelical missionaries invaded their territory, and decided they won’t have any of it. They retreated deeper into the forest, and now kill anyone who dares to venture into their territory – loggers, prospectors, miners, missionaries, government officials, and even fellow Huaorani.
All the aforementioned groups have seen what agriculture (and the culture of anthropocentrism that emerges as a result) can do to a people, and they decidedly and unequivocally oppose it to this very day. This is a conscious political and social choice, and it is my firm conviction that societies that have experienced the horrors that result from agriculture first-hand are not in danger of making the same mistakes. Graeber and Wengrow have called this process of intentional cultural demarcation from adjacent cultures schismogenesis – a concept that I completely endorse (since it’s partly what guides me as well), and that is one of the main reasons that led various Amazonian horticulturalists to become the way they are.
Yet their own drastic reduction in numbers (and the demise of countless other indigenous cultures) was and is often a direct result of an increase in population among their agricultural neighbors, fueled by the enormous surpluses agriculture can produce under optimal conditions (one of many other causes, such as, for instance, the introduction of novel diseases, or the discovery of “resources” on their ancestral lands). What followed this growth in population was territorial expansion, until the agriculturalists infringed on indigenous lands.
The lesson we can learn from this is that we don’t have to give up all forms of plant cultivation. Plant cultivation will inevitably lead to large-scale agriculture in some cases, but agriculture and hierarchical social organization don’t have to become the new normal. This already happened more times than necessary, and if we’re smart about it we can learn from it. If we are able to codify basic precautions into the new culture that succeeds the currently collapsing anthropocentric worldview, through deterrent myths and stories, and cultural practices that discourage certain behaviors, we might have a shot at creating a symbiotic form of plant cultivation – a compromise between foraging and farming – that doesn’t infringe on our basic needs and rights as social animals, doesn’t destroy and deplete the landscape we inhibit, and doesn’t lead to runaway inequality.
This form of plant cultivation cannot be agriculture, though, because agriculture – by definition – begets anthropocentrism, in that a given patch of land that used to feed myriad species of animals and plants is violently converted into an ecological desert that feeds only a single subgroup of a single species: the humans who cultivate it. There is no way to avert the anthropocentric implications of this subsistence mode. Of course it will drive a people crazy if they practice it over longer periods of time. Any truly sustainable form of cultivation must be closer to horticulture (first and foremost a difference in scale, but also concerning the diversity of crops, the plants’ own needs, and the management of non-human competitors for those target species).
Furthermore, there are certain ingenious ways to avoid the emergence of strictly enforced and/or hereditary hierarchies: one of the simplest examples of how to achieve the cultural shift towards discouraging certain dangerous character traits that lead to hierarchies and tyranny is the remarkable way in which members of the Lisu hill people of Zomia (shifting cultivators, and hence somewhere between horticulture and agriculture) deal with overly ambitious chiefs: they kill them in their sleep, without any previous warning. Pretty straightforward, right? This way, you won’t have many pretentious village headmen or -women to begin with, since certain kinds of people are strongly deterred from taking the office by this practice. As Arnold Schroeder so succinctly put it, the people who want power are usually the ones who really shouldn’t have it. With this practice in place you’ll ensure they either won’t pursue their ambitions for power, or their character traits will be swiftly removed from the gene pool.
Other examples for the deterrent myths, stories and cultural practices mentioned above abound. One is a sophisticated social mechanism called “shaming the meat” among the !Kung people of the Kalahari desert (a practice that proactively discourages anyone who brings home a large kill from thinking they’re better than everyone else through the use of humor and ridicule). Asked why they do this by an anthropologist, a man called Tomazo responded:
"When a young man kills much meat he comes to think of himself as a chief or a big man, and he thinks of the rest of us as his servants or inferiors. We can’t accept this. We refuse one who boasts, for someday his pride will make him kill somebody. So we always speak of his meat as worthless. This way we cool his heart and make him gentle."
Another great example for a cultural element that prevents agricultural/anthropocentric cultures from emerging is the Native American myth of the Windigo (often called Wetiko), an evil cannibalistic spirit that functions as an analogy for certain behaviors, and is also akin to an infectious disease that befalls humans. At this point I encourage the reader to grab a copy of Braiding Sweetgrass by Robin Wall Kimmerer and (re-)read the chapter ‘Windigo Footprints.’ She writes that the “Windigo is the name for that within us which cares more for its own survival than for anything else,” and it’s “the Windigo way that tricks us into believing that belongings will fill our hunger, when it is belonging that we crave.” But it’s impossible to explain this concept better than Robin, who is a Native Potawatomi herself (please excuse the overly extensive quote):
In terms of systems science, the Windigo is a case study of a positive feedback loop, in which a change in one entity promotes a similar change in another, connected part of the system. In this case, an increase in Windigo hunger causes an increase in Windigo eating, and that increased eating promotes only more rampant hunger in an eventual frenzy of uncontrolled consumption. In the natural as well as the built environment, positive feedback leads inexorably to change—sometimes to growth, sometimes to destruction. When growth is unbalanced, however, you can’t always tell the difference. Stable, balanced systems are typified by negative feedback loops, in which a change in one component incites an opposite change in another, so they balance each other out. When hunger causes increased eating, eating causes decreased hunger; satiety is possible. Negative feedback is a form of reciprocity, a coupling of forces that create balance and sustainability.
Windigo stories sought to encourage negative feedback loops in the minds of listeners. Traditional upbringing was designed to strengthen self-discipline, to build resistance against the insidious germ of taking too much. The old teachings recognized that Windigo nature is in each of us, so the monster was created in stories, that we might learn why we should recoil from the greedy part of ourselves. […]
Johnston and many other scholars point to the current epidemic of self-destructive practices—addiction to alcohol, drugs, gambling, technology, and more—as a sign that Windigo is alive and well. In Ojibwe ethics, […] “any overindulgent habit is self-destructive, and self-destruction is Windigo.” And just as Windigo’s bite is infectious, we all know too well that self-destruction drags along many more victims—in our human families as well as in the more-than-human world.
The native habitat of the Windigo […] has expanded in the last few centuries. […] Multinational corporations have spawned a new breed of Windigo that insatiably devours the earth’s resources “not for need but for greed.”
The [Windigo’s] footprints are all around us, once you know what to look for. […] They stomp in the industrial sludge of Onondaga Lake. And over a savagely clear-cut slope in the Oregon Coast Range where the earth is slumping into the river. You can see them where coal mines rip off mountaintops in West Virginia and in oil-slick footprints on the beaches of the Gulf of Mexico. A square mile of industrial soybeans. A diamond mine in Rwanda. A closet stuffed with clothes. Windigo footprints all, they are the tracks of insatiable consumption. So many have been bitten. You can see them walking the malls, eying your farm for a housing development, running for Congress.
Cautionary Windigo tales arose in a commons-based society where sharing was essential to survival and greed made any individual a danger to the whole. In the old times, individuals who endangered the community by taking too much for themselves were first counseled, then ostracized, and if the greed continued, they were eventually banished. The Windigo myth may have arisen from the remembrance of the banished, doomed to wander hungry and alone, wreaking vengeance on the ones who spurned them. […]
It intuitively feels true that, if such cultural practices are made part of our new culture as well, we can easily avoid making the same mistakes that past horti- and agricultural societies fell prey to.
There is a catch, though. Horticulture, like agriculture, inevitably leads to an increase in population – I’ll explain the reason for this in a minute. While this increase is less pronounced among horticultural societies, it nonetheless leads to increasing population pressure over time, which is why low-level warfare is a phenomenon frequently observed among Amazonian horticulturalists (and some other similar societies all over the world). As with all other animal species, an increase in population results in an increase in competition – for territory, food and other resources, and mates – and thus in an increase in aggression. Of course, indigenous warfare can’t be compared to organized warfare waged by agricultural societies, city-states, and especially larger civilizations (!), as reports of such battles attest to: the first example that comes to mind is the chapter ‘A Short Chapter, About a Tiny War’ in Jared Diamond’s The World Until Yesterday. The author describes a series of battles that took place among two groups of Dani (horticulturalist-foragers in the Highlands of Papua New Guinea), which was observed and documented by anthropologists, and resulted in a death toll of a mere eleven people over a period of six months (!). This kind of warfare doesn’t seek to exterminate the “enemy” (since you might depend on them for marriage partners and trade), but rather to show that you’re still here, still strong, and that the other group would do better not to infringe on your territory. It is more like an unmistakable demand to curb population growth: “don’t become too numerous – or else!” In fact, like Diamond’s chapter about Dani warfare seems to suggest, battles like that could actually be quite fun – if you’re not among the few casualties, that is. You’ll definitely have some exciting stories to tell around the campfire for the next few years!
“Fight” is the natural response to overcrowding exhibited by many other animal species as well, especially when “flight” cannot be considered a viable option.
The examples of pre-colonial sedentary foragers in Florida and the Pacific Northwest named by Graeber and Wengrow in The Dawn of Everything attest to this tendency even among non-horticulturalists; if food is abundant enough in your environment that it leads to an increase in population (and you do nothing to avert this), you’ll have some sort of hierarchy and some sort of warfare as a result, especially if the neighboring villages utilize a similar subsistence strategy.
Such population pressures, and the threat of raids by (or wars with) neighbors if one’s population was not kept in check, led hunter-gatherer-horticulturalists to actively control their birth rates, through the use of natural methods of birth control (tracking cycles or the “pull-out method”) as well as any of the several hundred species of plants worldwide that act as either contraceptive or abortifacient, or, if everything else failed, through infanticide (kind of like a post-natal abortion). Those are the practices observed among the horticultural societies of the Amazon rainforest and elsewhere. “True” (immediate return) hunter-gatherers already have lower birth rates for a variety of reasons, most of which have to do with extended breastfeeding (you’re not likely to get pregnant while feeding), food availability (without regular surpluses and stable calories from grains or tubers women menstruate infrequently), higher infant mortality (without modern medicine, infants die much easier), and nomadism (you can only carry one child).
So, in my opinion, the question is not “adopt agriculture, or starve,” but “stabilize population levels, or prepare for battle.”
But it is important to point out that forager populations (just as any other animal population) typically don’t experience steady population growth. They exist in equilibrium with their environments, and their numbers depend on food availability. If there is enough food for all members of the tribe in question – but no surpluses! – no increase in population can occur (although there will obviously be some minor fluctuations in any given year); no foraging culture regularly forages for more food than they immediately need. Population growth is not something that we’re passively experiencing or something what we’re unavoidably subject to, but something we can exert influence over, and a direct function of food availability.
Agricultural societies experience exponential growth (with the occasional setback in form of a famine, an epidemic, or a war): the new people who make up the increase in population have to be made of something, and that something is food.
Daniel Quinn has illustrated this peculiarity of population dynamics with a simple thought experiment: if you have a large cage with 100 rats and you give them enough food for 100 rats, their population will stay the same (with slight oscillations). If you feed them enough for 200 rats, eventually you’ll reach a population that stabilizes somewhere around 200 rats. And if you slowly decrease the amount of food (over many rat generations) so that it’s just enough for, say, 150 rats, the population will slowly decrease to about 150 (and, likewise, a rapid reduction in food availability will be followed by an equally rapid reduction in the population through starvation).
Many people in this culture express a strong allergic reaction to the implications of this scenario, because they cling to the identification with their ego (characterized by concepts like ‘free will’ and ‘rational thought,’ and thus implicating a separation from other animals) as a basis of who they are, and thus vehemently reject the notion that they are animals as well. They might say something along the lines of “the above might be true for rats, yes, but definitely not for us humans! I mean, we can decide!” – which is true! We can decide, on the individual level, but who says that those decisions are not influenced subconsciously by environmental conditions and other biological factors. The truth is that, while some of the rats in our experiment might decide not to have children on the individual level, the overall number on the population level will still increase. The dominant culture has been running this experiment for several millennia already, and each year, a growth in agricultural production is followed by a growth in population.
Until the beginning of the “Holocene,” our species’ population experienced linear growth (overall!) during the 300,000+ years we exist (although there was at least one, possibly more, serious population bottlenecks during this time), which is a direct result of Homo sapiens gradually settling in lands hitherto uninhabited by humans. Once all hospitable environments had been inhabited by foragers, the human population would have leveled off – if the climate would have stayed as erratic as it was during the entire Pleistocene, the relative equilibrium of the last few dozen millennia would have never been disturbed.
Human groups eventually reach the carrying capacity of the ecosystem they inhabit, but as long as they don’t appropriate more resources to their own exclusive benefit (and thus deprive other species of their food), they won’t experience a further growth in population. That’s basic population biology. A deer population is kept in check by the availability of fodder plants, but should the deer figure out a way to dramatically increase the availability of their fodder plants (maybe through plant cultivation?), they’ll experience an equally dramatic increase in population as a result (and an equally drastic reduction in various other animal and plant populations that they used to coexist with).
The biomass in any given ecosystem always stays more or less the same, but in one scenario (foraging society), this biomass is equally distributed among various animals, plants, fungi and bacteria, and in the other scenario (farming society), the biomass becomes more and more concentrated among humans (and their crops and livestock) and less concentrated in, say, forest trees, wolves and tigers – hence a reduction in diversity occurs.
If we continue a bit more down that road, what usually follows the latter scenario is a sudden crash due to some threshold being reached – like pressure steadily building up in a balloon until the point is reached where it pops, without any previous warning – partially caused by the very reduction in diversity that ensured the ecosystem’s resilience in the first place.
Agriculture wasn’t caused by an increase in population, but it led to one. People didn’t adopt farming because they were hungry (at first!) but because they experimented with plant cultivation (a point Graeber and Wengrow make that I agree with), and the first instances of true, field-scale agriculture were almost exclusively limited to river deltas where flood-retreat farming could be practiced (the only form of agriculture that requires an absolute minimum of work), and they were – in the beginning – always accompanied by hunting and gathering.
To sum up the foregoing (rather random) assemblage of different topics, I believe that, yes, human understanding of plant reproductive cycles inevitably leads to agriculture (if the climate allows it!), but history shows that agriculture is a self-eliminating evolutionary strategy, especially over the long term. In fact, it is far from certain if, would we rewind the last ten millennia, agriculture would once again become as ubiquitous; moreover, it seems obvious that this ubiquity will not last for another century from now, since the relatively stable climate of the so-called “Holocene” is over since (at least!) two decades ago. Unstable evolutionary strategies such as agriculture tend to disappear rather quickly on biological timescales, and the ten thousand years that agriculture exists is a blink of an eye for our shared home, Planet Earth.
What happens when humanity has reached a state approaching planetary carrying capacity under foraging (and horticultural) conditions is that – climatic conditions permitted – small-scale, low-level warfare becomes inevitable wherever other natural methods to curb any potential population growth fail. This form of relatively minor conflict would in fact be limited to extraordinarily abundant habitats (such as among salmon runs, bison pastures or rich coastal fisheries), since deserts, higher mountains and other marginal lands support a much smaller human population to begin with.
But since in any case no steady growth in population can occur without agricultural (and, to a limited extend, horticultural) surplus, it is my understanding that the population will eventually stabilize, probably somewhere around pre-Holocene levels, but likely a bit lower. The ecosystems that allow for a mixture of horticulture and foraging are limited in number and scope, even more so considering the long-term implications of a CO2 level of over 400ppm, so even if some societies experience sudden surges in population, those increases would be short-lived.
Please feel free to voice your opinion in the comments - I highly appreciate any feedback and hope this conversation will continue!
The foreword to the 30th-Anniversary Edition of ‘The Selfish Gene’ contains a number of such admissions of guilt, conceding that Dawkins’ original statements (statements that shaped the worldview of millions of people!) were erroneous. Examples are: “Altruism might well be favoured at other levels,” and “I do with hindsight notice lapses of my own on the very same subject. These are to be found especially in Chapter 1, epitomised by the sentence ‘Let us try to teach generosity and altruism because we are born selfish.’ There is nothing wrong with teaching generosity and altruism, but ‘born selfish’ is misleading [Emphasis mine].”
I globally agree and will have about 3-4 points to add, that came during reading... I am aware that the main reason for commenting is some urge to share a discord, some "yes but..." so that's why I focus on adding parameters!
The main shift could be illustrated by this image: how to pull out a stuck drawer, a bit on each side. Maybe agriculture and population increased through this pattern?
Overpopulation is rarely considered before the very visible recent bigger and sudden rise. You did, and I appreciated the read.
So my 1st addition is to consider that we are unable to see what's slow, like plant growth. Until we suddenly see the change.
My background is behaviour, animals' and plants', trauma resolution and anthropology. I live in the Canaries and you will find interesting information there, as the only place with clear limits where the native islanders had NO BOATS to migrate with! The consequences were very different according to the size and steepness of each island. They had to control their population in other ways than in Polynesia...
I am not an English speaker and don't remember all my thoughts at once, as it needs scrolling up and down your post, but we'll exchange on the go, right?
In a bit of a rush and came past your stack on a different page, so have not read properly, but it seems to me that there might be a generalisation going on here, which is possibly unhelpful. Pardon me if I missed something in this early morning ramble...
There are different _kinds of civilisation_. Those that radiated out of West Asia, based on ploughing, grains, slavery, tax and debt commanded and controlled by an elite in a state bureaucracy, are but one kind of civilisation. Though dominant in more than one sense, including that of the human imagination, there have been other attempts at configuring the complex web of life.
Notably much better attempts were made in the Amazon before the Europeans turned up. Charles Mann has written nicely about the cultural context and scientific framework relevant here, in "1491", and this archaeology piece is an easy intro https://pubmed.ncbi.nlm.nih.gov/30038410/ - see also the illuminating work of Michael Heckenberger.
The point I am trying to make is that: Whether plant knowledge leads to civilisation or not, is less interesting than questions concerning: What can plants teach us about building complex societies, how we can we learn from them to live in alliances with all the other beings (and in extension of their alliances with fungi and the rest of the soil communities)?
In other words, what exists in the patterns we know about that we can deploy today, here and now, to regenerate our habitat, develop more-than-sustainable food systems, and enrich our landscape with biodiversity - and so on.
People can do good, people can do bad. All things have two handles, beware of the wrong one.