Tag Archives: Forager

Why Think Of The Children?

When a cause seems good, a variation focused on children seems better. For example, if volunteering at a hospital is good, volunteering at a children’s hospital is better. If helping Africa is good, helping African kids is better. If teaching people to paint is good, teaching children to paint is better. If promoting healthy diets is good, promoting healthy diets in kids is better. If protecting people from war is good, protecting kids from war is better. If comforting lonely people is good, comforting lonely kids is better.

Why do most idealistic causes seem better when directed at kids? One explanation is that kids count a lot more in our moral calculus, just as humans count more than horses. But most would deny this I think. Another explanation is that kids just consistently need more of everything. But this just seems wrong. Kids are at the healthiest ages, for example, and so need health help the least. Even so, children health is considered a very noble cause.

For our foragers ancestors, child rearing was mostly a communal activity, at least after the first few years. So while helping to raise kids was good for the band overall, each individual might want to shirk on their help, and let others do the work. So forager bands would try to use moral praise and criticism to get each individual to do their kid-raising share. This predicts that doing stuff for kids would seem especially moral for foragers. And maybe we’ve retained such habits.

My favored explanation, however, is that people today typically do good in order to seem kind, in order to attract mates. If potential mates are considering raising kids with you, then they care more about your kindness toward kids than about your kindness toward others. So to show off the sort of kindness that your audience cares about, you put a higher priority on kindness to kids.

Of course if you happened to be one of those exceptions really trying to just to make the world a better place, why you’d want to correct for this overemphasis on kids by avoiding them. You’d want to help anyone but kids. And now that you all know this, I’ll wait to hear that massive rumbling from the vast stampeed of folks switching their charity away from kids. … All clear, go ahead. … Don’t be shy …

GD Star Rating
a WordPress rating system
Tagged as: , , ,

Reward or Punish?

Many reality TV shows, like Project Runway, Hell’s Kitchen, or Survivor, focus on punishing the worst, instead of rewarding the best. Not only do viewers seem to find that more interesting, it actually works better to incentivize performance (many quotes below). Punishment works better to encourage lone behavior, to encourage behavior in a group, and as a tool for letting some group members encourage others.

The puzzle is that in most of our social worlds we instead focus on rewarding the best, not punishing the worst. If you search for “punish reward” you will mostly find the issue raised about how to treat kids; we are mainly willing to use punishment flexibly on them. And this when young kids are the main exception – for them punishment works worse. For adults, we tend to limit punishment’s use to extreme behavior that we all strongly agree is bad, like crime. And when you ask adults, they much prefer to be part of a group that uses rewards, not punishment.

As a college teacher, I expect that I’d get more effort from most students by regularly pointing out the worst student in the class than the best. But I also expect students to hate it and give me low evaluations. Similarly, I expect that if I wrote the occasional post criticizing a bad blog commenter here, instead of praising a good one, I’d get more change in commenting behavior. But I also expect that person to complain long and loud about how I was biased and unfair, and others to come to their defense. I expect a lot less complaining about bias in picking the best.

In both the class and comment cases, I expect people to see me as mean and cruel for punishing the worst, but kind and generous for rewarding the best. This even though all of these effects are relative – punishment would raise the rest of the class, or the rest of the commenters, up above the worse.

Note that rewarding the best is in practice more elitist than punishing the worse; punishing creates an underclass, not an overclass. And in fact our hyper-egalitarian forager ancestors were quite reluctant to overtly reward or praise; they focused their social coordination on having the group punish norm violators. Our hyper sensitivity to being punished, and our elaborate instinctual strategies to give excuses and to coordinate to retaliate against any who might suggest we should be punished, are probably human adaptations to that forager history. And they make us especially unwilling to accept punishment by an authority, instead of by the informal consensus of the group.

This seems an interesting example of our seeking to avoid aspects of the forager way of life. Our forager evolved aversion to being singled out for social shame is so strong that we’d rather create elites instead. At least this applies when we are relatively rich and comfortable. If we really feared being destroyed for lack of sufficient efforts, as farmers often did, we’d probably be a lot more eager to raise overall efforts by punishing the worse. I suspect that foragers themselves didn’t punish much in good times; punishment was invoked more, and mattered more, in hard times. In good times foragers probably more tolerated praising some as better, and weak forms of bragging.

In a more competitive future, with organizations and individuals that compete harder to survive, I’d expect more use of punishment, in addition to reward.

Today if you have a group that really needs to succeed, and to induce strong efforts all around, consider paying the social disruptions costs of punishing the worst, instead of rewarding the best. You will probably get more effort that way, even if people end up hating you and calling you evil for it. And if your group doesn’t punish and fails, know that your reluctance to punish was probably a contributing factor.

Those promised quotes: Continue reading "Reward or Punish?" »

GD Star Rating
a WordPress rating system
Tagged as: , , ,

Imagining Futures Past

Our past can be summarized as a sequence of increasingly fast eras: animals, foragers, farmers, industry. Foragers grew by a factor of about four hundred over two million years, farmers grew by a factor of about two hundred over ten thousand years, and the industry economy has so far grown by a factor of about eight hundred over three hundred years. If this trend continues then before this era grows by another factor of a thousand, our economy should transition to another even faster growing era.

I saw the latest Star Trek movie today. It struck me yet again that such stories, set two centuries in our future, imagine a unlikely continuation of industry era styles, trends, and growth rates. At current growth rates the economy would grow by a factor of two thousand over that time period. Yet their cities, homes, workplaces, etc. look quite recognizably industrial, and quite distinct from either farmer or forager era styles. The main ways their world is different from ours is in continuing industry era trends, such as to richer and healthier individuals, and to more centralized government.

While this seems unlikely, it does make sense as a way to engage the audiences of today. But it leads me to wonder: what if past eras had set stories in imagined futures where their era’s trends and styles had long continued?

For example, imagine that the industrial revolution had never happened, and that the farming era had continued for another ten thousand years, leading to more than today’s world population, mostly farming at subsistence incomes within farmer-era social institutions. Oh there’d be a lot of sci/tech advances, just not creating much industry. Perhaps they’d farm the oceans and skies, and have melted the poles. Following farmer era trends, there’d be less violence, and longer term planning horizons. There’d be a lot more thoughtful writings, but without much intellectual specialization having arisen. Towns and firms would also still be small and less specialized.

Or, imagine that the farming revolution had never happened, but that foragers had continued to advance for another two million years, also reaching a population like today. They’d still live in small wandering bands collecting wild food, but in a much wider range of environments. Maybe they’d forage the seas and the skies. Their brains would be bigger, their tools more advanced, and their culture of participatory dance, music, and stories far more elaborate.

These sound like fascinating worlds to imagine, and would make good object lessons as well. Our future may be as different from the world of Star Trek as these imagined worlds would be from our world today.

GD Star Rating
a WordPress rating system
Tagged as: , , , ,

A City Is A Village Of Villages

There have been three major eras of human history: foraging, farming, and industry. During each era our economy has grown at a roughly steady exponential rate, and I’ve written before about some intriguing patterns in these growth eras: eras encompassed a similar number of doublings (~7-10), transitions between eras were much shorter than prior doubling times, and such transitions encompassed a similar number of growth rate doublings (~6-8). I’ve also noted that transition-induced inequality seems to have fallen over time.

I just noticed another intriguing pattern, this time in community sizes. Today in industrial societies roughly half of the population lives in metropolitan areas with between one hundred thousand and ten million people, with a mid size of about a million. While good data seems hard to find, during the farming era most people seem to have lived in communities (usually centered around a village) of between roughly three hundred and three thousand people, with a mid size of about a thousand. Foragers typically lived in mobile bands of size roughly twenty to fifty, with a best size of about thirty.

So community sizes went roughly from thirty to a thousand to a million. The pattern here is that each new era had a typical community size that was roughly the square of the size during the previous era. That is, a city is roughly a village of villages, and a village is roughly a band of bands. We could extend this patter further if we liked, saying that an extended family group has about four to eight members, with a mid size of six, so that a band is a family of families. (We might even go further and say that a family is a couple of couples, where a couple has two or so members.)

If previous growth patterns were to continue, I’ve written before that a new growth era might appear sometime in roughly the next century, and over a few years the economy would transition to a new growth rate of doubling every week to month. If this newly-noticed community size pattern were to continue, the new era would have communities of size roughly a trillion, perhaps ranging from ten billion to a hundred trillion.

Admittedly, after a year or two of this new era, things might change again, to yet another era. And the growth and community size trends couldn’t both continue to that next era, since a community size of a trillion trillion would require much more than twenty doublings of growth. So these trends clearly have to break down at some point.

I’ve been exploring a particular scenario for this new era: it might be enabled and dominated by brain emulations, or “ems.” Interestingly, I had already estimated an em community size of roughly a trillion based on other considerations. Ems could take up much less physical space than do humans, and since ems could visit each other in virtual reality without moving physically, em community sizes would be less limited by travel congestion costs.

So what should one call a city of cities of a trillion souls? A “world”?

GD Star Rating
a WordPress rating system
Tagged as: , , ,

Placebos Show Care

Something similar to the placebo effect occurs in many animals. … Siberian hamsters do little to fight an infection if the lights above their lab cage mimic the short days and long nights of winter. But changing the lighting pattern to give the impression of summer causes them to mount a full immune response.

Likewise, those people who think they are taking a drug but are really receiving a placebo can have a response which is twice that of those who receive no pills. In Siberian hamsters and people, intervention creates a mental cue that kick-starts the immune response. …

The Siberian hamster subconsciously acts on a cue that it is summer because food supplies to sustain an immune response are plentiful at that time of year. We subconsciously respond to treatment – even a sham one – because it comes with assurances that it will weaken the infection, allowing our immune response to succeed rapidly without straining the body’s resources. … Farming and other innovations in the past 10,000 years mean that many people have a stable food supply and can safely mount a full immune response at any time – but our subconscious switch has not yet adapted to this. (more)

OK, but the key question is: why would getting a placebo pill ever have been a credible signal that you could safely turn on your immune system? If for our ancestors treatments like pills tended to be very effective at improving health, you might think that a pill would give you so much extra energy that you could afford to spend some of that extra on your immune system. But pills are rarely that effective, and your body would quickly notice that fact.

My showing that you care theory, that the main function of medicine is to signal concern, fits well here. The idea is that we are reassured by the fact that people take the trouble to take care of us.

The most severe part of our ancestors’ environment wasn’t the weather, it was other humans. When people were sick, they worried that their rivals and enemies would use that opportunity to hurt them. If such harms were coming, they had to be attentive, wary, and ready to act — they couldn’t afford to turn on their immune system, which would make them lethargic.

But if someone had caretakers, who spent time and other resources to take care of them when they were sick, why then such caretakers would probably also protect them from rivals. So they could afford to turn on their immune system. If your associates spend resources to buy you pills, and then take time to make sure you take certain pills at certain times, they probably care enough to protect you from rivals.

GD Star Rating
a WordPress rating system
Tagged as: ,

Sex At Dusk v. Dawn

Two years ago I was persuaded by the book Sex At Dawn, at least on its ”key claim, that forager females were sexually promiscuous.” While I didn’t buy authors’ free-love scenario, I thought our ancestors were much less tied to their sex partners than most folks realize:

A Hadza man hunts big game to look sexy, even though that retrieves less food. Except that when a women he has sex with has a kid he thinks is his, he’ll gather more but less-sexy food, to give this woman ~1/2 of her food for one year, ~1/4 for the next two years, and declining amounts thereafter. Now, yes, this may be more pair-bonding than in chimps or bonobos. But it is also far less than the farmer ideal of life-long monogamy! Many men today reluctant to marry for life would be ok with this level of commitment.

Lynn Saxon has a new book Sex At Dusk, quite critical of Sex At Dawn. She was a kind enough sent me a copy, which I’ve just read. Searching, I’ve only find positive reviews of it (here, here, herehere).

Saxon doesn’t write as well, and especially fails to summarize well. Even so, she does successfully undercut many Sex At Dawn arguments. In humans, sexual jealousy is a universal, females are picky about sex partners, penises aren’t over-sized, testes are small, sperm production slow, and the evidence doesn’t suggest a great deal of sperm competition. Female chimps have little extra-group sex, bonobos don’t usually mate face-to-face, and many Sex At Dawn quotes are misleading, given their context.

On sound during sex, Saxon offers evidence that female primate cries during sex aren’t simple invitations:

A recent study of female chimpanzees found no support for the sperm competition hypothesis: females did not produce calls when mating with low-ranked males so it was not about inviting other males to join the party, and calls did not correlate with fertility and the likelihood of conception. … Females called significantly more while mating with high-ranked males, but suppressed their calls if high-ranked females were nearby. …

[Researchers] found that [bonobo] females were more likely to call with male rather than female partners but the patterns of call usage were very similar in that females called more with high ranked partners (as in chimpanzees), regardless of the partner’s sex. With a female partner compilation calls were consistently produces only by the lower ranking of the two females. … In bonobos the increase in calls during the alpha female’s presense. (p.279-280)

Yes, it looks like chimp and bonobo sex calls are more brags than invitations. Even so, brags make little sense when it is common knowledge who has sex with whom. So a habit of similar bragging sex calls by human females would suggest that humans often didn’t know who was having sex with whom, suggesting a lot of promiscuity.

A key question, to me, is what percentage of our forager ancestor kids were fathered outside pair-bonds. That is, what fraction of kids were born to mothers without a main male partner, or had a father different from that partner. This number says a lot about the adaptive pressures our ancestors experienced related to various promiscuous and polyamorous arrangements today. And hence says a lot about how “natural” are such things.

Alas, none of these authors give a number, but my impression was that Saxon would estimate less than 20%, while the Sex At Dawn authors would estimate over 50%. Even 20% would be consistent with a lot of human promiscuity adaptations, such as female sex brag calls. I asked Saxon directly via email, however, and she declined to give a number – she says her main focus was to argue against Sex At Dawns‘ “paternity indifference” theory (that humans don’t care which kids are theirs). Which is fine – that is indeed a pretty crazy theory.

So where does the evidence sit on promiscuity? Our closest living relatives, chimps and bonobos, are quite promiscuous. Yes, their pair bonds much weaker than ours, and pair-bonding usually greatly reduces promiscuity. But few pair-bonded animals live in big social groups where hidden extra-pair sex is so easy to arrange, and humans live in even bigger groups than chimps or bonobos. For humans, we have lots of clear evidence of outside-pair sex, mate-guarding to prevent such, bragging sex cries, and desires for sexual variety. And humans do seem to spend a record fraction of their time thinking about and doing sex.

Since humans usually have clear overt norms against extra-pair sex, but strong urges to arrange covert sex, promiscuity estimates comes down in part to estimates of how well humans actually enforced their norms. My homo hypocritus theory suggests a lot of covert norm violation, and so a lot of promiscuity. So I’ll estimate the key promiscuity number in the 20-30% range.

Btw, here’s another fascinating quote from the book:

Hrdy writes about a startling interview with an old hunter in which he reminisces to a time when just the sound of his footsteps on the leaves of the forest floor struck terror in the hearts of old women. He was the socially sanctioned specialist in eliminating old women deemed no longer useful: coming up behind an old woman he would strike her on the head with his axe. (p.216)

GD Star Rating
a WordPress rating system
Tagged as: , ,

The Riddle of Ritual

Since I love social puzzles, I was pleased to discover the book The Problem of Ritual Efficiency:

Ritual has come to be thought of in popular discourse as a kind of action that is ineffective, superficial, and/or purely formal, and this view is the unexamined premise behind much of ritual studies. This attitude explains why …we “know it when we see it” – and what we know to be rituals when we see them are acts that are apparently non rational, in which the means do not seem proportionate to the ends, the intended objects of human action are non empirical beings, or the theories of efficacy that ostensibly explain the ritual acts are inconsistent with modern, scientific paradigms. This reaction is similar to what an archaeologist does when he discovers a structure whose purpose is unclear – he calls it a temple. … The notion that ritual is ineffective is false. … Shamatic rituals heal, legal rituals ratify, political rituals unify, and religious rituals sanctify. Rituals transform sick persons into healthy ones, public spaces into prohibited sanctuary, citizens into presidents, princesses into queens … One of our most important tasks as scholars is to explain how rituals accomplish these things. (pp.6-7)

I bought and read that book, and have also been reading The Creation of Inequality, which emphasizes the centrality of rituals to foragers:

Cosmology, religion, and the arts were crucial to hunters and gatherers. … The lessons of myth were passed on audio visually. Performances combining art, music, and dance fixed in memory the myth and its moral lessons. … We doubt that art, music, and dance arose independently. More than likely the evolved as a package that committed sacred lore to memory more effectively than any lecture. … The archeological data suggest … that the use of the arts increased as larger social units appeared, because each moiety, clan, section, or subsection had its own body of sacred lore to commit to memory. … Dancing, drinking, and singing for days, as some tribes did, opened a window in the the spirit world and thereby confirmed it existence. (pp. 62-63)

Also, reading a BBS article on “ritual behavior”, I came across this comment by Bjorn Merker:

Literal duplication … lies at the very heart of ritual. The need to remember and reproduce essentially arbitrary details on an obligatory basis burdens behavior with a handicap, and the ability to sustain that burden is proof of capacity, and hence tends to impress. … There is reason to believe that humans by nature are carriers of ritual culture in the sense just defined. We, in contrast to chimpanzees – indeed, alone among all the primates but like many songbirds and the humpback whale – are in possession of a neural mechanism that allows us to duplicate with our voice that which we have heard with our ears. Most mammals, who excel at learning in other respects, are incapable of doing so; yet we humans do so with every song we know how to sing and with every word we know how to pronounce. … Such a perspective helps us understand why ritual form marks human culture not only in domains touched by precautionary concerns, but in well nigh every area of human pursuit. (p.624)

While much about ritual remains puzzling, one thing seems clear: the essential human difference, the one that has let us conquer the Earth, was an ability to accumulate useful innovations via culture. And this primarily required a good ritual sense, i.e., a good way to watch and copy procedures like starting a fire, shaping a knife, etc. Having language and big neighbor-friendly social groups helped, but were less essential.

The fact that language require good vocal ritual skills suggests better ritual abilities appeared before full recursive language. Since they were very social, pre-language but post-ritual humans probably filled much of their social lives with complex rituals, showing off their abilities to precisely execute complex procedures, and showing loyalty via doing their group’s procedures. Music, dance, and other such ritual habits continued after full language.

Language let us better express and enforce complex social norms, and helped us gain a more conscious and flexible understanding and control of our procedures. But we still have poor introspective access to our pre-language systems, and so still don’t know a lot about why we do which procedures when, and why they make us feel good or bad.

When humans believe in hidden spirits who take an interest in whether they follow social norms, hard-to-understand rituals offer a natural place to locate their connection to spirits. Humans can believe that spirits watch their rituals, and then respond by making them feel good or bad. This helps us understand why religions emphasize rituals.

Added: This poll has a majority favoring culture coming before language.

GD Star Rating
a WordPress rating system
Tagged as: ,

Forager, Farmer Morals

Looking for insight into farmer-era world views, I just read the 1931 novel The Good Earth, about Chinese farmers. It is of course more a morality tale than a documentary, and the main character soon gets rich, and is then no longer a representative farmer. But the story illustrates differences between farmer vs. forager style morality.

Foragers live in close egalitarian bands, with behavior well adapted to their environment. So forager morality issues are mostly about well-adapted personal behavior in conflict with group interests. Foragers sin by bragging, not sharing, being violent against associates, etc.

Farmer morality, in contrast, is much more about conflicts within people than within groups. Farmers sin by being lazy, wanting overly fancy foods, taking drugs, having sex with prostitutes, wanting status markers that cost too much in the long run, etc. Farmers need to resist internal temptations to do things that might make sense for foragers, but which can ruin farmers. These can also ruin one’s family and friends, so farmer sins also have shades of selfishness.

Of course farmers also care about bragging, violence, etc. In some sense farmers have more morality – more and stronger rules, to fight against stronger natural inclinations. So farming culture introduced religion and stronger social pressures to enforce their rules, to keep farmers from relapsing into foragers.

This helps me make sense of Jonathan Haight’s observations that liberals, who I’ve called forager-like, rely on fewer moral principles than conservatives, who I’ve called farmer-like:

The current American culture war, we have found, can be seen as arising from the fact that liberals try to create a morality relying primarily on the Care/harm foundation, with additional support from the Fairness/cheating and Liberty/oppression foundations. Conservatives, especially religious conservatives, use all six foundations, including Loyatly/betrayal, Authority/subversion, and Sanctity/degradation. (more)

I’ve suggested that as we’ve become richer, we’ve become more forager-like. If our descendants get poor again, they’ll probably need stronger social norms again, to get them to resist temptations to act like foragers and do what is functional in their world. Their morality would probably rely on a wider more-conservative-like range of moral feelings.

In the em scenario I’ve been discussing here, sex would be unimportant except as a possible way to waste too much time. So em morality would be pretty liberal on sex. But money, work, and reputation would be important – ems would probably have pretty conservative attitudes on keeping their word, doing their job, obeying their boss, and not stealing. When mind theft or virus corruption are big risks, they’d also probably have strong purity feelings about avoiding acts that could risk such harms. And they’d probably feel strong clan loyalty, even beyond what farmers feel, to the clan of copies of the same original human.

GD Star Rating
a WordPress rating system
Tagged as: , , ,

Taming The Wild Idea

Foragers distinguish between camp and the wild. In camp, things are safe and comfortable, and people should be pleasant. The wild, in contrast, is dangerous and uncontrolled. In camp, some of us must watch out for intrusions from wild, such as storms, wild animals, or hostile tribes.

Some of us must also periodically venture into the wild, to bring back food and other useful materials. But it is important that whatever we bring back be tamed before it gets here. Don’t bring back live dangerous animals, don’t leave poison berries around camp where people might think they are safe, and leave violent aggressive hunt habits out there in the wild. What happens in the wild, should stay in the wild.

Ideas and concepts can be dangerous and disruptive. Ideas influence the status and attractiveness of people and activities, and who is blamed and credited for what outcomes. For a society vulnerable to social disruption, ideas can be wild.

Today, most of the ideas and concepts that we come across have been tamed. They have long been integrated into our ways of thinking, and we have worked out attitudes and opinions to help us avoid being cut by their sharp edges.

But today we must also deal with a steady stream of new untamed ideas. Some of these are the side effect of ordinary people doing ordinary things. Others come from intellectual explorers, who purposely venture into the wild in search of new ideas. How do we tame such ideas?

We celebrate our intellectual explorers, both those who come back with useful ideas, and those whose useless ideas show off their impressive explorer abilities. But we are also wary of their trophies, just as foragers would be way of a hunter bringing a strange live animal into camp. We want people we trust and respect to tame those ideas before let them flow free in our camp of easily discussed ideas. Wild explorers, who may have “gone native”, can be useful in expeditions, but must remain under the control of more civilized explorers.

I think this helps us understand why universities, some of the most conservative institutions we have, are home to our most celebrated intellectuals. Academic institutions such as universities, academic journals, peer review, etc. seem far from ideal ways to encourage innovative ideas. But they seem like better ways to ensure outsiders that ideas have been safely tamed. The new ideas that academics endorse can be safely quoted and an applied with minimal risk of wild uncontrolled disruption. So when ideas originate among wild untamed academic-outsiders, we prefer to attribute them to the safe academic insiders who tame them.

When we are willing to risk being exposed to wild untamed ideas, we turn less to academics, and more to startup companies, passionate writers, activists, etc. And in our youth, many of us are eager for such exposure, to show that we are no longer children who must stay safely in camp – we are strong and brave enough to venture into the wild.

But when we get children of our own, and feel less a need to show off our derring-do, we prefer tamed idea sources. We prefer to hire kids who got their ideas from universities, not startups or activists. And most prefer their news to come from similarly tamed journalists. We applaud wild ideas, but prefer them tamed.

GD Star Rating
a WordPress rating system
Tagged as: , ,

Testing My Growth Model

I have suggested that long run growth can be described as a sequence of exponential growth modes, from primates to foragers to farmers to industry, where mode transitions are similar in their degree of suddenness and growth rate change factors. This model will be tested in the future – it suggests that within a century or so we’ll see a change within five years to a new mode where the economy doubles every month or faster.

But my model can also be tested against the past. Our data on the animal, forager, and early farming eras is pretty poor. My hypothesis suggests that the forager era was one big growth mode similar to the farming or industry eras, with a relatively smooth rate of growth in capacity (even if rare disasters temporarily disrupted the use of that capacity), and that the forager to farming transition has a level of smoothness similar to that of the farming to industry transition.

Contrary to my model, many have suggested there was an important comparable revolution in human behavior around 50,000 years ago. My model predicts that growth accelerated smoothly from around 100,000 years ago to the near full speed farming world of about 5000 years ago, similar to the way growth accelerated from 1600 to 1900.

The latest results seem to support my model:

Back in 2000, a now famous scientific paper called “The Revolution That Wasn’t” argued that the then-conventional wisdom that modern human behavior had erupted in a “creative explosion” about 50,000 years ago in Europe was wrong. Rather, anthropologists Sally McBrearty and Alison Brooks contended that modern behavior, including creativity, has deep and ancient roots, going back some 300,000 years ago in Africa (Science, 15 February 2002, p. 1219).

At a meeting here last month, researchers heard new evidence that human evolution took a gradual, rather than revolutionary, course during two other key junctures in prehistory. A study of ancient stone tools from South Africa concludes that hunters manufactured spears with stone points—a sign of complex behavior—200,000 years earlier than had previously been thought. And new excavations at a 20,000-year-old settlement in Jordan, laden with artifacts typical of much later sites, suggest that the dramatic rise of farming villages in the Near East also had early and deep roots. … Many archaeologists now think that apparent “revolutions” are due to gaps in the record or to behavioral shifts triggered by changing conditions, rather than sudden advances in cognition. What appear to be precociously sophisticated behaviors are really reflections of what prehistoric humans were capable of all along. (more)

GD Star Rating
a WordPress rating system
Tagged as: , , , ,