Deep thought – Sept 1

September 1, 2009

NOTE: Images in this archived article have been removed.

Click on the headline (link) for the full text.

Many more articles are available through the Energy Bulletin homepage

Image RemovedThis week Deep Thought is mainly about the brain and what behaviors may have helped us get to where we are today as a species…-KS


In control? Think again. Our ideas of brain and human nature are myths

Madeleine Bunting, The Guardian
It was browsing in a bookshop that got me started. I was confronted by a bank of bestsellers on the brain: how it works and how we think. There were the books which have attracted huge attention, such as Nudge and Blink, but there were others popularising the new insights of a range of academic disciplines – social sciences such as evolutionary psychology as well as neuroscience – which are radically challenging the most fundamental assumptions on which human beings operate.

Perhaps that sounds a little overblown, but it’s not. Who, dear reader, do you think you are? Do you think your mind is capable of independent judgment and largely directs the course of your life? Do you think that most of your decisions in life have been the product of your rational, conscious self? Do you believe you are in control of your life? Do you cherish ideas such as self-expression, a sense of autonomy and a distinct, self-authored identity? The chances are that, albeit with a few qualifications, most of your answers are yes. Indeed, given a pervasive culture which reinforces all these ideas, it would be a bit odd if you didn’t.

But the point about this new explosion of interest in research into our brains is that it exposes as illusions much of these guiding principles of what it is to be a mature adult. They are a profound misunderstanding of how we think, and how our brains work. They are fairytales, about as fanciful and as implausible as goblins.

This is such dramatic stuff that Matthew Taylor at the Royal Society of Arts, which has pioneered public engagement with this new research, argues that we are on the verge of a new Enlightenment. He argues that the 18th-century concept of the individual self has run its course and that a new paradigm of human nature is emerging. Given that assumptions of an autonomous individual underpin every aspect of how we order society and our political economy, educate and tackle social issues, this kind of Big Idea tends to make you feel a tad dizzy…
(23 August 2009)


Brain changes may have led to Stone Age tools

David Perlman, San Francisco Chronicle
Once upon a time in the long evolution of Homo sapiens, a band of our African ancestors learned to use fire for more than cooking meat, lighting the dark or warding off attacking animals.

Those Stone Age people became the world’s first engineers – they discovered that the intense heat of a fire’s embers could make chunks of stone much easier to flake for making tools, and to make them much sharper too.

It was “a breakthrough adaptation in human evolution,” reports an international group of archaeologists and anthropologists. And it may have come about because of changes in those early human’s brains, other scientists say.

What began at least 165,000 years ago became the most common method of stone toolmaking in Africa by about 72,000 years ago.

The scientists from Africa, Australia and Arizona analyzed nearly 200 ancient tools found around cave dwellings at a South African coastal site called Pinnacle Point, which earlier excavations had shown were inhabited by people of the Middle Stone Age…
(26 August 2009)


How cooking makes you a man

Sarah Karnasiewicz, Salon
Animals of the genus Homo are defined by their little mouths, large guts, big brains — and appetite for bratwurst. This, at least, is the provocative theory of evolution put forth by Dr. Richard Wrangham in his fascinating new book, “Catching Fire: How Cooking Made Us Human.”

Wrangham, the Ruth B. Moore Professor of Biological Anthropology at Harvard University’s Peabody Museum of Archaeology and Ethnology, began his career studying chimpanzees alongside Jane Goodall, and rose to academic acclaim as a primatologist specializing in the roots of male aggression. Naturally, he tends to think of most scientific questions in relation to chimps. And so it was that a few years ago, while sitting in front of his fireplace preparing a lecture on human evolution, he wondered, “What would it take to turn a chimpanzee-like animal into a human?” The answer, he decided, was in front of him: fire to cook food.

For years, accepted wisdom has held that it was a transition to meat eating that prompted human evolution — which makes Wrangham’s hypothesis a radical departure. Yet, the more he tested his theory, the more he found the science to back it up: Cooked food is universally easier to process and more nutritionally dense than raw food, which means adopting a cooked diet would have given man a biological advantage. The energy he once spent consuming and digesting raw food could be diverted to other physiological functions, leading to the development of bigger bodies and brains. And Wrangham’s “cooking hypothesis” not only explains the physical changes that humans underwent but also the social ones: Cooking created a sexual division of labor that informs our ideas of gender, love, family and marriage even to this day. “Humans are adapted to eating cooked food in the same essential way as cows adapted to eating grass, or fleas to sucking blood,” Wrangham concludes. “And the results pervade our lives, from our bodies to our minds. We humans are the cooking apes, the creatures of the flame.”…
(29 July 2009)


Cogito ergo sum, baby

Robert Burton, Salon
I confess the idea of babies carrying on philosophical investigations never crossed my mind until I met Alison Gopnik, professor of psychology at University of California, Berkeley. Gopnik, a cognitive scientist with cross-training in philosophy and common sense, has spent her career carefully and cleverly teasing out the previously unsuspected complexity of a baby’s thoughts. In her new book, “The Philosophical Baby: What Children’s Minds Tell Us About Truth, Love, and the Meaning of Life,” Gopnik incisively and compassionately highlights the extraordinary range of mental capabilities of even the youngest child.

What makes Gopnik’s book stand out from the myriad recent books on consciousness is her overarching insight into the sophisticated ways that even infants think and scheme. Citing her work and that of colleagues, Gopnik makes a convincing case that, from a very early age, even before the acquisition of language, we are actively engaged in assessing everything from statistics (probabilities) to right vs. wrong in a moral sphere. Recently I sat down with Gopnick for a conversation about how each of us began our thinking, and how kids might presently be looking at the world.

…One of the difficulties in knowing how babies think is that they can’t describe their thought processes. Yet psychologists have devised some very ingenious experiments to show that by age 12 to 15 months, infants with very limited vocabulary are already developing a clear cause-and-effect sense of how the world is put together. Without the benefit of much language, how do you think the brain creates this knowledge?

Alan Turing had one of the greatest scientific insights of the 20th century, when he realized that a physical system that was organized in a particular way could do many of the things that a human mind can do. That idea allowed us to build computers, physical systems that can reason and calculate without language or consciousness. The great idea of cognitive science is that the human brain is a computer — though one profoundly different and vastly more powerful than the ones we have now. Once this idea was out there, it made sense to think that babies’ brains were just as capable of computation as adult brains, even though babies might not be able to report what their brains were doing in a self-conscious reflective way. And that’s just what we’ve discovered. In fact, studying babies can give us new ideas about how to design learning computers.
(13 August 2009)


Tags: Culture & Behavior, Education, Food, Media & Communications