Wednesday, August 3, 2011

Warfare and Complex Societies

I am seriously beginning to suspect that someone in the New York Times is taking our class and using it as their inspiration for articles in the newspaper. Today's paper has an article about warfare and the rise of the state. As you no doubt recall from our lecture, this theory, also known as Circumscription Theory, has been around for quite some time. According to the article, recent archaeological finds seem to support this hypothesis.
Dr. Stanish believes that warfare was the midwife of the first states that arose in many regions of the world, including Mesopotamia and China as well as the Americas.

The first states, in his view, were not passive affairs driven by forces beyond human control, like climate and geography, as some historians have supposed. Rather, they were shaped by human choice as people sought new forms of cooperation and new institutions for the more complex societies that were developing. Trade was one of these cooperative institutions for consolidating larger-scale groups; warfare was the other.

Warfare may not usually be thought of as a form of cooperation, but organized hostilities between chiefdoms require that within each chiefdom people subordinate their individual self-interest to that of the group.

“Warfare is ultimately not a denial of the human capacity for social cooperation, but merely the most destructive expression of it,” the anthropologist Lawrence H. Keeley writes in his book “War Before Civilization” (Oxford, 1996).
(Note the use of the term "complex societies" in this excerpt.)

To read the entire article, CLICK HERE.

Tuesday, August 2, 2011

Myth and Star Wars

Since we will not have a chance to watch this excellent film about myth in class, I thought I would place it here for you guys to see. The movie discusses the mythological elements of the Star Wars movie series. Please watch at least the first hour of the movie.













Sunday, July 31, 2011

Love and Marriage in Afghanistan

We will discuss kinship and marriage in class today, so I thought it only appropriate to share this tragic love story that is making the rounds these days. Two Afghan teens who are members of different ethnic groups are currently sitting in jail in Herat for inciting a riot because of their unsanctioned relationship.
This month, a group of men spotted the couple riding together in a car, yanked them into the road and began to interrogate the boy and girl. Why were they together? What right had they? An angry crowd of 300 surged around them, calling them adulterers and demanding that they be stoned to death or hanged.

When security forces swooped in and rescued the couple, the mob’s anger exploded. They overwhelmed the local police, set fire to cars, and stormed a police station 6 miles from the center of Herat, raising questions about the strength of law in a corner of western Afghanistan and in one of the first cities that has made the formal transition to Afghan-led security.

The riot, which lasted for hours, ended with one man dead, a police station charred, and the two teens, Halima Mohammedi and her boyfriend, Rafi Mohammed, confined to juvenile prison. Officially, their fates lie in the hands of an unsteady legal system. But they face harsher judgments of family and community.

Mohammedi’s uncle visited her in jail to say she had shamed the family and promised that they would kill her once she was released. Her father, an illiterate laborer who works in Iran, sorrowfully concurred. He cried during two visits to the jail, saying almost nothing to his daughter.

“What we would ask is that the government should kill both of them,’’ said the father, Kher Mohammed.
Sadly, this story has it "all" - issues of ethnicity (Pashtun and Tajik), gender roles (honor killings, women's rights), marriage traditions (arranged marriage, endogamy), religion (sharia and Taliban), kinship, as well as ethnocentrism (how we react to the story based on our conceptions of love and our stories such as Romeo and Juliet).

Read it HERE.

Urban Foraging

In our subsistence strategy lecture, we mentioned that foraging is not something limited to the hidden corners of the Kalahari, but rather a practice that continues even in industrial societies. To prove this point, the New York Times published an article on Friday about foraging in New York parks. Either due to the continuing economic crisis or hipster appeal, there has recently been an upswing in urban foraging that has prompted New York City to enforce laws against this activity. As the article explains:
New Yorkers are increasingly fanning out across the city’s parks to hunt and gather edible wild plants, like mushrooms, American ginger and elderberries.

Now parks officials want them to stop. New York’s public lands are not a communal pantry, they say. In recent months, the city has stepped up training of park rangers and enforcement-patrol officers, directing them to keep an eye out for foragers and chase them off.

“If people decide that they want to make their salads out of our plants, then we’re not going to have any chipmunks,” said Maria Hernandez, director of horticulture for the Central Park Conservancy, the nonprofit group that manages Central Park.
Yet, as Melissa Poe, a researcher at the Institute for Culture and Ecology points out, these arguments are not so clear cut and more often than not tangled up with cultural ideologies such as the nature-culture dichotomy and an underlying assumption that human use is necessarily destructive. She notes:
Urban foraging and gathering is a vibrant and important practice for diverse urban communities. People gather plants, plant parts, fruits, nuts, salvaged fiber, and fungi in cities to support livelihoods, provide essential foods, medicine, and materials for households. Urban gatherers also express a range of non-utilitarian motivations derived from participating in gathering, including pleasure and enjoyment, connecting with the biophysical world, strengthening social ties, and
maintaining cultural traditions. In Seattle, gathering is also be one avenue through which urban residents care for their environment, for example, many gatherers intentionally steward important species and have developed strong connections with nature through their gathering practices.

Excluding gathering as a legitimate activity presents a number of disadvantages for urban greenspace managers: 1) It creates confusion about what kinds of plant material are acceptable to remove (invasives, tree fruits, berries?) and who can legitimately do so (restorationists? educators?); 2) It criminalizes what are often otherwise benign gathering activities occurring on public land; 3) It adversely impacts lower-income and food-insecure individuals who may make use of products in urban forests to meet some of their nutritional and medicinal needs; 4) It reduces the urgency for land managers to avoid using toxic herbicides and other chemicals in vegetation management; and 5) It fails to create incentives for gatherers, who often possess sophisticated local environmental knowledge, to become involved in broader city forest stewardship initiatives.
To read the article in the New York Times CLICK HERE.

Melissa Poe's comments appeared in the E-ANTH listserve (http://www.eanth.org/).

Friday, July 29, 2011

Homo sapiens Outnumber Neandertals.

A recently published article contends that the extinction of Homo neandertalensis was due to the fact that they were outnumbered and outsmarted by Homo sapiens.
Mellars and French analysed archaeological evidence in PĂ©rigord, a former province of southwestern France, which is renowned for its neanderthal and early human sites. They found that the population of homo sapiens that arrived in the region was at least ten times larger than that of the neanderthals already settled there.

In particular, the area saw a sharp rise in the number and size of early human sites and the detritus of life they left behind, such as stone tools and the remains of animal carcasses, according to a report in Science.

The researchers believe the sheer pressure of being outnumbered was exacerbated by the social and technological advantages that modern humans displayed, from long-range hunting spears to stronger cooperation and communication. The arrival of modern humans coincided with the appearence of elaborate cave paintings, decorative stones and beads, and imported shells, suggesting homo sapiens had a more complex society than the neanderthals.
To read the entire article.

Higher Latitudes, Larger Eyes

This appears to be a newly established "Bergmann's and Allen's Rule" of eyes. Researchers now contend that people in Northern latitudes not only have larger eyes, but they also have larger brains because they need more processing power to see in dim lighting.
"As you move away from the equator, there's less and less light available, so humans have had to evolve bigger and bigger eyes," said Eiluned Pearce from the Institute of Cognitive and Evolutionary Anthropology at Oxford University, a lead author on the study.

"Their brains also need to be bigger to deal with the extra visual input. Having bigger brains doesn't mean that higher-latitude humans are smarter, it just means they need bigger brains to be able to see well where they live."

This suggests that someone from Greenland and someone from Kenya will have the same ability to discern detail, but the person from the higher latitude needs more brainpower and bigger eyes to deal with the lower light levels.
Read the article HERE.

Tuesday, July 26, 2011

Post-processualism and Biblical Archaeology

In the last 20 years or so, there has been increasing awareness within the archaeological community that our cultural lens has a considerable affect on how we ask archaeological questions. Archaeological remains are now conceptualized by many workers in the fields as vehicles for how we construct our modern identities; for example, French national pride played a large role in the way that monuments to the ancient Gauls were constructed at sites like Alesia, where Vercingetorix made his last stand against Julius Caesar in the 1st century BCE. This ancient Gallic general is really much more than just a figure from history, now - he symbolizes French resistance to a foreign invader.

In this respect, the past becomes part of our modern cultural capital, and so has value not merely in the abstract, as part of some ostensibly objective history lesson, but instead as part of our subjective, constant creation of identity. This is no small thing in an increasingly globalized world where preservation versus progress creates a near-constant tension. In archaeology, this reflexive awareness that we create our past by how we frame our historical and archaeological questions has a name: Post-Processualism. Not only do Post-Processualists see the past as a cultural resource of the present, they also insist that due to our own inherent cultural lenses that we can never be fully certain that we have accurately re-constructed the past through archaeological remains. Their logic is as follows: if a researcher's cultural lens causes her to make certain choices about what questions to research, then other questions will remain unanswered, that might have provided a different picture. Further, the interpretation of a site is going to filtered through her cultural lens.

Another example of how these issues play out in archaeology can be found in the Middle East today. There is perhaps no topic that inspires more impassioned debate in the West than Biblical Archaeology. We appear to be (pardon the pun) hell-bent on tying concrete archaeological remains to the histories recounted in the Bible. Furthermore, the conflict between the Palestinians and the Israelis adds another, often acid-coated - layer to this debate. For a good look at how the fur can fly, follow the jump to an very interesting look at the New Biblical Archaeology of today, framed by the conflict in the Middle East.

Monday, July 25, 2011

Hijras and Discrimination

As you recall from our reading about Hijras (eunuch/transvestite) in India, they are a marginalized community that has suffered a great deal of discrimination. Apparently 3,000 Hijras from all over the country met today in Kolkata to demand a separate quota for government jobs, health care, and subsidized food. It will be interesting to see if any of their demands are met, since I do not think that they have traditionally been recognized by the government as a "scheduled" caste (i.e. an official caste that is considered outcastes) or what is known as "other backwards castes" (officially recognized groups that are discriminated against or economically challenged.)

Evolution in New York City

In today's New York times there is an article that uses New York City as an example of how large urban areas act as selective pressures that propel the process of evolution. What they report seems surprising, but makes complete sense if you consider that it is consistent with both evolution and punctuated equilibrium. For example:
The researchers inspected 50 traps laid the day before and found seven mice inside. They plopped each mouse out of its trap and into a Ziploc bag. They clipped a scale to each bag to weigh the mice. Dr. Munshi-South gently took hold of the animals so his students could measure them with a ruler along their backs.

Dr. Munshi-South and his colleagues have been analyzing the DNA of the mice. He’s been surprised to find that the populations of mice in each park are genetically distinct from the mice in others. “The amount of differences you see among populations of mice in the same borough is similar to what you’d see across the whole southeastern United States,” he said.
Consistent with what the Hardy-Weinberg principle suggests, this example of Cadmium resistant worms shows how large randomly mating pairs will eventually result in stasis:
In 1989, Jeffrey Levinton of Stony Brook University and his colleagues discovered that a population of mud-dwelling worms in the Hudson had evolved resistance to cadmium. They lived in a place called Foundry Cove near a battery factory near West Point. Dr. Levinton and his colleagues found that the worms produced huge amounts of a protein that binds cadmium and prevents it from doing harm.

In the early 1990s, the federal Environmental Protection Agency hauled away most of the cadmium-laced sediment from Foundry Cove. Over nine generations, the Foundry Cove worm populations became vulnerable again. This shift occurred, Dr. Levinton and his colleagues reported last year, as worms from less contaminated parts of the river moved in. They are interbreeding with the resident worms, and the resistant mutations are becoming rarer.
Lastly, the article shows why it is not necessarily always a good idea to be a germophobe:
Evolution is not just taking place in New York’s rivers and parks. It’s also taking place inside its hospitals. In 1997, Dr. John Quale, an infectious diseases physician at SUNY Downstate Medical Center, discovered a newly evolved strain of bacteria in the city that is resistant to most

The bacteria, known as Klebsiella pneumoniae, is often found in hospitals, where it can cause pneumonia and other life-threatening infections. Doctors typically treat Klebsiella with an antibiotic called carbapenem. Dr. Quale and his colleagues discovered carbapenem-resistant Klebsiella in four hospitals in Brooklyn. The new genetic recipe proved to be a winning solution. Dr. Quale’s surveys charted the strain as it spread from hospital to hospital throughout New York. “It’s one strain that’s adapted very well to the hospital environment, and it clearly has a survival advantage over other bacteria,” Dr. Quale said.

Once the new strain had established itself in New York, it began to spread out of the city. It’s now reached 33 other states, and has become a serious problem in other countries including France, Greece, and Israel.
Dr. Quale and his colleagues found that this new strain of Klebsiella is especially dangerous. About half of patients who get infected die. Doctors can cure some infections, but only by using toxic drugs that can cause nerve and kidney damage.
To read the entire article, CLICK HERE.

Sunday, July 24, 2011

Breaking the Clovis Barrier

If you wanted to start a bar fight at an archaeology conference even just ten years ago, all you had to do was bring up the so-called "Clovis Barrier." Clovis was, for decades, the earliest accepted culture of humans in the Western Hemisphere, dating to about 13,000 years ago. It is characterized by beautiful spear points and mass kill sites of megafauna like mammoth and bison. These big game hunters only flourished for about 750 years or so, until the last of the megafauna died out at the end of the Pleistocene. The early portion of the Holocene, analogous to the Mesolithic, or Middle Stone Age in Europe, saw increasing diversification of resource use and a highly visible change in stone tool (lithic) technologies, and the gorgeous, massive Clovis points were abandoned for toolkits more suitable to these new lifeways, to suit the changing environment and shifting climate.

So what is this "Clovis Barrier"? It's the idea that there were no people here in the Western Hemisphere before Clovis arose. Even today, some archaeologists absolutely reject the idea that there were any people here in the Western Hemisphere prior to about 13,000 years ago. Their reasoning is that there is just no hard evidence for a discernible culture prior to that point. However, there are a few sites that challenge this notion. Meadowcroft in Pennsylvania, Monte Verde in Chile, the Topper Site in South Carolina, and now Buttermilk Creek in Texas, all purport to contain pre-Clovis stratigraphic levels and artifacts.

Both sides of the debate have valid points. Those who reject pre-Clovis cite the seeming invisibility of these hypothetical pre-Clovis groups within the landscape. Where are the identifiable tool types, they ask? Where are the sites? Those who accept pre-Clovis point out that linguistic and DNA evidence now both support the idea that people entered the Western Hemisphere as early as 25,000 years ago, before the height of the last glacial maximum (LGM, in paleo-climatological speak, also known as Marine Isotope Stage 2, or MIS 2). We know that sites as old as 30,000 years have been found in Siberia, although it is open for debate about whether or not those were inhabited by modern humans, or Neandertals. And there's ample debate about exactly how people made their way into North America, and onward to South America, as well as how quickly.

While the Buttermilk Creek site may have settled the question for many, much more work on this question will have to be done. For example, if these people did not hunt big game using massive spear points, what did they do for a living instead? If the coastal migration model is correct, then maybe they fished - how visible will these activities be, archaeologically speak? Maybe people used multiple routes, both inland and coastal, to colonize this hemisphere. If so, the archaeological imprints of their activities will be varied accordingly. In summation, this issue isn't likely to become less complicated anytime soon.

Saturday, July 23, 2011

Controversy Over the Morton Skulls

This article looks at the Morton Skulls controversy which revolved around the question of brain size and intelligence. As the article succinctly explains:
In a 1981 book, “The Mismeasure of Man,” the paleontologist Stephen Jay Gould asserted that Morton, believing that brain size was a measure of intelligence, had subconsciously manipulated the brain volumes of European, Asian and African skulls to favor his bias that Europeans had larger brains and Africans smaller ones.

But now physical anthropologists at the University of Pennsylvania, which owns Morton’s collection, have remeasured the skulls, and in an article that does little to burnish Dr. Gould’s reputation as a scholar, they conclude that almost every detail of his analysis is wrong.

“Our results resolve this historical controversy, demonstrating that Morton did not manipulate his data to support his preconceptions, contra Gould,” they write in the current PLoS Biology.
To READ the whole article and watch the video about this fascinating controversy.

Children Share; Chimps, Not So Much

Here is a an interesting article about a recent study published in Nature about the difference between children and chimpanzees when it comes to collaborative work and sharing.
“Among great apes, only humans are true collaborative foragers.” Other species might look for food together, but being next to one another is not the same as working together. The only exception are the hunting parties of chimps, where several individuals work together to kill monkeys for food. The slain monkeys are shared, but either under duress or in exchange for favours.

With children, things are very different. Studies have shown that children as young as five to seven start sharing resources fairly among one another. On the other hand, when younger children come across a windfall of sweets, they tend to keep the majority for themselves. It’s tempting to think that children only develop a sharing ethic when they approach school age, but Hamann realised that something was missing.

In all the previous studies, scientists had given children an unexpected hand-out. What would happen if the kids had to work together to get their own rewards – a more common situation, and one that better reflects our evolutionary past.
To read the whole article, CLICK HERE.

Friday, July 22, 2011

Obstetrical Dilemmas and Human Culture

The so-called "obstetrical dilemma" is an area of great interest to several subfields of anthropologists, because it appears to have initiated a concatenating series of effects on human behavior. In short, this dilemma is caused by the constriction of the pelvic opening as a result of the shift to bipedal locomotion. In order for hominins to walk upright, this pelvic opening, also called the pelvic girdle, has become smaller. This means that considerably less room was left for an infant's head to pass through the pelvis during birth. Accordingly, the size of the infant's head has an upper limit, or the baby can't be born, and both mother and child will die during childbirth.

However, the development of bipedal locomotion has resulting in increased opportunity to access a more diverse, and calorie dense, set of food resources, specifically more protein. One side effect of this is increased brain development, for several reasons: one, if you can get more calories, then your body can allocate more energy to developing a bigger brain; and two, if you are eating a more diverse collection of foods, you'll probably be more motivated to develop tools and strategies to consume them once you've walked or run there. So, more diverse foods and more tool use equals a bigger brain, but the foundation for these behaviors - bipedalism - equals smaller pelvic girdles. Clearly, this is a problem.

This means that human infants must be born at a very early stage, developmentally speaking. It is no exaggeration to say that a human infant is essentially still a fetus, given how completely helpless s/he is, and how dependent s/he is on her or his mother. A newborn human infant can't even cling onto his or her mother effectively until months after birth, and therefore must be carried by the mother. Further, s/he must be fed every two hours (in some cases, more). For a human infant to survive, the mother must focus a considerable amount of effort on the baby, night and day.

Human mothers must also have help to give birth, generally speaking. In some cases, this even involves dramatic surgical interventions like C-sections. At minimum, a human mother generally requires a birth attendant who can help deliver the infant and assist the mother, whose body must go through considerable trauma to successfully give birth to her child. In most cultures, these attendants are other women who specialize in this; these are midwives in our culture, and even today, many women in our society give birth with midwives in attendance (many of whom are also nurses or otherwise trained in modern Western medicine).

Some anthropologists think that these effects have had considerable impacts on human cultural developments. In 1996, Trevathan argued that birth in the bipedal hominin must needs be a social event, and that this difference from non-bipeds is at the root of a lot of culture (for full access to the Trevathan paper, you must have full access to JSTOR). Certainly, if you consider that a considerable portion of a population - pregnant and laboring females, as well as their small offspring - would need social support to survive, it's reasonable to speculate on the different social mechanisms that might develop to help achieve this goal.

Thursday, July 21, 2011

Catching Evolution in the Act

A new study in the journal Current Biology demonstrates the concepts of micro and macroevolution that we have been talking about in class. A new "super" mouse that is resistant to the rat poison warfarin has been found in Germany. Upon DNA analysis it appears that the mouse has successfully hybridized with another mouse species from Algeria that had this resistance. The new species is a product of both globalization and natural selection. As the article points out:
At some point in the past, Kohn and his team believe the Algerian mice mated with European house mice, conferring their poison resistance to them. This process is called "horizontal gene transfer," and is usually only seen in microbes.

He added, "The process we describe (horizontal gene transfer) introduces more variation in the genomes of populations that would otherwise, by mutation alone, be available. In that regard it potentially could speed up evolution."

Humans appear to be driving the process. The mice from different regions likely would not have met, were it not for spreading via human agricultural practices. Our use of pesticides also played a part.

"Unprofessional and widespread use of poison seems to have favored the evolution and spread of resistant mice and rats," Kohn said. "However, the novel thing reported here is that it has also enabled a potentially important process (hybridization) to turn up something advantageous that usually is not."
To read the whole article, click HERE.

Wednesday, July 20, 2011

DNA and the Traveling, Friendly Genome

Tracing the paths taken by human populations in prehistory is a subject of great interest to biological anthropologists and archaeologists. DNA is the key to understanding how, and when, people moved from one place to the other. The Human Genographic Project has been collecting DNA data from all over the world in order elucidate this subject, and their results so far can be seen here, after the jump.

But what about human ancestors? Can we get DNA for them? In fact, enough usable genetic material has been recovered from Neanderthal remains that researchers were able to sequence the Neanderthal genome last year. And what they found was remarkable - evidence that at some point, Neanderthal populations interbred with those composed of modern Homo sapiens sapiens.

These researchers found that somewhere between 1% and 4% of the genes in modern humans are inherited from Neanderthals. But here's the most remarkable part of their findings, in my view: only non-African populations interbred with Neanderthal groups. There are several logical conclusions to draw from this: one, that Neanderthals never encountered modern human groups in Africa in any fashion liable to promote intermingling (we can't say that they weren't there, just that the two groups didn't intermix on that continent); two, that two ostensibly different species of human not only co-existed, but managed to do so happily enough to swap a detectable amount of genetic material.

And finally, most humorously, it turns out that modern human groups from Africa have the "purest" DNA of all, if you consider that to mean that they have no genetic inheritance from older, allegedly more "primitive" human ancestors (the issue of how "primitive" Neanderthals really were is a topic for another post). I wonder what people who are biased against Africans will make of that?

Pakistani Culture and Identity

If you read Chapter 14, culture and identity are important anthropological concepts that will come up again and again in the class. In this vein, I would like to recommend a recent Wall Street Journal article by Aatish Taseer that he wrote about his slain father Salman Taseer. The article, which attempts something like a psychoanalysis of a country and its people is does an excellent job of explaining the differences between India and Pakistan and why, in his opinion, Pakistan has suffered an identity crisis since its founding. Here is an excerpt:

To understand the Pakistani obsession with India, to get a sense of its special edge—its hysteria—it is necessary to understand the rejection of India, its culture and past, that lies at the heart of the idea of Pakistan. This is not merely an academic question. Pakistan's animus toward India is the cause of both its unwillingness to fight Islamic extremism and its active complicity in undermining the aims of its ostensible ally, the United States.

In the absence of a true national identity, Pakistan defined itself by its opposition to India. It turned its back on all that had been common between Muslims and non-Muslims in the era before partition. Everything came under suspicion, from dress to customs to festivals, marriage rituals and literature. The new country set itself the task of erasing its association with the subcontinent, an association that many came to view as a contamination. ...

Had this assertion of national identity meant the casting out of something alien or foreign in favor of an organic or homegrown identity, it might have had an empowering effect. What made it self-wounding, even nihilistic, was that Pakistan, by asserting a new Arabized Islamic identity, rejected its own local and regional culture. In trying to turn its back on its shared past with India, Pakistan turned its back on itself.

But there was one problem: India was just across the border, and it was still its composite, pluralistic self, a place where nearly as many Muslims lived as in Pakistan. It was a daily reminder of the past that Pakistan had tried to erase.
To read the whole article, click HERE.

Monday, July 18, 2011

Shunning and the Law

One of the ways in which cultures police themselves and enforce cultural norms of behavior is by defining who gets to live in a society, and who is not allowed to take part. Shunning people who fail to behave in culturally acceptable ways is known throughout the world and from all periods of history. Sometimes it's done as a means to enforce unwritten types of cultural conformity, and other times it's done to enforce written laws.

We don't tend to think of it as something that is done in any sort of formal way in our society, however. It might surprise many to know, then, that a form of shunning is currently being debated for its constitutionality in the state of Georgia. Although the Supreme Court of Georgia has ruled that banning criminals from the state is unconstitutional, it has upheld the right of judges to restrict criminals to specific counties, or forbid them from entering certain regions or counties. Currently, a case involving a mentally ill defendant is making its way through the appeals process and will probably be up for consideration by the Georgia State Supreme Court.

This is an interesting question, legally and anthropologically. At the heart of this debate, the rights of the defendant to live as s/he chooses is being counterpoised against the rights of our society to be free from criminal activity performed by the defendant. That's not a small question in our culture, because we tend to prefer to uphold the rights of the individual as an ideal, versus the rights of the State.

There are also pragmatic questions to be answered if the court upholds the practice: for example, what will be penalty be if a defendant is banned from a county, and s/he violates that ban? Also, does this practice address the problem of recidivism, or does it merely ship the problem elsewhere? And finally, because most of these bannings have restricted defendants to rural counties, what sort of long-term affects might there be on these rural counties who are receiving these people?

Wednesday, July 13, 2011

Endangered Languages and The Linguists Who Love Them

While the Whorfian hypothesis has been pretty thoroughly rejected as too simplistic, most cognitive anthropologists are comfortable with the idea that language, cognition, and culture are tied together in a feedback loop, each influencing the other. When one can conceptualize language this way, it is easy to see how each language that dies out takes a small but important piece of the human experience with it. It an increasingly globalized world, languages are dying at an every increasing rate; however, linguists aren't sitting idly by! Follow the link for a fascinating look at the work that linguists do to document endangered languages, and check out the trailer for the documentary below.

Tuesday, July 12, 2011

North American Dialects

Please take a look at the full scale North American dialects map.

Note that this will not be on the test, but is just for your own edification.

The Emic Side of Benny Lava

Please check out "Benny Lava" with the proper translation. Please think about how it makes you see the clip differently.



Also read up on Prabhu Deva, the actor who is dancing in the film clip.

Sunday, July 10, 2011

Anthropologist Stanley Ann Dunham

For those of you who might be interested to learn more about Barack Obama's mother Stanley Ann Dunham, check out this recent interview from the Colbert Report with the author Janny Scott who has written a biography called "A Singular Woman" about her. In the interview she states that Stanley Ann was "Very open emotionally and intellectually," and that "The only label she ever embraced was anthropologist."



On the Mental Health of Primates in Captivity

Here is a sad article that discusses the mental health of chimpanzees in zoos.
The documented behaviors, which included self-mutilation, repetitive rocking, and consumption of feces, are symptoms of compromised mental health in humans, and are not seen in wild chimpanzees, the authors say. The study found that even chimps at very well regarded zoos displayed the disturbing behaviors.

"Absolutely abnormal behavior and possible mental health issues are most commonly associated with lab chimps," co-author Nicholas Newton-Fisher told Discovery News. "This is one of the reasons we were surprised to see the levels of abnormal behavior that we did -- in chimpanzees living in good zoos."

"We conclude that the chimpanzee mind might have difficulties dealing with captivity," added Newton-Fisher, a primate behavioral ecologist at the University of Kent's School of Anthropology & Conservation.
That last bit about "might have difficulties" sounds like an understatement to me. To read the whole thing CLICK.

An Ethnography of Outsourcing

Maybe some of you have seen the NBC comedy show Outsourced about a group of Indians who work at a call center for an American company peddling tchotchkes. As I mentioned in class, Anthropology insists on actually going to a place and experiencing it as the local people do before we reach any conclusions about the place or the people who live there. Though not an anthropologist, this is exactly what Andrew Marantz, a freelance journalist did when he went to Delhi and took a job in the BPO (business process outsourcing) industry. It turns out that what he found was not that funny.

Every month, thousands of Indians leave their Himalayan tribes and coastal fishing towns to seek work in business process outsourcing, which includes customer service, sales, and anything else foreign corporations hire Indians to do. The competition is fierce. No one keeps a reliable count, but each year there are possibly millions of applicants vying for BPO positions. A good many of them are bright recent college grads, but their knowledge of econometrics and Soviet history won't help them in interviews. Instead, they pore over flashcards and accent tapes, intoning the shibboleths of English pronunciation—"wherever" and "pleasure" and "socialization"—that recruiters use to distinguish the employable candidates from those still suffering from MTI, or "mother tongue influence."Monica Joshi, 22, kills some time before her graveyard shift at a Gurgaon call center.Monica Joshi, 22, kills some time before her graveyard shift at a Gurgaon call center.

In the end, most of the applicants will fail and return home deeper in debt. The lucky ones will secure Spartan lodgings and spend their nights (thanks to time differences) in air-conditioned white-collar sweatshops. They will earn as much as 20,000 rupees per month—around $2 per hour, or $5,000 per year if they last that long, which most will not. In a country where per-capita income is about $900 per year, a BPO salary qualifies as middle-class. Most call-center agents, however, will opt to sleep in threadbare hostels, eat like monks, and send their paychecks home. Taken together, the millions of calls they make and receive constitute one of the largest intercultural exchanges in history.

Indian BPOs work with firms from dozens of countries, but most call-center jobs involve talking to Americans. New hires must be fluent in English, but many have never spoken to a foreigner. So to earn their headsets, they must complete classroom training lasting from one week to three months. First comes voice training, an attempt to "neutralize" pronunciation and diction by eliminating the round vowels of Indian English. Speaking Hindi on company premises is often a fireable offense.

To read the entire article, CLICK HERE.

We will be talking about globalization and its impacts on and off throughout the session, but I thought that this observation was a really succinct way of explaining the phenomenon without ever calling it such.
Trainers aim to impart something they call "international culture"—which is, of course, no culture at all, but a garbled hybrid of Indian and Western signifiers designed to be recognizable to everyone and familiar to no one.
(PS Of course the producers of the show Outsourced could not resist the urge to include a cow in the photo above. This is not only part of a Western trope a signifier for "India" but almost interchangeable with the Indians who are all behind the prominently featured American manager, who is clearly confident and in charge. If I were really cynical, I might even say that there is more than a little ethnicentrism here - with the Westerner represented as the focus/ideal, the Indians somewhere behind him and hardly a step away from the cow/animals.)

Friday, July 8, 2011

Tiger Mothers and the Mommy Wars

We talked today about Amy Chua's piece, "Why Chinese Mothers Are Superior", that ignited such a firestorm of debate when it appeared in the Wall Street Journal. While Chua tries in this piece to detach literal ethnicity from her parenting practices, she still manages to highlight some very important cultural differences in how we raise our children.

This piece touches on so many very, very sensitive issues in our ongoing debate about who we are as culture, where we're going, and with whom we compete. Some things to think about when reading this piece might be:
  • The degree to which we invest our children with our own needs and goals for the future of our culture, as opposed to their own ideas;
  • Our mythologies about childhood as a general concept;
  • Our anxiety about the rise of powers other than the United States in the global marketplace;
  • How we define achievement and success.

Multi-tasking and the Wired Generation

Here's a link to a fascinating look at the true efficacy (or lack thereof) of multitasking in today's digital world:


As I type this, I have four tabs open in my browser so that I can monitor my email, select a playlist that fits my mood from Pandora Radio, check the weather for my morning run tomorrow, and write this post. I also have my iPhone right next to the keyboard. Yet I don't consider myself to be multi-tasking right now. There are days when I might be running software analyzing ground penetrating radar data, checking my email, drafting a report, and listening to my iPod. That's a light day of multi-tasking for me.

I have to wonder how truly effective those days really are after watching this Frontline video. Thoughts?

Thursday, July 7, 2011

Was Shakespeare a Stoner?

Apparently there is an anthropologist in South Africa who would like to know if Shakespeare was high on marijuana. This is a very different type of forensic anthropology than that discussed in the following article.
Francis Thackeray, the director of the Institute for Human Evolution at the University of Witwatersrand in Johannesburg, has proposed to dig up Shakespeare's grave—along with the resting places of his family—to see if the skeleton could determine the cause of the bard's death. Hair and keratin from fingernails and toenails could also reveal a pattern of drug use, while a chemical analysis of teeth could expose the use of tobacco or marijuana.

Experts have long speculated whether drugs played a role in Shakespeare's genius; many have noted his references to a "noted weed" and "a journey in his head"—lines that appear in two different sonnets. For a study released in 2001, Thackeray discovered cannabis residue (along with cocaine) on clay pipe fragments found in Shakespeare's garden. Cannabis sativa, the plant from which marijuana is derived, was available in England during the Elizabethan era to make textiles, rope, paper, clothing and sails.
To read the entire article, click HERE.

Forensic Anthropology in the Military

Forensic Anthropology has recently become well known thanks largely to the hit TV show Bones. Here is an example of a real-life forensic anthropologist who works for the US military to help identify the remains of fallen soldiers.
As the military's only active duty forensic anthropologist, Regan unravels mysteries borne of wars in Afghanistan and Iraq, where the most common cause of death is not a bullet but a homemade bomb. She uses DNA, fingerprints, tissue analysis and painstaking observation to make positive identifications. Part of her "noble mission," she says, is making sure the remains survivors receive belong to their loved ones — and no one else.

These are the first wars in which every American battlefield death is autopsied — and, since 2004, the first in which every set of American military remains undergoes a CT scan. In previous wars, autopsies on American combat casualties were rare and CT scans were never done.

Like a casualty notification officer, Regan encounters family members on the worst days of their lives — and delivers painful truths.

In Regan's hands, each case is much more than anonymous remains. It's a fellow service member whose grieving family is desperate for answers.

"I'm a service member, and these people have made the ultimate sacrifice," she said inside a chilly basement morgue of the Armed Forces Medical Examiner in Rockville, Md. "Everything we do is to honor them and make sure we have uncovered the truth."
To read the entire article, click HERE.

Mayan Tomb Uncovered

Archaeologists have used a microscopic camera to peer into an ancient Mayan tomb in Palenque, Mexico.
A Mayan tomb closed to the world for 1,500 years has finally revealed some of its secrets as scientists snaked a tiny camera into a red-and-black painted burial chamber.

The room, decorated with paintings of nine figures, also contains pottery, jade pieces and shell, archaeologists from Mexico's National Institute of Anthropology and History (INAH) reported Thursday (June 23).

The tomb is located in Palenque, an expansive set of stone ruins in the Mexican state of Chiapas. According to the INAH, the tomb was discovered in 1999 under a building called Temple XX. But the stonework and location prevented exploration.
To read the article. To see pictures of the site.

Monday, June 13, 2011

Exciting Discoveries in Dmanisi

Here is some tantalizing evidence that suggests both an earlier Out of Africa date (1.85 million vs. 1.7 mya) and a two way migration (Out of Africa to Asia and back again). Basically, this means that it is possible that our earliest ancestors developed in the Caucasus and then returned to Africa where they evolved into Homo sapiens. The evidence comes from the Dmanisi site in Georgia (the country, not the state) that has already revolutionized our ideas about Homo erectus and its march across Eurasia. From the article:

"The accumulating evidence from Eurasia is demonstrating increasingly old and primitive populations," said Reid Ferring of the University of North Texas. Dmanisi is located in the Republic of Georgia.

"The recently discovered data show that Dmanisi was occupied at the same time as, if not before, the first appearance of Homo erectus in east Africa," the team led by Ferring and David Lordkipanidze of the Georgia National Museum reported. They uncovered more than 100 stone artifacts in deep layers at the site. Previously, fossil bones from a later period had been found at the site.

The new discovery shows that the Caucasus region was inhabited by a sustained population, not just transitory colonists. "We do not know as yet what the first occupants looked like, but the implication is that they were similar to, or possibly even more primitive than, those represented by Dmanisi's fossils," Ferring explained.

The occupants of Dmanisi "are the first representatives of our own genus outside Africa, and they represent the most primitive population of the species Homo erectus known to date," added Lordkipanidze. The geographic origins of H. erectus are still unknown.

The early humans at Dmanisi "might be ancestral to all later H. erectus populations, which would suggest a Eurasian origin of H. erectus," said Lordkipanidze. However, there's another theory as well: H. erectus originated in Africa, and the Dmanisi group might represent its first migration out of Africa.

To read the article, click HERE.


Thursday, June 9, 2011

The Earth is Full

The most read story in the New York Times today is by Thomas Friedman who discusses ecological footprints to make the case that we may be reaching a tipping point with regard to the environment.
You really do have to wonder whether a few years from now we’ll look back at the first decade of the 21st century — when food prices spiked, energy prices soared, world population surged, tornados plowed through cities, floods and droughts set records, populations were displaced and governments were threatened by the confluence of it all — and ask ourselves: What were we thinking? How did we not panic when the evidence was so obvious that we’d crossed some growth/climate/natural resource/population redlines all at once?

“The only answer can be denial,” argues Paul Gilding, the veteran Australian environmentalist-entrepreneur, who described this moment in a new book called “The Great Disruption: Why the Climate Crisis Will Bring On the End of Shopping and the Birth of a New World.” “When you are surrounded by something so big that requires you to change everything about the way you think and see the world, then denial is the natural response. But the longer we wait, the bigger the response required.”

Gilding cites the work of the Global Footprint Network, an alliance of scientists, which calculates how many “planet Earths” we need to sustain our current growth rates. G.F.N. measures how much land and water area we need to produce the resources we consume and absorb our waste, using prevailing technology. On the whole, says G.F.N., we are currently growing at a rate that is using up the Earth’s resources far faster than they can be sustainably replenished, so we are eating into the future. Right now, global growth is using about 1.5 Earths. “Having only one planet makes this a rather significant problem,” says Gilding.

Read the whole thing HERE.

For those who would like to explore this topic further, I would suggest looking into the debates surrounding Neo-Malthusianism and Cornucopianism.

Friday, May 6, 2011

Our Common Ancestor: Homo Heidelbergensis

We did not have a chance to talk about Homo heidelbergensis in our lectures, but it appears that there is a growing consensus that this hominid species was our common ancestor with the Neandertals. This species is believed to have been widespread in Africa and Asia in the middle Pleistocene period between 781,000 to 126,000 years ago. To read more about the latest study click HERE.

Wednesday, May 4, 2011

Senate Official: Wrong to Link Bin Laden, Geronimo

The news is currently dominated by stories of the capture and killing of al-Qaida leader Osama bin Laden. But here is one story I never expected to hear: "After bin Laden was killed, the military sent a message back to the White House: "Geronimo EKIA" — enemy killed in action."

Once again, we have a case of negative stereotyping of Native Americans and an inappropriate use of their iconography. As Tuell, member of the Nez Perce tribe in Idaho states, "These inappropriate uses of Native American icons and cultures are prevalent throughout our society, and the impacts to Native and non-Native children are devastating".

The linking of Native American hero Geronimo and U.S. enemy Osama bin Laden is another example of the fact that although non-existent in the biological sense, "race" as a cultural construct, is alive and well, with strong impacts on society today.

To read more, see: Senate official: Wrong to link bin Laden, Geronimo

UPDATE: Apache tribe is requesting a formal apology from President Obama.
The leader of Apache warrior Geronimo’s tribe is asking President Obama for an apology for the government’s use of his name as a code name for Osama bin Laden.

In a letter to Obama, Fort Sill Apache Tribal Chairman Jeff Houser said equating Geronimo to a “mass murderer and cowardly terrorist” was painful and offensive to all Native Americans. “Right now Native American children all over this country are facing the reality of having one of their most revered figures being connected to a terrorist and murderer of thousands of innocent Americans,” he wrote

Sunday, May 1, 2011

Cave of Forgotten Dreams (2011) NYT Critics' Pick

I can't wait to see director Herzog's new documentary, "Cave of Forgotten Dreams".

As Dana Stevens, critic at the Washing Post states, "If you are a member of the human race, you should see this movie."



And this is what Manohla Dargis, NYT reviewer, has to say:

"What a gift Werner Herzog offers with “Cave of Forgotten Dreams,” an inside look at the astonishing Cave of Chauvet-Pont-d’Arc — and in 3-D too. In southern France, about 400 miles from Paris, the limestone cave contains a wealth of early paintings, perhaps from as long ago as 32,000 years. Here, amid gleaming stalactites and stalagmites and a carpet of animal bones, beautiful images of horses gallop on walls alongside bison and a ghostly menagerie of cave lions, cave bears and woolly mammoths. Multiple red palm prints of an early artist adorn one wall, as if to announce the birth of the first auteur."

One of the most amazing things about this new film is that it captures what almost no human eyes have seen for at least 20,000 years. As the washington Post review points out,

"Chauvet, the most recently discovered and by far the oldest of the great Paleolithic cave-painting sites of Western Europe, has been visited only by a small group of scholars since it was found by three spelunkers on a hike in 1994...Miraculously, Herzog was granted permission to film, and for a six-day period he and a skeleton crew of three descended into the cave with very limited equipment—three cold light panels and a custom-built 3D camera rig—to give the rest of the world what ma! y [sic.] be the closest look we will ever get at some of the world's earliest works of art."

Follow this link to read more about this amazing new film. To watch an interview with director Werner Herzog: NYT Herzog interview.
Lastly, if you still can't get enough, check out the Chauvet Caves Gallery.

Friday, April 29, 2011

Beer and dehydration



In a recent article by Slate, the discussion of beer is raised. This article describes the risks of surviving by drinking only beer and water. Earlier in class, we discussed the role of beer in early societies and the risk of dehydration. This interesting article describes what happened to an individual in Iowa who took on a strict beer diet. Beer has long been touted as healthy source of nutrition, until it was discovered to contribute to scurvy. I hope this article provides some interesting reading on the importance of beer.

During the 46-day feat, J. Wilson consumed only beer and water, emulating a centuries-old tradition once practiced by the Paulaner monks of Munich, Germany. How long could a man survive on beer and water? Not more than a few months, probably. That's when the worst effects of scurvy and protein deficiency would kick in. (Liver disease is a serious risk of chronic alcohol use, but it takes longer to arrive.) If you kept to a strict beer diet—and swore off plain water altogether—you'd likely die of dehydration in a matter of days or weeks, depending on the strength and volume of beer consumed. There's plenty of water in beer, of course, but the alcohol's diuretic effect makes it a net negative in terms of hydration under most conditions.
Scurvy would be an ironic cause of death for a beer-dieter, since the drink was long considered a prophylactic against the disease. For much of the 1700s, doctors administered beer, wort, and malt to prevent the lethargy, wounds, gum disease, fever, and eventual death caused by scurvy. Legendary British explorer Captain James Cook touted the anti-scorbutic effects of beer; his sailors' rations typically included a gallon per day. (The low-alcohol, made-from-concentrate brew would be unrecognizable today.) Beer's failure to quell major outbreaks of scurvy, like those at the siege of Gibraltar in 1780 and aboard the HMS Jupiter in 1781, helped disprove the theory. In 1795, the British admiralty adopted lemon juice as the official cure.

Thursday, April 28, 2011

Death and Dowry in India


This article in the New York Times addresses some alarming violence based gender patterns in India. As discussed in class, there is a severe gender bias in the number of females to males in India...

In the 2001 census, the sex ratio — the number of girls to every 1,000 boys — was 927 in the 0-6 age group. Preliminary data from the 2011 census show that the imbalance has worsened, to 914 girls for every 1,000 boys.

Women’s groups have been documenting this particular brand of gender violence for years. The demographer Ashish Bose and the economist Amartya Sen drew attention to India’s missing women more than a decade ago. The abortion of female fetuses has increased as medical technology has made it easier to detect the sex of an unborn child. If it is a girl, families often pressure the pregnant woman to abort. Sex determination tests are illegal in India, but ultrasound and in vitro fertilization centers often bypass the law, and medical terminations of pregnancy are easily obtained...

The other side of this problem is that arrests appear to be slow to come if at all for deaths linked to dowry. Family's compete for a good groom and will promise a high dowry to secure a wedding. However, the dowry may be too high for the family to pay, and slow payments may lead to death or maybe the groom's family demands more money than agreed at the time of the wedding. Below is another excerpt from the same article.

Another form of violence against women — dowry deaths — is equally well-documented, and just as ugly, though Indians are so used to these that they have become almost invisible. The names of Sunita Devi, Seetal Gupta, Shabreen Tajm and Salma Sadiq will not resonate strongly for most Indians, though they were all in the news last week for similar reasons. Sunita Devi was strangled in Gopiganj, Uttar Pradesh, the pregnant Seetal Gupta was found unconscious and died in a Delhi hospital, Shabreen Tajm was burned to death in Tarikere, Karnataka, and Salma Sadiq suffered a miscarriage after being beaten by her husband in Bangalore.

Demands for larger dowries by the husband’s family were behind all of these acts of violence, so commonplace that they receive no more than a brief mention in the newspapers. National Crime Bureau figures indicate that reported dowry deaths have risen, with 8,172 in 2008, up from an estimated 5,800 a decade earlier.

Monobina Gupta, who has researched domestic violence for Jagori, a nongovernmental organization, draws a direct link between these killings and the abortion of female fetuses: “The dowry is part of the continuum of gender-based discrimination and violence, beginning with female feticide. Following the arrival of” economic “liberalization in 1992, the dowry list of demands has become longer. The opening up of the markets and expansion of the middle classes fueled consumerism and the demand for modern goods. For instance, studies show that color television sets or home video players have replaced black-and-white television sets, luxury cars the earlier Maruti 800, sophisticated gadgets basic food processors.

“It is similar to what is happening with female feticide,” she said. “As the middle class comes into more money, it is accessing more sophisticated medical technology either to ensure the birth of a boy or get rid of the unborn girl.”

What is the cost to the Indian family of having a girl, or to the boy’s family of forgoing a dowry? The economist T.C.A. Srinivasaraghavan puts the average dowry around 10,000 rupees, or $225. That average figure masks the exorbitant dowry demands that are often made by the family of the groom.

Sunday, April 24, 2011

Agriculture, Food Production Among Worst Environmental Offenders, Report Finds

An new article in Science Daily states that,

"a new report from the United Nations Environment Programme observes that growing and producing food make agriculture and food consumption among the most important drivers of environmental pressures, including climate change and habitat loss."


To learn more about the United Nations Environment Programme report, read: Agriculture, food production among worst environmental offenders, report finds.


But first, here are a few comments related to both the report and your Activity 2: In analyzing your carbon footprints, most of you stated that food consumption made up the largest portion of your total carbon footprints. Many of you also expressed being perplexed, wondering how daily eating can have such immense impacts, especially given that your diets are not even close to glutenous.

Perhaps the reason why this doesn't seem to make sense is that you are considering the impacts of your individual food consumption patterns, rather than how your individual food consumption patterns are themselves impacted by industrial agricultural production practices. It is not necessarily that you are eating so much, but that the food you are eating is energetically expensive- it takes a lot of energy to grow/produce and possibly package and ship it.

Furthermore, factors (such as how much packaging is used on the food items you purchase, and how far your food is shipped before it hits your market of choice) are determined by food market management decisions, which are partially the result of widespread industrial practices, which are themselves the result of economic competition and other factors, etc. etc. All of this is normalized at the societal level. You also contribute to societal norms through your behaviors, such as those related to spending (purchasing power), voting, and consumption. We are all role models (for instance, since the 1990s, demand for organic foods has increased dramatically, with the result that now you can even buy "organic" at Walmart). Some of you pointed this out.

Then there are also simple, unalterable bio-physical facts that determine how much energy is consumed in food production. For instance, many of you noted that eating meat substantially increased your carbon footprints. Some of you wondered why. A major reason (but by no means the only reason) why beef is especially "expensive" in terms of energy use is simply because the amount of calories cows need to survive is much greater than the amount of calories you get from eating them. They simply use up a lot of the energy to sustain their bio-physical Life processes. Thus, a lot more grains must be grown to feed them than would be necessary to feed you. While this may be unalterable, farm management practices, such as the use of fertilizer (made from natural gas and other energy sources) to stimulate the growth of plants that are fed to cows, can be more flexible. The energy costs of running farms is also variable. Like the Miller family you read about, many farms could reduce their energy costs through investment in energy efficiency.

Lastly, according to the article, the new science report states that,

"impacts from agriculture are expected to increase substantially due to population growth, increasing consumption of animal products. Unlike fossil fuels, it is difficult to look for alternatives: people have to eat. A substantial reduction of impacts would only be possible with a substantial worldwide diet change, away from animal products."

However, I would disagree with the statement that nothing can be done. While we all need to eat, most of you pointed out many ways that we can eat more sustainably in order to decrease our local and global environmental impacts.

Saturday, April 23, 2011

Plan B: Mobilizing to Save Civilization

Reading through your activity 2 assignments, I have been very impressed. So far, everyone has expressed a strong commitment to decreasing individual carbon footprints. Generally shocked to discover the ecological costs of your lifestyles, you are are enthusiastic and energetic about creating change. You care about the environmental legacy you leave to the future. Some of you have rightly pointed out, however, that change at the individual level, while significant when considered collectively, may not be enough. This is where we often get discouraged. However Lester Brown (U.S. environmentalist, founder of the Worldwatch Institute and Earth Policy Institute) provides a glimpse of the feasibility of creating change at the national and global levels in his inspiring short video Plan B: Mobilizing to Save Civilization:

Friday, April 22, 2011

Ishi Updated

If you have never heard the story of Ishi, the "last wild Indian" in the United States, then I highly recommend that you read this article. If you have read that anthropological classic Ishi in Two Worlds by Theodora Kroeber, then the article provides some interesting updates and new information on Ishi and his remarkable life based on the research carried out by anthropologist Richard Burrill.
The story of Ishi is familiar in part because it’s so remarkable. Known across California and beyond as the “last wild Indian,” he simply walked out of the wilderness one hot August day in 1911 and into civilization. He was 49 years old, or so they estimated, and he was to become one of the most famous Native Americans in history.

What seems to surprise people about Ishi was his ability to embrace Western culture while remaining true to himself. He wasn’t the “savage” that people thought he would be; he was amazingly similar in emotions and behaviors to the white anthropologists who became his friends.

"We make a big thing about these people being Russian, or these people being Indian. But we all have the same basic needs—we all cry, we all laugh,” said Richard Burrill, a teacher and author who’s been writing about Ishi for decades. “There are many more similarities than differences, and that’s what anthropology teaches us.”
The entire article can be accessed HERE.

Wednesday, April 20, 2011

Green Marketing Lies

I've been thinking a lot about how to reduce waste and lower my own carbon footprint, and I think a lot of you would like to do this too. Going over your activities, however, I keep seeing the same laments, that you'd like to be less wasteful, but you just can't afford organic vegetables, free-range meat, and hybrid cars. It almost seems like being resource efficient is something only the very wealthy can manage.

But wait! In EVERY SINGLE ACTIVITY you all pointed out that lower income countries have a smaller ecological footprint than the much wealthier United States, how are those folks managing it?

Not that I'm trying to imply that you should all start living like rural peasants, but I think the problem is that you've all become brainwashed by a very intentional marketing scheme which is telling you that you have to buy more products in order to be more ecologically efficient, and that's not true. Those companies need you to buy their products to stay in business.

While it's true that a new Toyota Prius uses less fuel per mile than a 1988 BMW 325, the energy and resources put into that vehicle, and especially the resources that are used to keep the factory rapidly producing more cars (since it's normal in America to replace one's vehicle every 3 years or so) are much much greater than to continue using the same old car until it is actually worn out.

What do you think would create a smaller ecological footprint: every American deciding to purchase a new fuel-efficient hybrid vehicle, or every American deciding to only buy one car every 20 years?

What if instead of buying prewashed, chopped and bagged salad mix, you bought a much less expensive head of unpackaged lettuce instead? Instead of buying a box of frozen hamburger patties, get a package of ground beef and make your own.

Basically, what I'm trying to say is that the trick to wasting less isn't spending more. It's buying less (an alarming logical system)