10 posts from January 2008

January 31, 2008

First Name Basis: Gender and Familiarity

author_karen By Karen Sternheimer

If you are a person of a certain age, you might remember a time when first names were reserved for those closest to us. 

When I was growing up, all adults were to be addressed as Mr., Miss, Mrs. (this was before Ms. became common), or Dr. We did call some of my parents’ closest friends by their first names, but for the most part every adult had a title. Even my friends’ mothers would sometimes call my mom “Mrs. Sternheimer” if they didn’t know her well.

This has changed quite a bit in recent years, as formality has given way to more egalitarian communication (particularly here in laid-back southern California). In-laws are seldom “mom” or “dad” but addressed by their first names. Some kids are encouraged to call their teachers by their first names. Most of my friends’ kids call me Karen (although one acquaintance insists that her daughter call me Ms. Karen), and many of my students do too.

I am on the fence about this; on the one hand it challenges the hierarchies of status and age; I want people to feel comfortable communicating with me. It is possible to maintain respect for someone in a position of authority when everyone is on a first-name basis. At several companies I have worked for it really helped foster communication when the president and vice presidents insisted on being called by their first names.

But there is a definite gender factor at work here. Women sometimes have to try harder to establish their authority, especially if they are young and/or small in stature. There is a fine line between familiarity and disrespect; it’s not always clear when it is crossed.

I have noticed this especially on political talk shows recently. Hillary Clinton is nearly always referred to as “Hillary”, while her male counterparts are mostly identified by their last names. I watch a lot of Sunday morning talk shows, and this seems to be a reliable pattern each week—regardless of whether the punditsimage are male or female. They speak of a war of words between “Hillary and Obama,” not “Hillary and Barack.” While they may mention the other candidates’ first names, it is nearly always followed by their last name.

So this is not just about men devaluing women, nor is it the work of the political talking heads alone. After all, her campaign placards and bumper stickers say Women For Hillary“Hillary” in big letters.

There is of course another unusual factor at work here: distinguishing herself from her husband and his presidency in her campaign. She could have called herself “Rodham Clinton,” but curiously dropped her original last name. Perhaps this is an attempt to appeal to more traditional voters (ironically in a very non-traditional situation).

Her campaign might have chosen the “Hillary” logo to try and overcome the aloofness that critics chide her for. Calling her by her first name is an invitation to familiarity.

But this still strikes me as deeply intertwined with gender. Think back to other presidential candidates. Did we see big, bold “George” “John” “Bill” or “Al” bumper stickers? Maybe it’s that male names are so common in national politics that we need their last names for clarification. After all, there are lots of Johns running for office (interesting double entendre, no?) but any woman’s name really stands out.McCain Space

John Edwards 2008No matter how much we might like to think otherwise, gender is central to the way people view presidential elections in particular and authority more generally. Part of Hillary Rodham Clinton’s challenge is to somehow seem to adhere to our gendered expectations while defying them at the same time.

First names can be a slippery slope—I notice that occasionally students’ papers will cite female authors by their first names only, but will not do the same for males. Or, even worse, some students will cite female authors’ ideas and last names but refer to them as “he.”

Gender constantly weaves its way into our relationships, even (or especially) when we are not thinking about it. So let’s think about it…what does using first names mean to you?

January 28, 2008

Real or Imagined: What are You Watching on TV?

author_sally By Sally Raskoff

Given the current writer’s strike in the entertainment industry, our television viewing habits may have to change. While late night and scripted shows are in early re-runs, “unscripted” shows are sure to vie for our attention. Reality shows are sold to us as unscripted quasi-documentaries or as competitions rife with drama (although they do employ writers). In any case, these shows have multiplied dramatically in recent years because they are inexpensive to produce and profitable. 

Are you already a frequent viewer of reality shows? Do you have your favorite? Do you prefer the competitions and drama of Survivor or Road Rules? How about American Idol or Rock Star? Cowboy U or Coyote Ugly? The Bachelor, Beauty and the Geek, or A Shot at Love with Tila Tequila? Extreme Makeover or Tim Gunn’s Guide to Style? The Apprentice, Miami Ink, Iron Chef, or Project Runway? The Real World or Amish in the City? The Simple Life, Tommy Lee Goes to College, or the Two Coreys? Dancing with the Stars or So You Think You Can Dance? Kid Nation

Each of these shows offers a slightly different take on the reality formula: collect a group of people to take on tasks that seem formidable and watch to see who annoys you the most, who you want to prevail, and who actually wins or stays the course. 

Shows like Snoop Dogg’s Father Hood, the latest in the Anna Nicole Smith and Osbournes tradition, don’t have winners or a specific competition as they are based on witnessing the life of the subjects. However, competitive elements may intervene as family members vie for attention or they realize better ratings come from upping their “odd” quotient. 

clip_image003Celebrity reality shows have gained such popularity that a new word has emerged: celebreality. Language does change in response to how cultures accept or reject ideas and concepts about which people communicate. We’ll have to wait a few years to see if this word makes it into the dictionary. If it does, that would signal the cultural acceptance and lasting appeal of watching celebrities ostensibly live their lives in front of the cameras. 

Most people are aware that reality shows, whether competitions or quasi-documentaries, really do not depict reality. The first show on television that documented family life was on PBS in the 1970s, called An American Family. However, that show has more in common with the documentary genre than what we now think of as reality TV. 

Game shows are similar to reality shows in that they are unscripted and relatively cheap to produce, but there are some important differences A competition on a game show typically last only an episode and do not intend to depict any type of reality outside the studio in which it is filmed. Some competitive reality shows (such as The Bachelor) are similar to game shows but the competition spans the entire season and the drama of the experience is emphasized over the game. 

The difference between reality shows today and those in the past has to do with the degree of reality that is presented and assumed. Contestants on the Dating Game and the Newlywed Game (popular shows in the 1960’s) never left the studio. The shows were limited to asking and/or answering questions, often about sex. clip_image006These dating game shows are vastly different from current dating reality shows. The current shows encourage obvious sexual references and activities, offer a range of racial and ethnic pairings and do not restrict their participants to those who appear heterosexual. 

In addition, on the current shows participants tend to date many different people at one time whereas on the Dating Game the bachelor or bachelorette could not date all three of those vying to be chosen. I welcome the diversity of participants in the newer shows, as it is a step towards living up to many of our country’s ideals, not the least of which is equality for all. Depicting those who are not heterosexual, white, protestant, or middle class, has the potential to normalize those formerly subordinate or deviant groups of people. This can happen if they are presented, not as foils or best buddies, but as people equivalent to those who fit the dominant status model. 

Reality shows are similar to the more traditional soap operas that have long dominated day time TV. The differences rest not only with the “actors” but to whom the show is marketed: reality shows are aimed at the younger generations while soap operas target women who are home during the day. With the advent of Tivo and DVD recorders, not to mention changes in our economy and labor practices, a more diverse audience may be watching daytime soaps, however, one look at the advertising between show breaks tells you quickly who they define as their audience.

Take a closer look at the television shows that you watch: do they reinforce or challenge our society’s norms? To whom is the show marketed? Does the advertising that is paired with your show illustrate any notions the network, station, or producers have about their audience? If you do not watch television, what is your reaction to this discussion?

January 25, 2008

Broken Windows

author_brad By Bradley Wright

A funny thing happens in our kitchen sink. Sometimes it doesn’t have any dirty dishes in it (okay, not that often, but it does happen). When the sink is empty, my family and I usually put our dishes straight into the dishwasher. At other times, however, there are dirty dishes sitting in the sink. When this happens, we all put any additional dishes straight into the sink, not even considering the extra several seconds it takes to put them into the dishwasher. Why in the world am I writing about my kitchen sink? It turns out that what happens with the sink is a reasonable analogy for one of the more important crime-prevention theories: the theory of broken windows.

The theory of broken windows originated from a 1982 article by James Q. Wilson and George Kelling in The Atlantic Monthly. They started with the idea that some broken windows in a building invite more broken windows. In their words:

“Consider a building with a few broken windows. If the windows are not repaired, the tendency is for vandals to break a few more windows. Eventually, they may even break into the building, and if it’s unoccupied, perhaps become squatters or light fires inside.” 

“Or consider a sidewalk. Some litter accumulates. Soon, more litter accumulates. Eventually, people even start leaving bags of trash from take-out restaurants there or breaking into cars."

According to Wilson and Kelling, the same holds true for neighborhoods and crime. Just as broken windows invite rocks, and dirty sinks get more dishes, so too certain characteristics of neighborhoods attract and promote crime. A neighborhood that is riddled with vandalism, litter, abandoned buildings and cars signals that no one is taking care of the neighborhood. A neighborhood that has lots of petty crime, such as public drunkenness, pickpockets, traffic violations, this signals that crime is accepted. In both cases the neighborhood is sending out a signal that crime is tolerated if not outright accepted. This encourages crime among residents of the neighborhood and it attracts criminals from other neighborhoods as well.

The importance of this theory is its implications for crime prevention. The way to cut down on crime in a given location, according to the broken window theory, is to change its physical and social characteristics. This can be done by repairing buildings, sidewalks, and roads, and fixing anything that makes a neighborhood look run down. It also means enforcing the law for even the smallest infractions. Police should ticket and/or arrest people for things as small as jaywalking, illegal panhandling, and public disorder. The logic is that by cracking down on small problems, the police are preventing more serious crimes.

The best known application of broken windows theory occurred in New York City, and depending on who you talk to, it was a smashing success in preventing crime, an irrelevant policy, or an invasion of individuals’ rights.

In 1993, Rudy Guiliani—a current presidential candidate—was elected mayor of New York City based on his “get tough on crime” platform. He hired William Bratton as the police chief. Bratton, who was heavily influenced by George Kelling, applied the clip_image002principles of broken windows theory. Bratton initiated a program of zero-tolerance in which the NYPD cracked down on all sorts of minor infractions, including subway fare dodging, public drinking urinating in public, and even the squeegee men—people who would wipe the windows of stopped cars and demand payment. A friend of mine who lived in New York City at that time even saw police telling people they could not sit on milk crates on the sidewalk-- apparently that was against the law as well.

Almost immediately rates of both petty and serious crimes dropped substantially. In the first year alone, murders were down 19% and car thefts fell by 15%, and crime continued to drop ever year for the following ten years.

So, was this application of broken windows an unqualified success? Some critics say no.

In the same time period, crime dropped significantly in other major cities around the country, cities that had not adopted broken windows policy. (See figure below). Crime dropped nationwide in the 1990s, and various reasons have been given for this overall crime drop. The crack epidemic of the 1980s was subsiding, and there were fewer people in the 15 to 25 year age group, which accounts for so much crime. As such, the declines seen in New York City did not result from new police policies but rather they would have happened anyway. 

clip_image002[8]

(The light blue line represents crime in Newark, NJ, purple Los Angles, red New York, and black the U.S. as a whole)

Other critics argue that regardless of the effectiveness of broken windows, it was too costly in terms of individual rights. They claim that the police, emboldened by the mandate to enforce even the smallest of laws, frequently crossed over into harassment of individuals, especially racial minorities and the poor. The application of broken windows, with its zeal for reducing crime, produced unacceptable police behavior.

Nonetheless, the results in New York City were sufficiently interesting that various police departments around the country have adopted principles of broken windows theory. In fact, William Bratton is now the police chief of Los Angeles.

P.S., this post shows that everydaysociologyblog.com covers everything of social importance, including the kitchen sink.

January 22, 2008

Reality Life

author_karen By Karen Sternheimer

Confession: I am fascinated by reality shows. Not the game show kind, where there is a contest or people get eliminated (although I was into them at first). My weakness is for the ones that follow people around and promise to give us a glimpse into their everyday lives. I don’t admit this very often, but I have often thought about the significance of shows like The Osbournes, Hogan Knows Best, Hey Paula, My Life on the D-List, and Newlyweds. (Yes, I have watched several episodes—okay, every episode—of all of these shows…and others like them).clip_image002

Most of these programs feature the daily lives of people at various levels of celebrity, or people who become celebrities based on their appearance on their show. We get an inside glimpse of what it is like to be one of “them” and temporarily feel like members of their inner circle. There’s a bit of a paradox working here: on the one hand the shows present their everyday behaviors that make them seem more like “us,” but the fact that they even have a reality show reinforces (or creates) their celebrity. 

If you’ve ever seen the Geico insurance ad, you might have noticed that in spots like this one they pair a “real person” with a celebrity, as if the terms were mutually exclusive.

Even though some of the shows, like The Real Housewives of Orange County, The Hills, and My Super Sweet Sixteen focus on people who are not famous (at first), they do have one thing in common with “celebreality”: all the people we are watching are rich.

Are the lives of wealthy people really more interesting than everyone else’s? clip_image004

It all depends on what a large number of people find interesting. And it just so happens that living in a fabulous home in an exclusive community filled with great stuff is interesting to a lot of people (myself included). This has something to do with how we currently define the American Dream: having financial independence and, of course, fame. What is it like to have all that? What’s it like to be the child of somebody rich and famous?

The flip side to all this should be lost on no one who has ever seen one of these shows, which are edited in such a way to help us feel a bit superior to them. Now, I would not say that the people on the shows are just “made to look bad,” as some reality show participants later complained, and that it is only because of the editing. But in addition to watching reality denizens bask in their high tax bracket status, we get to judge them too. Remember how Jessica Simpson seemed to be, er, intellectually challenged? Or all the dog doo lying around the Osbourne house? The temper tantrums when the “sweet” sixteen-clip_image006year-old didn’t have her way?

The wealthy people we see on television aren’t always admirable, either. Often shows like The Real Housewives of Orange County (which don’t really feature “housewives” since nearly of the women work outside the home and some aren’t married, but that’s the topic of another post) highlight the excesses and superficiality of their subjects. So in a way these shows both celebrate wealth and criticize the wealthy. If we’re not in the exclusive club of being wealthy, watching them might make us feel better about our relatively modest lives.

All of these examples point to the combined fascination and disgust that celebrities often generate. They have come to define what sociologist Thorstein Veblen called the “leisure class” in America. The real upper crust, whose money is not nearly as new, would probably not allow cameras in their home or want to call any attention to themselves, so they remain largely invisible. This helps to maintain the illusion of a completely open society, since it appears that anyone with an interesting clip_image008personality can be famous, and perhaps rich. As of 2006, only 17 percent of American households earned $100,000 or more, and the wealthiest one percent of Americans hold about one-third of all wealth.

The continued focus on the newly-minted rich serves to mask how the real elite got that way. CEOs of major corporations, families with multi-generational wealth and power are off of the pop culture radar screen. Sociologist C. Wright Mills called these people the “power elite.”

Are they less interesting than the Hogan family of wrestling fame? Who knows. But one thing is for sure: no matter how wealthy (and strong) the Hulk might be, he has a whole lot less power than the invisible rich in the grand scheme of things. And our continued focus on wealth coming from hard work, talent, and being on a reality show masks the reality of where wealth mostly comes from in America.

January 19, 2008

Class and Race

author_janis By Janis Prince Inniss

America has a long, painful history with race relations, but has prided itself on being free of class conflicts. Most Americans—regardless of their actual income—consider themselves middle class. In what is considered the land of opportunity, most people believe that if you work hard, regardless of your beginnings, you can become wealthy. 

While there is some discussion in the public arena about race (and to some extent ethnicity), there is less about class. During campaign speeches we hear charges that an opponent is for the rich, at the expense of the common folks. Recently, I learned that U.S. Presidential candidate Senator Hillary Rodham Clinton addressed the income gap. On November 19, 2007 in a speech entitled “Economy: Policy Address on America’s Economic Challenges, she made the following comments:

(T)he gap between the rich and everybody else has only gotten broader. 

In 2005, the last year I could find the numbers for, all income gains went to the top 10 percent of households, while the bottom 90 percent saw their incomes decline. That is not the America that I grew up in; that is not the country that I believe is holding out the promise of prosperity for people willing to work hard and take responsibility. 

The wealthiest 1 percent of Americans held 22 percent of America's income. That's an astonishing figure, and it is the highest level of income inequality since the beginning of the Great Depression in 1929.clip_image002

Indeed, the income gap exists and has widened. But what else can we learn about class by digging a little deeper? It is well documented that gender and race intersect with class and that these factors determine our relationships to power and privilege. 

This post examines some of the relationships between class and race— by looking at some of the differences in income, wealth, education and occupation by race. I am looking at these two socio-demographic factors for the sake of simplicity, but bearing in mind that the relationships between gender, class and race are highly interconnected. 

Income refers to wages and salaries for work we do. It also refers to money we make on our investments. Income in the U.S. has increased significantly over the past decades for all sectors of workers (not only those in white collar and professional, managerial occupations but also for blue collar workers). At the same time however, the divide between the top 5% of wage earners and pretty much everyone else has increase enormously.

A recently released report by the Pew Charitable Trusts entitled “Economic Mobility of Black and White Families” indicates that median family incomes have increased since the 1960s but that is less true for black families than it is for white families. (These are the only two groups the report addresses due to limitations of the data used.)

Researchers performed an intergenerational analysis and looked at how children fare in comparison to their parents in terms of income. They found that the economic benefits that black middle class parents enjoy are mostly not being matched by their children. In fact, the majority of black children of middle class parents fall below their parents in income and economic status, while white children exceed their parents’ attainments on those dimensions. Only 31 percent of black children grow up to earn more than their parents, compared to 68 percent of white children in that income range. This decline in income for blacks is found not only among middle class, but upper-middle-class children.

Even worse, almost half (45 percent) of black children of middle class parents fall to the very bottom of the income distribution, compared to 16% minority of white children. Looking at other income groups, black children fared better in the two lowest income groups, although they are always well below the gains of clip_image002[5]white children.

Wealth refers to assets such as real estate property, stocks and bonds. Wealth in the U.S. is concentrated in the hands of very few people; the top 1% of families held about one third (32.7%) of the nation’s total net worth and in fact the top 10% of families hold about 70% of the total net worth. 

This means that the majority of people – 90% of the population – have less than one third of the nation’s total net worth! 

Data from the U.S. Census Bureau illustrates the relationship between race and economic resources. Black households and Hispanic households held a significantly higher proportion of their net worth in housing than Whites. Black and Hispanic households have a significantly lower proportion in financial assets such as stocks and mutual fund shares compared with White households. 

Despite a historically large gap in the high school completion rates between whites and blacks, in 2000, high school completion rates for whites and Asians are pretty close, with Blacks a not very distant third place. Barely half of Hispanics finished high school; this group has the highest dropout rate of any in the U.S.clip_image002[8]

Although a large number of blacks do attend college today, graduation rates are disappointing for this group and for Hispanics. Asians are the only group approaching 50% of their population having attained a Bachelor’s Degree or higher.

Clearly, education and its relationship to race has an impact on occupation: the types of jobs that people are qualified for and by extension the incomes they can command. 

Where do you fall on each of the four dimensions of class—income, wealth, education and occupation? What role do you think your race plays in your social class standing? 

clip_image002[10]

January 16, 2008

Do we Really Know Better?

author_sallyBy Sally Raskoff

With the holidays and family gatherings over, it is a good time to ponder our behavior in these settings. During the holidays, do you indulge more in specific behaviors than you do in your everyday life? Most of us do.

Did you eat more food than usual? Did you eat more sugar or fat? Did you drink more than you usually do? Can’t remember what you did on New Year’s Eve? Do you have some tension or guilt about what you ate, drank, or did during the holidays?clip_image003

These over-indulgent behaviors can cause cognitive dissonance in the holiday season and at family gatherings. Cognitive dissonance is a social-psychological term that describes this “tension” that we experience when we think (or behave) in ways contrary to our “normal” modes. In other words, our thoughts (and behaviors) are in conflict with each other, and this creates ”dissonance” or tension that we find very uncomfortable. 

A great example of this phenomenon is a typical smoker’s attitude toward their habit: they know smoking is not healthy, yet they continue the behavior and may even value the act of smoking as much as they value their health. Social psychologist Leon Festinger clip_image006first coined this term after investigating a doomsday cult and their behaviors after their prophesied event didn’t take place. In resolving the conflict between their belief in the cult and the reality that the event had not occurred, some cult members gave up their belief while some rationalized the conflict by reinforcing their beliefs.

The more important something is to us or the more intense the conflict, the greater the cognitive dissonance. For example, when a strongly held belief is in conflict with our political thoughts or our behaviors, the dissonance can be quite strong. Compare a person who runs a red light with a Catholic who uses birth control, a pro-life person who chooses to end a pregnancy, and a lifelong political activist who votes for a candidate from the opposing party. How might their relative dissonance levels compare? The importance of one’s belief may affect dissonance.

Intensity also affects dissonance. For example, a person who abuses alcohol knows intellectually that drinking isn’t good for them but they may drink anyway. If that person gets liver disease because of the drinking, that increases the conflict, which increases the dissonance. The disease may not make it easier to quit; but it will make them more uncomfortable by increasing the tension between their knowledge and their behavior.

Resolving cognitive dissonance involves alleviating the conflicts by either ignoring one side of the issue or rationalizing ideas or behaviors. The act of rationalizing adds more support to one side of the conflict, thus minimizing or overpowering the other side. 

Many of us know that we will violate our typical eating and drinking patterns during the holidays. That causes some dismay but we figure we’ll slow down after the holidays and go back to our regular pattern, possibly fasting or cutting back for awhile to compensate. The act of rationalizing that behavior acts to reduce the cognitive dissonance we are feeling. But do we really end up fast or cut back our calories to compensate for the over-consumption?

clip_image009 I know that over the holidays this year, I’ve been eating more sugary foods (chocolate) than I typically do during the rest of the year. Why do I do this when it makes my body feel unwell? Because they taste good! Because relatives or friends made them or at least bought them. Because my favorite treats are rarely in the house during the rest of the year so why not eat while we have ‘em? Because this is what we do at our house—eat the foods and drink the beverages that people bring over. Are any of these good and rational reasons to eat all this sugar? Not really—but these rationalizations do enable me to pick up another cookie and not feel so guilty about it.

Alternatively, I could give myself more important reasons not to eat these items. For example, I could think about how I don’t feel well when I eat a lot of sugar. That should be rather important to me since I’d rather feel healthy than unhealthy. However, that rationale rarely overwhelms the others presented above because it is more abstract, not as immediate, and not as strong a connection. After all, I may feel bad for some other reason thus I’d miss out on these treats. (See how that rationalization process keeps working?)

I have some relatives who were expected the morning of New Year’s Day. They showed up at three in the afternoon because they had clip_image012gone out on New Year’s Eve and “partied” a bit longer than they had planned. Their cognitive dissonance rested with their decision to stay up and out most of the night and that was in conflict with their plan to come over in the morning. When they finally showed up in the afternoon, they mentioned that they had hangovers and were “so tired” so we shouldn’t chide them about being so late. Their attempt to relieve their dissonance involved focusing on how bad they were feeling so that they could feel less guilty about being late. If we knew they felt terrible because of their own choices, we would give them less grief about it, and their dissonance would dissipate. (We still teased them about it.)

Holiday behaviors are similar to those behaviors we may choose in young adulthood. In many American cultures, it is appropriate for people in their twenties (“college age”) to experiment and try new as they develop their independence and maturity. That experimentation can generate plenty of cognitive dissonance since the things we try may be in conflict with the values we were taught.

Cognitive dissonance can happen anytime, but it is more likely to occur during the holidays and at certain points in the life cycle (mid-life crisis, anyone?). What other examples come to your mind? How might they be resolved?

January 11, 2008

Applying Social Science in the Combat Zone

author_cn By C.N. Le

One of my core principles as a sociologist is for my academic research to have some kind of relevance to the "real world."

Instead of just conducting research and publishing it in obscure academic journals that few people outside academia read, I want to disseminate my academic knowledge to a wider, more popular audience and to use it to help address real world issues and problems. That is one of the reasons why I started my two blogs in the first place and why I participate in the Everyday Sociology Blog, which demonstrates that sociology has direct relevance to everyday events and people's lives.

More and more social scientists feel the same way. There has been a movement toward making sociology more "public" and in other social science disciplines scholars are increasingly engaging with real world issues that affect society, American and international.

But as Time magazine reports, one particular program of "applied social science research" is creating quite a controversy inside and outside of academia -- using social scientists to help the U.S. fight terrorists in Afghanistan and Iraq:

Two years ago, the CIA quietly started recruiting social scientists, advertising in academic journals and offering princely salaries of up to $400,000. But . . . in September, Washington turned a pilot project called Human Terrain Teams into a full-fledged, $40 million program to embed four- or five-person groups of scholars -- including anthropologists, sociologists and social psychologists — with all 26 U.S. combat brigades in Iraq and Afghanistan.iraq2a

[S]ome preliminary reports are encouraging. From Afghanistan, the 4th brigade (82nd Airborne Division) reported a 60-70% drop in attacks -- and a dramatic spike in capture of [suspected terrorists] after anthropological advisers recommended redirecting outreach from village elders to focus on the local mullahs. One mullah was reportedly so moved after being invited to bless a restored mosque on the nearby U.S. base that he quickly agreed to record an anti-Taliban radio ad. . . .

In the wake of the colossal mishandling of the Iraq occupation, this new partnership manifests the military's renewed appreciation of the importance of culture. 

Montgomery McFate, a Navy anthropologist, [was an] early advocate of what she says is best described as anthropologizing the military, not militarizing anthropology.

Yet many in the profession contend that any collaboration of this nature compromises their field's integrity. Anthropology deployed under such circumstances will become "just another weapon...not a tool for building bridges between peoples," argues Roberto Gonzalez, an anthropologist at San Jose State University and member of the Network of Concerned Anthropologists.

I spent some time thinking about programs like this and trying to decide whether I think they are a good thing or a bad thing for the academic disciplines involved and for American society in general.

On the one hand, I would say that it's beneficial for social scientists to get involved in these efforts because they can fulfill the fundamental professional mission I mentioned above -- using their expertise to address an important social issue and to produce the most benefits for the most people possible.

On the other hand, it would not be beneficial for social scientists to apply their efforts for a "more effective method of killing people," to put it bluntly. That is, depending on how you choose to see it, their knowledge can basically be used for the purpose of perpetuating war and the taking of human lives.

So ultimately, when it comes to the question of whether programs like this are good or bad, I think my answer is that just like life in general, the final answer is not a simple binary of good/bad, yes/no, or moral/immoral. Although this may sound like a cop-out, there are both positive and negative aspects to it, like the iraq3a rationales I just mentioned.

But if I had to pick one side of the argument over the other to support, at this point, I would agree with Prof. McFate's position that I quoted above, that programs like this are about "anthropologizing the military, not militarizing anthropology."

In other words, if used effectively and properly, the expertise of social scientists can indeed help people who may initially be on different sides of the war -- U.S. troops and Afghan or Iraqi civilians or tribal/religious leaders.

The U.S. would get culturally competent knowledge about how to best relate to the native population in order to effectively communicate and build interpersonal connections with them. The native population could also feel that their needs, issues, and concerns are genuinely being heard, understood, and incorporated into the actions of the U.S. military operating in their neighborhoods.

Of course, like I mentioned above, critics would point out that the assistance of social scientists is ultimately being used to promote war and killing. I respect that opinion, but I choose to see a more nuanced point -- that terrorists who target the U.S. military, generally speaking, are likely not to have much concern for the native population of civilians as well.

Therefore, if the terrorists see both of these groups as enemies or at least expendable casualties of war, the native population has a right to join efforts to oppose such terrorists. With that in mind, the U.S. military and the native population can work as allies, not in opposition or suspicion of each other.

Even if that means that some people will inevitably die, I would rather have those people be terrorists who indiscriminately target civilians and distort the doctrines of a just and honorable religion to suit their extremist views.

Sociologists and other social scientists can be useful in helping different groups of people recognize that not everything is cut-and-dry, black-and-white. Instead, every question and every goal have their own subtle and specific points that need to be addressed respectfully, thoughtfully, and competently.

January 08, 2008

Social Selection and Social Causation

author_brad By Bradley Wright 

Some of the most interesting puzzles for sociologists have to do with differences between groups of people, and two common explanations for social differences are “social selection” and “social causation.” They apply to a remarkably wide range of phenomenon, and they are kind of interesting to think about. 

Here’s how they work. Suppose we have two types of people, those in social group “A” and those in social group “B”. Furthermore, we observe that the “A” and “B” people are different along some personal characteristic, say “X”. 

How do we interpret this difference? If being in group “A” increases people’s “X”, then we are making a social causation argument. That is, we believe that being in that social group causes people to be different. 

It could be, however, that being high on characteristic “X” makes people join group “A”. If so, we’re making a social selection argument because being the type of person “X” causes people to join a different group. 

As such, the correlation between group “A” and characteristic “X” can be explained by both social selection and social causation, and what’s really interesting is to try to figure out which, if not both, are operating. To do so, you try to figure out which is the most plausible story and find any evidence that you can. 

Here are some examples. 

I know several people who graduated from Harvard, and they are some of the smartest people that I know. Freaking genius comes to mind. (Just for fun, I should say that one of them is a dunce just to keep my Harvard friends wondering if I mean them, but it wouldn’t be true. They’re all bright.) Why are they so smart? 

clip_image002It could be that Harvard provides a superior education. You have brilliant professors, world class facilities, and a rich legacy. If sitting in Harvard classrooms makes you really smart, this would be a social causation argument. Or, it could be that Harvard attracts the smartest students. The very brightest of high school students get their choice of schools to attend, and if they favor Harvard (and other top schools) then Harvard will produce very smart graduates, even if a Harvard education isn’t better than other schools. 

Hm-m-m-m, how can we figure out which it is? If I were to guess, I would probably lean more to social selection. We know for a fact that Harvard attracts the best students. Ninety-five percent come from the top 5% of their high school classes, and they have SAT scores well north of 2000. Wow! (Digression. The best line I’ve ever heard about SAT scores—Jennifer Lopez was asked what she got on her SATs, and she replied “nail polish.”) 

Harvard professors are brilliant, but they are selected for their ability as researchers, not teachers. Just because a professor creates new theories to revolutionize their field doesn’t mean they are particularly good at explaining the basics to students. They are also rewarded with salary and promotion for being star researchers, not for high quality teaching. I’m not saying they are bad teachers, but I can’t see how they would be much better teachers than those found in other four-year universities. If there is social causation, I would imagine it has to do with being around other smart students. Studying with, competing against, and talking to smart people will make you smarter, but this would be true with a group of people standing in a farmer’s field, it’s not a Harvard thing per se. 

Here’s another social puzzle. There is a significant correlation between mental illness and lower social class; poor people are more likely to exhibit mental illness than rich people, including depression. How do we explain this? 

A social causation argument would say that being poor increases mental illness. Not having enough money creates stress in people’s lives, and this stress can result in depression and other disorders. The poor are also less able to afford medication, counseling, and other ways of dealing with mental illness such that their conditions are more likely to get worse than those of wealthy people. 

A social selection argument would portray the mentally ill as less able to get ahead in society. If you’re depressed, it may be hard to go to work regularly or to put a lot of energy into your career. People who have observable symptoms of mental illness might be less likely to interview well for a job or be promoted once they have a job. (An interesting question: Is this a form of discrimination?) 

Which do I think it is? Well, I’m cheating here because I participated in a study that looked at this very question, and we found that both social selection and social causation mechanisms were in effect. 

I’ve discussed two examples, but there are lots of other social phenomena that lend themselves to both social causation and selection arguments. For example: 

  • Religious people live longer than non-religious people. Does religion change people’s life expectancy, or does it attract people who would live long already?
  • Criminals have more criminal friends than do non-criminals. Does having criminal friends make you into a criminal, or do criminals attract other criminals as friends?
  • Capitalist countries are wealthier than those with other types of economies. Does capitalism make a country wealthy, or do wealthy countries gravitate toward capitalism?

Interesting stuff, no? If you keep these two mechanisms in mind, you’ll be surprised by how many things you can explain by using them.

January 05, 2008

Doggie's BFF

author_sally By Sally Raskoff

Do you think that dogs are a human’s best friend? Are they really our best friend forever? Many feel that their pets are a member of the family, but how important are they compared  to your other family members?Iansusie1

Today, I saw a family walking into the local mall and something about them caught my eye. The family consisted of four adults and one child. The child was pushing a stroller and she appeared to be about 8-10 years old. They caught my attention since the young girl (and not one of the four adults) was pushing the stroller.

She leaned down to tend to clip_image002something in the stroller, and I realized that she was pushing a dog stroller! Her small puppy (or full size toy dog) was attempting to climb out of the carrier so she tucked it back in. As they approached the door of the mall, she had to push the dog back in the stroller two or three more times. 

My first thought was that this was a great example of the reinforcement of gender norms, as the girl was practicing mothering skills--not just with a doll but with a living creature. However, I realized that this was much more than a gender socialization process when I noticed that the stroller was designed specifically for dogs.

Recently, we have seen small dogs traveling in women’s purses or in bags that appear to be purses yet are small dog carriers. The first time we notice these purse-dogs, it is somewhat surprising, but the more we see this, the more “normal” it becomes. 

The stroller-dogs are similar to the purse-dogs in that they are being carried about, not walking under their own power. They are not walking, sniffing, and socializing as typical dogs do.

clip_image005

I’ve always thought it odd seeing a purse-dog. I wonder how it “does its business” or relieves itself. But dogs in strollers? This seems to be a very different situation. It seemed so odd to me since dogs need walking not riding!

clip_image008Why is it more startling to see a stroller-dogs than a purse-dog? 

  • While the dog is being carried about much like the purse-dogs, the stroller is more visible and it is obvious that people are pushing around a dog. 
  • When we see a stroller, we assume that it is carrying a human child or a doll; it is apparent that its occupant is equated with either a human or a doll. 
  • Dolls are proxy humans; girls are encouraged to play with them to prepare for their future role as a mother.

Thus these stroller dogs are treated not only as family members, but also as babies or children and equivalent to humans.

When I mentioned this to my friends, one mentioned that she had seen a woman shopping in the mall with her golden retriever. Adult retrievers are far too big to be carried in purses and are probably too big for most strollers, but they can participate in the take-my-dog-anywhere phenomenon too. If small dogs can go to the mall, why not larger dogs? 

Cities have the power of regulation over where animals may go. Guide dogs are allowed into public places and elsewhere to assist their humans navigate spaces and situations. But most cities have regulations requiring owners of other dogs to ensure that (1) their dogs are licensed, (2) their dogs be leashed when off one’s private property, and (3) when walking their dogs in public spaces, they remove any feces. Many clip_image011businesses have policies that prohibit the presence of all dogs except guide dogs. Health codes restrict most businesses that serve food from allowing animals into their establishment. 

Because so many dog owners have opted to bring their “best friend” with them everywhere they go, there is a robust market for books and websites that offer suggestions about pet-friendly businesses, hotels, and cities. 

clip_image013One website lists malls across the country that are pet friendly. Some shopping center websites have a “pet friendly” icon on their page. The Prime Outlets at Pismo Beach, California, is so pet friendly they have doggie treats for their four-legged visitors.

So, how do we explain this relatively recent phenomenon of dogs becoming not just our best friend, but our constant companion (when they are allowed to be)?

Norms, or societal guidelines for expected behavior are often reflected in organizational policies and governmental laws. Norms, like policies and laws, do change over time yet they do not often change quickly. What is source of such change? Typically, when behaviors of people deviate from those norms in enough over time, those behaviors become more accepted than the policies , laws, or norms that had restricted that behavior.

clip_image016Animals have always been important to our culture. However, in recent years the depth of integration of animals with our families and daily life has increased tremendously. Animals have always been valued for their labor (farm animals), amusement or education (circuses, zoos), and companionship or helping humans learn caring behaviors or parenting roles (domestic pets). These days, though, many pets are considered equal members of the family.

Why are we treating dogs as if they are children?

Symbolic interactionist theory and ethnographic research suggest that domesticated pets offer humans tremendously rewarding social and emotional contacts. Studies show that the elderly and those with disabilities are happier and healthier if they have regular interactions with a pet. Dogs also connect people to other people (much as children do) because they get their owners out into the social world. 

clip_image019Canine companionship can also serve as a substitute for human contact. As societal and technological changes make us more isolated from each other, it seems logical that we increasingly incorporate pets, especially dogs, into our families and social lives.

Of course, it can cost money to bring your dog everywhere you go. Dog strollers sell for approximately $50 to over $200. Dog carriers or “purses” also are not inexpensive either. While middle and upper class families can afford such luxury canine carrying items, those with lower earnings probably can’t. 

Homeless people with dog companions are often denied shelter unless they give up their dogs—thus most refuse such shelter. Since these dogs are often the only positive emotional connection for their human companions, they will not give them up to get a roof over their head. Their ties to their dogs are equal to or even more powerful than that of other people. Yet homeless people and their dogs are welcomed inside shopping centers they way a well-off dog-purse-carrying owner might be.

clip_image022Speaking of economics, the recent growing acceptance of dogs as family members has been a boon to the economy. The market for products created for dogs and other pets has steadily increased for many years. The American Pet Products Manufacturers Association estimates that in 2007 $40.8 billion dollars were spent on pet products in the U.S. Their trend report cites not only and increasing focus on dog products but companies known for human products are including dog products (e.g., Paul Mitchell, Origins, Old Navy). 

Historically, we have elevated humans above other species, and the differences between humans and animals have helped to define what humanity is. These days, those distinctions seem to be blurring—not just in the social world but in the scientific community as well.

clip_image025Scientific studies, such as the gene mapping projects, have found fewer differences between humans and other species than we had previously thought. The paradigms of scientific thought are also changing, thus the enlightenment perspective, that humans have the ability to understand and control the natural world, has been challenged by alternative theories. Some of these theories acknowledge that humans neither understand the laws of nature nor are they in control; rather, they are part of a delicate ecosystem.

clip_image028Thinking about this phenomenon through a sociological lens raises many questions: Should we expect increasing acceptance of dogs and other pets into our social spheres whether public or private? Will we admit other species into our realm? Will we continue to adopt technologies and make them part of our lives without thought to how those technologies change our lives? Or, will we reassert human primacy over other beings? Will we create and enforce laws and policies to keep non-human species on leashes and out of public spaces? Will we counter our blind adoption of technologies by encouraging human social interaction in a variety of situations?

No Dogs Photo: courtesy PDPhoto.org; Dog Stroller photo: PawsAboard.com

January 02, 2008

The Celebrity Effect

author_karen By Karen Sternheimer

If you follow politics even a little bit, by now you have heard that Oprah Winfrey is actively campaigning for Democratic presidential hopeful Barack Obama. Political commentators are busy asking each other what effect her support will have on his candidacy. Will she put him over the top to become the Democratic nominee? 

Critics tend to overestimate the influence celebrities actually have on the clip_image002public, fearing that we are blinded by fame and idolize Hollywood stars. Some think that they make us think it is cool to have kids without being married or be a single parent (as former Vice President Dan Quayle accused the fictional television character Murphy Brown of doing in 1992).

Oprah Winfrey is certainly not the first famous person to openly campaign for a candidate, but she is arguably the most powerful. No one can deny that she’s a one-woman empire of influence. But it is one thing to read a book because she loved it or buy a set of cotton jersey sheets because they made her Favorite Things show once upon a time (I’m guilty of this one myself…they are really soft!). It’s another to cast a vote based on a recommendation. After all, we can read lots of books and have several sets of sheets, but we get just one vote.

A lot of people might like to hear about celebrities’ private lives, about their clip_image004families and relationships down to the most mundane minutiae from their lives, true or not. We may want to have their expensive stuff, or to try to look as good as some of them do, have as attractive a mate, but much of the news coverage of celebrities is more of a handbook for what not to do (like drive drunk, shave your head, lose custody of your kids, and so forth).

Celebrities are a lot like the popular kids in school—they tend to have the best clothes, new cars, and lots of friends (as long as they are popular). Everyone else knows who they are, but we might not really like them. In fact, we may enjoy finding out that they aren’t that perfect after all. In a large, heterogeneous society as our own, we tend to have fewer and fewer social networks in common with others--except for celebrities.

German sociologist Ferdinand Tönnies had a name for this condition: Gesellschaft. Celebrities can become a form of social glue that helps us bond by our admiration and (frequently) condemnation of high-profile people and reaffirm a sense of shared morality.

In contrast, people who live in smaller communities have many of the same social networks and know lots of the same people. Tönnies called this kind of condition Gemeinschaft (which literally translates to mean “community”). This is not to say people in small towns have no interest in celebrities or celebrity gossip; in fact, condemning the “Hollywood elite” can serve to reinforce boundaries and shared values of a particular group.

But even though we might follow the stories of celebrities’ private lives (which are often more entertaining than their movies), people will not necessarily follow them to the polls or make other major life choices based on what the rich and famous are doing.

clip_image006In fact, in contemporary American politics, celebrity status can signify being out-of-touch with mainstream voters. For one, both celebrities and the popular culture their industry manufactures give the impression that Hollywood is a morally questionable place. Sometimes celebrity endorsements have the effect of making a candidate seem to be one of “them” rather than one of “us.” Some candidates’ appeal comes from presenting themselves as populists or regular people. Ironically, the very nature of celebrity implies that someone is not ordinary. So too much celebrity support can backfire.

Also, celebrities are often rich, unlike most of the general public, whose median income is about $48,000 a year; (about a week’s pay for a modestly successful star). Their interests are not necessarily ours, even if we may agree on a few things. I happen to live in the same zip code as many celebrities here in Los Angeles, and while there may be some overlap in our opinions, economically speaking we are in different worlds.

Of course Oprah is not just your run-of-the mill celebrity. She encourages her viewers to find their inner-selves, to “use your life” as she often says, for a higher purpose. She is more than a pitchwoman (as some celebrities basically are). Since she appears to chat with the public on a daily basis and is open about her flaws and struggles, she seems more like “one of us.” Yet she is also an inspirational leader, and may transcend the “Hollywood elite” label so many celebrities cannot. (It also probably helps that her show tapes in Chicago, not Los Angeles).

With primary caucusing and voting just days away, we will soon see whether her support translates into votes. If nothing else, she and fellow celebrities are good at drawing our attention. Senator Obama will certainly get a look from many people who might not have paid attention to him otherwise.

Become a Fan

The Society Pages Community Blogs

Interested in Submitting a Guest Post?

If you're a sociology instructor or student and would like us to consider your guest post for everydaysociologyblog.com please .

Norton Sociology Books

The Real World

Learn More

Terrible Magnificent Sociology

Learn More

You May Ask Yourself

Learn More

Essentials of Sociology

Learn More

Introduction to Sociology

Learn More

The Art and Science of Social Research

Learn More

The Family

Learn More

The Everyday Sociology Reader

Learn More

Race in America

Learn More

Gender

Learn More

« December 2007 | Main | February 2008 »