8 posts from September 2009

September 28, 2009

Knowledge Matters

author_janis By Janis Prince Inniss

On September 8, 2009, President Obama gave a back-to-school speech described by the White House as inspirational and pro-education to elementary, middle, and high school children in the U.S. Even before the speech there were reports of parents who thought this was inappropriate. (President Obama is not the first U.S. president to hold such an address; President George H. W. Bush did so in 1991 and President Ronald Reagan talked politics with students in 1988.)

Some parents worried that the president’s speech would be political, and that it was a way to reach parents through their children. Other parents complained that they were not notified about the event before-hand and asked to give consent via a permission slip as they do with activities such as watching R-rated movies. In response to parental complaints, some school districts allowed parents to opt-out and have their children take part in a different activity while other upset parents kept their children at home.

Some of the furor about President Obama’s speech may have been related to an assignment created by the Education Department to have students write essays to say how they could help the president. When some critics of the president protested that the assignment was too partisan, the lesson plan was revised to ask students to write about how they can achieve their short and long-term goals.

Recently, Former President Carter said “an overwhelming portion of the intensely demonstrated animosity toward President Barack Obama is based on the fact that he is a black man”. (The former president was responding to Rep. Wilson’s outburst during the President’s speech to Congress.) Whether President Carter is correct and whether the president’s race is related to the fears parents expressed about him addressing their children is a debate I’ll leave for another time.

The furor over the President’s address does raise the question of what constitutes information that should be central to our education. What should you be learning? Have you ever thought about who decides what you learn? Clearly your teacher does. But what influences your teacher to make the choices he or she does in selecting class materials?

There are the so-called three Rs: reading, writing, and arithmetic. These are considered the basics, seen as fundamental to other knowledge acquisition. Do you agree that these are fundamental? What about all the other subjects taught in school? I took a music appreciation class as a college freshman. The class was designed to give me an appreciation of classical music. Who decides what classical music is? Turns out that we only studied classical European music (two words that tend to go together—classical and European), but should music from other areas of the world have been included in such a class?

What about in your English literature classes? In my overview American literature course, we read from an anthology that is more than 2500 pages thick. (Yes, I still have the book and just checked!) Given the racial/ethnic make-up of America, should such a course include non-white authors? (About four African Americans are included in the edition we used.) Who makes decisions about what is considered classical literature and decides what books you should read?

clip_image002A 16 year-old International Baccalaureate (IB) student in Florida objected to passages in the highly regarded book The Wind-up Bird Chronicle, a book on the list of the prestigious IB English class curriculum. The student objects to sexually graphic content in the Haruki Murakmi novel and has been allowed to select an alternate book. Have you or your parents ever objected to any material that you were being taught? On what grounds? How likely do you think it is that you would be able change the information you are responsible for knowing in any of your classes?

While most of us now take education for granted, schooling is a relatively recent phenomenon—its modern form was developed in the early nineteenth century. This change was related to industrialization which created a need for an educated workforce. In the modern era, job opportunities have increased with education; today higher education is associated with increased income and less likelihood of unemployment (see chart in previous post). Is this how what is taught is decided—based on the kinds of information that are likely to keep us employed?

If you think about some of the more recent changes to curricula it seems that what is “worth knowing” changes. For example, after the civil rights movement and student activism, fundamental changes were made that resulted in the inclusion of more information on people of color and the creation of ethnic study departments on several university campuses. As you may recall from an earlier posting, standpoint theory suggests that even what we know depends on our affiliations. And is all knowledge ”equal”? Do you think there is an objective way to decide what knowledge matters?

September 24, 2009

Biography + History = Opportunity

author_karen By Karen Sternheimer

I recently read Malcolm Gladwell’s bestselling book Outliers: The Story of Success, and highly recommend it to anyone interested in a sociological perspective of what factors enable some people to achieve more than others. Although not a sociologist, Gladwell is a journalist with a knack for explaining sociological and social psychological concepts in a clear and interesting manner.

While the American ethos of success suggests that it is the result of talent and hard work, Gladwell examines factors that sociologists refer to as social structure—things beyond our individual control—to understand what else successful people have helping them on their journey. Let’s be clear: skills and hard work are important, but so is timing. And one of the most important things to time well is something none of us can choose—when we are born, and to whom we are born.

Sociologist C. Wright Mills describes the importance of timing in his classic 1959 book, The Sociological Imagination, where he notes that all of our life chances are shaped by the intersections of our own personal biographies and history. Gladwell provides numerous examples of this, finding that the so-called Robber Barons who became America’s captains of industry in the late 1800s were mostly born within a few years of each other. People like Andrew Carnegie and John D. Rockefeller were born just a few years apart in the 1830s, as were many other business titans who amassed great wealth. Was there something particularly profitable in the water back then? Lessons taught in school at that time that would have led to their incredible achievements?

As Gladwell points out, their timing couldn’t have been better. Yes, they likely worked hard and had brilliant business minds. But they also came of age just as the industrial revolution was exploding in America. They were able to get in on the ground floor of advanced capitalism.

Of course people have gotten very rich before and after this period, and Gladwell describes how being born in the mid 1950s was particularly fortuitous for those interested in computer programming development (think Bill Gates and Steve Jobs, both born in 1955). It also helped to be geographically near what were then called supercomputers, the gigantic predecessors to the thing on which you’re reading this post. Back in the 1960s, when Gates and Jobs were coming of age, a supercomputer took up a whole room and was not something most youngsters would have had a chance to see, let alone work on. But because of their proximity to actual computers, both Gates and Jobs had a leg up on others their age and had the chance to spend hours and hours (10,000 of them in Gladwell’s estimation) learning about programming.

We can apply this model to more than just financial success. Think about what opportunities your own biography and history have afforded you. How has when, where, and to whom you were born shaped your life today?

I tried to think about the intersection of my biography and history to imagine how timing might have led me to write this post or to read Gladwell’s book in the first place.

As a member of “Generation X”, I was born following the massive baby boom. As you can see in the graph on the left, after a peak in the mid 1950s, the number clip_image002of births sharply declined. How might this have affected me? As Gladwell describes, children born after booms like I was have the benefit of smaller class sizes. An unprecedented number of schools were built for Baby Boomers in the years before I was born. When my cohort was ready to go to school, there were newly-built buildings waiting for us, especially for people like me who lived in well-funded suburbs. (My hometown boasts that residents have never rejected a school levy in its entire history).

When I was in elementary school in the mid 1970s, there were so few students that many classes were combined: first and second graders had the same teacher, as did third and fourth graders. Looking back, this provided me with some unusual opportunities.

For one, a child in my district often had the same teacher for two years in a row. This teacher had the opportunity to know us better, and help us develop our strengths and provide lessons that could target any weaknesses. They would recommend us for special enrichment opportunities based on our talents too; there was a “Special Talents Program” we called STP where a few kids would spend time with the art teacher, in the music room, or reading additional books if we seemed particularly interested.

Another advantage: because children would necessarily have different skill levels in the same classroom, and might be nearly two years apart in age, a big difference for six- and almost eight-year-olds, the teachers would work with us in small groups, and sometimes one-on-one. Having small classes helped with that clip_image004effort too.

We would be placed in small groups, sometimes based on reading level, sometimes based on more random factors (like where we happened to be sitting that day) and learned lessons with far more individualized attention. We were also given “contracts” by our teacher, who would meet with clip_image006each student individually and assign lessons from workbooks based on our own level of achievement in reading, math, or another subject. We would then be able to work individually, return to show our work to the teacher, who would sign off on the “contract” that we had completed the assignment. We would also get individual help if we needed it from the teacher or occasionally from a student teacher if our classroom had one at the time.

Because I was a bit precocious as a child, this school structure really enabled me to thrive. Rather than get bored by a lesson designed to reach children at all levels, I could work as quickly as I wanted to and sometimes discover topics that I wanted to learn more about, and do separate research on my own. I also had college-educated parents who had taught me to read well before I entered school, frequently bought me books and could answer most of my questions if I had them.

Couple these factors with the lingering 1960s ethos which promoted experimental methods of learning and you have a better understanding of how the accident of my time and place of birth created an additional advantage. By the 1980s, when I went to middle school and then high school, this individualized learning model disappeared in favor of more traditionally structured classrooms, as the political backdrop shifted. There was one centralized lesson, one assignment for the whole class, and less one-on-one time with teachers. I got bored a lot more often.

So that’s the history portion of how my opportunities might have been shaped. Let’s bring biography back in.

You might have read about my elementary school days and thought, what’s to stop a kid from doing as little as possible? And what about children who aren’t willing or able to work independently?

I’m guessing there are many children who would not thrive in this independent environment that was so well-suited for me. Having the teacher meet with another group or another student one-on-one presented many opportunities for chit-chat and goofing off (I did my fair share of that too). So individual personality, work ethic, and talents do matter. They’re just not the only things that matter. How has your biography intersected with history to produce opportunities (or barriers) for you?

September 21, 2009

The Wages of War: The Oldest Profession

author_sally By Sally Raskoff

The other day I heard from someone currently on active duty in the Army. During a discussion on the difficulties of overseas deployments in wartime, he mentioned that female soldiers often make extra money “servicing” their fellow soldiers (i.e., sexual services if you didn’t get my innuendo). He said that this was common knowledge and due to the fact that the deployments are long and there were so few available women, especially American women.

Are you shocked? Surprised?

Historically, there are usually prostitutes available where a military congregates whether they are provided by the local economies or by the sponsoring governments themselves. Rarely, however, are those selling their bodies also members of the active duty military. (For example, the Comfort Women of World War Two)

clip_image002

http://en.wikipedia.org/wiki/File:Chinese_girl_from_one_of_the_Japanese_Army%27s_%27comfort_battalions%27.jpg

Whether or not my acquaintance is right about this phenomenon, the pressures of the situation – and the structure of the situation –could easily lead to this kind of prostitution.

After I was active duty (Air Force) and started taking college classes, I wrote most of my sociology class papers analyzing my experiences in the military. This was a few decades ago when women made up a smaller percentage of the armed forces, but today women are still a minority – less than 15% - of those on active duty.

At that time, it was very clear that a woman, no matter what her job skills, was considered “available” unless she was married. If a woman stayed single, the chances that she would develop a “reputation” were quite high. It didn’t matter if she was exploring her sexuality with one or many partners or if she was celibate, many colleagues saw all single women as fair game. A civilian parallel is the phenomenon that early developing girls are sometimes called “sluts” whether they are sexually active or not. (Leora Tanenbaum wrote a great book on this phenomenon called Slut! Growing up Female with a Bad Reputation.)

When I was in the military, the pressure to get married was intense, especially for those concerned about maintaining a good or professional reputation. The service also encouraged marriage as it made more stable soldiers even if those soldiers were married to each other. Married soldiers got paid more, better housing, more services and family support.

Anyway, back to what my friend was saying about female soldiers prostituting themselves.

When does prostitution occur? The research literature points out a number of patterns. The buyer buys sex when they can’t get it elsewhere or don’t want the encumbrances from getting it elsewhere. The seller sells sex when they have no other options.

Alienation exists in each case as each person is exchanging their body for cash, thus their body becomes an object and commodity rather than remain a personal subject of experience and emotion.

There is more prostitution in societies with greater gender inequality since the more powerful gender can exploit the less powerful in a variety of ways. When there is more gender equality, those in the different gender groups are considered more equal and thus there’s less exploitation based on perceived differences in value. Social interactions are much less likely to enter into the commercial arena where pricing and negotiation assume power differences and competition.

What is it like to be deployed, especially in today’s wars?

Isolating. Alienating. Fearful. One doesn’t know if one is going home intact or at all.

Many members of the military struggle with depression and post-traumatic stress disorder, and the suicide rate is higher in the military than it is for the general population. It often takes outside organizations to help soldiers deal with the conditions of their service, such as Suicide.org and Gold Star Families Speak Out, but those organizations are not meant for those currently deployed.

When you consider these unique stresses, is it surprising or shocking that an underground economy of sexual services emerges on overseas deployments, especially when outside purveyors of sexual services are not available in the areas to which these troops are deployed?

It’s not surprising at all when you consider: (1) single female soldiers face intense pressure to be socially and sexually active; (2) there are a minority of women compared to men thus there are not equal opportunities for heterosexual social interaction; (3) there is much inequality between women and men as men are more likely than women to have positions higher in the authority structure; (4) people are isolated from home, family, and even those in the host country; (5) they are in fear of their lives and may have no hope for their futures.

clip_image003

http://commons.wikimedia.org/wiki/File:755th_at_Bagram.jpg

An additional factor has to do with the age of these soldiers; most are in their twenties, some older, some younger. For these generations, oral sex is not as taboo as it was for previous generations. In fact, to many in these generations (and some others), oral sex isn’t really considered sex. Rather, it is something friends may do in the absence of a steady partner. And yet oral sex is considered a form of sexual intercourse, particularly by the Centers for Disease Control, and one can spread sexually transmitted diseases through it.

If indeed some soldiers are making extra money by doing sexual favors for their colleagues, it can be explained by the structural demands of the overseas deployments.

If our military leaders wanted to decrease the chances of this happening, what could they do to ease the strain of these conditions and ensure an effective egalitarian military force?

September 17, 2009

What Does War Cost?

author_janis By Janis Prince Inniss

(Spoiler alert: In this post, I describe important plot lines of the movie In the Valley of Elah as well as some grisly details of a homicide.)

Recently, I saw the film In the Valley of Elah. Starring Tommy Lee Jones, the film tells the story of a retired Vietnam veteran named Hank (Jones) who is informed that his son Mike is absent without leave (AWOL) having just returned from a deployment to Iraq. Finding this to be out of character for his son, Hank quickly heads from his home in Tennessee to his son’s base in New Mexico to investigate. Because he was a military police officer, Hank spots a number of holes in the stories he is told about his son who has apparently vanished.

When police find Mike’s body dismembered and burned in a field, there is the usual silence and stonewalling from various military officials . How did this happen? Who would have done such a thing? Was Mike a drug dealer? What might Mike have been involved with that might explain his death? What happened to him in Iraq? Hank finds Mike’s cell phone and with the help of a technician is able to look at some of the video on the phone. The videos on the phone are jarring to say the least but they provide a lens by which Hank—and movie goers—see some of the grim reality that Mike experienced before his death.

Hank realizes that Mike’s buddies in his military unit are lying about their last night together, but Hank knows that this does not suggest their guilt because soldiers do not kill each other. Or do they? The truth is that Mike’s comrades murdered him and then lied about that night, the night after they all returned from Iraq.

Initially, I was surprised to learn that this film was based on the real life murder of Army veteran Specialist Richard T. Davis in 2003. As in the movie, Davis was reported AWOL to his retired veteran father who then investigated his son’s disappearance. One night after returning from Iraq, Davis and some other soldiers went out; they drank and fought and one of the soldiers stabbed Davis (Alberto Martinez) at least 33 times. The soldiers then bought lighter fluid, returned to the crime scene, lit Davis’ body on fire and then buried it in the woods.
Watch CBS Videos Online


How is this possible? Why would soldiers “on the same team” commit such a heinous act on one of their own? Martinez’s defense lawyer says the solider was delusional at the time of the killing. Another is said to have been experiencing post-traumatic stress disorder (PTSD) during this time. Recent research on the issue indicates that 37% of veterans from Iraq and Afghanistan who have sought care at VA medical centers were diagnosed with mental health problems—primarily PTSD and depression and another 10% have alcohol and drug use disorders. A RAND study of veterans found that 14 percent screened positive for PTSD and the same number screened positive for depression, rates higher than found in the general U.S. population (3.5% and 7.0% respectively). Further, RAND extrapolates that if these rates hold for all military deployed to Iraq and Afghanistan , 300,000 people are experiencing depression and PTSD.

As alarming are the numbers of suicides by U.S. soldiers which have skyrocketed since the start of the Iraq war to more than 2,100 attempts in 2007 compared to 350 in 2002, the year before the war began. The New York Daily News headline “More soldiers committed suicide in January than killed by Al Qaeda” highlights the fact that the suicide rate in the Army is the highest it has been in 30 years—143 in 2008. The other leading concerns discussed—PTSD, depression, and drug use—might both be related to these suicides as returning soldiers find their relationships unraveling as they struggle with untreated, maybe undiagnosed mental health problems. They turn to drugs and alcohol for comfort and some finally commit suicide. (Thirty-one year old veteran Josh Barber’s suicide illustrates some of these complex relationships among these factors.)

You may recall that I was initially surprised to learn that In the Valley of Elah was based on a real homicide. Although I found the answer to the “whodunit” in the movie shocking and continue to be distressed by the crime, as a mental health professional and sociologist I have often worried and wondered about the impact of war on soldiers. (This is distinct from issues related to the impact of war on civilians in war-torn countries.) How do soldiers train for and act on that training to destroy others but reintegrate into a peaceful society? What do we owe veterans as they grapple with these experiences? Are these questions related to ideology? In other words, are those “for” either war more or less concerned with these questions than those “against” them?

Surely you’ve heard various dollar figure costs of the war in Iraq and the one in Afghanistan. But what about the psychological toll experienced by soldiers? How do we measure those costs? What other psychological and sociological costs do you think wars create?

September 14, 2009

Suburbanizing Rural America

author_karen By Karen Sternheimer

When I was growing up, I loved going to an amusement park about 20 miles away from my home during the summer. To get there, we had to drive for what seemed like forever on a two-lane highway through what I thought was the middle of nowhere. I lived in a suburban area not too far from the mythic place Leave It to Beaver’s home was supposed to be located. The rural landscape felt foreign, someplace to drive through on our way to someplace else.

Years later, my mother moved into a new development a few miles past the amusement park. It was a big change from where I grew up, but had a retreat-like atmosphere. Similar developments sprang up in the area with names reminiscent of grand estates: Barrington, Hawthorn, and Hidden Valley to name a few. Off of two lane highways, each development features homes that are nearly identical in size, design, and color and include amenities like private lakes, club houses, pools, and golf courses. A five-star resort across the street from my mother’s development attracts corporate groups, weddings, and diners to the highly acclaimed restaurants on the property. This is a major change from an area of forests and farms and a far-flung amusement park.

I have watched this area morph into what some call “ruburbia”: a fully suburbanized community whose rural character is now more about style than substance. More commonly called exburbs, or outer-ring communities further removed from a central city than traditional suburbs. People who move there today will no longer find the small town lifestyle that likely attracted people in the past, but instead remnants of small town “charm” with all the amenities of most American suburbs: restaurant chains, fast food, big box stores, and eventually office buildings. In a region whose population has been shrinking over the last several decades, this area has seen a boom from just over 8,000 residents in 1980 to over 14,000 in 2005. While this growth has expanded the once small-town’s tax base—especially by drawing young affluent families—the shift into suburbia has brought with it some of the things residents might have hoped to avoid by moving to ruburbia in the first place.

One of the town’s initial attractions was its lush forests. When my mother bought her home, there were trees as far as the eye could see in her backyard, and realtors sold the property in part based on the serenity and privacy the trees provided. Now just a smattering of trees remain behind her house, and her view is mostly of house built a few years later. To add insult to injury, the homeowners often hold loud parties into the night during the summer. So much for serenity.

Of course, the environmental impact of cutting down trees trumps the inconvenience to the homeowners who once enjoyed looking at them. Deer with no place else to go often dart into traffic. The shade trees once provided is gone, so now people require greater use of air conditioning in the summer. And while it might seem as though the economic downturn might slow development, developers are still clearing land in hopes of attracting new business to the area. Acres once covered with trees have been cleared to make way for new developments like the one visible in the background of the photo below. For Sale signs, like the one in the foreground of the photo below, offer unoccupied land throughout the area.

clip_image002

Despite the signs indicating that several existing homes are for sale, many acres of land have been cleared to expand one housing development. While the housing bubble never affected this area as severely as other communities in the region, high unemployment in the state will likely make it more difficult to sell many homes, which can be significantly higher than the region’s median home price.

clip_image004

The acres surrounding big box stores, as pictured below, have also been cleared, leaving just a handful of trees and planting lawns in their place, which require a great deal of water to maintain. At one strip mall, two of the major chains, Circuit City and Linens ‘n Things, have gone out of business. The Linens ‘n Things store is still vacant, and yet a sign just beyond this open space advertises that the land is for sale in case retailers would like to build a new store in the space.

clip_image006clip_image008

clip_image010

Before the development frenzy began, the landscape was dotted with small houses and businesses like the one pictured above, a store that advertises feed and live bait to passersby. I recently went inside the shop for the first time. It was empty and the owner was glad to have potential customers to show around.

She told us that the building was constructed in 1899, but may soon be bulldozed to make way for an office building. The owner doesn’t want to sell it, but she is afraid that eminent domain laws will force her out. I was surprised to hear that a building like this would even be considered for demolition, primarily because there were so many other areas of cleared land with no buildings, not to mention those spaces near the strip malls. Across the street from an outlet mall, the shop now seems to lend some of the rural authenticity that the new chain restaurants and big box stores do not. Even for those bent on new development, I assumed that keeping a few of these places around could add to the “small town charm” that could cynically be used to attract business.

I’m not bashing my mother’s community—I enjoy visiting and still find it a relaxing place to be (except when caught in traffic on the once rural roads now serving as major suburban arteries). What other social consequences of “ruburbia” can you think of?

Photos courtesy of Linda Sternheimer

September 10, 2009

Cultural Perspectives and Assumptions

author_sally By Sally Raskoff

We see the world through our own particular perspective and often assume that others share our views. Public events reveal these different perspectives, such as the conflict between Professor Henry Louis Gates and the police officer called to Dr. Gates’ home about an attempted break-in. Thoughts on this incident varied based on one’s experiences and cultural perspectives.

As I read thought about the differing views of the Gates incident, I was reminded of a summer trip we took in 2006 to the Little Bighorn Battlefield in Montana. This is the place commonly known as the site of Custer's last stand. We were on the trip with some people who were really into discussing the battle and the strategies that the U.S. Cavalry could have used to prevail. I wasn’t comfortable with that particular discussion for many reasons.

For those who don’t remember their history, this is the notable battle in which Native Americans (specifically, Lakota-Sioux, Cheyenne, and Arapaho) and the U.S. Cavalry (and their Native American scouts) ultimately clashed and both sides suffered tremendous loss of life. The twelve companies of the cavalry fighting this battle perished, including its leader, General Custer. My travel companions and many others feel that the cavalry underestimated the number of Native Americans or that their strategies were insufficient. Although the native peoples “won” this battle, they too suffered tremendous loss and they ultimately did not win the war.

The battle took place in June of 1876, and my visit was on the 130th anniversary. The terrain there consists of beautiful rolling hills, and it is hard to see who or what is on the other side.

clip_image002

Once you get to the battlefield, the most obvious feature is the monument on top of Last Stand Hill. In 1879, three years after the battle, this location officially became a National Cemetery controlled by the War Department; in 1881, the twelve foot high granite monument was placed on top of Last Stand Hill.

Also at the top of Last Stand Hill is an enclosed area that delineates where Custer and many of his men fell. The fence around this area and the headstones invoke the common cultural markers of a cemetery.

clip_image004

If you move back from this hill, the view is quite grand (see photo below) and the granite marker and the cemetery fence are visible from many different angles.

clip_image006

The park map (see below) also shows this area as the focus of activity both historically and currently. Most of the tourists in this area were quiet and reverential, realizing that many people died at this location.

clip_image008

White headstones are placed wherever cavalry members fell. When you look over the expanse of the terrain, the white markers are obvious in contrast to the grass.

clip_image010

These two photos (above and below) were taken atop Last Stand Hill, looking south and east. The darker green grasses grow closer to the enclosed area mentioned above while the rest of the area is left to grow naturally.

clip_image012

If you’re at the top of this hill and you turn around to look to the north/east, you will see the image below – an iron sculpture of an Indian riding through the hills.

clip_image014

What you are seeing is the top of the Indian Memorial.

In 1940, the National Park Service took control of the cemetery and in 1991 it was renamed the Little Bighorn Battlefield National Monument and plans began for a memorial to the Native Americans who perished here. By 1997, the design was selected and construction began.

The memorial is on the other side of the hill from the Custer cemetery but one can still see the granite marker atop the hill. This new memorial consists of a round area with inscriptions on one side framed by the sculpture of the riders on the other side.

clip_image016

I found it to be quite beautiful and reverential, comparably as moving as the cemetery on the other side of the hill. However, my reverie in this space was interrupted by families traipsing through, children running and jumping up and down the steps in front of the sculpture. Tourists behaved appropriately on the other side of the hill, quieting their children and, if they were old enough, pointing out why. Few ventured to this side but many of those who did, in the time of my visit, seemed to forget they were still in a cemetery.

The quotes on the inside portion of the wall reminded people of what had happened there, yet I noticed that many of the tourists did not take the time to read them. (see photo below of two markers) They were walking through the space as if it were a rest area.

clip_image018

The design of this memorial was intended to serve many functions but also to serve as a ”weeping wound” in the earth.

As you leave Last Stand Hill and take the many trails throughout the rest of the park, there are some other interesting sociological aspects to notice.

Off the main trails, there are headstones. There are paved paths to the white headstones. In addition to those white headstones for the cavalry, there are dark red granite headstones for some of the Native Americans who died. Those that are relatively close to a trail also have a path to them, although they are not paved. The photo below shows two vistas with headstones, red granite on top, white on the bottom, from one of the trails.

I noticed immediately that the red headstones were quite difficult to see whereas the white markers were obvious, even from quite a distance. I was curious as to why the white headstones had paved paths while the red headstones had identifiable yet unpaved paths.

clip_image020

Once back at the Visitor Center, we attended a park ranger talk. I was fascinated to learn that all the rangers who do the talks about the park are themselves Native American and they tell the story that is not often told. They tell the story from the Native American point of view. Thus we learned more about how this was the culminating battle in which they were fighting to keep their way of life from these invaders.

They mentioned that the paths to the red granite headstones were not paved because that honored their culture while the white headstones were paved since that was in sync with that particular culture. Thus, when ceremonies were done to honor the dead, people used the paved paths to lay items on the white headstones and people also took the unpaved yet very clear paths to the red headstones. I also learned that the red headstones were placed in 1999.

We also learned some new facts about the battle. The conscripts in the cavalry at that time were mostly new immigrants, Italian and Irish, who did not necessarily speak English well or at all, thus the troops did not always understand the officers’ orders. This added to the problems that army was having in fighting this battle.

On the main building of the Visitor Center, the words of Black Elk are clearly placed (see photo below). I would have loved to ask everyone who passed by how they interpreted those words, but alas, I did not.

clip_image022

Visiting this site with acquaintances who were focused on calvary strategies challenged me. I used my sociological analytical skills to see as many sides of the situation as possible and to consider the impact of culture on its creation and on how people experience it as tourists and local residents.

I would encourage everyone to visit historical locations and use your sociological imagination to assess what you see. Don’t rely on your first impressions or assumptions. Instead, gather information about what you see so that you can compare the different sources and decide what makes sense to you. Consider alternative viewpoints, especially those that are contrary to your own.

Click here for further information about Little Bighorn.

September 07, 2009

When Is Silence Golden?

author_janis By Janis Prince Inniss

More than 40 years ago, Kitty Genovese was fatally stabbed as she arrived home from work at 3 am. Although there is now some debate about exactly how many of Ms. Genovese’s apartment neighbors heard her cries for help, it seems clear that some did hear her.

From their Kew Gardens apartments in Queens, New York, neighbors reportedly heard her scream and some even looked out and saw Ms. Genovese struggle with her assailant. Someone who heard her cries called out and the assailant fled, allowing Ms. Genovese to get closer to her building. A few minutes later the attacker returned and murdered Catherine “Kitty” Genovese.

I first learned about Ms. Genovese’s murder in an Introductory Sociology class and I know that it is still included in many textbooks. Perhaps you have read about this tragedy too. The reason that sociologists and social psychologists have referenced this story is that it highlights some important theoretical ideas about the way we behave: Initial reports said there were 38 witnesses who would have heard or seen some aspect of this crime being committed—a number now highly disputed—but still, the fact that not one single person came to the aid of this woman is baffling.

In May, a 13-year-old Walker Middle School student in Tampa alleged that he was sexually battered by two flag-football teammates while two others held him down in the school locker room. Three of the boys accused are 14 and one is 15. Apparently the victim said nothing while the attack occurred and nothing afterward.

He claims that the same boys had been bullying him for almost two months, during which time he said nothing to his mother, teachers, nor any other adult. Reportedly there were nine witnesses who saw and/heard various aspects of this assault. Yet none of them said anything to authorities until they were called in for questioning.

All of this silence has been confounding for school authorities. For example, School Board member Jack Lamb said, “I'm very surprised that this was taking place and nothing was said. It's kind of amazing." Another board member, April Griffin initially responded to the silence of the witnesses by suggesting that they should be punished: "If they watched these situations take place….I would like some repercussions for those students.” (Griffin later acknowledged that punishing youth who witness bullying will not serve to make them more likely to report such behaviors.)

A variety of explanations have surfaced to explain the silence of the witnesses. Perhaps the witnesses thought it was a joke and didn’t realize it was as serious as it was, or maybe they assumed that the victim would report behavior this brutal. Even if these young witnesses recognized it as an attack and not a joke, perhaps they thought that “somebody” else would report the assault.


The Bystander Effect - Kitty Genovese
Uploaded by kronosposeidon. - Discover more science and tech videos.

Social psychologists developed the idea of bystander effect which might help us understand the behavior of witnesses in both cases. Simply put, the theory predicts that large groups of people are less likely to respond to an emergency than smaller groups are because in a large group there is diffusion of responsibility: We each presume that someone else will respond, or that someone else will handle the situation better. Sometimes people are even reluctant to help while others are looking on. Although continued experiments to test this theory have met with varied results, the notion is a compelling one and can be used to understand the lack of bystander response in these stories.



What about the idea that reporting criminal or other behavior is snitching? And that snitching is bad? Within the hip-hop community and accompanied by t-shirts, DVDs (one notably featuring NBA player Carmelo Anthony) and other such accoutrements, the “Stop Snitching” campaign tells people not to cooperate with law enforcement authorities. Busta Rhymes and Cam’ron are just two of the more famous hip-hop artistes to employ this philosophy. Despite some publicity about the “Stop Snitching” campaigns from mainstream press such as Anderson Cooper for “60 Minutes”, a code of silence prevails in many other arenas. Can you think of some of those? Is it likely that middle school students in Tampa have adopted the “Stop Snitching” philosophy?

Or consider the case of Lisa Torti, who pulled her co-worker Alexandra Van Horn out of a car after seeing Van Horn’s car hit a pole. Van Horn became a paraplegic and has sued Torti alleging that her condition is due to Torti’s actions. Do you think people resist getting involved to avoid trouble for themselves?

Have you witnessed behaviors that should be reported? Did you? What influenced your decision either way? What are your theories about how cases like these two occur? And what do we know from the social sciences that could help school boards, for example, to find ways of changing such behaviors? What have you learned from sociology that would help you to provoke a response from bystanders if you were in harm’s way?

September 03, 2009

Thinking Like a Sociologist: Deconstructing Polls

author_karen By Karen Sternheimer

A recent episode of The Daily Show satirized both left wing and right wing cable news shows for using questionable poll data in their broadcasts. Regardless of your political orientation, this clip is a great introduction to some of the core concepts sociologists use in research methods and statistics: survey construction, sampling, and margin of error. Think about these issues as you watch the video below.

 

1. Survey construction

Did you notice the wording of the questions in the polls? Good surveys should not suggest how respondents should answer them, but some of the polls included wording that could certainly bias responses. Note how some of the questions in this clip tend to encourage one response or another, particularly the questions about taxes. Most people would respond negatively to a general rise in taxes. But there are a wide variety of taxes (sales and income taxes to name a few), and rates vary based on a variety of factors (such as location, income level, and family size). By contrast, the tax that excludes most potential respondents (raising taxes only on a small percentage of Americans) is more likely to generate favorable results.

A good survey question is worded as neutrally as possible and is specific so that respondents are very clear about the meaning of their response. A good question about taxes would be clear on what specific tax rate would change, by how much, and for whom. The word “tax” is so loaded that another word might even be used in its place, such as fee or levy.

2. Sampling

Let’s say we’ve written good survey questions. Our next step is to administer our survey to an appropriate sample.

What is a good sample? That depends on which group we are trying to learn about. If we want to get the opinion of sociology majors from across the country, then we certainly wouldn’t want to poll people who pass through the center of a couple campuses. For one, we would get a lot of non-sociology majors, and secondly, a small number of campuses would not adequately measure the nation’s sociology majors.

Many polls reported on the news imply that they represent views of Americans. That might seem like a daunting task, but it is possible to get a sense of public opinion this way.

You might be thinking that with a population around 300 million, it is nearly impossible to sample a thousand or so people and claim to know what Americans are thinking. To do so, pollsters need to construct a probability sample, which means that every American should have an equal opportunity to be chosen at random to be part of the sample.

Hypothetically, you can think of a random sample as similar to putting everyone in the population’s name in a huge hat which is shaken so anyone’s name can potentially be pulled.

But there isn’t a hat that big, nor is it practical for someone to write everyone’s name down on a slip of paper. Pollsters often rely on measures which are admittedly imperfect. One of the most common is random digit dialing, where a computer calls random phone numbers in hopes of getting respondents.

Have you ever gotten one of these calls? If you’re like me, you get a lot of calls that interrupt your busy day or evening, and caller id allows you to ignore the numbers you don’t recognize. This has created sampling challenges: if there are consistent patterns in who is more or less likely to participate then your survey results might be skewed.

So there are no perfect samples, but there are some really imperfect samples, like the ones in the Daily Show clip above. Some internet and television polls make no attempt to create a probability sample, so their responses are limited to the demographic already watching or logging onto their site.

There are also other factors which might make someone more or less likely to participate. For instance, if a television show asks you to text your response, viewers might be more likely to participate if their cell phone plan has generous texting privileges. Beyond that, some people are more motivated to respond than others. So these polls might be fun, but they are not very useful gauges of public opinion. That’s what “results not scientific” means if they flash it on the screen. But often shows do not.

3. Margin of Error

Even the polls with well-written questions and the best possible samples are approximations of public opinion, rather than a perfect reflection of what people really think about an issue. While we might not ever be 100% certain that a probability sample reflects the true opinion of a population, thanks to probability theory, we can be 95% confident that the actual opinion lies within a certain range.

Here’s an example. Gallup regularly conducts job approval surveys of political leaders. In an August 2009 survey, they found that 31% of Americans approve of the job Congress is doing. As you can see from the graph below, the approval rating appears to have fallen from previous months.

clip_image002

But every poll has a margin of error, since surveys are approximations rather than perfect reflections of a population’s beliefs. The margin of error is a measure that takes into account the sample size and the amount of variation of responses. In this case, the margin of error is ±4, meaning that we can be 95% certain that between 27% and 35% of Americans approve of the job that Congress is doing.

Now, if I were a member of Congress I wouldn’t be too excited about this range of scores, but because it is within the range of scores from earlier months we probably cannot say with confidence that Americans’ approval of Congress is declining. But no matter how you look at it, it’s pretty low.

You might notice that news shows reporting polls rarely discuss the meaning of the margin of error, if they mention it at all. Keeping this important measure in mind, you can use your sociological imagination to think critically about the next poll you hear about in the news.

Become a Fan

The Society Pages Community Blogs

Interested in Submitting a Guest Post?

If you're a sociology instructor or student and would like us to consider your guest post for everydaysociologyblog.com please .

Norton Sociology Books

You May Ask Yourself

Learn More

Essentials of Sociology

Learn More

The Family

Learn More

The Real World

Learn More

Introduction to Sociology

Learn More

The Everyday Sociology Reader

Learn More

« August 2009 | Main | October 2009 »