Ifs and Buts

Anyone whose examination of British poet Rudyard Kipling digs deeper than the Disney version of The Jungle Book will quickly stumble across the numerous portions of his biography that leave many contemporary critics recoiling in disgust. Chief among these is the rightfully notorious poem “The White Man’s Burden.” Other Kipling writings are only slightly less inflammatory when viewed through a 21st century sociological lens. “If-,” another of his poems, can be read as a narration of the enforcement of rigid gender norms – a series of demanding expectations for what a man must be and do to truly be worthy of having the last two lines be true for him – “Yours is the Earth and everything that’s in it/And – which is more – you’ll be a Man, my son!”

“If-“ sets the standards for true manhood dauntingly high. A man is expected to be able to walk a slew of emotional tightropes – “If you can trust yourself when all men doubt you/But make allowance for their doubting too,” “If all men count with you, but none too much.” When I’m not distracted by the glaring discrepancies between Kipling’s view of the world and, say, Judith Butler’s, I return in my thoughts again and again to one line in “If-“ in particular. It is “If you can talk with crowds and keep your virtue/Or walk with Kings – nor lose the common touch.” I see this as one of the foremost challenges (should one attempt to carry it out) of a life in academia, and in my opinion, it’s a challenge whose consequence isn’t invalidated by Kipling’s imperialism or by its location in a highly gendered poem.

From the perspective of a grad student (or at least this grad student), “virtue” can be thought of as intellectual heft, and the “Kings” are the faculty and, to an even greater degree, the prominent scholars in one’s field even beyond one’s home campus. I realize that referring to college faculty as “Kings” is sure to provoke some eye rolling, and it’s certainly true that the literal kings of Kipling’s era held powers that the contemporary sociology professor doesn’t come close to possessing and probably wouldn’t even want to possess. But even if my so-called kings – and queens – lack geopolitical power, they’re still, I suggest, somewhat removed from “the common touch,” and this removal is frequently framed as “virtue.” And it should! Being able to think critically about that which others blithely accept, being willing to tackle difficult interpretations, being willing to conduct research and look at evidence rather than always rely on one’s guts – these are, in my opinion, fundamentally admirable qualities. If I didn’t think so, I wouldn’t be in grad school.

Ultimately, however, pursuing these qualities almost inevitably entails divorcing ourselves from the crowds and losing our “common touch.” The break is certainly more fluid than it is clean, and of course it will take different forms for different people. But no matter what the specifics of any individual’s circumstance may be, some separation is practically unavoidable. To some extent this is true for virtually any pursuit one may take up in life. If you become a banker, you put yourself in a position to socialize with bankers to the exclusion of others and to think and talk about banking to the exclusion of other topics. If you join a basketball team, you put yourself in a position to socialize with other basketball players to the exclusion of others and to think and talk about basketball to the exclusion of other topics. I would submit, however, that the pursuit of a career in academia requires, or at least strongly encourages, a particularly profound and consequential separation. As academics or aspiring academics, we take up dual roles. We go on living our lives, down there among the crowds going through the same utilitarian and banal experiences that even academics can’t completely remove themselves from. (Even emeritus professors can sometimes get stranded on the side of the road with a flat tire.) But we simultaneously take up a position high in the clouds, looking down upon that world and staking a claim to an understanding of it – or at least one slice of it – that is worthy of recognition and engagement. Social life, in its varied forms, becomes, for us, fodder for ideas and knowledge claims.

Ideas don’t exist in a vacuum, of course; they become sensible and meaningful only in relation to other ideas. But how do we engage in that meaning-making process? We can try to carry it out down there on the ground with the people who are stranded with flat tires, but they may be hard pressed to reciprocate – not because they’re stupid or lazy, but because they’re preoccupied by the frustrations of flat tires, or minimum wage jobs, and because, it many cases, they haven’t been provided with the privilege or good fortune of having some degree of financial or physical or social distance from these frustrations in the way that many academics have. And so, as we search for partners with whom we can engage in the conversations and debates that give our ideas merit, we stay in the clouds. Besides, it’s nice up here. The clouds have nice restaurants instead of fast food chains. They have jazz and opera on the radio instead of Van Halen. And most importantly, they have conversation partners. We challenge ourselves to be worthy of conversation with kings and queens. It’s a worthy pursuit that I’m taking part in right now; a virtue I want to embody. But I worry about what I might miss if this pursuit leads me into a pattern of single-mindedness and myopia. Ultimately, I don’t just want to have my ideas be in conversation with those of the kings. I want to keep the crowd involved, both in terms of the conversation’s substance and the music in the background as we have it. “If-“ I can do those things, I may not be “A man, my son,” but I’ll be the scholar and the person I want to be.

Advertisements

High Noon of the 2000s

Catching a repeat of “The 100 Greatest Songs of the 2000s” on VH1 got me thinking about whether any particular genre or style of music could really be considered emblematic of the 2000s.  After initially struggling to formulate a general theory of 2000s music, I began to consider the arbitrary nature of decades as a reason for my struggles.  (It’s a more appealing explanation than my own lack of creativity.)  Our calendars are, after all, social constructions.  There’s nothing inherent in the laws of nature that says that the culture should undergo a paradigm shift on January 1st, 2010.  Rather than try to shoehorn our understandings of cultural trends into pre-formed ten year chunks, maybe we should be more open to a classification of eras that isn’t boxed in by decades or inappropriately stretched to fill ten year spans.  Ten years, after all, is a fairly long time (though that sort of judgment is relative), and the themes and attitudes that define one portion of that span may not maintain their relevance throughout the entire ten years.  Consider, for instance, “the ’60s” as they are understood in the popular imagination – a decade of war, protests, and the Beatles, even though none of these achieved particular prominence (at least in the United States) until 1964 at the earliest.  It is for these reasons that I present a concept I call “High Noon of the 2000s.”

The boundaries of High Noon of the 2000s are difficult to pin down, but I would define them as running from around early 2004 to late 2007.  They begin roughly at the point at which the initial shock of the 9/11 attacks had begun to wane (though of course there would be many exceptions to this generalization of our collective response to 9/11) and American society begins to adjust to the more-or-less permanent war footing of the post-9/11 era as the “new normal.”  They end with a one-two punch around late 2007 and early 2008 – the financial crisis and the rise of Barack Obama’s presidential candidacy.

Having described what does not define High Noon of the 2000s, I should turn my attention to what does exemplify the period.  There is no better way to do this than to introduce the man who personifies High Noon of the 2000s better than anyone else – Kevin Federline.  Now I know what you’re thinking – “Kevin Federline?  But Matthew, he’s such a has-been artificial celebrity.  And even during his fifteen minutes of fame, he was insipid and crass and selfishly exploited the efforts and accomplishments of others.”  I say – exactly.  And that’s why he’s such a perfect representation of High Noon of the 2000s.  Kevin Federline’s success, such as it was, exemplifies success as it existed in the culture at large at High Noon of the 2000s – materialistic, narcissistic, independent of any commitment to something larger than one’s self, oblivious to the inequities of our society, and, ultimately, fleeting.  Listen to “A League of My Own” from his 2006 debut album Playing With Fire.  (The world is still waiting for its follow-up.)  Can you not picture it being blasted from the stereo of a Hummer as a newly-minted real estate millionaire rolls through the sprawling suburbs of Phoenix?

Nathan Rabin of the Onion AV Club produced a survey of the songs of Volume 21 of the “Now That’s What I Call Music” compilation series that doubles as the greatest documentation of the zeitgeist of High Noon of the 2000s that has ever been produced.  Though he makes no specific reference to Kevin Federline in particular, Rabin’s analyses of songs like “Grillz” and “Honky Tonk Badonkadonk” (not to mention the picture of a lasciviously leering Trace Adkins that adorns the top of the article) bring High Noon of the 2000s into the sharp relief that this blog post can only dream of.

Oh, and in case you’re still a little unclear on exactly how our culture has changed since High Noon of the 2000s, this pretty much tells you everything you need to know.

Sociology at the Checkout

One of the grocery stores in town has a small sign at each of its checkouts that reads “If you’re lucky enough to look under 27, please have your ID ready!”  The language of the sign always grabs my attention.  If you’re lucky enough to look under 27.  Why is the grocery store so confident in the universality (or near-universality) of the notion that to look under 27 is to be lucky?  I don’t disagree with the idea that the average American would believe that, all other things being equal, looking young is desirable.  But how did we get to this point, and what does it say about our society? 

What makes this question especially fascinating to me is the sort of reversal that occurs during the life course with regards to the age people wish to be.  Kids frequently wish to be older, in order to have access to the various facets of life that are (at least in a legal sense) restricted to those of a certain age – driving, buying tobacco and alcohol, seeing R-rated movies, and so on.  Then, once we are firmly enmeshed in adulthood, suddenly the “desirable age” shifts to an age younger than we actually are.

I had a teacher in high school who grasped this conundrum and sought answers from our class.  He said something to the effect of “I always hear from you kids that you want to be older and you can’t wait for your birthdays, but people my age dread their birthdays and wish they were younger.  So exactly how old do you kids want to be, anyway?”  In response, a student in the class raised her hand and, in a matter-of-fact, isn’t-this-obvious tone of voice, replied “21.”  That was just one person’s opinion, but I’m sure she isn’t alone.  It’s not hard to figure why 21 might be the “magic number,” as it were:  You’ve passed most of the age milestones (including, particularly, the legal drinking age), yet still have what are often called youthful good looks.  But what are “youthful good looks” anyway?  Why are good looks typically associated with youthfulness?  It calls to mind the significance of social factors in determinations of physical attractiveness.  We might often think that since physical attractiveness is, well, physical, that it is an entirely or mostly biological process that society has no real input on.  Nancy Etcoff makes such a point in her 1999 book Survival of the Prettiest.  Her general claim is that modern notions of what constitutes physical attractiveness are rooted in evolutionary strategies for maximizing the likelihood that we can produce offspring and that these offspring will survive.  Social institutions such as advertising, corporations, and the media, she suggests, can tweak and interact with our hard-wired preferences, but they ultimately do not create them, any more than “Coca-Cola or McDonalds created our cravings for sweet or fatty foods”  (1999:  4). 

Sociology and evolutionary psychology have a number of disagreements, to put it mildly.  I don’t think I have enough knowledge of genetics (“none” would be my precise amount) to fully rule out possibility of some biological influence on our perceptions of attractiveness.  Yet I do believe that society’s impact on these judgments is real and is significant.  Superficial differences of biology are imbued with cultural meaning in ways that blur the lines between what is biological and what is social.  A comparison to race could be instructive.  Human beings are born with surface-level differences in skin color, it is true, but it is society that seizes upon these differences and uses them to construct the institution of “race.”  It is society that chooses to attach these social consequences to skin color and not to eye color, finger length, or other ways in which human beings can differ.  Just as it is society that frequently attaches esteem and privilege to light skin and stigma to dark skin (calling to mind W. I. Thomas’s observation that if human beings define something as real it will be real in its consequences), it is society that can set the terms of for what is considered an attractive human body.  What’s more, these standards can differ across cultures and are subject to change.  Consider the increase in the prevalence of disordered eating in non-Western societies whose traditional standards of beauty differed from those of the West but are increasingly exposed to Western media and advertising (e.g., Becker et al. 2002).

It’s clear that what I’m grappling with when I speak of recognizing cultural influence on attractiveness while still feeling that I lack the training in genetics to conclusively rule out any biological factors is the classic nature-vs-nurture debate.  Are attractiveness standards hard-wired or socially constructed?  As a sociologist, I lean strongly toward the social construction end of the spectrum.  But more to the point, I don’t feel that leaving the possibility open for the existence of some biological influence necessitates the abandonment of efforts to recognize cultural influence, or to change or push back against it.

 

Sources: 

Becker, Anne E., Rebecca A. Burwell, David B. Herzog, Paul Hamburg, and Stephen E. Gilman.  2002.  “Eating Behaviours and Attitudes Following Prolonged Exposure to Television Among Ethnic Fijian Adolescent Girls.”  The British Journal of Psychiatry.  180:  509-514.

Etcoff, Nancy.  1999.  Survival of the Prettiest:  The Science of Beauty.  New York, NY:  Doubleday.

 

 

Sociology and the Presidency

The current election season sparked me to reflect on the access (or lack thereof) to the ears of Presidents that sociologists have enjoyed over the past several decades.  It seems clear that sociologists’ Oval Office influence pales in comparison to the access wielded by economists.  Numerous economist-and-President pairings have become famous (or infamous, depending on your point of view) – for instance, can you think of Arthur Laffer or Jude Wanniski without also thinking of Ronald Reagan?  If you’re searching for sociologists who have served as Presidential confidantes, the pickings are slimmer.  There is Daniel Patrick Moynihan, who served as an advisor to Presidents of both parties and whose Washington connections helped ensure that his “Moynihan Report” on the African American family would become as famous (or – once again – infamous, depending on your point of view) as it did.

But the high-water mark of sociological influence on the Presidency may well have come in the summer of 1979.  The typical story of that period reads something like the following:  As the country grappled with high inflation and energy shortages, President Jimmy Carter delivered a speech diagnosing a national malaise, which fell flat and contributed to his eventual loss to Ronald Reagan.  In his 2009 book What the Heck Are You Up To, Mr. President?  Jimmy Carter, America’s “Malaise,” and the Speech That Should Have Changed the Country, Ohio University Professor Kevin Mattson fills in some of the gaps in the story of the speech.  For starters, Carter never actually used the word “malaise.”  But beyond that, Mattson’s book illustrates the extent of sociological thinking’s influence on Carter in the weeks leading up to the speech.

In an attempt to take the pulse of the nation and investigate his hunch that the nation’s troubles ran deeper than long lines at gas stations, the President met with dozens of academics, religious leaders, political figures, and ordinary Americans, as well as his own pollsters and advisors.  I took particular interest in Mattson’s references to Carter’s meetings with Robert Bellah and Christopher Lasch.  Bellah, a sociologist from the Univeristy of California at Berkeley, advised Carter to speak uncomfortable truths to the public about the decay of the bonds that held them together.  Americans had once shared a sort of “national covenant” – a commitment to one another that transcended self-interest.  By the 1970s, Mattson describes Bellah as telling the President, this covenant had eroded into a “contract model” of society that facilitated the growth of narcissism.  This trend toward selfishness was further discussed in Carter’s conversations with Christopher Lasch.  While he was not a sociologist by trade, Lasch and his book The Culture of Narcissism, a surprise bestseller in 1979, have long received considerable attention in sociological circles.  Intriguingly, Mattson describes Lasch as cautioning Carter that a discussion of the need for sacrifice might fall on deaf ears in light of increasing public cynicism. 

Ultimately, on July 15th, 1979, Carter gave a speech in which the themes Bellah and Lasch had discussed figured prominently.  Carter appeared to hope that Lasch had been wrong about the way the public would react to a call for sacrifice on behalf of the common good.  The President spoke of a crisis of confidence that he saw sweeping the land and the renewed public commitments that would be needed to overcome it.  In other words, he exhibited more confidence in the ability of the American people to acknowledge and respond to Laschian concerns than did Lasch himself.  Among the President’s words were the following:

“In a nation that was proud of hard work, strong families, close-knit communities, and our faith in God, too many of us now tend to worship self-indulgence and consumption.  Human identity is no longer defined by what one does, but by what one owns.  But we’ve discovered that owning things and consuming things does not satisfy our longing for meaning.  We’ve learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose” (quoted in Mattson 2009:  211).

So then what happened?  Another valuable contribution of Mattson’s book is its discrediting of the notion that the speech was an immediate disaster.  On the contrary, the public response was initially positive.  In time, however, a number of factors would contribute to the collapse of both Carter’s standing with the public and the resonance of his speech.  For one, Carter squandered much of the immediate momentum from the speech by orchestrating a purge of his Cabinet in the following days.  More fundamentally, however, the American public was unsure exactly how to go about addressing the crisis of confidence.  The most appealing response, offered by Republican Presidential candidate Ronald Reagan, proved to be the rejection of the entire concept of a decaying national covenant.  In Reagan’s eyes, according to Mattson, the American covenant required little in the way of sacrifice.  On the contrary, the pursuit of self-interest was to be celebrated and encouraged.

What lessons, then, do Carter’s speech and Mattson’s account of it hold for sociology?  I believe that the initially positive reaction to Carter’s speech suggests that a President can stand to gain from bringing sociology to the masses.  The juxtaposition of the positive response to the speech and Carter’s eventual loss to Reagan and his message of unabashed individualism points to a duality of American civic culture.  We value individualism, yet strive for something beyond ourselves. 

At the same time, we should also acknowledge that the variety of sociology that speaks of a “national covenant” is but one of many.  Who gets to define this national covenant?  Who has the right to try to change it?  It seems possible that the negotiation and maintenance of a national covenant could take place under the terms of those with the most power in our society – Whites, men, heterosexuals, and so forth.  The thicket of these debates is familiar terrain for sociologists, but the complexity is perhaps indicative of why Presidents have generally been reluctant to engage with the discipline.  Better to just tell everyone to go shopping.

How Sociology Explains How Soccer Explains the World

Like virtually any two influential social institutions, sport and academia have a knotty and multi-faceted relationship.  Despite popular assumptions that jocks and nerds inhabit mutually incomprehensible cultural worlds, there is a long tradition of academic attempts to grapple with the social significance of sport, and sociology boasts a thriving subfield devoted to the subject.  However, these academics’ engagements with sport are most often carried out in the context of their roles as scholars, rather than as “sports fans” per se.  Indeed, the very identity of the sports fan, rooted as it is in the word fanatic, seems difficult to reconcile with academic commitments to question dogmatic assumptions and pay heed to empirical evidence.  What empirical evidence could possibly exist to suggest that the devotion of attention and emotional investment to sports teams, particularly perennial losers like baseball’s Seattle Mariners or hockey’s St. Louis Blues, yields anything other than anguish and a monumental waste of time, money, and energy?  At least a life spent following the New York Yankees reaps the psychological reward of basking in the glory of championships at the rate of about two a decade.

Nevertheless, I can’t help noticing a conspicuous exception to this academic reluctance to engage in fandom when it comes to soccer.  The recent Euro 2012 tournament was another reminder of the genuine excitement many of my academic friends and acquaintances have for the sport.  I’m far from the first person to notice the affinity between the highly educated and the “beautiful game” (though every time I see that phrase applied to soccer, I am forced to conclude that the user has been cruelly denied the opportunity to witness the exquisite splendor of a fast break by the UNC basketball team).  In his 2004 book How Soccer Explains the World, Franklin Foer looks at soccer as a front in the culture wars.  Soccer fans, at least in the United States, are a worldly, cosmopolitan lot, while its detractors tend to be downscale believers in American exceptionalism whose distrust of globalization in general shines through in their dislike of soccer and of those who suggest that America take a cue from the rest of the world and embrace it.  It isn’t hard to pick out which side of that divide academics would want to line up on.

So it’s not without some serious cognitive dissonance that I announce to the world that I, an aspiring academic, a dutiful liberal, don’t particularly care for soccer.  I love football (as it is defined by Americans outside the ivory tower), basketball, and hockey and the North Carolina-based teams that play them, I love baseball even though North Carolina doesn’t have a major league team, and I love the basketball team at my alma mater of UNC Chapel Hill, but I just can’t muster enthusiasm for the sport that the rest of the world – and, more importantly for the social capital of a graduate student, the apparent majority of sports fans in academia – adores.

Realizing the sort of company one keeps by virtue of a disinterest in soccer is enough to provoke a near existential crisis in a young academic.  Foer’s book describes some of the more vociferous members of this rogue’s gallery.  Witness, for instance, the appalling homophobia of sports radio shock jock Jim Rome, whom Foer quotes as saying “My son is not playing soccer.  I will hand him ice skates and a shimmering sequined blouse before I hand him a soccer ball,” or the peculiar style of patriotism practiced by NFL quarterback-turned Congressman Jack Kemp, who argued against an American bid to host the 1994 World Cup by proclaiming “I think it is important for all those young out there, who someday hope to play real football, where you throw it and kick it and run with it and put in in your hands, a distinction should be made that football is democratic, capitalism, whereas soccer is a European socialist [sport].”

And so if these are stakes, whose side am I on?  I used to believe that my hatred of the Atlanta Falcons was enough to solidify my understanding of myself as a fundamentally good and decent person with a bright future.  (Didn’t Max Weber call hating the Falcons the most commonly accepted sign of eternal salvation among people who couldn’t commit to the Protestant work ethic?  Everyone needs a Plan B.)  Now I realize I must expand my horizons.  I must make a change.  And so I resolve now that as part of my broader system of preparation for a career in academia, I will apply the same level of dedication and diligence that I commit to studying for comprehensive exams and writing articles for publication toward acquiring a taste for soccer.  After all, if I ever have to make a good social impression at a reception following a job talk, I’ve begun to resign myself to the fact that no one is going to want to hear about the offensive line play of the Carolina Panthers.  But Ronaldo, on the other hand…

<Turns on soccer game.  Two minutes pass.>

Then again, forget it.  I’ll just go back to studying for comps.