Are We Done With College Yet?

There should be a word for the things we do not because we want to, but because we want to be the kind of person who wants to.
A Softer World #626

There is already a term for the things we do even though practically nobody wants to be the kind of person who wants to: status update.

In “Generation Why?” Smith contends in McLuhan-esque fashion that the structure of Facebook is making people shallow, concluding that “[The Social Network is] a cruel portrait of us: 500 million sentient people entrapped in the recent careless thoughts of a Harvard sophomore.”  But the movie claimed that those careless thoughts were allegedly “I’m talking about taking the entire social experience of college and putting it online.”  And that’s a fascinating insight to me, since one of the distinctly collated memories I have of college is of cheerfully anonymous other students saying “How’s it going” as they walked through the courtyard without a pause in their step or a backward glance as if they didn’t realize that it was a question.  This behavior served to distinguish the difference between being a node on their network and being a person in their set of friends.

So while the LA Times reports that “College students may be lacking in empathy, study finds,” I’d observe that the trend is over 30 years so it’s not correct to heap the blame on Facebook.  People were already being vacuous at each other; Facebook just makes it easier and more pervasive than — for example — mere blogging ever did.  While the structure of tools certainly alters the way we engage in tasks, the tools were often created after the task with the effect of making some things easier and other things almost impossible.  And when a tool is built for a task that the builder wasn’t necessarily very good at, the result is the occasional awkward and telling limitation.  That Facebook wants to refer to all people you contact as friends is, I suspect, not a coincidence.

One of the more pernicious side effects of the tool is how word-oriented it is.  Traditionally, face-to-face interaction has provided for richer communication because of all of the non-verbal cues that are put to use to suggest power relationships, emotional states, et cetera, that practically nobody has the vocabulary to encode into their words.  This makes sense given the relative freakishness and youth of words compared to the communication that can go on amongst other animals.  In the age of mediated communication, however, we’ve got words and the :).  This lead the Wall Street Journal to the correlation that an increase in mediated communication is impinging on our ability to read non-verbal cues, with non-verbal cues being prime triggers for empathetic response.  And that sounds like trouble to me.

In Practical Wisdom, Schwartz and Sharpe talk extensively about how word-structured rules frame behavior, limiting improvisational wisdom-guided behavior.  This matters to the asynchronous model of communication — from tweets to status updates to debate speeches — because the chosen structure which facilitates one type of communicative behavior may (from McLuhan on down) impede another.

When words are your tools for evaluation, then you try to cram everything you are experiencing into a verbal format. When words, in the form of rules, are your tools for moral evaluation and decision making, then you limit your consideration to aspects of the situation that the rule speaks to. If all you have is a hammer, everything is a nail. Words are good—even essential—for many of the challenges we face in our lives. But they are not always friends of pattern recognition. And they are not always friends of wisdom.

This is hardly news.  Daniel Sieberg claimed “My life is not a status update” last year.  But if we are, as Jeremy Rifkin claims in The Empathic Civilization, soft-wired for empathy, what are we doing instead?  In response to “Generation Why?,” Helena Fitzgerald’s “Coldness, Cruelty and the Status Update” suggests that what we’re doing is self-voyeurism bordering on masochism:

When our relationships end, we announce them to the internet. When we begin new relationships, we announce them for everyone, including the last person we loved, to see. We become our own voyeurs; we are hiding in the bushes outside our own windows, watching the drama unfold within: Infidelity, disinterest, distraction, and new love overtaking old loyalties — not in any scripted narrative but unfolding in real time and happening to us and to the people we know — make up our entertainment.

This makes a certain degree of sense if we accept Dunbar’s research which says that humans are biologically incapable of effectively keeping track of more than 150 friends, resulting in a certain level of dehumanization of people to mere nodes beyond that point (or at least one close to it).  Indeed, if the advent of games on Facebook reduces the definition of “friend” to “somebody who clicks something to progress a virtual task in a game” and the modern social climber joins a network just to gain access to some of the people on it, then an ongoing expansion of consequentialist thinking towards other people should be utterly unsurprising.  While technical capacity has increased our awareness and net potential empathy as Rifkin advocates, I’m not at all certain that it is facilitating the ongoing development of actual empathy.  Schwartz reports:

Development of genuine empathetic understanding is not automatic. It depends on experience. But we seem to be predisposed to profit from the experience if we have it… Flanagan argues, for example, that children’s moral networks are tuned through experience and feedback: Children learn to recognize certain prototypical kinds of social situations, and they learn to produce or avoid the behaviors prototypically required or prohibited in each. Children come to see certain distributions of goodies as a fair or unfair distribution. They learn to recognize that a found object may be someone’s property… They learn to discriminate unprovoked cruelty, and to demand and expect punishment for the transgressor and comfort for the victim. They learn to recognize a breach of promise, and to howl in protest. They learn to recognize these and a hundred other typical social/moral situations, and the ways in which … society generally reacts to those situations and expects them to react.

The down-side of mediated communication is a reduction of visibility of impacts to other people, the reactions of whom should be triggering empathic responses, the lack of which puts people outside of our empathic sphere, relegating them to being the Other.

Rifkin glosses over this as he waxes poetic about the progression of empathy going from tribe to religion to state, never balancing his position with the realization that each of these groups were made possible by the existence of an Other.  Chris Hedge’s War Is a Force That Gives Us Meaning, however, discusses how this came out in the Balkans with the horrifying brutality and ethnocentric slaughters that followed: a movement needs an Other to contrast itself against.  And, as Eric Hoffer puts forth in The True Believer, “A rising mass movement attracts and holds a following not by its doctrine and promises, but by the refuge it offers from the anxieties, barrenness and meaninglessness of an individual existence.” Hoffer also made an early claim that would be picked up by Fromm in Escape from Freedom, that authoritarianism gains popularity by giving people “freedom from the necessity of informing themselves and making up their own minds concerning these tremendous complex and difficult questions.”

The point of concern is this: given enough connectivity, people can cobble together a support group for any manner of reprehensibility, taking refuge from the anxieties of the difficult questions they should be asking themselves while relegating anybody who would gawk at the behavior to being the Other.  In politics, this is manifested as partisanship.  More broadly, and counter to the idealist theory of the marketplace of ideas, this is group polarization.  But where it starts is with what is described on The Last Psychiatrist as “crowdsourcing the superego.”  In particular, he looks at a couple of scenarios which made the news for reasons inexplicable except to build a group contrary to social norms.

But what you need to get out of these stories is how this generation and forwards will deal with guilt: externalizing it, converting it to shame, and then taking solace in the pockets of support that inevitably arise.   Everyone is famous to 15 people, and that’s just enough people to help you sleep at night… It is, in effect, crowdsourcing the superego, and when that expression catches on remember where you first heard it.  Then remember why you heard it.  And then don’t do it.

Following up on that, he clarifies his particular concern with the amorality being effectively touted:

Many in the comments accused me of being an old codger, a “these kids today are immoral” uptight Rush Limbaughlite.  If you think that, you’re missing something truly important: these aren’t kids. These are middle aged professionals who have kids.  I expect– want–  a little Nietzsche in the 20 somethings of the world, to fuel them to do something with their lives.  But these are people who should know better.  Instead, they’ve convinced themselves, after 4 decades of life, that they deserve to be happy, that their happiness is more important than anything.

But this all makes perfect sense if a half-billion people are enjoying an online abstraction of “the entire social experience of college” rather than advancing the social experience of adulthood (which, I would separately contend, has also been injured by parents over-devoting themselves to their children to the detriment of their own adulthood).  The prime difference is that Facebook, in contrast to more conventional media, represents a democratization of crowdsourcing the superego — a boon for people not quite able to get on talk shows or reality TV, I’m sure.

What’s odd here isn’t that people are awful, but rather that acclimation to group polarization brings dissonance in an otherwise perfectly maintained persona projection.  Put another way, once a person has started a course of being, it is highly unnatural for them to stop being that way even if they want to be for political reasons.  This, according to Smith in some of the more intriguing bits of “Generation Why?,” was a crucial point of distinction between Roman behaviors and Grecian sensibilities which have been embedded in Facebook:

What’s striking about Zuckerberg’s vision of an open Internet is the very blandness it requires to function, as Facebook members discovered when the site changed their privacy settings, allowing more things to become more public, with the (unintended?) consequence that your Aunt Dora could suddenly find out you joined the group Queer Nation last Tuesday. Gay kids became un-gay, partiers took down their party photos, political firebrands put out their fires. In real life we can be all these people on our own terms, in our own way, with whom we choose.

Smith’s closing claim there appears based in Rousseau and thus has an existential weakness: existence precedes essence.  Just because a person isn’t actively performing whatever role in an interaction with another person doesn’t mean that they’ve given up that role.  Just because somebody doesn’t tell their husband that they’ve cheated on him doesn’t mean that they can avoid dropping accidental hints that make him very suspicious over prolonged and personal exposure.

But again, going back to the “social experience of college” model, this makes more sense: college was experimental freedom bounded by semesters.  If you didn’t want to take the consequences of what you were doing, then just claim that your coursework or whatever was too intense and drop whatever commitment you had picked up.  “Friend” was often synonymous with “study group for current class” or “coincidental neighbor this semester.”  Group hazings were practically required to create a sense of investment to foster loyalty that would have otherwise been completely absent from the recruits eager to continually try new things rather than settle into a lifestyle.  This is just part of the transition-heavy structure of college combining with incomplete brain development to give the bit of Nietzscheism that society expects to propel itself.  Whatever, no problem — speculative behavior is an important part of the learning process.  (Unless you fly directly home from Los Angeles without being very clear with the professor that you’re not going back with Sea-Tac with the rest of the class, as that will totally freak them out.)

The concern is ultimately that people are not outgrowing speculative behavior.  They are opting to not invest in anything or make any plans.  Indeed, having sold a house lately I’ve discovered that it’s more rewarding to not be invested in home ownership — and apparently even Mark Zuckerberg is also a rent-paying kind of guy.  This seems balanced in advantage and disadvantage to me.  On the one hand, I’m concerned that we’re making precious little progress in the physical world that will be left for our descendants — a position summarized by “Clicking ‘Like’ won’t solve America’s problems” — but at least it seems equally improbable that clicking ‘Like’ will result in genocide against the arbitrary Other du jour.