The Lexwerks

Linear Dragons

“We each make our own accommodation with the world and learn to survive in it in different ways.” –Douglas Adams, Last Chance to See

Photo of Komodo Dragon, 2012, at the Sydney Zoo

Tuka the Komodo Dragon

Aristotle was an asshole: his theories of virtue, ethics, and morality actively excluded the overwhelming majority of not only the world (no barbarians allowed) but even the residents of Athens (also no women, poor people, or ugly people). This is a crucial point that gets forgotten when people try to tap into ancient Greek philosophy: its aristocratic aspirationality may not scale up so well. And Aristotle’s blind spots make him generally incompatible with the teachings of early Christianity that started in a Roman-subjugated and entirely-underclass Jewish population. And while Nietzsche would later complain that the morality of Christendom was only suitable for herds of slaves that diminished the (romantic) dignity of aspirational — that is, both heroic and fictional — humanity, Nietzsche didn’t provide a form of ethics that scaled well either. And if you want more on this topic, then I’d suggest reading Alasdair MacIntyre’s After Virtue because my intention here is only to call to mind what Aristotle gets right, and what dragons get wrong, and what it means to ordinary modern people.

Aristotle gets two things right: any virtuous behavior tends to come between either too much or too little of that type of behavior, and a person develops that virtue by intentionally practicing hitting that mid-point of virtuous behavior. For example: while too little courage is obviously cowardice, too much courage is suicidal insanity — thus the virtue of courage is exercised and developed by finding the mid-point between timidity and wanton recklessness. Aristotle also asserts that relying on ingrained discipline or intuition could result in the short-term appearance of virtuous behavior, but it wouldn’t be a conscientious practice and development of it. On the whole, existentialists can look back to Aristotle’s insistence on practiced virtue for their core claim of Existence Precedes Essence.

Anyway, dragons. Nobody coming from the European tradition would make a claim as to the virtuousness of dragons. While they may have many very impressive qualities, dragons are considered to be bad by human standards. And this is true not only of the mythological dragons — I’ll get to them — but also for real dragons. Here’s a snippet of conversation Douglas Adams recorded regarding the Komoodo Dragon in his book Last Chance to See:

What works as successful behavior for us does not work for lizards, and vice versa.

“For instance,” said Mark, “we don’t eat our own babies if they happen to be within reach when we’re feeling peckish… A baby dragon is just food as far as an adult is concerned… It moves about and has got a bit of meat on it. It’s food. If they ate them all, of course, the species would die out, so that wouldn’t work very well. Most animals survive because the adults have acquired an instinct to not eat their babies. The dragons survive because the baby dragons have acquired an instinct to climb trees. The adults are too big to do it, so the babies just sit up in the trees until they’re big enough to look after themselves. Some babies get caught though, which works fine. It sees them through times when food is scarce and helps to keep the population within sustainable levels. Sometimes they just eat them because they’re there.”

In this case, Aristotle would say the Komodo Dragon is not virtuous despite eating only a moderate quantity of its offspring because it’s not intentionally practicing moderation in eating its young, but instead encountering the constraint imposed by the baby dragons climbing trees to escape. There’s a moral lesson in there for us all, I’m sure.

But let’s talk about the European mythological/symbolic dragon. What we know about them tends to come from our perspective: they’re slain by heroes, most notably by Bard or Saint George, but really anybody trying to recover the treasure of a kingdom and/or virginal princess has motivation to go after a dragon. And everybody knows where to find the dragon because by the time the dragon is big enough to have a kingdom’s worth of treasure or be stealing — or demanding — princesses, they’re already incredibly powerful and noteworthy and almost certainly well into their adult years. Conversely, the idea of a young dragon seems generally unremarkable: their lack of power prevents them from abusing their power, so continually pushing themselves to their personal maximum of consumptive rapacity — symbolically both in terms of material wealth and sexually with a standing preference for adolescent virignal girls — is unremarkable until their power has grown to the point of being overwhelming and the consumptive rapacity that they’ve been practicing for years, decades, maybe even centuries, is abusive. And this gets the attention of the heroes and gets the dragon killed.

From our point of view, the dragon should have known better than to irresponsibly and intolerably abuse its power. But what did the dragon know? The symbolic dragon is intelligent, though more cunning than wise. The dragon knew that if it didn’t fully secure its territory, it’d get slaughtered by humans or, presumably, attacked by other dragons. And so as it’s growing up, it is constantly straining its small self to ensure its individual security against a plethora of social threats. And then when it is an adult, it finds that its essence is simply what it was practicing for all though years only now it’s got this incredible dragon body that’s making it wildly successful at defending its treasure hoard and intimidating humans into virgin sacrifice.

The ingrained insecurity of its youth leads the dragon to not practice finding mid-points of virtuous behavior, but instead pushing towards chronic excess in the hopes of feeling secure. Of course, the success of its excessive behaviors increase its value as a target of valiant heroes and thus undermine its security.

One of the early lessons of Jordan Ellenberg’s How to Not Be Wrong is on non-linear (that is, “not in a straight line”) thinking as exemplified by the Laffer curve. And the lesson is simple and sounds just like Aristotle: it optimal position for a variable policy is somewhere in the middle. Laffer was originally describing the hypothetical curve of government revenue from taxing people too little or too much: if the government doesn’t tax anybody anything then it’ll collect $0 but if they tax everybody everything, then nobody will bother earning anything and the government will also collect $0. Similarly, the dragon should strategically attune its greed to the balance of its need for power to fend of incidental challenges with its need to avoid the level of infamy that makes it a ripe target for heroes.

Instead, the dragon’s linear more-is-better thought process leads it to a cruel optimism: that it may one day be secure if it just keeps doing what has finally started seeming successful for it with the gold-pillaging and virgin-consuming.

And all of this is actually super-relevant to humans, even the ones that prefer evolutionary biology over symbolic mythology, because our limbic system is what makes us similar to lizards and undermines our ability to break out of linear thinking.

“It doesn’t matter how clever you think you are; those years, they leave all kinds of scars.” –Curve, “Beyond Reach

I sympathize with the dragons. It’s not just that I am amassing a treasure-trove, nor is it that the keenness of my physique can dissolve the attractive young fitness coach-lady’s professional veneer into flirtatiousness as she probes my pectorals for fat that is not there. But rather like the dragons, it’s that I was not always like this.

Growing up I was always one of the youngest kids in the class, and also tended to have the lowest BMI. I’m average height (5’10”) but until I was 31, my maximum adult weight was 128 lbs — so rewind that to high school and I’ve got the kind of scrawny body that results in girls telling me that I’m causing them body image issues. For most of my formative years, however, I went out of my way to avoid being persecuted. I learned very early — from my brother’s experience as well as popular culture — that there were violently dangerous bastards at school and that I should try to avoid them: give space, or blend in, or — late in high school — stand out by looking spiky, or, in the worst case, use my small size to slip between obstacles and people. Failing that, and knowing that I’d not win a “fair” fight, I should fight dirty and decisively.

I confess that I was temporally lucky; when I was growing up, it was not utterly horrifying to be the quiet inscrutable loner. Taking Machiavelli — “It is better to be feared than to be loved” — to heart was not yet grounds for expulsion. Columbine was still a few years away, so being a bit prickly wasn’t a therapy-inducing problem. Sara Taylor describes that time of my life perfectly in Boring Girls:

I suppose it was around this time that I started nurturing my desire to be feared. I wanted to surprise people who underestimated me, and rather than simply impress them, I wanted them to regret having felt that way. I became fixated on that moment of realization.

And this is simply because if a fight happens, then the fight isn’t really going to end until my opponent realizes the depth of their error in coming after me. I’ve heard that cis-boys can rough each other up to test and reinforce the strength of their hierarchical organizations; that team sports help engender this in them. I don’t understand that way of thinking at all. I think it’s fucking idiotic.

Anyway, my perpetual readiness to fight was thankfully never substantially tested. I suspect I was thankfully a step above where Paul Graham describes in his essay “Why Nerds are Unpopular” when he says that “to be unpopular in school is to be actively persecuted” — and indeed, I would assert that to be too popular is to be a precarious target for persecution, just like a dragon with too much treasure — but for me that just meant that I ought to seem complicit to whatever behavior, no matter how wretched, I couldn’t avoid to ensure my personal safety.

And that’s how a kid keeps his social nose clean while running the risk of encountering a predatory gang of bored delinquents, or simply being trampled by some girl who literally runs him down because she’s oblivious to where she’s going, both of which come together later in the form of obnoxious college football jocks, inadvertently confirming that the lower half of campus a hive of degenerate villainy — as I had long suspected and mostly avoided.

But now I’m “like this,” which means I’ve added 30 lbs of muscle and have traveled solo over much of the world (and surfed on three continents) and so on, and yet I’m still deeply, physiologically anxious about meeting people when I don’t know what they want from me. I still glance at joints, eyes, throat, and kidneys when encountering “sporty” looking guys even though I look almost identical to them. I resent people who are oblivious to how much physical space they’re taking up even though I’m not certain how much additional space I’m taking up.

The insecurities that I used to intentionally practice have persisted with me long past their defensive usefulness, corrupting my memories such that I identify with troubled Hal from Rocket Science more than I do with all the small-town newspaper reports of my long run of youthful accomplishments that my mother lovingly and comprehensively scrapbooked.

It turns out that bending an ingrained behavior into a non-linear virtue is a difficult thing to do; it is difficult to make the dragon lurking in the limbic system curve its habits.

So am I saying that despite appearances I’m actually a cold-blooded monster that will flame-broil you as soon as look at you? That’s certainly one interpretation, but I’d prefer you realize that this

is shockingly wrong linear thinking that fails to see how the taint of power drives nerd culture on a trajectory indistinguishable from bro culture in the same way that, as MacIntyre observed, power has consistently turned Marxists into Weberians. And this is well known, even specific to historic nerd culture as deconstructed from Revenge of the Nerds: their linear pursuit of one goal overshot the mark of virtue; in the end their behavior was practically indistinguishable from their antagonists. While it may be totally true that those people over there have created a nasty hostile asshole culture, this does not preclude the possibility we are living in a nasty hostile asshole culture ourselves, especially if we’ve spent an awful lot of time fixated on how much we hate their culture rather than positively cultivating our own.

And so consider the Komodo Dragon: just because the baby Komodo Dragon is able to climb a tree to flee its parents doesn’t mean it won’t try to eat its own babies when it grows up. It seems like we humans should be able to behave better than this, but it’s not at all clear that we’re going to.

Not a Bug; Loss Aversion is a Feature

TLDR? Loss Aversion is a psychological defense against focus capture, not a bug in our mathematical reasoning.

We’ll come back to that. But first I want to talk about restaurants and food allergies. See, back in 2012 I went from having a dodgy digestive system to having a wrecked digestive system with my body deciding that an awful lot of otherwise normal foods — like bread — should be treated like poison. And this changes my entire approach to all eating, which can interfere with the social aspects of eating. When Jennifer Esposito says that going out to eat isn’t fun anymore, this is the sort of thing she’s talking about: rather than focusing on having fun with your friends or meeting interesting new strangers, the top priority is Don’t Get Poisoned — and that’s even if your social circle understands that you’ve got biological issues and aren’t just trying to be a fad-tastic hipster.

Food allergies have changed the way I tip restaurant staff. Due to the irksome list of things my body rejects, there’s really not much on most menus that I can eat — the result is that my meals are rather repetitive. And what I’ve found is that I really like it when the server figures this out. It assures me that “Don’t Get Poisoned” is under control, so I can have a good meal with friends as if I were a normal person. My iconic instance of this was when I and my friends went in and the young waitress was near the end of a very rough shift and looking worn down, she’s dutifully taking the orders with the last reserves of her chipper customer-focus, and she gets to me and instead of saying “Do you know what you want?” she simply says “I know what you want.” Which was amazingly great as far as I’m concerned. Even my mom will occasionally try to poison me, so if a waitress is taking extra care to assure me that I’m not going to be poisoned then I’m certainly going to leave a larger tip.

To put it another way: I’m not paying more to avoid being poisoned (though the gluten-free bun on a burger often does come with a surcharge) because nobody should be poisoned; I’m paying more to not have to think about being poisoned. I’m paying more in gratitude to people who are thinking about how to prevent my cognitive drain.

So now it’s time to discuss the notion of cognitive drain. Here’s a short article from Kathy Sierra, or her longer video if you want to learn about how it applies to software development, or her even longer book, but the short of it that is relevant here is that: cognitive resources trade off with cognitive resources. Whether we’re holding on to a concern about being poisoned, or exercising the self-control necessary to not yowl at idiocy on parade, or getting distracted by the co-dependently needy notifications of a fussy app, your mental capacity is being diminished.

This plays out with priorities: if you’re focused on being Not Poisoned, then the idiotic buzzing of the phone can bloody well wait, Nick Carr. The immediate priority (No Poison) creates a sort of tunnel vision that hedges out everything else even if that’s not a good thing to do. Sure it holds my attention away from being notified that another bot has followed me on Twitter which is totally fine, but it also distracts me from my friends or from that incoming message from the lady I’m trying to cultivate a relationship with, either of which is clearly more desirable than something we’d all like to take for granted (Not Getting Poisoned). As a bonus, having a clear priority of Not Getting Poisoned also mitigates sexism and lookism when tipping the wait-staff — so let’s hear it for the social progressiveness of… autoimmune digestive disorders?

Anyway, I view the functional value of tunnel vision like a Laffer curve, and if you’ve not been properly introduced to the Laffer curve then I strongly suggest that you go read Jordan Ellenburg’s How Not to Be Wrong because, excepting a bit where he goes off at length on overloading the lottery, it’s the most interesting applied mathematics book I’ve ever seen. But the generalized point about the Laffer curve is that on one side of the curve you’ve got crap and on the other side of the curve you’ve also got crap and the optimum position is between the two sewage troughs. In dealing with focus, one side is distracted by everything (as many people are concerned about, inclusive of Carr, and presumably Postman, and anybody who looks at what colleges are coming to expect from high school juniors and seniors) while the other side is distracted by nothing regardless of how important or monumental it happens to be.

Tangent: To be fair to Carr et al, I do agree that we are faced with an ongoing firehose of irrelevant alerts, but it didn’t start with our phones. I can readily push the blame back to the rise of competitive 24/7 news channels, especially Fox News with their perpetual state of “News Alert.” If you compare what people learn from the Daily Show (1 day roll up) or, better yet, Last Week Tonight (7 day roll up) versus a Fox News Alert (no roll up at all) this should be awfully clear. But the predominant self-victimizing victims of this are the people affluent enough to have the slack to care. (If you want to read about the functional value of slack, Tom DeMarco has a nice little book with the obvious title of Slack.)

But where I’m really headed with this is into the territory of people who don’t have enough slack to afford distraction and a 90-minute lecture on the cognitive effects of poverty:

Now there’s also a New York Times article or a Guardian article, as well as the referenced book, but there are a few key points I’d like to bring out:

  • Our brain will subconsciously fixate on what it thinks we’re lacking, no matter how much we consciously assure ourselves that everything’s under control. This came out of the Minnesota Starvation Experiment. This results in focus capture. Conversely, if you have more than an optimal amount (think Laffer curve here) of a resource, you may have — as far as economists are concerned — insufficient focus on the resource, leading you to undervalue and possibly squander it. In the lecture, this was comparing discounts on computers and MP3 players.
  • Focus capture is a cognitive drain. In the lecture, this comes out with the word hunt but is a large point. The not-actually-copacetic-at-all office worker is drained by diplomatically smiling at the idiot stakeholder, I’m drained by avoiding poisons, poor people are drained by wondering how they can possibly afford to pay for “this” regardless of what “this” is. All of this drains the resources available for doing other things, like doing the best possible work that we’re paid to do, or properly enjoying a stellar meal, or correctly filling out ridiculously obtuse paperwork which may or may not be some behavioral economist’s IQ test. (In aggregate this extends to “the stable home life of the affluent grants them a biological advantage in maintaining focus on the manufactured tedium of school, helping to maintain classism indefinitely.”)
  • The cascade of short-term thinking “errors” caused by focus capture leads to future difficulty. This is an ongoing point in the lecture in terms of loans, especially those with interest rates that may be termed “predatory” by people who can afford to have no part of them. But that’s the basic economic example for poverty. My focus on being Not Poisoned has me reluctant to try new foods regardless of how amazingly great they may turn out to be. And the ironic opposite of poverty has brazen consumerism leaving affluent people wondering how they can maintain all that they’ve acquired, from the too-big house to the too-full closet and even (dare I say it) the too-many children — or, to put it another way, action now commits future resources.
  • Our public policy does a bad job of being fault-tolerant despite the well-known ability of people to make mistakes, especially on obtuse bureaucratic paperwork. This comes into play at the end of the lecture; the articles get to it faster. But my short evaluation of it is this: politicians are intentionally hard on poor people to prevent “cheating the system” and thus (appear to) protect the ostensible resources of the affluent, with the sadistic irony being that the people who have the resources to cheat the system are the ones who aren’t being distracted by the ongoing/perpetual hustle-and-grind of poverty.

But the point that the lecture didn’t get to that is easy to hypothesize is this: Loss Aversion is not a bug in our reasoning. It’s a feature that prevents focus capture. And this would be because an abstract loss from the status quo drops the person from a secure position into a resource poverty, necessitating a sub-optimal focusing behavior until the resource had been re-secured. If the behavior pattern is known to be sub-optimal in a “it caused people to walk into a trap” kind of way, then it should also be the kind of behavior pattern that natural selection has discouraged from showing up again and again. Regardless, the economists who chortle in self-satisfaction at the innumeracy of the common populous should be slapped with a fish and then ignored rather than “regarded as clever” which has tended to be the biopower-augmenting reaction for some time now.

The point is that keeping your brain from suffering the effects of poverty is why loss aversion is a feature of (and not a bug in) our reasoning; promise fulfilled.

But where this really goes is to Fault Tolerance.

  1. Our anti-poverty programs tend to be badly designed. This is documented in the articles above and mentioned in the lecture and any further coverage on at a bullet-point level would be doing it a disservice. But do tip your waitress generously for taking care to not poison you, especially at the end of the day she’s been having.
  2. The disjointed nature of our educational curriculum, particularly for high schoolers, is stupefying, and it has been for generations now. Context switching between the intimately-related mathematics and physics (to pick on the most obvious example) is an utter waste of kids’ cognitive capacity that could be applied to both concurrently. The artificial isolation of subjects tries to push even our most stable students into brief lanes of tunnel vision while ignoring the amount of focus captured by everything else going on in the rest of their day and then criticizes their performance. (An obvious path to investigate would be collapsing the not-uncommon block schedule slightly: half a day for math and science, half a day for problem-solving/engineering and technology; alternate day mixes history and literature for one half and art and civics for the other. Reducing the loss due to switching is good for cognitive resources; blurring the testability of material is good for fault-tolerance.)
  3. But let’s get personal: if you’re not a politician or school reformer or even necessarily poor, you may still be wasting resources on something as basic as getting dressed. Really, why did modern-day Faustian genius Steve Jobs always wear a black turtleneck? So that he wouldn’t squander his genius on deciding what to wear. That’s it, that’s the reason. Timoni West has a super-great article on how to populate a wardrobe with clothes you don’t have to think about, and I support it immensely in the same way I support Tina Fey and am gobsmacked that any press would criticize an economist over the simplicity of their fashion choices. Regardless, simplifying your wardrobe will save your decision-making capacity for when you need, but it will also make it fault-tolerant by ensuring that you don’t don a fashion faux pas while only half-awake: win-win!

While this topic is vast because it goes into the core of reconciling our current actions to our future outcomes, both with underestimated commitments and unintended consequences, I’m short on points that I want to spend time making at this juncture. You should go get dressed and I should go get Not Poisoned.

Moral Panic, Nostalgia, and Phoning It In

I’m going to try to keep this short because I’ve already spent too long watching a Public Forum debate in the wild. The topic was (over)simply “Smart Technology is Making Us Dumb.” And as far as I’m concerned, the affirmative team lost in multiple ways.

Now I might forgive them their squishy failure to define terms. In a public forum, they’re appealing to the biases of what people can see: smart technology is the phone that the kid is texting on, dumb is the kid texting on the phone walking out into traffic. The topic is loaded with implicit value judgments about smart and dumb as Dr. Bell (on the negative) correctly observes, with a definite pejorative feeling about what dumb is. But then both of the speakers on the affirmative reveal that they’re carrying these menacing phones of stupidity-inducing doom as well, so we really shouldn’t trust them: either they’re obviously dumb or they’re lying to us about the effects of the technology that they’re using. Similarly: did the Google engineers Nick cites arrive at a spiritual awakening about the folly of their career path, or were the neuroscientists shocked into changing their relationship to technology? Such ripe possibilities go unmentioned.

But Nick wants us to believe that we’re becoming dumb because of all the notifications our phones give us that compel us to respond to them. Only what Nick is describing isn’t smart technology — it’s stupid technology. If it lacks the common sense (as per the definition of “stupid”) to know that it shouldn’t interrupt now, then it’s a stupid technology. And yes, the part of Android 5.0 where you can no longer easily drop your phone into STFU mode made Android phones stupid. But the notifications are allegedly symptomatic of Information Overload which is non-unique if not actively untrue and even if it were true and unique to the loss of gatekeeping on bloggers like me (never mind the explosion of channels of cable TV or the explosion of books due to that nefarious printing press) it would merely indicate that people aren’t very good at being their own gatekeepers.

And this goes to the main point of Nick’s partner, Andrew Keen, who vitriolically objects to the noises that the proles are making with their smart technology. He apparently pines for the days when the mass of men lived lives of quiet desperation instead of the noisy vulgarity they’re subjecting him (due his lack of filters) to now. But this is a false nostalgia of the affirmative, wishing for wise people thinking deep thoughts as cleverly posited by the final audience question (in short: people without smart phones don’t sit around discussing philosophy or astrophysics; they go out dancing… so what’s your definition of “making”?) crystallizes this: the so-called smart technology is merely allowing more people to express their pre-existing vulgarity. Indeed, as Bruce Feller reported in the New York Times,

A growing body of research indicates how deeply our brains are wired to seek social approval. A study out of Harvard in 2012 showed that humans devote up to 40 percent of our time to self-disclosure, and doing so is as pleasurable as having food or sex… Matthew D. Lieberman, a professor of psychology at U.C.L.A. and the author of “Social: Why Our Brains Are Wired to Connect,” told me that this need for positive social interaction is hardly new. “It’s been there in one form or another since before the dinosaurs 250 million years ago,” he said.

It isn’t making us smarter, but it’s not necessarily making us (in social aggregate) dumber, either, especially if we use the actual definition of “dumb” which is “mute; unable to speak.” Bonus points if you noticed that “how deeply our brains are wired” directly clashes with Nick Carr’s predominant concern about our brains being melted down into grey goo.

The affirmative concludes with Plato’s Allegory of the Cave, presuming to tell us that we are the prisoners in the cave, shackled there by our mobile phones. Except that they were also carrying their own shiny new cave-shackles ergo we should not defer to their assertions. And do bear in mind that Plato’s elitism vis-a-vis the cave dwellers reflected his elitism vis-a-vis the majority of his society that were women and slaves. Indeed, it was expected that the women and slaves would never leave their small station and menial concerns to engage with the larger world, and only the great men who had seen the world could protect the state and its small-concerned inhabitants from it. But what we’ve seen is that Plato’s intentions are not reflected in modern democracy with (near) universal suffrage, that even if they were most people have maintained a small sphere of actionable concern such that technological advances across thousands of years has changed very little about the common human condition, and that, on the other side of the social spectrum, being a governing elite in no way demonstrates a big-picture mentality the affirmative might expect it to as demonstrated by, for example, their aggressively pipsqueak denial of climatic changes.

So for all the moral panic and faux-nostalgia the smartphone-indulging affirmative can conjure in the vein of “In the Old World, we were (Philosopher) Kings!,” the majority of their assertions are ill-based, misguided, or simply non-unique. To sum up the counter-position with the words of another Philosopher-King,

What has been will be again,
what has been done will be done again;
there is nothing new under the sun…
What is crooked cannot be straightened;
what is lacking cannot be counted…
Do not say, “Why were the old days better than these?”
For it is not wise to ask such questions.
(Ecclesiastes 1:9, 1:15 & 7:10)

On Getting Schooled

“In challenging times such as these, there is really no use in pretending all of our children represent the best we have to offer, because in every conceivable sense, that just isn’t the case,” said Education Secretary Arne Duncan, adding that operating under this delusion had cost the nation untold trillions of dollars. (The Onion, March 7, 2012)

I should have posted the negative material against “Resolved: In the United States, students should be guaranteed two years of free tuition to a community or technical college” the other day, but here it is now, really years overdue.

Please be reminded that I’m not a public schoolteacher, I just hang out with teachers an awful lot and am constantly reminded of how bad I would be in their career. While Camus admired actors for portraying the Absurd on stage, I admire public schoolteachers for living it as a career. A side-effect of this, however, is that my writing on the subject tends towards the dismal and veers away from any brief existential sparks of joy that briefly illuminate their systemic gloom. So, dear anonymous students, do your teachers a colossal favor and take them seriously. They’re barely compensated enough to oversee your barely-competent behavior, not compensated enough to correct your almost-cleverness or abject incompetence, and never compensated enough to deal with your bullshit.

Now that I’ve got that out of the way, let’s explain why tuition-free community college is a breathtakingly dumb idea.

The idea that students should be guaranteed 2 years of junior college takes everything you don’t like about public schooling and makes it last even longer.

Contention 1: School insulates, college exacerbates

It wasn’t until the post-World War 2 economic boom — my parents’ generation, really — that the United States could really afford to leave most of its youth in school for 12-or-more years because, as the educators at Glogster note, “more people had the means to stay in school, and the states had increasingly more money to pay for it.”

But there’s a grievous down-side to this development as venture capitalist Paul Graham explains in his essay “Why Nerds are Unpopular“:

School is a strange, artificial thing, half sterile and half feral. [ … And … ] If life seems awful to kids, it’s neither because hormones are turning you all into monsters (as your parents believe), nor because life actually is awful (as you believe). It’s because the adults, who no longer have any economic use for you, have abandoned you to spend years cooped up together with nothing real to do. Any society of that type is awful to live in.

This situation has only been exacerbated by high schools dropping expensive real-world trade-skill oriented metal and wood shop classes as documented by Dr. Crawford in his book Shop Class as Soulcraft to instead double-down on the collegiate focus, as evidenced by the distressingly deterministic mantra that “College begins with Kindergarten,” a mantra fulfilled by the resolution that ostensibly aims to make 2 years of community college an extension of 13 years of public education.

Noting that High School only became common when we didn’t have as much economic demand for young workers, do we think that this push for Community College is to skill up workers, or to tamp down on the size of the labor force? The Bureau of Labor Statistics reports that last year, only “51.9 percent of young people [16-24] were employed in July.” (“The month of July typically is the summertime peak in youth employment”) That’s barely more than half during a solid economic recovery… so we have to keep the other half occupied in the traditional way: doing homework.

So the push for free or subsidized continued education

  1. is caused by an over-supply of labor in an increasingly automated economy
  2. causing the superfluity of education to fetishize itself as its own objective
  3. functionally stunting the ability of adults to take their socioeconomic position in civilization.

Contention 2: Publicizing College Ignores & Exacerbates Achievement Gap

Contrary to how much the United States loves its titans of industry, we don’t do much to support our most talented and gifted students, as the Davidson Institute reports in their book Genius Denied — instead we worry about an Achievement Gap that develops during students’ time in school and spent over $11B on special education for people with learning disabilities (2006) to mitigate that gap while special education for students with learning super-abilities goes generally unfunded.

This isn’t to say that abnormally advanced students are universally ignored. Indeed, even in Oregon at least the Beaverton and Albany school districts help their advanced high schoolers cross-apply community college credit that’s being counted towards and associates’ degree to their high school requirements as well — and that tuition is paid by the school district. This is in the status quo, and doesn’t even look at commonly-available AP credits that typically count for college credit, and certainly stops short of the online courses available for free to anybody with adequate initiative through Coursera.

So when the affirmative is arguing for 2 years of free community college that’s different from the status quo, they’re really short-changing communal support for our top-performing students even more while simultaneously ignoring the fact that the achievement gap — entrenched in the curricular difference between advanced placement and remedial courses in high school — would essentially be given a longer run-time in public education to develop.

This has three effects:

  1. the rarity of public support exacerbates the tendency of our top students to become elitist jerks buying into the mythology of the self-made person with no sense of noblesse oblige — this is status quo, but worsened with the aff — because
  2. the aff substantially increases the visibly disproportionate amount of public resources allocated the mediocre students in the hope that they’ll somehow become economically viable,
  3. despite not actually closing the achievement gap and instead passing it on to community colleges to try to handle.

Overall, there are a plethora of policy options to improve the 13-year system we’ve got, not keeping our kids mired in it for another 2 years.

And that’s the basic case that runs about 3:54 at a solid clip.

But there’s a lot more going on behind the scenes with education that you should be dialed in on to speak competently to why the resolution is a bad sort of idea.

First, here’s the specific evidence on the Achievement Gap:

Decades of research confirm that summer learning loss is real. According to a report released last month by the RAND Corporation, the average summer learning loss in math and reading for American students amounts to one month per year. More troubling is that it disproportionately affects low-income students: they lose two months of reading skills, while their higher-income peers — whose parents can send them to enriching camps, take them on educational vacations and surround them with books during the summer — make slight gains. A study from Johns Hopkins University of students in Baltimore found that about two-thirds of the achievement gap between lower- and higher-income ninth graders could be explained by summer learning loss during the elementary school years.

The obvious way to try to avoid summer learning loss is to break up summer vacation into smaller chunks and attach them to Winter and Spring breaks and the like. It’s not the amount of time off but the continuity of it that is implicated here. We could get over half of the benefit of the affirmative position with no new costs and no new time spent if we just optimized our existing system for retention — though this may require community colleges to re-tool their course offerings to stay relevant in the future.

The point behind the Achievement Gap is simple: the resolution mis-diagnoses the harms entrenched in the current education system and therefore declares “Moar Education!” to be the affirmative solution.

First-and-a-half, the Achievement Gap is undermining the value of time spent in college by leaving students unprepared for college-level work:

  1. Here’s the general trend:

    Every year in the United States, nearly 60% of first-year college students discover that, despite being fully eligible to attend college, they are not ready for postsecondary studies. After enrolling, these students learn that they must take remedial courses in English or mathematics, which do not earn college credits. This gap between college eligibility and college readiness has attracted much attention in the last decade, yet it persists unabated. While access to college remains a major challenge, states have been much more successful in getting students into college than in providing them with the knowledge and skills needed to complete certificates or degrees.

  2. And specific to Community College:

    College students are increasingly spending federal financial aid and taking on debt for high school-level courses that don’t count toward a degree, despite mounting evidence the courses are ineffective and may contribute to higher dropout rates. … The trends reflect a sharp rise over the past decade in enrollment at community colleges, which disproportionately serve low-income, minority and older populations. About 40% of students entering community colleges enroll in at least one remedial course, according to the Education Department; only about 1 in 4 of them will earn a degree or certificate.

And this not only highlights that students are graduating from high school unprepared for college (which began with kindergarten!) but also that the time-binding of the resolution to “2 years” doesn’t really attach to any consistent degree of merit: people who were over-achieving in high school won’t use 2 years while the under-achievers taking remedial courses to catch up will find it nigh-impossible to get their 2-year degree in a mere 2 years.

Next, let’s handle the common claim that education is a fountain of never-ending economic goodness, to which we’ve got multiple responses:

  1. It is estimated that within 18 years, 45% of jobs will be vulnerable to automation. That’s an average of 2.5% of jobs per year, year after grinding year. Credentialed educational institutions will not be able to re-tool fast enough to keep their course catalogs economically relevant as they’ll be competing with the growing sector of the economy for instructors. (Pin that thought, we’ll come back to it.)
  2. But even now education is proving none-too-valuable in terms of raw ROI: “The total outstanding student loan balance is $1.08 trillion, and a whopping 11.5% of it is 90+ days delinquent or in default. That’s the highest delinquency rate among all forms of debt and the only one that’s been on the rise consistently since 2003.” If education were a reliable economic benefit, then we should see student loan debt shrinking, and being the least prone to default because of all the high-paying jobs people are able to get when they’re done being students. But we don’t.
  3. And we don’t because wages for college-educated people has been falling since the turn of the century, clearly showing that economics is about power more than education.

    If inequality was really about education, we would expect to see income growth sorted by education level: wages declining for workers who never made it past high school but rising for those who finished college. But as the graph [of median income for men with bachelor’s degrees, declining 8% since the turn of the century] above shows, that’s not the pattern. Rather, wages have stagnated for college graduates, too. What separates the members of the 99th percentile from their friends in the 91st percentile isn’t a college education. … But there’s a reason Washington prefers talking about education than power. If the answer to inequality is simply more education, than that’s relatively easy: most everyone agrees, conceptually at least, that a better education system would be better. But if the answer to inequality is redistributing economic power, well, that’s more controversial — particularly among those who currently hold the power.

  4. This should surprise us not-at-all when we look at the sorry state of college professors in the modern Community College (or even your public schoolteachers, pretty much all of whom have more formal education than I do).

    Adjunct professors scraping by on assistance from family, charities, and safety net programs like Medicaid and food stamps continue to push for fair compensation and work conditions. Higher education institutions across Colorado employ part-time faculty, but adjuncts in community colleges say their situation is particularly dire. Adjuncts currently represent 4,060 employees, or 78 percent of instructors at the 13 colleges in the Colorado Community College System, and are paid per class, largely without benefits, sick leave or job security.

And this not only shows that money doesn’t necessarily follow education, but also shows that educational institutions — especially those reliant on scarce public funding — will be hard-pressed to pick up support from people who are in-demand with and therefore well-compensated by the remaining portion of the economy. (Even if they could pull this off, the sheer tumult in the economy suggests that people who use educational institutions to acquire new skill-sets may well return time and again over the course of their life, spending well over the 2 years afforded by the resolution over the course of their lifetime.)

But while we’re looking forward to the future of work, let’s also look forward to the future of our public schoolteachers, because it’s pretty grim:

This is the canary in the coal mine. Several big states have seen alarming drops in enrollment at teacher training programs. The numbers are grim among some of the nation’s largest producers of new teachers: In California, enrollment is down 53 percent over the past five years. It’s down sharply in New York and Texas as well. In North Carolina, enrollment is down nearly 20 percent in three years. … There are, of course, alternative teacher certification programs across the U.S. including Teach for America. But TFA, too, has seen large drops in enrollment over the past two years. … [Bill] McDiarmid [dean of the University of North Carolina School of Education] points to the strengthening U.S. economy and the erosion of teaching’s image as a stable career. There’s a growing sense, he says, that K-12 teachers simply have less control over their professional lives in an increasingly bitter, politicized environment.

So what we’re seeing is that, bluntly, increasingly shitty working conditions, stagnant pay, and being used as political props (either in the role of martyr or covetous villain) have made becoming a front-line teacher a bad idea. But what this means is that our education system will be disintegrating at an accelerating rate in its K-12 base unless/until we address the career satisfaction of our professional educators due to lack of fresh blood in the system as old teachers retire or are ground down into attrition statistics — and this is looking forward from the status quo achievement gap and all that it entails. When faced with these problems, fussing about free community college is like ensuring that there are enough parachutes on the Titanic.

If you want more on this, go check out the negative case from the last time the PF framers suggested an educational reform.

On Not Writing Sample Cases

So it’s that time of the year for the nationals-qualifying tournament, so I’m not going to tell you what I told my students. Or at least very little of it.

Given the Public Forum resolution of “In the United States, students should be guaranteed two years of free tuition to a community or technical college,” the most common difficulty that’s going to be faced is that the speculative cost figures generated to go with President Obama’s similar proposal don’t necessarily match the resolution. So the affirmative should be speaking first and, ideally, be limiting the scope of the plan to Obama’s plan and crimps multiple lines of negative argumentation. If the negative speaks first, they could block this with a wider definition — so long as they’re willing to forego cost evidence that was based on a different scope. Bad angles on the topic (for the negative) include: paying tuition is what makes college students students and not just visiting randos so the resolution contradicts itself, or that “free” implies that nobody pays and is distinctly different from “subsidized” which has a 3rd party (typically government) paying with that distinction scuttling the whole notion of tuition in a debate format that doesn’t allow for plans.

Regardless, I’m mildly optimistic that this topic will prompt at least some kids into researching how they can use community college credits to satisfy high school graduation requirements and then turn right around and transfer them to a 4-year university and count them again as credit towards their intended degree, as that’s the kind of cleverness my brother had that I… uh… totally slacked out on but shouldn’t have. (In Oregon, top students in Beaverton and Albany and perhaps some other school districts can actually get their Associates Degree while still in High School due to clever coordination with the local community college; talk to your academic councilor!)

Given the LD resolution of “Just governments ought to ensure food security for their citizens,” this totally lends itself to the same damned forced value, implausible state, and grammar-temporal critiques I’ve previously leveled against this imbecilic formula. The crucial difference with this resolution is how keen governments of all sorts are to provide food security for their citizens which, even more than a “living” wage, lends itself to a critique of invasive bio-power, but beyond that: if the government loses the condition of being “just” in the quite-common pursuit of food security, then it almost certainly negates the resolution as justice is always an implicit value (at least for the affirmative) that topically links to the obligation of food security unless the affirmative can instead show that a just government is horrible and must therefore become unjust posthaste in pursuit of a better value than justice. Sound strange? I could probably make it work… and I’d probably start with Albert Camus: “I believe in justice, but I’ll defend my mother before justice.”

50 Stockholms of Syndrome

Please note that this is an amoral thought experiment which goes from the wretched in carnality to the shameless in jokes, and probably vice-versa too. While it is intended to provoke thought through its arrangement of artifacts, it is not intended to be taken too seriously. After all, “Sex is boring.”

“Lackluster take on best seller is way too graphic for kids.” –Sandie Chen, reviewing 50 Shades of Grey film

It was startling to realize that 50 Shades of Grey really is a children’s story. More precisely: 50 Shades of Grey, Twilight, and Beauty and the Beast are all fundamentally the same story, with the targeted age group being proportional to the apparent humanity of the dude involved. Beast, being clearly not human, is targeted at children who can be reassured that actual people they see are clearly not like that. But as the monsters get better at passing for human — vampires, werewolves, billionaires — the targeted age demographic increases. But this particular fairy tale plays off the archetypal promise that a woman can leave the safety of her family/familiars, be subsumed into the domain of an unrestrained and overwhelming foreign power-as-force embodied by her man/men with their possessions, and through passive perseverance (an expression of power that does not functionally compete with the masculine unrestrained power-as-force) gain control of the masculine power-as-force for the benefit of everybody. That’s certainly how Beauty and the Beast plays out, I’ve heard that that’s how Twilight plays out, and that’s how 50 Shades of Grey promises to play out in the movie trailers whether it actually does or not.

That’s also how the book of Esther in the Bible plays out — in which the pretty princess seduces the fearsome foreign despot and so convinces him to not commit genocide against her tribe — and when observant Jews celebrate Purim, they’re basically reading from 50 Shades of Goy.

But in order to understand why a pulpy bit of trash like 50 Shades of Grey developed critical popular mass while all manner of other tawdry romance novel fails to do so, we have to look both at the archetypes 50 Shades of Grey is exploiting as well as where we’re at in society.

Google says this came from some dude on TumblrFirst, we have to understand that everybody loves Stockholm syndrome. I mean, sure, it’s got some difficult bits that seem really horrible — but it’s actually quite sweet when you get to know it, and, in the abstract, it makes for the best survival strategy when you’re conquered by an unrestrained and overwhelming force. And if you go back in time and look at where babies came from during the growth and subsequent collapse of — just as an example — the Roman Empire, you’d find that a lot of kids were born to mothers who were doing the best they could to survive knowing that they’d never see the home they knew so well ever again. And the mothers who were successful at surviving and, to a certain extent, thriving under vicious warlords passed those tricks and personality traits on to their children — who became our ancestors. Prior to that in Greece (as Foucault records in The History of Sexuality, Volume 2) all the slaves in a Grecian house were the physical property of the man of the house: lustful screwings were well within his expected behavior — so long as he didn’t acknowledge actual love for his slaves or any accidental offspring, anyway. And before that, back in the book of Genesis, Sarah “gave” Hagar to Abraham specifically to make a baby with because Hagar was a slave so it was, you know, totally okay. In more recent times, it turns out that a lot of Black Americans have a lot of White DNA in them from 19th century plantation owners continuing the, uh, tradition. And all of this is both horribly sexist and flatly horrifying. But the point is that our ancestors are used to accommodating overwhelming power as a means of survival. Even if we’ve not personally done it — or our ego has reshaped the memory of doing it to something less shameful — the experience is in our blood. All of the fairy tales suggest that we’re children of royalty, that we’re princesses and such: well we probably are sort-of children of the rare royalty, but we’re not the rare princesses — we’re the common and disavowed “and such.” And so as crazy-unhealthy as it rationally seems particularly in modern times, the story of a woman falling in love with an obviously predatory sociopath also seems natural and expected to us.

If we go back even into the pure mythology of Gilgamesh, Shamhat went and passively laid herself out to get fucked by the beast Enkidu for seven solid days until he was tame enough to be a civilized man… where “tame” meant picking a fight with the violent and powerful king and then becoming his best buddy and going on adventures with him until he offended the gods and was cursed to death. Really, “tame” isn’t used as a pejorative here so much as to indicate “has learned the value of a mattress.” But the point is that not only is the survival strategy common throughout history, but the (sexist and heteronormative) idea of feminine perseverance to gain control over masculine power-as-force has been with our species for as long as we’ve been literate.

Sidebar: There’s an unusual sex-inversion of this in The Odyssey where the demigoddess Kalypso has rescued Odysseus from drowning and he’s sleeping with her only “out of necessity.” But after Kalypso has promised to send Odysseus home (because the stupid boy-gods get super-violently jealous when lady-divinities start loving up mortal dudes) if he wants to go, and Odysseus reassures her of his ongoing love for his mere-mortal wife despite Kalypso’s obvious divine superiority… they go and have the gratuitously hottest sex they’ve ever had. Go figure. The strange thing about Kalypso is that The Odyssey seems to regard her favorably despite her utter disregard for Odysseus’s feelings, as if her divinity makes her assertion that she loves him true even though he’s sitting out on the beach crying in homesick despair day after day; her elevated status makes her good intentions of love and immortality unassailable despite the obvious cost in the lead protagonist’s misery.

Now one of the more pernicious things our subconscious does, per Dr. Carl Jung, is too look for ways to compensate for a lack of what our brain considers natural: it finds a way to compensate for the lack of so-considered normality to give itself leeway on actually having to adjust its definition of normality. When our primal behaviors are suppressed or ingrained expectations go unfulfilled in one way, they’re likely to show up only slightly transmuted elsewhere, often in an embarrassingly irrational way. Lacking an engageable power-as-force in waking life, the power-as-force is manifested in fantasy. In this case, it has had the good fortune to take the field as people “grew out of” Twilight, which hit the people who loved Disney’s Beauty and the Beast before it. (But, as a reality check, where’s the overlap between Twilight fans and Interview with the Vampire fans? I propose that there is practically none, and not just because Twilight takes an undead dump on centuries of folklore but rather because Interview isn’t a variation on the Beauty and the Beast story.) The pattern is shared elsewhere: despite having had their first books published at practically the same time, Game of Thrones only became culturally relevant after the conclusion of Harry Potter got people thinking “Yeah, I liked that — but it needed more sex and death” to compensate for their primally ingrained expectations of adulthood that aren’t meshing with our lengthening lifespans.

But there’s also an aspect of sex that we’ve practically lost that the modern uptick in BDSM compensates for: ritualized courtship and foreplay. We’ve got the obvious report that 50 Shades of Grey increases the practice of foreplay and sexual communication and the report that says actual BDSMers say “WTF?” But I think the better comparison is between the vulgar spatially arbitrary debauchery people are reporting in engaging in on Twitter feeds like “Tinderfessions” — okay, moral judgment: I’m not linking to Tinderfessions — and the meticulously maintained and functionally sanctified (that is, set apart for a single purpose and approached in a specific mindset) “playroom.” The re-cloistering of sexual intercourse returns the primacy of discernment to its practice that was de-valued as dominance over our reproductive functionality shook the illusions of the significance of sexual intercourse. Indeed, the disciplinary aspect brings back what may be imagined as the penitent attitude regarding carnal guilt, with the long-running sensual (rather than sexual) aspects of the performance — and certainly the (faux-)documentation of the practice in 50 Shades of Grey — compensating for the old monastic habit of gaining dominance over carnal desire by suspending it in and transmuting it to the language of confession, as Foucault documented in The History of Sexuality Volume 1. As the old joke goes, the young man goes into confession and admits “I have had lustful thoughts about a woman,” to which the priest replies, “ah, that is common — but did you entertain these thoughts?” to which the young man replies “No father — but they certainly entertained me.” The monastic practice was to address lustful urges with ritualized discourse in pursuit of conscious mastery. The masters of BDSM use ritualized play in pursuit of performative mastery. The Tinder crowd just swipes right.

So to recap: given the history of our species in sex, violence, and violent sex, all of which appear to be on the decline, we enjoy increasingly sex-and-death oriented entertainments to reassure ourselves that our “jungle surplus hardware” isn’t mis-evolved for the safe and fairly sterile civilization we’ve built ourselves into (even though it pretty much is), and that’s why old stories get remixed from generation to generation even though they’re insensibly horrifying to modern thought.