Thursday, December 20, 2012

Mass shootings, terrorism, and Columbine

Brief followup to the last post: via cshalizi's pinboard, here's an article that taught me something new about Columbine:

[In his book Columbine, author Dave] Cullen reconstructed the shooters' state of mind based on their own extensive written and filmed records and countless interviews with friends, family, and law enforcement. He concluded that Eric Harris was a psychopath, a young man without empathy or conscience, who coldly manipulated the deeply depressed Dylan Klebold into joining his scheme.

It was clear from the boys' meticulously documented plans that Columbine was an act of non-ideological domestic terrorism. Their goal was not merely to shoot bullies. They sought to first responders and parents with a mass shooting and then blow everyone up with huge bombs. Harris hoped this spectacular televised violence would touch of some sort of revolution. The bombs failed to detonate but the intent was clear.

If this is correct, then the Columbine shootings were in fact terrorism. It was violence staged against civilians specifically to generate an emotional reaction and an attendant policy reaction. However ill-conceived, weird, and amateurish, it appears to meet the definition.


Addendum: Incidentally this does not, of course, indicate that we should deal with terrorists of all types in the same way. The wall-to-wall media coverage of Columbine clearly played into the first half of the killers' plans. The most effective way to defeat this particular pair of terrorists would have been for the national media to ignore them entirely. The shooting should have been a purely local news story, focusing on memorializing the victims, and coverage of the killers should have been confined to wonky academic criminal science case studies.

Saturday, December 15, 2012

Are "lone gunman" school shootings terrorism?

I try not to take Twitter seriously, because I find its format frustratingly hostile to clear thinking, but once in a while something provokes me enough that I have to write.

Tweet by Jim Ray ‏@jimray: @_ToTheLost you're right, people who go on mass murdering shooting sprees are also terrorists. Thank you for helping me clarify that.

People who think the Connecticut school shooting is "terrorism" are merely living evidence that our discourse has completely degraded the meaning of the term.

Terrorism is not a synonym for "violence you don't like". Terrorism is violence against civilians calculated to elicit an emotional and policy reaction. Properly understood, terrorism is a tactic of asymmetric warfare, no more and no less. It is not terrorism when a lunatic kills people because they are conveniently available targets, and then commits suicide for no particular reason. (Terrorist bombers often commit suicide, but only because it is necessary to deliver the bomb successfully without being captured and interrogated; if they could deliver the same bomb to the same target without risk of capture or death, they generally would.)

Incidentally, for the same reason, the United States' drone strikes are not terrorism either. The US does not perform drone strikes to provoke the Pakistani population into a policy response. It performs drone strikes mostly for the simple utilitarian reason that it wants certain people dead, and (international law be damned) those people shall therefore be killed.

But this is a language game, and I am playing the role of the futile prescriptivist holding the line against a huge tide of people who would prefer that "terrorism" simply be a synonym for "doubleplus-ungood". A perfectly accurate phrase like "heinous criminal violence" is simply not enough for these people: they demand that the T-word be deployed. And in the end, descriptivism is the correct school of linguistics, and thus in the long run I will inevitably be wrong. Eventually, terrorism will be a synonym for "doubleplus-ungood" and a useful tool of thought will have been blunted into uselessness, like a scalpel bashed repeatedly against a brick wall.

This reminds me of how we now call everything "war" — war on drugs, war on poverty, war on women, war on Christmas — except when we pay our armed forces to shoot at another nation's people, in which case we call it "kinetic military action" or whatever.

Does this matter? I can't say for certain, but it seems to me that our national policy response to terrorism is muddled in part because our thinking about terrorism is muddled. I claim you can draw a connection between people who throw around "terrorism" with such carelessness, and the ridiculous anti-terrorism policy of the Bush years, when we reacted to terrorism by invading a country that had not committed terrorist acts against us, sacrificed our liberties to defend our "freedom", and spent untold blood and treasure to prevent terrorists from achieving their goal of provoking America into losing blood and treasure.

Sunday, July 22, 2012

On the nature of software ownership, yet again

Being a Google employee, I shouldn't comment on the specific case, but I do think this is an opportune time to repeat that by adopting software, you make a calculated bet on the future behavior of a group of people. And sometimes those people behave differently than you hoped.

There is no magic bullet that will make software developers behave the way you want indefinitely. Pay for the software with cash, pay for it with attention, pay for it with a gift economy involving the mutual exchange of labor, don't pay for it at all; make it open source, make it proprietary; eventually you will always be at the mercy of developers whose incentives converge only imperfectly with your particular desires.

Well, I should amend that: if you really really want to control your destiny, you can dedicate your life to building a software ecosystem where every user has both the freedom and the burden of control. This requires much more than open source; it requires the embrace of both Free Software and software systems designed to be just as hackable as they are usable — perhaps even more hackable than they are usable. Basically, a lot of people need to (re)read In the Beginning was the Command Line, and the story of Richard Stallman and the printer. And maybe, overall, be a little less sniffy about the more "extreme" (that is, principled and consistent) versions of the Free Software credo.

Monday, June 25, 2012

Evolutionary psychology and philosophers: An anecdote

N. Strohminger's review of philosopher C. McGinn's recent book on disgust (via cshalizi's pinboard) is hugely amusing, and reminds me of something that happened when I was a freshman in college.

I was attending a talk by someone in the NYU philosophy department, and the subject was the almost universal (human) fear of death. Towards the end of a long disquisition by the speaker on why the fear of death was puzzling (for example, death itself is not painful, and dying need not be), and philosophical explanations for why it might exist regardless, I raised my hand and asked: "Since we were produced by evolution, wouldn't a fear of death be selected for, because a fear of death would make us more likely to avoid it, and organisms that avoid death are likely to be more successful?"

This seemed to me like a curious theory to have omitted entirely from the talk. At the time, I had recently read R. Dawkins's The Selfish Gene, so evolutionary explanations for our behavior were on my mind. And also, it seems to align with common sense.

However, a reader of a certain background might notice that my explanation was an invocation of evolutionary psychology. I had, unwittingly, landed in a minefield on the scholarly battlefront between evolutionary psychology and its critics. So the speaker replied, roughly: "Well, evolution might select for avoiding death, but it wouldn't necessarily select for fear. I would also direct you to [list of scholars' names here], who have produced devastating critiques of evolutionary psychology." He then breezily moved on, evidently feeling that he could now ignore the idea, and evolution was not mentioned further.

I was so astonished by this response that I couldn't even muster a follow-up question. At the time I thought: is this philosopher seriously claiming that evolution could play no role in our fear of death? Could there be any psychological phenomenon which is more likely to be evolutionarily motivated? (Maybe love for one's children?) The speaker's reply correctly noted that evolution did not necessarily entail a fear of death, but it certainly could be sufficient explanation, and under an evolutionary account there might simply be no necessary explanation, only a sufficient one: the fact that death-avoidance is motivated by fear, rather than some other mechanism, might be no more inevitable than the fact that we have five fingers per hand rather than four. (Or, if you prefer, its necessity might be an artifact of the adaptive mechanisms that were at hand when the selective pressure was applied, rather than an a priori truth about the nature of the mind.)

I don't remember the content of the rest of the talk, but if you've ever read a work of scholarly philosophy you should be familiar with the style: a lot of precise reasoning from first principles (which I love), coupled with an almost disdainful elision of the empirical (which I do not).

At any rate, it seems to me C. McGinn's bizarre ideas about disgust, and total ignorance of the modern scientific literature thereof, share something in common with the NYU speaker's dismissal of my question. There is a certain strain in philosophy which doesn't particularly like science. Science offers a mode of investigation of such staggering power that it threatens the philosophical discipline. When you like sitting on the metaphorical mountaintop, pondering from first principles about morality, or aesthetics, or emotion, it can be a rude awakening to learn that some yob with electrodes or a pipette or a computer might be able to probe deeper and more certainly into reality than you can — even those aspects of reality which you thought were your exclusive domain.

To muddle things up further, a great deal of evolutionary psychology is terrible science, and the worthy interdisciplinary project of battling that nonsense provides cover for this brand of intellectual anti-intellectualism.

Truthfully, the episode with the seminar speaker above left me more sympathetic to evolutionary psychology, not less, so whatever effect he wished to have on me was completely thwarted. I am not by nature contrarian or perverse for its own sake; in fact I think that contrarianism without regard for correctness is one of the most pernicious and irresponsible seductions that clever people allow themselves to fall prey to; so my sympathies are not merely a reaction against misguided authority. I suggest in all earnestness that the next time you read some hilarious evisceration of embarrassingly popular bad evolutionary psychology, recall Strohbinger's review and consider that there's plenty of evisceration to be done on the other side as well.

Monday, June 18, 2012

D. Lowery on copyright infringement: How to alienate potential allies through intellectual dishonesty, illustrated by example

Young people's cavalier attitude towards copyright infringement is maddening, but this essay by D. Lowery comprehensively misunderstands what the Free Culture movement stands for. The guy repeatedly slams Free Culture but apparently has read neither Lessig's book Free Culture which gives the movement its name, nor the Creative Commons website, even though he links to a PDF of CC's tax filing.  This intellectually irresponsible ignorance makes me disinclined to be sympathetic to him.

Furthermore Lowery seems to have no conception whatsoever of:

  • The purpose of copyright law: viz., to ensure the production of the creative and useful arts. Note that there seems to be just a bit of culture being produced on the Internet these days, so the burden of proof is on those who want to lock the Internet down.
  • The costs and unintended consequences of reorganizing society in the ways that he prefers — so that service providers must implement comprehensive preemptive review of what is posted, or so that it would be dramatically easier for people to issue takedown notices on content published on the Internet. For example, Lowery's blog is hosted on Wordpress.com; comprehensively applied human review as a mandatory requirement for hosting service providers would have made Wordpress impossible in its current form; and dramatically easier takedown would make it trivial for a random troll to censor Lowery's blog post in a fit of pique.

...among other things.

Note that I buy or rent all my music and movies.  Almost nobody else I know does this consistently.  I did not illegally download the second season of Game of Thrones; I am waiting for it to come out on disc.  Mostly, the people I know seem to view me as sort of a sucker, but I do this because I view copyright infringement on entertainment media as beneath my dignity, like shoplifting.*  If my reaction to Lowery is to get extremely pissed off at his cavalier disregard for the truth, perhaps he should consider changing tactics.


*Of course, purely as a matter of utilitarian ethics, there is a huge difference between copyright infringement and shoplifting physical goods, but never mind that; this is just how it makes me feel, in this particular case.

Sunday, June 17, 2012

B. Caplan on Presidential nullification

Obama recently announced that he would stop deporting illegal immigrants, thus effectively nullifying parts of immigration law through non-enforcement. The injustice of immigration law today is so gross that we may well applaud this particular move. However, in response, B. Caplan writes:

I say the laws on the books are so overwhelmingly wrong that even random Presidential nullification would be a huge expected improvement.

In the year 2012, Caplan argues that an even more imperial Presidency would be a good thing, not in this one instance, but as a general rule. He has surveyed the landscape of the American government and decided that the big problem is that Presidents don't have enough unaccountable power. Yes, let's bring back the days of Worcester v. Georgia, when the Supreme Court failed to order federal marshals to enforce the ruling, for fear that Andrew Jackson would not comply!

Furthermore, the statement has a ridiculous premise. Laws will not, of course, be nullified at random. They will be nullified in the exact ways that Presidents find most politically advantageous or ideologically compelling. I suggest you ponder on your own whether you think that will be a good thing in general. At best, I think it is a mixed bag. Certainly, cravenly seeking political advantage can sometimes (as with Obama's strategic courting of the Latino population here) lead to policy outcomes that improve the welfare of many people, simply because politics is the art of getting many people to vote for you. And the dominant ideologies of American politics are not entirely malign (although a libertarian like Caplan should perhaps feel some unease at the thought that Presidents from the very parties which, allegedly, have been passing so many terrible, liberty-destroying laws will wield nullification as a liberty-enhancing tool).

It is remarkable how ideology can impair the thinking of extremely intelligent people. The points I make above are not the product of any profound thinking. Had Caplan focused his mind on his own statement in a similarly critical way, he might well have ended up feeling uneasy as well. But he did not, or he overcame that unease anyway.

I blame his ideology because I see this error as one common to a lot of libertarian thinking. Libertarians often embrace the wrong measures of the intrusiveness of government. Caplan thinks that less law will lead to greater liberty. (This is a cousin of the fallacious notions that lower tax rates lead to less government interference in the economy[0], or that private governments whose edicts are backed by the force of property law are not governments.) But of course that is an error. We should seek to have the right laws, and, perhaps equally importantly, just mechanisms for establishing the right laws. To focus on whether there are more of them, or fewer, is rather akin to King Philip in Amadeus criticizing Mozart's compositions for having "too many notes — just cut a few".

Now, our laws are hardly Mozart operas, but the point is it's a silly way of measuring things. Just so with a typical libertarian's ways of measuring government.


[0] This is an error in at least two ways. First, lower taxes without lower spending merely time-shift interference into the future. Second, a simple, universal tax injects less entropy into the market mechanism than a large number of narrowly targeted taxes applied inconsistently; if we view markets as information-gathering mechanisms then this information-theoretic view of taxation may be a better measure of government interference than tax rates.

Sunday, April 15, 2012

Stross on ebooks

Disclaimer: Since 2006 I have been employed by Google, which sells ebooks and related technology, and therefore competes with several of the companies involved in the subject matter. My opinions are my own and not those of my employer.

Charlie Stross has a compelling analysis of the book publishers' position, in light of the recently opened Department of Justice lawsuit against Apple for alleged price fixing. (Incidentally Stross's comments are usually decent and he engages with his readers as well, so they're worth a skim.)

Stross's post reminds me a lot of something I wrote about 2 years ago. Rereading that post today, I find that I have very little to add to it.

There is one additional thing from Stross's post, though, that I find particularly galling:

. . . if your boss is a 70 year old billionaire who also owns a movie studio and listens to the MPAA, you don't get a vote. Speaking out against DRM was, as more than one editor told me over the past decade, potentially a career-limiting move.

Publishing companies like to portray themselves as scrappy underdogs locked in heroic battle on the side of knowledge against the forces of ignorance. In fact, they are merely subordinate tentacles of large, stupid media conglomerates; they aid the forces of knowledge when it is convenient for business, and do whatever they can to muzzle open discourse when that is convenient. (Fortunately their power to silence critics is fairly limited in a free society.) Critics have been repeating for years that ebooks should be convenient and DRM-free. Publishers never listened; instead they threatened the careers of people like Stross's editors for even bringing it up.

Anyway, I can't comment on the substance of the DoJ lawsuit (and anyway, I know nothing about the facts of the case, so my comments would be pointless), but I find it hard to muster much sympathy for these people. The solution to their problems has been staring them in the face for as long as ebooks have existed. Unfortunately, I'm somewhat less sanguine than Stross that publishers are going to learn the lesson. They seem pretty impervious to persuasion.


p.s. See also Tim O'Reilly's Plus post.

Friday, April 13, 2012

The ideology of Dilbert

So apparently people are still leaving comments on that post about Dilbert from half a decade ago. The latest calls me a pointy-haired boss (this is a little funny because I've never been a manager; on the other hand you could make the case that PHB-ness is a state of mind rather than a job title). I've been arguing with people on the Internet for so long, and am consequently so thick-skinned, that insults as mild as that one barely even register; but anyway the ping prompted me to think even more about how Dilbert sucks.

Aside from failures in basic craftsmanship, which I discussed in my previous post, Dilbert cultivates a poisonous worldview. Here are a few lessons that you will learn by reading Dilbert strips:

  • Your boss, co-workers, and clients are not human beings, but objects to be ridiculed.
  • You are never at fault for anything; it is always your stupid boss, stupid co-workers, and stupid clients.
  • There is no possibility of change. Every attempt at change will be thwarted by the system.
  • There is no possibility of escape. Everywhere else you go will be equally bad.

Would you want to be friends with someone who believes these things? Do you want to become a person who believes these things? Then why would you read Dilbert?

As a corollary of the above lessons, here are a couple of plots that you will never see on Dilbert, even though I think they make sense in the context of a dysfunctional workplace, and could be funny.

Plot: Dilbert runs a job interview with a smart, capable young college graduate, who passes all his questions with flying colors. Dilbert takes him aside and whispers in his ear, "Run. Run now, while you still can."

Reasons you will never see this: (1) Dilbert never shows any compassion for another human being. (2) Nobody on Dilbert is ever competent at anything except Dilbert himself. (3) It would raise the uncomfortable question of why Dilbert himself does not leave.

Plot: In a long-running arc, Dilbert leaves his office to found his own startup, small business, or consultancy. He makes all kinds of hilarious mistakes, of the type which founders inevitably make, thus bringing his business to the brink of failure. He finds that he can only be rescued by dogged persistence and the help of mentors and allies.

Reasons you will never see this: (1) Dilbert can never be shown to be at fault for anything. (2) Other people can never be seen as a force for good. (3) The root causes of problems in the workplace can never be shown to arise from the inherent difficulty of making anything work well; it must always be rank stupidity, arrogance, or some other venial human flaw. (4) Scott Adams has no idea how to draw any setting besides a cube farm.

The latter of these is particularly telling, because Adams obviously left his job to found his own business as a cartoonist (and as a result became quite wealthy). But the fantasy he sells to his audience is not one of struggling to change the objectively terrible conditions of their lives, just like he did; it is one of complacently remaining in place while cultivating a smug sense of superiority and alienation. The relationship between Adams and his audience is therefore one of condescension and contempt.

Long ago, Paul Graham ran an ad for his seed capital fund titled Larry and Sergey Won't Respect You In The Morning; Graham was articulating a disdain for large corporate workplaces that is a distant cousin of Adams's, and yet there is a crucial difference. Graham is (admittedly for self-serving reasons) trying to channel his audience's discontent into an urge to follow their dreams and change the world. What would a similar ad for Adams's work look like? Oddly enough, it would be something like this: "I, Scott Adams, won't respect you in the morning. I left my job to build something of my own. Now sit there and read my comics like the powerless peon you are, and were always meant to be. Ha ha!"

Wednesday, February 15, 2012

In which Kevin Drum walks in wearing clown shoes, a clown nose, and a floppy wig, and is taken seriously

So Kevin Drum wrote something ridiculous about copy protection, and Tim Lee and others (see the comment thread) have tried to write thoughtful responses. But this is superfluous. Drum's original post is ignorant, arrogant, and insulting. Drum lazily handwaves away the multi-decades-long failed boondoggle of technological copyright enforcement, and the combined opinions of practically all subject matter experts who are not directly employed by the publishing oligopolies. He does not attempt to refute the evidence; in fact he does not even engage with it. Along the way he manages to sneak in some snide insults for the people who made general-purpose computers (like the one he is typing on) the incredible instruments of human creativity that they are today. What makes anyone think that a mere presentation of further evidence and argument are going to sway him?

Scientifically, the proposition that technological copyright enforcement can dramatically reduce infringement without severe and costly restrictions on liberty is in the same ballpark as climate change denial and cures for homosexuality. I could explain the implications of the Church-Turing thesis very patiently, in very small words, but frankly it strikes me as rather like reading aloud to a student who's not only too lazy to read the book, but too lazy to crack open the Cliffs Notes.

And I would add that this whole business looks very different when (as is the case for many in Silicon Valley) people in your extended social network have had startups or products crushed by errant IP law. Furthermore consider that countless engineer-years have been wasted dreaming up and implementing fruitless schemes like DVD CSS; however they were financially compensated, those are real and concrete wasted human lives. (Counting that production as economic output is rather like a broken windows fallacy in which the window never gets fixed, but the guy who broke the window gets paid.) The publishing oligopolies demand the satisfaction of their fantasies, and engineers pay the price in sweat and tears. Against all this, consider the extremely weak empirical evidence for large-scale harms from digital copyright infringement.

Drum strikes me as the moral equivalent of a priest reassuring his lord that the farmers ought to be all-too-happy to be taxed a few more bushels of grain to burn to the sky gods. Of course it costs him nothing to utter those words, so he can afford to be unbelievably cavalier. But this type of behavior should not command our respect.

Sunday, February 05, 2012

Desktop environments: get out of my way

Increasingly the applications you use — yes, you, sitting there, not some rhetorical "you" — can be broken into two categories:

  1. Lightweight stuff where you just want to minimize your maintenance costs. You don't particularly care whether you learn every keyboard shortcut; you just want it to be simple and cheap and available wherever you go.
  2. Heavyweight applications where you do hardcore work. You are an expert operator of these applications and any minimal loss of functionality annoys you.

The categories that these fall into differ from person to person. For a programmer, category (1) might include a photo editor and category (2) might include your text editor. For a graphic artist, these might be reversed.

Desktop environment developers have developed this arrogant idea that they can impose "human interface guidelines" on the user to make their user experience simpler, more consistent, etc. This is an illusion, because category (1) applications will eventually all be on the web, and category (2) applications have never obeyed any platform's human interface guidelines (HIG), and they never will.

Maya 3D laughs at your pathetic HIG, Microsoft. Emacs laughs at your pathetic HIG, Ubuntu. Photoshop laughs at your pathetic HIG, Apple. (For that matter, your own software usually laughs at your HIG, Apple.) When you spend all day doing hardcore work inside an application, you don't give a fuck if it's inconsistent with everything else you do, because the gains from having that app work exactly the way you want far outweigh the loss from having it be slightly inconsistent with your MP3 player and your PDF viewer. For category (2) applications, the homogenizing influence of the HIG makes about as much sense as having a carpenter install exactly the same grip on a hammer, a power drill, and a jigsaw.

As for web applications, the web laughs at all platforms' HIGs. And anyway it's more important that they be consistent among web browsers than that they be consistent with the conventions of the host platform. Unless you're one of those douchey people who carries their iPad everywhere in a little murse non-murse device that makes you feel acceptably masculine, you're going to check your email using some other device sometimes.

So what we basically need from desktop developers is to get the fuck out of the way. Integrate really well with the web browser, and also give the user the most efficient, unobtrusive way possible to switch between type (2) applications.