Is a C+ really that much worse than a B-? Please explain

What is it about C+ grades? I turn in my grades, and the email deluge begins roughly 5 minutes after I do so: “Can I have a B- instead of a C+? Of course you will be rounding my C+ up to a B-, right? It would mean the world to me.” (No, I shan’t.)

There’s something psychological about the C versus B and I don’t get it. It’s been a long time since I’ve thought about my own GPA, so maybe there is a big difference numerically, but I can’t recall that being the case; the difference between a B- and C+ is pretty marginal. Both can keep you out the elite graduate programs. Perhaps they should.

I’d understand this if were in the D+ versus C- range. At a D+, you wind up taking most required classes again, and that is an expensive and time-consuming proposition. But a C+ means you don’t have to take the class again.

I really don’t like letter grading. Let’s face it, I don’t like grading. By the time you are in graduate school, you should be self-motivated and self-evaluative enough to do your work in collaboration with a professor.

Qualitative research is not doomed, aka movie deals.

ATTENTION CONSERVATION NOTICE: The qual versus quant distinction that old timers have grown up with is dated, and it probably wasn’t even useful back in the day. Most of us academics are dinosaurs, so be humble when you throw poop around the dinosaur cage.

This piece by Stephen Porter crossed my desk via Twitter the other day from Noah Smith (@Noahpinion, who is wonderful, and you should follow), and at the time I shot back some opinions on Twitter. But it’s bothered me ever since, so I thought I would write a fuller response here. I don’t know Stephen Porter or his work, but that said, I did read his bio and a couple of his papers after this blog post.

Let’s start with the overall snarky tone of the piece. As somebody who is frequently snarky, it raises a red flag. I know full well when I do it, and it’s not good scholarly behavior on a blog or anywhere else. When somebody is snarky about a topic that shouldn’t normally generate anger or condescension, it’s a warning sign, and the warning sign is simply that the author’s ego is at stake in the writing. If you really have the full force of both soundness and validity in your argument, you don’t need snark to bully the reader into believing you or to frighten dissenters from challenging you. I’m as guilty of this as anybody.

From that onward, the argument is cherrypicked, overreaching, and blind to the overall research context we all live in.

Let’s start here:

Banners and Alerts and Speaking truth to power about qualitative research Stephen Porter

His response to this tweet was:

Of course, the whining and outrage was predictable. More here:

I assume BMJ is the British Journal of Medicine.

So whining and outrage go together, and reactions to a medical journal’s business model of scholarly research dissemination is mere “outrage” instead of legitimate critique of a journal that is extracting from scholars free content to sell at exorbitant prices…about inconsequential matters such as health. Okaaaay.

There actually are some pretty damn good reasons that medical research absolutely needs qualitative research, and some of the most important medical studies ever done have been qualitative. We know a lot more about the effects of toxins on the human body because of opportunistic studies of rare events like industrial accidents or London’s “killer fog.”

And, btw, what does shunning qualitative research mean for bioethics research? It should bother us when a medical journal is not interested in the casuistry of field practice. It’s one thing if BMJ intends to specialize and expects those researchers to go to specialty journals, but that’s not the same as the “it’s a low priority for us because it doesn’t sell” rationale.

The next point that strikes me as incorrect is this one:

Let’s face facts: it’s a quant world now. Policymakers and stakeholders don’t want to hear stories about the lived experience or any other such nonsense. Funders are increasingly adopting a similar mindset

The facts are, there isn’t any evidence to back up this assertion. The facts are…policymakers and stakeholders–an amorphous, ill-defined group of people, so God only knows who they are, but they of course agree with what Porter thinks…are often not terribly interested in any research of any kind unless it supports their interests.

But before I get too far into that, let’s deal with “don’t want to hear stories about the lived experience or any other such nonsense.” So if market interest is your measure of worth, then fine, but you should probably note that in the list of New York Times Best Sellers, historians outnumber economists roughly 8 to 1 (where economists would be doomed without the not-strongly quantitative Thomas Piketty), historians routinely win the National Book Award (year in and year out, actually, where economists have never posted a win, not ever*), and The Immortal Life of Henrietta Lacks–a qualitative book on bioethics and history–just landed a feckin movie deal. We should probably note that Freakonomics would not have been what it was in terms of runaway best sellers if it hadn’t had a research collaboration with a qualitative social scientist and a writing connection with a journalist. (And a platform in the NYT).

Matthew Desmond is currently tearing up the book sales with a book of “stories.”

I kind of think people are interested.

If your idea of qualitative research is that it is just about the “stories of the lived experience” and “nonsense”…then you aren’t qualified to make assessments of qualitative research because you don’t know what you are talking about. Yes, there are ethnography and interview studies still out there, and I find them often to be quite valuable (Henrietta Lacks, and many others). I’ve obviously done a fair amount of them in addition to my quantitative research because I’m not an ideologue about methods. I care about questions and getting answers.

But more than that, big data are–or should be if you are awake–entirely changing the distinction between quantitative and qualitative. With digital technologies and social media, you are getting millions of data that confound the traditional tools of econometrics. Later on, Porter says qualitative people are “dinosaurs” but with his characterization of qualitative research here, I guess I have to question about whether Porter is as cutting-edge as he thinks he is.

And I don’t know about the rest of you, but one of my econometric instructors, a brilliant econometrician named Joel Horowitz, and I once had a really interesting discussion about whether Bayesian approaches are inherently qualitative, and that wasn’t one of your typical sniffy-snooty, looking-down, pissing-on-the-wall, I’m-ever-so-much-more-rigorous-than-thou academic discussions. People like Horowitz, who are genuinely secure in their work, don’t have to do that: it was just the two of us chatting about where ideas come from and how people use them to formulate theory, and getting into some pretty interesting epistemological waters as we went.

The part here that pains me to write: research and higher education do seem to be in process of changing, but it’s not strictly a data revolution where quantoids like Porter stand astride the earth while the silly dinosaurs die. Instead, the star economy of the academy means that there are global academic darlings, who get all the sunshine, and then the rest of us–the Help–who get whatever crumbs are left.

Funding, particularly that for social science, is consolidating and drying up for just about everybody, not just those dummies who tell “stories.”

And if I were a betting woman, I’d guess that the university Porter teaches at, NC State, stands a good chance of either being the only state university in North Carolina…or being closed in the next 30 years. And since UNC at least has a sports dynasty on their side, I’d bet the latter.

I hope I am wrong. But I don’t think I am. I think some aspects of higher education are, in fact dying, and lot of what I see in Porter’s argument is the anxiety that all of us have about the changes going on around us: I’M not the dinosaur or the Help. YOU OTHERS ARE.

Then he goes on to say that quantitative dominance is only going to get worse because:

1. Statistics is now prominent in the K-12 math curriculum; it was nonexistent when I was a kid. Students at a young age will now be learning quant methods, not qual methods.

This assumes that students don’t learn qual methods, and I don’t think he’s right about that. I agree that we are seeing more statistical literacy in K-12 (and thank heaven), but we also seem to be seeing things like expanded service learning, visual ethnography in addition to data literacy.

2. The media has gotten much more data savvy, and now regularly present charts and graphs based on quant data. This is creating a culture where we tend to talk and view issues in terms of what the quant data tell us.

Yes, but the media also show us word clouds, videography…text mining appears regularly in the media, etc. It’s not like you can’t graph various aspects of qualitative research.

And, um, “This is creating a culture where we tend to talk and view issues in terms of what the quant data tell us”…go read some media effects research before you make sweeping conclusions like this based on your impression. The tail can wag the dog in terms of what media shows us.

3. Number 2 is especially true for academic research. The Chronicle of Higher Ed and Inside Higher Ed report predominantly on quant studies. The major media outlets, like the NY Times, tend to report on work done by economists. When was the last time you read about an anthropological study in the national media?

And yet Matt Desmond got a six-figure book deal just telling stories, and we didn’t, with our big, giant, better-than-his data.

4. More and different quant datasets are continually collected, as we use more electronic devices and the cost of data storage continues to drop to almost nothing. So it’s becoming much easier to study a wide variety of topics using a quant lens than it was 20 or even 10 years ago.

5. Statistical and visualization software is easier to use every year, putting more tools in the hands of people who might normally never crack open R and run a regression analysis.

I’m currently doing a project with about 5,000 images from the web. Quant? But it’s coded images and text mining. Qual?

Porter’s approach seems to be “everything that is new and emerging is quant and everything old and lousy is qual”–and it’s an easy way to frame an argument you wish to win–but that doesn’t make you right about your basic definitions. Just because you have a lot of data doesn’t mean your approach isn’t qualitative. If I measure every single thing that happens every nanosecond of an individual’s life…I might have a lot of data, but not necessarily generalizable research conclusions. And it could still be interesting and useful as all hell.

This last point, to me, just suggests that old binaries like “quant” and “qual” are going away because they aren’t useful, not that Porter is right in his characterization of them.

The rest of the essay is academic posturing: my discipline does things in a rigorous way, education doesn’t, and so forth. Everybody knows that there are good studies out there and weak studies out there, and there are plenty of examples of weak quant and weak qual studies.

His link to an editorial targeted to qualitative researchers on how to get their work published strikes me as good advice for academic writers in general, though nothing here strikes me as particularly earth-shattering for those of us who get our work published. But here it is, for those out there who can use advice.

*Leontief and Galbraith were both nominated, but didn’t win. Always a bridesmaid.

Enough whining about liberal smugness: go read Orwell if you need that fix

I’ve had about a gillion people gleefully forwarded me this piece from Vox from Emmett Rensin: The High Price Democrats Pay for Liberal Smugness.


The idea here is that John Steward is smug, John Oliver is smug, Stephen Colbert is smug…all you liberal proffies are smug, and all you people who don’t “get” Kim Davis….smug, smug, smug.

Have you watched FoxNews recently? Bill O’Reilly…now THERE is a humble guy right there. A man of the people. Yesssirree. Rush Limbaugh. THERE’S a quiet, unassuming guy for you. Never a condescending word has ever been uttered or written by Ann Coulter. Or that king of smugness himself, William F. Buckley, Jr.

From their flouncing around about campus protests to assertions that they know what “history tells us,” there is plenty of smugness on the right, too.

But what really irritates me about the Vox piece is that it is old wine in a new bottle, and Rensin gets to ride a million forwards and clicks based on poor argumentation and vague assertions about what “the liberals do.” Here’s one:

That is: Kim Davis was not only on the wrong side of the law. She was not even a subscriber to a religious ideology that had found itself at moral odds with American culture. Rather, she was a subscriber to nothing, a hateful bigot who did not even understand her own religion.

Says who? Resnin’s Facebook feed?

Plenty of us in the world (like me) said “Hey, that’s too bad for Ms. Davis, but she doesn’t get to use her public office to pick and choose who exercises rights that have been settled in law based on her personal beliefs.” That’s hardly “hateful bigot” language. (It reminds me of Monica Lewinsky’s claim that “the feminists were mean to her.” Um, I’m a feminist and I distinctly remember telling people to shut their damn pie-holes and get out of her life. Do I not count as a feminist or a liberal? Or do we all just get to collapse everybody together based on our perceptions of what some of them did wrong as the definition of what’s wrong with that whole damn group we wish condemn?)

These are straw men arguments, easily posited when you don’t have to have any proof besides listing some leftie media personas who have adopted the same loud, table slapping modalities as righty media personas.

There is nothing that anybody is ever going to say about smugness, liberal or otherwise, that will top George Orwell’s The Road to Wigan Pier. Liberals are damned if they do, damned if they don’t, and Orwell set up that damnation in 1937 England way before Rensin “discovered” how it was getting worse over 30 years in America. (No evidence it’s getting worse. Apparently somehow it is. I blame John Oliver. Oh wait, he’s English and you do realize we are talking about TV shows? And cable tv no less?)

Richard Bellamy wrote about this beautifully in “The Intellectual as Social Critic: Antonio Gramsci and Michael Walzer.” In Intellectuals in Politics: From the Dreyfus Affair to Salmon Rushdie, ed. Jeremy Jennings and Anthony Kemp-Welch.

If they remain outside politics, they end up being charged with aloofness and a selective bUndness to injustice. If they enter the political arena, they appear condemned either to prostrate themselves before the powerful or legitimately to impose their ideals on others. On the one hand, they stand accused of a false objectivity obtained via a refusal to dirty their hands by engaging with the often messy affairs of the world; on the other hand, they are warned against covering their hands in blood by seeking to make a necessarily imperfect world conform to their abstract ideals.

IOW, you don’t get to feel good about not being smug just because your position is that the status quo is awesome when others are trying to change the status quo and finding it a hard go. There is an inherent condescension, too, not to mention silly absolutism in the notion that because society is imperfectable, no change is meaningful: it assumes it knows the abilities and essential human nature of all who surround you.

Orwell, BTW, once wrote: “the real enemies of the working class are not those who talk to them in a too highbrow manner; they are those who try to trick them into identifying their interests with the interests of their exploiters.” (Letters, New English Weekly).

Oh, so he meant trickle-down economics, then?

I find the reification of “the working” class to be disingenuous in all its forms, and the “noble” working class v. the “effete” intellectual dichotomy is particularly specious now that most of the academic workforce has become as contingent and economically precarious as everybody else. Orwell hoped to secure the preservation of a socially conservative working class, and that’s fine as far as it goes, but as a refugee from that world, I can say without hesitation: it’s got its own damn problems, and it doesn’t deserve a pedestal (or blame) more than the rest of the world.

I find the dialogue around Donald Trump to be particularly odious in the way that people are so careful to note that “he’s clearly bright.” There is something especially perverse about somebody who is bright and who, nonetheless, persists in spouting base and childish arguments. It doesn’t matter if you are bright if you betray your own gifts by refusing to reflect on things.

Mansplaining in the philosophy classroom over at Crooked Timber

If you are an academic interested in politics, and you are not reading Crooked Timber you really ought to be. I tend to binge-read the entries as I go along.

My favorite piece here recently came from Harry Brighouse on Gender dynamics in the Philosophy Classroom, in echoing a piece that originally appeared on Leiter Reports.

It’s worth reading both, including the comments that Harry directs you to, and I like the use of the word “strawmans” as a verb.

I am inspired by the idea of having pre-planned round-tables where students expect to discuss the ideas in play. That eliminates some terror of cold-calling, and similar terrors of undergraduates huddled around podiums with boring, soul-destroying powerpoint presentations, but still requires participation. However, it does not decrease the likelihood that the male students dominate the discussion or strawman what their female counterparts say.

I am consistently surprised by my colleagues’ inability to see the role that mansplaining and strawmaning (and gaslighting) play in male speech in the academy, and it’s dangerous as all hell. With my recent confrontation of it, I’ve had to deal with one type of undermining behavior after another, from telling me I’m making a “big deal” out of nothing to calling it a “personal misunderstanding.”

No, goddamn it, no.

Being smart and having scholarly capabilities–like launching a credible argument or using basic social science skills–are the currency of the academy. They are also the cornerstone of value in the knowledge economy more generally. It is how people assign value and status. Thus, my greatest financial asset is peoples’ perceptions of my abilities and intelligence. When women’s capabilities and intelligence get downplayed and undermined, it has financial consequences. It affects your ability to pay rent, let alone your ability to build the confidence you need in order to offer the world the products of your intellectual work in the first place. Not standing up for your skills, capabilities, and record in the knowledge economy is the financial equivalent of allowing kids with graffiti spray paint your house (and not painting over it afterward).

If you fail to confront it, you allow people to represent your capabilities as less than they are, or if you correct it, you will be endlessly nagged for being “not nice.”

Judith Butler on anti-Semitism

This is from quite some time ago, back with Larry Summers was the president of Harvard. But it’s still worth reading.
Judith Butler takes on charges of protest as anti-Semitism for those of us who want there to be an Israel, but who also want to condemn human rights violations.

Summers’s view seems to imply that criticism of Israel is ‘anti-Israel’ in the sense that it is understood to challenge the right of Israel to exist. A criticism of Israel is not the same, however, as a challenge to Israel’s existence, even if there are conditions under which it would be possible to say that one leads to the other. A challenge to the right of Israel to exist can be construed as a challenge to the existence of the Jewish people only if one believes that Israel alone keeps the Jewish people alive or that all Jews invest their sense of perpetuity in the state of Israel in its current or traditional forms. One could argue, however, that those polities which safeguard the right to criticise them stand a better chance of surviving than those that don’t. For a criticism of Israel to be taken as a challenge to the survival of the Jews, we would have to assume not only that ‘Israel’ cannot change in response to legitimate criticism, but that a more radically democratic Israel would be bad for Jews. This would be to suppose that criticism is not a Jewish value, which clearly flies in the face not only of long traditions of Talmudic disputation, but of all the religious and cultural sources that have been part of Jewish life for centuries.

Just as for those who think to protest American actions is unpatriotic, it helps to remember what justice is for–justice isn’t some lofty idea that might never obtain. Perfect justice, perhaps. But a general sentiment that the system works justly and decently is a non-optional foundation to the long-term stability that many of us hope for Israel. Not everything can be accomplished with the barrel of a gun.

Sarah Manguos on writing from what is small, cheap, and false

We’ve been reading Isaiah Berlin’s Two Concepts of Liberty (pdf) in our class on justice. Our focus is on the first bits–the ideas about noninterference and coercion. This time reading it, I found myself much more interested in the second half material on self-mastery than I have ever been. Berlin and Aristotle have proved to be a fairly potent combination in playing about my mind throughout this week.

Philosophy in the latter part of the 20th century makes mastering the self more difficult because it becomes less clear, at a metaphysical level, what the self is, and it has gotten rather complicated, and mastering it even more so, but let’s just go with the idea that there is a self and you have some ability to discipline it for now. (There is less evidence of this in my case than I care to discuss, but…moving on.)

These ideas were floating around in my head this morning as I read this lovely essay from Sarah Manguso in the New York Times on writers and envy:

In 1818, Keats wrote to his publisher, “I would sooner fail than not be among the greatest.” His poem “Endymion” had recently been savaged by critics, one of whom called it “imperturbable driveling idiocy.” Keats’s terminal tuberculosis didn’t take full hold for at least another year, but Byron wryly remarked that Keats was ultimately “snuff’d out” by a bad review. But Keats also wrote, to another of his publishers, “If Poetry comes not as naturally as the Leaves to a tree it had better not come at all.” As leaves to a tree. A tree does not leaf out of envy of other trees. It leafs out all by itself, within a system of life and light, matter and time. Writing out of envy will not produce a tree in bloom. It will produce an expression of envy, and envy’s voice is ugly, small, cheap and false.

So inspiring. Write on, friends, from life and light, matter and time. Stretch your branches into the sun.

Governance post: Flint & Virginia Tech, A story of a mom, science, and governmental successes among the government failures

Mark Edwards, the scientist who worked on exposing the problems in Flint, MI, was one of my colleagues at Virginia Tech. He seems to be committed to the radical idea that people’s drinking water shouldn’t make them sick.

This piece in Washington Post is one of those “hero’s journey” journalistic accounts that drive me a little bit crazy, but hey. Edwards deserves it. The nice part of the story comes in here:

And then his phone rang in April 2015. It was a woman named Leeanne Walters, a Flint, Mich., stay-at-home mother who was getting nowhere convincing state and local officials that there was something seriously wrong with the orange-tinted water coming out of her tap. Her family’s hair was thinning. Her son’s skin was red and irritated. They told her the water was perfectly safe. And even months later, when it had been determined there were high traces of lead in her water, the officials shrugged it off as an isolated problem.

Desperately, she called Edwards, whom she had read about online. Over the phone, he walked her through how to take her own water samples. The next day she sent them FedEx to Edwards to test. It was the worst lead levels he had ever seen.

“When we saw that my heart skipped a couple of beats,” he said. “The last thing I needed in my life was another confrontation with government agencies. But it was us or nobody.”

First, yay Ms. Walters. Citizen-led science is never not awesome.

Second, though the story highlights lousy behavior and governmental failures on the part of regulatory agencies, included the CDC, which really can’t be doing that. CDC is an agency where nothing less than 100 percent effort towards transparency and accountability is acceptable. Heads need to roll.

But although much of media wants to trumpet just the government failures, ahem. Mark Edwards was educated, his entire life, at state schools–SUNY Buffalo and the University Washington. He’s spent his career fighting for clean water from a *tenured*–not contingent–position at *state* school. He received grant money, though not enough, from the National Science Foundation–a *federal* source of grant funding that some of our friends in Congress buuuuuuuuuurn to cut because it’s just a waste. He disseminated his findings via an internet that the government helped research, develop, and build.

My point is not just to fling more confetti on Professor Edwards, but to point us back to a rather old-fashioned idea: that there is good government and there is bad government, and that good government is possible, and it is often all around us, invisible, and it is sometimes the only thing that is capable of standing up when government itself fails.