Curmie’s Conjectures: Book Reviews and the Warm Fuzzies

by Curmie

[This is Jack: I have to insert an introduction here. Curmie’s headline is fine, but it would come under the Ethics Alarms “Is We Getting Dumber?” or “Tales of the Great Stupid” banners if I had composed it. What he is describing is a culture-wide phenomenon that is far more insidious than its effects on scholarly book reviews alone. I also want to salute Curmie for slyly paying homage in his section about typos to one of my own most common and annoying typos. I know it was no coincidence.]

I published my first book review in an academic journal in 1991.  In all, I’ve written about 30 reviews on a wide range of topics for about a dozen different publications.  In some cases, I was only marginally qualified in the subdiscipline in question.  In others, especially more recently, I’ve been a legitimate authority, as well as being a full Professor (or Professor emeritus) rather than a grad student or rather green Assistant Professor.

The process has changed significantly in recent years, the biggest change being the increased level of editorial scrutiny.  A generation or more ago, I’d send in a review and it would be printed as written.  That was back when I was an early-career scholar, at one point even without a terminal degree, often writing about topics on the periphery of my interests and expertise.  My most recent reviews, when I was a senior scholar writing about subjects in my proverbial wheelhouse, went through three or four drafts before they were deemed publishable.  Note: I didn’t become more ignorant or a worse writer in the interim.

Some of the changes came indirectly, no doubt, from the publishers rather than the editors: I received the same stupid comment—to include the chapter number rather than a descriptor like “longest” or “most interesting”—from book review editors from two different journals published by the same firm.  Actually, one of those “corrections” wasn’t from the book review editor himself, but was a snarky comment from his grad assistant.  You can imagine how much I appreciated being condescended to by a grad student.  Other changes were just kind of dumb: one editor insisted that I change “whereas” to “while” (“whereas” was the better term).

But these are the kind of revisions at which one just shakes one’s head and shrugs.  The ones that actually affect the argument are far more problematic.  One author was writing about the production of a play by a female playwright from the 1950s.  There’s no video footage (of course), and if literally anyone who saw that production is still alive, I think we could forgive them for not remembering many details.  But the author decried the (alleged) sexism of the male newspaper reviewers who weren’t impressed with the production.  Nothing they said, or at least nothing the author quoted, struck me as anything but a negative response to a poor performance. 

Remember, they’re not talking about the play as written, but as performed, so the fact that the text isn’t bad (I’ve read it) doesn’t render the criticism of the acting and directing invalid.  I said that in what amounted to my first draft, but was told that I needed to say that the allegations of sexism could have been true (well, duh!), but weren’t necessarily.  In my view, declaring suspicions as fact, even if there’s some supporting evidence, might cut it as a blog piece, but it isn’t scholarship.  But whatever…

In another review I suggested that the mere fact that male dramatists wrote plays with specific actresses—their “muses”—in mind for the leading roles doesn’t mean that those women should share authorship credit any more than Richard Burbage should get co-authorship credit for Shakespeare’s plays.  I was ultimately able to make that point, but in a watered-down version. 

More recently, I was asked to “tone down” a comment that several of the authors in what purported to be an interdisciplinary collection of essays were so committed to discipline-specific jargon, incredibly complex sentences, and sesquipedalian articulations (see what I did there?) that readers, even those well-versed in the subject matter—me, for example—would find those chapters unreasonably difficult read, and might be tempted to conclude that the authors were more interested in strutting their intellectuality than in enlightening the reader. 

I stand by the analysis, but the editor was probably right to ask me to temper the cynicism.  I did so, but I kept the rest in a slightly revised version.  She seemed pleased, and told me she’d sent it off to press.  When it appeared in print, only the comment about jargon remained… and the verb wasn’t changed from plural to singular.  Sigh.

Perhaps the most telling episode was when I said that a book was extremely poorly edited and proofread.  I’ve never written a book, but I have published several chapters in collections of scholarly essays.  The process varies a little from publisher to publisher, but for one recent chapter I sent a draft to the book editor, who made editorial suggestions and proofread, and sent it back to me.  I approved some of the changes he suggested and made my case for not changing other parts of the essay.  After about three drafts, we both pronounced ourselves satisfied, and the essay went off to the series editor, who requested a couple of very minor changes.  And then it went to the publisher.  And then the professional proofreader.  And then back to the publisher.  And then back to me.  At least five different people proofread that chapter, some of us several times.

It’s still almost inevitable that some typo will still sneak by.  Of course, some publishers will cheat and rely on spellcheck, sometimes without even checking the final product.  I once encountered a textbook that intended to reference the 19th century playwrights Henri Becque and Eugène Brieux, but rendered their surnames as Bisque and Brie—a nice lunch, perhaps, but hardly important dramatists.

But this book, published by a prominent academic press, was ridiculous.  There were four and five typos on a single page, inconsistent formatting so it was impossible to tell when quoted material began and ended, at least two (that I caught) glaring malapropisms, and a number of instances of sentences or paragraphs so convoluted it was literally impossible to tell what was intended.  We’re not talking “teh” for “the” or accidentally omitting the “l” in “public,” here.

I was insistent on making the point that the book was not yet ready to be published.  A lot of the scholarship was really excellent, but the volume read like a first draft, neither edited nor proofread.  Finally, the book review editor had to get permission from the journal’s editor-in-chief (!) for me to go ahead with that commentary.

So what’s going on, here?  I can offer no firm conclusions, only speculations… “conjectures,” to coin a phrase. 

Continue reading

Ethics Heroes: New York Times Readers

Who would have thought that New York Times readers could do such a terrific Peter Sellers impression?

Paul Krugman, once a Nobel Prize winner, now the very model of a modern progressive hack, issued his contribution to the current “Protect Joe Biden!” hysteria among pundits and journalists. It’s called “Why You Shouldn’t Obsess About the National Debt,” and if this won’t get the Nobel people to demand their prize in economics back, nothing will.

The intellectual dishonesty of the piece is stunning even for Krugman—I remember how an old friend favorably posted one of Krugman’s columns to Facebook and the scales fell from my eyes making me realize that the old friend was an idiot and had always been one—and the rationalizations he uses to shrug away the $34 trillion national debt are breathtaking in their audacity. Some examples:

Continue reading

Ethics Observations on the Harvard/Columbia “Nakba” Article Episode

What’s Nakba? It is a pro-Palestinian framing of the forever conflict in the Middle East between Israel and the Palestinians. Nakba refers to the beginning, when the United Nations announced its two-state resolution of the Palestine conflict with Israel getting one of them, and the Arab states along with the Palestinians attacked the new Israel territory with the objective of making the Israeli state a single Palestinian state. Israel won, and that historical episode is referred to as Nakba, “the disaster,” by the Palestinians.

I view it as the equivalent of the die-hard Confederacy fans in today’s South calling the Civil War “the war of Northern aggression.” It’s a false and biased framing that justifies everything the Palestinians do and try to do to Israel (like wiping it off the map), including terrorism. It is the reverse of the more correct and honest Israeli framing, which is that Palestinians could have had their state in 1948, tried to wipe out Israel instead, and now reside in the mess of their own making.

Soon after Hamas’s October 7 terrorist attack (the hostages appear to all be dead by the way, which should have been assumed by now), the Harvard Law Review asked Rabea Eghbariah, a Palestinian doctoral candidate at Harvard Law School and human rights lawyer, to prepare a scholarly article taking the Palestinian side of the latest conflict. Eghbariah, who has tried landmark Palestinian civil rights cases before the Israeli Supreme Court, submitted one, a 2,000-word essay arguing that Israel’s attack on Gaza following the Hamas act of war should be evaluated through the lens of Nakba, and within the “legal framework” of “genocide.”

Continue reading

Observations on the Early Post-Trump Conviction Polling

It’s early yet, and things could change, and yes, polls, but

Observations:

Continue reading

The DEI-ing of Major League Baseball’s Statistics: Oh. Wait, WHAT?

Major League Baseball’s absurd and self-wounding decision to lump all of the old Negro League season and career statistics in with those of its own players is impossible to defend logically or ethically. Ethics Alarms discussed this debacle of racial pandering here, three days ago. What is interesting—Interesting? Perhaps disturbing would be a better word—is how few baseball experts, statisticians, historians, players and fans are defending this indefensible decision or criticizing it. As to the latter, they simply don’t have the guts; they are terrified of being called racists. Regarding the former, there is really no good argument to be made. MLB’s groveling and pandering should call for baseball’s version of a welter of “It’s OK to be white” banners and signs at the games. Instead, both the sport and society itself is treating this “it isn’t what it is” classic like a particularly odoriferous fart in an elevator. Apparently it’s impolite to call attention to it.

Continue reading

So It’s Come To This: Baseball Destroys Its Hallowed History and Statistics To Signal Its Abject Wokeness, DEI Complaince and White Guilt

How sad. How transparent. How self-destructive.

Major League Baseball announced yesterday that it is now incorporating statistics of the Negro Leagues and the records of more than 2,300 black players who played during the 1920s, 1930s and 1940s into its own record books. This, of course, makes no sense at all: it is The Great Stupid at its dumbest. It is the epitome of DEI fiction and manipulating history. And, naturally, when everyone wakes up and realizes how brain-meltingly stupid this was, it cannot be reversed.

Because doing that would be “racist.”

Thus, lo and behold, legendary catcher Josh Gibson (above) becomes Major League Baseball’s career leader with a .372 batting average, surpassing Ty Cobb’s .367. Gibson’s .466 average for the 1943 Homestead Grays became the season standard, followed by the immortal (I’m kidding) Charlie “Chino” Smith’s .451 for the 1929 New York Lincoln Giants. These averages surpasse the .440 by hit Hugh Duffy for the National League’s Boston team in 1894.

Gibson also becomes the career leader in slugging percentage (.718) and OPS (1.177), moving ahead of Babe Ruth (.690 and 1.164). Gibson’s .974 slugging percentage in 1937 is now the MLB season record, with Barry Bonds’ .863 in 2001 dropping to fifth (that stat is also corrupted, but for a different reason). Bonds now trails legendary (kidding again) Mules Suttles’ .877 in 1926, Gibson’s .871 in 1943 and Smith’s .870 in 1929. Bonds’ prior OPS record of 1.421 in 2004 dropped to third behind Gibson’s 1.474 in 1937 and 1.435 in 1943.

Continue reading

MIT Geniuses Finally Figure Out That Forcing Faculty To Pledge Fealty To Woke World Isn’t Academic Freedom

From one perspective, this development seems encouraging. Maybe the lesson of “The Emperor’s New Clothes” is finally starting to take down the destructive DEI delusion.

The Massachusetts Institute of Technology announced that it will end the use of diversity statements in the faculty hiring process. These statements, typically a page-long, were required of all faculty candidates so they could persuade the institution that they could be relied upon to support and enhance the university’s commitment to “diversity.” The statements are now routine in faculty hiring at many public and private universities, as well in corporations and other organizations. I confess that I had not focused on this development sufficiently; it is scary, and the mainstream media and its pundits apparently felt it was not something “the public has a right to know.” [The only previous Ethics Alarms essay on diversity statements is here. I helped sound the alarm, and then did nothing for two years.]

As she announced the reform, MIT’s president Sally Kornbluth, the lone survivor of the fateful Congressional hearing that led to the dismissal of two other female presidents of elite universities, the University of Pennsylvania and Harvard, condemned the statements as compelled speech. “My goals are to tap into the full scope of human talent, to bring the very best to M.I.T. and to make sure they thrive once here,” Dr. Kornbluth said . “We can build an inclusive environment in many ways, but compelled statements impinge on freedom of expression, and they don’t work.”

Interesting phrasing. If they “worked,” whatever sinister meaning that has, would she be eliminating them? The diversity statements are not just compelled speech, they represent compelled ideological conformity. That’s fascist stuff. Explain to me again: who are the “threats to democracy”? It also points to the other perspective besides the one I alluded to at the beginning. The fact that diversity statements has infested academia at all is ominous.

Continue reading

Reminder to California: Doing the “Right Thing” When It Can’t Possibly Have a Positive Outcome Isn’t Ethical


It’s amazing what a flat learning curve ideologues have.

Certain laws of economics are immutable: if someone’s skills and the value of their labor are not worth the amount they demand in compensation for it, then eventually no one will be willing to hire them. Way back in my foggy history, the U.S. Chamber of Commerce charged me with examining just this issue in my role as head of the National Chamber Foundation, the Chambers public policy research arm. I hired an independent economist to examine the issue, and he concluded that indeed, raising the minimum wage cost the most vulnerable American workers jobs every…single…time. He also explained that the political pressure for raising the minimum wage came from unions, which used a ride in the bottom wages as justification for demanding higher wages in their definitely un-minimum wage compensated fields. Unfortunately for me, my scholar, being independent, also disputed the Chamber’s position that minimum wage increases were automatically inflationary across the board. The President of the Chamber had my foundation’s study pulled out by a Democratic Party minimum wage hike advocate and used to refuse his position on a Sunday morning public affairs show. (My ultimate boss had neglected to read the document.) This, as you might imagine, did not help my status in the organization.

If anything, the advances in technology have made that old study at NCF more accurate than ever. Never mind, though: 21st Century progressives seem to care about virtue-signalling and fealty to socialist cant more than actual results or, to put it another way, reality. Naturally California, one of our extreme leftist kamikaze states, arguably the most reckless one, has adopted this attitude. And thus it came to pass that last fall, Governor Newsom signed into law a $20 an hour minimum wage hike on the fast food sector for the “benefit” of fast food workers, even as the segment of the public that most often consume fast food has been slammed by inflation and higher food prices particularly.

Everything we have learned about minimum wage hikes indicated that this would be a disaster, but advocates of the move in the Democratic party pooh-poohed the objections as more proof that conservatives are cruel and greedy. Do these people ever get tired of being embarrassingly, absurdly wrong? As a Washington Times headline put it, “Fast food chains find a way around $20 minimum wage: Get rid of the workers.”

Continue reading

The Explanation For Everything That Afflicts Americans of Color Is Systemic Racism, Part II: Botched Executions

A report released last week by Reprieve, a human rights group that opposes the death penalty apparently shows that the lethal injections of convicted murderers are botched more than twice as often as the executions of white convicts. Spinning, the New York Times says, “That finding builds on a wealth of research into racial disparities in how the U.S. judicial system administers the death penalty. The proportion of Black people on death rows is far higher than their share of the population as a whole.”

“We know that there’s racism in the criminal justice system,” said Maya Foa, an executive director of Reprieve. “We know it’s there in the capital punishment system, from who gets arrested, who gets sentenced, all of it. This is, though, the first time that it’s been looked at in the context of the execution itself.”

To start with, they don’t “know” that at all. It is a self-perpetuating theory built on other debatable assumptions, such as believing that the disproportionate number of blacks on Death Row, and in the U.S. prison system generally, is because a disproportionate number of blacks commit crimes that legitimately put them there. Second, how exactly does doing a bad job killing a condemned prisoner show racial bias?

More from the Times:

Continue reading

The Explanation For Everything That Afflicts Americans of Color Is Systemic Racism, Part I: Insomnia

…someone just has to figure out how and why. Or just assume how and why. Oh, hell, just hand over the reparations already!

Researchers believe that black Americans are likely to have more trouble sleeping than the white Americans who oppress them. In fact, the darker your skin color is, the less sleep you get, says Dr. Dayna A. Johnson, a sleep epidemiologist at Emory University. “The theory is that racial minorities experience a stress that is unique and chronic and additive to the general stressors that all people experience,” said Johnson, a sleep disparity pioneer. “We all experience stress, but there are added stressors for certain groups. For certain populations, racism fits into that category.”

A Johnson-headed study published in the journal “Sleep” claims to find that experiencing racism and can cause people to have problems falling asleep. (What did the researchers do, hire people to racially discriminate against their subjects before bedtime?) The study also concluded that people who anticipate racism may experience interference with their sleep-wake cycle because the dread causes their body to be in a heightened state of anxiety, with higher blood pressure and accelerating heart rate. By this, I glean that being told by the media, politicians and race-hucksters that American society is all racist all the time causes black Americans to lose sleep. Got it. And being white, this is my fault.

Continue reading