EAGER TO do her part to demonstrate "solidarity with Black Lives Matter," the chair of the English Department at Rutgers University, Rebecca Walkowitz, recently sent a 3,400-word email on the subject to staff, students, and faculty members. Her message went into considerable detail to describe the "ongoing and future initiatives" that are planned to "create and promote an anti-racist environment," to eradicate "the violence and systemic inequities facing black, indigenous, and people of color members of our community," and to "cultivate critical conversations [on] state power; racism; violence; white supremacy; protest and resistance; and justice."
Most of these "initiatives" have little or nothing to do with English. Much of the email is filled with rhetoric about "equitable and diverse hiring" and "supporting black-owned businesses" and "engaging students in conversations about race" — all of it standard race-conscious boilerplate that could just as easily be copy-and-pasted into a similar email from the heads of the Anthropology, Molecular Biology, or Psychology departments.
But Walkowitz has found some means to incorporate "anti-racism" specifically into how Rutgers teaches English writing and grammar. Several seem to involve downplaying the importance of proper English writing and grammar. For example, one of the ways the Rutgers English Department is planning to improve its Writing Program is, in the memo's words, by "incorporating ' critical grammar' into our pedagogy." And what exactly is "critical grammar?" Walkowitz explains:
"This approach challenges the familiar dogma that writing instruction should limit emphasis on grammar/sentence-level issues so as to not put students from multilingual, non-standard 'academic' English backgrounds at a disadvantage."
Got that? There's more:
The new grammar curriculum "encourages students to develop a critical awareness of the variety of choices available to them w/ regard to micro-level issues in order to empower them and equip them to push against biases based on 'written' accents."
It takes years of graduate school and an advanced degree to be able to reduce English prose into something so impenetrable and opaque, but Walkowitz's memo appears to be assuring students — or at least students from "non-standard" backgrounds — that the failure to master standard English grammar will not be held against them, since it is more important for the Rutgers Writing Program to help students "push against" the traditional bias that favors proficiency in the rules and forms of English writing.
All this may advance the process of what Walkowitz calls "decolonizing the Writing Center," but it is not likely to ensure that Rutgers students who major in English acquire a reputation for clarity and elegance of writing.
In My Early Life, Winston Churchill's memoir of his first 30 years, the future prime minister recalls with deep appreciation his grade school teacher Mr. Somervell, who taught his charges to write English "as no one else has ever taught it." Churchill recalled his teacher's method:
Mr. Somervell had a system of his own. He took a fairly long sentence and broke it up into its components by means of black, red, blue and green inks. Subject, verb, object: Relative Clauses, Conditional Clauses, Conjunctive and Disjunctive Clauses! Each had its color and its bracket. It was a kind of drill. We did it almost daily. . . . I learned it thoroughly. Thus I got into my bones the essential structure of the ordinary British sentence, which is a noble thing. And when in after years my schoolfellows who had won prizes and distinction for writing such beautiful Latin poetry and pithy Greek epigrams had to come down again to common English, to earn their living or make their way, I did not feel myself at any disadvantage. Naturally I am biased in favor of boys learning English. I would make them all learn English: and then I would let the clever ones learn Latin as an honor, and Greek as a treat. But the only thing I would whip them for would be for not knowing English.
Today, of course, Rutgers and its champions of "critical grammar" would regard Churchill's emphasis on acquiring "the essential structure of the ordinary British sentence" as a primitive abomination. John F. Kennedy said of Churchill that he "mobilized the English language and sent it into battle"; there is little question that the power of Churchill's well-wrought English rhetoric helped save Western civilization in one of its darkest hours. (The power of that prose also earned Churchill the Nobel Prize for Literature in 1953.)
But English Department chairmen in 2020 have more important goals than mere excellence in English. The achievement of an "anti-racist" classroom takes precedence over everything else, including elevating the skills and knowledge of students who struggle with standard English.
"In short," observes David Bernstein, a university professor and head of the Liberty & Law Center at George Mason University,
the Rutgers English Department wants to make sure that students who come to Rutgers with a poor grasp of standard written English not only remain in that state, but come to believe that learning standard English is a concession to racism. I remember when keeping "people of color" ignorant was considered part of white supremacy.
All this reminds me of the national uproar that was triggered in the 1990s, when school boards in Oakland and Los Angeles decided that black English should be recognized as a distinct language — "Ebonics" — and used in classrooms, much as students from Spanish-speaking homes were to be taught in Spanish. "Debate rages over whether Ebonics is a language or just slang," reported Kate Zernike, who was then The Boston Globe's education reporter. "But either way, Ebonics isn't simply about words. It's a philosophy, one where teachers preach about 'raising motivation,' 'reducing anxiety,' 'removing barriers,' and 'affirming self-concept.' More than anything, this is the highest crest of the self-esteem movement."
There was just one problem, which the Globe documented: "There is absolutely no hard evidence it works." In the Los Angeles school system, test scores at schools with Ebonics programs were plunging. But to the true believers, it was more important to validate the legitimacy of the ill-developed language skills the students brought with them to the classroom than to help those students achieve proficiency in standard English diction, grammar, and pronunciation before their time in classrooms came to an end.
Would W.E.B DuBois, Booker T. Washington, Malcolm X, and Martin Luther King, Jr., all brilliant American prose stylists, have regarded 'Critical Grammar' as empowering — or demeaning? |
As many commentators pointed out at the time, Black English — Ebonics — is not a language but a dialect. "It is a dialect we love, one that warms us, comforts us, and gives us community," wrote the black poet Patricia Smith, who was a Boston Globe columnist. "And I wish I could say that we couldn't care less how it sounds to the rest of the world."
But those of us who are bi-dialectical learned from people who did not speak the way we did. We listened to teachers, many of whom themselves had dialects, and our brains worked and we grasped concepts and ideas and theories. As black kids, we were introduced to a world we had to enter in order to survive, and then we were offered the tools to get there. What they're saying in Oakland is that those kids are too dumb to learn the way we did, and that's insulting.
I didn't grow up in a community that spoke Black English. Instead I was surrounded by grown-ups who spoke the heavily accented, nonstandard English of Eastern European immigrants. Many were like my father, a refugee from Czechoslovakia who immigrated to America in 1948. The only English he knew when he arrived were the words he had picked up on the boat coming over. But like millions of immigrants before him, and like scores of others he met after settling in Cleveland, he made learning English a priority.
Two nights a week he would take the bus to a public high school that offered English classes, and on a third night he would attend another English class at the Jewish community center. To practice their confusing new language, my father and a number of fellow immigrants formed a New Americans Club, which organized Sunday outings during which everyone was expected to speak English. His grammar never became perfect, and he never lost his accent, but for the past 70 years English has been my father's primary language.
America in the '40s and '50s didn't make life easy for non-English speakers, a fact for which I am enduringly grateful. My father was forced to learn English; it was the prerequisite to American life. I don't know that he would have been as diligent about getting on that bus three nights a week if Cleveland's banks had provided Slovak-speaking tellers, or if government forms could have been completed in Hungarian, or if there had been a "Press 2 for Yiddish" option when he had to contact the phone company or a utility. (My father was fluent in all three.) Not learning English wasn't an option. My father had to acquire the common American tongue. His life has been better for it, and so, consequently, has mine.
The Rutgers English Department trumpets its adoption of "critical grammar" as an expression of its commitment to "anti-racist" education. I'm sure it is sincere and well-intended. But will it enhance its students' lives? Or will it do the opposite?
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
Don't scrap the filibuster. Restore it.
Democrats are increasingly confident of pulling off a trifecta in November — defeating Donald Trump, winning a majority of the Senate, and remaining in charge of the House of Representatives. If it happens, Democrats would have unified control of the White House and both chambers of Congress for the first time since 2009. But it still wouldn't give them unchallenged power to muscle through any piece of legislation the party supports — not as long as the Senate retains the legislative filibuster, which, under current rules, requires the support of at least 60 senators to bring a bill to a vote.
Consequently, pressure is growing within Democratic Senate ranks to abolish the filibuster if the party triumphs on Election Day. That would mean that the next Senate would need only a bare majority vote to pass everything on the progressive wish list, from the Green New Deal to statehood for the District of Columbia. During his 36 years in the Senate, Joe Biden always supported the filibuster as a key protection for the minority party against the tyranny of the majority party. But the presumptive Democratic nominee signaled last week that he will not object if his party moves to change the rules should it win a Senate majority in the fall.
In an interview with the New York Times, Biden was asked whether he supports doing away with the filibuster. "It's going to depend on how obstreperous they become," he said, referring to Republicans. "I think you're going to just have to take a look at it."
Wrangling over the filibuster isn't new. The practice is not mandated by the constitution; it's merely a creation of the Senate rules, and senators have debated for years the form it should take. Before 1917, debate in the Senate had no limit — as long as any senators were prepared to keep discussing a measure, it could be kept from a vote indefinitely. In 1917, prodded by President Wilson, the Senate adopted a rule modifying the filibuster: For the first time it authorized "cloture" — the closing of a debate — if two-thirds of the senators agreed. Decades later, the threshold needed to invoke cloture was lowered to 60, which is where it remains today, at least in theory.
But it's not that simple.
The filibuster has on the whole been a good rule, one that restrains majorities of either party from riding roughshod over the minority. Under Mitch McConnell and the Republicans, no less than under Harry Reid and the Democrats, the filibuster has prevented the Senate from becoming merely a smaller version of the House of Representatives, where the minority party has no bargaining power at all, and debate is ruthlessly curtailed.
It is true, as some on the left have been fuming lately, that as long as it takes 60 votes to get a controversial bill passed, Democrats' most radical proposals will never be enacted, even if Biden takes the White House and Chuck Schumer becomes majority leader. But that has been the reality on the right, too. Despite having a Republican president and Senate majority, the GOP has not been able to repeal Obamacare or deliver the border wall Trump craves: It has never been able to muster the necessary 60 votes. That explains why Trump has been as adamantly anti-filibuster as, say, Elizabeth Warren. "Republicans must get rid of the stupid Filibuster Rule — it is killing you!" he tweeted in 2018 in a typical denunciation. On that issue, if on no other, the president and the Massachusetts senator are in sync.
So should senators dump the filibuster?
No. They should restore it.
At their best, filibusters can be a valuable restraint on congressional overreach and runaway populism. Used prudently, a filibuster shelters the minority's right to be heard and pumps the brakes on surging legislative enthusiasm. The problem with the filibuster isn't that it exists, but that its use has become both routine and invisible. How this paradox came to be is a classic tale of unintended consequences, which I recounted in a column last year:
It used to be that any senator or group of senators could indefinitely block a vote on a bill the way Jimmy Stewart did in "Mr. Smith Goes to Washington" — by taking the floor to speak and refusing to stop until the majority agreed to give ground (or exhaustion overtook the speaker). Critically, while a filibuster was underway, all other Senate business was suspended.
But in 1970, then-Majority Leader Mike Mansfield introduced a "two-track" system, under which a bill being filibustered would be set aside so the Senate could take up other matters. The result was not what Mansfield doubtless expected — making filibusters less desirable by stripping them of their power to gridlock the Senate. Instead, the number of filibusters soared. Or rather, the number of threatened filibusters soared. Those threats never had to be made good. The mere announcement that Senator X intended to filibuster Bill Y created a de facto requirement for a supermajority to move the legislation forward. Soon it was taken for granted that nearly every bill needed 60 votes to pass.
The solution to this problem isn't to eliminate filibusters altogether, but to eliminate the two-track system that has made them ubiquitous. Senators were far less likely to undertake a filibuster back when they knew that doing so would bring the Senate to a halt. It was a weapon used sparingly. During the entire 19th century there were only 23 filibusters. Since 1970 there have been more than 1,000.
The Senate can make filibusters rare again by making them real again. A determined minority should have the ability to resist passage of a measure they find intolerable. But they should also have to demonstrate their resistance the hard way — by taking the floor, staying on their feet, speaking without letup, and facing the consequences. Then and only then should it require a supermajority to cut off debate and vote.
All the talk of whether Democrats should eliminate the filibuster if they take control of the Senate is missing the point. The filibuster was eliminated decades ago, and it was replaced with something more like a genteel blackball — a mere agreement-by-courtesy not to take up a bill. Mansfield's innovation hasn't served anyone well, and it's long past time to undo it. Senators who wish to block a piece of legislation from coming to a vote should be required to fight it on the floor of the Senate. If they aren't willing to do that, then they have no right to complain when it passes with a simple majority.
Democrats and Republicans, debating whether to get rid of the filibuster, are on the wrong track. What they should really be debating is how to get it back.
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
Politicians who don't give
One of my litmus tests for political candidates is generosity. Other things being equal, I'm inclined to vote for candidates who are prepared to spend money on worthy causes.
To spend their own money, that is.
Many would-be presidents, senators, and representatives are voluble on the subject of "generosity" when they are talking about the ways in which they want the government to spend public funds. As Senator Ed Markey runs for reelection, for example, he has plenty to say on his campaign website about all the money he has been instrumental in redirecting from the taxpayers who earned it to other recipients and purposes — such as the $200 million he backed for the creation of a flu vaccine, or the $5 billion of annual investment in intercity passenger rail projects, or the $25 million for gun violence research, or the $36.5 billion for critical relief in Puerto Rico. There is even a "Markey Map," on which voters are invited to "search to see how Ed Markey has delivered for your community."
But I derive no insight into Markey's character or qualifications for service from a litany of all the ways he arranges to spend money that he doesn't work for. By my lights, there is no virtue in voting to appropriate funds from the public treasury. I look for evidence of compassion or benevolence not in how openhanded Markey is in giving away our money, but in how liberally he gives away his own.
This month, Markey released his tax returns for the last seven years, which makes it possible to draw some conclusions on that score. They aren't very inspiring.
From 2013 through 2019, Markey's annual adjusted gross income totaled $1,248,441, or an average of just over $178,300 per year. (His wife's tax returns were filed separately). Over those seven years, he donated a grand total of $49,756 to charity, for a yearly average of $7,100. As a percentage of his adjusted gross income, Markey gave 3.9% to charity. That share would have been even lower had Markey not dramatically increased his charitable donations in 2018 and 2019, as he was gearing up to run for reelection.
Representative Joe Kennedy III, who is challenging Markey in the Democratic primary, has an even less impressive record of charitable giving. Kennedy and his wife released tax returns covering six years, during which their joint adjusted gross income totaled $2,093,460, for an annual average of $348,910. Over the course of those six years, the Kennedys donated $49,597 to charity. That amounted to only 2.3% of their very substantial personal income from 2013 through 2018.
Of course there is no requirement that Markey, Kennedy, or any other public official give anything to charity. I'm glad that they give something. And in fairness, there are politicians who give far, far less. Representative Richard Neal, the "dean" of the Massachusetts congressional delegation and the chairman of the House Ways and Means Committee, recently released eight years worth of his tax returns . They show that from 2011 through 2018, he earned more than $1.9 million in income, yet he donated a grand total of just $7,750 to charity — a paltry four-10ths of 1% of his earnings. As he campaigns for reelection, Neal has been talking up all the funds he has directed to his Western Massachusetts district from Washington, DC. To my mind, it says far more about Neal that he gives such negligible amounts of his own money to charities that help the poor, the sick, or the homeless.
Neal isn't alone, unfortunately. Many prominent politicians give little to charity — which doesn't stop many of them from lecturing Americans about the greed of the "billionaire class" or the need for more government spending to show "compassion." On the other hand, some politicians give great swaths of their income to charity as a matter of course. Mitt Romney's tax returns, for instance, have consistently documented levels of philanthropic giving far above the norm, not just in dollar terms but as a percentage of income. When Barack Obama was in the White House, his charitable giving topped $1 million.
For millions of Americans, charitable giving is a regular household expense — they would no more neglect to spend money on worthwhile philanthropic causes than they would overlook their rent payment or the grocery bill. I find that attitude entirely normal. I grew up watching many of the adults in my world give some coins to charity every day without fail. In the elementary school my siblings and I attended, kids were taught from an early age about the importance of giving to help others, and we would contribute our pennies daily when the charity box made the rounds.
I understand why politicians brag about how much money they have extracted from the government for this or that program. But when I want to form an opinion of a candidate's character, I find it much more illuminating to consider how much they have extracted from their own pockets. Or haven't.
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
The Last Line
"And that's the end!" — Bugs Bunny in A Wild Hare (July 27, 1940 — the first official appearance of Bugs, 80 years ago today.
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
(Jeff Jacoby is a columnist for The Boston Globe).
-- ## --
Follow Jeff Jacoby on Twitter or on Parler.
Discuss Jeff Jacoby's columns on Facebook.
Want to read more Jeff Jacoby? Sign up for "Arguable," his free weekly email newsletter.