Tuesday, October 30, 2018

Forget quantum laptops, our quantum computing future is in the cloud

No comments
New Scientist - Home
Forget quantum laptops, our quantum computing future is in the cloud
Australian of the Year Michelle Simmons is hoping her work building a new type of quantum computer can solve problems we don't even know about
CNET News
Bitcoin is actually going to ruin the world, climate scientists say     - CNET
Bitcoin is actually going to ruin the world, climate scientists say - CNET
Bitcoin requires electricity to exist, lots of it. Here's why that's bad news for our planet.
Five biggest mysteries about Apple's next MacBook     - CNET
Five biggest mysteries about Apple's next MacBook - CNET
If there is indeed a new slim, reasonably priced mainstream Apple laptop coming this week, almost everything about it is going to be a surprise.
OnePlus 6T trade-in deal: $300 off at T-Mobile     - CNET
OnePlus 6T trade-in deal: $300 off at T-Mobile - CNET
That's not "up to" $300 -- that's $300, period, when you trade in an eligible phone. And that list includes nearly three dozen models.
Science says sand flies like, totally get the munchies for marijuana     - CNET
Science says sand flies like, totally get the munchies for marijuana - CNET
The biting bugs appear to be willing to take long flights to find a fix.
The New Republic
Are Hurricanes Changing How We Talk?
Are Hurricanes Changing How We Talk?

Linguists have figured out a lot about the many different regional dialects of American English. They know why Brooklynites say “cawfee,” for example, and why Bostonians say “Hahvahd Yahd.” They’ve traced the history of our accents and phrases: how migration patterns influenced their development, and how technology shaped their recent evolution. But what do they know about how Americans will talk in the future?

Not a lot. Outliers in the field believe that the dialects of American English will die—or at least slowly fade—as technology expands connections between people from different regions. But everyone agrees that Americans won’t always speak like we do today. And one of the best ways to predict those future changes is to study places where American speech is shifting right now.

One of the most interesting such transformations is happening in New Orleans, according to Katie Carmichael. The linguistics researcher and assistant professor at Virginia Tech has been researching the Southern Louisiana city’s unique drawl for years. “It changes every six months, who lives there and what they’re arguing about,” she said. “Every time I go back there, it’s such a different place.”

In years of conversations with New Orleans residents, however, Carmichael has noticed one thing that’s always the same. “There’s not a single person who doesn’t bring up Hurricane Katrina,” she said. That observation led Carmichael to develop a unique hypothesis: Maybe the hurricane that devastated New Orleans in 2005 did more than just change the city’s physical and cultural landscape. Perhaps it altered how New Orleanians speak, too.

New Orleans English can be confusing to those who aren’t familiar with it. “Some of them speak with a familiar, Southern drawl; others sound almost like they’re from Brooklyn,” Jesse Sheidlower, the former editor-at-large of the Oxford English Dictionary, wrote in a definitive explainer of the dialect in 2005, shortly after Katrina. “Why do people in New Orleans talk that way?”

The “rich level of linguistic diversity” in New Orleans stems, in part, from its diverse migrants, he wrote. There were the French and Acadian settlers, of course, as well as Spanish, German, Irish and Italian immigrants. New Orleans was also “a gateway for the slave states, which brought in speakers of a variety of African languages.” The result was the hodgepodge of subdialects that exist there today, which change depending on the neighborhood or ethnic group.

The 24 regions of American English, according to Long Island University’s Robert Delaney.Robert Delaney

Carmichael can’t yet say specifically how New Orleans English has changed since Katrina. But last week, the National Science Foundation awarded a $250,000 grant to Carmichael to explore her hypothesis, along with Tulane University linguistics professor Nathalie Dajko. Together, they “will interview 200 lifelong residents of different ethnic backgrounds and neighborhood origins to collect the largest and most diverse data sample ever assembled in the city,” according to a press release. The results could help researchers predict other dialect changes that might occur as sea-level rise and more severe storms force Americans away from the places that once shaped their speech.

Carmichael’s hypothesis is based on two known impacts of Katrina that are also known to spur dialect changes: Widespread displacement of native speakers from the region and widespread migration of non-native speakers into it.

Katrina’s flooding displaced 400,000 people—nearly the entire population of the city. A year later, only about 53 percent of those people had returned—“less than a third at the home they’d lived in prior to Katrina,” according to CityLab. Today, the city has around 80 percent of the population it once had, but it’s been difficult to track with precision how many are newcomers and how many are longtime residents. But one thing is for sure: New Orleans is a smaller, whiter city than it used to be. There is thus concern that some New Orleans subdialects could be at risk of disappearing. After all, Carmichael said, “Some of our major cities in the South no longer sound Southern because economic prosperity brought Northerners in.”

A severe example of climate change–fueled displacement is happening not far from New Orleans at Isle De Jean Charles, a Louisiana island that’s slowly sinking into the ocean. The community there is one of the last vestiges of Louisiana Regional French, and some scholars worry that an exodus of residents—who are often called America’s first climate refugees—will eradicate that dialect. “Sometimes when people move away, they leave the language behind,” said Dajko, who is writing a book about linguistic changes on the island.

There is hope, though, for both New Orleans English and Louisiana Regional French, and it rests in part on the strong sense of identity that comes with a unique way of talking. If people feel their identity is threatened—that their dialect is fading—they may work harder to preserve it. Carmichael hypothesizes that as a result of Katrina, some ethnic populations in New Orleans may retain their unique older dialects for longer than they would have otherwise.

While she and Dajko work to figure that out, though, they hope the mere idea of their research will provoke others to think about how climate change might affect American English in the future. “These mass migration scenarios are more likely to happen again,” Carmichael said. That’s not only because of hurricanes, which are getting more intense, but also due to sea-level rise.

If warming continues unabated, up to 13.1 million Americans living in coastal areas could be permanently displaced. More than one million people are at risk in Louisiana, further threatening the regional dialect. But so are 900,000 in New York, 800,000 in New Jersey, and 400,000 in Texas. “Climate change is going to be a catalyst for a mass migration that will result in language change,” Dajko said. What that heralds for y’all and yous guys remains to be seen.

The Promise of Polarization
The Promise of Polarization

How divided have Americans become? When it comes to the two-party war, the differences could not be starker. Pew Research Center has reported that 55 percent of Democrats are “afraid” of the Republican Party and nearly half of Republicans are similarly fearful of Democrats. These survey results were published in June 2016—before Donald Trump was elected. Since then, of course, the enmity has increased. Trump’s genius for stirring up discord is one reason, but only one: The ingredients of all-out political warfare have been simmering for many years, as each of the two parties has discarded the old-fashioned ideal of the “big tent” and enacted its own purifying rituals.

What has changed is how personal these political divisions have become. Partisanship has taken on an unsettling aspect and turned into something new: “affective polarization,” which dictates not only how we vote, but also, as social scientists have reported in the Harvard Business Review, how we “work and shop.” Politically minded consumers are “almost twice as likely to engage in a transaction when their partisanship matched the seller’s,” and they are “willing to work for less money for fellow partisans.” Is this honorable self-sacrifice or self-inflicted injury? It is hard to say, especially since, when it comes to political dispute, “particular policy beliefs” are often beside the point, the researchers write. What matters is who wants the new bill passed and who wants it stopped. It’s a zero-sum game in which victories are less important than the other side’s defeats.

THE POLARIZERS by Sam RosenfeldUniversity of Chicago Press, 336 pp., $30

Yet, as Sam Rosenfeld shows in The Polarizers, the irrational-seeming “extreme partisanship” and “tribalism” that contaminate our politics today originated in the principled efforts of writers, activists, and politicians who thought the two parties needed more polarization, ideological fixity, and internal discipline. This idea went back to the New Deal era, when the two major parties were each riven by internal disagreements on race, the economy, and much else, so that President Roosevelt met opposition in Congress not only from Republicans but also from Southern Democrats. He tried to fix the problem, first mounting a campaign to purge conservatives from the Democratic Party in the 1938 midterms (it backfired) and then inviting the moderate Republican nominee he defeated in 1940, Wendell Willkie, to join him in a plan to break apart the two parties and reset them like straightened limbs, “one liberal, and the other conservative.”

Today that course seems fatefully misguided, but Rosenfeld is right to point out that what came before wasn’t always better. What some enshrine as an age of “statesmanlike civility and bipartisan compromise” often involved dark bargains and “dirty hands” collusions, and was not especially democratic. This is what led political scientists such as E. E. Schattschneider and James MacGregor Burns to argue in the 1940s and 1950s against bipartisanship, because it depended on toxic alliances that hemmed in political players, from presidents on down. Thus, even the immensely popular war-hero Dwight Eisenhower, the first Republican president elected in 24 years, was stymied time and again by in-built flaws in a defective system. Eisenhower wanted to do the sensible thing—to advance civil rights and economic justice at home while negotiating abroad with the Soviet Union. He repeatedly came up against a stubborn alliance of conservative Southern Democrats and heartland Republicans.

Out of all this came the drive to reform the two parties, to make them more distinct through what Rosenfeld calls “ideological sorting.” The hope was that clear agendas, keyed to voting majorities, would marginalize the reactionaries and extremists in both parties, and that mainstream, “responsible” forces would govern from the center, giving the public the expanded, activist government it obviously wanted. This was the initial promise of polarization. What went wrong?

For one thing, Schattschneider and Burns were viewing the system from the heights of presidential politics, where centrism did indeed dominate. The ideological distance from FDR in 1932 to Eisenhower’s successor Richard Nixon, elected in 1968, was not great. World War II and cold war “wise men” could be either Republicans or Democrats. They belonged to the same establishment, attended the same Ivy League colleges, were members of the same clubs, read the editorial pages of the same few newspapers. Two parties organized around such leaders could each have presented a coherent agenda, one to the left of center, one to the right, meeting in the middle.

It was the consensus ideal, and it ignored deeper tensions in parts of the country where politics was harder-edged and culturally driven. An ideology nourished in the small-town Midwest and rural South and in the growing population centers of Western states resented and opposed the approach, style, and transactional presumptions of East Coast elites. And this resistance found support from right-wing intellectuals, heirs to pre-World War II “Old Guard” conservatism. Its best minds coalesced around National Review, founded as an anti-Eisenhower weekly in late 1955. Rosenfeld has much to say about the magazine, but he leaves out its most original and penetrating thinker, the Yale political scientist and NR columnist, Willmoore Kendall. An incisive critic of the Schattschneider-Burns thesis, he helped coin the term “liberal Establishment” and theorized that proponents of the “presidential majority” seemed to be wishing away the second, “congressional majority” elected every two years and therefore more directly accountable to voters.

Burns could argue that the “true” Republican Party naturally reflected Eisenhower’s internationalism, because influential people—including the publishers of The New York Herald Tribune and Time magazine—approved of him. But much of the GOP base gave its loyalty to local figures, whose views more closely resembled their own on the whole range of issues: civil rights and civil liberties, military spending and foreign aid, free trade and the national debt, even “the scientific outlook.” When it came to these matters, the people’s tribune wasn’t Eisenhower, the five-star general, who had been the “supreme commander” of NATO and the president of Columbia University. It was Senator Joseph McCarthy, who became the hero to the emerging postwar right. His most eloquent defender, National Review’s editor, William F. Buckley Jr., applauded McCarthy’s Red-hunting investigations and ridiculed the tu quoque hypocrisies of McCarthy’s “enemies”—liberals and moderates in both parties.

Rosenfeld is curiously silent about all this. He praises Buckley’s 1959 manifesto Up From Liberalism, calling it a “thorough formulation of the connection between building a conservative ideological movement and recasting the party system.” In fact, Buckley said little about this, apart from restating the case for McCarthy. It was puzzling to readers, including some on the right, that Buckley never got around to saying what conservatism meant or even what conservatives should do. When he talked about policy, it was mainly to denounce liberal proposals—on voting rights, health care, battles between labor and management—without offering any serious alternative in their place. What would a truly conservative administration do if elected? Buckley had no idea, “Call it a No-program, if you will,” he cheerfully wrote or shrugged, in words that sound like marching orders for today’s GOP. Undoing or rolling back the New Deal and post-New Deal programs already in place would suffice. “It is certainly program enough to keep conservatives busy.”

Buckley wasn’t being flippant. He was being honest. Conservatives really did have no interest in social policy. National Review writers excelled at philosophical theory and high rhetoric, but when the subject turned to “a crucial policy issue such as Medicare, you publish a few skimpy and haughty paragraphs,” Buckley’s friend Irving Kristol complained in 1964, when it was clear some kind of national health care for the elderly was going to be enacted, expanding the popular protections in Social Security. “Why not five or six pages, in which several authorities spell out the possible provisions of such a bill?” Kristol urged. “It could really affect the way we live now.” Buckley wasn’t interested, and Kristol plugged the hole himself with The Public Interest, the quarterly he founded with Daniel Bell in 1965. It was one of the era’s best journals, filled with well-written analysis and incisive commentary on the entire range of midcentury policy. But in the end, Buckley was right. As Rosenfeld says, it was National Review that gave direction to the conservative revolution and made the GOP better organized and more ideologically unified than the “polarizers” of the ’40s and ’50s could imagine.

Buckley’s brother-in-law, L. Brent Bozell, was a key figure in translating these ideas into political strategy. He brilliantly repackaged Buckley’s “No-program” in a tract he ghostwrote for Barry Goldwater, The Conscience of a Conservative, meant to launch a shot-across-the-bow challenge to Nixon in 1960. In a famous passage, Bozell and Goldwater project a vision of the ideal “man in office,” the savior of the Republic, who tells the people,

I have little to no interest in streamlining government or in making it more efficient, for I mean to reduce its size. I do not undertake to promote welfare for I propose to extend freedom. My aim is not to pass laws, but to repeal them.

When the book became a best-seller and the guessing game of authorship began, Goldwater insisted he had written it—or that it grew out of his speeches and published writings (never mind that they’d been ghosted too). Under normal conditions, few would have cared—John F. Kennedy didn’t write his books either. But Goldwater was being marketed as a bold political thinker. Rosenfeld perpetuates this myth, the better to present Goldwater as a serious-minded intellectual who “framed his positions on disparate issues within an overarching ideological vision.” That vision consisted of libertarian economics at home and militant anti-Communism abroad. Goldwater didn’t come close to getting the nomination. Nixon did, as expected, and then lost, barely, to John F. Kennedy—another victory for the liberal Establishment.

Ronald Reagan and Barry Goldwater join William F. Buckley and his brother at a National Review party. Bettmann/Getty

Goldwater was too good a politician to chain himself to a single script, especially a losing script. It was dawning on some that Kristol had got one big thing right. The public really did want government programs, as long as the benefits accrued to them and not someone else. In early 1961, getting a jump on the next election, a second Goldwater ghostwriter, Michael Bernstein, drafted a prescient document, the “Goldwater Manifesto” or “Forgotten American” speech. It sketched out the beginnings of what later came to be called big-government conservatism—a reordering of spending away from the poor and minorities (singled out for help by Kennedy’s New Frontier) and toward a newly aggrieved group, “the silent Americans,” who truly “constitute the substantial majority of our people” and yet “cannot find voice against the mammoth organizations which mercilessly pressure their own membership, the Congress, and society as a whole for objectives which these silent ones do not want.”

What might the silent ones want instead? For one thing, Bernstein proposed, “tax relief for families with children attending college.” NR purists were appalled. This was still Big Brother—manna flowing from the Beltway—even if, in this case, the money was going back to overburdened taxpayers. In embarrassment, Goldwater backed away and made a new calculation. The most numerous “silent” votes were to be had in the South. White majorities there felt disrespected or worse by the presidencies of Kennedy and his successor, Lyndon Johnson. Civil rights was the pivotal issue, but not the only one. In fact, it overlapped with other tensions: in labor unions, public education, housing, anti-colonial uprisings abroad. Below the calm surface of consensus, a deeper struggle was going on. “There is a vague and bitter counter-revolution in this country—anti-big government, anti-union, anti-high taxation, anti-Negro, anti-foreign aid, and anti-the whole complex spirit of modern American life,” James Reston, The New York Times’ Washington bureau chief and most respected columnist, wrote in 1963, when Goldwater was the uncrowned king of an increasingly conservative GOP. The center that Schattschneider and Burns had counted on was coming apart.

What Reston missed was the sophistication of Goldwater’s rhetoric, helped along by the writings of Buckley, Bozell, and Bernstein. He overlooked as well the Southern strategy devised by NR’s publisher, William Rusher. It wasn’t a new idea. Goldwater’s first stab at the presidency, in 1960, had begun in South Carolina, when he won the delegates at the state Republican convention, catching Nixon off-guard. It was his first successful “duck hunting” expedition—that is, courting the votes of middle-class whites in the “New South,” with its rising business class. Uncomfortable with the overt race-baiting of Dixiecrats, these voters responded to a broader argument cast in the language of states’ rights and free enterprise, the true pillars of the constitutional republic as opposed to the Democrats’ promise of egalitarian democracy. You could make this case, and Goldwater did, without mentioning race at all. Buckley made the same adjustment. Instead of saying black people were inferior—National Review’s line in the 1950s—he now argued that Goldwater “does not intend to diminish the rights of any minority groups—but neither does he desire to diminish the rights of majority groups.”

While Democrats had become the party of civil rights, the Republican Party, without explicitly saying so, “was now a White Man’s Party,” as Robert Novak put it in his account of the 1964 election, The Agony of the G.O.P. The transformation began in earnest when Senator Strom Thurmond quit the Democratic Party, taking South Carolina’s electoral votes with him, and was welcomed into the GOP by his good friend Goldwater. Thurmond the defecting Democrat was joined by younger Southern politicians nourished within the GOP. These were figures like James Martin, who challenged and nearly unseated Lister Hill, the four-term incumbent Democratic senator in Alabama, in 1962. Martin was elected to the House in 1964, together with five others from the South, four of them from states—Tennessee, Texas, Florida, and Kentucky—that today contribute to the GOP’s base. Canny operatives like the Alabama prodigy John Grenier (oddly absent from Rosenfeld’s book) rose to top positions in Goldwater’s campaign. Its victories came almost entirely from the Deep South.

While Democrats had become the party of civil rights, the Republican Party, without explicitly saying so, “was now a White Man’s Party.”

Outside the South (and his home state, Arizona), Goldwater got a thrashing in 1964. But he had opened up the route to what the political strategist Kevin Phillips soon called the “emerging Republican majority,” which nationalized the Southern strategy by courting alienated white voters in the North as the civil rights movement moved there; by focusing on racially charged issues like “forced busing” and the integration of labor unions, the GOP drove a wedge in what had once been Democratic strongholds. In 1968, Richard Nixon dusted off Bernstein’s “forgotten man” speech and made it the template for his appeal to the “silent majority,” as Garry Wills reported in his classic Nixon Agonistes. Like Goldwater, Nixon cast tribal politics in lofty ideological terms. He talked of “positive polarization” and promised to overturn “the false unity of consensus, of the glossing over of fundamental differences, of the enforced sameness of government regimentation.” Ronald Reagan, preparing to run in 1976, went even further, warning that if Republicans continued “to fuzz up and blur” the differences between the two parties when they should be “raising a banner of no pale pastels, but bold colors,” he might quit the GOP and form a third party. Instead he contested and badly weakened the incumbent Gerald Ford. Four years later, Reagan repeated the Goldwater and Nixon formula, rechristening the “forgotten American” and “silent majority” as the “moral majority,” and won in a landslide.

For all this talk of the fundamental differences between the parties, however, partisanship did not yet reach today’s poisonous extreme. Nixon and Reagan, experienced leaders, ran “against” government while also realizing there were very few programs the voting public would be willing do without. Once in office, Republicans too were expected to make the system work. Democrats, with their long history of taking public policy seriously, were, however, better at it—as some conservatives acknowledged. In his influential book Suicide of the West, Buckley’s colleague James Burnham quoted Michael Oakeshott, who said fixing social problems was the liberal’s ambition, or delusion. While the liberal “can imagine a problem which would remain impervious to the onslaught of his own reason,” Oakeshott wrote, “what he cannot imagine is politics which do not consist in solving problems.” The conservatives’ job was to apply the brakes when necessary, to keep alive the opposition argument in a world in which all knew liberalism remained the basis of modern governance but weren’t always prepared to admit it.

This broad but tacit acceptance of activist government is what inspired the Democrat Daniel Patrick Moynihan to take a job in Nixon’s administration in 1969. He gambled that a moderate Republican, who said he disliked government but realized voters wanted it, might succeed in passing legislation where Democrats had failed. Despite encountering resistance from the “congressional majority,” Moynihan was vindicated. The Nixon years gave us a good deal of effective government. They saw the creation of the EPA, wage-and-price controls, the Equal Employment Opportunity Act, Supplemental Social Security income (for the blind, disabled, and elderly), Pell Grants (college loans for lower-income students), the Endangered Species Act, and more. It was a “rich legislative record,” as the political scientist David Mayhew has written. The reason is conveyed in the title of Mayhew’s book, Divided We Govern, which showed how well government worked when voters split tickets and gave each party control of a different branch.

Rosenfeld’s thesis—that the postwar enthusiasm for ideologically unified parties yielded some positive good—works better when he turns to the Democratic Party, which really did clean house, cutting loose Southern reactionaries to make itself the party of civil rights. Stalwarts of the Senate “citadel” like Harry F. Byrd and Richard Russell lingered, but with diminished authority as civil rights became the party’s great cause, and Northern liberals—the Minnesotans Hubert Humphrey and Eugene McCarthy, to name two, and the Prairie populist George McGovern—gained national followings. There were also the brave organizing efforts of college students, white and black, who mobilized citizens in the South. Rosenfeld has very good pages on the 1964 Democratic convention, when members of the Mississippi Freedom Democratic Party, led by the activists Bob Moses and Fannie Lou Hamer, challenged the Dixiecrats. Their victory was symbolic, but politics is often written in symbols.

One wishes Rosenfeld had more to say about other political figures, particularly black leaders such as the Rev. Martin Luther King Jr., Julian Bond, and Shirley Chisholm, who guided the Democrats’ response to the most important polarization in America. Kendall’s “two majorities”—one “presidential,” the other “congressional”—only grazed the surface of a nation profoundly split into “two societies, one black, one white— separate and unequal,” to quote one of the period’s great public documents, The Kerner Report. Published in 1968 after a year of investigation by a presidential advisory commission, the report explored the causes of the urban disorder in almost 150 cities—especially Detroit and Newark—in the summer of 1967.

In April 1968, while the Kerner commission findings were still being digested, King was assassinated, and the two societies hardened along lines that prefigure today’s jagged divisions. Trump’s truest forerunner, many have pointed out, was the one true radical in the 1968 presidential campaign, the Alabama segregationist George Wallace, a lifelong Democrat who ran on a third-party ticket and preached a Trump-like gospel of revenge. “The desire for ‘law and order’ is nothing so simple as a code for racism,” Garry Wills wrote of Wallace’s message at the time. “It is a cry, as things begin to break up, for stability, for stopping history in mid-dissolution.” Fifty years ago, “middle America” already yearned to make their country “great” again.

In truth it was becoming great—or better, anyway. Rosenfeld’s book, though the last pages rush through the years between 2000 and 2016, says very little about President Barack Obama, whose two terms were a model of “responsible party” politics, ideologically moored but also pragmatic and aimed at the broad middle of the electorate. It led to much good policy, and to the strong economy that is now buoying Trump’s presidency. Why does Rosenfeld have nothing to say about Obama? One answer might be that Obama was detached from the Democratic base: It steadily eroded during his two terms, especially at the all-important state level, as Nicole Narea and Alex Shephard wrote soon after Trump was elected. The Republicans, meanwhile, had diligently rebuilt from the bottom up, bringing about today’s “relentless dynamics of party polarization” and a climate of “factional chaos.”

Rosenfeld blames our current partisan gridlock on the system’s “logic of line-drawing.” But he also warns that “any plausible alternatives to the rigidities and rancor of party polarization might well prove to be something more chaotic and dangerous.” What can he mean? He points to the dangers of “pragmatic bargaining” and to the unprincipled compromises that might take the place of “effective policymaking.” This, he worries, would leave us with the same problems Schattschneider and Burns identified decades ago. Yet the last half-century of legislative history suggests something very different: The only coherent policies we’ve seen in decades—from the great civil rights legislation of the 1960s through Medicare and then Reagan’s tax reform in the 1980s—owe their passage to exactly the bipartisanship Rosenfeld finds corrupting. The lone recent instance of one-party rule creating a powerful piece of legislation is the Affordable Care Act, and the bill was vulnerable to attack precisely because no Republicans in either the House or the Senate voted for it and so had no stake in protecting it.

In one important way, however, Rosenfeld could be right about the ultimate benefits of polarization. In the Desolation Row of the Trump era, “Which side are you on?” has become the paramount question. Trump’s coarseness has invigorated the forces of resistance: A politer figure would not have given us the Access Hollywood tape, and the brazen denials afterward, and would not have fed the outrage that burst into public consciousness with the “Me Too” movement. So too Trump and Paul Ryan’s failure to come up with a workable replacement for Obamacare—a failure rooted in half a century of a “No-program program”—has given Democrats one of their most potent issues in the midterms. And the excesses of House Republicans, especially the foot soldiers in the Freedom Caucus, may well create opportunities for another disciplined group whose presence has been growing on the other side, the Congressional Black Caucus. If these changes come, polarization will be a major reason. The most enduring accomplishment of Trump and Trumpism— the latest, most decadent stage of the American right—could be the rebirth of an authentic American left.

Yes, Trump Is Culpable
Yes, Trump Is Culpable

“Bring the war home.”

That’s what protesters at the University of Wisconsin chanted in early 1970, denouncing defense-related research at the school during the Vietnam War. Later that summer, four men brought the war to campus. They set off a bomb at the Army Mathematics Research Center, killing a physics graduate student and injuring three others.

Critics were quick to blame the entire antiwar movement. But the attack also triggered sober reflection among the protesters, who asked themselves whether their increasingly aggressive language had encouraged it.

That’s precisely the kind of honest, good-faith reckoning that’s been missing among most Republicans in recent days. Last Friday, police said that a Florida supporter of President Donald Trump, Cesar Sayoc, was responsible for sending explosive packages to Barack Obama, Hillary Clinton, and several other prominent Democrats. Then, on Saturday, 11 people at a Pittsburgh synagogue were murdered by alleged gunman Robert Bowers, who reportedly didn’t vote for Trump but had posted diatribes echoing the president’s overheated rhetoric about the caravan of migrants from Central America.

White House officials responded defensively after both attacks, denying that Trump’s language might have helped provoke them. “The president is certainly not responsible for sending suspicious packages to someone, no more than Bernie Sanders was responsible for a supporter of his shooting up a Republican baseball field practice last year,” White House Press Secretary Sarah Huckabee Sanders said, referring to the attack that critically injured Republican Congressman Steve Scalise last year. 

But Bernie Sanders hasn’t encouraged people to behave violently; Trump has. He urged supporters at a rally to “knock the crap” out of anti-Trump protesters, promising to pay any resulting legal bills. He has continuously praised Montana Congressman Greg Gianforte for body-slamming a news reporter. And he grinned happily last week while supporters chanted “CNN sucks,” just days after the network received one of Sayoc’s mail bombs and mere hours after news reports showed Sayoc’s van bearing a bumper sticker with the same slogan.

And Trump certainly fueled anti-Semitic theories that the Central American caravan is funded by Jewish philanthropist and top Democratic donor George Soros, which was a recurring theme on Bowers’s social media account. Trump didn’t name Soros, whom Sayoc also targeted with a bomb, but warned that the caravan “didn’t just happen” and that “a lot of money” was “passing” to it from outside. That’s a clear nod to conspiracists like Bowers.

On Friday, Trump acknowledged that Sayoc “preferred” him over other political candidates. But he refused to admit that his own behavior could have encouraged Sayoc’s assassination attempts. “There’s no blame,” Trump declared. “There’s no anything.” 

His comment brought me back to the 1970 bombing at the University of Wisconsin, which shook the antiwar movement to its core. Instead of simply distancing themselves from the attack, protesters asked hard questions about how their own actions might have helped provoke it.

The same bombers had previously tossed a firebomb into an ROTC classroom at the university. And a few days before the 1970 New Year, they stole a plane and dropped a series of explosives onto a nearby ammunition plant. Nicknamed “The New Year’s Gang,” they were lionized as romantic heroes by Wisconsin’s student newspaper and other protesters at the university. 

But after the destruction of the math building and the death of the physics student, a 33-year-old father of three, protesters changed their tune. The bombing “was so extreme and unjustifiable and horrible, it stopped us in our tracks,” one activist said. Another noted that the attack “caused a lot of soul searching.” It was “a very pointed reminder . . .  that you can’t persuade people of the sanctity of human life by being recklessly unmindful of human life.”

When will Republicans search their own souls about the reckless rhetoric that provoked Sayoc and Bowers? Several GOP senators have released generic statements denouncing violence and calling for civility. But all of these comments ring hollow when they exempt the man who has done more than anybody else to encourage violence and erode civility. “Look, everyone has their own style. And frankly, people on both side of the aisle use strong language about our political differences,” Vice President Mike Pence told NBC News on Saturday. “But I just don’t think you can connect it to threats or acts of violence.” 

It’s true that Congresswoman Maxine Waters and a few other Democrats in Congress have engaged in their own kinds of incivility, such as calling on supporters to confront Trump administration officials in restaurants. But the Democratic leadership has unequivocally condemned their behavior. 

I’m not expecting Republicans to stop supporting Trump’s policies, just as the protesters in Wisconsin weren’t about to abandon their opposition to the Vietnam War over the actions of the New Year’s Gang. But not a single prominent Republican has stood up over the past few days to state the obvious: Trump’s violent rhetoric encourages violent action.

Decent people can endorse Trump’s views on immigration, health care, taxes, and so on, but decent people do not speak and act like he does. It’s time for Republican leaders to say that, clearly and unequivocally. If they don’t, they will be complicit in any further right-wing political violence that mars the country—and will be remembered as cowards and opportunists, who put their own immediate political concerns above the fate of America.

How Would a Good Leader Respond to Jamal Khashoggi’s Killing?
How Would a Good Leader Respond to Jamal Khashoggi’s Killing?

In 1919, famed German theorist Max Weber gave a lecture to a group of idealistic left-wing students in Munich. It was a time of political shifts: Germany had lost the First World War, and revolution was in the air. For young students, politics must have seemed an attractive outlet for shaping a better world. Weber wanted them to be under no illusions, however. If salvation was what they wanted, for themselves, and others, they had the wrong idea. Politics is not just about doing what’s morally right, he warned them, in a lecture that would become his classic “Politics as a Vocation” essay. Or rather, what is morally right is not that straightforward in politics.

What “any person who wants to become a politician” needs to understand, said Weber, is an “ethical paradox” at the heart of politics: the contrast between the “ethics of conviction” and the “ethics of responsibility.” Those who act out of the “ethics of conviction,” according to Weber, do what they see as the morally right thing to do, independently of the consequences. Those guided by the “ethics of responsibility” try to anticipate the potential implications of their actions and take them under consideration before acting.

Many have criticized Donald Trump’s tepid reaction to the murder of Saudi journalist Jamal Ahmad Khashoggi. Mainly, they’ve focused on his lack of moral indignation, i.e. lack of an “ethics of conviction”—his interest in keeping the Saudi relationship going despite the regime’s murderous activities. But what Weberians might note is that the president also hasn’t displayed any real ethics of responsibility.  

Trump’s response to the death of Khashoggi has evolved over the past weeks, moving from statements on the importance of Saudi arms sales to criticisms of the regime. This has been, to some extent, a result of Saudi Arabia’s continuously evolving account of what happened. But Trump’s revised stance has also been a result of the public perception of this case, something that Trump himself admitted to, and lamented: “This one has caught the imagination of the world, unfortunately,” he said. “It’s not a positive. Not a positive.”

Downplaying moral conviction can be a good thing for a leader.

In his interview with 60 Minutes, Trump came close to recognizing the atrocity of a government murdering a journalist over critical op-eds: “There’s something—you’ll be surprised to hear me say that—there’s something really terrible and disgusting about that if that were the case, so we’re going to have to see.” He then added, “We’re going to get to the bottom of it and there will be severe punishment.” But Trump’s recognition of the morally unpalatable nature of the journalist’s murder was coupled with a reminder of the enormous military order Saudi Arabia has placed with the U.S. Further concessions were also made: Khashoggi, after all, wasn’t a U.S. citizen, and the murder took place in Turkey, not on American soil. Trump has also said he believes the prince Mohammed bin Salman’s denial of involvement—that this was all carried out without his knowledge. True moral outrage was absent—probably unsurprising, as strong moral convictions are not something that Trump is known for, particularly when it comes to freedom of the press. The president has repeatedly called the media “the enemy of the people,” and recently applauded Republican Congressman Greg Gianforte for getting rough with a reporter.

Weber might point out that downplaying moral conviction can be a good thing for a leader. In fact, he criticized moralizing politicians, who he said “in nine out of ten cases are windbags,” self-satisfied with their own moral purity. More importantly, “they are not in touch with reality, and they do not feel the burden they need to shoulder”—to consider consequences, not just principles. For an illustration of that point, one need look no further than the Iraq War. According to reports from that time, the conviction that Saddam Hussein was “evil”—completely aside from the empirical question of whether he had weapons of mass destruction—trumped concerns about the possible complications and unwanted consequences of intervening in that area of the Middle East. Even though removing an evil dictator might have seemed a noble reason for going to war at the time, retrospectively, that seems a deeply irresponsible motivation, possibly even immoral, given the number of deaths, the power vacuum that allowed violent radical groups to proliferate, and the continuing instability of the country.

This brings us to Weber’s ethics of responsibility. It represents a down-to-earth pragmatism, one that takes into consideration the complexities of the world and the negative consequences that well-meaning actions might have.  

Given the geopolitical intricacies of the Middle East, there’s certainly a case to be made that U.S.-Saudi relations need to be preserved for non-monetary reasons— this security partnership provides some semblance of stability in the region, that could otherwise erupt violently. On the other hand, this line of argument would have to be weighed against the consequences that allying with Saudi Arabia has had so far—including the famine a Saudi-backed alliance is causing in Yemen. Even taking an “ethics of responsibility” stance against Saudi Arabia does not provide easy options.  


But this isn’t the sort of complicated political calculus the president has offered in place of moral indignation. Instead, the justification for being easy on Saudi Arabia has been couched in terms of financial loss—perhaps even on a personal level, as the president’s own business ties there have recently come under scrutiny. Trump’s caution over Saudi Arabia thus fails Weber’s definition of the ethics of responsibility as well: Weber explicitly stated that acting according to the ethics of responsibility was not the same as acting in pure self-interest.

On Thursday, news broke that Saudi Arabia’s public prosecutor was now acknowledging that Khashoggi’s murder was premeditated. This news will make it much harder for Mohammed bin Salman and his father, the king of Saudi Arabia, to maintain they were unaware of this plot. And it puts greater pressure on President Trump to choose a course of action.

Towards the end of “Politics as Vocation,” after having lectured the young, idealistic students in his audience about the moral compromises that a life in politics involves, Weber had a moment of idealism himself. Sometimes, even a politician with a keen sense of the ethics of responsibility, and an awareness of the potential consequences of their actions, can’t help but act on the basis of moral conviction. “This should be possible for any of us,” Weber said, “who is not dead inside.”

Unlike the American president, who so far has displayed not so much a struggle between two ethical compulsions as between public perception and calculated financial interest, at least some members of Congress seem to be taking the moral weight of the events seriously. There seems to be bipartisan support for at least exploring sanctions. In these times, having one branch of government not be “dead inside” is perhaps all one can hope for.

Is the Democratic Party Progressive Enough for Muslims?
Is the Democratic Party Progressive Enough for Muslims?

The relationship between American Muslims and the Democratic Party is often described as a marriage of convenience. One of the best illustrations of this was the appearance of Khizr and Ghazala Khan at the 2016 Democratic National Convention. The Khans, parents of a U.S. Army captain killed in the Iraq War, didn’t exactly fit the liberal mold: Khizr Khan was a political independent who supported Reagan twice. But now the Khans were ardent Democrats. “Vote for the healer, for the strongest, most qualified candidate, Hillary Clinton, not the divider,” Khizr Khan said.  

What choice did they have? Months earlier, Donald Trump had called for “a total and complete shutdown of Muslims entering the United States.” This blatantly discriminatory proposal was part of a larger political campaign steeped in Islamophobia. Not even the parents of a war hero—the so-called good Muslims—were protected. As a result, more than three-quarters of Muslim voters cast their ballot for Clinton. The Muslim-Democratic alliance has only been strengthened in the wake of President Trump’s Muslim ban, which translated his xenophobic campaign promises into the law of the land. Today, Muslims constitute the “most Democratic-identifying religious group” in the country.

This is despite the fact that many Muslims continue to lean conservative, as Wajahat Ali Khan has pointed out in The New York Times. “[P]rivately, they adhere to traditional values, believe in God, and think gay marriage is a sin, even though an increasing number support marriage equality,” he wrote.

The Republican Party’s Islamophobia has turned Democrats and Muslims into strange bedfellows, while also masking differences that have emerged since the 2016 election. Interviews with Muslim leaders and activists, however, reveal that those differences often do not hinge on the Democratic Party being too progressive, but on the Democratic Party not being progressive enough. And far from treating the Democrats as a haven in troubled times, Muslim-Americans are starting to demand more from the only mainstream party that will have them.

For American Muslims, the challenges of living under a Trump administration started with the Muslim ban but are not limited to it. Trump’s first year in office corresponded with a 15 percent increase in hate crimes against Muslims in the U.S. The Trump administration’s foreign policy in the Middle East has also not sat well with most Muslims, including the administration’s recognition of Jerusalem as the capital of Israel and its decision to open a U.S. embassy there.

Trump has also used and legitimized anti-Muslim rhetoric as a campaign strategy in the midterms. A new report published by the group Muslim Advocates found “80 separate instances of clear anti-Muslim political rhetoric being directly used by candidates in 2017 and 2018 races.” A majority of the candidates using this rhetoric are Republican. And Trump himself tweeted last week that a migrant caravan approaching the U.S.-Mexico border—which has become a flashpoint for midterms races—had been infiltrated by “unknown Middle Easterners,” a clear attempt to inject an Islamophobic element into the issue. (He later admitted he had no proof for his claim.)

But Muslim-American political activity this campaign season has not been restricted to responding to these existential threats—to the contrary, it has been notable for its breadth, variety, and inventiveness.

This has been most evident in the midterms’ “blue Muslim wave,” in which more than 90 American Muslims ran for office across the country. Most lost in the primaries, but a few have made it onto the ballot in November, including two Muslim women running for Congress, Ilhan Omar in Minnesota and Rashida Tlaib in Michigan. What’s striking about Omar and Tlaib is that their platforms are squarely in the left wing of the Democratic Party. Like other progressive insurgents, they are committed to economic justice for working people, Medicare for All, abolishing ICE, and holding Democratic leadership accountable.

Muslim organizers are urging Muslim voters to think of their vote not simply as a means of ensuring their survival in this country, but also as a tool to shape a particular political vision. The Muslim grassroots organization MPower Change has been leading a nationwide get-out-the-vote and digital engagement campaign called #MyMuslimVote, partnering with local organizations in states with significant Muslim populations, including Michigan, Georgia, Virginia, and Texas. MPower’s campaign director Mohammad Khan explained that its voter mobilization strategy changed after the 2016 elections. Khan said, “We wanted our communities to think about voting in an aspirational way, we wanted to expand what people think of as Muslim issues. Muslims are not day-to-day thinking about how they’re going to fight terrorism, they’re thinking about the same things everyday that other people are thinking about.”

The results of the 2016 Democratic presidential primary contain some important clues on what these aspirations might be. In Dearborn, Michigan, a city where 40 percent of the population is Arab-American, Bernie Sanders beat Clinton with 59 percent of the vote. Some journalists were surprised that Arabs and Muslims had voted for a Jewish candidate, but it’s likely that voters were less interested in religion than the presence of a progressive, non-establishment candidate in the race.

According to a 2017 survey conducted by Institute for Social Policy and Understanding (ISPU), “a substantial segment of Muslim respondents (roughly 30 percent) did not favor either of the two major party candidates” in the 2016 presidential election. Sanders’s appeal to Muslim voters is partly explained by the fact that, amongst major faith groups in the U.S., Muslims are the youngest and most likely to identify as low income. A 2018 ISPU survey revealed that one-third of Muslims find themselves at or below the poverty line. It should come as no surprise that, like other Americans who voted for Sanders, Muslims want better wages and affordable health care.

But Sanders also engaged with Muslims differently than Clinton did. Zohran Mamdani, board member of the Muslim Democratic Club of New York (MDCNY), said, “A lot of times in Democratic conversations, things are framed as looking at a whole community through terrorism and anti-terrorism and not seeing us as full, complex individuals who have a multitude of issues and deserve to be treated in a way that all other communities are.” The appearance of the Khans at the Democratic National Convention reinforced this framework, with its focus on a war that came shortly after 9/11 and that many Muslim Americans opposed.

One of the most powerful moments in Sanders’s campaign came at a rally in Virginia where a young Muslim student asked him how he would tackle Islamophobia as a president. Sanders responded by sharing his own Jewish family’s experiences with bigotry, placing Islamophobia in a larger context of American racism. In contrast, Clinton drew criticism from many Muslims, including MPower Change, after she blamed the 2016 attack at a gay nightclub in Orlando, Florida, on “radical Islamism.” Muslims felt that Clinton’s use of this term, which President Barack Obama avoided, implied that Islam’s more than one billion followers were responsible for the beliefs and actions of a small minority.

Muslim-American political activity this campaign season is perhaps most evident at the local level. Muslims make up about 1 percent of the U.S. population, which means that, unlike other minority groups, they do not have the numbers to influence election outcomes on a national level except at the margins. But they do have the numbers to make an impact at the local and state levels. 

For example, the Council on American-Islamic Relations (CAIR) recently published a special voter guide for Muslim voters in Maryland. The guide includes a survey of candidates’ views on a variety of national and local issues that Maryland Muslims care about, including their position on whether public schools in areas with significant Muslim populations should close for the Muslim holidays of Eid al-Fitr and Eid al-Adha. This is a particularly important issue for Muslims in Maryland because of the difficulties they faced in getting the state’s largest school district to start recognizing Muslim holidays in 2015.

One of the first groups to endorse Alexandria Ocasio-Cortez in her upset primary victory of Rep. Joe Crowley in New York this summer was a community organization called Muslims for Progress. Based in New York City and Long Island, the organization was created in 2017 by Toufique Harun and Saema Khandakar, husband and wife, in response to the disaster of the 2016 election. Harun and Khandakar, who describe themselves as “complete and total political novices,” said that the group is focused on increasing Muslim involvement in politics and that it was inspired by Indivisible, a nationwide grassroots movement of locally led progressive organizations.

Ocasio-Cortez’s director of organizing is Naureen Akhter, a 31-year-old Bangladeshi-American Muslim who also co-founded Muslims for Progress. Akther heard Ocasio-Cortez speak at a rally in June 2017, and soon after started leading signature-gathering efforts in Queens to help her get on the primary ballot. Akhter was critical in shaping Ocasio’s engagement with the large Bengali and Muslim community in New York’s 14th Congressional District—Ocasio even made a special campaign video for Bengali voters in which she spoke in Bengali.

In 2017, Muslim and Arab voters in the Brooklyn neighborhood of Bay Ridge rallied behind the first Arab candidate to run for city council, Reverend Khader El-Yateem. El-Yateem, a democratic socialist, ultimately lost the Democratic primary to Justice Brannan, but his presence on the ballot inspired unprecedented voter engagement amongst Arabs and Muslims, especially Arab and Muslim women.

In addition to supporting new progressive voices, Muslim organizers are focused on holding the Democratic Party accountable. MDCNY has an official endorsement process in which candidates have to fill out a questionnaire on a wide range of issues that matter to American Muslims. Mamdani said the purpose of this process is to “distinguish between those that simply talk about Muslims within this larger framework of ‘diversity is good and immigration is good’ and those who actually know that we don’t want broad platitudes.”

Harun from Muslims for Progress said that Muslims drawn to his group care more about issues than parties. “We work with the establishment candidates who fight for the issues, we work with grassroots candidates who work for the issues, we work with Republican candidates who work for the issues, we will work with anybody who fights for the right issues,” he said.

The focus on accountability has forced Muslim organizers to make difficult choices in the midterms. The New York attorney general primary between Public Advocate Letitia James and law professor Zephyr Teachout was especially challenging for MDCNY, since MDCNY had endorsed James in past elections. This time, the club endorsed Teachout after a tight vote amongst club members. James’s ties to Governor Andrew Cuomo were particularly frustrating for Muslim progressives, partly because Cuomo has never visited a mosque in his seven years as a governor.

Muslims, like other minority voters, expect more from the Democratic Party than it has given them in return. Khizr and Ghazala Khan’s appearance  at the 2016 Democratic National Convention was a defining moment for Muslim visibility and inclusivity in American politics for many Muslims—but not all of them. Whether the Democratic Party can speak to the diversity of American Muslim politics will determine how deep this alliance will go.

Sheldon Whitehouse’s Frustrating, Illogical Remarks on D.C. Statehood
Sheldon Whitehouse’s Frustrating, Illogical Remarks on D.C. Statehood

Donald Trump’s presidency has sparked a rolling national discussion about the long-term vitality of America’s democratic system. Democrats are more than happy to talk about how his rise to power undermined the nation’s experiment in self-government, how his presence in the White House sullies it, and how his actions as president have imperiled it. But it’s unclear whether Democrats’ focus on acute threats to the republic’s long-term health extends to more chronic problems.

Rhode Island Senator Sheldon Whitehouse has been a sharp critic of Trump’s policies and behavior over the last two years. Last week, The Providence Journal’s editorial board asked him about his views on potential statehood for Puerto Rico and the District of Columbia. Whitehouse responded with disinterest in the question for the island, then expressed opposition to its prospects for the district.

“I don’t have a particular interest in that issue,” Whitehouse said. “If we got one one-hundredth in Rhode Island of what D.C. gets in federal jobs and activity, I’d be thrilled.”

“Puerto Rico is actually a better case because they have a big population that qualifies as U.S. and they are not, as D.C. is, an enclave designed to support the federal government,” Whitehouse said. “The problem of Puerto Rico is it does throw off the balance so you get concerns like, who do [Republicans] find, where they can get an offsetting addition to the states.”

Whitehouse—whose comments caught wider notice on Thursday, prompting him to issue a statement saying that he would support statehood in both cases—is right that D.C. statehood raises some thorny questions for American governance. Placing the seat of federal power under a single state’s jurisdiction could be a constitutional crisis waiting to happen. (Imagine if D.C.’s power utilities cut off electricity to the White House and the Capitol during a dispute with the federal government, for example.) Ideally, the district’s residents could get congressional representation through something like the Twenty-Third Amendment. But since there’s no explicit constitutional bar to it, statehood is a feasible and reasonable solution to 700,000 Americans’ permanent lack of representation.

He’s also correct that Puerto Rico has a good case for statehood. The commonwealth’s 3.3 million Americans outnumber the populations of almost two dozen individual states, but dwell in a constitutional purgatory of sorts. Puerto Ricans enjoy a greater degree of self-government than a territory, but lack the sovereignty, legal stature, and electoral weight of a state. A grim recent example is the disastrous federal response after Hurricane Maria devastated the island last year and killed thousands of people. While congressional representation and a few votes in the Electoral College aren’t a panacea to the scars of colonialism and racism, the island’s residents have indicated in referenda that they would prefer it to the status quo.

Whitehouse’s response goes awry at two key points, however. The first is his dismissive approach toward the question of D.C. statehood itself. Senators need not be passionate or particularly interested in every political issue, but it’s striking that he’s so dismissive toward one of the most feasible ways that Democrats could chip away at the GOP’s current structural advantage in the Senate. After all, he spent the last few months staunchly opposing Justice Brett Kavanaugh’s confirmation to the Supreme Court. If two D.C. senators had been able to cast a vote, Whitehouse’s preferred outcome would have prevailed.

The second and perhaps greater error is Whitehouse’s concern that there isn’t an “offsetting addition” for Republicans. This may be an important practical hurdle to Puerto Rico statehood while the GOP controls the presidency or part of Congress. But it’s neither a legal requirement nor a moral necessity if Democrats eventually control all the political levers. It’s one thing for lawmakers to engage in a little horse-trading across the aisle to secure funding for a pet issue or pass a budget. It’s quite another to do it with millions of Americans’ right to self-government.

This approach is something of a tradition for top Democrats these days: unilaterally imposing limits on their ability to leverage an electoral mandate into lasting political change. Yearning for bipartisan buy-ins might have made sense in less polarized eras of American history. Now it just seems self-injurious for Democrats to seek it in the age of Trump. The GOP certainly isn’t operating under these rules. Majority Leader Mitch McConnell, the Senate’s master of realpolitik, isn’t going out of his way to add a few liberals to the pile of conservative forty-somethings that he’s shoveling into the federal courts. Republican secretaries of state like Kris Kobach and Brian Kemp haven’t bent over backwards to disenfranchise as many likely Republican voters as Democratic ones.

If Democrats retake power over the next few years, there are major steps they could take to strengthen American democracy. The Week’s Ryan Cooper made the self-evident but necessary argument in March that there’s “nothing wrong with strengthening America’s democratic institutions—making it simpler and easier for all Americans to vote and obtain political representation—in part because it would provide a partisan benefit.” At the top of his list was statehood for Puerto Rico and the District of Columbia, followed by abolishing the legislative filibuster and passing a new, stronger Voting Rights Act. I’d add automatic voter registration and anti-gerrymandering reforms at the state level, too.

Indeed, there’s a similarly blunt clarity in the GOP’s current strategy. Without extreme partisan gerrymanders, widespread voter suppression, and strict anti-immigration measures, the conservative political coalition may no longer be electorally viable in an increasingly diverse country. Republicans therefore have a logical reason (albeit a morally flawed one) to oppose an American electorate that can fully impose its political will. The only real mystery is why any Democrats would oppose that, too.


This article has been updated to note Whitehouse’s statement on Thursday.

Trump’s Extreme (and Extremely Boring) U.S. Tour
Trump’s Extreme (and Extremely Boring) U.S. Tour

Donald Trump thinks he’s a rock star. Touring the country in support of Republican candidates, the president puffed out his chest and bragged about his ability to draw a crowd. “Do you know how many arenas I’ve beaten Elton John’s record?” he reportedly told Congressman Kevin Cramer, who is running to unseat Democratic Senator Heidi Heitkamp in South Dakota.

The crowds at my Rallies are far bigger than they have ever been before, including the 2016 election. Never an empty seat in these large venues, many thousands of people watching screens outside. Enthusiasm & Spirit is through the roof. SOMETHING BIG IS HAPPENING - WATCH!

— Donald J. Trump (@realDonaldTrump) October 15, 2018

While the midterms are the ostensible reason for Trump’s fall tour, Trump’s message is all about Trump. The candidates themselves rarely factor in. At a stop in Texas on Monday in support of his frenemy Ted Cruz, Trump paid cursory attention to the Texas senator and instead gave a rambling, lengthy speech that hit all of the same notes as his presidential campaign: fear-mongering over immigrants and crime, attacks on the media, spiteful discursions about his political opponents, and a deluge of lies. He did refer to himself as a “nationalist” for the first time, but that’s the only real news that his rallies have made for weeks if not months. Things have gotten so dull that even Fox News, which has covered his rallies extensively over the past three-plus years, is barely covering them.

Republicans who need the president’s help are also tiring of his schtick. “Most of the president’s hour-plus performances are one-man shows,” The New York Times’s Jonathan Martin and Maggie Haberman reported earlier this week. “Unlike with past presidents, the candidate of the hour is handed the microphone by Mr. Trump only briefly sometime during his monologue. Strategists involved in the campaigns have even started to time how long into his remarks it is before the president mentions the race in question and starts attacking the Democrat on the ballot, which is the 30 seconds of footage they most covet.”

But the sense of unpredictability that once permeated Trump’s rallies and public statements has long since disappeared. That’s out of necessity. His historically low approval ratings may be inching upward, but Republicans are still facing a blue wave in November. With few real accomplishments to run on, Trump is leaning on what he knows best: his greatest hits. But American voters may be tired of hearing them.

Ever since Trump became the favorite to win the Republican presidential nominee in the spring of 2016, pundits began speculating that a pivot was imminent. Surely, Trump’s platform, particularly on immigration, would have to be moderated for him to win the general election? Trump’s apparent willingness to break with GOP dogma on foreign policy and entitlements only reinforced this belief. But the great moderation never came in his campaign against Hillary Clinton. Instead, he stuck with the same setlist: demonizing immigrants and demanding a wall be built at the United States’ southern border; suggesting that Clinton was a criminal who should be jailed; inviting several women who had accused Clinton’s husband of sexual harassment and assault to a presidential debate. Trump never wavered from serving red meat to his Republican base.

After Trump was elected, again there was the assumption that Trump would moderate his behavior—that the presidency would force him to. In his first address to a joint session of Congress in February of 2017, Trump was uncharacteristically muted and sounded like, well, a normal president. He feigned humility and expressed faith in American diversity and promise. The pundits ate it up, apparently forgetting his “American carnage” inaugural address just weeks earlier. Van Jones infamously remarked on CNN that “tonight, Donald Trump became President of the United States.” Fox News’ Chris Wallace said the exact same thing. CBS White House correspondent Jonathan Karl, meanwhile, tweeted the speech was at “his most presidential—his most effective speech yet.”

Seven months later, the pivot was alive and well—despite the fact that, in the intervening time, Trump had fired FBI Director James Comey because of “this Russia thing,” blamed “both sides” after a white nationalist murdered a counter-protester in Charlottesville, tried to deliver on his promise to ban Muslims from entering the U.S., ordered an end to Obama-era legal protections for undocumented immigrants brought to the country as children, and so on. After Trump agreed to a funding deal with Democratic leaders Chuck Schumer and Nancy Pelosi, and suggested he was open to a deal to protect those undocumented children anew, Axios’ Mike Allen’s offer this take on the new Trump:

It’s like a fictional movie scene: A president wins election with harsh, anti-immigration rhetoric, then moves to terminate protections for kids of illegal immigrants. He’s ridiculed on both sides for his heartlessness — but cheered by a band of white voters who helped put him in office. Then he suddenly realizes he looks like a cold-hearted jerk—and starts musing about going farther than President Obama got in providing permanent protections to those children of illegal immigrants.

The deal for the DREAMers, who are still in limbo, never materialized. Trump resumed his usual habits of belittling the press, his opponents, and immigrants.

It took 18 months, but it seems, for the most part, that the pundit class finally caught on: With Trump, what you get is what you see. But that means that now Trump is oddly frozen in amber. While he occasionally talks up the tax cut, his Supreme Court nominees, and the economy, the bulk of his stump speeches in support of Republican candidates is made up of the usual Trumpian flourishes. In a rally for Nevada senator Dean Heller, Trump claimed that Democrats wanted to give undocumented immigrants the right to vote—and to give them cars, indeed Rolls Royces. He has continued to suggest, as he did in 2016, that, if put in power, Democrats would open the borders and abolish the Second Amendment. Despite unsuccessfully working to repeal Obamacare—and successfully working to weaken it—Trump has returned to 2016 claims that he will protect pre-existing conditions, as well as a host of other entitlements that the Republican Congress is hoping to undercut, including Medicare. (He also had the gall to claim that pre-existing conditions were imperiled by Democrats.) In Montana, he praised Greg Gianforte, who body-slammed a reporter from The Guardian in early 2017. “Never wrestle him, any guy that can do a body slam, he’s my kind of guy, he’s my guy,” Trump said of the congressman. And, two years after winning the presidency, he is still ranting about Hillary Clinton, now claiming that it was her campaign, not his, that colluded with the Russians.

The consensus among many in the media is that Trump’s repetitiveness, particularly on the issue of immigration, is tactical. “This pure brute force from Trump could work,” NBC News’ “First Read” briefing argued, “because there is no equal response from Democrats.” This “brute force” campaign built on fear-mongering, race-baiting, and conspiracy theories worked in 2018—why not now? Mike Allen concurred, writing that “immigration and stoking fear about Mexican immigrants propelled Trump to the White House.” Trump is claiming that he can set the terms for the midterms, unveiling a new battle plan at recent rallies: “This will be an election of Kavanaugh, the caravan, law-and-order and common sense.”

But Kavanaugh may be more of a boon for Democrats than Republicans—which could explain why Trump has emphasized the “Democratic mob” more than the Supreme Court justice. Immigration and “law and order” were the pillars of Trump’s 2016 campaign. While Democrats have struggled to combat the GOP’s immigration claims—or to put forth their own comprehensive solution—they may not need to. Health care, not immigration, has been the dominant issue of the midterms so far. Trump is retreating to familiar territory because he doesn’t have anywhere else to go, and Republicans are following him out of desperation. With an unpopular president and an even more unpopular agenda, these fear-based appeals may be Republicans’ only card. “Voters are motivated by fear and they’re also motivated by anger,” Newt Gingrich told The Washington Post. He was referring to the migrant caravan, but may as well have been describing the GOP’s election strategy.

Trump’s race-based appeals have been “effective for him politically,” Maggie Haberman pointed out on Twitter. But what worked in 2016 may not work in 2018, and not simply because Trump isn’t on the ballot, potentially depressing his supporters’ turnout. There’s a reason his rally venues have shrunken. He’s droning on about the same old things because he has very little to show for two years of unified Republican control of the government. His only legislative accomplishment is a tax bill that is hugely unpopular. His rallies in 2018 are a mix of ego-boosting and retreat to familiar territory. He has, two years into his presidency, become the political equivalent of a band that has been touring off the success of its first record for too long. The superfans are still buying it, but everyone else seems to be tuning it out.

The Growing Inequality of Civil Rights in Trump’s America
The Growing Inequality of Civil Rights in Trump’s America

It’s often said that the arc of history bends toward justice, but the arc of American history seems to bounce toward it instead. During Reconstruction in the 1860s and 1870s, the federal government campaigned to build a multiracial democracy in the South. That project’s defeat in 1877 then ushered in 90 years of Redemption, mass disenfranchisement, and American racial apartheid. Only in the 1950s and 1960s did the civil-rights movement and the Warren Court finally drag the United States into genuine liberal democracy.

Another dip now appears to be underway. President Donald Trump and the Roberts Court are poised to spark a Second Redemption—an era where federal enforcement of civil rights is no longer assured, where states are free to allow citizens to be discriminated against in housing and employment, where voting is a privilege instead of a right, and where a person’s access to goods and services can be restricted by the beliefs of total strangers.

The latest blow came over the weekend when the New York Times reported that the Trump administration is considering plans to roll back civil-rights protections for an estimated 1.4 million transgender Americans. Some federal courts have concluded that gender identity is covered by some existing federal laws that forbid discrimination on the basis of actual or perceived sex. But conservative policymakers in the administration disagree, arguing instead for a narrow definition of gender based on a person’s assigned sex at birth. The Justice Department asked the Supreme Court on Wednesday to overturn a Sixth Circuit Court of Appeals ruling in favor of a transgender worker, arguing that Section VII of the Civil Rights Act of 1964 doesn’t cover discrimination against gay, lesbian, and transgender Americans.

If the Trump administration succeeds, transgender Americans’ rights would rest on a patchwork array of state laws and local ordinances. Twenty states and the District of Columbia forbid discrimination on the basis of gender identity in housing, employment, and public accommodations. Another dozen states have limited legal protections for transgender people, while more than 15 states, mostly in the South and the Great Plains, have none.

The geographic division roughly matches the divide on other matters of gender and sexuality. A Washington Post analysis in September found that abortion would automatically become illegal in 14 states under current laws if the Supreme Court overturns Roe v. Wade. (Roughly a dozen others could follow suit depending on the state legislature’s makeup at the time.) Seven states allow pharmacists to refuse to fill a contraceptive prescription without referring it to another provider. Twenty-eight states don’t have anti-discrimination laws for gay and lesbian Americans in situations like housing and employment. Seven states explicitly allow discrimination in adoptions and foster care.

One of the most popular political cliches of the last few years is the notion that there are two Americas. But this is not simply an issue of political differences, of red states vs. blue states. Increasingly, there are two Americas in legal terms: one where citizens broadly enjoy a range of rights and legal protections, and one where they don’t. As the federal protection of civil rights falters, and the Supreme Court lurches to the right, those differences are becoming severe—with dire consequences for women, LGBT people, and many other disadvantaged citizens.

Perhaps the most well-known chasm is over abortion rights. In theory, a woman’s right to obtain an abortion is protected from undue state interference by the Constitution under current Supreme Court precedent. In practice, however, the procedure is increasingly hard to obtain in the nation’s rural regions due to state laws designed to force abortion clinics to shutter. Mississippi, Missouri, North Dakota, South Dakota, and Wyoming each have a single clinic that performs abortions, while Kentucky, West Virginia, and Utah have two apiece. A 2014 survey by the Guttmacher Institute found that one in five American women has to travel more than 43 miles on average to the nearest clinic.

Some of that distance can be attributed to simple geography and population density. But it’s also mediated by political forces. Louisiana, for example, is slated to have only a single clinic covering the entire state after the Sixth Circuit Court of Appeals upheld a restrictive admitting-procedures law earlier this year. The Eighth Circuit recently refused to block a Missouri law that will leave a St. Louis clinic as the only available provider in the state. The confirmation of the staunchly conservative Brett Kavanaugh to the Supreme Court, replacing swing Justice Anthony Kennedy, raises the likelihood that similar measures will survive legal challenges, both in the lower courts and before the nation’s highest court.

Indeed, the most probable future for reproductive rights is a balkanized one: Women in blue states will still have access to the procedure, while women in red states will face a gauntlet of regulatory hurdles or have to travel long distances to obtain it nearby—if they can at all. Every year in the United Kingdom, hundreds of Northern Irish women who can’t obtain an abortion there cross the Irish Sea to have the procedure performed elsewhere in the country. The United States could see similar migrations by those with the ability to afford it in a post-Roe landscape.

In Michigan earlier this year, a pharmacist at a Meijer supermarket refused to fill a 35-year-old woman’s prescription for the drug misoprostal. The woman’s physician prescribed her the drug to complete a miscarriage she had suffered, but the pharmacist refused to fill it because he was Catholic, according to a letter sent to Meijer by the American Civil Liberties Union on her behalf earlier this month. He also refused to let another pharmacist handle it or to transfer the prescription to another pharmacy. “Unfortunately in Michigan, we don’t have an explicit state law that goes so far as to protect patients like Rachel,” an ACLU official told the Detroit Free Press.

The episode appeared to be a potential sign of things to come. The Roberts Court has taken a keen interest in religious-liberty exemptions in recent years, often ruling in favor of those who tell the court that their religious convictions run counter to state and federal laws. In Burwell v. Hobby Lobby, the court’s five conservative justices, including Kennedy, sided with the craft store chain’s claims that the Affordable Care Act’s contraceptive mandate violated its religious freedom. The mandate required most American employers to offer insurance plans that covered contraceptives.

In its ruling, the court held that the Religious Freedom Restoration Act, also known as RFRA, allows the owners of certain types of corporations to opt out of government regulations if those regulations run counter to their religious beliefs. Though individuals had been able to make RFRA claims under the law, the court had never held that it applied to closely-held corporations as well. The justices’ decision pleased religious conservatives who see liberal policy-making as antithetical to their faith. By applying it to companies, however, employees could have their access to healthcare shaped by religious beliefs that may not match their own.

Indeed, the ruling raised the specter that access to goods, services, and healthcare will be mediated by another person’s religious beliefs. Last term, the justices heard Masterpiece Cakeshop v. Colorado Civil Rights Commission, case involving a Christian baker who refused to bake a wedding cake for a same-sex couple, citing his personal religious beliefs. The Colorado Civil Rights Commission found that the baker had violated the state’s anti-discrimination law by refusing serve the couple because of their sexual orientation.

In their ruling in June, the justices sidestepped whether the baker’s First Amendment claim outweighed the state’s decision to protect gay and lesbian Americans from discrimination in the marketplace. Instead, they issued a narrow ruling in the baker’s favor that found the state commission had violated the principle of religious neutrality by allegedly denigrating his beliefs during their deliberations. Without Kennedy, the court’s leading figure on gay rights for two decades, the newly retrenched conservative justices could give conservatives Christians a legal path to bypass anti-discrimination laws that cover sexual orientation.

The end result of all this may be a country that feels less like the United States of the late-twentieth century, and more like the patchwork collection of German states that made up the Holy Roman Empire after the Protestant Reformation. Just as each duke and prince was free to establish Catholicism, Lutheranism, or Calvinism in their territory, many Americans may soon live in a country where their rights and liberties fluctuate as they travel from one state to the next, or even from store to store on the city streets. The laws across a broad section of the country may provide a safe space for those writing them, and for no one else.

Nihilist Nation
Nihilist Nation
If you think Donald Trump is wrecking the republic and wonder why so many Americans can’t see that he is, you may be asking the wrong question. What if they see the same thing you do and happen to like what they see? What if the deficiency you’ve been ascribing to a lack of adequate insight, information, or alarm is a lack of something deeper, a vacancy mirrored by but hardly confined to the president’s stare?

At the very least, something seems to be missing in the usual explanations for his becoming president. They tend to fall into two plausible, if ultimately inadequate, schools of thought. One of them holds that Trump rose to power because a large slice of the electorate was left behind by the neoliberal agenda of free trade and tech-sector hegemony. Only someone with a very good job or a very small imagination would dismiss this view out of hand. Nevertheless, the fact remains that voters with the lousiest jobs or no jobs at all—categories in which minorities and immigrants abound—are no fans of Trump.

Thus the second explanation: The success of Trump’s demagoguery is driven by white nationalism and racist hate. This too comes with plenty of supporting evidence. Yet skeptics have a point in asking why so many of Trump’s allegedly racist supporters would have cast their ballots not once but twice for Barack Obama. My strong suspicion is that along with the undoubted racists and xenophobes in the Trump camp are a number of uneducated white voters who do not hate blacks, Muslims, or Mexicans but rather the educated white liberals whom they suspect of caring more about blacks, Muslims, and Mexicans than about uneducated whites. When Trump said during his campaign, “I love the poorly educated,” he may have revealed more about whom many of his supporters truly hate than David Duke did when he endorsed Trump. So for those white liberals wringing their hands over the purported racism of Trump’s supporters, there is good news and bad news: They don’t all hate black folk. A lot of them just hate you.

Hatred of some “other,” however construed, and a sense of betrayal by the powers that be—both make sense as contributing causes of the devotion Trump inspires. There may be a third cause, however, one easily overlooked—possibly because it is so deeply ingrained in the nation’s sensibilities as to escape notice. This third cause, which I will call nihilism, helps to account not only for Trump, but also and more importantly for the phenomena he has come to personify (berserker gun violence, climate-change denial, etc.)—all of which were present before he entered the Oval Office and are likely to be around long after he’s gone.

Leaving nuanced definitions to the philosophers, I would define nihilism as a combination of three basic elements: a refusal to hope for anything except the ultimate vindication of hopelessness; a rejection of all values, especially values widely regarded as sacrosanct (equality, posterity, and legality); and a glorification of destruction, including self-destruction—or as Walter Benjamin put it, “self-alienation” so extreme that humanity “can experience its own destruction as an aesthetic pleasure.” Nihilism is less passive and more perverse than simple despair. “Nihilism is not only despair and negation,” according to Albert Camus, “but, above all, the desire to despair and to negate.”

A nihilist is someone who dedicates himself to not giving a shit, who thinks all meanings are shit, and who yearns with all his heart for the “aesthetic pleasure” of seeing the shit hit the fan. Arguing with a nihilist is like intimidating a suicide bomber: The usual threats and enticement have no effect. I suspect that is part of the appeal for both: the facile transcendence of placing oneself beyond all powers of persuasion. A nihilist is above you and your persnickety arguments in the same way that Trump fancies himself above the law.

Comparisons with Nazi Germany are often too glibly made and always too glibly dismissed. History does not repeat itself, true—I do not expect to see Donald Trump sporting a mustache the width of his nose—but history does show that similar social conditions can produce comparable political effects. With that in mind, it may not be out of bounds to quote from a nearly forgotten book by Nazi turncoat Hermann Rauschning called The Revolution of Nihilism. Published in 1939, and subtitled Warning to the West, the book characterizes Hitlerism as a form of vacuous “dynamism” with “no fixed aims” and “no program at all.” A movement of “utter nihilism,” it is “kept alive in the masses only in the form of permanent pugnacity.”

As early as 1932, Rauschning writes, Hitler was out “to liberate himself from all party doctrines in economic policy, and he did the same in all other fields,” believing that “the things that stir most men and fire their enthusiasm are the rhythm, the new tempo, the activity, that take them out of the humdrum daily life.” Especially if I’m reading at the end of a tiring day, this is the point at which I start losing track of whether Rauschning is talking about National Socialism or social media, but he has already said what he is talking about; he is talking about nihilism, which means that I wasn’t dozing after all.

For those white liberals wringing their hands over the purported racism of Trump’s base, there is good news and bad news: They don’t all hate black folk. A lot of them just hate you.

I suppose that if I’m going to define nihilism as a lack of values—or to use Rauschning’s summation of Nazism, a “hostility to the things of the spirit, indifference to truth, indifference to the ethical conceptions of morality, honor, and equity”—I’m obliged to say what I mean by a value. I would call it any kind of allegiance for which you are willing to check your own desires for reasons other than pure self-interest. All values manifest themselves in restraint. You’d like to pitch out all those empty wine bottles, but you recycle them instead. You’re late for a doctor’s appointment but slow down your car so as not to hit a pedestrian crossing the street. (If your sole motivation is not to get gore on your front bumper, that is something else.) Values are by their very nature at odds with the amoral dynamism Rauschning describes; they are what applies the brakes. They also threaten the dynamism of an advanced capitalist economy by daring to suggest that something lower than the sky might be “the limit.” All the nameable avatars of the Almighty Market—pop psychology, digital fundamentalism, addictive consumption, cutthroat competition—are based on the premise that what you want is what you ought to have, and the quicker you can have it the better. By its very operation, the market inclines us away from principled restraint and toward nihilistic abandon.

For that reason, it’s probably a mistake to view nihilism as “an explanation apart” from the common analyses of the Trump phenomenon. Economic dispossession and virulent racism stand in relation to nihilism not as alternative theories but as reciprocal causes and effects. In other words, all three flourish in a moral vacuum. Tony Judt remarked on the “moralized quality” of political debates of the post-World War II era, reminiscent of those “19th century radicals” driven by “the belief that there were moral rules to economic life.” He saw that quality in stark contrast to “the selfish amoralism of Thatcher and Reagan.” What does “rising income inequality” imply if not a falling moral barometer? The question is as old as the prophet Isaiah.

In the same way, it would be difficult to draw a sharp line between nihilism and racism, or to find a trace of one without some germ of the other. “We hold these truths to be self-evident, that all men are created equal”—a twenty-first-century reader is almost made dizzy by the simultaneous affirmation that such things as truth, self-evident truth, and human equality exist (along with the gross depravity of the slaveholders who wrote and signed the document, though they could not escape the reproach of its implications). All societies, and certainly all democratic societies, rest on the notion that some values are self-evident. That is surely what Walt Whitman means when, after celebrating himself and singing himself, he goes on to say, “and what I assume you shall assume.” The fundamental equality of your self and my self is what allows us to have common assumptions and to believe they are common. Lose hold of that faith and no body camera on earth will capture the resulting disconnect, because people will not accept that the murder they witnessed “really happened” or that the unarmed suspect bleeding on the sidewalk was a human being who felt a bullet the same way they would.

A sense of radical incredulity, spectacularly typified by Trump’s refusal to believe his own intelligence services, is but one manifestation of the nihilism that brought him to power. What makes him “the real deal” in the eyes of his most ardent admirers is largely his insistence that almost everything else is fake. Like him, they know that the news is fake, the melting ice caps are fake, the purported citizenship of certain voters is fake, science is fake, social justice is fake, the whole notion of truth is fake. Whatever isn’t fake is so relative that it might as well be fake; “true for you,” maybe, but that’s as far as it goes. Among those who call themselves “believers” and are thus at least technically not nihilists, one frequently finds an obsession with apocalypse, a gleeful anticipation of the living end that will destroy the inherent fakery of all things. The social teachings of the Gospels need not trouble the Christian conscience so long as the troubles predicted in Revelation come to pass.

Not that Revelation features any event quite so diabolically nihilistic—and, yes, unbelievable—as a school shooting. The person looking for nihilism in its purest form need look no further than the de facto normalization of gunning down schoolchildren as an act of free expression—and, what is more, as an expression of nothing much in particular beyond the whim to do it. Less a “cry for help” than a grunt of “whatever.” What makes school shootings almost as interesting as they are atrocious is that they place an insupportable burden of proof on people whose knee-jerk response to any social calamity is to say, “This stuff has always gone on, we just didn’t hear about it.” Actually, no. In the same way as antecedents for Donald Trump can be found in Roman tribunes and Nazi demagogues but not in any previous American president, you will search the historical record in vain for persuasive evidence confuting that nihilism in this country is something new.

New doesn’t preclude boring, of course. In less murderous forms, you can see nihilism at work in the banal iconoclasm that exults in anything outrageous, provocative, or “transgressive,” that sees no qualitative difference between the offensive and the genuine. So you have Columbine, and then you have radio talk show host Howard Stern marveling aloud why the killers didn’t pause during the slaughter to have sex with the “really good-looking girls running with their hands over their heads.” Definition of a nihilist: someone who thinks nothing contained in the envelope is ever as important as pushing it. “One must shock the bourgeois,” Baudelaire is supposed to have said, speaking at a time when the bourgeoisie could still be shocked. I wonder what Baudelaire would have made of late-night TV. I wonder how many of those who tuned in to Saturday Night Live to see Alec Baldwin impersonating Donald Trump realized the extent to which Trump himself is an impersonation of Saturday Night Live.

The reason Trump has managed to get away with a truckload of gaffes and indiscretions, any one of which would have destroyed the career of another politician, is precisely because they make up a truckload. They do not constitute a singular blot on his character; they certify his identity. They prove him to be an authentic iconoclast, a superhero of transgression, the guy who brags about grabbing women’s crotches, makes fun of war heroes, and speaks unashamedly of waging nuclear war. He makes Iggy Pop look like Cotton Mather. He can be scary, but he’s never square. He can even shock the bourgeoisie, who as it turns out don’t much mind being shocked, having learned long ago that a little innocuous “subversion” is a hell of a lot cheaper than paying more taxes or raising the minimum wage.

Americans are more infected by this ethos than they might think. Here’s an easy way to test that proposition. When the activist African American pastor William J. Barber II speaks of the need to restore morality to public discourse, are you slightly embarrassed? Do you wonder to yourself, “Can he really be saying that?” or wish he could find a less, shall we say, “moralistic” way to put it? I happen to think he’s hit the nail squarely on the head, although, to tell you the truth, I am a bit shocked.

Nihilism can be simply defined and readily observed, but its causes are probably as complex as human beings themselves. Some can be located in certain primal emotions and the irrational behaviors they generate. Others I would locate in the workings of capitalism. Obviously, there is not always a clear distinction between the two. Capitalism can be seen, and has even been defended, as the systemic expression of unregenerate human nature. Trading in the stock market can be a highly primal affair. Not for nothing do investors speak of having “made a killing” on Wall Street.

Of the relevant causal emotions, perhaps the most primal is fear, and the impulse to overcome fear through recklessness. (It goes without saying that Trump’s presidency is simultaneously a response to fear, a stoker of fear, and a reason to fear.) “All men kill the thing they love,” Oscar Wilde writes, and perhaps most ruthlessly when the thing they love—or have convinced themselves they no longer love—is under threat. Need I say that “the thing” being killed is America?

Another pertinent factor is envy, a basic human emotion that rising social inequality can only exacerbate. To put it in cruder terms: “The world sucks for me, so I am going to make it suck for you too. I have lost my job, my status as a white male, and may even lose my gun. So you, my smug, privileged friend, are going to lose your civil liberties, your faith in social progress, your endangered species, your affirmative action, your reproductive freedom, your international alliances, your ‘wonderful’ exchange student from Syria.” The rationale is probably not too distant from that of the jealous husband who shoots his wife, her lover, and himself. Enjoying ourselves, are we? We will enjoy nothing!—which is to say, we will enjoy the only thing a nihilist can enjoy.

Whether envy also figures in the eschatological fantasies of lower-middle-class evangelicals is perhaps too speculative to discuss, though vengeful envy has been an observable factor in outbreaks of apocalyptic fervor throughout history. One of the pleasures of heaven, according to Tertullian, the second-century theologian known as “the father of Latin Christianity,” would be watching one’s former persecutors roast in hell. That this was addressed to people who had literally seen their loved ones burned alive and has been derided ever since by people whose closest brush with burning occurred when they forgot their sunscreen does not diminish its relevance. Surely there are people who thrill at the thought of FDR or JFK, if not roasting in hell, then rolling in their patrician graves every time Trump sends a tweet.

Along with primal emotions, always more acute in times of social stress, are certain mechanisms innate to capitalism. I am hardly the first to note that capitalism tends toward nihilism by reducing all values to market values. As Marx and Engels put it, capitalism “has left remaining no other nexus between man and man than naked self-interest, than callous ‘cash payment.’ ... It has resolved personal worth into exchange value.” Granted, capitalism makes use of belief systems in its ascent, but eventually eats them up like the proverbial female spider who devours her mate. If a Protestant ethic will make workers go more obediently into the factories, then capitalism will extol the Protestant ethic; but if blasphemy begins to move merchandise at the mall, then it will blaspheme to the point of making Beelzebub blush. If democracy furthers profit, then long live democracy; if democracy impedes profit, then long live Citizens United and private security forces flown in to beat back the disaster-riled mobs. In the capitalist bible, profit and loss always trump the Law and the Prophets.

The winner-take-all strain of capitalism also fosters nihilism by depriving certain classes of key ingredients that make or buttress a sense of purpose: work, family, social usefulness.

The winner-take-all strain of capitalism also fosters nihilism by depriving certain classes of key ingredients that make or buttress a sense of purpose: work, family, social usefulness. How does an unemployed 35-year-old living in his parents’ basement make his life seem meaningful to himself, or at the least notable to others, especially if he lives in a culture where meaning and notoriety increasingly come down to the same thing? Perhaps by defying the taboo against murdering one’s parents. Perhaps by defying the taboo against murdering schoolchildren. Perhaps by defying both.

But the nihilism of the capitalist system is not confined to the social and cultural margins, to misfits in basements and meth-dealers on Harleys. In recent decades it has received influential support and exquisite expression from certain sectors of the intelligentsia. Others before me have pointed out how the theoretical game-play and moral relativism of the postmodern academy—masquerading as left-wing analysis no less!—serve the capitalist project. If there are no “grand narratives,” no self-evident truths, no straightforward texts, no criteria for determining artistic merit, then there is surely nothing to stop us from deconstructing such obsolete products as The New York Times and the Bill of Rights—or even, as so many academics seem obtusely unable to grasp, to deconstruct the self-evident merits of “diversity” itself? If you preach iconoclasm while dedicating a rainbow-colored stained-glass window, you shouldn’t be too surprised if somebody picks up a rock. Ultimately, you are left with no unassailable value but monetary value, the amount of your fellowship grant, the unpaid portion of your student loan.

It’s common to speak of Trump as a character out of a TV show; he might just as easily be viewed as a transplant from a cultural studies department (where much time is devoted to the study of TV shows). In his disdain for science, in the subjectivity of his worldview, in his radically solipsistic moral relativism (things are good or bad as they relate to him), he is a postmodern hero par excellence, Derrida with a funny haircut and a thousand-dollar suit. To put it more succinctly, he is the ungainly chicken of late-stage capitalism come home to roost.

Some will object that few people sporting a Make America Great Again baseball cap are going to have read postmodernist theory, so any claim of a cause-and-effect relationship here is ludicrous. No, the objection is ludicrous. It is like saying that a seabird cannot show up on a beach covered in petroleum since a seabird is obviously not an oil tanker. Culture is a highly permeable ecosystem. Mike Pence was influenced by Lady Gaga even if he couldn’t pick her out of a lineup. I have always thought of Ronald Reagan as the last of the California hippies, blithe in his stoned assurance that America could reach the New Jerusalem if everybody was left alone to do his own thing and consult his own astrologer. In the same way, I think of Mark Zuckerberg as an Ivy League Hells Angel. The Facebook motto “Move Fast and Break Things” wasn’t coined by Ralph “Sonny” Barger or his leather-clad sidekick Doug “the Thug” Orr, but it might as well have been. Ditto for Steve Jobs’s claim of having “put a ding in the universe.” When the universe itself is fair game for dinging, can nihilism be far behind?

Another tendency driven by the market, driven harder still by the information economy’s relentless “creative destruction,” is the compulsion to detect and get ahead of the latest trend. I suspect this compulsion will soon be as hardwired into our brain stems as the fight-or-flight response. If you asked me what Americans today fear more than anything else, I would answer that they fear being left behind. (The anxiety even comes with its own handy acronym: FOMO, or Fear of Missing Out.) They fear missing the boat, foundering in the backwash of the next wave. And what if “the next wave” is wholesale destruction? Common sense would say that you set about piling sandbags, but this would be as unthinkable to many people as resisting the depredations of Amazon by patronizing their local independent bookstore. Instead, the reflex is to make a pact with the vandals, to collaborate with them. To be “ahead of the curve.” Better to be with the destroyers than with the destroyed. If the barbarians are at the gates, best to dress up as a barbarian. More than a few servile Republican members of Congress seem to have drawn that conclusion in regard to Trump.

Finally, nihilism may emerge when people feel overwhelmed by their societies. This is where primal emotions and capitalist dynamism meet: in the moral deadening that comes of having few significant choices and infinite trivial ones. I suspect that somewhere in the heart of many Americans is the wish to see “it all come apart” because the “all” is simply too much to reckon with, too much to bear. King Lear calls on the hurricanes to blast the world and spill the “germens” that make humankind. Trump is too preposterous to be a Lear figure, but his rants may resonate with a Lear-like nerve in those weary of the world. Let the curtain come down. Let us finally be done.

I once caused a minor flap in a restaurant by referring to myself as a customer. I intended no offense. The imperious restaurateur thrust out his bared chest (he was wearing a low-cut caftan) and repeated the word as though I’d just piddled on his rug. I meekly corrected my usage to “guest,” whereupon he lowered his chin enough to indicate that I might possibly deserve to see a menu. His fastidiousness did not extend to the matter of the bill.

I’m told the place has since closed, though it will always be in operation for me as a metaphor. As metaphors go, it’s a bit complicated. The simple way to read it is like this: At the haughty urging of our “hosts,” we in capitalist America think of ourselves as liberals, progressives, conservatives, patriots, pacifists, intersectional-feminist-Marxist-Buddhist environmentalists, but what we are at the end of the day is a country of customers. If you doubt it, wait for the bill, which, if you have trouble paying, will tell you the other thing you are. Not that Marx hasn’t already told you. You can tattoo your eyelids if it makes you feel a tad more subversive, like a tech mogul packing his billionaire buns into a pair of jeans, but when they come to clear away the plates, he’s the guy who owns Facebook and you are just another face.

It would be tempting to leave it at that, but far too doctrinaire. Reflecting further on my faux pas, I’ve come to appreciate that the restaurateur saw the possibility of a value beyond the monetary transaction of our business. If the food I ate was delicious—which it was—that owed in part to his refusal to see his work solely in terms of his proprietorship and my patronage. Yes, there was something disingenuous in his regarding me as a “guest,” but there was also something crass and reductive in my calling myself a customer, especially after he’d gone to all the trouble of practically putting on a bathrobe so we could both feel more at home.

I’ve never stayed at a Trump hotel, where I’m sure all the customers are called guests, but I suspect that the man behind the brand doesn’t much care if they call themselves customers. As long as they pay up at the end of their stay, they can call themselves any damn thing they like.

And here may be yet another factor that draws people to Trump and draws them to nihilism as well: the suspicion that the moral niceties of their fellow Americans are either disingenuous or delusional. It is the inevitable disillusionment that comes of meeting Thomas Jefferson with Sally Hemings while the Devil whispers in your ear, “Heard any declarations of independence lately?” All too easily you can move from realizing that Jefferson is something of a humbug to believing that the equality he declared self-evident is humbug too, whereupon you have also moved from being justifiably indignant at meeting a Founding Father’s enslaved mistress to rejecting your best reason for wanting her free.

The same temptation can occur in less momentous deflations, whenever insincerity peeks from under a euphemism—whenever the “guest” turns out to be a customer. And it may be especially tempting when a culture suspicious of moral imperatives replaces them with the notion that sincerity is the highest virtue and hypocrisy the gravest fault. “I can’t say if he’s wrong or right, but he’s totally sincere.” The political philosopher Judith Shklar, who sees some degree of hypocrisy as essential to the workings of a liberal democracy (she cites the rascally Benjamin Franklin as a case in point), notes how “endless accusations of hypocrisy” invariably pursued “the most capable statesmen” in American history because they had “raised the level of moral and political expectations” and “failed to fulfill the standards they had themselves revived.” The corollary for the least capable statesmen is only too clear: In a moral universe where good and evil have been reduced to sincerity and hypocrisy, Donald Trump (the liar who believes his own lies) will always play the honest angel to Ben Franklin’s duplicitous imp.

It may be that nihilism, or Trumpism if you prefer, is the indignant customer’s reaction to being told for the thousandth time that he is something better than a customer when he has a thousand reasons for knowing that’s what he is and often all he is. On the one hand, his is a cynical denial that hospitality, idealism, and labors of love exist, an insistence on seeing every instance of venial human hypocrisy as proof of a meaningless void. (“Those Clintons, who do they think they’re kidding?”) On the other hand, isn’t it also a pathetic yearning for truth, even on the part of someone who may no longer believe that truth exists? American nihilism is an oozing sore, but like an oozing sore it is evidence both of a malady and of a body’s desperate attempt to heal itself. What I mean to say, if only for my own edification, is that some compassion for the wounded is in order here.

The difference between a genuine prophet of doom and a mere doomsayer (I think of myself as neither) is that the former hopes he is wrong. A recent event strongly suggested that America is hardly a nation of nihilists or is in any present danger from the purported nihilism of a benighted few. The cruel separation of migrant parents and children brought forth an exhilarating storm of outrage and bipartisan rebuke. Four former first ladies (Carter, Bush, Clinton, Obama) were belting out the same righteous song—“immoral,” “disgraceful,” “a humanitarian crisis.” Given so heinous a crime, the ladies could hardly protest too much, though it’s possible we made too much of the protest.

For one thing, the separations had happened. Whatever you want to call them, “unthinkable” will not do; the word lost any pertinence as soon as the first kid screamed. For another, the separations happened over a period not of days but of months. Not least of all, they happened within a complex logistical framework. Who can adequately comprehend the degree of coordination and moral acquiescence required to move hundreds of Central American children from the southern border to the city of New York? Perhaps no one so well as a parent who has tried to get two of his or her own kids from a New York apartment to the beach. To get as far as it did, the operation required a number of operatives not only willing to “simply follow orders” but also able to make thoughtful adjustments along the way. Automation is coming, I’ve been told, but these children were not whisked away by droids. It would be less disturbing if they were.

Against nihilism only love can prevail. That is because love must always affirm a value, both the value of the loved one and the value of the love itself. It cannot do otherwise.

One also observed a disquieting awkwardness as people scrambled for an acceptable explanation as to why kidnapping was wrong. “Un-American” appeared to trump “immoral,” an admittedly blameless choice of words, though it had the unfortunate effect of echoing the nationalist chauvinism of the president himself. He took kids away from their parents in the name of making America great again; his critics demanded he put the kids back in the name of being more authentically American. Am I a killjoy for feeling insufficiently soothed? I happen to own and fly an American flag, but if you want me to swear to tell the truth, the whole truth, and nothing but the truth, it’s not the first place I’m going to put my hand. I was reminded of the debates over “enhanced interrogation techniques” that followed September 11, with some commentators arguing that torture might be permissible in certain extraordinary cases (no rigid binaries like good and evil for those folks) and others arguing against torture on the sublimely moral high ground that it doesn’t yield reliable information. I suppose that someone about to be waterboarded for the tenth time will be grateful for any sort of preemptive argument, even one so grossly utilitarian, but Amnesty International will surely want a better slogan than “only if it works.”

It’s possible we will never know when or if all of these families have been reunited. It’s probable that many of us will have lost our zeal for redress long before the last traumatized children are returned to their parents. In the process of constructing this very paragraph, I realized with a jolt that I could not recall whether the special prison at Guantánamo Bay had been closed. I knew there’d been talk of closing it. Were there still a few inmates? Were they moved or set free? I had to go and ask my wife, who’s better than I am at staying current. “No, honey, it’s still there.”

Much has been written (especially about the young, because that is where adults like to locate cultural pathologies) regarding the shortness of the nation’s collective attention span, about information overload and its numbing effects. Less is written about the minuscule distance that exists between a short attention span and defective moral agency. Most people of conscience would agree that the true test of moral integrity is a principled act, but to perform such an act one needs to have enough presence of mind to move from the principle to its application. To demonstrate your commitment to ice cream, you need to remember why you got up to go to the fridge. Inattention is the gateway drug to nihilism. Distraction is the stealth weapon of the powers that be. Muscle-bound with information, we stand in front of our mirrors and flex. “Knowledge is power,” we croak through strained faces, adjusting our pose a few inches to the left or the right because our legs are falling asleep.

Not long ago I asked a veteran union organizer named Larry Fox what he had learned from his 40-plus years in the labor movement. Somewhat to my surprise, he said he had learned that love is a stronger motivating force for social change than anger.

The answer impressed me for two reasons. First, it brought to mind something the American socialist Michael Harrington had written in the early 1970s, around the same time as the angrier of his New Left comrades were falling under the spell of guns and bombs. “It was as a socialist, and because I was a socialist, that I fell in love with America,” he said, adding this note of prophetic warning: “If the Left wants to change this country because it hates it, then the people will never listen to the Left and the people will be right.” (Any self-satisfied right-wingers smirking at the partial fulfillment of Harrington’s prophecy should understand that the same goes for them.)

Second, I was struck that Fox, like Harrington, was uninhibited by sophistication. He was seemingly unaware that “someone like him” shouldn’t be saying “something like that.” As with the Reverend Barber’s use of the word morality, I realized that what I’m discussing here as a national predicament was in some ways manifest in my own awkward reaction to a word like love. In spite of my various convictions, nihilism had rubbed off on me.

Against nihilism only love can prevail. That is because love must always affirm a value, both the value of the loved one and the value of the love itself. It cannot do otherwise. But here’s the catch: A love capable of confronting nihilism must be nothing less than the militant, self-sacrificial force that it was for Martin Luther King Jr. “If a man has not discovered something that he will die for, he isn’t fit to live,” King said. In light of that statement, one could say that Trump-era America has become a nation of people whose fitness to live is at serious risk. Were it not, they would view the question “Are you prepared to die for your country?” as a tactical consideration instead of a plebeian breach of good taste. They would view the nation’s children as something more than “an investment in the future.” They would not speak of martyrdom as a form of pathology.

But aren’t martyrdom and nihilism close cousins? If you’re asking that with a straight face, then you too have been bitten by the nihilistic bug—or else have confined your understanding of martyrdom to those who strap on suicide vests. With terms derived from Nietzsche, Slavoj Žižek writes of a defining split between the “passive” nihilism of “First World” countries, whose inhabitants “find it more and more difficult even to imagine a public or universal cause for which one would be ready to sacrifice one’s life,” and the “active” nihilism of “Third World” militants who dedicate their lives to some “transcendent cause,” even to the “point of self-destruction.” In so doing, he blurs the distinction between those who would gladly kill for a cause and those who would reluctantly die for it. (He may also be making too neat a division between the First and Third worlds.) Martyrdom and nihilism are as different as Gandhi and Genghis Khan. A nihilist dreams of going out in a blaze of glory, taking as many with him as he can, because he hates his life and despises life in general. A martyr sacrifices her life, which by definition she cannot do unless that life is precious to her. You can only sacrifice what you hold dear. And you do so because something else is dearer still.

The recent plethora of school shootings makes plain not only the costs of weak gun laws, the risks of untreated mental illness, and the tragic repercussions of rearing children through the proxies of digital media and abandoning them to the tender mercies of the NRA—but also what amounts to the national morality play of this moment: nihilism versus self-sacrificing love. The indiscriminate shooter versus the teacher who throws his or her body at the shooter or on top of a wounded student—that is the mirror, that is the choice, that is the split image that keeps coming up on the screen. What’s happening in our schools may soon be happening in our streets, and not for the first time. King was speaking of what he had witnessed in Alabama no less than of what he believed in his heart. You know that punning slogan often seen at anti-Trump protests, “Love Trumps Hate”? If it seems a bit saccharine to you now, that is only because you have yet to tally what building a beloved community might cost. You’re forgetting all the architects who paid the first installments with their blood.

The Essential Difference Between Bernie Sanders and Elizabeth Warren
The Essential Difference Between Bernie Sanders and Elizabeth Warren

Senator Bernie Sanders wants to be president—he made that much clear with his energized campaign in 2016. Senator Elizabeth Warren does, too, as several recent moves show. They both have widespread name recognition and share similar political philosophies. Is there room enough in a Democratic primary for the two of them?

That was the premise of a Politico article last week about how the “two progressive behemoths [are] on a collision course in the presidential primary.” Sanders told the outlet that he has not spoken to Warren about their respective ambitions, but added, “I suspect that in the coming weeks and months, there will be discussions.”

Warren and Sanders recognize that they share the same lane in a presidential race: They’re both populists dedicated to fighting economic inequality. Splitting the vote and missing the opportunity to elevate that signature issue in presidential politics would be a worse fate than any extinguished personal ambition.

But Warren and Sanders are hardly identical progressives. They have markedly different approaches to empowering the working class. In the simplest possible terms, Warren wants to organize markets to benefit workers and consumers, while Sanders wants to overhaul those markets, taking the private sector out of it. This divide—and where Warren or Sanders’s putative rivals position themselves on it—will determine the future of the Democratic Party for the next decade or more.

Warren’s suite of policies, rolled out in recent months, all try to reform rules that empower the wealthy at the expense of regular Americans. Her Accountable Capitalism Act would ensure worker representation on corporate boards, and require large corporations to consider all its stakeholders—not just investors but workers, consumers, and communities of interest—in any decision-making. Her climate bill would force public companies to disclose climate-related risks, giving investors more information to use their money toward sustainable goals.

Her housing bill would transfer money for affordable housing to communities that adopt zoning policies to make it easier to build. Her 21st Century Glass-Steagall Act seeks structural separation of commercial and investment banks to keep companies making the riskiest trades away from taxpayer-funded bailouts. And the Anti-Corruption and Public Integrity Act would buttress these gains by reducing the role of special interest lobbying in federal policymaking.

Sanders, while concerned with making markets fairer, would rather just rip them up, either through limiting how much companies can grow or instituting publicly funded options alongside them. He would put a hard cap on the size of financial institutions to make them more manageable. He would make public colleges and universities tuition-free, rather than expanding access for certain needy students. He would create a federal jobs guarantee through 2,500 American Job Centers nationwide, and he estimates that his $1 trillion infrastructure investment would support 13 million publicly funded jobs as well. And in health care, he would simply nationalize the insurance sector, putting everyone on a Medicare-style plan.

I am leaving out some nuance. Both Warren and Sanders support single-payer health care, an example of nationalizing a market. They both want a public option for financial services through simple bank transactions at the post office. Sanders’s Workplace Democracy Act changes the rules to make it easier to form unions, a classic market-restructuring move. His proposed bill to tax employers whose workers receive federal benefits influenced Amazon’s increase of its entry-level wages, and that was the intent: organizing private-sector labor markets without taking them over or imposing standards.

But the two senators disagree over the best method to give the working classes a leg up. You can restructure markets so everyone benefits, or you can break down the market system, either eliminating the profit motive or giving everybody a public option. The impulse is the same: The game is rigged and must be fixed. But there’s a long gap between re-writing the rules of the game, as Warren wants, and turning over all the chess pieces, as Sanders does.

What makes more sense? It’s not necessarily a clear-cut choice. Market-based health care has proven more expensive everywhere it’s been tried, and as practiced in the United States it leads to worse outcomes. Even there, however, market competition can yield success. One of Warren’s biggest legislative victories involved breaking up the hearing aid oligopoly, which required doctor prescriptions for all audiological devices. Since passing a Warren-written law allowing for FDA-approved, over-the-counter hearing aids without medical evaluation, competitors have jumped into the space, driving down the cost of audio assistance for everyone.

Liberals prefer the concept of a mixed economy—which even Sanders supports, more than his democratic socialist branding would imply. Competition can come from the public sector or the private sector: Breaking up the banks through size caps, or separating their investment and deposit-taking wings, gets you to the same place functionally. The public sector may be better positioned to build a road, and the private sector better positioned to sell you sandwiches.

There’s room for this mixture, but only if markets are bent to the will of the people, as in Warren’s conception. As long as the private sector can get away with pursuing profit at the expense of the public, even the deepest interventions into the market might not succeed. Even under single-payer health care, private hospitals and other providers would still be delivering medical care, and they are so concentrated that they would still facilitate waste and frustrate outcomes. Plus, on a practical basis you can certainly restructure markets faster, in many cases without new approval from Congress. For example, antitrust laws still exist, and regulators with sufficient political will can start enforcing them.

Sanders’s policies are appealing to voters in the way that throwing out a system that doesn’t work is inevitably more appealing than tweaking it. But fighting corporate power and making the rules work for people can also resonate on the campaign trail. Warren has been talking about how the game is rigged as long as Sanders has.

That’s why you see competitors to Warren and Sanders alternately picking up market-restructuring and market-overhaul policies. Senator Cory Booker wants to transfer wealth to young people through a social wealth fund, and also block mergers in food and agriculture markets. Senator Kirsten Gillibrand wants a public option with postal banking and a financial transaction tax to nudge markets away from securities trading.

The decision between Warren and Sanders as the standard-bearer on the left is not merely about personality or electability. It will have implications about what Democrats stand for: a party that wants to make capitalism work for everyone, or one that will nationalize parts of capitalism that don’t work. Ultimately, Democratic voters will have to decide which vision they prefer.

A Super Typhoon Is Pummeling the United States
A Super Typhoon Is Pummeling the United States

A record-breaking hurricane slams into a United States territory. The U.S. government is supposed to respond to the damage. But the government is overstretched, and the island is remote. Amid poor transportation, the response suffers. Thousands die.

This might sound like the story of Puerto Rico and Hurricane Maria. But it could easily become the story of the Northern Mariana Islands and Super Typhoon Yutu, a Category 5-equivalent storm that passed over the U.S. territory in the Pacific Ocean on Wednesday. In the northwest Pacific, hurricanes are called typhoons; super typhoons are hurricanes with sustained winds above 150 miles per hour. And based on early estimations, Super Typhoon Yutu may be the strongest hurricane to ever hit U.S. soil, a devastating reminder both of the precarious, second-class citizen position of U.S. territories and of the storm readiness issues the U.S. will have to face as climate change accelerates.

Yutu is certainly the strongest storm to form on earth this year. With sustained winds of up to 180 miles per hour, it’s also “likely to be unprecedented in modern history for the Northern Mariana Islands ... home to slightly more than 50,000 people,” according to The Washington Post. The majority of those people live on the island of Saipan—which the storm’s inner eyewall directly passed over—along with the smaller island of Tinian.

Judging from the crickets on major U.S. news sites, who would ever know that a hurricane (#Yutu) is expected to slam into a U.S. commonwealth as a Category 5 storm in about 12 hours? Most of the 53,000 residents of the #northernmarianas are U.S. citizens https://t.co/HSBnZqcxy3 pic.twitter.com/FPGettue3K

— Bob Henson (@bhensonweather) October 24, 2018

The degree of damage is yet unclear. But Michael Lowry, a strategic planner for the Federal Emergency Management Agency (FEMA), predicted a “devastating strike” on Saipan. “This is a historically significant event,” he tweeted, calling Yutu “one of the most intense tropical cyclones we’ve observed worldwide in the modern record.”

Smaller storms have also wreaked havoc on Saipan in the past; Typhoon Soudelor hit Saipan in 2015 with maximum sustained winds of 105 miles per hour and wiped out much of the island’s power infrastructure. A month before that, another typhoon passed over the island and “disconnected an undersea cable, effectively severing communications between the Northern Mariana Islands and the rest of the world for a few days,” HuffPost reported at the time.

Google maps

Response and recovery will likely prove challenging. Saipan and Tinian are extremely remote; the closest large countries are Japan and the Philippines, both of which are more than 1,000 miles away. Hawaii, nearly 4,000 miles away, is the closest U.S. state. FEMA thus considers the Northern Mariana Islands an “insular area,” meaning they face “unique challenges in receiving assistance from outside the jurisdiction quickly.”

Yutu is also striking at the worst possible time. The islands already recovering from Super Typhoon Mangkhut, which slammed into the territory with 105 miles per hour winds last month. Mangkhut prompted President Donald Trump to sign a major disaster declaration for the islands, making federal funding available for both temporary and permanent rebuilding. But a major disaster declaration has not yet been declared for Yutu—only an emergency declaration, which makes federal money available for “debris removal” and “emergency protective measures.”

It’s likely Trump will issue a major disaster declaration for Yutu eventually. But if the past is any indication, it might take a while. His major disaster declaration for Mangkhut, for instance, didn’t come until September 29th—and the storm hit the islands on September 10. Trump was also criticized for his slow response to Hurricane Maria in Puerto Rico—another remote island U.S. territory that was struggling to recover from an earlier storm when it was hit by an even stronger one. Hopefully, the administration has learned how deadly delay can be.

Can State Courts Save the Liberal Agenda?
Can State Courts Save the Liberal Agenda?

No matter when, or how, Donald Trump leaves office, he will have dramatically remade the federal judiciary. His administration has struggled to achieve its goals on health care, the border, and infrastructure—but succeeded all too well with the courts. Trump took office with 105 vacancies in the federal district and appellate courts, almost twice the number Obama had in 2009, and Republicans have rushed to fill them, installing more than 50 judges. The impact of these jurists will be lasting; on the circuit courts, they are on average just 49 years old.

Whatever Trump does at the federal level, Democrats still have a fighting chance in the states. Neither Congress nor the president can control the selection of state judges. Term limits and regular elections make them more responsive to popular sentiment than their federal counterparts, who remain in office for life. (Until his death in August, there was a federal judge nominated by JFK on the bench.) This means that even in states like North Carolina, which Donald Trump won in 2016, there can still be a liberal majority on the Supreme Court; and crucially, their decisions are rarely overturned by the federal Supreme Court, which typically avoids interpreting state constitutions.

Some Democrats seem to understand the opportunity this presents. “Everything from family law to much of criminal law to our education system here has been affected by a decision of the state Supreme Court,” said Anita Earls, a civil rights lawyer running for a seat on North Carolina’s top court. Those courtrooms may soon be among the last remaining venues in which to pursue a liberal legal agenda. “The state courts are going to be a place where we’re going to be fighting about all of the issues that are important to people’s liberty, people’s equality,” said Samuel Bagenstos, a law professor from Ann Arbor, Michigan, who is running for the Supreme Court there.

In some states, they already are fighting. In Massachusetts last year, the court ruled that state and local police could not detain immigrants solely to buy time for federal law enforcement to take them into custody. In Pennsylvania this past January, the Supreme Court forced the state to redraw its congressional map, which was so gerrymandered, the justices wrote, that it “clearly, plainly, and palpably” violated its constitution. And in Iowa, in June, the state Supreme Court ruled that “reproductive autonomy” was protected under the state constitution, meaning that abortion would be legal in the state even if Roe v. Wade were overturned.

A more liberal judiciary in other states could make similar rulings possible. While the Michigan Supreme Court has handed down staunch conservative rulings in recent environmental and labor cases, its ideological balance could shift this November, if Bagenstos and one other liberal can win. In Texas, three seats on the Supreme Court are also on the ballot, and if a single Democrat is elected, he or she would be the first liberal to sit on the court in 24 years. Two Democratic candidates for seats on Ohio’s Supreme Court could provide a vocal liberal minority to challenge conservative rulings on voter suppression and gerrymandering. And in April, Wisconsin elected Rebecca Dallet, a candidate backed by state Democrats, to its Supreme Court, which until now has consistently ruled in favor of Republican Governor Scott Walker’s agenda on such issues as public sector unions and his own recall election.

Meanwhile, some states are considering amending their constitutions to address issues that federal courts, now that they are increasingly conservative, won’t address. Florida, for instance, will vote on a major amendment to end felon disenfranchisement, a racially discriminatory practice that the U.S. Supreme Court has repeatedly upheld as constitutional. If the amendment passes, and the measure is challenged legally, the state Supreme Court, no matter its political leanings, would have to expand ballot access. Colorado, Michigan, and Utah have also advanced state constitutional amendments that, if implemented, would allow independent commissions to redraw gerrymandered districts instead of state lawmakers—a key issue as legislators prepare for redistricting after the 2020 census. And Hawaii’s voters will even decide whether to hold a convention to rewrite their state’s constitution in its entirety.

Republicans understand the stakes and in some instances have resorted to extreme measures to take, or retain, control of the state courts. In August, the West Virginia House of Delegates voted to impeach all four sitting justices on the state’s highest court. Lawmakers defended the move as an effort to restore good government—the justices were accused of lavish and ethically dubious expenditures, including the purchase of a $42,000 desk and a $32,000 couch—but critics saw it as an effort by the Republican state government to seize control. (One of the justices has already retired; the other three will remain on the bench pending a trial to remove them in the state Senate, when they would almost certainly be replaced by more conservative jurists.) In North Carolina, Republican legislators went even further to take back the state court: They have added a constitutional amendment to the ballot in November that would let them pack the Supreme Court with two additional judges. If passed, it would turn a 4–3 Democratic majority into a 5–4 Republican majority.

A dogged focus on the courts is not new for Republicans. “They have made judicial appointments a priority in this country for a long time,” Bagenstos told me. “And on the other side, there has been much less attention until very recently.” Through the Federalist Society and Judicial Crisis Network, conservatives have been able to drive attention—and money—to key fights. When Trump nominated Neil Gorsuch to the Supreme Court, for example, conservative groups outspent liberals nearly 20 to one. Judicial Crisis Network reportedly spent $17 million. There are a few similar Democratic organizations, including Demand Justice, but it was founded just six months ago. Run by Brian Fallon, Hillary Clinton’s former spokesman, it has not yet weighed in on state races. As opportunities at the federal level dwindle, however, that could change.

Donald Trump’s legacy at the federal level may be assured, but the state courts are another matter. The November elections will decide the fate of the House and Senate. But state court races may prove more eventful for Democrats.

America’s Relentless Suppression of Black Voters
America’s Relentless Suppression of Black Voters

Brian Kemp currently holds two significant positions in Georgia politics, and he has been in the news for both of them. As the Republican nominee for governor, he is engaged in a fierce battle with Democrat Stacey Abrams, who, if she wins, would be the first female African-American governor in United States history. Polling indicates an extremely close race, one that could be decided by tens of thousands votes.

Kemp is also Georgia’s current secretary of state, where one of his responsibilities is to oversee state elections. In that capacity, he has been engaged in a systematic campaign to restrict the number of Georgians allowed to cast ballots. In July 2017, Kemp’s office purged nearly 600,000 people, or 8 percent of the state’s registered voters, from the rolls; an estimated 107,000 of them were cut simply because they hadn’t voted in recent elections. This year, Kemp has blocked the registration of 53,000 state residents, 70 percent of whom are African-American and therefore could be reasonably expected to vote for Abrams.

Both moves were entirely legal. Georgia, plus at least eight other states, has a “use it or lose” law that allows it to cancel voter registrations if the person hasn’t voted in recent elections. The state also has an “exact match” law, enacted last year, whereby a voter registration application must be identical to the information on file with Georgia’s Department of Driver Services or the Social Security Administration; if they don’t match, or no such information is on file, then the registration is put on hold until the applicant can provide additional documents to prove their identity. That’s why more than 50,000 applicants are on hold. (They can still vote, with a photo ID, but no doubt their pending status will discourage many.)

Georgia is only one of a number of states attempting to artificially suppress the (Democratic) vote, making voting rights a key issue in this election—not to mention 2020, when Donald Trump seeks a second term. With critics insisting that many state laws restricting voter registration are unconstitutionally discriminatory, a continued series of court tests is inevitable. The Supreme Court thus may be the ultimate arbiters of who is allowed to vote and who is not. This is not the first time the court has been cast in such a role, and history does not beget optimism.

Beginning in 1876, the Supreme Court presided over a three-decades long dismantling of what seemed to be a constitutional guarantee of the right to vote for African-Americans. The groundwork was laid in May of that year, when, in United States v. Reese, the court determined that the 15th Amendment, which states that the right to vote “shall not be denied or abridged…on account of race, color, or previous condition of servitude,” did not mean what it seemed to mean.

As Justice Joseph Bradley wrote in a companion case, the amendment “confers no right to vote. That is the exclusive prerogative of the states. It does confer a right not to be excluded from voting by reason of race, color or previous condition of servitude, and this is all the right that Congress can enforce.” Bradley thus transferred the burden of proof from the government that has denied someone’s right to vote to the person whose right has been denied, a bar that would prove impossibly high.

In 1880, in a pair of cases decided the same day, the court overturned a West Virginia law that, by statute, limited jury service to white men, but sustained a murder conviction by an all-white jury in a Virginia case because, although no African-Americans were chosen to serve on juries, there was no specific law that prevented it. Southern whites got the idea. As long as a law did not announce its intention to discriminate, it would pass judicial muster.

When Justice Bradley, writing for an 8-1 majority in the Civil Rights Cases in 1883, declared the Civil Rights Act of 1875 unconstitutional and announced that black Americans would no longer be “the special favorite of the laws,” white supremacists in the South ramped up their efforts to keep black Americans from the ballot box, employing terror, fraud, and a series of ludicrous contrivances.

South Carolina, for example, introduced a device called the “eight-box ballot,” equipped with eight separate slots, each designated for a specific candidate or party. To cast a valid vote, a person was required to match the ballot to the correct slot, but the manner in which the ballot and the box were labeled made it virtually impossible for someone not fully literate to do so. Whites were given assistance by agreeable poll workers, while blacks, most of whom could read only barely or not at all, were left to try to decipher the system on their own.

Still, despite all efforts to stop them, African-Americans throughout the South continued to risk their lives and property in order to try to cast ballots. In 1890, armed with the roadmap supplied by the Supreme Court, Mississippi called a constitutional convention to end black voting once and for all.

The new state constitution required that, in order to register, potential voters be Mississippi residents for two years, pay an annual poll tax, and pass an elaborate “literacy” test, which required an applicant to read and interpret a section of the state constitution chosen by a local official. The “understanding and interpretation” test was meant not only to prevent new registration by Mississippi’s extensive African-American population, but also to disqualify those already on the rolls. Whites were given simple clauses to read (and, again, were often assisted by poll workers) while African-Americans were given serpentine, incomprehensible clauses, which had been inserted into the document for that very purpose. When African-Americans were off the voting lists, they would be stricken from jury rolls as well.

None of this was done in the shadows. James K. Vardaman, a racist Democrat who would go on to become governor and then a United States senator, was one of the new constitution’s framers.

“There is no use to equivocate or lie about the matter,” he said. “Mississippi’s constitutional convention of 1890 was held for no other purpose than to eliminate the nigger from politics … let the world know it just as it is.” Democratic Senator Theodore Bilbo, during his campaign for re-election in 1946, remarked, “The poll tax won’t keep ’em from voting. What keeps ’em from voting is section 244 of the constitution of 1890 that Senator George wrote. It says that for a man to register, he must be able to read and explain the constitution … and then Senator George wrote a constitution that damn few white men and no niggers at all can explain.” As a result, according to Richard Kluger’s Simple Justice, “almost 123,000 African American voters were defunct practically overnight.”

Every southern state eventually followed suit. In 1898, Louisiana convened a constitutional convention specifically to disenfranchise African Americans—“to establish,” a committee chairman at the convention said, “the supremacy of the white race.” After its adoption, the number of registered black voters dropped from 130,344, to 5,320.

Court tests of these new state constitutions went nowhere. In June 1896, Henry Williams was indicted for murder in Mississippi by an all-white grand jury. His attorney sued to quash the indictment based on the systematic exclusion of blacks from the voting rolls, specifically citing the 1890 Mississippi constitution. For most laymen looking at the Mississippi voting rolls, that some organized chicanery had been afoot would have been beyond question.

Yet despite the fact that virtually none of the state’s 907,000 black residents were registered voters, and state officials had publicly announced their intention to disfranchise them, the court ruled that the burden was on Williams to prove, on a case-by-case basis, that registrars had rejected African-American applicants strictly because of race. Justice McKenna wrote that the Mississippi constitution did not “on [its] face discriminate between the races, and it has not been shown that their actual administration was evil; only that evil was possible under them.”

The final blow to African-American voting rights was struck in 1903 in Giles v. Harris, when the court rejected a challenge by Jackson W. Giles, a Montgomery janitor who had voted for two decades, to the registration provisions of Alabama’s 1901 constitution, which contained the usual poll tax, literacy requirement, and a grandfather clause (automatic registration if one’s father or grandfather had been registered).

In a perverse majority opinion, Oliver Wendell Holmes claimed that since Giles insisted “the whole registration scheme of the Alabama Constitution is a fraud upon the Constitution of the United States, and asks us to declare it void,” he was suing to “to be registered as a party qualified under the void instrument.” If the Court then ruled in Giles’s favor, Holmes concluded, it would become “a party to the unlawful scheme by accepting it and adding another voter to its fraudulent lists.” This is the very definition of reductio ad absurdum. By Holmes’s reasoning, any law that was discriminatory would be a “fraud,” and the court would become party to that fraud by protecting the plaintiff’s right as a citizen.

To avoid the problem, Holmes could have struck down the offending sections, and asserted that any state provision that, in word or application, violated the fundamental tenets of equal access to the ballot box would also be void. But he chose not to. Law professor Richard H. Pildes described Giles as the “one key moment, one decisive turning point … in the bleak and unfamiliar saga … of the history of anti-democracy the United States.” With the court’s complicity, by 1906, more than 90 percent of African-American voters in the South had been disfranchised. Unable to influence politics through voting, and with no recourse in federal court, African-Americans were forced to stand by helplessly as the horrors of Jim Crow took root across the South.

Poll taxes, literacy tests, and grandfather clauses are all illegal now, and so, as Brian Kemp and other Republicans have demonstrated, disfranchising black voters has needed to become ever so slightly more sophisticated. And certainly Kemp doesn’t brag about it like Vardaman and Bilbo did. Still, both the tactics and the intent are frighteningly familiar.

In deciding any voter suppression cases that come before it, the Supreme Court will have a stark choice. It can emulate decisions that upheld laws enacted solely and unapologetically to steal the right to vote from millions of African-Americans, or it can recognize the discriminatory and racist intent of these laws and strike them down.

But just as the distant past doesn’t beget optimism, neither does recent history. Back in 2013, the Supreme Court struck down the heart of the 1965 Voting Rights Act, ruling in Shelby County v. Holder that it was unconstitutional to require nine mostly Southern states to seek federal approval before changing their election laws. “Our country has changed,” Chief Justice John G. Roberts Jr. wrote for the 5-4 majority, which included swing Justice Anthony Kennedy. “While any racial discrimination in voting is too much, Congress must ensure that the legislation it passes to remedy that problem speaks to current conditions.”

Five years later, the disastrous effects of that ruling have become apparent. Nearly 1,000 polling places across the country have been eliminated since Shelby. As the Pew Trusts reported last month, “The trend continues: This year alone, 10 counties with large black populations in Georgia closed polling spots after a white elections consultant recommended they do so to save money.” With Kennedy’s seat now occupied by the significantly more conservative Justice Brett Kavanaugh, it’s fair to doubt that this court would find such trends any more troublesome than those that ruled during one of the darkest periods in American history.

How Trump Is Warping the Debate on Trans Rights
How Trump Is Warping the Debate on Trans Rights

This week, The New York Times obtained a draft memo leaked from the Department of Health and Human Services. It argues that the government needs to establish a binary definition of gender, particularly for the purposes of enforcing Title IX—the law stipulating that nobody can be discriminated against on the basis of sex in an educational context receiving federal funding. The new definition would have people defined as either male or female, according to the genitals observed at their birth, with disputes resolved by genetic testing. The memo suggests that gender would be defined “on a biological basis that is clear, grounded in science, objective, and administrable.”

The Office for Civil Rights at HHS is run by Roger Severino, an ideologue with a history of opposing queer Americans’ rights (he has defended gay “conversion therapy” and railed against gay marriage). It is not at all clear that the substance of the memo will be enacted in any form; that it was leaked by the White House suggests it is a provocation designed to inflame and divide voters in advance of the midterms. But there are two aspects to the document that reveal the utter bad faith in which the Trump administration, and Republicans more broadly, are engaging in the issue of gender.

The first is very simple. HHS proposes to pin down sex as a matter of biological certainty that can trump a person’s claimed gender. But the concept of two “biological sexes” is inaccurate. For example, all babies are not born with bodies that can be neatly categorized. The Intersex Society of America estimates that 1 in 1,500 to 1 in 2,000 babies are born with genitals that do not appear obviously male or female. That is a very large population of intersex Americans, whose existence would be falsely recategorized by this memo’s proposed view of sex.

For many years, intersex people have been encouraged or forced, via surgery, to identify with one gender and then “live as a woman” or “as a man.” But that just hasn’t worked, instead leading to great unhappiness for many people. The medical consensus now is that intersex people need to determine their gender for themselves, or risk serious psychological trauma.

It is a mistake to turn intersex people into a kind of “test case” to back up trans people, both because it’s reductive of intersex people’s experiences and because it wrongly implies that trans people’s gender identities have anything to do with their genitals. But the two groups are strongly allied, because trans and intersex people have both been harmed by the imposition of a binary view of birth gender—precisely what HHS wants to do now.

The medicalized view of gender assignment contained in the HHS memo runs contra to everything that research tells us about how gender actually works. But the “two genders and nothing else” idea is repeated over and over again by conservatives who think that there is a vast social justice campaign to undermine traditional American family values, under the umbrella term of “gender ideology.”

That mistake is the core piece of misinformation spread by the HHS memo. But there’s a second, more conceptual element to the memo’s perniciousness. It frames gender—a complex phenomenon that evolves through time and varies according to individual—as something basic and biological: as something “that is clear, grounded in science, objective, and administrable.” In doing so, the memo reduces the conversation around gender down to body parts. It turns the question of trans rights—the rights to be a free person in every legal respect—into a humiliating, dehumanizing debate about penises and vaginas. It demonstrates a refusal on the part of HHS to consider trans people as full human beings, with minds as well as bodies. Worse still, it forces all of us to argue for trans rights on those specious grounds.

That’s not a coincidence. Again and again, fights over civil rights in America (for black people, for gay people) have turned into fights over the barest, basest facts of life—whether this person can copulate with that person, whether this person can use a certain bathroom. The very premise of these debates contains the suggestion that the offending parties are not quite human, or that they have somehow transgressed the bounds of personhood.

This dehumanization is baked into the memo’s design. Want to be considered full, free, trans human beings? First, the memo says, let’s have a conversation about your genitals. Then we can talk about your rights. Because the memo sets the boundaries along those lines, advocates for trans rights can become trapped in an argument that works against them. The debate itself ends up transmitting the memo’s junk science and junk presumptions into the culture, where it can fulfill its purpose of engendering inequality.

There are many precedents for bringing the debate about rights down to the lowest possible denominator. In the nineteenth century, people claimed that mixed-race people were unable to have children, the same way that mules are infertile. It was patently untrue even at the time, but public “debates” over the claim meant that mixed-race people’s lives were always discussed in this degrading context.

In that case and in this, the political discussion purports to be scientific and objective. But it instead manipulates the terms of the conversation to be about organs, not hearts and minds. Trans people are, like any other group, citizens who vote. The leaked HHS memo is a cynical attempt to provoke a firestorm in our culture, in order to draw ever starker lines between the left and the right. But we can’t fall for it. If trans activists are reduced to talking about their genitals at the expense of talking about their human experience, then we are all degraded.

A War Without Civilian Deaths?
A War Without Civilian Deaths?

The killing of other human beings in war makes graphic an abiding moral dilemma: You might try to make an evil less outrageous, or you might try to get rid of it altogether—but it is not clear that it is possible to do both at the same time. In one of her Twenty-One Love Poems, Adrienne Rich imagines imposing controls on the use of force until it all but disappears: “Such hands might carry out an unavoidable violence / with such restraint,” she writes, “with such a grasp / of the range and limits of violence / that violence ever after would be obsolete.” Yet the lines contradict themselves: If violence is inevitable, however contained or humane, it is not gone.

Nick McDonell’s striking new book about America’s forever war, The Bodies in Person, is a call to contain or minimize one kind of outrageous violence: the killing of civilians in America’s contemporary wars, fought since 9/11 across an astonishing span of the earth. At a moment when Donald Trump has relaxed controls on American killing abroad even beyond what McDonell chronicles and our long-term proxy war in Yemen has broken into gross atrocities—like the Saudi air strike that killed scores of civilians in early August this year—it is a pressing theme. And McDonell’s appeal for Americans not only to attend more to the military excess of the war but also to bring down its civilian toll in the name of the country’s own founding ideals is nothing if not noble.

THE BODIES IN PERSON by Nick McDonellBlue Rider Press, 304 pp., $28.00

Pulsating with attention to moral principle, McDonell’s approach to this grisly reality is also highly personal. In his reporting, he restores the identities of civilian victims. “The first step away from a person’s name is the first step toward killing him,” he observes. He gives voice to misgivings that lurk beneath the surface of American consciousness, close enough to cause trouble but submerged too far to break through without help. Instead of explaining the world-historical setting of our wars or presenting another gritty narrative of soldiering abroad, McDonell is interested in how the military thinks, constantly asking our soldiers and their leaders hard moral questions. And yet, fixated on the range and limits of violence, he neglects to ask why our country treats it as unavoidable in the first place.

McDonell comes to the subject of American war as a disillusioned member of his country’s elite. He first became known as a prep school novelist, starting with the publication in 2002 of his smash hit Twelve, composed when he was only 17 years old and still a student at Riverdale Country School in New York. He wrote what he knew: Park Avenue apartments with absentee parents and posh drug dealers. This was the scene in which McDonell set a crime thriller, later made into a film with Kiefer Sutherland. It was both an ambitious undertaking and a remarkably insular vision.

After his years as a Harvard undergrad, McDonell must have concluded that he wanted to learn about more of the world. His third novel, An Expensive Education, marked a transition. Set partly in Somalia, it was still populated by young American elites, but the characters show their growing awareness of the global system of exploitation upon which their own privilege rests. They continent-hop, enjoying their position while also indulging in caustic irony from the top; they amuse themselves without believing anymore in the “stifling orthodoxies” of America’s “ill-conceived experiment in liberty.” The novel marked McDonell’s own evolution along the same lines: “I didn’t always think this way,” McDonell declares at the beginning of The Bodies in Person. “Halfway through my life my country went to war abroad.”

The realities of America’s global presence began to hit home for McDonell as he traveled the world, giving up fiction for reportage, but still equipped with a novelist’s sensibility and style. Afghanistan and Iraq have been his destinations, as he has embedded with ordinary soldiers and engaged in dialogue with higher-ups in the chain of command. Nearly a decade ago, McSweeney’s brought out his nonfiction book The End of Major Combat Operations, a disquieting look at America’s earliest attempt to withdraw with honor from the quagmire it created in Iraq. After a stint at the University of Oxford, McDonell wrote a book of political theory on nomads and their lack of representation in the international system of states. But America’s war kept drawing him back to the field and to the stifling orthodoxies of those who wage and support it.

Kabul, 2015. In recent years, the United States has turned to drone strikes and special forces. Gueorgui Pinkhassov/Magnum Photos for ICRC

American elites have, of course, discovered their country’s wars in prior generations. Fiction about and reportage on the moral realities of American conflict have long been a staple of the publishing industry, as generations of writers have dramatized the commitment of the soldiers while also offering skepticism about the missions on which they have been sent. “As if anyone needs another lyrical American war story,” one of McDonell’s cutting footnotes rightly observes, “when Michael Herr’s Dispatches is in every guesthouse from Kerada to Taimani.”

Like Herr’s and other earlier books, The Bodies in Person works through a series of narrative set pieces: McDonell witnesses the violence itself and studies its various aftermaths, like a seismologist traveling to assess the damage of an earthquake at various ranges from the epicenter. He movingly narrates the death of “Sara,” a young Iraqi girl from Tikrit who has been recuperating after a lifesaving operation, only to be killed by a bomb meant for the ISIS stronghold across the street. Like much prior literature in this vein, McDonell’s book recalls his own firsthand experiences, as he recounts his interactions through “fixers” with ordinary people in one scene, and then tells of jumping a helicopter to an undisclosed location in southern Afghanistan to observe American forces target and kill an enemy.

But through these stories, McDonell is preparing a very specific moral inquiry: How much should Americans contain their violence? Inside the computerized room where the group executes an air strike, McDonell surveys the protagonists, putting a human face on a general and his subordinates (and contractors). If every victim deserves a proper name, those who perpetrate their deaths are named too. These are troops who follow orders, assuming they are just, tolerating civilian fatalities as collateral damage; McDonell, outraged, wants to probe whether they are right to do so, and poses deliberately naive questions. “I’m trying to figure out in my own head,” he tells Callie, a private contractor from Missouri, “whether it is appropriate sometimes to kill people to get what we want.”

According to international law, it is. You can never aim at civilians, the law says. But it is not against the rules to kill civilians “collaterally,” so long as doing so is not out of proportion to the concrete and direct military aim, and so long as you take precautions to avoid or minimize harm. Given the consensus around this vague rule, for better or worse, the question is not whether this rule makes sense, but what it means in operations on the ground. What counts as disproportionate? And who decides? The law does not specify what number of civilian deaths counts as too many to allow attack. It all depends on what the military aim is—and who makes the calculation of how much harm is too much.

McDonell doesn’t delve into these questions as legal matters (he dismisses “legalese” at one point) but looks at how American policy defines acceptable violence—the killing of enemies and even civilians in the bargain, just not too many. He is especially concerned, it turns out, with how much of the decision-making is left to personnel on the ground, as he investigates the so-called Non-Combatant Casualty Cutoff Value (NCV)—the maximum number of civilians likely to be killed before troops have to ask Washington for permission to strike. This number was, according to McDonell’s reporting, 20 during the heat of the Iraq counterinsurgency, ten in Iraq more recently, ten in Syria during the campaign against the Islamic State, and zero in Afghanistan, after a higher number stoked popular outrage and anti-government protests.

Stretches of The Bodies in Person examine how the United States manages the optics of war, notably through policies that undercount collateral deaths. As a much-noted New York Times Magazine article by Azmat Khan and Anand Gopal last fall detailed, and as McDonell confirms, part of the callousness of American policy has been that it trivializes the scope of the damage after the fact. McDonell also covers the activities of the “Civilian Casualty Mitigation Team,” engaged in the macabre work of “consequence management” when civilians are injured or killed. A disturbing but fascinating system has been erected by the United States to offer payments for “solace” to next of kin, without taking blame for the deaths. “They don’t necessarily imply responsibility,” one lawyer tells McDonell. All they say is, “we’re very sorry this has happened.”

McDonell’s ultimate worry is not exactly that the number of civilians who die in America’s war is too high; it is that any civilians die at all. Human life, he reasons, is equally valuable, no matter where its bearer breathes and toils, and the United States should therefore set the NCV at zero. America’s current use of the NCV, allowing a certain number of civilian casualties, makes American lives “more valuable than others, which is contrary to the American axiom that all men are created equal.” McDonell dramatizes the facts of American carnage in Afghanistan and Iraq so powerfully that it’s difficult to criticize his sentiments.

The only way to ensure that no civilians are killed is to end this war altogether—something McDonell, like other American elites, is not ready to demand.

But as the main conclusion of The Bodies in Person, this seems wrong. All policies, including those affecting Americans alone, value lives differently. And as McDonell himself reports, the NCV is already at zero in Afghanistan. Setting it there has by no means ended the bloodshed but has only required units to seek permission from a higher authority before they incur collateral damage. Last year alone, over 10,000 civilians were killed or wounded there, according to a United Nations report. And it’s hard to imagine that any army could completely avoid harming noncombatants while engaged in a conflict. The only way to ensure that no civilians are killed is to end this war altogether—something McDonell, like other American elites, is not ready to demand.

McDonell’s chief mistake may lie in focusing on civilian death as the source of the most serious immorality of American war in the first place. After all, in war, troops die too, and calls to contain violence against civilians (or combatants) in war could function unexpectedly to make war more humane and, therefore, more likely to last longer. Indeed, asking for war without collateral damage is like asking for policing without brutality: It would not tend to end it, but to perfect it as a form of control and surveillance. We don’t know enough about how precisely changes in the NCV in Afghanistan have altered the war. But, even if drones are less and less likely to make mistakes, we do know that on the ground ordinary Afghans complain that their presence above is felt as an oppressive menace. Calling for Americans to kill only the enemy, which is what the military is already trying to do, could polish the moral sheen of that menace.

The containment and minimization of violence in America’s war, particularly when it comes to civilian death, have only made it harder to criticize America’s use of force in other countries. As McDonell rightly puts it in an email to the military, “it’s clear that the U.S. has the most robust CIVCAS [civilian casualties] avoidance policy and process in the world (and in history).” And he records Barack Obama explaining, in justification for his new form of light-footprint war:

People, I think, don’t always recognize the degree to which the civilian-casualty rate, or the rate at which innocents are killed, in these precision strikes is significantly lower than what happens in a conventional war.

For this very reason, many of the Americans McDonell interviews think they are fighting a moral war. One of his sources, a colonel, assures him that U.S. policy is “about just not wantin’ to hurt civilians.” “I mean,” he pronounces, “we take our values with us when we go to war.” The policy of avoiding civilian casualties allows the United States to present warfare as a form of virtue, especially to the extent it is increasingly civilian-friendly.

This narrative has only been growing stronger in recent years. Over the second half of Obama’s presidency, the United States moved away from the large ground campaigns on whose final stages much of McDonell’s narrative centers. Instead, the military has lately turned to deadly policing from the air and the deployment of tiny bands of special forces, which America sent to a full three-quarters of the countries of the world last year. Even while Trump has relaxed applicable guidelines for his forces and tolerated the atrocious conduct of America’s Saudi allies in Yemen, he has maintained a twilight war with few boots on the ground or none. To an unprecedented extent, this novel form of war is defined by its attention to the “humanity” of how it is fought.

Which brings us back to the crux of Adrienne Rich’s poem. Is the trouble that there is not enough restraint, or is it the practice of war (or at least this war) itself? The more containment succeeds—leading to less and less objectionable violence, fewer atrocities, and lower body counts, not merely on the American side but also among civilians caught up in the fray and among legitimate military opponents—the more likely it is that the war will continue indefinitely. What if its worst feature is not collateral death, or even violence, but an attempt at global control and ordering that no one opposes?

Lots of care is needed in considering this possibility, yet it is ultimately even more disturbing than the reality that America must do even better counting its casualties and preventing collateral harm. What if, to be blunt, the deepest moral problem with contemporary American war is not its inhumanity but its existence? What if the Pentagon is outrageously mistaken in the details and is outrageously undercounting civilian deaths, but is also entirely correct that it is fighting the most humane war in history? And what if the deepest moral problem with it is not its failures of containment, but the fact that its increasing nonviolence is part of its endlessness and the global hegemony it enforces?

If McDonell doesn’t ask these questions, it may be because of his nostalgia for the standard beliefs of American elites, whose novelist he once seemed to be grooming himself to become. America was supposed to be founded on a new idea—the universal equality of human beings. Yet it has been engaged in global rule for a while now. “The only non-combatant casualty cutoff value consistent with our values is zero,” McDonell insists, even as he knows both that those values have never guided the U.S. military in a conflict, and that noncombatants have been treated much worse in every American war to date. “Insistence on equality, despite its piercingly slow entry onto the rolls of law, is the most powerful aspect of our experiment,” McDonell adds in the vein of conventional American exceptionalism.

McDonell is aware that this is overoptimistic, but he concludes that, because the “worst crimes stem from a failure of imagination rather than any evil particular to our kind or time or nation,” narratives like his own might change the equation someday. I hope so. But if they do, they will need to reckon with the fact that we confront our endless war only partially when we focus on the civilian death and injury (or, for that matter, death and injury generally) that it involves. If this war can last forever because it is so humane—and fought for ends of control and surveillance rather than with the aim of killing enemies to break the will of the other side—then making it less deadly and more humane may extend it only more.

Don’t Count on Republicans to Punish Saudi Arabia
Don’t Count on Republicans to Punish Saudi Arabia

President Donald Trump has stuck by Saudi Arabia through every twist of the saga of Jamal Khashoggi, the Washington Post columnist and Virginia resident who was murdered in the Saudi embassy in Turkey earlier this month. He seemingly was the only person in Washington who believed the country’s initial denials of any involvement. When Saudi Arabia followed two weeks of lies with an admission and a bizarre defense—that Khashoggi had been killed after an “accidental fistfight”—Trump accepted that, too. The president has made it clear that all that really matters is the preservation of a $110 billion arms deal reached last year, which he falsely claims will create a million jobs.

“I would prefer that we don’t use, as retribution, canceling $110 billion worth of work,” he said on Friday, later adding, “You know, I’d rather keep the million jobs, and I’d rather find another solution.”

This has put Senate Republicans, particularly the self-appointed torch-bearers of the party’s foreign policy, in a familiar bind. Bob Corker told CNN’s Jake Tapper on Sunday that the Senate had invoked the Magnitsky Act, which could lead to sanctions being placed against Saudi Arabia. Ben Sasse told Tapper, “You don’t bring a bonesaw to an accidental fistfight,” referring to the implement allegedly used to dismember Khashoggi’s corpse. Peter King went even further, telling ABC’s George Stephanopoulos that he thinks the Saudis are “the most immoral government we’ve ever had to deal with.”

But with the exception of the usual exceptions, like the war skeptic Rand Paul, the calls for action have been relatively muted. Some Senate Republicans have even acted as though they’re the real victims of Khashoggi’s murder. Lindsey Graham told reporters, “I’ve been the leading supporter along with John McCain of the U.S.-Saudi relationship. I feel completely betrayed.” And Marco Rubio tweeted this unfortunate formulation:

The #KhashoggiMurder was immoral. But it was also disrepectful to Trump & those of us who have supported the strategic alliance with the Saudi’s. Not only did they kill this man,they have left @potus & their congressional allies a terrible predicament & given Iran a free gift.

— Marco Rubio (@marcorubio) October 22, 2018

It’s becoming increasingly clear that, for all their outrage (or mere disappointment) at Saudi Arabia, Senate Republicans have no intention of doing anything about the Kingdom’s murder of Khashoggi. And if that turns out to be true, it will be for same reason that the Senate has backed down time and again since the beginning of 2017. Over the past two years, Trump has systematically demolished one Republican principle after another. The party’s foreign policy moralism would simply be the last one to die.

Despite all the talk of possible rapprochement with Russia, this administration’s most successful foreign policy reset has been with Saudi Arabia. When Trump took office, the long relationship between United States and Saudi Arabia, built on oil and a desire to contain Iran, was in its worst shape in decades. Engaged in proxy wars across the Middle East, the Saudis were opposed both the 2003 invasion of Iraq and the 2015 Iran nuclear deal, on the grounds that both empowered their enemy.

But a visit from Trump in early 2017 ended in the $110 billion arms deal and, a couple of months later, the renunciation of the Iran deal. After Trump’s visit, CNN reported the “Saudis have begun to view Trump as a like-minded partner—one who put Iran ‘on notice’ early in his presidency and has vowed to take a tougher line on the Saudi nemesis than his predecessor. His team also seems less likely to chide the kingdom on human rights issues, a perennial thorn in the US-Saudi relationship.”

Bin Salman’s successful 2018 visit to the U.S.—and cultivation of elites in American media, business, and, most of all, politics—allowed some skeptics to see what they wanted to see in Saudi Arabia. Promising to remake his country’s oil-based economy, Bin Salman presented himself as a reformer, ending his country’s ban on female drivers in 2017. This image glossed over the Kingdom’s numerous human rights abuses both at home and abroad. The country is deeply engaged in two brutal wars in Syria and Yemen, which is in the midst of one of the worst famines in recent history thanks in large part to a Saudi blockade.

Nevertheless, that famine and Saudi Arabia’s indiscriminate bombing (of a bus filled with 50 schoolchildren, among other horrors), were largely ignored until Khashoggi’s brazen murder. Now, Republicans are having to answer for their support of an authoritarian country, and to figure out how far they can go in sanctioning it without provoking the wrath of a president who would “rather find another solution.” If history is any guide, they won’t go very far at all.

Ever since Trump took office, Senate Republicans have made a big show of pushing back against Trump whenever his foreign policy deviated from the party establishment. But true action has been rare. After Trump accepted Vladimir Putin’s promise that Russia didn’t interfere in the 2016 election, the “outcry, including from Republicans, was instant,” The New Yorker’s Evan Osnos noted. “More remarkable, though, was what didn’t happen. No one resigned from the Cabinet. No Republican senators took concrete steps to restrain or contain or censure the President.”

And even when the Republican-led Senate has taken concrete steps, they have often been symbolic. In response to Trump’s repeated criticism of NATO, the Senate “passed a non-binding measure, 97-2, that expresses support for NATO, its mutual self-defense clause and calls on the administration to rush its whole-of-government strategy to counter Russia’s meddling in the U.S. and other democracies,” DefenseNews reported in July. The Senate passed another non-binding resolution after Trump briefly flirted with the idea of handing over U.S. officials to Russia for questioning. Senate Republicans have also tried strongly worded committee reports and letters to the president.

As long as Trump is the most popular Republican politician in America, and he’s taking the arms deal off the table, it’s unclear what Senate Republicans could do to send a meaningful message to Bin Salman and Saudi Arabia. It’s also unclear that they even want to. From senators Graham and Rubio, there’s the sense that Saudi Arabia’s human rights abuses are better left ignored—that what matters is that they are allies in the fight against Iran. Surely others agree with Trump that one man’s murder does not warrant reneging on a $100 billion arms deal, which might explain why talk of blocking the deal appears not to have nearly the necessary support in the Senate.

The New York Times reported on Saturday that Trump is “betting he can stand by his Saudi allies and not suffer any significant damage with voters.” He’s probably right, and some Senate Republicans probably are making the same wager. The midterms have revolved almost entirely around health care and immigration for weeks now, and that’s not likely to change. Some on the right, notably evangelical leader Pat Robertson, are shrugging off Khashoggi’s murder, which only gives them further cover.

If the Republican Senate ultimately does nothing meaningful to punish Saudi Arabia, it will represent the party establishment’s final capitulation to Trump. This was perhaps inevitable after Senator John McCain’s death in August. Though his own foreign policy views were deeply flawed, there’s little doubt that McCain would have been the most morally righteous Republican voice in this moment, chastising Trump and calling on the Senate to punish Saudi Arabia. “We are not the president’s subordinates,” McCain said upon his triumphant return to the Senate last year, after being diagnosed with brain cancer. “We are his equals.” It’s not clear that any of his surviving Republican colleagues feel the same way.

The Reality Behind ‘Migrant Caravans’
The Reality Behind ‘Migrant Caravans’

Seven thousand Central American migrants are traveling toward the United States. President Trump wants Mexico to stop them. “I must, in the strongest of terms, ask Mexico to stop this onslaught.” Trump tweeted on October 18, when the “caravan” of mainly Honduran, El Salvadoran, Guatemalan, and Nicaraguan migrants arrived at the Suchiate River on Mexico’s southern border. “Sadly, it looks like Mexico’s Police and Military are unable to stop the Caravan heading to the Southern Border of the United States. Criminals and unknown Middle Easterners are mixed in,” he tweeted on Monday.

Trump has often suggested Mexico should help the U.S. halt Central American migration. In early September, the Trump administration announced that it would give $20 million for bus and airplane fare to deport 17,000 undocumented Central Americans from Mexico. (Mexico refused the funds.) Last week, Trump revived a request that Mexico agree to a law similar to the European Union’s policy that migrants apply for asylum in the first “safe” country they arrive in: Central Americans bound for the United States would have to seek asylum in Mexico instead, although violence there is at an all-time high.  

In reality, Mexico has been serving as the United States’ militarized buffer zone for some time—in no small part due to concerted efforts by U.S. administrations. Since 2014, the United States has spent nearly $200 million expanding a deportation regime in Mexico that has expelled over 600,000 migrants, mostly to the Northern Triangle countries—Guatemala, Honduras, and El Salvador—but also to South America, Africa, and Central Asia. The arrangement has allowed both the Trump and Obama administrations to outsource their dirtiest work onto Mexico, deploying U.S. immigration officials and U.S. equipment throughout the country to help carry it out. 

Mexico’s transformation into a full-fledged deportation state began in the summer of 2014.

Mexico’s transformation into a full-fledged deportation state began in the summer of 2014. A large number of unaccompanied Central American children—over 68,000 in twelve months—arrived on the U.S.-Mexico border—and the Obama administration moved swiftly with Mexico’s President Enrique Peña Nieto to approve a $100 million plan, known as the Programa Frontera Sur, that would protect “the safety and rights” of Central American migrants and secure Mexico’s southern border with Guatemala. “Both governments deny that the U.S. leaned on Mexico to crack down,” said Adam Isaacson, a Mexico security expert at the Washington Office on Latin America (WOLA), but “in our view, it’s completely what happened.”

The reality of Programa Frontera Sur differed strikingly from its stated mission of protecting migrants. Programa Frontera Sur paid for advanced border control machinery—drones and security cameras, fences and floodlights, alarm systems and motion detectors—and the expansion of a controversial national immigration service known as Grupos Beta. The organization is tasked with providing water, first aid, and directions to migrants. But Grupos Beta workers—who stand out against the wilderness in neon orange t-shirts—have been known to extort cash from migrants and report them to immigration officials who detain and deport them. Cargo trains, known as “the Beast,” that Central American migrants rode atop on their journey north were sped up so that migrants could no longer jump onboard, forcing them to forge routes through the forests of Chiapas and Oaxaca—where they are frequently attacked and robbed. Valeria Luiselli, a celebrated Mexican writer, described Programa Frontera Sur as an “augmented-reality video game,” where “the player who hunts down the most migrants wins.”


With Programa Frontera Sur, the United States extended its reach deep into Mexico’s interior. U.S. Border Patrol agents were dispatched to train immigration agents throughout Mexico’s 58 detention centers. In April, the U.S. Department of Homeland Security expanded an $88 million program for biometric equipment at Mexico’s southern border checkpoints that shares the fingerprints, iris scans, and descriptions of scars and tattoos with the U.S. Immigration and Customs Enforcement (ICE). Another $75 million from the United States went towards building communications towers along the remote Guatemala border. “The entire country of Mexico is now a border,” one Mexican analyst declared.

By one measure, Programa Frontera Sur achieved its intended outcome. Since 2014, Mexico has deported more Central Americans each year than the United States—nearly 180,000 in 2015. (For comparison, the top-deporting country in the EU, Greece, deported only around 105,000 migrants in 2015, at the height of Europe’s migrant crisis.) Now the United States seems to be building on the model. Throughout 2018, reports surfaced of Mexican immigration agents in the northern Mexican border cities of Tijuana and Nogales receiving orders from U.S. Border Patrol to detain and deport Central American migrants, despite their legal right to apply for asylum in the United States. “It’s a collaborative program that we’re doing with the Americans,” a Mexican immigration official told a Texas immigration lawyer in July.

But the militarization hasn’t changed the underlying dynamics driving immigration: El Salvador and Honduras rank as the second and fourth most violent countries in the world, respectively, and Guatemala trails not far behind. As a result, immigration from the so-called Northern Triangle countries has risen even as the number of Mexicans immigrating to the U.S. has declined. Maureen Meyer, a Mexico expert at WOLA, estimates that 400,000 Central Americans pass through Mexico in any given year. Parents continue to send their children north because the dangers of staying outweigh those of leaving.

Mexico, too, wants Central American immigrants out. A 2014 study found that many Mexicans discriminate against Central Americans, who live in Mexico’s most dangerous neighborhoods and work low-wage jobs that Mexicans avoid. Migrant shelter workers told WOLA in one survey that they have extracted pellets out of migrants’ legs after Mexican immigration agents shot at them with pellet guns. The same report cited migrant testimonies that immigration agents also use electric shock devices, despite laws prohibiting the use of all weapons. The Mexican media often frames Central Americans as gangsters, even though only eight of the 21,000 migrants scanned last year with biometric equipment were identified as gang members.

These factors create a harrowing ordeal for those fleeing violence in their home countries. Women are particularly vulnerable. An estimated eight in 10 migrant women and girls are raped while traveling through Mexico. Many reportedly bring birth control as a precaution.

Mexico has repeatedly and adamantly refused to pay a cent for Trump’s wall on the U.S.-Mexico border. But in many ways, Mexico has long been paying for a wall—on its southern border, instead. Since 2014, a vast and sophisticated deportation apparatus has emerged in Mexico that has traumatized and harmed hundreds of thousands of people. Although U.S. assistance only accounts for about 2 percent of Mexico’s $10 billion annual defense budget, much of the new infrastructure would not exist if it weren’t for the United States persistently nudging Mexico to crack down. There are parallels here with how rich Western European countries like Germany and France have relied on poorer countries in southeastern Europe and North Africa to halt the flow of migrants coming from Syria, Afghanistan, Mali, and Guinea.

On October 3, President Trump called Mexico’s president-elect, the charismatic leftist Andrés Manuel López Obrador, to discuss among other things how to halt Central American migration. López Obrador said he planned to plant two and a half million acres of timber and fruit trees in southern Mexico and build a high-speed “Maya” tourist train linking the temples of Palenque to the pyramids of the Yucatan: Creating jobs for Central American migrants will keep them from the United States, López Obrador believes. “Great phone call,” Trump tweeted approvingly afterwards.

López Obrador, who will succeed the deeply unpopular Peña Nieto on December 1, could present a threat to Trump’s deportation agenda. A pacifist and anti-imperialist nationalist, López Obrador has repeatedly expressed disdain for Programa Frontera Sur—and promised to focus on “addressing the root causes of Central American migration.”

“We are not going to chase migrants. We are not going to criminalize them,” Alejandro Encinas, the incoming undersecretary of immigration, recently told The Washington Post. López Obrador’s incoming cabinet has said that it would not cooperate with U.S. requests for the policy that would require Central Americans to seek asylum in Mexico rather than the United States.

Pressing for jobs, not pellet guns, departs radically from prior approaches. López Obrador’s immigration agenda remains vague, and many doubt that he will entirely dismantle Programa Frontera Sur or open the southern border to Central American migrants. Come December, both countries will find out how serious the new Mexican president is.

How Colleges Fail Young Trump Supporters
How Colleges Fail Young Trump Supporters

Earlier this month, on the eve of a federal trial over Harvard’s use of race in admissions, the university’s president invoked the predominant defense of affirmative action: It enhances education for everybody. “Harvard is deeply committed to bringing together a diverse campus community where students from all walks of life have the opportunity to learn with and from each other,” Larry Bacow wrote.

As a professor, I believe in that ideal as deeply as I believe in anything else. But since the 2016 elections, I’ve come to question whether our elite universities believe it. Despite our rhetoric of diversity, we haven’t made a sustained, explicit effort to learn from a significant but typically ignored minority in our midst: Donald Trump supporters.

Since 2016, I’ve had several pro-Trump students come out to me in my office, with the door closed. One student reported that he had heard a slew of egregiously offensive statements by his peers—including “Trump voters are racists, idiots, or both”—but that he hadn’t said anything in response, for fear of drawing ridicule and hostility. “Please don’t out me in class,” he added.

As Jon Shields and Joshua Dunn wrote in their 2016 book, Passing on the Right, younger conservative professors sometimes describe their plight in the language of gay rights. Like closeted homosexuals, they frequently disguise their identities and play along with the majority—at least until they get tenure, when they’re more likely to express their true selves.

Conservative students don’t have the same freedom. Like the Bryn Mawr student who was flamed on social media after she sought a ride to a 2016 Trump rally—so brutally that she withdrew from school—my Trump-supporting students are understandably afraid that they’ll be vilified by their peers.

Others feel maligned or threatened by their professors, almost all of whom are opposed to the president. So am I. Precisely because I dislike Trump, however, I think it’s my duty to talk with—and learn from—people who disagree with me.

And that’s where I disagree with many of my colleagues, who seem perfectly content to let pro-Trump students stew on the sidelines. Recently, at a faculty meeting at my school, we were asked what we’d do if a Trump supporter said she didn’t feel comfortable expressing her views in class on immigration.

“Why should she feel comfortable?” one professor asked. The people we really need to worry about, another faculty member added, are immigrants and students of color, whose “humanity” is under assault every day.

But insulating our classrooms from pro-Trump sentiment condescends to our minority students, all in the guise of protecting them. They already know that Trump’s election unleashed ugly outbursts of bigotry cross the country. Trump himself has made dozens of highly offensive remarks about racial minorities, women, and the disabled. I understand why our minority students would be skeptical about people who voted for him.

But it’s cynical and prejudicial to assume that every Trump voter is a racist or a misogynist. And, like every prejudice, it’s borne of ignorance: We don’t talk to each other, so we don’t know about each other either. Since the 1970s, as Bill Bishop detailed in his 2009 book The Big Sort, a declining fraction of Americans have reported conversations with people of a different political perspective. And people with more education are even less likely to engage in discussions across the political aisle.

That’s the ultimate indictment of our universities, which should expose us to ideas and people outside of our personal experience. And it underscores the unmet vision of affirmative action, which was designed to help students “learn from their differences” and “stimulate one another to re-examine even their most deeply held assumptions about themselves and their world,” as former Princeton President William G. Bowen wrote decades ago.

Bowen’s comment was cited approvingly by Justice Lewis G. Powell in Regents of the University of California v. Bakke, the landmark 1978 Supreme Court case upholding affirmative action. Drawing again from Bowen, Powell quoted a Princeton graduate who noted that students “do not learn very much when they are surrounded only by the likes of themselves.”

That’s exactly right. And that’s why I hope the federal district court in Boston upholds Harvard’s affirmative action system. But in the same spirit, I also hope we’ll make a concerted effort to insure that all of our students can say what they think once they get here. That means encouraging our Trump supporters to speak up in class (without outing them against their will, of course). And it means insisting that the rest of us grant them a respectful hearing, no matter what we think of Trump.

You can’t support race-conscious admissions, as a way to widen the conversation, then restrict the conversation to people who agree with you. That makes a mockery of affirmative action, and of the university itself. Students need to hear a broad array of voices, so they can learn from their differences. And they won’t learn very much if they are surrounded only by people like themselves.

How Dark Money Groups Keep Their Donors Hidden
How Dark Money Groups Keep Their Donors Hidden

Should nonprofit groups that buy ads supporting or attacking political candidates be required to disclose their donors? The answer would seem obvious. But in the post–Citizens United landscape—where politics are awash in funds from unknown sources, in unlimited amounts—it was entirely possible that when the Supreme Court got the chance to weigh in on that question, it might side with the defenders of dark money.

Last month, though, the court made the decision to let stand a lower-court ruling that forced dark money groups—mostly the nonprofit arms of groups such as the NRA and Planned Parenthood—to disclose the identity of donors who gave more than $200 for the purpose of influencing federal elections. Champions of campaign-finance reform hailed the decision.

“Great news!” tweeted Senator Sheldon Whitehouse, a Democrat from Rhode Island. “A blow to creepy #darkmoney forces.” Defending Democracy, an independent, nonpartisan initiative, tweeted that the ruling was a “major victory against #DarkMoney” and that it would bring more transparency to American elections, “effective immediately.” Ellen Weintraub, the Democratic vice chair of the Federal Election Commission, called the Supreme Court decision a “real victory for transparency.”

This is a real victory for transparency. As a result, the American people will be better informed about who’s paying for the ads they’re seeing this election season.

/2

— Ellen L Weintraub (@EllenLWeintraub) September 18, 2018

But the celebration proved premature. The latest FEC disclosure reports, released last week, show that most of these groups are still hiding their anonymous donors despite last month’s court order. It turns out that the new disclosure requirements are not as expansive as the reformers had hoped. There’s a gaping loophole—and Democrats are benefitting from it as much as Republicans are.

In 2012, the Citizens for Responsibility and Ethics in Washington (CREW), a nonprofit watchdog, filed a complaint with the FEC against Crossroads GPS, a conservative nonprofit organization co-founded by Karl Rove. CREW alleged that Crossroads, which was spending tens of millions of dollars to support Republican political candidates, was violating federal law by keeping its donors secret.

It wasn’t until three years later, in 2015, that the FEC put the issue to a vote. But the six-member commission deadlocked, as the three Republican commissioners opposed an investigation into Crossroads. Its complaint dismissed, the watchdog sued the FEC for not investigating Crossroads. The case wound its way to the U.S. District Court for D.C., which ruled in CREW’s favor this past August. When the Supreme Court refused to block the ruling, the FEC was forced to issue new guidance earlier this month.

The FEC wrote the narrowest rules possible without running afoul of the courts. The commission didn’t require all nonprofit groups that fund political ads for or against candidates to unmask their donors, as reformers had hoped it would. Instead, it only required this of groups that solicited funds specifically for that purpose. As Brendan Fischer, the federal reform program director from the nonpartisan Campaign Legal Center (CLC), explained, the new requirements “won’t ensure disclosure of donors to groups that spend money on ads that don’t expressly tell viewers how to vote.”

For example, if a group raised money with an appeal to increase federal funding for birth control, but then spent the money on ads asking voters to oppose Democratic Senator Joe Manchin in the November election because he supported Supreme Court Justice Brett Kavanaugh, then it wouldn’t have to disclose its donors. The group would only have to disclose them if it had explicitly solicited donations with an appeal to take down Manchin in the midterms.

So it was no surprise when, on October 15, the FEC released the latest campaign finance filings and all but a few politically active groups continued to hide their anonymous donors. For the period from September 18 (after the court ruling) to September 30, only four of the 17 political nonprofits with independent expenditures—that is, money spent on advocating for or against candidates—disclosed their donors, according to the CLC. Only one of those groups began revealing its donors after the FEC’s new guidance, suggesting that the guidance had little impact on dark money disclosures.

A list of nonprofits with political spending in the third quarter as of October 3. Working People Rising later began disclosing its donors.Campaign Legal Center

Even in these few instances, the transparency only goes so far. The four groups, all left-leaning, are Unite Here Arizona, Working America, Mi Familia Vota, and Working People Rising. Their donors are largely labor unions or other nonprofits, and those groups have not disclosed their individual donors. So we still don’t know exactly where the money is coming from.

The Koch-backed Americans for Prosperity, AFL-CIO, Heritage Action and Humane Society Legislative Fund were among the dark money groups that did not disclose their donors. But the group with the most independent expenditures in the third quarter was Majority Forward, a liberal nonprofit connected to the Democratic Senate Majority PAC. The group spent $12 million on political ads between September 19 and October 3, bringing their spending total this election cycle to almost $29 million—mostly on ads against Republican Senate candidates in swing states like Florida, Indiana, and Arizona.

Majority Forward doesn’t disclose its donors thanks to the aforementioned loophole, as its statement to the FEC last week makes clear: “As a matter of policy, Majority Forward does not accept funds earmarked for independent expenditure activity or for other political purposes in support or opposition to federal candidates.”

“Democratic Party officials say they oppose dark money,” the CLC’s Fischer said, “but so far a Democratic group (Majority Forward) is the biggest spender failing to disclose its donors in the wake of this pro-disclosure decision.” (Several other dark money groups, including the right-leaning Patriot Majority and left-leaning Taking Texas to the Top, echoed Majority Forward’s reasoning for not disclosing.)

Weintraub, the FEC’s Democratic vice chair, acknowledged her disappointment: “When we first read the opinion from the court, I think we got a little over excited about what it might be able to do.” But “any additional disclosure is a good thing,” she added, and insisted that there’s “much greater cause to investigate dark money groups, independent expenditures and straw donors now than in the past because of the new disclosure requirements.”

But can her commission keep up with dark money groups? With every new ruling or regulation, it seems, these groups just find another way to remain in the dark. True transparency will require comprehensive action from Congress and the White House to change our campaign finance laws. Anything short of that is just an illusory victory.

The Rise of Sucker-Punch Hurricanes
The Rise of Sucker-Punch Hurricanes

On Monday morning, a mild tropical storm in the Pacific Ocean heading toward Mexico’s western coast suddenly transformed into one the strongest tropical cyclones ever to threaten the region. Hurricane Willa—which had maximum sustained winds of 40 miles-per-hour less than 48 hours prior—had quadrupled in intensity, gusting at 160 miles-per-hour.

Residents of the Mexican states Sinaloa and Nayarit must now scramble to prepare for a Category 5 storm they didn’t realize was coming. Expected to make landfall there on Tuesday afternoon, Willa is now threateninglife-threatening flash flooding and landslides,” “dangerous storm surge,” and “large destructive waves,” according to the National Hurricane Center.

Willa’s unexpected growth has been both “explosive” and “extremely impressive,” the NHC said. But it’s also eerily familiar. Willa is the third devastating hurricane in the last month to experience “rapid intensification,” a phenomenon in which a hurricane’s sustained wind speeds increase by at least 35 miles-per-hour over a 24-hour period.

Explosive rapid intensification from low-end tropical storm to Category 5 hurricane in 48 hours -- zooming in on the core of #Willa pic.twitter.com/4XJ9aASOuS

— Stu Ostro (@StuOstro) October 22, 2018

Hurricane Michael was the last to catch meteorologists off guard. It was only a Category 1 storm on October 8, less than two days before it made landfall as nearly a Category 5, one of the strongest hurricanes ever to make landfall in the United States. Hurricane Florence jumped from a Category 2 to Category 4 storm in a period of a few hours before it slammed into North Carolina in September. The trend isn’t limited to this year, either. The most destructive and infamous storms of 2017—Hurricanes Harvey, Irma, Jose and Maria—all underwent rapid intensification, according to The Washington Post.

It’s fairly normal for hurricanes to gain strength over short periods of time, but rapid intensification is becoming more severe. “A study in the journal Geophysical Research Letters found that the magnitude of these rapid intensification events increased from 1986 to 2015 in the central and eastern tropical Atlantic Ocean,” the Post noted. “From 1986 to 2000, the average storm that rapidly intensified saw its peak winds increase by 32 mph in 24 hours, but the increase was 36 mph in 24 hours from 2001 to 2015.”

It’s also abnormal—and far more dangerous—when hurricanes undergo rapid intensification right before they make landfall, thus leaving less time for people to prepare. The fact that this happened twice in a row with Michael and Florence was “unusual,” Corene Matyas, a climatologist at the University of Florida in Gainesville, told NBC News in mid-October.

Remarkable comparison of the rapid intensification of Hurricanes Michael & Harvey.

Both occurred just prior to landfall. pic.twitter.com/gfYuAudEQI

— Dakota Smith (@weatherdak) October 14, 2018

But the conditions were right for Michael to strengthen close to landfall, Matyas said, citing abnormally warm water running abnormally deep. Those conditions are becoming more common. “As the world continues to warm from the increase in greenhouse gases, the coming decades are likely to bring hurricanes that intensify even more rapidly,” reads a report from the non-profit Climate Central.

In other words, Willa’s rapid intensification may represent the third sucker-punch storm in a row this year. But as long as we remain on a course of increasing carbon emissions, we should expect to keep getting hit.

Sarah Perry’s <i>Melmoth</i> Is a Gothic Tale With a Conscience
Sarah Perry’s Melmoth Is a Gothic Tale With a Conscience

Sarah Perry’s new book, Melmoth, is an extravagant mille-feuille of dread, disquiet, and fear. Her first novel, After Me Comes the Flood (2014), was an avant-garde tale of a man who drives out of a drought-stricken town to find his brother, but finds himself drawn into a strange house filled with people who seem to expect him. Her second novel, The Essex Serpent (2016), put her on the map. It won the 2016 Book of the Year at the British Book Awards and gathered tremendous reviews from The New York Times to the London Times. It follows Cora Seaborne, a beautiful widow who meets a gorgeous but married pastor in Essex in the nineteenth century. The titular serpent is a Loch Ness Monster–style local legend, supposedly returned to life to snatch goats and terrorize children.

Cora is interested in fossils, in classic late-Victorian fashion. The discovery of ancient remains shook Victorian culture deeply, as the Bible had told that the world was only 6,000 years old. Women were key to early paleontology: In the 1820s, Mary Anning discovered the first ichthyosaur skeleton, a monster rather matching the Essex Serpent’s description. Is the monster real, and just waiting for Cora to make a world-shattering identification and also propel women’s role in science forward? Or is it a local trauma sublimated into myth, providing a convenient way to frame a church-versus-science conflict between Cora and her handsome pastor?

The novel is engaging, florid, and fun. In The Washington Post, Ron Charles wrote that Cora is “the most delightful heroine I’ve encountered since Elizabeth Bennet in Pride and Prejudice.” Indeed Cora is delightful. But there is a faint sprinkling of schmaltz over the romance at the novel’s heart, by dint of Perry writing star-crossed lovers with strong arms and wild hair, respectively.

In contrast, Melmoth, inspired by the 1820 Gothic novel Melmoth the Wanderer by Charles Maturin, is schmaltz-free.* Our protagonist is Helen Franklin: “forty-two, neither short nor tall, her hair neither dark nor fair; on her feet, boots which serve from November to March, and her mother’s steel watch on her wrist.” We are in contemporary Prague, where Helen, a British expat, lives a life of severe austerity. She sleeps on a mattress with no sheets and refuses to eat properly or listen to music. The central mystery of the novel is this: What has Helen Franklin done that she must live like a solitary, faithless monk?

MELMOTH by Sarah Perry.Custom House, 288 pp., $27.99

Perry withholds the story until near the book’s end. The intervening pages are composed of another mystery with supernatural aspects. One day, Helen’s friend Karel shoves a folder into her arms, before promptly disappearing. The folder contains research into a figure called Melmoth—the woman who denied seeing Christ in the garden of the night of his resurrection. Robed in dark clothes and condemned to walk the earth for eternity, Melmoth the Wanderer appears to people when they have lost all hope and invites them to join her in her suffering. “I’ve been so lonely,” she says, extending to them her ghostly hand.

Karel’s documents span medieval Europe, the Armenian genocide, and World War II Prague. A pattern emerges. Melmoth appears to people who have committed grave sins. A hapless bureaucrat does his government job, pushing paper that will lead to violent, unseen consequences elsewhere. Melmoth appears to him on a beach among the scattered bodies his paperwork has condemned. She is an angel of history, or an embodied, private conscience that attends to the machinery of evil. Individuals are just cogs in those machines, and Melmoth appears at the instant that those individuals realize their terrible culpability.

Is Melmoth real? As in The Essex Serpent, Perry conjures a dark, unseen monster to see what her characters do with it. Of course, a fiend that exists solely in the mind is no less frightening than one observed by science. In this sense, Melmoth resembles a classic Gothic work. Like Frankenstein’s monster, Melmoth speaks the truth. And John Harker’s words in Dracula could be Helen Franklin’s own: “I am all in a sea of wonders. I doubt; I fear; I think strange things, which I dare not confess to my own soul.”

Franklin’s refusal to disclose her own mysterious guilt mirrors the refusal of Melmoth’s other victims to acknowledge that they have had agency in historical atrocities. Melmoth asks a question: Is one person’s crime of equal weight to the crime of a whole society? Does an individual sin in a different way than a government? Either way, sins committed manifest themselves in the human mind as guilt, and it is guilt that drives all hope and joy from a person’s mind. That is the moment when Melmoth in her dark robe appears, reaches out. She’s been so lonely, she says, appealing to the total solitude of the guilty.

If The Essex Serpent mined Victorian history for a legend and worked it up into a romance with broader social themes, then Melmoth repeats that trick in multiple dimensions. It is the story of Helen, but also the story of nations. Far richer than a romance, Melmoth uses the Gothic mode to sketch a psychological model of guilt that scales up and down. Sin can be collective, but it is only repented individually, Perry seems to argue. Half spooky story, half meditation on history, Melmoth revives the Gothic form and drags it through time, into our present.

*A previous version of this article stated that Maturin’s Melmoth the Wanderer was originally published in 1920. The correct date is 1820. We regret the error.

The Dealmaker
The Dealmaker

In the evening of June 11, a few hundred people gathered across the street from the St. Regis Hotel in Singapore. Many were holding up cameras. Some were posing with selfie sticks. Others, fresh-faced and groomed despite the sweltering heat, positioned themselves with a view of the hotel behind them, facing camera crews. These were TV reporters, and I could hear one rehearsing “I am reporting live from....” in Korean. Someone asked me to move a bit because they needed a place for the reporters to do their stand-ups. They would be going live soon, broadcasting directly from here, though “here” was a glass-covered skyscraper, one of the most luxurious hotels in Singapore.

About 20 motorcycles were parked in a line in front of the entrance, suggesting the presence of bodyguards for a VIP, who in this case would be Kim Jong-un, the supreme leader of the Democratic People’s Republic of Korea, a global political celebrity perhaps second only to Donald Trump, who was also in town, for a summit, the first of its kind between the two leaders. The media had staked out promising spots such as the St. Regis, like paparazzi angling for a rare shot, on the chance that Kim, who was staying there, would show up and do something noteworthy.

More than 3,000 reporters had flown in from around the world to cover the U.S.–North Korea summit, so many journalists that at times we were literally stumbling over one another. Most of us were settled in the media center in the cavernous three-story F1 Pit Building, where we watched events unfold from TV screens that hung from the ceiling. We saw Kim Jong-un arrive, we saw Trump arrive. We saw the two men shake hands, we saw them sign a vaguely worded document that shared much in common with similarly vague agreements between the two governments in 1994 and 2007, both of which unraveled. As if to make up for the lack of any real news, Trump gave an hourlong press conference, during which he released a Hollywood-trailer-style video featuring himself and Kim Jong-un looking heroic, with a slogan: “Two men, two leaders, one destiny.”

There had been much confusion in the lead-up to the summit, questions about its structure and goals, and doubt as to whether it would take place at all. Trump canceled the whole thing in late May, only to declare it back on eight days later. Despite the confusion, everyone seemed to agree that something big was afoot, an international something that perhaps might even result in a Nobel Peace Prize for the U.S. president; after the first inter-Korean meeting in 2000, South Korean President Kim Dae-jung took one home.

During a cabinet meeting in late April, Moon Jae-in, South Korea’s current president, half-casually remarked, “It’s President Trump who should receive the Nobel Prize. We only need to take peace.” This comment, which had everything to do with Moon’s own efforts to bring peace to the Korean Peninsula and very little to do with Donald Trump, was largely misinterpreted by the foreign media as an endorsement of the American president for the prize, and by extension, the belligerent brinksmanship that may or may not have brought North Korea’s Kim to the negotiating table.

Click, hold, and drag
Deep Divisions

The 1953 armistice stopped the fighting in the Korean War, but didn’t bring lasting peace. Unresolved conflicts in South Korea over how to reunite with the North continue to drive politics there, and around the world.

1. From left: U.S. paratroopers trained for a mission during the Korean War; a North Korean general and his retinue attended the 1953 truce conference in Panmunjom. The continued American presence in South Korea is a source of both security and friction. It offers a military guarantee of the nation’s safety—but exacerbates concerns about renewed conflict, potentially with nuclear weapons.

The leader of the liberal Democratic Party, Moon had only taken office in May 2017, after the events of the 45-day Candlelight Revolution the year before, during which millions of South Koreans had attended protests against the corruption of conservative President Park Geun-hye. In December 2016, Park was impeached, and she was removed from office three months later. She is currently serving a 33-year prison sentence for corruption, illegal use of state funds, and violating election laws; Moon became president after an emergency election.

Moon was born in a war refugee camp in 1953 to parents who had fled the northern province of Hamgyong. He was jailed twice as a student for participating in protests against the dictatorships of president Park Chung-hee (father of Park Guen-hye), who staged a military coup in 1961 and ruled South Korea until his assassination in 1979, and Chun Doo-hwan, who took over in 1980. After his release from prison, Moon founded a human rights law firm with Roh Moo-hyun, who also went on to become president, in 2003. In 2004, while serving as a presidential secretary, Moon accompanied his mother to reunite with her sister at North Korea’s Mount Geumgang resort, as a part of Roh’s Sunshine Policy of engagement with the North. (The Sunshine Policy was an economic and intergovernmental policy, unilaterally initiated by South Korea in 1998; it was formally abandoned and declared a failure by the South Korean Ministry of Unification in 2010, after conservative Lee Myung-bak came to power.) During his presidential campaign, Moon vowed that if elected, his first tour out of the country would not be to the United States, as was traditional, but to North Korea. (His first trip, in June 2017, ended up being to Washington, D.C.) He had already met with Kim Jong-un twice before Trump’s summit in Singapore, in April and May.

I grew bored waiting for Kim to appear that night, and finally, around 9 p.m., after watching some of the TV folks, I returned to the media center to see if there was any news I might have missed. I wasn’t hopeful. I was convinced that little progress toward peace on the Korean Peninsula would be made in Singapore by Trump and Kim. But that didn’t mean there would be no progress.

“We do not wish for North Korea’s collapse, we will not pursue reunification by absorption in any form, nor will we pursue artificial reunification,” Moon had said during a July 2017 speech.

Without stating it directly, Moon had declared a new policy with regard to North Korea, one that differed significantly from the Sunshine Policy, which had a stated purpose of reuniting families and the two countries. Moon had also renounced the direction of conservative South Korean governments, which had long refused to acknowledge North Korea as a sovereign nation—the only reunification they would accept was absorption. (Their favored model was that of reunified Germany, in which East was swallowed by West.) Conservatives had for years argued that the only means to achieve that outcome was by pressuring North Korea economically until it collapsed.

Yet those conservative governments were gone, and Moon and the liberals held sway. So, the real shift in policy advanced by Moon related less to regime change in the North and more to a peace process that would allow Kim Jong-un to stay in power. Moon’s vision of peace was really one of containment, ensuring that North Korea and Kim Jong-un could no longer threaten the South.

Peace without reunification is a very different thing than what had come before. Yet it was an idea that seemed to have found its moment in South Korea, which had been long split along reunion or confrontation lines. Virtually no one on the left in South Korea to whom I have spoken in recent years has expressed any real desire for reunification, though few people are willing to admit it publicly. Younger people generally dismiss reunification outright. Older South Koreans will only say that reunification should happen, and yet inevitably add “in due time” or “at some point in the future.”

So, lost in the chaos that Trump had created, and the vagaries of the summit and its outcome, was a new and frankly bold change in the kind of treatment North Korea could expect from one of the two nations with which it is still formally at war. In that light, my trip to Singapore was something of a farce—at one point so little was going on that reporters began interviewing other reporters so that they’d have something to report. It seemed to me that the real talks were not taking place between Kim Jong-un and Trump at all, but between Moon and the people of South Korea.

Such is how I found myself, a few weeks before Singapore, standing in line outside a café bar called Bunker 1 in the business district of central Seoul’s Chungjeong-ro neighborhood. Above the entrance hung a large poster for That Day, The Sea, a smash hit documentary about the 2014 sinking of the Sewol ferry, which killed 304 passengers, most of them high school students. The film was produced by 49-year-old Kim Ou-joon, a well-known publisher, podcast host, and media personality. Bunker 1 serves as headquarters for Ddanzi Ilbo, a digital media company that Kim founded in 1998.

With his ’80s glam-rocker’s teased hair, mustache, and goatee, Kim looked more like an artist than a media powerhouse. He is the creator and former host of one of the world’s most downloaded political podcasts, Naneun Ggomsudah, which roughly translates as “I Am a Petty-Minded Creep.” The title is a sarcastic reference to conservative President Lee Myung-bak, who served from 2008 to 2013. Lee, like the conservative successor Moon replaced, was indicted on multiple corruption charges. In October, the 76-year-old former president was convicted and sentenced to 15 years in prison.

Click, hold, and drag
2. From left: South Koreans protested the terms of the armistice in 1953; in 1980, democracy activists, arrested in Kwangju, were branded as pro-North Korean; three years later, President Ronald Reagan signaled U.S. vigilance at the DMZ; in 1989, Lim Su-kyung, a student, was imprisoned for five years for traveling illegally to the North and meeting with Kim Il-sung, the founder of North Korea; his son, Kim Jong-il, took power after his father’s death in 1994.

I had come to Bunker 1 to watch the live broadcast of Kim’s newest podcast, Dasvuida. (Translation: “I Will Show You DAS,” a play on the iconic Star Wars villain Darth Vader, and DAS, an auto-parts company embroiled in a federal investigation, whose actual ownership is murky, but which allegedly is controlled by Lee Myung-bak.) By 6:30 p.m., the bar was packed with several hundred people. The crowd had gathered for Kim and his cohost, another extremely popular media figure, 45-year-old Choo Chin-woo, South Korea’s foremost investigative journalist. When the two finally appeared, everyone cheered loudly, pressed toward the bar’s stage, and beseeched the two journalists to pose with them for selfies.To understand the influence that Kim Ou-joon has in South Korea, it’s important to put his podcast in context. Under Lee Myung-bak, the heads of the major broadcast and news organizations were replaced by close associates of the president and corporate bureaucrats with explicitly pro-government stances, essentially turning the mainstream press into a propaganda machine. In 2012, thousands of journalists from MBC, KBS, YTN, and other major media outlets went on strike in protest. Many would eventually resign or were transferred to lesser roles where they were unable to report. It was also around this time that the government took a hand in setting up new, pro-government cable TV stations called jonghap pyunsung. Naneun Ggomsudah offered a crucial alternative for the public, exposing the corruption of their political, religious, and economic leaders. In 2012, it was shuttered during a federal investigation into charges of defaming political figures.

Lost in the chaos that Trump and Kim had created, and the vagaries of the summit and its outcome, was a bold change in what peace with North Korea might look like.

Such a breathless reception for two middle-aged media figures might seem odd to an American audience. But Choo’s exposé of Lee Myung-bak’s financial corruption badly damaged his presidency, and his investigation of Park Geun-hye and her unofficial adviser Choi Soon-shil contributed to Park’s impeachment and the emergency election that brought Moon to power. In a way, these two journalists were nearly as responsible for Moon’s approach to peace as Moon himself.

The night’s program was supposed to focus on the upcoming National Assembly election, but North Korea kept coming up. Among Kim and Choo’s guests that night was Lee Jae-jung, a former unification minister during the Roh Moo-hyun administration. Lee had been a strong proponent of the Sunshine Policy and was now running for reelection as superintendent of education of Gyeonggi province. Strangely, many of his remarks were in praise of Kim Jong-un: “Kim Jong-un seems totally different from his father, whom I did meet”; “Look how he was this time. When our envoy went, he and his wife even walked our envoy out to the parking lot.”

A North Korean defector-turned-journalist named Kang Mi-jin also commended North Korea. Giant root vegetables burst from the ground near Mount Baekdu in the North, she said. When she first fled, she could not believe people ate such small vegetables as they did in the South. Back home—a country in which a late 1990s famine claimed as many as three million lives—people threw out such paltry specimens. The malgup mushroom, a South Korean delicacy, was so unimpressive that she recently asked her mother, who also defected, “Mom, didn’t we use these as kindling to make fire?” It was hard to imagine why she would have left such a land of plenty, but the audience seemed to accept her skewed logic; they broke into applause.

When the subject came up of who should receive credit for Singapore and whatever agreement emerged from it, Kim Ou-joon put it simply: “Who cares if Trump’s not fit to be the U.S. president? That’s the American people’s business. For us, without Trump, the summit wouldn’t be possible.”

Singapore wasn’t what Kim Ou-joon wanted to discuss when I returned to Bunker 1 a few days later. He was sitting at one of the outdoor tables, his cell phone in one hand and a cigarette in the other. He was far more interested in a different summit: the April 27 meeting between Moon Jae-in and Kim Jong-un, at the military demarcation line, or DMZ, that has separated the countries since 1953. This, he said, was “far more important than Singapore.” Kim Jong-un was the first North Korean leader to step over the line to the South, and the occasion aired on live TV. Kim was described as “mannerly and courteous” and a man of “self-assurance.” According to Kim Ou-joon, “In just a matter of seconds, the South Korean public perception went from Kim the madman to a well-brought-up man from a decent family.”

These were things, he said, that Koreans notice but Americans would not have picked up on. I agreed with him. There are nuances you catch only if you are from within a culture, but I am Korean, and I didn’t see what he was talking about. Kim Jong-un wasn’t from a “decent” family but a homicidal one, and no gentle mannerisms he put on for TV would change that fact. Besides, what difference did it make? Who cared if he seemed nice? Did that change his treatment of his own people, or the threat of his nuclear weapons program? He insisted it mattered.

“From that moment on, we no longer saw him as an enemy,” Kim Ou-joon explained. “There was a fundamental shift among South Koreans. It wasn’t just hope for peace. It went from distrust to trust. Because we saw it with our own eyes, it’s difficult for that perception to change—that was the key. In one moment, Kim Jong-un’s approval ratings jumped from 20 percent to nearly 80 percent.”

When I asked about Kim Jong-un’s 2013 execution of his uncle, Jang Song-thaek, and the murder of his brother, Kim Jong-nam, who in February 2017 was assassinated at Kuala Lumpur International Airport when two hired killers smeared a lethal nerve agent called VX on his face, he said that every country had its own rules for establishing political order by eliminating a rival. “It’s not right to judge it according to the American way.”

This was his response to all my questions about human rights: North Korea had its own ways, its own logic; the only ones bringing up human rights at this juncture were South Korea’s extreme conservatives, because it fit with their agenda to demonize North Korea. Besides, the United States had its own ethical and moral shortcomings with which to reckon. “Imagine if other countries brought up the white American policemen’s murder of black people, or of its gun violence, every time there were to be political negotiations?”

Every country had its rules for establishing political order by eliminating a rival, Kim Ou-joon said. “It’s not right to judge North Korea according to the American way.”

This, he said, was why U.S. foreign policy on North Korea always failed. “In America’s perception, it’s always Good versus Evil, them being right and others wrong.” He mentioned Trump’s cancellation of the summit. “Since it was Trump who canceled, Americans saw it as a ‘deal making.’ If North Korea had done it, then it would have been desperate strategy.” He drew on his cigarette when he said this, and seemed emotional. “America simply won’t acknowledge that other countries have agency.”

This opinion was seconded by Choo Chin-woo, whom I met at his office that same week. He had a lean profile and angular features, and exuded the cool, chic haughtiness of a K-pop star. “There really isn’t much we as South Koreans can do,” he said. “America has always had the upper hand. Without the will of the American president, this moment with North Korea simply could not happen. Even if we, North Korea and South Korea, want to talk with each other, it cannot happen without American approval. Thus the tragedy of a small nation.”

Choo said that the dynamics of modern Korean history were set by the division between North and South, which he called “the seed of corruption and injustice.” The elites in both Koreas, as well as the South Korean mainstream media, had long benefited from the split. “They’ve lived off it, blocking reconciliation between the Koreas. The conservatives, and the surrounding countries, too—China, Japan, Russia, and especially the United States, their Defense Department, their arms dealer, those who made money off it.” He said, however, that he was the wrong person to ask questions about what kind of leaders Trump or Kim Jong-un were in their own countries. His main interest was exposing corruption within South Korea.

I encountered this attitude among many South Korean liberals. It is a sort of South Korea First mindset, which initially reminded me of Trump and his MAGA message. But the left and right in South Korea are not the same as America’s liberals and conservatives. Here, the presiding issue remains the divided Korea. Choo insisted that if Park Geun-hye’s conservative administration had still been in power, the Singapore summit would never have been possible. The same went for the levels of support for Moon in South Korea: It was an essential factor in getting Trump and Kim to Singapore.

That support was mobilized into action after the Sewol ferry disaster. It was later revealed that the sinking, and the botched rescue efforts, were the result of corruption involving the ferry owners, the insurance companies, and the Coast Guard and Navy. In July, a South Korean court ruled that the government was liable. It was also not lost on South Koreans that many of the dead were teenagers from low-income families in Ansan, an industrial city in Gyeonggi province; some questioned if the government would have acted quicker had the kids come from elite Seoul schools. The realization that this was not just an unfortunate accident struck a deep emotional chord in South Koreans, comparable to what Americans felt after Hurricane Katrina. It exposed a pervasive rottenness under the surface of the country’s infrastructure and governance.

“Information often comes from data, which in South Korea has inevitably been manipulated by the conservative media,” said Lee Sang-ho, a reporter best known for the “Samsung X-File,” an investigation that exposed the giant company’s use of bribes to influence the 1997 presidential election. “With Sewol, though, all that changed.” The nation watched in horror as their children died, and they saw that their government did nothing to rescue them even when they could have. The president went missing during the immediate hours after the disaster without offering any explanation, and the mainstream conservative media misreported a supposedly successful rescue mission, while the independent media broadcast live from the scene. Because people saw the truth with their own eyes, and because the information was incontrovertible, it gave power to the people who provided it.

“People would not be fooled,” Lee said to me at the screening of his latest film, The Blacklist, a sequel to the earlier documentary he’d done about the Sewol disaster. Lee said that conservative leaders had always used the mainstream media to exploit the division between North and South, in order to accrue personal power and wealth. The specter of national security, disseminated through the press, would stifle calls for reform in unrelated sectors of the government. “No reform, from education to labor laws, was possible, because the reformers challenging the establishment would be reframed, maligned, and banished as communists,” he added.

Junguk Gyojikwon Nodong Johap, the Korean Teachers and Education Workers Union, was a good example, he said. Some of its members had been barred from teaching because they were considered anti-government and pro-North Korea. In 2013, the Park Geun-hye administration banned the union altogether, stripping it of government subsidies and the right to collectively bargain. “Unless the fundamental perception of North Korea as the enemy is eradicated, no reform is possible, “ Lee said.

He pointed to the positive coverage North Korea received during February’s Winter Olympics in Pyeongchang. Kim Jong-un’s sister, Kim Yo-jong, attended as a special envoy, and she sat next to Moon in the presidential box at a North Korean concert in Seoul, which was broadcast live on TV. “That was an opening act for this friendly image-building,” Lee said. That event was followed by the inter-Korean summit, which aired, again, live on TV, further improving the image of Kim Jong-un. Singapore would only be another, relatively minor, step in this process, which would be followed by Moon’s eventual visit to Pyongyang, promised the day after he took office. (Moon traveled to North Korea on September 18; during the visit, Kim said he would reciprocate with his own visit to the South.) “Outsiders forget that the Moon administration is not a normal one,” Lee said, regarding the urgency with which Moon has pursued his agenda with the North. “It was born out of a revolution, which then has to fulfill people’s expectation, which runs very high.”

Kim Yeon-chul, the head of the Korea Institute for National Unification, a government-funded think tank in Seoul, told me that when Moon was part of the Roh Moo-hyun government, the official, liberal reunification strategy was one that had existed since 1989, and was mapped out in three steps: first, the reconciliation and cooperation phase, during which South Korea would build a peaceful relationship with North Korea; next, the two Koreas would form a transitional commonwealth, with two governments under one entity, perhaps along the lines of China’s relationship to Hong Kong; and finally, a long-term process that would result in unification as a single state.

This was no longer Moon’s definition of reunification. His government now appeared to view the commonwealth not as a transition but an end state. And to achieve that, the United States was necessary. “Nuclear North Korea is a direct result of a long-lasting Cold War,” Kim Yeon-chul said. “America approaches the problem solely as one of weapons— denuclearization—but in South Korea, we approach this from the vantage point of a relationship. If the relationship changes for the better, then the weapon will no longer be useful.”

Though conservatives’ power in government was much diminished, I remained curious to understand what they thought of Moon’s shift in policy toward the North. So, in late May, I accompanied Ha Tae-kyung, a National Assembly member representing the city of Busan for the center-right Righteous Future Party, to a campaign event for the upcoming local elections on the rooftop of a shopping mall on the edge of Seoul. The candidate was 32-year-old Harvard graduate Lee Jun-seok, nicknamed “Park Geun-hye’s kid” because he was a protégé of the impeached president.

As I climbed the steps to the roof, I could hear an announcer saying the words “Harvard” and “Kim Jong-un.” The Harvard pitch was no surprise—American educational credentials go a long way in South Korea, even now. But the mention of Kim Jong-un was baffling. The speaker said that Lee Jun-seok was only a year younger than Kim Jong-un, implying that their proximity in age suggested shared leadership traits.

The young candidate was rosy-cheeked, with smiling eyes, like the young northern leader, though a bit less chubby. But the logic of positively comparing a conservative candidate to South Korea’s murderous archenemy eluded me. Apparently the conservatives had a few things in common with Moon after all. I watched Ha busily work the room, shaking hands with the few hundred constituents who had gathered there. With Moon’s Democratic Party in control of the government, it was unlikely the young candidate would win his election.

Ha Tae-kyung is a rare politician in the South who has switched his political allegiance from left to right. Born in 1968, the same year as both Kim Ou-joon and Lee Sang-ho, he belongs to the “486 Generation”—born in the 1960s, still in his forties, attended college in the turbulent 1980s. As a physics major at Seoul National University, he was one of the leading democracy activists in Junguk Daehaksaeng Daepyoja Hyupeuhwae (commonly known by its acronym, JunDaeHyup), the National College Student Leader Association, perhaps the most famous youth resistance group, known for its anti-government protests, which resulted in the death and torture of some members, and its inflammatory pro-North Korean activities.

One of its members, a South Korean college student named Lim Su-kyung, traveled to Pyongyang in 1989, in violation of a national security law prohibiting visits to the North. Like a South Korean Hanoi Jane, she publicly praised the Great Leader, Kim Il-sung, sparking a controversy in the South. (The North Korean government dubbed her the “Flower of Unification.”) When she returned home, she was sentenced to five years in prison. Im Jong-seok, one of JunDaeHyup’s leaders, served three and a half years for his role in arranging the trip. Ha was also arrested, in 1991, and charged with breaking a national security law. He served two years in prison.

Click, hold, and drag
3. From left: North Korea’s weapons program was a source of tension during Bill Clinton’s 1993 DMZ tour; a mother and son, separated by the war, were reunited in Seoul in 2001, under the Sunshine Policy; anti-nuclear-weapon demonstrators burned a North Korean flag in 2009; the North successfully conducted a medium-range ballistic missile test in 2014; the Sewol ferry disaster that year left South Koreans disillusioned with the country’s conservative government.

Many former JunDaeHyup members later took on leadership roles in the Democratic Party—Lim Su-kyung was the party’s representative to the National Assembly until 2015—and some now hold senior positions in the Moon administration. Im Jong-seok is chief presidential secretary to Moon and was seated next to him during the inter-Korean summit in April.

After the campaign event, I waited with Ha for his train to Busan, and he told me about his political evolution. Soon after being released from prison, he had traveled to the North Korean border to do aid work. It was there that he met North Korean defectors and learned about the brutal realities in the North. (Later that day, Ha asked to take a selfie with me. He said he was impressed by a book I had written about living undercover in North Korea for six months, which he called “a suicide mission.”)

South Korea’s conservatives like to accuse some far-left members of Moon’s administration of hewing to a belief system known as jusapa, an acronym for a Korean term that translates as “the faction of the Juche ideology.” Juche is the political philosophy of Kim Il-sung, the founder of North Korea, and means “self-reliance.” It is known for its rather hyperbolic assertions of ideological self-importance, but it also holds appeal for the far-left in South Korea in its synthesis of Marxist-Leninist and Korean culture. This is an attractive concept for a country that has historically been either colonized or suppressed by stronger nations such as Japan, China, and the United States.

For the student leaders of the 486 Generation, to resist the South Korean dictatorship meant to politically and intellectually oppose the United States, which propped the dictatorship up, and to believe in a fictionally benign image of the North—akin to the 1930s American socialists who made excuses for Stalin until they learned the truth about the Soviet Union. “The liberals in South Korea have more in common with North Korea, in its ethno-nationalism and its socialist ideals, than with the American democracy that arose in South Korea,” Ha said. The levels of distrust between left and right are rooted in years-old, still-unresolved antipathies.

Most of all, conservatives do not trust Moon, and I spoke to many of them, some with more extreme views, during my time in Seoul. For them, Moon’s push for peaceful coexistence between the two Koreas as two sovereign nations is not only false, but untenable. “It’s all a show, a fancy peace show! Why would Kim Jong-un change? Why would he give up his power? Sing peace all you want, would peace come?” said Jeong Kyu-jae, a 61-year-old former columnist at the Korea Economic Daily who is now a broadcaster, podcaster, and pundit famous for his conservative views.

He pointed to earlier failed engagement efforts, most notably the opening in 2004 of the Kaesong industrial zone in North Korea. More than 100 South Korean companies invested in Kaesong during the Sunshine Policy, employing some 54,000 North Korean workers. South Korean liberals cite the project, which closed in 2016, as an example of a relatively successful effort to nudge North Korea toward a market economy. Jeong dismissed it as a “slave factory”—with workers who had no freedom to negotiate a labor contract, for either wages or working conditions—used to bolster the North Korean regime. “Kaesong was never a road to a market economy for the North, but a system for legitimizing slave labor.”

“It’s all a show, a fancy peace show!” Jung Kyu-jae said of Moon’s détente plans with the North. “Why would Kim Jong-un change? Why would he give up his power?”

Shaking his head, he asked me, “What do you think the South Korean left hates the most?” The answer, he said, was chaebol, the large, family-run business conglomerates that have traditionally dominated the South Korean economy. He cited the “nut rage incident,” in which Korean Air heiress Cho Hyun-ah made international news in 2014 when she angrily delayed a flight for 20 minutes because she was not properly served nuts (on a plate, apparently). The left-wing media gleefully skewered the incident as gapjil (bossy bullying) gone amok. “But who do you think is the biggest chaebol on the Korean Peninsula?” He paused. “It’s Kim Jong-un! The third-generation heir and mass murderer all those on the left are embracing right now. Remember Hitler? All those young Germans were fanatic about him too.”

A few weeks later, as if to confirm this thought, South Korea’s Prime Minister Lee Nak-yeon stepped off South Korea’s Air Force One in Kenya and praised Kim Jong-un as “a leader who thinks of the livelihoods of people as being more important than other things.”

Nam Si-uk, an 80-year-old former columnist at the conservative Dong-A Ilbo newspaper (and author of Korean Conservatism Studies and Korean Progressives Studies), explained to me why conservatives view Moon’s positions as a threat. For example, the Korean commonwealth ideal, which until recently Moon supported, wouldn’t be like China and Hong Kong. A better comparison was the partition of Vietnam after the fall of the French colonialists in 1954, only on a nuclear-armed peninsula. “If it were a true commonwealth, the South would have nothing to lose, since it has twice the population of the North,” Nam said. “But, what conservatives fear is that the North will try to infiltrate and convert the South ideologically. Meanwhile, we would lose the alliances that could help us if a war were to break out. And such a war would be considered a civil war, much like Vietnam.”

Moon’s position, with two independent Koreas, could only be achieved if South Korea and the United States promised to secure the Kim Jong-un regime. “An impossibility,” said Nam. “How could anyone secure their regime? The more you pump them up economically, the more the people there will want freedom, and that freedom directly threatens the survival of the regime. Remember, Kim Jong-un’s regime isn’t an elected government but a monarchy dictatorship.” After a pause, Nam said definitively, “Moon’s vision is a fantasy.”

Most of all, Nam claimed that key members of Moon’s government still actually want total reunification, because they are fundamental ethno-nationalists and believe in one Korea. “It’s the young people, the students, who don’t want reunification, because they don’t want to give money to North Korea. But the core left does.” What conservatives fear, then, is the liberal vision of reunification, made on North Korea’s terms, which would bring South Korea closer to China and Russia, and ultimately break up South Korea’s alliance with the United States. In the end, the new battle between left and right was an old one: Soviet-style socialism versus American democracy. Never mind that the Soviet Union was no more, North Korea was isolated, and China, its sole partner, wasn’t exactly a communist paradise. Conservatives, trapped in their never-ending cycles of political recrimination, could only see Moon’s wish to make peace for South Korea as a feint to get rid of the United States.

“Alliance” is a word that Moon Chung-in, special policy adviser to President Moon, doesn’t like. I met him in his office in a government building around the corner from the Blue House, South Korea’s presidential residence. In May, he said, he had told a reporter from The Atlantic that alliances are “a very unnatural state” in international relations, and he added, “Ideally speaking, a country without alliances can be said to be more secure and stable.” Alliances, he explained, assume a common enemy, a common threat. If the common threat is gone, countries have two choices: either create a new threat, or “adjust your alliances.” Moon Chung-in did concede, however, that, “in the short to medium term, our alliances with the United States should be maintained.”

The interview outraged South Korean conservatives, who interpreted his remarks as the Moon administration declaring its intention to forsake its ties with the United States. An article in The Chosun Ilbo, the country’s leading conservative daily, blasted the comment with a headline, “Moon Chung-in, Again ... In the Long Run, Best Is to Get Rid of Korea-U.S. Alliance.” In 2017, when Moon Chung-in made a similar statement, Hong Jun-pyo, a former presidential candidate from Liberty Korea, the country’s biggest right-wing party, who is known as “Hong Trump” for his aggressive style, said, “Such a vulgar comment is not only shocking but also gives us the creeps,” and he suggested that Moon Chung-in move to North Korea. Ahn Cheol-soo, another former presidential candidate, and a leading figure in the Righteous Future Party, demanded Moon Chung-in be fired.

Moon Chung-in had also run into trouble a few months earlier during a speech in Washington, D.C., when he said that it would be difficult to justify the presence of U.S. troops in South Korea if there was ever a peace treaty with the North, a position he repeated in an April article he wrote for Foreign Affairs. “Many conservatives criticized me for saying that,” Moon Chung-in told me. “But my main concern was that there will be a debate in the United States” about whether, and how, its forces might withdraw, and that South Korea’s interests needed to be respected. Henry Kissinger, the former U.S. secretary of state, agreed with him, he said. “He told me himself that a debate about U.S. troops will be unavoidable and inevitable. But you see the reaction in South Korea!” Moon Chung-in said, shaking his head. “If the removal of troops becomes an issue, that will stand in the way of any peacemaking process. The reason for that is because of internal political resistance, and that’s between conservatives and liberals.”

Click, hold, and drag
4. From left: The 2016 Candlelight Revolution led to the arrest of President Park Geun-hye, who is serving a 33-year prison sentence for corruption and other crimes; President Moon Jae-in, joined by Kim Jong-un’s sister, Kim Yo-jong, at the Pyeongchang Olympics in February, has avoided taking a strong position on human rights violations in the North; Trump’s summit with Kim in Singapore has yielded few results; Moon’s trip to Pyongyang in September may prove more significant.

The South Korean conservatives, he told me, still considered the only “authentic peace” to be unification by absorption, one in which North Korea surrendered or collapsed. “No such thing is possible,” Moon Chung-in said. “The difficult task in all this,” he added, “is how to persuade North Korea to make peace with South Korea while maintaining the alliance with America and keeping the U.S. troops in South Korea.” To achieve that, North Korea could not be demonized. President Moon’s approach to détente was a pragmatic one, he claimed, not the fruitless dream that conservatives condemned it as.

With that in mind, South Korea’s interest in the Singapore summit, then, had little to do with a comprehensive deal to resolve Moon Jae-in’s “difficult task.” Any agreement would effectively be meaningless, since neither Trump nor Kim could be trusted. What Moon Jae-in wanted was for Donald Trump to avoid damaging the progress he’d made at the DMZ, along with a little help promoting support for peace with the North in South Korea. That’s why Moon Jae-in said Trump could have his Nobel Peace Prize—as a pat on the back! And if Kim stopped violating his subject’s human rights, great, because it made it easier for Moon to sell peace to his citizens. But it wasn’t that important. (There is still no evidence that North Korea has suspended its nuclear weapons program, and in August, Trump canceled Secretary of State Mike Pompeo’s planned visit to Pyongyang; by September the trip was back on.)

When I brought up human rights, Moon Chung-in cut me short. “It’s for North Korea to decide. It’s their destiny. We tend to have some illusion that we can shape the political lives of North Korea. Look at America, and how many times they intervened in the name of human rights and democracy. They always failed!” He paused, as if reconsidering his words. He said that some Americans might argue that a free and democratic South Korea is a successful example of intervention. He disagreed. “It was the people of South Korea who achieved democracy.”

In June, the DMZ summit and Singapore would be cited as significant factors in the landslide electoral victories for Moon’s Democratic Party. They took eleven out of twelve available seats in the National Assembly, and 14 out of 17 major municipality chiefs. Lee Jun-seok, “Park Geun-hye’s kid,” lost badly. Liberals are now fully in control of South Korea. Moon’s approval rating reached 84 percent, an all-time high.

Before leaving for Singapore, I visited the Yongsan Garrison, an American military base in the middle of Seoul. At 617 acres, and at its peak home to 22,000 U.S. military and other personnel, it is alarmingly large for a foreign military base in a country’s capital. It was originally built in 1910 by the Japanese Imperial Army during the occupation of Korea; in 1945, control of it passed to the U.S. Eighth Army. For Seoul’s ten million citizens, it’s a place of mystery; South Koreans must be approved for entry.

Under a relocation agreement signed in 2003 by American and South Korean presidents George W. Bush and Roh Moo-Hyun, the U.S. Army is finally moving out of Seoul, to Camp Humphreys, 40 miles south, in the city of Pyeongtaek. The relocation was delayed for many years due to protests over the U.S. military presence, and the cost. (South Korea is responsible for 92 percent of the $10.7 billion budget for the move, on top of its annual payments of more than $800 million for the upkeep of U.S. troops in the country.)

On the day of my visit, many of the barracks were already empty. A young American college student who had grown up on the base showed me around. She was home for summer break and was sad that by the next time she returned, it would all be dismantled. “What are the chances of your childhood home being completely erased?” she said, while leading me through what looked like an American suburb.

One of the barracks had a sign that read YUJIN KATUSA SNACKBAR. What made me pause was the word KATUSA—Korean Augmentation to the United States Army, a branch of the South Korean forces attached to the U.S. military. Growing up in South Korea, I had heard it mentioned often, although it seemed to me always with a trace of shame. All South Korean young men must serve two years of mandatory military service. Those who are selected for KATUSA are usually from the top colleges, and are typically regarded with envy, since they get a chance to practice their English and be posted in Seoul. But there is also an uncomfortable aspect to it. South Koreans are keenly aware that a group of their best young men are serving the U.S. Army, a sad metaphor for modern Korean history.

The inclusion of KATUSA in the name meant that this was a Korean restaurant, and, sure enough, the menu displayed all Korean dishes, though some of it was fusion using the military supply of Spam and yellow Kraft cheese slices. The American student told me that the place is run by a halmoni (old woman) who had been there “forever.” Inside, I found a very small, wrinkled old woman clearing tables. When she sat down with a huge basket of soybean sprouts, I sat across from her and offered to help snap the tail off each sprout, as my mother does when making soup.

Her name was No Jung-nyu. She was born in 1938 and had run this place since the 1980s. Her two daughters, now in their forties, cooked in the kitchen, and her middle-aged son worked the counter. She told me how life was hard, especially after the war, but here at Yongsan Garrison, she had found a home. Americans were so friendly and good to her. “Very kind,” she said.

She was hoping to be relocated to Camp Humphreys. Because I came from America, she seemed to think I might have some pull. She was losing sleep at night worrying about the prospect of her future and her family’s livelihood. They would all be jobless if they do not get selected, she said, and they loved serving their American army customers. When I asked her how the selection for relocation was being made, she said she had no idea, but that it would be Americans who decided who would go to the new base. “It’s always Americans who control.”


Photo credits left to right: Corbis/Getty; Bettmann/Getty; Left to Right: Bettmann/Getty (x3); Chosun TV; ARGAS/Gamma-Rapho/Getty; DIana Walker/Time Life Pictures/Getty; PauLa Bronstein/Getty; Ryu Seung-il/Polaris; Chung Sung-Jun/Getty; South Korea Coast Guard/Yonhap/AP; Jean Chung/Getty; Lee Young-ho/Polaris; Carl Court/Getty; Anthony Wallace/AFP/Getty; Pyeongyang Press Corps/Pool/Getty.
The Fundamentalist Trap
The Fundamentalist Trap

Julie was 16 years old when Bill Gothard, the founder of the fundamentalist Christian organization the Institute in Basic Life Principles (IBLP), pulled her aside at an IBLP event in 1996 in Dallas, Texas, to compliment her “bright, shining countenance.” She was 18 when Gothard, then 64 years old, invited her to work at IBLP’s sprawling headquarters near Chicago. Julie spent her days in a greeting lobby, often alone. Every time she began to connect with her on-campus roommates, Gothard moved her to a new room. Despite being a conscientious rule follower, Julie was immediately considered “rebellious” by the other staff, one of the worst labels possible within IBLP’s world of unquestioning obedience. She was called into her supervisor’s office and yelled at almost every day, despite being on her best behavior.

Julie, who asked that her last name be withheld, believes that Gothard was isolating her socially so that she would only feel safe with him. It worked. Julie, now 38 years old, remembers Gothard being the one kind face in a miserable existence. He frequently asked her to stay in his office with him until well past the 9 p.m. curfew, a shocking breach of protocol in a culture where simply talking to a member of the opposite sex could lead to a public shaming, or, worse, being sent home. One night, as Julie was transcribing Gothard’s dictation, she noticed he had gone quiet. When she looked up, she saw that Gothard was staring at her intently, his erection exposed.

Julie had been sheltered all her life and didn’t know precisely what Gothard was proposing. But she felt unnerved enough to tell Gothard that it was getting late and that she needed to leave. It was only well into her marriage that Julie realized what Gothard had done. And it was only many years later that she saw the similarities between that moment and the stories of ten plaintiffs who had accused Gothard of sexual harassment and molestation, including rubbing their breasts and genitals while clothed and placing their hands on his groin. Some of the plaintiffs were minors at the time.

IBLP parted ways with Gothard in 2014, following an internal investigation that, while finding no evidence of criminal activity, claimed that Gothard had acted in an “inappropriate manner” with members of IBLP. The organization has refused to release its findings, but Gothard has been blacklisted from all IBLP events and locations. When Gothard spontaneously showed up at an IBLP event in in Big Sandy, Texas, in April, the police were called to remove him.

The plaintiffs’ suit was dropped in February, and Gothard’s camp declared vindication of its longstanding claim that the lawsuit was a coordinated conspiracy by a group of disillusioned, bitter women. The plaintiffs stated that they were not withdrawing their allegations of sexual harassment and assault, but that the lawsuit had been dropped due to “unique complexities ... including the statutes of limitation.”

Still, IBLP hasn’t entirely renounced its former leader. The IBLP website celebrates Gothard’s seismic influence on the organization specifically and the fundamentalist movement in America more broadly, going back decades to when he packed arenas with thousands of Christian conservatives eager to hear how his seven basic principles could shield their families from the cultural upheaval of the late 1960s and early 70s. Even when IBLP announced Gothard’s removal from the organization, it was quick to add that his departure in no way tainted the proud legacy of Gothard’s works.

The question facing IBLP and other fundamentalist organizations is whether theology, personality, and an organization’s culture can exist in isolation.

Ben Ziesemer, a longtime IBLP employee, told me that “you have to separate the truths that Gothard taught, the Biblical principles, from a man’s personality.” But the question facing IBLP and other fundamentalist organizations is whether theology, personality, and an organization’s culture can exist in isolation. The scandals that have emerged from some of this country’s preeminent fundamentalist institutions—Paige Patterson’s Southwestern Theological Seminary, Bob Jones University, and Sovereign Grace Churches, just to name a few—suggest that they cannot. This is partly why the wave of Christian fundamentalism that washed over America in the 1970s and 1980s—a movement that intertwined moral purity and political activity—is in retreat.

IBLP is a case study in how a religious culture can implode when an authoritarian theology allows the most vulnerable to be targeted by predators. But the fall of Bill Gothard also reflects a larger shift in the way many evangelical Christians are engaging in American culture and politics—abandoning the call for moral rectitude in favor of a more purely partisan antipathy, which has found its greatest expression in the Christian right’s support for Donald Trump.

Gothard founded the Institute in Basic Life Principles (first called Campus Teams, then the Institute in Basic Youth Conflicts) in 1961. Beginning in 1965, he purveyed his teachings in a conference called the Basic Seminar, through which he created a loyal group of adherents who found safety in his rigid approach to a changing world. While Billy Graham was filling stadiums with the good news of salvation, Gothard was preaching a staunch countercultural ode to the lost virtues of a Christian society, which he said emphasized personal responsibility, the value of suffering, and submission to authority. Gothard attracted a following in major cities across America, usually visiting them twice a year. The Los Angeles Times estimated that, by the early 1980s, more than 250,000 Californians had attended the biannual Basic Seminar conference in L.A.

Gothard’s principles went beyond the common fundamentalist warnings against alcohol and tobacco, calling on adherents to remove televisions from the house and to avoid music with a “tribal” beat. Men had to be clean-shaven, and women had to meticulously conceal their sexual appeal—the most valuable, and dangerous, attribute they possessed. All these instructions were, Gothard claimed, pulled straight from Scripture. They were what God wanted.

Gothard’s fundamentalist message gained traction at a time when conservative Christianity feared it was losing its privileged position in American culture, and Gothard’s success soon spread into the political sphere, in ways still felt today. Mike Huckabee has long been a Gothard supporter. Secretary of Agriculture Sonny Perdue says in an online review of a Gothard book, “Nothing besides Scripture itself played a more important role in the daily guiding and leading of our family than Bill Gothard and the Institute in Basic Life Principles.” The Christian-owned company Hobby Lobby donated millions of dollars and multiple properties to IBLP. Gothard’s institute even entered pop culture consciousness in 2015 when the Duggar family of reality TV fame sent their son Joshua to an IBLP campus in Arkansas for “counseling,” after discovering he’d sexually touched four of his sisters as well as a babysitter.

As Gothard’s dreams for IBLP grew, so did its reach. It began with two main properties in Michigan and Illinois in the early 1980s; over the years the organization added an array of campuses and offices throughout the United States and eleven additional countries, including Romania, Australia, and Russia. To help followers apply the wisdom of the Basic Seminar to daily life, Gothard launched the Advanced Training Institute International (ATI), a national network of homeschooled families committed to educating their children and organizing their homes around Gothard’s principles. While IBLP has declined to release numbers on the program’s size, the ATI conference in Knoxville, Tennessee, was at its pinnacle attracting more than 15,000 attendees every year.

The ATI curriculum is built on “wisdom booklets,” which claim to provide practical instruction in linguistics, history, science, law, and medicine. In reality, students are taught a twisted form of Jesus’s teachings that focuses on themes of guilt, shame, fear, and self-flagellation. In one ATI booklet students are taught how to properly mourn by reflecting on 26 different categories of sin. For students raised under the ATI curriculum, the message—both stated and implicit—is this: If you are suffering, it is because you are outside of of God’s will.

Students who leave or criticize IBLP often find themselves excommunicated from their own families. Others know not to mention the abuse they’ve experienced, since their families still deny or minimize their claims. Survivors who publicly share their stories on Recovering Grace, a website and support group for former IBLP students, have been harassed by Gothard’s small but aggressive band of loyal followers. Because of all this, nearly every interviewee for this story asked that only their first name be shared. Others would only talk off the record.

One of these survivors is Johanna, whose father physically abused her for years. When she went to the pastor of her church, which was full of fellow ATI families, for advice, she was told her dad’s abuse was the result of her not “being submissive enough.”

Like all ATI students, Johanna was taught the “umbrella of authority” principle, which states that all authority comes from God, who then doles out that authority to parents (but mostly fathers), then pastors, then police, then higher government officials. Each human’s duty is to stay under their particular “umbrella of authority.” The authority’s decisions can technically be appealed, but in Johanna’s case that meant groveling before her abusive dad, hoping he would eventually relent.

Gothard’s fundamentalist message gained traction at a time when conservative Christianity feared it was losing its privileged position in American culture.

When I asked Ziesmer, the longtime IBLP employee, about what a girl like Johanna should do, his skepticism of such claims of abuse was apparent. “If the situation is abusive enough, a child can appeal to a government agency to intervene in their family,” he said, but added, “That is assuming that this young lady is not someone just wanting her own way.”

The idea that victims of abuse were to blame was rife at Gothard’s organization. The IBLP pamphlet “Our Most Important Messages Grow Out of Our Greatest Weaknesses” asks this question: “What if a wife is the victim of her husband’s hostility?” And the answer: “There is no ‘victim’ if we understand we are to suffer for righteousness.” In ATI Wisdom Booklet 36, students are taught that if a woman doesn’t “cry out” while being raped they are “equally guilty with the attacker.”

Johanna suffered silently through severe abuse, because in this system of beliefs there were no victims, because suffering is the way to righteousness, and because it was probably her fault anyway. Then her father punched her sister in the sternum so hard she couldn’t breathe, a moment of clarity for Johanna, who eventually left home. Another former ATI student described Gothard’s ATI program this way: “It turned every homeschool father into a cult leader, and their home into an island.”

Most ATI girls are expected to become a wife and a mother, via a courtship process facilitated by the father. Finding an afterschool job or preparing for college are either frowned upon or outright forbidden. However, as a generation of children raised in ATI entered their teenage years, another option emerged: working for IBLP.

For ATI families, Gothard was a celebrity, a teacher, and a spiritual guru who took on godlike proportions in the minds of children. Some girls, like Johanna, were personally recruited by Gothard to come work for him. She was one of several girls known around headquarters as “Gothard’s girls,” a rotating group of attractive, teenage girls who functioned as personal assistants. Johanna had only known a chronically abusive father; in comparison, Gothard was, Johanna said, “the nicest man I’d ever met.” At one point Johanna told him he was like a father figure to her. Gothard squeezed her hand, looked deep in her eyes, and said, “I can be that for you.”

Many of the ten women who filed suit against Gothard in 2015 were once part of Gothard’s personally curated collection of teenage girls. But the damage he caused was not restricted to sexual abuse, or to the girls who surrounded him; throughout IBLP’s organization women were subjected to multiple forms of emotional and psychological abuse, and physical neglect.

Several people interviewed for this article described how their health was permanently ruined by their time at IBLP. At the age of 24, Lauren found herself on the leadership team for EXCEL, an eight-week program for teenage girls that focused on intensive spiritual training. Lauren routinely worked 80 hours a week, and sometimes as many as 100, believing that she would be held accountable to God for those who didn’t hear Gothard’s message and went to hell as a result. As Lauren’s endocrine system slowly collapsed from exhaustion, she began having intense suicidal thoughts to the point where she wouldn’t drive alone, for fear of what she might do. When she finally was taken to a doctor Lauren was told her body might never recover from the damage. She suspects this trauma is the cause of her infertility now.

IBLP’s Northwoods campus, a retreat center in Michigan, is the site of one of IBLP’s most notorious stories. Gothard’s younger brother Steve had been caught at IBLP headquarters having sex with multiple women on staff, an already egregious violation of Gothard’s moral code, made worse by Bill regularly sending young women to the isolated Northwoods campus to serve on his staff. The women sentenced to serve as Steve’s sexual victims included Bill’s out-of-favor personal assistant Ruth, who claimed that Steve psychologically abused her for months before she was finally coerced into having sex with him. After leaving IBLP, Ruth had chronic nightmares of her experiences, and when she was 37 years old developed stage 4 breast cancer that ultimately killed her. Her husband Larne believes that the extreme traumatic stress she had experienced in her nine years working for Gothard made her more susceptible to the disease at such a young age.

When asked for comment, Steve Gothard told me, “I just wish this would all go away. I don’t see what the point in bringing it back up is.” (Gothard later claimed he had never spoken to me.)

“I saw him pick out young women who were obviously vulnerable and hurting—but also very attractive.”

For years, the largest numbers of volunteers for IBLP ended up at the Indianapolis Training Center (ITC), a former hotel that housed several of Gothard’s projects, including a school for teenagers deemed “rebellious.” Parents temporarily suspended their rights as authority figures and gave all their authority to ITC. One of the rights ITC claimed was sentencing teenagers to solitary confinement—benignly labeled a “prayer room”—sometimes for days on end. One former ITC resident named Karis remembers her roommate being sent to prayer room for breaking curfew. “The whole thing was traumatic,” Karis recalled. “My roommate was different when she got out. Before she had this bubbly, happy personality, and you weren’t allowed to be happy at ITC. Because happiness was a sign that you weren’t as spiritual as you were supposed to be. Joyful is different than happy.”

Students were always fearful one of their peers might report them to the head of ITC, Rodger Gergeni, who three different interviewees for this article independently referred to as “evil.” In 2002, local Indianapolis news affiliate WTHR reported allegations that students were handcuffed, sat on, beaten, and locked in prayer rooms for weeks at a time. An investigation into the allegations by the Indiana Family and Social Services, however, said the allegations were “unsubstantiated.”

Gothard would often recruit girls from ITC to come work for him at IBLP headquarters. Multiple students recalled joking about Gothard’s “harem.” One student named Micah said, “I saw him pick out young women who were obviously vulnerable and hurting—but also very attractive. I heard him promise them they’d be right at the center of the next big thing he was planning. Those plans never came to pass, but I saw the girls come and go.”

When I talked to Bill Gothard on the phone in August, he was full of hope for the future of his ministry. He is in his eighties, but he believes his best days are ahead of him, telling me that God will let him live until he is 120. While Bill has lost access to the IBLP email list, which is full of potentially sympathetic ATI families, he is working hard to rebuild his following. Over the past few years, he has written 26 books, which are for sale on his website, and there are more on the way. He is waiting for word on a property in Mexico that, due to legal maneuverings, could be returned to him. If that happens, Bill could build a new base of operations.

It seems unlikely, though, that this will come to pass. As a generation of ATI students have grown into adults, they’ve fled the program en masse. Some have found a more liberating form of the Christian faith, while others have left religion altogether. If Gothard were given the keys to IBLP tomorrow, he would find a dwindling number of true believers left. IBLP, meanwhile, is frantically selling off its properties to make ends meet.

Other Christian fundamentalist organizations are in similar straits. Paige Patterson, long lauded for saving Christianity from the clutches of liberalism (and who once wrote positively on Gothard’s doctrine of authority), was fired this past May as president of the influential Southwestern Baptist Theological Seminary for a series of misogynistic comments and his dismissive, allegedly abusive handling of sexual assault claims. Patterson was a pillar of the old guard Southern Baptist movement, one of the Moral Majority’s most powerful cultural influencers. But this legacy is being rejected by a new generation of Southern Baptist pastors. New SBC President J.D. Greear is the youngest president in 40 years, and has publicly stated that the SBC’s history of political involvement was a damaging distraction from the church’s mission to preach about Jesus.

While Bob Jones University continues to be a fundamentalist hub at the intersection of conservative politics and Christian movements—both Ted Cruz and Ben Carson spoke there in 2015—attendance has dropped 26 percent since 2001, and a $4.5 million budget shortfall led BJU to lay off 50 employees in 2018. Meanwhile, students at Liberty University, headed by Donald Trump enthusiast Jerry Falwell Jr., have publicly expressed their disgust with Falwell’s surrogacy.

If the culture warriors of yore expressed their opposition to liberal America through a loud, righteous embrace of Christian values, they have now thrown their weight behind the decidedly un-Christian Donald Trump.

While data on the spirituality of millennials suggests that younger generations still gravitate toward religious belief, including personal spiritual practices such as prayer, there is an increasing abhorrence toward religious authoritarianism. For Christians like myself, who have rejected the Moral Majority’s approach to the culture wars as well as Gothard’s repugnant twisting of Jesus’s teachings, this is a hopeful trend.

But these changes do not mean fundamentalist Christianity has necessarily turned a new leaf. If the culture warriors of yore expressed their opposition to liberal America through a loud, righteous embrace of Christian values, they have now thrown their weight behind the decidedly un-Christian Donald Trump. Indeed, white evangelical support for Trump has increased during his presidency, despite his unashamed acceptance of a lifestyle abhorrent to a traditional Christian sexual ethic, and his sneering at traditional Christian virtues like forgiveness, humility, and a compassion for the “least of these.”

It goes well beyond Trump. The same religious leaders who railed against the collapse of decency during the Bill Clinton impeachment proceedings in the 1990s, declare their support for badly tainted political candidates now, like Roy Moore, the Alabama Senate candidate who was accused of sexually abusing minors.

Gothard’s religious teachings are increasingly ignored today, even within fundamentalist churches, thanks in part to the various abuses they engendered. But the public sphere is another matter. Rather than reject misogyny, abuse, or patriarchal authoritarianism, a sizable segment of modern Christianity appears ready to tolerate these traits in its political leaders, as long as it is all in service of fighting the “enemy,” which is usually a shorthand for “liberals.”

Gothard’s legacy is not his thousands of pages of bizarre dogma, but the insight he offers into the way the Christian right once responded to the threats posed by liberal America. He was celebrated by the culture warriors for promising stability in a changing world, all while he warped the message of Jesus to build an empire for himself and prey on the vulnerable. The victims of his abuse are still waiting for justice—and watching as many of their fellow Christians show, in their actions and their politics, that they really don’t care.

The Mueller Report Is More Important Than Ever
The Mueller Report Is More Important Than Ever

Politico published a downer on Friday. “President Donald Trump’s critics have spent the past 17 months anticipating what some expect will be among the most thrilling events of their lives: special counsel Robert Mueller’s final report on Russian 2016 election interference,” the article began. “They may be in for a disappointment.” That’s the conclusion reporter Darren Samuelsohn came to after speaking with defense lawyers involved in the case and “more than 15 former government officials with investigation experience.”

The logic behind Samuelsohn’s theory is that Mueller’s report may never be released to the public—and even if it were, it may not contain any bombshells because the special counsel’s “by-the-books, conservative style” may cause him to “lean more toward saying less than more.” In short: Don’t get your hopes up, Trump critics.

There are reasons to expect otherwise. If this were a normal criminal investigation, special counsel Robert Mueller would likely end it as quietly as he ran it. Prosecutors don’t typically comment on what they find during a criminal inquiry unless they use it to bring charges, and for good reason. Investigators often trudge through intimate details of people’s lives, uncover embarrassing secrets, or find evidence of wrongdoing that falls short of criminal activity. Generally speaking, it would be deeply unfair for prosecutors to reveal what they learn outside of a courtroom.

But the Russia investigation is more than that. Trump and his Republican allies in Congress have resisted efforts to learn the full extent and effect of Russian election meddling in 2016. Party leaders refused to create a Watergate-style committee or a 9/11-style commission to find the truth, while House Republicans have spent the last year trying to discredit and shut down the Justice Department’s inquiry into the matter. By the time a Democratic president or Congress is able to order a full-scale inquiry, it may be too late. Memories fade. Documents go missing. Evidence disappears.

As a result, the special counsel’s inquiry may be Americans’ best chance to understand an attack on their democracy. Mueller could wrap up his work sooner, later, or not at all. I wrote in April that if he is unable to complete the investigation because of political interference by the Trump administration, he has a duty to go public with his findings. That reasoning applies even if he wraps up the probe of his own accord. Mueller’s silence is ethically, legally, and politically smart while the investigation is ongoing. But to keep quiet after it ends would be incompatible with a democratic society.

Under Justice Department guidelines, Mueller will provide a report to Deputy Attorney General Rod Rosenstein when he’s done. It’s unclear whether that moment is approaching. Bloomberg reported last week that Mueller is close to reaching a conclusion on the two main threads in the Russia investigation: whether the Trump campaign colluded with the Kremlin in 2016, and whether Trump himself obstructed justice through his firing of FBI Director James Comey and other efforts to undercut the investigation into election meddling. CNN disputed that account, quoting an unnamed Justice Department official who expected the inquiry to continue “well after the midterms.”

Though Mueller hasn’t taken public steps ahead of the November midterms, there are signs that he hasn’t been idle. Investigators continued to question associates of Roger Stone, the longtime Republican political operative, about his interactions with WikiLeaks founder Julian Assange during the 2016 election. The special counsel’s office reportedly met with former Trump campaign chairman Paul Manafort nine times in the last four weeks. Michael Cohen, Trump’s former personal attorney, also sat for multiple interviews with Mueller’s team since Labor Day.

Will Americans ever learn what they told Mueller? If he sticks with the practices of past investigations, maybe not. As Samuelsohn reported, Mueller’s findings “may never see the light of day” if he sticks to standard operating procedure. This may come as something of a surprise to Americans who remember independent counsel Ken Starr’s lurid report on President Bill Clinton’s relationship with Monica Lewinsky, which Congress used as the basis for Clinton’s impeachment in 1998. But the law that gave Starr his independence is no longer in force, Samuelsohn noted. Because Mueller’s powers flow through the Justice Department’s internal rules, he has far less leeway to make reports available to Congress or the public.

In this case, silence would be untenable on multiple levels. First, the public needs to know the full extent of what Mueller has learned about Russian interference. He’s already done some good work on this front with the Internet Research Agency indictment in February and the Fancy Bear indictment in July. Despite the president’s habitual denials, those charges reaffirmed that the Russian government bore responsibility for what happened in 2016. And while it’s unlikely any of the defendants will ever see a U.S. courtroom, the charges provide a common set of facts for the public to understand the depth and breadth of social-media manipulation during the election.

The need for accountability goes even further. Mueller’s findings would allow Americans to know who did what during the 2016 election and how they should be held to account. For some individuals, the answer may be criminal charges. In other cases, it may be useful to know what mistakes or acts of wrongdoing were committed so that changes can be made to prevent them from happening again. The 9/11 Commission, for example, made dozens of recommendations when it released its final report. While it may be inappropriate for Mueller to make the recommendations himself, what he’s uncovered would be useful in helping others figure out what steps to take.

There’s also civic value to knowing what happened. With so many threads and figures in the Russia investigation, it’s easy to lose track of why this all matters. At its core are questions about the validity of American self-government itself. Did the president conspire with foreign powers to take over the country? Did he break the law by trying to shut down inquiries into that question? Americans need answers to these questions for an abstract but fundamental purpose: so they can maintain faith in the nation’s democratic system, or take appropriate electoral action if that faith is not warranted.

Official silence in this case would do more harm than good. Thanks to the Warren Commission’s secrecy and mistakes, more than 61 percent of Americans still don’t believe that Lee Harvey Oswald acted alone when he assassinated John F. Kennedy in 1963. Widespread suspicions that President Gerald Ford made a corrupt bargain to pardon Richard Nixon during the Watergate crisis contributed to Ford’s defeat in the 1976 election. Mueller’s conclusion may not be accepted by everyone, of course. But the Russia saga is already a psychic wound in American political life. Without some kind of resolution, it will only continue to fester for the next hundred years.

It’s true that a report by Mueller likely won’t provide definitive answers on every aspect of Russian interference, and may even fail to answer key questions about what happened. That task will likely fall to historians, as it often has in the past. Americans shouldn’t have to wait that long for answers, though. At least half of the country believes the president may have cooperated with a foreign government while it broke the law to win his office. If Mueller can prove collusion, the American people deserve to know. If Mueller can’t prove it, that would be important to know, too.

Beto O’Rourke Isn’t Running for Senate Anymore
Beto O’Rourke Isn’t Running for Senate Anymore

Beto O’Rourke, the three-term congressman from El Paso trying to unseat Texas Senator Ted Cruz in next month’s midterms, may be the Democrats’ biggest rising star since Barack Obama. Over the last few months, he has gone viral in defending NFL players’ right to protest during the national anthem and air-drumming to The Who’s “Baba O’Riley.” He has broken fundraising records, raking in $38 million for his campaign against Cruz in only three months. Just about every national publication has devoted page upon page to profiling his quixotic quest to turn Texas—Texas!—blue. He even skateboarded on stage without looking like a complete dork.

But with just over two weeks to go until the election, O’Rourke’s momentum seems to have fizzled—in Texas, at least. After appearing to be within spitting distance (in a few polls, at least) earlier in the campaign, O’Rourke has consistently trailed Cruz by between seven and ten points in recent days. That outperforms Cruz’s last opponent, then-Congressman Paul Sadler, who lost by 16 points in 2012, but not by as much as you would expect, given O’Rourke’s fame.

But in recent weeks, it’s become increasingly clear that O’Rourke has a back-up plan: the 2020 Democratic presidential nomination. From his discussions about “inspiration” and American greatness to his fierce rhetoric on climate change, O’Rourke is aiming at a national audience. While he’s certainly to the right of, say Alexandria Ocasio-Cortez, he’s unabashedly liberal in a state where Democrats are typically conservative. O’Rourke—much like, it should be said, Cruz—is running for two races simultaneously, senator and president.

O’Rourke’s national appeal was to an extent strategic and to an extent inevitable. It was strategic insofar as Texas is a very large state and therefore a very expensive one in which to run a statewide election. Democrats have lacked the stomach to invest heavily in smaller states they dream of converting, like Georgia, because so much of the political infrastructure candidates need has to be built from scratch. Given Texas’ size and population, even if that infrastructure did exist, it would be expensive to run a competitive race in the state. But nationwide appeal means O’Rourke (like Cruz) has done very well from out-of-state donors (though it’s unclear how well).

At the same time, O’Rourke can thank his opponent for much of the prominence he’s gained over the last several months. Democrats have been dreaming of winning back Texas for decades, and O’Rourke is their best chance in a while. Cruz is not only one of the most prominent Republicans in America, but one of the most reviled men in politics. Given his prominence, his complicated relationship with Trump, and the threat of a “blue wave” in the November elections, Cruz was always going to attract a certain level of attention. The national media loves an upset, and O’Rourke unseating Cruz would be an even bigger shock than Ocasio-Cortez’s surprise defeat of establishment Democrat Joe Crowley. The national media also loves hip, handsome politicians, and O’Rourke’s boyish charm and optimism is all the more appealing when the foil is Cruz, “the Skeletor of American politics,” as Jack Shafer argued earlier this month (even if Cruz is just a year older).

With O’Rourke now a distinct long shot, the media has moved on to wondering if he will be a presidential candidate. In August, Vanity Fair said that O’Rourke’s campaign was a lot like being in Iowa with Obama in 2007. This is an act of journalistic hedging—it doesn’t make sense to deploy resources to extensively cover a guy who’s going to lose by ten points, but it does if he can be construed as a potential 2020 frontrunner. But O’Rourke has, particularly down the stretch, run a campaign as if his next stops were Iowa and New Hampshire, rather than the Senate. At his recent debate against Cruz, O’Rourke pontificated about “walls, Muslim bans, the press as the enemies of the people, taking kids away from their parents” and America’s place in the world. These are things that matter in Texas more than many other places, to be sure. But this is also what one expects to hear at state fairs and town halls in states with early primaries and caucuses.

O’Rourke has downplayed the 2020 talk, saying that his focus on national issues (and the president) is just him keeping it real—something that Democrats in Texas just don’t do. “Democrats in Texas have been losing statewide elections for Senate for 30 years,” he told Vanity Fair in August. “So you can keep doing the same things, talk to the same consultants, run the same polls, focus-group drive the message. Or you can run like you’ve got nothing to lose. That’s what my wife, Amy, and I decided at the outset. What do we have to lose? Let’s do this the right way, the way that feels good to us. We don’t have a pollster. Let’s talk about the things that are important to us, regardless of how they poll. Let’s not even know how they poll.” Of course, running on authenticity is exactly how a consultant would tell you to run against the ever-calculating Cruz.

If O’Rourke’s bid somehow succeeds, he would immediately become a frontrunner—maybe, with the possibility of a blue Texas, the frontrunner. (The polls says he’s also a top ten contender.) But O’Rourke’s fate, two weeks out, seems close to sealed: Even $38 million only goes so far. It’s also unlikely that if O’Rourke gets walloped, he’ll be able to carry the same swagger into Iowa and New Hampshire. Missouri’s Jason Kander won over a lot of nationwide Democrats with his losing Senate bid in 2016. He’s been able to parlay that into a national profile, but isn’t exactly being whispered about as a presidential candidate.

But however this plays out, O’Rourke has already accomplished a lot: He has shown  Democrats how to run brave, uncompromised campaigns in the Trump era. As The Ringer’s Justin Charity wrote earlier this week, O’Rourke’s supporters “have come to regard his campaign as a quixotic candidacy: doomed but nonetheless hopeful, if only because O’Rourke has run so shamelessly through Texas as a true-blue liberal.” This is what the Democratic base wants out of the 2020 primary—especially with well-heeled centrists like Michael Bloomberg and Howard Schultz kicking the tires. And O’Rourke isn’t just running the kind of campaign Democrats in Texas want to see. He’s running the kind of campaign Democrats around the country want to see.

Can Change Wait?
Can Change Wait?

Ayanna Pressley’s surprise primary win over Michael Capuano—a power in Boston politics, with 20 years in Congress representing the Massachusetts 7th and the support of virtually the entire state Democratic establishment—defies easy explanation. Unlike other insurgent leftists who have roiled the political landscape this campaign season, she had trouble outflanking Capuano, a stalwart liberal who supported Medicare for All well before it was popular, from the left. But she did have some advantages. She’s young—44 years old to his 66. She is black and he is white, a decisive factor in a district that had become majority-minority in the decades Capuano has served in Congress. Most important, she was something new, and something new seems very much to be what voters want. Pressley packed her message of transformation into a three-word slogan that captured the urgency and promise of the first campaign season of the Trump era: “Change can’t wait.”

If Democratic leaders in Congress have given the impression that they are clueless riders on the tiger of an inflamed Democratic base—consider their tepid slogans “A Better Deal” and “For the People”—here was the tiger itself, muscular and immediate. The national press seized on Pressley’s slogan as a kind of mantra for a new political generation. CNN declared that it “connected the Democratic optimism of the early Barack Obama years to the urgency of Donald Trump’s presidency.” The network was right; Pressley was clearly paying homage to the last great Democratic slogan, Obama’s “Change we can believe in.” But even as she was harnessing Obama’s rhetorical prowess to target his poisonous successor in the White House, she was also making a critique of Obama and his legacy, and of the party he left behind.

The Obama presidency was indeed a source of inspiration for African Americans—“an eight-year showcase of a healthy and successful black family spanning three generations, with two dogs to boot,” as Ta-Nehisi Coates put it—but it was also a source of frustration. The pace of change was too slow, and Obama’s temperament too unflappable in the face of widespread social injustice. He was caught between suspicious white voters on the right and the enormous expectations of black voters (and white liberals) on the left, which limited his ability to address the issues of race head on. His famously artful response to the death of Trayvon Martin in 2012—“If I had a son, he’d look like Trayvon”—was a case in point: a constrained attempt at catharsis.

Ayanna Pressley unwraps a gift from a supporter with her slogan in needlepoint in Boston, Aug. 26, 2018. Kayana Szymczak/The New York Times/Redux

Those frustrations extended to Obama’s steady, deliberate approach to issues beyond race: stagnant wages, student debt, sexism in the workplace (and now, the White House), a broken health care system, a broken immigration system, and the yawning divide between the rich and everyone else. For many Democrats, these issues require more daring policies—Medicare for All, the abolition of ICE—than what Obama was willing to offer. They will also require a fundamentally different kind of politician. “Those closest to the pain should be closest to the power,” Pressley would say during her campaign—another striking line that, like her slogan, is sure to make certain Democrats wince, as it implicitly pits groups (minorities, working people) against others (whites, the wealthy).

But the real downside to Pressley’s slogan lies not in its class and identity politics, but in its unapologetic impatience. Democrats like Pressley want not just to send Donald Trump back to the world of reality television, but also to right the nation’s long history of wrongs with a dramatic flurry of legislative action. If, as many expect, Democrats take back the House in 2018, they will still be in a very rowdy minority, since any bill they pass will meet opposition from the Republicans in the Senate and the White House. The real challenge will come if Democrats are able to build off that success to win back control of the entire federal government in 2020. Then, there will be pressure from Democrats like Pressley to repudiate Obama’s measured style, as well as his repeated deference to the ghost of a bipartisan consensus that had all but vanished by the time he reached office.

Can this new kind of Democrat be realistic about the pace of change in a divided country, and, for that matter, in a divided party? Can the party move at a blistering pace without falling apart? That remains to be seen. Campaigns, of course, are not a time for realism but for expressing hopes and dreams, anger and frustration. In this respect, today’s insurgent Democrats are not all that different from those who, back in 2008, believed that change couldn’t come soon enough, who were themselves clawing their way out of the political wilderness, and who were encapsulated in another, similarly determined Obama slogan: Yes, we can.

The #MeToo Workplace Policy That No One Is Talking About
The #MeToo Workplace Policy That No One Is Talking About

It’s been a year since the New York Times and New Yorker published their bombshell articles outlining how Harvey Weinstein had sexually harassed and assaulted women for three decades. Since then, the #MeToo movement has turned millions of women into the streets, toppled prominent men in Hollywood, media, politics, and Silicon Valley, and sparked a long overdue discussion about sexual abuse, especially in the workplace. 

Meanwhile, the Democrats have moved left on issues from the minimum wage to college tuition, health care, and job guarantees. But perhaps surprisingly, there has been no such shift toward expanding the social safety net for victims of sexual harassment or domestic abuse. Some Democratic candidates have been discussing #MeToo on the campaign trail: In Texas, House candidate MJ Hegar revealed that she was a survivor of domestic violence in her viral campaign ad “Doors”;  Anna Eskamani, who is running for a seat in Florida’s state house, spoke about being sexually harassed in her workplace; and Rachel Crooks won a seat in the Ohio state legislature in May after she publicly accused Donald Trump of forcibly kissing her in an elevator in 2005. But discussions of what they would do, if elected, to change workplace and domestic abuse policies for the better are less common. 

One of these solutions is simple: paid sick and safe days for survivors of domestic abuse. The policy has been around since 2005, and in other countries, it has been growing increasingly popular. Over the summer, New Zealand passed a law that grants ten days paid leave to survivors of domestic violence—time to flee their abuser and find housing, medical care, and legal assistance. Some states have passed similar laws. Currently, 33 provide some type of unemployment assistance to victims of domestic or sexual abuse, and ten offer mandatory paid sick days. “It’s spreading like wildfire,” said Marium Durrani, a senior policy attorney at the National Network to End Domestic Violence. “A lot more are following suit.” 

There are prototypes of federal bills making their way through Congress now, but they’ve been mired in congressional gridlock. Last November, a month after the Weinstein allegations broke, Washington Senator Patty Murray reintroduced her Security and Financial Empowerment (SAFE) Act to Congress. If passed, it would allow survivors to take up to 30 days off work to access the support they need, as well as protect victims from being fired if they are harassed at work. Meanwhile, a more general bill, The Healthy Families Act, would guarantee up to seven days of paid sick leave for all working families. Representative Rosa DeLauro introduced it in the House in March of last year.

These bills aren’t just good policy. They’re good politics. With President Trump cutting already limited funds for survivors, they would help Democrats draw contrasts between themselves and the administration. And with the framework for paid sick days already in place in some states, the legislation could be easier to pass than other domestic violence bills—especially if the Democrats, as expected, take back the House in November.

One in four women in the United States face some form of domestic abuse in their lifetimes, whether it’s physical, emotional, psychological, or sexual. This comes with an economic cost. Abusers often delay or prevent their partners from getting to work. “This is a classic example of power and control,” said Qudsia Raja, the policy director at The National Domestic Violence Hotline. The abuser will “try to manipulate any form of support the survivor has, and obviously your job is a big form of support.”

In other cases, women are harassed when they arrive at work—and when they report the abuse, according to a recent survey by the Maine Department of Labor, 60 percent either quit or are fired. In the United States alone, victims of abuse lose eight million paid days of work each year, according to the Centers for Disease Control and Prevention. That’s roughly 32,000 full-time jobs that employers aren’t filling, creating an unpredictable workforce rife with absenteeism and employee turnover. Even when victims do turn up to work, they are less motivated and demoralized, which stunts their advancement. 

But despite the magnitude of the problem, it’s hard to pass legislation that would help fix the issue. For one, domestic abuse has long been considered a private issue rather than a public one. It was only around 1994—when Congress finally passed the Violence Against Women Act, which marked the first comprehensive federal legislation protecting and preventing women from rape and battery—that domestic abuse was first considered a human rights issue, rather than something that happened in private, behind closed doors.

Still, domestic violence remains notoriously underreported. The Bureau of Justice Statistics had found that serious domestic violence was 31 percent more likely to go unreported to law enforcement for fear of reprisal—either from their partner, or on occasion, even from the government, which, in jurisdictions that have nuisance ordinances, actually punishes women who call 911 too many times.

From a legislative perspective, it’s hard to prevent women from being penalized for reporting abuse: Each state has different laws, and employers are given wide latitude to decide when and if to fire someone (domestic violence is not a protected class of employment, under Title VII of the Civil Rights Act of 1964), so if an abuser does call or turn up at the workplace of their victim and cause a disturbance, the employee is left vulnerable. And even if Congress were to pass a law to prohibit discrimination based on an abuser’s behavior towards a victim, how would they enforce it? 

A policy that gives victims of domestic violence access to paid sick days, on the other hand, would be much easier to implement. Today, more than 40 percent of private sector workers in the United States have no access to paid sick days—a problem that persists because of aggressive business lobbing and the weakening of labor unions. Organizations like the National Federation of Independent Business and the Chamber of Commerce have consistently resisted paid leave policies, arguing that they’re expensive and that it should be left up to a company to decide how much is offered. 

This means that women who are financially dependent are often trapped in abusive relationships because they can’t take the time off work to flee their abusers or pursue them in court. “If you miss your court hearing, then you don’t get your protective order,” said Durrani, who, before she joined the National Network to End Domestic Violence, was a lawyer representing victims of domestic violence in court. “Or you miss your job and your employer finds grounds to terminate you, and you probably don’t have resources to combat that. Beyond the physical and emotional implications of abuse, there are these long reaching ramifications.” Murray’s bill, and others like it, are a remedy for this. 

One of the main conservative concerns about legislation like Murray’s SAFE Act is the cost: When a similar bill was introduced in Maryland, the National Federation of Independent Business, a conservative group funded by the Koch Brothers, argued that the legislation—The Healthy Working Family Act—could decrease output by over $1.5 billion by 2027. “This would lead to reduced profitability, lost sales and production, and lost jobs,” wrote Senior Data Analyst Michael Chow in a February 2017 report for the NFIB. 

But that’s not the full story. The Center for Disease Control and Prevention estimated in 1995 that violence against women costs American companies $727.8 million in lost productivity each year—more than enough to make up the shortfall that Murray’s SAFE Act would produce. By allowing women to pursue safety without sacrificing a paycheck or their job, paid sick and safe days keep women in the workforce, which helps employers tap into a talent pool that is currently underused. Moreover, the Center for American Progress has found that when states passed similar paid family and medical leave laws, small business actually improved their profitability: their workers were more productive, their recruitment more effective, and their employees stayed at the company longer, too.

“We view this as making sure that workplaces implement policies so everyone can equally thrive,” said Sarah Gonzalez Bocinski, a program manager at Futures Without Violence, a nonprofit that works to end domestic and sexual violence. “Victims should not have to chose between job security and their own safety. It’s really a terrible choice that individuals have to make, especially for those in the lowest income jobs.”

For Democrats in Congress, aligning themselves with such legislation is smart from a strategic standpoint, as well. For one, it would help them set themselves apart from the president, who has not only stood by accused abusers but also ordered officers to reject the asylum applications of people who claim to be victims of domestic or gang violence. The bill also has the dual benefit of protecting both workers and businesses, at least in the long run. As the Democrats seek a message that sticks, these are just the kinds of policies they need more of, setting themselves up as the party of workers once more.

#MeToo “has created an opportunity where employers are reflecting on their own policies, particularly around sexual harassment,” Bocinski said. Now, it’s time for Democrats to “talk more broadly about a range of gender based violence and how [it] can manifest [itself] in the workplace.”

Are Brazil’s Businesses Cheating to Help an Extremist Win the Presidency?
Are Brazil’s Businesses Cheating to Help an Extremist Win the Presidency?

Until this week, Jair Bolsonaro, the retired army captain leading the race for the Brazilian presidency, seemed like he would cruise to an easy win over his opponent, Fernando Haddad of the Workers’ Party (PT). Bolsonaro’s free-market proposals, including mass privatizations of state enterprises, appeal to investors. His attacks on human rights and due process, on the other hand, have alarmed others. Recent polls showed Bolsonaro with a double-digit advantage over Haddad ahead of election day on October 28. 

Bolsonaro’s support—which has only grown despite homophobic remarks and open nostalgia for Brazil’s twenty-one-year military dictatorship—rests primarily on rabid anti-PT sentiment. Resentment of the country’s most popular party, marred by scandal in the past decade, has been diligently stoked through Facebook, Twitter, and, most prominently, WhatsApp, the free messaging service owned by Facebook. Indeed, WhatsApp, which is enormously popular in Brazil, has been the main driver of Bolsonaro’s histrionic campaign. Images that a sophisticated eye might clearly identify as having been doctored circulate freely among Bolsonaro supporters, reinforcing a visceral hatred of the PT and underscoring Bolsonaro’s image as a stern but honest alternative. One crude photoshop job shows the cover of a book written by Haddad, followed by pages wherein he appears to a defend incest, and call for murdering opponents.

For Bolsonaro, the allegations could be crippling. His appeal rests on his supposed incorruptibility.

Now, it seems that aspects of the social media campaign may in fact have been illegal. On October 18, the newspaper Folha de São Paulo reported that during the campaign, private companies have spent $12 millions reais ($3.2 million U.S.) to disseminate hundreds of millions of anti-PT messages through WhatsApp. Amounting to an undisclosed donation to Bolsonaro’s campaign, such spending would have violated Brazilian electoral law. Bolsonaro so far hasn’t denied the allegations. Instead, he’s described the measures as entirely voluntary and unconnected to his campaign, although one witness described Bolsonaro asking explicitly for such support at a dinner with wealthy followers. Meanwhile Luciano Hang, a vocal Bolsonaro supporter who owns a chain of department stores, said that paying to inflate posts across social media networks would have been redundant.  As he told Folha: “I just did a [Facebook] live here and it’s already reached 1.3 million people. Why would I need to [pay to] spread it?” On Thursday afternoon, Bolsonaro tweeted that the PT simply did not grasp that people would take such steps to voluntarily support his campaign.

For Bolsonaro, the allegations could be crippling. His appeal rests on his supposed incorruptibility—his purported rejection of under-handed means to secure political power, which he alleges is common practice for the PT. The PT won the last four presidential elections only to see its most recent standard-bearer, Dilma Rousseff, impeached in 2016 on charges of financial mismanagement.

Yet a great many Bolsonaro supporters have embraced the radical right-wing congressman, who has served in the legislature without distinction for almost three decades, with a zeal bordering on the fanatical. They may accuse the media of conspiring against their candidate, perhaps with the intention of shoring up Fernando Haddad. 



Contextual evidence should make this narrative implausible: Haddad is no media darling. He has campaigned on diversifying Brazil’s notoriously oligarchical media landscape and his party is universally reviled in the editorials of mainstream publications. The notion that Folha would concoct this story only to favor Haddad may prove too far-fetched for some Bolsonaro supporters. This does not necessarily mean that they will reconsider their support for Bolsonaro, but it may lead to an acknowledgment that their candidate is not a squeaky-clean paragon of morality, a concession to reality that would improve the current political debate in Brazil. 

For his part, Fernando Haddad, who has spent weeks publicly wondering who was financing the successive waves of fake news against him, has pounced on the allegations. Along with the Democratic Labor Party (PDT) of Ciro Gomes, who finished third in the first round of voting, Haddad’s party has filed charges with the Superior Electoral Court to annul Bolsonaro’s candidacy.

It remains unclear how quickly Brazil’s highest court for electoral matters will move to address the explosive findings. Brazil is hardly the only country at present where the role of online networks in extreme right-wing candidates’ campaigns has come under scrutiny. But in a country already polarized between those who oppose granting any quarter to far-right extremists and those who abhor the PT, tensions will almost certainly climb even higher with this new report. The revelations offer a distant hope for anyone concerned about an extremist like Bolsonaro taking command of the largest Latin American nation, and the ninth largest economy in the world. Only ten days until the vote, it’s possible not much will change. But there is now a sliver of a chance that Bolsonaro’s march to the presidency might be arrested, undone by a giddy reliance on the electoral potential of social media. 

Ryan Zinke Is in a Real Mess
Ryan Zinke Is in a Real Mess

After Scott Pruitt resigned from the Environmental Protection Agency this summer, many in the media—this reporter includedwondered whether Ryan Zinke might be the next to go. The Interior secretary was facing more than a dozen investigations over allegedly unethical and wasteful behavior at the agency.

Zinke’s fate, however, ultimately depended on the results of those investigations. So many were alarmed when The Hill reported on Tuesday that Zinke was getting rid of the person in charge of several of those inquiries—and would replace her with a Trump political appointee with no experience in government oversight.

The Hill’s information seemed pretty solid. It came from an email “Fond Farewell” and sent by Ben Carson, the secretary of Housing and Urban Development, to agency staff. “It is with mixed emotions that I announce that Suzanne Israel Tufts, our Assistant Secretary for Administration, has decided to leave HUD to become the Acting Inspector General at the Department of Interior,” Carson wrote. Though not explicit, this implied that Mary Kendall, who has served as the Interior Department’s acting inspector general for nearly a decade, would be fired or demoted.

The Washington Post, Outside magazine, and others soon picked up the news.

This is a very big deal. Politicizing the oversight function is dangerous, especially in the absence of any Congressional oversight. Changing IGs in the midst of multiple serious investigations of the agency's head should raise alarm bells everywhere. https://t.co/G8YvcRfX91

— Michael R. Bromwich (@mrbromwich) October 16, 2018

RM @RepRaulGrijalva, @RepMcEachin, @RepDonBeyer & @RepHuffman are urging scrutiny of the new @Interior Inspector General. It's deeply concerning that as IG this Trump appointee could interfere with the multiple Zinke Investigations currently underway. https://t.co/D2jQcFmjz3 pic.twitter.com/czpVYCggjV

— Nat Resources Dems (@NRDems) October 18, 2018

On Thursday evening, though, the story changed drastically. Heather Swift, the head spokesperson at Interior, denied in an email to BuzzFeed’s Zahra Hirji‏ that Zinke ever planned to replace Kendall with Tufts, and accused Carson of sending an “an email that had false information in it.”

“This is a classic example of the media’s jumping to conclusions and reporting before facts are known,” Swift wrote, adding that Kendall is still in charge of the Inspector General’s office at Interior. Contradicting Carson’s email, Swift said Tufts was merely considered “as a potential candidate for a position” in the Inspector General’s office, not offered the top job.

I want to reiterate how bizarre and messy this is. The top @Interior spokesperson just accused @HUDgov (and really @SecretaryCarson) of sending out an email "that had false information" https://t.co/sZOnk56hn9

— Zahra Hirji (@Zhirji28) October 18, 2018

Swift’s attempt at a clarification has not quieted the outrage. “This administration can’t stop embarrassing itself or keep its story straight for five minutes,” Arizona Rep. Raúl Grijalva, the top Democrat on the House Natural Resources Committee, told Politico. “Nobody is buying this explanation and we’re not going to stop pressing for answers.”

What this latest scandal has done, however, is shine a much-needed light on the messy state of agency Inspector General offices—particularly the Interior Department’s. These offices, as the Project On Government Oversight (POGO) notes, “serve as crucial independent watchdogs within federal agencies, and are indispensable to making our government effective and accountable. These watchdogs investigate agency mismanagement, waste, fraud, and abuse, and provide recommendations to improve federal programs and the work of federal agencies.”

Like Supreme Court nominees, Inspectors General are supposed to be nominated by the president and confirmed by the Senate. This process ensures a critical, public vetting under oath—and it ensures that the watchdog can’t be fired by the head of the agency they’ve been tasked with investigating. Swift noted this in her clarification email to BuzzFeed, saying Zinke couldn’t have fired Kendall because “only the White House is able to reassign senate confirmed officials.”

But Kendall was not confirmed by the Senate. Though she has been leading Interior’s office since 2009, she has been serving only in an “acting” capacity. In other words, the Interior Department has not had a confirmed inspector general for almost a decade. If it had, there never would have been a question about whether Zinke fired her, because he wouldn’t have had that power. “This story wouldn’t have been a story had there been a permanent IG in that office,” said Elizabeth Hempowicz, the POGO’s public policy director.

The Interior Department’s current inspector general vacancy is the longest-ever vacancy for that position. But it’s not the only one. Fourteen other agencies—including the EPA, the CIA, the Department of Energy, and the Department of Defense—don’t have Senate-confirmed inspectors general. That’s not a problem limited to the Trump administration, as many vacancies date back to the Obama administration. And as Hempowicz told me, “Congress hasn’t moved as quickly as they can on some nominations.”

Whatever the truth of Zinke’s latest kerfuffle, Hempowicz sees one silver lining. “I was surprised at the enormity of the response to this story,” she said. It may have been just the brazenness of putting a political appointee in charge of independent oversight. But it also could mean that “people have a real appetite to know that government is working the way it’s supposed to work,” she said.

Just hours after we spoke, with very curious timing, news broke that the Interior Department’s inspector general had concluded that Zinke broke department rules by spending thousands of taxpayer dollars on travel with his wife. The “new report says Zinke sought to designate his wife an agency volunteer in order to obtain free travel for her, that he often brought her in federal vehicles in violation of agency policy and that he neglected to get permission from ethics officials when he took campaign donors on a boat trip,” according to Politico.

It seems that Kendall is still in charge of oversight at Interior, and that Zinke still might be the next cabinet member to lose his job over a travel scandal.

The Deceptive, Shameful, Lucratively Funded War Against Rent Control
The Deceptive, Shameful, Lucratively Funded War Against Rent Control

On August 24, the tenants of two buildings near the Hollywood Forever Cemetery in Los Angeles received letters from their landlord notifying them of a rent increase of over $800 a month. The increase was not a result of repairs or tax increases but rather, the letter said, of the upcoming election in November.

The section of the ballot in question is Proposition 10, which, if it passes, would repeal a 1995 state law prohibiting local governments from enacting rent control on apartments and homes built after that year (or even earlier in cities like Los Angeles and San Francisco). According to the letter: “Although you don’t want higher rent and we did not plan on charging you higher rent, we may lose our ability to raise rents in the future. ... Therefore, in preparation for the passage of this ballot initiative we must pass along a rent increase today.” If the ballot initiative failed, however, the landlord, Rampart Property Management, promised to “revisit the rent increase with a desire to cancel it.”

For Maria (who prefers not to use her real name out of fear of retaliation by her landlord), waiting to find out was too big a risk. An immigrant from Guatemala, she pays $600 a month to share a bedroom with her 11-year-old son in a two-bedroom apartment—another family lives in the second bedroom, with a fifth tenant taking the living room. A couple hundred dollars extra in rent would not be feasible for Maria, who lives off welfare after losing her job last year. A month later, Maria received a second letter billed as an “olive branch”—a reduction of the increase to $238 a month. When Maria and several other tenants went to the property manager to ask for an explanation, he told them that so many tenants had threatened to leave that the landlord had no choice but to lower the proposed rent hike.

As the vote on rent control approaches, tenants across California have been harassed, served with eviction notices, and forced to pay more rent. In Concord, an entire building of 29 families was given 60-day eviction notices, with landlords explicitly citing Proposition 10 as the cause. In Modesto, tenants of a single-family building were not only notified of a rent increase, but also encouraged to vote against Prop 10, which the landlord said would “eliminat[e] the current availability of single family homes to rent.” Shanti Singh of the California renters’ rights organization Tenants Together says these are not isolated incidents: “This is punishing renters for participating in the democratic process. And we’re expecting to see a lot more of this in the coming month.”

These efforts are part of a massive attack corporate landlords have been waging on rent control across the state. And though they claim to be speaking for the mom-and-pop landlords of California, the leaders of this campaign are some of the largest property owners in the country. Blackstone, the world’s largest real estate management firm, has spent nearly $7 million to defeat Prop 10. Other top donors include Equity Residential, the third-largest apartment owner in the country, and AvalonBay Communities, the twelfth-largest property owner. These mostly Wall Street–based moguls have pooled as much as $60 million (with as much as $2 million raised in the last week alone) primarily to fund an enormous advertising blitz, eclipsing the $22 million raised by the coalition of over 150 housing advocacy, community, political, and faith-based organizations that, along with the California Democratic Party, has rallied around the ballot initiative.

If Proposition 10 passes, it would be not only the most significant attempt to roll back state limitations on rent control, but also the greatest success to date of the burgeoning national tenants’ rights movement—and real estate groups are responding with full force. Rent control, which is illegal in 27 states, has become a campaign issue across the country, and the landlord lobby has been rushing to squelch tenants’ rights campaigns wherever they spring up.

In Boston, a bill far more modest than Prop 10—it was intended merely to track evictions and give the city a way to notify evicted tenants of their rights—was killed in the state legislature this past May after landlord groups put pressure on lawmakers. In Oregon, the landlord lobby has already launched a multimillion dollar super PAC, More Housing Now!, to oppose an anticipated Prop 10–like bill in the 2019 legislative session. In New York City in April, a proposal to freeze rents for nearly one million rent-stabilized apartments was defeated after the city’s main trade group for residential landlords, the Rent Stabilization Association, reportedly spent over $1 million on lobbying in 2017.

To owners, landlord groups seek to portray renters as poor, unpredictable, and conniving. A mailer sent out to condo owners across Boston by the Small Property Owners Association in 2017 warned owners that if tenant protection legislation were to pass, “[d]isruptive renters will learn they can do anything with no consequences, no fines, no evictions.” It added, “Unevictable renters can pass their units & low rents on to their heirs. It never ends.”

To renters, real estate groups characterize rent control as anti-renter. In one recent ad, the executive director of the deceptively named California Council for Affordable Housing tells voters that Prop 10 will “drive up rents, take rental housing off the market, and make it harder to find a place to live.” Amy Schur of the Alliance of Californians for Community Empowerment, one of the organizations leading the “Yes” campaign on Prop 10, explains, “They are using that message because they read the same polls we read, which show that a majority of likely voters in California support rent control and want fast action to prevent rent gouging.” A 2017 poll by the Institute of Governmental Studies at the University of California, Berkeley, found that 60 percent of likely voters in the state support rent control. In Oregon, a research firm this past May found that nearly two-thirds of those surveyed support expanding rent regulations.

The anti–rent control rhetoric rests on the argument that rent control discourages developers from constructing new buildings, further aggravating housing shortages. But advocates say that landlord groups are blowing this threat out of proportion. Of all of the cities that have floated rent control legislation in the past few years, none have proposed extending rent control to include new construction. Even officials in Berkeley—who have been some of the strongest proponents of rent control—have proposed transitioning apartments into rent control on a rolling basis, exempting newly constructed buildings for 20 years. And, a report published out of University of Southern California last week shows that cities with rent stabilization ordinances for existing units have seen no decline in new construction.

Studies have indeed shown that rent control can affect existing housing stockbut for reasons real estate groups avoid spelling out. By taking advantage of loopholes for averting rent control requirements, landlords end up pulling more properties from the rental market, converting rent-regulated apartments into condos and reducing the overall supply of affordable housing. While rent control advocates acknowledge these risks, they maintain that rent control is necessary as a stopgap measure for tenants facing eviction in an extremely hostile rental market. Even a widely cited recent paper highlighting the potential negative effects of rent control found that, of tenants in San Francisco, beneficiaries of rent control are between 10 and 20 percent more likely to have remained in the same apartment since 1994 and that “absent rent control essentially all of those incentivized to stay in their apartments would have otherwise moved out of San Francisco.”

Despite these benefits, the real estate lobby’s scare tactics appear to be working. A poll this week shows that 46 percent of likely voters oppose Proposition 10 and only 35 percent are in favor. “We are finding voters in our community who are crystal clear that they support rent control, and then say, ‘So we should vote no on Prop 10, right?’” says Schur. In Mountain View, this summer, San Jose Inside reported that nearly 300 voters had been misled by paid signature-gatherers (some of whom said they were paid $40 per signature) into thinking that a rent control bill was pro-rent control, when in fact it aimed to repeal a rent control ordinance.

In some ways this is an old story. “The mobilization of networks of local politicians and homeowners, the explicit use of race and class stereotypes, the references to renters as second-class citizens,” says Tony Roshan Samara of Urban Habitat, a grassroots advocacy organization for low-income communities of color in the Bay Area. “All of this goes back to ‘30s, ‘40s, ‘50s. We’re seeing the same politics of who gets to control land and who doesn’t.”

But the scale of today’s opposition campaign is a distinctly post–financial crisis phenomenon, dictated by a race-to-the-bottom rental market. Since 2013, private equity firms like Blackstone have been purchasing tens of thousands of homes, converting them into rental properties, and bundling and securitizing them to create triple A–rated “single-family rental bonds.” Unlike “mom-and-pop” landlords, who tend to rely on a single-fixed rate loan from a bank, this new model relies on big investments from Wall Street investors, who expect firms to extract ever-higher returns from their tenants.

“The financialization of the rental housing market has had profound ramifications,” explains Schur. “This is rip and run—the Blackstones of the world are not investing long-term in our communities, they are extracting wealth from California to give to investors in the global financial market.” The impact of legislation like Proposition 10 on a local landlord is nominal compared to the impact on a group like Blackstone, which has a portfolio of around 13,000 single-family rentals in California and a 40 percent stake in Invitation Homes, a property management group with another 13,000 homes in the state.

The viability of this profit structure relies on a great degree of political intervention, not just in ballot initiatives, but also into elected offices across the country. In Oregon, filings from the secretary of state’s office show that the More Housing Now! PAC and its member organizations have contributed thousands of dollars to help County Commissioner Loretta Smith defeat vocally pro-tenant candidate Jo Ann Hardesty for one of Portland’s open city council seats. The PAC of the California Apartment Association, one of the groups behind the No on Prop 10 campaign, was among the top donors to the campaigns of incumbent candidates in four different city council districts in Sacramento, all four of whom won reelection. And there’s likely far more money flowing behind the scenes: In 2015, the Mountain View Voice reported that the California Apartment Association had quietly funneled $90,000 to three city council candidates opposed to rent control through a PAC called Neighborhood Empowerment Coalition.

For tenant advocates working to advance rent control across the country, these tactics haven’t come as a big surprise. “Everyone expected to be out-funded by the real estate industry. It’s just standard practice, especially during a housing crisis, when rents are really high,” says Singh of Tenants Together. This election may be the first where the landlord lobby’s influence has emerged into full view, but as campaigns at all levels of government continue to embrace affordable housing as one of the most pressing domestic policy questions, it won’t be the last.

Welcome to <i>Camping</i>, the Most Misunderstood Show on TV
Welcome to Camping, the Most Misunderstood Show on TV

If this is what it takes to make Julia Davis a very rich woman, then so be it. The extraordinary writer and actress got her start on British TV in the 1990s, when she sent a character reel to Steve Coogan, who hired her as a writer. The first show she wrote was the BAFTA-winning Nighty Night, a pitch-black comedy about a narcissist (played by Davis) who uses her husband’s cancer to manipulate people around her. In 2016, Davis wrote a six-episode series for Sky Atlantic called Camping, about a starchy monster of a mom who organizes a camping trip for her family and two other couples. That show has now been snapped up by HBO and remade to gleaming American standards by Girls creators Lena Dunham and Jenni Konner.

It’s premiered to some fairly harsh reviews, all of which entirely miss the point. Camping, in its original incarnation, was a vicious satire about a middle-aged harridan who micromanages a trip at a campground run by a deranged mama’s boy. Her browbeaten husband joins her, as well as a son who is forced to wear a special helmet in case he gets injured. Also present is the mother’s pathetic sister, who brings along her boyfriend, a reformed alcoholic widower (his teenager comes with them). The third couple is a recently-separated young man and his brand new girlfriend, a nymphomaniac (played to perfection by Davis herself) who offers everybody speed or ketamine, sometimes mixing them up.

The jokes are blindingly cruel. The alcoholic starts drinking again, provoked by the nympho, and argues with another guy about his dead wife. “She drowned to death!” his friend cries. “Yeah,” he responds. “It’s like, watch where you’re going you stupid cow.” The overprotective mother takes her kid to the hospital after he gets a bump to the head. “I wonder if you wouldn’t mind having a quick look at his anus,” she asks the doctor. “I’m worried that he might have more than one.”

The miniature society of the camping trip totally breaks down. Repulsive sex ensues, along with diabolical mishaps, nudity, violence, and drug benders. The show isn’t so much dark as completely, disorientingly devoid of light.

Julia Davis in the original Camping.Sky Atlantic

Dunham and Konner have now remade the show for an American audience that is notoriously thinner-skinned than its British counterpart. It’s interesting to see what they’ve kept and what they’ve discarded. All the basic characters are there, played by a very starry cast: The micromanaging mom is Kathryn (Jennifer Garner). Her downtrodden husband is Walt (David Tennant, playing American). Miguel (Arturo del Puerto) is the newly separated horndog, with Jandice (Juliette Lewis) his boundary-trampling girlfriend. The innocent little sister is Carleen (Ione Skye), and her addict boyfriend is Joe (Chris Sullivan). They’ve added another couple to the mix, in the form of Nina-Joy and George (Janicza Bravo and Brett Gelman), to be foils to the other couples’ bad behavior.

The central change is that Dunham and Konner have imbued each character with a sympathetic twist. Kathryn still micromanages—forcing everybody to go bird-watching, for example—but now she’s a woman stricken by grief over her hysterectomy, her emotional dysfunction sublimated into worries over her body (echoing Dunham’s own medical woes). The brainless and nasty alcoholic of the British Camping has become a sweet, struggling guy who is just trying to find his way. Miguel the shagger is now a smart doctor finding a new light in his life, instead of a pathetic young Englishman who got hair plugs and speaks with a slight American accent.

Crucially, the girlfriend who shows up out of the blue to destabilize the group is now a toxic Reiki healer, rather than Davis’s dubstep DJ. In the original, Fay (Davis) is an irredeemable nightmare. In the new show, Juliette Lewis as Jandice gives us a character who is equally maddening but also charming. In this show, her oversteps sometimes really do liberate the uptight campers around her. She convinces Carleen to cut her hair, for example, and Carleen loves it. Lewis is by far the best actor in the cast, and she certainly has the best role to play with. She gets to be outlandish, horrible, and gorgeous, then flounce off into the fields aglow with hippie self-righteousness.

All these changes are more than acceptable. There is nothing on American television with the vinegar of the original Camping, and to my great sadness there probably will never be. Dunham and Konner’s light touch has allowed them to keep huge chunks of the original script. In fact, many of the jokes are exactly the same. “Velcome to ze camps!” both dads joke, to their wives’ chagrin. We are still allowed to hate everybody, just in lesser doses.

Sadly, the writers cut the best joke of the original script. The mother in the U.K. show won’t let her son eat sun-dried tomatoes, or mozzarella, or wraps, because she fears they will make him “a homosexual.” “They’ve found a link,” she spits, referring to some kind of imaginary science. Making the tyrannical mom into a homophobe is the perfect detail, an evil cherry on top of a very evil cake. The cut is symptomatic of the new show’s agenda: We can’t have Kathryn hate gay people if she is to be ultimately redeemable. We’ve got to keep the hope alive, or nobody will keep watching.

Almost every major review in the U.S. so far has bemoaned the painfulness of the Camping experience. In The New York Times, James Poniewozik lamented that “Camping could be a cutting social satire. ... But it’s hard to see past the harsh filter.” At Variety, Caroline Framke wrote that “the series wastes its potential, showing so little insight or movement that watching Camping becomes nearly as unpleasant as it is for the characters living through it.”

But great satire is meant to be unpleasant. It’s supposed to make your soul feel the way your mouth does when it fills with bile. When you have that feeling and then you laugh, it has the taste of truth. The new Camping has had some of the old force taken out of its swing, but now and then it delivers a real, live uppercut. A show like this should be a tussle: between viewer and character, between love and hate, between enjoyment and pain. Camping could be nastier still, but at least the fight is there.

Republicans Are Running Away From the Economy
Republicans Are Running Away From the Economy

The U.S. stock market’s weeklong decline reversed itself dramatically on Tuesday, producing one of the strongest climbs of the year. A gain driven by strong earnings reports from bulwarks like Goldman Sachs and Johnson & Johnson pushed the market, which had lost 1,600 points in the previous eight trading days, up nearly 600 points. While the Dow Jones still sits about 1,100 points shy of its early October high, the gains have calmed fears that the nearly decade-long bull market was coming to an end.

But if Wall Street was breathing easy on Tuesday, the media didn’t notice. Instead, cable news focused on the disappearance of Saudi journalist Jamal Khashoggi, the upcoming midterm elections, and President Donald Trump’s Twitter feed, where he called his alleged former mistress “horseface.” On Wednesday, in an attempt to work the refs, Trump called out the media for ignoring the market’s gains, implying that he should be given credit for them.

“Network News gave Zero coverage to the Big Day the Stock Market had yesterday.” @foxandfriends

— Donald J. Trump (@realDonaldTrump) October 17, 2018

But the media isn’t alone in ignoring good news about the market or the economy more broadly. After tethering himself to the stock market for most of his first year in office, Trump has distanced himself from its performance since February’s nosedive. And after spending the first half of the year planning to campaign on the $1.5 trillion tax cut passed in late-December, since midsummer congressional Republicans have all but ignored their top legislative achievement of the Trump era. It’s now clear that, far from being a boon, the tax cut is a liability for Republicans, with Democrats using it as proof of the party’s upper-crust loyalties. Handed the strongest economy since the mid-’90s, the GOP instead has decided to campaign much like its leader did in 2016: on a platform of fear.

Back in February, Republicans planned a midterm strategy centered on the $1.5 trillion tax cut (which they promised, to much skepticism, would reduce the deficit) and on the strong economy (which they had inherited from President Barack Obama). “The tax bill is part of a bigger theme that we’re going to call The Great American comeback,” National Republican Congressional Committee chairman Steve Stivers told Bloomberg. “If we stay focused on selling the tax reform package, I think we’re going to hold the House and things are going to be OK for us.” In the weeks and months after the Tax Cuts and Jobs Act was narrowly enacted, Trump and other Republicans relentlessly flaunted the law as evidence of the party’s fiscal bona fides.

The message was supposed to be simple. “Congress has reached an agreement on tax legislation that will deliver more jobs, higher wages and massive tax relief for American families and for American companies,” Trump promised after congressional Republicans finalized the bill in December. Concerns about the potential adverse effects of passing a $1.5 trillion tax cut during a rosy economic period—namely that it would balloon the deficit—were dismissed, despite the presence of numerous nonpartisan studies arguing that the national debt would increase by as much as $2 trillion. White House Budget Director Mick Mulvaney told CNN he thought the bill “actually generates money,” while Treasury Secretary Steven Mnuchin argued “the plan will pay for itself through growth.” Senate Majority Leader Mitch McConnell told reporters, “We fully anticipate this tax proposal in the end to be revenue neutral for the government, if not a revenue generator.”

While the tax cut appeared to add rocket fuel to a booming stock market, Republicans were never able to connect it to perceptions about the overall health of the economy. Most voters, a recent Gallup poll show, do not discern any change in their economic well-being tied to the tax cut, while an internal GOP Bloomberg poll found that voters, by a two-to-one margin, believed the cuts favored corporations and the wealthy. Democrats were more effective in messaging and instead reframed the law as what it (mostly) was: an unnecessary giveaway to corporations and the rich. “In terms of the bonus that corporate America received versus the crumbs they are giving to workers, to kind of put the schmooze is so pathetic, it’s so pathetic,” House Minority Leader Nancy Pelosi said in a January press conference. Republicans pounced at the time, thinking they had caught Pelosi making an elitist remark. But nearly a year later, many voters agree with Pelosi.

On Tuesday, the Treasury Department announced that the deficit had increased by nearly $800 billion—a jump of 17 percent—thanks in large part to declining tax revenue. The leap was the largest since 2009, at the height of the Great Recession. In April, the CBO released a report finding that the deficit would hit $1 trillion by 2020—two years earlier than initially thought. The rising deficit has caused Republicans, predictably, to call for entitlement reform—meaning cuts to Medicare, Medicaid, and Social Security. Ever the cynic, McConnell this week deflected reports that the GOP tax cut was driving the debt increase and instead suggested that entitlement programs were the real deficit busters. “It’s disappointing, but it’s not a Republican problem,” McConnell told Bloomberg on Tuesday. “It’s a bipartisan problem: Unwillingness to address the real drivers of the debt by doing anything to adjust those programs to the demographics of America in the future.”

But this has only fueled Democratic midterm messaging about potential GOP cuts to social spending. “Sen. McConnell gave the game up in his comment yesterday,” Maryland senator Chris Van Hollen, who chairs the Democratic Senatorial Campaign Committee, said in a press call on Wednesday. “It was very clear from what he said that a vote for Republican candidates in this election is a vote to cut Social Security, Medicare and Medicaid. That’s what he said.”

With voters dismissing the meager benefits the tax cut bestowed on them, and the deficit ballooning, Republicans can’t claim ownership of an economic boom that’s been years in the making. So instead they’re embracing Trump’s 2016 playbook: fear-mongering over Muslims, immigration, and crime. In the wake of the protests against Brett Kavanaugh’s confirmation to the Supreme Court, Republicans have argued, sometimes explicitly, that they must be kept in power to preserve the rule of law—that Democrats will burn everything to the ground if they’re put in charge. “You don’t hand matches to an arsonist, and you don’t hand power to an angry, left-wing mob, and that’s what they have become,” Trump recently said.

Republicans also may be realizing what Hillary Clinton’s campaign realized too late in 2016: The economy is strong by most metrics, but millions of voters don’t see it that way. Though the economy Trump inherited was booming, its gains were being reaped unequally—that’s partly why he won. Touting a strong economy to voters who have been left behind is far from a winning strategy, especially after passing a tax cut for the rich that has only made that inequality more pronounced.

All of this may be beside the point. Republicans hurried to enact the Tax Cuts and Jobs Act in part to please their donors, whose help they needed to fend off a blue wave in 2018. Sure enough, donors like Sheldon Adelson and the Koch brothers have opened their checkbooks, but Republicans are getting pummeled in fundraising in dozens of competitive districts: Democrats have out-raised Republicans in all 30 “tossup” races, according to FEC data released on Tuesday. Still, that money has done nothing to boost Republican efforts to sell the economy or the tax bill. “Their messaging has been extremely poor,” Steve Moore, who served as an economic adviser to Trump’s 2016 campaign, told The Washington Post. “We’ve got the best economy in 25 years and they aren’t really talking about it. They are letting Democrats control the messaging.”

Mind Games
Mind Games

In April, when the Senate Judiciary and Commerce committees summoned Facebook CEO Mark Zuckerberg to Washington, it looked as if the nation was finally going to reckon with the outsize role that technology companies now play in American elections. Seventeen months had gone by since Donald Trump’s stunning presidential victory—a success credited by many to his campaign’s mastery of Facebook’s advertising platform, as well as to the divisive agitprop seeded throughout Facebook by Russian trolls from the Internet Research Agency, whose 470 pages and accounts were seen by an estimated 157 million Americans.

But that was not what brought Zuckerberg to the Capitol. Instead, he was there to diffuse the bomb dropped three weeks earlier by Christopher Wylie, former research director at Cambridge Analytica, the data science firm that Trump’s digital team had employed during the election campaign. In interviews with The Guardian and The New York Times, Wylie confirmed that his company had taken data from millions of Facebook users without their knowledge or consent—as many as 87 million users, he later revealed. Cambridge Analytica had used the information to identify Americans’ subconscious biases and craft political messages designed to trigger their anxieties and thereby influence their political decisions—recasting a marketing technique known as “psychographics” that, more typically, is used to entice retail customers with ads that spark their underlying emotional reflexes. (“This product will make you feel happy!” “This product will make you feel attractive!”)

Cambridge Analytica turned this technique sideways, with messaging that exploited people’s vulnerabilities and psychological proclivities. Those with authoritarian sympathies might have received messages about gun rights or Trump’s desire to build a border wall. The overly anxious and insecure might have been pitched Facebook ads and emails talking about Hillary Clinton’s support for sanctuary cities and how they harbor undocumented and violent immigrants. Alexander Nix, who served as CEO of Cambridge Analytica until March, had earlier called this method of psychological arousal the data firm’s “secret sauce.”

Cambridge Analytica had purchased its Facebook user data for more than $800,000 from Global Science Research (GSR), a company that was set up specifically to access the accounts of anyone who clicked on GSR’s “This Is Your Digital Life” app—and the accounts of their Facebook friends. At the time, Facebook’s privacy policy allowed this, even though most users never consented to handing over their data or knew that it had been harvested and sold. The next year, when it became aware that Cambridge Analytica had purchased the data, Facebook took down the GSR app and asked both GSR and Cambridge Analytica to delete the data. “They told us they did this,” Zuckerberg told Congress. “In retrospect, it was clearly a mistake to believe them.” In March, around the time Wylie came forward, The New York Times reported that at least some of the data was still available online.

Wylie, a pink-haired, vegan, gay Canadian, might seem an unlikely asset to Trump’s campaign. And as he tells the story now, he’s filled with remorse for creating what, in an interview with The Guardian, he referred to as “Steve Bannon’s psychological warfare mindfuck tool.” For months, he had been quietly feeding information to the investigative reporter Carole Cadwalladr, whose articles in The Guardian and The Observer steadily revealed a through-line from dark money to Cambridge Analytica to Trump. (Cadwalladr also connected Cambridge Analytica to the Brexit campaign, through a Canadian data firm that worked both for the Vote Leave campaign and for Cambridge Analytica itself.) When he finally went public, Wylie explained how, with financial support from right-wing billionaire Robert Mercer, Cambridge Analytica’s principal investor, and with Steve Bannon’s guidance, he had built the algorithms and models that would target the innate biases of American voters. (Ted Cruz was one of Cambridge Analytica’s first clients and was Mercer’s preferred presidential candidate in the primaries before Trump crushed him.) In so doing, Wylie told Cadwalladr, “We ‘broke’ Facebook.”

So Zuckerberg agreed to come to Washington to be questioned by senators about the way his company’s lax privacy policies had inadvertently influenced the U.S. election—and possibly thrown it to Donald Trump. But what should have been a grilling turned out to be more like a sous vide—slow, gentle, low temperature—as the senators lightly rapped Zuckerberg on his knuckles over Facebook’s various blunders, and he continually reminded them that he’d created the site in his Harvard dorm room, not much more than a decade before, and now look at it! Of course, he reminded them with a kind of earnest contrition, there were going to be bumps in the road, growing pains, glitches. The senators seemed satisfied with his shambling responses and his constant refrain of “My team will get back to you,” and only mildly bothered when he couldn’t answer basic questions like the one from Roger Wicker, a Mississippi Republican, who wanted to know if Facebook could track a user’s internet browsing activity, even when that person was not logged on to the platform. (Answer: It can and it does.)

Shortly before this tepid inquest, Zuckerberg publicly endorsed the Honest Ads Act, a bipartisan bill cosponsored by Democratic Senators Amy Klobuchar and Mark Warner and Republican John McCain, which, among other things, would require internet platforms like Facebook to identify the sources of political advertisements. It also would subject online platforms and digital communications to the same campaign disclosure rules as television and radio.

A tech executive supporting federal regulation of the internet may, at first, seem like a big deal. “I’m not the type of person that thinks all regulation is bad,” Zuckerberg told the senators. “I think the internet is becoming increasingly important in people’s lives, and I think we need to have a full conversation about what is the right regulation, not whether it should be or shouldn’t be.” But Facebook has spent more than $50 million since 2009 lobbying Congress, in part to keep regulators at a distance, and cynics viewed Zuckerberg’s support for the new law as a calculated move to further this agenda. (Indeed, after California passed the strongest data privacy law in the country in June, Facebook and the other major tech companies began lobbying the Trump administration for a national, and far less stringent, data privacy policy that would supersede California’s.) Verbally supporting the Honest Ads Act— legislation that is unlikely to be enacted in the current atmosphere of the Congress—was easy, especially when Facebook had already begun rolling out a suite of new political ad policies that appeared to mirror the minimal transparency requirements lawmakers sought to establish. The subtext of this move was clear: Facebook could regulate itself without the interference of government overseers.

Zuckerberg’s congressional testimony was the culmination of an extensive apology tour in which he gave penitent interviews to The New York Times, Wired, Vox, and more, acknowledging that mistakes had been made. “This was a major breach of trust,” Zuckerberg told CNN. “I’m really sorry that this happened.” A month later, Facebook launched a major ad campaign, vowing, “From now on, Facebook will do more to keep you safe and protect your privacy.” Then, in mid-May, Cambridge Analytica declared bankruptcy, though this did not put an end to the whole affair. A legal challenge to the company by American professor David Carroll for processing his voter data without his knowledge or consent has been allowed to continue in the U.K., despite the firm’s dissolution.

Republican and Democratic data firms are hard at work on the next generation of digital tools—driven by the idea that political campaigns can identify and influence voters by gathering as much data about them as possible.

It’s impossible to know whether Cambridge Analytica’s psychographic algorithms truly made a difference in Trump’s victory. But the underlying idea—that political campaigns can identify and influence potential voters more effectively by gathering as much information as possible on their identities, beliefs, and habits—continues to drive both Republican and Democratic data firms, which are currently hard at work on the next generation of digital campaign tools. And while the controversy surrounding Cambridge Analytica exposed some of the more ominous aspects of election campaigning in the age of big data, the revelations haven’t led to soul-searching on the part of tech companies or serious calls for reform by the public—and certainly not from politicians, who benefit most from these tactics.

If anything, the digital arms race is accelerating, spurred by advances in artificial intelligence and machine learning, as technologists working both sides of the political aisle develop ever-more-powerful tools to parse, analyze, and beguile the electorate. Lawmakers in Congress may have called Mark Zuckerberg to account for Facebook’s lax protection of its users’ data. But larger and more enduring questions remain about how personal data continues to be collected and used to game not just the system, but ourselves as sovereign individuals and citizens.

In 1960, John F. Kennedy’s campaign manager—his brother Robert—hired one of the first data analytics firms, the Simulmatics Corporation, to use focus groups and voter surveys to tease out the underlying biases of the public as the country considered whether to elect its first Catholic president.

The work was top secret; Kennedy denied that he’d ever commissioned the Simulmatics report. But in the decades that followed, as market researchers and advertisers adopted psychological methods to better understand and appeal to consumers, social scientists consulting on political campaigns embraced the approach as well. They imagined a real-world political science fashioned out of population surveys, demographic analyses, psychological assessments, message testing, and algorithmic modeling. It would be a science that produced rational and quantifiable strategies to reach prospective voters and convert them into staunch supporters. That goal—merely aspirational at the time—has since developed into a multibillion-dollar industry, of which Cambridge Analytica was a well-remunerated beneficiary. For its five-month contract with the Trump campaign in 2016, the company was paid nearly $6 million.

The kind of work Cambridge Analytica was hired to perform is a derivative of “micro-targeting,” a marketing technique that was first adapted for politics in 2000 by Karl Rove, George W. Bush’s chief strategist. That year, and to an even greater degree in 2004, Rove and his team set about finding consumers—that is to say, voters—who were most likely to buy what his candidate was selling, by uncovering and then appealing to their most salient traits and concerns. Under Rove’s guidance, the Bush team surveyed large samples of individual voters to assess their beliefs and behaviors, looking at such things as church attendance, magazine subscriptions, and organization memberships, and then used the results to identify 30 different kinds of supporters, each with specific interests, lifestyles, ideologies, and affinities, from suburban moms who support the Second Amendment to veterans who love NASCAR. They then slotted the larger universe of possible Bush voters into those 30 categories and tailored their messages accordingly. This approach gave the Bush campaign a way to supplement traditional broadcast media by narrowcasting specific messages to specific constituencies, and it set the scene for every campaign, Republican and Democratic, that followed.

In 2008, the micro-targeting advantage shifted to the Democrats. Democratic National Committee Chair Howard Dean oversaw the development of a robust database of Democratic voters, while for-profit data companies were launched in support of liberal causes and Democratic candidates. Their for-profit status allowed them to share data sets between political clients and advocacy groups, something the DNC could not do with its voter database because of campaign finance laws. One of these companies, Catalist, now controls a data set of 240 million voting-age individuals, each an aggregate, the company says, of hundreds of data points, including “purchasing and investment profiles, donation behavior, occupational information, recreational interests, and engagement with civic and community groups.”

“Catalist was a game changer,” said Nicco Mele, the director of Harvard’s Shorenstein Center on Media, Politics, and Public Policy, and a veteran of dozens of political campaigns. “It preserved data in an ongoing way, cycle after cycle, so it wasn’t lost after every campaign and didn’t have to be re-created for the next one. Catalist got things going.”

The data sets were just one part of it. In 2008 and 2012, the Democrats also had more sophisticated predictive models than the Republicans did, a result of having teams of data scientists and behavioral scientists advising Barack Obama’s presidential campaigns. While the data scientists crunched the numbers, the behavioral scientists conducted experiments to determine the most promising ways to get people to vote for their candidate. Shaming them by comparing their voting record to their family members and neighbors turned out to be surprisingly effective, and person-to-person contact was dramatically more productive than robocalls; the two combined were even more potent.

The Obama campaign also repurposed an advertising strategy called “uplift” or “brand lift,” normally used to measure consumer-brand interactions, and used it to pursue persuadable voters. First they gathered millions of data points on the electorate from public sources, commercial information brokers, and their own surveys. Then they polled voters with great frequency and looked for patterns in the responses. The data points, overlaid on top of those patterns, allowed the Democrats to create models that predicted who was likely to vote for Obama, who was not, and who was open to persuasion. (The models also indicated who would be disinclined to vote for Obama if contacted by the campaign.) These models sorted individuals into categories, as the Bush campaign had done before— mothers concerned about gun violence, say, or millennials with significant college debt—and these categories were then used to tailor communications to the individuals in each group, which is the essence of micro-targeting.

The Obama campaign had another, novel source of data, too: information that flowed from the cable television set-top boxes in people’s homes. Through agreements with industry research firms, the campaign sent them the names and addresses of individuals whom their models tagged as persuadable, and the research companies sent back anonymous viewing profiles of each one. The campaign used these profiles to identify which stations and programs would give them the most persuasion per dollar, allowing them to buy ads in the places and times that would be most effective. The campaign also mined—and here’s the irony—Facebook data culled from the friends of friends, looking for supporters.

“To be a technology president used to be a very cool thing,” said Zac Moffatt, who ran Mitt Romney’s 2012 digital campaign. “And now it’s a very dangerous thing.”

The fact that the Obama campaign was able to use personal information in this way without raising the same ire as Cambridge Analytica and Facebook is a sign of how American views on technology and its role in politics have shifted over the past decade. At the time, technology was still largely viewed as a means to break traditional political structures, empower marginalized communities, and tap into the power of the grassroots. Today, however, many people have a much darker view of the role technology plays in politics—and in society as a whole. “In ’12 we talked about Obama using micro-targeting to look at your set-top box to tell you who should get what commercial, and we celebrated it,” said Zac Moffatt, who ran Mitt Romney’s digital campaign in that election. “But then we look at the next one and say, ‘Can you believe the horror of it?’ There’s an element of the lens through which you see it. To be a technology president used to be a very cool thing, and now it’s a very dangerous thing.”

Despite the innovations of both Obama campaigns, by the time the 2016 election season rolled around, the technological advantage had shifted back to the Republicans, who had developed a sophisticated, holistic approach to digital campaigning that benefited not only Donald Trump but down-ticket Republicans as well. Republicans had access to a revamped GOP Data Center run by the party, as well as to i360, a for-profit data operation bankrolled by the Koch brothers’ network that offered incredibly detailed information on potential voters. The i360 voter files combined information purchased from commercial sources, such as shopping habits, credit status, homeownership, and religious affiliation, with voting histories, social media content, and any connections a voter might have had with advocacy groups or other campaigns. To this, Politico reported in 2014, the Koch network added “polling, message-testing, fact-checking, advertising, media buying, [and] mastery of election law.” Democratic candidates, meanwhile, were largely beholden to the party’s official data provider, NGP VAN, with the DNC not only controlling the data, but deciding how it could be used and by whom.

In 2016, the technological advantage shifted to the Republicans. Now, however, Higher Ground Labs, a campaign-tech incubator, is trying to make Democrats competitive again.

The Obama team’s digital trailblazing also may have diverted attention from what the Republicans were actually up to during those years. “I think 2016 was kind of the realization that you had eight years of reporters believing everything told them by the Democratic Party—‘We know everything, and the Republicans can’t rub two rocks together,’” Moffatt said. “But if you look, the Republicans haven’t really lost since 2010, primarily based on their data fields and technology. But no one wanted to tell that story.”

One major plot-point in that story’s arc is that the Republican Party devoted more resources to social media and the internet than the Democrats did. Eighty percent of Trump’s advertising budget went to Facebook in 2016, for example. “The Trump campaign was fully willing to embrace the reality that consumption had moved to mobile, that it had moved to social,” Moffatt said. “If you think about Facebook as the entry point to the internet and a global news page, they dominated it there, while Hillary dominated all the places campaigns have historically dominated”—especially television.

Clinton’s loss hasn’t changed the basic strategy, either. Going into the midterms, Republicans continue to focus on the internet, while Democrats continue to pour money into television. (An exception is the Democratic-supporting PAC Priorities USA, which is spending $50 million on digital ads for the midterms.) Republicans are reportedly spending 40 percent on digital advertising, whereas Democrats are spending around 10 percent to 20 percent. Democratic strategist Tim Lim agrees that ignoring the internet in favor of television advertising is a flawed strategy. “The only way people can actually understand what we’re running for is if they see our messaging, and they’re not going to be seeing our messaging if we’re spending it on Wheel of Fortune and NCIS,” he said. “Democratic voters are not watching those shows.”

The Democrats are hampered by a structural problem, too: Each campaign owns its own digital tools, and when an election is over, those tools are packed up and put away. As a result, said Betsy Hoover, a veteran of both Obama campaigns who is now a partner at Higher Ground Labs, a liberal campaign-tech incubator, “four years later we’re essentially rebuilding solutions to the same problem from square one, rather than starting further up the chain.” The Republicans, by contrast, have been building platforms and seeding them up and down the party—which has allowed them to maintain their technological advantage.

“After losing in 2012, one of the most creative things the Republicans did was apply entrepreneurship to technology,” said Shomik Dutta, Hoover’s partner in Higher Ground Labs, who also worked on both Obama campaigns as well as in the Obama White House. “The Koch brothers funded i360, and the Mercers funded Cambridge Analytica and Breitbart, and they used entrepreneurship to take risks, build products, test them nimbly, and then scale up what worked quickly across the party. And that, I think, is a smart way to think about political technology.”

And so, taking a page from the Republican playbook, for the past two years Hoover and Dutta have been working to make Democrats competitive again in the arena of campaign technology. In the absence of deep-pocketed Democratic funders comparable to the Mercers and the Kochs, Higher Ground Labs acts as an incubator, looking particularly to Silicon Valley entrepreneurs to support the next generation of for-profit, election-tech startups. In 2017, the company divided $2.5 million in funding between eleven firms, and in April it announced that it was giving 13 additional startups an average of $100,000 each in seed money.

That’s still a far cry from the $50 million the Koch brothers reportedly spent to develop i360. And Higher Ground Labs faces other challenges, too. Political candidates and consultants are often creatures of habit, so getting them to try new products and untested approaches can be difficult. With presidential elections happening only every four years, and congressional elections happening every two, it can be difficult for election tech companies to sustain themselves financially. And, as has been the case with so many technology companies, growing from a small, nimble startup into a viable company that can compete on a national level is often tricky. “It’s easy to create a bunch of technology,” said Robby Mook, Hillary Clinton’s 2016 campaign manager, “but it’s a lot harder to create technology that creates the outcomes you need at scale.”

Hoover and Dutta are hopeful that their investments in these startups will pay off. If the companies make money, Higher Ground Labs will become self-sustaining. But even if the startups fail financially, they may show what’s possible technologically. Indeed, the new tools these companies are working on are a different order of magnitude from the searchable databases that companies like Catalist pioneered just a few election cycles ago. And if they help Democratic candidates win, Hoover and Dutta view it as money well spent. “We hope to be part of the cavalry,” Hoover said.

The companies that Higher Ground Labs is funding are working on all aspects of campaigning: fundraising, polling, research, voter persuasion, and get-out-the-vote efforts. They show where technology—especially artificial intelligence, machine learning, and data mining—is taking campaigning, not unlike Cambridge Analytica did two years ago when it launched “psychographics” into the public consciousness. One Higher Ground-funded company has developed a platform that uses web site banner ads to measure public opinion. Another is able to analyze social media to identify content that actually changes minds (as opposed to messages that people ignore). A third has created a database of every candidate running for office across the country, providing actionable information to state party operatives and donors while building a core piece of infrastructure for the Democrats more generally.

An opposition research firm on Higher Ground’s roster, Factba.se, may offer campaigns the antidote to fake news (assuming evidence still matters in political discourse). It scours documents, social media, videos, and audio recordings to create a searchable compendium of every word published or uttered online by an individual. If you want to discover everything Donald Trump has ever said about women or steak or immigrants or cocaine, it will be in Factba.se. If you want to know every time he’s contradicted himself, Factba.se can provide that information. If you want to know just how limited his verbal skills are, that analysis is available too. (The president speaks at a fourth-grade level.) And if you want to know what’s really bugging him—or anyone—Factba.se uses software that can evaluate audio recordings and pick up on expressions of stress or discomfort in a person’s voice that are undetectable to the naked eye or ear.

To augment its targets’ dossiers, the company also uses personality tests to assess their emotional makeup. Is the subject extroverted, neurotic, depressed, or scared? Is he all of the above? (One of these tests, OCEAN—designed to measure openness, conscientiousness, extraversion, agreeableness, and neuroticism—is actually the same one that Cambridge Analytica used to construct its models.)

“We build these profiles of people based upon everything they do,” said Mark Walsh, the CEO of FactSquared. “Our AI engine can be quite predictive of what makes you happy and what makes you sad.”

“We build these profiles of people based upon everything they do, and then we do an analysis,” said Mark Walsh, the CEO of Factba.se’s parent company FactSquared, who was the first chief technology officer of the Democratic Party, back in 2002. As an example, he cited the 2017 gubernatorial race in Virginia, where Ed Gillespie, the former head of the Republican National Committee, ran against Lt. Governor Ralph Northam. “We took audio and video of the three debates, and we analyzed Gillespie, looking for micro-tremors and tension in his voice when he was talking about certain topics,” Walsh said. “If you were watching the debates, you wouldn’t know that there was a huge spike when he was talking about his own party’s gun control policy, which he didn’t seem to agree with.” This was valuable intel for the Northam campaign, Walsh said—though in the end, Northam didn’t really need it. (Northam won the election by nearly nine points, the biggest margin for a Democrat in more than a quarter-century.) Still, it was a weapon that stood at the ready.

“These are the types of oppo things you’re going to start to see more and more of, where candidate A will be able to fuck with the head of candidate B in ways that the populace won’t know, by saying things that they know bothers them or challenges them or makes them off-kilter,” Walsh said. “After about 5,000 words, our AI engine can be quite predictive of what makes you happy and what makes you sad and what makes you nervous.”

Judging personalities, measuring voice stress, digging through everything someone has ever said—all of this suggests that future digital campaigns, irrespective of party, will have ever-sharper tools to burrow into the psyches of candidates and voters. Consider Avalanche Strategy, another startup supported by Higher Ground Labs. Its proprietary algorithm analyzes what people say and tries to determine what they really mean—whether they are perhaps shading the truth or not being completely comfortable about their views. According to Michiah Prull, one of the company’s founders, the data firm prompts survey takers to answer open-ended questions about a particular issue, and then analyzes the specific language in the responses to identify “psychographic clusters” within the larger population. This allows campaigns to target their messaging even more effectively than traditional polling can—because, as the 2016 election made clear, people often aren’t completely open and honest with pollsters.

“We are able to identify the positioning, framing, and messaging that will resonate across the clusters to create large, powerful coalitions, and within clusters to drive the strongest engagement with specific groups,” Prull said. Avalanche Strategy’s technology was used by six female first-time candidates in the 2017 Virginia election who took its insights and created digital ads based on its recommendations in the final weeks of the campaign. Five of the six women won.

Clearly, despite public consternation over Cambridge Analytica’s tactics, especially in the days and weeks after Trump won (and before its data misappropriation had come to light), political campaigns are not shying away from the use of psychographics. If anything, the use of deeply personal data is becoming even more embedded within today’s approach to campaigning. “There is real social science behind it,” Laura Quinn, the CEO of Catalist, told me not long after the 2016 election. “The Facebook platform lets people gather so much attitudinal information. This is going to be very useful in the future for figuring out how to make resource decisions about where people might be more receptive to a set of narratives or content or issues you are promoting.”

And it’s not just Facebook that provides a wealth of user information. Almost all online activity, and much offline, produces vast amounts of data that is being aggregated and analyzed by commercial vendors who sell the information to whoever will pay for it—businesses, universities, political campaigns. That is the modus operandi of what the now-retired Harvard business professor Shoshana Zuboff calls “surveillance capitalism”: Everything that can be captured about citizens is sucked up and monetized by data brokers, marketers, and companies angling for your business. That data is corralled into algorithms that tell advertisers what you might buy, insurance companies if you’re a good risk, colleges if you’re an attractive candidate for admission, courts if you’re likely to commit another crime, and on and on. Elections—the essence of our democracy—are not exempt.

The manipulation of personal data to advance a political cause undermines a fundamental aspect of American democracy: the idea of a free and fair election.

Just as advertisers or platforms like Facebook and Google argue that all this data leads to ads that consumers actually want to see, political campaigns contend that their use of data does something similar: It enables an accurate ideological alignment of candidate and voter. That could very well be true. Even so, the manipulation of personal data to advance a political cause undermines a fundamental aspect of American democracy that begins to seem more remote with each passing campaign: the idea of a free and fair election. That, in the end, is the most important lesson of Cambridge Analytica. It didn’t just “break Facebook,” it broke the idea that campaigns work to convince voters that their policies will best serve them. It attempted to use psychological and other personal information to engage in a kind of voluntary disenfranchisement by depressing and suppressing turnout with messaging designed to keep voters who support the opposing candidate away from the polls—as well as using that same information to arouse fear, incite animosity, and divide the electorate.

“Manipulation obscures motive,” Prull said, and this is the problem in a nutshell: Technology may be neutral (this is debatable), but its deployment rarely is. No one cared that campaigns were using psychographics until it was revealed that psychographics might have helped put Donald Trump in the White House. No one cared about Facebook’s dark posts until they were used to discourage African Americans from showing up at the polls. No one noticed that their Twitter follower, Glenda from the heartland, with her million reasons to dislike Hillary Clinton, was really a bot created in Russia—until after the election, when the Kremlin’s efforts to use social media to sow dissension throughout the electorate were unmasked. American democracy, already pushed to the brink by unlimited corporate campaign donations, by gerrymandering, by election hacking, and by efforts to disenfranchise poor, minority, and typically Democratic voters, now must contend with a system that favors the campaign with the best data and the best tools. And as was made clear in 2016, data can be obtained surreptitiously, and tools can be used furtively, and no one can stop it before it’s too late.

Just as worrisome as political campaigns misusing technology are the outside forces seeking to influence American politics for their own ends. As Russia’s interventions in the 2016 election highlight, the biggest threats may not come from the apps and algorithms developed by campaigns, but instead from rogue operatives anywhere in the world using tools freely available on the internet.

Of particular concern to political strategists is the emerging trend of “deepfake” videos, which show real people saying and doing things they never actually said or did. These videos look so authentic that it is almost impossible to discern that they are not real. “This is super dangerous going into 2020,” Zac Moffatt said. “Our ability to process information lags behind the ability of technology to make something believable when it’s not. I just don’t think we’re ready for that.”

To get a sense of this growing threat, one only need look at a video that appeared on Facebook not long after the young Democratic Socialist Alexandria Ocasio-Cortez won her primary for a New York congressional seat in June. The video appeared to be an interview with Ocasio-Cortez conducted by Allie Stuckey, the host of Conservative Review TV, an online political channel. Stuckey, on one side of the screen, asks Ocasio-Cortez questions, and Ocasio-Cortez, on the other, struggles to respond or gives embarrassingly wrong answers. She looks foolish. But the interview isn’t real. The video was a cut-and-paste job. Conservative Review TV had taken answers from a PBS interview with Ocasio-Cortez and paired them with questions asked by Stuckey that were designed to humiliate the candidate. The effort to discredit Ocasio-Cortez was extremely effective. In less than 24 hours, the interview was viewed more than a million times.

The Ocasio-Cortez video was not especially well made; a discerning viewer could spot the manipulation. But as technology improves, deepfakes will become harder and harder to identify. They will challenge reality. They will make a mockery of federal election laws, because they will catapult viewers into a post-truth universe where, to paraphrase Orwell, power is tearing human minds to pieces and putting them together again in new shapes of someone else’s thinking.

Facebook didn’t remove the offensive Ocasio-Cortez video when it was revealed to be a fake because Stuckey claimed—after the video had gone viral—that it was satirical; the company doesn’t take down humor unless it violates its “community standards.” This and other inconsistencies in Facebook’s “fake news” policies (such as its failure to remove Holocaust denier pages) demonstrate how difficult it will be to keep bad actors from using the platform to circulate malicious information. It also reveals the challenge, if not the danger, of letting tech companies police themselves.

This is not to suggest that the government will necessarily do a better job. It is quixotic to believe that there will be a legislative intervention to regulate how campaigns obtain data and how they use it anytime soon. In September, two months before the midterm elections, Republicans in the House of Representatives pulled out of a deal with their Democratic counterparts that would have banned campaigns from using stolen or hacked material. Meanwhile, the Federal Election Commission is largely toothless, and it’s hard to imagine how routine political messages, never mind campaign tech, could be regulated, let alone if they should be. Though the government has established fair election laws in the past—to combat influence peddling and fraud, for instance—the dizzying pace at which campaign technology is evolving makes it especially difficult for lawmakers to grapple with intellectually and legislatively, and for the public to understand the stakes. “If you leave us to do this on our own, we’re gonna mess it up,” Senator Mark Warner conceded this past June, alluding to his and his colleagues’ lack of technical expertise. Instead, Warner imagined some kind of partnership between lawmakers and the technology companies they’d oversee, which of course comes with its own complications.

If there is any good news in all of this, it is that technology is also being used to expand the electorate and extend the franchise. Democracy Works, a Brooklyn-based nonprofit, for example, has partnered with Facebook on a massive effort to register new voters. And more Americans— particularly younger people—are participating in the political process through “peer-to-peer” texting apps like Hustle on the left (which initially took off during the Bernie Sanders campaign), RumbleUp on the right, and CallHub, which is nonpartisan. These mobile apps enable supporters who may not want to knock on doors or make phone calls to still engage in canvassing activities directly.

This is key, because if there is one abiding message from political consultants of all dispositions, it is that the most effective campaigns are the most intimate ones. Hacking and cheating aside, technology will only carry a candidate so far. “I think the biggest fallacy out there right now is that we win through digital,” Robby Mook said. “Campaigns win because they have something compelling to say.”

The Blue Wave Might Wipe Out Voter Suppression, Too
The Blue Wave Might Wipe Out Voter Suppression, Too

Every state routinely prunes its voter rolls when registered voters move, die, or get convicted of a felony. But under Secretary of State Jon Husted, Ohio has taken an aggressive tack to removing voters from state registration lists. The state purged more than two million voters from its rolls between 2011 and 2016. Many, if not most of those voters were likely ineligible. But over the past few years, as part of an annual audit of sorts, Husted’s office has removed thousands of eligible voters from the rolls simply because they failed to vote in three consecutive elections and didn’t return a postcard confirming their registration.

Critics, including state Representative Kathleen Clyde, said this practice disproportionately affects low-income Ohioans and communities of color, two constituencies that typically favor Democratic candidates. In 2015, Clyde introduced a bill to block Husted from purging voters unless they leave the state. Two years later, she authored another bill that would enact automatic voter registration.

Neither measure became law in the Republican-led chamber. And this past June, the Supreme Court upheld Husted’s purge. But soon Clyde may be in a position to stop the practice herself: She’s the Democratic nominee to replace Husted as secretary of state, a position that would give her significant influence over the state’s election laws.

Clyde is one of roughly a dozen Democratic candidates across the country who could become their states’ chief election officers if a blue wave sweeps through polling places in November. Their victories would provide Democrats an opportunity to turn back voter suppression efforts by Republican officeholders—and could give the party a leg up in voter turnout when President Donald Trump is up for reelection in 2020.

Forty-seven states have a secretary of state, either as an appointed post or an elected office. While the position’s duties can vary from state to state, the most common duty is to oversee elections and voting procedures, which are shaped by a mixture of federal statutes, state laws, and county policies. Navigating that legal labyrinth often falls to secretaries of state—the hall monitor, of sorts, for the nation’s democratic processes.

The role has taken on a heightened significance in recent years. Republicans hold more than half of the positions across the country, giving the party an advantage when shaping the nation’s election processes. Some Republican secretaries of state have used the position to crusade against the purported threat of voter fraud. Though vanishingly rare in the U.S., voter fraud has provided a useful justification for more restrictive voting measures that have kept tens of thousands of Americans from exercising their right to cast a ballot.

Governing magazine’s ratings of this year’s secretary of state races, as of October 12.

Democrats have an opportunity in the November midterm elections to tear down those barriers. Roughly two-thirds of the nation’s elected secretary of state positions are on the ballot this year, and Republicans are defending seven open seats, versus none for Democrats. (Governing magazine has rated eight races as competitive—and all seats currently held by Republicans.) What’s more, the secretaries of state elected this year will serve during the 2020 presidential election, meaning that Democratic officeholders would be well-placed to expand voter access in battleground states like Arizona, Georgia, Michigan, Nevada, Ohio, and Wisconsin should they prevail in two weeks.

Some contests have already drawn national attention. Georgia Secretary of State Brian Kemp, the Republican candidate for governor, froze 53,000 voter registration forms for dubious reasons, according to an Associated Press investigation earlier this month. More than 70 percent of the forms came from black applicants, raising concerns that the freeze is aimed at reducing voter turnout for Stacey Abrams, Kemp’s Democratic opponent. (If elected, Abrams would be the first black woman governor in American history.) Kemp also presided over a sweeping purge of the state’s voter rolls that removed almost 700,000 voters over the past two years.

Brad Raffensperger, the Republican candidate to replace Kemp, said at a debate earlier this month that he would continue the purges to “safeguard and keep our elections clean.” Jack Barrow, the Democratic challenger, opposes them. “Just because the Supreme Court allows you to discriminate doesn’t mean you must discriminate,” he wrote on Twitter after the court’s ruling in the Ohio case. “As your next Secretary of State, I’ll protect citizens who choose not to vote and keep them from being purged from voter rolls.” Barrow has also been critical of Kemp’s approach to election cybersecurity after Russian hackers targeted state election systems in 2016.

Kansas also moved to the center of the voting-rights wars after Kris Kobach’s election as secretary of state in 2010. He became a national spokesman of sorts for restrictive voting laws, championing the state’s strict voter ID law, successfully persuading Kansas lawmakers to let his office prosecute voter-fraud cases, and overseeing a dubious interstate anti-fraud program with a high false positive rate. His efforts to add a proof-of-citizenship requirement to the state’s voter registration forms sparked a legal showdown with the ACLU, which successfully argued in court that the move violated federal election law. A federal judge found Kobach in contempt earlier this year for failing to register voters who had been previously blocked by the requirement. Kobach is leaving the secretary of state post because he’s running for governor (and facing a strong challenge from Democratic state Senator Laura Kelly, who backed his voter citizenship bill).

It’s hard to imagine a sharper contrast to Kobach than Brian McClendon, the Democratic candidate to replace him. A former Google vice president who helped build Google Earth, McClendon is one of multiple Democrats running for secretary of state offices who have emphasized election cybersecurity. He’s also sketched out a less centralized version of the Kansas secretary of state’s office that focuses on voter access. “With appropriate leadership and support from the state, county elections staff are admirably effective at managing voting rolls,” he told the Topeka Capital-Journal in June. “And the office of the attorney general is better staffed and better qualified to handle law enforcement and prosecutions in the rare instances of voter fraud.”

Not every Republican secretary of state has tried to suppress voter participation, and not every Democratic candidate will have the power to carry out sweeping changes if they win next month. Some races, in fact, are referendums on expanding voter access rather than suppressing it: Michigan’s secretary of state contest is taking place alongside a major ballot initiative that would enact automatic voter registration, Election Day registration, no-excuse absentee ballots, and a slate of other reforms. Jocelyn Benson, the Democratic candidate, supports the initiative, while her Republican opponent Mary Treder Lang does not.

Secretary of state contests haven’t always been high profile or high stakes. But as the Supreme Court abandons its role as a guardian of voting rights, and the Trump administration ramps up its efforts to combat the illusory threat of voter fraud, this once-esoteric position could be a key bulwark in protecting Americans’ right to choose their own political destiny.

Hearing Secret Harmonies
Hearing Secret Harmonies

Graham Greene famously observed that there is a splinter of ice in the heart of every writer, and certainly this is true of that generation of middle-class male novelists born in the decade before 1914. These were the men who, as one of them, John Heygate, remarked, were “too young to enter the war, too old to inherit the peace.” What they had in common was a deep-seated strain of melancholy verging on, and frequently lapsing into, a curiously unemphatic, almost whimsical form of despair. Heygate, a minor novelist and man-about-other people’s wives, killed himself in the 1970s, leaving instructions for his friends to have a lavish, celebratory meal after his funeral. Their outlook upon the world may have been bleak, but they did have style, those chaps.

Anthony Powell, the subject of Hilary Spurling’s elegant, affectionate biography Dancing to the Music of Time, was afflicted by recurring and utterly debilitating bouts of depression, one of the worst of which followed the discovery, after the event, of his wife’s adultery sometime in the 1940s. Nicholas Jenkins, the affectless narrator of Powell’s most famous work, the multivolume novel sequence A Dance to the Music of Time, published between 1951 and 1975, shares many of his creator’s own traits. He springs to something most closely resembling life on those occasions when he recalls an early love who betrayed him, and the torments of jealousy that he, like Powell, suffered because of her unfaithfulness: “I felt as if someone had suddenly kicked my legs from under me, so that I had landed on the other side of the room…with all the breath knocked out of me.” This is spoken, we feel, in a tone very close to Powell’s own.

ANTHONY POWELL: DANCING TO THE MUSIC OF TIME by Hilary SpurlingKnopf, 480 pp., $35.00

A Dance, as internal evidence suggests, and as Spurling frequently confirms, is Powell’s largely autobiographical account of the period just before what used to be called the Great War, through World War II, and into the 1970s. Or perhaps it would be more accurate to say that it is a fictionalized group portrait of certain people living in those times, since Jenkins, the narrator, figures as a version of Woody Allen’s character Zelig. He is a mostly passive participant in all the major events of the elaborately detailed plot, as he encounters dodgy aristocrats, artists manqués—in Jenkins’s world no one is ever quite first rate, and quite a few are hapless failures— aspiring politicians and expiring relatives, femmes fatales, money men and wastrels, peace-loving soldiers and warlike civilians: a latter-day Vanity Fair, in other words. Towering over all, in all his egregious awfulness, is the horribly memorable Kenneth Widmerpool, whose self-promoting machinations are among the forces that drive the novel sequence.

Powell himself, according to his biographer, considered the central theme of the sequence to be “human beings behaving.” Although it is hard to think what characters, like real people, might do other than “behave,” the most appealing quality of A Dance is the almost hallucinatory sense it conveys of real people performing real actions in a wholly realistic world. Powell had, as Spurling has him say of Shakespeare, “an extraordinary grasp of what other people were like.” As a novelist, he had an unusual ability to portray large gatherings of people, and he made the phenomenon of “the party” one of his specialties. His women are particularly convincing, while his best male characters are the louche and slightly disreputable ones. He is not as acute as Evelyn Waugh, the writer to whom he is most often compared, and is certainly not his equal as a stylist, but Powell is a far more disinterested writer than Waugh, and lets his characters reveal themselves in a wholly natural way that Waugh would not have been capable of. Waugh’s fiction always bears the artist’s stamp, whereas Powell’s work appears self-generated.

As an avid observer of the comédie humaine, Powell was drawn to John Aubrey, the seventeenth-century author of Brief Lives, a series of short, often comic, and sometimes scurrilous biographical sketches of numerous of the author’s contemporaries. “He contemplated the life round him as in a mirror,” Powell admiringly wrote in his biography of Aubrey,

He was there to watch and to record, and the present must become the past, even though only the immediate past, before it could wholly command his attention. For him the world of action represented unreality.

Powell, in his novels, took a similar stance in regard to his characters and the time they lived in. He catches perfectly the curiously languid pace of twentieth-century middle-class English life, which persisted even through two world wars, and which self-deluding Brexiteers vainly imagine can be reinstituted in today’s globalized world.

Powell’s own life spanned most of the last century—he was born in 1905 and died in 2000—and despite his urge toward self-effacement, it was in its way every bit as active, noteworthy, and odd as the lives that John Aubrey sketched, or as the fictional lives of the multitude of characters Powell himself invented over the span of his career. He was born in “one of 159 identical furnished flats in a set of five monolithic blocks” near Victoria Station in London. His mother’s people had been landed gentry in a small way, but they lost their modest fortune in the costly and vain pursuit of a peerage. Powell’s paternal grandfather, Lionel, settled near Melton Mowbray in Leicestershire and became a surgeon of sorts in order to finance his passion for foxhunting. He “relied throughout his career on the hunting shires around Melton,” Spurling writes, “for a steady supply of fresh fractures” to treat.

Philip Powell, Lionel’s son and Anthony Powell’s father, as a boy had been “blooded” by being smeared with the tail of a newly killed fox, an experience that, Spurling says, “seems to have inoculated him against the sport of kings ever after.” Still, he cannot have been entirely averse to bloodshed, since he became a career soldier. He was a “dashing young subaltern” of not quite 18 when he fell in love with his future wife, Maud, an impossibly young-looking 33-year-old, whose “banjo solos were a star attraction of the Ladies Mandoline and Guitar Band” in the 1890s. She had known Philip since he was a baby. The couple had to wait three years for Philip to reach 21 so that they could marry without the consent of his parents. The marriage was happy enough at first, but Maud soon became depressed by the conviction that she was looked upon as a cradle-snatcher. She shunned society and even her own friends and settled into a reclusive life, which, luckily, suited her husband—and which Nick Jenkins describes with subdued pathos as the lives of his own parents in The Kindly Ones, the sixth volume of A Dance.

These facts of his early life no doubt contributed to Powell’s lifelong diffidence and cool detachment from the lives going on around him, lives that he nevertheless tracked with the obsessiveness and detailed attention of a Nabokovian naturalist. Growing up “in rented lodgings or hotel rooms,” he was “constantly on the move as a boy,” and, Spurling proposes, he “needed an energetic imagination to people a sadly underpopulated world from a child’s point of view.” And these years perhaps shaped his view of himself as a keen-eyed outsider. A slight figure, with notably short legs, he used to represent himself in marginalia in his letters as a dwarf, complete with bobbled hat and bootees.

Powell’s father seems genuinely to have loved his wife, and probably loved his only child, too, but as the years went on he became a sort of second son to Maud, who it sometimes seemed “had to deal with two implacable infant male egos.” Indeed, accounts of Philip Powell’s character and behavior give a new and forceful meaning to the word irascible. In old age, when he was living in solitude in a seedy London hotel, the management, unable any longer to tolerate his impossible behavior, issued an ultimatum for his removal to a nursing home. Anthony’s wife, Violet, traveled up from their home in the country to break the news to the old boy, who, Spurling writes, “pre-empted alternative plans for his future by dying the same day.” At the funeral, Powell heard an explosion from a nearby quarry that sounded to him, so he told his son Tristram, “like Grandfather being received in the next world.”

Like many of the sons of English middle-class parents of the time, the boy Anthony—or “Tony,” as Hilary Spurling, who was a friend, calls him throughout her book—was dreadfully unhappy at school. He went first to the New Beacon School at Sevenoaks in Kent, where most of the pupils came from military families. He was lucky in making a close friend there, Henry Yorke—later to be the novelist Henry Green, another product of that melancholic prewar generation—who later described being offered “a stinking ham oozing clear smelly liquid, and boys so hungry they ate raw turnips and mangel wurzels” stolen from the farmers’ fields roundabout. For years after he left the school, Powell had recurring nightmares of being back there, until in his late 20s he dreamed he had killed the headmaster, which proved a curative.

He had recurring nightmares of being back at school, until in his late 20s he dreamed he had killed the headmaster, which proved a curative.

From the Beacon, Powell went on to Eton, which seems to have been, if only by comparison, an improvement on what had gone before. Some of the housemasters there were interesting, and at least one of them, Arthur Goodhart, would find his way into A Dance as the restless, dim, and faintly sinister Le Bas, who is a mass of physical tics, has numerous passages of second-rate verse off by heart, and on one occasion is caused to be arrested by the police, a prank played on him by Charles Stringham, Jenkins’s friend and one of the most vivid characters in the sequence. It was at Eton that Powell developed his interest in and talent for drawing, and from the beginning, according to Spurling, he “found his own natural habitat in the Drawing Schools on Keats Lane.” However, although his imagination was in many respects pictorial, he did not have the makings of an artist in this medium, and his drawing remained confined to caricatures and amusing sketches.

Throughout his life, though, he kept up the habit of assembling scrapbooks and murals, some of them considerable in size, style, and prolixity, from images cut out of newspapers and magazines—examples of these adorn the endpapers of Spurling’s biography—which might be speeded-up, manic versions of Poussin’s masterpiece A Dance to the Music of Time. The painting, which hangs in The Wallace Collection in London, and to which Powell returned again and again, shows a quartet of figures, three women and a man, engaged in a round dance to the music of a lyre played by the figure of Time, an elderly, naked man. Recalling in his memoirs his first view of the painting, he wrote, “I knew all at once that Poussin had expressed at least one important aspect of what the novel must be.”

One guesses this aspect to be what he referred to, with the urgency of italics, as “the importance of structure.” What Powell took from Poussin is a classically balanced coolness of style and treatment. Because one reads A Dance close-up, necessarily—it is after all a compelling, even a rollicking, narrative, except perhaps in the wartime sections, where, paradoxically, the pace slackens to a slow march—it is easy not to notice how tightly and expertly woven the tapestry is. Despite the claims of some of his more excitable admirers, Powell is a much lesser artist than Joyce, lacking Joyce’s stylistic exuberance and his determination to break out of the bonds of the traditional novel form. Yet he could with justice claim of A Dance, as Joyce did of Ulysses, that it is a triumphant feat of engineering.

Somewhat surprisingly, and unlike his friend and friendly rival Evelyn Waugh, Powell detested Oxford and chafed throughout his time at the college. Probably he was too solitary a soul—and too confirmed a heterosexual—to relish the jostling, sybaritic pleasures on offer in the City of Dreaming Spires in the interwar years. One night at dinner he made the mistake of confessing his distaste for college life to the legendary don Maurice Bowra, who was so shocked at the notion of anyone not venerating the alma mater that a rift was opened between the two men that was to last for 35 years. Spurling has no doubts that Powell’s happiest, or least unhappy, time at Oxford was his last year, when he was sharing rooms with Henry Green and they were discovering together, among other glories, Proust’s À la recherche du temps perdu, as each new volume appeared. Proust was, along with Poussin, a vital discovery of Powell’s younger years, the great exemplar who showed him what wonders, not only of narrative but also of style and form, could be achieved in the roman-fleuve.

Powell got out of Oxford as quickly as he could and went to work for the venerable and highly dysfunctional publishing firm of Gerald Duckworth & Co. Spurling’s pages on this period of his life, like the fictional version of it in A Dance, contain some of the most richly entertaining passages in the biography. Gerald Duckworth was a figure that Powell, or Waugh, would hardly have dared to invent. He “smoked foul-smelling cigars, enjoyed a bottle of claret a day over lunch at the Garrick, and was often half-tipsy in the office.” More pertinently, he hated books and, according to the head of a rival publishing house, he considered authors “a natural enemy against whom the publisher must hold himself arrayed for battle.”

In looks, Powell was no matinee idol, and to many he seemed cold, aloof, and arrogant, yet a remarkable number of remarkable women fell in love with him, or at least suffered him to fall in love with them. Few of his early affairs were satisfactory, until he met Violet Pakenham in the summer of 1934. The encounter took place at Pakenham Hall in County Westmeath, Ireland, seat of the Earl of Longford, who had inherited the title at 13 and later repudiated it, being an Irish nationalist to the extent of changing his name to Eamon de Longphort. The family was a sort of Irish version of the Mitford clan, though possibly a shade more eccentric, if such seems possible.

As the dance of life proceeded around him, by turns gay and melancholy, Powell watched, he listened, he noted.

Powell was attending a house party at the Hall, not very happily; Christine Longford was also a novelist, and jealous, like all novelists, so that from the start the occasion was touched with a definite froideur. Powell was having his portrait painted by Henry Lamb, Edward Longford’s brother-in-law, who summoned his wife’s sister to keep the model from fidgeting. This indicates a touching naivety of the painter’s part, since Violet Pakenham was beautiful, intelligent, and a definite “catch”—Marion Coates, a girlfriend of Powell’s at the time, remarked wryly years later that she could quite see why he would throw her over in favor of the daughter of a belted earl.

After the sitting, Violet took Anthony outside to the kitchen garden and, in an Edenic gesture, conscious or otherwise, picked him an apple; that evening, as Violet wrote years later, there began “a conversation which has continued unabated until this day.” The marriage was long and happy, surviving even Violet’s secret affair with the man—Spurling has been unable to identify him—whom she described to Sonia Orwell as “the love of her life.” When Powell found out about Violet’s betrayal—probably in 1946, according to Spurling—“he plunged into a hole of depression, exhaustion and almost insane overwork.”

Powell was indeed a prodigious worker, who in his early years as a writer could read and review five or six books a week. Over his lifetime, he produced 19 novels, five volumes of memoirs and three of journals, a writer’s notebook, and, for good measure, two plays. As an artist, he probably lived too long: His work was largely done by the mid-1970s, when he published Hearing Secret Harmonies, the final volume in the Dance series, and when the world in which the series was set had largely disappeared. He spent the remainder of his life doing little more than tidying his desk, as Spurling tacitly acknowledges by wrapping up those years in an appositely titled, and decidedly perfunctory, 13-page Postscript.

Although all of Powell’s novels sparkle, if not all the time, his true achievement is A Dance. Even though it is probably not quite as good as many, including Powell himself, considered it to be, it will live if only through a handful of characters who have become emblematic of a milieu and a time. These include the toadlike Widmerpool; Charles Stringham, funny, fey, and doomed; Pamela Flitton, shamelessly based on the beautiful man-eater Barbara Skelton—who threatened to sue, but settled instead for advice on getting a novel published; the crafty, avaricious, and unforgettably awful Uncle Giles; and Jenkins’s lost love, the beautiful betrayer Jean Duport. These are, as Evelyn Waugh said of Captain Grimes in Decline and Fall, among the immortals.

And it was Waugh who paid Powell the best and certainly the most elegant tribute one novelist could bestow upon another. In an uncharacteristically warm and generous assessment of his friend’s masterwork, he wrote:

Less original novelists tenaciously follow their protagonists. In the Music of Time we watch through the glass of a tank; one after another various specimens swim towards us; we see them clearly, then with a barely perceptible flick of fin or tail, they are off into the murk. That is how our encounters occur in real life. Friends and acquaintances approach or recede year by year…Their presence has no particular significance. It is recorded as part of the permeating and inebriating atmosphere of the haphazard which is the essence of Mr. Powell’s art.

Despite inevitable flaws and weaknesses, Anthony Powell was a master of the traditional English novel form. As the dance of life proceeded around him, by turns gay and melancholy, he watched, he listened, he noted, with the most careful interest and attention. “Try,” Henry James, in his great essay The Art of Fiction, urged the tyro novelist, “to be one of the people on whom nothing is lost.” Anthony Powell was without doubt an artist of that rare kind.

The Democrats’ Incredible Shrinking Message
The Democrats’ Incredible Shrinking Message

In the summer of 2017, when the midterm elections were more than a year away but already on everyone’s mind, Democrats seemed to have an embarrassment of riches.

President Donald Trump was historically unpopular and engulfed in myriad scandals, from the tawdry (an alleged affair with a porn star, covered up with campaign funds) to the corrupt (using the presidency to enrich family businesses) to the existential (Robert Mueller’s investigation into the Trump campaign’s possible collusion with the Russian government to influence the 2016 election). Some of Trump’s cabinet members were similarly engulfed. His White House had become a reality TV psychodrama that not even Bravo’s producers could have dreamed up. And Congress, despite Republicans’ unified control of the government, was failing to accomplish much at all—including its years-long promise to repeal Obamacare.

Presented with so many gifts, Democrats’ only question was whether they should focus on one issue or try to synthesize them all into a single, winning message. “That message is being worked on,” Congressman Joseph Crowley, the number-four Democrat in the House, told the Associated Press. “We’re doing everything we can to simplify it, but at the same time provide the meat behind it as well. So that’s coming together now.”

It did not come together—not then, not ever. The midterms are less than three weeks away. The Democratic Party still hasn’t found its message, and the issues that many thought would feature prominently on the campaign trail—impeachment, Russia, corruption, #MeToo—have largely been relegated to subtext. But somewhere along the way, Democratic candidates around the country, almost in spite of the party’s dithering, have found the winning message themselves.

A year ago, if you were watching cable news—and not following the candidates—the major issues of the campaign would have seemed apparent.

The Russia inquiry had ensnared some of Trump’s top campaign and cabinet officials, including his former chairman, Paul Manafort, and national security adviser, Michael Flynn—both of whom are now convicted felons. And Trump’s firing of FBI Director James Comey suggested a possible attempt at obstructing the Russia investigation.

The president’s corruption, which he only barely seemed to hide, was underscored by the signing of a massive $1.5 trillion tax cut that will greatly benefit him and his family businesses. Trump’s administration, meanwhile, has been marked by ethics scandals and taxpayer waste. Health and Human Services Secretary Tom Price resigned after it was revealed he had spent more than $1 million on private flights, while EPA Administrator Scott Pruitt left the administration after spending hundreds of thousands on first-class travel.

Trump’s record on women—his well-documented history of misogyny, and the many allegations of sexual misconduct against him—was also believed to be a potent election issue, especially since it seemed to be driving the unprecedented number of Democratic women running for office. Kavanaugh’s confirmation battle, which was fractious even before the emergence of allegations of sexual assault, only underlined the GOP’s vulnerability with women.

The number and breadth of these scandals created perhaps the biggest debate in Democratic circles over the past year: whether the party should pin its 2018 hopes on promising to impeach Trump. A few House Democrats support the idea, but party leaders have danced around the question—a recognition, perhaps, that the issue could hurt the party in November. While a significant majority of the party’s base (and megadonor Tom Steyer) support impeachment proceedings, polls consistently show that fewer than half of all Americans do.

Though cool on impeachment, the Democratic Party has repeatedly grasped for a similarly compelling, unified message. Its first attempt, unveiled in July 2017, was the well-conceived, poorly received “A Better Deal,” which stated that the party’s mission was “to help build an America in which working people know that somebody has their back. American families deserve A Better Deal so our country works for everyone again, not just the elites and special interests.” The message went nowhere. Almost exactly a year later, the party rolled out the even more milquetoast “For the people.” Most Democratic lawmakers, if put on the spot today, likely could not explain the three main issues the message represents.

This has led to some familiar Democratic anxiety. Writing in The Atlantic in August, former Democratic Congressman Steve Israel described attending a campaign fundraiser in “a plush residence on the 64th floor of Trump World Tower,” where “most in the crowd wanted to know one thing: What’s the Democratic message?”

“There, in a building staffed with uniformed doormen, standing on floors so fine that we’d been asked to remove our shoes, the donors demanded to know why their party had no unifying theme. Or, more precisely, why wasn’t the message the specific message that they wanted messaged?” he continued. “These questions have come up at Democratic gatherings across the country this year, from grassroots fund-raisers to posh weekend retreats.”

Israel argued that “Democrats have it wrong that they need a national-message template in the first place. Past elections have shown that the most effective messaging is local and specific to each district.” This year’s election seems to be proving this true, or at least Democratic candidates are campaigning as if it is. By and large, they are running on a single issue. It’s not impeachment or collusion or corruption or #MeToo; it’s not even specific to Trump. The election, for many Democrats, is all about health care.

“The top three issues this year are health care, health care, health care,” J.B. Poersch, the head of the Democratic Senate Majority PAC told CNN last week. Candidates across the country, from Cindy Axne in Iowa to Claire McCaskill in Missouri to Josh Harder in California are talking about their own struggles dealing with the high cost of medical care. West Virginia’s Joe Manchin, toe lone Democratic senator to vote to confirm Kavanaugh, is leading in the polls in his state, thanks in large part to his embrace of Obamacare, which he even made an issue during the most recent Supreme Court confirmation.

Republicans are following suit, even those who voted to repeal the Affordable Care Act in 2017. Republican Martha McSally, who is running to fill Jeff Flake’s Arizona Senate seat, has campaigned on protecting coverage for pre-existing conditions, despite voting for the AHCA, which would have repealed the ACA, last year. In a debate on Monday, she told voters, “We can’t go back to where we were before Obamacare.”

Trump’s most significant legislative accomplishment, the $1.5 trillion corporate tax cut passed last December, has also factored into Democratic messaging—partly to highlight the hypocrisy of Republicans’ deficit hysteria during the Obama years, but also as another way to discuss health care.

Journalists and politicians talk about “the health care repeal and the Trump tax plan as two different issues,” Democratic consultant Jesse Ferguson told CNN back in May. But “the voters see them as ways Washington isn’t looking out for them.... On both of them, it’s basically the same: [Republicans] have been giving tax breaks to health insurance companies, to pharmaceutical companies, and those come at the expense of people who work for a living. It means higher health care costs, eventually higher taxes, more debt for your kids, and cuts to Social Security and Medicare as you get older.”

After Mitch McConnell said on Tuesday that entitlement cuts to Medicare, Medicaid, and Social Security are the only way to reduce the deficit, Democrats immediately sent out emails tying his statement to the tax cut.

Heads up for the "Dems have no national message" people: The McConnell comments on entitlements are driving today's state Dem messaging. pic.twitter.com/OzCvauPMax

— Dave Weigel (@daveweigel) October 16, 2018

It’s possible, of course, that Democrats are focusing on these issues in part because they don’t have to draw more attention to the president’s scandals, which already dominate cable news. The party’s fear has long been that its message, whatever it may be on a given day, would be drowned out by all things Trump. But the wall-to-wall coverage of Trump may in fact be helping Democrats. His scandals are now inevitably woven into the fabric of the 2018 campaign, such that Democratic candidates don’t need to go hoarse talking about Mueller or Trump’s tax returns or Stormy Daniels; voters are already motivated one way or another by those matters. Instead, Democrats can spend their time hammering the single issue that Republicans are most vulnerable on—which also happens to be the issue that voters care the most about.

The Mistake Countries Repeatedly Make When Dealing With the EU
The Mistake Countries Repeatedly Make When Dealing With the EU

“There isn’t even a one in a million chance that Merkel will say no.” These were the words of Alexis Tsipras shortly before becoming Greece’s prime minister in 2015. He was talking about his alternative negotiation proposals for Greece’s European Union bailout agreement—clearly in the EU’s interests, from his perspective, and vastly more palatable for Greece.

Populist politicians excel at presenting an effortless route to a rosy future. In the run-up to the UK’s referendum vote in 2016, Boris Johnson—then London mayor, now ex-foreign minister, and perennial prime ministerial hopeful—expressed his certainty that, after Brexit, the EU would surely agree to a tariff-free trading deal, just like the one the UK enjoyed by being in the EU: “Do you seriously suppose that they are going to be so insane as to allow tariffs to be imposed between Britain and Germany?” Both Tsipras and Johnson saw the future negotiation results as a done deal, bound by the EU’s own economic interests. All the people had to do was vote the right way.    

Greece found out the hard way that simply voting for the future deal you want isn’t enough when negotiating with a transnational institution representing 27 other democracies. It also found out that Tsipras and his government had been too arrogant in claiming to know what was in the EU’s interests. Angela Merkel, Germany’s chancellor, was not so taken by Tsipras’s alternative proposals, and insisted instead on the “austerity and reforms” recipe that Greece had been following.  

Meanwhile at the EU summit on Wednesday, more than two years after the country voted for Brexit, and with less than six months until the UK’s deadline for leaving the EU, the UK is still negotiating its withdrawal agreement and the terms of the future EU-UK relationship. The UK’s Brexit vote resembles that of Greece’s vote for Tsipras not only in terms of the political naiveté of the populist promises, but also in terms of the strategies pursued in the subsequent negotiations with the EU. Both the Conservative government and the Labour opposition in the UK seem to be repeating Greece’s mistakes. The result is unlikely to be any more successful.

Prime Minister Theresa May is wedded to the withdrawal proposal her government arrived at after much deliberation: the so-called Chequers plan. May sees this as the best compromise possible, given the internal politics of her own government. Pro-EU members of parliament (MPs) want as close a relationship with the EU as possible, whereas pro-Brexit MPs want a relationship that gives the UK as much autonomy as possible. As a result, the Chequers plan is a patchwork that has been described by two confectionary analogies: “having your cake and eating it” and “cherry-picking.” It involves enjoying some of the benefits of EU membership, such as the free movement of goods, but without many of the commitments that come with it, such as the free movement of people.

The problem with Chequers is that the EU has signalled many times, including in a mocking Instagram post by EU president Donald Tusk, that the EU will not accept it. This has led to a stand-off. After the Salzburg summit last month, when the Chequers’ plan was once again rejected by the EU, May restated that Chequers was the only credible proposal the UK was prepared to make, and that the EU should offer an alternative, if it cannot accept it. Of course, the EU has offered the UK the alternatives: membership of the European Economic Area (EEA), or a Canada-style trade deal. May in turn finds those proposals unacceptable. Staying in the EEA would involve the continued freedom of movement of people, and a lack of autonomy when it came to striking new trade deals, making a mockery of the Brexit referendum result, according to May. A Canada-style deal, on the other hand, would amount to a trading relationship between the UK and the EU involving customs checks at the borders. In order to respect the Good Friday Agreement’s peace-promoting guarantee of no border between Northern Ireland (part of the UK) and the Republic of Ireland (part of the EU), however, that would mean that the customs checks would take place somewhere between Norther Ireland and the rest of the UK, undermining the country’s unity. It would also profoundly destabilize May’s government, which is propped up by the votes of MPs belonging to the DUP, a Northern Irish party, who would not tolerate such a solution.

The similarities with Greece’s negotiating strategy are striking.

May seems to be engaged in a game of chicken. The assumption is that the EU would have just as much, if not more, to lose from a no-deal Brexit. Hence May’s mantra: “no-deal is better than a bad deal,” aimed at convincing the EU she is not afraid of the former.  The similarities with Greece’s negotiating strategy are striking. After the 2015 elections, the leftist Syriza-led Greek government’s negotiation tactic to gain more favorable bailout terms was based on the assumption that if the sides couldn’t agree, Greece’s inevitable exit from the Eurozone would have been even more damaging for the EU than for Greece. The EU called Greece’s bluff, and the Greek government eventually capitulated.

May’s game of chicken will probably end in a similar way. The EU is unlikely to budge; 27 countries, speaking with one voice, and with less to lose in the case of no deal, are in a much stronger position than a single country, with a divided government, and which could face serious shortages with its EU trade disrupted. And even if the EU were to back down, accepting some version of Chequers, the deal would have to be approved by parliament. Labour and pro-EU Conservative MPs have said they would vote such a plan down. On the other hand, if May accepts a variation of the Canada-style trade deal on offer from the EU, the same MPs, including the DUP, will again probably reject it in parliament. This all makes a no-deal result seem rather likely. At that point, it is unclear how May’s government can continue, and a Conservative party leadership contest, a general election, or a second referendum, become plausible.

This brings us to the strategy of the opposition, the Labour party. Its plan is that if enough pro-EU Conservative MPs vote down the deal May brings to parliament, a general election could be triggered, which Labour assumes it would win. If that happens, the argument goes, Labour would proceed to renegotiate the terms of withdrawal from the EU, and get a better deal than May. This was in fact also Syriza’s and Tsipras’s plan when they ousted the prior government and came into power in January 2015. They believed that their fresh democratic mandate gave them the opportunity to start negotiations from scratch—only to be told by the EU that the Greek bailout was a national affair greater than any one party, and that the country was bound by what had already been discussed and agreed to.

A hypothetical Labour government in the UK would likely face a similar predicament. Labour would again have to choose between variations of the currently available options: staying in the EEA, or going for a Canada-style trade deal. More importantly, unless there was a pause to the withdrawal process, and the UK remained within the EU for longer than originally planned, there wouldn’t be enough time to re-negotiate. The Brexit date is March 29, and even if a general election took place in January, a couple of months are hardly enough time.

The UK is suffering from the same illusions as Greece was in its negotiations with the EU: overestimating its power, and believing that a shift in domestic politics, either by a change of government or Prime Minister, can yield a better result. Brexit, ironically, was sold by the likes of Johnson as a way for the UK to gain sovereignty and autonomy. It was also sold on the promise of continuing to trade with the EU as if nothing had changed—all that was needed was the right democratic mandate. Instead, what the UK is finding out is that in leaving the EU, it has much less say over its future than it did before. A single national democratic mandate is not all powerful against the interests of several other democracies organized together—a valuable lesson for those who advertise a return to national politics in a supranational era.

The New DNA Paradigm
The New DNA Paradigm

Big cultural changes happen slowly, then all at once. This summer, the Golden State Killer, a serial rapist and murderer, was identified through the search of a third-party consumer genomics service called GEDmatch, which turned up one of his distant relatives. The hit was no fluke: Science reports that the commercialization of genomics has grown so much that around 60 percent of Americans with European heritage could be linked to a relative through the databases of companies like 23andMe. On Monday this trend entered the political sphere, with Senator Elizabeth Warren announcing, in refutation of President Donald Trump’s skepticism, that her DNA shows “strong evidence” of Native American ancestry some six to ten generations ago.

These different but related news items tell the story of DNA science’s trajectory from the academic peer-reviewed realm, to the hands of law enforcement, to the broader culture of at-home genetic testing. For years, DNA has largely been considered part of an invisible, mysterious realm that experts can dip into as needed: to identify criminals, to screen for disease. But that paradigm is giving way to a new one. Now that so much of our genetic information is stored in databases, linking us all to each other, it turns out that DNA technology is not a neutral arbiter of truth. Rather, it exerts its own influence and can be used to enhance the power imbalances that exist in this country.

Since the first American was convicted using DNA evidence—Tommie Lee Andrews, for rape, in Florida, 1987—nearly 400,000 cases have concluded the same way, according to the FBI. Sixteen million Americans have their DNA stored in a law enforcement database. Meanwhile, 15 million people around the world have had their DNA analyzed by a direct-to-consumer (DTC) genetics company like MyHeritage or 23andMe. The way law enforcement authorities and 23andMe process DNA is different (the police only do routine tests, enough to match two samples, while DTC companies use a process called genotyping to define which genetic variants a person possesses). But these two worlds are starting to merge. In 2015, a 23andMe transparency report revealed that law enforcement agencies had requested access to the company’s genetic database, but had been denied. And with the capture of the Golden State Killer, the overlap between law enforcement’s priorities and the “fun” commercial aspect of genetic testing has become clear.

This is cause for concern not just because it represents a potentially vast infringement of privacy; it also could reinforce existing biases within the law enforcement system, whose use of genetic data is skewed. A 2011 study in PLOS Medicine showed that “[f]orensic DNA databases are growing to mirror racial disparities in arrest practices and incarceration rates.” As prison populations have grown, they have been accompanied by a “dramatic shift” in their racial proportions, as African Americans and Latinos have been disproportionately targeted by drug-focused policing. So, the authors observed, it follows that law enforcements’ DNA databases mirror those unequal incarceration rates.

Now that police can use open-access genetic databases, they could potentially introduce racial bias to information that originated with unwitting consumers. A person can upload their results to a third-party service like GEDmatch, where it can lead police to their relatives. These are public concerns, and the answers are clouded by the sheer vastness of the numbers and the secrecy around law enforcement’s process for sifting through them.

Then there are the ethical problems already at play in the DTC companies’ work. This July, 23andMe sold its consumers’ data for $300 million to GlaxoSmithKline, for the purposes of medical research. On the scientific level, this makes sense: You need a huge corpus in order to study genes across the population. But for many, the case recalled the story of Henrietta Lacks, whose own DNA was used in research without her consent. In 1951 her tissue was removed without her knowledge, and used to create an extremely profitable medical industry. She died, and her husband and five children were left in poverty, never seeing any of the benefits that Lacks’s cells brought to others.

The other important connection between race and the DTC DNA testing kits is more subtle and psychological. There are many African American users who test themselves to connect to their history. 23andMe has encouraged them to do so, via their “African American Sequencing Project.” Slavery and imperialism has severed countless African Americans from their deeper origins, and there has been a therapeutic benefit to many from the material proof of their family’s existence. In 2016 Cara Rose DeFabio wrote an insightful piece about black 23andMe users confronted by painful truths lying hidden in their own genes.

The irony of black users’ data being used for medical research, then, is painful. That breaches in security have led to their genetic data becoming accessible to police via third-party services is more painful still.

The intersection between race and medical technology, especially genomics, lies behind the scandal of Elizabeth Warren’s own genetic disclosure. In asserting that she has Native American “blood,” she implies—although she explicitly claims otherwise in the promotional video—that Native identity has anything to do with DNA. In fact, it doesn’t. Tribes are free to determine membership as they choose, but none uses DNA testing as a membership standard. As Professor Kim TallBear of IndigenousSTS put it in a statement released on Twitter, Warren “focuses on and actually privileges DNA company definitions in this debate, which are ultimately settler-colonial definitions of who is Indigenous.” In other words, Warren “proving” that she is Native with her DNA undermines the real practices of Indigenous people and imposes an oppressive standard upon them.

Warren has effectively bought into a definition of Indigenousness that Trump established when he challenged her to prove her Native identity. Ironically, she has reinforced a medical model of ethnic “purity” reminiscent of eugenics. Many people have lamented the way that the Warren debacle has plunged our discourse on race back into the laboratory.

we really needed race science to return. there’s about to be craniometers at target.

— doreen st. félix (@dstfelix) October 15, 2018

In the medieval and early modern periods in Europe, explanations for the difference between people’s skin color tended to rely on either the Bible or simplistic environmental theories, like the idea that the sun burns people black. But in the 19th century European thinkers turned their attention to new models of race taxonomy. Scientists in the burgeoning field of anthropology measured skulls, weighed brains, and performed other kinds of physical measurements to “study” nonwhite people. These practices were always implicitly colonialist, because they posited whiteness as a norm, and became explicitly so in the colonial context. And of course “scientific racism” of this kind would be appropriated by the Nazis in the 20th century (even though many Victorian thinkers believed Jewish people to be superior), which is symptomatic of the extreme political pliability of the “medical” model of differentiating race.

This is where the early history of “race science” loops back around to our present moment. Trump has shot back at Warren, saying that he will only trust a DNA test that he personally administers. Just as in the colonial era, every test can be ruled as definitive or bunkum by whoever has the most power. There turns out to be no single “truth” about the information in Warren’s genome: just a process of deferral to the person who can speak the loudest.

Genomic science has saved countless lives, and given us miraculous insight into matters of the human body. But there is no neutral knowledge; it always has to exist in a flawed world. Now that genomics has become ubiquitous, it has taken on a very powerful politics. It has already become the territory of commerce; of law enforcement; of electioneering; of the biggest, oldest problems around race and identity in the United States. So much information on so many people is now stored in databases that we have reached a tipping point: Do we have control over DNA data, or does it have control over us? Our world is increasingly ruled by those who control such information. Our bodies are our own, but the data is not.

A Death Sentence Over a Cup of Water?
A Death Sentence Over a Cup of Water?

“To pardon or overturn the verdict against Asia Bibi, self-confessed blasphemer is the commission of blasphemy itself and is crime against Islam and the Constitution of Pakistan.” So read a handout distributed by the hardline Islamist group Tehreek-e-Labbaik Pakistan in rallies all over Pakistan last week. The group threatened to paralyze the country with protests if the mother of five were to be exonerated by Pakistan’s Supreme Court, members of the group dispatching to all the major areas of the country.

Going by the laws of evidence and due process, Asia Bibi should be freed rather than put to death as ordered in 2010. Stemming from a dispute over a drinking cup, the case has huge evidentiary holes, violations of due process and factual fabrications. And as it has proceeded to the supreme court in Pakistan, it has become an emblem of how longstanding hatreds and vague laws have enabled minority persecution.

The story began in the small village of Katanwala, in an area known as Nankana Sahib, which stands 30 miles from the Pakistani city of Lahore. There, on the afternoon of June 14, 2009, four women working in the fields got into a terrible argument over a drinking cup. Asia Bibi, the only Christian among them, allegedly grabbed the communal cup and drank from it before the other three could do so. The others claimed she had “contaminated” the cup and that they should have been permitted to drink first. The argument escalated and more fieldworkers gathered. In an interview to BBC Newshour, Bibi’s daughter recounted how she ran to get her father. When they returned, however, Bibi had already been taken away. Within days a blasphemy case had been registered against her by the village cleric, who additionally claimed she had “confessed” to the crime.

The question of drinking order is a vestige of the Hindu caste system.

The question of drinking order is a vestige of the Hindu caste system that has lingered in the area even after most of the population converted to Islam over a hundred years ago. Christians, believed to be converts from the lowest classes of Hinduism, continue to be treated as untouchables in parts of Pakistan. For high caste Hindus, using the same utensils as someone from a lower caste represented contamination or impurity. It seems  the women in the field with Asia Bibi on that ill-fated June day believed this as well.

The case seemed tailor-made for hardline parties looking to mobilize communities against religious minorities. Similar recent blasphemy cases have been brought against Shia Muslims and members of the Ahmediyya sects. The country is rapidly transitioning from a mostly rural to mostly urban milieu. With caste and status in flux, clinging to some imagined superiority based on religion can be an attractive prospect—even if it only confers the privilege of drinking from a cup before a Christian.

The blasphemy law itself has been criticized even by Islamic scholars, who have pointed to its vaguely worded text. But the law has become the signature issue of hardline groups who oppose any change they see as weakening the Islamic nature of the Pakistani state. Tehreek-e-Labbaik have deployed themselves as watchdogs and vigilantes, supposedly policing the country against blasphemers. In another incident several months ago, they staged a protracted sit-in at one of the major intersections in Islamabad, Pakistan’s capital city, paralyzing traffic for months, because the government had surreptitiously removed the name of the Prophet Muhammad in the oath of office. The government capitulated, saying that the altered oath had been a “mistake.”

The new government installed after the election this summer has also shown susceptibility to hardline Islamic pressure. A little over a month ago, Prime Minister Imran Khan removed Princeton economist Atif Mian from his Economic Advisory Council because the latter belongs to the Ahamdiyya sect, which does not believe that the chain of prophets ended with Muhammad. Mian’s expertise in the area of debt and credit restructuring is sorely needed as Pakistan lobbies for an IMF bailout. But Khan, who already expressed his support for the blasphemy law as he wooed Islamists during his election campaign, removed Mian from his position.

With a public that has increasingly championed the death penalty and cheered its resumption following a seven-year moratorium that ended in 2015, and a Prime Minister beholden to the very people who want her dead, Asia Bibi can only rely on the Supreme Court itself. The lawyers and judges have all faced intimidation from the hardliners who are issuing threats, insisting that those who exonerate Asia Bibi will be blasphemers themselves. The three male justices deciding her case heard arguments from both sides on October 8, and while they seemed interested in the way the witness statements contradicted each other, and that the male cleric who had filed the case was not actually present when the altercation took place, there were few clues as to which way the court leaned. At the end of the proceedings, the court said it would “reserve” the verdict. Pakistani media were told to refrain from discussing the case, a directive most of have adhered to in recent days.

Aside from the threat of protests, a more sinister shadow hangs over the proceedings. Two politicians—the former Governor of Punjab, Salman Taseer, and a Federal Minister for Minorities, Shahbaz Bhatti—were gunned down in 2011 for supporting Asia Bibi’s innocence.

The Supreme Court has made bold rulings before, for example the one in 2017 that ended Prime Minister Nawaz Sharif’s time in office—he was subsequently convicted of corruption. And there is genuine concern in Pakistan about the international reaction to putting a woman to death over a dispute centered on a drinking cup.. 

Asia Bibi’s case may have begun as a macabre mélange of class, caste, and religious persecution, but it has quickly become a gendered narrative as well. Most if not all Tehreek-e-Labbaik members are men, as is Khadim Rizvi, its leader. Asia Bibi’s case would represent the first case ever of a woman put to ever be executed. The power now lies in the hands of the all male bench of the Supreme Court that heard her case, as the two male lawyers presented their arguments.

The country’s Supreme Court has shown it can stand up to politicians. Now is its chance to show it can stand up to the mob, and the deeply ingrained prejudices mobilizing it. At the heart of the case is, quite simply, a woman—and a large number of people who want her dead. 

How Obamacare Became a Winning Issue
How Obamacare Became a Winning Issue

In 2009, when Barack Obama traveled to Bristol, Virginia, for a town hall to promote the Affordable Care Act, his motorcade passed a small but turbulent protest. I was raised just outside this small Appalachian city, and even then, three years after graduating from high school, I knew it desperately needed health care reform. At the time, according to data compiled by the Urban Institute, almost a fifth of Bristol’s residents under age 65 had no health insurance—one of the highest rates in the state. And yet, when Obama arrived, people greeted him with signs that read SOCIALISM ISN’T COOL and OBAMA: “GOD” DECIDES LIFE AND DEATH NOT YOU OR NATIONAL HEALTHCARE.

Six months into his first term, Obama was facing this kind of opposition not just in Bristol, but nationwide, even in districts he’d won the previous fall. Alarmed by conservative talk radio hosts and the constant harping of an intransigent Republican Party, many Americans believed that the ACA would rip apart the fabric of American life. “No one should be surprised at the coming embrace of euthanasia,” conservative columnist Cal Thomas warned.

Despite the opposition, Democrats were promising that Obamacare would eventually boost their chances in elections, as Americans gradually came to see the benefits of the law: It made sure that preexisting conditions were no longer cause for discrimination, and gave people with diabetes, cancer, and other serious conditions a chance to afford health insurance. “As people learn about the bill, it’s going to be more and more popular,” Senator Chuck Schumer said in March 2010. “By November, those who voted for health care will find it an asset, those who voted against it will find it a liability.”

What Schumer predicted never happened, at least not that year. A few months later, the GOP picked up 63 seats in the House and six in the Senate. A study published the following year estimated that at least 13 House Democrats lost their seats because of their support for the law. With the Tea Party sweeping into office, the ACA threatened to drag Democrats down.

This year, the prevailing attitude toward the ACA has changed. In West Virginia, in a September campaign ad, Senator Joe Manchin, perhaps the most conservative Democrat in the Senate, blasted a paper copy of a lawsuit challenging the ACA with a rifle. In Ohio, Democratic gubernatorial nominee Rich Cordray pledged to protect the state’s ACA Medicaid expansion from Republican interference. And in Wisconsin, Democratic gubernatorial candidate Tony Evers has repeatedly attacked Republican incumbent Scott Walker for joining a multistate lawsuit opposing the ACA. “If you want to protect the millions of Wisconsinites with a preexisting condition, drop Wisconsin from this lawsuit,” Evers said in September.

It’s not hard to see why Democrats are now eager to align themselves with Obamacare. Last year, for the first time, Gallup reported that a majority of Americans viewed the law favorably. This past March, the Kaiser Family Foundation released another poll that put public support for the law at 54 percent, the highest it’s been since 2010. Nationally, confidence in the Democratic Party’s ability to help solve health care is at its highest level since 2006, the last time there was a blue wave in the midterms.


Republican rhetoric suggests they understand this. It’s astonishing to watch a candidate like Michigan’s Republican attorney general, Bill Schuette, frame himself as a champion of government health care. He campaigned on repeal and replace in 2010, and as attorney general repeatedly joined lawsuits intended to strike down key provisions of the ACA. Seven years later, with about 660,000 Michigan residents enrolled in the state’s Medicaid expansion program, Schuette, who is running for governor, no longer opposes the plan. Similarly, in Virginia, some Republicans revolted against their party leadership in May, joining Democrats to pass a bill that expanded Medicaid under the ACA in the state. Some were coalfield Republicans, representing Virginians not far from Bristol’s 2009 picket.

Conservatives are partly responsible for the shift. For the better part of a decade, they promised to “repeal and replace” Obamacare, if voters only gave them the chance. But even with a unified government, the party failed—twice. Ironically, their attacks on the ACA may actually have convinced voters of its importance. Last March, as Republicans made their first attempt to replace the ACA, only 38 percent of independents supported the law. Over the next two months, with Republicans sniping at one another on television, these swing voters grew more uneasy about the idea that they might actually lose the ACA. By May, support for the law among independents had ticked up 10 points. The ACA, once a “radical” proposal, had become something more familiar, perhaps even reassuring, to swing voters. And in flippable districts like Wisconsin’s 1st, Virginia’s 7th, and New York’s 19th, that could be the difference between Democrats taking the House in November or not.

Now that Obamacare is finally winning majority approval, many Democrats are pushing for something more ambitious: Medicare for All. Popularized by Senator Bernie Sanders in 2016, the plan, a version of single payer, has growing support, but the polling is more complicated. Some polls show it with roughly the same support as the ACA, others with more, a few with much less.

Republicans, sensing that the numbers are mixed, have tried to capitalize. Internal Republican polling, reported by Axios in September, asserts that attacks on Medicare for All are “the best-performing message” with key demographics, including seniors and suburban women. Andy Barr, the Republican incumbent in Kentucky’s 6th Congressional District, has claimed that his Democratic challenger—charismatic former Marine pilot Amy McGrath—would end “Medicare as we know it,” if she were able to push Medicare for All through Congress. Similarly, as Dave Weigel of The Washington Post has reported, Representative Dave Brat of Virginia, a Tea Party favorite who unseated the powerful Eric Cantor in 2014, said in a recent ad that his Democratic challenger, Abigail Spanberger, would “bankrupt Medicare as we know it.”

Facing these attacks, Democrats may wonder if they should simply make a public show of affirming their support for the ACA and leave it at that. But voters know, after almost a decade living with the law, that it did not create “communist death panels” or lines like those at the DMV. Increased government involvement in the provision of health care—in this case, through subsidies for private insurance, and in some states, the expansion of Medicaid access—does not create shortages of doctors or overcrowded hospitals, and voters understand that now. While Medicare for All isn’t yet a guaranteed winner in all the districts Democrats need to control the House, these are encouraging signs that it, like the ACA, may one day be accepted law, with a radical reputation in the rearview mirror. In the meantime, Democrats shouldn’t abandon Obamacare now that it finally works for them. Instead, they should embrace the law for what it has revealed itself to be: proof that progress is possible.

Correction: A previous version of this article incorrectly identified Bill Schuette as Michigan’s governor. We regret the error.

The Uncertain Fate of Affirmative Action
The Uncertain Fate of Affirmative Action

“It remains an enduring challenge to our nation’s education system to reconcile the pursuit of diversity with the constitutional promise of equal treatment and dignity,” Justice Anthony Kennedy wrote in a Supreme Court opinion two years ago. The University of Texas, he a majority of justices concluded, had met this challenge with its admissions policies. The 4-3 ruling in 2016’s Fisher v. University of Texas effectively meant that American universities could lawfully consider racial diversity when admitting new students as long as Kennedy, the court’s swing justice, remained on the court.

Now Kennedy is gone, and with him, a fifth vote on the Supreme Court to uphold the constitutionality of such admissions policies. Justice Brett Kavanaugh’s confirmation all but guarantees that the court will revisit the issue in the near-future. His presence may also give the court’s conservative wing the votes it would need to chip away at 40 years of precedents affirming that American higher education has a compelling interest in ensuring a diverse student body.

Harvard’s legal battle over its own admissions practices began before Kennedy retired. But his departure raises the stakes even higher as the Ivy League university’s own case goes to trial this week in Boston. Harvard is defending its policies for admitting new students from a lawsuit brought by a group of Asian-American applicants who say they were kept out by an informal quota system at the school. Though many aspects of the case are unique to Harvard’s quirky system for choosing new students, it could still eventually give the high court the opening it needs to make far-reaching changes.

For years, the face of the movement to curb affirmative action was a young white woman. Abigail Fisher kicked off a years-long legal battle with the University of Texas after it denied her application to UT-Austin’s 2008 freshmen class. Representing her in court was the Project for Fair Representation, a conservative legal organization that specializes in challenging the legislative victories of the civil-rights movement.

The organization’s founder, Edward Blum, is a not a lawyer. But he has an uncanny knack for bringing momentous cases before the Supreme Court. He orchestrated the successful legal campaign to gut the Voting Rights Act of 1965, which culminated in the court’s 2013 ruling in Shelby County v. Holder. The 5-4 decision struck down Congress’s formula for determining which states had to seek federal approval before changing their voting laws. Though Chief Justice John Roberts claimed the nation had moved on from the Jim Crow era, the ruling sparked an immediate surge in voter suppression in Republican-led states across the country.

In 2016, the Project for Fair Representation also urged the Supreme Court to rethink how it enforces the “one man, one vote” principle. The organization’s lawsuit in Evenwel v. Abbott tried to compel states to apportion their legislative districts by voting population, not by total population. Had it succeeded, the nation’s whiter and more rural regions would have seen a tremendous boost in legislative power at the expense of diverse urban areas with more diverse communities. Instead, the Supreme Court unanimously rejected the proposition.

That same year, the justices rejected Blum and Fisher’s bid to strike down the University of Texas’s method for accepting new students. But the case hinted at another way forward. In a dissenting opinion, Justice Samuel Alito suggested that the university’s methods increased black and Hispanic representation in the student body at the cost of Asian-American representation. “In UT’s view, apparently, Asian-Americans are not worth as much as Hispanics in promoting ‘cross-racial understanding,’ breaking down ‘racial stereotypes,’ and enabling students to ‘better understand persons of different races,’” he wrote, quoting from a brief filed by an Asian-American legal group.

The Harvard lawsuit was brought in 2014 by Students for Fair Admissions, a nonprofit organization also established by Blum. It represented a group of Asian-American applicants who claimed that the school’s admissions policy violated federal education laws by effectively setting a de facto racial quota through their evaluation of students’ personalities. At the time, Harvard and other Ivy League schools were under growing scrutiny for declining admission rates for students of Asian descent.

“Harvard and other academic institutions cannot and should not be trusted with the awesome and historically dangerous tool of racial classification,” the group argued in its complaint. “As in the past, they will use any leeway the Supreme Court grants them to use racial preferences in college admissions—under whatever rubric—to engage in racial stereotyping, discrimination against disfavored minorities, and quota-setting to advance their social-engineering agenda.”

Students for Fair Admissions’ lawsuit isn’t without its critics in the Asian-American community, some of whom have expressed concern that their situation is being used as a vehicle to undermine efforts to help other disadvantaged communities. “My stance on affirmative action is a general reminder to the rest of America—and especially to Edward Blum—that I, along with so many other Asian Americans, refuse to be tools of white supremacy, and that we stand in alliance with all communities of color,” Thang Q. Diep, a Harvard student, said at a rally on campus against the lawsuit on Sunday.

If the lawsuit or another one like it gives the Supreme Court an opportunity to strike down race-conscious admissions programs, colleges and universities may have to turn to alternative measures to ensure a diverse student body. One of the most controversial options would be to focus on socioeconomic factors instead of race. In 1996, California voters passed Proposition 209 to ban the state from considering race, gender, or ethnicity in higher education and certain other spheres. California’s state university system responded with sweeping changes to its admissions policies to offset the effects.

“Schools have reduced their reliance on standardized test scores for admissions, banned legacy preferences for the children of alumni, encouraged more community-college transfers to four-year institutions, and created new outreach programs to high-poverty high schools,” Richard Kahlenberg, a senior fellow at the Century Foundation, wrote in 2014. “In part because of these efforts, UCLA and UC–Berkeley are far more socioeconomically diverse than most selective colleges.” This, he argued, bolstered the higher-education system’s ability to encourage social mobility.

Another proposal is to rethink the admissions process entirely. The Atlantic’s Alia Wong wrote earlier this year that some experts are suggesting that elite schools could use a lottery system of sorts to fix structural issues in the current process. “To continue to promote diversity, the school could give extra weight to certain applicants depending on, say, their zip code, the kind of high school they attended, their income, and their race,” Wong explained. “Then admissions officers could use those criteria to whittle down their batch of 40,000 applicants to a much smaller pool of qualified contenders and from there select the final 2,000 or so through a lottery (not everyone who’s admitted attends).”

Neither of these alternatives is likely to work as well at achieving as racially diverse a student body as some of the current measures used by the nation’s top colleges and universities. The most effective tool may be race-conscious admissions policies in one form or another. But with the Supreme Court’s disapproval almost certainly imminent, more creative options may be needed to achieve the same goals.

Hacker News
Remix is hiring Software Engineers to build better public transit and cities
Founder’s Guide to the Y Combinator Interview
Micro.blog
Skills and education (2007)
Spatial: A High Level Programming Language for FPGAs
WebAssembly Threads ready to try in Chrome 70
Web Performance 101: how to optimize your JS, CSS, HTTP stuff, images, and fonts
Show HN: Stay with founders in San Francisco
Global Kernel Locks in APFS
Technology preview: Sealed sender for Signal
The unsolved murder of an unusual billionaire
Book Review: A Philosophy of Software Design
To Err Is Human: Mistakes and slips in skydiving and other disciplines
Growth AMA with YC Partner Gustaf Alströmer
How an outsider bucked prevailing Alzheimer's theory, clawed for validation
Three hundred and sixty years of United States caselaw
Bracketed paste mode (2013)
Building a fly brain in a computer
Chemists thrilled by speedy atomic structures
Facebook exodus: Nearly half of young users have deleted the app
Show HN: Vespene – My new Python CI/CD and automation server written in Django
Headquarter Locations of Top 101 Y Combinator Companies
Decensoring Hentai with Deep Neural Networks
A Look at the Design of Lua
IBM’s Old Playbook
The cost of keeping Singapore squeaky clean
Writing a Screencast Video Editor in Haskell
The D Language Front-End Merged Into GCC 9
Soup – Alan Kay on Objects
Qualcomm says Apple is $7B behind in royalty payments

No comments :

Post a Comment