Logo

When the Establishment No Longer Calls the Shots in Writing History

From Tiananmen Square to the trans-Atlantic slave trade, deciding how the past is remembered is one of the invisible roles of rulers. But newer social movements are challenging whether the narrator should be a hero or a villain

Share
When the Establishment No Longer Calls the Shots in Writing History
A statue of Confederate States President Jefferson Davis lies on the street after protesters pulled it down in Virginia in 2020 / Parker Michels-Boyce / AFP via Getty Images

“What you can’t do is go around seeking to change our history retrospectively,” proclaimed British Prime Minister Boris Johnson in January 2022, “or to bowdlerize it or edit it in retrospect.” The prime minister was reacting to a jury verdict acquitting four protesters who had joined a Black Lives Matter demonstration in Bristol, England, in June 2020. The four were prosecuted for toppling a public statue of the 18th-century philanthropist and slave merchant Edward Colston.

One Conservative member of Parliament, Tom Hunt, added that the verdict would give a “green light for all sorts of political extremists . . . to ransack our past.” Another party member and former minister, Tim Loughton, objected that the jury decision “effectively allows anyone to rip down statues, vandalize public art and memorials or desecrate buildings because they disagree with what they stand for.” But the left-leaning writer Nesrine Malik fired back, accusing these Tory grandees of overlooking “deep inequalities that run, like cracks, from the past to the present.” Malik cast British history as a “legacy of supremacy, both racial and national” — a legacy that still lives on “not just in our streets and squares but in our politics, our education and our economy.” Malik charged that the country remains “as delusional about the moral integrity of its colonial heroes as it is about the health of its race and ethnic relations.”

Nothing about these disputes is uniquely British. Iconoclasm dates back over millennia, recurring in many cultures during turbulent times. When rival factions clash, their politics end up tied to memory. Moreover, in today’s world, social media fuel street rebellions by increasing their payoff, particularly when monuments seem to glorify, say, European imperialism or the American Confederacy. Militants who deface these displays enjoy instant audiences and can spark global movements — or rather, they can do so if they live within sufficiently open societies. After all, China’s 1989 Tiananmen Square massacre claimed over 10,000 victims, yet when Hong Kong universities removed public memorials commemorating it in 2021, barely a peep was heard. Few Chinese can afford to ruin their or their families’ lives by candidly challenging official histories. The ordinary citizen cannot even obtain basic information about the uprising (China has banned search engines such as Google and Yahoo, allowing only more restricted ones), so many people know little about it.

The warning sounded by George Orwell back in 1949 still rings true today: “Who controls the past controls the future. Who controls the present controls the past.” This raises a question that is relevant from Bristol to Beijing: Can we discover any patterns in how governments around the world promote public understanding of history?

Official histories usually glorify the state’s heroes, lament its victims and condemn its enemies — but rarely do such histories proclaim open remorse for past sins. Those that do tend to come from democratic rather than autocratic governments, even if the line between the two is sometimes hard to draw.

One context in which officials openly condemn their own state’s past is revolution, as far as the new government proclaims a sharp break with its predecessor. By emphatically charging the old order with wrongdoing, the new regime aims to bolster its public approval. The French revolutionaries of 1789 were eager to proclaim the state’s past injustices so the revolutionaries could then be seen as legitimate, rightly overthrowing the brutal and corrupt ancien régime. They presented the aristocracy as traitors, as aliens, as part of the “they” and not part of the “we,” or even as a kind of foreign occupying force. In the 20th century, Bolsheviks condemned tsarism in similar terms, just as Mao’s cultural revolution turned on older imperial remnants. Cuba, Nicaragua and others furnish further examples. In these revolutionary contexts the government realigns itself with the state, as if it were not only the government but also the state itself that was being renewed. The new officials feel confident about condemning the state’s past wrongdoings as a means of justifying the overthrow of their predecessors.

Yet revolutions are the exception in history. Most of the time officials wish to exhibit continuity between the state and government. They hesitate to condemn past evils committed by their own states, particularly when those evils do not lie far back in time, because that would mean condemning some predecessor government along with often vast portions of the population who supported it. After the demise of leaders such as Italy’s Benito Mussolini and Spain’s Francisco Franco, any official proclamations of their crimes became a perilous affair, likely to spark outrage and faction, likely to divide populations and even families. Former administrators and supporters of those regimes — also called collaborators, depending on where you stand — were still alive, often still young and professionally or civically active. For many governments, silence seems to offer the only pathway toward national reconciliation.

Today, we can make sense of the politics of memory, of who is remembered and how, by starting with four assumptions. Each of them, it seems, is uncontroversial. First, most nations today have emerged because someone exerted power over others, whether that power be military, economic, ethnic, national, religious, gender-based or something else. Second, those power differentials have bred injustices, at least in the eyes of later generations, if not always from the standpoint of those who held power at the time. Third, it remains rare for governments to publicly proclaim their own responsibility for histories of mass and systemic injustice. Official histories usually glorify their own heroes, lament their own victims and condemn their real or fabricated enemies — but rarely do they proclaim open remorse for the state’s own wrongdoings, at least insofar as the existing constitutional order is still largely in place. Indeed, highly autocratic governments sanitize history by turning such acts into taboos, as in North Korea, China, Russia, Turkey and elsewhere. Democratic regimes may be more inclined to issue such statements, but the wording is often parsimonious since officials seek to avoid not only political backlash but also lawsuits from victims or their descendants who may be numerous and may demand large sums in compensation.

Yet despite all such obstacles, governments do, from time to time, confess past wrongs, which leads us to the fourth precept: The dividing line between the governments that are willing to acknowledge guilt and those that are not falls largely between democracy and autocracy — even if boundaries between these two terms have become ever harder to draw. Not only between democracies and autocracies, but also within democracies we witness autocrats like former U.S. President Donald Trump, France’s far-right provocateur politician Eric Zemmour or the former leader of Germany’s right-wing Alternative for Germany party Alexander Gauland pushing against self-critical histories in favor of sanitized national histories. To be sure, officials within democracies generally have an easy enough time apologizing for incidental mishaps, for example, by openly apologizing for the death of a military recruit who has been killed in a botched military exercise. The harder task is to take responsibility for mass injustices.

Only after World War II do we witness a break from those habits of silence and avoidance, most notably in what was then West Germany, when the government and intelligentsia adopted the narrative of a collectively responsible “we.” The disgraced Nazi regime would henceforth be treated not as “other” but as part of an ongoing history for which present and future governments would have to take responsibility. This has come to be known as “Erinnerungskultur,” literally “memory culture.” The phrase may sound stilted in Anglophone ears, but it has become mainstream, almost colloquial in Germany, where it no longer sounds novel or exotic.

To be sure, public consciousness is one thing, but public consensus is another. Even in Germany’s political mainstream, commemorative projects spur controversy. Berlin’s massive Holocaust memorial, built in 2005, has long sparked quarrels about its aims and design. For many people, its conception seems inappropriately amorphous; at worst, there is the unseemly reality that visitors can easily flaunt their disrespect. Similarly, since reunification in 1990, disputes surrounding the former East Germany have proved contentious around topics like the dictatorship’s political legitimacy, its citizens’ participation in it, and West Germany’s overt and covert dealings with it during the Cold War. And yet surely all these disputes display not the weaknesses of Erinnerungskultur but its strengths. If democracies thrive through collective self-examination, then surely they offer a natural home for self-critical histories. When people like Trump reject critically minded memory, they reject the very idea of democracy as an arena for collective and deliberative reflection.

Obviously, stories of national guilt need not eclipse all others, nor do stories of national loss need to be excised. Russian or Polish authorities can credibly commemorate mass sufferings at the hands of Nazis. China can justifiably remember atrocities committed by Japan. Vietnam and Cambodia can rightly recall the victims of American war crimes. But when officials entrench mythologies about their nation solely as the hero or the victim, silencing any discussion about its role as a perpetrator, then they take ever further steps away from democracy itself. The problem is not that there is necessarily autocracy wherever we find sanitized history, but the converse: Wherever we find autocracy, we are sure to find sanitized history. In the same way, we do not necessarily find self-critical history wherever we find democracy, but the converse: Wherever we find self-critical history, we can surely expect to find at least incipient, if not yet full-fledged, democracy.

One hotly disputed by-product of Erinnerungskultur has been Germany’s ban on public statements that deny the occurrence or the extent of the Holocaust. Such a ban, also adopted in France and many other Western democracies, ends up placing one civic value above another, since the government’s constitutional duty to protect free expression becomes subordinated to a collective ethical duty of remembrance. The anti-denialist law coerces citizens either to confirm both the existence of the Holocaust and its gravity, or to dodge the matter when speaking publicly. What results is a rift in memory politics. In opting for such bans, countries like Germany or France subscribe to “militant democracy” as immortalized in the words of the 18th-century Jacobin Louis Antoine de Saint-Just: “no freedom for the enemies of freedom.” That philosophy contrasts with the Anglosphere’s traditionally laissez-faire policies, where governments certainly engage in commemorative activities yet prefer to leave much of the discussion, debate and research in the hands of citizens without imposing bans, as in the United States, or imposing comparatively mild ones, as in Britain.

Despite such surface disagreements between contemporary democracies, we should not exaggerate the divergences between Western European and Anglo-American attitudes. What unites them is a goal of strengthening the democratic public sphere, even if they dispute the best means of achieving that goal. In the past I have criticized speech bans by arguing that democracy cannot legitimately subordinate free expression to historical commemoration. I continue to hold that view, yet the difference between the age-old policies of officially self-glorifying histories and this newer, self-critical stance reveals something unprecedented, indeed admirable, about the German and French policies. Whatever may be the advantages and disadvantages of their speech bans, the anti-denialist laws contrast starkly with, for example, Poland’s 2018 legislation imposing criminal penalties on speakers who blame the Polish state or nation for complicity in Nazi atrocities.

What’s the difference? Polish authorities like to equate their ban with the German one, defending their crackdowns by arguing that Germany does the same. Yet the aims and effects of the two countries’ bans could not be more different. For all its faults, the German ban adopts the democratically credible stance of acknowledging the nation’s wrongdoings, while the Polish ban merely defaults to the old sanitizing rhetoric of officials proclaiming their nation’s heroism or victimhood. The Polish ban penalizes references to historically documented events, while the German one primarily obstructs the dissemination of patent falsehoods and conspiracy theories (which, incidentally, tend to be laced with hefty doses of antisemitism). The German ban raises concerns about democracy as far as it curbs speech, and yet it strengthens democracy by entrenching a culture of collective self-criticism. The Polish policy fails on both counts.

Apologists for the Polish ban complain that frequent references to “Polish” concentration camps such as Auschwitz-Birkenau, Treblinka, Belzec or Sobibor, which the Nazis built in the occupied country, run the risk of misleading the public by suggesting that those compounds had been managed under Polish authority. But that’s a poor excuse. Warsaw’s lawmakers know that there are alternative means of avoiding that error without having to adopt a law so blatantly designed to curtail public discussion and scholarly research about Poles’ wartime crimes. Polish officials’ bad faith is underscored when we recall that, even before 2018, the governing party had sought to bring prosecutions for criminal libel against the Polish-American historian Jan Gross, who has published research on atrocities committed by Poles against Jews during the war.

It is important to add that Germany has also promoted measures that are less coercive but more effective — measures that can advance Erinnerungskultur without having to punish speakers. In particular, German school curricula do much to promote Holocaust and wartime education, often including guided class trips to former concentration camps, as well as long-standing policies within the mass media to expand documentary programming, and promotion of information and discussion through museums and other public forums.

Similar policies have been adopted by other Western nations, yet not without backlashes. In former imperial powers such as Britain, France or the Netherlands, it has become increasingly untenable to discuss the achievements of the empire without paying serious and even primary attention to the prices paid by colonized and Indigenous peoples. Recently in the Netherlands, some commentators have argued that the time has come to stop calling the 17th century — the era of Rembrandt, Vermeer, Hals, Spinoza, Huygens, the world’s first stock exchange and the Dutch East India Company — the “Golden Age,” given blights of slavery, poverty, imperialism and warfare that followed on its heels. Meanwhile, in recent decades American classrooms have devoted greater attention to the bleak pasts of slavery and Jim Crow along with the brutalization of various Indigenous and immigrant peoples yet have simultaneously faced hostility from reactionaries who seek to erase or downplay those histories and would restore schoolbooks to tales of Anglo-Saxon glory.

Despite the emergence of more self-critical official histories, it is self-glorification that continues to dominate throughout much of the world. People who face struggles in their daily lives may feel little sympathy for victims and events that seem far away. Many crave collective pride, not collective shame. Few politicians score points by telling the nation how horrid its ancestors were. Yet democracies have no other option. Self-critical history is not only a necessary ingredient of truth-telling at home but also of credibility abroad if democracies are to challenge others about past and present human rights violations.

Against the backdrop of official histories, how and why did this new countercurrent of self-criticism start to emerge? After all, self-evaluation is an ancient norm, found in many cultures and belief systems. Socrates launched much of Western philosophy by embracing such self-criticism, and yet he urged it only on individuals, not on governments acting in their official capacities. Similarly, we cannot rule out the origins of autocritique in medieval Christian practices of self-chastisement, yet here, too, such rituals were individual, never formally instituted as government practice.

The late Middle Ages and Renaissance gave rise to “mirrors for princes” and “mirrors for magistrates,” namely handbooks for rulers on good governance that highlight the value of self-reflection. These texts urged leaders to embrace self-mediating qualities of humility, compassion and other such virtues of benign government (as poignantly ironized in Shakespeare’s mirror-smashing moment in “Richard II”). Desiderius Erasmus’ “The Education of a Christian Prince” offers a prominent example of the genre, but similar advice can also be found in non-European systems such as Confucianism, Buddhism and Islam. Albeit in their own idioms, all such traditions urge rulers to exercise power with erudition, yet here too, these teachings never recommend that governments must engage in open self-rebuke by taking public responsibility for mass injustices.

One might argue that earlier societies felt no need for official proclamations of wrongdoing given that their evils did not take place on the scale witnessed in industrialized societies. Yet that argument is unpersuasive. After all, mass atrocities were certainly known in premodernity, even if not in numbers witnessed in more recent times. And at any rate, it would seem easier for officials to take blame for wrongdoings that were smaller in scale, so it does not seem to be the sheer degree of wrongdoing that explains the recent shift toward self-critical history.

The reasons for the shift lie elsewhere and can be viewed as the logical conclusion of an admittedly idealistic, post-Enlightenment cosmopolitanism that reached a pinnacle around the mid-20th century in response to the horrors of World Wars I and II. Idealistic cosmopolitanism is a worldview that envisions a society of critically aware citizens who jointly agree about past failures and future reforms. It assumes a common “human family” sharing universal moral values, as witnessed when the U.N. General Assembly adopted the Universal Declaration of Human Rights in 1948, a document that opens by proclaiming that “disregard and contempt for human rights have resulted in barbarous acts which have outraged the conscience of mankind.”

Of course, the problem with that humanist idyll is that no such universal conscience ever existed. The U.N. Declaration’s drafters, including Eleanor Roosevelt, Peng Chun Chang, Charles Malik and others, were well-meaning yet privileged elites who shared little with ordinary people. In the long chronicle of human civilization, the shift from sanitized histories toward collective self-rebuke may yet prove to be a flash in the pan, a luxury of momentary prosperity.

Yet no one who believes in democracy can take an entirely pessimistic outlook. Policies of national self-inculpation are likely to remain the rare exception. But let’s not forget that constitutional democracy itself remains exceptional in history.

Sign up to our newsletter

    Will be used in accordance with our Privacy Policy