A few years back, President Obama’s “red line” on Syria’s use of chemical weapons drew worldwide attention. In a December 2012 speech, the president stated,
“I want to make it absolutely clear to Assad and those under his command, the world is watching. The use of chemical weapons is and would be totally unacceptable. And if you make the tragic mistake of using these weapons, there will be consequences and you will be held accountable.”
When Obama made this speech, the civil war in Syria had been raging for more than a year and a half, and an estimated 60,000 Syrians had been killed. That got us wondering: what was it about the threat of chemical weapons that the president and others found so much less tolerable than all these deaths by so-called conventional means?
Richard Price, a political scientist at the University of British Columbia, traces the story of the chemical weapons taboo all the way back to 1899. In that year, delegates from the world’s most powerful nations got together in the Hague and hashed out a list of things that would make war fighting more humane, like banning bullets that expand on impact, explosives fired from balloons. and projectiles designed to spread asphyxiating gases. But that last idea was far from the focus of the discussion there, says Price.
“It was a decidedly minor issue, almost a throwaway, because the delegates said, well, nobody has this anyway. So sure, it’s not going to harm any of us to go ahead and come up with this ban.”
But even that early in the history of chemical weapons, it was clear they were going to get special treatment. Unlike weapons from the crossbow to the first firearms to submarines, repulsion at their destructive power came before, not after, chemical weapons had been used, says Price: “Right from the get-go of this weapon, it was tagged as having this moral sensibility around it.”
A little over a decade later, World War I broke out in Europe. And while that moral sensibility may have been the thing that kept chemical weapons from being used against civilians, it didn’t keep them off the battlefield. By the war’s end, some historians estimate that 40% of artillery shells deployed were chemical rounds, killing thousands of soldiers on the battlefield. Those that survived often faced life-long health problems.
People worried about how this now very real weapon might be used again in the future. And sure enough, another international agreement was signed banning chemical weapons: the Geneva Protocol of 1925. It was greeted with a whole lot of skepticism, but World War II proved it to be surprisingly effective. Neither side of the war wanted to spend their limited military budgets on something so taboo, and as a result neither the Americans nor the Germans felt they were well-equipped enough to initiate a chemical conflict. Says Price,
“Everybody expects World War II is going to be a chemical war. [But] this really precarious threshold somehow survived, even as virtually every other boundary in World War II was exploded.”
Thirty-five years after World War II had ended, the threshold was crossed but the taboo arguably remained just as strong as ever. Throughout the 1980s, Iraq used chemical weapons in its war with Iran, against both enemy soldiers and civilians. But while the Germans during World War I argued publicly that their use of chemical weapons was more humane than conventional warfare, the Iraqis did no such thing. Price explains,
“The Iraqis refused to admit that they had used chemical weapons, even as abundant evidence emerged to the contrary. They wouldn’t acknowledge it, so they actually contributed in a curious way to the notion that the use of these weapons is aberrant, even as they used them.”
Today, all but six of the world’s nations have signed the latest ban, the Chemical Weapons Convention. But for Price, the strength of the taboo is more a product of history than international law. Over the course of the 20th century, he says, we never got accustomed to images of civilians choking on poison gas the way we did, unfortunately, to images of civilians killed by aerial bombardment. And the longer chemical weapons existed but were not used, the more people believed they should never be used. What started as a kind of dotted line at the turn of the 20th century had by the turn of the 21st century coalesced into a solid red line.
But Price fears that red line has come about at the cost of other forms of warfare becoming more and more acceptable. Tanks, which during World War I era were singled out as the most terrifying military innovation of the time, have joined automatic weapons, grenades, and landmines to become part of the routine of war.
“What’s really fascinating to me is how other things that we ought to also feel horrified at are put into this category we call conventional weapons, which is sort of soothing… [It] means it’s OK to get burnt to death, to get blown up to death, all these other horrible ways is which warfare is conducted.”
Richard Price is the author of The Chemical Weapons Taboo. You can listen to this segment or our entire episode on the rules of war here.
Pulling Out Your Heartstrings
If you’ve been near a TV or radio in the last twenty years, you’ve heard commercials and PSAs asking for charitable donations. After major disasters, celebrities like Billy Bob Thorton make appeals for Red Cross donations, or Alyssa Milano and UNICEF plead for aid on behalf of children worldwide. And of course, there are the ads Sarah MacLaughlin appears in for the SPCA.
These ads typically include striking images: dogs who’ve nearly been beaten to death. Kids with flies in their eyes and protruding ribs. It’s hard to watch without feeling a pull on your heartstrings—and your purse strings.
Scholar Kevin Rozario says that appeals to compassion have a long history in America, a history that stretches at least as far back as 1852, when Harriet Beecher Stowe published Uncle Tom’s Cabin. Stowe, says Rozario, depicted the harsh conditions of slavery in the hope that reasonable Americans would feel driven to abolish it.
“She’s gonna tell them what’s going on and she hopes that they’ll be horrified by slavery and work to put an end to it. And so she’s got a strong sense that what she’s really catering to is the compassion and the enlightened reason of her readers.”
By and large, charity organizations during Stowe’s time approached their mission the same way: all you had to do was tell people how bad something was and their natural sense of right and wrong would require that they help. But in the early 1900s, Rozario says, charities started thinking about their donors in a new way: no longer as naturally compassionate, but as people who had to be convinced to care,
“As if they are consumers who have to be manipulated in some ways, by appealing to their desires. And one of the ways that you appeal to people’s desires…in the 1910s, especially, is that you present them with lots of very vivid, thrilling, exciting images to get their attention.”
It was a technique taken straight from the playbook of advertisers, who were also discovering new ways of capturing people’s attention around this time, Rozario says.
“This is exactly the kinds of advertising appeal that you’ll see in pulp magazines at the time…And as you read it through 1917, 1918 and so forth, there’s a sense that we have to keep making this more thrilling, more exciting, more vivid because readers are gonna be bored and they’ve seen these images before.”
Not everyone was comfortable with the Red Cross’ embrace of sensationalism. In 1917, one of the group’s field representatives — a guy named Robert Scott — was working to establish a new chapter in Alaska. The stakes were high — Americans were fighting, starving, and dying in the Great War, and the Red Cross was desperate for support on the homefront.
So Scott went to where the crowds were: keeping warm in the local movie theater. But the audience wasn’t moved by his descriptions of the front lines — really, it was nothing compared to the bloody, violent movie they’d just seen.
“So in a sense, what he was concerned about here was he was trying to rile up an audience, get them to care about these horrifying events in Europe, to feed their compassion – but this audience had already been satiated by the time he came on.
It wasn’t simply that the movie was distracting people from the more important matter at hand. Scott was disturbed because the audience clearly found atrocity entertaining when they should have found it horrifying.
“And so his beef, I guess, with these new sensationalistic movies was that they seemed to be training people, conditioning audiences to respond to these images as forms of entertainment, rather than as something that should put them in touch with their compassionate instincts.”
But sensationalism paid off, for the Red Cross, at least. Between 1915 and 1919, the number of chapters in America grew from 145 to more than 3,700. During two National Red Cross Weeks in 1917 and 1918, 43 million Americans contributed 238 million dollars. As WWI came to an end, the American Red Cross was the country’s leading charity.
Like Scott, you might find it unsettling to think that people have to be convinced to care about each other — it betrays the idea that we’re inherently compassionate beings. And that there’s a danger we’ll get so caught up in the graphic-ness of pretend images that we’ll miss the suffering of real people.
But for Rozario, the perils of sensationalism don’t outweigh the potential good. Making charity organizations competitive with mass media, he says, is precisely what allowed humanitarian efforts to flourish in the 20th and 21st centuries.
You know you use the techniques that you have and the cultural strategies that are available to you to try to serve the causes that you most believe in. Once you’ve got people’s attention, hopefully the educational part kicks in. Once you’ve got people caring about the issue, then you can start to really fill in the background.
What does the United States really look like? You can describe the physical landscape, the rivers, the mountains, the Grand Canyon. You can talk about its citizens, both famous and ordinary. But if you had to choose one person who embodied the whole nation…well, you might pick this guy:
Uncle Sam has been used as an allegorical symbol of the U.S. – or perhaps more accurately, the U.S. government – for about 200 years. But starting in the twentieth century, he started to usurp another figure, one who’d been with us since the very beginning: Columbia.
The figure of Columbia emerged during the Revolutionary War as an American equivalent to England’s Britannia, eventually becoming as easily recognizable to Americans as Uncle Sam is today. The first reference to Columbia as a human figure appeared in a poem by a woman named Phillis Wheatley. She wrote it for George Washington in 1775, and sent it to him as inspiration in the struggle for independence. Here’s an excerpt:
Celestial choir! enthron’d in realms of light,
Columbia’s scenes of glorious toils I write.
While freedom’s cause her anxious breast alarms,
She flashes dreadful in refulgent arms.
See mother earth her offspring’s fate bemoan,
And nations gaze at scenes before unknown!
See the bright beams of heaven’s revolving light
Involved in sorrows and the veil of night!
The poem is remarkable for several reasons. First, George Washington liked it enough to personally invite her to his camp and thank her for the poem she had written in his honor. He even helped her get the poem into print. And then there’s the story of its author. Wheatley was a former slave, who had learned to read and write while living in Boston with her owners, the Wheatley family. Even her name was a mark of her bondage: the slave ship that had taken her to America as a child had been called “The Phillis”.
After Washington helped publish Wheatley’s poem, Columbia began to show up in songs and newspaper cartoons. She helped give meaning to a nation in its infancy, says our guest Ellen Berg:
“She is this wise creature, this wise being who can lead the country. And I think that’s really important early on, because there’s some sense of a supernatural force who is helping us know what to do, where this country should be going.”
As the young nation grew, though, Columbia’s significance began to be superseded by that of another national symbol: Uncle Sam, who first appeared while America was fighting the War of 1812.
It wasn’t an exact replacement, since there were subtle differences in what the two symbols represented. Whereas Columbia was removed from politics and represented the nation itself, Uncle Sam came to represent the more aggressive, assertive representation of the federal state. For a time, it was even common for artists to depict the two of them together, though their relationship wasn’t entirely clear. Sometimes Sam was Columbia’s uncle, and sometimes the two were linked romantically, with the states – or, as in the case below, various ethnic groups – as their children.
But as the federal government grew stronger, Uncle Sam’s power grew too, until, within a hundred years of his first appearance, his reputation came to eclipse even Columbia’s. Once ubiquitous, Columbia is rarely depicted today.
But she still lives on, even though we may not know her by name – most notably, says Berg, in the figure of the Statue of Liberty, standing proud in New York Harbor:
“As our knowledge of Columbia has fallen, what remains is the statute who we know is the goddess of liberty. And the statue has become the stronger figure. So in a way, we can say that Columbia is still there, we’re just not really aware of it.”