Feed aggregator

The Disease Map of Rural America

Dissent Magazine -

In the hardest-hit rural communities, the collective immune system was already fatally compromised. They were deep into a decades-old crisis when the pandemic arrived.

Revealed: How U.S. Gov't & Hollywood Secretly Worked Together to Justify Atomic Bombings of Japan

Democracy Now! -

On the 75th anniversary of the Hiroshima bombing, when the United States became the only country ever to use nuclear weapons in warfare, we look at how the U.S. government sought to manipulate the narrative about what it had done — especially by controlling how it was portrayed by Hollywood. Journalist Greg Mitchell’s new book, “The Beginning or the End: How Hollywood — and America — Learned to Stop Worrying and Love the Bomb,” documents how the bombings of Hiroshima and Nagasaki triggered a race between Hollywood movie studios to tell a sanitized version of the story in a major motion picture. “There’s all sorts of evidence that has emerged that the use of the bomb was not necessary, it could have been delayed or not used at all,” says Mitchell. “But what was important was to set this narrative of justification, and it was set right at the beginning by Truman and his allies, with a very willing media.”

"The Beginning of Our End": On 75th Anniversary, Hiroshima Survivor Warns Against Nuclear Weapons

Democracy Now! -

On the 75th anniversary of when the United States dropped the world’s first atomic bomb on the Japanese city of Hiroshima, killing some 140,000 people, we speak with Hideko Tamura Snider, who was 10 years old when she survived the attack. “The shaking was so huge,” she recalls. “I remember the sensation, the color and the smell like yesterday.” Tamura Snider describes her harrowing journey through a shattered city, suffering radiation sickness following the attack, and her message to President Trump.

Headlines for August 6, 2020

Democracy Now! -

Nobody Accurately Tracks Health Care Workers Lost to COVID-19. So She Stays Up At Night Cataloging the Dead.

Mother Jones Magazine -

This story was published originally by ProPublica, a nonprofit newsroom that investigates abuses of power. Sign up for ProPublica’s Big Story newsletter to receive stories like this one in your inbox as soon as they are published.

When police discovered the woman, she’d been dead at home for at least 12 hours, alone except for her 4-year-old daughter. The early reports said only that she was 42, a mammogram technician at a hospital southwest of Atlanta and almost certainly a victim of COVID-19. Had her identity been withheld to protect her family’s privacy? Her employer’s reputation? Anesthesiologist Claire Rezba, scrolling through the news on her phone, was dismayed. “I felt like her sacrifice was really great and her child’s sacrifice was really great, and she was just this anonymous woman, you know? It seemed very trivializing.” For days, Rezba would click through Google, searching for a name, until in late March, the news stories finally supplied one: Diedre Wilkes. And almost without realizing it, Rezba began to keep count.

The next name on her list was world-famous, at least in medical circles: James Goodrich, a pediatric neurosurgeon in New York City and a pioneer in the separation of twins conjoined at the head. One of his best-known successes happened in 2016, when he led a team of 40 people in a 27-hour procedure to divide the skulls and detach the brains of 13-month-old brothers. Rezba, who’d participated in two conjoined-twins cases during her residency, had been riveted by that saga. Goodrich’s death on March 30 was a gut-punch; “it just felt personal.” Clearly, the coronavirus was coming for health care professionals, from the legends like Goodrich to the ones like Wilkes who toiled out of the spotlight and, Rezba knew, would die there.

At first, seeking out their obituaries was a way to rein in her own fear. At Rezba’s hospital in Richmond, Virginia, as at health care facilities around the U.S., elective surgeries had been canceled and schedules rearranged, which meant she had long stretches of time to fret. Her husband was also a physician, an orthopedic surgeon at a different hospital. Her sister was a nurse practitioner. Bearing witness to the lives and deaths of people she didn’t know helped distract her from the dangers faced by those she loved. “It’s a way of coping with my feelings,” she acknowledged one recent afternoon. “It helps to put some of those anxieties in order.”

On April 14, the Centers for Disease Control and Prevention published its first count of health care workers lost to COVID-19: 27 deaths. By then, Rezba’s list included many times that number — nurses, drug treatment counselors, medical assistants, orderlies, ER staff, physical therapists, EMTs. “That was upsetting,” Rezba said. “I mean, I’m, like, just one person using Google and I had already counted more than 200 people and they’re saying 27? That’s a big discrepancy.”

Rezba’s exercise in psychological self-protection evolved into a bona fide mission. Soon she was spending a couple of hours a day scouring the internet for the recently dead; it saddened, then enraged her to see how difficult they were to find, how quickly people who gave their lives in service to others seemed to be forgotten. The more she searched, the more convinced she became that this invisibility was not an accident: “I felt like a lot of these hospitals and nursing homes were trying to hide what was happening.”

And instead of acting as watchdogs, public health and government officials were largely silent. As she looked for data and studies, any sign that lessons were being learned from these deaths, what Rezba found instead were men and women who worked two or three jobs but had no insurance; clusters of contagion in families; so many young parents, she wanted to scream. The majority were Black or brown. Many were immigrants. None of them had to die.

The least she could do was force the government, and the public, to see them. “I feel like if they had to look at the faces, and read the stories, if they realized how many there are; if they had to keep scrolling and reading, maybe they would understand.”

It’s been clear since the beginning of the pandemic that health care workers faced unique, sometimes extreme risks from COVID-19. Five months later, the reality is worse than most Americans know. Through the end of July, nearly 120,000 doctors, nurses and other medical personnel had contracted the virus in the U.S., the CDC reported; at least 587 had died.

Even those numbers are almost certainly “a gross underestimate,” said Kent Sepkowitz, an infectious disease specialist at Memorial Sloan Kettering Cancer Center in New York City who has studied medical worker deaths from HIV, tuberculosis, hepatitis and flu. Based on state data and past epidemics, Sepkowitz said he’d expect health care workers to make up 5% to 15% of all coronavirus infections in the U.S. That would put the number of workers who’ve contracted the virus at over 200,000, and maybe much higher. “At the front end of any epidemic or pandemic, no one knows what it is,” Sepkowitz said. “And so proper precautions aren’t taken. That’s what we’ve seen with COVID-19.”

Meanwhile, the Centers for Medicareand Medicaid Services reports at least 767 deaths among nursing home staff, making the work “the most dangerous job in America,” a Washington Post op-ed declared. National Nurses United, a union with more than 150,000 members nationwide, has counted at least 1,289 deaths among all categories of health care professionals, including 169 nurses.

The loss of so many dedicated, deeply experienced professionals in such an urgent crisis is “unfathomable,” said Christopher Friese, a professor at the University of Michigan School of Nursing whose areas of study include health care worker injuries and illnesses. “Every worker we’ve lost this year is one less person we have to take care of our loved ones. In addition to the tragic loss of that individual, we’ve depleted our workforce unnecessarily when we had tools at our disposal” to prevent wide-scale sickness and death.

One of the most potentially powerful tools for battling COVID-19 in the medical workforce has been largely missing, he said: reliable data about infections and deaths. “We don’t really have a good understanding of where health care workers are at greatest risk,” Friese said. “We’ve had to piece it together. And the fact that we’re piecing it together in 2020 is pretty disturbing.”

The CDC and the Department of Health and Human Services did not respond to ProPublica’s questions for this story.

Learning from the sick and dead ought to be a national priority, both to protect the workforce and to improve care in the pandemic and beyond, said Patricia Davidson, dean of the Johns Hopkins School of Nursing. “It’s critically important,” she said. “It should be done in real time.”

But data collection and transparency have been among the most glaring weaknesses of the U.S. pandemic response, from blind spots in the public health system’s understanding of COVID-19 in pregnancy to the sudden removal of hospital capacity data from the CDC’s website, later restored after a public outcry. The Trump administration’s sudden announcement in mid-July that it was wresting control over hospital coronavirus data from the CDC has only intensified the concerns.

“We’d be the first to agree that the CDC has been deficient” in its data gathering and deployment, said Jean Ross, a president of National Nurses United. “But it’s still the most appropriate federal agency to do this, based on clear subject-matter expertise in infectious diseases response.”

The CDC’s basic mechanism for collecting information about health care worker infections has been the standard two-page coronavirus case report form, mostly filled out by local health departments. The form doesn’t request much detail; for example, it doesn’t ask for employers’ names. Information is coming in delayed or incomplete; the agency doesn’t know the occupational status of almost 80% of people infected.

The data about infections and deaths among nursing home staff is more robust, thanks to a rule that went into effect in April that requires facilities to report directly to the CDC. The agency told Kaiser Health News that it is also “conducting a 14-state hospital study and tapping into other infection surveillance methods” to monitor health care worker deaths.

Another federal agency, the U.S. Occupational Safety and Health Administration, investigates worker infections and deaths on a complaint basis and has prioritized COVID-related cases about the health care industry. But it has suggested that most employers are unlikely to face any penalties and has issued only four citations related to the outbreak, to a Georgia nursing home that delayed reporting the hospitalization of six staffers and three Ohio care centers that violated respiratory protection standards. Of the more than 4,500 complaints OSHA has received about COVID-19-related working conditions in the medical industry, it has closed nearly 3,200, a ProPublica analysis found.

Data problems aren’t just a federal issue; many states have fallen short in collecting and reporting information about health care workers. Arizona, where cases have been surging, told ProPublica, “We do not currently report data by profession.” The same goes for New York state, though a report in early July hinted at just how devastating the numbers there might be: 37,500 nursing home employees, about a quarter of the state’s nursing home workforce, were infected with the coronavirus from March through early June. Other states, including Florida, Michigan and New Jersey, provide data about employees at long-term facilities but not about health care workers more broadly. “We are not collecting data on health care worker infections and/or health care worker deaths from COVID-19,” a spokesperson for the Michigan Health Department said in an email.

This problem is global. Amnesty International, in a July report, pointed to widespread data gaps as part of a broader suppression of information and rights that has left workers in many countries “exposed, silenced [and] attacked.” In Britain, where more than 540 medical workers have died in the pandemic, the advocacy group Doctors’ Association UK has begun legal action to force a government inquiry into shortages of personal protection equipment in the National Health Service and “social care” facilities such as nursing homes. And in May, more than three months after the first known medical worker’s death, the International Council of Nurses called for governments across the world to start keeping accurate data on such cases, and for the records to be centrally held by the World Health Organization. The WHO estimates that about 10% of COVID-19 cases worldwide are among health workers. “We are closely following up (on) these cases through our global networks,” a spokesperson said.

“Governments’ failure to collect this information in a consistent way” has been “scandalous,” said the council’s CEO, Howard Catton, and “means we do not have the data that would add to the science that could improve infection control and prevention measures and save the lives of other healthcare workers. … If they continue to turn a blind eye, it sends a message that [those] lives didn’t count.”

So regular people, like Rezba, have stepped up with their makeshift databases.

Rezba, 40, initially wanted a career in public health. While finishing her master’s degree at Emory University in Atlanta and for a few months afterward, she worked as a lab tech at the CDC, analyzing nasal swabs to track cases of MRSA, the flesh-eating bacteria. But she decided she cared more about people than bugs, so she headed to Virginia Commonwealth University medical school in Richmond, graduating in 2009 with plans to specialize in the treatment of chronic pain.

During her residency at VCU, her first rotation was in the neonatal intensive care unit. “There was a little baby I helped take care of for three weeks. And the very last day of that rotation, his parents withdrew care…He was the first little person I pronounced dead. I went and cried in the stairwell after that.” Her next rotation was in the burn unit, then the emergency department. “It seemed like death was just everywhere,” Rezba said. Witnessing it “is something very separate from the rest of your life experiences. People look different when they’re dying. It’s not like TV. They don’t look like they’re sleeping. CPR is pretty brutal. Codes are pretty brutal.”

She began keeping a list as a way to process the grief. “In residency, you record everything—your case logs, the procedures you do. It was just sort of second nature to record their names.” Whenever a patient died she would make another entry in her notebook, then “I would kind of perseverate”—ruminate—“over their names.” At the end of the year, she took the notebook to church. “I lit candles for them. I prayed. And then I let it go.”

A decade later, Rezba was working full time as an anesthesiologist and raising three small children, her list-compiling days long past her, she thought. Then COVID-19 hit. The onetime infectious disease geek became obsessed with the videos leaking out of China—the teams of health care workers in full protective gear, the makeshift wards in tents, the ERs in chaos: “I knew early on that this was going to be a big problem.” In her job, Rezba was often called upon to do intubations. “The possibility of not having enough PPE caused a lot of anxiety for her,” said her husband, Tejas Patel, whom she met in medical school. “She would be the one, if we did hit that level of New York, who could potentially be at risk and bring it home to the kids.”

As it turned out, Rezba’s hospital wasn’t inundated, nor did it experience the PPE shortages that plagued many health care facilities. But her anxiety didn’t disappear; it just took a new shape. If health care workers were front-line heroes, she decided, her role was to search the trenches for the bodies left behind.

Rezba is the first to admit she’s not great at technology; she rarely uses a computer at home. Patel discovered what she was doing because their iPhones and iCloud accounts are linked. “Whenever she saves a picture to the phone, I can see it. And I noticed a bunch of pictures of, you know, these strangers.” He remembered how, in their student days, Rezba had insisted on humanizing the cadaver in their anatomy lab: “It upset her that it was just this anonymous person. Knowing his birthday and little things like that would make her feel better.” Patel figured the photos were part of a similar coping strategy. “It wasn’t until much later that I found out she was putting them up on Twitter.”

Much of Rezba’s digging happens in the middle of the night, when she can’t sleep. She usually starts by Googling for local news stories; if she’s still not tired, she turns to the obituary site Legacy.com. The hunt for a person’s occupation and cause of death invariably takes her to Facebook, where she follows the trail to relatives and co-workers, to vacation slideshows and videos of old men serenading their grandkids on the guitar. Every few days, she checks GoFundMe, where she’s recently been struck by the number of people who linger for weeks or months before dying. She’s still discovering deaths that occurred in April and May. Anyone under 60 gets special scrutiny. “If the obit says, ‘They died surrounded by family,’ I usually don’t bother trying to find out more, because those people didn’t have COVID. The people with COVID are mostly dying alone.”

Doctors and nurses are the easiest to find. “If someone worked in the laundry service at the nursing home, the family doesn’t put that in,” Rebza said. Yet it’s the nonmedical staff that she feels a special obligation to uncover — the intake coordinators and supply techs, the food service workers and janitors. “I mean, the hospital’s not going to function if there’s nobody to take out the trash.” Every so often, a news story mentions that several staffers from a particular nursing home or rehab center have died, without mentioning their names, and Rezba feels the rage start to bubble. “What it comes down to is, these are people that are making $12 an hour. And they get treated like they’re disposable.”

If she can’t find someone’s identity right away, or if the cause of death isn’t clear, she’ll wait a couple of days or weeks and try again. Because she comes across them anyway, she’s started to keep track of other categories of COVID-19 deaths, like kids and pregnant women, as well as health care workers in their 30s and 40s who don’t appear to have the virus but suddenly perish from heart attacks or strokes or other mysterious reasons. “I have a lot of those,” she said.

Once she’s certain she’s found someone who belongs on her list, she selects a photo or two and writes a few words in their honor. Sometimes, these read like a scrap of poetry; sometimes, like a howl.

He enjoyed crazy-dancing at home to Bruno Mars, with the moves becoming wilder the more his family laughed.

As a child, she would wrap her clothes around Dove soap so they would smell like America.

This poor baby should have his mother in his arms. Instead he has her in an urn.

A preprint study out of Italy last week hinted at the kind of lessons researchers and policy makers might glean if they had more complete data about health care workers in the U.S. The study pooled data from occupational medical centers in six Italian cities, where more than 10,000 doctors, nurses and other providers were tested for coronavirus from March to early May. Along with basic demographic information, the data included job title, the facility and department where the employee worked, the type of PPE used and self-reported COVID-19 symptoms.

The most important findings: Working in a designated COVID-19 ward didn’t put workers at greater risk of infection, while wearing a mask “appeared to be the single most effective approach” to keeping them safe.

In the U.S., many medical facilities are similarly monitoring employee infections and deaths and adjusting policies accordingly. But for the most part, that information isn’t being made public, which makes it impossible to see the bigger picture, or for systems to learn from each other’s experiences, to better protect their workers.

Imagine all of the opportunities it would present if everyone could see the full landscape, said Ivan Oransky, vice president for editorial content at Medscape, where a memorial page to honor global front liners has been one of the site’s best-read features. “You could be doing some real great shoe-leather epidemiology…You could go: ‘Wait a second. That hospital has 12 fatalities among health care workers. The hospital across town has none. That can’t be pure coincidence. What did this one, frankly, do wrong, and what’s the other one doing right?’”

To Adia Harvey Wingfield, a sociologist at Washington University and author of “Flatlining: Race, Work, and Health Care in the New Economy,” some of the most pressing questions relate to disparities: “Where is this virus hitting our health care workers hardest?” Is the impact falling disproportionately on certain categories of workers—for example, doctors vs. registered nurses vs. nursing aides—on certain types of facilities, or in certain parts of the country? Are providers who serve lower-income communities of color more likely to become ill?

“If we aren’t attuned to these issues, that puts everybody at a disadvantage,” Wingfield said. “It’s hard to identify problems or identify solutions without the data.” The answers are especially important in Black and Latino communities that have suffered the highest rates of sickness and death—and where health care workers are themselves more likely to be people of color. Without good information to guide current and future policy, she said, “we could potentially be facing long-term catastrophic gaps in care and coverage.”

The near-term consequences have also been enormous. The lack of public data about health care workers and deaths may have contributed to a dangerous complacency as infections have surged in the South and West, Friese said—for example, the idea that COVID-19 is no more dangerous than other common respiratory viruses. “I’ve been at this for 23 years. I’ve never seen so many health care workers stricken in my career. This whole idea that it’s just like the flu probably set us back quite a way.”

He sees similar misconceptions about PPE: “If we had a better understanding of the number of health care workers infected, it might help our policymakers recognize the PPE remains inadequate and they need to redouble their efforts…People are still MacGyvering and wrapping themselves in trash bags. If we’re reusing N95 respirators, we haven’t solved the problem. And until we solve that, we’re going to continue to see the really tragic results that we’re seeing.”

The misconceptions appeared to stretch to the highest reaches of the federal government, even as infections and deaths started surging again. At a White Houseevent in July focused on reopening schools in the fall, HHS secretary Alex Azar told the people gathered, “health care workers…don’t get infected because they take appropriate precautions.”

Even some medical workers have continued to be in denial. A few days before Azar spoke, Twitter was abuzz over an Alabama nurse who worked the COVID-19 floor at a hospital by day and decompressed at crowded bars by night, where she often went maskless. “I work in the health care industry,” she was quoted as saying, “so I feel like I probably won’t get it if I haven’t gotten it by now.”

Piercing that sense of invulnerability—making the enormity of the COVID-19 disaster seem real—isn’t only Rezba’s mission. From The New York Times’ iconic front page marking the first 100,000 American deaths to the Guardian/Kaiser Health News project “Lost on the Frontline,” news organizations and social media activists have grappled with how to convey the scale of the tragedy when people are distracted by multiple world-shattering crises and the normal rituals for processing grief are largely unavailable.

“The point at which accountability usually happens is when our leaders have to reckon with the families of those who’ve been lost, and that has not happened,” said Alex Goldstein, a Boston-area communications strategist behind the wrenching @FacesOfCOVID Twitter account, which has posted almost 2,000 memorials since March. With COVID-19, “no one has had to look in the eye of a crying parent who wants to show you a picture of their child or listen to someone telling you about who their mom or dad was. There has been no consequence. What would our policy decisions have looked like if [the people making them] had to come face to face with that death and loss in a more visceral way?”

It’s a question that weighs especially heavily on health care professionals, who have seen, in the most visceral way possible, the worst that COVID-19 can do. Erica Bial, a pain specialist in the neurosurgery department at a Boston-area hospital, fell dangerously ill from COVID-19 in March, her respiratory symptoms lingering for more than six weeks. She lived alone and opted not to go to the hospital, in part because she worried about infecting other people. “At that point [in the outbreak], they would have intubated me, given me hydroxychloroquine and azithromycin and probably killed me.” As her recovery dragged on, she wondered how other doctors were faring: “I couldn’t believe that I was the only physician I knew who was sick.” But as she searched online, “I could not find any data. I just started getting really frustrated at the lack of information and the disinformation…And then I started thinking about, well, what happens if I die here? Will anybody know?”

Like Rezba, Bial has a background in public health; the Facebook page she created, COVID-19 Physicians Memorial, was an attempt to build “a network where there’s accountability. I wasn’t necessarily trying to create, you know, reverence or memorialization. I was trying to understand the scope of the problem.”

Rezba soon began posting memorials on the page; as it grew to include more than 4,800 members, Bial asked her to help moderate it. Among the things the two women share is a determination to stick to facts. “I didn’t want any politics and I didn’t want any garbage,” Bial said. “(Rezba) was 100% like-minded and trustable.” She was also someone Bial could talk to, doctor to doctor, as she recovered. “It wasn’t just two people obsessed with something kind of morbid,” Bial said. “She was a source of support.”

Emergency room doctor Cleavon Gilman also gained a following for his posts on Facebook, a diary about what he witnessed as an ER resident in the NewYork-Presbyterian hospital system, battling the virus as it engulfed Washington Heights. “It was just … overwhelming,” he recalled. “We were intubating 20 patients a day. We had hallways filled with COVID patients; there was nowhere to put them.” In the space of a few brutal days in late April, three of Gilman’s colleagues died, including one by suicide. “When it’s a colleague that you’re taking care of and you know them as a person you’ve been on a journey with…man, that’s hard.”

Though much of the media focus was on the risks faced by older patients, Gilman was struck by how many of the critically ill were in their 20s, 30s and 40s. In mid-April, his own 27-year-old cousin, a gym teacher at a New Jersey charter school, suddenly died; he went to the ER twice with chest pain but was diagnosed with anxiety and sent home, according to his relatives, only to collapse in his car on the side of the road.

As the crisis in New York City ebbed, Gilman could see trouble ahead in other parts of the country, including in Yuma, Arizona, where he was about to start a new job. It seemed vitally important to help younger people understand the risks they faced—and that they created for others—by not adhering to physical distancing or wearing masks, not to mention the dangers that health care workers faced from continuing shortages of PPE. So Gilman began gathering the memorials he saw on Twitter and Facebook, many of them found by Rezba or on @FacesOfCOVID, and organizing the dead on his website in the type of gallery that he knew would pack an emotional wallop. Then he went a step further, making the photos and obituaries—more than 1,000 people—sortable by age and profession.

“You begin to see a pattern here,” he said. “When someone says, ‘Oh people aren’t dying, they’re not that [young],’ you can come back with actual names, actual articles, quickly. It’s more powerful. You have your evidence there.”

One of the most overtly political projects is Marked by COVID, formed by Kristin Urquiza in honor of her father, Mark, after her “honest obituary” of him went viral in early July. To Urquiza, who earned her master’s in public affairs from the University of California, Berkeley, and works as an environmental advocate in the San Francisco area, “the parallels between the AIDS crisis and what is happening now with COVID are just mind-boggling [in terms of] the inaction by governments and the failure to prioritize public health.” She and her partner, Christine Keeves, a longtime LGBTQ activist, hope the project will be both “a platform for people to come forward and share their stories” and the COVID-19 version of the anti-AIDS group Act Up.

They’re also raising money on GoFundMe to help other families pay for obituaries; the second honest obit on their site was for a respiratory therapist in Texas named Isabelle Odette Hilton Papadimitriou: “Her undeserving death is due to the carelessness of politicians who undervalue healthcare workers through lack of leadership, refusal to acknowledge the severity of this crisis and unwillingness to give clear and decisive direction to minimize the risks of coronavirus. Isabelle’s death was preventable; her children are channeling their grief and anger into ensuring fewer families endure this nightmare.”

It’s a trend that Rezba supports wholeheartedly. By the end of July, she had posted almost 900 names and faces of U.S. health care workers who had perished from COVID-19. She fantasized about what it would be like to leave the counting behind her. “It would be great if I could stop. It would be great if there was nobody else to find.” But she had a backlog of dozens of stories to post, and the number of deaths kept climbing.

The Trump Files: Cosmo Once Asked Donald to Pose Nude for $50,000

Mother Jones Magazine -

This post was originally published as part of “The Trump Files“—a collection of telling episodes, strange but true stories, and curious scenes from the life of our current president—on October 14, 2016.

It turns out Melania almost wasn’t the only Trump to strip down for a magazine.

In 1989, according to the Miami Herald, Cosmopolitan asked Donald to pose nude in “a feature lay-out [a] la Burt Reynolds” for its 25th anniversary issue. The Herald also reported that Trump was (inexplicably) “the readers’ choice as the sexy man they would most like to see featured in the nude.” To entice Trump, Cosmo promised to protect his modesty with “a stack of books, a potted plant, a towel or something,” and the magazine was even willing to throw in a $50,000 donation to the charity of his choice.

Alas, Trump said no (as, reportedly, did Dennis Quaid, Tom Hanks, Paul Newman, and others). The seemingly not-that-coveted-slot eventually went to David Hasselhoff.

How the Fascists Won World War II

Counterpunch Articles -

Photograph Source: View of the IG Farben Building from the Main Tower by
EvaK – CC BY 2.5

Come you masters of war
You that build the big guns
You that build the death planes
You that build all the bombs
You that hide behind walls
You that hide behind desks
I just want you to know
I can see through your masks

–“Masters of War,” Bob Dylan

This is a mystery story. It revolves around a building that—as you will all come to agree—should have been bombed.

Before construction of the Pentagon during World War II, the two largest and most famous office buildings on planet Earth were the Empire State Building and the headquarters of German industrial behemoth IG Farben. Building these palaces of capitalism was a frantic race run in 1930-1931, at the opening of the Great Depression. Both edifices were designed to inspire awe, by “skyscraper” height in New York, by overwhelming grandiosity in Frankfurt. Unlike the original World Trade Center, both buildings still stand. There is no mystery about how the rugged steel frame of the Empire State Building survived the 1945 direct crash into its 79th floor by a twin-engine B-25 bomber, lost in fog over the city. How the IG Farben HQ survived World War II, however, is a mystery whose dark depths hold secret links between the past and the present.

The Empire State Building was an indelible feature of my mental landscape, growing up as a Brooklyn kid during World War II. But my first glimpse of the IG Farben building was only in a movie: Jacques Tourneur’s Berlin Express, a 1948 film I first saw while the president of the United States was doing his best to follow Hitler’s path to power. Like Alfred Hitchcock’s 1946 Notorious, it’s a thriller revolving around a Nazi conspiracy to regain power. I can think of no other postwar film about a Nazi comeback attempt, rather surprising since these were the years of the denazification campaign and the war crimes trials.

Berlin Express would be worth watching as a mystery thriller with excellent production values and fine ensemble acting. It is also the only postwar film I know of that warned against the nascent Cold War and pleaded for the restoration of the wartime alliance against fascism. But its knockout power comes from astounding visual revelations. Shot in 1947, Berlin Express was the first commercial movie filmed in occupied Germany. A full-screen opening credit proclaims:

Actual scenes in Frankfurt and Berlin were photographed

by authorization of

The United States Army of Occupation

The British Army of Occupation

The Soviet Army of Occupation.

When the IG Farben building appeared early in the film, I gasped. There it stood. Enclosed by acres of manicured parkland, its six monumental interconnected wings (compared to the Pentagon’s five) reached out in a soaring arc that sought to dominate space. Each wing was itself a massive nine-story building clad with blocks of exquisite Travertine marble. The camera took us through the portico’s ponderous columns into the ornate lobby, then across glistening marble floors leading to non-stop no-doors paternoster elevators endlessly whisking people to and from their work. On an upper floor, we followed one of the forty-five curved corridors that interlaced this colossal structure, which housed ten million cubic feet of office space and, from 1933 until 1945, was the nexus of the Nazi war machine. The next scene took place inside an office, through whose windows we glimpsed endless vistas of rubble, the ruins of the city of Frankfurt.

For fifteen years, this had been headquarters of the giant German conglomerate IG Farben. The main slave labor camp at Auschwitz was designed, administered, and financed within the walls, and profits from the camp were remitted to these offices. Joseph Mengele submitted detailed reports on his hideous Auschwitz experiments directly to this building, where his directors dutifully authorized his payments and requisitioned whatever equipment and supplies he requested. Here was invented the Zyklon-B gas used to murder millions of Jews, Communists, Roma, and homosexuals. Even more important, in this building were the brains and other vital organs of the company that invented and produced the synthetic rubber, synthetic oil, and new lightweight alloys that enabled the Wehrmacht’s warplanes and tanks to conquer Europe all the way from the English Channel to the outskirts of Moscow, Stalingrad, and Leningrad. At the war crimes trial of the leaders, prosecutor General Telford Taylor said these were the men who turned Hitler’s fantasies into reality. (For the definitive history of IG Farben, see Diarmuid Jeffrey’s superb Hell’s Cartel: IG Farben and the Making of Hitler’s War Machine.)

Before that vast Nazi industrial war machine could be administered inside this gorgeous palace of death, it had to be financed and created. IG Farben’s first contribution to Hitler and his Nazi party came at a crucial moment in the history. The Nazis, who had won 37.5% of the vote in the July 1932 election, plummeted to 33.1% in the November election, costing them 34 seats in parliament. Outnumbered by the combined Social Democrat and Communist deputies, the Nazis were unable to form a majority coalition, but Hitler, backed by many German industrialists and some American corporations, persuaded President Hindenburg to appoint him Chancellor, with control over the police. New parliamentary elections were scheduled for March 1933. In late February, Hitler held a secret meeting with a who’s who of Germany’s industrialists. Led by IG Farben, which gave the largest contribution, the giant corporations financed a tsunami of Nazi propaganda, huge Nazi rallies, and the unleashing of Hitler’s Stormtroopers (Sturmabteilung or SA, known as the Brownshirts). In that March election, the last free one, the Nazis attained their peak vote (43.9%), enough to consolidate Hitler’s dictatorship.

How could this building not have been—for military reasons alone, not to mention moral reasons–the prime target of U.S. and British bombing? But for other reasons, no Allied forces were ever permitted to attack the citadel of Nazi power and command center of Germany’s greatest war crimes. This was not because the ancient city of Frankfurt, where German kings and emperors had been crowned as early as 855AD, was spared. The cloak-and-dagger action of Berlin Express takes us on a nightmare scenic tour of the bombed-out remains of Frankfurt. Lucien Ballard’s superb black-and-white photography captures endless miles of skeletal buildings, vast piles of rubble, mutilated beggars, and the homeless. Some of the main action takes place inside the ruins, including key scenes in a clandestine pro-Nazi night club hollowed out behind the rubble. The IG Farben building stood unscathed, amid a city bombed into a modern form of the stone age.

“The masters of war,” as Bob Dylan sang to us, “hide behind walls.” Which walls? In America, the closest image of the walls they hide behind is the Pentagon. But the Pentagon is merely the workplace of their minions, henchmen, and hirelings, sitting behind desks in high-ranked uniforms or scurrying around corridors after their military retirement, now in expensive suits as representatives of our “defense” corporations. But in Nazi Germany, anyone could see the ostentatious walls behind which lurked the profiteering masters of war. They gloried in their walls, and they wanted them known by the world. The IG Farben headquarters thus combined those prime targets of the 9/11 bombers: the World Trade Center and the Pentagon, the two most famous office buildings in the 21st century world.

We call the 9/11 bombers “terrorists,” a term that obscures both their motivation and the astounding and ghastly success of their geopolitical mission: to plunge to the United States into endless and unwinnable war in the heart of the Muslim world. Terror was not their aim, which was to lure the U.S. into Afghanistan, where the Jihadists—with major assistance from Washington—had recently defeated the USSR. In contrast, the World War II British and U.S. bombers of Frankfurt—and the other cities of Germany and Japan—were implementing a strategy of terrorism. That was an explicitly Fascist strategy, expounded by the Italian Fascist theorist Guilio Douhet and developed in Britain by Air Marshall Hugh “Boom” Trenchard and General Arthur “Bomber” Harris and in America by General Billy Mitchell (as explained and documented at length in the “Victory through Air Power” section of my book, War Stars: The Superweapon and the American Imagination). The strategy developed from the Italian terror bombing of Libya in 1911 and the British terror bombing of Iraq in 1922.

Douhet explained his theory in a series of treatises compiled in The Command of the Air, the blueprint for what is known as “strategic bombing,” the main World War II U.S. air strategy, and later the mission of the U.S. Strategic Air Command, in which I served as a navigator and intelligence officer. Since, in Douhet’s words, the goal is “spreading terror and panic,” therefore “it is much more important” to destroy “a bakery” than “to strafe or bomb a trench.” The main targets are “warehouses, factories, stores, food supplies, and population centers.” Douhet enthused about incendiary as well as explosive bombs (and poison gas). He envisioned “panic-stricken people” fleeing burning cities “to escape this terror from the air.”

Perhaps the best known early victim of this Fascist theory of warfare is Guernica, a Spanish city of no military significance, subjected to saturation aerial bombing by the Luftwaffe in 1937. Pablo Picasso’s magnificent painting of the slaughter is rightly known as one of the greatest visual antiwar artworks. I find the horrors of Black Rain, the 1989 Japanese reenactment of the Hiroshima bombing and its aftermath, more gut-wrenching. But when I think deeply about what we see in Berlin Express and relate it to our daily news, this old Hollywood movie hits with more terrifying implications.

Why was the IG Farben HQ never bombed? How did this building become the safest building in any German city? No official explanation has ever been offered. Berlin Express repeats a rumored explanation: General Eisenhower decided in 1944 that he wanted the building for his headquarters (which is what we see in those scenes actually shot inside this structure in 1947). The problem with the explanation lies in the history of the bombing of Frankfurt.

Frankfurt was bombed fifty-four times by the British before July 25, 1942. During the rest of 1942 and 1943, massive RAF bomber raids intermittently saturated Frankfurt with explosives and incendiaries. The October 4, 1943 attack alone dropped 300,000 tons of liquid and solid incendiary bombs on the city.  The first American air raid on Frankfurt came on January 29, 1944, when a vast fleet of 800 B-17 Flying Fortresses obliterated the entire city—except the IG Farben building and grounds.

Most of the B-17 crews were veterans of raids on other German cities. On this raid, they encountered something they had never experienced. They were “puzzled,” they reported, by the lack of any German resistance on the way in. They met neither flak from the antiaircraft batteries below nor fire from Nazi fighter planes above until after they had made their bombing run and turned back to head home.

Why? The B-17s were most vulnerable while they were laden with bombs and in tight bombing formations. The defenders could hardly miss 800 Flying Fortresses, and, compared with other missions, dozens of bombers would have been shot down. I can think of only one explanation for this behavior by the Nazi forces whose mission was to defend the city. If they attacked the bombers before they unleashed their bombs, those bombs could go anywhere—including the sacrosanct IG Farben headquarters. And that makes no sense unless the defenders knew in advance, no doubt from the dozens of past raids, that the attackers would not target IG Farben.

Who were IG Farben’s guardian angels? The answer to that question helps explain how the apparent victory in our so-called Good War somehow turned into our Forever War, today led by a president who is meticulously following Hitler’s path to power. It lies in the labyrinthine maze of IG Farben’s interconnections with giant British and American corporations.

One way to navigate the IG Farben international maze is to follow the footsteps of John Foster Dulles and Allen Dulles. Until the U.S. entered the war, IG Farben’s U.S. representative was Sullivan & Cromwell, a law firm headed by Foster, abetted by his partner Allen. As soon as Hitler won the 1933 election, Sullivan & Cromwell began all its German cables with “Heil Hitler.” While negotiating crucial international deals for IG Farben, including ways to hide the company’s control of strategic U.S. corporations, Foster was also an apologist for the Nazi regime and a founder of the America First appeasement movement. (The Dulles brothers’ pre-war story is told powerfully by Nancy Lisagor and Frank Lipsius in A Law Unto Itself: The Untold Story of the Law Firm Sullivan & Cromwell) During the war, Allen led the OSS office in Switzerland, where he met various German spies and agents, suppressed reports of the Holocaust, and arranged for a post-war anti-Soviet strategy. (For essential reading on Allen, see David Talbot’s The Devil’s Chessboard: Allen Dulles, the CIA, and the Rise of America’s Secret Government.) The Dulles brothers took the helm of America’s Cold War when Foster became Secretary of State and Allen became head of the CIA under President Eisenhower.

As the U.S. was sliding into World War II in 1941, the Department of Justice exposed the byzantine cartel created by IG Farben and Standard Oil, including a jointly-owned U.S. corporation. Top executives of Standard Oil were convicted of criminal conspiracy with IG Farben. (Each of these corporate criminals was punished with a fine of five thousand dollars.) Using a host of holding companies and dummy corporations, IG Farben also gained stakes in other major U.S. competitors, including Dow Chemical and Alcoa. Its aim? To prevent the U.S. from producing its own synthetic rubber and oil, as well as light weight strategic metals, especially the new forms of magnesium so important for fighter planes. Its tactic? Lure the U.S. companies with offers of IG Farben patents and then sign agreements severely limiting any production utilizing these patents. Thanks to cordial and intimate relations between the German and U.S. executives, this worked fine for Nazi war plans.

I discovered the complex ties between IG Farben and Dow Chemical in 1966, while working to help create the movement against the use of napalm in Vietnam. Dow, of course, was the main producer of napalm. As I wrote then: “In the 1930’s, Dow Chemical and IG Farben formed an international cartel. Part of their agreement was to restrain U.S. production of magnesium and allow Germany to take world leadership in the vital element. As a result, at the outset of World War II Germany was producing five times as much magnesium as the United States.”

Could the guardian angels of IG Farben’s HQ be the same angels who saved IG Farben’s leading executives from execution or lifetime sentences? As Berlin Express was filming in Frankfurt, 140 miles away in Nuremburg twenty-three top IG Farben executives were being tried as war criminals. In the film, the Nazis are still the enemy. But by that time, America was already rebuilding German industry against a perceived menace from Soviet Communism.

Leading the charge against the prosecutors of IG Farben’s chieftains was Congressman George Dondero of Michigan, who asserted on the floor of Congress, that Josiah DuBois, the prosecution’s lead attorney, as well as five other members of the team, were all “Communist sympathizers” “who are trying to blacken the name of IG Farben.” Dondero’s Congressional district happened to include the Midland, Michigan, international headquarters of Dow Chemical, whose links to IG Farben were already being exposed in U.S. newspapers. On the same House floor, Congressman John Rankin of Mississippi branded the trial a “disgrace” where members of a “racial minority” are “trying German businessmen in the name of the United States.”

Ten defendants were acquitted on all counts. Thirteen were convicted of various war crimes. None served more than three years of their prison sentences, and most served far less. As for IG Farben, it was divided up, mostly into the three companies that had previously merged to form the many-headed beast: BASF, Hoechst, and Bayer. As soon as they were released from prison, many of the convicts became leaders of BASF, Hoechst, and Bayer. Karl Wurster, who won total acquittal despite serving as a director of the company that supplied the Zyklon B gas for the death chambers, became the head of BASF.

BASF, short for Badische Anilin und Soda Fabrik, was the main company that originally created and merged into IG Farben. When I was researching Dow Chemical and napalm in 1966, I learned that Badische Anilin und Soda Fabrick had renewed its relations with Dow and the two companies were now partners in the Dow-Badische company, with a giant chemical plant in Freeport, Texas.

As a member of the small delegation that met in 1966 with the executives of UTC, a Dow subcontractor with a huge napalm contract in the San Francisco Bay Area, I naively presented my research. Barnet Adelman, the president of UTC and a fellow Jew, responded in these exact words, the core defense of the IG Farben war criminals at Nuremberg: “Whatever our government asks us to do is right.”

When the leaders of IG Farben escaped any meaningful punishment for their monstrous war crimes (with their fortunes intact), the U.S. prosecutors dropped their pending case against Deutcshe Bank. As Germany’s largest bank, Deutsche Bank bankrolled the rise of Nazis and amassed colossal wealth from the genocide of the Jews and the takeover of foreign banks as nations fell to the Wehrmacht. Deutcshe Bank financed both the death camps and IG Farben’s slave labor factory at Auschwitz. As Jews and other victims were gassed by IG Farben’s Zyklon-B, their gold wedding rings and dental fillings were collected and melted down. Deutsche Bank then sold the gold, thus converting it into the hard cash desperately needed by the Nazi war machine. David Enrich’s Dark Towers: Deutcshe Bank, Donald Trump, and An Epic Trail Destruction reveals the sequel. After U.S. banks blacklisted Donald Trump because he had defaulted on many loans, Deutcshe Bank gave Trump loan after loan, through bankruptcy after bankruptcy, default after default, effectively financing his real estate empire.

Born in 1934, I have often wondered, over the decades, how fascism triumphed in Germany, arguably then the most advanced nation in the world. I guess we are beginning to understand. I hope it’s not too late.

The post How the Fascists Won World War II appeared first on CounterPunch.org.

The American Narrative of Hiroshima is a Statue that Must be Toppled

Counterpunch Articles -

A photograph of Hiroshima seen from a US airplane after the attack, autographed by Enola Gay pilot Paul Tibbets.

In August 1945, the United States attacked two cities in Japan with nuclear weapons in the last days of World War Two. The US used weapons of mass destruction against a primarily civilian population, instantly killing over 100,000 human beings, with tens of thousands of wounded and irradiated people who would die in the subsequent months and years. The American narrative of the nuclear attacks was formalized in a piece written by former Secretary of War Henry Stimson in Harper’s in 1947. Stimson wrote that the use of nuclear weapons ended the war, and in making an invasion of the Japanese home islands unnecessary, saved millions of lives on both sides. He actually wrote that the use of weapons of mass destruction against urban centers saved lives.

The American telling of the nuclear attacks focuses on the astonishing accomplishments of scientists involved in developing the weapons, on industrial manufacturers producing the weapons, politicians “deciding” what to do with the revolutionary technology, and the highly trained military personnel who “dropped” the bombs (always a passive construction) on Hiroshima and Nagasaki. It seems that every year someone finds another way to tell the story that celebrates the inclusiveness prioritized in modern American narration. Some tell the story of children expressing pride in their parents involvement in creating this weapon. Others find “inspiring” angles of inclusiveness such as gender, or of minority racial groups, leaving unmentioned the enforcement of Jim Crow style discrimination in employment practices in the Manhattan Project production workforce. But the central players in the story are Americans, there are no Japanese people in the story. Japanese people are included only as statistics: how many dead; how many wounded. It is a story of the mass murder of hundreds of thousands of human beings in which those murdered are a footnote. No Japanese person is named.

This is a continuation of the war time erasure of Japanese humanity. In its practice of “urbacide” the US military turned human urban settlements, which were full of innocent civilians, into “kill zones,” “target areas,” and “workers dwellings,” or simply equations or statistics of burned area and bomb tonnage. Hiroshima was the culmination of a campaign that saw up to 350,000 civilians bombed, burned and strafed by the US 20th Air Force. Yet, we treat the people who executed these raids as tortured souls who hated what they were doing. That is if we think about them at all.

City sign as you enter Los Alamos (CC BY 2.0) by M McBey.

The fire raids are completely obscured by the A-bombs in American and Japanese memory. Historian Mark Selden called these, somewhat provocatively, a forgotten Holocaust. The comparison with the Holocaust is problematic for contemporary Americans. Even obscene. But, this was not always so. Already in August 1945, Pulitzer Prize-winning journalist Felix Morley, not exactly a Marxist firebrand, wrote “At Nazi concentration camps, we have paraded horrified German civilians before the piled bodies of tortured Nazi victims…It would be equally salutary to send groups of representative Americans to Hiroshima. There, as at Buchenwald, there are plenty of unburied dead.” We still have not heeded Morley’s advice. We are still refusing to look at the crimes we committed during our last good war. If Morley could say this in 1945, right after the liberation of the camps, American patriotism at its highest point, we should be able to think about the implications of the comparison now.

But we rarely do. The American narrative of the nuclear attacks on Hiroshima and Nagasaki—which are by definition, war crimes—focuses entirely on the perpetrators. When it is being recounted by experts, it often obsesses over them. Who said what to whom on what day? What materials were moved from place A to place B on what day? How were the weapons of mass destruction assembled? Who did what? The American narrative of the nuclear attacks is an obsession with the killers, and with their weapon. To the degree that the war crime itself is discussed, the focus is on the physical effects and dynamics of the weapon. The presence beneath this process of thousands of schoolchildren is unnoted.

Replica of the Fat Man bomb (used in the nuclear attack on Nagasaki) at the Bradbury Science Museum in Los Alamos.

Even amongst those of us on the left who are sympathetic to Hiroshima, the willful blindness endures. What American (and Japanese) liberals mostly renounce is not the US actions but “nuclear weapons.” As if the A-bomb “dropped” itself rather than having two billion 1945 dollars and hundreds of thousands of workers and the full power of the state behind it. So, we march for the abolition of nuclear weapons, we may learn how to fold paper cranes, even visit the peace museums in Hiroshima and Nagasaki, but we never name names. Never point fingers. We still marvel at the technological achievement while we condemn the monster it midwifed.

Some hibakusha were happy and grateful for this sympathy. Some of this was genuine thirst for peace with America and a desperate need for a silver lining. In an extreme example, one hibakusha told a Life Magazine reporter, “something good must come out of this. I now want to be sent to the U.S. so doctors can experiment with my body. It does not matter if I die as a result, as long as I can be of some use to the world of peace.” Such gestures were part of an emotional theater that Americans expected from both Holocaust survivors and Hibakusha, which always ended in a Hollywood like happy ending. Hiroshima wrapped itself in emotion. And we loved to participate and hug its survivors. President Obama hugged survivors and shed tears in his visit to the city. He did not offer compensation, nor help with treatment for the death, misery and decades-long radiogenic diseases we inflicted. He brought the “nuclear football,” the mobile command center used by the US President to authorize launching nuclear weapons, into the Peace Park with him.

Leslie Groves Park in Richland, Washington.

So, yes, we like a good cry. We like to hear heart wrenching stories about the last train to Hiroshima, or the inspiring struggle of someone’s grandma in law. All of this emotion (and these stories ARE heart breaking) actually serves to take our gaze away from the dead and into the inspiring stories of survival and reconciliation. Of course, we need to hear the stories coming out of Hiroshima and Nagasaki. But we also need to go a step further. We still have not heeded Morley’s advice and still have not, like the Germans, been paraded through the dead. We’d rather not look.

We have not redefined for ourselves the relationship between our Manhattan Project heroes and those whose stories we sympathize with. We somehow manage to keep these stories in separate silos.

This is not a question of simple amnesia. It is more an issue of misremembering and of pointing the torch of historical enquiry in the completely wrong direction. From the 1960s onwards, scholars have deconstructed the narrative that the nuclear attacks “saved lives” and were entirely focused on ending the war. Truman and his Secretary of State James Byrnes were clearly focused on the Soviet Union as they moved through the authorizations for the attacks. Truman did not so much make a “decision” as approve plans drawn up by others that had already been set in motion. Some Manhattan Project scientists did actively oppose the use of the weapons in Japan. This is essential scholarship in understanding the history of the attacks, the end of World War Two, and of the initial dynamics of the Cold War. The contributions of the scholars who have explored these issues are notable.

However, the annual process of reveling in the details, of “live tweeting” the various steps and movements of materials as they are put into place to commit the war crimes, seems a little disturbing at this point. Yes, it is important to inform people who may only now be paying attention to the story about the historical background, but the perennial limiting of this retelling to the actions and thoughts of the perpetrators perpetuates the American narrative—that the story is about the feelings, thoughts and actions of those who committed a war crime, not on the actual crimes or the victims. Having one final tweet, or mention of the number of human beings killed and showing one last photo of the mushroom cloud or of the vaporized sections of the cities, seen from the air—the perpetrators’ perspective—reinforces the view that those acting are important, and those being acted upon are…statistics. It is time to move past detailing the minutia of committing the war crimes and begin to expand our focus to the actual crimes and those who were attacked. Telling half of the story tells a disturbing tale about the storytellers.

Any political leader who would suggest today that the use of weapons of mass destruction against a civilian population would “save lives” because it would compel a surrender would be understood to be advocating war crimes. Why do we consider this justification worthy of anything but contempt when looking back at the spring and summer of 1945? Why do we obsess over the communications and preparations to commit these war crimes rather than clarify that what was being done was ghastly and inhumane? When we recount the history of slavery in the United States, we do not limit our focus to the “owners” of other human beings. We see the institution as the horror it was, and we recount the brutalities that people endured and how they resisted. When we recount the Holocaust we don’t obsess over the communications of Nazi leaders and the minutia of constructing the concentration camps and of transporting the Zyklon B gas (when was the purchase order written? When was the delivery received? And what about the woman guards?). While these details are an important aspect of the historical record, the story we tell is of the atrocities, and of those who suffered. If we talk about the perpetrators it is to understand how to prevent such historical events from happening again. In Hiroshima and Nagasaki, it is as though we tell the story of slavery, or of the Holocaust, with the victims remembered only as a statistical footnote at the end.

Author Jacobs outside of Richland High School, Home of the Bombers.

While many in America have directly faced, and opposed the horrors of using nuclear weapons against human beings, our annual social media storytelling around the anniversary obscures the war crimes and casts a glowing light on the war criminals. This may be one reason that today, 75 years later, American society still embraces nuclear weapons as a valid military tool, and is preparing to spend trillions of tax dollars on “modernizing” them over the coming decades. Maybe if we call a war crime a war crime, we can more effectively resist committing our society to policies which sleepwalk us closer to what Noam Chomsky has called the “very severe threat of nuclear war.”

The American narrative of the nuclear attacks on Hiroshima and Nagasaki as celebrations of American technological prowess needs to be toppled. All of the historical nuance to the development, manufacture and use of nuclear weapons must remain an important part of the historical record, and further research must be done. However, the annual “real time” recitation of those facts, driven in part by the nature of social media, serves only to reinforce a triumphal fascination with the killers and the obfuscation of the killed.

Hiroshima and Nagasaki were part of a brutal war. But, it is a war crime to kill a mass population because some of those among the dead are defined as legitimate targets in war. If we want to work towards a world liberated from the threat of nuclear weapons and nuclear war, we must stop fetishizing the mechanics of their single use directly against human beings.

We are currently in the midst of a historical awakening in this country regarding the historical injustice and systematic violence perpetrated against African-Americans. This violence was not confined to American shores. There is a direct line running between oppression at home and nuclear violence abroad. African-American activists were often among the first to recognize these connections. We are now toppling statues erected to honor those who perpetuated historical violence, and fighting back against those who celebrate the history of that violence. Meanwhile, in the midst of this awakening, this historical reckoning, one of the largest war crimes in American history is once again being celebrated. Now is the time to awaken from the American narrative of the “great technological achievement” of nuclear weaponry, and the footnoted mass murder of 100,000s of civilians in Hiroshima and Nagasaki.

The post The American Narrative of Hiroshima is a Statue that Must be Toppled appeared first on CounterPunch.org.

Reverse the New Nuclear Arms Race

Counterpunch Articles -

Image Source: Large stockpile with global range (dark blue), smaller stockpile with global range (medium blue), small stockpile with regional range (light blue) – Public Domain

August 6 is the 75th anniversary of the only time any nation in the world dropped an atomic bomb on people.

On August 6, 1945, President Harry Truman ordered Fat Man dropped on Hiroshima. 210,000 people were killed instantly. Three days later, Little Boy was dropped on Nagasaki, killing 70,000 more people instantly.

Now would be an opportune moment for a major party presidential or US Senate candidate to make the new nuclear arms race a top campaign issue. Instead we hear nothing.

The Bulletin of the Atomic Scientists recently moved its Doomsday Clock the closest it has ever been to midnight. Tension in the world is very high. The US has withdrawn from one nuclear weapons treaty after another, from the Iran nuclear deal to the Intermediate-Range Nuclear Forces (INF) Treaty. The last bilateral nuclear weapons treaty between the US and Russia, the New START Treaty concerning strategic nuclear forces, expires next February 5.

As a US Senate condition for ratifying New START in 2010, the Obama administration initiated, and the Trump administration continued, a wasteful and reckless multi-trillion-dollar nuclear weapons “modernization” program. Russia and China have responded with their own nuclear “modernization” programs.

The new strategic arms are hypersonic – six times faster. Hypersonic means countries no longer have time to launch on warning of incoming missiles, but instead must launch a first strike in anticipation that the other side is about to strike. These new strategic arms are dangerously destabilizing.

Modernization also deploys more tactical nukes in conventional forces with the crackpot military doctrine of “escalate to de-escalate.” The tactical nukes would be used to degrade a stronger conventional force that was gaining the upper hand. The wishful thinking is that the conflict could then be de-escalated. But as former nuclear war planner Daniel Ellsberg demonstrates in his “second Pentagon Papers,” his book The Doomsday Machine, once one nuke is launched, they will all launch automatically and that will be the end of us.

The people of Japan know more than anyone how dangerous these weapons are. Since the late 1950s they have been warning the world that nuclear weapons and the human race are on a collision course.

We propose a set of peace initiatives as a way out of this life-or-death emergency. The U.S. should:

* Cut military spending by at least 50%,
* Withdraw from its endless, illegal wars abroad,
* Pledge “no first use” of nuclear weapons, and
* Disarm to a minimum credible deterrent.

On the basis of these tension-reducing peace initiatives, the US can credibly approach the other eight nuclear powers to negotiate complete and mutual nuclear disarmament, an obligation nuclear powers are already charged with under the 1970 Nuclear Non-Proliferation Treaty.

The US would enter disarmament negotiations with world public opinion on its side. In a meeting at the U.N. in 2017, 122 non-nuclear nations agreed to the text of the Treaty on the Prohibition of Nuclear Weapons. The International Campaign for the Abolition of Nuclear Weapons received the Nobel Peace Prize for that achievement.

Yet very few Americans know about these developments because none of the leaders of the Democratic and Republican parties and none of their presidential or senatorial candidates are addressing this crisis. None of them oppose the US nuclear modernization program. None have proposals to reverse the new nuclear arms race.
None have objected to the use of nuclear weapons in US foreign and military policy. Every time US leaders say “all options are on the table,” the threat of a nuclear strike is implied. The danger of nuclear escalation is always present in the never-ending US wars abroad.

Voting for the lesser evils among these candidates is still voting for the nuclear arms race and endless wars. Voting Green is voting for peace initiatives that offer a way out of this march toward nuclear doomsday. As the great 20th century anti-war leader A.J. Muste famously said, “There is no way to peace; peace is the way.”

The post Reverse the New Nuclear Arms Race appeared first on CounterPunch.org.

Ireland and Slavery: Debating the ‘Irish Slaves Myth’

Counterpunch Articles -

Mural of Frederick Douglass, Falls Road, Belfast. Irish people’s history has embraced both a legacy of identification with the oppressed, and in propagation of the Irish Slaves Myths, elements of racism. Photograph by
Ross – CC BY-SA 2.0

Over recent months, social media in Ireland and the United States has been saturated with claims and counterclaims about ‘Irish slaves’ and a broader controversy about Irish complicity in the transatlantic slave trade. The timing of the ‘debate’ is far from coincidental: a series of false and malicious assertions that the American far Right have pushed aggressively for more than a decade, embraced with enthusiasm by the most conservative elements in Irish America, have grown wings in the new context opened up by the rise of Black Lives Matter. A controversy that has simmered below the surface has taken on new urgency as a fascist Right, emboldened by Trump, finds itself confronted for the first time with a powerful mass movement capable of pushing back. In this context racists in the US are attempting to weaponize a false version of Irish ‘history’ to undermine BLM. In the south of Ireland, especially, a small ‘anti-globalist’ Right sees in the controversy a possibility for redeeming their dismal showing in the recent election by drawing people in on the basis of a mawkish, fairytale nationalism. Socialists and all anti-racists have a responsibility in this situation to counter these lies, to build solidarity with BLM here in Ireland and abroad, and to confront racism wherever it is manifested in Irish society.

For the past seven years much of the burden of refuting these lies has been taken on by the Limerick-based independent scholar Liam Hogan. Working his way meticulously through a complicated historical record, Hogan has published extensively on the controversy, and is the main source for coverage that in recent weeks has appeared in leading newspapers in Britain, Ireland and the US. His research shows that a version of the ‘white slave memes’ first began surfacing in US websites associated with hardcore white supremacists around 2003, but made its way into broader Tea Party circles from about 2013, and has more recently become a staple in the larger far Right that has grown in size and confidence under Trump’s patronage. In the article below – focused on the whether the experience of Irish indentures in the 17th and 18th century world is comparable with that of African slaves – and in a second installment that will follow on Irish complicity in transatlantic slavery, I argue that there are problems in Hogan’s approach, and in the framework in which he situates these questions. But it should be acknowledged unequivocally that his work has been critical for arming anti-racists against a deluge of lies and misinformation. All anti-racists are indebted to Hogan for taking this on almost singlehandedly.

Far Right Weaponizes ‘History’

At the outset it’s worth taking the measure of the scale of the problem we confront. The notion of ‘Irish slavery’ may have floated around Irish America in some vague form before 2003, but it does not seem to have played a significant role in underpinning white (Irish American) racism before then, and although by 2013 it had seeped into sections of the Irish American press, it was the white nationalist-influenced far Right rather than these outlets that drove its early dissemination. In Ireland former Irish army officer Sean O’Callaghan’s highly problematic To Hell or Barbados (published in 2001) popularized the belief that Cromwell’s Irish deportees had been enslaved in the British Caribbean, and was almost certainly the source for Gerry Adam’s cringe-worthy 2016 assertion that “through the penal days [the] Irish were sold as slaves.” The careless blurring of the lines between slavery and indenture in O’Callaghan’s work, rooted in sentimental nationalism than a commitment to white supremacy – provided an aura of credibility for the ‘Irish slaves’ meme that it would not have otherwise enjoyed.

To the extent that these falsehoods have taken root more broadly across Irish society, it is important to confront them. But it’s also worth pointing out that the surge in discussion of the ‘Irish slaves’ meme on social media does not necessarily indicate growing support for racism. In the south of Ireland at least, where between May and early July 2020 there has been a staggering increase in the volume of such discussion, posts debunking the ‘myth’ seem to exponentially outnumber those defending it. The five most popular facebook posts over this period related to ‘Irish slaves’ all debunk the myth; only one in the top ten (from the right-wing student publication, The Burkean) defends it, but with only about 2% of the traction of the leading five.[1]

There are of course right-wing activists and individual racists in Ireland who want to weaponize the meme in the same way their American counterparts aim to do – to discredit Black Lives Matter and undermine calls for racial justice – but it’s clear also that there are (confused) BLM supporters and anti-racists even among those who believe at some level that Irish indenture and African slavery were equivalents. The assertion that Africans had been ‘third in slavery” in the Americas (after indigenous peoples and white indentures) was earlier popularized by the race conscious editor of Jet magazine, Lerone Bennett Jr., who wrote at the height of the Black Power movement that Africans “had inherited [their] chains, in a manner of speaking, from the pioneer bondsmen, who were red and white”. Clearly Bennett was blurring the lines here in the same way many do today, but he can hardly be accused of soft-pedaling the horrors of racially-based chattel slavery, and was obviously motivated by the hope that history might be put to work in building cross-racial alliances. The point here is not that populist renderings of the past should be given a free pass when they are put to service in building interracial solidarity – they shouldn’t – but that we need to be attentive to the context in which particular versions of the past gain traction, and the present emphasis on marking off the sharp line between indenture and slavery itself reflects important ideological shifts over recent years.

A Mountain of Falsehoods

Let’s get to the heart of the matter: on all the key questions – of whether the Irish were ‘the first slaves’, or whether their experience as indentured laborers in Britain’s colonies in the so-called ‘new world’ was equivalent to that endured by Africans, or of whether ‘indenture’ is just a euphemism for ‘slavery’ (as many on social media want to insist), the record is crystal clear and there should be no equivocation. These are all false assertions, almost always deliberately concocted at source through flagrant manipulation of numbers and chronology. Hogan has forensically dissected the numbers game here, and demonstrates repeatedly the dishonesty, embellishment and manipulation of the facts underpinning the Right’s disinformation campaign.

There is nothing to be gained by trying to diminish or downplay the suffering endured by seventeenth- and eighteenth-century Irish indentured servants – a point that some of those arguing Hogan’s corner seem oblivious to, and one I will come back to in some detail below. But even if we acknowledge the hardships many of them had to endure, and even if we accept at face value the hugely inflated numbers that those disseminating the ‘Irish slaves’ myth place in circulation, any attempt to conflate or render equivalent their nightmare with Africans’ experience as slave chattels is untenable, and callous in the extreme.

In scale, in duration, in the wrenching and long-term legacy of transatlantic slavery on Africa itself, in the absolutely central role which black slave labor played in generating the colossal wealth that helped fuel Europe’s industrial transformation – in effect launching global capitalism and the modern world as we know it – any attempt to draw equivalences is beyond absurd. By way of illustration, let us take at face value the hugely exaggerated numbers of Irish indentures purported in one prominent meme, which asserts that over an eleven-year period in the middle of the 17th century the British “sold 300,000 [Irish people] as slaves”. Leaving aside momentarily the question of status, as Hogan points out this number represents six times the total number of Irish migrants to the West Indies for the whole of the 17th century, and almost double total Irish migration to the West Indies and North America in the century and a half after 1630. But leave this aside as well: let’s accept that – for argument’s sake –there were 300,000 Irish laborers condemned to some degree of unfreedom in the plantation societies of the Americas.

Now consider a second figure: over nearly three centuries of the transatlantic slave trade some four million enslaved Africans died en route to the new world – either while being transported overland to coastal ports in west Africa, at sea during the Middle Passage, or shortly after landing in the ‘new world’. That is, thirteen times as many Africans died in transit to the ‘new world’ than the total number of Irish which the Right insists were deployed as ‘slave’ laborers. If we use instead the numbers for Cromwellian deportations accepted by credible scholars (10-12,000) then this single measure reveals the absurdity of trying to render these experiences equivalent: forty times as many African died in transit than the total number of Irish sent into indenture in the 1650s. The scale of the far Right’s intended swindle is breath-taking: that such idiocy can gain any traction at all shows an almost pathological aversion to facing up to the past among those circling the wagons against the renewed challenge to white supremacy.

Marx on Slavery and the Birth of Capitalism

Marxists understand the centrality of African slavery to the making of the modern world in ways that others, including liberal defenders of the present social order, miss or deliberately evade. For Marx himself, bloody conquest in the Americas (almost everywhere involving genocide against indigenous peoples) and chattel slavery on a scale the world had never previously seen were twin cornerstones of a newly emerging capitalism that would, over a remarkably short period in historical terms, bring under its ambit diverse and far-flung societies across the globe. “The different momenta of primitive accumulation [of capital] distribute themselves now,” he wrote in the first volume of Capital, “over Spain, Portugal, Holland, France, and England. In England at the end of the 17th century, they arrive at a [systematic] combination, embracing the colonies, the national debt, the modern mode of taxation, and the protectionist system. These methods depend in part on brute forcee.g., the colonial system.” More broadly, he insists


The discovery of gold and silver in America, the extirpation, enslavement and entombment in mines of the aboriginal population, the beginning of the conquest and looting of the East Indies, the turning of Africa into a warren for the commercial hunting of black-skins, signalised the rosy dawn of the era of capitalist production. These idyllic proceedings are the chief momenta of primitive accumulation. On their heels treads the commercial war of the European nations, with the globe for a theatre.[2] [emphasis added]

The leading African American scholar-activist WEB Du Bois, deeply influenced by marxist materialism at the height of his intellectual powers, expanded on this argument from the vantage point of the early 20th century, when the global system that Marx identified in embryo had developed into mature form: “That dark and vast sea of human labor…the great majority of humankind,” he wrote in Black Reconstruction, “shares a common destiny…despised and rejected by race and color, paid a wage below the level of decent living[.] Enslaved in all but name,” he insisted, they gather up raw materials “at prices lowest of the low, manufactured, transformed, and transported,” with the resulting wealth “distributed and made the basis of world power…in London and Paris, Berlin and Rome, New York and Rio de Janeiro”. Written in 1935, at the height of the Great Depression, one could hardly find a more fitting depiction of the world we inhabit today, marked by exploitive relations between global capital and a racially stratified, multinational labour force. And of course it is impossible to understand the deep resonance that protests in the US have found across the globe without understanding the ways in which police violence – directed disproportionately at this ‘dark and vast sea of human labor’ – reinforces the wider system of exploitation that Marx and Du Bois identified.

This approach to understanding the centrality of slavery serves up a crushing riposte to far Right apologists, but it also marks off an alternative framework from which to understand the evolution of chattel slavery in the Americas – one that captures complex aspects of its development missing from the debate thus far, and perhaps excluded by the way it has been framed by Hogan and others. In Ireland the enduring legacy of the ‘revisionist’ reappraisal of the past is evident in writing on indenture and, more obviously, on Irish complicity in the transatlantic slave trade. Fearghal Mac Bhloschaidh has written perceptively of the ideological thrust of the revisionist project in Ireland: the “main current dominating Irish historiography,” he asserts, can be best understood as an ‘idiosyncratic Irish manifestation of a wider liberal defence of power”, which “employs a vulgar empiricism and constrictive ontology to prohibit a radical reading of the past or an awareness of history as process.”[3]

Framing Slavery and Indenture

The salience of this historiographical context for framing discussion of Irish indenture and its relationship to African slavery is obvious. Though the terrain of this discussion has been profoundly shaped by the regressive trends identified by Mac Bhloschaidh, this background will go unnoticed by many who happen upon the ‘debate’ online or in the press. The convergence of the revisionist sensibility in understanding the Caribbean is most obviously manifested in Donald Akenson’s lectures on the Irish presence in 17th century Montserrat, collected in a volume published under the suitably provocative title If the Irish Ran the World. Drawing on a recent literature that simultaneously denies Ireland’s colonial past and upholds the notion of an ‘Irish empire’, Akenson asserts that the smallest of the main Leeward islands constituted “Ireland’s only 17th and 18th century colony”. Indenture figures in his account as a story of rational-choice ‘upward mobility’; the text strains to obliterate the distinction between Irish (mainly though not exclusively Anglo-Irish and Protestant) planters and indentured servants, and asserts that indenture “was so very different from black slavery as to be from another galaxy of human experience”. [emphasis added] Though Hogan is mostly forthright in acknowledging the misery that attended indenture,[4] traces of Akenson’s cheerier rendering are evident in the present debate.

Three key elements prominent in earlier writing on indenture in the Caribbean and British North America are obscured in the way the recent controversy has been framed. First, an earlier historiography acknowledged, significantly, that indenture was one in a series of solutions by which planter elites attempted to solve the problem of labor scarcity in their ‘new world’ colonies: in short they could not reap profits selling staple crops on the transatlantic market without an adequate supply of labor, and for a while – in the earliest period of Anglo/European settlement – white indentured labor seemed a viable option. As the distinguished anthropologist Sindey Mintz put it, “‘planters were, in one sense, completely without prejudice [and] willing to employ any kind of labor…under any kind of arrangements, as long as the labor force was politically defenseless enough for the work to be done cheaply and under discipline”. In parts of the Caribbean Irish indentures seem to have played an especially prominent role early on (rooted in the Cromwellian deportations) but in North America indentures were drawn from England (mainly), Scotland and Ireland, without any obvious distinction being made between them. A second element, now largely absent from the ’debate’, is the sense that the turn to African slavery as the foundation for plantation labor was a contingent development, and not preordained by history. A complex convergence of circumstances pushed Anglo planters toward a full commitment to black slavery: improved conditions in Ireland and Britain meant the flow of ‘voluntary’ migrants dried up; the turn from small-scale tobacco cultivation to sugar in the Caribbean (and from hopes of outright plunder to tobacco export further north in Virginia and the Chesapeake) generated an exponential increase in the planters’ labor requirements. Finally, British entry into and then domination of the transatlantic slave trade brought a sharp decline in the costs of deploying slave labor, to the point where African slaves-for-life could be deployed on the plantations more economically than white indentures.[5]

Labour, Identity and the Construction of Race

In the face of the malicious attempt to muddy the waters around the horrors of African slavery, it is important to point out the very real, and substantial, differences between indenture and slavery for life. But the evolution of chattel slavery, its place in an evolving system of labor exploitation, the declining importance of indenture – indeed any sense of change over time – is almost entirely absent from the narrow framework of national identity through which discussion is now focused. And what do we lose in this shift? Most significantly we miss the possibility of grasping what the African American historian Barbara J. Fields has called the incremental “construction of race”.[6] If it is true, as is now widely accepted, that ‘race’ is a fiction, and that ‘whiteness’ was deliberately constructed as a way of marking off racial boundaries and securing the loyalty of white laborers to their social betters, then the new world plantation societies of the 17th and 18th centuries served as the setting in which this process was initiated and then consolidated.

Here Akenson’s assertion that indenture and chattel slavery were ‘galaxies apart’ – or his insinuation that Irish indentures were simply slaveowners-in-the-making who had not yet found their true calling – is misleading, and marks a retreat from a more dynamic and potentially productive framework. Those pushing the ‘Irish slaves’ myth suggest that ‘indenture’ is synonymous with slavery, but there are important distinctions: most (though not all) indentures from Britain, Scotland and Ireland were voluntary; in return for transatlantic passage and a modest package of land and tools, etc. at the end of their indentures, they signed away their freedom for terms ranging typically from 4 to 10 years. Historians will continue to debate the evidence from colonial North America and the Anglo Caribbean: there is a basis for emphasizing the ‘special unease’ that planters exhibited toward their African laborers, and the indications that ‘racial’ demarcation between African and ‘Christian servants’ [i.e. whites] was underway from outset. But there is countervailing evidence – especially from colonial North America, that points to a period in which ‘race relations’ at the bottom were more fluid. It was not uncommon during the first half of the seventeenth century for Africans, Europeans, and indigenous ‘Americans’ in the colonial Chesapeake to work alongside one another, and even to share the same living quarters. No sharp racial division of labor had yet emerged to prescribe which work would ‘belong’ to a particular group. Contrary to the assumption that racism has always divided blacks and whites, unfree laborers of all ‘races’ in the early seventeenth century Chesapeake seemed “remarkably unconcerned about their visible physical differences”.[7] Their lives intersected in many ways, and there is clear evidence that they shared not only living quarters and daily toil, but close personal ties as well. From the fragmentary record we know that at least some ‘white’ laborers were conscious of the common lot they shared with blacks: “We and the Negroes both alike did fare,” one servant-poet wrote: “Of work and food we had an equal share.”[8] An investigation into the death of a Dutch servant at the hands of his master in 1647 illustrates the sense of vulnerability shared among the ranks of the unfree. A plantation overseer testified that when he visited the servants’ quarters just after hearing of the death of one of their co-laborers, they all

sate very mallanchollye in the quartering house, and [the overseer] asked them what they ayled to bee soe mallanchollye. The Spanyard made answer and said Lord have mercy upon this boye hath been killed by b(l)owes, his conscience told him. Tom Clarke said Lord have mercy upon us that ever it was my hard fortune to come to this countrye for, if this bee suffered, it maye bee my turne to morrow or next daye. The Negro said Jesus Christ my mayster is not good. And they all wept bitterlye.[9]

The ‘construction of race’ in the vortex of ‘new world’ conquest, exploitation and inter-imperial rivalry can also explain the fundamental distinction between the form of slavery that took shape in the seventeenth and eighteenth-century Atlantic world and every other form of slavery that preceded it. What was distinct about the system that provided the foundations for modern capitalism was its elevation of racial distinctions: though the timing of its development varied here and there, black skin became everywhere in the Americas a marker for slave status. Unlike slavery in the ancient world, where ethnicity and national origin seem to have barely figured in denoting the status of human beings, what developed in the Americas was a racially-defined system of exploitation. And the stigma attached to race, the comprehensive system of racist ideology concocted to rationalize and justify the selective enslavement of black Africans (notions of black inferiority, white superiority), gave grounding and coherence to a set of deeply embedded racial assumptions that outlived slavery, and which very obviously retain their destructive power well into the 21st century.

Wherever historians come down on what the records show for the formative period in race-making, there can be little dispute that racial boundaries became more rigid over time, and that this hardening of the color line was reflected in evolving law and custom. In British North America the critical turning point in the fastening of racial hierarchy – the invention of ‘whiteness’ – came as a direct response by colonial elites to a multi-racial rebellion driven by the ‘lower orders’.[10]

Finally, understanding both slavery and indenture as evolving solutions to the labor problem rather than as mere reflections of a hierarchy of identities can explain how, as Kerby Miller has suggested, “the records of almost every major slave revolt in the Anglo-American world – from the West Indian uprisings in the late 1600s, to the 1741 slave conspiracy in New York City, through Gabriel’s rebellion of 1800 in Virginia, to the plot discovered on the Civil War’s eve in Natchez, Mississippi – were marked by real or purported Irish participation or instigation”.[11] The planters’ dread of rebel combinations between the Irish poor and African slaves – like their more general tendency to perceive slave plots all around them – was more often based on paranoia than firm evidence. “What worried masters in Barbados, above all,” Hilary McD. Beckles observed, “was Irish involvement in slave revolts.” In most cases “fear outran fact in this regard,”[12] he notes, but if it is true that the conditions which black slaves and white indentures worked and lived under were ‘galaxies apart’, how can we explain the persistence of this deep anxiety among Anglo slaveowners throughout the plantation societies of the Americas? Ironically, as Miller points out, “it was not the much-maligned Irish nationalists of the 19th and early 20th centuries who first constructed the image of Ireland’s Catholics (and [their] Protestant allies) as inveterate rebels against political and social authority. Rather, it was earlier Protestant (and Catholic) conservatives and counter-revolutionaries, for whom ‘essential’ (or ‘wild’) ‘Irishness’ seemed the inveterate enemy of the hierarchical systems, deferential habits, and genteel norms that maintained the prevailing, unequal distributions of rights, property, and power”.[13]

In obvious ways, the terms of the debate around slavery and indenture have been outside the control of Liam Hogan and others who have stood up to refute the intense disinformation campaign mounted by the far Right and a softer element of right-leaning, sentimental Irish nationalists. Clearly, they have performed an important service in deploying the historical record against sordid attempts to make light of the horrors of chattel slavery. But the narrow terms in which these issues have, until now, been discussed reveal also the persistent influence of conservatism in Irish history writing, itself a variant of a more general retreat among historians – away from an engaged social history that attends to the complex relationships between race, class and power and toward a fixation with culture and identity. This can obscure as much as it reveals. Indentured servitude and racially-based slavery for life were not equivalents, nor were they comparable in terms of scale or importance in generating the economic foundations that would launch global capitalism. But they were related forms of exploitation at the birth of the modern world, and the best way to honor the victims of both is to commit to rebuilding the rebel combinations that flickered, tentatively, across the color line among those at the bottom. The odious racism that the Black Lives Matter Movement confronts today has its origins in that harsh world, and it’s time we buried that part of our past.


1. Buzzsumo app, 29 June 2020: data in author’s possession.

2. Marx, Capital, Vol. 1, Ch. 31: “Genesis of the Industrial Capitalist”, available online at https://www.marxists.org/archive/marx/works/1867-c1/ch31.htm.

3. Mac Bhloscaidh, “Objective Historians, Irrational Fenians and the Bewildered Herd: Revisionist Myth and the Irish Revolution,” Irish Studies Review (April 2020): pps. 2,6.

4. On this point see Liam Hogan, Laura McAtackney and Matthew C. Reilly, “The Irish in the Anglo-Caribbean: Servants or Slaves?, History Ireland (March-April 2016); pps. 18-22.

5. On these interlinked developments see David W. Galenson, “The Rise and Fall of Indentured Servitude in the Americas: An Economic Analysis,” Journal of Economic History 44:1 (Mar. 1984), pps. 1- 26. On parallels in the North American context see Kelly, “Material Origins of Racism in North America,” available at https://www.academia.edu/10195740/Material_Origins_of_Racism_in_North_America.

6. See Field’s important discussion of this issue in Barbara Jeanne Fields, “Slavery, Race and Ideology in the United States of America,” New Left Review 1/181 (May/June 1990): available online at https://newleftreview.org/issues/I181/articles/barbara-jeanne-fields-slavery-race-and-ideology-in-the-united-states-of-america.

7. Kenneth Stampp, cited in Winthrop D. Jordan, “Modern Tensions and the Origins of American Slavery,” Journal of Southern History 28:1 (1962): 21.

8. Jacqueline Jones, American Work: Four Centuries of Black and White Labor (1999), p. 76.

9. North Carolina Deeds, Wills, etc., 1645-51, cited in J. Douglas Deal, Race and Class in Colonial Virginia: Indians, Englishmen, and Africans on the Eastern Shore During the Seventeenth Century (1993), pps. 117-118.

10. On the importance of Bacon’s Rebellion (1676) in galvanizing Virginia elites and solidifying racial hierarchy in the colonial Chesapeake, see Edmund S. Morgan, American Slavery, American Freedom: The Ordeal of Colonial Virginia (1975), and Theodore Allen, “’They Would Have Destroyed Me’: Slavery and the Origins of Racism,” Radical America (May-June 1975): pps. 41-63.

11. Kerby Miller, “Epilogue: Re-Imagining Irish and Irish Diasporan History,” in Ireland and Irish America (2008).

12. Hilary McD. Beckles, “A ‘riotous and unruly lot’: Irish Indentured Servants and Freemen in the English West Indies, 1644-I7I3,” William and Mary Quarterly 47:4 (Oct. 1990), p. 517. On indenture and cooperation between Irish indentures and African and creole slaves in the British Caribbean see also Aubrey Gwynn, “Indentured Servants and Negro Slaves in Barbados (1642-1650),” Studies: An Irish Quarterly Review 19:74 (Jun. 1930), pp. 279-294. As a corrective to Akenson’s story of ‘upward mobility’, Gwynn reminds us (284) that although after their term of indenture “had been completed, the servant was free, and might be allotted land on the island. But not many lived to see the day”. Mortality on Caribbean-bound ships was high, a fact that seems to be missing from recent discussions of indenture. In Sugar and Slaves: The Rise of the Planter Class in the English West Indies, 1624-1713 (1972), for example, Richard S. Dunn reports that on one ship eighty of 350 passengers died of sickness by the time it arrived in port in Barbados in 1638.

13. Miller, Epilogue, p. 23.

The post Ireland and Slavery: Debating the ‘Irish Slaves Myth’ appeared first on CounterPunch.org.

Trouble in Paradise Valley

Counterpunch Articles -

Paradise Valley, Montana. Photo by George Wuerthner.

Paradise Valley, Montana, is aptly named. The Yellowstone River flows north to Livingston, Montana, framed by the Absaroka Mountains on the east and the Gallatin Range on the West. It’s one of the most stunning landscapes in the entire West.

Due to its location immediately adjacent to Yellowstone Park and the Absaroka Beartooth Wilderness, Paradise Valley has extremely high wildlife values. The Yellowstone River itself is a “blue ribbon” trout stream. In contrast, the adjacent uplands are frequented by elk, mule deer, whitetail deer, bighorn sheep, moose, grizzly bear, black bear, wolves, and other charismatic wildlife.

Some 49% of the valley is privately owned, primarily as ranchland. However, as in other high amenities, driven economies in the West, Much of Paradise Valley is subdivided into small parcels and rural sprawl.

Cattle in Paradise Valley. Photo by George Wuerthner.

A recent report on ranching in Paradise Valley Montana by Property and Environmental Research Center (PERC) based in Bozeman, Montana, was featured in an article in Mountain Journal.

PERC is funded by corporations and others to promote private property ideas. PERC has long championed private lands and resource development, including the privatization of public lands like Yellowstone National Park as part of what they call Free Market Environmentalism.  This advocacy of private ownership as a solution to environmental issues is critical to understand their perceptive. PERC just published Elk in Paradise Conserving Migratory Wildlife and Working Lands in Montana’s Paradise Valley.

As is typical of advocates of the Free Market, they start with several flawed assumptions. One is that all resources would be well managed, and ecosystem values would be better protected if they were privately owned. The obvious flaw of this idea is easy to see when you view the private timberlands clearcuts owned by corporations like Weyerhaeuser in places like western Oregon and Washington.  The same is true of private grazing lands, which are often in worse ecological shape than adjacent ecologically similar public lands parcels.

With regards to ranching, PERC articulates what they see as the benefits maintained by private livestock operations or what they call “working landscapes” (as opposed to those lazy, unemployed lands like parks and wilderness). However, they conveniently ignore the many ecological costs of livestock production that are externalized or unaccounted. You can read their report to find out what they consider the values of ranching. Here, I want to discuss all the negative impacts of livestock grazing and “working landscapes.”

PERC reports that some 34 private landowners responded to a survey about ranching in the valley and the “problems” presented by wildlife—in particular, elk. Ranchers worry about the transmission of brucellosis by elk to their cattle, a disease that can cause abortion in livestock. The fear of brucellosis is one reason bison are shot when they leave Yellowstone Park.  Despite the fact, there is no documented instance of brucellosis transmission from bison to cattle under natural circumstances; this hasn’t stopped the annual slaughter.

In their report, PERC has nine suggestions for improving tolerance for elk in the valley and retaining private ranchlands as an integral part of the landscape. Most of these suggestions include the public subsidy of ranch operations based on the presumption that improving the economics of the livestock operations will reduce the likelihood of subdivisions and subsequent sprawl.

For instance, PERC suggests an elk fund to compensate landowners for forage consumed by elk (but of course, nothing about the feed consumed by rancher cows on public lands which often drives elk to private areas), more publicly funded propaganda extolling the wonders of private lands ranching and its “benefits” for wildlife among other “incentives.”

Elk by Dome Mountain, Paradise Valley, Montana. Photo by George Wuerthner.

It also ignores the fact that none of these economic incentives have ever worked. And this is one of the real failures of this approach. Consider that in California, which is home to the most expensive and most productive Ag land in the country, Agricultural properties are regularly subdivided into housing tracts.

Subdivision sign in Paradise Valley. Montana. Photo by George Wuerthner.

What PERC and other champions of ranching ignore are land prices.  Once land values exceed the value of Ag production, a point passed in Paradise Valley long ago, you can’t maintain a “working ranch” because part of being able to keep a ranch is the ability to buy additional land for a price you can pay off with Ag production. That is impossible In Paradise Valley and nearly all western Montana at this point.

There is some interesting statistical data that illustrates trends all around the Greater Yellowstone Ecosystem. All larger landowners in Paradise Valley are involved in livestock production. However,  of the 34 landowners who responded to the survey, more than half (18) derive the majority of their income from sources outside of livestock production. In comparison, only 13 landowners obtain 80-100% of their income from ranching.

This trend is illustrative of the “amenity rancher” or wealthy individuals who are buying up ranches as an investment, showcasing their wealth, or for the recreational opportunities of fishing, hunting, and just the status of owning “spread” in Montana. Paradise Valley ranchlands are owned by everyone from movie actors like Jeff Bridges and musicians like John Mayer to wealthy individuals like Author Blank, of Home Depot, and Maryanne Mott. She owns the 15,000 acres B Bar Ranch in Tom Miner Basin.

Many of these amenity ranchers likely continue to produce cattle to maintain the low Ag land tax benefits.  They also tend to be more receptive to placing conservation easements on their lands.

In many ways, this change in ownership is positive if you value native biodiversity. Most of these amenity ranchers do not have to graze or otherwise manage their lands as intensively as those whose sole income is derived from the production of livestock. They are often more tolerant of predators, will spend money improving fish habitat and other projects designed to improve the overall ecological health of the land. And most importantly, they are less likely to subdivide their properties and more likely to place conservation easements on them.

This may be one reason that as the land ownership in Paradise Valley has changed towards amenity ranching, the number of elk has also risen. Keep in mind that the most subdivided western state—Colorado–has the largest elk herds in the country. As older ranch operations have been acquired by amenity buyers or sold off for housing tracts, there is a higher tolerance for wildlife like elk.

Furthermore, the idea that subdivisions are always detrimental ignores geography. Why? Because most of the land that has been subdivided hasn’t been elk habitat or habitat for much of anything else except for exotic grasses and animals for a century. I acknowledge there are vital ranches–mostly the ones that are up against the mountain foothills. However, that is a small percentage of the ranches that might have critical elk habitat.

The idea that we are going to “save” ranches in Paradise Valley or anyplace else by promoting livestock production is absurd. When you consider that nearly all communities in the West are built upon former Ag lands, the belief persists that Agricultural production can preclude housing tracts,

Unless rich people like Author Blank buy them, all these ranchers will go out of business. You cannot run cattle and make a profit in the arid West. The only way you make a profit is by externalizing all your costs to the public. And it is these externalized costs that PERC and other advocates of “working landscapes” typically ignore.

One of the common faulty assumptions is that OPEN SPACE IS NOT THE SAME AS GOOD WILDLIFE HABITAT. Coal strip is “open habitat,” but no would say it’s good for wildlife. That is an extreme example, but I do think it’s germane. A hayfield or a wheat field is poor habitat, even if you occasionally see a deer in them.  Hayfields are dominated by one or a few exotic grasses like smooth brome or alfalfa. Exotic grasses do not support native insects, and therefore, fewer native birds, amphibians, reptiles, and small mammals. Sure, you might see elk or deer in these fields, but elk and deer are not biodiversity.

Hayfield in Paradise Valley. Photo by George Wuerthner.

Agriculture by any real accounting if far more destructive to the West’s landscape than subdivisions, if for no other reason than they dominate the land. Get up in a plane and fly over the Gallatin Valley, and what you see is not subdivisions, but mostly Ag fields, and mostly hay or wheat fields. Collectively these fields represent a biological degradation far worse than your average neighborhood.

Although done more than two decades ago, a GAP analysis of Montana found that all human development, including highways, housing tracts, and all the rest occupied only 0.17% of the landscape. In contrast, irrigated fields, primarily for hay production, occupied approximately 5% of the state. Throw in the rangelands grazed by livestock, and perhaps up to 70% of the state is used for livestock production.

Thus, in simple numbers, livestock production’s physical footprint far exceeds the impact of subdivisions and rural sprawl. I want to be precise. I am not a proponent of developments, but I think PERC makes the opposite mistake of assuming that just because subdivisions have impacts, that ranching is more desirable.

Years ago, I did a quick survey of my urban block in Livingston and noted all the bird species there. Just in numbers and species of birds, there were far more than in a similar size hayfield. In my yard alone, I have native plants like chokecherry, serviceberry, snowberry, mock orange, plains cottonwood, mountain ash—all of which are used by native insects, and thus support the numerous bird species. Larger mammals like elk are not going to live in urban settings, but one should not assume that elk represents “wildlife” or “biodiversity.”

Most of the Gallatin Valley is a biological desert created by agricultural lands. Photo by George Wuerthner.

The center of Paradise Valley or the near-by Gallatin Valley, where Bozeman is located, hasn’t been home to elk for at least a hundred years. A subdivision collectively with their landscaping, often with at least some native plants, INCREASES biodiversity compared to hay or wheatfield. Hayfields are biological deserts.

That is critical to understanding the situation. The fact is that ranching is highly destructive. I’m not suggesting subdivisions are better, but let’s not fool ourselves. Ranches and cattle grazing is one of the most damaging land uses in the West. According to the USGS, ranching is the most significant contributor to endangered species in the West.

In my book Welfare Ranching, I list hundreds of species that are in decline due to livestock production. This review includes 159 species listed or candidate species for listing under the Endangered Species Act that the US Fish and Wildlife Service noted ranching on a contributing factor in their demise.

I would posit that people dislike subdivisions because they represent change, and they do bring different issues like traffic congestion, higher taxes, and so forth. But they are not necessarily worse for wildlife and ecological processes. The ONLY thing that works is zoning, like in Oregon. I’ve seen it work wonders there. But ranchers are some of the biggest opponents to land use zoning and planning. There are exceptions, of course, but as a rule, most ranchers have the absurd idea that it’s my land and I can do what I want with it.

And while subdivisions sometimes damaged riparian areas, far more are destroyed by livestock than housing tracts. As you know, riparian areas are the most critical habitat. I recently canoed the W and S Missouri, where for more than 100 miles, the riparian habitat is nearly gone. Why? Not houses, but from cows. I can attest that the riparian habitat in Paradise Valley is far better–with all the subdivision than on the Missouri where there are no houses—just cows.

And then there is water pollution from cows. There are leaky septic tanks in subdivisions, of course, but the average cow poops out the same waste daily as 50 people. I have no idea how many cows are in Paradise Valley, but I am willing to bet they contribute more to the pollution than leaky septic tanks.

Mill Creek a major tributary to the Yellowstone River completely dewatered for hay production. Photo by George Wuerthner.

Another ecological impact ignored by PERC and most “working lands” advocates is the loss of aquatic habitat. Nearly every stream that enters the Yellowstone River in Paradise Valley is dewatered annually for irrigation to make the hay consumed by cattle in winter. This water removal impacts the trout for which the Yellowstone is renowned (mostly because most of its summer flow comes from Yellowstone Park where there are no withdrawals for irrigation) and impacts other wildlife depended on aquatic insects like birds and bats.

While the ranchers in the PERC report worry about the transmission of brucellosis to their livestock, the report ignores that bison are killed annually primarily to protect the local ranchers from brucellosis.  And domestic animals also transmit the disease to wildlife. Domestic sheep, for instance, can send pneumonia to their wild cousins and are responsible for the loss of bighorn sheep herds throughout the West.

Bison in Paradise Valley. Photo: George Wuerthner.

Then there is the social displacement of native species like elk by domestic livestock. Elk avoid areas actively being grazed by cattle. This is especially critical when domestic cattle are moved from private lands to summer pastures on public lands. Cows come in; elk move out. Since one must presume the reason elk are found in any particular area is due to the fact that the land meets their biological needs, they often being displaced to a more marginal habitat.

Livestock are also the primary vector for the spread of weeds. And many public agencies and publicly funded Ag support systems like County Extension Services plant exotic grasses like crested wheatgrass that are favored by livestock to the detriment of native wildlife.

Lest we forget, many ranchers are intolerant of native predators. Wolves, coyotes, cougars, and bears are killed by Wildlife Services, a federal agency that uses taxpayer dollars to reduce predators for the livestock industry. The loss of predators has many adverse ecological effects. Predators help to maintain healthy ungulates herds by eliminating sick animals, including those with Chronic Wasting Disease, which ultimately results in the death of infected animals.

Fences are another problem with livestock production. Fences block migrations and are used by avian predators of sage grouse and other nesting birds. Collisions with fences are responsible for up to 29%-30% of sage grouse mortality in some areas.

Native rodents like prairie dogs are poisoned to favor the consumption of grasses by livestock. Prairie dogs are considered a keystone species. While prairie dogs are not found in Paradise Valley in other parts of the West, massive poisoning programs to reduce or eliminate prairie dogs have been implemented to the point where they are in danger of extinction. For example, on the Thunder Basin Grasslands in Wyoming, the Forest Service is preparing to poison most of the prairie dogs’ colonies there.

Water developments for livestock often tap into springs and seeps, funneling the water to troughs for cattle to the detriment of native species from birds to frogs to snails that depend on these natural water holes.

Frogs and other amphibians rely on wet meadows, seeps and springs for habitat–much of which is destroyed to facilitate livestock production. Photo by George Wuerthner.

There is the destruction of biological crusts, which are critical to precluding the spread of cheatgrass, a highly flammable exotic grass taking over the sagebrush ecosystems of the West. Cattle and sheep, destroy these crusts due to their higher densities than typically found in native herds, facilitating the spread of the cheatgrass.

In many parts of the West, the federal agencies are destroying juniper and other native plant communities, in part, to favor livestock grazing.


If your goal is to preserve wildlife biodiversity and ecological values, the best way to do this is not to promote livestock or “working landscapes.” For all the money we annually pump into the Ag economy, including programs like the Conservation Reserve Program, predator control, and all the other “assistance” we provide to ranchers, a far wiser choice would be to spend that money buying private lands. For all its faults, federal management is typically better overall than what we see on private lands. At the very least, all citizens, in theory, have a say in the management of these lands.

In Paradise Valley, tens of thousands of acres have been acquired over the years, including places like the Dome Mountain Wildlife Management Area, the Slip and Slide Ranch, and other lands that ultimately has a higher value to wildlife than private ranchlands.

Beyond the public acquisition of private lands, implementation of statewide or at least county-wide zoning can direct growth towards areas with fewer impacts, and help preserve valuable properties like wildlife migration corridors, wetlands, winter range, and other lands critical for wildlife.

Oregon has statewide zoning. It was directing all communities, including the small towns, to develop an “urban growth boundary” where future development will occur. Urban growth boundaries prevent rural sprawl, but it helps communities and saves tax dollars by locating growth in areas close to existing infrastructure. Since one where new homes and stores will be found, you can plan to have sufficient roads, parks, and schools to accommodate this growth.

I have friends who tell me that in “conservative” Montana, you could never get zoning ordinances passed, in part, because of the resistance from ranchers. The irony is that Oregon’s law was passed in 1972 by a Republic governor and legislature when it was primarily dominated by natural resource extraction industries like logging and ranching.

One of the other factors overlooked by those promoting livestock production as an antidote for sprawl is that ranchers (who always claim to love the land—until they sell it off) can place their property in a conservation easement. Alternatively, they can sell to a buyer who is not interested in subdividing. They do have a choice.

In any event, the idea that promoting ranching is good for biodiversity is a flawed assumption. So is the belief that we can preclude subdivisions by subsidizing and ignoring the real costs of Agriculture, especially ranching.

Time to uncover the myths and expose that the Emperor has no clothes.

The post Trouble in Paradise Valley appeared first on CounterPunch.org.


Subscribe to The Peace Coalition of Southern Illinois aggregator