Does capitalism make us unhappy? No, but also yes

Money of various denominations and countries

Americans are more unhappy than ever. Is capitalism to blame?

If you ask social media, the answer is obvious. Polls show a similar shift in attitudes, especially among Gen Z: 54% of young people view capitalism negatively, according to a 2021 poll.

But if we’re going to make this argument, it’s important to get the details right. Why, specifically, is capitalism bad for human happiness?

The simplest explanation is that capitalism gives rise to enormous inequality and forces us to struggle for survival, and that makes people miserable. But that hypothesis might be a little too simple.

Massive inequality, poverty and precarity aren’t unique to capitalism. They’ve been features of every society since the dawn of agriculture. Does this imply that no one has ever been happy, except for a tiny minority of the privileged?

The data don’t support such a stark conclusion. If you look at a worldwide survey of happiness by country, it’s true that the richest countries are near the top and the poorer ones are on the bottom. However, happiness doesn’t correlate strictly with GDP or income.

Countries with a wide range of median incomes all cluster around similar numbers. The United States, the richest country in the world, is only the 26th happiest. Some less-wealthy countries, like Costa Rica, surpass us. Mexico, with one-eighth the median income of the U.S., is only two places behind. It appears that wealth doesn’t influence happiness as much as you might guess.

Of course, poverty is harmful. If you can’t afford enough to eat or a roof over your head, you’ll be unhappy. However, inequality isn’t harmful – at least, not intrinsically. It doesn’t hurt you by the mere fact of its existence, like disease or war. If your neighbor is ten times richer than you but is practicing stealth wealth, living in a modest house and wearing non-designer clothes, you’ll never realize.

Inequality only hurts if you’re aware of it. In that case, it naturally inspires feelings of envy (I want what he’s got!) and inadequacy (am I not as good as him?), both of which make us unhappy.

This is an ancient instinct, older than humanity. As a famous experiment shows, even monkeys feel unhappy because of inequality. They’re happy to get a cucumber slice as the reward for a task, unless they see another monkey getting a tasty grape for doing the same job. Then they throw a tantrum.

This is where capitalism comes in. We’re bathed in more advertising than ever, both in the sheer quantity of ads and in their intrusiveness. Marketers spend trillions of dollars to cram commercials into our eyeballs everywhere we look. Capitalism incentivizes this behavior in a way that no previous economic system did.

And that matters, because the purpose of advertising is to make us unhappy. Its goal is to make our lives feel incomplete so we’ll spend money trying to plug the hole. No matter how much you have, it sends the message that you’re falling behind and need more.

Ubiquitous social media also supercharges our ability to peer into other people’s lives. Once, the only people you could easily compare yourself to were your neighbors on the same block. At most, you could read a gossip column or watch a TV show about the lives of celebrities. Now you can see in real time how the richest people on the planet live. That widens the circle of people you compare yourself to, and as the saying goes, comparison is the thief of joy.

Social media can even make us perceive inequalities that don’t exist. We all feel the temptation to curate our lives for social media: to post only about the good parts, and to polish them up and present them as favorably as possible. That can make it seem, when you scroll through your friends list, as if everyone’s life is going great except for yours. It’s comparing someone else’s highlight reel to your cutting room floor.

It goes beyond curation into outright deception. Some “influencers” resort to selective editing and other tricks (like renting a private jet just to pose in the cabin for a photoshoot) to create the false impression that they’re leading a life of luxury.

Both advertising and social media contribute to unhappiness in this way. Capitalism doesn’t just create inequality, it strives to shove it in our faces at every turn, and that does make people unhappy.

Human psychology is such that we tend to be discontented and envious if we have less than our neighbors, and that’s true no matter how much money or how many possessions you have. Capitalism thrives on this mindset, because envy fosters the desire to compete and consume. But human suffering is the raw material that powers it.

Capitalism tells us lies about what makes us happy: more work, more money, more stuff. If you believe those lies, you’ll be running on an endless treadmill, seeking fulfillment through consumption but never finding it, going deeper into debt for no gain. Or, like many white-collar workers, you’ll get caught up in hustle culture, working grueling hours and killing yourself with stress when you already have more than enough for a good life.

The things that truly make human beings happy aren’t for sale. They include autonomy, meaningful relationships, leisure, creativity, and natural beauty. We’ll still need to work, if only to provide for our basic needs, but making more money on top of that doesn’t make people any happier.

Yet millions of Americans believe that happiness comes from being rich. They’re mortgaging their lives in service to this falsehood, encouraged by marketing that stokes feelings of greed and envy, and ruining the planet in the bargain.

The first step in fixing this problem is to recognize the illusion for what it is. All our extravagant consumption is neither a right nor a necessity. We can be perfectly content leading much simpler lives. The sooner we learn that, the happier we’ll be.

The hemorrhaging of Red America continues

[Previous: The lights are flickering in Red America]

Rural Americans are white, conservative, and Republican. That pattern holds overwhelmingly true, and it explains almost all of the political divide in America, even if some experts shy away from the implications.

I make no claims to be impartial about this. But if you can judge a political ideology by anything, you should judge it by the success or failure it produces. When that ideology gets to govern, do its followers thrive, or do they suffer? Do they have stable, prosperous, happy lives, or do they spiral down a vortex of misery?

Let’s consider a new piece of evidence summed up by this headline: “The urban-rural death divide is getting alarmingly wider for working-age Americans“.

As recently as 25 years ago, urban and rural areas had comparable death rates among working-age adults. But since that time, cities have been improving, while rural areas are faring worse and worse. A new report from the Department of Agriculture’s Economic Research Service shows just how wide the gap has gotten:

The report focused in on a key indicator of population health: mortality among prime working-age adults (people ages 25 to 54) and only their natural-cause mortality (NCM) rates — deaths among 100,000 residents from chronic and acute diseases — clearing away external causes of death, including suicides, drug overdoses, violence, and accidents. On this metric, rural areas saw dramatically worsening trends compared with urban populations.

…In 1999, the NCM rate in 25- to 54-year-olds in rural areas was 6 percent higher than the NCM rate of this age group in urban areas. In 2019, the gap had grown to a whopping 43 percent. In fact, prime working-age adults in rural areas was the only age group in the US that saw an increased NCM rate in this time period.

Rural areas have higher rates of drug overdose, suicide, alcoholism and other ills than cities, per capita. However, this conclusion holds true even if you exclude those causes and focus only on so-called natural deaths.

Your first thought might be that COVID accounts for a big chunk of the difference, but no. The researchers specifically excluded 2020 from their data set, because they considered it an outlier. However, including COVID would make this already huge gap into an even wider chasm, considering that the reddest parts of America had a COVID death rate almost six times higher than the blue ones.

There are no plagues that are unique to rural America. Red-state residents die from the same causes as residents of blue cities. They just die from them more often:

Among all rural working-age residents, the leading natural causes of death were cancer and heart disease — which was true among urban residents as well. But, in rural residents, these conditions had significantly higher mortality rates than what was seen in urban residents.

Why do rural residents die in larger numbers? A big part of the problem is that, because they’re more spread out to begin with, doctors and hospitals are few and far between. That makes it harder for them to get the medical care they need, both on a regular basis and in emergencies.

That problem is rapidly getting worse, because the health-care system in rural America is running on fumes. With each passing year, more and more rural hospitals are losing money, forcing them to pare back services or close entirely:

A recently released report from the health analytics and consulting firm Chartis paints a clear picture of the grim reality Ryerse and other small-hospital managers face. In its financial analysis, the firm concluded that half of rural hospitals lost money in the past year, up from 43% the previous year. It also identified 418 rural hospitals across the United States that are “vulnerable to closure.”

…According to Chartis, nearly a quarter of rural hospitals have closed their obstetrics units and 382 have stopped providing chemotherapy.

The harm inflicted by hospital closures goes beyond the sick people who are most directly affected. It tears holes in the social fabric. People with means won’t move to places where they can’t get medical care. Young people who want to start families will avoid places without labor and delivery wards. (Some rural states, like Idaho, are losing so many obstetrics wards that they may soon become maternity deserts in their entirety.)

When people don’t want to live in these places, employers will move away because they can’t attract talent. That creates a brain drain, worsening poverty and leading to a death spiral:

While people in rural America are more likely to die of cancer than people in urban areas, providing specialty cancer treatment also helps ensure that older adults can stay in their communities. Similarly, obstetrics care helps attract and keep young families.

Whittling services because of financial and staffing problems is causing “death by a thousand cuts,” said [Michael] Topchik [co-author of the Chartis study], adding that hospital leaders face choices between keeping the lights on, paying their staff, and serving their communities.

The article proposes, hopefully, that Congressional support will be needed to keep the lights on in rural hospitals. The brutal reality is that this isn’t going to happen, because there’s no constituency for it.

As the last few years have demonstrated, most voters living in these white rural enclaves don’t value their own health. More than that, they fight furiously against efforts to improve it. They’re loud supporters of Republicans whose only concern is punishing women for having abortions, or gay people getting married, or transgender people using bathrooms, or immigrants coming to this country to find jobs, or whatever culture-war noise Fox News is telling them to care about today. The politicians they elect do nothing tangible to improve their lives, and they seem content with that.

Meanwhile, the biggest policy proposals that would make a positive difference to rural voters’ lives – like the Medicaid expansion, or COVID vaccination programs, or infrastructure spending, or stronger unions, or even single-payer health care – are Democratic initiatives, and they resist all of them with furious tenacity. As long as they cling to these attitudes, their doctors will keep fleeing, their hospitals will keep closing, and they’ll keep getting sicker and living shorter lives. They’re literally dying of whiteness, and it seems that’s the way they prefer it.

A hole in the sky

[Previous: The eclipse of 2024: why are we lucky enough to have these on Earth?]

I missed the eclipse in 2017. The path of totality wasn’t that close to me, and I had an infant to care for, so traveling was out of the question.

This time was different.

The April 2024 solar eclipse was the last one that’d be visible from the U.S. for decades. Quite possibly, it was the only one that would be within driving distance of me for the rest of my life. I wasn’t going to miss it.

The weekend before, I set out on the long drive north. On the day of, I was camped out in a meadow with a crowd of other sightseers. There were families with kids, young people with dogs, photographers with solar lenses and camera mounts. The sky was clear blue and sunny, with only wisps of cloud. The mood was one of pleasant anticipation.

When the time arrived, the sun looked no different to the naked eye, too brilliant to glance at. But as the minutes crawled by, you could sense something happening. The light dimmed; colors faded. The warmth of a sunny April afternoon cooled into something more like fall.

Through eclipse glasses, the scene was completely different. You could see the dark disc of the moon moving across the face of the sun. It was an eerie sight, all the more so because it was invisible to the unaided eye.

The author, with eclipse glasses on, gazing toward the sky

Yes, I wore my Arecibo shirt

Over the course of an hour, that twilight dimming grew more noticeable, and the crowd’s anticipation increased. Through the filter of the glasses, the sun dwindled. It was a circle with a bite taken out, then a crescent, at last a slender arc.

And then, all at once – totality.

The sun dazzled, then dimmed, then darkened. The shadow of the moon, which had been there all along, suddenly emerged into view like an actor rising out of a trapdoor onto center stage.

Night fell in an instant, as swiftly as if a curtain had dropped over the world. The temperature plunged, and a chilly breeze kicked up. Venus emerged in a twinkle. A reddish sunset glow clung to the horizon.

Where the sun had been, there was a black void surrounded by a ghostly ring of fire, like a burning hole in the sky.

There was a collective gasp. A mass indrawn breath.

The eclipse at the moment of totality, showing the dark sky and the sun as a blurred ring of light

A goad to the imagination

Solar eclipses have been occuring for the entire span of humanity’s existence. I can only imagine what it felt like for ancient people to see this without knowing what it was. There must have been mass panic, weeping and prayer and frenzy, orgies of hedonism and outbreaks of violence. They probably thought the world was ending, and with good reason.

Even I, knowing it was harmless and that the sun would return momentarily, still felt a little frisson, a shiver at the back of my neck. It was impossible not to.

It was a sharp reminder that the sun doesn’t exist for us; it’s not a hanging lamp put there for our convenience. We live on a planet plunging through the dark, whirling among many other celestial bodies all following their own courses. It was tangible evidence of the vast universe out there, that has nothing to do with us.

In those ancient times, when an eclipse ended and the sun returned, there must have been a rush to interpret its meaning, a proliferation of prophets all offering dueling explanations. Many new religions must have been born, and perhaps some old religions died.

I once wrote that my humanism comes from the stars. After seeing an eclipse, I’ve come to believe that religions come from the stars as well. Not in the sense of UFOs and alien astronauts bringing revelations, but in the sense that people’s imaginations have always been fired by dramatic sights in the world around them.

It’s not just eclipses, but comets, meteor showers, planetary conjunctions, constellations: everything in the heavens that seemed strange, significant or noteworthy. We know that the movement of the skies was important for ancient people, to mark the seasons and predict the times for agriculture, if for no other reason.

But it was also a wellspring of creativity and a goad to the imagination. I wonder how much of our mythology originates from people who saw something unusual in the sky and spun a story about it. The ancient Greeks put gods and heroes there, and other civilizations did the same. It’s not a stretch to imagine that myths of more recent vintage about resurrected saviors and heavenly battles of angels and demons may have similar origins.

Knowledge deepens wonder

Back in the present, that sheer, vertical awe only lasted a moment. I said there was a collective gasp – but then people broke out into laughter, cheers and applause. It was an upwelling of ecstasy. Being there, standing beneath that unreal sky, was transcendent in the truest sense of the word.

The spectacular sight of totality was fleeting. In a few short minutes, the sun reemerged. Light flooded into the world. The sky lightened to blue, and the chill faded.

It was an experience I’ll never forget. And it inspired me in another way as well.

Where people once cowered from eclipses or treated them as portents of doom, now we know them for what they are, and we appreciate them more because of it. People came from hundreds of miles around specifically to see this one, because they wanted to be present for it. Some towns, like Burlington, Vermont, saw their population temporarily double. Our greater knowledge deepened our sense of wonder at the majesty of nature, rather than dispelling it.

The eclipse was no longer something to fear, no longer a sign of divine wrath. The crowds treated it as they should have – just an awe-inspiring natural phenomenon, courtesy of the laws of orbital mechanics and the grand clockwork of the cosmos. From fear to awe, from terror to wonder. That’s what science does.

Israel had the world’s sympathy and squandered it

A window with tattered posters listing the names of Israeli hostages taken by Hamas

[Previous: Netanyahu speaks the language of genocide]

There was a moment when Israel had the whole world’s sympathy.

That moment was right after the October 7 attacks, when Hamas massacred hundreds of defenseless civilians and took hundreds more hostage. It was the single bloodiest day for Jewish people since the Holocaust. However, Hamas’ violence was indiscriminate, and the victims weren’t just Israelis. They were from many countries, including Thailand, Argentia, Germany, France, Russia, China and America.

Immediately after that brutal assault, it’s fair to say the world was on Israel’s side. Hamas committed an act of war, and no nation would have blamed Israel for responding in kind. If they’d targeted Hamas in a targeted and proportionate way, they could have had an international coalition to support them.

But Israel’s government has utterly squandered that sympathy. Instead of trying to limit civilian casualties – or even pretending to – they’re engaging in massive and indiscriminate bombing of Gaza. They’ve destroyed homes, hospitals, refugee camps, entire neighborhoods. It’s an unprecedented act of collective punishment.

Even worse, the Israeli military has blockaded Gaza, cutting off food to the Palestinian people. It’s a deliberate effort to create famine, and it’s working:

On October 9, Israel’s defense minister, Yoav Gallant, declared a “complete siege” of Gaza, stating, “There will be no electricity, no food, no fuel, everything is closed.” Since then, the Israeli bombing campaign has destroyed Gaza’s agriculture and infrastructure, and Israel has restricted aid coming from outside the Strip.

…In February, the deputy executive director of the World Food Programme, Carl Skau, announced that one out of every six Gazan children under the age of 2 was acutely malnourished.

That brings us to this week, when Chef José Andrés of the charity World Central Kitchen says that Israel intentionally bombed a convoy of volunteers delivering food aid, killing seven:

“This was not just a bad luck situation where, ‘Oops, we dropped a bomb in the wrong place,'” Andrés told the Reuters news agency, stressing that his team’s vehicles were clearly marked and “it’s very clear who we are and what we do.”

“They were targeting us in a deconflicting zone, in an area controlled by IDF. They, knowing that it was our teams moving on that road… with three cars,” he said, adding that he believed the seven aid workers killed by the strike in Gaza were targeted “systematically, car by car.”

Whatever crimes Hamas has committed, there’s no possible justification for preventing the delivery of supplies to starving people. This, more than anything else, shows that Israel’s plan is to commit genocide against the Palestinian people. I don’t use that word lightly, but there’s no other word that fits.

I agree with Bernie Sanders’ remarks:

“Too many people do not understand that the Israel of today is not the Israel of…20 to 30 years ago,” Mr Sanders told news outlet Crooked Media. “It is a right-wing country, increasingly becoming a religious fundamentalist country where you have some of these guys in office believe that God told them they have a right to control the entire area.”

… “So bottom line, Hamas committed an atrocity in my view, Israel certainly had the right to defend itself, but it did not and does not have the right to go to war against the entire Palestinian people,” Mr Sanders continued. “Two-thirds of the casualties and deaths are women and children. Unacceptable.”

Israel has even lost Senator Chuck Schumer, one of the most senior Jewish elected officials in America and for a long time one of its staunchest defenders. In a blistering speech, he called Netanyahu an obstacle to peace and argued that Israel is making itself a pariah state:

“I also believe Prime Minister Netanyahu has lost his way by allowing his political survival to take precedence over the best interests of Israel.

He has put himself in coalition with far-right extremists like Ministers Smotrich and Ben Gvir, and as a result, he has been too willing to tolerate the civilian toll in Gaza, which is pushing support for Israel worldwide to historic lows. Israel cannot survive if it becomes a pariah.”

(If you don’t know these names: Itamar Ben-Gvir is an Israeli politician who heads a party called Jewish Power, and the ideological descendant of a Jewish supremacist named Meir Kahane who called for ending democracy and replacing it with theocratic law based on the Torah. Bezalel Smotrich is a religious nationalist who advocates illegal settlements in both Gaza and the West Bank and expelling all Palestinians who object. Netanyahu has brought both their parties into his coalition, forming the most right-wing government in Israel’s history.)

The bleak irony is that none of this death and destruction is likely to break Hamas’ grip on power. Israel doesn’t even appear to have a plan for accomplishing that. It’s tempting to view their attacks on the Palestinians as motivated by sheer rage. It’s destruction with no goal other than “you made us suffer, so we’ll make you suffer more”.

However, I can’t help wondering if there’s a strategy behind all the bloodshed – a bottomlessly cynical and cruel strategy, but a strategy nonetheless. It’s not about getting rid of Hamas. It may be an attempt to create an us-against-the-world mentality.

It could well be that Netanyahu’s goal is to make Israel – and by extension, all Jewish people – so despised that Jews everywhere will feel as if they have to suppress their own opinions and throw their support to Israel, because they’ll have no other allies anywhere. It allows him to say, “The world will always hate you, and only strong leaders like me can protect you.” Or at least, if this isn’t his goal, he may view it as a beneficial side effect.

War is a potent driver of tribalism; that’s one reason why fundamentalists and fanatics glorify it. It coerces everyone to fall in line and obey, to keep the boundaries between Us and Them sharp and unbridgeable. It tamps down empathy and discourages people from trying to understand different viewpoints. Whether they’re Jewish or Muslim, Israeli or Palestinian, or for that matter American, Russian, evangelical Christian or Orthodox, all warmongers desire a world where everyone outside their own tribe is an enemy to be oppressed or destroyed without mercy. It serves the interests of the worst people on every side.

The religious demand for enforced ignorance

[Previous: That which must not be seen]

In the era of Don’t Say Gay laws, we mostly hear about Christian book-burners – like the eleven (yes, eleven) people who are responsible for a majority of all book challenges in the entire country.

However, whenever minority religions gain power, they’ve proven to be just as eager to engage in censorship.

Case in point: Last year, a group of Somali Muslim families in St. Louis Park, Minneapolis complained to the public school about their use of books that featured LGBTQ characters. They didn’t want their kids exposed to anything their religion treats as a sin.

Under Minnesota state law, families have a right to review school materials and make arrangements for alternative instruction if they object. So, the district had no choice but to grant their request:

Hodan Hassan, who has lived in St. Louis Park for 14 years and has four children in the district, said that she was glad when the district granted her request to allow her children to opt out of books with LGBTQ+ characters last week.

“We came to America for religious freedom in the Constitution, and so our kids will have a great opportunity,” Hodan said in an interview. “By granting us and other families the opportunity to opt out of teaching that violates our deeply held religious beliefs, families are able to raise their children according to the principle that they value the most.”

To be perfectly clear, these parents didn’t just want their kids kept out of sex ed classes that discuss alternative ideas of gender or sexuality. They wanted their kids kept out of any classroom lesson or discussion that mentions LGBTQ people in any way. They want to custom-tailor the entire public school curriculum to erase everything they disapprove of.

As a school board member noted, the law as currently written allows for parents to opt their kids out for any reason. It gives unchecked power to naked bigotry:

“The way this law currently reads means that someone can opt out of anything for any reason,” said board member Anne Casey. “If protected classes aren’t excluded, someone could come in and say, I don’t want my child to learn about people of color. I don’t want my child to learn about Jewish people. I don’t want my child to learn about people with disabilities. Those are literally all legal under the current iteration of this law, and that does not sit well with me.”

What these Muslim families obviously haven’t considered is that censorious Christian parents could just as easily use this rule against them, to exclude any positive or neutral mention of Islam from schools.

That very thing happened in Williamson County, Tennessee in 2015:

In seventh grade, kids study world geography and history, including a unit on “the Islamic world” up to the year 1500 A.D. “Williamson County parents and taxpayers have expressed concerns that some social-studies textbooks and supplemental materials in use in Tennessee classrooms contain a pro-Islamic/anti-Judeo-Christian bias,” one school-board member, Beth Burgos, wrote in a resolution. She questioned whether it’s right to test students on the tenets of Islam, along with the state and district’s learning standards related to religion. She also said the textbook should mention concepts like jihad and not portray Islam as a fundamentally peaceful religion.

In the U.K., where religion classes are part of the curriculum, there’s a chronic problem of parents who pull their kids out of lessons on Islam for similar reasons:

But a recent survey from the National Association of Teachers of Religious Education shows there appears to be a growing problem with parents taking their children out of school RE lessons. The findings show that parents are withdrawing children from lessons on Islam, or visits to the Mosque, calling into question their preparation for life in modern Britain.

Recently published research suggests that “withdrawal” has been requested in almost three quarters of schools. More than 10% of those withdrawing are open about the fact that they are doing so for racist or Islamophobic reasons.

The parents who want to yank their kids out of lessons on Islam are trying to protect their own carefully tended bigotry. They know, at some level, that better understanding promotes empathy, and they don’t want their kids to learn anything that would humanize Muslims.

When kids are kept ignorant of Islam – or any other belief system – it’s easier to portray its adherents as subhuman, backwards or violent. Muslim families have every right to object when Christians use that tactic against them. Therefore, it’s fiercely ironic that Muslim parents are using the same tactic to serve their own ends.

If Muslim parents wouldn’t want Christians to dictate what the curriculum says about Islam, those same parents should understand why they shouldn’t try to dictate what the curriculum says about gay, transgender, queer and nonbinary people.

There are people of every religion who want public schools to reflect their values and their sensibilities, and exclude every idea they disagree with. Trying to appease them all would be an impossible juggling act. There’s no way a school can accommodate the conflicting, incompatible demands of every faith in the world.

What we need is neutrality – or in other words, secularism – where schools present a diversity of viewpoints without endorsing any of them. Under the principle of secularism, parents have the right to expect that public schools not endorse a religious viewpoint the parents don’t agree with. What parents don’t have the right to do is demand that schools not mention any fact that they’d prefer to keep their children ignorant of. Creationist parents can’t demand that evolution be removed from science classes because it offends their beliefs, and Christian nationalist parents can’t expect that history classes make no mention of church-state separation in the Constitution because they object to it.

The same principle applies here. Like it or not, LGBTQ people exist. They vote, pay taxes, buy houses, settle down, fall in love, get married, and raise families. That’s a reality which no homophobic religious believer can wipe away. It’s appropriate for schools to teach about their existence as part of the bigger mission of educating kids about the world we live in.

There’s no right to enforce ignorance on children or anyone else for religious reasons. What’s more, it shows that these parents think their beliefs can’t withstand a challenge. Why else would they be anxious to censor the competition? Apparently, they’re afraid that if their kids find out that there are other ways to live, they’ll immediately abandon the faith they were raised in.

They can’t be confident in either the truth or the value of their religion, if they fear that young people will rush out the door as soon as they know they have a choice. As Daniel Dennett has said, if your faith is so fragile that it can’t survive learning about the existence of people who are different, then your faith deserves to go extinct.

The Christian cult of embryo worship

[Previous: Embryos are people in Alabama, but women aren’t]

In the wake of Alabama’s anti-IVF ruling, the American religious right belatedly realized that they’d overstepped. Even many people who consider themselves anti-choice saw this as a gross intrusion on their rights. (I was surprised to learn that 2% of American babies born each year were conceived through IVF. That’s more than I would have guessed.)

After a firestorm of criticism, the Alabama legislature hastily passed a law to shield IVF clinics from liability. This allowed fertility treatments to resume in the state.

However, the religious right can’t hide from their record. One of the first things Republicans did after taking over the House was to introduce a “Life at Conception Act” that would make single-celled embryos count as people in the eyes of the law. That’s exactly the logic that shut down IVF facilities in Alabama. An earlier draft of the House GOP bill contained a special exception for IVF, which has since been removed.

What’s more, Alabama’s fix sidesteps the question at the heart of this debate: Is a frozen embryo a person or not?

While the new law protects IVF providers from lawsuits and prosecution, it’s silent on this issue. It does nothing to contradict the Christian dominionist judge’s ruling that kicked off this furor. This means that, bizarrely, in Alabama, frozen embryos are legally considered people, yet it’s okay to kill them.

Do embryos have rights?

Anti-choicers frame this as a human rights issue. They’d have us believe that single-celled embryos and undeveloped fetuses are like every other oppressed minority in history.

However, they’re not like every other oppressed minority. Black people in slave societies, Jewish people in antisemitic societies, women in patriarchal societies, and others had an inner humanity which the dominant majority refused to recognize. They suffered and felt despair, rejoiced and felt joy, had dreams and aspirations for the future.

This isn’t true for embryos and fetuses. Carl Sagan and Ann Druyan’s classic essay on abortion explains the evidence:

By placing harmless electrodes on a subject’s head, scientists can measure the electrical activity produced by the network of neurons inside the skull. Different kinds of mental activity show different kinds of brain waves. But brain waves with regular patterns typical of adult human brains do not appear in the fetus until about the 30th week of pregnancy – near the beginning of the third trimester. Fetuses younger than this – however alive and active they may be – lack the necessary brain architecture. They cannot yet think.

It’s not a heartbeat or movement that makes a human body into a person. Even brain-dead bodies have those. What makes someone a person is thought: the complex inner life of a conscious being with a functioning brain. Before an embryo possesses that, it’s not a person, whatever else it is.

In fact, Sagan and Druyan further point out that many past legislators and theologians – including Roman Catholics – didn’t consider a fetus to be human, either:

Neither St. Augustine nor St. Thomas Aquinas considered early-term abortion to be homicide (the latter on the grounds that the embryo doesn’t look human). This view was embraced by the Church in the Council of Vienne in 1312, and has never been repudiated. The Catholic Church’s first and long-standing collection of canon law (according to the leading historian of the Church’s teaching on abortion, John Connery, S.J.) held that abortion was homicide only after the fetus was already “formed” – roughly, the end of the first trimester.

Unitarian Universalist minister William McLennan echoes this point:

For most of the history of the Catholic Church, one did not become a human being or a person until well after conception. Saint Augustine in the fourth century adopted the Aristotelian belief that the human soul didn’t enter the fetus until forty to ninety days after conception. In roughly the same era Saint Jerome emphasized human shape: “The seed gradually takes shape in the uterus, and it [abortion] does not count as killing until the individual elements have acquired their external appearance and their limbs.” The Apostolic Constitutions of the late fourth century allowed abortion if it was done both before the human soul entered and before the fetus was of human shape. Saint Thomas Aquinas of the thirteenth century followed Augustine in not considering the abortion of a non-ensouled fetus to be murder. Pope Innocent III, earlier in the same century as Aquinas, emphasized that the soul enters the body at the time of quickening – when a prospective mother first feels movement of the fetus. When Pope Gregory XIV affirmed the quickening test for ensoulment in 1591, he set the time for it as 116 days into pregnancy, or the sixteenth week.

The Catholic opposition to abortion only began in the late 1800s. What’s more, it was founded in a mistaken belief that dates back to the early days of microscopy: the idea that sperm cells contained a “homunculus”, a tiny but fully formed human being.

The modern anti-choice argument, which insists on personhood from the moment of fertilization, is the most extreme position on abortion that’s ever existed in history. It’s more radical than the prevailing beliefs of eras when women weren’t seen as autonomous people with their own rights. In every sense, it’s a cult of embryo worship.

An acorn is not an oak tree

An embryo or a fetus is a potential person. That is to say, it contains the necessary components to grow into a human body with a human mind. If a long and complex process of development is allowed to occur without interruption, it will become a person in the fullness of time.

But potentiality isn’t the same as actuality. An embryo isn’t a person, just as an acorn isn’t an oak tree, an apple seed isn’t a fruit, a sheaf of wheat isn’t a loaf of bread, and a spool of yarn isn’t a sweater. It’s a basic category error to mistake the preconditions of a thing for the thing itself.

There’s no dispute, even among anti-choice Christians, that an embryo doesn’t possess the bodily structure for human experience. The way they justify themselves is by resorting to the hypothesis of “ensoulment”. This belief holds that at some point, God grants the embryo a supernatural element of consciousness, and that this alone is the source of personhood and moral value.

No test or experiment can detect the soul, so this is an unevidenced assertion. Even if souls existed, when does a fetus acquire one? At the moment of conception? The first trimester? The moment of birth? Theologians through history have postulated different answers, but there’s no way to know. It’s purely a matter of faith.

Such ethereal beliefs can’t be the basis for law in a secular democracy. We have to stick to the facts that everyone can check for themselves.

The eclipse of 2024: why are we lucky enough to have these on Earth?

A total solar eclipse showing the "diamond ring" effect

Like lots of people, I’m planning to admire the total eclipse next month.

I’ll be heading north, into upstate New York. The path of totality will travel along an arc from Niagara Falls to Plattsburgh. I’m hoping to catch it near the Vermont state line. I’ve got the special protective eyewear, I’ve made hotel reservations, and I’m planning to leave well in advance to beat the traffic. Now I’m just hoping Mother Nature cooperates.

All us eclipse-chasers are at the mercy of the weather. If it’s cloudy that day, the whole trip will be for nothing. But it’s a gamble I’m willing to take for an almost once-in-a-lifetime opportunity.

Total eclipses aren’t all that rare, per se. One can be seen from somewhere on Earth about every eighteen months. However, this is the last one that will be visible from my neck of the woods for decades. If I don’t get to see it, it’s going to be a much bigger and more expensive proposition to try to catch another one a few years down the line.

What “rare Earth” gets right

Creationists love talking about the “rare Earth” idea: the argument that Earth is specially and uniquely fine-tuned to support life. It orbits in the habitable zone, not too close or too far from the sun, which is a stable star without massive flares. We have a regular day-night cycle, a mostly stable axial tilt, a magnetic field that screens out cosmic radiation, and so on. The creationists claim that this is evidence of God’s special favor.

The fallacy of the rare-Earth argument is that it’s an inference based on incomplete data. Just as you can’t compute the probability of a particular hand of cards unless you know what’s in the deck, we have no basis for proclaiming how common Earthlike planets are. Our sample size is too limited (although it’s growing all the time).

We also don’t know if life can exist on planets that are unlike Earth. We have no idea whether life could thrive with different chemistries, or under different conditions, than we’re used to. Could there be silicon-based life in the mantle of superhot planets, or aquatic life on ice worlds that uses ammonia or methane instead of water, or hydrogen-powered life soaring through the skies of gas-giant planets, or plasma-based life drifting in the coronas of stars? It might turn out that an Earthlike planet is one way, but not the only or even the best way, to support a biosphere.

However, the rare-Earth argument seems to be correct on just this one point. By – as far as we know – a complete coincidence, the Moon is 400 times smaller than the Sun, but also 400 times closer to Earth. The numbers cancel out perfectly for the Moon to perfectly block the Sun’s disc but not its corona, giving rise to the spectacular sight of fiery streamers haloing a disc of black shadow.

A unique moment in cosmic time

If the numbers didn’t mesh so precisely, we wouldn’t get to enjoy this spectacle. The Moon either wouldn’t block the Sun’s entire disc (so we’d only see partial and annular eclipses, not total ones), or it would block the whole Sun, including the corona.

And in fact, it’s only at this moment in cosmic history that we can appreciate this sight – because the Moon is slowly moving further away from the Earth. In 600 million years, give or take, there won’t be total eclipses anymore. If there are still humans (or other intelligent beings) around by then, their view of the sky will be poorer for it.

Of course, this doesn’t help the creationists’ fine-tuning argument, because total eclipses aren’t necessary for life. They just look cool. Still, this kind of favorable coincidence must be rare. There’s a real sense in which the Earth, in this era, may indeed be uniquely privileged among planets in the galaxy.

If we ever make contact with extraterrestrial life, I like to imagine, Earth could be a major tourist draw for this reason. The prospect of seeing a total eclipse is something special, a spectacle that almost no other planet can offer. If you think Airbnb prices are out of control now, just wait!

National Review notices the kids aren’t conservative anymore

Millennials are breaking the oldest rule of politics, and the American right has noticed.

Historically speaking, it’s true that people tend to grow more conservative as they get older. Or at least, it was true for previous generations. As newer polls have shown, Millennials and Gen Z are breaking the pattern.

We’re not aging into church and conservatism – on the contrary. If anything, we’re getting less religious and more liberal as we get older.

And National Review, the conservative magazine, has noticed. They’ve published an article whose title betrays their scarcely contained panic: “The Link Between Age and Conservatism Is Breaking“.

I’m always interested to read articles by religious apologists noticing that young people are leaving religion in droves, and theorizing about the reasons for that. It’s a revealing glimpse into the way they see the world. This is another flavor of that lament.

There’s a rich harvest of schadenfreude to reap from this article, and I’ll get to that. But first, it wouldn’t be a conservative publication if it didn’t blame every problem on a lack of grit and moral fiber among the youth. NR doesn’t disappoint:

We find shocking levels of reported depression and suicidality among the young, although in some ways, younger people have been groomed to report themselves as depressed and mentally unstable.

NR implies that young people aren’t really depressed, they’re just being told (“groomed”) to believe that they are. Do they display this same skepticism toward self-reports when the conclusion is one they agree with? If a survey found that religious people report greater happiness, would they reject that conclusion because church members are “groomed” to say that believing in God makes them happy?

That’s not the bottom of the barrel, either. In an even grosser claim, NR asserts that not marrying makes women ill:

Unmarried women in middle age report huge rates of depression, or other mysterious illnesses that are diagnosed as fibromyalgia and rheumatoid arthritis, and are experienced as “constant pain” in all joints, poor sleep, and exhaustion.

Marriage is the cure for rheumatoid arthritis. Who knew?

However, once you get past the obligatory victim-blaming and insulting pseudoscience, NR proposes a different explanation. And what’s fascinating is that this one comes pretty close to the truth:

Conservatism is also associated with settling down. People who acquire property tend to become more conservative. And Millennials just aren’t doing that at the same rates as previous generations. By age 30, just 42 percent of Millennials own homes compared to 48 percent of Gen Xers and 51 percent of Baby Boomers. The gap persists into their early 40s.

…Are we surprised that a generation that feels least optimistic about living in a family and in their own home has little faith in the American dream? Should we really find it shocking that so many of them find something resonant in the 1619 Project’s understanding of this country, which theorizes that America is about exploiting labor unjustly without due rewards or respect?

To my own surprise, I agree.

NR doesn’t elucidate on why Millennials are less optimistic about the future, but I can fill in the gaps for them. Inequality has skyrocketed in America over the last few decades, mostly because of older generations building ladders to prosperity for themselves and then yanking them away from those who came after.

With enthusiastic support from older voters, Republicans have crippled unions, slashed tax rates at the top, frozen the minimum wage, tried to scrap social-welfare programs, done everything in their power to keep health care unaffordable, and presided over soaring prices for education and houses without any sign of concern. They want to raise the retirement age for those who come after them. They’ve fought tooth and nail against every effort to stop climate change, the single biggest crisis that makes younger people feel hopeless and pessimistic about the future.

At the same time, they’ve passed onerous voter-ID laws (while barring student IDs as valid forms of identification, of course), closed polling places on college campuses, fought against automatic voter registration, and taken other measures to discourage young people from voting. When even that hasn’t worked, they’ve resorted to aggressive gerrymandering to dilute young people’s votes into meaninglessness and enshrine permanent minority rule.

In short, conservatives have done everything possible to keep young people poor, oppressed and powerless. And now they’re upset and dismayed because those young people, as they transition into middle age, aren’t invested in the system anymore! It’s a true leopards-eating-my-face moment.

All the other social ills that NR recognizes, like high levels of depression and loneliness, civic fragmentation and chronic illness, are symptoms of this bigger problem. They’re the result of a crushingly stressful and precarious existence, where good jobs are dwindling and the ones that are left still don’t pay enough to afford health care, housing and other necessities.

Obviously, merely to recognize this is treading on dangerous territory for a conservative magazine. What NR doesn’t do is suggest what, if anything, they think conservatives should do about it.

We can guess. Much like the religious apologists who react to the decline of religion by saying they need to double down and do more of what wasn’t working, we can imagine that NR‘s solution would entail more union-busting, more trickle-down economics, and more voter suppression. It’s the epitome of “the beatings will continue until morale improves” thinking, and it’s why the ground is going to keep crumbling under their feet.

No one gets what they deserve

Look around your house. How many of the things you own do you truly deserve?

If you’re a carpenter and built your own house, or a farmer and grew your own food, you could fairly say that you earned those things as the work of your own hands. However, I can’t make that claim for myself. The mug of coffee on my desk was brewed from beans grown halfway around the world, and I don’t know the people who harvested, shipped or roasted it. My clothes are made of cotton that was picked, spun, woven and sewed in farms and factories I’ve never seen. I didn’t cut down the trees to make my own furniture.

So how did I come to have these things? What makes me so sure I deserve them?

In the superficial, capitalist sense – the tiny Ayn Rand on one shoulder – I “deserve” them because I worked a job where I exchanged my labor for money, and that money is evidence of worth. Anything that I want and can afford, I deserve. If I didn’t have enough money for a roof over my head or food to eat, then in this worldview, that would be a sign that I don’t deserve them (and should, therefore, have the good manners to starve quietly, preferably out of sight of rich people).

There’s an appealing simplicity to this view. Like Eastern notions of karma, or medieval ideas of God-ordained hierarchy, it proclaims that we each occupy our right and proper station in life. Best of all, it proclaims that morality is built in. We don’t have to fret over injustice or put in the effort to demand change, because the market does that for us. It’s an infallible dispenser of rewards and punishments, giving each person what their actions merit.

Rigged games

However, even a passionate advocate of capitalism would have to admit this isn’t the whole picture. There’s such a thing as random chance, which sometimes benefits and sometimes harms us. You can work your hardest and fail through no fault of your own, or you can be lazy and irresponsible and yet have success rain down on you.

Most importantly, we all came into being through a birth lottery. I was born into a privileged position, in the richest nation in history – rather than, say, being born as a Siberian peasant or a Chinese factory worker. Did I deserve that?

Even in the wealthy nations, there are huge gradations of privilege. Does anyone “deserve” to be born into a segregated slum with crumbling schools and polluted air and water? How about into a rich family with a private summer cottage, a yacht and a trust fund?

Regardless of how smart you are or how hard you’re willing to work, these advantages of birth go a long way toward determining where you end up in life. A few extraordinary people succeed despite a disadvantaged background, and a few feckless rich people squander their wealth and end up poor. That doesn’t mean that the competition was fair. More often, the rich stay rich and the poor stay poor, regardless of choices they make.

Capitalism can’t be the arbiter of what we deserve. It’s a rigged game, where advantages we didn’t earn matter as much as, or more than, individual choices or abilities.

Some might say that if economics can’t be the judge of what people deserve, the law can. The legal system, unlike the market, is at least supposed to dispense justice. Yet it, too, falls short.

There’s no good-faith dispute that racial and class biases influence who gets arrested, who gets charged, and how harshly they’re punished. And even discounting those biases, the law can never be more than a rough approximation of deservingness. We punish wrongdoers with fines or prison, but those don’t undo the harm they committed. You can compensate victims with money, but that rarely if ever makes them truly whole.

Last but not least, there are those intangible connections of love and friendship that, more than money or possessions, make life truly worthwhile. But here, too, we fail to find any grounds for deservingness. Does anyone deserve a loving relationship, or a harmonious family, or a fulfilling social life?

None of these are rights or entitlements. At best, they’re blessings that some of us are fortunate enough to receive. Your choices influence how your personal relationships play out, but in no sense can you say that they’re fully under your control.

Two viewpoints on what we deserve

From one point of view – call it the technological view – no one deserves anything, because nothing is given to us. We’re large primates living on a ball of dirt and rock, whirling around an unremarkable yellow star, itself spinning through a vast and uncaring universe. Nature has no concern for our well-being; it kills us without a qualm.

The only comforts we have are things we’ve figured out how to create for ourselves. Through painstaking trial and error, we’ve learned to transmute the raw stuff of nature into objects that make our lives healthier, more comfortable or more pleasant. None of this is owed to us, and nothing about our technology inherently limits it to some people and not others. Why shouldn’t we all benefit from our collective cleverness?

From another point of view – call it the moral view – we’re all human beings, alike in dignity. Some people want to draw artificial lines dividing us, lines of class or race or nationality or gender, but those are nothing but superficial chalk marks. They reflect nothing fundamental, they don’t carve nature at any joint.

All humans are the same species, with the same abilities. We’re nearly identical in our DNA, save for a few variations that some people place inordinate importance on. We all feel the same pains, the same joys, the same wants, the same loves. In an otherwise indifferent universe, we’re meaning-makers and storytellers and hopers and dreamers.

From this standpoint, it’s all but impossible to argue that one human being deserves something which another doesn’t. If there’s anything that anyone deserves merely by virtue of existence, then we all deserve it. Our equal standing as sentient creatures demands this conclusion.

Either way, it’s not plausible to treat “deservingness” as a proxy for virtue. There’s no good argument to be made that some people deserve ease and comfort while others don’t, or that the amount of money you possess says anything about your value as a person. That’s solely a rationalization made by people who’ve benefited from the unfair nature of the world and want to reassure themselves that this is okay.

When you adopt this reasoning, it prevents you from developing an unhealthy attachment to the material goods you possess right now – or a belief that you stand above other human beings because you possess them.

I find this to be a welcome dose of humility. You don’t deserve the good things you have, because nobody does. At the same time, we all deserve a life of safety and comfort, within the bounds of our collective ability to create those things. It’s a reminder to be grateful for the privileges that we possess, despite not deserving them in any cosmic sense – and to be generous, as much as your circumstances permit, for those who haven’t had the same good fortune.

How to listen to women

Christian fundamentalism, like all fundamentalisms, is a high-control belief system. It smothers believers – especially women – with an oppressive web of rules that dictate every aspect of their lives. It tells them how to dress, how to act in public and private, what they’re allowed to read, who they’re allowed to interact with, what ideas they’re allowed to express – and even how to speak, when they’re permitted to.

Those of us who watched Katie Britt’s response to Joe Biden’s State of the Union heard a demonstration of that. It’s called “fundie baby voice”.

Fundie baby voice (a term coined by ex-evangelical Jess Piper) is the art of speaking in a breathy, high-pitched, intentionally childlike tone. It’s a performance that’s expected of women in fundamentalist sects, as a display of subservience and inferior status.

And it is a performance. The voice that Britt used for her SOTU response isn’t what she really sounds like. To hear the difference between fundie baby voice and Britt’s normal voice, watch this TikTok video from comparing the two. The difference is jarring, and once you know it for what it is, the fundie baby voice is inescapably creepy. It sounds like something out of a horror movie: the whispering of a ghost from beyond the veil.

As patriarchy escapee Tia Levings says:

It’s the denial of our voices, the suppression of our natural sound and range of emotion, and the terms used to train us are reflective of the agenda and abusive system we were in. They want us to sound like sexualized children.

This is a widespread phenomenon in Christian fundamentalism. Michelle Duggar is another high-profile example, as was shown in Amazon’s Shiny Happy People documentary series. Kelly Johnson, wife of Christian nationalist and House Speaker Mike Johnson, does it too.

Women in these cultish patriarchies are expected to be perpetually docile, accommodating and obedient to the men around them. They’re never permitted to be loud, assertive or overly emotional. Speaking in a register that’s typical of children reinforces that mindset.

When they go high, we go low

Fundie baby voice is a dramatic example of how women are expected to contort themselves to fit the demands of a sexist society. It works the other way, too.

You may remember Elizabeth Holmes, the convicted fraudster behind Theranos. Among her other affectations, she spoke in a deep baritone voice in public interviews. That’s not her normal voice, as she finally admitted once the jig was up:

That register is no more — and is now somewhat of a joke for Holmes, who seems to have embraced her natural pitch. She speaks in a “soft, slightly low, but totally unremarkable voice,” according to a recent New York Times profile of the founder, for which writer Amy Chozick spent time observing Holmes and her partner Billy Evans at home.

You can imagine why she did this. Politics, science and other prestigious and powerful fields are still male-dominated, and this creates a feedback loop of unconscious bias.

To people steeped in this legacy of sexism, a deeper – that is, more masculine – voice is unconsciously perceived as a sign of competence and intelligence. Feminine traits, on the other hand, are looked down upon and treated as evidence of ditziness and frivolity.

In an experiment demonstrating this, spoken recordings were shifted to be either higher or lower in pitch, and participants were asked to “vote” for one. People, both male and female, consistently chose the lower voice. Apparently, we have an unconscious bias that people with deeper voices make better leaders.

It’s a widespread assumption – so widespread that most people don’t realize they hold it, much less think to question it – that the more competent a woman is, the more she resembles a man. A con artist like Holmes, skilled at altering her self to match people’s expectations, cynically played along with this belief. She’s not the only one. Margaret Thatcher adopted a deeper voice as she rose in the ranks of British politics.

The fact that women feel pressure to shift their voices, whether higher or lower, to appeal to an audience is clear evidence that sexism isn’t a thing of the past. Some traditionalists, too steeped in their own assumptions to look past them, believe that women (but not men) wearing dresses or using makeup is somehow “natural”.

But there can’t be anything “natural” about women disguising their normal voices to fit what others expect. That’s literally unnatural, in the strictest sense of the word. Whether they’re exaggerating their voices to be higher or lower, either way it highlights the double standard that still reigns: femininity is associated with submission and inferiority, and masculinity with intelligence and dominance.

In an enlightened world, there’d be no reason for anyone to suppose that the pitch of a person’s voice had any correlation with the contents of their brain. Nor would we judge people’s ability by the attractiveness of their face, the shape of their body, the clothes they wear, or their makeup or jewelry or lack thereof. We’d look past all these things as the irrelevancies they are.

We’ve taken some small steps toward this ideal, but not nearly enough or fast enough. Religion, especially fundamentalist religion is one of the biggest forces fighting progress toward equality, spreading toxic stereotypes about gender and trying to keep women subservient. It has to go if the world is ever going to become a better place.