Category Archives: The Conversation

Tracking CO2 emissions from space could help support climate agreements


NASA’s Orbiting Carbon Observatory (OCO-2) satellite can make precise measurements of global atmospheric carbon dioxide (CO2) from space. (NASA/JPL-Caltech)

Ray Nassar, University of Toronto

The central objective of the Paris Agreement is to limit Earth’s warming to well below 2 C above pre-industrial levels, but preferably 1.5 C.

This challenging task will require policies and tools to enable every sector of society to drastically reduce greenhouse gas (GHG) emissions to eventually reach net-zero.

Enacting the most effective and efficient strategies to reduce emissions starts with knowing in detail where, when and how much of these greenhouse gases we are emitting, followed by implementing emission reduction policies and tracking our progress.

Is it possible to track carbon dioxide (CO2) emissions and emission reductions from space? New research from my group shows that it is.

Why CO2 matters

CO2 is the primary greenhouse gas driving climate change. Burning fossil fuels for electricity generation, heating buildings, industry and transportation has elevated the CO2 in our atmosphere well beyond natural levels.

Currently, CO2 emission reporting is mainly done by accounting for the mass of fossil fuels purchased and used, then calculating the expected emissions — not actual atmospheric CO2 measurements. The finer details about exactly when and where the emissions occurred are often not available, but more transparent monitoring of CO2 emissions could help track the effectiveness of policies to reduce emissions.

Today GPS satellites help us to get around, meteorological satellites track weather systems and communication satellites relay TV, internet and telephone signals. It is time we use satellites to help tackle the biggest challenge that humanity has ever faced — climate change.

Satellites for measuring CO2

A global network of ground-based CO2 measurements began in 1957 and now consists of over one hundred stations around the world. Accurate and precise measurements from these stations have revealed a lot about changes in global atmospheric CO2 and Earth’s overall carbon cycle, but we can’t place these stations everywhere on Earth.

Satellites can observe the entire planet. Those that measure CO2 in the lower atmosphere near Earth’s surface (where CO2 emissions and CO2 uptake by plants happens) first began making measurements in 2002. Since then, they have been getting better and better at doing it, but there have been setbacks along the way.

About a decade of effort by NASA went into developing the Orbiting Carbon Observatory (OCO) satellite to make precise measurements of atmospheric CO2 across the Earth.

NASA's OCO undergoing development prior to launch
NASA developed the Orbiting Carbon Observatory satellite to make precise measurements of atmospheric CO2 across the Earth. (NASA/JPL), Author provided

In 2009, OCO was lost due to a launch problem. After sustained advocacy for a rebuild of this important climate mission, NASA secured new funding to launch the OCO-2 satellite in 2014 and OCO-3 to the International Space Station in 2019.

The OCO missions were designed to improve our understanding of vegetation’s CO2 absorption, also known as the land carbon sink. But what about fossil fuel CO2 emissions?

A new way to verify CO2 emissions

In 2017, I led a research team that published the first study showing that we can quantify CO2 emissions at the scale of an individual power plant using OCO-2 observations.

Since OCO-2 was not designed for this purpose, its coverage and infrequent visits were inadequate for operational global CO2 emission monitoring, but we can still quantify emissions in select cases when the satellite passes close enough and gets a good cloud-free view.

OCO-3 is very similar to OCO-2, but has an additional pointing mirror that enables it to better map CO2 around targets of interest like the Bełchatów Power Station in Poland, Europe’s largest fossil fuel burning power plant and CO2 source.

A Power Station
Bełchatów Power Station, Europe’s largest fossil fuel burning power plant. (Shutterstock)

With ten clear views of CO2 emission plumes from Bełchatów imaged by OCO-2 and OCO-3 from 2017-2022 analyzed in our new study, we were able to determine emissions on those days.

European power plants report hourly power generation but only annual CO2 emissions. Power generation fluctuates with electricity demand and generating unit shutdowns (for maintenance or decommissioning) and CO2 emissions are expected to exhibit proportional fluctuations.

We confirmed this using OCO-2 and OCO-3 in our recent paper, which showed that satellite observations can track changes in facility-level CO2 emissions. This means that satellites can be used to verify (or refute) reported CO2 emission reductions that result from climate change mitigation — like mandated efficiency improvements, carbon capture and storage technology, etc.

OCO-3 observations of a CO2 emission plume from the Bełchatów Power Station in Poland on April 10, 2020 overlaid on Google Earth imagery.
A plume of high CO2 resulting from coal burning is evident down wind from the Bełchatów Power Station in OCO-3 observations. (Ray Nassar), Author provided

Emissions monitoring for the Paris Agreement

Our approach can be applied to more power plants or modified for CO2 emissions from cities or countries with OCO-2 and OCO-3. We can also try integrating the satellite observations with CO2 monitoring from the ground or aircraft.

While we are already working on this, advances will only be incremental until the launch of the European Commission-funded Copernicus Anthropogenic CO2 Monitoring Mission or “CO2M”. CO2M is comprised of two satellites, aiming to launch in late 2025.

These satellites will provide about 50 times as much coverage as OCO-2 and OCO-3 combined and will form the space component of Europe’s system for CO2 emissions Monitoring, Verification and Support (MVS).

CO2M will be a major advance, but just like successful global climate action, requires contributions from many countries. The long-term robust operational global monitoring of GHG emissions will need a constellation of satellites contributed by multiple countries as part of an integrated global observing system.

Hopefully, with new, more detailed and transparent tracking of human-caused greenhouse gas emissions to assess and guide us toward the most effective policies, society can achieve the emission reductions needed to reach net-zero in time.

Ray Nassar, Research Scientist at Environment and Climate Change Canada (ECCC), Adjunct Professor in Atmospheric Physics, University of Toronto

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Scientists discover five new species of black corals living thousands of feet below the ocean surface near the Great Barrier Reef


Researchers discovered five new species of black corals, including this Hexapathes bikofskii growing out of a nautilus shell more than 2,500 feet (760 meters) below the surface. Jeremy Horowitz, CC BY-NC

Jeremy Horowitz, Smithsonian Institution

The Research Brief is a short take about interesting academic work.

The big idea

Using a remote-controlled submarine, my colleagues and I discovered five new species of black corals living as deep as 2,500 feet (760 meters) below the surface in the Great Barrier Reef and Coral Sea off the coast of Australia.

Black corals can be found growing both in shallow waters and down to depths of over 26,000 feet (8,000 meters), and some individual corals can live for over 4,000 years. Many of these corals are branched and look like feathers, fans or bushes, while others are straight like a whip. Unlike their colorful, shallow-water cousins that rely on the sun and photosynthesis for energy, black corals are filter feeders and eat tiny zooplankton that are abundant in deep waters. https://www.youtube.com/embed/MYncyEIDr10?wmode=transparent&start=0 The team of researchers collected 60 specimens of black corals over 31 dives using a remotely operated submarine.

In 2019 and 2020, I and a team of Australian scientists used the Schmidt Ocean Institute’s remotely operated vehicle – a submarine named SuBastian – to explore the Great Barrier Reef and Coral Sea. Our goal was to collect samples of coral species living in waters from 130 feet to 6,000 feet (40 meters to 1,800 meters) deep. In the past, corals from the deep parts of this region were collected using dredging and trawling methods that would often destroy the corals.

Our two expeditions were the first to send a robot down to these particular deep-water ecosystems, allowing our team to actually see and safely collect deep sea corals in their natural habitats. Over the course of 31 dives, my colleagues and I collected 60 black coral specimens. We would carefully remove the corals from the sandy floor or coral wall using the rover’s robotic claws, place the corals in a pressurized, temperature-controlled storage box and then bring them up to the surface. We would then examine the physical features of the corals and sequence their DNA.

Among the many interesting specimens were five new species – including one we found growing on the shell of a nautilus more than 2,500 feet (760 meters) below the ocean’s surface.

A robotic arm grabbing a thin coral off of a rock.
Researchers used the robotic arm of their rover to collect over 100 samples of rare corals and brought them up to the surface for further study. Jeremy Horowitz, CC BY-ND

Why it matters

Similarly to shallow-water corals that build colorful reefs full of fish, black corals act as important habitats where fish and invertebrates feed and hide from predators in what is otherwise a mostly barren sea floor. For example, a single black coral colony researchers collected in 2005 off the coast of California was home to 2,554 individual invertebrates.

Recent research has begun to paint a picture of a deep sea that contains far more species than biologists previously thought. Considering there are only 300 known species of black corals in the world, finding five new species in one general location was very surprising and exciting for our team. Many black corals are threatened by illegal harvesting for jewelry. In order to pursue smart conservation of these fascinating and hard-to-reach habitats, it is important for researchers to know what species live at these depths and the geographic ranges of individual species.

A large, white, tree-like coral underwater.
Black corals don’t form large reefs like shallow corals, but individuals can get quite large – like this Antipathes dendrochristos found off the coast of California – and act as habitat for thousands of other organisms. Mark Amend/NOAA via Wikimedia Commons

What still isn’t known

Every time scientists explore the deep sea, they discover new species. Simply exploring more is the best thing researchers can do to fill in knowledge gaps about what species live there and how they are distributed.

Because so few specimens of deep-sea black corals have been collected, and so many undiscovered species are likely still out there, there is also a lot to learn about the evolutionary tree of corals. The more species that biologists discover, the better we will be able to understand their evolutionary history – including how they have survived at least four mass extinction events.

What’s next

The next step for my colleagues and me is to continue to explore the ocean’s seafloor. Researchers have yet to collect DNA from most of the known species of black corals. In future expeditions, my colleagues and I plan to return to other deep reefs in the Great Barrier Reef and Coral Sea to continue to learn more about and better protect these habitats.

Jeremy Horowitz, Post-doctoral Fellow in Invertebrate Zoology, Smithsonian Institution

This article is republished from The Conversation under a Creative Commons license. Read the original article.

When tragedy becomes banal: Why news consumers experience crisis fatigue


As the war continues in Ukraine, a grandmother helps her grandchild light candles in a church in Lviv. AP Photo/Emilio Morenatti

Rebecca Rozelle-Stone, University of North Dakota

When Vladimir Putin launched a full-scale invasion of Ukraine by land, air and sea on Feb. 24, 2022, the images of war were conveyed to dismayed onlookers around the world. Far from the action, many of us became aware of the unprovoked aggression by reading online coverage or watching TV to see explosions and people running from danger and crowding into underground bunkers.

Half a year later, the violence continues. But for those who have not been directly affected by the events, this ongoing war and its casualties have been shifting to the periphery of many people’s attention.

This turning away makes sense.

Being attentive to realities like war is often painful, and people are not well-equipped to keep a sustained focus on ongoing or traumatic occurrences.

In addition, since the war in Ukraine began, many other events have arisen to occupy the world’s attention. These include droughts, wildfires, storms tied to global warming, mass shootings and the reversal of Roe v. Wade.

As the philosopher-psychologist William James asked, “Does not every sudden shock, appearance of a new object, or change in a sensation, create a real interruption?”

Ongoing tragic events, like the assault on Ukraine, can recede from people’s attention because many may feel overwhelmed, helpless or drawn to other urgent issues. This phenomenon is called “crisis fatigue.”

A firetruck drives near a burning wildfire.
The McKinney Fire burned more than 60,000 acres in Northern California this summer, killing four people and destroying 90 residences. Drought conditions enabled the fire to spread quickly. AP Photo/Noah Berger, CC BY

Roots of crisis fatigue

Malevolent actors and authoritarians like Putin are aware of public fatigue and use it to their advantage. “War fatigue is kicking in,” the Estonian prime minister, Kaja Kallas, said. “Russia is playing on us getting tired. We must not fall into the trap.”

In a speech to marketing professionals in Cannes, France, the president of Ukraine, Volodymyr Zelenskyy, asked them to keep the world focused on his country’s plight. “I’ll be honest with you – the end of this war and its circumstances depend on the world’s attention …,” he said. “Don’t let the world switch to something else!”

Unfortunately, many of us have already changed the channel. The tragic has become banal.

I became interested in the phenomenon of fatigue as a result of my scholarly research into moral attentiveness. This idea was articulated by the 20th-century French philosopher and social activist Simone Weil.

A 1936 photo of French philosopher Simone Weil dressed in military clothing holding a rifle.
Simone Weil, a French philosopher, joined the Durruti Column in 1936 during the Spanish Civil War. Her scholarly work of social justice focused on the oppressed and marginalized in society. Apic/Hulton Archives via Getty Images, CC BY

According to Weil, moral attention is the capacity to open ourselves up fully – intellectually, emotionally and even physically – to the realities that we encounter. She described such attention as vigilance, a suspension of our ego-driven frameworks and personal desires in favor of a Buddhist-like emptiness of mind. This mindset receives, raw and unfiltered, whatever is presented without avoidance or projection.

Not surprisingly, Weil found attention to be inseparable from compassion, or “suffering with” the other. There is no avoiding pain and anguish when one attends to the afflicted; hence, she wrote that “thought flies from affliction as promptly and irresistibly as an animal flies from death.”

The sensitivity involved in attending to crises can be a double-edged sword. On one hand, attention can put people in touch with the unvarnished lives of others so the afflicted are truly seen and heard. On the other, such openness can overwhelm many of us through vicarious trauma, as psychologists Lisa McCann and Laurie Pearlman have noted.

Two young people place candles on the ground.
Protests are a visual reminder of the devastating war in Ukraine. Ehimetalor Akhere Unuabona for Unsplash, CC BY

The difficulty of sustained focus on events like the war is due not only to the inherent fragility of moral attention, however. As cultural critics like Neil Postman, James Williams and Maggie Jackson have noted, the 24/7 news cycle is one of many pressures clamoring for our attention. Our smartphones and other technology with incessant communications – from trivial to apocalyptic – engineer environments to keep us perpetually distracted and disoriented.

Why audiences tune out

Aside from the threats to people’s attention posed by our distracting technologies and information overload, there is also the fact of crisis fatigue leading readers to consume less news.

This year, a Reuters Institute analysis showed that interest in news has decreased sharply across all markets, from 63% in 2017 to 51% in 2022, while a full 15% of Americans have disconnected from news coverage altogether.

Men looking at multiple monitors.
The sheer volume of digital news and information has an unintended side effect: News consumers are tuning out. ThisisEngineering RAEng for Unsplash, CC BY

According to the Reuters report, the reasons for this differ, in part, with political affiliation. Conservative voters tend to avoid the news because they deem it untrustworthy or biased, while liberal voters avoid news because of feelings of powerlessness and fatigue. Online news, with its perpetual drive to keep eyes trained on screens, is unwittingly undermining its own goals: to provide news and keep the public informed.

Taking a new tack

How might we recover a capacity for meaningful attention and responses amid incessant, disjointed and overwhelming news? Scholars have made a variety of recommendations, usually focused on reining in digital device usage. Beyond this, readers and journalists might consider the following:

  1. Limiting the daily intake of news can help people become more attentive to particular issues of concern without feeling overwhelmed. Cultural theorist Yves Citton, in his book “The Ecology of Attention,” urges readers to “extract” themselves “from the hold of the alertness media regime.” According to him, the current media creates a state of “permanent alertness” through “crisis discourses, images of catastrophes, political scandals, and violent news items.” At the same time, reading long-form articles and essays can actually be a practice that helps with cultivating attentiveness.
  2. Journalists can include more solutions-based stories that capture the possibility of change. Avenues for action can be offered to readers to counteract paralysis in the face of tragedy. Amanda Ripley, a former Time magazine journalist, notes that “stories that offer hope, agency, and dignity feel like breaking news right now, because we are so overwhelmed with the opposite.”

Weil, who was committed to the responsibility of moral attentiveness but did not romanticize tragedy, wrote, “Nothing is so beautiful and wonderful, nothing is so continually fresh and surprising, so full of sweet and perpetual ecstasy, as the good.”

Rebecca Rozelle-Stone, Professor of Philosophy, University of North Dakota

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How was Halloween invented? Once a Celtic pagan tradition, the holiday has evolved to let kids and adults try on new identities


Kindergarten students in 1952 race out of school in Los Angeles, eager to celebrate Halloween. Los Angeles Examiner/USC Libraries/Corbis via Getty Images

Linus Owens, Middlebury

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to curiouskidsus@theconversation.com.


How was Halloween invented? – Tillman, age 9, Asheville, North Carolina


“It’s alive!” Dr. Frankenstein cried as his creation stirred to life. But the creature had a life of its own, eventually escaping its creator’s control.

Much like Frankenstein’s monster, traditions are also alive, which means they can change over time or get reinvented. Built from a hodgepodge of diverse parts, Halloween is one such tradition that has been continually reinvented since its ancient origins as a Celtic pagan ceremony. Yet beneath the superhero costumes and bags of candy still beats the heart of the original.

The Celts lived in what’s now Ireland as far back as 500 B.C. They celebrated New Year’s Day on Nov. 1, which they called Samhain. They believed that leading up to the transition to the new year, the door between the worlds of the living and the dead swung open. The souls of the recently dead, previously trapped on Earth, could now pass to the underworld. Since they thought spirits came out after dark, this supernatural activity reached its peak the night before, on Oct. 31.

The Celts invented rituals to protect themselves during this turbulent time. They put on costumes and disguises to fool the spirits. They lit bonfires and stuck candles inside carved turnips – the first jack-o’-lanterns – to scare away any spirits looking for mischief. If all else failed, they carried a pocketful of treats to pay off wayward spirits and send them back on their way to the underworld.

Sound familiar?

Although focused on the dead, Samhain was ultimately for the living, who needed plenty of help of their own when transitioning to the new year. Winter was cold and dark. Food was scarce. Everyone came together for one last bash to break bread, share stories and stand tall against the dead, strengthening community ties at the time they were needed most.

a collection of lit jack-o-lanterns
Ghouls, goblins and glowing jack-o’-lanterns have been synonymous with Halloween for a long time. Erik Freeland/Corbis Historical via Getty Images

When Catholics arrived in Ireland around A.D. 300, they opened another door between worlds, unleashing considerable conflict. They sought to convert the Celts by changing their pagan rituals into Christian holidays. They rechristened Nov. 1 “All Saints Day,” which today remains a celebration of Catholic saints.

But the locals held on to their old beliefs. They believed the dead still wandered the Earth. So the living still dressed in costumes. This activity still took place the night before. It just had a new name to fit the Catholic calendar, “All Hallows Eve,” which is where we got the name Halloween.

Irish immigrants brought Halloween to America in the 1800s while escaping the Great Potato Famine. At first, Irish Halloween celebrations were an oddity, viewed suspiciously by other Americans. As such, Halloween wasn’t celebrated much in America at the time.

As the Irish integrated into American society, Halloween was reinvented again, this time as an all-American celebration. It became a holiday primarily for kids. Its religious overtones faded, with supernatural saints and sinners being replaced by generic ghosts and goblins. Carved turnips gave way to the pumpkins now emblematic of the holiday. Though trick-or-treating resembles ancient traditions like guising, where costumed children went door to door for gifts, it’s actually an American invention, created to entice kids away from rowdy holiday pranks toward more wholesome activities.

Halloween has become a tradition many new immigrants adopt along their journey toward American-ness and is increasingly being exported around the world, with locals reinventing it in new ways to adapt it to their own culture.

postcard of a witch and black cat riding a broomstick
A Halloween postcard circa 1910. Trolley Dodger/Corbis Historical via Getty Images

What’s so special about Halloween is that it turns the world upside down. The dead walk the Earth. Rules are meant to be broken. And kids exercise a lot of power. They decide what costume to wear. They make demands on others by asking for candy. “Trick or treat” is their battle cry. They do things they’d never get away with any other time, but on Halloween, they get to act like adults, trying it on to see how it fits.

Because Halloween allows kids more independence, it’s possible to mark significant life stages through holiday firsts. First Halloween. First Halloween without a parent. First Halloween that’s no longer cool. First Halloween as a parent.

Growing up used to mean growing out of Halloween. But today, young adults seem even more committed to Halloween than kids.

What changed: adults or Halloween? Both.

Caught between childhood and adulthood, today’s young adults find Halloween a perfect match to their struggles to find themselves and make their way in the world. Their participation has reinvented Halloween again, now bigger, more elaborate and more expensive. Yet in becoming an adult celebration, it comes full circle to return to its roots as a holiday celebrated mainly by adults.

Halloween is a living tradition. You wear a costume every year, but you’d never wear the same one. You’ve changed since last year, and your costume reflects that. Halloween is no different. Each year, it’s the same celebration, but it’s also something totally new. In what ways are you already reinventing the Halloween of the future today?


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

Linus Owens, Associate Professor of Sociology, Middlebury

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Misinformation is a common thread between the COVID-19 and HIV/AIDS pandemics – with deadly consequences


Disinformation can derail public health measures vital to controlling the spread of infectious disease. AP Photo/Jeff Chiu

Cristian Apetrei, University of Pittsburgh Health Sciences

Since health officials confirmed the first COVID-19 cases, misinformation has spread just as quickly as the virus. Social media may have made the amount, variety and speed of misinformation seem unprecedented, but COVID-19 isn’t the first pandemic where false and harmful information has set back public health.

Misinformation altered how people trusted their governments and doctors during the 1918 influenza pandemic. It fueled the 19th century smallpox anti-vaccine movements through some of the same arguments as those currently used against the COVID-19 vaccine.

What sets the COVID-19 pandemic apart, however, is the sheer magnitude of damaging disinformation put in circulation around the world. Data shows that regions and countries where disinformation thrived experienced more lethal pandemic waves despite vaccine availability. In the U.S., for example, viewership of a Fox News program that downplayed the pandemic is associated with increased COVID-19 cases and deaths. Similarly in Romania, disinformation is a contributing factor to the country’s disastrous fourth wave of COVID-19. https://www.youtube.com/embed/Xl9zgDGko5U?wmode=transparent&start=0 The COVID-19 infodemic began as soon as the first few cases of infections were confirmed.

The problem of misinformation has been so widespread that it has its own word: “infodemic,” a portmanteau of “information” and “epidemic.” Coined by journalist David Rothkopf during the 2003 SARS outbreak, it describes a situation where “a few facts, mixed with fear, speculation and rumor, are amplified and relayed swiftly worldwide by modern information technologies.”

Infodemics can affect economies, politics, national security and public health. The COVID-19 infodemic became such a problem that the Royal Society and the British Academy released an October 2020 report noting its significant impact on vaccine deployment, endorsing legislation that prosecutes those who spread misinformation.

As a researcher who studies HIV and lived through the AIDS pandemic, I felt a sense of déjà vu as COVID-19 disinformation spread. In the 40 years since the emergence of AIDS, society has learned how to cope with the disease with more effective diagnostics, treatments and preventive strategies, transforming AIDS from a lethal condition to a chronic disease.

However, there are striking parallels between the HIV/AIDS and COVID-19 pandemics that show the dire consequences disinformation can have on both patients and society as a whole.

Denying the existence of a virus or a pandemic

There are people who deny the existence of COVID-19. There are abundant claims on social media that the virus that causes COVID-19 has never been isolated, or it is insufficiently characterized. Others do not contest the existence of COVID-19, but ignore the severe consequences of infection.

In general, these groups tend to also deny germ theory, claiming that infectious diseases are not caused by pathogens like viruses and bacteria. Instead, they promote the idea that pathogens don’t cause disease, but rather are a consequence of it. https://www.youtube.com/embed/J0UTqngnsuY?wmode=transparent&start=0 Misinformation is just one common theme between the COVID-19 and HIV/AIDS pandemics.

Likewise, some denied the role of the HIV virus in AIDS infection. AIDS denialist Peter Duesberg was one person who disseminated this misinformation, which had been refuted by the scientific community at large. But his erroneous claim still reached the then president of the Republic of South Africa, Thabo Mbeki, who banned the use of lifesaving antiretrovirals in public hospitals. This decision resulted in the deaths of over 330,000 people from HIV/AIDS between 2000 and 2005.

Mbeki’s decision was considered so damaging that scientists and physicians worldwide signed the Durban Declaration, reiterating that HIV indeed causes AIDS and urging Mbeki to reconsider his decision. While the government did reverse the ban after strong international political pressure, the damage had been done.

Gain of function claims

Gain of function experiments involve manipulating a pathogen to understand what contributes to its ability to cause disease. At the same time, such experiments can give pathogens new abilities, such as making viruses more transmissible or more dangerous to humans. Conspiracy theorists have made claims that the COVID-19 virus resulted from alterations to a bat version of the virus that gave it the ability to replicate in human cells.

But these claims ignore several key facts about the COVID-19 virus, including that all coronaviruses from bats can infect humans without additional adaptation. The mutations that increased the transmissibility of COVID-19 occurred after it started circulating in people, resulting in even more infectious variants.

HIV also saw conspiracy theories claiming that it was created in a lab for genocide. But research has shown that HIV also naturally evolved from an animals. African non-human primates are natural hosts to a vast group of viruses collectively called simian immunodeficiency viruses (SIV). Despite their high rates of SIV infection in the wild, these primate hosts typically don’t experience symptoms or progress to AIDS. Throughout the evolutionary history of SIV, jumping to a new host species involved naturally occurring genetic changes over the course of thousands of years.

Miracle cures

During a public health crisis, researchers and health officials are learning about a disease in real time. While missteps are expected, these can be perceived by the public as hesitation, incompetence or failure. https://www.youtube.com/embed/Sg3-KxH9iqc?wmode=transparent&start=0 There are some steps you can take to identify misinformation.

As researchers looked for possible COVID-19 treatments, others were offering their own unproven drugs. Multiple treatments for COVID-19, including ivermectin and hydroxychloroquine, were tested and abandoned. But not before large amounts of time, effort and money were spent on disproving claims that these were supposed miracle treatments. Similarly for HIV, frustration and anxiety from a continued lack of available treatments amid rising deaths led to fraudulent cures, with price tags of tens of thousands of dollars.

Even though treatment delays and changing guidelines are a natural process of learning about a new diseases as it unfolds, they can open the door to disinformation and generate distrust in doctors even as they care for infected patients.

Preventing misinfodemics

The next pandemic is not a question of if but when and where it will occur. Just as important as devising ways to detect emerging viruses is developing strategies to address the misinfodemics that will follow them. The recent monkeypox outbreak has already seen similar spread of mis- and disinformation about its source and spread.

As author Gabriel Garcia Marquez once said, “A lie is more comfortable than doubt, more useful than love, more lasting than truth.” Countering misinformation is difficult, because there are reasons other than ignorance for why someone believes in a falsehood. In those cases, presenting the facts may not be enough, and may sometimes even result in someone doubling down on a false belief. But focusing on urgent scientific and medical needs to the exclusion of rapidly addressing misinformation can derail pandemic control. Strategies that take misinformation into account can help other pandemic control measures be more successful.

Cristian Apetrei, Professor of Immunology, Infectious Diseases and Microbiology, University of Pittsburgh Health Sciences

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Reducing gun violence: A complicated problem can’t be solved with just one approach, so Indianapolis is trying programs ranging from job skills to therapy to violence interrupters to find out what works


Participants in ‘violence prevention’ programs seek to deescalate conflicts before they turn deadly. Andre Chung for The Washington Post via Getty Images

Thomas D. Stucky, IUPUI

Indianapolis is no stranger to gun violence. The city is also trying many promising approaches to reducing violence that – if proven successful – could benefit other urban areas across the U.S.

The city’s homicide rate in 2020, at 24.4 per 100,000 residents, was approximately triple the national average, and the city’s highest on record. Approximately 80% of those homicides were perpetrated using firearms.

Gun homicides ended about 240 lives there in a recent two-year period, according to a study regarding this city of 900,000 people. The number of people who were shot but survived was far higher, and firearms account for a significant number of suicide deaths.

I’m a former police officer who has studied policies and programs that seek to prevent gun violence since the late 1990s. I have periodically partnered with Indianapolis officials and community agencies on anti-violence initiatives coordinated by the local government with many private- and nonprofit-sector partners since 2004.

Though some approaches developed in other places have worked here, and Indianapolis has implemented many programs that have been shown to make a difference elsewhere, there’s still not enough data to pinpoint which specific programs are the most effective.

But given the urgency of the problem, I believe it’s important to keep test-driving promising methods based on the information available so far. And because Indianapolis experiences many of the same gun violence issues that other medium and large cities face, what’s learned here can apply in many other places.

Stepping up efforts to reduce gun violence

Indianapolis intensified its efforts to reduce gun violence in 2006, when 144 people died by homicide – up 27% from a year earlier.

That year Bart Peterson, then serving as the city’s mayor, created the Community Crime Prevention Task Force, in which I played a role. Its mission was to seek evidence-based recommendations to reduce violence.

After reviewing the relevant academic research, I identified best practices and the most promising violence-prevention strategies. The task force, in turn, made recommendations to the Indianapolis City-County Council.

The city subsequently began to increase funding for efforts to reduce gun violence in coordination with the Indianapolis Foundation, a local charity.

This private-public partnership has been supporting nonprofits engaged in several approaches to reducing gun violence ever since.

The overarching purpose of all these programs is to help the people who are the most likely to be wounded or killed by a gun to obtain services, such as job training and health care, in their communities and change norms away from gun violence to reduce that risk.

Because people killed by guns in Indianapolis are most likely to be male, young and Black, young Black men are a major focus for all the programs. Researchers have also determined that 3 in 4 gun homicide victims and suspects in the city were known to law enforcement through prior investigation, arrests or convictions. So that is another factor in terms of determining who gets these services.

Employing formerly incarcerated people

Other grants from the private-public partnership in Indianapolis have funded cognitive behavioral therapy for people at risk of engaging in or being victims of gun violence. This is a method in which people get help identifying and pushing back on their negative thoughts and behaviors, making it easier to resolve disputes without resorting to violence.

The city has also partnered with several community organizations to prevent gun violence.

One such group is Recycleforce, which hires formerly incarcerated people to recycle old electronic goods. It’s among several enhanced transitional job programs that provide services and training to the recently incarcerated.

One study showed that Recycleforce participants were 5.8% less likely to be arrested and 4.8% less likely to be convicted of a crime in the first six months of the period reviewed. However, in the second six months, the benefits were no longer statistically significant.

A second study used in-depth interviews to assess the program. It suggested that the peer-mentor model Recycleforce follows works well.

Preventing future gunshots

A large Indianapolis hospital, Eskenazi, also runs several important anti-violence programs. One, called Prescription for Hope, assists people treated there for gunshot wounds.

Like similar hospital-based programs around the country, the one based at Eskenazi helps participants develop effective life skills and connects them with community resources to reduce criminal and risky behaviors.

An initial study of the program showed that only about 3% of participants returned to the emergency department with a repeat violent injury within the first year, compared with an 8.7% rate when the program wasn’t underway. This translates to a two-thirds reduction in the likelihood that someone with a violent injury will need similar emergency medical assistance in the future.

‘Violence interruption’

In 2021, Indianapolis began to hire “violence interrupters” to calm contentious situations and reduce the risk of violent retaliation.

The “violence interruption” method connects people with personal ties to those most at risk of becoming involved in gun violence as victims or perpetrators.

Violence interrupters try to mediate disputes and calm things down on the streets, at parties and during funerals before any shooting starts. They have credibility with violence-prone people because of their past experiences.

The interrupters also help at-risk people to obtain services and to change gun violence norms in their communities.

Violence interruption, part of a growing public health approach to reining in violence, originated in Chicago in 2000. Now called the “cure violence model,” it has spread quickly amid generally positive research results.

Indianapolis was employing about 50 violence interrupters as of mid-2022.

More federal funding

Most of the city’s violence-prevention grants funding these efforts have been relatively small until now, ranging from US$5,000 to $325,000.

But U.S. cities, including Indianapolis, now have have until 2024 to tap into a comparatively large stream of federal funding for community-based violence intervention. That money was included in the $1.9 trillion stimulus package enacted in 2021.

Using these federal funds, the city is partnering with the Indianapolis Foundation to award grants totaling $45 million from 2022 through 2024 for local efforts to reduce gun violence.

Fortunately, Indianapolis’ homicides appear to be declining in 2022 compared with a year earlier.

As a local resident, I certainly welcome this news. But as researcher, I consider it to be too soon to tell whether this trend will continue or what the many public and private efforts to reduce gun violence underway will accomplish.

Thomas D. Stucky, Professor of Criminal Justice, IUPUI

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Climate pioneers: how small farmers could be leading the way towards sustainable agriculture


Small farmers in Maza village, Morogoro, Tanzania. US government/Flickr

Zareen Pervez Bharucha, Anglia Ruskin University

Agriculture is a leading cause of climate change, but it is also undeniably affected by it. Farming must therefore change in order to keep up with global demands, while reducing its environmental impact. Without these necessary changes, it’s estimated that by 2030, the impacts of climate change will be even worse, causing yields to decline so much that we will cancel out any progress we have made towards eliminating global hunger.

Some of those worst affected by climate change are small farmers (those working on land under two hectares). There are around 475m small farms around the world, cultivating around 12% of the world’s farmed land. Small farmers in the tropics and poorer agricultural communities will be particularly severely affected by climate change.

However, many of these small farmers are increasingly using innovative ways of reducing greenhouse gas emissions and adapting to climate change. They are the true pioneers of climate-smart agriculture, using practices that maintain productivity while decreasing emissions. They are also producing a range of other benefits such as poverty alleviation, better nutrition and biodiversity conservation.

Sustainable but healthy yields

In the 20th century, farmers boosted yields by intensifying production: using more water, land, energy, synthetic pesticides and fertilisers. This model tended to assume that you couldn’t have high yields as well as environmental protection. Now, we understand that this is a false choice, and that sustainable intensification – producing healthy yields and higher incomes while building ecosystems on and around the farm – is possible. And it looks like small farmers are leading the way in implementing such sustainable intensification around the world.

There are three steps towards sustainable intensification. These are increased efficiency (doing more with less), substitution (replacing ineffective or harmful products) and redesign (changing the whole farm to be more sustainable). These steps are not necessarily mutually exclusive.

For example, rice plants are typically planted close together in flooded nurseries. But they can also be grown in nutrient-rich nurseries that aren’t flooded – something that saves around 40% of the water used compared to conventional production methods. However, the system is about more than simple resource efficiency – it actually involves a fundamental redesign of the whole system of rice production.

Rice farmer in Punjab, India. Neil Palmer (CIAT)/Flickr, CC BY-SA

Substitution involves replacing less efficient or harmful inputs such as synthetic pesticides, which can be harmful for wildlife, with better alternatives. You can also replace old crop varieties with new ones that can withstand sudden changes, or which need less water – important for climate resilience. New varieties may also be able to help reduce agricultural emissions. For example, plants with greater root mass could help sequester an estimated 50 to 100 tonnes of carbon per hectare.

Radical approaches

Radical redesign of farms involves techniques such as conservation agriculture – practices that minimise the disruption of the soil’s structure and biodiversity. Integrated pest management, which involves strategies to deal with pests without posing risks to the environment, and agroforestry, using trees in agriculture, are also good examples. A recent assessment estimated that around 163m farms worldwide (29% of the global total) practice some form of redesign.

The evidence shows that these methods are already helping small farmers achieve healthy yields while delivering a range of other benefits, including carbon sequestration, using less energy and synthetic inputs and climate resilience.

One example is the “push-pull integrated pest management”. Push-pull is a method of pest control that was developed in East Africa to help farmers deal with stemborers and striga weeds, which attack crops such as maize. Instead of relying exclusively on synthetic pesticides, farmers grow pest-repelling plants such as desmodium (which push the pests away) in among the main crop. They also plant borders around their fields of other crops such as such as Napier grass, which attracts pests (pull).

This keeps pests away from the main cereal crops, reducing losses. In recent years, push-pull systems have been adapted to include plants such as Brachiaria, which can tolerate hotter and drier climates. Such systems are used across 69,000 small farms across Kenya, Uganda, Tanzania and Ethiopia.

Other methods of redesign are also being practised at scale by small farmers in other places. In India, 140,000 farmers in Andhra Pradesh and an estimated 100,000 in Karnataka practice “zero budget natural farming”. This is an initiative which promotes the natural growth of crops without adding any synthetic fertilisers and pesticides. In Africa, small farmers in Burkina Faso and Niger have taken up agroforestry and soil and water conservation, and transformed the landscape of around 500,000 hectares of degraded land.

The redesign of agriculture offers the best chances for achieving lower carbon, climate-proof agriculture in the 21st century. But, it requires new partnerships between farmers, development agencies, governments and researchers. Farming is knowledge intensive, and will be increasingly so in a changing world. Sustainable intensification initiatives that have spread to scale have all involved new initiatives to support collaboration and learning. Farmer field schools, training programmes for local farmers, are key to this. So are plant breeding programmes in which participating farmers get opportunities to make decisions at different stages during the process.

Ultimately, climate proofing is best achieved by improving the sustainability of existing systems. Small farmers already know what works. The challenge remains to help them spearhead the global spread of redesigned agriculture.

Zareen Pervez Bharucha, Senior Research fellow, Anglia Ruskin University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Climate crisis: the countryside could be our greatest ally – if we can reform farming


The Yorkshire Dales, England. Jakob Cotton/Unsplash, CC BY-SA

Ian Boyd, University of St Andrews

Around 20% of the UK’s farms account for 80% of the country’s total food production, and they do this on about half of all the farmed land there is. At least 80% of farms in the UK don’t produce very much at all.

In England, just 7% of farms produce over half of the country’s agricultural output – on 30% of its farmland. A little under half (42%) of England’s farms produce a meagre 2% of the total agricultural output, working just 8% of the country’s total land.

In an average year, mixed farming, livestock grazing and cereal farms make a financial loss on what they produce, and much of the income on these farms comes from government subsidy. In all these cases, this subsidy forms the majority of income. Livestock farming is the least profitable sector of all while some of the most profitable sectors like horticulture – producing everything from vegetables to soft fruit and tomatoes – receive very little subsidy.

Land is precious, and there are trade offs between designating enough to grow food and reserving it for other vital functions, like natural wilderness for biodiversity, recreation and carbon storage. This is as true in the UK as it is in the rest of the world.

Some farmers argue that they are the custodians of the land and the wildlife that live on it, but much of the evidence suggests that this role is neglected in the UK. Much farmed soil has been drained of its natural nutrients and now relies on artificial inputs like fertiliser. Rather than offering a haven for struggling bird species, it seems little progress has been made in halting declines in wildlife abundance on farmland.

A grey partridge – this species lives on farmland but has declined by more than 80% in the UK since 1970. Marek Szczepanek/Wikipedia, CC BY-SA

Agriculture is also a major emitter of greenhouse gases, accounting for about 10% of total UK emissions. Some estimates suggest that ten “calories” of fossil fuel energy is needed to produce a single calorie of protein.

The EU’s Common Agricultural Policy protected the right of people to farm unproductive land for the sake of countryside prosperity. But farming contributes only about 4% to the rural economy of England. Overall, UK agricultural production has stagnated in absolute terms since the late 1980s. This has meant unprofitable and environmentally damaging agriculture is maintained through subsidy. It’s time that a new policy shifted the balance.

Rewild, restore and reopen

Agriculture in the UK uses a vast amount of resources – energy, pesticides, water and mineral fertilisers – compared to the amount of goods it produces. For the productivity of agriculture to match other developed sectors of the economy like construction, agriculture would need to produce five to ten times more from the land it consumes.

Much of this inefficiency is caused by the energy used to produce fertilisers and livestock production. Only about 10-20% of vegetable matter fed to livestock is converted into meat for people to eat. Animals are often fed plant-based food produced on land which could also produce human food. Around 75% of the calories fed to livestock in the UK comes from these sources. As much as ten plant-based meals could be produced for the same material cost as it takes to produce one meat-based meal.

So what’s the alternative? If the UK wants to play its part in feeding the world, keeping people healthy and conserving the environment, there is a very simple way forward. Converting the 50% of land that’s mainly used for agriculture – but which only produces 20% of the UK’s total agricultural output – to other functions, including recreation, storing carbon and enhancing biodiversity.

This could be possible over ten years. It would give enough time for people involved in farming relatively unproductive land to adapt. Some of these people will still be paid from public funds but they could be tasked with rewilding their land to forest or other habitats that can lock away CO₂ and expand wildlife habitat. Some will also be rewarded for opening their land for public access. This will be especially important for land near urban areas as access to nature has serious benefits for human health.

Spending just two hours a week in nature has been shown to benefit a person’s health and mental wellbeing. Lukasz Szmigiel/Unsplash, CC BY-SA

Growing food in different ways could also make farming more efficient and it would be needed to make up for the small shortfall in production. Vertical farming, hydroponics and aeroponics are all techniques where food is grown according to the principles of manufacturing. This means it’s produced close to where it’s consumed, no pesticides are needed and all nutrients are closely controlled, reducing pollution.

Mobilising British agriculture to help the UK reach net zero emissions would be an incredibly valuable use of the UK’s landscape. But the main challenge to this is convincing the people who currently farm the relatively unproductive land that they need to be a part of this vision. The National Farmers Union – who represent many of these particular farmers – have done much to try and sustain the status quo, especially for livestock agriculture. Overcoming this social inertia will be hard work, but vital.


Click here to subscribe to our climate action newsletter. Climate change is inevitable. Our response to it isn’t.

Ian Boyd, Professor of Biology, University of St Andrews

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Three ways farms of the future can feed the planet and heal it too


Nature and technology can combine to help farms of the future nourish the earth and its inhabitants. SimplyDay/Shutterstock

Karen Rial-Lovera, Nottingham Trent University

Intensive agriculture may be nourishing most of the Earth’s inhabitants, but it’s doing the opposite to earth itself. Its dependence on singular crops, heavy ploughing machinery, fossil-fuel based fertilisers and pesticides is degrading our soils wildlife and nutrient cycles, and contributing a quarter of the planet’s unwanted extra heat.

But we’re not powerless to change the future of food. Nature and technological innovation are tackling these problems head on – and if the solutions they’re offering are incorporated on a large scale and used together, a new agricultural revolution could be on its way. Here are three of the most exciting developments that can help farms not just feed the planet, but heal it too.

Crops, trees and livestock in harmony

Several UN reports have highlighted agroecology – farming that mimics the interactions and cycles of plants, animals and nutrients in the natural world – as a path to sustainable food.

The approach uses a wide variety of practices. For example, instead of artificial fertilisers, it improves soil quality by planting nutrient-fixing “cover crops” in between harvest crops, rotating crops across fields each season and composting organic waste. It supports wildlife, stores carbon, and conserves water through the planting of trees and wildflower banks.

It also integrates livestock with crops. This may seem counter-intuitive given their inefficient land use and high emissions. But having a small number of animals grazing land doesn’t have to accelerate global heating.

Grassland captures carbon dioxide. Animals eat the grass, and then return that carbon to the soil as excrement. The nutrients in the excrement and the continuous grazing of grass both help new grass roots to grow, increasing the capacity of the land to capture carbon.

Carefully managed grazing can help the environment, not harm it. Millie Olsen/Unsplash, CC BY-SA

Keep too many grazing animals in one place for too long and they eat too much grass and produce too much excrement for the soil to take on, meaning carbon is lost to the atmosphere. But if small numbers are constantly rotated into different fields, the soil can store enough extra carbon to counterbalance the extra methane emitted by livestock’s digestive rumblings.

While this doesn’t make them a carbon sink, livestock bring other benefits to the land. They keep soil naturally fertilised, and can also improve biodiversity by eating more aggressive plants, allowing others to grow. And if local breeds are adopted, they generally don’t require expensive feed and veterinary care, as they’re adapted to local conditions.

Pesticides no more

Pests, diseases and weeds cause almost 40% of crop losses globally – and without care, the figure could rise dramatically. Climate change is shifting where pests and diseases thrive, making it harder for farmers to stay resilient.

Many commonly used herbicides, pesticides and fungicides are now also under pressure to be banned because of their negative effects on the health of humans and wildlife. Even if they’re not, growing resistance to their action is making controlling weeds, pests and diseases increasingly challenging.

Nature is again providing answers here. Farmers are starting to use pesticides derived from plants, which tend to be much less toxic to the surrounding environment.

They’re also using natural enemies to keep threats at bay. Some may act as repellents, “pushing” pests away. For example, peppermint disgusts the flea beetle, a scourge to oilseed rape farmers. Others are “pulls”, attracting pests away from valuable crops. Plants that are attractive for egg-laying but that don’t support the survival of insect larvae are commonly used for this purpose.

Nasturtiums are pest magnets – and they’re edible too. Shutova Elena/Shutterstock

Technology is also offering solutions on this front. Some farmers are already using apps to monitor, warn and predict when pest and diseases will attack crops. Driverless tractors and intelligent sprayers that can target specific weeds or nutritional needs have recently entered the market. Agritech companies are now also developing robots that can scan fields, identify specific plants, and decide whether to use pesticide or to remove a plant mechanically.

In combination, these methods can dramatically reduce agriculture’s reliance on herbicides and pesticides without lowering crop yields. This is important, since the world’s population is set to rise by a quarter in the next three decades.

Small tech, big difference

Soon, technology at an almost impossibly small scale could make a big difference to the way we grow our food. Companies have designed nanoparticles 100,000 times smaller then the width of a human hair that release fertiliser and pesticides slowly but steadily, to minimise their use and maximise crop yields.

New gene-editing techniques will also increasingly use nanomaterials to transfer DNA to plants. These techniques can be used to detect the presence of pests and nutrient deficiencies, or simply improve their resistance to extreme weather and pests. Given that increasingly frequent and severe extreme weather events due to global heating are putting the very functioning of the global food system at risk, these advancements could be vital for preventing agricultural collapse.

Nanotechnologies aren’t cheap yet and researchers have yet to conduct rigorous tests of how toxic nanomaterials are to humans and plants, and how durable they are. But should they pass these tests, agriculture will surely follow the path of other industries in adopting the technology on a large scale.

Save for nanotechnology and advanced robots, the above solutions are already in use in many small-scale and commercial farms – just not in combination. Imagine them working in synchrony and suddenly a vision of sustainable agriculture doesn’t seem so far away anymore.


Click here to subscribe to our climate action newsletter. Climate change is inevitable. Our response to it isn’t.

Karen Rial-Lovera, Senior Lecturer in Agriculture, Nottingham Trent University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Coal mines can be closed without destroying livelihoods – here’s how


Owen Douglas, University College Dublin and Kieran Harrahill, University College Dublin

Countries across the globe are trying to wind down coal production. While this will help in the battle against climate change, those communities that have specialised in coal mining may see their local job market decline or be eliminated entirely. Most of these places have mined coal for many generations. Given long-standing traditions, such communities will inevitably resist decarbonisation unless they are given appropriate reassurances regarding their economic and social survival.

We recently researched what did – and didn’t – work in coal regions of Canada, Australia and Germany. Our aim was to identify which policies have been most successful in halting the production of coal without placing the economic burden on coal workers and communities. Our results are now published in the journal Energy Policy.

Workers in extractive industries like mining or oil are often presented as the public face of opposition to environmental protection. However, research has shown that workers in “dirty” industries do tend to support environmentally friendly policies once their immediate interests are not negatively affected.

Furthermore, there is clear evidence that environmental protection and transitioning to the low-carbon economy has the potential to create employment just as much as it can cause unemployment.

The European Trade Union Institute has developed various indicators of a “just transition” away from coal – dialogue, retraining, and so on. In our paper, we used these indicators to identify what worked in our three case study areas.

Talk to each other

We found that active dialogue with communities is key. In North Rhine-Westphalia, Germany, policy is jointly formulated by employees and employers, giving workers a voice which is largely equal to that of industrialists. The proportion of employees on supervisory boards is determined by the number of employees, which means there is one-third employee representation if there are more than 500 employees and parity on the supervisory board if there are more than 2,000 employees. This has meant coal mining has been gradually reduced and now nearly eliminated without major social or political upheaval.

In contrast, Hazelwood coal power station and adjoining mine in Victoria, Australia were closed with minimal consultation with unions or government, and after just five months notice.

Where dialogue does occur, it must be genuine and followed by action. In coal villages in Alberta, Canada, such as Forestburg or Wabamun, the industry did attempt to talk to workers and local officials but the structure of the talks was poorly defined, resulting in workers not trusting the decarbonisation processes.

Jobs after coal

We identified re-employment in “clean” industries as a way to maintain livelihoods. The German approach to re-employment has seen North-Rhine Westphalia reinvent itself as a leader in new energy technologies. Central to this has been a bottom-up approach involving co-operation between workers, communities, employers and government.

Essen, Germany, was once known for its coal. In 2017 it was made ‘European Green Capital’. Lukassek / shutterstock

In Victoria, the dominance of the coal industry has hindered the transition towards a lower-carbon economy. However, the establishment of the Earthworker Cooperative has provided a platform for various affected groups to establish sustainable enterprises such as Australia’s first worker-owned factory, making renewable energy appliances and components. This demonstrates how local communities can create employment and maintain profits within their area without relying on coal.

In Alberta, a number of production facilities are simply shifting from coal to gas. While this shift creates jobs outside the coal sector, it does little to secure employment overall, since natural gas extraction and production requires fewer workers than coal. For example, energy company TransAlta is converting its coal-fired Sundance power plant in Wabamun to natural gas, which means the overall workforce will be cut in half when the layoffs are complete.

Investing in people’s futures

Re-training allows workers to develop the necessary skills to work outside of the coal sector. In North Rhine-Westphalia, training programmes have targeted a number of different sectors including engineering, trades, business and technology. The industrial heartland of the Ruhr area – once the centre of Germany’s coal industry – has six new universities, 15 colleges and 60 research facilities since 1961. This Strukturwandel, or structural change, has developed a highly skilled workforce and demonstrates the potential for economic growth and diversification beyond coal.

Education and training has been made more accessible through subsidised retraining – in Alberta, through the Coal and Electricity Transition Tuition Voucher, and in Victoria through the Training Guarantee. This ensures that retraining does not place an additional burden on those facing redundancy.

Make former coal towns great again

Investing in infrastructure is a further means by which to secure sustainable transitions for workers and their families. In North Rhine-Westphalia and Victoria, government funding has primarily focused on roads and rail alongside investment in community infrastructure such as sports and recreational facilities. This ensures that former mining areas do not remain synonymous with coal production, pollution and socio-economic problems, and makes them a more attractive place for other industries to invest.

Moving away from fossil fuels such as coal is central to achieving emissions targets. This doesn’t have to create huge social unrest. With the goodwill of policymakers and through measures such as those we have identified, decarbonisation strategies can be developed and implemented while maintaining livelihoods for those directly affected.


Click here to subscribe to our climate action newsletter. Climate change is inevitable. Our response to it isn’t.

Owen Douglas, Post-Doctoral Researcher, Environmental Policy, University College Dublin and Kieran Harrahill, PhD Researcher on the Bioeconomy and Society, University College Dublin

This article is republished from The Conversation under a Creative Commons license. Read the original article.