Tag: Eco-friendly Solutions

Cash-strapped University of Arizona says climate action can wait

The University of Arizona this week delayed implementation of its climate action plan citing a $177 million budget deficit. Despite rising revenues, the university has been grappling with low cash reserves due to overspending, and is now dealing with hiring freezes, flat-lined salaries, and potential layoffs. Now, the university’s climate commitments may be on the chopping block. 

Nick Prevenas, director of media relations at the University of Arizona, said the administration is “currently reassessing how to approach the final steps in the development of the university’s Sustainability and Climate Action Plan to ensure it best supports the university’s Financial Action Plan.” 

Six working groups and two technical teams spent last fall working on nearly 100 recommendations to decrease carbon emissions at the university, including upgrading facilities, incentivizing cleaner transportation options, and improving public awareness of sustainability issues. The list of final recommendations includes divesting from fossil fuels by 2030, creating positions to oversee socially conscious investing, and creating policy to deal with donations from individuals or groups with ties to the fossil fuel industry. According to Prevenas, 6 percent of the University of Arizona Foundation’s endowment is currently made up of privately managed fossil fuel investments, which is valued at about $75 million. 

It is now unclear when or if those proposals will be put into action, and Prevanas did not respond to direct questions about how long implementation may be delayed.

“We are the only public university in Arizona that doesn’t have a climate action plan,” said Samantha Gonsalves-Wetherell, a senior at the University of Arizona who has been a leader in the campus divestment movement. “It shows a lack of responsibility and accountability.”

Jake Lowe, executive director of the Campus Climate Network, says Arizona isn’t the first university to backtrack from divestment goals, noting that students at the University of Illinois have protested similar delays. But he says there’s a financial case for sticking with divestment goals, citing a recent analysis by the Institute for Energy Economics and Financial Analysis that advocates for a green transition. 

“Weak economic performance and an unstable future for fossil fuels have made it clear that divestment can be achieved without financial harm to any individual investment fund,” the analysis says. “Divestment is a defensive tool employed to protect investors from the loss of value — losses as certain as climate change’s global reach.”

The news comes just weeks after a Grist investigation found that Arizona is among several universities that rely on fossil fuel production, mining, and other extractive industries to earn revenue from land taken from Indigenous peoples. Divestment activists at the University of Arizona have called the practice shocking, but not shocking.

Nadira Mitchell, a Diné student at the university who is currently serving as Miss Native American University of Arizona, was among those disappointed by Grist’s findings, and the delay in the climate action plan compounds her frustration. 

“If sports funding isn’t cut and the climate action plan is,” she said, “that kind of shows what the university’s priorities are.”

This story was originally published by Grist with the headline Cash-strapped University of Arizona says climate action can wait on Feb 23, 2024.

Latest Eco-Friendly News

20 Global Dishes With the Greatest Biodiversity Risks Include Meat-Based and Vegan Fare, Study Finds

Our food choices can have substantial environmental impacts. According to the United Nations Environment Programme, worldwide food production — especially animal agriculture — is the main cause of biodiversity loss. One of the major reasons is that livestock and their feed require a great deal of land.

Certain foods — such as Indian kidney bean curry, Brazilian steak and lechazo, a Spanish dish made with lamb — have an outsized biodiversity footprint, a new study has found.

“Brazilian cattle, for example, need a lot of space. So do Spanish lambs,” said Roman Carrasco, one of the authors of the study and an associate professor in the department of biological sciences at the National University of Singapore (NUS), as The Guardian reported.

Scientists from NUS used prior research data identifying how certain crops encroached on the ranges of mammals, amphibians and birds to estimate how 151 popular international dishes affect biodiversity, the Public Library of Science (PLOS) said.

The dishes were taken from TasteAtlas.com and CNN.com, with each dish standardized to be 825 kilocalories.

The study, “Biodiversity footprints of 151 popular dishes from around the world,” was published in the journal PLOS One.

The research team calculated each ingredient’s biodiversity footprint by examining the conservation status, range and richness of the wild mammals, amphibians and birds within each product’s agricultural land area, PLOS explained. The scientists then combined the footprint of each ingredient to come up with a total biodiversity imprint for each dish. The footprint scores changed depending on whether an ingredient had been sourced globally or locally, farmed on a small scale or industrially.

“The study highlights particular problems for dishes using ingredients from tropical areas rich in biodiversity, including Brazil and Mexico,” said lead author of the study Elissa Cheng, an NUS life science graduate, as reported by The Guardian.

According to the study, the 20 dishes with the biggest biodiversity footprints were both meat-based and vegan, including yukgaejang — Korean vegetable and spicy beef stew; salsa verde pork; several Brazilian steak dishes; caldo de pollo, or chicken soup; and vegan dishes like rajma — kidney bean curry; dal; idli — fermented savory rice cake; and chana masala, a chickpea curry.

The dishes with the largest footprints tended to be made with beef, chicken, legumes and rice. Legumes and rice grown industrially for vegetarian and vegan dishes from the Indian subcontinent tended to have particularly strong effects on threatened species and biodiversity indicators related to range.

High biodiversity impacts were also found to result from lamb and Brazilian beef dishes because of the conversion of diverse ecosystems like the Amazon rainforest to pasture lands.

The smallest biodiversity footprints mostly came from starchy vegan and vegetarian dishes with a base of potato or grain, such as pommes frites, kartoffelpuffer — German potato pancake — triple cooked chips and baguettes.

The researchers looked at mammals, amphibians and birds specifically without differentiating between wildlife with certain habitat requirements and those able to live in cultivated habitats. The research team said they focused on dishes from nations with a high gross domestic product and that they were not necessarily representative. They also noted that variations in recipes could lead to different outcomes.

“Small changes in the dish we choose to eat and where we get the ingredients from can go a long way in preventing species extinctions. In addition to the large footprint of beef and lamb dishes from countries containing biodiversity hotspots, vegetarian dishes from highly biodiverse and under strong human pressure countries, like India, can be also very detrimental for biodiversity,” the authors said.

The post 20 Global Dishes With the Greatest Biodiversity Risks Include Meat-Based and Vegan Fare, Study Finds appeared first on EcoWatch.

Latest Eco-Friendly News

Thousands Evacuated as Australia Bushfire Burns Out of Control

More than 2,000 residents of Australia’s Victoria state were under evacuation orders on Thursday due to an out of control bushfire.

Residents of Beaufort, Raglan and surrounding areas were encouraged by the state emergency service to leave and head east toward Melbourne while it was safe to do so, reported Reuters.

“Leaving immediately is the safest option for those communities,” said State Premier Jacinta Allan in a news conference, as Reuters reported. “If you are located in these areas, please heed this advice, please act now to save your own life.”

The fire near Beaufort and Raglan covered more than 17,297 acres, according to The Age.

Jason Heffernan, chief officer of the Country Fire Authority (CFA) told Melbourne radio station 3AW that Lexton and Amphitheatre to the north of the fire, as well as Burrumbeet and Addington to the east, were currently the most threatened.

Wind gusts as high as 31 miles per hour had been making the fires more difficult to fight, Heffernan said.

“Thankfully, just before going on air here tonight, I’ve started to get some initial [intelligence that] those winds are dropping,” Heffernan said, as reported by The Age.

Smoke, ash and embers from the blaze were being seen in neighboring communities, the Australian Broadcasting Corporation (ABC) said.

In Tasmania, emergency and watch alerts, as well as smoke advisories, were in effect.

“People at higher risk from the effects of smoke, including those with medical conditions, should enact their personal plan for avoiding smoke and managing their health,” advised TasALERT, according to ABC. “People traveling in the area should be careful if driving in smoke. Turn on your headlights, drive slowly and be aware of emergency services in the area.”

Chris Hardman, chief of Forest Fire Management Victoria, told ABC Melbourne radio the bushfire was being fought by more than 1,000 firefighters.

In addition, more than 100 vehicles and 24 aircraft were providing support, with more on the way, according to Allan, as Reuters reported.

Approximately 12,355 acres were burning to the northwest of Ballarat, with another similarly sized area burning out of control to the west.

Several districts were under extreme fire danger warnings issued by the Bureau of Meteorology because of hot, dry winds and a risk of thunderstorms.

Heffernan expressed gratitude to firefighters and other emergency workers in a social media post.

“Two words ‘Thank You’! to all emergency service personnel on the fire line today and tonight. It’s not over by far as many are putting their lives at risk as I type, to fight for Beaufort, Lexton, Raglan and surrounding communities,” Heffernan posted on X.

The post Thousands Evacuated as Australia Bushfire Burns Out of Control appeared first on EcoWatch.

Latest Eco-Friendly News

Heavy Metals and Other Toxins Found in Stranded Whales and Dolphins

A new study has detected heavy metals and other toxins in whales and dolphins that became stranded in Florida and Georgia over a 15-year timeframe.

Researchers analyzed tissue and fecal samples of 90 odontocetes (toothed whales), spanning nine different species, that had stranded in Florida and Georgia from 2007 to 2021. In total, the team led by the Harbor Branch Oceanographic Institute at Florida Atlantic University analyzed over 319 samples of blubber, kidney, liver, skeletal muscle, skin and feces for 12 trace elements: cobalt, copper, iron, manganese, molybdenum, selenium, zinc, arsenic, cadmium, lead, mercury and thallium.

The results revealed that the animals did experience bioaccumulation of heavy metals in varying amounts, which they could be exposed to through their diets. 

The study, published in the journal Cell Press: Heliyon, revealed that the highest concentrations for mercury, cadmium and lead were in two species, Risso’s dolphins (Grampus griseus) and short-finned pilot whales (Globicephala macrorhynchus).

The research also unveiled that heavy metal exposure may be increasing over time. According to the study, researchers found higher concentrations of several trace elements, including arsenic, copper, iron, lead, manganese, selenium, thallium and zinc, for adult pygmy (Kogia breviceps) and dwarf sperm whales (Kogia sima) that stranded from 2019 to 2021 compared to the same species that stranded earlier, in 2010 to 2018.

“When we separated phylogenetic groups into age classes and compared median concentrations of heavy metals in specific tissue types between adult specimens of species, we found some interesting trends,” Annie Page, senior author of the study, associate research professor and clinical veterinarian at Harbor Branch, said in a statement

Another trend the authors noted in the study was that the highest concentrations for several elements were found in fecal matter. Why is this important? It showed how beneficial the data collected in a non-invasive way can be, according to the authors.

While mercury and other toxins can impact many species in marine food webs, these elements can be especially harmful for those at the top of the food chain, as reported by the Minamata Convention on Mercury. The heavy metals can accumulate as predators consume their prey, and the heavy metals can cause several health problems for whales, dolphins, and other species.

“Exposure to heavy metal contaminants can result in oxidative stress, which can impair protein function, damage DNA and disrupt membrane lipids,” Page explained. “Heavy metal exposure has been linked to degenerative heart disease, immunodeficiency and increased parasite infestations, among other disease risks.”

The authors hope this research will contribute to baseline data on the risks of bioaccumulation of trace elements in whales and dolphins.

“This study illustrates the importance of monitoring toxic contaminants in stranded odontocetes, which serve as important sentinels of environmental contamination, and whose health may be linked to human health,” the study concluded.

The post Heavy Metals and Other Toxins Found in Stranded Whales and Dolphins appeared first on EcoWatch.

Latest Eco-Friendly News

How air pollution delayed a surge in extreme rain

The past half-century has seen remarkable improvements in air quality in many parts of the world, thanks largely to legislation like the U.S. Clean Air Act. Efforts like these took aim at pollutants like the group of chemicals known as aerosols, which include sulfur dioxide, nitrogen dioxide, and other compounds that are harmful to human health.

Like greenhouse gases, aerosols are produced by cars, factories, and power plants — but unlike greenhouse gases, they make the earth cooler rather than warmer. This is because aerosols reflect the sun’s rays, rather than trapping its heat like carbon. Some studies estimate that, without aerosol pollution, the world might have already warmed by another half a degree Celsius

This creates a tricky paradox, which renowned climate scientist James Hansen has called a “Faustian bargain.” If you remove aerosols from the air, you reduce the health impacts of pollution, saving thousands of people from lung and heart disease, but you might also make global warming worse. This powerful relationship has been on display over the past few years in the maritime shipping industry: As freight ships have stopped using dirty bunker fuel since 2020, they’ve also stopped emitting trails of sulfur dioxide, which has caused world temperatures to jump by an additional .05 degrees C.

Now, new research shows that the interaction between aerosols and greenhouse gases also has implications for flooding, which is one of the costliest climate disasters. A peer-reviewed paper published this week in Nature Communications finds that the presence of toxic aerosols in the atmosphere over the United States helped suppress the impacts of climate change on rainfall for decades, postponing a surge in rainfall and flood risk driven by climate change. The passage of clean air laws, which removed these aerosols from the atmosphere, ironically unleashed a trend of worsening floods.

The paper’s results help solve what had been something of a mystery in climate science: Even though warmer air holds more moisture, rainfall in the United States hasn’t been increasing in the way scientists expected as temperatures rise.

“This paper highlights that the counteraction between aerosols and greenhouse gases has likely masked a lot of climate hazards over the last few decades,” said Geeta Persad, an assistant professor of Earth sciences at the University of Texas at Austin and an expert on aerosols. (Persad wasn’t involved in the study.)

“If aerosol emissions drop drastically over the next few decades and greenhouse gases don’t, a lot of those unanticipated climate hazards could be revealed,” added Persad.

The paper uses data from thousands of rain gauges to tease out how aerosols and greenhouse gases have influenced rainfall averages and the frequency of extreme rain events. The use of rain gauges allowed researchers to trace how the two types of human-caused pollution balance each other out in different regions of the country.

Greenhouse gases have been stacking up in the atmosphere for more than a century, and they have a pretty simple impact on rainfall. The more carbon dioxide is in the atmosphere, the hotter it gets; the hotter it gets, the more moisture the atmosphere can hold. Aerosols are more complicated: They react differently with different types of clouds, and as a result their impact on rainfall varies from region to region and from season to season. In most of the U.S., they made things drier.

The passage of the landmark Clean Air Act in 1970 caused a rapid decline in aerosol pollution as factories installed “scrubber” devices to clean up their smokestacks and automakers updated their cars to comply with emission limits. The disappearance of these aerosols left greenhouse gases to dominate in the atmosphere, which started to ratchet up rainfall totals. If those aerosols hadn’t been there, the paper argues, rainfall and flooding might have started worsening in the United States several decades earlier.

Separating out the effect of these aerosols also allows the researchers to make predictions about how flood risk will change over the next decade. It’s not good news: Now that there’s nothing to offset the heat-trapping effect of carbon dioxide and methane, much of the country is about to get a lot wetter and see a lot more monster storms.

“This somewhat rapid intensification of rainfall extremes is the new normal, at least for the next five years,” said Mark Risser, a research scientist at Lawrence Berkeley National Lab and one of the paper’s lead authors.

The effect is most pronounced in the southeastern United States, where a slew of hurricanes and rainstorms have caused billions of dollars of flood damage in recent years. The authors find that aerosol pollution tamped down summer and fall precipitation until the late 20th century, when the effect of greenhouse gases started to dominate in the region. That led to both an increase in annual rainfall totals and an increase in the frequency of big rainstorms. (Previous research has shown that aerosols can also suppress the emergence of tropical storms by disrupting cloud formation.)

The paper’s findings could have big implications for the next few decades of environmental regulation. President Biden’s Environmental Protection Agency is racing to finalize strict regulations on industrial pollution that could slash emissions of key aerosol pollutants such as sulfur dioxide. If these regulations take effect, they would apply to numerous facilities in the Southeast, including the petrochemical facilities in the Louisiana region known as “Cancer Alley.”

These regulations would protect residents who live near industrial facilities from asthma, heart disease, and cancer, but a further decline in aerosols could also make hurricane season worse by allowing big storms to hold moisture — meaning more events like Hurricane Harvey, which struck in 2017 and stunned climate scientists by dropping more than 50 inches of rain over Houston, Texas.

Persad, the aerosols expert, says the paper offers a grim warning about future climate risk. If air pollution declines in the United States over the next few decades, many more Americans in regions such as the Southeast could see stronger storms and more severe flooding.

“We’re looking at a situation where over the next 30 years, you could either keep masking, or you could reveal 50 percent more warming,” she said. “Up until now, there has not been very much recognition of how much the evolution of this aerosol signal, over the lifetime of a mortgage of a house that somebody buys today, is going to affect the climate hazards they’re exposed to.”

This story was originally published by Grist with the headline How air pollution delayed a surge in extreme rain on Feb 22, 2024.

Latest Eco-Friendly News

Climate change is undoing decades of progress on air quality

A choking layer of pollution-laced fog settled over Minneapolis last month, blanketing the city in its worst air quality since 2005. A temperature inversion acted like a ceiling, trapping small particles emitted from sluggish engines and overworked heaters in a gauze that shrouded the skyline. That haze arrived amid the hottest winter on record for the Midwest. Warmer temperatures melted what little snow had fallen, releasing moisture that helped further trap pollution.

Though summertime pollution from wildfire smoke and ozone receives more attention, climate change is making these kinds of winter inversions increasingly common — with troubling results. One in 4 Americans are now exposed to unhealthy air, according to a report by First Street Foundation. 

Jeremy Porter, head of climate implications research at the nonprofit climate research firm, calls this increase in air pollution a “climate penalty,” rolling back improvements made over four decades. On the West Coast, this inflection point was passed about 10 years ago; air quality across the region has consistently worsened since 2010. Now, a broader swath of the country is starting to see deteriorating conditions. During Canada’s boreal wildfires last summer, for example, millions of people from Chicago to New York experienced some of the worst air pollution in the world. It was a precedent-breaking spate that saw the average person exposed to more small particulate matter than at any time since tracking began in 2006

It’s a preview of more to come.

Since Congress passed the Clean Air Act in 1970, federal law has regulated all sources of emissions, successfully reducing pollution. Between 1990 and 2017, the number of particles smaller than 2.5 micrometers, known as PM2.5, fell 41 percent. These particulates pose a significant threat because they can burrow into the lungs and enter the bloodstream. Exposure can cause heart disease, strokes, respiratory diseases like lung cancer, and premature death. Such concerns prompted the Environmental Protection Agency to toughen pollution limits for the first time in a decade, lowering the limit from 12 micrograms per cubic meter of air to 9 earlier this month.

But a stricter standard isn’t likely to resolve the problem, said Marissa Childs, a post-doctoral researcher at Harvard University’s Center for the Environment. That’s because the agency considers wildfires an “exceptional event,” and therefore exempt from the regulation. Yet about one-third of all particulate matter pollution in the United States now comes from wildfire smoke. “The Clean Air Act is challenged by smoke,” she said, both because wildfires defy the EPA’s traditional enforcement mechanisms, and because of its capacity to travel long distances. “Are we going to start saying that New York is out of compliance because California had a fire burning?”

To get a better sense of how a growing exposure to air pollution might impact the public, First Street used wildfire and climate models to estimate what the skies might look like in the future. (Though its researchers relied on Childs’ national database of PM2.5 concentrations, she was not otherwise involved with First Street’s report.) They found that by 2054, 50 percent more people, or 125 million in all, will experience at least one day of “red” air quality with an Air Quality Index from 151-200, a level considered risky enough that everyone should minimize their exposure. “We’re essentially adding back additional premature deaths, adding back additional heart attacks,” Porter said at a meeting about the report. “We’re losing productivity in the economic markets by additionally losing outdoor job work days.”

First Street has now added its air quality predictions to an online tool that allows anyone to search for climate risks by home address. As extreme heat increases ozone and changing conditions intensify wildfires, it shows just how unequal the impacts will be. While New York City is projected to see eight days a year with the Air Quality Index at an unhealthy orange, meaning an in the range of 101 to 150, an increase of two days, the Seattle metropolitan area is expected to see almost two additional weeks of poor air. “That’s two more weeks out of only 52,” said Ed Kearns, First Street’s chief science officer. “Twelve more days of being trapped in your house, not being able to go outside — worrying about the health consequences.”

Just as the sources of pollution are unevenly distributed, so too is people’s ability to respond. “People across the board are seeking information about air quality,” Childs said, for example, searching online about pollution levels on particularly smoky days. But not everyone has the same ability to make choices to protect themselves. Childs cowrote a 2022 Nature Human Behavior paper that found behavioral responses to smoke — staying indoors, for example, or driving to work rather than waiting for the bus — are strongly correlated with income. If left to individuals, she says, “the people who have the most resources are going to be the most protected, and we’re going to leave a lot of people behind.”

In a collaboration with real estate company Redfin, First Street found early signals that suggest people are already leaving areas with poor air quality. Tarik Benmarhnia, an environmental epidemiologist at the University of California, San Diego, quibbles with those conclusions, however, saying many variables influence both air quality and residential mobility, like income and housing prices. Air pollution is a notoriously complex subject — difficult to predict even a week out, much less speculate on what might happen in three decades. “I think the most critical problem is a total absence of any discussion of uncertainty,” he said. 

He also worries that First Street’s risk index could unintentionally magnify these distinctions of privilege. If potential homeowners use the database to avoid areas based on the report’s predictions, property values in those regions could fall accordingly, reducing tax bases and decreasing the ability to provide services like community clean air rooms during smoke events. “It may act like a self-fulfilling prophecy.”  

Benmarhnia notes that traditional sources of air pollution, like factory emissions, show a very consistent relationship between socio-economic status, race, and higher pollution levels, a pattern that repeats across the country. Smoke and ozone don’t tend to follow these social gradients because they disperse so widely. “But wildfire smoke doesn’t come on top of nothing, it’s on top of existing inequities” like access to health care, or jobs that increase outdoor exposures, he said. “Not everybody is starting from the same place.” Benmarhnia recently published a paper finding that wildfires, in concert with extreme heat, compound the risk to cardiovascular systems. But the people most likely to be harmed by these synergies live in low-income communities of color.  

“The thing about air pollution is there’s only so much you can do at individual or civil society level,” said Christa Hasenkopf, the director of the Clean Air Program at the Energy Policy Institute at the University of Chicago. “It’s a political and social issue that has to be tackled at a national level.” The university’s Air Quality Life Index measures how air pollution is contributing to early deaths around the world, aiming to provide a clearer image of the health gaps. “The size of the impact on life expectancy in two relatively geographically nearby areas can be surprising,” she says, like between eastern and western Europe. 

For her part, Hasenkopf is enthusiastic about First Street’s air quality report, hoping it will help highlight some of these inequities. Though 13 people die every minute from air pollution, funding for cleaner air solutions remains limited. “That disconnect between the size of the air pollution issue, and what resources we are devoting to it is quite startling,” Hasenkopf said. 

This story was originally published by Grist with the headline Climate change is undoing decades of progress on air quality on Feb 22, 2024.

Latest Eco-Friendly News

Supreme Court weighs blocking a federal plan to cut smog pollution

The trouble with air pollution is that it tends to travel — blowing downwind for hundreds of miles, entering the lungs of people living far from its source. Nitrogen oxides emitted by coal-fired power plants, for example, can waft across state lines and react with other chemicals in the atmosphere to form ozone, a potent pollutant and the main ingredient in smog. Last March, the federal Environmental Protection Agency issued a rule to rein in those downwind ozone pollutants in 23 states. But in the months since, states and fossil fuel industry groups have filed dozens of lawsuits to block the plan. As a result of this ongoing litigation, the agency’s ozone pollution reduction rule, dubbed the “Good Neighbor” plan, has been put on hold in 12 states, including Kentucky, Texas, and Utah. 

Those legal battles have now reached the Supreme Court. On Wednesday, as supporters of the rule demonstrated outside, attorneys representing the state of Ohio, the oil and gas pipeline company Kinder Morgan, the American Forest and Paper Association, and the manufacturing company U.S. Steel, among others, presented oral arguments before the Supreme Court. The groups want the court to grant what’s called an “emergency stay,” which would halt the Good Neighbor plan entirely — even in the 11 states already implementing the rule — while lawsuits in lower courts play out. 

The justices wouldn’t have a final say on the legitimacy of the EPA’s rule — that’s up to the U.S. Court of Appeals for the District of Columbia Circuit, which is currently wrangling with 18 related lawsuits on that question. But legal experts say that Wednesday’s oral arguments seem to indicate that the Supreme Court could end up wading into the validity of the Good Neighbor plan in its decision anyway, with untold public health consequences for residents of downwind states.

“The applicants are trying to get the Supreme Court to weigh in on the merits through this procedural stay application,” Zachary Fabish, senior attorney at the Sierra Club, told Grist based on what he heard at the court on Wednesday. “And the downwind folks in those states are paying the public health price.”

A few justices commented on the plaintiffs’ unusual choice to argue in front of the Supreme Court before their pending litigation has been decided by the D.C. Circuit. The groups even admitted during oral arguments that they had requested a delayed briefing at the lower court so they could present their case to the Supreme Court first.

“It’s fairly extraordinary, I think, to be asking the court to decide this matter when you haven’t even lost below in terms of what is before the D.C. Circuit,” Justice Ketanji Brown Jackson told the plaintiffs. “So I’m trying to understand what the emergency is that warrants Supreme Court intervention at this point.”

That emergency, the state and industry plaintiffs argue, mostly boils down to the costs of complying with the EPA’s ozone reduction plan. In 2015, the EPA updated the federal air quality standard for ozone, which sets strict limits for that pollutant nationwide. According to federal law, each state was required to submit a plan within three years of the updated standard describing how it would reduce the amount of ozone pollution blowing downwind to other states. If they failed to do so, or submitted inadequate plans, the EPA was obligated under the Clean Air Act to enforce the Good Neighbor rule to reduce downwind pollution in those states. By February 2023, the EPA had rejected 21 states’ plans; another two, Pennsylvania and Virginia, did not submit one. 

In March, the agency issued the Good Neighbor plan for those 23 states, a rule that plaintiffs argued levies an unfair burden on states like Ohio and Indiana; oil and gas companies; and heavy industry. “In order to get into compliance with an unlawful federal rule, we are spending immense sums, both the states as well as our industries,” argued Ohio Deputy Solicitor General Mathura Sridharan. 

But Judith Vale, a deputy solicitor general for New York who argued in favor of the Good Neighbor plan, noted that the EPA’s rule helps address inherent cost imbalances between upwind and downwind states. In many cases, power plants and industrial facilities in upwind states in the South and Midwest would simply need to turn on existing pollution controls to come into compliance. Downwind states like Connecticut and Wisconsin, on the other hand, need to reduce their own pollution while also compensating for pollutants blowing into their jurisdiction. 

Often, those states have “already exhausted a lot of the less expensive strategies,” Vale said. “So they need to turn to more and more expensive strategies to find any further cuts.”

While Jackson and other liberal justices seemed to question challenges against the Good Neighbor plan, conservative justices like Justice Brett Kavanaugh appeared more sympathetic. In response to a point raised by Malcolm Stewart, a deputy solicitor general at the U.S. Department of Justice, that pausing the air pollution plan would disproportionately harm downwind states, Kavanaugh agreed but added that “there’s also the equities of the upwind states and the industry,” concluding that both sides had experienced irreparable harm.

Fabish noted that the court’s decision to even schedule oral arguments for this case is highly unusual. The request for the emergency stay arrived on the Supreme Court’s “shadow docket,” a lineup of cases that, until recently, involved less consequential matters and got decided on without oral arguments, extensive hearings, or even explanations from the judges. But by asking for briefings and an oral argument, the court has created a kind of “process conundrum” for themselves, Fabish said. While the justices have some materials to base a judgment on, he noted they lack most of the evidence used in a typical case, such as extensive briefs, documents, and arguments. The justices also lack detailed opinions from a lower court, since the D.C. Circuit has yet to issue a decision.

All those factors, in addition to dozens of pending lawsuits related to the Good Neighbor rule in courts across the country, create a great deal of uncertainty around how and when the Supreme Court will rule on this application, Fabish said. Richard Lazarus, an environmental law professor at Harvard Law School, told Harvard Law Today that anywhere from four to six justices could agree to halt the rule, pointing to Kavanaugh, Neil Gorsuch, Clarence Thomas, and Samuel Alito as likely votes to do just that. Meanwhile, other justices worried aloud whether this case could encourage future plaintiffs to use the shadow docket as a venue to challenge environmental regulations. 

“I mean, surely, the Supreme Court’s emergency docket is not a viable alternative for every party that believes they have a meritorious claim against the government and doesn’t want to have to comply with a rule while they’re challenging it,” Justice Jackson said. 

This story was originally published by Grist with the headline Supreme Court weighs blocking a federal plan to cut smog pollution on Feb 22, 2024.

Latest Eco-Friendly News