Carbon dioxide is increasing in the atmosphere 10 times faster than it has in the last 50,000 years, according to a new study led by researchers from University of St. Andrews and Oregon State University.
The findings shed light on periods of abrupt climate change in the planet’s history while offering new understanding of the impacts of today’s climate crisis.
“Studying the past teaches us how today is different. The rate of CO2 change today really is unprecedented,” said Kathleen Wendt, lead author of the study and an Oregon State University assistant professor in the College of Earth, Ocean and Atmospheric Sciences (CEOAS), in a press release from University of St. Andrews. “Our research identified the fastest rates of past natural CO2 rise ever observed, and the rate occurring today, largely driven by human emissions, is 10 times higher.”
The international research team conducted a detailed analysis of the chemicals in ancient Antarctic ice, which showed the impact of human carbon emissions.
Over hundreds of millenia, ice containing gases from the atmosphere trapped in air bubbles built up in Antarctica. Scientists drill cores as deep as two miles in order to take samples of the ice, analyze trace chemicals and put together records of the climate of the past.
Prior research revealed that the last ice age — which came to an end roughly 10,000 years ago — contained several periods during which carbon levels appeared to spike. However, Wendt said the measurements didn’t contain enough detail to show the complete picture of the changes, hampering scientists’ ability to comprehend what was going on.
“You probably wouldn’t expect to see that in the dead of the last ice age,” Wendt said in the press release. “But our interest was piqued, and we wanted to go back to those periods and conduct measurements at greater detail to find out what was happening.”
Using samples from the ice core of the West Antarctic Ice Sheet Divide, the team found a pattern showing that the sharp increases in carbon dioxide occurred alongside cold intervals in the North Atlantic — known as Heinrich Events — associated with abrupt global climate shifts.
“These Heinrich Events are truly remarkable,” said Christo Buizert, co-author of the study and an associate professor with CEOAS, in the press release. “We think they are caused by a dramatic collapse of the North American ice sheet. This sets into motion a chain reaction that involves changes to the tropical monsoons, the Southern hemisphere westerly winds and these large burps of CO2 coming out of the oceans.”
Carbon increased by approximately 14 parts per million during the 55 year period of the most prominent of the natural rises, and the spikes happened roughly once every 7,000 years. Today that magnitude of increase only takes from five to six years.
The evidence suggested that during the periods of natural increase, westerly winds integral to deep ocean circulation were also getting stronger, bringing about a quick release of carbon from the Southern Ocean.
“These Heinrich Events kick off an astonishing sequence of rapid shifts in climate around the world,” said co-author of the study Dr. James Rae of the University of St. Andrews School of Earth and Environmental Sciences. “They start with a weakening of the North Atlantic’s circulation system, which causes rapid cooling in NW Europe, sea ice expansion from Scotland to New York, and disruption to tropical monsoons. Our paper shows they also change winds and circulation in the ocean round Antarctica, which belches out CO2.”
The study, “Southern Ocean drives multidecadal atmospheric CO2 rise during Heinrich Stadials,” was published in the journal Proceedings of the National Academy of Sciences.
Previous studies have suggested that climate change will cause westerly winds to get stronger over the course of the next century. The researchers noted that, if that happens, their findings suggest the Southern Ocean’s ability to absorb human-generated carbon will be reduced.
“We rely on the Southern Ocean to take up part of the carbon dioxide we emit, but rapidly increasing southerly winds weaken its ability to do so,” Wendt said.
The United States Federal Energy Regulatory Commission (FERC) on Monday approved the first overhaul of the country’s electric transmission policy in more than a decade.
The changes will make new interregional lines faster and bring more renewable energy to meet increasing demand.
“Our country is facing an unprecedented surge in demand for affordable electricity while confronting extreme weather threats to the reliability of our grid and trying to stay one step ahead of the massive technological changes we are seeing in our society,” said FERC Chairman Willie Phillips in a press release from FERC. “Our nation needs a new foundation to get badly needed new transmission planned, paid for and built. With this new rule, that starts today.”
The rule is the first time FERC has directly addressed the country’s need for long-term energy transmission planning and will help achieve President Joe Biden’s target of decarbonizing the U.S. economy by 2050, reported Reuters.
It has taken FERC almost two years to develop the rule, with new requirements for how electricity is transmitted across state lines, project approval and financing.
Transmission owners will now need to conduct 20-year assessment plans for their regional transmission needs with a requirement to revisit them every five years. The regulation mandates that opportunities for modifying existing transmission facilities be identified, known as “right-sizing.”
“This rule cannot come fast enough,” Phillips said, as Reuters reported. “There is an urgent need to act to ensure the reliability and the affordability of our grid. We are at a transformational moment for the electric grid with phenomenal load growth.”
To meet the Biden administration’s goal of decarbonizing the power sector by 2035, the U.S. must more than double the country’s regional transmission capacity, as well as expand the interregional transmission capacity by more than five times, according to a study conducted by the U.S. Department of Energy last year.
“Our country’s aging grid is being tested in ways that we’ve never seen before. Without significant action now, we won’t be able to keep the lights on,” Phillips said, as reported by The New York Times.
The new rule also includes a code of conduct asking project participants to work with Native American Tribes through early outreach, issue an environmental justice report and engage with communities.
“We need to seize this moment,” Phillips said in the press release. “Over the last dozen years, FERC has worked on five after-action reports on lessons learned from extreme weather events that caused outages that cost hundreds of lives and millions of dollars. We must get beyond these after-action reports and start planning to maintain a reliable grid that powers our entire way of life. The grid cannot wait. Our communities cannot wait. Our nation cannot wait.”
America’s energy system has a problem: Solar and wind developers want to build renewable energy at a breakneck pace — and historic climate legislation has fueled their charge with financial incentives worth billions of dollars. But too often the power that these projects can produce has nowhere to go. That’s because the high-voltage lines that move energy across the country don’t have the capacity to handle what these panels and turbines generate. At the same time, electric vehicles, data centers, and new factories are pushing electricity demand well beyond what was expected just a few years ago.
As a result, the U.S. is poised to generate more energy — and, crucially, more carbon-free energy — than ever before, but the nation’s patchwork system of electrical grids doesn’t have enough transmission infrastructure to deliver all that renewable energy to the homes and businesses that could use it. Indeed, this transmission gap could negate up to half of the climate benefits of the Inflation Reduction Act, according to one analysis.
On Monday, the Federal Energy Regulatory Commission, or FERC, approved a new rule that could help complete this circuit. The agency, which has jurisdiction over interstate power issues, is essentially trying to prod the country’s many electricity providers to improve their planning processes and coordinate with each other in a way that encourages investment in this infrastructure. The hope is that this new regulation will not only address the outstanding interconnection challenge and growing demand but also fortify the grid in the face of extreme weather, given that more transmission will make it easier to shift electricity from one grid to another when there are disaster-driven outages.
The new rule, which has been years in the making, creates two new critical requirements.
First, it will require the operators of regional grids across the country to forecast their region’s transmission needs a full 20 years into the future, develop plans that take those forecasts into account, and update those plans every five years. In practice, this should mean a more robust consideration of new wind and solar options, as well as greater adherence to the net-zero emissions targets set by many U.S. states. Second, the rule requires providers to identify opportunities where they can upgrade existing infrastructure in a way that increases capacity, creating an easier route to moving more power between states without the complexity of building new lines from scratch.
“This rule recognizes the reality on the ground, that the factors affecting our grid — they are changing,” FERC Chair Willie Phillips said at a press conference on Monday.
The nation’s energy system is in the midst of a massive transformation. Around two dozen states have established definitive goals for clean energy in the decades ahead, with most of those on the books before President Joe Biden committed the U.S. to achieving 100 percent carbon-free electricity by 2035. Already, the clear demand for building out renewable energy has resulted in nearly 2,600 gigawatts of generation and storage capacity bidding for permission to plug into the nation’s aging transmission system. This is more than double all the resources already connected.
While these goals and plans depend upon the nation’s transmission infrastructure, regional grid operators have until now faced no requirement to ensure that new sources of renewable generation can connect to power lines without overloading them. FERC’s new rule changes that.
Prior to the rule, the 10 regional transmission operators that make up America’s patchwork grid were able to take largely independent approaches to infrastructure planning. Few took a meaningfully proactive approach to meeting the demands of climate policy.
There is one exception: The new FERC rule builds off of a comprehensive approach developed by the transmission authority responsible for most of the Midwest, which retooled its planning in part to help meet some of its states’ aggressive climate targets. The new federal regulation requires operators in every U.S. region to similarly consider at least three potential scenarios for how their electricity requirements will change over the coming two decades and establish plans in line with them.
However, the reality of the rulemaking process means that the action might not come as quickly as the moment seems to demand. Though the rule was approved on Monday, it doesn’t take effect until 60 days after its publication, and then grid operators and transmission planners will have 10 to 12 months to outline how they intend to comply with the new rule. Only then will the actual planning begin.
“These reforms are coming at a critical time,” said Christine Powell, deputy managing attorney for the clean energy program at the nonprofit Earthjustice. Powell added that it can sometimes take 10 years to get new transmission lines built given the logistical hurdles involved, so getting planning processes started as soon as possible is essential.
Thankfully, not all the necessary work involves building new infrastructure on these lengthy timelines. The new rule also carves out a requirement for “right-sizing” existing infrastructure by, for instance, using state-of-the-art conductors and transformers to, in some cases, double the transmission capacity of existing towers.
Of course, these new requirements could be delayed or derailed by lawsuits — a likely prospect given the history of legal challenges faced by major FERC rules in the past. Both Powell and Phillips said that they believe that the new policy is durable enough to withstand those challenges. Powell told Grist that the rule went through a lengthy review process that involved extensive public comment. FERC went through 15,000 pages of those comments and ensured that the arguments and issues raised in each were weighed and considered before the final rule was completed.
Still, FERC commissioner Mark Christie, a Republican and the lone “no” vote in Monday’s decision, claimed in his dissent that the process was rushed. Powell, who worked at FERC for eight years, disputed this in an interview with Grist.
“That is not FERC rushing,” she said. “That is FERC really trying to deliberate and do the right thing.”
Chaz Teplin, who leads the clean competitive grids team at the nonprofit Rocky Mountain Institute, added that the new rule had broad bipartisan support, despite Christie’s dissent. Republican governors and lawmakers were among those who recognized the importance of a policy like this and issued comments in support of it. Even Neil Chatterjee, a Republican who chaired FERC under then-President Donald Trump, published a tweet stating, “If I were on the commission I would have voted for it.”
Editor’s note: Earthjustice is an advertiser with Grist. Advertisers have no role in Grist’s editorial decisions.
Following several months of long and heavy rain, officials in Los Angeles have announced that the amount of stormwater captured from October 2023 through April 2024 is an estimated 96.3 billion gallons.
As the Los Angeles Times reported, the amount of stormwater captured in late 2023 through spring 2024 is enough to meet water demands of about 2.4 million people, about 25% of the county.
In February 2024 alone, the city captured 5 billion more gallons of stormwater compared to the previous year, according to Mayor Karen Bass. The total captured stormwater for that month reached 13.5 billion gallons.
In a regular year, the city captures an average of 8.8 billion gallons of stormwater, the Los Angeles Department of Water & Power reported. In the past year, stormwater capture has reached about 204 billion gallons, enough to supply water to about 5 million people, according to Water for LA, a county program.
The county had a particularly rainy winter and early spring, reaching the wettest day in over 20 years on Feb. 4, 2024, and February was the seventh-wettest month for Los Angeles in its recorded history, The New York Times reported. By April, Los Angeles had experienced more rainfall than infamously rainy Seattle.
Without proper stormwater management infrastructure, rainfall flows into storm drains, carrying pollution from the city with it, where it eventually is released into local waterways and the Pacific Ocean untreated.
The recent increase in stormwater capture can be linked to infrastructure projects designed to save more stormwater during the region’s recent wet winters. The county has put about $1 billion in investments into stormwater capture and storage projects since 2001, the Los Angeles Times reported. As of December 2023, Los Angeles county had established 126 stormwater management infrastructure projects, NBC4 reported.
Los Angeles recently established the L.A. County Water Plan, adopted by the Los Angeles County Board of Supervisors in December 2023. This plan is set to increase the local water supply by 162 billion gallons by 2045 and ultimately meet 80% of the county’s water demand. Currently, about two-thirds of the area’s water is imported from areas like Northern California or the Colorado River.
With improved stormwater capture management infrastructure and programs, the county is making progress toward its goal to source more water locally to meet demand.
“We know with weather volatility, we have to save every drop of water that we can. So this has to continue to be a trend that we invest in,” Lindsey Horvath, chair of the Los Angeles County Board of Supervisors, told the Los Angeles Times. “The more we see investment in infrastructure, the more we’re going to be able to capture and make a difference, and keep that water resource local.”
Grist, an award-winning, nonprofit media organization dedicated to highlighting climate solutions and uncovering environmental injustices, has acquired the nonprofit news site The Counter. Launched in 2015, The Counter investigated the forces shaping how and what America eats, and ceased publication in May 2022 due to a funding shortfall.
As part of the acquisition, Grist will maintain The Counter’s archives and will take ownership of the organization’s brand assets. Grist will carry on a large portion of The Counter’s mission by continuing to investigate “the forces shaping how and what America eats” and how those forces intersect with climate and the environment. To that end, today we’re launching a food and agriculture vertical, which includes two new hires.
Ayurella Horn-Muller joined Grist in April as a food and agriculture staff writer. She is an award-winning journalist based in Florida who has covered climate, justice, and food for Axios and Climate Central, and who has freelance bylines for The Guardian, CNN, National Geographic, USA Today, Forbes, and on NPR and PBS NewsHour. She is also the author of the book Devoured: The Extraordinary Story of Kudzu, the Vine That Ate the South, which was published in March.
Frida Garza will also join Grist as a food and ag staff writer. She was the editor of The Guardian US’s environmental justice series, America’s Dirty Divide, and also wrote stories on the culture of food in North America. Previously, she was a senior staff writer at Jezebel, covering culture and politics. A native of El Paso, Texas, she recently finished a master’s in labor studies and is based in Brooklyn, New York. Garza starts on May 15.
Grist already has a decadeslong track record covering the intersection of food, agriculture, and climate. Most recently, freelancer Julia O’Malley was named a finalist for the James Beard Media Awards for her Grist story (in collaboration with the Food & Environment Reporting Network) on Alaska’s missing snow crabs, and former Grist fellow Max Graham won this year’s Alaska Press Club Award for best long feature for his salmon coverage. But the acquisition of The Counter represents a significant expansion of the news site’s coverage of this critical beat.
“Food is a unique entry point through which to enter the climate conversation — and it’s one that everyone around the world can easily grasp,” said Nikhil Swaminathan, Grist’s CEO. “We were great admirers of The Counter’s work, and when the opportunity came to breathe new life into a brand we held in such high regard, we jumped. This new vertical will allow Grist to very clearly frame coverage of climate as what it literally is: a kitchen table issue.”
“Nothing could be more gratifying than to see a publication of the caliber of Grist both committing itself to sustain public access to the award-winning archive of The Counter, and dedicating its energies and significant resources to relaunching and invigorating its approach to food journalism,” said Jeffrey Kittay, The Counter’s founder and publisher.
“As temperatures rise, global warming’s impacts on food security, agricultural economies, labor justice, and food culture are coming into sharp relief,” said Grist executive editor Katherine Bagley. “And unlike many climate risks, these changes are deeply personal, from the rising cost of your weekly grocery bill to disruptions in cultural traditions around food.”
To mark the moment, Grist has curated a downloadable cookbook filled with climate-friendly recipes from our archives, as well as from partners Cool Beans and Pale Blue Tart. The cookbook has everything from entrees to sides and desserts.
In addition to the new vertical, Grist is honored to ensure the outstanding work done by The Counter staff remains accessible to the public. And it’s not the first time Grist has taken on that responsibility: In 2020, Grist acquired Pacific Standard and continues to house its archives.
I have been at Grist for more than seven years now, and it’s been a hell of a ride. The organization has evolved — or, put more fairly, transformed — shifting from almost entirely Seattle-based to very nearly fully distributed, with staffers in more than 20 states. We’ve morphed from our roots as an iconoclastic blog to an authoritative digital magazine. We’ve experimented with new forms of storytelling, including climate fiction — which involves both a medium and ideas that weren’t on our radar back when I walked in the door in 2017.
In that time, we’ve accumulated dozens of collaborators, hundreds of syndication partners, amazing new staffers, 350 new Grist 50 Fixers, a trophy case of awesome awards, and more examples of our work’s impact out in the world than I could dare to count. Along the way, we’ve tried to do our part to not only continue to build the legacy of Grist, but also preserve the legacies of some of our fellow travelers in the nonprofit media space.
Nearly four years ago, we acquired the archive of a publication we deeply respected, Pacific Standard, promising to ensure that the organization’s thought-provoking, visually-arresting, award-winning work would continue to be publicly available for anyone to read. Earlier this year, we hired Lyndsey Gilpin, the dedicated founder of Southerly, so she could bring to Grist her work covering environmental justice issues using her celebrated community engagement practices.
And today, Grist is bringing back a well-respected title on the food and agriculture beat and officially building out that topic’s intersection with climate in a major way. I’m thrilled to share that Grist has acquired the archive and brand assets of The Counter, a decorated nonprofit food and agriculture publication that we long admired, but that sadly ceased publishing in May of 2022. As evidence of our esteem for The Counter, the last piece it published on its website was a collaboration with Grist — a look at the use of methane digesters as a climate solution in California, and how the way the state was counting emissions reductions likely oversold their impact.
When The Counter’s founder and publisher Jeffrey Kittay reached out looking for a home for the site’s archive, speckled with award-winning work, as well as a healthy overlap of work covering the food and agriculture issues that Grist readers care about, we saw an opportunity to catapult into our next chapter of high-quality journalism on this topic.
We hope to build on — and supercharge — The Counter’s strong journalism and the important work we’ve recently done on this beat. A story we collaborated on with the Food & Environment Reporting Network about an Alaska Native village reckoning with the disappearance of snow crab was nominated for a James Beard Award. The monthly Sidney Awards honored our investigation into a celebrated vertical farming startup in Kentucky that turned out to be a cauldron of workplace abuse and unrealized production. A story from our Michigan reporter housed at Interlochen Public Radio on the debate over using land in the state for farming or solar panels echoed arguments across the country and was one of our most popular stories of the past year. And two years ago, our creative storytelling team put together a brilliant package on the future of food, which included a Climate Future Cookbook complete with a dossier of resilient foods, like pawpaw and lionfish, that we could see more of in a few years.
The time for this expansion is now: Agriculture is under threat all over the world because of rising temperatures and increasingly severe weather events. At the same time, the way we grow our food has to change. According to the World Bank, “the global agrifood system emits one-third of all emissions.” Add to that how people’s relationship to food is changing in a warming world, from subsistence communities losing access to the foods that have kept them alive for generations to families, food purveyors, and cities thinking more deeply about how to manage food waste — which a group of scientists recently estimated accounts for half of food system emissions.
The Counter had hit on a rich vein to report on, and we’re excited to not only ensure the work of the staffers and contractors of that publication is available for posterity, but to build on it. So we’re relaunching The Counter as a food and agriculture vertical within Grist, continuing their smart and provocative reporting on food systems, specifically where it intersects with climate and environmental issues. We’ve also hired two amazing new reporters to make our plan a reality.
Being back on the food and agriculture beat in a big way is critical to Grist’s mission to lead the conversation, highlight climate solutions, and uncover environmental injustices. What we eat and how it’s produced is one of the easiest entry points into the wider climate conversation. And from this point of view, climate change literally transforms into a kitchen table issue.
The first thing you notice walking up to a dai pai dong, one of Hong Kong’s signature open-air street food stalls, is the smoke. Aromatic plumes billow out from aluminum-covered vent hoods as chefs with decades of experience produce steaming plates of crackled shrimp, juicy mussels, and crisped-up rice by tossing the ingredients in a giant, flame-cradled wok.
As a foodie and avid stir-fry consumer, I love everything involved in wok cooking — the artistry, the bursts of orange under the deep, round-bottomed pan, the incomparable taste. But as a climate reporter, I see just one problem: It typically relies on gas stoves, which release planet-warming methaneeven when turned off.
Climate experts say that we need to phase out fossil fuel use to address the climate crisis, especially in buildings, which account for 35 percent of U.S. greenhouse gas emissions. Gas stoves also produce harmful air pollutants like carbon monoxide, nitrogen dioxide, and benzene, a known carcinogen.
So when I heard that an all-electric food hall on Microsoft’s campus in Redmond, Washington, featured a pair of custom-made induction woks, I was eager to try out a climate-friendly stir-fry. Unlike gas stoves, induction ranges use electromagnetic currents to heat food, eliminating both the carbon emissions and harmful air pollutants produced by gas. Yet minutes into my lunch with a friend who works at Microsoft, my excitement dissolved. My tofu noodles arrived limp and drowning in vegetable oil.
As I poked at my soggy introduction to induction wok fare, I couldn’t help but think back to a plate of noodles I had eaten at a dai pai dong in Hong Kong just a few weeks before. The two noodle dishes could not have been more different. One was prepared with state-of-the-art climate tech — yet produced lukewarm results. The other was freshly tossed in a kerosene-fueled wok, yielding glossy, chewy noodles bursting with soy sauce, blackened slivers of onion, and, most importantly, that elusive, umami-filled char called wok hei.
Wok hei, loosely translated from Cantonese as the “breath of the wok,” represents the pinnacle of the stir-fry cooking technique most commonly associated with southern China. (While many cuisines rely on the wok, not all strive for that signature aroma.) From street food stalls to high-end restaurants, diners from all over the world seek the intangible flavor that renowned chef and wok whisperer Grace Young described as “a special life force or essence from the wok.”
For all its coveted glory, wok hei — and the question of what exactly produces it — remains somewhat mysterious. The term itself is fairly abstract: while wok refers to the cooking vessel, hei can simultaneously mean “air,” “breath,” “energy,” and “spirit,” leaving room for a variety of interpretations. Many chefs say that fire, and therefore a gas stove, is essential for achieving the aroma, putting it at odds with climate-driven legal trends: Since 2019, more than a hundred local governments across the United States have introduced policies to ban the use of natural gas in buildings, including gas stoves. Others argue that with high enough temperatures and a few adjustments, chefs can switch to induction and still produce foods with wok hei.
In the face of this gastronomic debate, many chefs are asking what an all-electric future will mean for cherished culinary traditions like wok cooking.
When the city of Berkeley, California, enacted its local gas ban in 2019, the California Restaurant Association sued, arguing that gas is essential for certain specialty techniques, including “the use of intense heat from a flame under a wok.” It wasn’t the only attempt to derail gas bans. An investigation by the Sacramento Bee, for example, revealed that the gas utility SoCalGas actively recruited Chinese American restaurant owners to advocate against electrification policies in Southern California.
It would be naive to say gas utility companies were driven by a love of great stir-fry when they turned their lobbying efforts toward wok-based cooking. But the culinary debate around whether wok hei can be achieved over an induction stove has certainly added fuel to the electrification debate.
An employee of the commercial kitchen equipment company Bartscher shows an induction wok at a trade event in 2019. Ulrich Perrey / Picture alliance via Getty Images
For chefs, the most important consideration when it comes to switching off gas is whether induction can support their livelihoods. In cities like San Francisco and Los Angeles, some restaurant owners serving Chinese, Thai, and other Asian cuisines using woks have expressed concerns that local gas bans could jeopardize signature tastes and textures.
Whether individual chefs think that induction can achieve wok hei depends largely on how they define it. Wok cooking expert and food writer J. Kenji López-Alt, for example, defines wok hei as a quintessential smoky flavor. He told Grist that it’s impossible to achieve wok hei without gas or fire — and the reason comes down to the food science.
A number of different elements go into that signature smoky aroma, according to López-Alt. One is the flavor imparted from hot, well-seasoned carbon steel or cast iron, two of the most common materials used to make woks. Another component is the caramelization that happens when sauce hits a searing hot pan. If you “watch a Chinese chef cooking, when they add soy sauce to a stir-fry, they swirl it around the outside of the pan where it immediately sizzles and gets intense heat, and that changes the flavor and gives it a bit of smokiness,” he said.
But the main flavor component flavor of wok hei, López-Alt says, comes from the igniting of aerosolized oil with fire. As chefs toss food up into the flames of a gas stove, tiny droplets of fat suspended in the air catch on fire, dripping back down into the wok to impart a subtle smokiness. “You can’t get that without an actual fire,” he said.
Martin Yan, restaurateur and longtime host of the PBS cooking show Yan Can Cook, has a different take on wok hei, which he defines as an ephemeral, fragrant aroma that lasts a mere 15 to 20 seconds after a dish is prepared. He told Grist that achieving that aroma depends not on fire, but on applying intense, high heat. When fresh ingredients hit the wok’s surface, they undergo a Maillard reaction, in which proteins and sugars break down and develop new, complex flavors. “The wok hei is not created by the gas,” he said. “It’s created by the frying pan and that chemical reaction.”
In theory, Yan said, the heat could come from any source: electricity, gas, even wood or charcoal. “You could use nuclear fusion, as long as you can create that intense heat.”
Celebrity cook Martin Yan demonstrates his wok cooking skills over a gas-powered stove at an event at the Conrad Hotel in 2006. K. Y. Cheng / South China Morning Post via Getty Images
Induction stoves, which can instantly heat to temperatures of up to 643 degrees Fahrenheit, are capable of the intensity Yan describes as necessary for wok hei. Yet some chefs like López-Alt say that the shape of the wok presents another obstacle to using induction. Woks feature a deep, high-walled bowl, which allows flames to curl around the vessel and create varied temperature zones — ideal for moving sauces and ingredients around to optimize flavors and control heat. But induction stoves are typically flat and only activate when directly in contact with the pan’s surface. Lifting the wok to toss ingredients, therefore, would result in an instant loss of heating.
Jon Kung, a Detroit-based chef and TikTok personality who advocates for induction cooking, says that induction stoves designed specifically for woks can help with this issue. Like Yan, he defines wok hei as a “mix of char and caramelization” as a result of the Maillard reaction, requiring high heat rather than flames.
Kung owns two portable induction wok burners that feature a curved heating bowl in which the wok sits, allowing for better temperature control up the sides of the pan. While this setup may not totally replicate the temperature gradient present in a traditional fire-heated wok, Kung said the conditions are sufficient for producing high-quality stir-fry, a task he points out is difficult even for those with gas stoves at home.
“It’s incorrect to assume that the only things you need to achieve wok hei are a wok and a gas burner,” he said in a 2023 video. “The ones in Chinese restaurants have a power output of 150,000 BTUs. That’s way more than the 30,000 that comes out of your Viking range. The fact of the matter is, these induction wok burners do a better job at mimicking the focus of energy into the bottom of a wok that you get from a genuine Chinese wok burner.”
While Kung’s induction models plug into a typical outlet and are designed for home use, similarly shaped and far more powerful commercial induction wok ranges exist on the market — including at Microsoft’s all-electric food court. But the stove itself wasn’t the reason for the company’s substandard stir-fry. The noodles I ate there appeared to have been batch-cooked, an efficient way to feed hungry tech workers but a less-than-optimal method for achieving wok hei, which depends on the freshness of the ingredients. And since I wasn’t present at the time of cooking, I also can’t evaluate the temperature used for cooking.
As of now, I can safely say that my induction-versus-flame-fueled wok hei taste test remains inconclusive. And sadly, I don’t have many nearby options to gather more data. Although Yan reported that some hotels in China like the Hilton and Marriott already exclusively use induction woks, commercial induction kitchens are rare in the United States.
According to a 2022 survey by the National Restaurant Association, 76 percent of restaurants in the U.S. still use gas. That proportion goes up to 87 percent for full-service restaurants, or sit-down eateries that provide table service. Meanwhile, less than five percent of U.S. households currently use an induction stove — though wok expert Grace Young has said she’s often asked which wok to buy for induction and glass-topped ranges.
Chef and wok expert Grace Young has a razor clam dish at a restaurant in New York’s Chinatown in December 2021. Jeenah Moon / The Washington Post via Getty Images
A big reason for the lack of commercial induction uptake is the cost. Yan noted that induction wok burners for restaurants remain prohibitively expensive in the U.S., especially since the technology is still maturing. Upgrading a gas kitchen to accommodate all-electric appliances to begin with can require up to tens of thousands of dollars, an exorbitant price for businesses operating on thin profit margins. Commercial induction ranges also typically cost three to four times as much as gas-powered ones.
Kung told Grist that he is not aware of any restaurants in the U.S. achieving wok hei with induction — although he believes that with a few tweaks in technique, it’s “absolutely” possible. The problem, beyond the cost of induction ranges, is that chefs might also simply prefer the tactile experience of cooking with fire, or generally feel resistance to adopting new techniques. But Kung maintains that if governments want to take the climate crisis seriously, they need to pass policies to incentivize and help businesses switch to electric.
“Chefs are problem-solvers by nature,” Kung said, and will likely innovate and relearn how to achieve wok hei on induction at a commercial level.
Although López-Alt says achieving wok hei is not possible without a flame, he isn’t against induction stoves in general. He initially felt wary of switching when he first came across the debate over gas stoves a few years ago. Yet he eventually concluded that, for most Western cooking and home cooking, the technology can be just as good as gas if not better — not just for climate and health reasons, but also in terms of efficiency of cooking.
“It’s a topic that gets a lot of knee-jerk, immediate reactions,” he said. But “for most things it actually makes sense to get rid of gas.”
In early April of last year, a white capsule the size of a small school bus detached from the International Space Station and splashed down off the coast of Tampa, Florida. On board were 4,300 pounds of supplies and scientific experiments, including samples of dwarf tomatoes grown in space; crystals that could be used to make semiconductors; and medical data on the astronauts working in the space station. Tucked away among these contents was a much smaller and lighter cargo: more than a million tiny orange seeds.
Half a world away in Seibersdorf, Austria, a town about 22 miles outside the capital of Vienna, Pooja Mathur waited eagerly for the seeds — from a plant called arabidopsis, a member of the mustard family — to arrive. Mathur, a plant geneticist, leads the Plant Breeding & Genetics Laboratory for the Joint FAO/IAEA Centre of Nuclear Techniques in Food and Agriculture, a collaboration between two United Nations agencies: the Food and Agriculture Organization and the International Atomic Energy Agency.
For over 60 years, the laboratory has studied whether nuclear technologies can be used to breed new and more resilient varieties of crops, and the seeds from the space capsule were its newest venture. They had spent nearly five months in low Earth orbit, exposed to cosmic radiation, extreme temperatures, and low gravity, which altered their DNA in unpredictable but potentially beneficial ways. Scientists like Mathur hope that a few of these seeds might sprout into plants that can survive changing conditions here on Earth, such varieties more resistant to drought or heat.
An expert at the Plant Breeding and Genetics Laboratory holds the sorghum seeds that spent five months at the International Space Station.
Katy Laffan / IAEA
“It was a great opportunity to receive them,” Mathur told Grist over a video call from her office in Austria. “But there was also a nervousness — there are always these questions when you embark on something unknown.”
The “cosmic crops” project is the United Nations’ first foray into space breeding, part of a global effort to address rising risks of food insecurity stemming from shifting land use patterns, population growth, and climate change-driven extreme weather. Heat waves, droughts, floods, erratic rainfall, and worsening pest and disease outbreaks all threaten agricultural production around the world, and the effects are already being felt in many countries. Massive flooding destroyed at least 4 million acres of farmland in Pakistan in 2022, triggering a food crisis for more than 8 million people; in East Africa, extreme drought has pushed millions of people to the brink of famine in the past three years. In the United States, natural disasters, many made worse by climate change, caused $21.5 billion in agricultural losses in 2022 alone.
While space breeding seeds was first attempted in the 1960s, the scientific endeavor is currently experiencing a golden age as space travel and research becomes more accessible for nations outside the U.S., Russia, and Europe. Chinese researchers have been at the forefront of this experimentation, developing more than 200 varieties of space-mutated plants since 1987. Other countries that have developed space programs in recent years, like India and the United Arab Emirates, are also among the most vulnerable to climate change, and have expressed interest in the technology.
But the joint FAO/IAEA center’s project, known officially as Seeds in Space, is the first such effort on an international level, which will help make the results of these experiments available even to nations that can’t afford to build rockets or extensive plant genetics laboratories. And it will help answer essential questions about what makes space mutations different from those done here on Earth, and where scientists should direct their efforts in order to adapt to climate change.
“[If] we can understand how plants mitigate stress [in a space environment], we can use that knowledge in our approach to global warming on Earth,” said Tapan Mohanta, a former agricultural researcher at the University of Nizwa in Oman who has studied the potential of space breeding for developing new crop varieties and was not involved in the FAO/IAEA mission.
The joint FAO/IAEA center was founded in 1964 amidst a post-war push to use atomic energy for peaceful means. Researchers at the time found that exposing plant material to radiation encourages mutations at a much faster rate than conventional breeding, a painstaking procedure that requires multiple generations to show changes in the plants’ phenotype, or outward characteristics. Mutations occur naturally as cells multiply by making copies of their genetic code; what starts as a random error in one strand of DNA can be replicated over and over again until the organism either repairs the damage or allows it to spread to all of its cells.
Scientist Shoba Sivasankar, right, receives a package of seeds that journeyed from the International Space Station to the FAO/IAEA Plant Breeding and Genetics Laboratory in Seibersdorf, Austria, in 2023.
Katy Laffan / IAEA
Hitting seeds with gamma rays, the most powerful form of radiation, speeds up this process, known as “mutagenesis,” by as much as 1 million times. Irradiated seeds which survive the high doses of radiation can grow into plants that show much clearer phenotype variations than their conventionally-bred counterparts; scientists can then test these new specimens to see whether they can withstand difficult conditions or produce a higher crop yield than currently existing varieties. This process does not make the seeds themselves radioactive, and the resulting crops are safe to eat, Mathur said.
By selecting and then further breeding the most promising candidates, researchers have produced over 3,400 new varieties of more than 210 plant species, according to the IAEA’s Mutant Variety Database. Farmers in more than 70 countries are already growing the resulting plants; the seeds are often crossbred with widely used “elite” varieties to better suit local conditions. Other mutations can be induced using chemicals, bypassing nuclear technology altogether.
Cosmic rays, which are emitted by distant space objects like the sun, other stars, and even black holes, offer a different way to trigger mutagenesis, Mathur said. One of the goals of the “cosmic crops” project is to determine whether radiation from space, which is lower intensity but applied over a longer period of time than in the lab, can create different results than experiments with gamma rays on Earth. Previous experiments by Chinese researchers have found that space radiation induces “useful” mutations more often than gamma radiation applied in a lab, according to the BBC.
“Mutagenesis is a very slow process on a day-to-day basis,” Mathur said. Space breeding “can accelerate the process to harness the power of natural changes at a much faster scale, considering that there is a dire need to have solutions in food and agriculture.”
Two types of seeds were picked for the experiment: arabidopsis, a weed that, while usually not edible, is a “model species” with a well-studied genome that researchers can quickly examine for the most obvious genetic changes and useful traits, and sorghum, a dryland crop that’s consumed by 500 million people around the world and is therefore useful from a food security standpoint, Mathur said. Half were kept outside the International Space Station, where they were exposed to the full range of cosmic radiation along with the extreme cold and zero-gravity environment of outer space; the other half stayed inside the station, under microgravity conditions but shielded from most radiation, to provide a point of comparison.
Because the mutations that occurred in space were random, scientists are taking two approaches to figure out what they look like: Since receiving the seeds in June of last year, Mathur’s lab has planted them and will now begin using DNA sequencing technology to study the arabidopsis seedlings and determine what changes took place at the genetic level. They plan to have results by summer or early fall. After that, researchers will screen the ones that seem to display positive genetic changes to determine whether they can actually better withstand harsh conditions like drought, salinity, and pest infestations. They’ll follow up by testing the sorghum, which takes longer to sprout and grow to maturity.
Crops take root in a beaker at the IAEA Plant Breeding Unit in Seibersdorf, Austria.
Adriana Vargas Terrones / IAEA
Mathur’s lab is sharing its results with countries that want to learn which techniques — encompassing everything from the length of time the seeds are in space to the way they’re grown once they return — produce the most resilient crop varieties. One such “coordinated research project,” which would compare mutations induced by cosmic rays with those applied in the lab, has attracted researchers from Australia, Burkina Faso, China, France, Ghana, India, Kenya, Niger, South Korea, the United Kingdom, and the U.S.
“The molecular variations in plants induced by space mutagenesis are largely unknown,” said Hongchun Xiong, an associate professor at the Chinese Academy of Agricultural Sciences who is working on the coordinated research project. Although Xiong’s previous research using space-exposed seeds has identified mutant varieties of wheat that are more tolerant to saline soil, which can prove useful as saltwater encroaches on agricultural fields thanks to rising sea levels, she hopes to identify others that are resistant to dry conditions or use nitrogen more efficiently.
“We believe this is important for [the] development of new wheat varieties for food security and climate change adaptation,” Xiong said.
Previous experiments with space breeding have already yielded results. China registered a new variety of wheat called Yannong 5158, which was developed using space mutagenesis, in 2007. Smaller than conventional wheat, with dark green leaves, this version proved more resistant to bacterial diseases and stem rust, a type of fungal infection, while also producing a higher yield. This variety has since been planted in several villages in the Fuyang prefecture in eastern China. The country also harvested its first batch of rice that had traveled to deep space — nicknamed “rice from heaven” by state media — in 2021, though it has not yet announced whether the resulting plants were more resilient in any way than their Earth-bred counterparts.
Experiments like these carry risks, Mohanta pointed out. Mutant DNA could potentially escape and contaminate wild species or other crops through cross-pollination, which could pose a threat to biodiversity or human health if the mutations are harmful in any way — a small possibility, but one that plant breeders developing genetically modified organisms, or GMOs, also face. One genetically modified variety of corn, for example, was suspected of unintentionally introducing allergens into the U.S. food supply in the early 2000s and later had to be recalled, although officials could not prove that the GMO corn actually caused allergic reactions. And although contamination incidents are common, with nearly 400 recorded by Greenpeace between 1997 and 2014, researchers have found no definitive links between GMO foods and negative health effects.
While space-bred varieties are not GMOs, because the mutations that occur are random and not controlled by humans, the joint FAO/IAEA center still follows protocols to keep cross-contamination from occurring. But it can’t control what member states do once they have access to the technology and mutated seeds.
“Although developing plant varieties that thrive in microgravity and resist cosmic radiation may be an important goal for the scientific community, an undesirable mutation in the genome could have deleterious effects on other crop varieties,” Mohanta wrote in a 2021 paper in the journal Frontiers in Plant Science. “Therefore, the conduct of such research should be subject to strict international regulations to avoid the possibility of unexpected results.”
Mathur emphasized, though, that despite the unknowns, space breeding has enormous potential, which scientists are only just beginning to unpack. She pointed to previous studies that found peppers exposed to cosmic radiation had a higher nutritional content, a promising feature given widespread deficiencies of iron, zinc, vitamin A, and other nutrients around the world. And although space experiments are still a very small component of plant breeding, the results of the “cosmic crops” project will help researchers decide whether to invest more into this technology in the future.
Mutation breeding “has been the cornerstone of agriculture for a long, long time,” Mathur said. “Agriculture is all about harnessing mutations … and mutation is very much a part of our evolutionary process.”