Systems
10.04.2016

Counting Fish

An impossible challenge, but one that's critical to the fish, the ecosystems in which they live, and the humans who depend on them

A few miles off the coast of Massachusetts, aboard the fishing boat the Miss Emily, chains groaned as they lifted the sodden net out of the water. The multi-hued strands opened, spilling their meager contents onto the deck. “This is definitely a small catch,” said William Hoffman, senior marine biologist with the Massachusetts Department of Fisheries. The scientists and fishermen aboard the boat splashed through the flopping fish, shoveling them onto a conveyor belt and then quickly sorted the catch by species: flounder, hake, sea herring, haddock, lobster.

After sorting the fish, the team tossed them back onto the conveyor belt by species. Hoffman caught each fish as it came off the belt and slid it down the table to his colleague Nick Buchan. Hands protected by thick blue gloves, Buchan grabbed hold of a slippery flounder. He lined its nose up at the end of the electronic measuring board and stamped a small magnet onto the board just where the fish’s tail fin forked. The computer wired to the board blared as it recorded where the magnet landed, locking in the length of the flounder. Buchan seized the fish around its mid-section and tossed it into a nearby orange bucket to be weighed. The whole process took only a few seconds, and Hoffman and Buchan were on to the next fish. 32cm, BEEP. 28cm, BEEP.

Authors Cynthia Graber and Nicola Twilley host the award-winning podcast Gastropod. To hear more about this story, including how scientists are digging into ship archives and restaurant menus to learn about fish populations in the past, listen here.

The team worked quickly and efficiently, identifying, sexing, sizing, and weighing hundreds upon hundreds of fish. They would repeat this day’s activity multiple times over eight months, in a carefully plotted program to count the diversity of fish in Massachusetts state waters.

Scientists—and fishermen—want to know how many fish are left in the oceans, because the numbers amassed from these types of trawls provide the basis for local and national decisions about how many fish the fishermen are allowed to take out of the water—how much we can eat and how much we have to leave in. Can we strike a balance that maintains viable, sustainable marine populations? The answer comes down to a challenge that seemed straightforward aboard the Miss Emily (so simple, indeed, that by the end of the day we’d even slipped on gloves to help) and yet is actually incredibly difficult to resolve: counting fish.

Figuring out how many fish there are in the sea is an impossible, essential challenge. Marine life provides a crucial source of protein for billions of people around the world, as well as necessary income for people in coastal communities. But the trends are alarming. The World Wildlife Fund estimated in a 2015 report that some commercially important stocks have been reduced by almost 75 percent since the 1970s. If fishing pressure continues unchecked, several species could be wiped out entirely, causing unknown damage to marine ecosystems. At the same time, if governments place restrictions on the industry that are overly conservative, many fishermen could unnecessarily lose their jobs, and untold numbers of people could be deprived of an important food source.

The stakes are huge, the ocean is vast, and the fish won’t stay still, which makes counting them important and exceedingly difficult. For decades, scientists have relied upon counting techniques similar to the one used on the Miss Emily. But now researchers are looking to the latest technologies—artificial intelligence, autonomous submarines, and drones—to develop new methods for quantifying fish populations. In the process, they’re learning much more about the fish themselves. Are better numbers, combined with a richer understanding of marine ecosystems, enough to save both the fish and the humans who depend on them?

In many ways, New England is a perfect microcosm of this struggle. New England’s towns and cities flourished in large part due to the wealth flowing in from the docks. These waters have been fished for profit since Europeans first reached the shores hundreds of years ago, and Native Americans fished here thousands of years before that. Today, cod, haddock, lobster, and scallops remain a mainstay of the New England diet, but the jobs for fishermen who sail out of coastal cities such as Gloucester, New Bedford, and Scituate are on the decline. “There’s nobody left,” says Kevin Norton, captain of the Miss Emily. “Everyone is out of business.” In just a decade, from 2001 to 2011, the number of groundfish boats in New England plummeted from more than 1,000 to 344. (Groundfish include all the bottom-dwelling fish for which the region is famous: cod, haddock, yellowtail, and flounder.)

The loss of jobs is due to the loss of New England fish. When Europeans first arrived, the supply of cod seemed endless. Throughout the 20th century, as new engines roared to life, larger boats and trawls traveled farther and faster, scooping fish out of the sea with alarming efficiency. The region’s abundant fish stocks proved finite after all, crashing under the pressure. While fishermen and scientists agree on little, both groups admit that the ocean is a much emptier place today than it was a century ago.

By the 1970s, the federal government stepped in, enacting a series of closures on fishing beds and restrictions on the overall tonnage fishermen could haul to shore. In effect, the policies reduced the fishing fleet as well, as many fishermen could no longer catch enough to make a living.

This led to a tug-of-war, if not an all-out war, between scientists and fishermen. Public meetings about fisheries science and policy frequently turn into shouting matches. Fishermen insist that scientists have no idea how many fish are in the ocean and that they’re counting in the wrong places, while scientists defend their methods and the resulting figures. Who’s right? And why is it so difficult to determine?

Although they stand by their data, researchers admit that it’s nearly impossible to arrive at a definitive number for any particular species; they can’t say, for instance, that there are exactly 2.3 million cod in the Atlantic. Over the course of a dozen interviews, scientists repeatedly paraphrased an idea that originated in the 1970s with John Shepherd, a fisheries management specialist at England’s University of Southampton: Counting fish is like counting trees, but the trees are invisible and constantly on the move.

New England is not only a hotbed of fish fights, but it’s also an internationally renowned center of fisheries science. And so, in an attempt to do a better job of counting those invisible, moving creatures, New England researchers are developing and testing a diverse array of new techniques and technologies. They want to not only achieve a more accurate tally, but to generate more trust in the count—to create the best possible policy, and, of course, to preserve commercial fish species far into the future.

For more than five decades, research ships have sampled the seas. In the 1960s, the organization that would become the National Oceanic and Atmospheric Association, or NOAA, began counting fish. They set up a system that was, at the time, achievable and cost effective: For any given region, NOAA scientists charted a specific path through the ocean to trawl and sample the fish in exactly the same spots every year, and using the same type of nets. This way, any changes they observed in the size of the catches would most likely be due to declining or increasing fish populations overall, rather than because the trawl had stumbled upon the spawning grounds of a particularly populous school of cod, or because a new type of net inadvertently picked up more hake than the previous net did.

There have been some advances in NOAA’s counting technology over the decades, most notably in the transition away from a waterproof pencil-and-paper system. Until 2001, ankle deep in sloshing water, fish, and fish guts, with the boat rocking with the wind and waves, biologists had to record the species, sex, length, and weight of each fish by hand. Once ashore, they submitted the papers for input into computers, which took up to three months to complete. “There were a lot of avenues for errors to creep in,” said Nancy McHugh, a fishery biologist at NOAA.

McHugh, one of the developers of Fiscus, the onboard computerized system used today, recalled that in 1999, she and some colleagues sat around a restaurant table in Mobile, Alabama, on a work trip to study electronic data capture systems. “While eating fried pickles, we took all the sugar packets and hot sauce and ketchup and mustard, and we created a set-up of what happens to fish as it goes from the net to a basket, to being weighed, to being measured.” Ketchup and mustard bottles moved like chess pieces across the table as the scientists played out their data needs and processes. Two years later, she and her colleagues launched the first iteration of Fiscus. Today, as we saw aboard the Miss Emily, Fiscus can automatically capture weight and length, and scientists tap buttons on a touchscreen to fill in all the other necessary data about each animal. But aside from these computerized systems for logging catches, and advanced sensors that are attached to trawl nets to make sure the net is functioning properly, the NOAA count is essentially unchanged from the original 1960s trawls.

The data that emerges from those trawls is, of course, a sample rather than a census. “The ocean is a big place,” says Chris Legault, who studies population dynamics for NOAA. “We can’t go out and count each individual fish. And so we have to get at it indirectly in order to figure out how many fish are out there.” Over the decades since the trawls began, NOAA researchers have developed models that combine trawl results with commercial catch statistics, as well as what they know about that species’ biology, to arrive at an estimate of how many fish can be pulled out, while still maintaining healthy stocks. (In other parts of the world, unlike in the U.S. where a scientific agency is tasked with counting fish, governments rely solely on the data from fishermen themselves. It’s called “catch per unit effort”—basically how many fish are caught given a certain amount of time and effort spent fishing. That number is included in decisions in the U.S. as well, but it’s primarily a supplement to the numbers that come from the scientists.)

But there are problems with NOAA’s trawl system. Trawls are most efficient when the seabed is flat. Yet many commercially important species live in untrawlable rocky, craggy habitats. In addition, highly migratory species, such as tuna, range too widely and spend most of their time higher in the water column to be sampled using the trawl method—and so the numbers used to inform regulations of those species depend entirely on fishermen’s catch. But scientists say that relying on catch numbers from fishermen presents an inherent sampling bias: Fishermen tend to congregate where the fish are, and avoid areas devoid of fish, so commercial catches could remain high even as fish disappear from their former habitats.

There’s yet another issue. While the sample captured by research trawls is a small fraction of the overall commercial catch, some scientists say it’s wasteful, as most of the fish, after being hauled up to the surface, die even when thrown back overboard. This becomes a more significant concern in areas where fish populations are particularly low.

The biggest problem, however, is that scientists need to be able to accurately track trends over time, and so the methodology needs to be almost exactly the same, year after year. In effect, NOAA locked themselves into the 1960s counting system. “It’s old technology,” says Elizabeth Clarke, senior scientist at NOAA Fisheries in Seattle. Today, however, NOAA’s fish counting technology is finally on the verge of entering the 21st century—albeit with the 1960s trawl still in tow.

     National Forest Boundary

NOAA and Massachusetts Department of Fisheries scientists conduct extensive bottom trawl surveys off the east coast of U.S> each year. This map depicts the number of hauls (463) conducted by a single NOAA research vessel between September 9 and November 14, 2014.

On a clear, cold day off the coast near Woods Hole on Cape Cod, a small crane lifted the six-foot yellow torpedo-shaped machine off the boat and into the water. The researchers on the boat who were testing the robotic autonomous underwater vehicle, or AUV, sent computer signals out to the craft as it bobbed along at the surface, and received beeps in return. Once satisfied that it was ready, the researchers transmitted the command that told it to go, and the AUV disappeared under the waves. Each time they send the AUV underwater, said Hanumant Singh, an ocean engineer who splits his time between Northeastern University and Woods Hole Oceanographic Institute, “we’re holding our breath—we don’t breathe for six hours.”

While the scientists waited above, the AUV motored along at the pace of a human stroll some five to ten feet above the ocean floor. Following a pre-programmed course through the water, the craft used sonar to avoid boulders and cliffs in its path. As it glided, it monitored the temperature of the water, the oxygen content, levels of dissolved organic matter, and it snapped dozens of photos per minute of the ocean bottom beneath it.

This AUV, designed by Singh, was originally intended to survey coral reefs. But when Singh presented this new technology at a conference, Clarke approached him with the idea that it could help her both count and study rocky-bottom-dwelling fish. So Singh built a clone for NOAA, and began working with Clarke’s team of fisheries scientists to optimize it for counting fish. Among the tweaks, image-processing specialists on Singh’s team had to devise methods to enable colors on the seafloor to pop in stark relief, despite the faint light far below the surface. Since then, the AUV has gone on repeated trips to survey fish stocks, capturing tens of thousands of images. This month, it will sail with Mary Yoklavich, marine biologist with NOAA’s Southwest Fisheries Science Center out of San Diego, to conduct a trial survey of rockfish, whose habitat is too rugged to sample with traditional NOAA trawls.

Singh is not the only WHOI scientist focused on advancing underwater imaging technologies. A new camera, called the Habitat Mapping Camera System, or HabCam, promises a better eye on the seabed. Towed behind a ship and cruising just five or six feet above the ocean floor, the Mini Cooper-sized camera snaps six overlapping images each second, creating a detailed, non-invasive, real-time view of the surrounding ocean floor.

Scott Gallagher, WHOI scientist and one of the image-processing and design experts on the HabCam system, says the team has developed what they call a stereo camera, one that generates a three-dimensional image of the sea floor and anything lying atop it. The camera will help identify species on the seabed, including shellfish such as scallops, as well as species like flounder partially buried in the sand. The team is currently working to integrate this stereo camera into the AUV system.

“The first reaction is, ‘God, wow, fantastic, all these images.’ And the next reaction was, ‘Oh, what are we going to do with all this data.’”

—Hanumant Singh, Northeastern University / Woods Hole Oceanographic Institute

These new technologies solve one problem—counting fish in untrawlable habitats—but they also present their own particular challenge: too much information. “The first reaction is, ‘God, wow, fantastic, all these images,’” said Singh. “And the next reaction was, ‘Oh, what are we going to do with all this data.’” All of the teams are developing algorithms and machine-intelligence programs to automate the process of extracting useful information from the noise, but that also introduces new sources of error.

Still, counting fish using Singh’s AUV and Gallagher’s HabCam may not only improve the quality of existing data in terms of numerical precision but also, more importantly, in the richness of the accompanying visual imagery. The ability to map and photograph these previously unexplored rocky seabeds is allowing scientists to uncover new information about how fish live in and interact with their habitat. In some cases, this new information is already improving the official process for assessing fish stocks. Data from Gallagher’s HabCam, for example, has not only greatly improved the accuracy of NOAA’s scallop dredge counts, but also provided insights into juvenile scallop distribution that has helped the Fishery Management Council create temporarily restricted areas. Thanks, in part, to this technology, the New England scallop fishery is now considered a success story, having fully recovered from its depleted state in the 1990s.

As promising as these underwater imaging technologies are for some species, they cannot help researchers count fish that live higher in the water column, rather than along the bottom. To improve the count of these pelagic, or middle-dwelling fish, such as herring, mackerel, and sardines, scientists are learning to listen, rather than look.

In 2010, borrowing from oil and gas industry research and engineering, Tim Stanton, scientist emeritus at WHOI, and his colleagues introduced a new broadband system capable of measuring sound scattering across a continuous range of frequencies, allowing scientists to pick out the acoustic signatures of various species and sizes of fish from within the noise. Stanton’s acoustic sampling technique is already being incorporated into the official stock assessment process: WHOI’s new research boat, the R/V Neil Armstrong, which was launched in September 2015, was one of the first vessels in the U.S. equipped with the technology.

By providing a real-time, three-dimensional survey, this acoustic technology offers a way to count fish without taking them out of the water. It also provides an entirely new view of fish behavior. For example, according to Stanton, it was once thought that fish assemble in layers of small prey fish and larger predator fish. However, broadband acoustics has shown that various fish species frequently intermingle throughout the water column. “The conventional wisdom was just based on using the wrong tools,” said Stanton.

These technologies have their limits, too, of course. Acoustic sampling doesn’t work on bottom-dwelling fish, such as cod and haddock, due to the distorting effect of echoes from the sea floor. And for highly migratory species such as tuna, which are not only incredibly valuable but whose numbers are fiercely debated, a combination of technologies might be required.

Just a few minutes’ walk from Stanton’s office, out on a pier behind NOAA’s Northeast Fisheries Science Center (NEFSC), Mike Jech, who runs the Center’s Advanced Sampling Technologies Research team, prepared to launch Ginger, one of NOAA’s two new drones. (The drone’s twin is named Wasabi.) Ginger is a small but powerful hexacopter with a wingspan of 32 inches and custom feet made of pink and green pool noodles. With one hand on each foot, Jech held Ginger above his helmet. His colleague, Jennifer Johnson, stood a few paces away with the remote control. With Ginger’s rotors whirring, Jech let go, and within seconds the drone was a hundred feet in the air. Then, with Johnson piloting, Jech ducked under a black hood like an old-time photographer to watch the live feed from the digital camera mounted beneath Ginger’s body.

The team started flying the drones last year, working in partnership with commercial spotter pilot Mark Brochu and Molly Lutcavage, a scientist at the University of Massachusetts Boston. The idea was to develop a new way to count and measure tuna, when they’re at or near the surface. Originally, Lutcavage worked solely with Brochu, using cameras mounted to his airplane, but now Brochu, with his longer range, simply finds and directs Jech’s boat to the tuna schools. The drone can be in the air for only about fifteen minutes, but, because of its sophisticated GPS and orientation sensors, it’s easier to calculate at exactly what angle it is photographing the fish, and thus achieve a more precise measurement of their sizes.

It’s still early days for this surveying method, and Jech’s hexacopter only allows the researchers to count tuna at the surface. But by combining the drone data with acoustic sampling to survey the below-surface members of the school, the team hopes to arrive at the first-ever direct population counts of tuna. Last year was a proof-of-concept to demonstrate that the drone could capture usable images. This year, Jech will include population biologists in the research efforts to determine how this new combined drone/sonar sampling method compares to the industry catch data on which tuna population estimates are currently based.

And, as with Stanton’s broadband acoustics and Singh’s submersible imaging, Jech points out his aerial survey is also starting to provide new insights into fish behavior. “There are four or five different types of behaviors that you can see—different school formations,” said Jech. “We don’t know if that’s because different sizes behave differently, or why.”

These new fish counting technologies—the HabCam, AUV, broadband acoustics, drone, and others—are all, to a greater or lesser extent, still in their infancy. But will they help us count fish more accurately? Certainly, they offer an improvement on the status quo for species such as tuna, scallops, and herring, which are all but impossible to count using a survey trawl. Even when a species—for example, cod—can be surveyed using current methods, audio-visual techniques might offer scientists a chance to assess populations more accurately without taking more fish out of the sea.

That said, even using these new counting methods, the final number will always be an estimate, based on a smaller sample. And, like every survey method, these estimates suffer from the observer effect. Just as some fish are known to avoid the net in traditional trawls, each of these new technologies exerts its own distorting influence on fish behavior. Some species, and even some fish within species, seem to avoid camera light; others are disturbed by sound. Factors like these affect the accuracy of the final count. The question that Singh, Gallagher, Stanton, and others are trying to answer is, can we ascertain by how much?

In the end, even the proponents of these technologies don’t suggest that any of the new counting methods could, on its own, replace NOAA’s traditional survey. Each method, within its particular species or environmental niche, might help make the final number more accurate, but the time-honored trawls have to continue being conducted in the same way, each year, in order to see larger trends in fish numbers over time. “You don’t want to throw the baby out with the bathwater,” explained Clarke. “You need to somehow knit these old surveys and new surveys together in a logical, cost-effective way.”

This is not an insignificant challenge, given the limited amount of funding available for federal fisheries research. Adding an official drone or AUV survey each year, on top of NOAA’s existing trawls, is a stretch, if not altogether impossible, given the agency’s current budget. Nonprofit and academic funding is similarly limited: Lutcavage, for example, will have to put her research on hold at the end of the year in order to scramble for grants. But NOAA won’t—and shouldn’t, according to most scientists—stop funding their bi-annual trawls in favor of drone flights or submersibles. No matter the trawl’s flaws, the consistent methodology is critical to assessing those long-term trends—the trawls provide an irreplaceable and extremely valuable time series. After forty years of doing the same survey in the same ways each year, the inaccuracy of individual results fades and larger patterns of fishery declines emerge.

And so none of these new technologies, despite their individual strengths, solves the problem of counting fish. “I wish there was one giant hammer we could take to the problem and solve it that way,” said Singh. “Unfortunately, that doesn’t work.” But, providing NOAA can afford to add drones, sonar, and AUVs to its existing fish-counting toolkit, these technologies have the potential to improve data quality for hard-to-count species, as well as provide additional, valuable information about fish behavior and habitat.

Debating the accuracy of stock assessments is… a dangerous distraction, “an endless argument about how many fish there are in the sea until all doubt is removed—and so are all of the fish.”

—Andrew Rosenberg, Union of Concerned Scientists

According to Andrew Rosenberg, director of the Center for Science and Democracy at the Union of Concerned Scientists, “Fisheries management is a science-informed political process.” In other words, funding challenges and improvements in data collection aside, decisions about how many fish can be caught is never based purely on how many fish scientists think there are.

Rosenberg spent years as the northeast regional administrator for NOAA’s Fisheries division, moderating shouting matches between fishermen, politicians, and scientists. Although fisheries management council members have all pledged to protect the nation’s fishery resources, rather than simply represent their own interests, in reality, their perspectives are inherently biased. Inaccurate data is only part of a much larger problem of balancing the needs of all of these communities, whose time frames and incentives are rarely aligned.

Given all of the political obstacles, it’s hard not to wonder whether it’s even worth fighting for better fish counts. Some scientists argue that extra precision is unnecessary when the trends are so clear. “I didn’t need to know whether it was 20 or 22 percent of the biomass that we should be removing each year,” argued Rosenberg. “We were removing 60 percent, so all I needed to know was that it had to be a lot less.” Debating the accuracy of stock assessments is, to his mind, a dangerous distraction, “an endless argument about how many fish there are in the sea until all doubt is removed—and so are all of the fish.”

There is no doubt that many, if not most, commercial fish stocks are at historic lows, and that human pressure is to blame. Indeed, comparing estimates of 19th century cod abundance to today’s impoverished population makes the battles over minor recoveries or quota reductions over the past few decades seem almost ridiculous. According to Poul Holm, a professor of environmental history at Trinity College Dublin, based on analyses of 200 years’ worth of catch data, it’s clear that, over the past two centuries, “we have basically eradicated nine-tenths of the biomass of the large fish and marine mammals.”

Meanwhile, climate change is disrupting ocean conditions so fast that even the most responsibly crafted management plan based on the best possible data would struggle keep up. Andy Pershing, chief scientific officer at the Gulf of Maine Research Institute, has seen this play out with cod in the Gulf of Maine, whose waters are some of the fastest warming in the world. With that environmental change has come shifts in species’ range, predator-prey relationships, and fish behavior, none of which are accounted for in the fishery managers’ statistical models. Pershing says that over the past decade, failure to factor in variations related to climate change has led scientists to think there should be more fish than there actually were in New England waters, meaning that managers set quotas based on inaccurate numbers. The result? “You end up creating this really frustrating situation where fishermen are doing their job, they’re staying within the limits”—but, despite their sacrifices, the cod population continues to decline.

“We have basically eradicated nine-tenths of the biomass of the large fish and marine mammals.”

—Poul Holm, Trinity College Dublin

Aboard the Miss Emily, the first four trawls were disappointingly thin, and we hadn’t seen more than a handful of cod all day. There was thus a palpable shift in mood when, after the penultimate trawl of the day, the net finally came up full, bulging with silvery-skinned haddock. The fishermen were in their element, gutting and cleaning the fish as soon as Hoffman and Buchan had completed their measurements. As the crew cleaned the deck, the boat’s captain, Kevin Norton, breaded fresh haddock fillets for supper. Because this was a commercial vessel rather than a NOAA trawl, the rest of the catch would be on a truck to market that night, and the proceeds funneled back to fund more cooperative research.

While the method Hoffman was using was the standard NOAA assessment technique, this particular survey represented an important advance: a collaborative effort in which commercial fishermen and scientists worked together to count fish. After all, while more accurate numbers can help fishery managers chart a better balance between sustainability and the survival of the industry, the process that translates those numbers into management plans is a political one, and finding ways for fishermen and scientists to work together to count fish can transform them into allies, rather than opponents.

The new technologies that Singh, Gallagher, Stanton, and others are working so hard to develop and to have incorporated into NOAA’s official count can’t guarantee that the humans challenged with acting on those numbers cooperate to make the right decision. But they do offer a potentially more important benefit: a better picture of the entire ecosystem. After all, the ocean is a dynamic environment, and fish behavior and fish populations respond to all of the aspects of their habitat, not just fishing pressure—water temperature and pH shifts, run-off from human settlement, variations in seal populations and zooplankton biomass, El Niño, and more.

By their very nature, technologies such as HabCam and broadband acoustics do more than simply count fish; they offer a view into the complex marine ecosystem in which fish live, as well as how they interact with that environment and each other. Indeed, in many ways, the fact that these new tools are better at counting fish is a mere bonus compared to their real benefit: They offer us a better picture of the ocean itself. If their adoption can encourage fishery managers to update existing models to reflect the health of the ocean as a whole, rather than the population of a single commercially important species, it’s possible that they could result in policies that not only replenish and sustain reduced fish stocks but also preserve and even strengthen marine biodiversity.

“One of our most important goals is to have ecosystem-based assessments,” Elizabeth Clarke says. “It’s pretty important that we start gathering information that is about the fish and where they live, simultaneously.” Better data doesn’t guarantee better decisions, but it does give us a better shot at making them.

Gastropod

Gastropod is the award-winning podcast that looks at food through the lens of science and history. Every other week, co-hosts Cynthia Graber and Nicola Twilley release a new episode that explores the hidden history and surprising science behind a particular food- or farming-related topic, from aquaculture to ancient feasts, from cutlery to chile peppers, and from microbes to Malbec. Cynthia Graber is an award-winning print and radio reporter, whose work has appeared in venues including Wired, The New Yorker, and various NPR shows. Nicola Twilley is a contributing writer for The New Yorker, where she writes about science in print and online.

bioGraphic is powered by the California Academy of Sciences, a renowned scientific and educational institution dedicated to regenerating the natural world through science, learning, and collaboration.