The Brilliant 10: The most innovative up-and-coming minds in science

Share

FRESH EYES can change the world, and a world stressed by a pandemic, climate change, and inequity is one more ripe for change than we have ever experienced before. That’s why, after a five-year break, Popular Science is bringing back the Brilliant 10: an annual roster of early-career scientists and engineers developing ingenious approaches to problems across a range of disciplines. To find those innovators, we embarked on a nationwide search, vetting hundreds of researchers from institutions of all stripes and sizes. These thinkers represent our best hopes for navigating the unprecedented challenges of tomorrow—and today.

Making future forecasts less hazy

a woman with long dark hair on a green background
Allison Wing, Assistant Professor of Meteorology, Florida State University. Nicole Rifkin

Allison Wing sees a hole in the world’s major climate models: The reports published by the Intergovernmental Panel on Climate Change factor in water vapor, but not the way it forms clouds—or, more specifically, the way they cluster in the skies. In fact, says the Florida State University meteorologist, these airborne puffs may be the biggest source of uncertainty in our environmental projections. Wing’s models and simulations could help predict how a hotter planet will reshape clouds and storms and whether these changes will, in turn, exacerbate global warming.

It’s already apparent that cloud patterns can produce distinct local effects. “When clouds are clumped together, rather than being randomly distributed,” Wing explains, “the atmosphere overall is drier and warmer, and there’s actually less cloud coverage overall. And that affects how radiative energy flows through our climate system.”

Wing’s findings, published in the Journal of Advances in Modeling Earth Systems in 2020, suggest that the nuances of cloud behavior may alter notions of what our climate future looks like and perhaps how fast we’ll reach it. “Not just how they’re clustering,” she says, “but everything about them.” She—together with a group of 40 international scientists she leads in running mathematical simulations of the atmosphere—wants to get a better grip on how factors like cloud density, height, and brightness could change as the planet warms. Zeroing in on those details may hone the accuracy of global warming projections.

In the here and now, Wing wants to answer questions about extreme weather events, such as what controls the number of hurricanes we have in a given year and why big storms are getting larger and wetter faster. Her work points at a sort of “cloud greenhouse effect” in which the infrared radiation reflected as the sun warms the Earth gets trapped under nascent storms, which makes stronger tempests build more quickly. She hopes observational data from the Jet Propulsion Laboratory’s CloudSat research satellite, which she got access to as part of a 2021 NASA grant, will verify this phenomenon’s existence.

By simulating past hurricanes in vivid detail—a process involving so many variables that Wing runs them on the National Center for Atmospheric Research’s supercomputer in Wyoming­—she hopes to render the re-creations more realistically over time. Eventually, though, she wants to tap NASA’s satellite imagery (aka the real world) to make potentially lifesaving predictions.

Turbocharging surgical pathology

a man with glasses on a pink background
Michael Giacomelli, Assistant Professor of Biomedical Engineering and Optics, University of Rochester. Nicole Rifkin

When it comes to speedy biopsy results, nothing beats Mohs surgery. To minimize scarring, pathologists analyze excised skin cancers on site to ensure all dangerous cells are gone. Other common cancer surgeries, such as those for the prostate and breast, still rely on lab work that takes days to confirm clear margins, which can mean repeat procedures are necessary. And it’s all very labor­ intensive. Michael Giacomelli, a University of Rochester biomedical engineer, has a microscope that could put even Mohs surgery’s turnaround time to shame—spotting cancerous cells from a variety of tumors in near-real time.

The key is going small. The type of imager he’s built, a two-photon microscope, has been around for decades, but their hefty price tags (often $500,000 or more) and sprawling form factors (components are often racked in a space the size of a utility closet) make them impractical for most operating rooms. The scopes spy sick cells with the help of lasers: Tumor cells have characteristically enlarged nuclei, due to their excess of DNA; when soaked in a specialized dye, the oversize organelles fluoresce under the laser light. “They’re able to reach into a wet, bloody, messy mass of tissue and look at what’s inside,” Giacomelli explains.

With a background in optics, he knew that smaller, lighter lasers were being used for welding and on factory floors. The key was to find dyes that operated at their wavelength and that wouldn’t ruin human tissue for more in-depth follow-up in a traditional lab. He identified one suitable hue in a derivative of the ink in pink highlighters. After years of trial and error, which began at MIT, and a few iterations, the laser that sets it alight weighs 5–25 pounds. Combined with a microscope, monitor, CPU, keyboard, and joystick, the system fits on a handcart compact enough to wheel between surgeries. The price tag: around $100,000.

With more than 100,000 breast cancer surgeries and millions of skin cancer procedures each year in the US, the impact could be profound. Since 2019, an earlier version of Giacomelli’s system (one the size of a washing machine) has been in a clinical trial for breast cancer patients at Beth Israel hospital in Boston. And a study on prostate cancer screening published in Modern Pathology found doctors could ID malicious cells just as well with the new system as with traditional methods. Next, Giacomelli wants to trial his new, sleeker setup on Mohs and other skin cancer surgeries. He’s also interested in getting his imaging equipment into rural clinics that don’t have tissue labs nearby for fast answers. And modifying his scope for 3D imaging, which could improve outcomes for complexly shaped cancers like melanoma, could also open doors: Looking at tumors in 2D limits our understanding of what’s going on, he says. “I really think 3D imaging is going to be huge for diagnosis.”

Untangling transgenerational trauma

a black woman in a yellow blazer on a green background
Bianca Jones Marlin , Assistant Professor of Psychology and Neuroscience, Columbia University. Nicole Rifkin

Bianca Jones Marlin credits her siblings for inspiring her career. All 30-plus of them. That’s not a typo: Her folks took in dozens of foster kids. “My siblings have gone through things you wouldn’t even want to imagine,” she says. That’s why Marlin, a psychologist and neuroscientist at Columbia University who has now fostered children herself, studies a unique sliver of epigenetics, or the impact our environments and behaviors have on our genes. She documents how stress and trauma pass between generations, even when forebears have little or no contact with their descendants.

“The world changes your brain and your body—and also your offspring,” she says. “That has such strong implications for society, for the way we predict what’s going to happen in the future.” Communities that have endured famine, genocide, or any number of other struggles, she points out, may experience heightened anxiety and PTSD in later generations. Revealing the levers by which stress “travels to the future” could open pathways to therapy and ­prevention—​­breaking the chain of trauma.

Marlin began her work, which centers on brain development and learning, by identifying one of the mechanisms responsible for a seismic shift in social behavior. In 2015, she showed how the hormone oxytocin sensitizes mouse moms to their pups’ distress calls. And since then, she’s studied the effects of environmental stress and trauma in lab mice.

But how are those changes passed down? “That is the beautiful, essential question that we’re working on,” Marlin says. Until now, scientists have seen such effects only anecdotally: For example, an infamous famine in the Netherlands at the end of World War II increased health issues like diabetes, high blood pressure, and schizophrenia not only in those it affected, but also in their children, suggesting that reproductive cells could convey a memory of the trauma. Through her work on mice, Marlin has demonstrated how a learned behavior (associating the smell of almond with an electric shock) is tied to an increase in olfactory cells that respond to that scent in progeny. “We talk about it in culture,” she notes, “but because we don’t know the mechanism, it’s considered a myth.”

Marlin’s aware that her findings could be used to stigmatize groups of people—even harm them. “I would be disappointed if, 15 years from now, people were able to take the work that we have done and use that as a wall—assuming that because your ancestors went through this, you obviously are going to suffer from this too,” she says. Or worse, she continues, malicious actors could torture or terrorize with the explicit intention of harming future generations.

The positive ramifications are enough to keep her going. “If we can induce negative changes and dramatic changes, we also can induce positive,” Marlin says. “That’s the beauty of epigenetics. It’s not permanent.”

Dusting for digital fingerprints to find deepfakes

a man with glasses on a gray and pink background
Matthew Stamm, Associate Professor of Electrical and Computer Engineering, Drexel University. Nicole Rifkin

“It is impossible for a criminal to act, especially considering the intensity of a crime, without leaving traces of this presence,” wrote Edmond Locard, a 20th-century forensic science pioneer. It’s a quote Matthew Stamm frequently references. The Drexel University computer engineer isn’t after fingerprints or hair strands, however; his tools and techniques instead detect even the most subtle alterations to digital objects: deepfakes.

Since its earliest Reddit days in 2017, deep­faking has graduated from a revolting prank—using AI to put celebrity actors’ faces on porn stars’ bodies—to an alarming online threat that involves all sorts of synthetic multimedia. Since detection is even newer than the act itself, no one has a grasp on how widespread the phenomenon has become. Sensity, an Amsterdam-based security firm, reported that the examples spotted by its homegrown sniffer doubled in the first six months of 2020 alone. But that number is surely low, especially with the release of easy-to-use apps like MyHeritage, Avatarify, and Wombo, which have already been used to animate tens of millions of photos.

“The ability to rapidly produce visually convincing, completely fake media has outpaced our ability to handle it from a technological end. And importantly, from a social end,” notes Stamm. According to a 2021 Congressional Research Service report, the acts pose considerable national security threats. They can be used to spread false propaganda with the intent to blackmail elected officials, radicalize populations, influence elections, and even incite war.

The budding threat has prompted a growing number of companies and researchers—including biggies like Micro­soft and Facebook—to develop software that sniffs out AI fakes. But Stamm, who’s funded by DARPA to build automatic deepfake detectors, notes that artificial intelligence is used to make only a small subset of the tampered media we have to worry about. People can use Adobe Photoshop to create so-called cheapfakes or dumbfakes without specialized talent or hardware. In 2019, videos of Nancy Pelosi were altered by slowing soundtracks to make her appear drunk, slurring her words. In 2020, chopped-up videos made then-candidate Joe Biden appear to fall asleep during an interview.

Stamm’s approach to image analysis can catch even simple manipulations, no matter how convincing. “Every processing element, every physical hardware device that’s involved in creating a piece of media, leaves behind a statistical trace,” he notes. He based his algorithms on a concept called forensic similarity, which spots and compares the digital “fingerprints” left behind in different regions. His software breaks images into tiny pieces and runs an analysis that compares every part of the photo with every other part to develop localized evidence of just about any kind of nefarious editing.

Stamm’s latest work focuses on emotional consistency, matching voice patterns (intensity and tone) with facial characterizations (expressions and movements) in video. Inspired by Stamm’s wife, a psychologist, the idea stems from the notion that it’s difficult for video manipulations to sustain emotional consistency over time, especially in voices, he says. These techniques are still in development, but they show promise.

Removing ‘forever chemicals’ from drinking water

a man with a beard wearing goggles on a yellow background
Frank Leibfarth, Assistant Professor of Chemistry, University of North Carolina Chapel Hill. Nicole Rifkin

The Cape Fear river in North Carolina feeds drinking water for much of the southeastern part of the state. But for decades the chemical giant DuPont fed something unsavory into the waterway: PFAS, or per- and polyfluoroalkyl substances, chains of tightly bonded carbon and fluorine with a well-earned rep as “forever chemicals.” A subset of them—PFOA and PFOS—can contribute to elevated cholesterol, thyroid disease, lowered immunity, and cancer. The Centers for Disease Control and Prevention has found them in the bloodstreams of nearly every American it’s screened since 1999. While DuPont (via a division now called Chemours) phased out production in 2013, the remnants of old formulations of household staples like Teflon, Scotchgard, and Gore-Tex linger.

Frank Leibfarth, a chemist at the University of North Carolina at Chapel Hill, has a filter that can remove these toxins—and he’s starting with the Tarheel State’s polluted waterways.

Leibfarth specializes in fluorinated polymers like PFAS. Before the NC Policy Collaboratory funded him to help with the state’s water pollution problem in 2018, he was focused on finding cheap and sustainable alternatives to single-use plastics, whose exteriors are sometimes hardened with fluorine. Leibfarth’s solution took its cue from diapers: “They’re super-absorbent polymers that suck up lots of water,” he says. He developed a fluorine-based resin that’s similar enough in structure to PFAS to attract the compounds and hold on to them. The material filters nearly all of these substances from water, and 100 percent of PFOA and PFOS, according to results his team published in the journal American Chemical Society Central Science in April 2020. The material is cheap and scalable, so municipal water treatment plants can deploy the filters as an additional cost-effective filtration step.

The North Carolina legislature is considering a series of PFAS-remediation bills in 2021, one of which would fund commercializing Leibfarth’s solution, including manufacturing the resin and fitting it to municipal filtration systems. Other locales will surely follow. According to the nonprofit Environmental Working Group, as of January 2021 there are more than 2,000 sites across the US with documented PFAS contamination. Seven states already enforce limits on the chemicals in their drinking water—with more to follow.

Amid all this, the Environmental Protection Agency in March 2021 identified another new PFAS exposure threat: the very same hardened plastic containers that Leibfarth’s initial work aims to make obsolete. “I want to change the field’s thinking,” he says, “about what is needed to develop materials that are both useful and sustainable at the same time.”

Powering electronics without batteries

a man in a pink shirt on a green background
Josiah Hester, Assistant Professor of Computer Science, Computer Engineering, and Electrical Engineering at Northwestern University. Nicole Rifkin

Our love of personal gadgets is causing a ­major pileup. Based on current trends, humanity’s battery­ powered gizmos could number in the trillions by 2030. Josiah Hester, a computer engineer at Northwestern University, hopes to keep those power-hungry devices from overloading landfills with their potentially toxic power cells. His plan is simple and radical: Let these little computers harvest their own juice.

Hester’s team creates arrays of small, smart, battery-free electronics that grab ambient energy. His work is based on a concept known as intermittent computing, an approach that can deal with frequent interruptions to power and internet connectivity—in other words, devices that do their jobs without a constant hum from the grid.

His team assembles circuit boards that combine off-the-shelf processors from companies like Texas Instruments with sensors and circuitry to tap power sources like the sun, radio waves from the environment, thermal gradients, microbes, and impact forces. The team also writes the custom software to keep the sensors running. The most notable feature of these circuit boards? No batteries. Juice flows through capacitors when it’s available, and devices are designed to handle brief power-downs when it’s not.

In 2020, Hester debuted his proof of concept: a handheld gaming device (ENGAGE) modeled after a classic Game Boy. Its power comes from small solar cells framing its screen and from the impacts of button presses, which generate electricity when a magnet drops through a coil. (Shakable Faraday flashlights work in a similar way.) The toy is no match for the energy-gobbling processors in most immersive platforms on the market, but it’s a harbinger of what’s to come. During the pandemic, Hester’s lab developed a “smart mask” prototype decked out with tiny sensors that check vital signs like temperature, heart rhythm, and respiratory rate—all powered by the vibrations from the user’s breaths.

Untethering devices from the electrical grid also makes them more practical for remote applications. Hester has several programs underway, including one to monitor wild rice habitats and avian flocks in the Kakagon Sloughs, a Great Lakes conservation area managed by the Ojibwa people. When the sensors, which harvest energy from soil microbes and sunshine, are deployed later this year, they’ll track water quality and the sounds of crop-ravaging waterfowl. He’s also working with the Nature Conservancy to set up noninvasive, solar-powered cameras on Palmyra Atoll, an island in the heart of the Pacific Ocean surrounded by more than 15,000 acres of coral reef. Once a weather station and monitoring site for nuclear testing, the spot is now perfectly stationed to track migrating birds and, perhaps eventually, the effects of climate change on marine species.

As Hester pushes the limits of intermittent computing to improve device sustainability, he’s guided by a philosophy he attributes to his Native Hawaiian upbringing. It boils down to a simple question: “How do you make decisions now that will have positive impacts seven generations in the future?”

Storing data in chemical soup

a woman with a brown ponytail on an orange background
Brenda Rubenstein, Assistant Professor of Chemistry, Brown University. Nicole Rifkin

According to a recent report, Earth only has enough permanent physical storage space to hold on to some 10 percent of the more than 64 billion terabytes of data humans generated in 2020. Luckily for us, not every meme and tweet needs to live forever. But given that our output has doubled since 2018, it’s reasonable to fear that crucial information like historical archives and precious family photos could find itself homeless in the near future. That’s the problem Brenda Rubenstein, a theoretical chemist at Brown University, hopes to solve. She wants to tap into evolution’s storage designs (read: molecules) to create a radical new type of hard drive—a liquid one. Her chemical computers use tiny dissolved molecules to crunch numbers and store information.

In 2020, she and her colleagues converted a cocktail of small amines, aldehydes, carboxylic acids, and isocyanides into a kind of binary code puree. “The way you can store information in that disordered mixture of molecules floating around is through their presence or absence,” Rubenstein notes. “If the molecule is there, that’s a one, if a molecule is not there, that’s a zero.” The method, published in Nature Communications, successfully stored and retrieved a scan of a painting by Picasso. In 2021, her team used a similar slurry to build a type of AI called a neural network capable of recognizing simple black-and-white images of animals, like kangaroos and starfish.

Molecular storage has already been in the works. Experiments with embedding info into DNA, or long-chain molecules, date back to the early 2000s, and tech titans like Microsoft and IBM have entered the mix, along with specialty companies and the US federal research agency for spies, IARPA.

But small molecules may have distinct advantages over DNA. Compared to the double helix, their structures are simpler to synthesize (cheaper to manufacture), more durable (less susceptible to degradation), and less error prone (because reading and writing don’t require sequencing or encoding). What’s more, according to Rubenstein’s rough calculations, a flask of small molecules could hold the same amount as 200 Empire State Buildings’ worth of terabyte hard drives. When they’re stored as dried crystals, the molecules’ lifespans could outlast even modern storage media—perhaps in the thousands of years compared to current hard drives’ and magnetic tapes’ 10 to 20. The main trade-off is speed. Rubenstein’s tech would take about six hours to store this article, for example, and you would need specialized equipment like a mass spectrometer to read it back, making the method better suited to archival preservation than daily computing.

Within the last few years, Rubenstein and her colleagues have filed a chemical computing patent, and they are in talks with a venture capital firm to launch a startup focused on harnessing the budding new technology. “What gets me up in the morning,” says Rubenstein, “is the prospect of computing using small molecules.”

Tracking public health with smart sewers

an asian woman with short hair on a blue background
Fangqiong Ling, Assistant Professor of Energy, Environmental & Chemical Engineering, Washington University in Saint Louis. Nicole Rifkin

The name Beijing often conjures images of skyscrapers, traffic, and crowds. But Fangqiong Ling, who grew up in the city of more than 20 million, thinks of its scenic lakes, which still bear their 17th-century Qing dynasty names: Qianhai, Houhai, and Xihai. Ling studied algae blooms in these pools in high school. She and her classmates used benthic invertebrates (such as crayfish, snails, and worms) to analyze water quality, knowing that different groups of species tend to gather in clean or polluted environments. She’s been turning smaller and smaller biological organisms into sensors ever since.

Ling, an environmental microbiologist and chemical engineer at Washington University in St. Louis, still studies the H2O that flows through urban infrastructure. But she’s transitioned from water quality to wastewater-based epidemiology (WBE) and the use of “smart sewers.”

This concept isn’t new: Public health officials have sampled sewage for years to detect a wide spectrum of biologics and chemicals—including illicit drugs, viruses, bacteria, antibiotics, and prescription medications. But they have lacked tools to accurately account for the number of human sources represented in their samples, making it hard to assess the scope and scale of contamination. If a sewage sample turns up high concentrations of nicotine, for example, the spike could be the result of one toilet flush from a hardcore smoker close to the collection area, or the culmination of many smokers across the city. Substitute coronavirus or anthrax, and it’s easy to see how the difference matters.

Ling’s breakthrough was figuring out how to use the relative numbers of people’s gut bacteria in wastewater—revealed by rapidly sequencing their RNA—to estimate the true size of the population that contributed to that sample.

Her field is having a moment. During COVID-19, many cities have turned to WBE, which has exploded from a dozen or so projects to more than 200 worldwide. In 2020 the Centers for Disease Control and Prevention announced a new National Wastewater Surveillance System as a public health tool. With a 2021 National Science Foundation grant, Ling wants to improve population estimates to the point where the comings and goings of commuters, tourists, and other transients don’t skew results. Those tools are a step toward auto­matic, highly accurate assessments of contaminants and contagions in precise locations. “Microbes really have a very fundamental relationship with humans and our cities,” Ling notes. “I’m just trying to dig out the stories they have to tell.”

Shining light on dark matter

a man with glasses on a pink background
Michael Troxel, Assistant Professor of Physics, Duke University. Nicole Rifkin

The standard cosmological model describes how stars, planets, solar systems, and galaxies—even little-understood objects like black holes—congealed from a raucous cloud of primordial particles. While there’s abundant evidence to support the big bang (such as the expansion of the universe and the background radiation the cosmic event left behind), there are some vexing gaps. Dark matter, for instance. For galaxies to rotate at the speeds we observe, there should be at least five times more mass than we’ve been able to lay eyes on. “We have no evidence that dark matter exists, except that it is necessary for the universe to end up where we are today,” says Michael Troxel, a cosmologist at Duke University. To piece together what’s missing, Troxel builds maps of the universe larger and more precise than any before.

Since 2014, Troxel has worked with the Dark Energy Survey (DES), an ambitious international collaboration of more than 400 scientists, to address critical unknowns in the universe. To scope out distant skies, DES fitted a custom 570-megapixel camera with an image sensor highly attuned to red light—as objects move farther away, their wavelengths appear to stretch, making them look increasingly crimson—and mounted it on a telescope perched high in the Chilean Andes. From that vantage, it can spot some 300 million galaxies.

Now co-chair of the DES Science Committee, Troxel coordinated the analysis of data collected through 2016, and, in doing so, spied dark matter’s myriad fingerprints on celestial bodies across spacetime in exquisite detail. The brightness and redness of objects indicates both their distance and—because the universe is expanding—how long they’ve been traveling. Modeling subtle bends in light (think magnified or stretched waves) called weak gravitational lensing reveals massy objects both seen and unseen. And the makeup of the objects themselves helps fill in the picture even more: Troxel used machine learning to classify patterns in galaxy colors (shades of red and faintness) and mathematical modeling to infer shapes (elliptical, spiral, irregular), netting a catalog of more than 1,000 types of galaxies. Having a reference for what clusters should look like helps efforts to detect distortions that may point to dark matter. “That allows us to reconstruct this 3D picture of not just what the universe looks like now, but how it looked 6 or even 9 billion years ago,” Troxel explains.

The findings, announced in May 2021, cover one-eighth of Earth’s sky and more than 100 million galaxies. By the time the results of the full DES data set are published (possibly by 2023), Troxel is hopeful we’ll be able to predict and calculate dark matter. “There’s going to be this watershed moment where we measure the right thing, or we measure the things we’re measuring now with enough precision that we’re going to fundamentally learn where physics is broken,” he says. “We’re almost there.”

Adapting technology for those who need it most

a woman with long brown hair on a green background
Stacy Branham, Assistant Professor of Informatics, University of California, Irvine. Nicole Rifkin

To Stacy Branham, people with disabilities are the original life hackers—and that’s a bad thing. The University of California, Irvine computer scientist doesn’t think anyone should have to be a MacGyver just to get through life. Marginalized groups often adapt apps and gadgets to suit their needs. In the 1950s, for instance, visually impaired people manipulated record players to run at higher speed, allowing them to “skim” audio books for school or work; today, browser extensions that hasten videos have the same effect. Branham wants to use similar insights to design better products from the start. “Innovation is having the right people in the room,” she says.

Branham takes off-the-shelf technologies, like virtual assistants, and puts them together in novel ways to address the needs of under-served communities. One of her projects, nicknamed Jamie, provides step-by-step directions to help the elderly and people with disabilities navigate Byzantine airport checkpoints, signs, and corridors. Jamie uses voice assistance, a geolocation system that takes cues from sources like Bluetooth beacons and WiFi signals, “staff-sourcing” (daily reports by airport employees about dynamic changes like repair work), and audio cues or vibrations. COVID-19 derailed plans to pilot the system at Los Angeles International Airport, but Branham expects to resurrect it soon. “It was built from the beginning with input from people who are blind, people who are wheelchair users, and people who are older adults,” she says, but the resulting tech will benefit anyone who gets lost in airports.

Next, Branham wants to adapt text-to-speech tech to help blind people read with their children. Her proposed Google Voice–based app will act as an interpreter for e-books, prompting caregivers via earbuds with the right words and descriptions of images so they can have a richer story-time experience with their families.

When modern tools are designed with disabled communities in mind, there’s often a widespread benefit—see, for example, the now-ubiquitous curb cuts that enable passage for those with strollers and luggage as much as those in wheelchairs. ­Branham also points out how software like hers could help others, like those who speak English as a second language. Ultimately, she measures success unlike most people developing personal electronic gizmos: not by whether she can create flashy new features, but by whether the offerings of innovation and science are accessible to the people who might need them the most.

This story originally ran in the Fall 2021 Youth issue of PopSci. Read more PopSci+ stories.

Bill Gourgey Avatar

Bill Gourgey

Contributing Writer

Bill Gourgey is a Popular Science contributor and unofficial digital archeologist who enjoys excavating PopSci’s vast archives to update noteworthy stories (yes, merry-go-rounds are noteworthy).