Everything Noise
"I like everything noise.”Jonas Zmuidzinas’s new favorite saying is a phrase that’s been running through his mind a lot lately. A physicist at Caltech who develops instrumentation for use in astronomy, he spends an inordinate amount of his waking hours thinking about noise—but not in the way you might expect. For the average person, thinking about noise might mean trying to ignore the loud neighbors on a Sunday morning or using sound-cancelling headphones on a flight full of babies.
But for many scientists and engineers, a broader definition also assigns the term to the fluctuations in a measured signal that can obscure or reduce its clarity. “For people like myself who build instruments and detectors, noise is at the heart of what we do,” says Zmuidzinas. That’s because in engineering, for example, fluctuations, or noise, can arise from the random motions of atoms or electrons, and can manifest as heat or electronic static. And that can lead to malfunctioning machines. A clearer understanding of noise sources and ways to minimize it in circuits can lead to more efficient microchips and to telescopes that are capable of probing structures in the universe that were previously beyond reach.
At the same time, Zmuidzinas and his Caltech colleagues believe that noise can also be a useful scientific tool; some are investigating ways to harness the literal power of noise to create artificial cells that can function as powerful computers, while others are starting to uncover the important role noise plays in gene expression, research that suggests noise is vital to life itself.
SPACE NOISE Zmuidzinas’s focus on noise is helping him learn how to reduce it and thereby improve the ability of instruments to detect previously indiscernible signals and objects in astronomy. “Big progress is made when you advance the frontier of detection, when you start measuring signals and detecting objects that have not—could not—be seen before,” he says.
Zmuidzinas’s lab focuses on submillimeter astronomy, the branch that deals with wavelengths of light that are both invisible to the human eye and lie between the wavelengths that radio telescopes and optical telescopes observe.
“Submillimeter astronomy straddles these two very different domains of astronomy. It’s a mishmash of both,” says Zmuidzinas. “All of the physics that enables the detectors that you use in visible-light astronomy get tossed out the window when you’re designing instruments for submillimeter astronomy. You have to start completely fresh and invent everything from the ground up.”
The construction of improved submillimeter instrumentation could help answer many of modern astronomy’s outstanding questions, such as when and how did stars form in galaxies over the history of the universe. Large clouds of gas and dust give birth to stars, but when stars blaze to life, the clouds enveloping them absorb their starlight and convert it into a longer and weaker wavelength before re-emitting it back out into the universe in the submillimeter band. Still, astronomers were shocked to discover—in the 1990s—that the universe is as bright in submillimeter light as it is in optical-infrared light, which are the primary wavelengths that stars emit. “The surprise was how much of the energy produced by stars gets shifted to the far infrared and submillimeter range. Very few astronomers would’ve believed it if the measurements didn’t clearly show that this was the case,” says Zmuidzinas.
Building detectors sensitive enough to make this case—or, rather, to discern submillimeter light—is technically difficult. That’s because the particles of light, or photons, in the submillimeter band are 1,000 times weaker than optical photons,since the gas clouds and other celestial objects that emit them are typically cold and dark. Instruments intended to ensnare these feeble photons must be similarly cooled, so that they do not radiate any heat that would interfere with signal detection. To build their detectors, Zmuidzinas and his team have turned to superconducting materials that can operate at temperatures well below 1 kelvin (–458 degrees Fahrenheit).
At such frigid temperatures, the natural motions of the atoms that make up the superconductors slow to a near standstill, thus minimizing thermal vibrations. These vibrations—which are a kind of noise—can break the fragile bond between the electron pairs in the superconductors that form the core of the detectors Zmuidzinas’s team develops, and create false positives in their measurements.
“Understanding the origins of the noise allows us to engineer our detectors to reduce it,” says Zmuidzinas, who helps lead Caltech’s Submillimeter Astrophysics group, which operates the Caltech Submillimeter Observatory (CSO) in Mauna Kea, Hawaii.
Zmuidzinas’s immersion in the study of noise has resulted in some surprising collaborations. While researching a type of noise originating from atoms hopping around on the surfaces of their detector circuits, particularly in the capacitors—electrical components that store energy—Zmuidzinas and his coworkers discovered a common ground with physicists working on quantum computing, which aims to use the weird laws of quantum mechanics to process information.
“It turned out that the noise we were investigating also strongly influences the performance of the superconducting circuits that people interested in quantum computing have been using,” says Zmuidzinas, “and so we’ve had this nice back and forth that has helped us both.”
PHONON NOISE Delving into the mechanics of noise also resulted in new insights for mechanical engineer and applied physicist Austin Minnich. He and his team recently identified a source of electronic noise that sometimes affects the functioning of electronic instruments that operate at very low temperatures, including devices used in radio astronomy and in airport security scanners. The findings could also have implications for the future design of circuit elements, such as transistors, which amplify and switch electronic signals and electrical power.
The electronic noise Minnich’s team identified is related to the transfer of packets of vibrational energy, called phonons, which are present in all materials that have a crystal structure. “In a crystal, from those in ordinary table salt to the indium phosphide crystals used to make transistors, you have atoms that are arranged in an orderly lattice,” Minnich says. “Those atoms can vibrate in different ways, and you can break down those vibrations into modes. A phonon is a discrete packet of these vibrational modes.”
Phonons are important for electronics because they help carry away the thermal energy, or heat, generated by electrical currents. When electron flow is restricted, some of the energy involved in moving the electrons forward is converted to heat. How swiftly and efficiently phonons ferry heat away from a circuit element and into the environment is partly dependent on the device’s operating temperature: at high temperatures, phonons are more energetic and are more likely to collide with one another and with imperfections in the atomic structures of electrical components. This noisy phenomenon, called scattering, results in phonon traffic jams that prevent phonons leaving the device and therefore lead to a temperature rise. “Phonons can interact with each other, and you can imagine that if their direction is randomized, a lot of scattering will occur, and that makes it hard to move heat energy around,” Minnich says.
One way that engineers get around this problem is to operate electronics in extremely cold conditions, because scattering drops off dramatically when the temperature dips below about 50 kelvins, or about –370 degrees Fahrenheit. But the new findings by Minnich’s team showed that while phonon scattering ceases at low temperatures, another mechanism kicks in and severely restricts heat transfer away from a device.
In a study published last fall in the journal Nature Materials, Minnich and his colleagues demonstrated that at around 20 kelvins, or –424 degrees Fahrenheit, the high-energy phonons that are the most efficient at transporting heat away quickly become deactivated. The remaining low-energy phonons don’t have enough energy to carry away the heat and, as a result, the transistors warm up until eventually the temperature rises enough that the high-energy phonons become activated again.
Minnich says the solution may be to design transistors that don’t heat up as much at low temperatures in order to find an equilibrium where scattering doesn’t come into play again either. “The heat turns out to be generated in a very small region under a device. So if you can figure out how to spread out the phonon generation, you could, in principle, decrease the temperature rise that occurs,” Minnich says.
A better general understanding of phonon scattering could also lead to improved thermoelectrics: devices that can convert heat directly to electricity and that could lead to efficient waste-heat recovery, environmentally friendly refrigeration, and more capable rovers and other robots for space exploration.
ARTIFICIAL NOISE Understanding noise—and harnessing its power—also is a goal for computer scientist Erik Winfree, who sees noise as an intrinsic aspect of synthetic artificial cells that are designed at the molecular level. Winfree and his team of international collaborators want to create biochemical circuits that use DNA, proteins, and other biological molecules instead of silicon chips to perform programmed tasks, such as computations or signal processing, that control the activity of the artificial cell.
“I tend to think of cells as really small robots,” Winfree says. “Biology has programmed natural cells, but now engineers are starting to think about how we can program artificial cells. We want to program something that can interact with its chemical environment and carry out the spectrum of tasks that biological things do but according to our instructions.”
Because the components in their circuits are so tiny, the scientists have to contend with sources of noise—such as Brownian motion, which is the random jiggle of atoms and molecules suspended in fluid—that can be ignored in macroscopic systems.
Another source of randomness that comes into play at very small scales is partitioning noise, which results from small differences in how particles end up in each place when a large volume of liquid gets divided, or partitioned, into many tiny compartments.
Winfree ran into an example of this kind of partitioning noise when he and his team recently tested the effect of small sample size on biochemical processes using a florescent biochemical oscillator circuit capable of pulsing rhythmically for several hours.
First, they designed the oscillator, a solution composed of small synthetic DNA molecules that are activated by RNA transcripts and enzymes. When the DNA molecules are activated by the other components in the solution, a biological circuit is created. The researchers then “compartmentalized” the oscillator by reducing it from one large system in a test tube to many tiny oscillators isolated within droplets surrounded by oil.
During their experiments, they found that the size of the droplets really mattered: large droplets fluoresced mostly in sync with one another, as though they were acting in concert and similarly to how the larger circuit acted in a test tube, while smaller droplets were much less consistent—their pulses quickly moved out of phase with the larger droplets.
The scientists think the main reason for this is partitioning noise: some of the droplets initially had more molecules when they were created, while others had fewer. Also, the ratio of the various elements differed among the droplets. The smaller the droplets, the more important these differences became since, with fewer molecules, slight differences in the timing of reactions were amplified.
“In one sense, partitioning noise is a nuisance, because it prevents you from knowing how any given droplet is going to behave because it could have different starting conditions compared to its neighbors,” Winfree says.
But Winfree thinks the data gleaned from this natural variability could also be useful for characterizing molecular circuits and measuring their behavior. “You can think of the randomness as a resource,” he says. “For example, if you have 1,000 droplets, and each one has a slightly different initial condition, you can essentially perform 1,000 different experiments in parallel.”
Other scientists are using the molecular circuits and design principles developed in Winfree’s group to create biomolecular systems that might one day operate within cells to diagnose or treat disease. But Winfree says he is interested in more fundamental questions, such as what kinds of computation are possible using chemical and biological systems, what the limits are of such computations, and how can noise either hinder or help the molecular circuits function.
“We have a rich theory about deterministic computation like the kind performed by electronic computers,” he says. “But the universe is made out of molecules, and I want to understand what are the computational capabilities of these natural systems.”
NOISY BIOLOGY Bioengineer Michael Elowitz also investigates noise in nature. He and his team focus on gene expression—the cellular process by which DNA instructions get translated into proteins—and on better understanding and even designing genetic circuits composed of interacting genes and proteins; such circuits are already beginning to enable researchers to program new behaviors in living cells.
Back in 2002, his lab was looking specifically at the question of whether gene expression is deterministic—whether a specific gene action always leads to a specific protein outcome—or whether it is driven mostly by random, noisy fluctuations in gene signaling. The ideal way to test for this kind of randomness in genetic circuits would have been to place two identical cells in the same environment and see whether their behavior differed. “But you can’t really perform this experiment,” says Elowitz, “because no two cells are ever identical. Even two sister cells differ from one another and do so in many ways.”
So Elowitz’s team did the next best thing. They took two almost perfectly identical genes that coded for differently colored fluorescent proteins—call them green and red—and inserted them into single cells in a population of the bacteria Escherichia coli. Their hypothesis was that, if gene expression is deterministic, the cells would treat the two genes in the same way and express equal amounts of the two proteins. In that case, all cells would appear yellow, since the equal amounts of red and green protein produced would combine to make yellow. What happened instead was that the cells exhibited a much broader spectrum of colors, ranging from neon green cells to very red cells, with plenty of oranges and yellows in between. This color variation indicated that cells expressed widely varying amounts of the two genes, despite the cells’ similarity.
“Our experiment showed in a very visual way that cells are noisy,” says Elowitz. “It established that noise is real and in many cases is the dominant source of variation in gene expression between cells.”
Subsequent research by Elowitz’s lab has indicated that bacteria have evolved ways to exploit this kind of noise in order to hedge their bets against an uncertain future. For instance, his team showed in another experiment that genetically identical bacteria raised in the same environment nonetheless employ different survival tactics when stressed. Some enter a state in which they are more receptive to infusions of DNA from other bacteria, while others transform into spores—a dormant state that makes the cell extremely resilient and able to survive for hundreds of years. In many ways, this is a clever strategy, Elowitz says.
“The cells don’t know the future, so what they’ve evolved to do is diversify by having different fractions of the population enter into different states,” he says. “The cells use their own internal noise to roll the dice, if you will.”
Elowitz’s research suggests that the decision about which state to enter seems to be governed by random chance at the level of gene expression, depending on what else is happening inside a cell at the time. His team has identified specific kinds of gene circuits in the cell that initiate specific cellular behaviors or programs in response to random fluctuations of protein levels—i.e.,noise—within a cell.
Under conditions where the environment itself fluctuates randomly, noise-based strategies can be more advantageous than deterministic strategies. This makes sense, says Elowitz: a population of cells whose members can switch randomly between different states will be better prepared for future changes in the environment than one that just responds to current conditions.
But noise is not just for bacteria. Elowitz’s team has begun taking the techniques they’ve developed for studying noise in bacterial cells and applying them to embryonic stem cells. These undifferentiated cells from the earliest stages of fetal development can go on to produce a wide range of different cell types, even when put in identical cell cultures. Elowitz’s group is now studying how noise could enable this type of cell differentiation and development in mammals.
Elowitz’s research strongly suggests that noise—far from being a nuisance—is essential for healthy cell functioning. “Computers function with essentially no noise. But cells are built very differently. They wouldn’t function without noise, or if they did, it wouldn’t be life as we know it,” he says. “I think we’re just beginning to scratch the surface of the many different roles that noise plays in living systems.
Michael Elowitz is a professor of biology and bioengineering and an investigator with the Howard Hughes Medical Institute (HHMI). His work receives support from HHMI, the Army Research Office, the Human Frontier Science Program, the National Institutes of Health, the Paul G. Allen Family Foundation, and the Gordon and Betty Moore Foundation.
Austin Minnich is an assistant professor of mechanical engineering and applied physics. His research is funded by a Caltech start-up fund and by the National Science Foundation (NSF).
Erik Winfree is a professor of computer science, computation and neural systems, and bioengineering. His work receives support from the NSF and the Gordon and Betty Moore Foundation’s Programmable Molecular Technology Initiative.
Jonas Zmuidzinas is the Merle Kingsley Professor of Physics and chief technologist at NASA’s Jet Propulsion Laboratory. His research is funded by NASA, NSF, JPL, the Gordon and Betty Moore Foundation, and the Keck Institute for Space Studies.