Like archeologists carefully digging for fossils, scientists with the Planck mission are sifting through cosmic clutter to find the most ancient light in the universe.
The Planck space telescope has created the most precise sky map ever made of the oldest light known, harking back to the dawn of time. This light, called the cosmic microwave background, has traveled 13.8 billion years to reach us. It is so faint that Planck observes every point on the sky an average of 1,000 times to pick up its glow.
The task is even more complex than excavating fossils because just about everything in our universe lies between us and the ancient light. Complicating matters further is "noise" from the Planck detectors that must be taken into account.
That's where a supercomputer helps out. Supercomputers are the fastest computers in the world, performing massive amounts of calculations in a short amount of time.
"So far, Planck has made about a trillion observations of a billion points on the sky," said Julian Borrill of the Lawrence Berkeley National Laboratory, Berkeley, Calif. "Understanding this sheer volume of data requires a state-of-the-art supercomputer."
Planck is a European Space Agency mission, with significant contributions from NASA. Under a unique agreement between NASA and the Department of Energy, Planck scientists have been guaranteed access to the supercomputers at the Department of Energy's National Energy Research Scientific Computing Center at the Lawrence Berkeley National Laboratory. The bulk of the computations for this data release were performed on the Cray XE6 system, called the Hopper. This computer makes more than a quintillion calculations per second, placing it among the fastest in the world.
One of the most complex aspects of analyzing the Planck data involves the noise from its detectors. To detect the incredibly faint cosmic microwave background, these detectors are made of extremely sensitive materials. When the detectors pick up light from one part of the sky, they don't reset afterwards to a neutral state, but instead, they sort of buzz for a bit like the ringing of a bell. This buzzing affects observations made at the next part of the sky.
This noise must be understood, and corrected for, at each of the billion points observed repeatedly by Planck as it continuously sweeps across the sky. The supercomputer accomplishes this by running simulations of how Planck would observe the entire sky under different conditions, allowing the team to identify and isolate the noise.
Another challenge is carefully teasing apart the signal of the relic radiation from the material lying in the foreground. It's a big mess, as some astronomers might say, but one that a supercomputer can handle.
"It's like more than just bugs on a windshield that we want to remove to see the light, but a storm of bugs all around us in every direction," said Charles Lawrence, the U.S. project scientist for the Planck mission. "Without the exemplary interagency cooperation between NASA and the Department of Energy, Planck would not be doing the science it's doing today."
The computations needed for Planck's current data release required more than 10 million processor-hours on the Hopper computer. Fortunately, the Planck analysis codes run on tens of thousands of processors in the supercomputer at once, so this only took a few weeks.
Planck is a European Space Agency mission, with significant participation from NASA. NASA's Planck Project Office is based at JPL. JPL, a division of the California Institute of Technology, Pasadena, contributed mission-enabling technology for both of Planck's science instruments. European, Canadian and U.S. Planck scientists work together to analyze the Planck data.
The Planck space telescope has created the most precise sky map ever made of the oldest light known, harking back to the dawn of time. This light, called the cosmic microwave background, has traveled 13.8 billion years to reach us. It is so faint that Planck observes every point on the sky an average of 1,000 times to pick up its glow.
The task is even more complex than excavating fossils because just about everything in our universe lies between us and the ancient light. Complicating matters further is "noise" from the Planck detectors that must be taken into account.
"So far, Planck has made about a trillion observations of a billion points on the sky," said Julian Borrill of the Lawrence Berkeley National Laboratory, Berkeley, Calif. "Understanding this sheer volume of data requires a state-of-the-art supercomputer."
Planck is a European Space Agency mission, with significant contributions from NASA. Under a unique agreement between NASA and the Department of Energy, Planck scientists have been guaranteed access to the supercomputers at the Department of Energy's National Energy Research Scientific Computing Center at the Lawrence Berkeley National Laboratory. The bulk of the computations for this data release were performed on the Cray XE6 system, called the Hopper. This computer makes more than a quintillion calculations per second, placing it among the fastest in the world.
One of the most complex aspects of analyzing the Planck data involves the noise from its detectors. To detect the incredibly faint cosmic microwave background, these detectors are made of extremely sensitive materials. When the detectors pick up light from one part of the sky, they don't reset afterwards to a neutral state, but instead, they sort of buzz for a bit like the ringing of a bell. This buzzing affects observations made at the next part of the sky.
This noise must be understood, and corrected for, at each of the billion points observed repeatedly by Planck as it continuously sweeps across the sky. The supercomputer accomplishes this by running simulations of how Planck would observe the entire sky under different conditions, allowing the team to identify and isolate the noise.
Another challenge is carefully teasing apart the signal of the relic radiation from the material lying in the foreground. It's a big mess, as some astronomers might say, but one that a supercomputer can handle.
"It's like more than just bugs on a windshield that we want to remove to see the light, but a storm of bugs all around us in every direction," said Charles Lawrence, the U.S. project scientist for the Planck mission. "Without the exemplary interagency cooperation between NASA and the Department of Energy, Planck would not be doing the science it's doing today."
The computations needed for Planck's current data release required more than 10 million processor-hours on the Hopper computer. Fortunately, the Planck analysis codes run on tens of thousands of processors in the supercomputer at once, so this only took a few weeks.
Planck is a European Space Agency mission, with significant participation from NASA. NASA's Planck Project Office is based at JPL. JPL, a division of the California Institute of Technology, Pasadena, contributed mission-enabling technology for both of Planck's science instruments. European, Canadian and U.S. Planck scientists work together to analyze the Planck data.
No comments:
Post a Comment