In recent months it has become abundantly clear that NASA’s James Webb Space Telescope is doing exactly what it sets out to do. Just as its creators had hoped, the multi-billion dollar machine flawlessly “unfolds the universe” by revealing cosmic light we can’t see with our own eyes – and its superb results bring even the most unlikely stargazers to life.
Because of this gilded telescope, Twitter raged one day over a dim red dot. For 48 hours, people around the world stared at a galaxy that was born shortly after the birth of time itself. It appears that humanity stands united above stardust thanks to the technological capabilities of the JWST.
But here’s the thing.
With personal admiration, scientists at the Massachusetts Institute of Technology warn that we should consider one crucial scientific Follow a superhero telescope.
If the JWST is like a scope upgrade from zero to 100, you wonder, is it possible that our scientific models also need a zero to 100 restart? Can’t the datasets scientists have been using for decades match the device’s performance and therefore reveal what it’s trying to tell us?
“The data we will get from JWST will be incredible, but…our knowledge will be limited if our models don’t match them qualitatively,” Clara Sousa-Silva, quantum astrochemist at the Center for Astrophysics, Harvard & Smithsonian, to CNET.
And according to a new study she co-authored, published Thursday in the journal Nature Astronomy, the answer is yes.
More specifically, this paper points out that some of the light-parsing tools that scientists typically use to understand exoplanet atmospheres are not fully equipped to deal with the extraordinary light data from the JWST. In the long run, such an obstacle can have the greatest impact Celebration Everyone’s JWST quest: the hunt for extraterrestrial life.
“Currently, the model we’re using to decode spectral information doesn’t match the precision and quality of the data we have from the James Webb Telescope,” said Prajwal Niraula, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences and co-author of the study said in a statement. “We have to improve our game.”
Here’s a way to think about the riddle.
Imagine pairing the latest, most powerful Xbox console with the very first iteration of a TV. (Yes, I know the extreme hypothetical nature of my scenario). The Xbox would try to give the TV fantastic, high definition, colorful and beautiful graphics to show us – but the TV wouldn’t have the capacity to compute any of it.
I wouldn’t be surprised if the TV exploded right away. But the point is, you wouldn’t knows what the Xbox wants to offer you, unless you get an equally high-definition TV.
Similar to the discovery of exoplanets, scientists feed a range of light or photon data from space into models that test for “opacity”. Opacity measures how easily photons pass through a material and depends on factors such as light wavelength, material temperature and pressure.
This means that each such interaction leaves a tell-tale signature of the photon’s properties and therefore, when it comes to exoplanets, what type of chemical atmosphere those photons passed through to get to the light detector. In this way, scientists use light data to calculate what makes up an exoplanet’s atmosphere.
In this case, the detector linkage is on the James Webb Space Telescope — but in the team’s new study, after testing the most commonly used opacity model, the researchers saw JWST light data pointing to what they call an “accuracy wall.” “ met. ”
The model wasn’t sensitive enough to analyze things like the researchers say, whether a planet has an atmospheric temperature of 300 or 600 Kelvin, or whether a particular gas occupies 5% or 25% of the atmosphere. Such a difference is not only statistically significant, but according to Niraula it is also “important for us to constrain planet-forming mechanisms and reliably identify biosignatures.”
That is, evidence of extraterrestrial life.
“We need to work on our interpretation tools,” Sousa-Silva said, “so we don’t see something amazing through JWST but don’t know how to interpret it.”
In addition, the team found that their models obscured their uncertain readings. A few adjustments can easily cover up uncertainties and consider results to be well-suited when wrong.
“We found that there are enough parameters that one can tweak, even with the wrong model, to still get a good fit, meaning you wouldn’t know your model was wrong and what it was telling you , is wrong,” Julien de Wit, an assistant professor at MIT’s EAPS and co-author of the study, said in a statement.
Going forward, the team is pushing for an improvement in opacity models to account for our spectacular JWST revelations – specifically the call for crossover studies between astronomy and spectroscopy.
“There is so much that could be done if we knew exactly how light and matter interact,” says Niraula. “We know this well enough about Earth’s conditions, but as we move into different types of atmospheres things change, and that’s a lot of data of increasing quality that we risk misinterpreting.”
De Wit compares the current opacity model to the ancient language translation tool, the Rosetta Stone, and explains that this Rosetta Stone has worked well so far, for example with the Hubble Space Telescope.
“But now that we’re taking Webb’s precision to the next level,” the researcher said, “our translation process will prevent us from capturing important subtleties, such as those that make the difference between a planet’s habitability.”
As Sousa-Silva puts it, “It’s a call to improve our models so we don’t miss the intricacies of the data.”
#Scientists #warn #data #NASAs #Webb #Space #Telescope #misinterpreted
Leave a Comment