NASA revealed plans to launch the Nancy Grace Roman space telescope in September 2026, eight months earlier than expected. This telescope aims to provide astronomers with 20,000 terabytes of data throughout its operational life.
This will complement the daily 57 gigabytes of remarkable images from the James Webb Space Telescope, which began its mission in 2021, and the impending survey by the Vera C. Rubin Observatory in Chile, expected to collect 20 terabytes of data each night.
In contrast, the Hubble Space Telescope supplies only 1-2 gigabytes of data daily. With the growing data volume, astronomers now turn to GPUs for analysis.
Brant Robertson, an astrophysicist at UC Santa Cruz, has been at the forefront of this scientific evolution by applying GPUs to space research for over 15 years. He collaborated with Nvidia on advanced simulations and is currently developing tools to manage the influx of data from the latest observatories.
“There’s been this evolution [from] looking at a few objects, to doing CPU-based analyses on large scales of the dataset, to then doing GPU-accelerated versions of those same analyses,” Robertson shared with TechCrunch.
Robertson, along with Ryan Hausen, created a deep learning model named Morpheus to analyze large datasets and identify galaxies. Their AI analysis of Webb data uncovered an unexpected number of specific disc galaxies, challenging existing cosmic theories.
Morpheus is now evolving to accommodate new technologies: its architecture is shifting from convolutional neural networks to transformers, enhancing its capacity to analyze larger areas faster.
Robertson is also developing generative AI models trained on space telescope data to enhance the quality of ground telescope observations affected by Earth’s atmosphere. Since launching an 8-meter mirror into orbit remains challenging, software improvement of Rubin’s observations is the current solution.
However, Robertson feels the strain of growing global GPU demand. Although he developed a GPU cluster at UC Santa Cruz with NSF support, it ages while more researchers seek computational resources. The Trump administration’s proposal to cut NSF’s budget by 50% adds pressure.
“People want to do these AI, ML analyses, and GPUs are really the way to do that,” Robertson stated. “You have to be entrepreneurial…especially when you’re working kind of at the edge of where the technology is. Universities are very risk averse because they just have constrained resources, so you have to go out and show them that, ‘look, this is where we’re going as a field.’”
*When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.*
