A galaxy aflame!

JWST image of NGC 5134 is fire. Also, using real AI to find galaxies.

The Trifid Nebula looks like a red flower with dark lines converging on its center, surrounded by pale blue gas and countless stars.

The Trifid Nebula and environs. Credit: RubinObs/NOIRLab/SLAC/NSF/DOE/AURA

April 13, 2026 Issue #1022

Should we mine asteroids?

I did a podcast interview about that plus space rocks in general

A cartoon of an astronaut in a spacesuit about to hit a tiny asteroid with a pickaxe.

Science Stuff! Credit: ScienceStuff

Wanna hear me talk about asteroids? Jorge Cham (creator of PhD Comics!) interviewed me for his Science Stuff podcast where we talk space rocks: where they come from, how we can find them, how we can keep them from hitting us, and should we mine them for valuable minerals, including water ice?

Galaxy of flame

NGC 5134 seen by JWST looks like it’s on fire, but it’s really just smoke

Sometimes, the coolness of an astronomical image is in how it’s presented.

Sometimes also it’s the hotness. Like this image of NGC 5134, a spiral galaxy about 65 million light-years away in the constellation Virgo (not far from the bright star Spica on the sky, actually):

If you guessed this was an image from JWST give yourself an infrared star [and here’s a huge 4,200 x 4,200 pixel version of it]. It is, and it’s a combination of observations taken with both its NIRCam and MIRI instruments. The Near-Infrared Camera shows mostly stars, displayed as blue, teal, and green. It’s sensitive to redder stars (red in visible light, that is, like red giants and supergiants), and you can see them blurred together into a gentle glow in the center out to the edges.

The Mid-Infrared Instrument sees longer wavelengths, and in particular the 7.7-micron filter image (displayed here as orange) selects for dust grains in the galaxy, specifically PAHs, or polycyclic aromatic hydrocarbons. These are long-chain carbon molecules very similar to soot.

I love the poetry here: the spiral arms are displayed in a way that makes them look like flames, but we’re seeing the smoke! 

PAHs are created in massive stars when they explode as supernova, and even before that when they’re red supergiants. For example, remember when Betelgeuse got really dim in 2019-20? It expelled a huge cloud of dust (a generic term that includes PAHs) that made the star appear fainter, since that dust is opaque to visible light.

But warm PAHs emit light at long infrared wavelengths, which is why this image shows them so well. Massive stars don’t live long and stay near their stellar nurseries where they were born. Those are in the spiral arms, so the dust they blow into the galaxy is in the arms as well, and this image traces that structure well. Note the brighter sections at the ends of the galaxy along the long axis; those are gigantic complexes of nebulae making stars. That’s a bit clearer in images taken in visible light, like this one by the Carnegie Observatory (the image is copyrighted so I can’t display it here, but click through to see it; it’s rotated about 90° counterclockwise from the JWST image).

Observations like this help astronomers track where stars are born, where they die, how much dust they produce, and how that affects the galactic environment. And, as always, they’re also devastatingly beautiful.

It’s funny to me; the universe doesn’t have to be beautiful, and yet it is. I think this may be coincidence; we happened to evolve an aesthetic that appreciates graceful curves, flow, and color, and those are attributes common in galaxies. But whether this is true or not, art is science, and vice-versa. I’ve been saying that for years.

Neural net finds weird galaxies in Hubble images

This is real artificial intelligence, kinda

Hubble Space Telescope has taken hundreds of thousands of images of the sky, many of which are “deep”, meaning long exposures that can see faint objects. The vast majority of the galaxies seen have never been seen before! That’s a big opportunity for astronomers, but how to capitalize on it?

A team of scientists decided to give this a try. They developed a neural net called AnomalyMatch to dig into the data, looking at almost 100 million (!!) cutouts of galaxies — literally, small sub-images a few dozen pixels on a side featuring a galaxy in each — to see if they could find odd-looking galaxies. These are usually the result of galactic collisions, or gravitational lensing distorting their shape. But there could be other reasons, too, so examining them on a large scale can be pretty instructive.

Neural nets are computer programs designed to look into some specific aspect of a problem where there is a large sample of data to examine. They can be trained, meaning they are fed examples of objects — in this case, weirdly shaped galaxies — then let loose on the dataset to find more. Neural nets aren’t exactly artificial intelligence, even though in a very narrow sense they can learn, but they fit the bill way better than the LLM grift going on right now.

Anyway, looking through the images the net found 1,300 objects that fit the bill [link to journal paper]. Over 600 were from galaxy mergers, 140 from lensing, 35 were “jellyfish” galaxies (galaxies moving rapidly through a galaxy cluster and having their internal gas stripped away, leaving long tendrils behind them), and two were edge-on protoplanetary disks (disks of material around young stars that planets form from — in fact, this project was first trained to look for disks, but they expanded the list as time went on).

Six galaxies from the search, all showing odd shapes like loops, tendrils, and distortions.

Six galaxies from the search, including gravitational lenses and collisions. More info can be find by clicking the image. Credit: NASA, ESA, David O'Ryan (ESA), Pablo Gómez (ESA), Mahdi Zamani (ESA/Hubble)

1,300 out of 100 million is a small fraction, but imagine trying to do this yourself by eye! I imagine if they relax the parameters a bit they’d find many more, too. This was a first attempt, and shows that this sort of work is possible and helpful.

I played with neural nets a bit when I worked on Hubble — I was doing similar work looking for faint red dwarfs in the data — but found it a bit too out of my wheelhouse for me to use well. Also, those were early days of that sort of thing, and a lot of progress has been made since then. So I’m glad to see this going on! With Rubin and Roman coming online soon, we’re going to have vast amounts of data to sort through, so neural nets will become important, if not critical, tools for searching.

Et alia

You can email me at [email protected] (though replies can take a while), and all my social media outlets are gathered together at about.me. Also, if you don’t already, please subscribe to this newsletter! And feel free to tell a friend or nine, too. Thanks!

Reply

or to participate.