The process by which stars form has been well established for decades. Then, in the 1990s, astronomers looked out on the universe and saw evidence everywhere telling them that they weren’t getting the whole picture. Stars were beginning to form, but then disappearing. How?
The Formation of a Star
Making a star requires only two things: enough matter and enough time. The matter doesn’t have to be in large chunks. The first stars formed from the diffuse dust of hydrogen and helium, and possibly just a few heavier molecules formed by chance. The process hasn’t changed much. Hydrogen gas floats in giant space clouds. Over time it cools, and as it cools, it condenses. The condensing process heats the gas up once more, especially towards the center. At a certain point, the heat and pressure cause the hydrogen atoms at the center of the cloud to fuse together, and the ball of matter becomes a burning star.
Scientists knew how stars formed, and they knew where stars formed. The universe condenses unevenly, so dense gas clouds are separated by huge empty spaces. Inside these gas clouds, the gas also clumps unevenly, creating many stars in each cloud, which is how galaxies are formed. And since galaxies already populated with hundreds of millions of stars still have gas clouds leftover, all astronomers have to do to find newly-forming stars is point their telescopes at the thick, dense centers of existing galaxies.
In the 1990s, astronomers turned their telescopes on the galaxies expecting to get a nice look at star formation, but it wasn’t there. The ingredients for the star were present. Astronomers could see gas, and they knew it was swirling around in sufficient quantities. They could even check the temperature. Oxygen and iron, not present in great quantities in these gas clouds but still observably present, give off x-rays at specific temperatures. Astronomers could watch the cooling gas clouds that were on their way to making new stars...
...until they simply winked out of existence. They looked out and saw that in these regions gas condensed and cooled, and condensed and cooled, and then stopped. Oh, there was a little bit of still-cooling gas leftover, but the majority of it not only stopped cooling but seemed to have simply disappeared.
This disappearing act wasn’t happening directly in front of scientists. The cooling process, astronomers estimated, took place over about 10 million years. The disappearance happened in x-ray photos taken by telescopes, but, by looking at a lot of data, they still got a picture of what was happening and it made no sense. They were watching a star form, just like they knew stars always formed, until the proto-star disappeared.
What Could Make a Star Disappear?
It was only when astronomers took a look at the Perseus galaxy that they finally understood what was happening. The centers of galaxies aren’t just home to stars and the dense gas clouds that make them; most galaxies center on a giant, but hidden, black hole. Black holes are the big scary beasts of the universe, and they’re scary because they don’t just pull matter in — they also push matter out.
A two-year look at the Perseus galaxy reveled the effects of a phenomenon that had been as yet invisible to astronomers. Every once in a few million years, the black hole at the center of the galaxy was blowing “bubbles” of hot particles. These hot emissions of particles moved through the dense clouds of gas, heating them up, and then dispersing them. The emissions weren’t continuous, but they didn’t have to be. The gas would condense and cool, condense and cool, and then get blown away by the black hole.
So now we know that making a star requires three things. It requires matter. It requires time. And it requires a secluded spot out of the reach of mysterious forces that obliterate a forming star the way a gust of wind obliterates a dandelion puff.
[Source: Gravity’s Engines, by Caleb Scharf]