Authors: Cole, J. W., et al.
First Author’s Institution: Department of Physics and Astronomy, Texas A&M University, College Station, TX, USA
Status: ePrint [open access]
In the local universe, galaxies tend to form stars at a slow and steady rate, but in the early universe, new observations suggest this is not the case. The James Webb Space Telescope has allowed astronomers to peer further back into the universe’s history, and the galaxies that they’ve found seem to defy expectations. For example, astronomers keep observing more bright galaxies than initially expected. One explanation for the observations made by JWST is that galaxies in the early universe form their stars in bursts, interspersed with periods of less intense star formation.
The average rate at which galaxies form stars varies over cosmic time: from the beginning of the universe, it rises for about 3.3 billion years, peaks at around a redshift of 2 (equivalent to about 10.4 billion years ago), and then begins to decline (see Figure 1). While an individual galaxy might deviate from this general trend, it’s a good description of how an entire population of galaxies behaves over time. Importantly, this trend is slightly different depending on a galaxy’s mass. More massive galaxies tend to form more of their stars and reach their peak star formation rate earlier in the universe’s history.
The correlation between mass and star formation rate gives rise to a relationship known as the star-forming main sequence, or SFMS. The SFMS is an observed correlation between a galaxy’s stellar mass and its star formation rate (or SFR) – higher-mass galaxies generally form stars at a higher rate. Since the cosmic star formation rate changes with time, so does the SFMS, for example, the SFMS of galaxies close to the peak of cosmic star formation will be shifted towards higher star formation rates than an SFMS of local galaxies.
To make things more interesting, astronomers can use different features in a galaxy’s spectrum to estimate the star formation rate averaged over a different amount of time. For instance, you can use the luminosity of the H-α emission line to estimate the star formation rate, averaged over the last 10 million years. This is because the H-alpha emission line is produced in ionised gas surrounding the most massive stars (O types), which only live for a couple million years. By contrast, the ultraviolet (UV) luminosity of a galaxy is more sensitive to the emission from lower-mass, longer-lived stars, giving astronomers a way to estimate the star formation rate over the past 100 million years.
The authors of today’s paper use these two measures of star formation rate to investigate the star-forming histories of over 1,800 high-redshift galaxies. They find that the SFMS has a lot more scatter if you use a short-timescale, H-α based SFR as opposed to a longer-timescale, UV-based SFR, suggesting that the star formation rate is more variable on short timescales and supporting the idea of bursty star formation.
The galaxies used in the analysis were observed as part of the Cosmic Evolution Early Release Science (CEERS) survey using the Near Infrared Camera (NIRCam) on JWST. All of the galaxies are in a very well-studied region of the sky, known as the Extended Groth Strip, so the authors were able to combine the JWST data with pre-existing Hubble data.
To determine how the SFMS evolves with time, the authors divided the sample into five different bins, each in a different redshift range. For each bin and each star formation rate indicator, they estimated the slope, normalisation (y-intercept), and scatter of the SFMS. Across all redshifts, the authors found that the shorter-timescale, H-α based SFR generated an SFMS with more scatter (see Figure 2) and a lower normalisation than an SFMS based on a longer-timescale SFR. Larger scatter shows that the SFR varies more on short timescales than long timescales, reflecting a bursty star formation history. The lower normalisation shows that the short-timescale SFR is, on average, lower than the long-timescale SFR. So, the authors conclude that while star formation does happen in bursts, it’s more accurate to describe the star formation history as interruptions to normal star formation (lulls or naps which last between 100 and 250 million years), as opposed to short periods of very intense star formation.
Today’s authors also found that the scatter of the shorter-timescale SFMS increased with redshift, suggesting that galaxies in the early universe had burstier star formation histories. However, the normalization of the SFMS did not change significantly, suggesting that the intensity of the bursts was not significantly higher in the early universe.
When the authors divided the sample into higher-mass and lower-mass galaxies, they found some interesting behaviour. On average, lower-mass galaxies had more similar long- and short-timescale SFRs, while higher-mass galaxies showed a more marked difference between the two SFRs, with the shorter-term SFR being lower than the higher SFR. You can see this behaviour in Figure 3, which plots the ratio between the short- and long-term SFRs on the y-axis and shows a negative correlation between the ratio and mass. This shows that lower-mass galaxies experience longer bursts of star formation than higher-mass galaxies. The authors estimate that star-forming bursts last around 60 million years in high-mass galaxies and about 110 million years in low-mass galaxies.
The results of today’s paper give us an exciting insight into the early stages of galaxy evolution and will help us to model these newly observed baby galaxies. There are still lots of open questions regarding why star formation histories seem to be so different in the early universe, so stay tuned as astronomers learn more about galaxy evolution!
Astrobite edited by Will Golay
Featured image credit: NASA/STScI/GLASS-JWST program: R. Naidu, G. Brammer, T. Treu.
Discover more from astrobites
Subscribe to get the latest posts to your email.