Using Machine Learning to Make A Really Big Detailed Simulation

Using Machine Learning to Make A Really Big Detailed Simulation

Title: A Gigaparsec-Scale Hydrodynamic Volume Reconstructed with Deep LearningAuthors: Cooper Jacobus, Solene Chabanier, Peter Harrington, JD Emberson, Zarija Lukic, and Salman HabibFirst Author’s Institution: University of California, BerkeleyStatus: preprint on ArXivSimulations have become one of the quintessential tools in modern astrophysics. Whether you are trying to understand a star, an active galactic nucleus (AGN) disk, a galaxy, or the universe itself, there is probably a simulation for that. One of the most popular kinds of simulations in modern astrophysics is a hydrodynamic simulation of large-scale structure (e.g., Illustris, Eagle, Astrid), which is designed to mimic a chunk of the universe containing millions of galaxies. These simulations start with an n-body simulation in which billions of particles representing dark matter are simulated under the attractive influence of gravity. In addition to dark matter, baryonic matter– everyday matter like protons, neutrons, and electrons– are simulated as a kind of “fluid” riding on the sea of dark matter, hence the “hydrodynamic” part.Every simulation of this type is a balancing act between three competing choices: the total size of the simulation, the computing power needed to run it, and the resolution, which, in simple terms, is a combination of the smallest distance and time scales at which the data in the simulation are reliable or meaningful. When designing a simulation, if you want to alter one of these three quantities, you must consider the effect on the other two. For example, if you want a larger simulation volume, you must increase the computing power or lower your resolution. What decisions you make when designing a simulation comes down to the resources available and...