Published online by Cambridge University Press: 20 January 2023
The load imbalance and communication overhead of parallel computing are crucial bottlenecks for galaxy simulations. A successful way to improve the scalability of astronomical simulations is a Hamiltonian splitting method, which needs to identify such regions integrated with smaller timesteps than the global timestep for integrating the entire galaxy. In the case of galaxy simulations, the regions inside supernova (SN) shells require the smallest steps. We developed the deep learning model to forecast the region affected by the SN shell’s expansion during one global step. In addition, we identified the particles with small timesteps using image processing. We can identify target particles using our method with a higher identification rate (88 % to 98 % on average) and lower “non-target”-to-“target” fraction (6.4 to 5.5 on average) compared to the analytic approach with the Sedov-Taylor solution. Our method using Hamiltonian splitting and deep learning will improve the performance of extremely high-resolution galaxy simulations.