I don't actually know I've only learned it from using phones from all companies for the last 8 years and watching hundreds (thousands?) of reviews and comparisons and also reading about why the original Pixel and Pixel two were so far ahead and what the iPhone copied with the xs series to catch up.
Basically you can expect all Samsung phones to always be this way because there's never been any indication that they intend to compete with google the way apple did.
Just Google "rolling buffer Pixel" or "Pixel zsl" or "Pixel zero shutter lag" to see what's up.
Basically, as soon as you open the camera app, it begins shooting a sort of video in the background which is made of special types of frames. Then when you press the shutter it stops capturing frames and combines the buffer into a picture—thus, it is almost impossible to get blurry subjects unless there's a very high amount of motion. It combines that technique with actual several other algorithms including HDR+ (which combines frames from before AND after you press the shutter).
The iPhone does something similar. There are probably papers on it but you can watch their keynote for the xs series to get a laymen's overview.
The point is that Samsung adamantly refuses to get with the program. Google started doing this in 2016, Apple copied them by 2018, and there's no sign that Samsung intends to get with the program.
In general, the overall thing that's happening is called "computational photography", which google basically started with HDR+ back in the Nexus 5 days in 2013, but the rolling buffer and ZSL are a newer subset of that technology.
Interestingly, iPhones have had a shocking ability to get photos of children and pets without blue since at least the iPhone 4s (my first iphone). So they've always been extremely good at that. It's just the special computational HDR stuff that they started copying from google since 2018.