Thanks you. Yes, I will try to get a short and proper material for this, but I’m not in the power to just order it!
In a nutshell, ACES was developed to mix all cameras used during filmmaking without caring what camera produces what. Since its first announcement, I followed it and tried to create awareness here, as the misuse of LUTs and other weird treatment of renderings seemed to have gone in the wrong direction.
The core idea: Each camera has been profiled to put, e.g., a GoPro into the mix, while a RED or an ARRI camera is in the same pool. At the same time, the camera data after that point should make no difference. Of course, the limitations in dynamic range or tonal fidelity can’t be overcome with this. But the space in which all of that happens is larger than any current camera can produce. Certainly larger than the human eye can capture. As usual, the pipeline must be bigger than the results. Not the other way around. Like some people, even today produces in 8bit.ch!
For CG, it is usually the ACEScg or perhaps the ACEScc (color correction); however, the profiles assume that the values are fine and an 18% gray value (0,18) or, as some people call it a mid-gray or 50%, while all others are linear. With that, the transfer functions will work.
Now one can mix a match without effort. Many options for the target device (Scree, Film-printing, Projector, etc.) are given when all is done.
After your questions, I searched what Adobe offers so far, as Adobe should answer the question for AfterEffects and Premiere. Maybe it is more hidden than I thought, but I couldn’t find a thing.
Max has done a lot of DaVinci material about ACES, which explains the process, but yes, a short and proper AfterEffects session should be available.
All the best