You would only ever use 24Hz for two things:
1. to match the frame rate of a 24fps video
2. to reduce the bandwidth used so that you can use higher resolutions and color depth / chroma, etc.
My understanding is that I can get the correct color depth (correct being what he content is encoded at) while using 60Hz, for SDR and HDR content, right? I know what you’re saying, and if I drop down the refresh rate I can have the device output 4:4:4 for SDR, but if it wasn’t encoded that way, I’d rather let my TV upsample to that Chroma.