^ The 23.97 is for compatibility with broadcast telecine.
In historical NTSC standard definition analog TV broadcasts, the actual frame rate was 29.97 frames per second or 59.94 interlaced fields per second (rather than the "nominal" 30 and 60) for technical reason related to keeping enough bandwidth separation between the video and audio components of the modulated broadcast signal while staying within the limited frequency bandwidth assigned to each broadcast channel. That video rate has been carried over into modern day. (NOTE: There is no similar issue with 50Hz PAL broadcasts.)
This is why 60 Hz power line interference lines crawl slowly up your TV screen; the 60Hz power line rate is slightly faster than the 59.94 refresh rate of the TV.
The TV fields making up the broadcast signal don't arrive butted end to end; there's a gap in between them where nothing is displayed -- the sync "blanking" interval -- which gives the TV a chance to get its act together and get prepared to start displaying the next field. There's enough slack in that sync interval between fields (i.e., that waiting period before the next field begins) that the TV can easily stay synced with the real video signal even though the video rate is slightly different from the power line rate.
For movies recorded at 24 frames per second, converting them for TV broadcast requires adding in the extra frames so the TV gets the frame rate it is expecting. This process (telecine) is done by breaking each frame of film up into 2 interlaced fields and then repeating the fields in a regular "cadence". Basically you need 4 frames (8 fields) of film to happen in the same amount of time as 5 frames (10 fields) of video. A typical "field repeat cadence" might be 2-3-3-2. The result is that some of the fields (interlaced half-frames of the film content) are shown on screen slightly longer than the rest -- which is where "cadence judder" comes from.
But since video is not really 60 fields per second but rather 59.94, the FIRST thing you have to do is slow the film down just a hair to maintain the same timing relationship between film frames and video frames. Thus 23.97 frames per second for film.
That slow down -- 0.1% less than the "nominal" rate for both the video and film rates -- is too tiny for the eyes or ears to spot -- amounting to roughly one less frame every 33 seconds. For example, it is so tiny that no "pitch adjustment" is needed for the audio.
For home theater, modern electronics can handle the difference between 24.000 and 23.97 frame rates without problem. I.e., the player can generate the correct output timing regardless of which is on disc, and so the disc formats give studios the option of putting the content on disc either way.
In what follows I'll ignore the difference between 24.000 and 23.97 frame rates -- as that all gets handled automagically.
Now I mentioned "cadence judder" above. That's a VERY slight ratcheting of what is supposed to be smooth motion when film is displayed at video rates due to some fields of the film being held on screen slightly longer than others as part of that telecine process of raising the film frame rate to the video frame rate for display on the TV. You've been seeing cadence judder your whole life -- any time you watched a film on broadcast TV. It turns out the brain is VERY good at ignoring cadence judder, so unless you are unusually sensitive to it, odds are you've never noticed it, and won't even notice it now unless you have a chance to compare normal and "judder free" display side by side. One good place to look for it is in the credit scroll at the end of a movie, which will advance in a slightly ratcheting fashion due to cadence judder.
But these days, modern electronics allow for reasonably priced TVs that can display images at EITHER a 30 or 24 based refresh rate. Modern TVs will typically display 60 frames per second of TV (the normal 30 frame rate with each frame doubled). For film rates, the TV refresh rate will be a multiple of 24 -- 48, 72, and 96 being the most common -- differing by how many times each film frame is repeated on screen before the next one gets displayed. (These repeats reduce the chance you'll see "flicker" due to the slowness of 24 frames per second -- theater movie projectors do a similar thing, breaking up each frame into 2 repeats with a mechanical shutter mechanism.) This allows for the possibility of displaying movie content WITHOUT cadence judder.
Blu-ray movies are recorded on disc as 1080p/24. If you have a TV that can accept /24 input and "do the right thing" with it, then you get cadence judder free display of that movie.
But what of SD-DVDs? They are recorded on disc as 480i/60. That means movies on disc have ALREADY had telecine applied to them. The cadence judder is already built it.
Theoretically you can detect the field repeat cadence in the SD-DVD video stream and REMOVE the repeated fields -- thus recovering the original movie rate of 24 frames per second. That can be fed to the same sort of TV as above for cadence judder free display.
But the reality is that this is actually pretty tricky.
First off, suppose what's actually on disc is a TV show? Well that was recorded at 30 frames per second -- video rate -- not 24. And if you force it to 24 for output you are *DEFINITELY* going to have problems. That's because there's no good way to decide which frames to discard to LOWER the frame rate. So you will get "frame drop stutter". This is a significant jerkiness of motion -- most easily seen in horizontal or vertical pans as the whole image is shifting at the same time. Frame drop stutter is much MUCH worse than cadence judder. Almost everyone will see frame drop stutter, and it is patently obvious that it was there if you turn off 24p conversion and view the video normally -- motion will look distinctly better.
So the first issue is that you don't want to apply 24p conversion to SD-DVDs that were recorded at video rate. Now this will usually mean TV shows and live concerts on SD-DVD. HOWEVER, some TV series are actually recorded at 24 frames per second -- i.e., they are produced like movies. And some movies are EDITED with video rate equipment (for a variety of reasons that need not concern us). The point is you can get surprised and discover a TV show works well with 24p conversion or a movie DOESN'T work well with it.
In addition to the luck of the draw on how the content was produced, there's also the problem that the process of creating the "transfer" -- the digital rendition of the movie on SD-DVD -- often introduces glitches. These are points where the flow of the cadence is broken, and thus the 24p conversion goofs up in deciding which fields to discard in its attempt to extract the original 24 frames per second film content.
Such glitches often come from "bad edits", where the process of editing the film broke the field repeat cadence.
Others come from the mechanical telecine process used years ago to produce video broadcast versions of movies -- the result of which has just been dumped onto SD-DVD without regard to the problems that might cause. Movies on such discs HAVE NO fixed relationship between film and video frames. There is no uniform repeat cadence to be found. And thus 24p conversion will get all confused.
Glitches like this also produce frame drop stutter, or even worse artifacts depending on how the 24p conversion is implemented. And depending on the nature of the glitch, how frequently they occur, and the type of 24p conversion algorithm in use, the ability to RECOVER after a glitch will vary. So the period of bad video may be brief, longer, or even indefinite if the system can't figure out how to recover the cadence lock.
So what to do?
First, consider applying 24p conversion only to newer SD-DVDs of newer movies. These are more likely to be on disc "correctly".
Second, learn what frame drop stutter looks like. The easiest way to do that is to play an SD-DVD of a TV show and compare it with and without 24p conversion.
Then go ahead and try 24p conversion -- assuming you have a TV that "does the right thing" with /24 input in the first place (since there's no point in using 24p conversion if your TV is simply going to raise it BACK up to a 60 refresh rate!). And just be prepared to turn off 24p conversion if you see frame drop stutter, as that's your clue that THIS particular SD-DVD disc is not a good candidate for it.
TVs will attempt to do cadence detection "on the fly" and switch between how they convert fields into frames to include the repeated fields or not. That is they will take a movie stream of 48 fields per second and produce 24 frames per second from that and THEN raise that to 60 frames per second by repeating WHOLE FRAMES instead of fields. This makes for better looking "deinterlacing". This is what most TVs mean by "reverse telecine". Some TVs will actually shift their refresh rate on the fly as well -- detecting incoming film rate content and switching to, say 72 frames per second refresh instead of 60.
Disc players that do 24p conversion have a different issue to deal with, which is that changing between /24 and /60 output on an HDMI cable requires a new HDMI handshake -- which takes at least 2 seconds and sometimes longer. And so the player has to take a different tack. It has to stick with a choice of film or video rate processing to avoid constant new HDMI handshakes interrupting the flow of the viewing. And thus glitches can last longer if the player gets confused by the content into making the wrong choice. On the other hand, letting the player do the conversion means you can get true /24 into a TV that DOESN'T try to switch refresh rates itself unless fed with /24 to begin with.