Motion judder is inherent to any content shot at 24 frames per second.
All movies are shot @ 24 fps.
The only exceptions for now are The Hobbit films in HFR/48 fps.
I'm not a fan of HFR or any oth the motion smoothing enhancements. I looks like early video tape to me.
For live tv it's certain that it arrives at your display at 1080i 60. What it's actually shot at could be several different formats.
At the display the signal gets up converted to the native resolution of the the display 1080p @ 60
This deinterlacing process is very effective. You can read more here
http://en.wikipedia.org/wiki/Deinterlacing
So why do you see so much motion blur on cable?
There are many variables that could be contributing to this.
I'm assuming that the connection from the cable box is all via hdmi.
The first thing I would check is the cable receiver is actually outputting HD.
I know this sounds ridiculous but you would be surprised at how many friends and families think they are getting HD but are actually watching SD.
Make sure you are not double scaling. This can happen when you have a surround receiver between your sources and your display.
Many if not all new receivers have their own scaling and enhancement capabilities. Make sure these are all off and you are just passing the signal through to the display.
Next turn off all enhancements on the display.
After all this you still have significant motion blur on live sports it's something that your provider is doing or not doing.
Likely too much compression.
I have seen awful HD especially from Uverse and WOW.
Comcast is hit or miss depending on the area and the channel.
It's possible that by using some of the displays enhancements you can get to an image you find acceptable.
But there is only so much you can do with poor quality content.
The 2nd Edition of the Spears & Munsil BD has great patterns for testing judder.
http://www.spearsandmunsil.com/portfolio/pattern-help-text/