It really comes down to your own level of anal-ness. (I say this with a disclaimer, as I am sure someone else will bark down my throat that only by having your TV set exactly the way Riddly Scott expects it to be set when he mastered Black Hawk Down will you ever be able to appreciate it for blah, blah, blah.)
The difference is between reference and preference, and the debate has logged a lot of threads here in the forums. Basically, do you believe that your TV should just look good to you, or that it should look exactly like the recording studio says it should? If you think the former is all that matters, I would say that a person of average intelligence, with workable eyes, can get their displays to the point that they are not missing anything too terribly significant without having to buy additional tools.
I have a couple of I1 products that I use to make sure my desktop monitor is accurate for still photography work, but that is primarily to ensure that the photos I see on screen look identical to the photos I see in print or that the photos sent to others (also with properly setup displays) will show similar results. I've never bothered to use them to try and calibrate or profile my movie/TV displays.
If you have a general understanding of what a proper picture should look like (no clipped whites, no clipped blacks, nobody with green or orange skin, etc) you can get very good results with a number of freebies found online (there are some good downloadable DVD calibrators here in the display calibration forum) that can get you a very well dialed in picture.
Now the general retort is usually something like but if you don't at least start with your display at reference levels, you'll never know what your display should look like to begin with, etc. etc. This is where your own personal level of anal-ness comes into play. If you get the display to where you think it looks good, and the various free/your-eyes-only calibration aids show that you are not vastly out of whack, is that good enough for you?
-Suntan