I don't care to look up pattern and folders names anymore - as I have verified "low inter device variability" on my devices - and use HCFR and its inbuilt pattern generator instead of color pattern disks you have to switch manually.
As "known good sources" (for example in combination with a PS3) they are a good baseline to "get to know" output source accuracy (similarity really, ...) - but for my taste - you should switch to HCFRs pattern generator as soon as possible. From hearsay - you now can also use a Google Chromecast as a pattern generator for HCFR - so if there is proven to be low variability / deviance from ideal - I'd recommend doing that as well.
Pattern disks are just a hassle, that will keep you from learning more doing several calibration runs. They are a crutch. They might be necessary if your playback device introduces huge color (/greyscale) shifts. Most (from experience) don't. Some do.
RGB limited / full: You should see it when you get there. You are talking about loosing the botom 16 shades of dark colors (all crushed and black), or introducing black 16 shades too "bright" - so in effect you should see that something is wrong when the two don't match up - quite easily.
HCFRs black border around all color fields, if you use an APL level above 0%, always is PC level 0 black (while the black in the middle might be not, depending on your settings on the same settings screen), afair - and I have used this in the past as an indicator. (If both blacks look different on your screen, something is wrong (either the HCFR setting for limited or full, or the TVs setting for limited or full).
But you could also use any brightness testpattern - the crushed blacks you should very much see on the patterns, black being blown out (starting at 16 levels too high) you wont be able to see on a pattern, but really - looking at how real world images look, it should be obvious.
Mostly its just finding the "full/limited/(auto)" setting on your TV - and flipping it a few times - and you should know which one is the one to use for the input signal you are faced with.
2 point and 10 point sometimes go together, sometimes are completely different settings, it depends on the TV and model. Usually it is recommended, to calibrate 2 point first, and then calibrate 10 point in addition - but mostly because of little kinks, like LG TV supposedly producing more artefacts if you try to correct everything with 10 point sliders. It also means that in most cases they should go together.
No, I won't provide videos (I've never even once seen a Youtube tutorial that was even half as decent as a written write up that had time to be self referential, reflect, correct, rewrite, ...) - and I won't provide step by step handholding after having written a step by step tutorial, bit should there be questions that come up after at least having tried to figure out a supposed "issue" first - I'm sure someone will provide support, me included.
Just don't expect others to do the figuring out for you. Questions like "what does 100% brightness, 100% saturation" mean - warrent a "take a guess - 100% brightness, 100% saturation" answer.
Asking if you really need all those pesky color patterns if you want to calibrate a TVs color almost ask for a "not if you only wan't to calibrate six colors total" answer.
Logic has to be applied.
Don't mistake forum users for App usage assistances.
Also - because gamma is so horribly broken by design by now - I'd says that calibrating TVs currently is out of the question and everyone who is suggesting that they are doing so are lying their asses off.
Calibrators today wan't to tell you that it is ok to calibrate TVs to either one of those two targets (or any number of different ones, no one cares) today - just depending on "if your room is bright" - or if "the black level of your TV demands it".
We have some real "voodoo sh*t" going on currently as far as "standards" are concerned.
Basically because of that I'd suggest, that TV calibration currently is more a distributed fraud model, than an actual model that standardizes anything.
You are welcome to join that discussion, once you have learned whats going on. You won't learn it, if you are just asking question on how to best follow tutorial steps. Or to match your testdisc with a different testdisc, or a specific calibration program.
Use several, find out the varience for yourself. If you arent sure you use the right patterns, try to understand what they are showing - instead of getting a "yes this one, and then that one" shortcut out of someone.
edit: If your CMS settings are labeld strangely - yes, ingeneral you are looking for low dE values and a Color target thats "spot on".
But as you ordered a colorimeter that depends on factory corrections for each "display type" (defined in difference in spectral graphs), and there have been several new "display types" introduced in LED LCDs alone in the past three four years, on which no one cared to profile or compare them - colorimeters without model specific spectral corrections become more and more useless by the day.
Also - you may or may not trust spectral corrections, as I have seen at least two of them break in spectacular fashion (compared to a spectroradiometers readings) on current TVs -
and of course there is the color metamerism failure issue - people in here wan't to ignore as well (but that will become more prevalent with the new quantom dot devices, I'm sure... So color "science" for TVs is broken three or four different ways, and thoroughly broken by every one of those issues - so have fun - measuring "something".