The added value of a 3D LUT calibration is entirely relative to how your display tracks at various levels of luminance and saturation once corrected using its internal controls.
For example, on my JVC X30 which has poor OOTB calibration, no internal CMS and uncorrectable green and cyan at 100% sat, a 3D LUT calibration even with a small LUT (5x5x5) with a Lumagen Mini was like getting a new display. Litterally day and night. Moving from really frustrating to close to perfection (from a color accuracy point of view). I'm sure it would get marginally better with a larger LUT, 9x9x9 or 17x17x17. But even a small LUT made a significant difference, well worth the cost of the hardware+software (to me).
On the other hand, when I evaluated a Sony 500ES recently, it measured so close to reference OOTB at all levels of saturation and luminance (I measured the 125 points of a non calibrated LUT and all points were under 3dEs) that a 3D LUT (at least a 125 points one, I didn't test with a VP offering a larger LUT) would bring very little to this display - in my set up, it might be different with a less neutral screen for example - from a calibration point of view.
Of course the display will drift as the lamp ages, especially regarding gamma, so more corrections would likely be needed after a few hundred hours, but the Sony software allowing to calibrate gamma, along with the autocalibration feature to bring it back to OOTB state might suffice in that case.
So the first thing to do would be to measure your display and see if it tracks well or not at various levels of luminance and saturation. If it tracks well, save the money, or get the equipment for other - valid - reasons, like upscaling, processing, flexibility, etc. If it doesn't track well, then the worse it tracks, the more you'll stand to gain from a good 3D LUT calibration.
This being said, it's also relative to how sensitive you are to color accuracy. I know many who are perfectly happy with a non calibrated JVC, or even with a JVC calibrated using their poor internal CMS. So you need to provide more information, both about your display and your personal standards to get informed replies.
One last thing, it's all great to use low-end meters to get "perfect" calibration, but at the end of the day you only get a calibration as good as your meter. So if the errors to start with are large (like on most JVC), you can only benefit from a calibration, even with a low end meter like an i1d3. But if the errors are small to start with (like on the Sony 500ES), you need reference equipment, otherwise you are calibrating to the meter, not to reference. A lot of people are also struggling with this notion, and obsess about getting down to the last .5 dE of accuracy, when the meter itself might be 3 dEs off or more.
Overall, spend as much as you want to, but only if something frustrates you in the picture, and if you know what it is. Otherwise, save your money to buy a better display or more movies. That would be my advice
PS: One very last thing: although software and hardware manufacturers want you to believe that it's easy to simply set up a meter and press one button to get a perfect calibration, in my experience you need to invest quite a lot of time to get good results, even with autocalibration. So unless you want to invest a good amount of time learning, I would suggest to dissociate the hardware - say a Lumagen (or an eecolor if you don't care about 3D) - and the software+meters. Getting the hardware installed by a good calibrator not only saves you money - no need to buy meters or software - but also a lot of time and headaches. You can spend more time watching movies and less time with meters, software, patterns and bugs. If you are technically minded, calibration can be a great hobby and a fantastic way to waste a lot of time, especially during long winter nights:). However you might end up spending more time calibrating than watching movies, so beware...