I am very new at this process, so here is what I've learned so far: it takes a long time and consumes a lot of disk space.
I experimented with my retail copy of Hunger Games.
I ripped the entire BD disk to a local directory structure using DVDFab.
This generated some 44gb of files and removed the copy protection. AnyDVD not required.
Next, I used the entire Clown_DB package.
This generated some 42gb files in the Movie directory, and another 42gb in the Remux directory.
I needed some 130gb of disk space to complete a rip to the .M2TS final file.
The temp Movie and Remux space can be erased and reclaimed.
The whole process took about 1.5 to 2.0 hours, end to end.
My platform is an i7/870, 8 cores at 2.93 GHz.
DDR3 at 1333, SATA-II WDC drives.
The final .M2TS blu-ray file plays correctly on my new WD TV Live.
This seems like a lot of time, and whole lot of fiddling around for a single movie.
One could guess an average of 30 BD movies per terabyte of disk space.
Add in disk redundancy (RAID, ZFS), and the disk+power cost-per-movie doubles.
Consider the disk lifetime at 26,000 hours / 3 years.
Consider the disk array rebuild time when a drive is replaced.
Consider the power consumption of a RAID/ZFS server powered up 24x7.
All in all, this seems like a huge investment in time and money for the convenience of either not owning a BD player, or not getting off the couch to mount a BD disk.
One would simply have to love the server build and maintain process, the BD imaging process, network shares, and so forth.