Originally Posted by Mac The Knife
(And I still can't understand why the heck Debian never seems to roll up those bug fixes).
Because neither testing nor unstable are meant for production use. Ubuntu essentially just fills that gap, doing whatever work needs to be done to make testing suitable for production use.
It's pretty simple, really. New versions of software go in unstable. If they work well, they go into testing. If they work really well, they will get pulled into stable when the new release is cut. Yes, this means stable is always several versions behind (aside from security backports), but it also means that unexpected behavior is rare.
Personally, I prefer the Debian model to any of the others. If you want to run something that almost certainly won't break, you've got stable. If you want to run something that probably won't break, but has newer packages, there's testing. If you want the latest packages regardless of stability, that's what unstable is for.
The best part is that APT makes it pretty easy to mix and match when you need to (presuming there's not an ABI break in any libraries the particular package depends on). A lot of the servers I run pull a few packages from testing and have everything else at stable.
Gentoo, at least when I last used it (about the time they were switching from named releases to dated releases) was a fracking nightmare. Portage makes it tolerable, but that's about as far as software could take it. They needed a lot more policy work. The worst part is that if you forgot about a machine for too long, you'd get your build profile deleted on you and spend an hour or two downloading the recovery profile (I forget what it's called) and doing a bunch of junk to make the new profile work. Even if you did keep up with updates, it wasn't at all unusual for a completely broken package to be uploaded, making even more work to keep things running.
I didn't so much mind the build times, but Gentoo was a complete mess organizationally.