The ratings are maximums of CLEAN CONTINUOUS power. Usually, when one drastically underpowers a speaker, you turn it up to the point that it distorts trying to get the volume level you want. When an amp goes into clipping, the high frequency component (usually less than 5% of the overall power being delivered) rises dramatically and you fry a tweeter.
For example, if you have a set of speakers rated at 200 Watts and you have a 30 WPC receiver. The speakers have an efficiency of 87db/1W/1m. In order to get 90db, you need 2W. 93db, 4W. 96db, 8W, etc.
Now, let's just say that those speakers are 3 way. The woofer can handle 125W, the midrange, 60, and the tweeter 15. That's all good so far. Let's say that in normal content that the woofer gets 75% of the power, the mid gets 20% and the tweeter, 5%. So now at 96db, the tweeter is going to be getting about .4W. Now, if you listen to it so the average level is around 102db, the amp is going to clip or distort. At that point you are putting out maybe 32W BUT the the distorted signal is now 50% of the spectrum. So, now you are getting 16W to the tweeter and 16W between the woofer and the midrange. When you add peak power in there, it's easy to see how you could blow a pair of 200W speakers with a 30W receiver where a 200W amp wouldn't have that problem as it wouldn't be clipping.
Realistically, as long as you run the amplifier within it's clean operating range, you should have no problems. If all you ever send is 100W CLEAN to speakers rated @ 150W, you're good to go. The general advice is not to underpower a speaker based on it's efficiency and your desired loudness level.