Grey balance (greyscale) is indeed affected by irises, and I think this is one reason we are only now seeing widespread use of irises: The projector manufacturers have finally gotten greyscale under control enough that they might be able to make it work with irises.
Here's why the iris affects greyscale.
The iris doesn't directly change the color on it's own; it's just a neutral density filter. But what it does do is move the greypoint around on the grayscale curve. A greyscale curve is a plot of the color (usually expressed in kelvin) on the Y axis versus the video level from 0 IRE to 100 IRE. In a perfect world this would be a straight flat line, but in the real world it's usually not quite flat. Some output levels usually have a different color, whether it's 10 IRE, 50 IRE, or 90 IRE.
Now the problem is that the iris will move the scene around on this curve.
Lets say you have a scene with 5 IRE shadow, a 30 IRE grey spaceship, and a 100 IRE glowing white robe. This scene will require the iris to stay wide open. Now lets say the 100 IRE white robe leaves the scene and so the 30 IRE spaceship is the brightest thing in the scene. The iris closes down to about a third of it's original size. In addition, and this is the interesting part, the imaging device must increase it's output so that the 30 IRE spaceship is now close to the "full on" imaging chip level, the same level was used to reproduce the 100 IRE glowing white robe before. In the process the 5 IRE shadow gets scaled up to three times the pre-iris level as well.
Just as an example, lets say the best calibrated greyscale of the system is 6000K at 0 IRE and 7000K at 100 IRE, with a smooth linear transition between. That means that when the robe was in the scene, the 30 IRE spaceship was displayed at 6300K. But when the robe left the scene and the iris closed down, the 30 IRE spaceship shifted to 7000K. I would be worried that the 700K difference would be noticeable, especially if it occurred in 1/60 of a second as some irises are capable of.
So that's how the iris will end up causing color shifts. Whether these shifts are noticeable will depend on how fast they occur, what scenes are viewed, the dynamic range of the scenes, and most importantly on how well the grayscale is calibrated in the first place.
Modulating the lamp output has the potential to cause the same shifts, plus color shifts arising from the fact that color balance often changes based on the lamp output and temperature.
Whether these things end up being problematic remains to be seen... I've always thought that dynamic irises are a great idea. It's the low contrast movie scenes that really demand high projector contrast, and that's where the irises shine.
I would like to see these auto iris systems come with built in calibration systems that would measure the RGB output at different chip and iris output levels. Perhaps with a sensor-containing lense cap that you put on every few hundred projector hours. It really would be quite cheap for the hardware since by tailoring for the projector's color balance you avoid the need for fancy sensors. Just a few bucks for a cheap photocell and A/D converter should be it, plus some software costs.