I've posted this before but seems to fall on deaf ears, so I'll take this opportunity to repeat it here, FWIW:
An analog loopback measurement of a sound card has nothing whatever to do with a frequency response measurement. Since all analog devices have DC offset blocking capacitors whose values are completely arbitrary, and thus the huge variance in roll off from one card to another, an analog loopback will show this roll off. In almost every case (100% so in all of the cases I investigated), the blocking caps are in the analog output stage.
However, when you measure the FR of your room, the signal you feed through your SC into REW never sees the analog output stage of the sound card. It goes into the SCs A/D converter and stays in the digital realm from there to REW.
That means that if your particular SC has a roll off at or above 10 Hz (which many do, or worse) in its analog output stage and you run an analog loopback from which REW makes a correction file, in most cases that correction file ends up being a boost curve to the actual A=>D signal path.
Same goes for using a generic correction file for the RS meter, and I wouldn't pay much attention to any measurement result below 10 Hz using that method to measure.
Finally, there is the noise floor of the room/measurement system, which increases as frequency decreases. To safely clear that hurdle, you have to measure at 100dB minimum, well above the level your posted graph shows. This also would entail making sure REW is at least somewhat calibrated. Since I've never used the RS meter to make a measurement, I don't know if the SPL reads while the mic is fed to REW, but if it does, used a 1000 Hz tone (where the RS meter is probably the most accurate) and set the calibration before you measure. Then, bump the sweep level to 100 dB and lose the SC correction file.
Try that and repost, if you care to and get the time.