While it's technically possible, it's not realistically going to happen in a home environment, even if you're using a computer on them all day, as long as you are turning it off at night - unless the display is faulty.
With CRT, Plasma, OLED etc. the wear is accumulative. This is how Plasmas may end up with uneven wear across the screen with letterboxing "burned" into the screen if someone primarily watches films and little 16:9 content, for example.
With the first couple of generations of OLED, they will probably be more susceptible to burn-in due to the materials being used, which is something they will surely figure out. Part of the problem is that they are trying to drive them to be as bright as an LCD, rather than competing with Plasmas.
Hopefully, they will be able to get them to be as resistant as CRTs in a few years time. Late generation CRTs were almost immune to burn-in from home use scenarios. (even with gamers)
Plasma is always driving its phosphors at 100%, and varying the duration to modulate brightness.
CRT adjusted the brightness of the phosphors directly, so they weren't driven as hard. OLED should be more CRT-like in this regard, being able to vary the brightness of the pixel, rather than using PWM (or similar) to modulate brightness.