I've read a lot about OLED burn-in issues. It's clear that they exist and that for some use cases that exacerbate the issue it can be a big deal. The same was true with plasma, though I don't know whether plasma or OLED is more susceptible.
Back around 2004 or so, I helped my mom pick out a high-end, well-regarded Pioneer plasma panel. She liked to leave it on CNN for background noise while she puttered around the house during the day. She passed away ~5 years later, and I inherited the TV. When I first set it up in my house, I could see some moderate burn-in on the lower portion of the screen where the crawl and the logo were always on. Over time, the burn-in faded away and was unnoticeable. (Not implying that OLED burn-in is reversible, btw; just illustrating that my usage habits are such that burn-in is not a big risk for me.)
That plasma TV was my first (and only, until just now) flat-panel TV, and it remained my primary screen until just yesterday, when my Black Friday OLED purchase arrived. That's ~15 years of "primary use", which is longer than any other TV I've ever owned.
When LCD/LED took over from plasma as the dominant (and for several years, only) TV technology, I felt like it was akin to the VHS/Betamax or other tech wars where the "better" technology lost out. OLED is still "young" enough that I may grow to regret my decision (either because we learn that this technology has flaws that emerge over time or that it is rapidly eclipsed by something better), but for now, I'm excited to have the choice of an emissive display technology again, and based on my past experience with a "burn-in risky" technology, I'm pretty confident that my viewing habits will mitigate that risk.
As others have said...YMMV.