or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › OLED TVs: Technology Advancements Thread
New Posts  All Forums:Forum Nav:

OLED TVs: Technology Advancements Thread - Page 97

post #2881 of 9448
Quote:
Originally Posted by Chronoptimist View Post

Am I understanding your post correctly, and you just don't believe that the delay numbers are accurate?

I have verified the delay numbers for my own set through various different methods, including but not limited to, high-speed photography comparing it and a reference CRT (1/4000s, 8fps) and measuring the audio delay introduced by the display for synchronisation. (changes with picture settings and matches other results)

Switching between game and theatre modes has an immediately noticeable impact on controlling games. Even if you don't believe that the measured results for game mode are accurate (the 33ms delay) the relative difference between game/theatre is.

For what it's worth, the difference between game mode and theatre is that it switches to 4:4:4 chroma rather than 4:2:2, drops to 60Hz from 480Hz and decouples the local-dimming array from the picture. (lags a frame or two behind the panel) The contrast is lowered to compensate for this. (only dims zones down to 3,000:1 rather than "infinite" black level)
Not sure I follow what you mean here, BFI does not operate at 30Hz. With my set, there is a subtle but noticeable flicker introduced when enabling 480Hz backlight scanning if you're looking for it. (and a very obvious drop in brightness)

Tests have shown that pilots can identify an aircraft from a single image flashed at 1/220s. (approx 4ms)


The best LCDs already cover the HD spec, increasing the gamut further is actually detrimental to image quality.

QDLED displays (rather than QD backlit LCDs) could prove to be an interesting alternative to OLED but I don't expect to see them for a number of years yet.


i raised the same question about qleds here a while back and got shot down by some stating their toxicity precludes widespread adoption. now we learn oleds have some toxic issues as well, so i'm not sure where we are.
post #2882 of 9448
Quote:
Originally Posted by Chronoptimist View Post

Am I understanding your post correctly, and you just don't believe that the delay numbers are accurate?

I have verified the delay numbers for my own set through various different methods, including but not limited to, high-speed photography comparing it and a reference CRT (1/4000s, 8fps) and measuring the audio delay introduced by the display for synchronisation. (changes with picture settings and matches other results)

Switching between game and theatre modes has an immediately noticeable impact on controlling games. Even if you don't believe that the measured results for game mode are accurate (the 33ms delay) the relative difference between game/theatre is.

For what it's worth, the difference between game mode and theatre is that it switches to 4:4:4 chroma rather than 4:2:2, drops to 60Hz from 480Hz and decouples the local-dimming array from the picture. (lags a frame or two behind the panel) The contrast is lowered to compensate for this. (only dims zones down to 3,000:1 rather than "infinite" black level)
Not sure I follow what you mean here, BFI does not operate at 30Hz. With my set, there is a subtle but noticeable flicker introduced when enabling 480Hz backlight scanning if you're looking for it. (and a very obvious drop in brightness)

Tests have shown that pilots can identify an aircraft from a single image flashed at 1/220s. (approx 4ms)

Like I said I believe in display lag, but I don't believe it is the TOTAL delay introduced. As a PC guy, when we debottleneck we do it along the whole chain and find the weakest link instead of focusing on one singular component.

BFI is not backlight scanning. It is obsolete now and were used years ago to reduce LCD hold time.

You are a gamer. There are sites that test your reaction time. You can go verify your reaction time and see what order of magnitude it is to 33ms. As to the oft quoted pilots' test, it is another case of skewed "science". Firstly you can see in a plain screen but can you react fast enough? Secondly I doubt anyone can see if the single flash happens in CoD. That's how the brain works heuristically: it focuses on what we deem important and ignores details. A single flash in a myriad of complex environment is not important to the brain.
post #2883 of 9448
Thanks for the quick response! If you don't mind, I have a few follow-up questions.

Quote:
Originally Posted by wjchan View Post

In terms of bells and whistles, the PVM is actually fairly bare when compared to a consumer LCD. You can set the gamma, the color LUT, deinterlace mode and that's about it. The BVM has a lot more features.

Hmmm. That actually surprises me a bit. I assumed it would be bare with regard to "pro" features...but I also assumed that it's features would be a superset of commonly available consumer features. So are you saying that the PVM is missing standard picture controls such as Brightness, Contrast and Color and Tint?

Quote:


1. The contrast is extremely high and the monitor can get very bright. If you have eye issues due to Lasik, you might experience discomfort.

So again, no brightness setting? I'll be using the display in a dark environment, so if the picture is too bright and can't be toned down, that's a concern. I use my current LCD with the backlight set pretty close to it's lowest value.

Quote:


2. Burn-in is real even though I have not experienced it yet. Be careful if you watch a lot of non-16:9 content.

I know OLED is susceptible to burn-in, and I certainly do watch a lot of movies in scope format, but I won't be watching TV on the thing. Only movies at most a few times a week. I figured that kind of usage probably pales in comparison to a professional using the thing 8 hours a day.

Quote:


3. Screensaver is very aggressive. It kicked in while I was editing a home movie with a hummingbird feeding her babies. The content displayed was about 80% static and the screensaver kicked in after 10 minutes.

I assume this can't be turned-off then? It would be annoying having a screen-saver kick in during a slow, fairly static scene.


Quote:


I believe the PVM-1741 is supposed to start shipping in November. These OLED displays are in very high demand. If you want to get your hands on one quickly, I suggest you contact a Sony Pro dealer ASAP. Thinking that the Hollywood-based dealers would get the most allocation, I put down a deposit with a Sony Pro dealer in the Burbank area. I was lucky enough to get a monitor in one of the first shipments.

I have not done this yet but one fun thing you can do with this monitor is to pair it with a eeColor box. With the Large Color Gamut Add-On Pack, you can do quite a bit of tweaking with the color.

Thanks for the heads-up. Based on your responses, I'll have to give some serious thought to whether or not this display addresses my needs.

Thanks!
post #2884 of 9448
Quote:
Originally Posted by byancey View Post

Thanks for the quick response! If you don't mind, I have a few follow-up questions.



Hmmm. That actually surprises me a bit. I assumed it would be bare with regard to "pro" features...but I also assumed that it's features would be a superset of commonly available consumer features. So are you saying that the PVM is missing standard picture controls such as Brightness, Contrast and Color and Tint?



So again, no brightness setting? I'll be using the display in a dark environment, so if the picture is too bright and can't be toned down, that's a concern. I use my current LCD with the backlight set pretty close to it's lowest value.

It has all the basics but not the fancy dark-frame insertion, mosquito noise reduction, etc. You can create a login and download the full manual from the UK Sony site.

One more thing. 24FPS flickers quite a bit. You can always convert 24FPS to 60FPS using an external scaler. 1080p48 isn't officially supported and I haven't tried that yet.
post #2885 of 9448
Quote:
Originally Posted by specuvestor View Post

Like I said I believe in display lag, but I don't believe it is the TOTAL delay introduced. As a PC guy, when we debottleneck we do it along the whole chain and find the weakest link instead of focusing on one singular component.

When there are multiple different ways to measure the lag (compare to reference display, measure audio delay, measure time from key press to action being displayed etc) its not that difficult to figure out how much delay is caused by the display.

When the only variable that changes is a picture mode on the display, you know exactly how much delay is introduced, and the cause. Adding 67ms is very noticeably less responsive wih my current screen.

Quote:
Originally Posted by specuvestor View Post

You are a gamer. There are sites that test your reaction time. You can go verify your reaction time and see what order of magnitude it is to 33ms. As to the oft quoted pilots' test, it is another case of skewed "science". Firstly you can see in a plain screen but can you react fast enough? Secondly I doubt anyone can see if the single flash happens in CoD. That's how the brain works heuristically: it focuses on what we deem important and ignores details. A single flash in a myriad of complex environment is not important to the brain.

This is not about your reaction times. It is about the display reacting to your inputs.

When there is more than about 33ms, there is a noticeable disconnect between moving my arm (to control the mouse) and my view changing in-game. (or the mouse cursor moving) It starts to feel like I'm dragging my view around, rather than controlling my view directly with the mouse.

It's not about how quickly I can react to what the screen is showing me (though if you are up against someone else, you want to remove as many disadvantages as possible) it's primarily about removing the disconnect between my actions in the real world showing up on the screen.

I would prefer lower if I could get it, but the only option other than gaming LCD monitors (which are small and low quality) are Plasmas (Panasonic have had some with a 16ms delay) which are not really suitable for long periods of gaming, and give me headaches from the flickering, or broadcast monitors (particularly the new OLED ones from Sony) which are also small, and way out of my budget.

I'm hoping that the new HMZ-T1 will be 16ms or lower as it is also OLED, but it is a consumer device, rather than broadcast, so it's probably less of a priority.


John Carmack has recently been testing head mounted displays and head tracking, and concluded that 60fps tracking (16ms) isn't enough, and that for him, 120 fps (8ms) was necessary.
Quote:


I re-did 60/120/180 fps tests recently. I think 120 is critical for truly believable head tracking
post #2886 of 9448
^^ I've said what logic and sense tells me that 33ms cannot be noticeable. If it is noticeable (which I agree based on user feedback) then the TOTAL lag must be much longer. So I'll just leave this together with other issues like whether CD is right to cut off at 20kHz

On other developments, this just out from Nokia:
"The Lumia 800 features an outer design that's much shared with the MeeGo-based N9. Instead of the 3.9-inch on the N9, the Lumia 800 has to make room for Windows Phone buttons, so instead there's a 3.7-inch AMOLED ClearBlack curved display. It also packs a 1.4 GHz Qualcomm processor with hardware acceleration and graphics. The 8 MP Carl Zeiss optics with dual LED flash is activated with a dedicated camera button – something that the N9 doesn't have. It however, gives up a front-facing camera from the N9 in exchange for a status LED. It has 16GB of internal user memory and 25GB of free SkyDrive storage for storing images and music. For memory, the Lumia 800 has 512MB, down from the 1GB in the N9. The estimated retail price for the Nokia Lumia 800 will be approximately 420 EUR ($480), excluding taxes and subsidies – also down from the 600 EUR ($830) N9. Pick from cyan, magenta and black."
post #2887 of 9448
The display is not curved. The glass that wraps around the phone is slightly curved. Period.
post #2888 of 9448
Why PHOLEDs are far better in power efficiency than fluorescent OLEDs:

http://en.wikipedia.org/wiki/Phospho...emitting_diode

Quote:


Like all types of OLED, phosphorescent OLEDs emit light due to the electroluminescence of an organic semiconductor layer in an electric current. Electrons and holes are injected into the organic layer at the electrodes and form excitons, a bound state of the electron and hole.

Electrons and holes are both fermions with half integer spin. An exciton formed by the recombination of two such particles may either be in a singlet state or a triplet state, depending on how the spins have been combined. Statistically, there is a 25% probability of forming a singlet state and 75% probability of forming a triplet state.[2][3] Decay of the excitons results in the production of light through spontaneous emission.

In OLEDs using fluorescent organic molecules only, the decay of triplet excitons is quantum mechanically forbidden by selection rules, meaning that the lifetime of triplet excitons is long and phosphorescence is not readily observed. Hence it would be expected that in fluorescent OLEDs only the formation of singlet excitons results in the emission of useful radiation, placing a theoretical limit on the internal quantum efficiency (the percentage of excitons formed that result in emission of a photon) of 25%.[4]

However, phosphorescent OLEDs generate light from both triplet and singlet excitons, allowing the internal quantum efficiency of such devices to reach nearly 100%.

If anyone has access, this IEEE article directly covers what we're discussing:

http://ieeexplore.ieee.org/Xplore/lo...hDecision=-203

Quote:


We model and analyze the power consumption and resulting temperature rise in active-matrix organic-light-emitting device (AMOLED) displays as a function of the OLED efficiency, display resolution and display size. Power consumption is a critical issue for mobile display applications as it directly impacts battery requirements, and it is also very important for large area applications where it affects the display temperature rise, which directly impacts the panel lifetime. Phosphorescent OLEDs (PHOLEDs) are shown to offer significant advantage as compared to conventional fluorescent OLEDs due to high luminous efficiency resulting in lower pixel currents, reducing both the power consumed in the OLED devices and the series connected driving thin-film transistor (TFT). The power consumption and temperature rise of OLED displays are calculated as a function of the device efficiency, display size, display luminance and the type of backplane technology employed. The impact of using top-emission OLEDs is also discussed.

So, absent any direct comparisons of 2011 OLED displays that use only PHOLED red vs those that will be using both red and green PHOLED it stands to reason that we should expect the types of savings Universal Display's website suggests, all other things being equal.
post #2889 of 9448
Quote:
Originally Posted by specuvestor View Post

With the passing of Steve Jobs and the dismal launch of 4S, the biggest winner looks to be S2.

Looks to be gaining even more momentum, though the caveat below is that Sammy sells many more models than Apple.

"Sammy overtook Apple in 3Q to become the world's largest smart
phone seller. Sammy shipped 27.8mn units in 3Q, taking 23.8%
of the market. vs. 3.7% M/S in 2009
#1 Sammy sold 27.8mn units, 23.8% M/S
#2 Apple 17.1mn 14.6% M/S
#3 Nokia 16.8mn 14.0% M/S
Sammy's telecommunication div posted the largest ever earnings
mainly due to Galaxy S2 sales
REV 14.9wtn (+22% QoQ, +37% YoY)
OP 2.52wtn (1st time to reach W2tn lvl)
OPM 16.9% (beats Semicon div OPM of 16.8% for the 1st time)"
post #2890 of 9448
They didnt give much info on future OLED plans on the English call but one thing is for sure, mobile OLED's are very profitable right now. It is the only thing saving SMD. They didnt give any details, but they made it pretty clear that they plan on being less reliant on LCD's going forward as they transition from smartphones to tablets to televisions.

Also interesting to hear that they will sell flexible displays in handsets next year.

Slacker
post #2891 of 9448
"....the dismal launch of 4S"

The most successful phone launch ever, you must be referring to. 4 million in less than 1 week.

http://news.cnet.com/8301-13506_3-20...first-weekend/

That's compared to 10 million Galaxy S II phones in 5 months.

http://www.electronista.com/articles....s.10m.record/

The Samsung "kitchen sink" approach combined with a later launch that expected certainly did propel Samsung past Apple in Q3. You want to make that bet again for Q4?
post #2892 of 9448
Lest we forget we were expecting iPhone 5. Versus the expectation it was dismal.

I'll give the new Apple team the benefit of doubt but I would say Galaxy S would be gaining market share (not top selling) in 4Q as well. Like I said the caveat is that it refers to multiple Samsung models.
post #2893 of 9448
Quote:
Originally Posted by pdoherty972 View Post

If anyone has access, this IEEE article directly covers what we're discussing:

http://ieeexplore.ieee.org/Xplore/lo...hDecision=-203

I have ordered it but I think most of the information has been updated in recent Universal Display papers 2010-2011 from SID which I already have.

Here is a chart from a 2010 paper from Universal Display with projections on future PHOLED displays vs future LCD displays.

post #2894 of 9448
Yeah, but the guy who led that study is a Hack.
post #2895 of 9448
Quote:
Originally Posted by mr. wally View Post

now we learn oleds have some toxic issues as well, so i'm not sure where we are.

Where did you learn that? That would be news to me, and GE as well. who made this video on OLED white lighting.

http://www.efactormedia.com/archive/ge_oled/index.html
post #2896 of 9448
Quote:
Originally Posted by specuvestor View Post

Lest we forget we were expecting iPhone 5. Versus the expectation it was dismal.

Only in Apple land is the greatest smart phone launch in history dismal. People wanted a radically new model on the internet; in the real world this is the fastest selling phone ever. Period.
Quote:


I'll give the new Apple team the benefit of doubt but I would say Galaxy S would be gaining market share (not top selling) in 4Q as well. Like I said the caveat is that it refers to multiple Samsung models.

Samsung has a lot of models so they will probably outship Apple even in Q4. It does appear Apple cares a bit and is slowly expanding its portfolio, but I imagine any significant regaining of marketshare will take time.

Man, I'm a sucker for bad reporting... While I still believe Samsung outsold Apple in Q3, it's very much worth reading this:

http://www.loopinsight.com/2011/10/2...phone-numbers/

Q4 will be much closer -- in reality, not fake shipment numbers -- than this fake 10 million unit advantage would have you believe.
post #2897 of 9448
Quote:
Originally Posted by xrox View Post

I have ordered it but I think most of the information has been updated in recent Universal Display papers 2010-2011 from SID which I already have.

Here is a chart from a 2010 paper from Universal Display with projections on future PHOLED displays vs future LCD displays.


Entire 32" TVs will use under 15w just 2 years from now? That's quite frankly amazing.

If you take 4 of those displays and slam them together, it would seem you get the following power consumption for a 65" display in 2016:

LCD -- 22w (4 x 4 for display + 6 for electronics)
OLED -- 10w (4 x 2 for display + 2 for electronics)

Part of me finds these forecasts entirely unbelievable, which is different from me saying "no way in hell". If this is remotely true, it should end the debate as to whether OLED is going to be able to sold on power consumption: There isn't any chance of that. While people like me will look to find 10w here or there, that's beneath the noise threshold from a cost perspective, a reasonable "greening" of one's house perspective, etc.

A 12-watt difference is less than what you'd save by replacing a single incandescent bulb of 40w or above with any other bulb technology, quite likely including improved halogens. Since the average TV is not going to be 65" or replaced anytime soon, even if we multiple this by "a billion", we'd save about 6 billion watts x 2500 hours or 15 trillion watt/hours or in more commonly used terms or 15 million megawatt / hours.

In 2008, the world consumed about 17 billion megawatt/hours. The amount that replacing every TV in the world with an OLED TV would save vs. using an LCD TV would cut that consumption by less than 1/1000th! (Hint: We should focus on the light bulbs.) Because building TVs is energy intensive, focusing on replacing TVs is a bad idea anyway. There are other replacement items where the cost to build them is covered int the first year or two of operations (e.g. solar panels). TVs that are using 500w are probably worthy of being replaced for energy reasons alone. TVs using <100w? Not so much.
post #2898 of 9448
Quote:
Originally Posted by rogo View Post

15 trillion watt/hours or in more commonly used terms 15 megawatt/hours annually (the size of a fraction of a power plant).

Correct me if I'm wrong, but wouldn't 15 trillion watt/hours equal 15 million megawatt hours, or 15 terawatt hours? I follow your math that this would be 1/8760th of the world's energy consumption (i.e. one hour worth), just wondering about the conversion from w/h to mw/h.
post #2899 of 9448
Quote:
Originally Posted by specuvestor View Post

Lest we forget we were expecting iPhone 5. Versus the expectation it was dismal.

New CPU & GPU outperforming the 4 considerably. (twice the CPU power, 8x the GPU power, fastest phone on the market by a huge margin)
New camera with higher pixel count, lower noise, and considerably improved optics.
New dual antenna design. (avoids potential signal problems of 4, now a "world phone")
New operating system.
New "assistant" feature.
Larger battery.

Aside from the external appearance, which is still the best industrial design out there by some margin, it is a completely new phone.

The only people that seem to be disappointed was anyone that wanted a low resolution device that isn't pocketable or able to be operated with one hand.

Quote:
Originally Posted by rogo View Post

Entire 32" TVs will use under 15w just 2 years from now? That's quite frankly amazing.

It's really exciting just how efficient products are getting these days now that companies are really pushing for it.

I have two Sony LCDs here, both 2010 models, one is the highest-end HX900 and the other is a lower end model. Both are the same size, and the local-dimmed HX900, which puts out a much better picture uses about 1/3 to 1/4 the power of the lower-end CCFL backlit screen.

We're just about to try switching to LED lighting, and while I'm yet to be convinced about the quality of light from those bulbs, the power savings are massive, going from about 200W in one fixture to 20W. Due to rising costs and the amount of use this particular fixture sees, the bulbs should pay for themselves in two years.

Hopefully OLED lighting will be available in a few years at lower prices with improved light quality and higher efficiency.
post #2900 of 9448
Quote:
Originally Posted by HogPilot View Post

Correct me if I'm wrong, but wouldn't 15 trillion watt/hours equal 15 million megawatt hours, or 15 terawatt hours? I follow your math that this would be 1/8760th of the world's energy consumption (i.e. one hour worth), just wondering about the conversion from w/h to mw/h.

Oh lord, did I cheat again. I divided the 15 trillion hours by a million to get 15 million and then just discarded the million... [ /wristslap] Let me go fix that.
post #2901 of 9448
Quote:
Originally Posted by rogo View Post

In 2008, the world consumed about 17 billion megawatt/hours. The amount that replacing every TV in the world with an OLED TV would save vs. using an LCD TV would cut that consumption by less than 1/1000th! (Hint: We should focus on the light bulbs.)

Luckily, OLED used for white lighting will be doing that as well (has already begun - see GE video I posted a few posts back or the commerical offerings of Konica, Minolta, Philips, Novaled, etc).
post #2902 of 9448
Quote:
Originally Posted by Chronoptimist View Post

New CPU & GPU outperforming the 4 considerably. (twice the CPU power, 8x the GPU power, fastest phone on the market by a huge margin)

Most of the tests (like the one from CNET below) show the S2 to be faster (and don't forget the USA T-Mobile S2 is a 1.5Ghz CPU, not 1.2Ghz like the international one). And we already know it's far faster on the cell network, being a 4G phone.


CNET - Samsung Galaxy S2 vs iPhone 4S:

http://www.youtube.com/watch?v=lXhjAgRDhT8
post #2903 of 9448
Quote:
Originally Posted by pdoherty972 View Post

Luckily, OLED used for white lighting will be doing that as well (has already begun - see GE video I posted a few posts back or the commerical offerings of Konica, Minolta, Philips, Novaled, etc).

OLED lighting, in any meaningful quantity, is a long long way off. It is awesome that big name companies are working on commercializing lighting but it is only going to be applicable to niche areas for quite a while.

Slacker
post #2904 of 9448
Quote:
Originally Posted by pdoherty972 View Post

Luckily, OLED used for white lighting will be doing that as well (has already begun - see GE video I posted a few posts back or the commerical offerings of Konica, Minolta, Philips, Novaled, etc).

As I've stated, LED lighting will absolutely obliterate OLED lighting in the marketplace for at least the balance of this decade, probably longer. The niche applications for OLED lighting will cover the small fraction of the market it will capture and that's about it.
post #2905 of 9448
Quote:
Originally Posted by rogo View Post

Oh lord, did I cheat again. I divided the 15 trillion hours by a million to get 15 million and then just discarded the million... [ /wristslap] Let me go fix that.

The end result is the same - i.e. best case scenario, miniscule power savings. Just wanted to make sure I was doing my conversions correctly.
post #2906 of 9448
Quote:
Originally Posted by pdoherty972 View Post

Most of the tests (like the one from CNET below) show the S2 to be faster (and don't forget the USA T-Mobile S2 is a 1.5Ghz CPU, not 1.2Ghz like the international one). And we already know it's far faster on the cell network, being a 4G phone.


CNET - Samsung Galaxy S2 vs iPhone 4S:

http://www.youtube.com/watch?v=lXhjAgRDhT8

That video is an iPhone 4 compared to the HTC Evo 4G, not the new iPhone 4S.

Despite the 800MHz clockspeed, the iPhone 4S still beats the 1.5GHz Galaxy SII in tests.
post #2907 of 9448
Quote:
Originally Posted by Chronoptimist View Post

That video is an iPhone 4 compared to the HTC Evo 4G, not the new iPhone 4S.

Despite the 800MHz clockspeed, the iPhone 4S still beats the 1.5GHz Galaxy SII in tests.

Yeah, looks like the video was mislabeled. And I didn't notice they only showed iPhone 4 in the summary at the end.
post #2908 of 9448
Quote:
Originally Posted by Chronoptimist View Post

New CPU & GPU outperforming the 4 considerably.
... stuff about iPhone...

New operating system.
New "assistant" feature.
Larger battery.

Aside from the external appearance, which is still the best industrial design out there by some margin, it is a completely new phone.

The only people that seem to be disappointed was anyone that wanted a low resolution device that isn't pocketable or able to be operated with one hand.

This is terrifying and I have some calls in to Lucifer asking about ice forming in Hades but: I agree with all of this. That said, I did play with a Galaxy and unless you hold it jammed into your thumb/forefinger joint, you can operate it with one hand for many many things with "average sized" hands. Most importantly, though, you, me, and the marketplace seem to agree. Tech fanboys and some analysts are the only people on the other side of the fence.
Quote:


It's really exciting just how efficient products are getting these days now that companies are really pushing for it.

I have two Sony LCDs here, both 2010 models, one is the highest-end HX900 and the other is a lower end model. Both are the same size, and the local-dimmed HX900, which puts out a much better picture uses about 1/3 to 1/4 the power of the lower-end CCFL backlit screen.

We're just about to try switching to LED lighting, and while I'm yet to be convinced about the quality of light from those bulbs, the power savings are massive, going from about 200W in one fixture to 20W. Due to rising costs and the amount of use this particular fixture sees, the bulbs should pay for themselves in two years.

Hopefully OLED lighting will be available in a few years at lower prices with improved light quality and higher efficiency.

I follow the green blogs fairly closely. I haven't seen one iota of evidence that OLED lighting is going to be used in the next 5 years as drop-in replacements for the billions of light bulbs in use. LED on the other hand is. If you are patient and wait for the new Switch Lighting bulbs and compare those to the Philips, I think you'll find nice options. The light quality from what I've seen looks pretty good.
post #2909 of 9448
Quote:
Originally Posted by rogo View Post

I follow the green blogs fairly closely. I haven't seen one iota of evidence that OLED lighting is going to be used in the next 5 years as drop-in replacements for the billions of light bulbs in use. LED on the other hand is. If you are patient and wait for the new Switch Lighting bulbs and compare those to the Philips, I think you'll find nice options. The light quality from what I've seen looks pretty good.

OLED lighting isn't destined for the light bulb replacement market. The first mass markets will depend on new installations, such as new office buildings choosing OLED ceiling tiles rather than fluorescent tube lighting. I could also see somebody choosing to use OLED's rather than downlighting when finishing a basement. The diffuse lightwould help eliminate dark areas. Of course everything depends on eventually commercializing roll to roll manufacturing so they can bring down the costs.


Slacker
post #2910 of 9448
Quote:
Originally Posted by slacker711 View Post

OLED lighting isn't destined for the light bulb replacement market. The first mass markets will depend on new installations, such as new office buildings choosing OLED ceiling tiles rather than fluorescent tube lighting.

Right, that makes perfect sense. This means as a practical matter it would be limited to single-digit percentages of the new office-building lighting market for the foreseeable future. This makes bullish forecasts associated with OLED lighting completely bizarre.

Quote:


I could also see somebody choosing to use OLED's rather than downlighting when finishing a basement. The diffuse lightwould help eliminate dark areas.

If and when it's reasonably affordable.

Quote:


Of course everything depends on eventually commercializing roll to roll manufacturing so they can bring down the costs.

Which, of course, hasn't happened yet.
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › OLED TVs: Technology Advancements Thread