BenQ W1070 : DLP Full HD, 3D Ready with lens-shift for 1000$ - Page 400 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 403Likes
Reply
Thread Tools
post #11971 of 11994 Old 03-16-2017, 04:54 AM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
Quote:
Originally Posted by qal1h View Post
My experience with the 1070 in a few lines:

Great projector, nice colours, bright and very good video quality, no tearing or other issues.

Few things to note, I do sometimes get the rainbow effect, only slightly noticeable, not a big issue at all.

The projector is a tad bright, i've not adjusted the settings at all but the original colour config is fine and I dont know where to start with adjusting it. We generally use it on a white wall, and the blacks aren't very deep at all.

However we got a 150" grey projector screen and the colours look brilliant, so much better than on a white wall, the blacks are now deep and overall picture quality is so much better.
Can you do me a favour? Try running the projector at 71.928 hz (using my settings above), from your PC, and tell me if that makes the rainbows disappear.

In theory, going from 60hz to 72hz could help reduce RBE since the colour wheel speed is synced to the refresh rate. And a 20% reduction is nothing to sneeze at. It's like going from a 5X to a 6X colour wheel speed. Actually, exactly like that, since the w1070 drops to 5X at 60hz so running it at 72hz likely brings it back up to the equivalent of 6X.

I've been meaning to try painting my wall gray, although reducing the brightness doesn't actually increase the contrast AFAIK, since you're cutting both white level and black level by the same amount, so the ratio between them stays the same no matter what.
RLBURNSIDE is online now  
Sponsored Links
Advertisement
 
post #11972 of 11994 Old 03-19-2017, 01:39 AM
Member
 
andyxoxo's Avatar
 
Join Date: Jun 2011
Posts: 36
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 13
Quote:
Originally Posted by RLBURNSIDE View Post
The smart move here is to replace your bulb before that happens, to avoid collateral damage (which can affect other components including the colour wheel) and big messy glass cleanup after the fact.

Commercial cinemas have figured this out years ago. You do not want explosions. These bulbs are dirt cheap for the lifetime they offer. 80 bucks shipped from Amazon for 4000-5000 hours of usable lifespan is a bargain.

I would recommend to all w1070 users to replace their bulbs at no longer than 4500 hours. It will be quite dim by then anyway and time for a freshening up anyway.
Ya, that's very good advice and I am going to take it. I have one question though... I see that bulbs with housing are sold and then also just the bulbs itself are sold. Since it seems you have taken your W1070 apart (a few times ;-) have you seen heat damage or distortion on the housing that would require (or make sense for) an owner to replace the housing & bulb together? Or after prolonged usage can we just get away with replacing only the bulb at 4500 hours? Am curious to know your recommendation for someone like me who has never had to replace the bulb yet if we can just go for the bulb or also should change the housing as well for best optical performance. Cheers
andyxoxo is offline  
post #11973 of 11994 Old 03-19-2017, 12:30 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
Funny you should mention that, when my bulb exploded it cracked the glass in front in the housing and I couldn't order it separately, so I had no choice but to order a new housing. But the 80 dollar bulbs now on amazon often include the housing too, and it's convenient to have a spare. If you don't need it though, and nothing's broken, then definitely try to save 30 bucks and get just the bulb w/o the housing.
RLBURNSIDE is online now  
 
post #11974 of 11994 Old 03-27-2017, 07:42 PM
Newbie
 
Join Date: Mar 2017
Posts: 3
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 1
Quote:
Originally Posted by monakh View Post
Assuming the bulb isn't bad, then yes, that's what the service manual says. I have no way to test this and neither can I test the blower to see if it is functional or not.

I took my PJ apart. It was a b*tch to keep track of all the screws. Stupid fan was almost at the tail end of the disassembly! The fan is a blower type ADDA AB5012DX-A03. It has a standard PC server type 3 pin connection. I wish I had a mainboard to test it. I ordered one on eBay for $25 from China though you can find it 30% cheaper at some wholesale Chinese stores. I just didn't have the patience to sign up and go through checkout.

I'll re-assemble the PJ in a few weeks when the new fan arrives. If this sucker works, I'll just sell it.
Thank you! I needed the blower fan part # and was not looking forward to disassembly just to get it!
DansWife is offline  
post #11975 of 11994 Old 03-29-2017, 11:47 AM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
Just ordered a new 25ft VGA cable to get 444 + 10-bit native at 1080p 71.928 hz in prep for Creator's edition windows update:

https://www.amazon.ca/gp/product/B00...?ie=UTF8&psc=1

72hz is a winner on this projector, and finally I'll be able to get true 10-bit (downscaled) UHD content without using fullscreen exclusive mode or dropping to 422 chroma.

Banding is one of the most annoying video artifacts for me and this should cut it down by a factor of four. Hopefully without much in the way of analog signal artifacts due to using a longish-run VGA cable instead of HDMI. I'm hoping the shielding and many good reviews of this cable will bear this out for FHD projector use.

According to the service manual of the w1070 the entire signal chain internally is capable of 10-bit, including via analog inputs (VGA and component). Reason I'm not using component is because there aren't any adapters out there with native 72hz guaranteed capability, from what I could find. It might work but why take the chance? I also have a DP 1.2a to VGA adapter with 10-bit DACs in case I ever upgrade to a videocard with no DVI-I connectors on it. I believe DVI is being deprecrated on many newer cards. I haven't tested if the DP adapter I bought actually does 72hz VGA output via custom resolution but it should definitely handle 10-bit 444 at 60hz at least. Although for me I really want 10-bit + 444 + 72hz or even 75hz for gaming. Every bit counts.

These two things combined represents an effective 56 % overclock over a typical 1080p60 @ 8-bit HDMI signal. Using a VGA cable step gives us 10-bit "for free" without increasing bandwidth requirements.
RLBURNSIDE is online now  
post #11976 of 11994 Old 03-30-2017, 02:59 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
These settings work for 72hz but not 71.928hz, both direct out of my GTX 970 DVI -> 25' VGA cable and through DisplayPort 1.2a -> VGA.

I have no real, concrete way of verifying if either actually results in a true 10-bit SDR signals via Deep Color setting in Alien Isolation, but via the DP -> VGA adapter there are no visual artifacts due to these manual timings:

1920 1080
30 2
30 5
2080 1100
+ -
72.000

(pixel clock ends up 164.7360 MHz).

Hopefully Creator's Edition on April 10th will make me a happy 72hz desktop @ 10-bit camper

Last edited by RLBURNSIDE; 03-30-2017 at 06:46 PM.
RLBURNSIDE is online now  
post #11977 of 11994 Old 04-03-2017, 06:08 AM
Advanced Member
 
zryder's Avatar
 
Join Date: Dec 2008
Location: St Paul MN
Posts: 526
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 13
Quote:
Originally Posted by RLBURNSIDE View Post
Just ordered a new 25ft VGA cable to get 444 + 10-bit native at 1080p 71.928 hz in prep for Creator's edition windows update:

https://www.amazon.ca/gp/product/B00...?ie=UTF8&psc=1

72hz is a winner on this projector, and finally I'll be able to get true 10-bit (downscaled) UHD content without using fullscreen exclusive mode or dropping to 422 chroma.

Banding is one of the most annoying video artifacts for me and this should cut it down by a factor of four. Hopefully without much in the way of analog signal artifacts due to using a longish-run VGA cable instead of HDMI. I'm hoping the shielding and many good reviews of this cable will bear this out for FHD projector use.

According to the service manual of the w1070 the entire signal chain internally is capable of 10-bit, including via analog inputs (VGA and component). Reason I'm not using component is because there aren't any adapters out there with native 72hz guaranteed capability, from what I could find. It might work but why take the chance? I also have a DP 1.2a to VGA adapter with 10-bit DACs in case I ever upgrade to a videocard with no DVI-I connectors on it. I believe DVI is being deprecrated on many newer cards. I haven't tested if the DP adapter I bought actually does 72hz VGA output via custom resolution but it should definitely handle 10-bit 444 at 60hz at least. Although for me I really want 10-bit + 444 + 72hz or even 75hz for gaming. Every bit counts.

These two things combined represents an effective 56 % overclock over a typical 1080p60 @ 8-bit HDMI signal. Using a VGA cable step gives us 10-bit "for free" without increasing bandwidth requirements.
It's very interesting to see your results with this. Maybe this is just my preconceived bias, but I am having a hard time imagining why a digital source and a digital projector shouldn't use a fully digital signal path. Does image quality change at all when you compare the 2?

"The satisfaction of a great price will be overshadowed by the bitterness of poor quality."

Quote:
Originally Posted by KO Abear
You still measure your TV in "inches"? Thats so cute! :D
zryder is offline  
post #11978 of 11994 Old 04-04-2017, 07:07 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
It looks just as sharp as HDMI to my eye, and certainly sharper than YCbCr in 422 which is required for HDMI to pass 10-bit to this projector.

But...I made a goof guys + gals. Overclocking the refresh rate of this projector results in skipped frames as per this test:

https://www.testufo.com/#test=frameskipping

So keep it to 60hz. This happens both on HDMI 70/72hz and VGA 70/72hz with manual / reduced timings.

The only point in using a VGA cable is to get 10-bit colour + 444 at the same time. There is a downside though, VGA doesn't seem to handle 3D SBS signals though, so you'll have to pick. I guess everyone will just stick to 8-bit because of this but this "hack" to use VGA to get 10-bit without dropping to 422 is legit.
skoolpsyk likes this.
RLBURNSIDE is online now  
post #11979 of 11994 Old 04-06-2017, 04:22 PM
AVS Forum Special Member
 
bori's Avatar
 
Join Date: Sep 2006
Posts: 3,174
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 126
Just replaced bulb with an Amazon bulb. Is it normal for the bulb to smell in the beginning?

Sent from my ZTE A2017U using Tapatalk
bori is offline  
post #11980 of 11994 Old 04-06-2017, 09:09 PM
AVS Forum Special Member
 
dreamer's Avatar
 
Join Date: May 2000
Location: Walnut, CA, USA
Posts: 3,016
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 952 Post(s)
Liked: 258
Quote:
Originally Posted by bori View Post
Just replaced bulb with an Amazon bulb. Is it normal for the bulb to smell in the beginning?

Sent from my ZTE A2017U using Tapatalk
Yes. It should go away within a half hour.
bori likes this.

*********************
Kirk Ellis
BenQ W1070 VuTec 122" Screen
Harmon Kardon AVR 247 Parasound L&R Amp
Psycoustic Mark III L&R Towers, Boston Center
Energy Take 5 Surrounds, HSU Research Sub
dreamer is offline  
post #11981 of 11994 Old 04-07-2017, 02:19 PM
AVS Forum Special Member
 
grubadub's Avatar
 
Join Date: Nov 2006
Posts: 1,242
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 97 Post(s)
Liked: 100
Quote:
Originally Posted by bori View Post
Just replaced bulb with an Amazon bulb. Is it normal for the bulb to smell in the beginning?

Sent from my ZTE A2017U using Tapatalk
yes, mine took a few hours of play before the smell went away
bori likes this.

'...have an A1 day!'
grubadub is online now  
post #11982 of 11994 Old 04-07-2017, 04:25 PM
Member
 
DaleNixon's Avatar
 
Join Date: Jul 2005
Posts: 152
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 23
Here's a question to those that have replaced their bulbs: Was there a lot of dust build-up around the bulb housing and venting area? I ask since I had a 65" DLP Sony bulb go out much earlier than expected and when I took the housing out I noticed there was a lot of dust hampering air-flow to the bulb housing. This is what I attribute to the short bulb-life of that TV.

My point is I might go into the BenQ now and clean it out so as to extend the life of my W1070's original bulb.

Thanks!

__________________________________________________ _______________________________
Denon AVR-X4200W
7.4.2 Klipsch / Monoprice / Behringer / Volt-10LX Atmos build
NU3000DSP iNUKE / 2 SI HT18 D4's
NU6000DSP iNUKE / 2 Ultimax 18"s
BenQ W1070 / 10' AT screen
DaleNixon is online now  
post #11983 of 11994 Old 04-08-2017, 05:12 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
Keeping your projector clean will reduce heat buildup and consequently lower fan noise and a cooler projector should help with bulb lifespan.

I would recommend cleaning out your projector every time you change a bulb, and change it around 4500 hours, PRIOR to it blowing up. It can be a royal pain to clean the insides of the optical path if you ever happen to get dust inside the lens itself or inside the optical cavity, but cleaning the rest of the projector with air canisters and removing baked in dust using q-tips is indeed a very smart thing to do.

Also, use SmartEco all the time, it extends bulb lifespan (even possibly more than Eco, due to having a variable bulb intensity which seems to help a ton with durability) and dynamic contrast. People who suggest otherwise are giving bad advice (IMO). SmartEco is like having a perfectly silent dynamic iris, which is a high-end projector feature. Not using it is like a bird not using its wings (either to fly or to run across the ground faster), it's a waste.
RLBURNSIDE is online now  
post #11984 of 11994 Old 04-15-2017, 07:35 PM
Member
 
long_pn's Avatar
 
Join Date: Jun 2004
Posts: 42
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 11
I've tried PowerDVD 17 Ultra on W1070 and very impressed how it can upgrade 1080p images of YouTube/Movies to HDR like quality. Because I still use old settings for 1080, surely a setting matched with HDR can improve more. Anyone has such a setting for HDR on W1070 ?

Last edited by long_pn; 04-16-2017 at 04:07 AM.
long_pn is online now  
post #11985 of 11994 Old 04-16-2017, 01:42 AM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
Officially the w1070 only supports gamma / srgb EOTF. They take in the internal signal, and use the current gamma setting via a LUT. If one could modify this LUT for the degamma step, it should be possible to add PQ / st 2084 aka HDR10 native support.

Then you'd just need to override EDID somehow to pretend like it supports HDR10. At that point Windows 10 should detect HDR and allow you to enable it. HDR doesn't require 2160p resolution, 1080p should be fine. I'm also way more interested in HDR + WCG than I am in pseudo / fake 4K.
long_pn likes this.
RLBURNSIDE is online now  
post #11986 of 11994 Old 04-16-2017, 05:36 AM
Member
 
long_pn's Avatar
 
Join Date: Jun 2004
Posts: 42
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 11
Quote:
Originally Posted by RLBURNSIDE View Post
Officially the w1070 only supports gamma / srgb EOTF. They take in the internal signal, and use the current gamma setting via a LUT. If one could modify this LUT for the degamma step, it should be possible to add PQ / st 2084 aka HDR10 native support.

Then you'd just need to override EDID somehow to pretend like it supports HDR10. At that point Windows 10 should detect HDR and allow you to enable it. HDR doesn't require 2160p resolution, 1080p should be fine. I'm also way more interested in HDR + WCG than I am in pseudo / fake 4K.
You're right, that's the correct way, but maybe doing that is beyond the ability of normal users.
What about adjusting individual settings ? The contrast is one of them I think of, what else could make sense ?
long_pn is online now  
post #11987 of 11994 Old 04-16-2017, 11:28 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
The w1070 supports a gamma of 2.8 max, so that's what would represent the most high contrast, HDR-like image. Then when watching HDR rips from a PC (or an HD Fury Linker set to 2.8) would have a relatively higher contrast and take better advantage of the 10-bits these projectors have to offer.

I would gladly add an HDR10 LUT if I could figure out how to do that, maybe the firmware can be hacked to modify the hex values. I'll look into it. E.g. modify the LUT for the 2.8 gamma setting to correspond to the brightness levels of a PQ signal, then upload that firmware as a "1.09 DIY" version. It could look pretty decent. But for me, I want Windows to think my w1070 is a true HDR10 display, with some custom peak nits value set somewhere because the projector doesn't understand HDR metadata. Maybe that would require a "strip metadata" feature.

I asked the folks over at HD Fury if they had a product which allows one to enter a custom, 1:1 passthrough LUT, but they don't have one. That would allow arbitrary HDR curves to be converted to SDR at least. Or HDR10 to masquerade as SDR10 (without any of the actual signal values being changed at all) and then switch to the PQ decoding mode inside the projector.

But out of the box, the HD Fury Linker product I believe allows you to pick the output gamma value, so using a 2.8 value to match what the w1070 supports, and using 10-bit input, would yield the highest contrast / least banding image. No mods required (just money). The only reason I haven't bought one myself (so I could at least watch Netflix at UHD on my w1070 with 10-bit colour), is because I much prefer to watch 1080p rips (even though I am a paying Netflix customer) and use MPC-HC to view all my content with frame interpolation.

Last edited by RLBURNSIDE; 04-16-2017 at 11:31 PM.
RLBURNSIDE is online now  
post #11988 of 11994 Old 04-16-2017, 11:49 PM
Advanced Member
 
gamermwm's Avatar
 
Join Date: Sep 2010
Location: OK, USA
Posts: 903
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 84 Post(s)
Liked: 93
So I've been playing Mass Effect Andromeda via my PC on my Samsung KS8000 in HDR at 1080p/60 (GTX 1070), and I was wondering if that was possible on this projector as well? My search of the thread didn't yield anything as specific and cut and dry as I hoped, except for the fact that this projector actually does support 10bpc.

4K is great and all, but HDR is where it's at IMO. Can I just output in 422 10bpc at 1080p/60 and go from there? Anyways, I've kind of a noob when it comes to these things, so any advice would be great thanks!

Edit: Mass Effect has an Auto Detect option for HDR, and it is not detecting capability nor outputting HDR in 422 10bpc

What is the meaning of life?
www.iamsecond.com/films

Last edited by gamermwm; 04-17-2017 at 12:31 AM.
gamermwm is offline  
post #11989 of 11994 Old 04-19-2017, 10:05 PM
Member
 
long_pn's Avatar
 
Join Date: Jun 2004
Posts: 42
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 11
Quote:
Originally Posted by RLBURNSIDE View Post
The w1070 supports a gamma of 2.8 max, so that's what would represent the most high contrast, HDR-like image. Then when watching HDR rips from a PC (or an HD Fury Linker set to 2.8) would have a relatively higher contrast and take better advantage of the 10-bits these projectors have to offer.

I would gladly add an HDR10 LUT if I could figure out how to do that, maybe the firmware can be hacked to modify the hex values. I'll look into it. E.g. modify the LUT for the 2.8 gamma setting to correspond to the brightness levels of a PQ signal, then upload that firmware as a "1.09 DIY" version. It could look pretty decent. But for me, I want Windows to think my w1070 is a true HDR10 display, with some custom peak nits value set somewhere because the projector doesn't understand HDR metadata. Maybe that would require a "strip metadata" feature.

I asked the folks over at HD Fury if they had a product which allows one to enter a custom, 1:1 passthrough LUT, but they don't have one. That would allow arbitrary HDR curves to be converted to SDR at least. Or HDR10 to masquerade as SDR10 (without any of the actual signal values being changed at all) and then switch to the PQ decoding mode inside the projector.

But out of the box, the HD Fury Linker product I believe allows you to pick the output gamma value, so using a 2.8 value to match what the w1070 supports, and using 10-bit input, would yield the highest contrast / least banding image. No mods required (just money). The only reason I haven't bought one myself (so I could at least watch Netflix at UHD on my w1070 with 10-bit colour), is because I much prefer to watch 1080p rips (even though I am a paying Netflix customer) and use MPC-HC to view all my content with frame interpolation.
I've changed the Gamma to 2.8 and HDR clips look better now. I've also tried to increase contrast but high values seemed not to make pictures better, so I ended by 60.
long_pn is online now  
post #11990 of 11994 Old 06-13-2017, 02:17 PM
Member
 
monakh's Avatar
 
Join Date: Jun 2001
Location: Lahore | Manama | Washington
Posts: 175
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 15
Quote:
Originally Posted by monakh View Post
Just wanted to check in. My second bulb went out at 3793 hours (after about 14 months). That's worse than before, I think the first bulb crossed 4000 hours (too lazy to check this thread). The small explosion was just like the one early last year. Bloody thing always shatters inside, albeit with small pieces. It's a pain removing them. Luckily, I had a spare since I knew the bulb failure was imminent. The PJ has lasted me over two years which is longer than I have kept any other projector recently. Still going on strong. I will be replacing it in a few months though I am not sure what with. There is still nothing like this baby on the market with the same price/performance ratio.
13 months and third bulb exploded at 2500 hours. I only use genuine Benq lamps. It seems the performance keeps getting worse as time goes by. This has been a great PJ for me but I think it's time to move on

Last edited by monakh; 06-13-2017 at 02:21 PM.
monakh is offline  
post #11991 of 11994 Old 06-13-2017, 02:58 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,555
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1757 Post(s)
Liked: 1391
Just goes to show that BenQ uses the exact same OSRAM bulbs from those cheapie resellers on ebay.

The lifespan is a combination of randomness, and heat. Older projectors have more heat buildup due to more dust, presumably it's not 100% clean. Also, who knows what the environmental conditions of these projectors are. I suspect more bulbs explode when the room is hot than otherwise, although humidity and/or condensation can blow bulbs easily too.

Just bad luck most likely.

Even if you're moving on, you can probably recoup the cost of a 50$ lamp from china and sell the PJ used for a hundred bucks. Or maybe just give it to someone. Shame to let such a nice piece of gear not bring someone else joy, no? That's what I plan on doing with mine (although I may end up keeping it if I don't get a new PJ with 3D capability, for those times when I get the itch for 3D. Or 1080p SDR gaming). Don't forget the benefit of having a backup projector when you buy a new one, either, warranty repairs can put you out of movie watching for months. It happened to me when I had to get my w1070 repaired, or wait for a replacement bulb to arrive in the mail.
skoolpsyk and monakh like this.
RLBURNSIDE is online now  
post #11992 of 11994 Old 06-14-2017, 09:19 AM
Member
 
monakh's Avatar
 
Join Date: Jun 2001
Location: Lahore | Manama | Washington
Posts: 175
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 15
Quote:
Originally Posted by RLBURNSIDE View Post
Just goes to show that BenQ uses the exact same OSRAM bulbs from those cheapie resellers on ebay.

The lifespan is a combination of randomness, and heat. Older projectors have more heat buildup due to more dust, presumably it's not 100% clean. Also, who knows what the environmental conditions of these projectors are. I suspect more bulbs explode when the room is hot than otherwise, although humidity and/or condensation can blow bulbs easily too.

Even if you're moving on, you can probably recoup the cost of a 50$ lamp from china and sell the PJ used for a hundred bucks. Or maybe just give it to someone. Shame to let such a nice piece of gear not bring someone else joy, no?
Yeah. You're right, when I get around to it I'll pick up one of those bulbs from eBay and do a bare bulb replacement. Thankfully, I have the PF1500 as my backup PJ though it's in a different room.

I am sure heat is the killer here. I live in the Middle East and it's worse than summer in Arizona here in the desert.

Sent from my MIX using Tapatalk
RLBURNSIDE likes this.
monakh is offline  
post #11993 of 11994 Unread Yesterday, 08:05 AM
AVS Forum Special Member
 
Daniel Chaves's Avatar
 
Join Date: Sep 2012
Location: LA (Valley Village)
Posts: 1,439
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 411 Post(s)
Liked: 305
Just picked up another one of these for a friend with firmware 1.08 and you think over the course of this product up to its end of life they would have fixed the HDCP handshake issue but nope looks like the ports still have seating problems....

Projector: BenQ w1500 + ES Sable 135" Screen AVR: Onkyo TX-NR646 ATMOS/DTSX + Darbee 5000
Speakers: Polk Audio TSX550t (FL/FR), CS2 Series II (C), Monitor40 Series II (RL/RR),
Onkyo THX Bookshelf Speakers (Ceiling L/R), (2) JL Audio 12" Subs + (2) Dayton 15" Subs + (2) ButtKicker LFE
Arrangement: 5.1.2 Source: HTPC, Roku 4, Nexus Player, Samsung UBD-K8500 4k Bluray Player
Daniel Chaves is offline  
post #11994 of 11994 Unread Today, 08:33 AM
Newbie
 
Join Date: May 2017
Posts: 14
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 1
Great projector in term of value for money, had mine running for 3 years with zero complaints bright, good colors and quality of display.
jaggajatt is online now  
Sponsored Links
Advertisement
 
Reply Digital Projectors - Under $3,000 USD MSRP

Tags
Benq W1070 1080p 3d Projector
Gear in this thread



Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off