***Official*** Google Chromecast ULTRA owner's thread
Originally Posted by Masterbrew2
Like I said, they are 30 megapixel+. JPGs taken with a Sony RX1 mark2.
The photos are stored locally on the iPhone, so I'm assuming Google Photos is accessing them that way? I don't use the Google cloud storage.
A jpg would be lossy so maybe that's it? Can you access the info on the actual photo on your phone to see its resolution? I don't care about the megapixel it was shot in originally on the camera itself but rather the photos resolution where it ends up.
There's an info button for each photo in Google photos that will tell you. How are you getting the photos to your phone? Direct wifi transfer from the camera? My wife just got a sony a7 II. I'll have to ask her if photos being transferred via wifi in the app is resulting in them being resized.
It has been required that photos to be cast had to be backed up to the cloud. That's how the Chromecast typically works. By pulling directly from the Internet.
Casting directly from local phone storage is a somewhat new feature for Google photos so I'm wondering if maybe it's tweaking the resolution when casting that way? I really only cast from the cloud.
I have an LG B6 65" and everything looks great. Especially the screensaver photos.
Streaming Devices: Nvidia Shield TV, 2x Roku 3's, 1st and 2nd gen chromecast, Amazon Fire TV stick 2nd gen, Apple TV 4, xbox 360, xbox one
Displays: Vizio M602i-B3, LG OLED65B6, panasonic ax100u on a 145" S-I-L-V-E-R painted screen, 40" Samsung(don't recall model #)
Receivers: Denon X3300, Yamaha RX-V663. Bluray/UHD player: Oppo UDP-203
Retired - HTPC: Intel e6300 2.8ghz, Intel DG45ID, 2gb DDR2, Radeon 5570, MCE IR receiver
Last edited by archer75; 12-14-2016 at 06:08 AM.