I'm working with a Wireless HDMI Adapter product which can receive the AV streams from Handsets or Laptops with Wireless Display protocol support . So here i'm in need to measure the delay between Audio and Video at the end Display device (which is the final output device to end user) .
E2E Diagram is as below :
Laptop (Local Display) ----> Mirroring the content over air ----> Wireless Adapter/Receiver ----> (connected via HDMI) ----> HDTV/Monitor (Remote Display)
Here , I'm try to measure the AV sync delay of the playback on TV/Monitor .
My proposed method :
I have downloaded some test videos like this http://www.youtube.com/watch?v=EoeDZkGvvjg
This content plays a bip and bop sounds alternatively (once in a second) along with a sinewave displayed exactly at the time of the sound is heard , which is used to ensure if the audio heard and the sinewave displayed both are in sync (AVSync) .
Since it bips or bops once In every second , its little hard to perceive exactly at which instance of video did we hear the bip/bop .
So i'm using the either VideoLan or Windows Media Player to play the same content in Laptop but in slow motion , say 0.5x or 0.25x speed . Hence the same will be mirrored on the Remote Display(TV) with the same slow speed . By this was I made it easy to find at which instance of video playback did I hear the bip/bop (say more than 90% accurate) .
My doubt here is that , i'm supposed to check the AVsync with 1x speed , but instead i'm playing a normal content in 0.5x speed . Will there be any difference technically b/w the 1x and 0.5x playback speeds as if I measure the AvSync with slow speed , could there be a chance of getting wrong values with whatever I observe on the TV ??
Sorry for the big description and if am on a wrong thread......If someone could clarify this would be of great help