Potential Audioquest ShootOut - Page 2 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #31 of 40 Old 01-01-2014, 03:36 PM
 
goneten's Avatar
 
Join Date: Dec 2002
Location: South Africa
Posts: 3,681
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 17
It would be hilarious if several cable manufactures agreed to a bias-controlled test to discern differences with their own cables. I find it strange that people turn into lumps of coal as soon as they are deprived of visual feedback, but let them see which cable is which and baby, watch the hyperbole fly! biggrin.gif
goneten is offline  
Sponsored Links
Advertisement
 
post #32 of 40 Old 01-01-2014, 04:05 PM
AVS Addicted Member
 
amirm's Avatar
 
Join Date: Jan 2002
Location: Washington State
Posts: 18,375
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 981 Post(s)
Liked: 547
Quote:
Originally Posted by arnyk View Post

How many high end cables do I need to purchase to convince myself that my Fields and Waves prof, and my Electronics who taught me back in the early 70's were not lying?

Is someone faulting me for having the opinion that this number is zero, as well?

Do tell! ;-)
I do smile.gif. If you are going to spend the trillion hours that you have arguing this topic, it would help your cause if you at least had some that you had measured, and performed the blind listening tests on them in addition to some casual subjective testing. The answer would be zero if were talking about you buying them for your own use/motivation. But to argue with the other side, it is my opinion that you do need to have them. It is what gets you some amount of credibility with the other side. In that regard, it matters not whether you think you need them or not, but rather, whether they think you need them.

Now, if you don't value convincing anyone and just want to argue the same points over and over again that countless people make in these forum before with no additional knowledge shared, by all means, go right ahead. Just don't ask the rest of us if that is good logic. biggrin.gif It is not.

Amir
Owner of some high-end cables smile.gif.

Amir
Retired Technology Insider
Founder, Madrona Digital
"Insist on Quality Engineering"
amirm is offline  
post #33 of 40 Old 01-01-2014, 04:43 PM
AVS Addicted Member
 
amirm's Avatar
 
Join Date: Jan 2002
Location: Washington State
Posts: 18,375
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 981 Post(s)
Liked: 547
Quote:
Originally Posted by Glimmie View Post

I have a different theory which especially applies to the $20K plus crowd.

With these guys it's all about how much you spent, not how it looks video wise, sounds, and of course not how it measures technically. They just can't comprehend the fact that a $5 six foot Monoprice HDMI cable is just as good performance wise as their $500 Audioquest or whatever.
Hardly any audio people I know are into video that way. So are you talking about a very tiny population here.
Quote:
This is why all the jitter FUD was born. Yes jitter is a legitimate problem but as you have noted many times, it's effects are largely inaudible in digital audio systems, especially with today's technology.
You have to demonstrate that it is *always* inaudible. Not 'largely." Because in that case you are leaving the possibility that it could be audible and since you are not hearing what the other person is hearing, the argument is lost at that point. Worse yet, we seem to operate from the position of no data. Here is some data. This is two tests I performed on the same AVR using two *digital* interfaces. The red is S/PDIF and the green, HDMI:

i-QHffrnc.png

Look at the right side of the center tone and see the distortion spurs which I have the cursor on (the white plus signs). Notice a 25 db rise in level with HDMI? Due to masking effects, likely this is not audible but how do we as "degreed engineers" ever wake up in the morning and paper over such a step backward in measured performance? Why not strive for excellence so that we can *know* the distortion is inaudible?

Here is another good one. Again, two tests with digital interface and the output of the system (in analog domain post DAC) measured on a different AVR:

i-Rd4Tsrr-XL.png

Both times the source is the same HDMI input. What is the difference? Nothing but time! Measure once and you get one response. Measure again and you get the other! eek.gif

Here is more data showing jitter reduction in db:

i-L9sfQTP-XL.png

What is the best gear at the bottom with most rejection? Yup, the hated gear that the "$20K plus" people buy. In this case it is the Mark Levinson 502 processor. What is the next best? The sister to the processor you have: the Lexicon 12B. The junk on top with essentially no jitter reduction? Past and present AVRs! And I am not talking about cheap stuff. At least three of them retail for more than $1,000. These are random samples of gear I have on hand for testing. So nothing was rigged to high-end stuff look good. They look good because someone there knows what they are doing and has instruments to measure performance before releasing the box.
Quote:
As long as some people firmly believe cost directly and solely relates to quality, there will be a good market for audiofool products.
Maybe. The bigger sin in my opinion though is that people who know this stuff but instead of working from data, work from assumptions. That it must all be the same. That all designs perform the same. Well, they don't. I am the second person ever to measure and publish above data on HDMI performance in an audio/video magazine! So it reasons that most people operate from the dark.

When I managed development of pro video products and we sold them to major TV networks, their techs performed tests like this all the time. It matter not whether person watching the programming could tell. As long as their measurements showed the box to be worse than what they expected, they would reject it. I know you work/worked in that space. So it is surprising to me that excellence in video engineering there, is not translating to the same high standard to gear we buy with our own money.

So this is not a one sided argument, I tested three different HDMI cables and their differences were lost in the variations of the system performance from run to run.

Amir
Retired Technology Insider
Founder, Madrona Digital
"Insist on Quality Engineering"
amirm is offline  
post #34 of 40 Old 01-01-2014, 06:03 PM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 8,018
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 182 Post(s)
Liked: 252
Amirm,

I think by saying "largely inaudible", that means I do acknowledge that is is audible to some, especially people like yourself who have trained themselves to hear it.

Are you going to argue that some USB cables sound better than others? I do acknowledge that power supply noise can be conducted into the DAC and a good USB DAC will use it's own internal PS. But most of the USB audiophile accessory stuff on the market is snake oil.

Then there's the HDMI jitter you like to grandstand. Yes, it is much higher than SPDIF, no question about it and it is probably audible at -80db. I am not disputing that. HDMI is what it is. If you are bent on fixing the HDMI audio jitter problem why not consult with the HDMI consortium to try and get it fixed. You do the same thing you accuse Arny of. You have posted those jitter pictures countless times here and yet the factories continue spew out these high jitter HDMI products. Looks like nobody with the power to make changes is listening to you either. If someone today has an issue with HDMI jitter, there are ways around it, albeit expensive, like multichannel SPDIF/AES which some highend users and myself do resort to.Your measurements do show the SPDIF jitter is over 100db down. Nobody is going to hear that in normal listening.

There is nothing wrong with techniques and products that reduce jitter. But as you so correctly indicated, they need to be designed into the product. Cables can't reduce jitter. They can aggravate it but the amount is minuscule in a consumer enviornment. If you are already so close to the cliff that cable induced jitter pushes you over the edge, well you have a bigger problem and fixing it with a cable is a kludge.

And yes commercial companies, TV industry included, will reject products that do not meet published or agreed to specifications. But you are ignoring the fact that those specifications are based on realistic engineering and financial parameters. Excellence can be and is measured by consistently meeting specifications and exceeding them. But those specifications have to be reasonable as well.

You seem not to acknowledge a fundamental engineering profession discipline. Good engineering is not simply making the best product possible. Good engineering involves making the best product possible within the parameters and materials provided. I'm not so sure some of the high end companies could produce an AVR at a $400 price point and get as good performance as Denon or Yamaha does. The level of true engineering in a low cost consumer product is often much deeper than a small run high end product. Throwing money at problems is not necessarily good engineering at all.
CruelInventions likes this.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
post #35 of 40 Old 01-01-2014, 10:03 PM
AVS Addicted Member
 
amirm's Avatar
 
Join Date: Jan 2002
Location: Washington State
Posts: 18,375
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 981 Post(s)
Liked: 547
i don't have any data on USB cables one way or the other. I plan to test them some day. Until then, as with power cables that I am similarly situated (i.e. not tested them), i personally use whatever cable is handy. And will not recommend anything fancy. At the same time, until I test them, I will stay quiet and don't claim that some class at school on "fields and waves" says they make no difference. Whether they do or not, depends on far more practical things than some terms one would want to throw around in a subset of the theory.

As to making some difference on HDMI, I am doing precisely that. In this forum and elsewhere I have tried to raise visibility of it so that customers ask questions of their suppliers and insist on at least measurement data. This will not work on mass market products of course as their designers aren't remotely interacting with their customers. But it will have an effect potentially in high-end products where they literally earn every unit sales. The other thing I did was that when I tested the HDMI products for my Widescreen Review Magazine article (where these measurements came from), I did not identify which graph is which although I did list all the manufacturers and model numbers at the start. I ended the article by saying that when I do the test in the future, I will identify poor performing products by name. So a bit of implied threat smile.gif. Not holding my breath though with these mass market products as I mentioned.

Importantly, I partnered with Prism Sound on this testing (makers of the test equipment I used). They had never heard of any HDMI problems or any of their customers trying to measure its analog audio performance! They were delighted with the article and hoped that the article would entice their customers to do more testing in this area.

As to influencing the standard, HDMI actually has a mode where the DAC is the clock master. This is in HDMI 1.3a and later products and is called Audio Rate Control (ARC, not to be confused with Audio Return Channel). But it is optional and usually not implemented. If it is, it allows a local stable clock to be used in the DAC and with it, eliminate this entire problem. This is the theory. The practice is that usually equipment from the same manufacturer work this way and do not interop with others. And as i mentioned, hardly any companies implement it. Pioneer's PQLS is one example of this solution.

But even without ARC, proper implementations can get rid of jitter down to S/PDIF levels as my testing showed. We had the same problem with S/PDIF. It too was designed by folks that didn't understand proper audio design. After years and years of refinement, we now have excellent performance despite the weakness of the interface. We could get to the same place with HDMI if we did not have Arny's of the world constantly papering over the issues. Let's have some transparency here. None of us work for the manufacturers, nor get paid to run a campaign. Let's get rid of the constant chanting that "digital is digital" and hence is perfect. Let's get the measurements out. Let's get the message widely accept that this interface has set us back 20 years. And let's get it fixed.

So this is what I have done about this problem. in private I also beat up the manufacturers I deal with. But I am retired now and no longer drive initiatives in the industry. It is for others who hopefully see these issues and help drive them.

As to standards, mine says we need to achieve transparency on an objective basis. For CD, we have 16 bit samples and hence signal to noise ratio of 96 db. I like to see distortion products below this. -80 db is just sad. We are talking 16 bits in audio samples some 30 years after CD was introduced. And we still can't keep the distortion below noise in HDMI in a $1,000 product??? Since the limit of useful DAC performance is around 20 bits, for full transparency I like to see distortion down to -120 db. All of this is fully achievable. Here is my 13 year old Mark Levinson DAC over S/PDIF:

i-p2Qmc84.png

As we see, its distortion products are at -123 db and even there, it is just a couple of spikes and that is it. So we can achieve fully transparency with the highest dynamic range we may see in our content. We did it 13 years ago and high-end products over S/PDIF and USB do this day in and day out today. Mass market products often fall short even in S/PDIF but certainly in HDMI. I am fine if they don't fix it. I am not fine if folks keep saying they are all the same. They are not.

Finally, I have not advocated any cables here. I am advocating that we explain the science correctly. That we don't constantly deviate with the topic and keep ranting on the rest of audiophile world while using incorrect explanation of how the technology works, or the limits it must achieve for excellence.
Audionut11, kiwi2 and Talos4 like this.

Amir
Retired Technology Insider
Founder, Madrona Digital
"Insist on Quality Engineering"
amirm is offline  
post #36 of 40 Old 01-02-2014, 04:43 AM
AVS Addicted Member
 
arnyk's Avatar
 
Join Date: Oct 2002
Location: Grosse Pointe Woods, MI
Posts: 14,387
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 763 Post(s)
Liked: 1178
Quote:
Originally Posted by Glimmie View Post


I think by saying "largely inaudible", that means I do acknowledge that is is audible to some, especially people like yourself who have trained themselves to hear it.

IME it doesn't happen in good reliable listening tests for real world jitter in good real world audio gear.

The results of the HDMI jitter tests posted so far in this thread scare only small boys! ;-)

Some people fault me for not falling over myself to do DBTs related to the technical test results that they post. That's like faulting people for not launching large fishing boats in a tiny lake known to contain only minnows.

For those of us who have actually heard jitter, the most revealtory experience is to ask those who believe that they have trained themselves to hear jitter, what it sounds like.
arnyk is offline  
post #37 of 40 Old 01-02-2014, 12:06 PM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 8,018
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 182 Post(s)
Liked: 252
Quote:
Originally Posted by amirm View Post

...... We had the same problem with S/PDIF. It too was designed by folks that didn't understand proper audio design. .....

No, I disagree with that. The problem with SPDIF is that it WAS designed by people who understood audio design. And that's the problem! SPDIF/AES is not audio frequencies. In fact 48K SPDIF/AES is the same bandwidth as analog NTSC video. Even worse the rise times are more critical to the link performance than with analog video. Poor HF rise time or excessive ringing causes data errors, not just slightly soft or ringing images as with analog video. The audio designers of that era typically lacked the experience in RF/video and high speed pulse design techniques.

I do stand corrected on your efforts to fix the HDMI problem. Publishing your WSR article did bring attention to the problem and perhaps some manufactures engineering group will look into it. So nice work on that.

It does interest me why there is so much jitter in the HDMI clock. As HDMI is a direct wired clock synchronous interface I would expect the resulting counted down audio clock to be quite stable. The ARC is a fix but that will require sample rate conversion to a light extent - they probebly don't advertise that fact to the audiophile public. You have to buffer a pack of samples to make that work. If a look ahead memory control is used, the buffer over/under flow, which will certainly happen, could be masked in the audio material. But again that's even more circuitry that adds at least R&D costs. And then there's the pass through delay of that audio buffer - perhaps insignificant.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
post #38 of 40 Old 01-02-2014, 01:22 PM
AVS Addicted Member
 
amirm's Avatar
 
Join Date: Jan 2002
Location: Washington State
Posts: 18,375
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 981 Post(s)
Liked: 547
Quote:
Originally Posted by Glimmie View Post

No, I disagree with that. The problem with SPDIF is that it WAS designed by people who understood audio design. And that's the problem! SPDIF/AES is not audio frequencies. In fact 48K SPDIF/AES is the same bandwidth as analog NTSC video. Even worse the rise times are more critical to the link performance than with analog video. Poor HF rise time or excessive ringing causes data errors, not just slightly soft or ringing images as with analog video. The audio designers of that era typically lacked the experience in RF/video and high speed pulse design techniques.
Agree that is part of the problem. But the larger issue is what I call the push method and embedded clock. The target should have been in charge of the clock and the source the slave (pull method). This would have eliminated all the problems of jitter with respect to extracting the clock out of the source.
Quote:
I do stand corrected on your efforts to fix the HDMI problem. Publishing your WSR article did bring attention to the problem and perhaps some manufactures engineering group will look into it. So nice work on that.
Thanks smile.gif.
Quote:
It does interest me why there is so much jitter in the HDMI clock. As HDMI is a direct wired clock synchronous interface I would expect the resulting counted down audio clock to be quite stable. The ARC is a fix but that will require sample rate conversion to a light extent - they probebly don't advertise that fact to the audiophile public. You have to buffer a pack of samples to make that work. If a look ahead memory control is used, the buffer over/under flow, which will certainly happen, could be masked in the audio material. But again that's even more circuitry that adds at least R&D costs. And then there's the pass through delay of that audio buffer - perhaps insignificant.
I think the issues of HDMI jitter are very different than S/PDIF. The main problem I see is that there is just so much going on in the system once you require that video be decoded in the process of getting audio since audio clock is slave to video and data is stored in the spare area of the latter. This means that there are a ton of clocks and other activity "whaling" in the system. Quieting those down from leaking into the DAC clock simply becomes hard. If you look at the spectrum of the jitter, there are two observations of source:

1. Low-frequency noise. This is what causes the blown up shoulders around the main tone. For it to be random, it would need to be a ton of low frequency activity. Some of this might be related to vertical video clock although i could not isolate this by changing the refresh rate in the limited testing I did. Perhaps the cause of that is the larger amount of logic that is running in the HDMI transceiver and video processor. All the products sans the mark levinson suffered from this low frequency noise jitter.

2. Specific/correlated noise caused by the source and destination devices. Unfortunately it seems that the HdMI connection fully couples the two systems electrically. And what is in the source then bleeds into the target DAC clock. Here is a remarkable test of that. Below are two measurements *both* with S/PDIF into the Mark Levinson:

i-TZ93scf.png

Clearly the amber one has some problems with additional distortion spikes. What caused them? By simply plugging in the HDMI cable into the Mark Levinson and *not* using it as a source! That's right, simply plugging in the HDMI cable degraded the performance of S/PDIF input!!! And this is on this superb implementation of HDMI in the form of Mark Levinson. Same problem showed up with the same jitter frequency in other products so clearly it was generated by the source (HDMI output of my laptop). I was going crazy trying to generate my test results as from I would keep getting different results. Then I discovered this problem and realized sometimes I did not have the HDMI cable plugged in when I was testing S/PDIF. And other times I did so I was getting different results even though I was just measuring S/PDIF.

Given this, then the results I see are *source dependent*! Worse or better performance might be there if one uses different HDMI sources. The matrix of what one has to test to get any idea of typical HDMI performance then becomes massive as it will be the combination of any source with any target destination device!

These are the reasons I have just given up on these interfaces in favor of well implemented asynchronous USB. There, the target is the clock master and with the activities of USB in the DAC can be dealt with a hell of a lot easier than HDMI. Everyone should be using a music media server anyway and coupling that with async USB interface/DAC, you are golden. Then use HDMI for video and you are all set. Thank heaven for computer technology saving us from these problems. smile.gif

Amir
Retired Technology Insider
Founder, Madrona Digital
"Insist on Quality Engineering"
amirm is offline  
post #39 of 40 Old 01-02-2014, 03:27 PM
GGA
Advanced Member
 
GGA's Avatar
 
Join Date: Dec 2001
Location: Topanga CA
Posts: 716
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 34 Post(s)
Liked: 14
Quote:
2. Specific/correlated noise caused by the source and destination devices. Unfortunately it seems that the HdMI connection fully couples the two systems electrically. And what is in the source then bleeds into the target DAC clock. Here is a remarkable test of that. Below are two measurements *both* with S/PDIF into the Mark Levinson:

Would a fiber optic HDMI cable eliminate this?
GGA is offline  
post #40 of 40 Old 01-02-2014, 04:15 PM
AVS Addicted Member
 
amirm's Avatar
 
Join Date: Jan 2002
Location: Washington State
Posts: 18,375
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 981 Post(s)
Liked: 547
Quote:
Originally Posted by arnyk View Post

IME it doesn't happen in good reliable listening tests for real world jitter in good real world audio gear.

The results of the HDMI jitter tests posted so far in this thread scare only small boys! ;-)
Your comments always remind of the famous chef/travel TV star, Anthony Bourdain. In his book, Kitchen Confidential he talks about the dark secrets of restaurants. He gives a couple of great examples. One was about the bread. He said something like, "you really think someone throws away the bread you did not eat? Of course not. They dust off the cigarette ashes and bring it to the next customer!" eek.gif In another example, he talks about the New York seafood market and how the best chefs are out there by something like 6:00am to get the best catches. Then by 10:00am, when everything worth eating is gone, the Chinese cooks come and buy what is there for really low prices. I always wondered how one can buy Chinese seafood so cheap and here was the answer.

Last year I was at CEDIA show in Denver. Found a sushi place and started a conversation with the Japanese sushi chef (always a good idea to get better quality stuff especially if you speak a bit of Japanese). To my surprise he said after some 20 years in US, he was going back to Japan. I asked why. He said the quality standards for sushi are so low in US. As an example,he said he worked at one sushi restaurant where they would get Hamachi (yellowtail) and if they found worms in it, they would just take them out and serve them to the customers!!! eek.gif

Your comments always remind of these restaurant owners Arny. I suspect most people tolerate such low food quality standards. As long as someone doesn't get seriously ill, the practice is "good enough." Well, it is not for me. I couldn't eat Hamachi for a few weeks even at my favorite Japanese run sushi restaurant. My standard is not whether I die or get seriously ill or not. It is to make sure if someone knew what was going on, they would not get disgusted with what I had put together.

So sure. Be brave. Eat the worm infested fish, recycled bread with ashes on it, and put in extra soy sauce to mask the poor quality of the seafood in that stirfry. It doesn't bother me a bit. But don't say we aren't brave for doing the same. Good engineering hygiene is very important to many of us just as it is for people frequenting restaurants.
Quote:
Some people fault me for not falling over myself to do DBTs related to the technical test results that they post. That's like faulting people for not launching large fishing boats in a tiny lake known to contain only minnows.
I have debated these topics with you for three years Arny. For that entire length, across many talks of DBT, not once have I seen you document such a test. Not once!. The couple of times you said you could do them you then proceeded to ask to be paid to do so. As far as I am concerned, you may not know how to catch a fish in a trout farm! biggrin.gif Let me know if you ever try to add something new of substance to the conversation. Chanting that you don't believe doesn't amount to anything. I am confident no matter what DBT test you put forward, I can find faults with it.
Quote:
For those of us who have actually heard jitter, the most revealtory experience is to ask those who believe that they have trained themselves to hear jitter, what it sounds like.
If there are better proof of one not understanding what jitter is than this statement, I don't know what is. It is like saying I know all the symptoms of a virus infection by assuming there is only one kind of virus! There is no such thing as 'hearing jitter." Jitter is the name of any and all clock anomalies. The created distortions can be random or predictable (correlated). The frequency and amplitude of those distortions can be anything they want. And last but not least, in a typical system there are many different jitters all mixed in together. So to say you have heard jitter when there is infinite variations of it, makes zero engineering or logical sense.

To show further evidence of this, here is the page you had created on your web site on jitter: http://web.archive.org/web/20050210185932/http://www.pcabx.com/technical/jitter_power/index.htm

No doubt that jitter that you think you heard was from those files. As we see there, the jitter that is simulated is a single one at 60 Hz. In other words, power supply induced jitter. This is indeed a common jitter *component* in AV gear. So what is the issue with simulating it? Well, it is one of the least audible because it creates two distortion sidebands to each music tone that is only separated by the same 60 Hz jitter frequency. Masking effect is quite powerful when secondary tones (jitter distortion) are very close to the primary one. Therefore such jitter is one of the least audible ones there is, unless you boost its level way high beyond what we would find in real equipment.

Here again are some jitter distortion products:

i-QHffrnc.png

The first pair around the main signal in the middle are at 63 Hz. For the same reason mentioned here, I explained earlier that they are likely not audible. But look past them. There are many distortion spikes going to left and right. In this zoomed display, the span +- 600 Hz, not 60 Hz. Has Arny heard the "jitter' in this AVR? Of course not. How would he have known to create such a profile of jitter out of blue?

What on earth would be the point of simulating 60 Hz jitter anyway? Does the person not know psychoacoustics? Or hopes by simulating the least audible ones we can try to convince people it is not audible? I put my money on the first.

As I said, it comes down to transparency. It comes down to saying exactly what "jitter" you had heard and not leaving that important part blank. It comes down to not ignoring jitter across a range of products without once measuring it. If we were running political campaign, all of this would be fine. But not when discussing product engineering.
Audionut11 and kiwi2 like this.

Amir
Retired Technology Insider
Founder, Madrona Digital
"Insist on Quality Engineering"
amirm is offline  
Reply Audio Theory, Setup, and Chat

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off