THX Caught With Pants Down Over Lexicon Blu-ray Player 397
SchlimpyChicken writes "Lexicon and THX apparently attempted to pull a fast one on the consumer electronics industry, but got caught this week when a couple websites exposed the fact that the high-end electronics company put a nearly-unmodified $500 Oppo Blu-ray player into a new Lexicon chassis and was selling it for $3500. AV Rant broke the story first on its home theater podcast with some pics of the two players' internals. Audioholics.com then posted a full suite of pics and tested the players with an Audio Precision analyzer. Both showed identical analogue audio performance and both failed a couple of basic THX specifications. Audioholics also posted commentary from THX on the matter and noted that both companies appear to be in a mad scramble to hide the fact that the player was ever deemed THX certified."
Audio/Videophiles Beware (Score:5, Insightful)
Expensive isn't always better. Ever heard of Denon's $500 ‘Audiophile’ Ethernet Cable [wired.com]
Re:Audio/Videophiles Beware (Score:5, Funny)
Re: (Score:3, Funny)
"designed for the audio enthusiast"
Those people are fools! I only buy Monster cables.
Re: (Score:3, Informative)
The question is: does the Denon units use the Ethernet protocol?
The answer to this question will determine if you're smart or if you bought into their marketing chant.
Re: (Score:3, Interesting)
I don't own the denon and so I can't say for sure.
I actually build my own spdif hardware and audio dacs (my audio gear is all DIY stuff). and I do use i2s as an 'interconnect' between spdif receivers and the dac chips (when we build dacs, we take great care to layout the pcb traces to ENSURE that the i2s lines are exact(!) lengths. its just proper engineering.)
Re:Audio/Videophiles Beware (Score:5, Insightful)
Re: (Score:3, Funny)
Ethernet signals travel at a very large fraction of the speed of light
0.59c isn't THAT large.
Large enough for your mother.
Re:Audio/Videophiles Beware (Score:5, Interesting)
I'm laughing my ass off. You don't seriously think the jitter caused by that miniscule difference in cable length will fool with anything designed to use twisted pair as an interconnect, do you?
We're not talking about memory busses running at several GHz, we're talking about relatively low-bandwidth interconnects between devices. And this is assuming that you're not encapsulating everything and just using ethernet signaling like everyone else in the pro audio world does.
Re:Audio/Videophiles Beware (Score:4, Interesting)
I'm impressed, but you must admit, at least, that makes you a little "eccentric" (in the best definition of the word).
I'm curious: can you hear the difference when using unequal cable lengths?
I've known "audiophiles" to claim some pretty wild stuff, so I have to ask.
[Note: I produce recorded music for a living now, so I have a professional interest]
Re:Audio/Videophiles Beware (Score:5, Funny)
There's a world of misery in high end audio these days. First off, I do NOT have a high end DVD player, as I have yet to find a DVD player with decent enough audio out. But, I and my neighbour both have really awesome stereo systems and we regularly test different cables and suchlike, and oddly enough, cabling, even for digital, does make a difference, sometimes dramatic.
My system:
Computer: iBook, USB to (DAC)
CD: Rotel 855, spdif out to (DAC):
DAC: Musical Fidelity DAC, w/ M.F. power unit, kimber cable to:
Pre: Bryston, to:
Amp: B&K, using amazingly cheap yet excellent flat audio cable to:
Speakers: Home built. SEAS 8in woofers with 5in Audax mids and 1 in tweeters, in ported forward firing towers.
I also have a turntable: SOTA Comet with a REGA tonearm and Sumiko Blue Point Cart that goes to a Rotel ttable preamp. I also have an old Onkyo FM tuner that I rarely use.
My friend's system: Copmuter: IBM thinkpad, USB to DAC
CD: Rotel 855, spdif out to DAC:
DAC: Benchmark to:
Pre: Melos optical, to
Amp: Phase Linear 400 to
Speakers: Watson Lab 10s (monster towers. Filled with Audax drivers)
And we did a series of tests. Our results were:
1. The best listening on both systems was this arrangement:
24bit FLAC files on Computer via USB to DAC to AMP to SPEAKERS.
The FLAC sounds better than CD because of the error correction in the CD player accounting for defects in the CD, dust, finger prints, vagrant cruft, the fact that the discs aren't perfectly circular, etc.
2. Getting good electricity was paramount - plugging directly to the wall socket noticeably screwed with the sound.
3. We found that the Preamps very very very slightly altered the sound stage. We both have high quality passive preamps, and they shouldn't change anything, but they did. The Bryston was less affecting than the Melos. We swapped preamps one day, and decided the Melos sounded a wee tiny bit nicer, but was slightly more tiring with my speakers and amp. As a consequence, the ever so slightly better sound was to go directly from the DAC to the AMP.
4. Next to solid electric provided by power conditioners, cabling made a big difference. We both use fairly high end Kimber cables, so that is not the issue. What is supremely weird is the USB cable made a difference. We had some junk USB cables sitting around and used those. Then we both chipped in and got a stupidly expensive ($85) USB cable. It sounded great. That afternoon, I bought a hard drive that came with a USB cable. The FREE CABLE sounded better. No shit. On both systems. So, we got our money back on the USB cable and took our families out for pizza and beer.
Also, the SPDIF cable made a huge difference. The cheap plastic SPDIF lightpipe thingie sucked. It wreaked havoc with the soundstage. However, the SPDIF RCA style optical was WAY better. Why? No idea.
5. The second best arrangement was with the Preamps back in the system. Frankly, the differences were tiny. My neighbour noticed it more than me.
6. We both have shorter cable lengths. We both used to have longer cables, but after repeated testing on both systems, the longest cable either of us now has is 1 meter, except, or course, for the speaker cable. The speaker cable is an interesting issue. For years I used heavy duty lamp cord. Then I bought balanced studio TRS cables. Then I figured out, any decent cable is just fucking copper. Pure copper. It's NOT the cable: it's the interconnects. building my own cables is do-able as I can buy high end silver interconnects, and solder them to pure copper cables. I still have some Kimber cables, but it's mostly home built.
People poo poo home built, as if a kit is inferior. If you're careful and precise, and know how the stuff works, homebrew gear can be VASTLY superior. Example: I bought a pair of Polk Audio Monitor 5s at a pawn shop for $60. I used them until a tweeter failed. The cabinets were in PERFECT cond
Re:Audio/Videophiles Beware (Score:5, Insightful)
So what you are saying is that two guys spent large amounts of money on hifi gear and now both of you can hear a difference between digital cables which all the science and testing says are the same?
Psychology is always a problem with this sort of thing. Unless you can show that you can tell the difference in double blind tests then I'm afraid you won't be able to convince me. Every time people have done double blind tests the results have shown that they can't tell the difference between cheap digital cables and expensive ones, probably because there isn't any.
Re:Audio/Videophiles Beware (Score:5, Funny)
Yes.
It is common knowledge that the human body is not built symmetrically and distance between ear and brain vary from one another.
I usually tug along my portable CAT scanner for adjusting the cable lengths properly and provide my customers with the best aural experience possible.
Re:Audio/Videophiles Beware (Score:5, Insightful)
Really? The clock is *that* intolerant on a 3 and a bit Mbps signal that a couple of mm is going to really make a difference?
Sorry - but a normal ethernet cable will be more than adequate. You're wasting your money if you spent $500 on the Denon cable - you've been had. Ensuring the PCB traces are exactly the same length isn't good engineering for this particular task, it's simply wasting your time. I simply do not believe the clock tolerance is measured in picoseconds.
Re:Audio/Videophiles Beware (Score:5, Informative)
It doesn't matter for i2s, it's a clocked interface. As long as setup and hold times are met, the data will be valid. Picoseconds aren't going to mess things up when the setup and hold time specs are measured in nanoseconds or more.
Re: (Score:3, Insightful)
Using a protocol that absolutely requires perfectly identical conductor lengths on internal PCBs can be solid engineering.
Using that protocol outside of separate units, over flexible wiring, is questionable, since there is no control over wiring specs, thermal expansion, wiggling in the cable etc..
Which is probably the reason the Denon Link cable is that expensive: Ethernet cabling is probably incompatible with that protocol and the optimal cabling for the protocol requires very low tolerances for everythin
Re:Audio/Videophiles Beware (Score:5, Interesting)
Please do the calculation and tell us what the difference in transit times is for, say, 40m of cable.
Clue: do actually believe that a band who's musicians use different length guitar/mic cables cannot possibly play in time?
Re:Audio/Videophiles Beware (Score:4, Insightful)
Re:Audio/Videophiles Beware (Score:4, Informative)
The ONLY place where I expect the digital bit stream to have any problem is in the DAC (or when the sampling for the ADC). The BER (bit error rate) in a standard digital link at 192KHz audio is going to be on the order of 10^-10, which means 1error in every 18minutes of audio. I doubt even trained ears can notice a 1bit error over that interval. And heck, the interface is probably even better - 10^-12 BER isn't out of reach.
As for jitter: if you have a 30KHz signal, which only your audiophile dog can hear, a 10ps jitter RMS (which can be considered around 70ps pk-pk jitter) would be translated by the DAC into >115dB SNR; and it would not introduce distortion, only noise ( a good chunk of it would be then filtered out by the system lowpass filters in the audio stream). If you are dealing with a 3KHz tone, then the SNR would be >135dB.
I've had to fight jitter problems on high speed ADCs (100+MSPS) in the past; you have to do BPF and clock divider techniques there; audio systems don't need that. To me, 192KHz is DC. The wavelength of the ~10Mbit/s interface clock is around 100ft - and for a digital system you would only close the eye by 1% with a 1foot mismatch.
Re:Audio/Videophiles Beware (Score:5, Informative)
Well, it was Grace Hopper and she's the mother of COBOL, not FORTRAN. She used to give out "nanoseconds" at her lectures. They were 11.9 inch lengths of wire, which represents how far electricity can go in a nanosecond. A friend of mine still has one of these "nanoseconds" he got from her.
Re: (Score:2, Informative)
i could be wrong, but this sounds like cat5 to me, (rather then 5e where the twists are all the same)
No it works fine with normal Cat-5 (Score:5, Insightful)
They say as much in the manual of Denon gear that has the port on it. You have to realize they used stick Denon Link on most of their stuff. They do it much less now that HDMI works well. The original purpose of it was to get a digital multi-channel uncompressed audio signal off DVD-A and SACD. Prior to HDMI, there wasn't an interconnect that did that so they rolled their own. Now it isn't so useful so they've pulled it off most of their gear.
At any rate, I don't think they were seriously expecting people who bought $1,000 receivers to get a $500 cable. As I said, the manual doesn't say you need to. What I think it was is audiophiles whining. They do sell some pretty expensive stuff, like a $7,500 processor/preamp. Some people who buy that probalby sniveled at the though of having to use an ordinary ethernet cable for their precious data. Denon then decided that if these people wished to waste money, they'd be happy to stick a vaccuum in their pocket and suck it out.
I don't believe it uses I2S, as they specifically talk about jitter immunity, and even if so it wouldn't matter. The data from any of the digital inputs doesn't go to a DAC, it goes to a SHARC processor (or sometimes more than one) where it is manipulated according to the setup of the receiver. From there it goes to the DAC. So it is going to get re-clocked anyhow.
Re:No it works fine with normal Cat-5 (Score:4, Interesting)
A few years ago I worked for a famous chip company with only one real competitor. When they came out with a chip that was smaller, faster, and used less juice than ours, we were, ahem, green with envy.
And we raised our prices.
The marketing VP explained to us at a meeting that people will perceive our chips as being better, even when they know the facts prove otherwise, because if it "costs more it must be better."
I'd like to point out that most "audiophiles" are usually scrounging vintage gear at Goodwill, and pretty much tweak their analogue gear with rubber bands and safety pins or whatever words. It's the guys with too much money who are buying the alleged high end gear.
Re: (Score:2)
The impact on the audio signal is irrelevant.
Re:Audio/Videophiles Beware (Score:5, Interesting)
Given that the maximum cable length under best conditions (I'm not even accounting for cable twisting here) is about 100m, at 0.5c the delay between sender and receiver is about 6.6*10^-9. Not quite 7 nanoseconds, if I am not mistaken. The time it takes your computer to execute about 30 atomic instructions. Considering your reflexes take a billion times longer, I would be amazed if you can hear THAT.
Re:Audio/Videophiles Beware (Score:4, Informative)
Thanks, I was too lazy to do that bit of the calculation. Now let's put that into some more relevant terms. The highest sound that a typical human can hear is around 20kHz, let's be generous and say and audiophile can hear 25kHz (the upper range of my hearing is well above average, but is still a way below 25kHz). At 25kHz, the wavelength of the sound is about 1.36cm (assuming one atmosphere pressure), so that gives the absolute shortest wavelength you're likely to hear. In 7 nanoseconds, this sound will travel 0.000175 wavelengths. That means that this skew, over a 100m cable, will potentially make the sound out of phase by 0.0175%. Now, the human ear is very good at detecting differences in phase. It's something that you're likely to find subconsciously irritating, even if you aren't able to tell what it's caused by. Even a magical audiophile ear (the one we're assuming can hear ultrasonic audio), however, is going to have difficulty spotting a phase variance that small.
I always assumed those ethernet cables were a joke. I'm astonished that people take them seriously. If I had a few million dollars, I'd buy one to reward the manufacturers for making me laugh, but I never thought people would buy them expecting a real difference in quality.
Re: (Score:3, Interesting)
It's like going to a fancy restaurant. The food might taste the same or even worse than a moderately priced on, but you got dressed up and paid a lot of money for it so you feel like the whole thing is a bit higher "quality". Quality being an entirely subjective measurement.
Re: (Score:2)
http://en.wikipedia.org/wiki/I%C2%B2S [wikipedia.org]
Note what they say at the bottom: IS signals can easily be transferred via Ethernet-spec connection hardware (8P8C plugs and jacks, and Cat-5e and above cabling).
Care to correct wikipedia?
Re: (Score:2)
Am I surprised that wikipedia might be wrong? No.
Comment removed (Score:5, Informative)
Re: (Score:3, Informative)
His point is correct, although the details are a bit misleading.
Just to give an impression of the magnitudes involved, I2C high speed (here's the spec [nxp.com]) signals have rise/fall times in the 10-80 ns range. The setup time, which depends on the synchronization between the data and clock lines, has a minimum spec of 10 ns. If the implementation puts things in the center of the window, there's about an 80 ns setup time, so their might be 70 ns of slop available on either side.
Twisted pair c
Re: (Score:2, Interesting)
Thank you for the a good laugh. I love hearing audiophiles brag about wasting money on cabling, and talk like they are experts on subjects they obviously know nothing about. I have no idea how sensitive the ear is to clock jitter, but I can tell you that the only clock that matters is the one driving the audio DAC, and that it is not the same one that is used for the ethernet connection.
Re: (Score:3, Insightful)
I design computer chips. Nothing high-speed, just 500 megahertz, but even I know it *is* possible to eliminate clock jitter. You can make the final received data look as good as the original source, such that even with an oscilloscope you can't see any difference.
Re:Audio/Videophiles Beware (Score:5, Funny)
Just because you can't measure it doesn't mean that an audiophile can't hear it.
Re: (Score:3, Funny)
It's just a shame that all that cash goes to the least scrupulous companies.
Someone should start a rumour that the perfect speaker-casing material is Haitian rubble.
Re:Audio/Videophiles Beware (Score:4, Insightful)
Yeah, you can. See, this is where you audiophiles fail: you take some word that refers to a real concept and mangle it into fairy tales that have nothing to do with reality.
Jitter is real, it affects craptacular dumb DACs driven by a clock recovered from a self-clocking signal, and it's eliminated completely by any form of clocked digital processing or buffering on the signal after clock recovery. Or in other words, yes, it exists, no, it doesn't affect 99% of AV receivers out there.
Re: (Score:2)
>>>ethernet cable (modern spec) has UNEQUAL LENGTH WIRES. This will 'mess' with digital audio clock and data (i2s).
Even if we assume that's true, digital error correction will correct any clock skew, so it does not matter if the wire are unequal. You will get the same result as if the wires were equal. That's the advantage of digital audio - it's self-corrects.
You're the typical audiophile who is still thinking in analog terms (which can be affected by inferior wire), and not realizing the
Re:Audio/Videophiles Beware (Score:5, Informative)
We're not talking about data loss here, but data degeneration. And it used to be a problem with analog cables. A signal might have been distorted by a badly shielded cable because the signal was sent into the wire and then reproduced the way it was received. If it was altered along the way, that alteration was often audibly noticable.
That doesn't apply to digital data. If a 1 is sent and is received as a "0.8" or a "1.2", it will still be interpreted as a 1. Simply because there is no 0.8 or 1.2, as there used to be in analog times. Yes, the signal can still degenerate, but since we use discrete values of signals in digital media instead of a "sliding scale" analog signal, that degeneration is easily compensated. It can now be identified correctly and it is adjusted accordingly. So that signal degeneration plays a lesser role now. Of course, if you have REALLY crappy cables it will show. But the average cable that wasn't tied in a knot first will do just fine.
Re: (Score:3, Informative)
>>>That doesn't apply to digital data. If a 1 is sent and is received as a "0.8" or a "1.2", it will still be interpreted as a 1. ..... So that signal degeneration plays a lesser role now. Of course, if you have REALLY crappy cables it will show. But the average cable that wasn't tied in a knot first will do just fine.
>>>
Well said. If a digital signal "1" degrades below 0.6 there's a possibility the computer inside the receiving unit will misinterpret that "1" as a "0" but as you said tha
Re: (Score:3, Informative)
Well it turns out you're right. S/PDIF doesn't use error correction. It's as error-prone as analog. What idiot would design a digital transmission protocol without built-in error correction?
Re:Audio/Videophiles Beware (Score:5, Insightful)
The kind that wants to sell $500 cables to be used with it?
Re: (Score:3, Informative)
Re:Audio/Videophiles Beware (Score:4, Funny)
If a SDPIF cable starts losing bits, it will be rather noticeable. It will be full of pops and clicks.
Having a 'bad signal' without it being blatantly obvious is near impossible, and if you get a bad signal, I suggest you move your audio equipment further away from your unshielded nuclear reactor or your kilowatt FM transmission tower or unwrap it from around powerline step-down transformers, or whatever the fuck is causing the problem.
Re: (Score:3, Insightful)
I find this whole argument ludicrous. You can pump a hundred millions bits of information per second over a cat 5 cable using Ethernet, but for a few hundred thousand bits of audio per second you suddenly need $495 worth of snake oil to be added?
Re:Audio/Videophiles Beware (Score:5, Funny)
(shines ethernet cable)
(attaches fake Denon label)
I've got some amazing Denon wire here, personally spit-polished to ensure the absolute best in digital transmission quality. And at only $249 this is a real bargain! (audiophiles stampede into the room). My god. It's almost like being Timothy Geitner - I'm printing my own money.
Re:Audio/Videophiles Beware (Score:4, Insightful)
Re: (Score:3, Informative)
Note that when you twist two wires together, they reach a shorter distance, because the wire now has to follow a longer spiral path. More twists means more distance. So to make up for the loss in length you need to use more wire. Thus a differing number of twists means a different amount of wire is needed to reach the same distance. Not a lot, but there _is_ a difference.
In ancient times, clamps were made by putting a wooden rod between two ropes and using the rod to twist the ropes, pulling the two ends cl
Re: (Score:3, Interesting)
Still, at the end of the day (or the wire, rather) you get a signal that is composed of 0s and 1s. There is no "in between", there is no "bit rot" in the medium. That might have been real for analog transfer when it mattered that the signal was transfered verbatim. It can NOT be transfered any other way today. It is EITHER 0 OR 1. There is no in between.
Re:Audio/Videophiles Beware (Score:5, Interesting)
I have been told (directly, not third party) by one of the highest authorities at Denon Electronics that their cable is a shielded Cat5e cable... They only made it to satisfy custom installers who wanted something ridiculous to sell clients who had more money than sense. Off the record of course...
In this case Denon aren't bad guys, they just aren't stupid. They had enough requests and knew these guys would simply go elsewhere to get what they wanted (another product they could sell people who, if they dropped a $100 bill on the ground, would think it a waste of time to stoop over and pick it up).
In this case, the people at fault are the installers who can't seem to charge for their time and instead want to cultivate an industry where their services are "free" and everything is paid through them buying products at cost and selling them at retail to clients. The really big installers know how to run a business, but the middle and lower tiers are largely fueling customer ignorance of the value of their services.
Re: (Score:3, Insightful)
Re:Audio/Videophiles Beware (Score:5, Informative)
No, please let ME explain this.
The wires are in pairs. Color coded pairs.
Depending on your point of view, the green pair is TX (transmit) and the orange pair is RX (receive).
Since the green pair (solid/striped pair) is twisted together, both green wires are the same length. Both orange wires are the same length (although green and orange may have slightly different lengths).
ALL of the signal data is transmitted in ONE DIRECTION on ONE PAIR (man I love caps emphasis) whose wires are the SAME LENGTH.
Understand that yet? Data from component A to component B travels over two wires which are the _same length_. Data back from component B to A travels over another pair of same-length wires. There is no "messing" with clock data, as it's all serial. And even if it used the 1000-T/TX standard requiring 2 pairs per direction, the difference in length of conductors in a 10 meter long cable (10 meters would be a very big audio rack) is 4.44 centimeters (assuming 24 ga and standard insulation thickness and miswiring to get longest/shortest paired). Rounding that up to 5 centimeters, and using 300 million meters/second for speed of light, and .64c propogation speed in the wire, I get about one four-billionth of a second difference. Meaning your sample rate would need to be in the GHz range, and which means if you can tell the difference, you would be able to "hear" VHF and UHF radio waves.
Keep rationalizing though... it keeps my debunking skills fresh.
Re: (Score:3, Funny)
That's why the only setup a true Audiophile will have is a listening room setup inside of a vacuum chamber. No worries about changes in air pressure.
Re:Audio/Videophiles Beware (Score:4, Informative)
Actually, interconnect and speaker cables do (audibly) benefit from good quality, to a reasonable extent.
Interconnect, yes. Speaker cables, no.
Plenty of blind tests have shown that there's no audible difference between the most expensive speaker cables and cheap telephone wires. If you look at the math you'd see that the wire noise is something like a hundred times less than the distortion introduced by the speakers themselves, so spend those the $500 on better speakers and use whatever wire you got for cables.
Interconnect cables transmits much weaker signals so noise have a greater effect there.
Re: (Score:2, Informative)
actually, there IS something to that cable. very very minor but its there.
I believe that cable is NOT for ethernet even though it uses rj45. I THINK its used for i2s in audio and that is VERY timing dependant (clock and data on diff wires).
now here's where most people don't know something and think they do: ethernet cable these days is NOT equal length wires! yet i2s for spdif break-out NEEDS each wire exactly the same length (timing matters, again). and so you cannot really use ethernet cable. look it
Re: (Score:3, Insightful)
Sorry, still no point.
Signal speed in copper is about 15-20cm per second.
Even if they were running those things at a GHz (how many hundreds of audio channels do they transport), being correct to the cm would be quite ok.
And even bog-standard cables are easily in side that tolerance.
Re: (Score:2, Interesting)
btw, I am NOT defending the price on this! just the fact that there IS something to the denon cable that most people are not seeing and don't even know about (the ethernet thing with diff length pairs inside).
it should be priced MUCH lower, of course. but still, the fact is that i2s does require exact length wires on all the links between the spdif receiver chip and the dac chip (which is what i2s is all about, really; its not even an external interconnect but intended entirely for use INSIDE cd players,
Re: (Score:2)
What? Signal speed in copper is over 100,000 KILOMETERS per second. Am I completely misunderstanding what you're trying to say?
Re:Audio/Videophiles Beware (Score:5, Informative)
I tell you, audiophiles have NO IDEA OF SCALE.
I am working with HF stuff. I run on cables that cost as much as that one, but in bulk supply from industry vendors (Huber+Suhner, for example). Because they are linear to 18 Ghz.
I also did an experiment where i had to synchronize two signals to some picoseconds, and that is damn hard. Damn hard in the sense of "a day of quality time with a network analyser and a few delaylines".
---
Speaking again on HDMI: Yeah, it matters for it, as its fucking running at several hundred times the datarate than an audio connection.
HDMI is specified to transport up to 10Gbits/s, multiplexed on only 19 conductors.
Compare again with audio datarates...
Re:Audio/Videophiles Beware (Score:5, Informative)
At 20kHz two chunks of aluminum makes for a pretty nice cable. With 50GHz coax you need tiny precision machined connectors (2.4mm), and a very narrow cable with a low permittivity dielectric. Such a cable costs about $2,000.
The reason for the precision, size, and expense at those frequencies (as you know but others probably don't) is that if you have a large cable, there are multiple different wave equation solutions (modes) which allow power of a particular frequency to travel down a cable, and they will propagate at different speeds in the cable (and different attenuations), so what you get out of the cable is a distorted version of the input. So you must make the cable with size on the order of the smallest wavelength you intend to transmit. And it has to be precisely made because imperfections, scratches, and so on need to be even smaller or they will cause an impedance shift which reflects some signal back at the source.
At 50Ghz a wavelength is 6mm. At 20kHz it is 15km. This is why it is easy to make very nice audio cables, and hard to make nice HF cables.
Re:Audio/Videophiles Beware (Score:4, Insightful)
given a choice of 2 interconnects where one is to exact-length and the other is varying length, which would you choose, all else being equal? and ignoring the insane markup (yes, its uncalled for!).
What are you talking about? The insane markup is the whole point! You are saying that there is probably no measurable difference - in that case, any good engineer would choose the less expensive solution. End of story.
Re: (Score:3, Insightful)
easy proof (similar idea): look at the inside of an hdmi switch. it has parallel twisted pairs, too. look at the squiggles on the pc board traces. they are 'making up length' with lefty/righty (tech term, lol) loops of copper trace. TIMING MATTERS on parallel digital signals!
Of course it does, because the HDMI signal is 165MHz+ (HDMI 1.0, later added higher modes). It matters for two digital devices to talk to each other, but there's no way a human could recognize picosecond jitter in the decoded video or audio which runs in kilohertz for audio and hertz for video. And if the digital signals were wrong, like the LSB of one sample running over into the MSB of the next sample, you'd know extremely quickly unless you're blind and deaf as it'd all be noise.
In short, your technical
Re:Audio/Videophiles Beware (Score:5, Insightful)
You're confusing jitter with clock skew. Clock skew means nothing as long as the input signal is still within the setup/hold times of the receiver. It either works or doesn't. This isn't to say that you don't need good matching, just that better matching will not improve quality.
Jitter is different. Jitter is uneven clocking. On the other hand, jitter is almost nonexistent on separate clock/data connections because any delays in the clock are consistent.
Jitter does matter in things like S/PDIF that combine clock and data, because then the data will affect the distortion on the clock and it will be jittery when recovered. This is what all the talk about jitter is: S/PDIF (and similar) clock recovery. Don't mix it up with other issues and other interfaces.
S/PDIF does have improved quality if the signal is less distorted, because it improves jitter. This problem can be completely eliminated by using a buffer before the DAC, or at least a PLL to clean up the clock (it only affects DACs that clock straight off of the recovered S/PDIF clock). Other interfaces (I2S) with separate clock and data do not have this problem because any distortion on the clock is consistent cycle to cycle.
Re: (Score:2)
ethernet cable these days is NOT equal length wires! yet i2s for spdif break-out NEEDS each wire exactly the same length (timing matters, again)
But the different twist rate of the pairs in Plain Ordinary CAT5 don't make any difference. You're talking about a difference of a few millimetres over a whole 305m roll of CAT5 - in a sane length of patch cable that would make a difference in the order of a few femtoseconds.
Re: (Score:2)
I2S clock jitter does NOT affect audio performance (Score:5, Informative)
As someone who has actually interfaced I2S sigma-delta DAC's to DSP's I can tell you are either confused or have your facts wrong.
The clocking setup is typically a master clock running at 256X, 384X or 512X audio frequency running into the DAC, it is the stabilty of this clock that determines the accuracy of the analogue output.
The I2S bus has three lines, CLK (data clock) which runs at 32X frequency (for 16bit audio), DATA (the actual bits) and LR which indicated if the data is on the left or right side. Jitter on the data line has no bearing on the quality of the output as long the data is present on the clock transition as it is latched and presented synchronously to the analogue section of the DAC.
Although I2S was not designed for cable comunications you could easily get away with using it for short distances since even at 24bits and 96Khz the clock rate is only 4.608MHz with a cycle time of 217ns. Assuming a latch window of 25% of cycle time of gives us 51ns, any device producing that much jitter would have to be pretty badly designed.
So to cut a long story short, yes for I2S using ethernet cable is more.
Re: (Score:3, Informative)
You have no idea why twisted pairs are twisted do you?
ahem. as both a designer and builder of digital audio equipment, I have to say you are DEAD WRONG. I fully know about differential encoding using twisted self-shielding. its the same that POTS uses and same that pro audio uses with xlrs. same idea.
but running pairs next to each other interferes. THIS is why they use unequal length PAIRS. PAIRS. that's the key, each pair 'beats' at a slightly diff frequency (swr, really) and there is some natural a
Re:Audio/Videophiles Beware (Score:4, Funny)
How does a 2-direction arrow silkscreened onto the connector improve anything?
Re: (Score:3, Interesting)
I always thought that was the reason why they used digital cabling in the first place: to get a perfectly lossless transfer and have CRCs to prove it.
"Common off the shelf" ethernet parts have now an uncorrectible bit error rate below 10^-10 or so, which means a cheap "small-office-home" Netgear or D-Link part solution will have one bit off every 10 seconds when continuously blasting at full 1Gbps.
One bit off, every ten seconds under maximum transfer speeds.
Software and protocols handle that on the receivin
Re: (Score:3, Informative)
He was right. You don't know.
Pairs of wires are twisted together to couple them to each other, rather than to their environment. This is useful for both digital and analog signals, and is key to differential signals which may accumulate quite a large "common" signal over their length from environmental factors, especially differences in "ground". Similar effects can also be achieved by wrapping one wire inside the other, also known as coaxial cable, but that's far more expensive to make and more awkward to
Re:Audio/Videophiles Beware (Score:4, Informative)
The difference in wire length for i2s is either very audible or not audible. It does affect the DAC and matching clock and data lengths is important, but it's a data corruption issue - if the lengths differ enough that the signal is out of spec with regard to the setup and hold times of the DAC, you get glitchy audio. This isn't an "analog" difference.
Clock jitter may be audible, and mismatched clock skew between outputs can be too, but skewed clock and data to a single DAC will not cause any audible changes until you exceed the specifications and then all hell breaks loose.
No shock (Score:5, Informative)
Say it ain't so!
Re:No shock (Score:5, Insightful)
I've never understood why you'd want to buy a "high end" Blu-ray player anyhow. Reason is I can see only two setups:
1) You own a low end TV and receiver, or maybe no receiver at all. You've got no digital inputs. Thus your Blu-ray player's DACs have to handle the conversion. However, their quality matters little. Why? Well you've got a low end setup. You clearly are not concerned with quality. As such a cheap player will do fine. Improvements to its DACs and supporting analogue circuitry won't be noticeable to you.
2) You own a high end TV/receiver and care a great deal about quality. In the case you hook the Blu-ray player up using HDMI. Reason is HDMI gives you the best signal. However in this case, the player isn't doing anything other than nabbing the data and passing it along. The analogue conversion happens in other units. So again, the quality isn't important. Your receiver's high quality DACs will handle the audio, the Blu-ray player will just send them data.
I just can't see the case where you'd need good analogue outputs for Blu-ray.
I can see potentially buying something like the Oppo player, if it had a good warranty and build quality. Makes sense to maybe pay more to have your gear last, but I can't see paying more for one just because it supposedly had better circuitry. Even if it does, you aren't going to make use of it. You'd be a fool to buy a high end HDTV and then not use the digital input, as the TV processes everything digitally internally.
Re:No shock (Score:5, Insightful)
This overlooks one group of people who actually exist in large numbers but are often overlooked:
3. You have a nice HDTV and HDMI digital for that. But you also have a very nice audio system, but one that you put together before the HDMI specification was well established and thus it does not have HDMI. But your Receiver/PrePro/Amplifiers are very good, and you don't want to just replace them just to get ones with HDMI built in. But luckily they can take 5.1 or 7.1 analog inputs from a player with good quality outputs.
This is exactly why I like the Oppo BluRay player. At the time for a minimal cost increase over other BR players I was able to use both a digital connection to my TV, and use the latest audio upgrades on BR along with my older, but very good, audio system. That being said I would never pay the $2000 plus for the 'high end' BR players. The Oppo is excellent, and I don't even have the special edition model with upgraded audio components. I'm sure it's fabulous, but the regular one I have is really really good.
Why replace perfectly good equipment just to get a new connector, when you can still use it and get great performance out of it? I occasionally get the itch to replace those components, but when I research new ones I just don't see enough upgrade for what it would cost to justify it at this point.
oh..heh (Score:2, Insightful)
"THX certified" is that about as useful as "Designed for Windows"? or maybe "Windows Vista Certified"...hahaha
Credibility. (Score:5, Insightful)
Re:Credibility. (Score:5, Interesting)
Sadly it's been a years-long downwards slide with THX. They used to certify only high-end theatres, then added high-end home theatre setups, then the standards for commercial theatres slowly started slipping until basically everyone who wasn't showing films in a tin can got certified, then they started certifying middle-of-the-road home theatre setups, then individual pieces of home-theatre hardware, and recently even some decent but not exactly world-class Logitech computer speakers.
Re:Credibility. (Score:5, Informative)
Ya. While over all I like the idea of certification grades, THX did a bad job of it. Part of it was that they don't do enough to differentiate the certification types. They all feature THX in big letters and then something small that tells you what the actual certification is. Ok, well that matters a lot. A high end Ultra 2 certification on speakers pretty much means they can handle theatre reference levels of sound. They can truly give you a home theater. Their lower end stuff? Not so much.
Also when it came to computer speakers they started compromising too much. It wasn't a matter of backing off on some specs that really didn't matter too much, they changed it so much to accommodate the lower end nature of computer speakers as to make it more or less meaningless.
Personally, I don't buy THX gear. It is a waste of money in my book. All the gear I seem to like the best doesn't bother getting THX certified. They don't need a label saying "This is good for home theater." You take a listen to it and you say "This is good for home theater," no badge needed.
In some cases, they impose restrictions that aren't acceptable to manufacturers either. Speakers are a good example. The high end THX spec (don't know about the lower ones) requires speakers to be sealed with a natural rolloff at 80Hz. Ok, well maybe I don't want that. In fact, I for sure don't want that for music. I want more full range speakers, and I'd like them ported as that increases low end efficiency. Ok, well they can't be THX then, no matter how good they are.
Really, if you are looking for good home theatre, you'll do much better buying high quality gear you like, and making sure to get a receiver that has a good calibration solution like Audyssey MultEQ. Having your setup properly dialed in to correct levels and delays and such is way more important than if the speaker is precisely what THX likes.
Re: (Score:3, Insightful)
In some cases, they impose restrictions that aren't acceptable to manufacturers either. Speakers are a good example. The high end THX spec (don't know about the lower ones) requires speakers to be sealed with a natural rolloff at 80Hz. Ok, well maybe I don't want that. In fact, I for sure don't want that for music. I want more full range speakers, and I'd like them ported as that increases low end efficiency. Ok, well they can't be THX then, no matter how good they are.
Buy a (sub)woofer.
With a woofer, you won't notice the 80Hz rolloff.
If you're porting a mid-range in order to bump up the low end, you're doing it wrong.
Purpose of 80 Hz rolloff (Score:3, Informative)
While THX has no convenient spec for download on their homepage, I have gleaned the following from various forums (errors of the posters possible ;-)
-80 Hz is the crossover frequency between subwoofer and full range speakers
-The subwoofer is fed the signal over a low pass filter with 24db/oct at 80 Hz
-The full range speakers are fed the signal over a high pass filter with 12db/oct at 80 Hz. Together with the natural roll off that amounts to a high pass filter with 24db/oct.
My semi-educated opinion (electric
Re: (Score:3, Interesting)
True but your typical $500 - $1000 receiver from the later 2000's probably can produce output that is every bit as clean and generally good as a $2500 from the mid 90's or prior.
Better DACs that use more bits and hardware that internally uses higher sampling rates has become cheap. The noise floor is lower on modern equipment too as chip manufacturing even analog chips like opamps has improved greatly; much lower distortion. DSPs have gotten cheap as well; in even modest setup these days the internals are
More newsworthy... (Score:3, Insightful)
is the fact that anyone takes THX seriously anymore.
The moment they started "certifying" those horrid Logitech surround setups should have made their irrelevance clear.
Haha, and some people ridicule me.... (Score:5, Insightful)
Re: (Score:2)
I was involved in a quite heated ./ discussion about this and the conclusion was as follows:
Spend on the source ( cd player / turntable / receiver ) and the reproduction units aka speakers.
As for a lot of hi-end equipment there are still a few worth paying the price for like McIntosh but most of what you get these days is just what this is all about, selling the brand, screw whats inside, sell the brand..
had a similar case with B&O and Panasonic (Score:5, Interesting)
When I was working for a Bang & Olufsen dealer I we had the case of a broken TV we had to pick up from a client and fix it. The TV in question was a rebadged panasonic with a nice B & O frame. We repaired the tv in the workshop and tested it. After that we put it back in its B&O frame and returned it to the customer only to find it wasn't working. Why? One of us had managed to accidently press the original panasonic powerbutton while putting it back in the B&O frame. Try explaining that to a customer.
Re:had a similar case with B&O and Panasonic (Score:5, Informative)
want a worse example? lets continue with panasonic but lets enter LEICA into it!
rebadging was never quite the same as when 'red dot' leica did it. they took semi-crappy pany digicams, slapped a leica logo on it, LIED TO THE PUBLIC about the lineage of the camera (saying it was qa'd in germany which is an out and out LIE) and then sold the cams at several times the pany price.
LEICA used to be a real high end camera company. they lost face when they pulled this stunt. there are leica lenses in the $3k range that are 'real leicas' but a $500 digicam that is rebadged is not a real leica even though the brand lies thru their teeth about it (when dpreview.com was pressed, they dodged the issue. probably due to lost advertising income if they fessed up that the fz50 and vlux1 are the same friggin cameras. touch that 'third rail' and you lose advertising revenue and review samples. yup, we know the game, guys...
Re: (Score:3, Funny)
Woah! Hold on a second! Are you telling me that George Lucas is trying to pass crap off to an unsuspecting public?!?!?!?!? Say it ain't so!
THX? (Score:5, Funny)
Wow. I'm sticking with THC.
Could never happen with computers... (Score:4, Funny)
Imagine a company that would take a few hundred bucks worth of regular PC parts, add a slightly modified free open-source OS, package the thing in a white shiny box and sell it for a few thousand bucks... What a scam it would be!
Re: (Score:3, Insightful)
I like your post but there is a minor error. OS X is not open source. It's derived from NeXT which is a closed-source OS from the 1980s that was ported to the PowerPC platform, and is still closed source today.
Wow. I can't believe I just defended Apple. That's like defending Chrysler's practice of taking a Dodge Stratus, rebadging it a chrysler sebring, and then adding 10,000 to the pricetag. Honda/Acura and Toyota/Lexus do the same deal.
Re: (Score:3, Informative)
I like your post but there is a minor error. OS X is not open source. It's derived from NeXT which is a closed-source OS from the 1980s that was ported to the PowerPC platform, and is still closed source today.
You may want to read this [wikipedia.org]:
Darwin is an open source POSIX-compliant computer operating system released by Apple Inc. in 2000. It is composed of code developed by Apple, as well as code derived from NeXTSTEP, BSD, and other free software projects.
Darwin forms the core set of components upon which Mac OS X, Apple TV, and iPhone OS are based. It is compatible with the Single UNIX Specification version 3 (SUSv3) and POSIX UNIX applications and utilities.
Darwin's heritage began with NeXT's NeXTSTEP operating system (later known as OPENSTEP), first released in 1989. After Apple bought NeXT in 1997, it announced it would base its next operating system on OPENSTEP. This was developed into Rhapsody in 1997 and the Rhapsody-based Mac OS X Server 1.0 in 1999. In 2000, Rhapsody was forked into Darwin and released as open-source software under the Apple Public Source License (APSL), and components from Darwin are present in Mac OS X today.
Darwin version 10.2 corresponds to Mac OS X 10.6.2.
Re: (Score:3, Insightful)
Yes you can clean it out and even re-install a clean copy of windows to ensure it works to its best but then you're paying with your time rather than money.
Tight-wads love stories like this to justify buying the cheapest shit out there but in general you'll find middle of
Re:Could never happen with computers... (Score:4, Insightful)
Then you are talking to the wrong "Mac people", or are wilfully ignoring the ones who are telling you otherwise, unless we are going to expand this to "the sort of people who don;t read slashdot", and if you're going to include the nominally "clueless" users then you have to do that for the Windows side too.
Assuming you are just talking to people with actual computer knowledge, there are very few Apple users who believe the components inside the box are some sort of magical things that are just not used by PC makers.
It's a long known business practice of Apple that has served them well - it's turnkey or nothing. Their direct competitor is not Dell or HP, or even a whitebox home builder, and not even really companies like Alienware who go for the prestige/high performance in fancy case gamer market. They're just kind of off on their own, doing their own thing. If you want a hassle free OS X box, you buy it from them. Sure you can make yourself a hackintosh if you like for less money, but you lose out on the form factor and warranty and so on. Those things are worth it to some people. The form factor of my iMac alone was worth the price I paid for it over the equivalent spec PC from any other vendor, not to mention OS X (and the ability to triple boot if needed).
It's simply not the case that Apple drop PCs into Apple cases and put the price up - not *literally* in any case (and watch this paragraph get selectively quoted by an AC for instant karma) - the components may be the same, but what size and shape is that $300 AMD machine? How loud is it? What version of OS X does it run out of the box?
The Mac Mini is expensive because it uses laptop components and crams them into a desktop form factor, and laptop parts cost more than desktop ones do. A better comparison would be a 3Ghz laptop, minus screen (and yes, even then the PC will be cheaper).
My iMac is the same - C2D 2Ghz, 2GB Ram, 500GB Sata HD (self upgrade - stock was 250GB), 20" 1680x1050 screen. I know that I could get a PC with those specs in late 2006 when I bought it for *much less*, but then I lose the all-in-one form factor and the fact that I can just pull the wall plug, put it into its box (that has a carry handle) and travel transatlantic with it several times as checked baggage as if it was just another suitcase.
Sure, most people who have one won't be moving it very often, but even at home, it is a very small footprint and small use of space for what it is - it's fabulous not having a tower stashed under my desk.
Not everything about buying a computer is about finding the most CPU+GPU for your money.
But it was greatly improved! (Score:3, Funny)
The blog got it all wrong! Lexicon if very honest about taking the Oppo player and improving upon it, and boy they did!
It's common knowledge that the audiophile listener derives his pleasure not from the quality of sound reproduction but from the price tag of his equipment.
So an audiophile is getting 7x the pleasure from listening to the Lexicon compared to the Oppo. Beat that if you can!
/greger
Re: (Score:2)
Monster Cables (Score:4, Funny)
I'll bet they forgot to use the Monster Cables.
How many more products like this are there? (Score:4, Interesting)
One of the sites linked to by this story, in turn linked to a glowing review of this Blu-Ray player by another site that praised its superiority [hometheaterreview.com] over the very Oppo unit it is "based" on.
With my interest piqued, I browsed a little more on this site, and found a review for an HD projector that sounded weirdly similar [hometheaterreview.com] in that it appears to be a JVC projector that has been repackaged and rebadged at a higher price, and got a similarly glowing review. Without any real technical scrutiny, of course. I wonder how many more products are out there of a similarly repackaged and fraudulent nature.
Re: (Score:3, Interesting)
Is this really a surprise? (Score:4, Insightful)
The really, really stupid audiophiles don't stop at $3500 though. Go and have a laugh at the Goldmund [goldmund.com] players [goldmund.com]. How does anyone ever manage to play a blu ray without a "magnetic damper". I expect if you cracked them open they'd be built around the same SOCs powering devices costing 1/20th the price.
"High end" computers (Score:3, Interesting)
It's amusing that we don't have "high end" computers for multimedia use. Features might include:
These are the kind of specs you see in hard real time systems that have to run both time-critical and non-time-critical code. "Multimedia PCs" ought to have specs like that, but they don't. So you still get pausing and stuttering if something else interferes with playback.
A typical test in the real time world is to hook up a square wave generator to an input pin and a digital oscilloscope to an output pin. You then run a program which is waiting for interrupts triggered by the input pin, and when the user process triggered by the interrupt gets control, it turns on the output pin. You load up the CPU with other, lower-priority tasks. You watch the results on a storage 'scope, timing the time from input to output. You expect all the spikes to be below the promised time threshold. If there are any outliers, users get annoyed, file bug reports, and it gets fixed. This is how you get rid of "jitter" at the OS level.
Re: (Score:3, Informative)
Seriously? Yeah, Lexicon's ridiculously overpriced equipment used to be worth the ridiculous prices, but *now* they are ruining the company and overcharging.
Or maybe they have ALWAYS been charging a 500%+ markup on their products just because they could. I'm not saying Lexicon doesn't have some of the best products in the business - just that the best products in the business do NOT need to cost 5-10x the average products in the business...
Re: (Score:3, Insightful)
The fancier players tend to try post-processing the input to make it look "better", in order to validate their price. This made a decent amount of sense with DVD players, where motion compensation, de-interlacing and other things could really make a difference.
In reality, for Blu-Ray, buy a slimline PS3 and call it done, unless you want a player with a specific feature (DVR, Blu-Ray recording, etc.)