Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Music Hardware

THX Caught With Pants Down Over Lexicon Blu-ray Player 397

SchlimpyChicken writes "Lexicon and THX apparently attempted to pull a fast one on the consumer electronics industry, but got caught this week when a couple websites exposed the fact that the high-end electronics company put a nearly-unmodified $500 Oppo Blu-ray player into a new Lexicon chassis and was selling it for $3500. AV Rant broke the story first on its home theater podcast with some pics of the two players' internals. Audioholics.com then posted a full suite of pics and tested the players with an Audio Precision analyzer. Both showed identical analogue audio performance and both failed a couple of basic THX specifications. Audioholics also posted commentary from THX on the matter and noted that both companies appear to be in a mad scramble to hide the fact that the player was ever deemed THX certified."
This discussion has been archived. No new comments can be posted.

THX Caught With Pants Down Over Lexicon Blu-ray Player

Comments Filter:
  • by Entropy98 ( 1340659 ) on Saturday January 16, 2010 @05:47AM (#30788868) Homepage

    Expensive isn't always better. Ever heard of Denon's $500 ‘Audiophile’ Ethernet Cable [wired.com]

  • oh..heh (Score:2, Insightful)

    by Anonymous Coward on Saturday January 16, 2010 @05:55AM (#30788896)

    "THX certified" is that about as useful as "Designed for Windows"? or maybe "Windows Vista Certified"...hahaha

  • Credibility. (Score:5, Insightful)

    by headkase ( 533448 ) on Saturday January 16, 2010 @06:04AM (#30788938)
    Years to build, seconds to destroy. So, who comes out on top over THX now?
  • More newsworthy... (Score:3, Insightful)

    by Anonymous Coward on Saturday January 16, 2010 @06:06AM (#30788948)

    is the fact that anyone takes THX seriously anymore.

    The moment they started "certifying" those horrid Logitech surround setups should have made their irrelevance clear.

  • by imsabbel ( 611519 ) on Saturday January 16, 2010 @06:07AM (#30788954)

    Sorry, still no point.

    Signal speed in copper is about 15-20cm per second.
    Even if they were running those things at a GHz (how many hundreds of audio channels do they transport), being correct to the cm would be quite ok.

    And even bog-standard cables are easily in side that tolerance.

  • by Tanuki64 ( 989726 ) on Saturday January 16, 2010 @06:22AM (#30789010)
    ...because I always buy cheapest. Mostly people who deem themselves audiophile and cannot understand that I am not. For me a cheap player was always enough. Now I also have the satisfaction that I am not cheated. At least I get what I pay for. :-)
  • by Sycraft-fu ( 314770 ) on Saturday January 16, 2010 @06:49AM (#30789130)

    They say as much in the manual of Denon gear that has the port on it. You have to realize they used stick Denon Link on most of their stuff. They do it much less now that HDMI works well. The original purpose of it was to get a digital multi-channel uncompressed audio signal off DVD-A and SACD. Prior to HDMI, there wasn't an interconnect that did that so they rolled their own. Now it isn't so useful so they've pulled it off most of their gear.

    At any rate, I don't think they were seriously expecting people who bought $1,000 receivers to get a $500 cable. As I said, the manual doesn't say you need to. What I think it was is audiophiles whining. They do sell some pretty expensive stuff, like a $7,500 processor/preamp. Some people who buy that probalby sniveled at the though of having to use an ordinary ethernet cable for their precious data. Denon then decided that if these people wished to waste money, they'd be happy to stick a vaccuum in their pocket and suck it out.

    I don't believe it uses I2S, as they specifically talk about jitter immunity, and even if so it wouldn't matter. The data from any of the digital inputs doesn't go to a DAC, it goes to a SHARC processor (or sometimes more than one) where it is manipulated according to the setup of the receiver. From there it goes to the DAC. So it is going to get re-clocked anyhow.

  • Re:No shock (Score:5, Insightful)

    by Sycraft-fu ( 314770 ) on Saturday January 16, 2010 @06:57AM (#30789166)

    I've never understood why you'd want to buy a "high end" Blu-ray player anyhow. Reason is I can see only two setups:

    1) You own a low end TV and receiver, or maybe no receiver at all. You've got no digital inputs. Thus your Blu-ray player's DACs have to handle the conversion. However, their quality matters little. Why? Well you've got a low end setup. You clearly are not concerned with quality. As such a cheap player will do fine. Improvements to its DACs and supporting analogue circuitry won't be noticeable to you.

    2) You own a high end TV/receiver and care a great deal about quality. In the case you hook the Blu-ray player up using HDMI. Reason is HDMI gives you the best signal. However in this case, the player isn't doing anything other than nabbing the data and passing it along. The analogue conversion happens in other units. So again, the quality isn't important. Your receiver's high quality DACs will handle the audio, the Blu-ray player will just send them data.

    I just can't see the case where you'd need good analogue outputs for Blu-ray.

    I can see potentially buying something like the Oppo player, if it had a good warranty and build quality. Makes sense to maybe pay more to have your gear last, but I can't see paying more for one just because it supposedly had better circuitry. Even if it does, you aren't going to make use of it. You'd be a fool to buy a high end HDTV and then not use the digital input, as the TV processes everything digitally internally.

  • by Captain_Chaos ( 103843 ) on Saturday January 16, 2010 @07:15AM (#30789252)
    What everybody seems to be missing: it is the twist lengths that are different! Not the wire lengths.

    I find this whole argument ludicrous. You can pump a hundred millions bits of information per second over a cat 5 cable using Ethernet, but for a few hundred thousand bits of audio per second you suddenly need $495 worth of snake oil to be added?

  • by Dahamma ( 304068 ) on Saturday January 16, 2010 @07:18AM (#30789268)

    given a choice of 2 interconnects where one is to exact-length and the other is varying length, which would you choose, all else being equal? and ignoring the insane markup (yes, its uncalled for!).

    What are you talking about? The insane markup is the whole point! You are saying that there is probably no measurable difference - in that case, any good engineer would choose the less expensive solution. End of story.

  • by commodore64_love ( 1445365 ) on Saturday January 16, 2010 @07:29AM (#30789310) Journal

    I design computer chips. Nothing high-speed, just 500 megahertz, but even I know it *is* possible to eliminate clock jitter. You can make the final received data look as good as the original source, such that even with an oscilloscope you can't see any difference.

  • by Xugumad ( 39311 ) on Saturday January 16, 2010 @07:33AM (#30789332)

    The fancier players tend to try post-processing the input to make it look "better", in order to validate their price. This made a decent amount of sense with DVD players, where motion compensation, de-interlacing and other things could really make a difference.

    In reality, for Blu-Ray, buy a slimline PS3 and call it done, unless you want a player with a specific feature (DVR, Blu-Ray recording, etc.)

  • by Kjella ( 173770 ) on Saturday January 16, 2010 @07:46AM (#30789374) Homepage

    easy proof (similar idea): look at the inside of an hdmi switch. it has parallel twisted pairs, too. look at the squiggles on the pc board traces. they are 'making up length' with lefty/righty (tech term, lol) loops of copper trace. TIMING MATTERS on parallel digital signals!

    Of course it does, because the HDMI signal is 165MHz+ (HDMI 1.0, later added higher modes). It matters for two digital devices to talk to each other, but there's no way a human could recognize picosecond jitter in the decoded video or audio which runs in kilohertz for audio and hertz for video. And if the digital signals were wrong, like the LSB of one sample running over into the MSB of the next sample, you'd know extremely quickly unless you're blind and deaf as it'd all be noise.

    In short, your technical knowledge is as lousy as your tech terms, because the situation audiophiles describe will never happen. Either you plug in a digital cable and the decoder will decode it perfectly with a timing accuracy far, far greater than human senss or the decoder will fail and it'll all be shit. It's never "almost right with jitter", not on the KHz scale.

  • by commodore64_love ( 1445365 ) on Saturday January 16, 2010 @07:54AM (#30789402) Journal

    I like your post but there is a minor error. OS X is not open source. It's derived from NeXT which is a closed-source OS from the 1980s that was ported to the PowerPC platform, and is still closed source today.

    Wow. I can't believe I just defended Apple. That's like defending Chrysler's practice of taking a Dodge Stratus, rebadging it a chrysler sebring, and then adding 10,000 to the pricetag. Honda/Acura and Toyota/Lexus do the same deal.

  • by Anonymous Coward on Saturday January 16, 2010 @08:24AM (#30789530)
    Ethernet signals travel at a very large fraction of the speed of light. Light travels around 11 inches in a nanosecond. So you're claiming picosecond intolerances in your clock signals.
  • Re:Credibility. (Score:3, Insightful)

    by TubeSteak ( 669689 ) on Saturday January 16, 2010 @08:36AM (#30789612) Journal

    In some cases, they impose restrictions that aren't acceptable to manufacturers either. Speakers are a good example. The high end THX spec (don't know about the lower ones) requires speakers to be sealed with a natural rolloff at 80Hz. Ok, well maybe I don't want that. In fact, I for sure don't want that for music. I want more full range speakers, and I'd like them ported as that increases low end efficiency. Ok, well they can't be THX then, no matter how good they are.

    Buy a (sub)woofer.
    With a woofer, you won't notice the 80Hz rolloff.
    If you're porting a mid-range in order to bump up the low end, you're doing it wrong.

  • by marcansoft ( 727665 ) <hector AT marcansoft DOT com> on Saturday January 16, 2010 @08:43AM (#30789654) Homepage

    You're confusing jitter with clock skew. Clock skew means nothing as long as the input signal is still within the setup/hold times of the receiver. It either works or doesn't. This isn't to say that you don't need good matching, just that better matching will not improve quality.

    Jitter is different. Jitter is uneven clocking. On the other hand, jitter is almost nonexistent on separate clock/data connections because any delays in the clock are consistent.

    Jitter does matter in things like S/PDIF that combine clock and data, because then the data will affect the distortion on the clock and it will be jittery when recovered. This is what all the talk about jitter is: S/PDIF (and similar) clock recovery. Don't mix it up with other issues and other interfaces.

    S/PDIF does have improved quality if the signal is less distorted, because it improves jitter. This problem can be completely eliminated by using a buffer before the DAC, or at least a PLL to clean up the clock (it only affects DACs that clock straight off of the recovered S/PDIF clock). Other interfaces (I2S) with separate clock and data do not have this problem because any distortion on the clock is consistent cycle to cycle.

  • by Anonymous Coward on Saturday January 16, 2010 @08:47AM (#30789678)

    Yes this is a troll post, but all this has just proved to me that you so called audiophiles are the biggest bunch of pretentious wankers ever to exist

  • by Anonymous Coward on Saturday January 16, 2010 @09:09AM (#30789768)

    Wow, I bet that 4.6 megabit/s (per channel) throughput requires some serious signaling hardware!

  • by DrXym ( 126579 ) on Saturday January 16, 2010 @09:16AM (#30789802)
    Audiophiles are stupid. As long as something comes in a chunky heavy box with knobs, meters and valves they'll pay a substantial markup even if the innards are nothing special. Onkyo and Pioneer have both sold Blu Ray players which are almost the same as $100 Magnavox models sold in Walmart with a huge markup.

    The really, really stupid audiophiles don't stop at $3500 though. Go and have a laugh at the Goldmund [goldmund.com] players [goldmund.com]. How does anyone ever manage to play a blu ray without a "magnetic damper". I expect if you cracked them open they'd be built around the same SOCs powering devices costing 1/20th the price.

  • Re:No shock (Score:5, Insightful)

    by Shawn Parr ( 712602 ) <<moc.rrapnwahs> <ta> <rrap>> on Saturday January 16, 2010 @09:57AM (#30789976) Homepage Journal

    This overlooks one group of people who actually exist in large numbers but are often overlooked:

    3. You have a nice HDTV and HDMI digital for that. But you also have a very nice audio system, but one that you put together before the HDMI specification was well established and thus it does not have HDMI. But your Receiver/PrePro/Amplifiers are very good, and you don't want to just replace them just to get ones with HDMI built in. But luckily they can take 5.1 or 7.1 analog inputs from a player with good quality outputs.

    This is exactly why I like the Oppo BluRay player. At the time for a minimal cost increase over other BR players I was able to use both a digital connection to my TV, and use the latest audio upgrades on BR along with my older, but very good, audio system. That being said I would never pay the $2000 plus for the 'high end' BR players. The Oppo is excellent, and I don't even have the special edition model with upgraded audio components. I'm sure it's fabulous, but the regular one I have is really really good.

    Why replace perfectly good equipment just to get a new connector, when you can still use it and get great performance out of it? I occasionally get the itch to replace those components, but when I research new ones I just don't see enough upgrade for what it would cost to justify it at this point.

  • by jo_ham ( 604554 ) <joham999@noSpaM.gmail.com> on Saturday January 16, 2010 @09:58AM (#30789982)

    Then you are talking to the wrong "Mac people", or are wilfully ignoring the ones who are telling you otherwise, unless we are going to expand this to "the sort of people who don;t read slashdot", and if you're going to include the nominally "clueless" users then you have to do that for the Windows side too.

    Assuming you are just talking to people with actual computer knowledge, there are very few Apple users who believe the components inside the box are some sort of magical things that are just not used by PC makers.

    It's a long known business practice of Apple that has served them well - it's turnkey or nothing. Their direct competitor is not Dell or HP, or even a whitebox home builder, and not even really companies like Alienware who go for the prestige/high performance in fancy case gamer market. They're just kind of off on their own, doing their own thing. If you want a hassle free OS X box, you buy it from them. Sure you can make yourself a hackintosh if you like for less money, but you lose out on the form factor and warranty and so on. Those things are worth it to some people. The form factor of my iMac alone was worth the price I paid for it over the equivalent spec PC from any other vendor, not to mention OS X (and the ability to triple boot if needed).

    It's simply not the case that Apple drop PCs into Apple cases and put the price up - not *literally* in any case (and watch this paragraph get selectively quoted by an AC for instant karma) - the components may be the same, but what size and shape is that $300 AMD machine? How loud is it? What version of OS X does it run out of the box?

    The Mac Mini is expensive because it uses laptop components and crams them into a desktop form factor, and laptop parts cost more than desktop ones do. A better comparison would be a 3Ghz laptop, minus screen (and yes, even then the PC will be cheaper).

    My iMac is the same - C2D 2Ghz, 2GB Ram, 500GB Sata HD (self upgrade - stock was 250GB), 20" 1680x1050 screen. I know that I could get a PC with those specs in late 2006 when I bought it for *much less*, but then I lose the all-in-one form factor and the fact that I can just pull the wall plug, put it into its box (that has a carry handle) and travel transatlantic with it several times as checked baggage as if it was just another suitcase.

    Sure, most people who have one won't be moving it very often, but even at home, it is a very small footprint and small use of space for what it is - it's fabulous not having a tower stashed under my desk.

    Not everything about buying a computer is about finding the most CPU+GPU for your money.

  • by Alioth ( 221270 ) <no@spam> on Saturday January 16, 2010 @10:11AM (#30790052) Journal

    Really? The clock is *that* intolerant on a 3 and a bit Mbps signal that a couple of mm is going to really make a difference?

    Sorry - but a normal ethernet cable will be more than adequate. You're wasting your money if you spent $500 on the Denon cable - you've been had. Ensuring the PCB traces are exactly the same length isn't good engineering for this particular task, it's simply wasting your time. I simply do not believe the clock tolerance is measured in picoseconds.

  • by ultranova ( 717540 ) on Saturday January 16, 2010 @10:56AM (#30790290)

    What idiot would design a digital transmission protocol without built-in error correction?

    The kind that wants to sell $500 cables to be used with it?

  • by phoenix321 ( 734987 ) * on Saturday January 16, 2010 @11:29AM (#30790494)

    Using a protocol that absolutely requires perfectly identical conductor lengths on internal PCBs can be solid engineering.

    Using that protocol outside of separate units, over flexible wiring, is questionable, since there is no control over wiring specs, thermal expansion, wiggling in the cable etc..

    Which is probably the reason the Denon Link cable is that expensive: Ethernet cabling is probably incompatible with that protocol and the optimal cabling for the protocol requires very low tolerances for everything.

    Which is not to say the Denon Link cable is anything but price gouging: solid engineering would've either
    - required rigid connectors between units where length is independent of wiggling and twisting of a cable
    - put all the components inside the same box to get over the need of external connectors
    - design and use a protocol that increases the wire length tolerance
    - use a cabling that already has identical strand lenghts (DVI or HDMI?`)

  • by thetoadwarrior ( 1268702 ) on Saturday January 16, 2010 @11:36AM (#30790550) Homepage
    A Mac would be considerable to a high end PC and they're about the same price. A mac will be better than something bought in Wal-Mart because the Wal-Mart PC will have shit parts and will be subsidised by all the crap installed on it.

    Yes you can clean it out and even re-install a clean copy of windows to ensure it works to its best but then you're paying with your time rather than money.

    Tight-wads love stories like this to justify buying the cheapest shit out there but in general you'll find middle of the road stuff is the best. High-end stuff is always over priced and is more about brand than performance. Low-end is for people in trailers and will mean cheaper parts, less support or something.

    Mid-range is basically a real high-end and aimed at normal people. Where as anything that is advertised as being high-end is for pompous jerks with more money than sense. These are the sort of people that don't care how long it lasts because they can buy 2 more to replace it.

    Apple does sort of move into the high-end market but, as I said, a good PC that won't be out of date in 2 months will cost about the same. Apple will likely charge more and they realise that which is why they act like Nazis and like to have their systems closed up as much as possible. Having fewer pieces of crap software on your system and less hardware variety will lead to perceived quality increase despite using the same parts.

    While it's the same parts inside, it's also what's on the outside that makes a difference and Apple has lead in design and usability. Even their old G4s had nice doors that are opened via a handle rather than some funky ass piece of metal that requires you to take the screws out and even then they can often be a pain to slide in and out of place or some big ass U shaped piece of metal that is more likely to cut you than go back on properly.

    What you're paying for on average with a Mac is for them to employ more designers and usability experts than Acer will ever have and Nazi-like control. Sure they could subsidise the cost with shit-ware from Norton, McAffee, Real, etc and it would be cheaper and it would still have a superior design to a Wal-Mart PC but then it would run worse and there wouldn't be much point in moving to OSX and giving up all your Windows software if you're not getting the stability.

    Unlike the case with Lexicon, I don't think Apple hides the fact their hardware is the same stuff inside and that their quality comes through other means which do work if you're willing to give up the freedom. It happens that I' not willing to give up the freedom so I've never owned a Mac. It's just too easy to build your own quality PC.
  • by hufman ( 1670590 ) on Saturday January 16, 2010 @11:39AM (#30790580)
    I don't think this has to do with the length of time that a wireless signal takes to be transmitted. I think it's affected by the length of time that the music from the rest of band takes to travel through the air.
  • by u38cg ( 607297 ) <calum@callingthetune.co.uk> on Saturday January 16, 2010 @12:13PM (#30790780) Homepage
    You're doing it wrong. You need to charge *more* for your spit-shined cable, not less. Cheaper cable is just clearly cheaper. Can't you hear the bits screaming in pain, man?
  • by mgrivich ( 1015787 ) on Saturday January 16, 2010 @12:20PM (#30790822)
    If you assist a shady deal, you share responsibility for that shady deal. If you have the choice of assisting in a shady deal, or selling nothing, the better choice is to sell nothing. Of course the best option is to figure out how to make money without selling dishonest goods. You also seem to be saying, "It is the customer's fault for being an idiot." Tell me when you car gets stolen, is it your fault for not putting a tracker on it? When you get mugged, is it your fault for not taking combat training? We can't be good at everything. It is the moral obligation of the experts to not steal from the non-experts.
  • by AmiMoJo ( 196126 ) on Saturday January 16, 2010 @01:12PM (#30791272) Homepage Journal

    So what you are saying is that two guys spent large amounts of money on hifi gear and now both of you can hear a difference between digital cables which all the science and testing says are the same?

    Psychology is always a problem with this sort of thing. Unless you can show that you can tell the difference in double blind tests then I'm afraid you won't be able to convince me. Every time people have done double blind tests the results have shown that they can't tell the difference between cheap digital cables and expensive ones, probably because there isn't any.

  • by Hatta ( 162192 ) on Saturday January 16, 2010 @01:14PM (#30791296) Journal

    And we did a series of tests.

    Were they properly blinded? If not, they mean nothing.

  • Re:No shock (Score:1, Insightful)

    by Anonymous Coward on Saturday January 16, 2010 @01:19PM (#30791324)

    You guys are forgetting another one.

    4) You like pretty things that look alike and match.

    Check out the sex that Copper and Brass colored components are from ADA [ada.net] in the 30 years. These system weren't cheap and at the time, someone, somewhere, thought they looked great.

    People pay big bucks for this everyday with other industries and products as well..

  • by Anonymous Coward on Saturday January 16, 2010 @01:24PM (#30791384)

    Imagine a company that would take a few hundred bucks worth of regular PC parts, add a slightly modified free open-source OS, package the thing in a white shiny box and sell it for a few thousand bucks... What a scam it would be!

    If you think OSX is "slightly modified", stop holding Command+S when you turn it on.

  • by marcansoft ( 727665 ) <hector AT marcansoft DOT com> on Saturday January 16, 2010 @02:04PM (#30791728) Homepage

    hint: you can't fully undo jitter at the reclocking phase. you can attenuate some but if the source adds jitter then you can't magically take it all 'back' via reclocking.

    Yeah, you can. See, this is where you audiophiles fail: you take some word that refers to a real concept and mangle it into fairy tales that have nothing to do with reality.

    Jitter is real, it affects craptacular dumb DACs driven by a clock recovered from a self-clocking signal, and it's eliminated completely by any form of clocked digital processing or buffering on the signal after clock recovery. Or in other words, yes, it exists, no, it doesn't affect 99% of AV receivers out there.

  • Simple fraud (Score:2, Insightful)

    by Anonymous Coward on Saturday January 16, 2010 @02:05PM (#30791740)

    This is a clear case of fraud, but because it was perpetrated by a corporation there will be no legal consequences.

  • by dotgain ( 630123 ) on Saturday January 16, 2010 @03:30PM (#30792410) Homepage Journal
    It's a fact that SPDIF doesn't use ACKs/ retransmits, yes. But that's not even relevant in this case. Clock recovery / regeneration never had anything to do with that.

    While -1, Troll is not the correct mod for an insubstantial argument, at this stage the GP is quite deserving of the downmods. I'm surprised more of his posts aren't -1, Funny.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...