Forgot your password?
typodupeerror
Music Hardware

THX Caught With Pants Down Over Lexicon Blu-ray Player 397

Posted by timothy
from the high-end-but-not-high-road dept.
SchlimpyChicken writes "Lexicon and THX apparently attempted to pull a fast one on the consumer electronics industry, but got caught this week when a couple websites exposed the fact that the high-end electronics company put a nearly-unmodified $500 Oppo Blu-ray player into a new Lexicon chassis and was selling it for $3500. AV Rant broke the story first on its home theater podcast with some pics of the two players' internals. Audioholics.com then posted a full suite of pics and tested the players with an Audio Precision analyzer. Both showed identical analogue audio performance and both failed a couple of basic THX specifications. Audioholics also posted commentary from THX on the matter and noted that both companies appear to be in a mad scramble to hide the fact that the player was ever deemed THX certified."
This discussion has been archived. No new comments can be posted.

THX Caught With Pants Down Over Lexicon Blu-ray Player

Comments Filter:
  • by TheGratefulNet (143330) on Saturday January 16, 2010 @06:13AM (#30788972)

    btw, I am NOT defending the price on this! just the fact that there IS something to the denon cable that most people are not seeing and don't even know about (the ethernet thing with diff length pairs inside).

    it should be priced MUCH lower, of course. but still, the fact is that i2s does require exact length wires on all the links between the spdif receiver chip and the dac chip (which is what i2s is all about, really; its not even an external interconnect but intended entirely for use INSIDE cd players, dat players, dvd players, etc).

  • Dear Lexicon (Score:1, Interesting)

    by Anonymous Coward on Saturday January 16, 2010 @06:21AM (#30789006)

    Oh please don't ruin your company. I know times are bad, if anything just sit like we are and keep what ye has. Why not venture out into re-issuing older vintage models with enhancements for modern times? Or repairing the ones that still exist? it seems much more valuable to the community to actually make and possibly service your own product instead of opposing claymines doin case artwork for a doomed product then punk'd by the THX fail which doomed it

  • Re:Credibility. (Score:5, Interesting)

    by Trepidity (597) <delirium-slashdot AT hackish DOT org> on Saturday January 16, 2010 @06:26AM (#30789018)

    Sadly it's been a years-long downwards slide with THX. They used to certify only high-end theatres, then added high-end home theatre setups, then the standards for commercial theatres slowly started slipping until basically everyone who wasn't showing films in a tin can got certified, then they started certifying middle-of-the-road home theatre setups, then individual pieces of home-theatre hardware, and recently even some decent but not exactly world-class Logitech computer speakers.

  • by Anonymous Coward on Saturday January 16, 2010 @06:26AM (#30789022)

    When I was working for a Bang & Olufsen dealer I we had the case of a broken TV we had to pick up from a client and fix it. The TV in question was a rebadged panasonic with a nice B & O frame. We repaired the tv in the workshop and tested it. After that we put it back in its B&O frame and returned it to the customer only to find it wasn't working. Why? One of us had managed to accidently press the original panasonic powerbutton while putting it back in the B&O frame. Try explaining that to a customer.

  • by gowen (141411) <gwowen@gmail.com> on Saturday January 16, 2010 @06:34AM (#30789046) Homepage Journal

    Please do the calculation and tell us what the difference in transit times is for, say, 40m of cable.
    Clue: do actually believe that a band who's musicians use different length guitar/mic cables cannot possibly play in time?

  • by phoenix321 (734987) * on Saturday January 16, 2010 @07:05AM (#30789218)

    I always thought that was the reason why they used digital cabling in the first place: to get a perfectly lossless transfer and have CRCs to prove it.

    "Common off the shelf" ethernet parts have now an uncorrectible bit error rate below 10^-10 or so, which means a cheap "small-office-home" Netgear or D-Link part solution will have one bit off every 10 seconds when continuously blasting at full 1Gbps.

    One bit off, every ten seconds under maximum transfer speeds.

    Software and protocols handle that on the receiving/sending units, that's why we don't have noticeable data corruption at all when transferring endless amounts of data across small home networks. That is two units connected by less than 30 USD worth of networking equipment.

    With higher level equipment, medium and large company grade material, this bit error rate is down to 10^-15. That is 1 uncorrectible bit error every 11 *days*(!) while continuously operating at maximum capacity. With 150USD worth of networking equipment, of which is certainly less than 10 USD for the cable alone.

    If the Denon link protocol cannot handle 1 single bit error every 10 seconds: shame on them.
    If the components Denon uses for their Link interface have a higher bit error rate than enterprise-level network switches: shame on them.
    If these components actually perform worse than cheap commodity SOHO parts: feces will be hitting the fan.

    I don't know if audiophile humans can even detect a single bit error every ten seconds. I don't know if that would warrant spending that amount of money even if they actually did.

    But I certainly know that digital high-end equipment must outperform cheapest commodity hardware, I also know that software and protocol on either side of the link must provide for and correct single bit errors. The resulting data stream on application level must be much lower than 10^-15 single bit errors using regular cheap Cat5e cabling. That is 1 bit error every 10 hours at 10Gbps, and I know for fact that enterprise grade storage systems have much less than that, or we would have corruption on all our filesystems within the blink of an eye.

    If a resulting bit error rate of less than 10^-15 is still producing visible and audible artifacts, someone made a big mistake in unit or software design or manufacturing.

    If a bit error rate of less than 10^-15 or even 10^-20 is desired, more shielding is needed. We're talking about centimeters of lead here, since the remaining bit errors are caused by cosmic radiation, not only in the wire, but in the entire circuitry of the connected units.

  • by Anonymous Coward on Saturday January 16, 2010 @07:08AM (#30789226)

    Thank you for the a good laugh. I love hearing audiophiles brag about wasting money on cabling, and talk like they are experts on subjects they obviously know nothing about. I have no idea how sensitive the ear is to clock jitter, but I can tell you that the only clock that matters is the one driving the audio DAC, and that it is not the same one that is used for the ethernet connection.

  • by TheGratefulNet (143330) on Saturday January 16, 2010 @07:10AM (#30789232)

    I don't own the denon and so I can't say for sure.

    I actually build my own spdif hardware and audio dacs (my audio gear is all DIY stuff). and I do use i2s as an 'interconnect' between spdif receivers and the dac chips (when we build dacs, we take great care to layout the pcb traces to ENSURE that the i2s lines are exact(!) lengths. its just proper engineering.)

  • by TheGratefulNet (143330) on Saturday January 16, 2010 @07:15AM (#30789254)

    Even if we assume that's true, digital error correction will correct any clock skew, so it does not matter if the wire are unequal.

    ahem.

    spdif is realtime, NO ACKS and no retries.

    care to rethink your 'solution' ?

    there is no digital error correction that fixes clock skew. error correction is one thing and clock dejittering is entirely separate. apples and oranges.

    we're not talking about wrapping digital audio in tcp/ip, here. pure spdif does NOT do ack'ing or retries or any classic datacomm things that you're thinking of.

  • by TheGratefulNet (143330) on Saturday January 16, 2010 @07:29AM (#30789308)

    any engineer would:

    - ignore the high cost profit-filled item
    - realize they had a good idea
    - make one himself from cat3 wire (perhaps; since that DOES have equal length wires inside the bundle)

    that's what a real engineer would do. not just use a wire that is out of spec but build one for almost no cost that IS in spec.

    why not? its almost a no-cost item to DIY yourself. if you can have exact length wires for pennies via a DIY, why NOT do it right?

  • by dangitman (862676) on Saturday January 16, 2010 @07:44AM (#30789366)

    One of the sites linked to by this story, in turn linked to a glowing review of this Blu-Ray player by another site that praised its superiority [hometheaterreview.com] over the very Oppo unit it is "based" on.

    With my interest piqued, I browsed a little more on this site, and found a review for an HD projector that sounded weirdly similar [hometheaterreview.com] in that it appears to be a JVC projector that has been repackaged and rebadged at a higher price, and got a similarly glowing review. Without any real technical scrutiny, of course. I wonder how many more products are out there of a similarly repackaged and fraudulent nature.

  • by Opportunist (166417) on Saturday January 16, 2010 @07:56AM (#30789410)

    Given that the maximum cable length under best conditions (I'm not even accounting for cable twisting here) is about 100m, at 0.5c the delay between sender and receiver is about 6.6*10^-9. Not quite 7 nanoseconds, if I am not mistaken. The time it takes your computer to execute about 30 atomic instructions. Considering your reflexes take a billion times longer, I would be amazed if you can hear THAT.

  • by Opportunist (166417) on Saturday January 16, 2010 @07:58AM (#30789418)

    Still, at the end of the day (or the wire, rather) you get a signal that is composed of 0s and 1s. There is no "in between", there is no "bit rot" in the medium. That might have been real for analog transfer when it mattered that the signal was transfered verbatim. It can NOT be transfered any other way today. It is EITHER 0 OR 1. There is no in between.

  • by Anonymous Coward on Saturday January 16, 2010 @08:29AM (#30789566)

    Imagine a company that would take a few hundred bucks worth of regular PC parts, add a slightly modified free open-source OS, package the thing in a white shiny box and sell it for a few thousand bucks... What a scam it would be!

    I take it the reference is for OSX. Hate to disagree. It's hardly "slightly modified". Also a prime example of the difference between Mac and Windows is I bought two machines at the same time, one OSX Leopard and one Vista Pro. The Vista Pro was a higher end machine than the Mac. After six months the Mac still works perfectly and the PC is bricked. I finally broke down and ordered Windows 7 Ultimate and I'll have to redo the machine when it comes in. The fun part is reinstalling all the software. Just in the last three years alone I've owned four different Macs and twice that many PCs, I do graphics work. Other than a hard drive dying on my first iMac after two years the Macs have been rock solid and virtually every PC has required constant fiddling just to keep them working. Eventually all the PCs developed major problems and required reinstalling the OS. I never once had to do this with a Mac. If there's a scam here it's with Windows and not Macs. I'm stuck with them because some of my software and hardware is Windows only but if I could live without them I would in a heart beat. Back in the day redoing the OS wasn't that big a deal but now they are so large and the software has so much security it's a nightmare every time I have to start over. I hope Windows 7 sorts out the mess but either way I hate all the changes. XP was serviceable and fairly easy to use. Vista was a mess. They ruined a perfectly good filing system, far better than Macs, and somehow managed to turn off by default all the useful stuff and turn on all the annoying and useless stuff. Add in all the instability and insecurity and the only things keeping the average user with Windows are they haven't tried Mac or they can't aford them or are unwilling to pay the extra. The software limitations don't really affect 90% of the users out there except gaming people. The newer iMacs are pretty good and the two Mac Pros I've owned were excellent. My first iMac gradually died, first the superdrive and eventually the hard drive but the OS never missed a beat. The newer iMac seem to be much better quality and better all round machines. Also with the last two machines I bought the iMac took minutes to set up. I was over an hour before I could use the PC running Vista. I know were are all supposed to make fun of Macs but honestly if you just want a machine that works I'd go with Mac every time. All my PCs are miserable media players. The Macs just pop in a DVD and they just work every time.

  • by Taimoor (891521) on Saturday January 16, 2010 @08:43AM (#30789652)

    I'm laughing my ass off. You don't seriously think the jitter caused by that miniscule difference in cable length will fool with anything designed to use twisted pair as an interconnect, do you?

    We're not talking about memory busses running at several GHz, we're talking about relatively low-bandwidth interconnects between devices. And this is assuming that you're not encapsulating everything and just using ethernet signaling like everyone else in the pro audio world does.

  • by SchlimpyChicken (1250578) on Saturday January 16, 2010 @08:45AM (#30789660)
    There are a ton. In particular, JVC's DLA-RS2 projector got rebadged by a ton of companies (Audioholics also exposed the Meridian MF10 as a rebadged JVC [audioholics.com]), all of whom insisted that they made "dramatic" improvements to the picture quality. The problem is - reference is reference, and black is black. The system can only get so black, and a $350 calibration can bring the JVC DLA-RS2 to near-perfection. Happens all over the industry. Lexicon actually has a history of doing this, but this time they got caught in a more blatant example - and pulled THX down with them in the process.
  • by Anonymous Coward on Saturday January 16, 2010 @08:55AM (#30789712)

    I think a sense of proportion is needed when it comes to different cable lengths. It's worth finding out for far an electric signal will travel in one tick of the signal clock.

    Speed of light in a vacuum (according to Google) is 299,792,458 m/s and the speed of propagation of an electrical signal (http://en.wikipedia.org/wiki/Speed_of_electricity) is about 66% of that in coax.

    Let's assume the 'speed of electricity' in our cat 5cable is 50% of the speed of light, say 150,000,000 ms.

    Looking at i2s clock speeds (http://en.wikipedia.org/wiki/I2S) the bit clock that marks each bit ticks at "sample rate" * "number of channels" * "number of bits per sample". The single word select line suggests only 2 channels for I2S.

    At a typical (from the Wikipedia entry) bit clock speed of 2.8224MHz our electrical signal will travel 150,000,000 / 2,822,400 which is 51m.

    So a 51m cable length difference would result is a clock skew of 1 full tick cycle.

    I doubt there's that much of a length difference between the different pairs...

  • by SchlimpyChicken (1250578) on Saturday January 16, 2010 @08:56AM (#30789714)

    I have been told (directly, not third party) by one of the highest authorities at Denon Electronics that their cable is a shielded Cat5e cable... They only made it to satisfy custom installers who wanted something ridiculous to sell clients who had more money than sense. Off the record of course...

    In this case Denon aren't bad guys, they just aren't stupid. They had enough requests and knew these guys would simply go elsewhere to get what they wanted (another product they could sell people who, if they dropped a $100 bill on the ground, would think it a waste of time to stoop over and pick it up).

    In this case, the people at fault are the installers who can't seem to charge for their time and instead want to cultivate an industry where their services are "free" and everything is paid through them buying products at cost and selling them at retail to clients. The really big installers know how to run a business, but the middle and lower tiers are largely fueling customer ignorance of the value of their services.

  • by PopeRatzo (965947) * on Saturday January 16, 2010 @09:09AM (#30789766) Homepage Journal

    I actually build my own spdif hardware and audio dacs (my audio gear is all DIY stuff).

    I'm impressed, but you must admit, at least, that makes you a little "eccentric" (in the best definition of the word).

    I'm curious: can you hear the difference when using unequal cable lengths?

    I've known "audiophiles" to claim some pretty wild stuff, so I have to ask.

      [Note: I produce recorded music for a living now, so I have a professional interest]

  • Re:Credibility. (Score:3, Interesting)

    by DarkOx (621550) on Saturday January 16, 2010 @09:56AM (#30789972) Journal

    True but your typical $500 - $1000 receiver from the later 2000's probably can produce output that is every bit as clean and generally good as a $2500 from the mid 90's or prior.

    Better DACs that use more bits and hardware that internally uses higher sampling rates has become cheap. The noise floor is lower on modern equipment too as chip manufacturing even analog chips like opamps has improved greatly; much lower distortion. DSPs have gotten cheap as well; in even modest setup these days the internals are fully digital. Traditional noise is a much lower factor if the first thing you receiver does with any analog input is use some more isolated circuits to digitize. All the processing, effects, surround decoding etc gets done digitally. In the case of a digital source like HDMI or SPID its not analog until right before the final amp that is going to drive your speakers.

    Not having any line level analog transmission is probably the biggest factor in the improvements. I am not sure THX has really gone down hill so much as typical kit is just way better than it used to be.

  • by Anonymous Coward on Saturday January 16, 2010 @11:20AM (#30790444)

    A few years ago I worked for a famous chip company with only one real competitor. When they came out with a chip that was smaller, faster, and used less juice than ours, we were, ahem, green with envy.

    And we raised our prices.

    The marketing VP explained to us at a meeting that people will perceive our chips as being better, even when they know the facts prove otherwise, because if it "costs more it must be better."

    I'd like to point out that most "audiophiles" are usually scrounging vintage gear at Goodwill, and pretty much tweak their analogue gear with rubber bands and safety pins or whatever words. It's the guys with too much money who are buying the alleged high end gear.

  • by kilodelta (843627) on Saturday January 16, 2010 @12:26PM (#30790866) Homepage
    And in my office we had a combination of Macintosh LC II's, and LC III's and LC IV's.

    In one office someone got upgraded to an LC IV which made their office mate jealous because she only had an LC II. We didn't have the money to buy more of the newer machines but since they all used the same case/covers I had an idea. I took the cover from my LC IV and swapped it with her LC II cover. She was so happy with her new machine.
  • by BetterSense (1398915) on Saturday January 16, 2010 @12:32PM (#30790912)
    I never argue with audiophiles that they can't hear a difference. I even agree that their magic rocks, etc. can make their system sound different. I insist, however, that there is no change whatsoever in the output of the system. Just as you said, some products can make a stereo sound different without changing the output of the stereo system whatsoever. The listening happens in the brain. If a magic rock makes their system sound better to them, well, is it really a waste of money?
  • by AmiMoJo (196126) <mojoNO@SPAMworld3.net> on Saturday January 16, 2010 @01:24PM (#30791378) Homepage

    It's like going to a fancy restaurant. The food might taste the same or even worse than a moderately priced on, but you got dressed up and paid a lot of money for it so you feel like the whole thing is a bit higher "quality". Quality being an entirely subjective measurement.

  • "High end" computers (Score:3, Interesting)

    by Animats (122034) on Saturday January 16, 2010 @02:10PM (#30791790) Homepage

    It's amusing that we don't have "high end" computers for multimedia use. Features might include:

    • No firmware runs in System Management Mode, stealing cycles from the main CPU.
    • No paging.
    • Operating system is tested and certified for interrupt response under 1us, 100% of the time. (Hard real-time operating systems like QNX can do this. Linux and Windows still have excessive interrupt lockout times; I think Linux is now below 1ms if you don't have any crappy drivers installed, but 1us is a long way off.)
    • Support for "sporadic scheduling", where the OS guarantees, say, 20% of the CPU every 1ms to a task that requests it. This allows playing multimedia with no breaks while doing something else. If you try to open too many multimedia windows, the scheduling request is rejected, because you're out of capacity.
    • Disk access prioritization, so that CPU priority affects disk access priority. (QNX has this).
    • All solid state disks.

    These are the kind of specs you see in hard real time systems that have to run both time-critical and non-time-critical code. "Multimedia PCs" ought to have specs like that, but they don't. So you still get pausing and stuttering if something else interferes with playback.

    A typical test in the real time world is to hook up a square wave generator to an input pin and a digital oscilloscope to an output pin. You then run a program which is waiting for interrupts triggered by the input pin, and when the user process triggered by the interrupt gets control, it turns on the output pin. You load up the CPU with other, lower-priority tasks. You watch the results on a storage 'scope, timing the time from input to output. You expect all the spikes to be below the promised time threshold. If there are any outliers, users get annoyed, file bug reports, and it gets fixed. This is how you get rid of "jitter" at the OS level.

"But this one goes to eleven." -- Nigel Tufnel

Working...