Forgot your password?
typodupeerror
Intel Networking Hardware Technology

Clash of the Titans Over USB 3.0 Specification Process 269

Posted by timothy
from the so-you'd-cut-this-giant-electronic-baby-in-half dept.
Ian Lamont writes "Nvidia and other chip designers are accusing Intel of 'illegally restraining trade' in a dispute over the USB 3.0 specification. The dispute has prompted Nvidia, AMD, Via, and SiS to establish a rival standard for the USB 3.0 host controller. An Intel spokesman denies the company is making the USB specification, or that USB 3.0 'borrows technology heavily' from the PCI Special Interests group. He does, however, say that Intel won't release an unfinished Intel host controller spec until it's ready, as it would lead to incompatible hardware."
This discussion has been archived. No new comments can be posted.

Clash of the Titans Over USB 3.0 Specification Process

Comments Filter:
  • 1394 For Life (Score:5, Insightful)

    by vertigoCiel (1070374) on Sunday June 15, 2008 @11:53PM (#23805955)
    Ever the more reason to never give up Firewire until they pry it from my cold, dead fingers.
    • Re:1394 For Life (Score:4, Insightful)

      by mrbluze (1034940) on Sunday June 15, 2008 @11:56PM (#23805967) Journal

      Ever the more reason to never give up Firewire until they pry it from my cold, dead fingers.
      But why does everything with firewire have to cost an extra $30 or so?
      • by Anonymous Coward on Sunday June 15, 2008 @11:59PM (#23805991)
        Because it was designed by Apple.
        • Re: (Score:2, Insightful)

          by armanox (826486)
          Nice try at humor. Also called Sony ILink, it coests more because it needs a firewire controller
        • Re:1394 For Life (Score:4, Informative)

          by armanox (826486) <asherewindknight@yahoo.com> on Monday June 16, 2008 @01:36AM (#23806549) Homepage Journal
          Also, the royalties are not in effect any longer...
      • Re: (Score:2, Informative)

        by theshibboleth (968645)
        Well Firewire is faster than USB, so people are willing to pay more for it. Plus it doesn't have quite as wide adoption as USB, so manufacturers don't make as many Firewire devices, which limits the supply.
      • Re: (Score:2, Insightful)

        by Macrat (638047)
        USB = cheap crap

        1394 = quality technology
      • Re:1394 For Life (Score:5, Informative)

        by outZider (165286) on Monday June 16, 2008 @01:16AM (#23806433) Homepage
        FireWire requires an actual IO controller, where USB 2 relies on the CPU and the driver.

        In short -- FireWire is faster and requires far less load on the target machine. The downside is the initial cost is higher. I find it pays for itself pretty quick.
        • by EmbeddedJanitor (597831) on Monday June 16, 2008 @04:31AM (#23807445)
          Firewire might pay for itself in high speed applications where time == money, but it is sever overkill (and too high cost) for many lower speed applications such as mouse, keyboard etc. USB is king of the low speed domain because of low cost: a USB-cappable microcontroller only costs a couple of bucks and a sub dollar micro can do a low speed bit-banged implementation of USB. Adding USB to peripherals is almost free.
      • Probably because they are generally designed better in other areas, too. My aluminium Firewire/USB2/eSATA drive enclosure was more expensive than my cheap, plastic USB2 enclosure.
        The market for people who want to buy Firewire is probably closer to the market that also want to pay a bit extra for quality. That's also partly why Firewire isn't going away anytime soon, at least on the Mac.
      • Re: (Score:2, Redundant)

        by petermgreen (876956)
        several reasons
        1: at least early in firewires life there were some fairly significant licensing fees, dunno if that is still the case.
        2: Firewire is intrinsiclly a higher spec and more expensive interface. A good example of this is the power provision, firewire can carry much more but the higher voltage makes using it more awkward for devices.
        3: Firewire has become something of a niche product, the more niche a product is the less people the upfront costs are spread over.
        4: in the case of motherboard suppor
    • Re: (Score:3, Funny)

      by T3Tech (1306739)
      Viva IEEE 1284 FTW
  • So... (Score:5, Interesting)

    by Darkness404 (1287218) on Sunday June 15, 2008 @11:54PM (#23805963)
    So will this mean in the end we will have 2 competing USB standards? USB-Intel and USB-AMD? I can only hope that one will get picked over the other before it appears in most products because after the whole HD-DVD and Blu-Ray thing it would be an absolute pain to get a computer with USB-Intel in it when all the products will be USB-AMD.
    • Re: (Score:3, Insightful)

      At least with a computer you could just install a $20 PCI card, little bit harder with a DVD player.
    • Re: (Score:2, Informative)

      by mysidia (191772)

      I think we can be fairly confident if there were USB-AMD and USB-Intel, that:

      All other things being equal (no major bugs in one of the specs), USB-Intel would be the clear winner if the two standards came out about the same time, due to Intel's influence, name recognition, prestige, etc. The 5000 pound gorilla flattens the 200 pound monkey with 1 step.

      USB-ADM could win, but only if it came out far enough in advance, for products to start being designed using it.

      There's a limited market for devices

      • Re: (Score:2, Insightful)

        by Anonymous Coward
        I'm sorry but I disagree with just about everything you said:

        - all things being equal, USB-Intel would lose, look at the companies opposing it, you have AMD, Intel's biggest rival in chipsets, you have nVidia, the biggest gfx company, you have VIA and SiS - who handle pretty much every other chip in your computer.

        In short, every chip in your computer except your intel chip would be specced to the disputing standard, what would Intel do to counter that? Personally try to take over the gfx market, the VIA mar
        • by Bert64 (520050)
          Intel already dominate the gfx market, they just don't offer any high end products (enthusiasts/gamers/highend)... A huge number of low end systems (ie cheaper and greater volume) use Intel onboard video chips, and these systems are heavily used in offices around the world, and by home users who aren't interested in heavy video processing or gaming. These small cheap laptops which are increasingly popular these days tend to use intel video too.

          Not sure what this has to do with Linux, AMD are quite good at s
      • Re:So... (Score:5, Insightful)

        by Gnavpot (708731) on Monday June 16, 2008 @01:23AM (#23806489)

        All other things being equal (no major bugs in one of the specs), USB-Intel would be the clear winner if the two standards came out about the same time, due to Intel's influence, name recognition, prestige, etc. The 5000 pound gorilla flattens the 200 pound monkey with 1 step.

        Oh, you mean like Intel won over AMD with their attempt at a 64 bit processor instruction set?

        (In case you don't know: They did absolutely not. Intel had to scrap their 64 bit processor because nobody wanted it, and today's Intel 64 bit processors uses AMD's instruction set.)
        • by mlts (1038732) *
          The mistake Intel made was doing such a big jump with their Itanium line, with no 32 bit x86 compatibility. AMD extended the 32 bit x86 with 64 bit instructions, so people could continue running 32 bit code without issue.

          IMHO, The Itanium architecture is way far better than the AMD64, with 128 registers for integers, and 128 registers for floating point, but because it couldn't run 32 bit x86 code natively, it has not obtained much marketshare other than for enterprise servers, where x86 compatibility does
          • by Bert64 (520050)
            Itanium may be superior to AMD64...
            Alpha, MIPS, HPPA and Sparc are superior to x86/amd64 too, but the problem is that people want to run closed source commercial software, which brings up a number of factors:

            Existing apps won't run, users need to replace their apps, commercial vendors will typically charge for the new version.

            A new architecture has no users yet, and thus no potential market for commercial vendors. (chicken)
            A new architecture has no commercial software vendors yet, and thus it will not gain
    • Once again, we'll have the VHS version and the Betamax version.

      One will win. Avoid whichever one Sony gets behind.

      • by sznupi (719324) on Monday June 16, 2008 @12:31AM (#23806189) Homepage
        Sooo...you're still waiting for HD-DVD to win?
        • Until DVD is considered dead or dieing, in the same way VHS is now considered dead, they have BOTH failed.
        • by Miseph (979059)
          The exception that proves the rule. Sony got behind the one that was worse for the consumer and finally won one.

          I think Sony must have sent some of their upper management to the Microsoft/Big Oil/Ma Bell School of Ungodly Profit... conventional wisdom says that pissing off paying customers and then charging them extra for the "privilege" will lead to failure, but the SoUP teaches that such a tactic will, in fact, lead to otherwise impossible success.

          Of course, if they want their doctorates in screwing peopl
        • by symbolset (646467) on Monday June 16, 2008 @01:25AM (#23806501) Journal

          Sooo...you're still waiting for HD-DVD to win?

          This one's not over yet. Apparently online distribution was a third contender waiting in the wings. We shall see. Sony bought out HD-DVD. They can't buy out online distribution. In the meantime BD players and discs have gone up in price not down. That was a critical mistake.

          Sony has some of the most brilliant engineers on earth. They're chained to the marketing team from hell. They always try to exploit their market share before it's time. A shame, really. They do a host other things wrong too. If it weren't so their supercomputer class gaming console [wired.com] would not be coming in third to the XBox and the Wii. They could use a consultant to come in and tell them how retarded their marketing team is, but they have too much pride to win. Surely I'm not the only one who sees this.

      • by Fluffeh (1273756)
        My suggestion would be not to back what Microsoft backs on this one.

        *cough* Lets offer a HD-DVD addon for the X-Box *cough*

        Sony seems to have done pretty damned well actually. *cough* PS3 will have a Blu-Ray DVD in the unit *cough*
        • by symbolset (646467)

          My suggestion would be not to back what Microsoft backs on this one.

          Regular slashdotters will know I'm not one to endorse Microsoft's stuff. The very notion is abhorrent.

          But even a stopped clock is right twice a day. This one's not over yet and Microsoft may still win this one with online distribution before market penetration of HD video is enough to lock the market.

          I could probably help Sony win this one. They won't listen to me. Their loss.

    • by Phong (38038) on Monday June 16, 2008 @12:47AM (#23806279)
      This isn't about competing USB 3 standards -- the spec is being designed by a group, and there is only one. This is about the design of the hardware used to implement a host controller that can comply with the spec. This is something that any company can develop if they want to, but since Intel is going to license their design of the host controller for free, most companies will just wait for that design and use it to implement USB 3.
    • by spotter (5662)
      no.

      currently, one has to write 1 driver for usb. no matter what chip is used, 1 driver should support it.

      In linux you'll see "uhci" and "ehci" modules.

      All this means is that one will have to write 2 drivers to support all usb 3 chips.

      its a mountain out of a moehill.
      • by Tony Hoyle (11698)
        ...and don't forget ohci.
      • by Bert64 (520050)
        Actually..

        UHCI and OHCI for USB1...

        EHCI for USB2...

        AMDUSB3 and INTELUSB3 for USB3?

        But standards in hardware are good, the reliance on drivers to provide a compatible middle layer between hardware and software does nothing to help performance or ease of use.

        With standard hardware, we can...

        Make OS's easier to install (drivers for all standard hardware can easily be included, much less work for the OS authors).
        Make apps (games) that boot directly without a need for an OS, and derive maximum performance from t
    • Re:So... (Score:4, Informative)

      by Hal_Porter (817932) on Monday June 16, 2008 @02:01AM (#23806671)

      So will this mean in the end we will have 2 competing USB standards? USB-Intel and USB-AMD?
      I think this is about host controller specs not wire protocols. So it will be like with USB 1.0 where there was OHCI and UHCI. Universal Host Controller Interface was Intel and Vias controller standard and OHCI was everyone else's. Including Microsoft. OHCI was supposed to be do more in hardware, though I don't think it made much difference in practice. But both controllers were compatible on the wire - you could easily make devices that worked with both. IIRC there were cases where the OHCI controller, because it had more informatation about the protocol could respond to information from a device inside the same frame. UHCI controllers were basically dumb and needed intervention from software on the host, so they'd respond to some device condition during the next frame, after the host stack had had a chance to think.

      But according to the USB spec both behaviours are correct since the device can't make any assumptions about what overheads exist on the host.

      I can't find the reference to device visible differences between UHCI and OHIC and in any case it was a very rare case. I did find this presentation by Intel that shows OHCI and UHCI performing almost identically despite the fact that OHCI controllers basically do the USB protocol in software and UHCI is just a bus master DMA engine attached to a serial interface with the protocol is done in software.

      http://www.usb.org/developers/presentations/pres0598/bulkperf.ppt [usb.org]

      With USB 2.0 there was a push to a unified host controller spec called EHCI. From what I can tell this spat means that there will possibly be two rival host controller specs because Intel haven't published their spec in time for other people to implement it. But I don't think that will fork the wire protocol, I think it just means that OSs will need to have two new host controller like USB 1.0 drivers rather than one like USB 2.0.

      You could argue that UHCI was a good thing since it uses less hardware and performs about the same.

      Incidentally Wikipedia writes this up based on the "Good open standards vs vile proprietary standards" meme, which seems a bit unfair. Both OHCI and UHCI are based on published specifications which are freely available. I don't know if you need to pay a license fee to implement either or both of them - I actually think you don't since USB was successful because you didn't need to pay a per port fee when it was introduced, unlike Firewire.

      http://en.wikipedia.org/wiki/OHCI [wikipedia.org]

      The difference seems to me more like a software engineer view (Microsoft want to do it all in hardware like OHCI) of the world vs a hardware engineer view of the world (Intel say do it all in software with UHCI)

      • by Bert64 (520050)
        Intel have been pushing to do more in software for a while, with software modems and the like, as it helps them sell faster CPUs...

        There are ups and downs to both sides... Doing things in hardware is great and performs better short term, but processing speed can quickly catch up and surpass the dedicated hardware if it's not also kept up to date...

        Use the Amiga as an example, when it came out it's dedicated video/sound hardware was great, and helped the Amiga massively outperform other systems using the sam
  • How is this article, published online by an employee of a company supported by Intel, not biased in its analysis of the situation?
    • How is this article, published online by an employee of a company supported by Intel, not biased in its analysis of the situation?

      The fine article doesn't have to be bias free. We'll cover every conceivable side of the issues in the slashdot comments, and much irrelevance also.

      My personal opinion: USB3.0 is cool, but give me external PCIe v2.0 x16 for the win. And Natalie Portman slathered in hot grits, of course.

      • by zappepcs (820751) on Monday June 16, 2008 @12:44AM (#23806257) Journal
        I agree... the battle just heating up, how can you be biased? Not until there are two definitive sides can you get behind one or the other.

        This does point out one thing, there is a lot to be said for open standards ... even if some of them have been OOXML'd lately. (that's not even valid in Roman Numerals)

        No matter which version is better technically, if there is one that is not backwards compatible they will have an uphill slog trying to sell it. Yeah, I know, CDs were not backwards compatible with floppy drives, but this is a bit different. If the connector is the same, it MUST be compatible or my aunt nelly will kill someone.

        • Re: (Score:3, Interesting)

          The two sides I see here are not Specification A and Specification B but not producing an open standard and producing an open standard.

          "there is a lot to be said for open standards"... Yes, Something indeed. Who lead the CD revolution? Sony. Who developed the standard? Sony (and Phillips). They released the standard after they had working products to sell. The "standard" still then cost a lot of money to even look at. (See the wikipedia article on the Red Book standard).

          My Point (finally?): Giving the ex
          • by zappepcs (820751) on Monday June 16, 2008 @02:19AM (#23806769) Journal
            Interestingly (or not) you demonstrate a logical understanding of the technology marketplace. To paraphrase you, if I may, Intel and AMD are fighting about who gets to piss on the idea of competition creates value for the consumer. Any space where AMD and Intel are competing is full of this, and not inconsequentially, lawsuits. Intel has been partnered with MS for a long time, and they worked hard to be the hardware version of what MS was to software.

            We can detail the lawsuits ad nausea, but my point is that anyone that was a healthy partner with MS has done to their industry what MS did to software. Like that or not, it is true. In the end, we have Mr Gates to thank for this, no matter how philanthropic he may try to be these days. I wonder sometimes how far exactly he has set the human race back from what will eventually, and necessarily be.

            Though that is sort of scifi philosophy, it is true. In the name of riches, the advancement of technology has been slowed, deliberately, and with malicious intent against the betterment of mankind. In this way, I find his generosity a bit pale these days.

            Open standards are indeed the ONLY way to create technology and advancement that will last and actually advance mankind in a direction that betters all of us. Despite the socialist sounding tone of that, it is true. We are all better for the sharing of technology from the space race. Technology, and specifically computing/networks are still in the hands of those that would derail it's benefits if there is profit in it. There are those that are trying to change this situation, but it is slow going. Even hardware manufacturers are hobbled by things like the DMCA and it's ilk around the world. Sometimes I'm sad to say I'm American.

            Fighting against the 'right thing to do' for the sake of money is not in the best interests of the community, and in the end, it hurts your business. Customer is king, so they say, and when you put hurdles in the way of a complete and exemplary experience by the end user, you harm your business in some way, if not in big ways. It's unfortunate that not enough people will understand that the competitions in the technology markets have hurt them, and they will not understand how to express their frustration that older USB devices won't work with new USB hosts. It will be just one more black magic thing they don't understand about technology type things. They will go to PCs R Us and buy whatever the best they can get happens to be, hoping that it works for a couple of years, not unlike car buyers. So for profits, businesses promote the throw-away society. When there is something new, throw the old away, don't upgrade, don't re-use. How is this helpful to the human race?

            Well, just some late night thoughts about this whole thing, and the absolutely ignorant waste it makes of the world.

            BTW, there is hardware space competition.... if you are willing to build your own and not buy what the idiot^H^H^H^H^H salesman tells you at worstbuy.

            sigh
    • Re: (Score:2, Insightful)

      by Josue.Boyd (1007859)
      The fact that an employee of a company who is supported by Intel wrote this article does not make it biased. If it were written by an actual employee of Intel, or even Mr. Intel himself, that wouldn't even make the article biased. Is the article biased? perhaps.
  • Bastard companies (Score:4, Insightful)

    by mark_hill97 (897586) <`moc.liamg' `ta' `swodahsforetsam'> on Monday June 16, 2008 @12:07AM (#23806039)
    As we have seen with wireless networking gear in the past companies are all too eager to screw the consumer with incompatibilities because of pre-spec products being released. If Intel was doing this I would say good for them, its rare a company would actively try to protect the consumer from these vultures.
  • by spinkham (56603) on Monday June 16, 2008 @12:11AM (#23806061)
    This is a replay of the OHCI/UHCI host controller interface standards of original USB.
    This does NOT at all effect users, only driver writers.
    What is being forked is the USB driver interface, and does not effect device compatibility at all.
    As mentioned above, there were two driver interfaces for the original USB standard, and the only people who knew were driver writers and nerds compiling their own custom kernel.
    This is blown way out of proportion, and doesn't effect 99.999% of us. Nothing to see here, move along....
    • Re: (Score:2, Insightful)

      by T3Tech (1306739)
      Unless users actually want to utilize the full capabilities of USB 3.0, which would require proper cabling [reghardware.co.uk]. Then it may affect a higher percentage when it comes time to blow up that bridge, but otherwise right now I think you're right.

      Though I'm sure Denon will be the first to come out with a super USB 3.0 optical cable for the bargain price of $750 as an upgrade to their $500 Ethernet cable [slashdot.org] which seems to have an issue with clearly transmitting the frequencies that dogs hear.
      So hopefully in a year or t
    • Re: (Score:3, Informative)

      by tjrw (22407)
      ... and people who ran into all sorts of nasty incompatibilities in the more scary corner-areas of the spec (isochronous transfers, etc.). Microsoft remember this fun which is why they are not happy about this. I remember various issues with USB depending on whether you had and OHCI or UHCI controller.

      It is not in the interests of the consumer nor of the standard to have multiple host-controller interfaces. You may care to muse on why it might be in Intel's interests to the detriment of all others.
  • ... won't release ... unfinished ... until it's ready
    Hey! You can't just copy Microsoft like that!
  • by nick_davison (217681) on Monday June 16, 2008 @02:41AM (#23806877)
    Intel has a point: releasing documentation on a non finalized standard creates a fluster-cluck of bad implementations that aren't necessarily compatible with each other. IIRC, isn't that what's happened to 802.11n, pre-n, draft-n, n-ready, looks a bit like n in a dress, MIMO, etc. which just confuse the crap out of a consumer already pissed at USB 2.0 HiSpeed and USB 2.0 FullSpeed crap.

    nVidia has a point: Intel not telling anyone else until the last moment would, indeed, give Intel an unfair first mover advantage.

    Obvious solution: Release the pre and post release specs with an agreement attached that anyone wanting a copy has to sign. An amount of time that gives everyone a fair chance to get product ready is picked after final specs are chosen. Anyone gaining access to the specs agrees not to release until that time period has passed. Now no one releases incompatible hardware and no one gets an unfair first mover advantage.
  • by ILuvRamen (1026668) on Monday June 16, 2008 @02:54AM (#23806957)
    If any one of them was really smart and wanted to name it to win, they'd name it either blu-port, usblu, or usb 4.0. I mean seriously, which one are you going to use? One named USB 3.0 or 4.0?
  • Intel and some of their favourite customers are on the board of USB Implementer's Forum [usb.org].

    This may seem like the odds are stacked in Intel's favour, and I'ms sure they thing so too, by not allowing anyone else near the host controller spec. Despite this, I think that the other board members would fully realise that Intel is a minority against the combined force of AMD, Via, SiS and Nvidia in production of chipsets for desktop PCs.

    The USB-IF board knows the danger of losing control of the standard if it is f

  • ...it's the same HD-DVD vs. Blu-Ray / DVD-R vs. DVD+R shit, except now it's cables.

    The only people getting screwed is the customers.

My idea of roughing it turning the air conditioner too low.

Working...