Clash of the Titans Over USB 3.0 Specification Process 269
Ian Lamont writes "Nvidia and other chip designers are accusing Intel of 'illegally restraining trade' in a dispute over the USB 3.0 specification. The dispute has prompted Nvidia, AMD, Via, and SiS to establish a rival standard for the USB 3.0 host controller. An Intel spokesman denies the company is making the USB specification, or that USB 3.0 'borrows technology heavily' from the PCI Special Interests group. He does, however, say that Intel won't release an unfinished Intel host controller spec until it's ready, as it would lead to incompatible hardware."
1394 For Life (Score:5, Insightful)
Re:1394 For Life (Score:4, Insightful)
Re:1394 For Life (Score:5, Funny)
Re: (Score:2, Insightful)
Re:1394 For Life (Score:5, Informative)
Firewire's main advantage now is the fact that it is a point to point mechanism, not a bus. USB suffers because every so often the host must interrupt things to discover new devices. This can slow down large block transfers quite a bit.
Re:1394 For Life (Score:4, Informative)
The thing that does have a big impact is using 12 mbps or 1.5 mbps devices in a way that they hog the bus. Ideally, all non-high-speed transfers would be converted to 480 mbps.
You might imagine a motherboard with 10 USB ports could communicate with all 10 independently. But that is rarely the case. Usually they all share the same bandwidth. You might expect there would be buffering for 12 and 1.5 mbps transfers, so they wouldn't hog the bus from the other 9 boths. That too is rarely the case.
USB 2.0 hubs do buffer and convert 12 and 1.5 mbps transfers to 480 mbps. Again, you might expect a 4 port hub to properly allow 4 slow devices to share. That is sometimes the case. Better hubs have multi-TT (transaction translators, basically the USB term for a buffer). But many hubs have only a single TT, which means only one downstream 12 mbps or 1.5 mbps device can talk at once, and any others on that hub must wait until the single buffer is available.
If the USB 2.0 spec had required all hubs to include a TT on every downstream port, and had the "root hub" (on the motherboard which provides many ports with shared bandwidth) been required to implement TTs on every port, there would have been much higher levels of satisfaction with USB 2.0.
The when Compaq, HP, Intel, Lucent, Microsoft, NEC and Philips wrote the USB 2.0 spec, they apparently believed 480 mbps speed would soon replace 12 mbps in most devices. Requiring many TTs probably seems excessively costly to support legacy devices that would soon become obsolete. What instead happened is only certain devices requiring high speed implemented 480 mbps. Almost all others stayed at 12 mbps. Most devices that implement 12 mbps use a 48 MHz clock internally, and many low-cost silicon fabs really only supports clocks to about 60-100 MHz (especially if the chip's fab supports the extra polysilicon layers for implementing flash or eeprom).
Let's hope they learn their lesson and require TTs in ALL cases where 480, 12 and 1.5 mbps devices could share the upstream bandwidth, especially on motherboards. If they do, USB 3.0 will probably be very nice, providing so much more shared bandwidth than necessary that hardly anybody will care if it's shared. But if they skimp and allow any sharing, anywhere, without TTs - the result will probably be a lot like USB 2.0 - very fast, but sometimes you plug in another device and all of a sudden it sucks.
Re:1394 For Life (Score:4, Informative)
Re: (Score:2, Interesting)
Re: (Score:2, Informative)
Re:1394 For Life (Score:5, Informative)
The entire royalty is something like $0.25 per device, Apple only gets a portion of that.
The cost is in the smarts, each device requires a more complicated controller and an additional chip.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re:1394 For Life (Score:5, Funny)
Re:1394 For Life (Score:5, Insightful)
Re: (Score:3, Insightful)
Look on it from the bright side, a few years from now you and your likes will claim how Apple popularized USB3. If it weren't for Apple we would still be using low speed Firewire and so on. Great, isn't it.
Re:1394 For Life (Score:4, Informative)
Re:1394 For Life (Score:5, Funny)
Re:1394 For Life (Score:4, Interesting)
Think a 6 port firewire-hub. That's $6 just in royalty.
But I don't think they actually charged that much. Wasn't it more along the lines of $0.25 per device?
The biggest reason why USB was a really slow hit with the x86-crowd was the lack of USB-support in MS-Windows and other x86 OS's. In order to connect a USB-device you had to install USB-support, reboot, install the device drivers, reboot, sometimes there would be another driver to be loaded after the first one (sic) so another install, reboot...
Also, some early USB equipped x86-mainboards didn't have USB support in BIOS, so you couldn't use a USB-keyboard to change BIOS-settings, enter Windows safe-mode, etc, etc.
USB was also slow as hell for most other uses than HID-devices or printers.
My first mp3-player would take more than an hour to fill. 6GB @ 12Mbps, the horror!
When Apple put a port in their hardware, they usually already got the drivers ready and they rely mostly on making their own hardware.
The Imac came with a USB-keyboard and USB-mouse made by Apple, and thus everyone that had an Imac used USB-gear.
What has MS license cost or Microsoft's greed got to do with firewire royalty and Apple's greed?
They're not connected. Don't confuse subjects.
If anyone accuses Apple of greed, that doesn't mean that they think Apple is worse than Microsoft. Both are greedy. Microsoft more than Apple usually though.
Re: (Score:2)
That kind of thinking is why a company such as Microsoft couldn't put out a decent iPod clone even after a few years despite anyone with half a clue being able to tell you what they are doing wrong.
Product design is more than just penny pinching.
Re: (Score:2)
Re: (Score:3, Interesting)
Bite your tongue. I just spent my economic stimulus check on a new firewire audio interface for my digital audio workstation (on which I make part of my living).
If firewire "dies" companies like Avid, M-Audio, Prosonus, MOTU and many more are gonna have to go back to the drawing board.
USB (even 2.0) just isn't that great for moving a lot of digital audio. By comparison, Firewire (400 or 800) is a dream. If firewire goes,
USB2 is _not_ faster than firewire... (Score:4, Insightful)
USB2 is quoted as having 480Mbps throughput, however as the grandparent points out USB2 is not a fully-fledged I/O controller just the PHY layer, the host having to do all the heavy lifting.
The upshot is that when you actually use one bus or the other to, say copy files, firewire at a mere 400Mbps trounces USB2 in throughput.
Yes USB3 is in the pipe with vastly improved on paper specs, but then again Firewire has 3200 and 6400 variants in the pipe as well.
Essentially USB should have been left as an interface for keyboards and mice, and firewire aught to have been adopted by intel as the preferred bus for all high throughput applications, it would also have been preferable to SATA.
Re: (Score:3, Interesting)
Re:1394 For Life (Score:5, Informative)
Re:1394 For Life (Score:5, Informative)
Re: (Score:2, Insightful)
1394 = quality technology
Re:1394 For Life (Score:4, Insightful)
Why bother using firewire hacking when it is much simpler to do a hard reset and load a bootable CD?
*YMMV, See TrueCrypt for example.
Re: (Score:2)
Re: (Score:3, Informative)
Admit it, once you have access to the computer, it's game over. Unless you encrypt the hard drive. The whole thing. And your RAM as well. And use EFI. Encrypted...
Re: (Score:3, Funny)
Re:1394 For Life (Score:4, Funny)
Re: (Score:3, Funny)
Re: (Score:3, Funny)
Re: (Score:2)
Ohh and breaking BIOS passwords is trivial, virtually all of them have backdoors allowing you to bypass whatever password the user set.
As for not booting from CD, so what? just move the existing HD to a different slot, and put your own HD on the primary slot so it gets booted instead... Same end result as booting from CD.
Re: (Score:2)
Does eSATA have this issue where one can plug in something into an external SATA port, then be able to dump the memory of a local computer to fish out encryption keys?
Of course, its pretty much game over if the bad guys get physical access to the machine, but disabling IEEE 1394 will slow them down at least, forcing them to try to find another bus to hotplug onto for the RAM dump (PCI, PCI-e.)
Re:1394 For Life (Score:5, Informative)
In short -- FireWire is faster and requires far less load on the target machine. The downside is the initial cost is higher. I find it pays for itself pretty quick.
Not quite true about the cost. (Score:5, Insightful)
Re: (Score:2)
The market for people who want to buy Firewire is probably closer to the market that also want to pay a bit extra for quality. That's also partly why Firewire isn't going away anytime soon, at least on the Mac.
Re: (Score:2, Redundant)
1: at least early in firewires life there were some fairly significant licensing fees, dunno if that is still the case.
2: Firewire is intrinsiclly a higher spec and more expensive interface. A good example of this is the power provision, firewire can carry much more but the higher voltage makes using it more awkward for devices.
3: Firewire has become something of a niche product, the more niche a product is the less people the upfront costs are spread over.
4: in the case of motherboard suppor
Re: (Score:3, Funny)
Re:1394 For Life (Score:5, Interesting)
True, there is no HID standard for Firewire. But that's not its strength. Firewire's strength is USB's weakness, and Firewire's weakness is USB's strength.
Firewire seems to be fading into smaller niches though. I don't want to daisy chain hard drives, so eSATA will do fine, and eSATA does allow the use of port multipliers, one port still does five drives.
I have two HDV cameras, but I don't use them much, I prefer an HF10 which writes to SDHC cards. Firewire is good for audio tasks, which I don't do.
Re:1394 For Life (Score:5, Informative)
Re:1394 For Life (Score:5, Interesting)
It's not USB2 or SATA that cannibalized Firewire's supposed market... It's Ethernet.
Much better range, lower price, more devices, equally high speed, similar (controller) requirements, easier device sharing, etc.
High-end printers, scanners, CD/DVD duplicators, studio (audio/video) equipment, hard drive arrays, etc. They all have gigabit ethernet connectors now.
Ethernet ate the high-end, USB ate the low-end, Firewire got left out in the cold, with just a few niche applications where Ethernet is inconvenient and its benefits don't apply, and yet USB isn't quite fast/flexible enough. That basically means just digital camcorders, and a handful of studio equipment...
Re: (Score:3, Insightful)
Of course not. But be honest. How often is that a critical requirement? Like I said, Firewire has been relegated
Firewire m
Re: (Score:2)
You may not want to daisy-chain HDDs, but I do. For price reasons, I end up using USB for the task, but it's only just barely adequate for the task. Oh, USB does fine for copying a small number of files, wonderful for a thumb drive, but try transferring 100 GB at a time, and it chokes badly. Especially if you
Re: (Score:3, Interesting)
So...with USB 3 we have a case of extending USB into areas which it wasn't really meant to serve...and which already are served very well.
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Re: (Score:2)
That's exactly what we had with USB 2.0... and it wasn't exactly a flaming failure.
Re: (Score:2)
But it's not really better than what we've had bofore it. Even if, in more positive cases, you have to use a stopwatch and look at CPU utilisation to notice any real difference (USB2 vs. Firewire HDDs)
But when it's not so good...oh boy, once I tried to force USB connectivity in digital camera that had both Firewire and USB (my advice: don't, just buy the damn Firewire card). And even when it comes to webcams, the one I consider "best ever" is on...Firewire. Which also would serve
Re:1394 For Life (Score:5, Insightful)
Also firewire IO is done on the card/chip, whereas USB is done to a large degree by the CPU. This is why we saw recent threads about the 'security risk' associated with jacking into the firewire port of a computer - you have direct access to system memory on most systems. Try a file copy with USB 2, and again with firewire, watch your processor. BIG difference. This is important when you are processing video, you can't have your video IO making your video processing lag and skip frames. That's one of the reasons firewire remains dominant on video.
The only aspect of this I find puzzling is the scarcity and cost of firewire flash drives. kanguru makes them but they cost 3-4x as much as comparable USB thumb drives. Best guess here is thumb drives started their boon before most PCs had firewire ports, so they were just trying to hit the largest market, which lacked firewire, and so now we're stuck with it.
Re: (Score:2)
Re: (Score:2)
I think for the kinds tasks most people use key-chain-type flash drives for, USB is good enough. Perhaps it's because when people want the speed of Firewire, they're usually copying lots of data, possibly more than you average key-chain flash drive can hold. And for most people, compatibility/portability is a very important feature.
BTW. What is the new term for flash drives now that flash drives are starting to pop-up
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
Sorry for your BSOD but that's not the device's problem, has nothing to do with USB or Firewire. Linux and Macintosh do not have ANY issue with hot swapping firewire.
And I work with firewire and usb storage many times every single day at work so I believe I have a good sampling to speak on.
I can say I've seen firewire damaged devices though... some of the cheap firewire port end cages are split st
Re: (Score:2)
So... (Score:5, Interesting)
Re: (Score:3, Insightful)
Re: (Score:2, Insightful)
Re: (Score:2, Informative)
I think we can be fairly confident if there were USB-AMD and USB-Intel, that:
All other things being equal (no major bugs in one of the specs), USB-Intel would be the clear winner if the two standards came out about the same time, due to Intel's influence, name recognition, prestige, etc. The 5000 pound gorilla flattens the 200 pound monkey with 1 step.
USB-ADM could win, but only if it came out far enough in advance, for products to start being designed using it.
There's a limited market for devices
Re: (Score:2, Insightful)
- all things being equal, USB-Intel would lose, look at the companies opposing it, you have AMD, Intel's biggest rival in chipsets, you have nVidia, the biggest gfx company, you have VIA and SiS - who handle pretty much every other chip in your computer.
In short, every chip in your computer except your intel chip would be specced to the disputing standard, what would Intel do to counter that? Personally try to take over the gfx market, the VIA mar
Re: (Score:2)
Not sure what this has to do with Linux, AMD are quite good at s
Re:So... (Score:5, Insightful)
Oh, you mean like Intel won over AMD with their attempt at a 64 bit processor instruction set?
(In case you don't know: They did absolutely not. Intel had to scrap their 64 bit processor because nobody wanted it, and today's Intel 64 bit processors uses AMD's instruction set.)
Re: (Score:2)
IMHO, The Itanium architecture is way far better than the AMD64, with 128 registers for integers, and 128 registers for floating point, but because it couldn't run 32 bit x86 code natively, it has not obtained much marketshare other than for enterprise servers, where x86 compatibility does
Re: (Score:2)
Alpha, MIPS, HPPA and Sparc are superior to x86/amd64 too, but the problem is that people want to run closed source commercial software, which brings up a number of factors:
Existing apps won't run, users need to replace their apps, commercial vendors will typically charge for the new version.
A new architecture has no users yet, and thus no potential market for commercial vendors. (chicken)
A new architecture has no commercial software vendors yet, and thus it will not gain
Betamax theory of CE (Score:3)
Once again, we'll have the VHS version and the Betamax version.
One will win. Avoid whichever one Sony gets behind.
Re:Betamax theory of CE (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
I think Sony must have sent some of their upper management to the Microsoft/Big Oil/Ma Bell School of Ungodly Profit... conventional wisdom says that pissing off paying customers and then charging them extra for the "privilege" will lead to failure, but the SoUP teaches that such a tactic will, in fact, lead to otherwise impossible success.
Of course, if they want their doctorates in screwing peopl
Re: (Score:3, Informative)
Re:Betamax theory of CE (Score:5, Insightful)
This one's not over yet. Apparently online distribution was a third contender waiting in the wings. We shall see. Sony bought out HD-DVD. They can't buy out online distribution. In the meantime BD players and discs have gone up in price not down. That was a critical mistake.
Sony has some of the most brilliant engineers on earth. They're chained to the marketing team from hell. They always try to exploit their market share before it's time. A shame, really. They do a host other things wrong too. If it weren't so their supercomputer class gaming console [wired.com] would not be coming in third to the XBox and the Wii. They could use a consultant to come in and tell them how retarded their marketing team is, but they have too much pride to win. Surely I'm not the only one who sees this.
Re: (Score:2)
*cough* Lets offer a HD-DVD addon for the X-Box *cough*
Sony seems to have done pretty damned well actually. *cough* PS3 will have a Blu-Ray DVD in the unit *cough*
Re: (Score:2)
Regular slashdotters will know I'm not one to endorse Microsoft's stuff. The very notion is abhorrent.
But even a stopped clock is right twice a day. This one's not over yet and Microsoft may still win this one with online distribution before market penetration of HD video is enough to lock the market.
I could probably help Sony win this one. They won't listen to me. Their loss.
Not competing standard, competing hardware designs (Score:5, Informative)
Re: (Score:2)
currently, one has to write 1 driver for usb. no matter what chip is used, 1 driver should support it.
In linux you'll see "uhci" and "ehci" modules.
All this means is that one will have to write 2 drivers to support all usb 3 chips.
its a mountain out of a moehill.
Re: (Score:2)
Re: (Score:2)
UHCI and OHCI for USB1...
EHCI for USB2...
AMDUSB3 and INTELUSB3 for USB3?
But standards in hardware are good, the reliance on drivers to provide a compatible middle layer between hardware and software does nothing to help performance or ease of use.
With standard hardware, we can...
Make OS's easier to install (drivers for all standard hardware can easily be included, much less work for the OS authors).
Make apps (games) that boot directly without a need for an OS, and derive maximum performance from t
Re:So... (Score:4, Informative)
But according to the USB spec both behaviours are correct since the device can't make any assumptions about what overheads exist on the host.
I can't find the reference to device visible differences between UHCI and OHIC and in any case it was a very rare case. I did find this presentation by Intel that shows OHCI and UHCI performing almost identically despite the fact that OHCI controllers basically do the USB protocol in software and UHCI is just a bus master DMA engine attached to a serial interface with the protocol is done in software.
http://www.usb.org/developers/presentations/pres0598/bulkperf.ppt [usb.org]
With USB 2.0 there was a push to a unified host controller spec called EHCI. From what I can tell this spat means that there will possibly be two rival host controller specs because Intel haven't published their spec in time for other people to implement it. But I don't think that will fork the wire protocol, I think it just means that OSs will need to have two new host controller like USB 1.0 drivers rather than one like USB 2.0.
You could argue that UHCI was a good thing since it uses less hardware and performs about the same.
Incidentally Wikipedia writes this up based on the "Good open standards vs vile proprietary standards" meme, which seems a bit unfair. Both OHCI and UHCI are based on published specifications which are freely available. I don't know if you need to pay a license fee to implement either or both of them - I actually think you don't since USB was successful because you didn't need to pay a per port fee when it was introduced, unlike Firewire.
http://en.wikipedia.org/wiki/OHCI [wikipedia.org]
The difference seems to me more like a software engineer view (Microsoft want to do it all in hardware like OHCI) of the world vs a hardware engineer view of the world (Intel say do it all in software with UHCI)
Re: (Score:2)
There are ups and downs to both sides... Doing things in hardware is great and performs better short term, but processing speed can quickly catch up and surpass the dedicated hardware if it's not also kept up to date...
Use the Amiga as an example, when it came out it's dedicated video/sound hardware was great, and helped the Amiga massively outperform other systems using the sam
Re: (Score:2)
Re: (Score:2)
Are you seriously making the argument that "800Mbps should be enough for anybody"?
Non-scewed article how? (Score:2, Insightful)
Re:Non-skewed article how? (Score:2)
The fine article doesn't have to be bias free. We'll cover every conceivable side of the issues in the slashdot comments, and much irrelevance also.
My personal opinion: USB3.0 is cool, but give me external PCIe v2.0 x16 for the win. And Natalie Portman slathered in hot grits, of course.
Re:Non-skewed article how? (Score:4, Funny)
This does point out one thing, there is a lot to be said for open standards
No matter which version is better technically, if there is one that is not backwards compatible they will have an uphill slog trying to sell it. Yeah, I know, CDs were not backwards compatible with floppy drives, but this is a bit different. If the connector is the same, it MUST be compatible or my aunt nelly will kill someone.
Re: (Score:3, Interesting)
"there is a lot to be said for open standards"... Yes, Something indeed. Who lead the CD revolution? Sony. Who developed the standard? Sony (and Phillips). They released the standard after they had working products to sell. The "standard" still then cost a lot of money to even look at. (See the wikipedia article on the Red Book standard).
My Point (finally?): Giving the ex
Re:Non-skewed article how? (Score:4, Interesting)
We can detail the lawsuits ad nausea, but my point is that anyone that was a healthy partner with MS has done to their industry what MS did to software. Like that or not, it is true. In the end, we have Mr Gates to thank for this, no matter how philanthropic he may try to be these days. I wonder sometimes how far exactly he has set the human race back from what will eventually, and necessarily be.
Though that is sort of scifi philosophy, it is true. In the name of riches, the advancement of technology has been slowed, deliberately, and with malicious intent against the betterment of mankind. In this way, I find his generosity a bit pale these days.
Open standards are indeed the ONLY way to create technology and advancement that will last and actually advance mankind in a direction that betters all of us. Despite the socialist sounding tone of that, it is true. We are all better for the sharing of technology from the space race. Technology, and specifically computing/networks are still in the hands of those that would derail it's benefits if there is profit in it. There are those that are trying to change this situation, but it is slow going. Even hardware manufacturers are hobbled by things like the DMCA and it's ilk around the world. Sometimes I'm sad to say I'm American.
Fighting against the 'right thing to do' for the sake of money is not in the best interests of the community, and in the end, it hurts your business. Customer is king, so they say, and when you put hurdles in the way of a complete and exemplary experience by the end user, you harm your business in some way, if not in big ways. It's unfortunate that not enough people will understand that the competitions in the technology markets have hurt them, and they will not understand how to express their frustration that older USB devices won't work with new USB hosts. It will be just one more black magic thing they don't understand about technology type things. They will go to PCs R Us and buy whatever the best they can get happens to be, hoping that it works for a couple of years, not unlike car buyers. So for profits, businesses promote the throw-away society. When there is something new, throw the old away, don't upgrade, don't re-use. How is this helpful to the human race?
Well, just some late night thoughts about this whole thing, and the absolutely ignorant waste it makes of the world.
BTW, there is hardware space competition.... if you are willing to build your own and not buy what the idiot^H^H^H^H^H salesman tells you at worstbuy.
sigh
Re: (Score:2, Insightful)
Bastard companies (Score:4, Insightful)
This is only a concern to driver writers (Score:5, Informative)
This does NOT at all effect users, only driver writers.
What is being forked is the USB driver interface, and does not effect device compatibility at all.
As mentioned above, there were two driver interfaces for the original USB standard, and the only people who knew were driver writers and nerds compiling their own custom kernel.
This is blown way out of proportion, and doesn't effect 99.999% of us. Nothing to see here, move along....
Re: (Score:2, Insightful)
Though I'm sure Denon will be the first to come out with a super USB 3.0 optical cable for the bargain price of $750 as an upgrade to their $500 Ethernet cable [slashdot.org] which seems to have an issue with clearly transmitting the frequencies that dogs hear.
So hopefully in a year or t
Re: (Score:3, Informative)
It is not in the interests of the consumer nor of the standard to have multiple host-controller interfaces. You may care to muse on why it might be in Intel's interests to the detriment of all others.
unfinished spec (Score:2)
By the sounds of things: Both Right, Easy Solution (Score:3, Interesting)
nVidia has a point: Intel not telling anyone else until the last moment would, indeed, give Intel an unfair first mover advantage.
Obvious solution: Release the pre and post release specs with an agreement attached that anyone wanting a copy has to sign. An amount of time that gives everyone a fair chance to get product ready is picked after final specs are chosen. Anyone gaining access to the specs agrees not to release until that time period has passed. Now no one releases incompatible hardware and no one gets an unfair first mover advantage.
I've got an idea (Score:3, Funny)
Intel's positing on the USB-IF seems strong (Score:2)
This may seem like the odds are stacked in Intel's favour, and I'ms sure they thing so too, by not allowing anyone else near the host controller spec. Despite this, I think that the other board members would fully realise that Intel is a minority against the combined force of AMD, Via, SiS and Nvidia in production of chipsets for desktop PCs.
The USB-IF board knows the danger of losing control of the standard if it is f
Oh great... (Score:2)
...it's the same HD-DVD vs. Blu-Ray / DVD-R vs. DVD+R shit, except now it's cables.
The only people getting screwed is the customers.
Re: (Score:3, Informative)