PCI Express 3.0 Delayed Till 2011 80
Professor_Quail writes "PC Magazine reports that the PCI SIG has officially delayed the release of the PCI Express 3.0 specification until the second quarter of 2010. Originally, the PCI Express 3.0 specification called for the spec itself to be released this year, with products due about a year after the spec's release, or in 2010."
Delayed the release? (Score:5, Funny)
So the spec is complete, but were not gonna tell you what it says!
Doesn't make sense!
Re: (Score:2)
Oh wait... they didn't delay the spec... they spec is not ready yet. BIG DIFFERENCE!
Re:Delayed the release? (Score:5, Funny)
They are just giving time to Amazon's EC2/S3 to get compliant.
Re: (Score:1)
Or they worked out a deal with computer manufacturers to get an extra upgrade cycle. There'll be one this year, for people who just have to have Windows Vista SP2/3, and then another one next year for businesses that want PCI3...
Re:Delayed the release? (Score:5, Informative)
So the spec is complete, but were not gonna tell you what it says!
Doesn't make sense!
The article says they're working on getting it to be backward compliant with the current PCIe specs. You probably don't want to start building to the spec until that's in place anyway. You can find a lot of information on PCIe 3.0 [pcisig.com] on the FAQ on their site. If you're a member of PCI SIG, you might even be able to get the preliminary spec, who knows?
Re: (Score:2)
Nobody cares about your insignificant little life. Go kill yourself. It's across the lane and not down the street.
*facepalm* If you're going to flame, do it right. You got it backwards, dummy.
Or, in the more concise and efficient spirit of the Scary Devil Monastery, simply...
"Down, not across."
Re: (Score:3, Interesting)
Re: (Score:2)
AGP 1x and 2x were 3.3V, 4x was 1.5V and 8x was 0.8V.
Afaict virtually all stuff that supported 0.8V supported 1.5V as well. So that left 1.5V/0.8V vs 3.3V as the main compatibility issue. There was a notching system that was supposed to indicate whether a card/motherboard supported just 3.3V, just 1.5V/0.8V or both and prevent incompatible combinations from mating. Unfortunately some manufacturers miskeyed thier products.
BTW PCI also had two voltages though the lower voltage was generally only seen on prett
Re: (Score:2)
Re: (Score:2)
"I am staring right now at a SFF 733Mhz Compaq where the PSU is shaped like a fricking triangle! "
Now that is truly awesome! What drugs was/were the engineer(s) taking when they designed that!
As for RAM, I've stuck different speeds together (not recommended) when Franken-building machines from old parts. It will work, if a bit quirky. Talking about 5-10 years ago; don't know how current machines and RAM will behave if you try that now.
Then there was the Amiga's floppy drive. Even though the disks were t
Re: (Score:2)
The problem with the Amiga floppy was not really the drive itself but the floppy controller. You can read amiga disks on a standard pc diskdrive, but it require either a special disk controller, or 2 diskdrives and some REALLY clever software*.
*Googling for why it require 2 diskdrives for a pc to read 1 amiga disk, will really show some great software hacking.
Re: (Score:2)
Re: (Score:2, Interesting)
Personally I am quite glad they are delaying it until it is fully backwards compatible.
Umm dude this is slashdot. The correct response is "This new standard sucks. It would be 10x faster if they didn't worry about back compatibility cruft" from a bunch of people who didn't understand the old standard but have been told it was really complex.
A good example would be x64 replacing x86. Every single nerd on the internet knows that x86 is bloated and that x64 should have started from scratch, despite the fact that a look at a picture of the die of a modern processor shows that the actually CPU cor
Re: (Score:2)
It is one thing to switch to a new processor (say Macs from Motorola to PPC to Intel), which is a good thing in the long run, and another in letting interconnects be backwards compatible. Remember the furor of phasing out ADB, PS/2, serial, and parallel for USB? If USB 3 is not back-compatible, yikes. As for PCI, I'm not a graphics guy, so I don't know if back-compat is a big deal performance wise or not.
Re: (Score:1)
Maybe it's like USB 3.0 XHCI spec. Our spec is like beautiful man. But you can't just download the PDF from a webpage. You need to get your boss to sign something and fax it and then post the originals. Bureaucratic Fucks.
http://www.intel.com/technology/usb/xhcispec.htm [intel.com]
Re: (Score:2)
Re:Whatever (Score:5, Funny)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Funny)
PCI Express 2.0 has more bandwidth than anyone will ever need
It might have more bandwidth than hardly anyone will ever need before 2011... So start saving, 2011 you'll be paying blood for your new PCIe 3.0 graphics card!
Who cares (Score:1, Insightful)
Just another reason to make everyone buy new motherboards. Add one more pin to the CPU while you're at it. Seriously, PCIe 1.1 or whatever is great for me and I play crysis at 1280x1024 with an old ATI X1900- by no means top of the line and on a FX-60 socket 939 CPU. Eventually I'll buy an AM2 or AM3 or AM9 or whatever they're on next. These PCIe upgrades really don't offer much anyway. Mainly we need to get manufacturers to stop selling x8 electricals as x16's.
Re: (Score:3, Insightful)
Cluster interconnects, high speed storage attachment, and various flavors of coprocessors are always hungry for more bandwidth.
Re: (Score:3, Insightful)
but isnt' that the point of making it a channelized system? where each channel is full duplex? they can jsut add more channels as needed.
16x - 20x - 24x - 32x
you can plug a 1x or 4x card in a 16x slot and have it work - hell if you wanted to you could make a 3x card..
adding more available channels on the slot is much less of a change to it than PICX was to PCI.. and that actualy turned out to work quite well..
i'm all for increasing the speed of interconnects - but adding more lanes seems to work just as
Re: (Score:3, Interesting)
For the moment, at least, our ability to drive wires faster at
Re: (Score:2)
Make a double-slot card that goes in two 16x slots if you need 32x?
The only problem with that is almost all boards with multiple 16x sockets have them with a 1x socket in between...
Re: (Score:2)
Looks like a PCIe x32 connector is 210 mm long ( http://az-com.com/pages/pcie/pcie_pdf/ds-06-01.pdf [az-com.com] ) compared to 158 mm for x16
I'm finding it tricky to find the length of standard PCI connectors and things are also complicated by the fact that PCI express connector go closer to the edge of the motherboard than PCI ones but I'd guess it would reach back about as far as a 64-bit PCI slot does.
Still I agree it would be a routing nightmare (which means more layers and therefore more cost)
Re: (Score:2)
Re: (Score:2)
whats in 3.0? (Score:5, Interesting)
the pci sig blurb says its mostyl cleanup and the removal of 5v support
does anyone know of anything interesting in 3.0?
Re:whats in 3.0? (Score:5, Interesting)
Twice as fast again. x16 is 32GB/s. They're looking to support 3 graphics cards per PC, which is cool if you're into that whole supercomputer on your desk thing, but it's going to burn at least a kilowatt.
I'm sad we haven't seen external PCIe implemented. It was in the v2 specification. The idea of an external interconnect with that much bandwidth probably made some heavy players nervous.
Re: (Score:2, Informative)
http://www.ni.com/pxi/mxiexpress/ [ni.com]
Re: (Score:3, Informative)
As the AC above me referenced, National Instruments uses PCI-e for a lot of their backplane communications in their equipment.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
External PCIe x1 isn't that expensive: http://www.magma.com/products/pciexpress/expressbox1/index.html [magma.com]
The x8 version starts to get expensive, though: http://www.magma.com/products/pciexpress/expressbox4-1u/prices.html [magma.com]
Re: (Score:2)
Nvidia has an external PCIe Tesla. I've also seen external GPUs for laptops and I heard something about RED Rocket for laptops that hangs off the ExpressCard slot.
Re: (Score:3, Interesting)
They're looking to support 3 graphics cards per PC
Interesting, I just read the specs on my motherboard which has 4 slots for video cards, granted with 4 slots used it's only 8x (which is ok since I live in 2d land) but with 3 or less in use they're all 16x (well, so it claims), so it would seem that's already covered.
Re: (Score:3, Informative)
Re: (Score:1)
There have been and still are a few implementations of external pci express. But they have all been prohibitively expensive and somewhat "special purpose".
Yeah, they're called ExpressCards.
Re: (Score:2, Funny)
Holy cow, that's what I was looking for, thanks! The Magma ExpressBox7. $2800 for 7 x4 electical, x16 physical slots and a x4 host adapter with cable, rackmount. That's why I like Slashdot.
This enables some interesting configurations of those 1TB PCIe attached SSDs.
Re: (Score:2)
I haven't used it, but I came across it while looking for a way to relocate my Delta 1010 PCI card away from one of my PCIe 16x slots. Also available is a PCIe to external 4xPCI slots, which is great for legacy stuff (or interesting wifi configurations?).
Re: (Score:2)
Twice as fast again. x16 is 32GB/s. They're looking to support 3 graphics cards per PC, which is cool if you're into that whole supercomputer on your desk thing, but it's going to burn at least a kilowatt.
No.
Ever read those power consumption reviews, with beefy high end cards? Usually the computers(quad core, single high end GPU) use 200-300w load. Much of that comes from the CPU/mobo/RAM/HDD/etc. If you add a few more cards, it's unlikely you'll even hit 500 watts.
I picked up a Kill-A-Watt off newegg, a while back, and was surprised to find out my gaming computer only consumes ~100 watts from the wall. That's partly influenced by having a high efficiency PSU, and partly by parts not consuming nearly as much
Re: (Score:2)
I picked up a Kill-A-Watt off newegg, a while back, and was surprised to find out my gaming computer only consumes ~100 watts from the wall.
Is that an idle measurement or one under heavy load?
Re: (Score:2)
Idle. Heavy load peaks it up to about 150, depending on whether the CPU, GPU, or both are stressed.
If I were to multitask and burn a DVD while video encoding on one core and playing Left4Dead, I have a feeling I could push it higher - but lets be honest... that isn't really average use. ;)
And before anyone asks, I checked out the consumption of other stuff like lightbulbs, my monitor, microwave, etc. to make sure the Kill-A-Watt wasn't on the fritz.
Made sure the Kill-A-Watt wasn't on the fritz. (Score:1)
Cool. Because we wouldn't want your measurements to be out of range of variance for a microwave or retail 60 Watt bulb. That would be bad.
It has probably occurred to you that people using that "supercomputer on your desk thing" might have different use cases than yourself. You've probably also considered that since this is slashdot, you might be talking to someone with NIST certified test equipment rather than a Kill-A-Watt purchased from Newegg.
It's cool that you're interested enough to buy your own t
Re: (Score:2)
Hehe, sarcasm. ;)
I don't really care how accurate my Kill-A-Watt is, so long as it isn't reporting a 300 watt computer as using 150 watts. After testing various devices, I'm fairly satisfied that this isn't the case.
I tried some 20w energy efficient bulbs, and they were consuming 25 watts each. :/ My 35w monitor only consumes 28w when on.
Out of curiosity I also tried an old CRT TV. That thing was a monster! ;)
I wish more people were interested too. Power demands are always going up - but if we can make thin
2012 actually! (Score:1, Funny)
This is when the first pci express 3 spec computer is installed into the LHC control system.
strange self-reference (Score:5, Funny)
the PCI Express 3.0 specification called for the spec itself to be released this year
Now we know how time loops are accidentally created.
Re: (Score:1)
2nd half of 2010 == 2011 !?!? (Score:2)
Good Cause Creative still cant handle PCIe now! (Score:2)
Re: (Score:2)
Not really, because the Sigmatel chip built into my Intel board... had serious problems with monitoring input. It had much worse drivers than Creative X-Fi Titanium.
To be fair to creative, they did fix a lot of the latency issues on the driver that were due to pcie timings... but they still pop up on my system from time to time.
Audio in a program will sometimes degrade into a crackling audio stream until i restart the program. Sometimes i ahve to restart the entire PC.
OVERALL, the drivers have some nice fea
Re: (Score:2)
If you're doing anything serious with a line-in jack, you're probably using the wrong tools anyway.
I'd get a professional audio soundcard or, more likely, an external USB or Firewire unit from some company like M-Audio.
Re: (Score:2)
Your right, but i'm just using the line in so that 2 pc's audio come through the same speaker set.
I work in film and know plenty of audio guys. My best friend has a recording studio. I'm a visual fx artist, and photographer. I know gear pretty well... I was simply piping one PC's audio into another which had speakers attached.
Like I said, I work mostly in visual arts... so i didnt need a MOTU, M-Audio etc...
Just something for basic PC sound while i work and the occasional game when i'm tired of working :)
Re: (Score:2)
Hardware EAX, quality audio for musicians (not Creative's cards), etc.
Re: (Score:1)
Did you really just now find out Creative drivers are shit?
Re: (Score:2)
I took a break from creative for a long time. I owned the original SB many years ago, the AWE32, SB LIVes, and Audigy 1...
After that i was done with creative for sometime. Those cards were all good and I still had some faith in creative. For a long time they were a solid go to company for sound cards.... since the old dos days.
Recent years... i guess thats not true. I knew that when i went in on the XFI titantium, but my sigmatel onboard chip SUCKED. It had terrible driver support and broken functionality d
Re:Good Cause Creative still cant handle PCIe now! (Score:5, Informative)
Creative purchased their drivers off of a third party company and then just updated them over the years. This literally happened since the soundriver products began. Once Vista came out with an entirely new sound infrastructure nobody at Creative had the expertise to write a decent driver so they cobbled one together (with Microsoft's help) from their old horrible drivers.
Fact is - Creative soundscards aren't worth while because the drivers are so poor. Even if the sound hardware could potentially take load off of the CPU, you're more likely to spend endless hours messing with it and even if it does work it won't work as effectively as one might hope.
Re: (Score:2)
I have heard that Creative purchased their drivers from a third party. I'm not sure that its completely true or any different than what Creative has done in the past. I'm pretty sure for a long time, Creative's products were all pretty much made and engineered overseas by tech companies they hired. I dont think Creative did any real driver producing ever... short of maybe the original SB for dos.
Till, until and 'til (Score:2)
Re: (Score:2, Informative)
Til(l) was the original form of the word. The redundant prefix un(d) was added later, and nowadays people mistake till as a shortened version of until, which gives 'til. So, there's nothing wrong with the headline.
http://www.etymonline.com/index.php?term=till [etymonline.com]
http://www.etymonline.com/index.php?term=until [etymonline.com]
Re: (Score:1)
Article promise big speed ups. (Score:2)
---
Graphics Cards [feeddistiller.com] Feed @ Feed Distiller [feeddistiller.com]
I think the guy needs a dictionary. (Score:2)
Uh, I hate to break this to you, guy, but according to the dictionary, moving back the deadline is pretty close to the opposite of "doing diligence".
I suspect there are other reasons for the delay. (Score:1)
The capital equipment costs to buy IC testers that run up to 8Ghz is quite prohibitive. In this economy I don't think too many IC production facilities are willing to lay out the funds to buy equipment to test at this higher rate until they have cash flow coming in from the upturn. Until then the test coverage of IC's that run at 8Ghz is minimal and will require bench test methods and "guarantee" by design. This delay if not due to the capital equipment requirements of testing at 8Ghz, will allow supplie