Despite FTC Settlement, Intel Can Ship Oak Trail Without PCIe 140
MojoKid writes "When the Federal Trade Commission settled their investigation of Intel, one of the stipulations of the agreement was that Intel would continue to support the PCI Express standard for the next six years. Intel agreed to all the FTC's demands, but Intel's upcoming Oak Trail Atom platform presented something of a conundrum. Oak Trail was finalized long before the FTC and Intel began negotiating, which means Santa Clara could've been banned from shipping the platform. However, the FTC and Intel have recently jointly announced an agreement covering Oak Trail that allows Intel to sell the platform without adding PCIe support — for now."
Am I the only one who is confused... (Score:5, Insightful)
...by what the actual issue is here? And I did RTFA.
Something about Intel pushing a new proprietary graphics bus into a new chipset...they never actually mentioned how the FTC thing got started.
Re: (Score:2)
Re:Am I the only one who is confused... (Score:5, Informative)
A little more digging brings us [computerworld.co.nz]
The FTC site adds that [ftc.gov]
Seems to have been part of a broader move against Intel at the time, I admit I don't remember it very clearly, but Reuters adds [reuters.com]
Oh and that case can be found here [ftc.gov]
Re: (Score:2, Interesting)
Instead of telling Intel how to make their product, I consider it much better to confiscate the relevant patents and copyrights and put them into the public domain. That way AMD, nvidia, etc. will have all the access they need. They use asset forfeiture on us all the time. Time to use it here. Fair is fair.
Re: (Score:2)
Personally I think that a graphics integrated with the CPU is a good thing.
The "video card" is an essential component of a system, and having a last resort GPU built into the CPU is a nice backup measure.
What would piss me off is if the chip then blocked access to external video card ports.
Re: (Score:2)
like by not including support for PCI-E?
Re: (Score:2)
Precisely.
Re: (Score:2, Interesting)
mediocre solution that they have to glue it onto the CPU
The mediocre solution (GMA HD) they are gluing to the CPU is a derivative of the solution they shipped 140,000,000 of last year (GMA* in 90%+ of every laptop manufactured.) That's pretty hilarious. It will be downright hysterical when, integrated into the CPUs, another 100,000,000 displace most of the discrete desktop graphics cards.
Do not bet against integration.
But how does that equate to supporting PCIe? (Score:2)
How do we get from anti-compete against AMD to 6 years if support for PCIe?
Re: (Score:1, Offtopic)
even more confusing is that google is not mentioned.
and this is a slashdot article. there *has* to be a google angle somewhere.
Re:Am I the only one who is confused... (Score:5, Informative)
Here is a good article [arstechnica.com] about the original antitrust settlement.
Basically, Intel refuses to license it's new DMI or QPI bus protocols to NVIDIA, so they can no longer make chipsets for intel processors (like nForce). Furthermore, it has been feared that with the push towards systems on chip, that Intel would eliminate the PCI-e bus as well leaving no way for any graphic company to supply a discrete graphics chip for netbook or notebook computers.
Re: (Score:3, Interesting)
Re: (Score:1, Interesting)
AMD also has HyperTransport. Maybe this was why there were rumours about Nvidia making a CPU.
If Intel & AMD decided to offer GPUs linked by QPI & HT it would give their GPUs a big advantage with Nvidia unable to compete.
I think non-portable computers will end up a lot more modular in this way. Memory, CPUs, GPUs, Northbridge all connected to each other on a future generation of a switched HT/QPI bus. It would make the computers much more scalable, futureproof, adaptable and efficient. It might also
Re:Am I the only one who is confused... (Score:4, Interesting)
If Intel & AMD decided to offer GPUs linked by QPI & HT it would give their GPUs a big advantage with Nvidia unable to compete.
That would also kill Intel's high-end consumer products. Most high-end Intel CPUs are sold to gamers, who aren't going to be gaming on some crappy Intel integrated graphics chip.
At least for the forseeable future, Intel need Nvidia for the mid to high-end gaming market, because they're not going to be releasing GPUs in that arena any time soon.
Re: (Score:2, Insightful)
High-end graphics and discrete cards are making up a smaller and smaller percentage of the market. It is quickly getting to the point that the only people who are buying discrete GPUs are gamers and graphics professionals. Most people just don't see the need for the added expense.
The "mid to high-end gaming market" is fairly small on the PC, relative to the entire PC market.
Re: (Score:3, Insightful)
Intel's high-end consumer products aren't where they make their money.... and enough of them make it into prebuilt machines from Dell, HP, anyways.
Most high-end Intel CPUs are sold as server solutions, where a graphics card makes very little difference.
Re: (Score:2)
Re: (Score:2)
Did I say that they didn't? No... I said that high-end consumer gaming machines are not where Intel is making its money.
And those cards don't have to use PCIe, there's other ways of getting the controllers in. Gigabit network cards for example already offload a lot onto the processor itself and are usually built into the chipset.
Re: (Score:2)
Who in the hell is gonna buy a core i7 laptop and stick with integrated graphics, again?
How much market? (Score:2)
That would also kill Intel's high-end consumer products. Most high-end Intel CPUs are sold to gamers, who aren't going to be gaming on some crappy Intel integrated graphics chip.
Can you quantify the gamer CPU market sales vs. the chipset sales currently held by nVidia?
Re: (Score:1)
Re: (Score:2)
I and everyone else should be able to get GPUs independent of CPUs, or any other hardware for that matter
How about FPU?
Re: (Score:3, Insightful)
If they did that, every manufacturer of even moderately high-end laptops would drop their CPUs faster than an LSD addict drops acid.
Even if Intel's GPUs were the best in the industry, there are too many other critical things you wouldn't be able to properly support without PCIe-
Ya I can't imagine them not wanting it (Score:5, Interesting)
Intel doesn't want nVidia making chipsets, true enough, because Intel makes chipsets. However the want expansion slots on their boards because they want people using their boards. I'm quite sure they are plenty happy with nVidia and ATi graphics cards. Heck they've included ATi's crossfire on their boards for a long time (they didn't have SLI because nVidia wouldn't license it to them). Intel has nothing that competes in that arena, and they recently revised their plan so they aren't even going to try. They want people to get those high end GPUs because people who get high end GPUs often get high end CPUs since they are gamers. Not only that, they STILL sell the Integrated GPU, since it is on chip.
I just can't see them not wanting PCIe in their regular desktop boards. They know expansion is popular, and they also know that the people who expand the most also want the biggest CPUs.
Now on an Atom platform? Sure makes sense. These are extremely low end systems. PCIe logic is really nothing but wasted silicon. You don't have room for PCIe expansions in there, never mind the desire for it. Those are integrated, all-in-one, low end platforms.
However desktop and laptop? I can't see them wanting to eliminate it there.
Re: (Score:2)
Re: (Score:2)
"LSD addict drops acid"
LSD is not addictive.
Re: (Score:2)
Okay, bad analogy. You get the point, though....
Re: (Score:2)
All those things you mention don't really require PCIe - if they are provided by Intel chipset.
Comment removed (Score:5, Informative)
Re: (Score:2)
Just look at how many power hogging P4s are still in use, thanks partially to the fact that Intel paid off OEMs not to run the better at the time AMD chips.
Prior to the Athlon-64, P4s _were_ better than Athlons unless you wanted to run x87 floating point instructions. When I bought my last Windows PC I expected to go AMD but when I actually looked at the benchmarks the P4 was up to twice as fast as a similarly-priced Athlon at rendering in the 3D and video editing packages I was using at the time.
It was only in the final P4 space-heater era that choosing AMD became a no-brainer.
Re: (Score:2)
Re: (Score:2)
You really do have to consider that the performance per watt in the era since the P4 has been stellar from Intel, while AMD hasn't quite been up to par in that department. The roles really have reversed in the past few years in regards to wattage, with Intel also keeping the raw performance crown on the high end.
On the other hand, for the price of the highest-end Intel chip [newegg.com] (and a motherboard [newegg.com] to run it on) (also note: board and chip ONLY; no OS, no drives, no case, no nothing), I can practically build two high-end AMD systems [newegg.com] (If Newegg will sell me a pre-built system for just under a grand, I'm willing to bet I can build it myself for $800 or less - especially without the MSFT tax).
Re: (Score:2)
Re: (Score:2)
Now that AMD is about to integrate decent GFX...one can see why Nvidia wants to focus primarily on the "pro" market.
Last two performance-starved areas, games and video editing/encoding, should quickly become mostly covered even by entry CPUs...
Re: (Score:3, Insightful)
That's not a good comparison. Intel has an obscenely high priced chip. Fine, they always have for those people who have more money than sense. They also have reasonably priced chips. Try instead looking at, say, a Core i5-760. 2.8GHz quad core chip for $210. Look up some performance numbers and then compare to AMD chips. It isn't very favorable. More or less they need their 6 core chips to compete, and then it is only competitive if you happen to have an app that can use all 6 cores (which is very rare stil
Re: (Score:2)
Sure if you are the type to run only one app at a time.
my list:
1)emerge
2)ffmpeg
3)Wow.exe
4)Chome while on carrerbuilder.com(something is dumb there and loops using 10% cpu).
ffmpeg can use 2-4 cores, emerge can as well, wow uses 2, and chrome could use a bunch as well. hmm just about everything i run is threaded 6-12 cores sure sounds nice.
Re:Am I the only one who is confused... (Score:4, Interesting)
Try instead looking at, say, a Core i5-760. 2.8GHz quad core chip for $210. Look up some performance numbers and then compare to AMD chips.
Performance numbers based on Intel crippling compiler.
Yeah. Even in cases where Intel's compiler isnt used for the benchmark program, many benchmarks still use libraries compiled with Intel's compiler.
Of significance are Intels Math Kernel Library and even AMD's Core Math Library (compiled with Intels fortran compiler!)
These libraries are extensively used in most benchmark programs.
Well then perhaps AMD needs to get on that (Score:2)
See here's the problem: Even if Intel's compiler does better for Intel's own chips, which I'm sure it does, it is still the best compiler out there by a long shot. Any time you see compiler benchmarks it is consistently the best thing on the market. Intel has a really, really good compiler team. So if that is a problem for AMD, well then they should be writing their own compiler. Like the ICC, they should make it plug in to Visual Studio so that VS developers can use it as a drop-in replacement to speed up
Re: (Score:2)
Even if Intel's compiler does better for Intel's own chips, which I'm sure it does, it is still the best compiler out there by a long shot.
This isnt about not putting effort into optimizing for non-Intel. This is about intentionally putting effort into sabotaging non-Intel performance. They have been convicted of this act, and so far have not honored the courts ruling on the matter.
Remember when Microsoft intentionally sabotaged competing DOS clones? Yeah.
They only care how it does in the apps they actually use. So if all their apps are ICC apps and they run better on Intel processors, well hten that's all that matters.
Who claimed that all their apps are ICC-based? The claim is that most BENCHMARKS use ICC-generated code at some point, and this claim is DEMONSTRABLY true (simply changing the CPUID Vendo
Re: (Score:2)
Try doing some benchmarks on a pure Linux system compiled with GCC...
GCC has no reason to bias any individual processor maker.
Re: (Score:2)
My "contrived situation" was simply to go to newegg.com, and look up the highest rated intel chip, then get the highest rated motherboard for that chip, then compare that price to a good gaming system sold as a single, pre-built unit.
The funny thing is, you managed to rant about how I used a six-core chip against the Intel four-core, while completely ignoring that for the money, I would actually get twelve cores with the AMD-based solution. Unless you can argue that it takes 3 AMD cores to equal the perform
Re: (Score:2)
Re: (Score:2)
You'd also tell the difference if you had an orangutan beating you with a stick. What does that have to do with performance, again?
Please don't argue just for the sake of argument.
Re: (Score:2)
Re: (Score:2)
Let's take this to an individual chip level, so my point comes through crystal clear.
Intel Core i7-975 Extreme Edition Bloomfield 3.33GHz 4 x 256KB L2 Cache 8MB L3 Cache LGA 1366 130W Quad-Core Processor BX80601975 [newegg.com] - $1,039.99
AMD Phenom II X6 1090T Black Edition Thuban 3.2GHz 6 x 512KB L2 Cache 6MB L3 Cache Socket AM3 125W Six-Core Desktop Processor HDT90ZFBGRBOX [newegg.com] $229.00
This is a direct comparison of the "best" Intel desktop chip on newegg.com and the "best" AMD desktop chip on newegg.com. I didn't research
Re: (Score:2)
You're picking Intel's most expensive chip to try and prove a point, and failing horribly. Intel has a $279.99 offering on Newegg [newegg.com] that beats [tomshardware.com] the [tomshardware.com] living [tomshardware.com] shit [tomshardware.com] out [tomshardware.com] of the AMD processor for things normal people do on their home computers, and is damn close in the rest. Oh, and it uses far less power both at idle and at load [anandtech.com]. (Tom's didn't have power numbers for the i7-860).
Now, you might have a point about code "not being optimized for AMD blahblahblah", but here's a newsflash: Not only do the testing suites us
Re: (Score:2)
Actual measurements of power consumed are very close, with upper Intel chips often consuming more; with perf differences that can't be noticed anyway (TDP is not consumption, and Intel uses more optimistic method anyway)
Re: (Score:2)
Not to mention that Intel's top of the line chip really does thrash AMD's...
Re: (Score:2)
Why are you thinking that you need a $290 motherboard again? AMD may be better bang for your buck but you have only yourself to blame if you want a budget system and then sink that much into a Mobo.
See my reply further back in the same thread [slashdot.org]. The discussion was about Intel chips being more expensive, without delivering enough additional power to justify the additional expense. Indeed, with the number of AMD chips you could buy for the price of a single Intel chip, you could outperform any multi-threaded process with the AMD chips; If you're building a cluster, AMD wins hands down.
I'm not trying to tweak your nose, here, but I just don't see any reason to buy Intel anymore - they might be the king of
Re: (Score:2)
Re: (Score:2)
The $300 motherboard wasn't really the kick in the pants... the big deal there was that the Intel chip is more than the AMD system.
By the way, why didn't you manage to read my comment before posting your own? I posted this [slashdot.org] more than an hour and a half before you asked about the motherboard...
Re: (Score:2)
Prior to the Athlon-64, P4s _were_ better than Athlons unless you wanted to run x87 floating point instructions.
Clock for clock the Athlons beat the shit out of the P4. Only by getting a fast and thus very power-hungry and expensive processor could you build a faster machine with a P4. Does that mean they were "better"? Also, at the time floating point had just become massively important to gaming since we were fully into 3d land. fp math was one of the most important differentiators and competition over specfp benchmarks was intense.
When I bought my last Windows PC I expected to go AMD but when I actually looked at the benchmarks the P4 was up to twice as fast as a similarly-priced Athlon at rendering in the 3D and video editing packages I was using at the time.
Only if you bought your AMD from someone whose pricing was designed to remove their
Re: (Score:3, Funny)
"...they make MSFT look like the Care Bears."
Can't....type.....horrible image of Ballmer....in Care Bear outfit.
Re: (Score:2)
From 2000 - 2005 I bought a few AMD systems because they were a bit cheaper, but I also had quite a few CPU's fail and even one melt despite heatsink, fan, two case fans, plus another PCI slot fan. Maybe it was just my luck of the draw, but since 2005 everything I've bought except my PowerMac G5 tower has been Intel CPU's. And I haven't had any problems with the intel CPU's.
Re: (Score:2)
Basically Intel locked down all I/O on many of their chips to specifically lock out Nvidia and force their lousy GPUs onto you, whether you like it or not.
They did, but the DPI license is mostly a diversion. The real story is that with the Core i3/i5s you already have integrated graphics on the CPU, so even if nVidia manage to claw their way back into the motherboard game there's nothing for them there since graphics used to be the main differentiator. By turning it into a license/contract issue it seems a lot cleaner than "oh, you can still produce boards but we moved the essential functionality into the CPU". Though honestly AMD has been talking about the m
Re: (Score:2)
. The real story is that with the Core i3/i5s you already have integrated graphics on the CPU, so even if nVidia manage to claw their way back into the motherboard game there's nothing for them there since graphics used to be the main differentiator.
They still have shit graphics, so there is still a need for fancier graphics for laptops. If I was buying a laptop for my girlfriend I would have used intel integrated video before and I would still use it. If I am buying one for me then I wouldn't use it before, and I won't use it now. Nothing has changed except where the hardware I don't want lives. Before it was soldered to the motherboard where I couldn't (reasonably) remove it. Now it's built into the CPU and I still can't remove it.
Nothing has changed
Re: (Score:2)
Nothing has changed.
What has changed is that with core 2 stuff you have the option of a nvidia chipset with integrated graphics that were better than intel integrated graphics while being physically smaller and lower power consumption than a discrete solution with it's own memory.
With current gen intel stuff that option is gone (though admittedly from a users point of view the fact that intel integrated graphics are better than they used to be somewhat makes up for it).
Afaict Nvidia was afraid that intel wo
Re: (Score:2)
Basically Intel locked down all I/O on many of their chips to specifically lock out Nvidia and force their lousy GPUs onto you, whether you like it or not.
Do you understand what this chip is? It's a system on a chip. The whole point is a small, integrated, specialized, low power chip for things like tablets. There's absolutely no point in allowing for an NVIDIA chip on it because 1) the integrated graphics are ALL you need. 2) if you added another GPU chip you would hurt power consumption and increase overall costs and 3) why the hell increase the complexity of the chip to support something that it fundamentally contrary to the design goals
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
Issue is that Atom except Intel STB boards with their media accel processor is deliberately crippled in terms of video performance. As a result the entire system ends up being crippled wholesale giving the consumers a perception that the computer is slow while it really isn't.
Nvidia has demonstrated this - when paired with a devent video chippery Atom makes a perfectly adequate desktop and notebook. As a result Intel has gone as far as damaging its own media STB roadmap to lock Nvidia out so that Atom does
I think this was in the pipe line befor the fcc ru (Score:2)
I think this was in the pipe line before the new FCC rule came out
Re: (Score:2)
In which case Intel and Microsoft are simply giving a huge market opportunity to Apple and any company that decides to build a decent platform on Android using non-Intel hardware (ARM, etc.).
Sounds like a loss for Intel/MS and a win for the consumer.
Re: (Score:2)
Intel has been bitten by its non-cannibalisations strategies in the past.
One of the main reasons for Athlon to have its time in the sun is not its performance. It was _NOT_ that much better initially. It was Intel deliberately limiting and crippling various product lines on non-cannibalisation grounds. i810, i840 and most importanty crippling i815e which had 2G memory addressing capacity by design with a 512MB strict marketing requirement (sounds familiar doesn't it?) had more to do with it. As a result Ath
Re: (Score:1, Funny)
Re: (Score:3, Funny)
... I went there expecting shoe porn.
Re: (Score:2)
I looked at the picture in the fine article: Oak Trail, Lincroft, Whitney Point, Langwell, and I thought they were talking about some suburban development in California.
Don't see any other way for Intel (Score:5, Insightful)
Re: (Score:1)
The FTC behaved pretty much reasonably in this case.
I'm pretty sure that's one of the signs of the apocalypse.
Re: (Score:2, Offtopic)
Maybe they can just glue on dummy PCIe slots, kind of like the Chinese used to hand-paint barcodes on boxes.
Dang, I'm no good with Google today. I can't find the reference. Years ago, when barcodes were just starting to become popular on boxes/cases used for shipping, I recall a story where some American company had specified that their Chinese supplier had to begin bar-coding boxes of goods sent to the US to make warehousing here easier, and proceeded to have fits when none of the barcodes scanned. They
Re: (Score:3, Interesting)
Re: (Score:2)
Off-topic, but FYI I think Oak Trail basically is a PC-compatible chipset for Moorestown (the other chipset was not PC-compatible). It includes all the legacy stuff that is need to maintain compatibility back to the original IBM PC in 1981, and extensions such as ACPI, so most normal x86 OSes will run.
Re: (Score:2)
Oak Trail basically is a PC-compatible chipset for Moorestown
Oops, replace Moorestown here with Lincroft.
Re: (Score:2)
It's a SoC chip
To be more precise, Lincroft is the SoC chip, Whitney Point is the codename of the PC-compatible chipset, and Oak Trail is the codename for the entire platform.
Re: (Score:2)
Re:Don't see any other way for Intel (Score:4, Interesting)
That makes my next PC purchase easier... (Score:2)
With no PCI Express support, I can just skip anything from Intel, since I won't be able to use any decent video card in their rig.
Thanks, Intel, for throwing away any chance you had at selling stuff to the gaming market.
Wait... does this mean Intel is going to be the next big corporation screaming about piracy hurting their profits? I mean, obviously, if no one is buying their crap anymore, it's the fault of the pirates...
Re: (Score:2)
This applies only to one chip, and that chip's codename is right in the title.
Re: (Score:2)
because intel's cards don't do CUDA. And the 5450 is rather slow if you want to push a game to 3 monitors. Does the Intel card do 3 monitors? how about 6? how about decent h264 decoding in mainline mplayer? VLC? xine?
Re: (Score:2)
Even the HD4650 blows the doors off of the HD5450. That HD5450 ranks right up there with 2 year old mid range (at the time) graphics cards.
That the Intel integrated solution is a "tiny bit slower" than something no serious gamer would even consider buying today. The Farcry 2 benchmark for the HD5450 puts it at an average 20 FPS on Medium Settings, Low Shadows, No AA. Simply horrible. Its probably fine if you want to play
Human-readable analysis of the stuff (Score:2)
Here is a semiaccurate article on this, with human-readable analysis: http://www.semiaccurate.com/2010/08/04/intel-settles-ftc-and-nvidia-win-big/ [semiaccurate.com]
Re:Human-readable analysis of the stuff (Score:4, Interesting)
If Intel doesn't want a GPU on their platforms, it is trivial to abide by the letter of the law and still screw Nvidia
During the public comment period, I submitted a comment about this and the FTC actually responded:
http://www.ftc.gov/os/adjpro/d9341/101102intelletterbao.pdf [ftc.gov]
Semi-accurate is Fully-retarded (Score:3, Interesting)
Don't believe their bullshit. Two major flaws with their argument:
1) Nobody gives a shit about PCIe speed on the Atom. It is a low end platform, for netbooks. You are not putting discrete GPUs at all on it, never mind fast ones. You do not want that kind of battery drain, or cost, for that platform. Speed is really not relivant.
2) PCIe is way, WAY faster than it needs to be. 8x, which is half speed, is still more than you need as HardOCP found (http://hardocp.com/article/2010/08/16/sli_cfx_pcie_bandwidth_pe
Re: (Score:2)
CUDA or GPGPU stuff needs more bandwidth. As for the atom, it makes a decent media center when used with a gt240, or even a gt220.
Re: (Score:2)
And neither of those things at all matters. So CUDA stuff needs more bandwidth? I can believe it (though I'd like to see evidence) but you don't go and run CUDA stuff on an Atom. Intel's desktop boards have plenty of bandwidth for PCIe. All the desktop chipsets and boards support full PCIe 2.0 16x on their slot. Their x58 and 5520 chipsets support multiple 16x slots. You can pack on the CUDA cards and have plenty of bandwidth, no problem. We've got a Supermicro system at work that uses an Intel chipset that
Re: (Score:2)
I wasn't saying a media center needed a pile of bus speed, simply if i want netflix i have to run windows, which means x86, low powered and cheap and that means atom (yes i know about via, and they are lower power than atom, but are more expensive, cap-ex wise).
So if I want my atom to do 1080P high bitrate high profile level 5 h264@24fps I need either the mini-pcie broadcom card, or a nvidia gpu... Maybe the hd4500 can do otherwise, but not the last time i looked, nor could the low powered AMD gpus. So no p
Re: (Score:2)
Are you fucking kidding me?
OpenCL is made for something like an Atom. When you start talking about number crunching, serious numeric computation, an Atom along with a couple of GPUs makes a hell of a lot more sense than almost anything else. Especially when you are talking about thousands of these machines.
You seem to have an obsession about the Atom and its inadequacy for most tasks. Guess what? They sell reams and shitloads of Atom boards to the server market, and I know of several big ass rooms that
Re: (Score:2)
In response to your point 1, Atom processors are used for a lot more than netbooks these days. It is not uncommon to find them in all sorts of servers.
People buy Atom motherboards and use them for all kinds of uses. Hell, for most people an Atom is all they need to their day to day work.
Re: (Score:2)
An atom with a raid card makes a great SOHO NAS... with enough grunt to even handle a bit more than just NAS if it has too.
Re: (Score:2)
No RAID card needed. Just gigE.
We have 2 desktops and one HTPC storing most data on an Atom 330, just using its internal SATA and IDE connectors, and it never even hiccups. ZFS is a beautiful thing.
I also have seen these computers deployed in helicopters and fixed wing craft, in remote (read: tent and tiny generator) applications, in cars, and other places you wouldn't want to put a screaming server into mainly for power consumption issues.
Re: (Score:2)
You are not putting discrete GPUs at all on it, never mind fast ones.
What do you think NVIDIA ION 2 is?
Re:Semi-accurate is Fully-retarded - obligatory (Score:2)
2) PCIe is way, WAY faster than it needs to be. 8x, which is half speed, is still more than you need as HardOCP found (http://hardocp.com/article/2010/08/16/sli_cfx_pcie_bandwidth_perf_x16x16_vs_x16x8/6) even for extremely high end cards in multi-card setups. For that matter on the forums Kyle said that 4x (quarter speed) is still more than enough for cards at 1920x1200. The highest end discreet cards don't need it, you are fine.
8x ought to be enough for anybody...
this is why monopolies suck (Score:2)
Re: (Score:2)
I was thinking more like Intel would be likely to add PCIe to the thing by making a PCI-to-PCIe bridge chip, making everything traverse the PCI bus. That'ud show 'em.
more like light peak only no DVI no USB no vga (Score:4, Insightful)
more like light peak only no DVI no USB no vga.
also light peak only works with intel video and if you want to use your usb keyboard or mouse $30 cable or hub that needs a wall wart as light peak may not pass power.
want Ethernet $30 cable
want to use a ati or nvidia video chip you may need a piggy back cable to make it tie into the light peak network.
Re: (Score:2)
also light peak only works with intel video and if you want to use your usb keyboard or mouse $30 cable or hub that needs a wall wart as light peak may not pass power.
Not really, light peak complements USB as they share the same physical port. You can still use your cheap USB cables at USB speeds. If you want "light peak" speeds then you need to use a more expensive cable - one with 4 fiber optic lines in addition to the 4 regular USB electrical lines. Passing power will work just like with USB.
want Ethernet $30 cable
Possibly, but only in those rare situations where your device is so small that there is no room for a regular ethernet port.
want to use a ati or nvidia video chip you may need a piggy back cable to make it tie into the light peak network.
Unlikely, just look at displayport. Displaypor
Re: (Score:2)
The original Apple laptops with DisplayPort also did not route audio through the displayport. This is why there were no displayport to hdmi converters available from Apple (other people offered them but they lacked audio). So there might be some hardware design limitations to think about - or it might just be software.
But the new Apple computers do support audio through the displayport - so they have obviously thought about the problem. And it is not some hack like what you see with your ATI card. So
Re: (Score:2)
Dude, its more like if apple went back to the parallel port and removed the USB.
Re: (Score:3, Informative)
Please re-read, it's Intel, not IBM... and there's lots of useful info in the comments.
Re: (Score:3, Funny)
Two identical posts this close together? You must be a Slashdot editor.