Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Government The Courts Hardware

Despite FTC Settlement, Intel Can Ship Oak Trail Without PCIe 140

MojoKid writes "When the Federal Trade Commission settled their investigation of Intel, one of the stipulations of the agreement was that Intel would continue to support the PCI Express standard for the next six years. Intel agreed to all the FTC's demands, but Intel's upcoming Oak Trail Atom platform presented something of a conundrum. Oak Trail was finalized long before the FTC and Intel began negotiating, which means Santa Clara could've been banned from shipping the platform. However, the FTC and Intel have recently jointly announced an agreement covering Oak Trail that allows Intel to sell the platform without adding PCIe support — for now."
This discussion has been archived. No new comments can be posted.

Despite FTC Settlement, Intel Can Ship Oak Trail Without PCIe

Comments Filter:
  • by robot256 ( 1635039 ) on Friday November 05, 2010 @09:53PM (#34144856)

    ...by what the actual issue is here? And I did RTFA.

    Something about Intel pushing a new proprietary graphics bus into a new chipset...they never actually mentioned how the FTC thing got started.

    • by Kenja ( 541830 )
      At a guess, the FTC is the enforcement body for the ISO standard that Intel agreed to follow which includes PCIe.
    • by cappp ( 1822388 ) on Friday November 05, 2010 @10:18PM (#34144974)
      Okay as far as I can tell

      The FTC sued Intel alleging Intel had violated Section 5 of the FTC Act.

      A little more digging brings us [computerworld.co.nz]

      The FTC filed its complaints against Intel on Dec. 16, 2009. It charged the chip maker with illegally using its dominant position to stifle competition for decades. The complaint was filed just a month after Intel had settled antitrust and patent disputes with Advanced Micro Devices for US$1.25 billion.

      The FTC site adds that [ftc.gov]

      ").(1) Section 5 of the FTC Act prohibits "unfair methods of competition," and was amended in 1938 also to prohibit "unfair or deceptive acts or practices.

      Seems to have been part of a broader move against Intel at the time, I admit I don't remember it very clearly, but Reuters adds [reuters.com]

      A wide range of antitrust enforcers have gone up against Intel for its controversial pricing incentives. New York Attorney General Andrew Cuomo accused Intel in November of threatening computer makers and paying billions of dollars of kickbacks to maintain market supremacy. The European Commission has fined Intel 1.06 billion euros ($1.44 billion) for illegally shutting out AMD. In June 2008, South Korea fined Intel some $26 million, finding it offered rebates to PC makers in return for not buying microprocessors made by AMD. Japan's trade commission concluded in 2005 that Intel had violated the country's anti-monopoly act. The case before the FTC is "In the Matter of Intel Corporation," docket number 9341.

      Oh and that case can be found here [ftc.gov]

      • Re: (Score:2, Interesting)

        Instead of telling Intel how to make their product, I consider it much better to confiscate the relevant patents and copyrights and put them into the public domain. That way AMD, nvidia, etc. will have all the access they need. They use asset forfeiture on us all the time. Time to use it here. Fair is fair.

      • How do we get from anti-compete against AMD to 6 years if support for PCIe?

    • Re: (Score:1, Offtopic)

      even more confusing is that google is not mentioned.

      and this is a slashdot article. there *has* to be a google angle somewhere.

    • by pavon ( 30274 ) on Friday November 05, 2010 @10:35PM (#34145032)

      Here is a good article [arstechnica.com] about the original antitrust settlement.

      Basically, Intel refuses to license it's new DMI or QPI bus protocols to NVIDIA, so they can no longer make chipsets for intel processors (like nForce). Furthermore, it has been feared that with the push towards systems on chip, that Intel would eliminate the PCI-e bus as well leaving no way for any graphic company to supply a discrete graphics chip for netbook or notebook computers.

      • Re: (Score:3, Interesting)

        by bhcompy ( 1877290 )
        You mean leaving no way for nVidia to do it. ATI has always had solid mobility offerings and AMD owning them just means that nV is left out in the cold if Intel goes down the course you say.
        • Re: (Score:1, Interesting)

          by Anonymous Coward

          AMD also has HyperTransport. Maybe this was why there were rumours about Nvidia making a CPU.

          If Intel & AMD decided to offer GPUs linked by QPI & HT it would give their GPUs a big advantage with Nvidia unable to compete.

          I think non-portable computers will end up a lot more modular in this way. Memory, CPUs, GPUs, Northbridge all connected to each other on a future generation of a switched HT/QPI bus. It would make the computers much more scalable, futureproof, adaptable and efficient. It might also

          • by 0123456 ( 636235 ) on Friday November 05, 2010 @11:38PM (#34145300)

            If Intel & AMD decided to offer GPUs linked by QPI & HT it would give their GPUs a big advantage with Nvidia unable to compete.

            That would also kill Intel's high-end consumer products. Most high-end Intel CPUs are sold to gamers, who aren't going to be gaming on some crappy Intel integrated graphics chip.

            At least for the forseeable future, Intel need Nvidia for the mid to high-end gaming market, because they're not going to be releasing GPUs in that arena any time soon.

            • Re: (Score:2, Insightful)

              by Starrider ( 73590 )

              High-end graphics and discrete cards are making up a smaller and smaller percentage of the market. It is quickly getting to the point that the only people who are buying discrete GPUs are gamers and graphics professionals. Most people just don't see the need for the added expense.

              The "mid to high-end gaming market" is fairly small on the PC, relative to the entire PC market.

            • Re: (Score:3, Insightful)

              by Dynedain ( 141758 )

              Intel's high-end consumer products aren't where they make their money.... and enough of them make it into prebuilt machines from Dell, HP, anyways.

              Most high-end Intel CPUs are sold as server solutions, where a graphics card makes very little difference.

              • Yes, and RAID / SAS / GB NIC cards certainly dont use PCIe, right?
                • Did I say that they didn't? No... I said that high-end consumer gaming machines are not where Intel is making its money.

                  And those cards don't have to use PCIe, there's other ways of getting the controllers in. Gigabit network cards for example already offload a lot onto the processor itself and are usually built into the chipset.

            • So basically, if youre on a budget, youd do AMD. And if youre looking to make a decent gaming rig, youd do AMD. And if you were looking to do high end, youd do Intel, except with no PCIe, no, you wouldnt-- youd do AMD.

              Who in the hell is gonna buy a core i7 laptop and stick with integrated graphics, again?
            • That would also kill Intel's high-end consumer products. Most high-end Intel CPUs are sold to gamers, who aren't going to be gaming on some crappy Intel integrated graphics chip.

              Can you quantify the gamer CPU market sales vs. the chipset sales currently held by nVidia?

        • I like and buy AMD CPUs, but I've always preferred nVidia for graphics cards; mostly because I run linux, for which ATI/AMD cards have notoriously poor support compared to nvidia. Vendor lock in, no matter the company is a terrible thing, I and everyone else should be able to get GPUs independent of CPUs, or any other hardware for that matter.
          • by sznupi ( 719324 )

            I and everyone else should be able to get GPUs independent of CPUs, or any other hardware for that matter

            How about FPU?

      • Re: (Score:3, Insightful)

        by dgatwood ( 11270 )

        Furthermore, it has been feared that with the push towards systems on chip, that Intel would eliminate the PCI-e bus as well leaving no way for any graphic company to supply a discrete graphics chip for netbook or notebook computers.

        If they did that, every manufacturer of even moderately high-end laptops would drop their CPUs faster than an LSD addict drops acid.

        Even if Intel's GPUs were the best in the industry, there are too many other critical things you wouldn't be able to properly support without PCIe-

        • by Sycraft-fu ( 314770 ) on Saturday November 06, 2010 @04:11AM (#34145990)

          Intel doesn't want nVidia making chipsets, true enough, because Intel makes chipsets. However the want expansion slots on their boards because they want people using their boards. I'm quite sure they are plenty happy with nVidia and ATi graphics cards. Heck they've included ATi's crossfire on their boards for a long time (they didn't have SLI because nVidia wouldn't license it to them). Intel has nothing that competes in that arena, and they recently revised their plan so they aren't even going to try. They want people to get those high end GPUs because people who get high end GPUs often get high end CPUs since they are gamers. Not only that, they STILL sell the Integrated GPU, since it is on chip.

          I just can't see them not wanting PCIe in their regular desktop boards. They know expansion is popular, and they also know that the people who expand the most also want the biggest CPUs.

          Now on an Atom platform? Sure makes sense. These are extremely low end systems. PCIe logic is really nothing but wasted silicon. You don't have room for PCIe expansions in there, never mind the desire for it. Those are integrated, all-in-one, low end platforms.

          However desktop and laptop? I can't see them wanting to eliminate it there.

        • Depends on the market. How many laptop users actually care about ExpressCard? I've had one on this machine for 4 years, but never plugged anything in to it. FireWire and SATA controllers are small enough that they could be on die on a low-power SoC. Things like USB and Ethernet / 802.11 are already commonly provided on die by ARM SoCs, so I'd imagine that they would be with most Intel single-chip solutions too.
        • "LSD addict drops acid"

          LSD is not addictive.

        • by sznupi ( 719324 )

          All those things you mention don't really require PCIe - if they are provided by Intel chipset.

    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Friday November 05, 2010 @11:13PM (#34145194)
      Comment removed based on user account deletion
      • by 0123456 ( 636235 )

        Just look at how many power hogging P4s are still in use, thanks partially to the fact that Intel paid off OEMs not to run the better at the time AMD chips.

        Prior to the Athlon-64, P4s _were_ better than Athlons unless you wanted to run x87 floating point instructions. When I bought my last Windows PC I expected to go AMD but when I actually looked at the benchmarks the P4 was up to twice as fast as a similarly-priced Athlon at rendering in the 3D and video editing packages I was using at the time.

        It was only in the final P4 space-heater era that choosing AMD became a no-brainer.

        • You really do have to consider that the performance per watt in the era since the P4 has been stellar from Intel, while AMD hasn't quite been up to par in that department. The roles really have reversed in the past few years in regards to wattage, with Intel also keeping the raw performance crown on the high end.
          • by znerk ( 1162519 )

            You really do have to consider that the performance per watt in the era since the P4 has been stellar from Intel, while AMD hasn't quite been up to par in that department. The roles really have reversed in the past few years in regards to wattage, with Intel also keeping the raw performance crown on the high end.

            On the other hand, for the price of the highest-end Intel chip [newegg.com] (and a motherboard [newegg.com] to run it on) (also note: board and chip ONLY; no OS, no drives, no case, no nothing), I can practically build two high-end AMD systems [newegg.com] (If Newegg will sell me a pre-built system for just under a grand, I'm willing to bet I can build it myself for $800 or less - especially without the MSFT tax).

            • Comment removed based on user account deletion
              • by sznupi ( 719324 )

                Now that AMD is about to integrate decent GFX...one can see why Nvidia wants to focus primarily on the "pro" market.

                Last two performance-starved areas, games and video editing/encoding, should quickly become mostly covered even by entry CPUs...

            • Re: (Score:3, Insightful)

              by Sycraft-fu ( 314770 )

              That's not a good comparison. Intel has an obscenely high priced chip. Fine, they always have for those people who have more money than sense. They also have reasonably priced chips. Try instead looking at, say, a Core i5-760. 2.8GHz quad core chip for $210. Look up some performance numbers and then compare to AMD chips. It isn't very favorable. More or less they need their 6 core chips to compete, and then it is only competitive if you happen to have an app that can use all 6 cores (which is very rare stil

              • by cynyr ( 703126 )

                Sure if you are the type to run only one app at a time.

                my list:
                1)emerge
                2)ffmpeg
                3)Wow.exe
                4)Chome while on carrerbuilder.com(something is dumb there and loops using 10% cpu).
                ffmpeg can use 2-4 cores, emerge can as well, wow uses 2, and chrome could use a bunch as well. hmm just about everything i run is threaded 6-12 cores sure sounds nice.

              • by Rockoon ( 1252108 ) on Saturday November 06, 2010 @08:40AM (#34146808)

                Try instead looking at, say, a Core i5-760. 2.8GHz quad core chip for $210. Look up some performance numbers and then compare to AMD chips.

                Performance numbers based on Intel crippling compiler.

                Yeah. Even in cases where Intel's compiler isnt used for the benchmark program, many benchmarks still use libraries compiled with Intel's compiler.

                Of significance are Intels Math Kernel Library and even AMD's Core Math Library (compiled with Intels fortran compiler!)

                These libraries are extensively used in most benchmark programs.

                • See here's the problem: Even if Intel's compiler does better for Intel's own chips, which I'm sure it does, it is still the best compiler out there by a long shot. Any time you see compiler benchmarks it is consistently the best thing on the market. Intel has a really, really good compiler team. So if that is a problem for AMD, well then they should be writing their own compiler. Like the ICC, they should make it plug in to Visual Studio so that VS developers can use it as a drop-in replacement to speed up

                  • Even if Intel's compiler does better for Intel's own chips, which I'm sure it does, it is still the best compiler out there by a long shot.

                    This isnt about not putting effort into optimizing for non-Intel. This is about intentionally putting effort into sabotaging non-Intel performance. They have been convicted of this act, and so far have not honored the courts ruling on the matter.

                    Remember when Microsoft intentionally sabotaged competing DOS clones? Yeah.

                    They only care how it does in the apps they actually use. So if all their apps are ICC apps and they run better on Intel processors, well hten that's all that matters.

                    Who claimed that all their apps are ICC-based? The claim is that most BENCHMARKS use ICC-generated code at some point, and this claim is DEMONSTRABLY true (simply changing the CPUID Vendo

                • by Bert64 ( 520050 )

                  Try doing some benchmarks on a pure Linux system compiled with GCC...
                  GCC has no reason to bias any individual processor maker.

              • by znerk ( 1162519 )

                My "contrived situation" was simply to go to newegg.com, and look up the highest rated intel chip, then get the highest rated motherboard for that chip, then compare that price to a good gaming system sold as a single, pre-built unit.

                The funny thing is, you managed to rant about how I used a six-core chip against the Intel four-core, while completely ignoring that for the money, I would actually get twelve cores with the AMD-based solution. Unless you can argue that it takes 3 AMD cores to equal the perform

            • Why are you thinking that you need a $290 motherboard again? AMD may be better bang for your buck but you have only yourself to blame if you want a budget system and then sink that much into a Mobo.

              Not to mention that Intel's top of the line chip really does thrash AMD's...
              • by znerk ( 1162519 )

                Why are you thinking that you need a $290 motherboard again? AMD may be better bang for your buck but you have only yourself to blame if you want a budget system and then sink that much into a Mobo.

                See my reply further back in the same thread [slashdot.org]. The discussion was about Intel chips being more expensive, without delivering enough additional power to justify the additional expense. Indeed, with the number of AMD chips you could buy for the price of a single Intel chip, you could outperform any multi-threaded process with the AMD chips; If you're building a cluster, AMD wins hands down.

                I'm not trying to tweak your nose, here, but I just don't see any reason to buy Intel anymore - they might be the king of

                • Intel isn't dropping PCIe support except on the Atom chip line. I don't give a shit what they do to Atom's PCIe bus, since I'm not going to put a GTX480 or any other expansion card in one of those netbook systems anyway.
              • by znerk ( 1162519 )

                The $300 motherboard wasn't really the kick in the pants... the big deal there was that the Intel chip is more than the AMD system.

                By the way, why didn't you manage to read my comment before posting your own? I posted this [slashdot.org] more than an hour and a half before you asked about the motherboard...

        • Prior to the Athlon-64, P4s _were_ better than Athlons unless you wanted to run x87 floating point instructions.

          Clock for clock the Athlons beat the shit out of the P4. Only by getting a fast and thus very power-hungry and expensive processor could you build a faster machine with a P4. Does that mean they were "better"? Also, at the time floating point had just become massively important to gaming since we were fully into 3d land. fp math was one of the most important differentiators and competition over specfp benchmarks was intense.

          When I bought my last Windows PC I expected to go AMD but when I actually looked at the benchmarks the P4 was up to twice as fast as a similarly-priced Athlon at rendering in the 3D and video editing packages I was using at the time.

          Only if you bought your AMD from someone whose pricing was designed to remove their

      • Re: (Score:3, Funny)

        by monktus ( 742861 )

        "...they make MSFT look like the Care Bears."

        Can't....type.....horrible image of Ballmer....in Care Bear outfit.

      • From 2000 - 2005 I bought a few AMD systems because they were a bit cheaper, but I also had quite a few CPU's fail and even one melt despite heatsink, fan, two case fans, plus another PCI slot fan. Maybe it was just my luck of the draw, but since 2005 everything I've bought except my PowerMac G5 tower has been Intel CPU's. And I haven't had any problems with the intel CPU's.

      • by Kjella ( 173770 )

        Basically Intel locked down all I/O on many of their chips to specifically lock out Nvidia and force their lousy GPUs onto you, whether you like it or not.

        They did, but the DPI license is mostly a diversion. The real story is that with the Core i3/i5s you already have integrated graphics on the CPU, so even if nVidia manage to claw their way back into the motherboard game there's nothing for them there since graphics used to be the main differentiator. By turning it into a license/contract issue it seems a lot cleaner than "oh, you can still produce boards but we moved the essential functionality into the CPU". Though honestly AMD has been talking about the m

        • . The real story is that with the Core i3/i5s you already have integrated graphics on the CPU, so even if nVidia manage to claw their way back into the motherboard game there's nothing for them there since graphics used to be the main differentiator.

          They still have shit graphics, so there is still a need for fancier graphics for laptops. If I was buying a laptop for my girlfriend I would have used intel integrated video before and I would still use it. If I am buying one for me then I wouldn't use it before, and I won't use it now. Nothing has changed except where the hardware I don't want lives. Before it was soldered to the motherboard where I couldn't (reasonably) remove it. Now it's built into the CPU and I still can't remove it.

          Nothing has changed

          • Nothing has changed.
            What has changed is that with core 2 stuff you have the option of a nvidia chipset with integrated graphics that were better than intel integrated graphics while being physically smaller and lower power consumption than a discrete solution with it's own memory.

            With current gen intel stuff that option is gone (though admittedly from a users point of view the fact that intel integrated graphics are better than they used to be somewhat makes up for it).

            Afaict Nvidia was afraid that intel wo

      • Basically, no.

        Basically Intel locked down all I/O on many of their chips to specifically lock out Nvidia and force their lousy GPUs onto you, whether you like it or not.

        Do you understand what this chip is? It's a system on a chip. The whole point is a small, integrated, specialized, low power chip for things like tablets. There's absolutely no point in allowing for an NVIDIA chip on it because 1) the integrated graphics are ALL you need. 2) if you added another GPU chip you would hurt power consumption and increase overall costs and 3) why the hell increase the complexity of the chip to support something that it fundamentally contrary to the design goals

    • Re: (Score:3, Insightful)

      by arivanov ( 12034 )

      Issue is that Atom except Intel STB boards with their media accel processor is deliberately crippled in terms of video performance. As a result the entire system ends up being crippled wholesale giving the consumers a perception that the computer is slow while it really isn't.

      Nvidia has demonstrated this - when paired with a devent video chippery Atom makes a perfectly adequate desktop and notebook. As a result Intel has gone as far as damaging its own media STB roadmap to lock Nvidia out so that Atom does

      • I think this was in the pipe line before the new FCC rule came out

      • In which case Intel and Microsoft are simply giving a huge market opportunity to Apple and any company that decides to build a decent platform on Android using non-Intel hardware (ARM, etc.).

        Sounds like a loss for Intel/MS and a win for the consumer.

        • by arivanov ( 12034 )

          Intel has been bitten by its non-cannibalisations strategies in the past.

          One of the main reasons for Athlon to have its time in the sun is not its performance. It was _NOT_ that much better initially. It was Intel deliberately limiting and crippling various product lines on non-cannibalisation grounds. i810, i840 and most importanty crippling i815e which had 2G memory addressing capacity by design with a 512MB strict marketing requirement (sounds familiar doesn't it?) had more to do with it. As a result Ath

  • by gman003 ( 1693318 ) on Friday November 05, 2010 @09:59PM (#34144880)
    I went and looked up the specs for the chip in question. It's a SoC chip, just a PCI bus is all I could find. There's no market reason for PCIe, and it really wouldn't even offer much of a benefit, since the single-core CPU is barely pushing a gigahertz. The FTC behaved pretty much reasonably in this case.
    • The FTC behaved pretty much reasonably in this case.

      I'm pretty sure that's one of the signs of the apocalypse.

    • Re: (Score:2, Offtopic)

      by jgreco ( 1542031 )

      Maybe they can just glue on dummy PCIe slots, kind of like the Chinese used to hand-paint barcodes on boxes.

      Dang, I'm no good with Google today. I can't find the reference. Years ago, when barcodes were just starting to become popular on boxes/cases used for shipping, I recall a story where some American company had specified that their Chinese supplier had to begin bar-coding boxes of goods sent to the US to make warehousing here easier, and proceeded to have fits when none of the barcodes scanned. They

    • Re: (Score:3, Interesting)

      by sarkeizen ( 106737 )
      I don't even see the relevance of the Atom platform anymore. It used to be about power efficiency and they really got there with the Z series maxing out at 2.4W. This was, of course at the expense of processing power, addressable memory and such. However after the release of the SU7300 which maxes out at 10W - and doesn't have the limitations of the Atom. I get that there are some power savings in there with all the integration Intel is planning but I'm skeptical how much that bares out. My wife was r
    • by yuhong ( 1378501 )

      Off-topic, but FYI I think Oak Trail basically is a PC-compatible chipset for Moorestown (the other chipset was not PC-compatible). It includes all the legacy stuff that is need to maintain compatibility back to the original IBM PC in 1981, and extensions such as ACPI, so most normal x86 OSes will run.

      • by yuhong ( 1378501 )

        Oak Trail basically is a PC-compatible chipset for Moorestown

        Oops, replace Moorestown here with Lincroft.

    • by yuhong ( 1378501 )

      It's a SoC chip

      To be more precise, Lincroft is the SoC chip, Whitney Point is the codename of the PC-compatible chipset, and Oak Trail is the codename for the entire platform.

      • Reminds me of drugs naming, where the same drug might be sold under several different names depending on intended purpose, per manufacturer, in addition to a generic name or two and a scientific name.
    • by squizzar ( 1031726 ) on Saturday November 06, 2010 @07:05AM (#34146464)
      I thought that Intel wanted to break into the embedded market that contains a lot of ARM and PowerPC cores with Atom? The FPGA + Embedded processor combination is pretty common, and PCIe is the way to interface them. Hence your low power/low performance chip is bundled together with another (FPGA or ASIC) that does the heavy lifting for a specific task. Every application that requires some serious, but fixed, number crunching is appropriate for this. I do broadcast related stuff, so the things that spring to mind are video compressors, deinterlacers, etc. Why spend lots of dollars and lots of watts on a powerful CPU when you can combine a amsll core and an ASIC/FPGA and get the same result? Without PCIe no one is going to consider the Atom for these applications.
  • With no PCI Express support, I can just skip anything from Intel, since I won't be able to use any decent video card in their rig.

    Thanks, Intel, for throwing away any chance you had at selling stuff to the gaming market.

    Wait... does this mean Intel is going to be the next big corporation screaming about piracy hurting their profits? I mean, obviously, if no one is buying their crap anymore, it's the fault of the pirates...

    • by yuhong ( 1378501 )

      This applies only to one chip, and that chip's codename is right in the title.

  • Here is a semiaccurate article on this, with human-readable analysis: http://www.semiaccurate.com/2010/08/04/intel-settles-ftc-and-nvidia-win-big/ [semiaccurate.com]

    Secondly, Intel doesn't need to be bastards, they can just continue with the bog-standard half-speed PCIe 2.0 link that they have on their Atoms. This doesn't provide enough bandwidth to run a retired analog cigarette vending machine let alone a modern GPU. If Intel doesn't want a GPU on their platforms, it is trivial to abide by the letter of the law and still s

    • by yuhong ( 1378501 ) <yuhongbao_386@@@hotmail...com> on Saturday November 06, 2010 @03:25AM (#34145926) Homepage

      If Intel doesn't want a GPU on their platforms, it is trivial to abide by the letter of the law and still screw Nvidia

      During the public comment period, I submitted a comment about this and the FTC actually responded:
      http://www.ftc.gov/os/adjpro/d9341/101102intelletterbao.pdf [ftc.gov]

    • Don't believe their bullshit. Two major flaws with their argument:

      1) Nobody gives a shit about PCIe speed on the Atom. It is a low end platform, for netbooks. You are not putting discrete GPUs at all on it, never mind fast ones. You do not want that kind of battery drain, or cost, for that platform. Speed is really not relivant.

      2) PCIe is way, WAY faster than it needs to be. 8x, which is half speed, is still more than you need as HardOCP found (http://hardocp.com/article/2010/08/16/sli_cfx_pcie_bandwidth_pe

      • by cynyr ( 703126 )

        CUDA or GPGPU stuff needs more bandwidth. As for the atom, it makes a decent media center when used with a gt240, or even a gt220.

        • And neither of those things at all matters. So CUDA stuff needs more bandwidth? I can believe it (though I'd like to see evidence) but you don't go and run CUDA stuff on an Atom. Intel's desktop boards have plenty of bandwidth for PCIe. All the desktop chipsets and boards support full PCIe 2.0 16x on their slot. Their x58 and 5520 chipsets support multiple 16x slots. You can pack on the CUDA cards and have plenty of bandwidth, no problem. We've got a Supermicro system at work that uses an Intel chipset that

          • by cynyr ( 703126 )

            I wasn't saying a media center needed a pile of bus speed, simply if i want netflix i have to run windows, which means x86, low powered and cheap and that means atom (yes i know about via, and they are lower power than atom, but are more expensive, cap-ex wise).

            So if I want my atom to do 1080P high bitrate high profile level 5 h264@24fps I need either the mini-pcie broadcom card, or a nvidia gpu... Maybe the hd4500 can do otherwise, but not the last time i looked, nor could the low powered AMD gpus. So no p

          • Are you fucking kidding me?

            OpenCL is made for something like an Atom. When you start talking about number crunching, serious numeric computation, an Atom along with a couple of GPUs makes a hell of a lot more sense than almost anything else. Especially when you are talking about thousands of these machines.

            You seem to have an obsession about the Atom and its inadequacy for most tasks. Guess what? They sell reams and shitloads of Atom boards to the server market, and I know of several big ass rooms that

      • In response to your point 1, Atom processors are used for a lot more than netbooks these days. It is not uncommon to find them in all sorts of servers.

        People buy Atom motherboards and use them for all kinds of uses. Hell, for most people an Atom is all they need to their day to day work.

        • by cynyr ( 703126 )

          An atom with a raid card makes a great SOHO NAS... with enough grunt to even handle a bit more than just NAS if it has too.

          • No RAID card needed. Just gigE.

            We have 2 desktops and one HTPC storing most data on an Atom 330, just using its internal SATA and IDE connectors, and it never even hiccups. ZFS is a beautiful thing.

            I also have seen these computers deployed in helicopters and fixed wing craft, in remote (read: tent and tiny generator) applications, in cars, and other places you wouldn't want to put a screaming server into mainly for power consumption issues.

      • by yuhong ( 1378501 )

        You are not putting discrete GPUs at all on it, never mind fast ones.

        What do you think NVIDIA ION 2 is?

      • 2) PCIe is way, WAY faster than it needs to be. 8x, which is half speed, is still more than you need as HardOCP found (http://hardocp.com/article/2010/08/16/sli_cfx_pcie_bandwidth_perf_x16x16_vs_x16x8/6) even for extremely high end cards in multi-card setups. For that matter on the forums Kyle said that 4x (quarter speed) is still more than enough for cards at 1920x1200. The highest end discreet cards don't need it, you are fine.

        8x ought to be enough for anybody...

  • Imagine a world without AMD, cyrix, Nvidia or other chip manufacturers. There would be no market or competition to face Intel and the company could force you to run whatever they wanted. I mean, a lot like it is now, but more so. As a consumer, figure out how to support the competition equally or there won't be any.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...