Follow Slashdot stories on Twitter


Forgot your password?
Intel Upgrades Hardware

Core i5 and i3 CPUs With On-Chip GPUs Launched 235

MojoKid writes "Intel has officially launched their new Core i5 and Core i3 lineup of Arrandale and Clarkdale processors today, for mobile and desktop platforms respectively. Like Intel's recent release of the Pinetrail platform for netbooks, new Arrandale and Clarkdale processors combine both an integrated memory controller (DDR3) and GPU (graphics processor) on the same package as the main processor. Though it's not a monolithic device, but is built upon multi-chip module packaging, it does allow these primary functional blocks to coexist in a single chip footprint or socket. In addition, Intel beefed up their graphics core and it appears that the new Intel GMA HD integrated graphics engine offers solid HD video performance and even a bit of light gaming capability."
This discussion has been archived. No new comments can be posted.

Core i5 and i3 CPUs With On-Chip GPUs Launched

Comments Filter:
  • by NoNickNameForMe ( 884862 ) on Monday January 04, 2010 @06:04AM (#30638880)
    That is not the only problem nowadays, even processors within a given family may or may not have specific features (VT, for example) disabled. You'd think that there is a conspiracy going on...
  • by IYagami ( 136831 ) on Monday January 04, 2010 @06:25AM (#30638968)

    "As a CPU technology, Clarkdale is excellent. I can't get over how the Core i5-661 kept nearly matching the Core 2 Quad Q9400 in things like video encoding and rendering with just two cores. We've known for a while how potent the Nehalem microarchitecture can be, but seeing a dual-core processor take on a quad-core from the immediately preceding generation is, as I said, pretty mind-blowing. Clarkdale's power consumption is admirably low at peak
    The integrated graphics processor on Clarkdale has, to some extent, managed to exceed my rather low expectations." []
    "For a HTPC there's simply none better than these new Clarkies. The on-package GPU keeps power consumption nice and low, enabling some pretty cool mini-ITX designs that we'll see this year. Then there's the feature holy-grail: Dolby TrueHD and DTS HD-MA bitstreaming over HDMI. If you're serious about building an HTPC in 2010, you'll want one of Intel's new Core i3s or i5s."

    "From the balanced notebook perspective, Arrandale is awesome. Battery life doesn't improve, but performance goes up tremendously. The end result is better performance for hopefully the same power consumption. If you're stuck with an aging laptop it's worth the wait. If you can wait even longer we expect to see a second rev of Arrandale silicon towards the middle of the year with better power characteristics. Let's look at some other mobile markets, though.
    If what you're after is raw, unadulterated performance, there are still faster options.
    We are also missing something to replace the ultra-long battery life offered by the Core 2 Ultra Low Voltage (CULV) parts. "

  • by 0100010001010011 ( 652467 ) on Monday January 04, 2010 @06:26AM (#30638978)

    Not sure about Intel. But Nvidia has VDPAU which is very nice. Feature Set C even added MPEG4 decoding and SD content upscaling, all in GPU (

    Broadcom finally released Crystal HD drivers for Linux, which means if you have a mini PCI-E slot, you can get HD content. (

    If you want to know what is available for what GPU/Platform, keep an eye out on the XBMC guys are doing. They seem to be at the forefront of getting hardware acceleration working on different setups []

  • Re:Sockets and mobos (Score:4, Informative)

    by beelsebob ( 529313 ) on Monday January 04, 2010 @06:38AM (#30639006)

    The average consumer doesn't give a shit what socket their CPU is in either, so it's all okay.

  • Well if you read the specs here [] you will see that it has 12 execution units, which I'm guessing is Intel speak for stream processors, which considering a $30 [] ATI card has 320, I'm guessing like all Intel GPUs its gonna be of the uber-suck.

    About the only ones I saddle piss poor Intel GPUs on anymore is the housewives, who at most are playing a browser game on Facebook. Everyone else gets an Nvidia or ATI onboard so if they decide to do a little light* gaming they can.

    * The new ATI onboard GPUs are surprisingly good at gaming. I personally was playing Bioshock and Swat 4 on my 780v until I could get time to order a 4650 discrete. While these games aren't cutting edge, the fact that an onboard could actually game blew my fricking mind! Compared to the horrible chips that Intel calls GPUs it was actually nice, and it had full hardware acceleration for the most popular formats out of the box. I was impressed, and unlike so many horror stories I had heard the ATI drivers were just as solid and stable as could be.

  • by cowbutt ( 21077 ) on Monday January 04, 2010 @07:06AM (#30639094) Journal
    Even worse than that, at least one model, the Q8300 Core2Quad both does and does not have VT, depending on the sSPEC code; SLB5W doesn't [], SLGUR does []. Good luck trying to buy one of those online and being sure of what you're gonna get!
  • Not that different (Score:5, Informative)

    by Anonymous Coward on Monday January 04, 2010 @08:24AM (#30639436)

    Intel also has three lines that more or less directly correspond to AMDs: Core/Phenom (good), Pentium/Athlon (ok) and Celeron/Sempron (cheap), plus the server Xeon/Opteron. The real pain is the amount of different model numbers and numbering schemes. The secret decoder ring for Intel models is:

    A) old three number codes
    E.g. Pentium 965, Celeron 450, ...
    First digit is the model, second digit corresponds to the speed
    These are usually old crap and should be avoided. Celeron 743 and Celeron 900 fairly recent low-end chips that you can still buy.

    B) Letter plus four numbers codes, e.g. SU7300:
    * S = small form factor
    * U = ultra-low voltage (5-10W), L = low-voltage (17W), P = medium voltage (25W), T = desktop replacement (35W), E = Desktop (65W), Q = quad-core (65-130W), X = extreme edition
    * 7 = model line, tells you about amount of cache, VT capability etc. Scale goes from 1 (crap) to 9 (can't afford).
    * 3 = clock frequency, relative performance within the line. Scale from 0 to 9.
    * 00 = random features disabled or enabled, have to look up for specific details.

    C) New Core i3-XYZa
    Similar to scheme B, with the added dash and more confusing
    * i3 = Line within Core brand, can be i3 (cheap, but better than Celeron or Pentium), i5 (decent) or i7 (high-end)
    * X = the actual model, tells you the amount of cache and number of cores, but only together with the processor line (i3-5xx is very different from i5-5xx)
    * Y = corresponds to clock speed, higher is better
    * Z = modifier, currently 0, 1 or 5 for specific features
    * a = type of processor: X = extreme, M = mobile, QM = quad-core mobile, LM = low-voltage mobile, UM = ultra-low-voltage mobile

  • Article is terrible (Score:5, Informative)

    by sammydee ( 930754 ) <> on Monday January 04, 2010 @08:27AM (#30639448) Homepage

    The article is awful. There is only one game benchmark and that compared to an integrated AMD GPU that hardly anybody has heard of. There is also no way of telling from the article whether the integrated intel graphics actually has HD video decode acceleration or not. The modern core i5 chips are pretty capable of decoding 1080p content by themselves without any gpu assistance.

    I think the article writer misunderstands how hardware video decode assist actually works. It isn't magically engaged when you play any HD movie in any media player (usually it has to be turned on in an option somewhere with a media player app that supports it) and it isn't a sliding scale of cpu usage. Modern decoding chips either decode EVERYTHING on the card, reducing cpu usage to 1% or 2%, or the app decodes EVERYTHING in software, resulting in fairly high cpu usage.

    I still have no idea if the new intel graphics chip actually offers any HD video acceleration at all. If it did, it would make it a nice choice for low power and HTPC solutions. If it doesn't, it's just another crappy integrated graphics card.

  • Re:Solid huh? (Score:3, Informative)

    by DoofusOfDeath ( 636671 ) on Monday January 04, 2010 @08:51AM (#30639592)

    You shouldn't be recommend Intel graphics for pretty much anything.

    I disagree. I've had a few laptops that were primarily used for programming. On those, the basic, build-in Intel graphics (GMA950 and X3100, iirc) were just fine.

    In fact, they were even better than ATI or nVidia graphics for me: those computers were running Linux, and I could always count on the Intel drivers being available for the most up-to-date Linux kernels, whereas I couldn't make that assumption for the closed-source nVidia or ATI drivers.

  • by Rockoon ( 1252108 ) on Monday January 04, 2010 @09:36AM (#30639892)
    1080p is quite a bit less than the 2560x1600 that the poster was talking about. In consumer terms, its comparing 2 megapixels vs 4 megapixels.

    Also, last I checked, the largest PC gaming segment still runs at 1280x1024 (presumably on commodity 5:4 aspect LCD's which stormed the market several years ago.) Only 12% run at 1080p or higher resolution. (source [])

    The 512MB NVIDIA 8800GT is probably still the best bang-for-your-buck card on the market given the resolutions people are gaming at. The 8800GT handles every game you can throw at it just fine at 1280x1024.
  • by Joce640k ( 829181 ) on Monday January 04, 2010 @09:41AM (#30639934) Homepage

    Monopolies are only illegal when you abuse them.

  • by sajjen ( 913089 ) on Monday January 04, 2010 @10:01AM (#30640084)
    After looking closer, the board is only fanless in the pictures. The box contains a 60mm fan for the CPU.
  • by W2k ( 540424 ) <{moc.liamg} {ta} {suilesnevs.mlehliw}> on Monday January 04, 2010 @10:52AM (#30640588) Homepage Journal

    Of course. Every PC hardware site worth a penny does regular articles on which CPU is currently the fastest and which will give you the most for your money. As well as comparisons between Intel/AMD. My favorite site for such things is Tom's Hardware, though Google will likely find you many more.

    Which CPU is actually fastest heavily depends on what you will be using it for. Your list of "regular geek activities" does not narrow it down enough. Also, many applications contain optimizations that target a particular CPU family or architecture.

    CPU articles:,1/CPU,1/ []

    Best (gaming) CPU for the money as of dec 09:,review-31755.html []

    All CPU performance charts:,6.html []

  • by nxtw ( 866177 ) on Monday January 04, 2010 @11:27AM (#30641106)

    I think the article writer misunderstands how hardware video decode assist actually works. It isn't magically engaged when you play any HD movie in any media player (usually it has to be turned on in an option somewhere with a media player app that supports it)

    DXVA acceleration works automatically with Windows 7 and any application using the proper built-in decoder and EVR renderer. It should also work with Media Player Classic Home Cinema, if the default renderer is compatible.

    I still have no idea if the new intel graphics chip actually offers any HD video acceleration at all.

    It does. The G45/GM45 chipsets released in 2008 also have full decoding.

  • Re:What the hell... (Score:1, Informative)

    by Anonymous Coward on Monday January 04, 2010 @12:52PM (#30642480)

    I have an i7 laptop (so basically an i5) with hyperthreading, and indeed it shows up as 8 threads. However, I assume the original poster did the same thing I did, saw 8 threads, went "WTF," then, "Oh, right, Intel brought back hyperthreading. I have 4 cores x 2 threads per core."

  • by Hadlock ( 143607 ) on Monday January 04, 2010 @02:08PM (#30643532) Homepage Journal

    The largest segment is technically 1280x1024 @ 21.2%, typically the highest resolution available on consumer CRTs. The next largest is 1680x1050 @ 19.98%, which is most definitely an LCD display. Technically it's not 1080p, but it's damn close for most applications. If you include all the resolutions from 1680x1050 all the way up to 1900x1200, HD, or "damn close HD" makes up a full third (36.69%) of the displays being used. The 8800 is no doubt a stellar card (I wish I'd bought one two years ago, instead of a "hold me over" 8600 until the next gen was released) but with a modern display, the 8800 is only mediocre at best for most people's gaming use.

  • Re:Solid huh? (Score:3, Informative)

    by Randle_Revar ( 229304 ) <> on Monday January 04, 2010 @02:20PM (#30643732) Homepage Journal

    No, Intel is great. They have the best drivers right now (AMD OSS drivers are caching up, though), and as long as you don't play a lot of Wine Crysis they are plenty powerful.

"The number of Unix installations has grown to 10, with more expected." -- The Unix Programmer's Manual, 2nd Edition, June, 1972