Core i5 and i3 CPUs With On-Chip GPUs Launched 235
MojoKid writes "Intel has officially launched their new Core i5 and Core i3 lineup of Arrandale and Clarkdale processors today, for mobile and desktop platforms respectively. Like Intel's recent release of the Pinetrail platform for netbooks, new Arrandale and Clarkdale processors combine both an integrated memory controller (DDR3) and GPU (graphics processor) on the same package as the main processor. Though it's not a monolithic device, but is built upon multi-chip module packaging, it does allow these primary functional blocks to coexist in a single chip footprint or socket. In addition, Intel beefed up their graphics core and it appears that the new Intel GMA HD integrated graphics engine offers solid HD video performance and even a bit of light gaming capability."
Intel branding considered harmful (Score:5, Insightful)
Grrr ... I wish Intel would go back to their system of giving new names to new chips then adding a MHz (and if that's not enough, maybe a cache size and number of cores) to distinguish them, rather than using a weird combination new names (for their top-tier chips) and old names (for their low-end gear).
I only just realized that Pentium no longer means "crappy NetBurst", but now means "low end C2D". And later this month, there will be "Pentiums" and even "Celerons" built on the same architecture as the i5. How do you let your friends know that the "Pentium" is either a worthless, power-hungry dinosaur; or a cheap version of the i5? Should people memorize the chip serial numbers? Because that seems to be the only way of figuring out what the chip is these days.
Re:Intel branding considered harmful (Score:5, Informative)
Re:Intel branding considered harmful (Score:5, Informative)
Re: (Score:3, Funny)
It's just Schroedinger's processor. You have to watch it closely to permanently enable or disble VT.
Re: (Score:2)
Re: (Score:2)
is VT really significant? VMware on a CPU without VT works just fine, doesn't it?
Re: (Score:3, Insightful)
is VT really significant? VMware on a CPU without VT works just fine, doesn't it?
Yes, but there are many applications where having VT will improve the performance of the VM. If you do a lot of virtualization, you'll definitely want it.
Re: (Score:3, Insightful)
VT lets you run a 64-bit guest OS on a 32-bit host OS. It probably has some performance benefits, too.
Re:Intel branding considered harmful (Score:4, Insightful)
number of retailers have had cpu mobo combos on sale with no way to determine which SSPEC they're actually stocking
Send it back and demand a full refund else charge it back. Let the retailers deal will Intel's BS.
Re: (Score:3, Insightful)
No flame intended to the Intel fans, but this is one thing I find much simpler with AMD's nomenclature.
Re: (Score:2)
I fully agree with this, it's absolutely impossible to fully understand Intel's CPU product line up. And why make all those different models anyway? I understand you have a branch of products focussing on power consumption and another on speed, but the current amount of different processors, brand names, code names, series, serial numbers is completely insane. Especially, as you point out, because the meaning of these names keep changing all the time!
Comment removed (Score:5, Interesting)
Not that different (Score:5, Informative)
Intel also has three lines that more or less directly correspond to AMDs: Core/Phenom (good), Pentium/Athlon (ok) and Celeron/Sempron (cheap), plus the server Xeon/Opteron. The real pain is the amount of different model numbers and numbering schemes. The secret decoder ring for Intel models is:
A) old three number codes ...
E.g. Pentium 965, Celeron 450,
First digit is the model, second digit corresponds to the speed
These are usually old crap and should be avoided. Celeron 743 and Celeron 900 fairly recent low-end chips that you can still buy.
B) Letter plus four numbers codes, e.g. SU7300:
* S = small form factor
* U = ultra-low voltage (5-10W), L = low-voltage (17W), P = medium voltage (25W), T = desktop replacement (35W), E = Desktop (65W), Q = quad-core (65-130W), X = extreme edition
* 7 = model line, tells you about amount of cache, VT capability etc. Scale goes from 1 (crap) to 9 (can't afford).
* 3 = clock frequency, relative performance within the line. Scale from 0 to 9.
* 00 = random features disabled or enabled, have to look up for specific details.
C) New Core i3-XYZa
Similar to scheme B, with the added dash and more confusing
* i3 = Line within Core brand, can be i3 (cheap, but better than Celeron or Pentium), i5 (decent) or i7 (high-end)
* X = the actual model, tells you the amount of cache and number of cores, but only together with the processor line (i3-5xx is very different from i5-5xx)
* Y = corresponds to clock speed, higher is better
* Z = modifier, currently 0, 1 or 5 for specific features
* a = type of processor: X = extreme, M = mobile, QM = quad-core mobile, LM = low-voltage mobile, UM = ultra-low-voltage mobile
Re: (Score:2)
This secret decoder ring is exactly where Intel has got it all wrong. Who needs to charge them with antitrust violations when the marketing and product departments will run them into the ground anyway
Re: (Score:3, Interesting)
Re: (Score:3, Interesting)
And as far as these new chips go, does Intel want to get a monopoly charge dropped on it?
The writing has been on the wall for a while, it will all be integrated into one chip at least on the low end. Oh sure Intel might get slapped one way or the other but by the time the dust settles it'll all be on a <30nm chip and no court will manage to force them to create discrete chips again.
The other part is games but the chips are running ahead of eyes and displays and developer time, if you looked at the latest reviews they only test at 2560x1600 with full AA/AF. I'm sure Fermi will be impressive b
Re: (Score:2)
Last I checked, 1080p displays where becoming the norm for PCs...
Re:Intel branding considered harmful (Score:4, Informative)
Also, last I checked, the largest PC gaming segment still runs at 1280x1024 (presumably on commodity 5:4 aspect LCD's which stormed the market several years ago.) Only 12% run at 1080p or higher resolution. (source [steampowered.com])
The 512MB NVIDIA 8800GT is probably still the best bang-for-your-buck card on the market given the resolutions people are gaming at. The 8800GT handles every game you can throw at it just fine at 1280x1024.
Re: (Score:3, Informative)
The largest segment is technically 1280x1024 @ 21.2%, typically the highest resolution available on consumer CRTs. The next largest is 1680x1050 @ 19.98%, which is most definitely an LCD display. Technically it's not 1080p, but it's damn close for most applications. If you include all the resolutions from 1680x1050 all the way up to 1900x1200, HD, or "damn close HD" makes up a full third (36.69%) of the displays being used. The 8800 is no doubt a stellar card (I wish I'd bought one two years ago, instead
Repeat after me: A monopoly isn't illegal (Score:4, Informative)
Monopolies are only illegal when you abuse them.
Re: (Score:2)
When has that not been the case with PCs?
They've always been way past good enough for the average user.
Re: (Score:2)
Mainboards/chipsets for i5/i7 are expensive too (Score:2)
I tried to make 2 alternatives for my imaginary new PC (as I am getting sick of Apple), one AMD Athlon 2 Quad based, other i5 or i7 based.
If you go with a trusted brand like Asus, the mainboard may cost more than the CPU itself! AMD mainboards are way cheaper and has an integrated but a REAL gpu, ATI 4000 something which really supports up to directx 10.1 and has several 2d acceleration features.
I couldn't see usual suspects offering i5/i7 supporting chipsets, VIA etc... Or they are a bit late...
Re: (Score:2)
Nvidia has a license to make QPI-based chips. whether they have any real plans to do so.....
No they do not. That was the reason they bailed out of the controller market.
Intel licensed SLI from Nvidia for their QPI based Core processors.
SANTA CLARA, CA—AUGUST 10, 2009—NVIDIA Corporation today announced that Intel Corporation, and the world’s other leading motherboard manufacturers, including ASUS, EVGA, Gigabyte, and MSI, have all licensed NVIDIA® SLI® technology for inclusion on their Intel® P55 Express Chipset-based motherboards designed for the upcoming Intel® Core i7 and i5 processor in the LGA1156 socket.
http://www.electronista.com/articles/09/12/16/ftc.ignores.amd.settlement.in.intel.suit/ [electronista.com]
The FTC notes the lawsuit is not a direct antitrust case and only accuses Intel of violating competition and monopoly rules under Section 5 of the FTC Act. As a result, it prevents other companies from 'piggybacking' on the lawsuit by using an antitrust decision to demand triple damages in any private cases. In pursuing the complaint, the government commission is hoping to ban Intel from engaging in unfair bundling, pricing and exclusionary licenses and could, if victorious, force Intel to allow NVIDIA chipsets like the GeForce 9400M and Ion for Core i3, i5 and i7 processors as well as future Atom designs. Companies like Apple have faced the possibility of mandatory major reworkings of their computers to continue using modern processors.
to bad intel wants to make QPI to be in $500+ cpus (Score:2)
to bad intel wants to make QPI to be in $500+ cpus only and maybe exons as well.
The desktop i5 / i7 should have QPI as well.
i5/i7 am I missing something? (Score:2)
I finally understand, i5 is lighter version of i7 with less cache. OK, it is what Intel did for years with Pentium/Celeron.
What is the basis of not enabling "HT" on lower end while it is on higher end which is already in use by high end Workstations and apps actually using the cores and doesn't need some fake virtual CPU to fill threads?
Re: (Score:2, Funny)
I think the CPU lineup goes like this:
8088, 8086, 80286, 80386, 80486, Pentium, Athlon, umm not sure if there is anything faster than that.
Sockets and mobos (Score:2)
Re:Sockets and mobos (Score:4, Informative)
The average consumer doesn't give a shit what socket their CPU is in either, so it's all okay.
Re: (Score:2)
I bet they do give a shit when they try using a hammer to fit a 1366 pin into a 1156 socket!
Average consumers don't try to build their own computer from parts.
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
No, the why is because they're not interested.
You don't build your own car, why? Because you're not interested in building cars.
You don't build your own house, why? Because you're not interested in building houses.
They don't build their own computers, why? Because they're not interested in building computers.
Re: (Score:2, Insightful)
That's a faulty comparison. Cars and houses take many well-trained hands to build, whereas a PC can be built by a single individual with little to no training in a few hours time (or less). I don't change my oil, I don't paint my house, hell I can't even fix the leaky faucet downstairs, but I can certainly build my own PC.
Re: (Score:2)
Yeah, that PC can be built with little training if you hand that person the exact collection of parts that they will be assembling and then supervise them. Picking parts, on the other hand, is a whole other ball of wax. I'm a systems engineer whose job it is to keep apprised of new PC technologies and yet it took me quite a while to determine the final configuration for my latest build. There are a crapton of different options for most components, which is awesome for those of us who spend a lot of time lea
Re: (Score:3, Insightful)
See, that's funny to me because changing oil, painting a house, or fixing a leaking faucet take FAR less knowledge and ability than assembling a computer. Hell....my WIFE changes the oil on the car.
Re: (Score:2)
Only until you find out you can get yours at IKEA [boklok.com] as well.
Re: (Score:2)
They buy a computer that is described as having a Core i7, and would like to know whether or not that allows them to run XP Mode in Windows 7. They don't care which type of socket the motherboard comes with.
Re: (Score:2)
What? With Pentium it was easy - there was a year or so long break between using this brand for Netburst and for Core architecture. For around 2 years already anything new & under Pentium brand gives you nice, cheap, C2D CPU...perfect in typical laptops. Yes, it's slightly slower, but together with Intel GFX and slow HDDs it doesn't matter.
Intel of course wasn't really promoting those CPUs, wishing from you to overpaid for full C2D, but they weren't secretive about them either.
Re: (Score:2)
Yup. And for a while, Pentium meant "Core Duo, not Core 2 though".
It's why I've been VERY wary of anything but "Core whatever" branded CPUs, to Intel's detriment - I've held off on an HTPC purchase for a while because the cost was more than I was willing to justify, and Adobe Flash is iffy unless you have a LOT of CPU horsepower...
Re: (Score:2)
You could compare Pentium 4s to each other pretty reliably based on clock speed. Sure, the Northwoods were a bit faster than the Prescotts, and the Extreme Edition chips had a nice speed boost from the cache, but generally clockspeed made em' match up.
However, turbo boost / new architectures can give a 50% speed boost on tasks like x264 encoding when you're talking core 2 vs i5. The frequent changes necessitate new naming schemes.
That was the problem, I think. They were rubbing Moore's law in people's faces. Now, the i7 can always be $999, the i5 can always be $250, and the Pentium can be always be $100. People won't feel like idiots for lashing out on a marginally more powerful system, because their i7 will always have a superior sticker to the lowly Pentium.
Solid huh? (Score:5, Insightful)
In addition, Intel beefed up their graphics core and it appears that the new Intel GMA HD integrated graphics engine offers solid HD video performance
Solid HD video performance? I see 35% CPU load in the Casion Royale 1080p trailer screenshot, on a fast Quad-core CPU. My puny single-core Atom 1.6Ghz with NVidia graphics does 6-10% max on any 1080p content I throw at it in XBMC.
It's better than what Intel offered before: nothing, but I still wouldn't recommend Intel graphics for any HD video player.
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
I disagree. I've had a few laptops that were primarily used for programming. On those, the basic, build-in Intel graphics (GMA950 and X3100, iirc) were just fine.
In fact, they were even better than ATI or nVidia graphics for me: those computers were running Linux, and I could always count on the Intel drivers being available for the most up-to-date Linux kernels, whereas I couldn't make that assumption for the closed-source nVidia or ATI
Re: (Score:3, Informative)
No, Intel is great. They have the best drivers right now (AMD OSS drivers are caching up, though), and as long as you don't play a lot of Wine Crysis they are plenty powerful.
Re: (Score:2)
Actually what you see os the entire cpu load including the gpu part, what you see in the second case is the pure cpu load and offloaded gpu, the end result is pretty much the same if you sum both up...
Re: (Score:2)
You're wrong. What you're seeing is that apparently the Casino Royale video is not fully accelerated by the GPU. The GPU load is never factored in the Windows performance monitor, you need CPU-z or something like that.
Multitasking (Score:2)
Re: (Score:3)
Sure -- theoretically (Score:4, Insightful)
"In theory, practice and theory are the same. In practice they aren't"
Most people don't multitask on their desktop, or better described, "significantly multitask", meaning run multiple programs are once that are intensively using the CPU(s). Typically, they're running one application, which they're focussed on, and other background applications, while they are running, are mostly idle, or utilising no more than the occasiona few percent.
Ripping a movie, on an Atom CPU PC (likely a netbook) at the same time as watch one? I think that's an unlikely event.
Running a highly trafficed web server, on an Atom CPU? I think that's even less likely that ripping a move while watching one.
Remember the OP's criticism? 35% CPU utilsation, which of course still allows 65% CPU for any other tasks, such as ripping a movie, running a web server etc. was unacceptable. So how much unused CPU is enough for more than likely theortical, rather than in practice, use? 70%, 80%, 90%? Any free CPU is CPU you've paid for but aren't getting any value from. The greater the unutilised CPU percentage, the less value for money you're getting.
People buy CPU capacity based on their peak usage, not their average usage. My fundamental point, and why I agree with "Solid HD" performance, is that the typical high load use of a PC while watching a movie is only watching that movie. If these new Intel CPUs with GPUs still have 65% capacity left while the movie is playing, you could say they're significantly overspec'd for their likely peak use - by 65% or so percent.
Re:Sure -- theoretically (Score:4, Insightful)
Re: (Score:3, Insightful)
I won't bother pulling up the numbers but I'm pretty sure you'd find that a CPU spec'd to 15% of your current CPU's capacity uses a lot less power than the current CPU running at 15% capacity.
There's a reason they don't throw Core i3's in cell phones and just under-clock them. Low power CPUs exist for a reason.
Re: (Score:3, Interesting)
They might be watching a video while touching up their photos in Photoshop. That's probably the most likely heavy use scenario.
Re: (Score:2)
Re: (Score:2)
Heat and noise...
When CPU usage gets high, many machines especially laptops get hotter and crank up the fans to compensate...
If the work is being done by the GPU then typically less heat is generated and thus less noise...
Noise when watching a movie can detract from the enjoyment of the movie.
Re: (Score:2)
I think you also missed a key factor, and that is the concept of the GPU being improved. Intel has had the advantage in terms of CPU performance, so Intel is trying to improve their graphics performance now. And, if the new GPU isn't up to the task, THAT is where you CAN focus.
Now, let's be honest, the low end of the market is where integrated graphics really comes into play. So, we are talking about the $400 computer towers you can buy from HP, Gateway, and Dell. And for THOSE, you need the combin
Re: (Score:3, Insightful)
65% of Core i5 CPU is worth much more than 90% of Atom for "multitasking". Plus, those numbers aren't correlated strongly with how smooth any hypothetical multitasking will be, it's more about OS & the way apps are written.
Re: (Score:2)
Re: (Score:2)
Well for one, the machine can be passively cooled and will jump over 70 degrees Celsius if I tax the CPU for more then a few percent, during GPU accelerated playback it stays nicely around 60C. Also, the thing is in use as a home-server/personal web server, which means there's all kinds of stuff running in the background. 35% of a core i5 = around 300% of a single core Atom, you do the math.
Last but not least I like the idea that the most efficient part of my computer is used for the most appriopiate task.
Reviews online at anandtech.com and techreport.com (Score:5, Informative)
DESKTOP PROCESSORS
http://techreport.com/articles.x/18216/1 [techreport.com]
"As a CPU technology, Clarkdale is excellent. I can't get over how the Core i5-661 kept nearly matching the Core 2 Quad Q9400 in things like video encoding and rendering with just two cores. We've known for a while how potent the Nehalem microarchitecture can be, but seeing a dual-core processor take on a quad-core from the immediately preceding generation is, as I said, pretty mind-blowing. Clarkdale's power consumption is admirably low at peak
(...)
The integrated graphics processor on Clarkdale has, to some extent, managed to exceed my rather low expectations."
http://anandtech.com/cpuchipsets/showdoc.aspx?i=3704 [anandtech.com]
"For a HTPC there's simply none better than these new Clarkies. The on-package GPU keeps power consumption nice and low, enabling some pretty cool mini-ITX designs that we'll see this year. Then there's the feature holy-grail: Dolby TrueHD and DTS HD-MA bitstreaming over HDMI. If you're serious about building an HTPC in 2010, you'll want one of Intel's new Core i3s or i5s."
NOTEBOOK PROCESSORS
http://anandtech.com/mobile/showdoc.aspx?i=3705 [anandtech.com]
"From the balanced notebook perspective, Arrandale is awesome. Battery life doesn't improve, but performance goes up tremendously. The end result is better performance for hopefully the same power consumption. If you're stuck with an aging laptop it's worth the wait. If you can wait even longer we expect to see a second rev of Arrandale silicon towards the middle of the year with better power characteristics. Let's look at some other mobile markets, though.
(...)
If what you're after is raw, unadulterated performance, there are still faster options.
(...)
We are also missing something to replace the ultra-long battery life offered by the Core 2 Ultra Low Voltage (CULV) parts. "
Re: (Score:2)
Wow, you're cherry-picking in favor of Intel, how about some quotes like:
When I first started testing Clarkdale I actually had to call Intel and ask them to explain why this wasn't a worthless product. The Core i5 661 is priced entirely too high for what it is, and it's not even the most expensive Clarkdale Intel is selling!
Do Not Want! (Score:3, Interesting)
Anyone else suspicious of this? Intel trying to use its CPU monopoly to gain a GPU monopoly?
Re: (Score:2)
We breath oxygen. What do you breath there?
Re: (Score:2)
It will further cement thier already near monopoly in the integrated graphics for intel based systems segment. Whether it will have much impact on the gamer graphics segment depends on how well it performs. It seems that they have more or less caught up with AMD integrated graphics but I don't think that in itself is enough to seriously impact on sales of discrete graphics cards.
Unfortunately TFA jumps straight from integrated graphics to a £130 card and uses completely different settings for the two
Interesting implications (Score:4, Interesting)
While you might have missed that Intel already is the largest GPU vendor in the world for years (gaming is small compared to B2B sales), you are right, anyway. When offering intel CPUs implies having to buy their GPU, the air will become thin for excellent integrated chipset offerings as Nvidia's. Instead of pushing customers through secret, anti-competitive contracts, they have just changed their product lineup. Want a CPU? Fine, but you can't have it without a GPU.
It will be interesting to see, wether Apple will get special treatment. The have already semi-officially let a word slip out, that they are not interested in the Arrandale GPU and won't use it. It's just not powerful enough for their GPU-laden OS and application lineup compared to Nvidia's chipset offerings.
Re: (Score:2)
And remember the last time they did this? Before the 486, there were a few x87 manufactures (including AMD). The 486 came with an integrated 487, so there was no need to buy one from a third party (they later split the line into 486sx and dx, where the sx was a 486 with the broken 487 disabled).
AMD survived by ramping up investment in their x86 clones and shifted to selling x86+x87 cores, rather than just x87 cores, as their primary market. ATi is doing the same thing by being purchased by AMD. nVidia
Re:Do Not Want! (Score:4, Interesting)
Jepp they already said they want to bankrupt nvidia, every move in the last year was in this direction, first shutting out the ION chipset by illegal pricing now trying to push the gpus into the core so that the cheap enough solution ends wherever nvidia (and ATI but they are less bothered since they can do the same) got its core money from, third fighting a patent war on them to shoot them out of the chipset market.
The entire thing started when NVidia was blabbering about you dont need CPU upgrades anymore just use the GPU for everything, that woke Intel up, and as usual with cheapass solutions which are worse but cheaper they kill off the competition!
Worked in the past works again.
I wonder if we will see NVidia in 5 years at all in the PC market they might end up being a second PowerVR still healthy in the embedded sector but not at all present on the PC side of things.
What the hell... (Score:5, Insightful)
Re: (Score:3, Interesting)
I bought an i7 as part of a general upgrade a few months ago; it wasn't until I had it installed and happened to check Task Manager that I realised it was a quad core chip.
Re:What the hell... (Score:4, Insightful)
Re: (Score:2)
Probably dual-core, with hyper-threading turned on. Try one of the many CPU-ID programs to view your CPU's full features (or cat /proc/cpuinfo on Linux)
Article is terrible (Score:5, Informative)
The article is awful. There is only one game benchmark and that compared to an integrated AMD GPU that hardly anybody has heard of. There is also no way of telling from the article whether the integrated intel graphics actually has HD video decode acceleration or not. The modern core i5 chips are pretty capable of decoding 1080p content by themselves without any gpu assistance.
I think the article writer misunderstands how hardware video decode assist actually works. It isn't magically engaged when you play any HD movie in any media player (usually it has to be turned on in an option somewhere with a media player app that supports it) and it isn't a sliding scale of cpu usage. Modern decoding chips either decode EVERYTHING on the card, reducing cpu usage to 1% or 2%, or the app decodes EVERYTHING in software, resulting in fairly high cpu usage.
I still have no idea if the new intel graphics chip actually offers any HD video acceleration at all. If it did, it would make it a nice choice for low power and HTPC solutions. If it doesn't, it's just another crappy integrated graphics card.
Re:Article is terrible (Score:4, Informative)
DXVA acceleration works automatically with Windows 7 and any application using the proper built-in decoder and EVR renderer. It should also work with Media Player Classic Home Cinema, if the default renderer is compatible.
It does. The G45/GM45 chipsets released in 2008 also have full decoding.
tough day for nvidia stock (Score:2)
a wsj analyst has to be looking at this, and concluding that the gpu business is doomed.
Re: (Score:2)
First, Intel is licensing the Atom microarchitecture to SoC manufacturers. They can also license a GPU core design from nVidia, and maybe a DSP design from someone else, and build their own integrated SoC with an nVidia GPU and an Atom CPU. This is Intel's attempt to compete with ARM. When you buy an ARM chip, you almost alwa
Re: (Score:2)
Intel claimed the video acceleration market was doomed years ago with their first AGP cards that accessed system memory and were about as useful for gaming as a Gameboy that's been through the laundry.
Integrated video has been available for a long time, and it keeps getting better, but the gap between that and truly good gaming hardware is still very wide. The difference is that video card makers simply need to refocus and won't have the low-end market to supply anymore for those easy profits.
Re: (Score:2)
Did you used to work for SGI? Your post sounds exactly like the arguments that SGI management was making in the '90s.
Discrete graphics chips are luxury items. For most users, integrated graphics are more than enough; there's a reason why Intel has more than 50% of the GPU market. I rarely come close to stretching my three-year-old laptop's GPU. Integrated graphics keep improving. The gap between integrated and discrete isn't the important thing, it's where on the line 'good enough' sits. For a lot of
Nvidia does a real thing, Intel is fake (Score:2)
...or, being a journalist, he uses a Mac and he has purchased the i950/945 based scandals from Apple.
Trust me on that, Apple figured they made the biggest mistake by trusting to Intel's "graphics". There are games who has to carry "Intel graphics based Macs aren't supported".
Imagine, you fix the endian issue, claim to have "best opengl" and you base your OS to GPU acceleration features. Some CPU monopoly who you stupidly relied on as a single vendor offers you a graphics solution and your "living room compu
Re: (Score:2)
a wsj analyst has to be looking at this, and concluding that the gpu business is doomed.
Intel is going to have to come out with a GPU that's better than a 4 year old nvidia gpu first.
OK can someone clear this up (Score:2)
Can someone answer these 'simple' questions - In terms of regular geek activities, movie playing/encoding, gaming, compiling, rendering, desktop use, all the regular things
1. Which processor is the all out fastest, best (money no object)
2. Which processor is the best bang for buck (money and object)
3. how do intel chips compare to amd on the bang per buck level.
Re: (Score:3, Informative)
Of course. Every PC hardware site worth a penny does regular articles on which CPU is currently the fastest and which will give you the most for your money. As well as comparisons between Intel/AMD. My favorite site for such things is Tom's Hardware, though Google will likely find you many more.
Which CPU is actually fastest heavily depends on what you will be using it for. Your list of "regular geek activities" does not narrow it down enough. Also, many applications contain optimizations that target a parti
only have 1 x16 + DMI IS bad as boards with usb 3. (Score:2)
only having 1 x16 + DMI IS bad as boards with usb 3.0 / sata 600 have to cut pci-e lanes or use pcie switches to have the bandwidth to run them.
Apple better not use this gpu as it is slower 9400 (Score:2)
Apple better not use this gpu as it is slower then the 9400m and much slower then the newer 9400m gen 2.
apple has to much in to the gpu / cuda to go back to Intel GMA POS.
and if they do intel is just asking for people to user mac os x86. Come on a $1200 aio with this? $1500 - $1700 laptops with this and 13" / 15" screens? a $800 desktop with this. When you get a core i7 (920) 5770 ati video 6gb ram 1TB HD and more for $1000 - $1200. Apple better not even think of this at $800+.
Re:Video decoding under Linux (Score:5, Informative)
Not sure about Intel. But Nvidia has VDPAU which is very nice. Feature Set C even added MPEG4 decoding and SD content upscaling, all in GPU (http://en.wikipedia.org/wiki/VDPAU#NVIDIA_VDPAU_Feature_Sets)
Broadcom finally released Crystal HD drivers for Linux, which means if you have a mini PCI-E slot, you can get HD content. (http://xbmc.org/davilla/2009/12/29/broadcom-crystal-hd-its-magic/)
If you want to know what is available for what GPU/Platform, keep an eye out on the XBMC guys are doing. They seem to be at the forefront of getting hardware acceleration working on different setups
http://xbmc.org/wiki/?title=Hardware_Accelerated_Video_Decoding [xbmc.org]
Re: (Score:2)
Not sure about Intel.
I don't know about these chipsets, but the current Intel chipsets with HD acceleration support like Poulsbo and GMA X4500HD have had extremely poor Linux support. nVidia and VDPAU was really the first viable solution.
Re:Video decoding under Linux (Score:4, Interesting)
Re: (Score:2)
Passively cooled, forget it unless you can find a Tegra based system (which do not exist out of the box) Tegra can do it, but no one pushes out boards for that, because it is ARM based, and you know how easy it is to get a decent ARM board (outside of the beagle board there are none you can get), the ION is the closest you can get, add a silent active cooling and you are off even with the cheapass ATOM processors.
You might be better off in a half years time, NVidia is working on a Via based ION solution whi
Re: (Score:2)
Passive cooling hopefully isn't necessary. Consider just getting a big HSF with a big fan, to run at minimum speeds. I've got a Core 2 E6400 HTPC in a Silverstone LC11-M case. I could unplug every single case fan, set the stock cooler to the lowest possible speed (about 920rpm) and play videos without it overheating, or even getting close to overheating. At that point the noisiest item was the hard disc, even though its got soundproofing panels around it.
I've recently bought a 30GB SSD to replace the noi
Re: (Score:2)
Quiet SATA drives are nearly silent these days, but fans still aren't. A large small fan draws a lot of air and while you reduce the sound of the bearings in the fan, you still have the noise of air flow over the uneven surfaces.
I have a couple completely passive boards and the noise difference from nearly-silent Sonata case + very slow CPU fan to truly silent is still very notable.
Re: (Score:2)
What I'd really like is to have a passively cooled box that's able to play 1080p H.264.
You mean like this? [newegg.com]
Re: (Score:2)
That one comes with a cooling fan, and from what I've heard there's a good chance you'll probably need it for the dual core model. But I think the single core is cool enough based one what I've read.
Re: (Score:2, Informative)
Re: (Score:3, Interesting)
No, it's a laptop CPU/GPU combo, these things are aimed squarely at high end laptops like MacBook Pros.
Re: (Score:2)
Midend laptops like (the lower) MacBooks. MacBook Pros have always had decent discrete graphics, which is one of the primary factors differentiating them from the cheaper Macbooks.
Re: (Score:2)
no, it makes sense to advertise it when it's a single chip that does everything. Netbook CPUs don't offer *any* video decoding support (other than software), that's all done on the chipset (assuming you actually got an ion).
Re: (Score:3, Informative)
Re: (Score:2)
By mobile the parent probably meant laptop.
Re: (Score:2)
don't worry, there is little doubt that this is a downgrade. (except for Atom owners)
Re: (Score:2)
Care to elaborate on how an i5 on a laptop is a downgrade for anyone?
Re: (Score:2)
You can't get a small/lightweight Core i7 laptop, though. I doubt any of them have spectacular battery life.
Re: (Score:2)
my question was a trick, too: Intel sold quite a few onboard graphic chips as "Vista Ready" in the past; I bought one without doing my homework before, and right now I am quite cautious when it comes to Intel hype.