AMD Demos Llano Fusion APU, Radeon 6800 Series 116
MojoKid writes "At a press event for the impending launch of AMD's new Radeon HD 6870 and HD 6850 series graphics cards, the company took the opportunity to provide an early look at the first, fully functional samples of their upcoming 'Llano' processor, or APU (Applications Processer Unit). For those unfamiliar with Llano, it's 32nm 'Fusion' product that integrates CPU, GPU, and Northbridge functions on a single die. The chip is a low-power derivative of the company's current Phenom II architecture fused with a GPU that will target a wide range of operating environments at speeds of 3GHz or higher. Test systems showed the integrated GPU had no trouble running Alien vs. Predator at a moderate resolution with DirectX 11 features enabled. In terms of the Radeon 6800 series, board shots have been unveiled today, as well as scenes from AMD's upcoming tech demo, Mecha Warrior, showcasing the new graphics technology and advanced effects from the open source Bullet Physics library."
Deceiving naming... (Score:5, Informative)
The 6870 actually has less performance than the 5870... Same goes for the 6850/5850... I don't really understand why they named them the way they did... Either way, a 6970 is supposed to be released in the near future to surpass the GTX480/5870.
Re: (Score:2)
If the 5870 is still better in price, frame rate or power use, I am sure it will be noted.
The main thrust seem to be a new mid range (in price) 6xxx should be a bump towards the 5870 stats.
As for the top end, will be fun
Re: (Score:2)
I didn't understand any of what you said. This new scheme doesn't make much sense. Why didn't they just name these new cards 6770 and 6750 if that's the price range they're targeting? This will just confuse consumers and is something I would expect from nVidia or Intel, AMD are usually sensible in their naming conventions.
Re: (Score:2)
Re: (Score:3, Insightful)
That's a nice way of saying "give the consumer less".
Re:Deceiving naming... (Score:4, Insightful)
Re: (Score:3, Insightful)
You forgot the 9700 era, ATI totally owned NVIDIA then.
And the current era, they totally own that.
Re: (Score:1)
I said FX5000 being exception stupid.
Funny how you get moderated for being stupid though.
And it wasn't that much ATI owning Nvidia as Nvidia (3Dfx) screwing up.
Re:Deceiving naming... (Score:5, Insightful)
Many years ago I upgraded from a Voodoo 3 to GeForce 4 Ti 4600, and for more than a few games that GF4 was slower in FPS than the Voodoo at first (but still more than fast enough for gaming.)
This was at a time when games were almost strictly simple textured polygon throwers, which was the Voodoo 3's only strength. As the use of multi-texturing became more prevalent (heavily used in terrain splatting.. [google.com]), the advantages of the GF4 over the voodoo became apparent as more scene detail became essentially free, whereas that voodoo required many rendering passes to accomplish the same thing.
Now I'm not saying I know that this generation of AMD gpu's will experience the same sort of future-benefits as that GeForce 4 did, especially since DX10/DX11 really isnt having a rapid uptake, but there could easily be design choices here that favor DX11 features that just arent being heavily used yet.
The question is not 'is the 6870 slower than that 5870?' in some specific benchmark. The question is, which is these cards will provide a solid gaming platform for the most games. As with my experience, that voodoo performed better than the GF4 for while.. but for the newest games the GF4 kept providing a good experience whereas that voodoo became completely unacceptable.
Re:Deceiving naming... (Score:4, Interesting)
What I suspect AMD has done was add tesselation units to the chip. This will be evident when running the Heaven Benchmark with Tesselation enabled. Keep in mind that Tesselation is one of the key changes between DX10.1 and DX11 and as you stated, this is future looking. Sure the chip may be a bit slower currently but I suspect that when running something that depends heavily on tesselation, there wont be any slowdowns.
The reason I'm aware of this is my Radeon 5650. It's a DX11 card with 512 onboard and when running the Heaven Test, there's lots of improvement when tesselation is on even though the card struggles and drops to between 4-12 frames. With tesselation off, the card easily handles the test at a playable rate of 45-60 frames.
You must have had a really crappy Geforce4 (Score:2)
I am wondering how your Geforce4 Ti4600 got outclassed by a Voodoo3. The Voodoo3 was equaled or outclassed by the original GeForce256. Maybe your memory is fuzzy, but there would be some major issues if your Voodoo3 was faster than the Geforce4. Also multi-texturing was a big deal around the TnT2 & Voodoo2/3 days, the Geforce3/4 were way past that stage with the introduction of shaders. By the time the GeForce4 came around 3Dfx was already dead for two years after their Voodoo5 failed miserably against
Re: (Score:2)
Re: (Score:2)
yeah good points. Issue is that gaming graphics are now undeniably console driven, and consoles don't do DX11/tesselation yet, so that featureset I suspect will be a bit niche. For every stalker/crysis type of envelope pushing gfx-whore-heaven fps (which I do oh so love) there will be 4-5 multiplatform releases that don't even push a 5770 let alone higher.
I'm considering an upgrade from my 5850 purely from a noise POV, I have the HIS icoolerV variant (stock not OCed) and reviews say its quiet but under load
Re: (Score:2)
That and only XBox360 does DX9 (specificially 9.0C). PS3 uses OpenGL ES 2.0 library for the most part, with lots of hacks. If they drive the market, don't expect DX11 or later to be adopted until the next gen of consoles come out.
I'm really excited about physics integration, as most of the stuff I've worked on recently have required passing physics information between hardware and software (in textures). For instance, cloth and hair both need physics to behave realistically.
Re: (Score:3, Interesting)
Re: (Score:3, Interesting)
I'm about ready to replace my 4650, too. I've got a new HD monitor coming and figure that's as good a time as any to up the graphic power, though I won't be going water-cooled.
My problem with the numbering system is always the second digit. For example, is a 5830 better than a 5770 or 4870? Do I add up the 4 digits and compare the sums? Is the first digit the most important, or the second, or
Re: (Score:3, Informative)
Re: (Score:3, Interesting)
I bought and use that exact water cooler on an AMD965 (Phenom IIx4 3.4Ghz Black-Edition). It works great and i highly recommend it. My only advice for anyone is make sure your side panel doesn't have fans or protrusions in the back near your 120mm exhaust port. My case has a 180mm side fan that prevented the radiator (sandwiched between two 120mm fans) from being mounted inside the case. I dremeled out a slot so the coolant tubes could pass through the back (it's a closed coolant system, so you can't ju
Re: (Score:3, Interesting)
I have a stupid problem with that very case. I used it with a GA55-UD3P motherboard and the connector to the audio jacks was on a wire that was about 1.5 inches too short to connect to the onboard audio.
Do you know if you can buy extension cords for those little connectors? I'd hate to not be able to use the headphone jack because the wire inside the case is too short. (Note: I am not competent with a soldering iron)
Re: (Score:3, Informative)
Is this your mobo and that green spot (F_AUDIO) on the left by the audio jacks is where you need to plug in? http://www.orangeit.com.au/catalog/images/prodimg/img1338.jpg [orangeit.com.au]
I don't remember where mine went and can't check until later tonight.. but i think mine was bottom left. Is it possible the cable is wrapped around something behind the other sidepanel? I don't know if anyone sells an extension cord for those, but i found some stuff that may work for you.
http://www.sparkfun.com/commerce/product_info.php?p [sparkfun.com]
Re: (Score:2)
Brother, that's really nice of you to offer, but I'm sure you have better things to do with your time.
I'll have to pop the cover on my case again to look, but I seem to recall that the connector on the mobo was one of those 2 or 3 bare pins sticking up and the wire from the case was one of those thin wires with the tiny black connector on the end that you slip over the pins. I may well be wrong. When I built the system, I was in a panic to get it up and running. All I remember is that I had trouble makin
Re: (Score:2)
Re: (Score:2)
105-110F usually and rarely does it go over 120F. I don't have a before and after comparison though, sorry. The only bad part about the thing is you have to remove the mobo from the case to install a custom bracket. That won't be a problem for you though if your friend is going to dremel the case anyways.
Re: (Score:1)
On an unrelated note whats the best distro choice for a 600MHz Transmeta cpu laptop?
Re: (Score:1)
For example, is a 5830 better than a 5770 or 4870?
Probably.
Stupid guesses:
58xx > 48xx from generation alone.
57xx probably more limited chip or something else (different memory?) than 58xx.
xx30 lower end than xx70 of the same chip.
Or something, wikipedia most likely tell?
Facts:
HD4870: 750/900 clock, 800:40:16 unified shaders, texture mapping units, render output units, 256 bit GDDR5.
HD5770: 850/1200 clock, 800:40:16, 128 bit GDDR5.
HD5830: 800/1000 clock, 1120:56:16, 256 bit GDDR5.
X7XX on both seem to be 128 bit memory.
X8XX 256 bit
48XX X2 2x256
X9XX 2x256
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
I frankly can't tell you if a 9600Gt beats a GT210 or the other way around
I assume it does. What about a HD4250 vs HD 3650?
Hard to see the difference =P, though yeah, LE, GT, GTX, GTS, ... may seem weird, 30, 50 and 70 is rather obvious.
Re: (Score:2)
Because shit dude the 6870 is 1000 better than the 5870! And it's going for a lot less! It's a GREAT deal.
In a Tablet, or Game Console (Score:2)
Re: (Score:2)
Which one? (Score:1, Offtopic)
PCs will be obsolete when I can have a chip implanted in my brain.
Will you go for Ponch, or Jon?
Cheers,
Re: (Score:1)
My brain will be obsolete when one can implant a chip in it.
Who's right? Just poking in a M68k or "implanting" a 5.56mm NATO in there with no connections doesn't count :D. Though the later is sure to make my brain obsolete :D
Deceiving? (Score:4, Insightful)
Re:Deceiving? (Score:4, Insightful)
Re: (Score:1)
From the comments it looks like they said 5 cards, not 5 generations? 2 generations?
And they did the same (or for three?) with the low-end GF4s didn't they?
That didn't mean ATI got better cards than the TIs though.
Re: (Score:1)
I think in the GF 3-4 range there were a lot of GF4-mx cards.
These were essentially low-middle end GF3 cards, and for some things my GF2 TI outperformed it (with less CPU to boot).
Re: (Score:2)
I think in the GF 3-4 range there were a lot of GF4-mx cards.
These were essentially low-middle end GF3 cards, and for some things my GF2 TI outperformed it (with less CPU to boot).
The 'Geforce 4 MX' was a Geforce 2 in drag, wasn't it?
Re: (Score:1)
That was the impression I had. Or atleast that it wasn't better :)
Re: (Score:1)
check again, you were undoubtedly looking at 5770s, not the 5870s. and then that is correct, 5770 -> 6870. which as noted, doesn't make a lot of sense when the 5870 performs better.
they should have really just started their base line at the 6660, obviously...
Re: (Score:1)
It makes sense for AMD to hold off on updating their lower end offerings since consumers are less demanding at this price point.
Re: (Score:1)
Re: (Score:2)
low power but better manufactured leaves to see if they can be overclocked better than the 5x series. TBD until people get some hands on time, of course.
Re: (Score:2)
The general idea is that the 5850/5870 will be phased out as these cheaper new cards will be introduced, so the AMD mid to up-range lineup will be 5750/5770, 6850/6870, 6950/6970 (next month, + 6990 later). So, that does make sense since the 6850 and 5750 will be out at the same time and the former is much faster (and that is indicated by its x8xx vs x7xx), and also has the new features (like HDMI 1.4, 3D and those are indicated by its 6xxx vs 5xxx). But the 5850/5870 will not disappear overnight so for a w
"Alien vs. Predator" Movie or Video Game? (Score:2)
I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.
Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?
Re: (Score:1)
I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.
Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?
By current standards, Low-power would be just under the amount of energy required to power the sun to run a dual-card setup.
Re:"Alien vs. Predator" Movie or Video Game? (Score:4, Informative)
Without any fan? No probably not. It is a desktop processor. This isn't an ultra low power component, it isn't an Atom. The idea AMD is going for here, and I think there's some merit to it, is a low range desktop type of system. People who want something cheap, but still want to play games. Intel's integrated chips don't work that well for that (though they've improved) so this is to try and fill the market.
If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones). But don't bitch that some people might want a computer that can play a game a little better than Nethack.
Re:"Alien vs. Predator" Movie or Video Game? (Score:5, Insightful)
The idea AMD is going for here, and I think there's some merit to it, is a low range desktop type of system. People who want something cheap, but still want to play games. Intel's integrated chips don't work that well for that (though they've improved) so this is to try and fill the market.
Think more mid-to-high-end laptops.
As mentioned in the summary, this is a low-power version of the Phenom II. Not an ultra-low power for consumer electronics or netbooks like Atom or AMD's Bobcat, but still solidly aimed at the mobile market. It provides all the power and cost advantages of a UMA solution plus gets rid of one of the system buses for more savings, while providing good-for-a-laptop graphics without having to significantly re-engineer the motherboard or cooling solution. This is still in theory; demonstrations of engineering samples are nice, but it'll be interesting once the reviewers get their hands on some.
Of course you're also right, since cost and power usage are relevant for desktop. Just not as much, since you're not dealing with battery life, or the form factor that make it difficult to work with discreet graphics. A single line of UMA-based motherboards with optional pci-e graphics card can serve multiple markets with one design and acceptable margins.
Re: (Score:2)
"Think more mid-to-high-end laptops."
No, those will come with their own MXM expansion slot for dedicated GPU.
Re: (Score:2)
Most of the mid-range laptops don't have MXM because it's a waste. The price differential of adding the slot is almost certainly more of a burden to the user than a replacement, and many MXM cards have heat sinks in nonstandard locations, or come in nonstandard sizes that only fit one range of laptops. This is supposedly less common in MXM3 but people were still using MXM1 when MXM3 had come out.
Low- to Mid-range laptops will get these chips, netbooks will continue struggling along with the slowest CPUs, an
Re: (Score:2)
"Which suggests that MXM is a big fucking waste of time and money in all cases."
Yea, you go ahead and say that to my nx9240, which has had four video upgrades and still runs like a champion.
Re: (Score:2)
The plural of anecdote is not data, and a single anecdote is not even that. I was talking about a waste to the manufacturer, who is more important than you, especially when deciding whether the system shall have an MXM slot. Statistically nobody ever bought a laptop because they could upgrade it. You are noise.
Re: (Score:2)
The very high-end, sure. For the rest, the cost and battery life advantages will steer it towards Fusion. People these days want a high-end laptop that can play games, but that can also function as a useful portable computer in the absence of a power outlet.
Re: (Score:2)
If AMD gets the jump on Intel by integrating high performance GPUs with high performance CPUs it won't be the first time.
This is not a high-performance GPU; you really can't integrate a high-performance GPU with a high-performance CPU because you'd have to suck 400W of heat out of the monstrous combined chip that would produce _and_ it would then be crippled by the poor memory bandwidth anyway.
No-one is going to buy this chip if they want high-performance 3D, they'll buy a CPU and a discrete graphics card. It's merely providing somewhat better performance than other integrated graphics chipsets, allowing people buying cheap P
Re: (Score:3, Interesting)
Firstly this can save money. Integrating the GPU into the CPU can create a lower cost part for an OEM than having to using two chips in separate packages. Second this is a fusion between x86 & GPGPU/OpenCL. Once a critical mass of CPUs have an integrated GPU then you will probably see GPGPU tech really start to become integrated into programs that can take advantage of it. Suddenly your low-end budget box CPU can encode & decode multiple HD streams from your camera or apply special effects in realti
Re: (Score:2)
It might be a medium-performance GPU by today's standards, integrated with a medium-performance CPU.
Right now, AMD has (for instance) the following sepatate parts:
-the 5570 GPU, a midrange GPU with 128 bit data bus. Complete cards use max. 43Watt (according to alternate.de)
-the Athlon X4 610e (e for "energy efficient"), a "Propus" quadcore with 45Watt TDP.
Put both on the same chip, assume some more improvement in power consumption, and you might get something loke the Llano. Except maybe for the shared memo
Re: (Score:2)
Which kind of leaves me wondering what the point is; the primary market is people who want to play games but not enough to actually buy a graphics card which can do so. Maybe it would be beneficial in the laptop market where many systems can't really play any games at all.
Naaaaah, that's not it.
Just kidding, of course that's it. It's like I'm trying to say: Where is having performance around maybe a low-end discreet card, or at least that feature set, but with the economics and power usage of an integrat
Re: (Score:2)
No, think value desktops.
A suitable market, but not the primary thrust. Read my post.
With regard to Fusion generally; never, ever bet against integration.
Integration is even more important in the mobile market, for all the reasons I already discussed.
Re: (Score:2)
If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones).
Ha... my LG Blu-Ray player has been in for three times for warranty repair and now we're waiting for a replacement unit because they can't fix it, while my $90 unknown-Chinese-brand Blu-Ray player from Walmart works perfectly and is multi-region out of the box.
And our Ion MythTV frontend does have a fan to keep it cool when playing 1080p, but it's inaudible from the sofa.
Re: (Score:2)
If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones). But don't bitch that some people might want a computer that can play a game a little better than Nethack.
And if you wan't a car that doesn't use gas, get a bicycle.
Re:"Alien vs. Predator" Movie or Video Game? (Score:5, Interesting)
I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.
AvP is a relatively modern game. Came out in the last year or so. It isn't mind-shatteringly amazing, but it looks pretty decent.
Traditionally, integrated graphics have done a lousy job with serious gaming on PCs. Basically any FPS has required a discrete 3D card.
If Joe Sixpack can go out and buy an off-the-shelf machine at Best Buy and play a game on it without having to upgrade the hardware, it'll be a huge step in the right direction.
But this chip doesn't look like it'll be replacing 3D cards for serious gamers anytime soon.
Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?
It's a desktop chip, so I can't imagine it'll do anything without a fan. Although the integrated graphics means that you wouldn't need a separate graphics card with its own fan. So it should be at least a little quieter.
Re: (Score:1)
If Joe Sixpack can go out and buy an off-the-shelf machine at Best Buy and play a game on it without having to upgrade the hardware, it'll be a huge step in the right direction.
well they always could, but they are too ill informed to make a good choice for their needs, they see a half dozen intel stickers on the front of some demo box and instantly start rationalizing that intel makes the best computer, just like they play their Nintendo tapes and wear their Nike air's, while blowing their nose in a Kleenex and using Clorox
personally there ought to be a survey for the people who just dont know, cant understand or are overwhelmed by a metric fucton of TOTALLY meaningless number thi
Re: (Score:2)
well they always could, but they are too ill informed to make a good choice for their needs
The problem hasn't really been one of information.
Until fairly recently, your average off-the-shelf computer shipped with very crappy graphics. If you just ordered whatever was on Dell's website or grabbed something from Best Buy it would have enough integrated graphics to run Windows and not much else.
Sure, you can generally customize them with a video card of your choice... At least if you're ordering on-line... But even then the offerings weren't terribly impressive.
And there really is a limit to how
Re: (Score:1)
And there really is a limit to how much self-education you can expect your average consumer to do. Do you go out and research what kind of spark plugs are factory installed in your new car? I sure as hell don't.
no just as I said earlier about the little survey, I look in the book hanging off the shelf to quickly decide what I need for a sparkplug
Its not about joe sixpack educating themselves, its about providing reasonable information in a easy to digest format
why do most computers come with a crap video card? cause most of them say "great for gaming" on the front and is pushed out en mass as cake and people eat it up not having a clue either way
(and yes the cake is a lie)
Re: (Score:2)
And there really is a limit to how much self-education you can expect your average consumer to do. Do you go out and research what kind of spark plugs are factory installed in your new car? I sure as hell don't.
no just as I said earlier about the little survey, I look in the book hanging off the shelf to quickly decide what I need for a sparkplug
The book hanging off the shelf... At the car dealership?
I'm not talking about replacing the spark plugs in a vehicle you already own. I'm talking about going out to the dealership and purchasing a car. I don't think most people check to see what type of spark plugs are factory installed when they go looking for a new car. Either they know enough to care, and they'll just replace them with their preferred type when they get the vehicle home... Or they don't know enough to care, and they'll just use what
Re: (Score:1)
yea ok I misread the example, but If my Kia Rio had a big red sticker on it that says "great for off-roading" when in fact its not there would be some heck to pay
why its okay to falsely advertise or exaggerate features in a computer, and its not in any other industry is what I am trying to get at here
Its become so confusing I cant even keep up with it, there is more meaningless random tell you nothing numbers on computers and that needs to stop for everyone's mental stability
To extend the car anlogy... (Score:3, Insightful)
Those who are interested in fast cars will usually make sure to buy the biggest engine (GPU/CPU) they can afford. ;-)
Your average ricer kids (gaming nerds) are also likely to obsess about technical details and be at least somewhat well informed.
They might also decorate the car (PC) with lots of spoilers (LED-illuminated fans).
And then they go drag racing (comparing benchmarks
Re: (Score:3, Funny)
It worked, I have travelled back to the year 2000.
Re: (Score:2)
It worked, I have travelled back to the year 2000.
Hmmm...
Well, I assumed they were talking about the 2010 AvP game [wikipedia.org]. As that would make more sense (seeing as DX11 didn't even exist in 2000).
But I suppose it could be the 2000 AvP game [wikipedia.org]. In which case I'm less impressed.
Re: (Score:1)
AvP is a relatively modern game. Came out in the last year or so.
Actually, there are more than a dozen AvP games according to the Wikipedia:
http://en.wikipedia.org/wiki/List_of_Alien_and_Predator_games [wikipedia.org]
I liked the AvP 1999 a lot and at first glance I did not understand why they showcase a 11-year old game.
Now I know I will have to buy an AvP 2010 with my next AMD laptop ;)
Re: (Score:2)
Yeah. I just assumed they were referring to the 2010 version, as the earlier ones probably didn't feature DX11 graphics. But just saying "AvP" doesn't really clarify things much at this point.
The 2010 game is a mixed bag.
The marine campaign is a ton of fun. The predator campaign is fun, but doesn't make a whole lot of sense. The alien campaign was a disappointment.
The multiplayer can be fun, or frustrating, depending on the map and who you're playing as/against. Some of the maps seriously favor one rac
Re: (Score:1)
You want Ontario or Zecate (Bobcat based APUs).
Both offer h.264 accelerated playback and are 9W or 18W.
I am seeing mixed info actually on what is available dual cs single core at what wattage.
The 9W is definitely single core, faster than Atom, with decentish GPU (accelerated video), the Zecate I think is 18W with a single core.
I imagine they will not quite double when going dual core (as the graphics part will not increase).
And they are supposed to have tech to completely shut down parts of the chip that ar
Re: (Score:2)
As for low-power, if recent experience is any judge, the power usage will be low only in comparison to quadruple Pentium IVs. Some cards last gen were 300+ watts TDP.
Re: (Score:1)
Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?
Shouldn't anything new?
Re: (Score:1)
I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be
Agreed how many IOCs (Instances of Crisis) is that? like .25?
Re: (Score:2)
Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?
1080p blu-ray isn't that hard. If that's your primary interest you don't need to worry about getting a fancy card that scores well in gaming tests. The two things don't (mostly) correlate. With AMD cards, you're looking for something that has UVD2 support. That's the clue that indicates a card can give you proper blu-ray playback. And yes, that's plenty possible without a fan. For example, this oldish 4350 [awd-it.co.uk] should be fine for blu-ray (has UVD2.2, the latest) and is fanless, But I'm certain you could fine thi
Physics and GPUs (Score:2)
Nvidia has their PhysX engine, and Intel advised they were to acquire Havok. Bullet is exciting for me. It was used in Grand Theft Auto 4, and in the movie Hancock.
So for me, reading AMD, ATI, Bullet in the same sentence is the interesting part.
Will apple use this new cpu with gpu build in? (Score:2, Offtopic)
Will apple use this new cpu with gpu build in?
as they like thin and small and intel new chip with gpu build in does not fit with apples techs and apple does not like putting in add on video chips in there low end systems.
Re: (Score:1)
I doubt it. Switching to AMD (especially for only part of their line) seems like it would have a lot of ancillary costs such as the R&D help I know Intel has given Apple. Apple stuck by Intel for years through their abysmal "GPUs" (I've got one, along with an nVidia, in my MacBook Pro). Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.
Also, don't forget the right
but the lack of OPEN CL in intel's chip is bad and (Score:2)
but the lack of OPEN CL in Intel's chip is bad and do you want a $700 desktop with intel video?
a $1200 laptop? $1500 laptop?
also the 320M they have now is better then what Intel new chip can do.
Re:Will apple use this new cpu with gpu build in? (Score:5, Interesting)
I doubt it. Switching to AMD (especially for only part of their line) seems like it would have a lot of ancillary costs such as the R&D help I know Intel has given Apple. Apple stuck by Intel for years through their abysmal "GPUs" (I've got one, along with an nVidia, in my MacBook Pro). Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.
Also, don't forget the right now AMD has the Phenom, which is a good chip, and Intel has their current Core line, which is an amazing line of chips. To go to AMD means sacrificing performance/watt on the CPU side.
Two years ago maybe it would have mattered. Today? Too little too late.
Being a former NeXT and Apple Engineer I can tell you unequivocally your thought is Bull Shit. Intel gave NeXT practically zero information for the NeXTStep Port to Intel. Apple designs around Intel Specs and Intel helps as another OEM. No special treatment.
Re: (Score:2)
I've been hearing that Intel's latest graphics are finally pretty good for over a decade. At this point, they could release a graphics chip so amazing that each polygon gives me twenty dollars and a blowjob, and I'd still make a careful point of never using Intel graphics, no matter the cost. It's like the boy who cr
The article got it wrong (Score:5, Informative)
APU doesn't standard for Applications Processing Unit, it's an acronym for Accelerated Processing Unit.
http://sites.amd.com/us/fusion/apu/Pages/apu.aspx [amd.com]
"The GPU, with its massively parallel computing architecture, is increasingly being leveraged by applications to assist in these tasks. AMD software partners today are taking advantage of the GPU to deliver better experiences to across an ever-wider set of content, paving the way to breakthrough experiences with the upcoming AMD Fusion Family of Accelerated Processing Units (APU)."
Answering "Why is it so underpowered?" . . . (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
Typo in summary,
ATI makes Radeon. Not AMD.
Posting anonymous for obvious reasons.
Because otherwise everyone would laugh at you for not realizing AMD's ATI brand is on the way out? Don't worry, we'll do that anyway.
Re:Typo in summary, (Score:5, Informative)
No, AMD makes Radeon and has done for years.
They've _branded_ them ATI since the buyout, but even that has changed now and future parts (which is what these are) will be AMD branded.
Mod Parent Up (Funny) or Down (Troll), Please (Score:2, Offtopic)
AMD buys ATI. ATI animates Lizard. Lizard bites Spock. Spock buys nVidia. nVidia dominates ATI.
Re: (Score:1, Informative)
Re: (Score:2)
No AMD makes Radeon. ATI was acquired. Accept this thing called reality. AMD kept the brand seperate a while (despite ATI not actually existing) because is was worth more as a separate brand. Times change. As AMD gets more and more into hybrid APU it makes less sense to have a separate fake company name on some (but not all) of its video cards.
The line between CPU & GPU will become very blury over the next decade. AMD wants you to know they make it all. Dedicated CPU, dedicated GPU, low power APU
Re: (Score:2)
Nice try, but for all intents and purposes, ATI does not even exist as a company, at least not as an independent one. It was bought by AMD some time ago, in the range of 3-5 years ago, I think. So anything sold under the ATI moniker is in fact made by AMD. AMD is killing of the ATI name, and now beginning to label their graphics line strictly as AMD products.
Why did you post anonymously? If you had been right (you weren'
Re: (Score:2)
You must have not been paying attention when AMD bought ATI. You also must not have tried to download Radeon drivers and realized that you were at AMD's site.
LK
Re: (Score:2)
Nope, trolling. Way to feed 'em.
Re: (Score:2)
I know you are a troll and this is very much OT...
but to think that all things that exist had to have been conciously thought of is the foolish thing, my friend. look around: you think this is the best that a perfect being could have come up with?
your own argument ('how wonderful things are') is the argument I use against you. things are quite shitty 'down here' and there is no sign of anyone in control. a supreme being would probably not be an absentee ruler and yet that's exactly what your religion wou
Re:Useless resolution/performance measure (Score:4, Insightful)
That's at least 90% of all the games in history released for PC on an integrated graphics processor. Pretty amazing if you ask me.
Re: (Score:2)
Re: (Score:2)
My MSI laptop with Ati x200 (well, it says x1150 on the tin, but that is a rebadged x200) has been running linux since 2007 (when i got it actually). Compiz worked flawlessly with 3d cube nonsense and all in the then recent ubuntu (dont remember if it wat 7.04 or 7.10)
Yes, Ati drivers for linux were hell back in 2004 with my 9600pro, but the last few years it all works