ARM VP To Keynote AMD Developer Conference 70
MojoKid writes "AMD is hosting its first AMD Fusion Developer Summit (AFDS) this summer, from June 13-16. The conference will focus on OpenCL and upcoming AMD Llano performance capabilities under various related usage models. One interesting twist is that the keynote address will be given by Jem Davies, currently ARM's VP of technology. To date, AMD's efforts to push OpenCL as a programming environment have been limited, particularly compared to the work NV has sunk into CUDA. With its profit margins and sales figures improving, AMD is apparently turning back to address the situation — and ARM's a natural ally. The attraction of OpenCL is that it can potentially be used to improve handheld device performance. AMD's explicit mention of ARM hints that there might be more than meets the eye to this conference as well."
Re:Goodnite x86 (Score:4, Informative)
I've been hearing this drivel for years.
It's beaten out other techs for a reason. ARM is not replacing x86 on the desktop any time soon, thank goodness. Maybe joining it, but not replacing it.
Re: (Score:3)
No ARM Will not replace x86 on the desktop. They point is to keep x86 away from other platforms. The desktop is what it is, a swiss army knife where a powerful CPU with a big feature set is called for, because the system will do a little of everything. Its also true that different feature sets are needed by different users but packing it into a single one size fits all chip is the way to go because the incremental cost of adding features so won't use are less that making a wider array of products. Elect
Re: (Score:2)
I think the only one of those that would stand a chance is Alpha... *MAYBE* MIPS.
Sparc? Doubt it. It's nice for highly parallel tasks, but not so great for high-demand single-thread, which matches a lot of desktop apps.
Re: (Score:2)
RISC architecture is going to change everything.
(I couldn't resist)
Re: (Score:2)
s/is going to change/has changed/
Given that, to my knowlege, all x86 CPUs are internally RISC, that change may be necessary.
However, there's nothing wrong with exposing the x86 arch while internally using their own, separate, instruction set - this allows a model which can perform standard optimizations, keeping the programmers from having to worry about implementing them, themselves. Also, it allows for changing the executed instruction set, to improve performance, without causing problems to existing appl
Re: (Score:2)
x86 is ill suited to mobile devices for all the same reasons its great on desktops and laptops that can afford large heavy batteries, and are generally used where AC is available.
This is complete BS. As ARM starts to match the performance of x86 the power draw is beginning to match too. If intel sticks it out in another generation or two i'm betting Atom will absolutely dominate the power/performance curve because there isn't anything fundamentally in x86 that makes it draw more power. If anything building
It's not that it is better technically (Score:3)
Re: (Score:3, Interesting)
It's beaten out other techs for a reason. ARM is not replacing x86 on the desktop any time soon, thank goodness. Maybe joining it, but not replacing it.
And what do you think this reason is?
Hint: it's not technical superiority.
Phillip.
Re: (Score:2)
Adaptability and technological superiority for the tasks on a desktop.
So, yeah, it is in part, technological superiority.
Re: (Score:2)
> ARM is not replacing x86 on the desktop any time soon, thank goodness. Maybe joining it, but not replacing it.
That's a beginning: when someone use an iPad instead of a PC, he "replace" an x86 by an ARM.
Currently iPads and the like are mostly PC 'companions' but it wouldn't surprise me that they could become PC replacements for many users..
Re: (Score:1)
Re: (Score:2)
Which features is the ARM architecture missing compared to x86?
Re: (Score:1)
Features of x86 that are currently missing in ARM? How about out-of-order execution, 64-bit operation, speed boost (some cores shut down to let other cores run faster), and a top-end speed around 3GHz just to name a few.
Of course the lack of those features lets it run cooler which makes the ARM processor ideal for low-power applications like cell phones.
Re: (Score:2)
Except for 64-bit operations, those features have nothing to do with x86 vs ARM. They're part of the microarchitecture. And by the way, the Cortex-A9 does support out-of-order execution.
Re: (Score:3)
Features of x86 that are currently missing in ARM? How about out-of-order execution,
The Cortex-A9 is out-of-order.
64-bit operation,
They indeed don't have 64 bit ALU or memory addressing support yet.
speed boost (some cores shut down to let other cores run faster),
That's unrelated to the architecture. And at least NVidia's Tegra dual-core cpu's shut down one of the two cores if it's not in use. I don't think they automatically overclock the other one to run faster when doing so though.
and a top-end speed around 3GHz just to name a few.
Yes, in absolute performance per core they are still trailing x86. I was mainly reacting to the "anorexic featureless simpleton CPU" remark with my question though.
Of course the lack of those features lets it run cooler which makes the ARM processor ideal for low-power applications like cell phones.
And server farms [eetimes.com].
Re: (Score:1)
Re: (Score:2)
Server farms? Kind of a bitch without ECC but I guess it works in some cases. Granted, what do I know, maybe they will be slapping ECC support into them.
Why do you believe ARM doesn't support ECC right [mvdirona.com] now [google.com]?
Re: (Score:1)
It's missing various addressing modes, some tiny 8-bit instructions, a separate I/O address space, and probably more. Depending on what you do, however, missing these features may itself be a feature. ;)
Re: (Score:2)
I agree that has been extremely important until now. As the success of Android and iOS devices demonstrates, it's getting much less so though, even in the consumer space. Even migrations from one architecture to another have been pulled off by Apple already twice in a quite successful way (although in that case increasing performance of the new architecture is quite important). And in the server space, the underlying architecture is almost irrelevant.
Re: (Score:2)
I feel like x86 compatibility itself doesn't matter anymore either. The majority of users seem to depend on only a very small number of applications. You pretty much get all the average folk with a web browser, Flash, Microsoft Office, and maybe iTunes. Adobe, Microsoft, and Apple have demonstrated a willingness to work with whatever platforms are popular.
There are certainly large niches that matter too, like video games, but would companies/developers for those applications hold things up? I don't know.
Re: (Score:3)
With the plethora of JIT compiled languages doing high-end tasks today, and the increasing number of cross platform/arch libraries, I'm not sure that x86 compatibility is such a killer.
Re: (Score:2)
Actually, ARM wouldn't take much work to make a good server CPU.
Most servers are better suited by a lot of mediocre or slightly below mediocre cores, rather than one or a few heavier-duty cores. ARM's low power and high performance/power ratio makes it a very likely contender for the server market in the next couple years, if it is developed properly.
For the end-user segment (desktops and notebooks), where low-thread brute force tends to be a more relevant factor, ARM isn't as good of a choice.
Proper Linux Support? (Score:3)
How about spending a few engineering dollars and releasing GOOD well documented drivers? I'm a regular reader of the XBMC forums [xbmc.org] and anyone that wants to use Linux more or less needs to buy Nvidia hardware.
I'm not in the 'anti-closed binary' camp, I just want the best tool for the job. Nvidia provides great CUDA and VDPAU support and it more or less 'just works'. ATI & Intel decided to jump on the Linux bandwagon by opening up everything and so far it seems like the community really hasn't jumped on it. I paid money for your hardware, why not pay an engineer to write software I can actually use?
When I go car shopping and the sales associate shows me 2 cars. One is completely built, works well enough and has good factory support BUT I'm not allowed to modify it. Or the second one which is actually just in a crate. It comes partially assembled... but don't worry. There is complete documentation for every single loose part and instructions on how to put it together. And the 2 cars cost nearly the same.
I'm going to choose the first car. My time IS worth something and I'd rather have something I can't modify but works great as is (NVidia's drivers) to something that really is useless unless I, or someone else, uses the documentation to do something (ATI). Especially when the hardware costs are nearly the same.
Re: (Score:3)
I'd be satisfied if ATI would release enough information to support their hardware. I have a netbook based on R690M chipset and Athlon 64 L110 and the graphics only work correctly under Vista. They limit me to one suspend-resume cycle under Windows 7, suspend never resumes properly under Linux, and I get massive graphics corruption in Linux even with RenderAccel disabled. Further, power saving doesn't work properly anywhere but Vista; I have a five hour battery, get about 4:30 in Vista (no crap) but about 3
Re: (Score:2)
I replaced the hard-drive with an ssd - which stopped the thing's exhaust from scorching me and extends the battery life a bit... now I have an overpriced, overweight but moderately nice to use in dim light ebook reader. In tablet mode a single page of a magazine like SciAm fits perfectly on the screen and is still readable so it pretty much eliminates scrolling.
It'
Re: (Score:2)
Are you sure the problem isn't simply that the manufacturer of the laptop produced rubbish?
Yeah, I'm sure, because it works 100% in Vista. Which is to say, even slower, but literally everything on the machine works great. The problem is not Gateway, IIRC this is really an Everex anyway. The problem is AMD. I knew better and bought something with ATI graphics and now I am paying the price. Based on my experience with Geode, though, I fully expected the processor to have good Linux support, which was not the case at all.
Between that and their lack of proper timely support for k10 (hello, there's a
Re: (Score:2, Informative)
That second car does not have all the docs, or someone could build it. AMD leaves out anything relating to video acceleration for example. This is to protect their windows DRM, meaning me a linux user is suffering due to windows DRM.
Re: (Score:2)
Then can't you just buy a different video card (or different laptop with built in video from another company), and let the market decide?
Re: (Score:2)
Re: (Score:2)
Decent Binaries?
Since when? Try using an ATI card to run games in wine, or do anything particularly OpenGL heavy and watch what happens.
Re: (Score:1)
I do watch what happens, sometimes for long periods of time. What happens? My game runs just fine, often even better than in Windows.
Re: (Score:2)
Re: (Score:1)
I am annoyed (under Linux) that support for video accelerations sucks or does not exist. Flash and just web browsing with Chrome and Firefox 4 is painful in that platform as a result.
I switched back to Windows 7 and use a VM for Linux programming for serverish things. Adobe maintains they will not support hardware acceleration at all for any Intel or ATI products because the drivers are hacks and scripts and are not professional grade like their MacOSX and Windows counterparts.
Under Windows 7 I like my ATI
Re: (Score:2)
Define OpenGL heavy. The drivers typically fall flat on their face compared to NVidia when your developer "oopsed" something on a shader or a call- basically, the drivers are less tolerant of errors in coding than NVidia's. As for WINE...don't know, it's been a bit since I've tried doing much in it. I port titles after hours so I tend to not rely on band-aids to get games to play... ( :-D )
Re: (Score:2)
Funny but the FOSS community has said for years that if "They just documented the chips we would write the drivers.". What it comes down to is money. Very few people buy hardware to run Linux on. Most people buy hardware to run Windows on. Most resources goes to where most profit comes from. From what I have heard ATI drivers have gotten much better lately.
Re: (Score:1)
That's because AMD has faked releasing documentation. They have released the most basic worth-/useless parts.
And as promised, the XOrg radeon team already implemented that, plus a lot more.
As can be seen here: http://xorg.freedesktop.org/wiki/RadeonFeature [freedesktop.org]
If they release stuff that actually documents the 3D and video stuff, instead of just basic mode setting & co, then we'll fix the bits that are missing too.
But it's so nice of this whole thread with all parent posts and most sibling posts, to just spew
Re: (Score:2)
Uh, no... It's more because they're trying to get INFRASTRUCTURE in right so that you have no bottlenecks in the rendering path that are avoidable. You're watching the devs work towards what NVidia and AMD have already had 10+ years at doing for their couple of year's at it so far. It's NOT like the old drivers that you just needed to know how to submit verticies and textures to the rasterization engine quickly like with RagePRO, Rage128, and G200/400 cards (I should know about BOTH classes of hardware..
Re: (Score:2)
I use AMD catylist with xbmc. I have the va-api implementation and it works all right.
Re: (Score:2)
Re: (Score:1)
Nice try, Intel.
Re: (Score:2)
I have to say that is the general feeling I'm getting about a lot of stuff.
I want to play around a bit with GPU programming for scientific calculation and the feeling I am getting is that my choices are either Linux with NVidia or Windows with ATI (or NVidia).
It's pretty much the same with Linux itself. I talked the wife into using Linux instead of Windows on her desktop and netbook but that pretty much meant Ubuntu. I don't want to have to maintain two different flavors so that means I run Ubuntu too. M
Re: (Score:1)
http://www.moviewavs.com/php/sounds/?id=bst&media=MP3S&type=Movies&movie=Star_Wars"e=power.txt&file=power.mp3 [moviewavs.com]
Re: (Score:2)
Considering that the remark that NVidia's drivers working well is a Your Mileage May Vary Considerably (Fermi not being well supported under Nouveau, NVidia dropping 2D driver support and pointing people to Nouveau for initial bring up, and select Fermi chipsets NOT being supported (GT440, for example...)) you MIGHT just want to moderate your remarks on that score.
Re: (Score:2)
What if one car is completely build and works 'well enough' and has good factory support, but the other one offers 30% more performance, has all-wheel drive (VT), and twice as much+ storage space (8-16GB RAM support) - though it needs new glow plugs (you can start it, but only in warm weather).
That's the dichotomy of Atom vs. Bobcat, not what you propose.
ARM and AMD (Score:5, Informative)
Re: (Score:2)
Re: (Score:2)
That makes no sense. I am setting up a new box and will have to pay more for a worse NV card to get the performance I want since I run linux.
All my other machines use intel graphics, but I want to game on that one.
Re: (Score:2)
I love AMD CPUs, and the next box will be an AMD CPU and an NVidia GPU. I need working drivers.
Re: (Score:2)
Last time they were a good enough reason for me to get a new motherboard and nVidia graphics card. Boy, I really wish their drivers did not suck so much, since they've got the more interesting hardware. I had good 3D, good video, but they would just not work at the same time. Then you start to try and before you know it you got an X configuration that even I could not fix (and I've got some 17 years experience). Their multi-monitor support wasn't up to par either, and that's stuff I *NEED* and expect to wor
2012 (Score:2)
Re:2012 (Score:4, Funny)
No, 1987 was the year of ARM on the desktop.
Re: (Score:2)
I must be an early adopter, I've already got an arm on my desktop.