AMD and Intel Update CPU Roadmaps 222
vincecate writes "Recently
AMD updated their processor roadmap. It shows their move to 90 nm and has a range of new processors over the next 1.5 years, including dual-core chips. An
unofficial AMD roadmap shows speeds and performance increasing.
Intel also recently updated their roadmap.
Intel does not show anything faster than
the current 3.6 Ghz in the next 11 months, including the recently delayed 4 Ghz chip, except to say '3.6 Ghz or greater.' Strangely, some of the recent SPEC benchmark results show the 3.6 Ghz chip to be slower than the 3.4 Ghz chip. One possible explanation for this is that the 3.6 Ghz chips will slow down due to 'thermal throttling' if you are not very careful to keep them cool. So it seems like heat may be the reason Intel's roadmap does now show much improvement."
Water cooling? (Score:5, Interesting)
Re:Water cooling? (Score:2, Interesting)
I guess though, at some point in the future water cooling will have to be implimented in some form.
Re:Water cooling? (Score:4, Interesting)
Prescott in general has had more then its fair share of problems. Prescott is a massive CPU with a 31 stage pipeline, compared to the older P4's 20 and the Athlon XP's 12. I'm not sure off the top of my head how many stages the Athlon 64 has.
All this extra complexity is supposed to make it easier to clock up the processor, and was the same trick Intel used to gain clock speed from the PIII to the P4, so the marketing folks said "Do it again."
Of course the biggest reason why Intel doesn't show many (or any) speed increases is they've scrapped all their future P4/Prescott based designs, even projects that were closed to or already completed because of the problems they have had with Prescott. Intel's plan is to rework their Centrino/Pentium-M core into a desktop chip, but that will take several years.
Re:Water cooling? (Score:2)
Right. Identified by a "below-spec" sticker on them I assume. What's the "and such" specifically?
Using Pentium-M as a desktop processor won't take several years.
Re:Water cooling? (Score:4, Informative)
That's the problem Intel has right now, really. Marketing seems to say, "Make it sound faster", only looking for good warrior CPUs in the Mega Hertz Wars. IBM/Apple and AMD have not been trying to go for faster clock speeds but instead for faster CPUs.
Such long pipelines as the Prescott line may help achieving higher clock speeds, but 31 stages means that you'll see more pipeline stalls, so your CPU is happily running at higher clock rates, doing nothing. Of course, not all instructions actually have to go through all 31 stages, but still, it's impractical to have so many stages in an architecture when you know that every so-many-but-fewer-than-31 instructions you're going to hit a branch. Not to mention the additional complication for the on-die dependency tracking that you need in out-of-order cores like Prescott.
Of course in-order architectures with full predication ISAs would solve some of the problems with longer pipelines, but I guess we can't say that this other Intel architecture, ia64, is such a great success ;-)
Re:Water cooling? (Score:3, Informative)
Re:Water cooling? (Score:5, Informative)
Another disadvantage with this high heat production is that other core components in the computer (such as the mainboard) will be exposed to more heat as well, hence the durability of these components will be lower.
If Intel and AMD continues to approach Itaniums heat production, water-cooling or similiar technologies will become mandatory for high end processors.
Re:Water cooling? (Score:5, Informative)
AMD used to have a high-heat reputation and used to be known for difficult-to-overclock processors. Honestly, I don't think that is nearly as much the case with their newer processors. The Athlon64 seems to run fairly cool, plus it supports frequency scaling when it isn't busy (note - the 55C figure I gave was under heavy load for considerable time - no scaling in effect). Right now, I'm typing on the machine and the CPU is reading 37C - only 1.5C higher than case temperature.
I think AMD is actually passing Intel in this respect. Intel had better watch out if they expect year-long delays - eventually AMD will be releasing 3-4GHz Athlon 64's and they'll be FAR faster than anything Intel currently has...
Apple's premiums can handle the extra cost... (Score:3, Insightful)
...but maybe the cheaper PCs cannot?
Also, a liquid cooler is probably inherent harder for Intel to package with an OEM processor. Affixing a liquid cooler to a processor requires more case aware design than simply clipping a fan to a mainboard socket.
Re:Water cooling? (Score:2)
Is the processor clock rate trend coming to an end (Score:4, Interesting)
Don't really keep up on the hardware these days..
Cheers,
Simon.
Re:Is the processor clock rate trend coming to an (Score:3, Interesting)
I built my system about two years ago (actually it's a few months short of two years). AMD would have to release the equivalent of 5600+ within a few months to match the speed of the 2800+ they released almost two years ago.
If they were a few months late that would be normal but it looks like it will take far longer.
Re:Is the processor clock rate trend coming to an (Score:4, Interesting)
Are the new processors really much faster?
Re:Is the processor clock rate trend coming to an (Score:3, Interesting)
But then and again your upgrade cycle seems to be longer then mine. I think it all depends on whether or not you want to run some apps that require a more powerful system or actually runs them a good deal faster.
Re:Is the processor clock rate trend coming to an (Score:3, Interesting)
Don't get me wrong... I still love my Laptop! :-)
The important concept to keep in mind is that all these computers are powerful enough to do what I need them to do, so merely making CPU clocks tick at a higher rate isn't going to
Re:Is the processor clock rate trend coming to an (Score:2, Informative)
The Celeron is a severly crippled chip, unlike the Duron, which is a respectably performing budget processor. It only has 128KB cache, which is CPU sucide on a P4 core. The P4 needs large amounts of cache to keep its long pipeline filled. People who buy high clock speed Celeron, thinking they're getting a fast CPU are getting massively screwed by Intel. So much so it borders on being an unethical and immoral business
Re:Is the processor clock rate trend coming to an (Score:2)
I'm hoping that because of this 90nm barrier (or pause, what have you) that the advent of dual core chips actually comes around this time. There have been many promises & comments from the G3 750FX a few years ago up through to today, of chip manufacturers turning to Dual Core CPUs.
I'd r
Re:Is the processor clock rate trend coming to an (Score:2)
That's the roadmap that I'm going to be following for any workstation that requires it. Downside is the cost, but from all reports it makes the operating system and everything much more responsive. Still, you know what they say, the only thing faster then X is *two* of X.
I'm hoping for dual-core as well... but unless they break the 90nm issue, are they really going to have room on the die for a 2nd core?
Re:Is the processor clock rate trend coming to an (Score:2, Interesting)
The next market force is competition. If AMD looks like it will be selling a 4000+ Intel will match that.
Processors capable of this speed are most likely possible. There's no way Intel can sell an
Re:Is the processor clock rate trend coming to an (Score:4, Informative)
I was talking to my friend about this the other day, and we think that eventually they cannot go that much faster (well, maybe have a SMALL core of the chip that can go faster), and they'll start stacking in parallel instead. Ie, massively hyperthreaded processor cores. So maybe in a few years we'll see 6 GHz chips with 8 or 16 hyperthreaded processors?
We're physicists, though, not engineers, maybe there are some other clever ways to keep pushing the envelope?
Re:Is the processor clock rate trend coming to an (Score:2)
Re:Is the processor clock rate trend coming to an (Score:2)
Re:Is the processor clock rate trend coming to an (Score:3, Informative)
Photons are the mediating particles of electromagnetic force, and it's definitely this force that couples two electrons together, or the electrons to the 'holes' in the doped semiconductors, etc etc. An elementary description of
Re:Is the processor clock rate trend coming to an (Score:2)
Whew (Score:4, Funny)
And here I was, afraid that they had decided to not increase speeds and performance. That was close.
Re:Whew (Score:3, Funny)
So I'm screwed? (Score:4, Interesting)
Re:So I'm screwed? (Score:4, Insightful)
Recent price drops of athlon 64 3000 make socket 754 solutions attractive price wise.
Re:So I'm screwed? (Score:2)
Re:So I'm screwed? (Score:5, Interesting)
Re:So I'm screwed? (Score:2)
Re:So I'm screwed? (Score:5, Insightful)
Re:So I'm screwed? (Score:2)
Are you aware that means either your case temperature reading or the cpu temperature reading is wrong? Fans are not heat pumps.
cpu draws mainly outside air? (Score:2)
As many people place an intake fan right outside the CPU to feed the CPU outside air (with a slot fan or normal case fan), so it is quite plausible that the CPU temp is under the average or representative case temperature. Especially since the exhaust from the CPU fan is usually directed into the case and further heated by hard drives, GPUs, etc.
Granted, the output air temp cannot be lower than the intake, but there is no reason to assume that the intake temp is th
Re:So I'm screwed? (Score:2)
Re:So I'm screwed? (Score:2)
Re:So I'm screwed? (Score:2)
Re:So I'm screwed? (Score:2)
For example, I've got an EPOX motherboard ( can't remember which one ) and an Athlon 2100XP. It's got a 266MHz FSB ( from memory - I may be wrong ). I'm pretty sure if I wanted to, I could put a 3200XP chip into it. But at the moment, I really don't see the point. This runs all games quite well indeed ( partly due to my video card of course ). I d
Re:So I'm screwed? (Score:2)
Weird I have a 2100 and epox MB, wouldn't be the 8rda+ would it ?
Anyway you are absolutely right, most upgrades are short term temporary solutions, eventually you have the bite the bullet and upgrade teh whole MB, CPU and RAM combo.
Plus if you are a gamer, you really should wait till next year before spending any money, with PCI Express just around the corne
The P4 heat problems (Score:2)
Sure, but this is just another form of performance problem. That it takes the form of heat is just a sign of a too inefficient design for the speed. I'm of the
Is it just me or are people stupid these days? (Score:5, Informative)
What I don't understand is why more people aren't building Pentium M desktops.
Re:Is it just me or are people stupid these days? (Score:5, Insightful)
Re:Is it just me or are people stupid these days? (Score:2, Insightful)
Since its already a cutthroat pricing market, I guess PC makers dont see the need to up the cost of a PC by putting in a Pentium M chip when they can use a P4. Even if the P4 does use more power.
Re:Is it just me or are people stupid these days? (Score:5, Interesting)
Re:Is it just me or are people stupid these days? (Score:2, Insightful)
most importantly the 3.4GHz one on the quoted SPEC page has 2MB of L3 cache, but the 3.6GHz one has none.
Correction (Score:2)
Re:Is it just me or are people stupid these days? (Score:2)
I modded u flaimbait cause that isn't true. The Pentium M is comparable performance to the P4.
I thought the Pentium is out in speeds up to 3.4 Gigahertz. I thought the desktops 3.4GHZ and mobile chips running at up to 2.0GHZ would have been informative, not flaimbait. I guess I unintentionaly started a flame war and so my first comment is flamebait.
A clip from the Intel site regarding the Pentium M..
Processor, Intel® Pentium® M Processor.
*sigh* (Score:5, Insightful)
We're obviously starting to see a convergence between the industrial processor market and the end-user one. I mean three years ago you would get a dual 3.2GHz (1.6 * 2) system to host a medium sized website, and that kind of horsepower is probably still adequate today. So what kind of apps (I mean, apart from Doom 3) do end users need this kind of grunt for? 3GHz? 3.6GHz? 4Ghz?! If Architects could use AutoCAD 2000 on a 950MHz cpu, without complaint, what has changed? Obviously a speed increase is nice, but three or four times that?
Are we going to see a point where the convergence turns to over taking, and end-user CPU's need to be faster than a lot of corporate stuff?
p.s: I'm aware of shit.slashdot.org, no karma whores please.
Re:*sigh* (Score:2)
Re:*sigh* (Score:4, Insightful)
Didn't that skin make it hard to see the colour scheme?
But back to the actual topic...
CPU speeds are just stupid for most people. I write code for a living (at the moment anyway) and the 400Mhz "Mobile PII" I'm using at the moment is adequate for that.
I understand that if your job involved compiling Mozilla or X11 a lot then more CPU (and more importantly more and faster RAM) would make you more productive.
I understand that if you are doing computer generated animation or physics modelling or whatever then more grunt would make you more productive.
But for most people, computers were fast enough a long time ago. This 400Mhz laptop is my fastest machine - both home computers are 300Mhz or so. One of them runs Windows XP Pro just fine, runs Office just fine, runs firefox and IE just fine, runs gimp just fine and even does all of those at the same time just fine.
Every time someone asks me for advice on buying a computer, I ask "do you want to play games?", and if the answer is no then my answer is "buy the slowest CPU model they sell and spend any extra money on RAM".
Re:*sigh* (Score:2)
Every time someone asks me for advice on buying a computer, I ask "do you want to play games?", and if the answer is no then my answer is "buy the slowest CPU model they sell and spend any extra money on RAM".
Another good thing to spend the extra money on is monitor, keyboard, mouse - the things you actually interact with. I've seen people pay through the nose for the absolute latest Pentium, then buy a shitty mouse with it...
These are things you look at and touch all the time when using the computer, i
Re:*sigh* (Score:2)
In fact I'm using my laptop right now, because the significantly better desktop machine at the in-laws place (which is where I am) has an especially bad mouse (and to make it worse not enough "mousing space" on the tiny desk) - so bad that my laptop touchpad is nicer to use...
Re:*sigh* (Score:3, Insightful)
Re:*sigh* (Score:2)
[Aside: I've done video editing (with Premiere, which I suspect was a particularly inefficient application) on a Power Mac 7100 (I think, may have been a 6100 or an 8100...) - we went home for the weekend while it rendered and hoped it didn't crash]
Re:*sigh* (Score:4, Interesting)
I think we will actually. If I understand your meaning correctly when you say "corporate stuff" I'm thinking web, file, email servers and so on. Like you said, 3 year old machines are fine for most of that stuff now and will continue to be for some time. On the other hand, the end user is going to be requiring more and more power and not just for games or pretty interface animations. Apple and Microsoft have both been talking about the idea of the PC as a digital hub (well, I don't think MS uses that term exactly because it may be a Steve-ism) for a while. As it becomes a hub for more and more devices it's going to need more power. Loading an iPod with songs is trivial. Manipulating digital photos is a bit tougher. Beyond that you get into editing video and burning DVDs. Encoding and Decoding video. Music creation software. Maybe it won't be long before we see easy to use, prosumer quality 3D animation software...
We've seen a lot of things that used to require very expensive, specialized equipment make their way into the consumer space in the past few years. It's not too hard to guess where that trend may go next. One thing is for sure, it will continue to require more and more powerful processors. Not everyone will need all that power every day but when you get back from that European vacation and you want to do something cool with all the video you shot, you'll be glad it's there.
Re:*sigh* (Score:3, Informative)
A dual 3.2 GHz P4 would host a web site quite a bit larger than "medium sized." Consider this: for a web server, the vast majority of the computational resources are spent in bandwidth. If you had a number of static pages, you could probably serve up web pages from a single 1Ghz to millions of visitors a day[1]. Anything more than that, you'd
Re:*sigh* (Score:2)
If all you want from your computer is word processing, web browsing and email access then you don't need much processing power at all.
If, in contrast, you want to do video editing, applying plenty of real-time effects, and decoding/encoding to a compressed format then a high-end dual processor machine is handy.
People are getting used to their computers being able to do things in real-time. Consumer-level applications make a lot of use of real-time
Re:*sigh* (Score:2, Informative)
So what kind of apps (I mean, apart from Doom 3) do end users need this kind of grunt for? 3GHz? 3.6GHz? 4Ghz?! If Architects could use AutoCAD 2000 on a 950MHz cpu, without complaint, what has changed? Obviously a speed increase is nice, but three or four times that?
IDL, IRAF, Mathematica, Matlab, etc. In other words, physicsists and astrophysicists can always use faster computers for their everyday work. Even more so, (astro)physicists running fluid dynamical systems of galaxies need every bit of spe
I'm starting to worry (Score:4, Interesting)
Re:I'm starting to worry (Score:3, Insightful)
Clock speeds seem to have stalled. (Score:5, Insightful)
What I personally really, really want to see is cooler CPU's. CPU's that doesn't require a huge fucking fan. CPU's that are content with a heatsink would be nice.
Furthermore, I would love it if Dual configuration became more widespread (and thus cheaper). Personally I would love a multi-CPU machine far more than single-CPU ones.
My personal wishlist:
- 64bit CPUs to become the norm (seems to be happening).
- Cooler CPUs, not requiring fans (seems to be happening, look at the VIA EDEN CPU's)
- Dual/Quad/Multi -CPU configurations becoming the norm in home computers.
I don't care much whether single CPU's grow much faster at the moment, as there doesn't seem to be applications requiering it for regular use. There are of course specialist tasks that require more horsepower, but those are
Re:Clock speeds seem to have stalled. (Score:2, Informative)
- downclock and undervolt
- result: cool CPU
Re:Clock speeds seem to have stalled. (Score:4, Informative)
You can have those, just not at the same time. Via Eden runs fanless. But it's still 32bit! And it doesn't run in SMP-configurations (yet. there has been some info about SMP-solutions).
I think you could buy an Opteron 2xx-machine, underclock it to around 1GHz so it might run fanless. Then you would have your fanless 64bit SMP-machine,
Re:Clock speeds seem to have stalled. (Score:2)
Well I don't know about fanless, but if you're really after a quiet computer, and don't care at all about speed, there's always underclocking. I bought an AMD 2600+ for my dorm room PC and underclocked it (gasp) to 2100+ speeds. The Shuttle power supply has a small 40mm fan, but the only other fan in the case is a Zalman 80mm fan with a big fat
Re:Clock speeds seem to have stalled. (Score:4, Insightful)
Once upon a time, viewing images was a specialist task. Then, viewing them in full colour, then editnig them. Sound used to be a specialist task, then editing sound.
Now, things like video encoding/editing is a specialist task, requiring (relatively) serious amounts of horsepower.
Well, once that sort of horsepower becomes commonplace, the task stops being specialist, as more and more people do it simply because they can.
True, there will always be truly specialist tasks that never become mainstream (animation and rendering work, phsically simulation and similar number crunching), but there is stuff now that could most certainly benefit from more CPU power (whether it be from single- or multi-cored machines) that would become more mainstream when that power became affordable.
Re:Clock speeds seem to have stalled. (Score:2)
Animation isn't going to be a specialist task forever either. The porn applications of it are obvious. One example: remember how Celebrity Deathmat
Re:Clock speeds seem to have stalled. (Score:2)
The future may be MP via multicore processors because that's the easiest path to superior performance. It won't be because there's no demand for a faster single processor though.
Processor manufacturers have to co
Re:Clock speeds seem to have stalled. (Score:2)
It makes lots of sense.
With one CPU, one task typically occupies the entire CPU for an an entire timeslice, however that is defined by the kernel at hand.
With two CPU's, two different tasks may occupie the differe
Re:Clock speeds seem to have stalled. (Score:2)
The first thing to do is stop buying cheap cases/power-supplies. It costs money to make things quiet; extra sensors to throttle fan speeds & quality bearings add a few $$$s to the cost of a unit.
With that said, I just threw together an Athon64 3200+ with the stock AMD heatsink into one of these [newegg.com] and I couldn't hear it over the ventilation system. This is under full load, not sitting there idle with some sort
Re:Clock speeds seem to have stalled. (Score:2)
Doing many tasks at the same time. One music player, one movie player shoveling a movie to the TV in another room, some CPU intensive work for the guy sitting at the computer, some this and some that, while the system still being responsive.
With a single CPU, this takes a lot of timing, prioritizi
Re:Clock speeds seem to have stalled. (Score:2)
Considering my 1.13 Ghz PIII-S dissipates 29.9 watts, I'm pretty impressed. I could keep my current clock speed and only use 5 watts. That Shuttle is going to be my next upgrade. Hopefully the Pentium M chips will have dropped somewhat in
What is with the 64-bit fetish? (Score:2)
Why do people care about this so much? A 64-bit address space only matters if you have more then 4 GB of RAM. That's a crapload of RAM, any way you look at it. In some cases, moving from 32-bit to 64-bit can even slow things down, because now you're spending time managing twice the address space. I'd much rather have a dual-core 32-bit chip then a 64-bit chip.
Re:What is with the 64-bit fetish? (Score:2)
# of registers is independent of word size (Score:2)
The number of registers has nothing to do with the word size. The 32-bit SPARC has 24 general-purpose registers, for example. The 64-bit Alpha has 64 GPRs (32 integer and 32 FP). There's no reason we have to have a 64-bit word size just to get more registers. Nor is there any reason to limit the number of registers to 16, for that matter. That smells like 8086-induced brain damage. If we're going to go for more registers, why n
Re:Clock speeds seem to have stalled. (Score:2)
I, then guy you're answering too, remember. I also remember when I didn't even need a heat sink.
What I really want is for computing to return to the days of the heat sink. No movable parts for cooling is a good thing. Less maintainance.
I don't see any reason to return to the pre-heatsink days though, as a heatsink isn't a moveable part.
Opteron, Linux 2.6 and Java 5 benchmark (Score:4, Informative)
Re:Opteron, Linux 2.6 and Java 5 benchmark (Score:2, Interesting)
Where is the roadmap I want? (Score:5, Insightful)
I survived just fine on a PII for several years until recently biting the bullet and getting myself a P4 box in a Shutttle Zen XPC case (relatively quiet). I seriously considered getting myself an EPIA box as my main machine, simply because it would be lower power (therefore cheaper to run), silent and enough umph to use mutt, firefox and ssh into the server kit where the real work is done. The only reason I ended up with a P4 is because a friend had a 3GHz one going very cheap.
I want less power, not more. The idea I should overclock, buy liquid cooling systems and should pay a ridiculous amount so I can play some games? I'm sorry, what planet are you all on?
Re:Where is the roadmap I want? (Score:2)
I recon Intel wont have the 4 Ghz CPU's out by then.
Your thinking is very vector like with high language teaching from schools being the target. In reality there are other directions CPU's are going too. If the original poster is building a gateway with no moving parts including CPU fans, hard drives,
Z80, 80C52,
Re:Where is the roadmap I want? (Score:2)
AI will likely only move forward along with advances in robotics and bio-connectionist systems. Traditional programming doesn't lend itself well to the task.
"Expert systems" is an unfortunate term. I prefer "knowledge based", but even those programs have proven impractical. The most ambitious attempt has been Doug Lenat's CYC project [sourceforge.net].
Faster processo
Re:Where is the roadmap I want? (Score:2)
Except for a very few specialist areas - yes, those are the only applications - and not even the current batch of games seem to rely mostly on CPU power, but on GPU power.
As the days move on, EVERYTHING will get bigger and smarter, until we reach a point where AI is king
We'll need a breakthrough in AI first. I doubt it will come very soon.
The drive for greater computing power far exceed
This shows a benchmarking problem (Score:3, Funny)
Or it could be that.... (Score:2)
article's author has failed to notice that Intel's
new flagship line (the 7xx series) is continuing to
improve while their higher-clocked but end-of-life
Pentium 4 line is topped out.
End of Life on 32bit (Score:2)
When will they start building chips that have no support for 32-bit software?
I mean, we don't really support 8-bit today and I'm not really sure if we even have support for 16-bit software in todays CPU's. So when are we going to rid our selves of the legacy throw backs?
Re:End of Life on 32bit, wrong... (Score:2)
Example, AH is an 8 bit register and AX is the 16 bit counter part. EAX is 32 bit, I forgot what new 64 bit registers are called... but yes you will find these instructions in old software and it can usually still run.
Re:End of Life on 32bit (Score:2)
CPUs will never lose its 32 bit support because 64 bits are bad for a application if it isn't necessairy. The binaries get larger, the pointers get larger and the application won't run faster.
A good operating system, and a good processor will allow for a mix of 32 bit and 64 bit applications. Use the technoogy we'
Start of clue on 32-bit (Score:2)
"64-bit" is not "better" then "32-bit" just because 64 > 32.
The size of an address word determines the memory space you can address, and most people are simply unlikely to need more then 4 GB of virtual segment size for at least the next ten years. I'm convinced that most processors will have a 64-bit address word because obviously most people think like you do and believe a bigger number automatically means a better produc
The whole heat problem, etc (Score:2)
One possibility is to keep throwing more power at the problem and keep cooling it off.
Which CPU is better for gaming? (Score:2)
Should I go AMD Athlon64 or Intel P4? I heard P4 Prescott CPUs have issues like heat which is bad because my room can get up to 85 degrees(F) during heat waves. Any recommendations welcomed.
Re:Which CPU is better for gaming? (Score:2)
If you get a 775 based prescott you will have a hard time finding a pci express video card to go along with that new processor. I just bought (last week) a 3.4Ghz northwood core P4 instead of a prescott and a 875 chipset motherboard to go along with it.
A lot of folks here will probably say go AMD and give a reason like "its 64bit". Don't let that sway you. Currently
Re:Which CPU is better for gaming? (Score:2)
Re:Which CPU is better for gaming? (Score:2)
Trusted Computing may take some of the blame? (Score:2)
-
Re:Forget CPU, enter the GPU (Score:3, Informative)
Re:Forget CPU, enter the GPU (Score:3, Informative)
I haven't had the chance to play with a Pixel Shader 3.0 card yet, so I don't know how useful for generic computation they are. It usually helps if you're trying to process many sets of the same kind of data, rather than evolving one calculation through a long or iterative algorithm.
~phil
Re:Forget CPU, enter the GPU (Score:2)
I thought this was an AGP driver issue, not a hardware limitation? Who knows, maybe the same guy will write the PCIe driver and history will repeat itself...
Re:Forget CPU, enter the GPU (Score:2, Interesting)
A GPU is still only a CPU that has been optimized for multicore vector operations
eg: a GeForce 6800 is approx 10 programmable pipelines, with some entangling fixed function pipes, such as triangle setup, cache, memory and IO, etc.
6 of those cores are independant vertex processors
4 of them include 4 "pixel pipes" each, which, unfortunately, aren't actually independant, they just include writemasks to prevent unused ones from affecting what they shouldn't. if one pixel pipe s
Re:Forget CPU, enter the GPU (Score:2)
So you can use the CPU for things other than rasterizing scenes?
Re:heat and faster speeds ... (Score:3, Informative)
Also, the wafers used today are what, 300 mm in diameter while the chips are something like 10x10 mm, so t
Re:which to purchase? (Score:2)
I would place a bet that AGP will still be around and strong this time next year and maybe even the year after. There is too large of a user base to just drop it entirely.