The Gigahertz Race is Back On 217
An anonymous reader writes "When CPU manufacturers ran up against the power wall in their designs, they announced that 'the Gigahertz race is over; future products will run at slower clock speeds and gain performance through the use of multiple cores and other techniques that won't improve single-threaded application performance.' Well, it seems that the gigahertz race is back on — a CNET story talks about how AMD has boosted the speed of their new Opterons to 3GHz. Of course, the new chips also consume better than 20% more power than their last batch. 'The 2222 SE, for dual-processor systems, costs $873 in quantities of 1,000, according to the Web site, and the 8222 SE, for systems with four or eight processors costs $2,149 for quantities of 1,000. For comparison, the 2.8GHz 2220 SE and 8220 SE cost $698 and $1,514 in that quantity. AMD spokesman Phil Hughes confirmed that the company has begun shipping the new chips. The company will officially launch the products Monday, he said.'"
Oh come on (Score:4, Insightful)
I muchly prefer a fanless processor.
Re:More Power for What? (Score:3, Insightful)
You're correct that people don't need this much power for their desktops but there are still plenty of uses for more speed in servers and for certain other applications.
Re:More Power for What? (Score:5, Insightful)
Flash is ridiculously inefficient, and requires an extremely beefy machine to render real-time full-screen animation.
Re:More Power for What? (Score:5, Insightful)
You're correct that people don't need this much power for their desktops but there are still plenty of uses for more speed in servers and for certain other applications.
Then there is the big fact that progammers these days are sloppy and waste resources. A machine that is faster than one needs today will only be adequate in 2 or 3 years given upgrades to all the programs. (Am I being cynical? Maybe, but then again, maybe not.)
Re:Gigahertz a bad thing? (Score:4, Insightful)
You're missing the whole point. CPU performances are increasing all the time, which allows Microsoft to continue making everything slower. However, the GHz race had little to do with performance; Intel pushed their Pentium 4 closer to 4 GHz, even if it performed slower than many competing CPUs between 2 and 3 GHz. They probably did it because most consumers would only look at raw GHz instead of performance.
Re:More Power for What? (Score:1, Insightful)
Servers still do need more power.
With virtuaisation software allowing dozens of people to share on server to replace their desktop apps
Were still have a long way to go
this is only the beginning
Re:NO NO NO NO NO (Score:5, Insightful)
Re:NO no NO no NO (Score:2, Insightful)
You'd be surprised (Score:5, Insightful)
E.g., sure, we like to use the stereotypical old mom as an example of someone who only sends emails to the kids and old friends. Unfortunately it's false. It was true in the 90's, but now digital cameras are everywhere and image manipulation software is very affordable. And so are the computers which can do it. You'd be suprised the kind of heavy-duty image processing mom does on hundreds of pictures of squirrels and geese and whatever was in the park on that day.
And _video_ processing isn't too far out of reach either. It's a logical next step too: if you're taking pictures, why not short movies? Picture doing the same image processing on some thousands of frames in a movie instead of one still pictures.
E.g., software development. Try building a large project on an old 800 MHz slot-A Athlon, with all optimizations on, and then tell me I don't need a faster CPU. Plus, nowadays IDEs aren't just dumb editors with a "compile" option in the menus any more. They compile and cross-reference classes all the time as you type.
E.g., games, since you mention the graphics card. Yeah, ok, at the moment most games are just a glorified graphics engine, and mostly just use the CPU to pump the triangles to the graphics card. Well that's a pretty poor model, and the novelty of graphics alone is wearing off fast.
How about physics? They're just coming into fashion, and fast. Yeah, we make do at the moment with piss-poor approximations, like Oblivion's bump-into-a-table-and-watch-plates-fly-off-superso nic engine. There's no reason we couldn't do better.
How about AI? Already in X2 and X3 (the space sim games) it doesn't only simulate the enemies around you, but also what happens in the sectors where your automated trade or patrol ships are. I want to see that in more games.
Or how about giving good AI to city/empire building games? Tropico already simulated up to 1000 little people in your city, going around their daily lives, making friends, satisfying their needs, etc. Not just doing a dumb loop, like in Pharaoh or Caesar 3, but genuinely trying to solve the problem of satisfying their biggest need at the moment: e.g., if they're hungry, they go buy food (trekking across the whole island if needed), if they're sick, they go to a doctor, etc. I'd like to see more of that, and more complex at that.
Or let's have that in RPGs, for that matter. Oblivion for example made a big fuss about how smart and realistic their AI is... and it wasn't. But the hype it generated does show that people care about that kind of thing. So how about having games with _big_ cities, not just 4-5 houses, but cities with 1000-2000 inhabitants, which are actually smart. Let's have not just a "fame" and "infamy" rating, let's have people who actually have a graph of aquaintances and friends, and actually gradually spread the rumours. (I.e., you're not just the guy with 2 points infamy, but it's a question of which of your bad deeds did this particular NPC hear about.) Let's not have omniscient guards that teleport, but actually have witnesses calculate a path and run to inform the guards, and lead them to the crime. Etc.
Or how about procedurally generated content? The idea of creating whole cities, quests and whatnot procedurally isn't a new one, but unfortunately it tends to create boring repetition at the moment. (See Daggerfall or Morrowind.) How about an AI complex enough to generate reasonably interesting stuff. E.g., not just recombine blocks, but come up with a genuinely original fortress from the ground up, based on some constraints. E.g., how about generating whole story arcs? It's not impossible, it's just very hard.
And if you need to ask "why?", let's just say: non-linear stories. Currently if you want, for example, to play a light side and a dark side, someone has to code two different arcs, although most players will only see one or the other. If you add more points and ways you can branch the story (e.g.
Re:Oh come on (Score:4, Insightful)
To understand how this is not a sign of slacking off by the chip designers, you have to understand that the P4 was able to run at high clock speeds only because it was designed to use a very long pipeline of small functional units. This design has proven to be inefficient because it causes too many pipeline stalls and because it requires a higher clock speed and higher power consumption to achieve the same performance. The more complicated functional units of chips with shorter pipelines cannot be clocked as fast, but they perform better at the achievable clock rates than the P4 did at higher clock rates. The last Gigahertz race was ended by a shift of architecture, not by "hitting a wall". Then came multicore designs, which further reduced the need and opportunity for higher clock rates (heat dissipation is somewhat of a "wall"). All this caused clock rates to grow much slower. Now that chip designers have found ways to control power consumption, increasing the clock rate is viable again, so the race is back on.
AMD is desperate (Score:2, Insightful)
Funny, Intel was chumped by AMD just like this a couple of years ago, why did AMD let themselves get tagged back? Intel woke up in a major way. Can AMD? Doesn't look too good...
Re:Oh come on (Score:1, Insightful)
Re:More Power for What? (Score:2, Insightful)
Re:More Power for What? (Score:3, Insightful)
You know, that is something that really piss me off.
Yes, I know many times it is not the programmers fault, and they have to be sloppy to be able to meet that stupid deadline. But, c'mon. Take a look at the system resources something like Beryl uses. Then take a look at whats-name Vista 3D crappy gui uses.
And I'm pretty sure Beryl could be even more efficient (tho I'm not sure if it would be worth the effort).
2GB to RUN an OS ? Tons of processing power (both CPU and GPU) to run a simple game ? I can understand beefy CPU needs for things like video encoding and cryptographic processing, along with a few other things. But most apps simply SHOULD NOT need that much resource. IT IS JUST PLAIN STUPID.
Re:AMD is desperate (Score:4, Insightful)
Moving to 45nm gives you extra headroom for clock speed, extra transistor budget, etc. So they might just be demoing systems with similar power envelopes/cost/whatever.
Throw some SSE4 enabled apps in the mix and the Penryn would outperform an equalized Conroe by a fair margin.
2.5Ghz - 3Ghz a race? (Score:3, Insightful)
Re:More Power for What? (Score:5, Insightful)
Just shut the fuck up already. Anyone with more sense than a bag of rocks will conserve scarce resources, not plentiful ones. Clock cycles are cheap. Profiling is expensive. Megabytes are cheap. Time spent coming up with clever bithacks is expensive, especially since only the cleverest and generally highest-paid developers can do it. Second cores are cheap. More time spent coming up with whole new clever bithacks for the pentium D version because it has a different relative cost for jumps and floating point ops, thus making your last batch of hacks do more harm than good, is expensive.
Furthermore, programmers don't so much *waste* resources as utilize them to provide more value. Yeah, I know the 2600 had 128 bytes of RAM, and those were some clever fellas who managed to make playable games on them. Lets see you play WoW on it. I know that your multimedia keyboard probably has more processing power than the PCjr that could once run Word. Fire up that version of Word, insert an image and a table, and hit "print preview."
Of course there are times when computing power is a precious resource. Console games that have to look awesome on 4 year old hardware. System libraries where every wasted clock will be multiplied by 2000 calls by 10000 different programs. Embedded systems where cost and size simply won't allow you to have those few extra Hz you crave. In these situations, when using extra cycles has more severe consequences than offending your sense of computational aesthetics, I believe you will find that these young whippersnappers aren't wasteful at all.
Re:More Power for What? (Score:5, Insightful)
If CPUs stayed the same power, people would write better code to improve performance.
Re:More Power for What? (Score:3, Insightful)
Those apps have changed. They have to change. Nobody wants the old ones. Funny how that works, eh? A clone of Microsoft Word 2.0, which I used in high school, would be fast, efficient, adequate for virtually all usage, and commercially worthless. You and I may want the old fast versions, but I haven't noticed anybody developing software just for me. (I keep waiting for that to happen. I also want a cell phone without a friggin' camera. And get those kids off my damn lawn.)
Those old apps weren't that responsive anyway, and they were incredibly unstable by today's standards. I'm younger than you, but I'm old enough to remember when saving your work every ten minutes was just common sense. These days that would be a neurosis. Maybe that has something to do with simpler, less-efficient code?
Actually, I haven't noticed anything of equivalent complexity that runs worse today than it did ten years ago. I do notice is that the market for old, well-understood kinds of applications is dominated by the apps with the most features, so those apps aren't comparable to their predecessors of ten years ago. If you want to compare like with like, compare emacs today with emacs ten years ago. Compare gcc with gcc. It turns out this nifty new hardware is really smokin' fast!
No, I'm not fine with it. My brainpower, as copious as it is, is the most precious resource I have. Sure, I can handle memory allocation. Right now I do most of my programming in C++, with a bit of Python here and there. Sometimes I have to use the heap, and that means making sure my objects get deleted. Fine. I've had two or three memory leaks in over three years of work as primary developer on a C++ application, and only one made it into production, where it had a minimal impact. (Thank you, smart pointers.) That doesn't mean manual memory management is a good use of my time. Sure, it used to be a good idea, back in the day. Back in the day, people checked the oil in their cars a few times per month. If it were good to submit yourself to every possible discipline that used to be prudent, no matter what the present-day benefit, we'd all be checking our oil once a week, practicing martial arts, learning Latin, and swearing fealty to a feudal lord. Those things might be good fun and beneficial for our personal development, but neglecting them no longer entails unacceptable consequences.
You said yourself that extraordinary effort was put into optimization because it was the only way to make programs run acceptably fast. Well, that reason doesn't exist any more. You need a new reason. A good programmer invests his effort where he'll get the greatest return. A good programmer knows that meticulously pairing new and delete costs time and brainpower. (A good programmer also knows that meticulously pairing new and delete restricts his design space, sometimes precluding the simplest, most elegant design. Again, thank God for smart pointers.)
A lot of people regret the passing of the exigencies that forced them to come of age and learn discipline. For you, the exigency was meager hardware, and your measure of discipline is tight code. For the current generation, the exigency is complexity, and the measure of discipline is simplicity. Twenty years ago, restraint meant being frugal with machine resources, even when it caused you hardship. Today, restraint means being frugal with mental resources, including our own, even when it causes us hardship. It isn't all wine and roses. Imagine