The Future of Intel Processors 164
madison writes to mention coverage at ZDNet on the future of Intel technology. Multicore chips are their focus for the future, and researchers at the company are working on methods to adapt them for specific uses. The article cites an example were the majority of the cores are x86, with some accelerators and embedded graphics cores added on for added functionality. "Intel is also tinkering with ways to let multicore chips share caches, pools of memory embedded in processors for rapid data access. Cores on many dual- and quad-core chips on the market today share caches, but it's a somewhat manageable problem. "When you get to eight and 16 cores, it can get pretty complicated," Bautista said. The technology would prioritize operations. Early indications show that improved cache management could improve overall chip performance by 10 percent to 20 percent, according to Intel." madison also writes, "In another development news Intel has updated its Itanium roadmap to include a new chip dubbed 'Kittson' to follow the release of Poulson. That chip will be based on a new microarchitecture that provides higher levels of parallelism."
gcc? (Score:3, Insightful)
Yeah, cause, you know, Intel doesn't make their own http://www.intel.com/cd/software/products/asmo-na
Re:Instead of more power (Score:5, Insightful)
I hate to break it to ya, but in a low-level language like C, doing proper bounds checks and data sanitization required for security does not help performance (although it doesn't harm it much either, and should of course always be done)
There is a lot of bloated code out there, but the bad news for people who always post "just write better code!" is that the truly processor-intensive stuff (like image processing, 3D games) is already pretty well optimized to take advantage of modern hardware.
There's also the definition of what "good code" actually is. I could write a parallelized sort algorithm that would be nowhere near as fast as a decent quicksort on modern hardware. However, on hardware from 10 years from now with a big number of cores, the parallelized algorithm would end up being faster. So which one is the 'good' code?
As usual, real programming problems in the real world are too complex to be solved by 1-line Slashdot memes.
Re:Cell and parallel processing. Answer this for m (Score:2, Insightful)
For the long term (Score:3, Insightful)
If software developers can't or won't take advantage of the potential benefits of multi-core, Intel and AMD may have to significantly cut the price of their processors because upgrading won't add much value.
New term war. (Score:4, Insightful)
What we really need is for software to catch up. Luckily some programs like Premiere, Photoshop have supported multiple CPU's for a while now. But games, etc can really benefit from this. Just stick AI on 1 core, terrain on another, etc etc.
Where all the CPU time will go (Score:5, Insightful)
Where will all the CPU time go on desktops with these highly parallel processors?
Re:Where all the CPU time will go (Score:3, Insightful)
Re:Instead of more power (Score:2, Insightful)
Besides, if we stopped adding features, we'd still be using things like ed for editing (and 'word processing'), our games would still be like Pong, and our remote access would still be VT52 terminals.
Re:Instead of more power (Score:2, Insightful)
The parent's point is that in code where it makes a difference, the code is already thoroughly optimized, in general. Slimming down the code for Microsoft Word or XEmacs or Firefox or Nautilus or iTunes (there, now we've slaugthered everyone's sacred cow!) isn't likely to make much of a difference because apps like these already run plenty fast on modern hardware. Sure, bloat is bad, but it's a lot harder to remove bloat from existing code without removing features than it sounds. If bloat is an issue, use an equivalent app with less features -- nano instead of XEmacs, for instance.
More energy efficient chips... (Score:3, Insightful)
I dream of the day when my gaming computer doesn't need any active cooling, or heat sinks the size of houses. Focussing on efficiency would also force developers to write better code, honestly its unbelievable how badly some programs run and how resource intensive they are for what they do.
Re:For the long term (Score:4, Insightful)
Re:For the long term (Score:3, Insightful)
Ultimately I think you're right. Processors started out general, and have become increasingly specialized. First we had the "floating point co-processor", next stuff like an MMU, then GPUs came along. Multiple cored with differing functions is in many ways just a continuation of that trend.