Intel Core i7-4790K Devil's Canyon Increases Clocks By 500 MHz, Lowers Temps 57
Vigile (99919) writes "Since the introduction of Intel's Ivy Bridge processors there was a subset of users that complained about the company's change of thermal interface material between the die and the heat spreader. With the release of the Core i7-4790K, Intel is moving to a polymer thermal interface material that claims to improve cooling on the Haswell architecture, along with the help of some added capacitors on the back of the CPU. Code named Devil's Canyon, this processor boosts stock clocks by 500 MHz over the i7-4770K all for the same price ($339) and lowers load temperatures as well. Unfortunately, in this first review at PC Perspective, overclocking doesn't appear to be improved much."
Speed is not the only thing. (Score:5, Informative)
For common people video re-rendering is probably the most CPU intensive task. Even that could be farmed out to GPU, if not already pretty soon.
There are some power users in the accounting and finance department who commit crimes against software using atrociously written Excel macros. Their spreadsheet update time scales as the square or cube of the number of cells. They blame the computer for being slow and demand faster computers. Even this group does not benefit by overclocking because Excell is such a bloat, it triggers so many page faults and long (out of L1, L2, L3 cache) fetches.
So might benefit? May be people like me, doing finite element analysis, mesh generation or other such physics simulations.
For a vast majority of the users, reducing the temperature and applying it to more reliability, longer lasting, less power consuming chips would give bang for the buck. But that is difficult to test, does not garner press reports and more importantly cuts into future sales. So they will obsess with overclocking gimmicks.
Re: Speed is not the only thing. (Score:1)
Who might benefit? Gamers, I imagine.
Re: (Score:3)
Secondly gamers form a very small segment of the computer users. The mobile phone gaming market is bringing in so many non-traditional ga
Re: Speed is not the only thing. (Score:4, Insightful)
As a gamer, there are a lot of games that are CPU bound. ArmA 3 and Skyrim are great examples, basically any game with a complicated physics engine or that's dependent on a lot of AI (Gamer AI, means NPCs) calculations. A faster CPU can offer a much better performance than a slower one with the same video hardware
Re: Speed is not the only thing. (Score:4, Informative)
I play Skyrim on a core 2 duo pegged to 30% at the moment. At 1920*1200, with a gtx460. It runs fine.
If it were actually cpu bound, it'd play like complete crap.
Re: (Score:2)
Not true at all! My phenom II with a ati 7850 slows to a crawl with polygon counts in bf 4 and swtor in the imperial fleet. Guild members with i7s with lower end cards have double fps.
The card needs polygons fed to reach render and wait upon DirectX to redraw. A faster CPU is required
Re: (Score:1)
Supposedly the Cryengine 4 is CPU bound. It's one of those engines where you really want to throw cores at it.
Re: (Score:2)
Yap, Star Citizen Arena Commander just came out and you need a properly OC CPU to reach 60FPS. My i5 2500k @ 4.5 is only able to give me 48FPS, while stock CPUs won't give you more then 40FPS.
Re: (Score:2)
Those things go mostly hand in hand, either you can increase performance or reduce power usage. Things get a little more complicated as you approach SoC power levels, but in general the one with the highest performing chips also can scale them down to the lowest consuming chips. There's a reason Intel can sell $500-1000 mobile chips, in that power envelope AMD doesn't have a match on performance so Intel is free to set the price at will.
Re: (Score:2)
What do you think Intel is trying to do with Atom?
Re: (Score:2)
Re: (Score:3)
May be people like me, doing finite element analysis, mesh generation or other such physics simulations.
Probably not even you. Such tasks demand a huge amount of memory and the bottleneck is often the memory channels on the CPU chip per core. If you scale the benchmark result with the amount of cores, a CPU with 4 cores and 4 channels will outperform a CPU with, say, 6 cores and 3 channels even if the 6-core CPU is clocked higher. Given that the software scales nicely, it will be better to add more CPUs to the cluster than increasing the clock speed. Also, if CUDA takes off, the clock speed of the CPU will be
Re: (Score:2)
That my friend is what the difference between an i5 and i7. For having many apps and tabs open an i5 is fine. For compiling code more ram channels from an i7 help
Re: (Score:2)
It seems to me that this CPU would be the perfect choice for a MAME setup being that MAME is one of the few things out there these days that is genuinely CPU bound.
Re: (Score:2)
Re: (Score:2)
But the users of i7s are not the friendly receptionist at work or Grandma checking her FB. These are almost all gamers mixed with software developers and a few niches.
So yes CPU is important. If not then the rest of the normal users buy i3s and maybe an i5 if they have more cash and do heavy workflows. Unless you make 1 tb video edits or play Wolfenstein the average users won't see any benefit between a 4 core i5 or i7.
Re: (Score:2)
So might benefit? May be people like me, doing finite element analysis, mesh generation or other such physics simulations.
Actually, finite element analysis is one of the fields that stands to benefit most from the use of GPUs. Although it does depend on just what it is you are doing.
I remember back in the early 90s when our office got a brand-new 25MHz 486. We had great expectations for it. We set up a groundwater flow model (I forget how many cells it was) on the Friday it came in, after I got it all set up. The simulation ran the rest of Friday... we went home. Monday when we came back it was still running...
Re: (Score:2)
Wow.. (Score:2)
Repost (Score:4, Informative)
No it's techni-babble (Score:1)
...a polymer thermal interface material that claims to improve cooling on the Haswell architecture,...
Captain! The polymer thermal interface is failing along the Haswell architecture! I dunno how long I can keep 'er go'in!
Re:Repost (Score:5, Informative)
That was a paper launch announcement. This post is a review with benchmarks and overclocking.
Re: (Score:1)
K part, 500MHz stock bump DOESNT MATTER (Score:2)
Intel bumped stock clock on a K designated unlocked part that NEVER works on stock. Pointless gesture.
And it appears to be OC limited to boot, W T F Intel?
Re: (Score:2)
start your water pumps (Score:2)
Yawn (Score:5, Informative)
30 Percent Faster in 3 Years:
http://www.pcper.com/reviews/P... [pcper.com]
Overclocking issues?
http://linustechtips.com/main/... [linustechtips.com]
Speed is dead, long live low power (Score:2)
Apart from games and video encoding, we've kind of reached a plateau of user requirements such that a platform like the AMD Kabini would be plenty for most of the average user. Since the average joe can live with only a tablet, I'm guessing even Kabini is overkill by comparison.
Re:Speed is dead, long live low power (Score:4, Insightful)
Agreed. I've said it before and I'll say it again: significant performance increases in the x86 world are a thing of the past.
There simply isn't enough money in the market chasing higher performance to make the development cost of faster chips worth the investment.
This is actually an opportunity for AMD. I expect it costs AMD less to catch up to Intel than it costs Intel to push to faster speeds, and since Intel isn't being paid anymore to get faster, AMD can, like the slow and steady tortoise, gradually catch up to Intel. I believe it will take a couple more years, but if AMD survives that long, I believe that it will have achieved near performance parity with Intel by then.
And then neither company's offerings will get much faster, forever thereafter, until there is some new kind of 'killer app' that demands increased CPU speeds that people are willing to pay for (could happen anytime; but the way things are going, with everyone moving to mobile phones and pads, I think we're in for a relatively long haul of form factor and power usage dominating the marketable characteristics of CPUs).
I believe Intel will continue to hold a power advantage over AMD for a long time though, but AMD will gradually narrow that gap as well.
The thing is, AMD will be fighting Intel for a stagnating/shrinking CPU market, and more than likely AMD won't increase its margins significantly during this process, it will just reduce Intel's margins. Not really good news for either company, but worse for Intel.
I Miss the Good Old Days (Score:1)
Any young tikes in here remember when processor speeds used to increase by 500 MHz in 18 months instead of being stuck in the 3 - 4 GHz range for almost a decade?
Re: (Score:1)
yes, good times.. anything seemed possible by just waiting a few months until the hardware could take it.
On the other hand now and going forward good software developers will be valued highly because efficient cpu resource usage and smp are difficult but increasingly important.
Re:I Miss the Good Old Days (Score:4, Insightful)
Re: (Score:1)
Yeah, I miss those days in the late 1980's ... it wasn't just the clock speeds that were revving upwards, the hard disk drives were getting larger, doubling every six months (10, 20, 40, 80, 160 Megabytes ... we're up to 2 Terabytes now), the screen resolutions were getting larger (320x200, 640x480, 1024x768, 1280x1024, 1600x1200) as well as pixel sizes (8-bit, 16-bit, 24-bit). Now we're into quad-monitors with HD resolutions, so maybe that is still going upwards.
But CPU cores are increasing, and caching is
Speed and Temp (Score:2, Insightful)
So, Intel lowers the temp, increase the speed by 50 and there are issues with overclock ability? Man, just go invent and build your own proc. Want you cake and eat it too?
Pervasive threading (Score:2)
Why, on a modern machine running a modern flavor of Windows, does a heavy RPG only use 2 cores?
I7 core series (Score:1)
im still running First generation 1366 and i havent even Begun to find a reason to upgrade anything but the video and thats just to stay on top of gaming.
it would be nice to get usb3 i suppose but truthfully thats just a pcie addin card away.