Intel Core I7-5775C Desktop Broadwell With Iris Pro 6200 Graphics Tested 75
bigwophh writes: 14nm Broadwell processors weren't originally destined for the channel, but Intel ultimately changed course and launched a handful of 5th Generation Core processors based on the microarchitecture recently, the most powerful of which is the Core i7-5775C. Unlike all of the mobile Broadwell processors that came before it, the Core i7-5775C is a socketed, LGA processor for desktops, just like 4th Generation Core processors based on Haswell. In fact, it'll work in the very same 9-Series chipset motherboards currently available (after a BIOS update). The Core i7-5775C, however, features a 128MB eDRAM cache and integrated Iris Pro 6200 series graphics, which can boost graphics performance significantly. Testing shows that the Core i7-5775C's lower CPU core clocks limit its performance versus Haswell, but its Iris Pro graphics engine is clearly more powerful.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
This is why I have no problems staying an AMD shop despite AMD staying at 28nm, because even at 28nm they are still vastly overpowered compared to what the average user does (especially when you look at non rigged [youtube.com] benchmarks [phoronix.com]) because once we went multicore chips went from "good enough" to so insanely powerful it isn't even funny.
While much of what you said is true, AMD's single biggest problem is power consumption.
No, not for laptops, desktops...
What? Why does that matter?
Because power costs money, some places more than others, but generally consuming lots of power is bad for the planet. Even if your power is hydro, wind, or solar, that power could have been sent somewhere else and been used to replace coal or natural gas, so it is still wasted.
The modern Haswell chips are so much more power efficient than AMD it is sad. Compare
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Fair enough, that might be an edge case...
However, one thing that you should keep in mind is that 18W is a whole lot more than 1W.
What does that mean? Intel has learned in recent years that rather than slow down the CPU and let it run tasks over time, it actually saves more power most of the time to run at full speed, get done quickly, then go to sleep mode.
Even if this happens every second or two, it saves more power most of the time.
Consider that a 35W Haswell chip might actually pull less total power th
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Sorry friend but you've been bamboozled as it would take SEVENTEEN YEARS to save enough power to make up the price difference between an AMD and an Intel and that is with picking the 125w on the AMD side. Now are you seriously gonna argue you are keeping your chip for nearly 20 years?
That is a cute video, but it doesn't show anything but a guy talking.
Frankly, my own testing shows otherwise, the AMD chip uses twice the power as the Intel chip at load and 50% more at idle.
For a computer that is on 24/7, that has stuff often running for hours (I often set it to run overnight tasks, so it spends many hours at 100% load), the difference adds up.
He also doesn't take into account the needed cooling for the extra heat. I'm in Texas, I pay to AC my home. The extra power is nearly doubled due
Re: (Score:3)
Re: (Score:2)
I'm considering getting a PC/laptop for SteamOS. I'll probably go w/ the Intel graphics, instead of either AMD or NVIDIA. If the Iris Pro has caught up w/ these other 2, good, but even otherwise, I'd want to avoid the fiasco of bad or incompatible drivers from either AMD or NVIDIA. Intel's graphics works w/ even BSD, so that's what I'd use.
Had I been shopping for another Windows 10 box, I'd go w/ an AMD. But as one poster observed, power consumption of those things is still an issue
Re: (Score:1)
My dad is in his mid 70s. He's stopped going to the local computer places because they won't stop hassling him about how he needs a new PC etc. His computer is about 7 years old. It's a Core2Duo E8600 (3.33 GHz), 2 GB of DDR2 and a SATA hdd (250 GB I think) with integrated graphics. He uses it for email, typing things up in Office, doing his income tax, looking at youtube videos, and occasionally (once a month or so) converting video footage my mom took with their video camera to DVD so he can burn
Re: (Score:2)
So get a cheaper 4th Gen Core i whatever. Or go with an AMD APU or CPU and get a discrete GPU in the mean time.
Intel Iris Pro graphics might not be great for gaming, but it sure will help with various GPU accelerated tasks like compositing and so forth in non-gaming uses.
Re: (Score:3, Informative)
So get a cheaper 4th Gen Core i whatever. Or go with an AMD APU or CPU and get a discrete GPU in the mean time.
I think you missed the point. GP expressed a wish for a 5-gen chip with more cores, and no graphics.
There's a lot to be said for that.
Re: (Score:2)
Re: (Score:2)
The point is that half the die area (plus the extra DRAM module) cost you major $$$ even if all you wanted was a top end 4 core processor. For quite a while now Intel has been working on the graphics and power consumption, which is great for mobile and low-end, but the higher end desktop folks are getting frustrated that performance is dead flat, if not dropping.
Anyone paying $300-540 for a processor is not likely to cheap out and ONLY use the integrated graphics. These i5 and i7 processors are turning in
Re: (Score:2)
Yup, imagine how easily another four cores would fit in there.
However that would compete with Xeons. Can't have that, it's where the money is.
Re: (Score:2)
Anyone paying $300-540 for a processor is not likely to cheap out and ONLY use the integrated graphics. These i5 and i7 processors are turning into a pretty big disappointment. You can get far better graphics with just about the lowliest available sub $100 graphics card, but the options ditch the graphics and get a couple more cores explode in price.
Well if you are gaming when are you ever CPU limited with a 4+ GHz quad core? It would be nice if they dropped the integrated graphics and sold it for less, but the six/eight core processors are typically for people who do video encoding, 3D rendering, lots of VMs or some other semi-pro use. Even GTX 980 Ti in SLI should run fine on an i7-4790k, I guess for triple/quad-SLI you need the extra PCI lanes but then you're extremely far out of the mainstream even for gamers.
Re: (Score:1)
I'd settle for a CPU that is faster than the last generation. Intel is throwing away CPU performance to compete on graphics and power consumption. For the ultrabook and tablet market, this is a good move but for desktops and high end laptops, it is a horrible idea. Some of us actually need more CPU and less GPU. For those of you who think the GPU counts, try implementing OpenCL on an open source operating system.
After reading this review, I'm sticking with my Core i7-4770. It's 1.5 years later and Intel o
Re: (Score:2)
Many just want some PC to do audio, photo or video, or perhaps some other uses while still running "light" games (be it any blizzard or valve ones, or some stuff where being compatible and not CPU/RAM starved is well good enough)
Within some parameters (perhaps moreso with i5 5675C), the CPU has its merits. No need to spend cash, size and weight on PSU and cooling either. Yes you can buy a 125W CPU and a 200W graphics card instead (going to the other extreme)
The elephant in the room is a $40 Celeron or a $50
Re:Slower and over 540$ - List is $366 (Score:3)
It's officially not released yet, and the seller linked to is in Japan and gouging anyone who can't wait another couple weeks. I'm not sure how or what stock they got a hold of.
According to Intel, list price is $377 boxed or $366 Tray. Not $540.
Re: (Score:2)
do not want new and improved somewhat tolerable intel graphics
This this and more this. Intel graphics are pretty much diabolical everywhere, In every situation.
Re: (Score:2)
Fine. Go have a ball in your own playpen then. I won't even consider any system except Intel with Intel graphics. It works aces for me. So we cancel each other out.
Re: (Score:2)
Less GPU more cores. (Score:2)
I saw the uncapped version of the chip showing it's layout.
Are there versions of this chip with less GPU and more cores?
Re: (Score:2)
Re: (Score:2)
Isn't that was haswell-E is about?
DO NOT WANT (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
" The output from the NVIDIA chip gets piped through the Intel chip, as the Intel chip is the only one attached to the display."
"There's also a mild performance reduction with the Optimus solution."
So, here I sit on a desktop system with both discrete GPU (GTX 260 Core 216) and an Intel IGP, triple-monitor setup.
I absolutely fucking hate when the system renders on the GTX yet insists on displaying through the monitor attached to the Intel IGP. We're talking a no-shit measured 50% drop in framerate - FOR DOO
Re: (Score:2)
Re: (Score:2)
Yes farmers.
And businesses
And schools, governments, integrated devices, laptops, tablets, low cost devices, small size devices, and 95% of the general purpose PCs on the market.
Re: (Score:2)
Re: (Score:2)
Horseshit. Video cards are special purpose items these days and there are many use cases for a system with a top of the line GPU but little to no graphics grunt. But even if you did want graphics grunt unless you're doing complex 3D drafting and need a Quattro or something similar this CPU is able to beat last generation dedicated GPUs and even scores an impressive 130fps in GTAV at 1080p. If you NEED something more than this you are most definitely in special purpose category (i.e. 3D design, or heavy gami
Re: (Score:1)
"Horseshit. Video cards are special purpose items these days "
I guess you haven't heard of GPGPU. Well, given your UID, not a surprise.
Re: (Score:1)
" this CPU is able to beat last generation dedicated GPUs"
To boot, no, bullshit, and I proved this in the AMD spreadsheet performance comments from yesterday/day before. This new i7 processor can't even match a 9800GTX+ in single-precision FLOPs.
So, no, it isn't even beating GPUs from SEVEN GENERATIONS ago.
Re: (Score:2)
The GTX570 outperforms the 9800GTX in all specs and benchmarks by a long margin, and the i7-5775C sans dedicated graphics card scores within 15% of the GTX570 on all games tested.
So go fuck your hat.
Re: (Score:2)
CPU/GPU integration is for farmers, to paraphrase Seymour Cray.
CPU/GPU integration has much lower latency than discrete a GPU. The HSA based AMD chips pass data from the fast, single threaded, fast branching core to the massive array of relatively slow FPU units in a few nanoseconds.
Which is why HSA benchmarks seem to work so well
http://www.tomshardware.com/re... [tomshardware.com]
http://wccftech.com/amd-kaveri... [wccftech.com]
If you want fast comptuting, low latency comms is where it's at :)
Re: (Score:2)
It's hothardware, what do you expect. plenty of quality hardware sites out there to get decent reviews if your interested. As it is hothardware you can guarantee that they are at least a month behind the more reputable sites.
Skylake is two weeks away (Score:1)
DO NOT buy a broadwell chip. Skylake desktop chips are expected to launch August 5th @ Gamescom.
Re: (Score:2)
Re:Skylake is two weeks away (Score:5, Informative)
Skylake has same CPU performance, and slower GPU (no eDRAM). And version 1.0 motherboards.
Your advice is sound, but mostly if you don't need/care about the GPU or if you want new features like hdmi 2.0 and h265 decoder
Re: (Score:2)
Or support for 64GB ram on the Skylake ; or wanting the newer GPU 5-10 years down the road. Yes, many reasons to get a Skylake. If you do want fastest CPU and fastest integrated GPU, Broadwell it is.
Re: (Score:1)
"or wanting the newer GPU 5-10 years down the road"
At the current pace of technology, it is very likely that 5-10 years down the road your system won't even support/have the new bus slot the newer GPUs will require. That's including the current-gen of processor we are now discussing. PCI express is going to go away with direct processor interconnects on an interposer. Soon we will be at the point I predicted in my teenage years, where we have a closely-linked multi-socket motherboard, and all we do is repla
Re: (Score:2)
Eh, I wrote it badly :) I wanted to mean that 5-10 years down the road, you're a bit better off having a Skylake GPU than a Broadwell GPU, for driver support/features.
I believe PCIe still has some life left - PCIe 4.0.
You're probably right in some way but this is the death of the PC as an open platform : you would only buy Intel stuff that only works with Intel stuff, AMD stuff that only works with AMD stuff (already that way with motherboard chipsets, but you still have additional controllers) or the third
Re: (Score:3)
Re: (Score:1)
That will come in 2016 . . . possibly late Q2/early Q3 2016. We may not even see anything quite like that until Kabylake.
Re: (Score:2)
Why would you care about the GPU on a desktop? If you don't want to bother with a graphics card and want to use integrated graphics, use a regular Broadwell or Skylake i7. If you need a GPU, add a graphics card.
Iris Pro sorta kinda makes sense on laptops - slightly better than integrated performance for slightly more power consumption, without
Re: (Score:2)
Not everyone wants the noise of an add-on video card. Some of us don't game, and only need "good enough" graphics to drive the display manager requirements. Add on a dollar or two saved on the power bill per year, less money spent on the power supply, and the money saved for the no-longer-necessary add-on graphics card, and built-in CPU graphics sounds like a "win" to me.
I might keep on using my fanless NVidia card on my next box, but I'm going to wait and see whether I can saturate my drive IOs while
Re: (Score:1)
"Some of us don't game, and only need "good enough" graphics to drive the display manager requirements"
Okay, here's your 256KB GPU RAM. Have fun!
Re: (Score:2)
Even a megapixel display at 24 bits required 3MB per frame... and a megapixel display has been "low end" for a lot of years now!
Seriously, though -- why does everyone sneer at the fact that not everyone is a gamer? Why are gamers so god damned fucking ARROGANT about their "my dick is bigger than yours" hardware?
Re: (Score:2)
Re: (Score:2)