The Impact of Memory Latency Explored 162
EconolineCrush writes "Memory module manufacturers have been pushing high-end DIMMs for a while now, complete with fancy heat spreaders and claims of better performance through lower memory latencies. Lowering memory latencies is a good thing, of course, but low-latency modules typically cost twice as much as standard DIMMs. The Tech Report has explored the performance benefits of low-latency memory modules, and the results are enlightening. They could even save you some money."
cache (Score:1, Informative)
Link crashed Firefox (Score:1)
A bug to be reported, or what is happening?
Re:Link crashed Firefox (Score:5, Interesting)
Re:Link crashed Firefox (Score:2)
Re:Link crashed Firefox (Score:2)
Re:Link crashed Firefox (Score:2)
Re:Link crashed Firefox (Score:3, Informative)
Re:Link crashed Firefox (Score:1)
Re:Link crashed Firefox (Score:2)
"CA|NA||||||||||"
Re:Link crashed Firefox (Score:3, Informative)
What they're doing with the list is anybody's guess.
Re:Link crashed Firefox (Score:5, Funny)
Just another reason to switch to IE!
Re:Link crashed Firefox (Score:2)
Re:Link crashed Firefox (Score:2)
Re:Link crashed Firefox (Score:1)
Re: (Score:2)
Re:Link crashed Firefox (Score:2)
The only problem I've seen with it is a tendency to wipe my acls upon upgrading to a new version, but I imagine that'll be fixed.
apply this to picking a wife (Score:5, Funny)
I'd have to say this is right on when applied picking a woman to spend your life with... low-latency memory is a BAD BAD thing, and VERY expensive. My next time around, I'm going with the "CHEAPER", high-latency model that can't immediately recall everything I've ever said while arguing her point... Roses and jewelry can cost you over the long run friends...
Re:apply this to picking a wife (Score:2)
Fret thee not, for such a beast exists solely in myth!
Re:apply this to picking a wife (Score:2)
Matter and anti-matter, my friend. They just don't mix.
Just stick a few blue LEDs on it... (Score:5, Funny)
Re:Just stick a few blue LEDs on it... (Score:2)
Re:Just stick a few blue LEDs on it... (Score:5, Insightful)
Re:Just stick a few blue LEDs on it... (Score:4, Insightful)
Guess what? That wicked dual-core CPU actually runs games slower than its single core cousin. That brand-spankin' new video card that cost you $400(or more)? I pay that much once every several years on my video card. The difference is that I don't care if I squeeze out my maximum frames per second because most people can't even detect the difference if the game didn't have an option to show the number in the corner of the screen like some veritable rating of thier manhood (sorry for my gender bias on that). And that super ultra OHMYFUCKINGGODITMAKESMYEXPLODEITSSOFAST low-latency RAM is giving you a performance boost of 2% of what I've got now.
I find it educational to read these reports so I can make educated purchasing choices. For that, I'm quite grateful. However, I find it kind of sad that the parent post is unsettlingly accurate in that the 'hardcore pc gamers' will shove this to the side for the ATI SXL 10G Super Elite XTRME Pro card next week. Witness what happens when PC gaming meets MTV-esque marketing.
Re:Just stick a few blue LEDs on it... (Score:2)
is that right ? What FPS are you getting in Quake 4 at 1280x1024x32 with tri-linear and models/textures set to HIGH ?
Re:Just stick a few blue LEDs on it... (Score:4, Insightful)
Re:Just stick a few blue LEDs on it... (Score:2)
Re:Just stick a few blue LEDs on it... (Score:2)
Now if he had said 1600x1200, then maybe we could have taken the question seriously - but I assure you guys that he is just yanking your chains (and you fell for it!)
That said, I second the guy that requested a roundup where they put 512M of the uber1337 memory with chrome heat-spreaders and blinkenlichten in one machine, and 2G of Crucial/Kingston budget line memory in the other - and do a full suite
Re:Just stick a few blue LEDs on it... (Score:2)
FPS makes a difference to playability, anyone that says otherwise has dull senses.
Re:Just stick a few blue LEDs on it... (Score:1)
Why are you associating with such people?
Sad? Who cares? Let people spend their money the
Re:Just stick a few blue LEDs on it... (Score:3, Informative)
Guess what? That wicked dual-core CPU actually runs games slower than its single core cousin.
Is this actually a true statement? I can't do any current testing since I don't have a reasonable 3D card in my machine, but I remember testing Quake 3 on my old dual Celeron machine with a TNT2 card. top showed Quake was using 95% or more of one CPU, and the X server was using 30% or more of the other CPU.
I don't expect the numbers to be the same today, but shouldn't there be at least some slight increase i
Re:Just stick a few blue LEDs on it... (Score:5, Interesting)
Conclusions
Let's start by talking about the Athlon 64 X2 4200+. This CPU generally offers better performance than its direct competitor from Intel, the Pentium D 840. Most notably, the X2 4200+ doesn't share the Pentium D's relatively weak performance in single-threaded tasks like our 3D gaming benchmarks. The Athlon 64 X2 4200+ also consumes less power, at the system level, than the Pentium D 840--just a little bit at idle (even without Cool'n'Quiet) but over 100W under load. That's a very potent combo, all told.
In fact, the X2 4200+ frequently outperforms the Pentium Extreme Edition 840, which costs nearly twice as much. Thanks to its dual-core config, the X2 4200+ also embarrasses some expensive single-core processors, like the Athlon 64 FX-55 and the Pentium 4 Extreme Edition 3.73GHz. Personally, I don't think there's any reason to pay any more for a CPU than the $531 that AMD will be asking for the Athlon 64 X2 4200+.
If you must pay more for some reason, the Athlon 64 X2 4800+ will give you the best all-around performance we've ever seen from a "single" CPU. The X2 4800+ beats out the Pentium Extreme Edition 840 virtually across the board, even in tests that use four threads to take best advantage of the Extreme Edition 840's Hyper-Threading capabilities. The difference becomes even more pronounced in single-threaded applications, including games, where the Pentium XE 840 is near the bottom of the pack and the X2 4800+ is constantly near the top. The X2 4800+ also consumes considerably less power, both at idle and under load.
The X2 4800+ gives up 200MHz to its fastest single-core competitor, the Athlon 64 FX-55, but gains most of the performance back in single-threaded apps thanks to AMD's latest round of core enhancements, included in the X2 chips. The X2 4800+ also matches the Opteron 152 in many cases thanks to Socket 939's faster memory subsystem. Remarkably, our test system consumes the same amount of power under load with an X2 4800+ in its socket as it does with an Athlon 64 FX-55, even though the X2 is running two rendering threads and doing nearly twice the work. Amazing.
There's not much to complain about here, but that won't stop me from trying. I would like to see AMD extend the X2 line down two more notches by offering a couple of Athlon 64 X2 variants at 2GHz clock speeds and lower prices. I realize that by asking for this, I may sound like a bit of a freeloader or something, but hey--Intel's doing it. No, the performance picture for Intel's dual-core chips isn't quite so rosy, but the lower-end Pentium D models will make the sometimes-substantial benefits of dual-core CPU technology more widely accessible. If AMD doesn't follow suit, lots of folks will be forced to choose between one fast AMD core or two relatively slower Intel cores. I'm not so sure I won't end up recommending the latter more often than the former.
Beyond that, the giant question looming over the Athlon 64 X2 is about availability, as in, "When can I get one?" Let's hope the answer is sooner rather than later, because these things are sweet.
Re:Just stick a few blue LEDs on it... (Score:1)
Re:Just stick a few blue LEDs on it... (Score:4, Interesting)
It's thanks to them that the rest of us can get normal gear at such reasonable prices...
Anandtech did this months ago (Score:4, Informative)
You'll basically find that the performance of value memory is very on par with the high end stuff. You basically pay for the ability to overclock on a more consistent basis.
So did ExtremeTech - and they included A64 and P4 (Score:5, Interesting)
Re:Anandtech did this months ago (Score:3, Informative)
Can't Read the Article (Score:2, Interesting)
Ed Almos
Re:Can't Read the Article (Score:1)
http://hardware.slashdot.org/comments.pl?sid=16710 1&cid=13932623 [slashdot.org]
considering just a handful of posts above you, another user complained of the same thing, and yet another user provided a useful reason why:
Beware, one of the banner advertiser on that page (netshelter.net) is trying to buffer overflow with strangely crafted cookie. Hope you do not run your Firefox on Windows...
Re:Can't Read the Article (Score:1)
Re:Can't Read the Article (Score:1)
Re:Can't Read the Article (Score:1)
OS: Ubuntu 5.10
Browser: Firefox 1.0.7-0ubuntu20
Extension: Adblock 0.5.2.039
Filter: Filterset.G [nyud.net] 2005-10-31a
Re:Can't Read the Article (Score:5, Funny)
(I'm sorry, that's not helpful at all, is it?)
Insightful article (Score:3, Informative)
2-2-2-5 timings at 400MHz t1 memory is the fastest but costs twice as much and the performance gains are almost non-existant except in lower resolution games (i.e. 800x600 you may see an increase in 20 fps, which I think is a lot!), and of course the cost of the ram in this case would not be justified because putting that extra money into a better video card would be the better thing to do.
Only if you're an overclocker is this worth it, at least from their benchmarking and perspective, which I'll accept.
Oh yes, and that website also crashed my Firefox.
Re:Insightful article (Score:3, Insightful)
Perhaps, that particular benchmark was for Far Cry at 800x600 w/ medium settings, and the lowest fps was around 168, and the highest was 188, so a 20fps difference.
They were using this video card: NVIDIA GeForce 6800 GT with ForceWare 77.77 drivers
However if you look at the opportunity cost of buying this ram because you have a bad video card and play at those resolutions, then it would still be more worth it to just get a better vi
Re:Insightful article (Score:2)
Not really. The video card was irrelevant to the test -- all the high vs low benchmarks showed was that if you're GPU bound then your memory latency has virtually no impact whatsoever. An older video card would've just been GPU bound at lower resolutions (for most of the games they tested, the lower resolution test would've been GPU bound by itself; BF2 wouldn't even run on a GF3).
If you're a gamer,
If you can afford a cup of coffee a day... (Score:2)
Re:If you can afford a cup of coffee a day... (Score:5, Funny)
Your analogy does not hold. Slashdot is a high latency site. By the time I've read a few comments, I've usually forgotten what the story was about.
Wait, why am I posting this comment again?
But surely flashing LEDS make it go faster! (Score:3, Funny)
Crow T. Trollbot
My LEDs are blinkier than yours! (Score:2)
Google Images [google.com.au]
PDF from company [corsairmemory.com]
Note, due to their width, you can only put in one per bank. :)
Ostentation doesn't work so well when inside an opaque case.
Re: (Score:2)
Ask a builder (Score:3, Insightful)
The difference between, say, Corsair Value Select memory, and Corsair 1337 Ultra X2000 - the memory equipped with LCDs, heat spreaders, and a spoiler with metal-flake yellow paint that add at least 10 horsepower - is going to be absolutely unnoticeable in the real world. Even benchmark scores will show little to no improvement.
Ricer RAM - you know, the PC equivalent of this crap [hsubaru.com] - is for overclocking. If you're not planning on overclocking it, you're paying too damned much.
Re:Ask a builder (Score:1)
Re:Ask a builder (Score:4, Insightful)
These products are not for people who want to achieve a useable level of performance and as such are not marketed at those crowds. They are for people who have already fast equipment but want more. I won't say this is a good or bad thing as it is simply a hobby for most of these people. Just like import tuners: they may drive funny-looking cars, but it's their choice of hobby.
Re:Ask a builder (Score:2)
If you have to use a stopwatch to tell which system is faster, they are the same speed.
If you calculate that one system is 7% faster than another system, they are the same speed.
If you have one system getting 127 frames per second, and another system getting 136 fps - they are the same speed.
103% for memory tweaks, 102% for OCing the CPU, 104% for a different tweak all add up to : same speed.
There are two magnificent pieces of equipment that are going to make your computer faster :
Re:Ask a builder (Score:2)
Re:Ask a builder (Score:2)
From run to run if they ran 20 runs and in all 20 runs the same car was 0.001 seconds faster each and every single time, letting the drivers swap cars a few times so each driver got 10 runs in each car, I would be willing to budge a little. But you and I know that that isn't the case. The machines are the same, but one lane had a little more rubber on the ground, or the driver was a little better (or a few lbs lighter, or had on his lucky shoes, or
Re:Ask a builder (Score:2)
Re:Ask a builder (Score:2)
Welcome to Gentoo is Rice, the Volume goes to 11 here [funroll-loops.org]
Re:Ask a builder (Score:2)
For gaming (which is what inspires most insane PC builds) this article proves you'd be better off putting that $100 extra you'd spend on RAM into an even more eye wateringly expensive video card.
Not to harp on the obvious (Score:1, Interesting)
The underestimated impact of latency. (Score:1, Insightful)
In the short-run, these tests help a person decide whether to buy low-latency RAM. But they provide little long-term insight into how much fa
You really think so? (Score:1)
Really? MS hand-tunes the ASM code generated when they do a build of winword.exe ? Maybe thats why OO.o is so slow?
If I sound sarcastic, I suppose I am. With a few exceptions, almost every coder I've worked with in multiple jobs, has been of the 'throw CPU cycles' at the problem. I can count on one hand those who actually design for a HW architecture, since most of the coders these days are VBScript and Java
Re:The underestimated impact of latency. (Score:5, Interesting)
The kind of changes you're talking about require vastly faster memory. Not the kind of latency differences being discussed here at all. Both of these are "high latency" compared to what would be needed for your theoretical redesign of the entire software stack. And even then, you just become utterly and completely screwed if you have to hit virtual memory, possibly more so than you are now because you've re-orchestrated everything around the idea that latency is a non issue.
Oh, and latency is getting worse, not better, and has been for a long, long time. CPU speeds long ago outstripped the speeds of our fastest memory (well, fastest while still not costing absurd amounts of money...), and the newer memory formats (DDR, DDR2, DDR3, RDRAM, etc) have higher latencies in exchange for greater bandwidth.
You made my point. (Score:2)
Exactly!
Oh, and latency is getting worse, not better, and has been for a long, long time.
Very true my first full-sized computer had a 8 MHz processor and 150 ns RAM in 1985. Now there's more than an 8:1 ratio between CPU and RAM clocks (and th
Re:You made my point. (Score:2)
Re:The underestimated impact of latency. (Score:3, Informative)
Re:The underestimated impact of latency. (Score:2)
It's like putting 93 octane gas in a 87 octante tuned car. You waste your money and get nothing out of it but maybe a check engine gas light.
Re:The underestimated impact of latency. (Score:2)
Re:The underestimated impact of latency. (Score:2, Interesting)
Word/Excel isn't going to bother, but a game might be worth stuffing a few versions of tweaked loops in that are selected by a loop invariant, or by f
Re:The underestimated impact of latency. (Score:2)
Re:The underestimated impact of latency. (Score:2)
I agree, I did some benchmarking similar to this about 5-7 years ago for some avionics architecture but was not funded to get into this level of detail. I was looking for best CPU, best network, best OS combinations. Somedays I miss the embedded world,as now I work in Enterprise IT where such stuff is WAY below the level of concern.
What does this mean? (Score:2, Interesting)
OT, but (Score:2)
JOC, why don't you specify Athlon X2 4400+ or 4800+s? They all have 1MB L2 per core, as well.
Re:The underestimated impact of latency. (Score:2)
What about cache? (Score:4, Interesting)
Also, outside of the HPC world, it seems very few programmers optimize their cache usage. Are there any tools(open source or otherwise) that can actually help you locate/fix inefficient uses of cache?
Re:What about cache? (Score:1)
Re:What about cache? (Score:1)
Re:What about cache? (Score:1)
Remind me again, how much L1 cache exactly does a Pentium 4 have? Wasn't it something like 8KB fast cache on the older ones and 16KB at half the speed on the newer ones?
Re:What about cache? (Score:2)
Re:What about cache? (Score:3, Informative)
So, the bottom line is that cache is the most expensive type of memory in a computer. Some methods have been made
Re:What about cache? (Score:3, Informative)
Re:What about cache? (Score:3, Insightful)
Ok, you say, so move it off the chip. Well the problem is that part of the reas
Re:What about cache? (Score:2)
(can't have a subject that starts with $) $ (Score:3, Insightful)
Cache isn't some magical thing. It's simply RAM. SRAM, usually, which is why it's so fast (don't have to waste power/time refreshing your contents). At the end of the day, it's just some very fast RAM. It sits between your CPU and the rest of your RAM, and uses its increased speed to "trick" the CPU into performing as if your main RAM is much faster than it is.
In my computer arch course a while back, someone asked why, if cache is so fast, we don't just build co
Re:(can't have a subject that starts with $) $ (Score:2)
There are a lot of good reasons... (Score:2)
What are you talking about? My G3 running at 450MHz has a 1MB L2 cache, and it has since 1999. Pentium Pros and various workstation/server class chips had multimegabyte caches a decade ago.
The reason you've seen less cache is that it didn't make sense to have a slow CPU with a 4MB ache that had to dissipate 100+ watts to operate. on-die cache is expensive in terms of heat, die space, and clock speed.
There's also the marketing factor, Intel would have
Save money? (Score:2)
Re:Save money? (Score:1)
Re:Save money? (Score:4, Funny)
Re:Save money? (Score:2)
Re:Save money? (Score:1)
Importants of clock speeds (Score:2)
Re:Importants of clock speeds (Score:2)
The real issue ... (Score:4, Insightful)
Re:The real issue ... (Score:2)
1st best leveraged enhancement: Disk access- once you get something to such-and-such speed, then the next important step is:
Bus speed. Once you get this to speed x:
Video card.
then RAM latency...
(and of course C
well Duh? (Score:2)
Scientific computing benefits from this (Score:3, Interesting)
Memory latency, and memory bandwidth, both impact how long it takes my simulations to complete. Let's say it is the difference between a simulation taking a week vs. five days... this is significant to me and how much I can get done. With these heavy duty scientific models and such, you really can see a noticable benefit with the fancier hardware, and clock speed is certainly not the the only factor to consider by a long shot.
As we say in Psychology (Score:3, Funny)
Re:FP! (Score:1)
Re:FP! (Score:5, Informative)
Yes. I regularly by high speed RAM and downclock it, but run it at lower latency. For instance if I wanted to run my RAM at 400MHz, I'd buy 433/466/500MHz VAL-U-RAM and run it as a stick of semi-premium 400MHz.