Intel in the GHz Game Again - Skulltrail Hits 5 GHz 229
An anonymous reader writes "Intel's Skulltrail dual-socket enthusiast platform has been making the rounds on the web for half a year or so, but we haven't seen many details yet. TG Daily got a close look at an almost complete prototype, which surely sounds almost like a production ready version, judging from the article. Everything that TG Daily describes sounds like Skulltrail PCs will be very limited in availability and insanely expensive. Intel also has said it has developed 'special' Xeon processors with desktop processor attributes just for Skulltrail. These chips are currently running at a stable 5 GHz."
I guess... (Score:5, Funny)
Re:I guess... (Score:5, Funny)
Re: (Score:3, Funny)
So it all works out in the long run.
Re: (Score:2)
Also, Duncan Hill Coffee (Score:5, Funny)
Obviously, it's the only architecture hand-designed by Dethklok.
Re: (Score:2, Insightful)
Excessive? (Score:2, Interesting)
On the other hand...will this be out in time for Crysis?
Re:Excessive? (Score:5, Insightful)
Re: (Score:2)
I agree with you; games are power-hungry, but by no means the most power-hungry things you can do with a PC. Mind you, I'm weird - I've actually done proper scientific numerical simulation work (and had to leave it running overnight to finish). I've also done video transcoding, and while that doesn't take as long it wasn't quite real-time last time I did it, so there's definitel
Re: (Score:3, Funny)
Re: (Score:2)
But yes your correct. You did leave out data mining and a few other applications.
What is amazing is the size of problems that we are now willing to tackle with desktop hardware.
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
we all know that games generally speaking are the most intensive software ever run on a PC
Not even close. Games, after all, run in realtime. There are many, many applications out there that have no problem pegging top-of-the-line hardware for hours on end: DV editing, raytracing, scientific computing.
Maybe by "PC" the GP meant "desktop PC" and not "workstation." Of course, the Skulltrail platform is just workstation hardware (dual Xeons, ECC FB-DIMMS) with some modifications for uber-gamers that have more money than common sense.
Isn't a standard workstation (dual workstation CPUs, ECC RAM, workstation graphics card) more appropriate for DV editing, raytracing, and scientific computing? Isn't "desktop" hardware (fast single desktop CPU, faster non-ECC DDR2, "gamer" graphics card/cards) more appropriat
Wasting Time... (Score:4, Funny)
You are wasting your time, the answer will always be 42....
Re: (Score:2)
Which is exactly why more processing power is important. You can't wait hours for it to render the next scene. Then again it really is just a semantics issue on what he meant by intensive. You could always just throw your computer into an infinite loop and peg the processor as well.
Re: (Score:2)
One bizzare thing is I ran effectively the same job on an Intel 8 CPU machine and an AMD 8 CPU machine of very similar specs - ther
Re: (Score:3, Insightful)
Im pretty sure games stress the whole system overall a lot more than any application im aware of. math problems and ray tracing and DV editing if im not mistaken are CPU exclusive operations. Im not an expert but high end graphics cards are more powerful than cpus, even if they are specialized. i cant think of any other application that will stress the CPU, GPU, RAM, HDD and everything else to 100% other than games.
You are mistaken. Take particle physics simulations for example. The system might be downloading a 10GB dataset to do the next simulation while it's working on simulations of a detector which involves working with the current dataset. The download would max out your net connection while the simulation work would max your cpu and require something like 2-3GB of ram. The two activities are probably generating a decent i/o load as well.
Same deal with audio or video processing, if you're streaming a vi
Re: (Score:3, Informative)
Re:Excessive? (Score:4, Insightful)
Not to mention running something like World Community Grid [worldcommunitygrid.org]. I love using my idle processor time to tackle AIDS, Cancer, Muscular Dystrophy, Dengue, etc.
-l
Re: (Score:2)
Traslation (Score:5, Insightful)
It will be 20% faster, 200% hotter, needs a 300% nosier fan, consumes 500% as much power.
Yes, but (Score:5, Funny)
Re:Yes, but (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Don't you mean 500% hotter? For any practical purpose, heat produced in an IC is equivalent to the amount of power it draws.
Re: (Score:3, Funny)
if you could make something require 500% more power but convert 200% more energy to heat (ignore photonic emissions), you'd have yourself a nobel prize.
i'm just sayin.
Re: (Score:2)
That's pretty rough - by that stage the fans become stalkers.
Estimating power at 5GHz (Score:3, Interesting)
At 3GHz, it uses 8.79W when doing nothing, and 73W when running all four cores flat-out
At 4GHz, it uses 16.83W to do nothing, and 135W with all four cores flat-out; on the other hand this required a voltage increase to 1.44V from the 1.25V that sufficed up to 3.33GHz.
Fitting curves suggests that you would be using something like 350W for four cores at 5GHz, which is quite impressi
Not if it is putting out energy in other forms. (Score:5, Funny)
Insanely expensive... (Score:5, Insightful)
bragging rights (Score:5, Funny)
Remember, it's not just the spammers that profit off of people with small penises. Auto manufacturers, TV manufacturers, home builders, and now Intel all profit off of them too.
Re: (Score:2, Funny)
Ha! I don't buy things from any of those people, and my penis is tiny!
Re: (Score:2, Offtopic)
Third party observation by the secretary was that guys who drive reasonable cars are more comfortable with themselves and more stable. Better material, etc.
Of course, I then bought an M3 and ruined it all!
Re:So... (Score:5, Insightful)
Confucius say, a small dick is still better than an unused one.
Re: (Score:2, Offtopic)
Perhaps those women are right; that they really can't find a comfortable, stable guy.
Perhaps many of the comfortable, stable guys aren't interested in girls with the kind self confidence issues that put them in M3s with assholes.
I'm drastically simplifying relationships when I write this and I'm also grossly generalizing, so please take what I write with a grain of salt. I'd love to hear other people's experience.
Here's my dating experience, having gone through many stages of
Re: (Score:2)
Re: (Score:2)
In my experience, physical attractiveness does play an important role in relationships. A friend put it well: "The difference between a close friend and a lover is attraction."
Just remember that attraction isn't only about looks.
A long time ago, I took up weightlifting and martial arts in order to improve my physique. I've been at both for years. But IMO, the biggest successes I've had have come from impro
An addendum... (Score:2)
I think it's very possible to be a comfortable, stable, and secure person around your friends, family members, and co-workers and still be insecure around women. Each are a unique experience.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I got more action, looks, and flirting when I had my sports car. Lots more.
Now it could be that the women I was seeing didn't want anything more then a quick weekend at the beach having sweaty sex.
So enjoy your M3.
Yes, I did post this to brag about my younger weekends. Considering I get paid the same to program computers as my colleagues who spent all their younger years constantly sitting in front of a computer, I win.
Re: (Score:2)
This is only talking about a 20% clockspeed jump over a current overclocked workstation platform. Sure, that won't translate into a 20% performance jump overall, but it'll still be a performance jump. I'm not really up for spending an extra $1000 for an 8% framerate increase, but for those people who like that plan this is a perfectly reasonable product. Further, it sets the bar higher for future products - and as a technology enthusiast I can always get behind rasing the bar.
Re: (Score:2)
Insanely expensive now will be standard consumer performance over a major game platform development lifecycle.
That's not a bottleneck (Score:2, Informative)
It's just not that simple. A 5GHz CPU will be faster than a 3GHz CPU and 3 video cards will be faster than 1 video card almost regardless of other components. The only real bottlenecks you can talk about are the system busses and at the moment, that's not a problem either. HyperTransport 3.0 and intel's quad-pumped busses are still plenty wide enough for 5GHz processors, no sweat.
I completely u
Hertz by themselves are useless (Score:5, Insightful)
Where did my /. go? (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
Hertz? I care about Watt much more (Score:2, Insightful)
Re: (Score:2)
That's exactly how I explained it to my father, back in the K7 vs Pentium 4 days, and as far as he knew MHz was how fast it went and Intel was selling chips with big
MHz wars are over (Score:3, Insightful)
Look at the history of processors speed. We've been pretty flat, and will stay that way in all practical manner for a while.
Before someone throws the quote like they are smart, Moore's law refers to transistor not speed.
1) Faster chips require better fabs. Fabs are having difficulty producing better platters with a few enough flaws to produce mass quantities. Strides are being made, but know massive breakthroughs.
2) Multiples cores and real parallel processing development is just starting to become expected knowledge for the average application developer. Lets be honest, a lot of developers don't bother to understand multi-threading and avoid it like a plague. Fortunately there are some IDEs that make it easier for developers.
Re:MHz wars are over (Score:5, Insightful)
Re:MHz wars are over (Score:5, Interesting)
A properly functioning word processor can already do pretty much everything 99.99% of what a user asks of it as fast as the user can tell it to do something, even on the bottom line processor.
Today's video games, sure, aren't going to benefit much from multicore. But I disagree that the benefits for future games will top out at 2. I mean - you could have 1 core handling user input and processing, 1 core handling the physics enviroment, 1 core for unit AI, 1 core for graphics information. There's a quad core right there.
Business and scientific apps will see some beyond that, but memory tends to be the bottleneck there- we'd be better off increasing memory bandwidth and latency than clock speed
Then they can start worrying about beefing up memory bandwidth - I've read about some technologies in the pipe that will help with this. And the scientific community can always use more bandwidth - they are one of the larger users of supercomputers, and this might take a project from 'Need to rent 24hrs on the supercomputer for $$$' to 'I can run this on my work computer for a month/week to get the same results for $'.
Re: (Score:2)
Re: (Score:2)
Wait till they start modelling brains (Score:2)
Re: (Score:2)
Re: (Score:2)
That's true, but the algorithms / program designs that work great with a hundred cores work like crap on one or two cores. Personally, I expect to see video games designed to be truly concurrent just as soon as low-end gamers have quad core machines (and high-end gamers have 32-thread systems).
Re: (Score:2)
Re: (Score:2)
This is true.
Here's the thing: Every one of the applications that people commonly run on a desktop PC can be parallelized.
The real problem is that programmers who are used to single-thread designs cringe when they see the parallel version. Not only is it moderately more complex, but to generalize to many cores a design frequently entails a 10% to 50% performance penalty compared to the
Re: (Score:2)
It'll get better if they improve game threading, but to truly double performance I'll probably need 4 cores.
I figure many(IE more than four) core systems are about four years away. An eternity in computer terms, of course.
Re: (Score:2)
Word processing is a bad example because the software is so badly written and ridulously resource hungry while providing less features than a Desktop Publishing program that ran on a 286 or even one that ran on an Atari ST.
Re: (Score:2)
I agree that getting video games to take advantage of multiple cores isn't as easy as some people seem to think - OTOH, games already are massively parallelised. It's just that the parallelisation has already happened on dedicated hardware - a graphics card. Since the CPU tends not to be the bottleneck, it wouldn't benefit from a faster CPU anyway, whether that's higher GHz or more cores.
CPU right now might not be the
Re:MHz wars are over (Score:4, Insightful)
Fewer faster cores will always be more flexible than more slower ones. The reason we go with more slower ones is that slower cores use less power (power scales much worse than linearly with speed, so two 1GHz cores will use a lot less power than one 2GHz one). Some workloads are intrinsically parallel (e.g. web serving) and so having lots of cores using less power is a big win. Others are not and so extra cores are just a waste (although you can often consolidate multiple serial tasks onto one machine with lots of cores).
Re: (Score:2)
We've also had decades of practice solving problems using multiple processing units. In fact, for the sort of problems that today's processors can just barely handle (i.e. those problems that processing power is still an issue on) we've had *more* practice solving them on compute arrays than we have solving them on single processors.
Re: (Score:3, Insightful)
Re: (Score:2)
If you go on mhz alone, the 16 core machine should process work units at the same speed as the 64 core machine (1/4th the number of processors, but 4x faster cpus, vs 4x more processors and 1/4th speed cpus). But that's not taking into account faster bus speeds, better architecture, improved floating point performance etc.
However, a 96 core machine at 600mhz would process far more units of work than the opteron machine,
Re: (Score:2)
Please, it's all about cores.
I think to you it's all about throwing around bullshit unfounded opinions.
There are many tasks, some of them on the dekstop, which will never be parallelizable. Single-core performance has been and will remain absolutely crucial, even when everyone and their mom can write code in a parallelizing toolkit.
Faster chips require better fabs. Fabs are having difficulty producing better platters with a few enough flaws to produce mass quantities. Strides are being made, but know massive breakthroughs.
My phoniness meter just exploded.
Re: (Score:2)
Yes, I was once an AMD knight, the same as your father.
Chris Mattern
what does "desktop processor attributes" mean? (Score:3, Interesting)
Thad
Re: (Score:3, Insightful)
Re: (Score:2)
Many developers believe this, and in any practical application they are wrong.
I have never seen an application where real world performance, and more importantly "perceived performance increase", isn't improved with threading. Too many times I have seen scenarios where a user is sitting waiting for the computer to finish. When they could be doing some other work.
I have seen developers wither under the request to add some multitasking.
Re: (Score:2)
Re: (Score:2)
You could always store your mp3s on one disk, another application on another disk, etc etc. It's a kluge though, because it requires more power to do so.
Re: (Score:2)
Thrashing? You need a lot more memory then! Are you sure you don't mean disk access? Thrashing occurs when [constantly swapping]
Quote WP:
In computer science, thrash is the term used to describe a degenerate situation on a computer where increasing resources are used to do a decreasing amount of work. Usually it refers to two or more processes accessing a shared resource repeatedly such that serious system performance degradation occurs because the system is spending a disproportionate amount of time just accessing the shared resource. Resource access time may generally be considered as wasted, since it does not contribute to the advancement of any process.
It's a degenerate situation because two or more processes are fighting over the read head on the HDD - it keeps going back and forth, getting very little done bcause it spends a disproportionate amount of time moving the read head so data can be accessed instead of advancing the process. I agree it's not the most common use of the term, but I think it's appropriate...
Re:what does "desktop processor attributes" mean? (Score:4, Insightful)
I'm not intimately familiar with the specifics in this case, but starting with a server chip and "adding desktop processor attributes" would typically entail:
adding the inability to use ECC.
adding a reduction in cache.
adding a lack of fault tolerance or error checking capabilities.
adding the feature of being impossible to use with > 2 sockets.
adding a whizzy new marketing name.
And, the enthusiast desktop parts are often easy to overclock, while server parts assume you'll just buy a faster CPU instead of wasting time fiddling with something that may catch fire.
BTW, hey, I remember you from alt.movies.visual-effects "back in the day" before the death of Usenet. good to see you haven't fallen off the face of the planet. I'm not in the process of working on a compositing demo reel so I can try to jump from straight IT to visual effects in the near future. I blame this career change in part on all your interesting and informative posts getting stuck in my head.
Re: (Score:2)
Sorry about that whole visual effects problem... Hope it works out.
I thought your description of the difference between the server chips and desktop chips was right on.
I'm going to be building a 8-core AMD machine in a few days, and I'll use the "ser
Stable (Score:5, Funny)
This looks bad next to a amd dual quad-core system (Score:4, Interesting)
The dual amd system that this will be like this will use DESKTOP RAM, have 2 or more chipset choices. Also the amd setup lets you have 2 full Northbridge chipsets for even more i/o the nForce 680a uses this and nvidia will likey have a new chipset with pci-e 2.0. The old has a x16 x8 x8 x16 pci-e with a total of 56 PCI-E lanes.
The new amd chipet is also comeing and you may even see a board with 2 Northbridges = 82 pci-e lanes.
790FX
* Codenamed RD790, final name revealed to be "AMD 790FX chipset"
* Dual-socket (Quad FX, Dual Socket Direct Connect Architecture) or single AMD processor configuration
* Maximum four physical PCI-E x16 slots and discrete PCI-E x4 slot , the chipset provides a total of 52 PCI-E lanes, with 41 lanes in Northbridge
* HyperTransport 3.0 with support for HTX slots and PCI Express 2.0
* ATI CrossFire X, see below
* AutoXpress, see below
* Extreme overclocking, reported to have achieved about 420 MHz bus for overclocking an Athlon 64 FX FX-62 processor, from originally 200 MHz.
* Discrete chipset cache memory of at least 16 KB to reduce the latencies and increase the bandwidth
* Supports Dual Gigabit Ethernet, and teaming option
* Reference board codenamed "Wahoo" for dual-processor system reference design board with three physical PCI-E x16 slots, and "HammerHead" for single-socket system reference design board with four physical PCI-E x16 slots, also notable was the reference boards includes two ATA ports and only four SATA 3.0 Gbit/s ports (as being paired with SB600 southbridge), but the final product with SB700 or SB750 southbridge (see below) should support up to six SATA ports
* Northbridge made on 65 nm process, manufactured by TSMC, and runs at 3 W when idle, and maximum 10 W under load, nominal 8 W power consumption, the northbridge was seen on reference design with single passive cooling heatsink only instead of connecting to heat pipes which are frequently used on current mainstream motherboard offers, the combination of 790FX northbridge with SB600 southbridge consumes normally less than 15 W
* Enthusiast discrete multi-graphics segment
Even if the Intel system is faster the amd system with less costly MB and much cheaper ram will likely be a better buy.
Re:This looks bad next to a amd dual quad-core sys (Score:2)
Progress (Score:3, Funny)
Re: (Score:2)
I don't care what motivates their purchases. They are paying for the hardware the rest of us will buy cheaply later in its product life cycle. The basement lifestyle must make for lots of disposable income...
Next, from AMD... (Score:5, Funny)
Compiling kernels in our undies ? (Score:2)
Nothing to see here. (Score:2)
8GB RAM + SLI? (Score:3, Interesting)
Correct me if I'm wrong but if you're stuck with 32 bit windows there's no point having much more than 2GB RAM if you're doing SLI, given you have 4GB addressing space and the video cards would take a large chunk of that addressing space.
Re:But... (Score:5, Funny)
Dude, you can run linux on a wristwatch. The question is, can it run Vista?
From an old K5 diary: [kuro5hin.org] -mcgrew
Re: (Score:2)
And it might not be Vista, or even Windows, but Microsoft is also getting into the wristwatch game [windowsfordevices.com].
Re: (Score:2)
No. But Linux can run on it...
Re: (Score:2)
Re: (Score:2)
Re:But... (Score:4, Insightful)
I'd be fairly certain that the NSA uses some kind of off-the-shelf processors, whether that be Power, Itanium, or X86.
What the NSA does different, most likely, is scale. You put 1,000 of these in a supercomputer? They'll put 100,000.
Chip fabs are expensive, as is chip design. There's no reason not to leave that to the experts (AMD/Intel). It's a commodity process, and they'll do it better than the government ever can.
Supercomputer design is something else. That's not commodity; and it's a simple scaling problem. More $$ = Bigger computer.
Why should they bother reinventing the wheel?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Not the first one at all (Score:4, Funny)
Re: (Score:2)
I *personally* can't see any significant market for Crays either, but they clearly exist and are wo
Re: (Score:2)