Two Approaches to the Next-Generation Desktop 421
puppetman writes: "Tom's Hardware has a review up of a pre-production P4/2666 using 533 mhz Rambus memory (and shows it stomping the competition). The Pentium 4 needs memory bandwidth, and DDR doesn't supply it. Or does it? Anandtech, ironically, has a preview of the E7500 chipset from Intel - dual channel DDR with support for up to 16 gig of RAM. With a new bus architecture, this looks perfect for high-load databases that need wide pipes to hard-drives, memory, and ethernet. Both of these technologies look great for mid-range database servers.
Anandtech claims that dual DDR200 will provide 3.2 gig/second bandwidth, where Tom claims that DDR266 (single channel) offers only 2.1 gig/second. Intel is sure hedging their bets. I wonder what AMD has up their sleeves."
Overclocking (Score:5, Interesting)
Re:Overclocking (Score:2, Informative)
Re:Overclocking (Score:2)
Re:Overclocking (Score:2)
Desktop?!? (Score:2, Insightful)
I know thats no reason to stop advancing hardware, but it seem a good enough reason to slow down on the hype.
Re:Desktop?!? (Score:2, Insightful)
Re:Desktop?!? (Score:2, Insightful)
Every now and then one can find opinions like: Nobody normal needs such speed or that much of memory... I can still remember zealous defenders of "good ol' 286" and their arguments that 386 is "unnecessary complication".
I agree that every new development of the cpu muscle is usually wasted on making office assistant doing more fancy tricks. However, a new hardware development eventually gets employed for the more useful purposes.
Geeks that frequent this board often discuss things like DV cameras and editing video material. Only few years ago such things were reserved for expensive SGI-s. Today you can do it on a platform with the price tag below $1k (only hardware though).
Bottom Line: Every new breakthrough in technology is Good Thing (TM). It means that by the time it hits consumer market, geeks will have plenty of inexpensive toys to play with.
Re:Desktop?!? (Score:2, Insightful)
You made the valid point that "some hardware development has a much higher payoff than others" but I am not sure it applies to 386. Protected mode, introduced by 386 is THE advancement that enabled "modern" OS-s on consumer desktop. One may argue that X86 architecture was the worst possible one to start with (bunch of legacy crap: 1Mb barmier, etc, etc), but it is the standard accepted by the market. I would say that this particular development brought handsome payoff at least to Intel.
As for your remark about "OS and Apps that don't run 'fast enough' on any existing hardware" it is hen or egg seniority paradox. Every new generation of software prompts (commercial) development of hardware and vice versa. Conspiracy theoreticians drool about unholy alliance between hardware and software vendors. Funny thing is that they are probably right. However, the end result is that we have more powerful computers at affordable prices. Computer hardware is one of the few things that has millions of engineer-hours behind its development and still sells for peanuts.
Re:Desktop?!? (Score:3, Interesting)
Hell, I was DISAPOINTED with the ABYSLEMAL results that came in.
Huh?? You ask?
Well yah.
You see how that MPEG4 video took TIME to encode? Time that could be measured in MINUTES per video?
Tell me when I can do MPEG4 encoding at over 1000x real time speed with shitloads of Virtual Dub filters running and without out my CPU even going up to 10% utilization, and THEN I will say that we have (maybe) gone fast enough.
As it is I still have to hit the render button, wait... wait... wait... wait... wait... wait... wait... wait... wait... wait... Run it through my post production filters and repeat the waiting (seen above) if not for an even longer time, and THEN I get to compress it down to some sort of video stream (choose a codec folks).
Ooooh great...
Royal pain in the arse when rendering takes longer then creation.
Oh yah, and did I mention that I am not even using over $1k of software here? I am not even running some sort of fancy high end effects house, I am just doing regular quicky animations. But rendering those Terrains sure is a pain in the arse, and then those realistic clouds, ooh ouchies MAJOR performance hit there folks.
Heck even photoshop still takes times to run filters. Not even complex filters either, just single ones. (It has gotten A LOT better since the 'old days' of running Photoshop on those "brand new Pentium 166mhzs!!!" Oh man, that was
What about even transfering images from a digital camera? You know how bleeping long it takes to load previews of all images in a folder? You know, all 100-200 images? Or more? Most likely of varying resolutions to boot. How lovely.
That _IS_ enough to annoy a Grandama and encourage her to upgrade to a new machine.
You think people want to WAIT to encode their MP3 streams? Why? By the time we hit 10ghz or so (and if HD speeds hopefuly start scaling up a bit faster.
Or at least we sure as friggin better, heh.
Until then my 700mhz Duron OC'd to 950mhz/1ghz (depending on time of year
(well that and my 80gb + 20gb Hds which are quickly filling up. Screw CPU time, I can always play Gameboy, but I _NEED_ more HD space damnit! I filed up 40GB in two weeks, and I wasn't even trying!! )
Re:Desktop?!? (Score:3, Insightful)
---
I'm doing 3D animation, the fastest, the less hours I spend waiting for my renders to come out. That's ONE application... it's not because you're still playing tradewars in ascii on an XT that some other people won't benefit from advances in technologies.
In your everyday life, other technologies benefit from it, CAD benefit from it, movies studio benefit from more power, Science, etc. I can't beleive some people are SO much self-centered that they pull out comments like this (neither moderators modding the parent up), I mean, if you have the IQ to come here and read the articles, how can you think like that?
Granted, these changes are kinda pointless for most people, after 1GHZ cpu and a geforce2, you don't need much more to enjoy what most end user technologies have to offer, but there are still DESKTOP users out there that enjoys powerfull machines for other things than showing off
$0.02
Re:Desktop?!? (Score:2, Informative)
Team Competition (Score:2)
Distributed computing, of course. Lookie them blocks fly!
Re:Desktop?!? (Score:2)
>>>>>>>>>>
Run GNOME at a decent speed?
Need and want: (Score:5, Insightful)
Do *users* need this memory bandwidth or does the proverbial Quake benchmark need it?
Show me "desktop" (as the headline implies) application that requires this. Even the most cutting edge 3D games don't use current 3D processors to their potential, these days.
Re:Need and want: (Score:5, Insightful)
Re:Need and want: (Score:2)
iMovie
iDVD
Quartz (the displayPDF layer)
The stuff that you would need a Mac for...
Re:Need and want: (Score:2)
And, I also have Mathematica. I've done some huge calcs with it that would have been much more enjoyable with more memory bandwidth.
Re:Need and want: (Score:3, Insightful)
Even the most cutting edge 3D games don't use current 3D processors to their potential, these days.
What games are you playing? Firstly, of course games are usually limited by the lowest common denominator (meaning that you severely limit the polygon count if 50% of the population is using the Virge 3D), but secondly there are some games that seriously tax current hardware: An excellent example is "Operation Flashpoint", which on a GeForce 3 Ti200 has a visibly stuttering frame rate at 1024x768/32-bit colour with a reasonable set of options (I'd say that the frame rate is from 10-25 FPS), yet even that game represents a massive set of compromises: Visibility is limited to 800m or so, there is a limited number of units in a set area, land is mostly defined by textures rather than polygons as polygons are too expensive. Even in the venerable Quake 3 with the mod urban terror, some of the maps (which still represent a massive collectin of compromises) send the previously mentioned video card begging for mercy in parts (and Q3 is OLD). And 1024x768 is hardly a great resolution, and of course if you want to use FSAA you'd better knock down to 800x600. Saying that "cutting edge" games don't use the hardware to its potential makes me presume that the most demanding game you've played is The Sims (though even it can get stuttery when you have fully decked out a multi-level house, and it is hardly an example of photo-realism).
The "too much power" argument has always been flawed, going back to when the 486 was introduced and countless pundits exclaimed that a 386/33DX was all anyone needed. This same argument has gone on, foolishly, since the beginning of computers I'm sure. Actually probably back to the abacus.
Photoshop (Score:4, Funny)
Re:Photoshop (Score:2)
I suspect Jobs won't be using Photoshop to compared PCs with Macs. He'll be using things like iDVD, iMovie, and the ilk. I mean, those are the biggest reasons to buy a Mac, right now. At least on the consumer level.
Re:Photoshop (Score:4, Funny)
I mean, what better way to encourage people to upgrade from 'old' iMacs to 'new' iMacs when you show them the ability to burn DVD data at 4x the speed?
'With our old iMac you were able to burn a DVD in real time, which was incredible, but that still took forever. 90 minutes of video would take over your computer for too long, so we fixed that. With the new 1.4GHz processors, you can encode your DVDs at four times the speed. You can burn your home movies, 90 minutes of video, in just twenty minutes. Isn't that wonderful?
We've also added the capability of storing up to three hours of video on a DVD. That's 180 minutes of video, and it still only takes 45 minutes to burn. We think you'll find this very exciting. Marvelous.'
Something like that. Can't you just hear him saying something like that?
Re:Photoshop (Score:2)
And you're right, the DVD-R is going to be the limiting factor, but I doubt it will be so for long.
You asshole (Score:2)
I just got spam from you yesterday!
Why AMD won the battle before it even began (Score:4, Insightful)
The vast majority of systems that are being sold today are somewhere around the 1Ghz mark. They represent the "sweet spot" on the price/performance curve, and quite frankly, users just don't need anything better. Open source OS users, such as most of us here, don't need to ratchet up the speed to 1.5Ghz unless they're running a bleeding edge release of the bloated KDE 2. Windows XP runs just great (well, as well as Windows XP can run, anyway ;) on my Duron 900.
Desktop users don't need anything faster than 1Ghz. So what's Intel's brilliant strategy? Why, they're going to develop chips that are even faster than the overpriced 2Ghz P4s they're having difficulties unloading right now.
And that, my friends, is why AMD is well on its way to winning the war. Intel is putting a product on the market without bothering to notice that nobody needs anything faster. They will lose a lot of money doing this (a friend at Intel pegged the development costs for this chip at $3.7 billion). AMD is sitting tight and refining their core business: solid, stable, speedy, and inexpensive chips that consumers can afford and that consumers actually want to buy.
If I were a stock broker, I would be telling all of my clients to short Intel and go long on AMD right about now. The revolution is underway and the underdog is winning.
Mr. Uptime
Re:Why AMD won the battle before it even began (Score:4, Insightful)
Neither does Tom pabst. No average consumer gives a damn what one computer guru says, or whoever the hell Tom pabst is. They care what the stoned, look-you-got-a-dell kid has to say, and in a couple months, he's gonna say you want that 2GHz Dell. Adveritisng is everything. And Intel's ad budget is big.
Re:Why AMD won the battle before it even began (Score:5, Insightful)
Until we have machines that can perform (near) perfect speech control/dictation, face recognition (in real time, reading expressions), and can make realistic holograms (ala STNG Holodeck), I will not even begin to believe that CPU's have come far enough.
In the meantime, AMD rides the gravy train.
Re:Why AMD won the battle before it even began (Score:2, Insightful)
We don't really need systems that are any faster, unless they're orders of magnitude faster
What we need now (until some bloke figures out something new & spiffy to tax a P10 or an athlon whatever) are systems that are rather more flexible. Right now cost is a pretty significant limit agent, as is reliability.
Come to think of it what we really need are appliances that cost $99, work more reliably then my toaster & can, with minimal fuss & expense relapce my worprocessor, PVR, fax, email station, cd burning station etc.
Re: (Score:2)
Re:Why AMD won the battle before it even began (Score:2)
Speech control vs Gregg Shorthand (Score:2, Interesting)
I think that speech data entry is inefficient and not appropriate in most office environments. Think of how noisy it would be if everyone spoke to their computers!
What would be really wonderful is a Gregg Shorthand recognition system, for palmtop, laptop, and desktop digitizer pads. It would be a lot faster than the current text recognition systems, and maybe even faster than a keyboard for prose input. I don't think that Gregg is being taught as much as it used to be, but a freely available Gregg input system would bring it back for sure. There are already several gesture recognition programs out there. Gregg is something like that.
Re:Why AMD won the battle before it even began (Score:4, Interesting)
I bought my computer with a 1.2Ghz Athalon in September. At that point about 1/3 of the computers in stock were AMD. Since about a month after that I've been in that and several other computer stores, multiple times and NOT ONCE have I seen a computer with an AMD chip. I'm sure the companies will only be too happy to oblige when you order (as I ended up doing with mine) but I've stopped seeing them in the stores. Could this have something to do with the fact that I'm in Canada, some bizarre business decision on AMD's part or perhaps we just like intel a lot more? Or is this happening generally in computer stores? If I recall this sudden shift away from AMD happened around the time of the release of the P4s. Don't underestimate the publics willingness to succumb to hype and a feeling of security. Most people will gladly hand over the extra one or two hundred to make sure their two grand machine can surf the net and doesn't explode.
Re:Why AMD won the battle before it even began (Score:2)
Unfortunately my workplace's idiotic people seem to prefer crappy Dells these days, which are just the cheapest components soldered together and thrown into a pretty box. About 1/4 of them have extreme stability problems (shitty power supplies and bad ram), a few of them like to hang during POST!?
Screw Dell. Yay Cemtech!
Why you're wrong (Score:2)
As the operations manager for a medium-sized business, I am responsible for approving or denying acquisition requests (ARs as we call them). And I will strongly encourage my employees to buy Intel machines over AMD machines if they want their AR approved. Why is that? Although I am very impressed with the speed of AMD chips, and very unimpressed with RDRAM and P4s' performance (did you know they reduce the cache memory clock as they increase the core speed to prevent overheating?), P4s are an order of magnitude more stable than Athlons. Having seen several Athlongs crash and burn in the past two years, I have been refreshing AnandTech every morning awaiting the release of a comparably speeded P4.
Most businesses hire smart people, and there are probably thousands of people just like me who want the speed of an AMD chip, coupled with the reliability and quality of an Intel chip. Well, the day has finally come, and Intel will sell these chips faster than they can restock the shelves. Good for them.
freebsd guy
Re:"It's a well Known Fact"... (Score:2)
Here's a not-so-well-known fact: By the time AMD gets to a REAL 2GHz processor (Barton), Intel will be at 3.0GHz, and it ain't looking back.
True, but the 2 Ghz Athlon 3000+ will probably still meet the 3 Ghz Pentiums performance or come close, for half the price.
Re:Why you're wrong (Score:2)
Re:Why AMD won the battle before it even began (Score:3, Insightful)
Which is AMDs contribution: bringing the price of heavy desktop computing firepower down to "Why not?" prices. And my HDTV PCI card chews serious CPU time, so having several hundred MHz to spare is rather nice.
On investing in AMD stock: speaking as a 2+ year AMD shareholder, if you buy in, prepare yourself to be in it for the longhaul and for the insane price swings. AMD is one of the most manipulated stocks on the market. It's insanely undervalued right now, but there's absolutely no way to tell when its valuation will reflect reality.
Maybe the Hammers will do the trick. At the least they'll beat the crap out of those souped-up P4s Intel let Tom play with
Re:Why AMD won the battle before it even began (Score:2)
the overpriced 2Ghz P4s they're having difficulties unloading right now
I wonder why my recent purchase of an IBM Intellistation M Pro was delayed for three weeks because of a shortage of PIV 2GHz processors?
(Yes, it was worth the wait. No, I didn't pay for it.)
Re:Why AMD won the battle before it even began (Score:2)
Don't rule Intel out yet, but certainly give AMD its due props for making fast computers for so cheap. I remember my IBM DX4-100 costing $300, mb and chip. Memories.. o/~
Re:Why AMD won the battle before it even began (Score:5, Interesting)
You're missing the second part of the the story, here -- while increases in top-end processing speed are nice, they are not the only result of faster/more efficient processors.
Another major feature is that for the same clock speed, it can be run on less power and with less heat, meaning that even if they only sold the chips to run at 1 GHz, they would be able to run on half or a third of the power that a current 1 GHz chip could.
I recently replaced the 700 MHz celeron in my home entertainment machine with a 1.2 GHz Pentium 3 -- not because I needed more power, quite the contrary. I underclocked the P3 to 600 MHz and took off the processor fan, thereby reducing the total noise on the system. It's been running fine, only a few degrees warmer than the old chip with active cooling. Total power use and waste heat is down.
In a few years, the 20 GHz chips mean that we'll be able to run our wristwatches off a battery for months at 600 MHz without any cooling at all. THAT is the point...
Re:Why AMD won the battle before it even began (Score:2)
Thanks for any info.
Re:Why AMD won the battle before it even began (Score:2)
Re:Why AMD won the battle before it even began (Score:2)
I don't know, I told my laptop to use the least power possibal when I'm running on AC power. I specificly set the CPU speed to slowest when on battery to save power. I don't notice the difference in speed, and it is only a PII-266 (I think).
Re:Why AMD won the battle before it even began (Score:2)
Disagree... (Score:2)
I do live Television recording off my WinTV card using Snapstream, and I can assure you the faster processor does matter. If I could have justified the price of a 2.2Ghz P4, I would have done so. I figure I'll wait for this next generation memory bus to come out and then look at upgrading next year.
I also use VMWare, and I almost justified the price of going dual processor just for that.
I guess the point there is that as the computers get faster, bigger, better, we find new applications to take advantage of them. I know I certainly have. I also like the fact that things occur nearly instantaneously on my system.
Oh, and I bought the Intel over the AMD because I absolutely cannot stand it when my computer locks up. I want stability and reliability... Hence I also went with an Intel motherboard. Thus far no issues, this D815EPEA2U board is by far the best I have ever seen.
I like that AMD is in the market, but until I start seeing some reviews which acknowledge them for something besides speed, I'm leary. I'm certainly not betting against Intel, companies also want stability and they buy far more computers than Linux nerds.
Re:Next-gen windows (Score:2)
Re:Why INTEL won the battle before it even began (Score:2)
AMD was never considered the performance leader until the Athlon came out and even then Intel could still charge more for it's (slower) procs. Why? Cause they were intel and they have mindshare due to advertising. Why AMD doesn't allot a larger budget to advertising twords normal people, (non-geeks) I don't know.
Nobody I know who bought their computer from a store has an Intel and almost everyone I know who bought it from a computer chop shop or built it themself has an AMD.
Good (Score:5, Interesting)
But flood the market with P4's and K-Whatevers
at 3+GHz, and the price, she'll keep on droppin'.
Thank God for the bleeding edge hardware buyers. They keep folks like me who consistently buy CPUs at the $/MHz sweet spot with enough left over for more memory
Knunov
Re:Good (Score:2)
For that price you could buy an Athlon XP 1600 and decent motherboard.
Re:Good (Score:2)
Check pricewatch or a similar website.
Re:Good (Score:2)
For what a P3 will cost you, you could get a faster chip AND the motherboard to support it
Re: be carefull with the sweet-spot myth ... (Score:2, Interesting)
Simple Example:
Chip 1 0.9 ghz: $200
Chip 2 1.5 ghz: $450
Chip 3 2.0 ghz: $1500
Knumov: buys Chip 1 at $200, waits a year, then buys Chip 2 when it reaches $200, and so on. Average $/mhz = 33 cents
Smart guy: buys Chip 2 and skips a cicle. Average $/mhz spent = 30 cents
Smart Guy saves some bucks and enjoyed the most demanding games and apps for a year while Knumov did not.
Re:Good (Score:2)
No kidding. I'm running YDL on a Powermac 7200/120 with a 120mHz 601 PPC processor. Everything is slick as oil running Blackbox, until I do something foolhardy like launch Konq or another KDE pack-in.
Glad I upgraded from that 486/66 I was using before...
--saint
No surpises here. . . (Score:3, Insightful)
That said, I just wish the process of speeding up memory wasn't so painfully slow! It'd also be nice to have some kind of standard mem that'd work anywhere, even tho both rambus and ddr techs look promising for the immediate future (and now also they're comparitively priced).
16GB: 12GB unusable? (Score:3, Interesting)
The article didn't address (yuk yuk) this, and I'm certainly not on the cutting edge of chip design nowadays...can someone explain how you can use those upper 12GB? Is it increased address space on the P4 (seems unlikely) or some magic communication to the chipset (also seems unlikely), or something else entirely?
Re:16GB: 12GB unusable? (Score:2, Informative)
"Offers a maximum memory bandwidth of 3.2GB/s through a 144-bit wide, 200MHz dual data rate SDRAM memory interface supporting a maximum of 16GB of memory"
I believe the chip has 36 address lines...
Re:16GB: 12GB unusable? (Score:5, Informative)
Nope, all pentium chips since the ppro and pII have more than 32 address lines (not sure exactly how many, is it 40?).. To access this with 32bit registers requires the use of 4mb paging instead of 4kb paging. Check out http://x86.ddj.com/ [ddj.com] for more info.
The page size has nothing to do with it (Score:2)
The P4 has 36-bit addressing (Score:2)
All newer Intel chips have a mode called PAE, Physical Address Extension, that allows them to access over 4GB of RAM. In fact it allows them to access a total of 64-GB of ram, 36-bits. Windows 2000 Advanced Server and Datacentre both support PAE in a mode they call WAE.
What happens is that Windows presents every application with a 4GB virtual address space (2GB system 2GB app) regardless of the actual physical RAM installed. It does this even if you have less than 4GB, all virtual addressess are mapped to physical addressess by the kernal. Now when a PAE aware app is running on a system with more than 4GB of ram, what it does is setup a window in its address space for mapping the higher memory to. It then orders Windows to point that window at whatever area it happens to need to see at the time.
It is a system very much like EMS/XMS of the days of DOS, however paging is handled on an app level, not on a Windows level. It is not a perminant solution, but provides a temporary stopgap for large databases that need over 4GB of memory until Intel has its 64-bit line out in earnest.
If you wish to buy Intel server boards with support for more than 4GB, look at the SuperMicro P4DL6 motherboard. It supports a maximum of 16GB of DDR-SDRAM (with two way interleaving), 133mhz PCI-X, onboard SCSI, dual P4 Xeons and so on.
http://www.supermicro.com/PRODUCT/MotherBoards/
Re:16GB: 12GB unusable? (Score:2)
When will heat become an issue? (Score:2, Interesting)
Looking at the pictures I see memory that requires fans. Granted I've 7 computers up here, and various other peice of equipment (monitors, printers, etc), but I don't think I can handle any more major BTU producers. At what point are the chip manfacturers going to be limited by the fact that the average person cannot provide the cooling required?
Re:When will heat become an issue? (Score:2)
Oh Come On! (Score:2)
There is a need for more speed... (Score:5, Insightful)
1. Current applications run fine. This is not true. More and more people are doing things such as video editing(think iMovie, forinstance) DVD encoding, as well as the next generation of high powered games. It would be great if video effects and encoding could be done in real time or faster. It is only with faster chips that this is happening. Final Cut Pro 3 can now do some effects in real time on a G4. As chips get faster, current applications will speed up even more, which means less time waiting around and more time getting stuff done. And anyone who has seen the new Unreal2 engine benchmarks should know that nextgen games will require way more power, which leads nicely into second point...
2. More power allows applications that we haven't even thought of yet, or are currently not feasable. Stuff like near perfect voice recognition, enhanced artificial intelligence/analysis, modeling, etc are all applications that can't be realized in general right now, but may be some day. How about a navigation system on the web that uses a U2 like engine. Look at what has happened so far. Many user interface enhancements have only been made possible by greater speed/memory. Given more power, it is impossible to predict what enterprising people will come up with. I mean, the argument that "well, things are just fine for word processing" holds true for a 1980 era IBM/Apple II machine. Is anyone seriously arguing that nothing that has happened since has been important for how people work with their computers?
I know many of us on
Re:There is a need for more speed... (Score:2)
Every time there's a story on Slashdot about newer, faster hardware, somebody will say that it's more than anyone ever needs. That's as predictable as "how about a beowulf cluster of these" and as insightful as "640K of memory is all anyone will ever need."
Not terribly long ago, CPUs had to struggle many seconds or even minutes just to display a JPEG image on the screen. Imagine the state of the web today if that were still the case. Not as long ago as that, CPUs didn't have enough power to decode MP3 in real time. Five-ten years from now, there will be something that our 10-100GHz CPUs do so quickly that we'll take it for granted.
Re:There is a need for more speed... (Score:2)
On a similar note, the current DirectX 8.0/8.1 that divides the 2 major video cards ( Nvidia and ATI ) maybe a reason that these next gen games are taking so long to create. Of course it could also be that with the introduction of pixel shaders and other new features, that video graphics engines have gotten ten times harder to create.
If they could only figure out some way to turn CPU power into bandwidth.....
Great Server, Silly Desktop (Score:4, Insightful)
Give me a desktop with no fan, lots of pixels and video RAM, and a reasonable-sized disk and a CD-burner. In a small case. And put the disk in one of those removable-drive drawers so it's easy to replace. If it needs more than 500 MHz, it belongs on the server in the back room. Desktops are for running X (or VNC if you don't have a real OS), and doing light development, and running MP3s. If I need to have a dedicated machine to do development on instead of a shared environment, (which I don't), it almost certainly needs to be a slower machine to emulate a random customer.
Actually, my current desktop is a laptop running Win98. There's never enough RAM, and often not enough disk, but the 450MHz CPU is almost always fast enough.
Re:Great Server, Silly Desktop (Score:2)
It has one external 5.25" bay, one PCI slot, built-in AGP video, sound, Firewire, USB, composite video out, a drawer-mounted hard drive bay, and only weighs 6 pounds. It measures about 11" in all dimensions.
The main drawback is the CPU: you can install either a Celery or a PIII.
BUT: I understand that on or about April 1, a new version of the case is coming out. Here's hoping for AMD support!
Rambus makes poor server memory (Score:3, Informative)
As one can derive, this greatly increases latency as the number of modules increases. Servers, being systems that generally have lots of RAM, often have at least 8 modules available.
Due to this increased latency as a function of the number of modules (and other factors), Rambus is therefore poor memory for servers.
Note that this is per channel, meaning a dual channel Rambus system with eight modules has the memory latency of a four module system because the modules are split between the Rambus channels.
My wishes for the next-generation desktop... (Score:3, Insightful)
A modern desktop environment is built on many layers, lots of processes and daemons, many interfaces and abstractions, most of which could be delegated to and shared among other hosts. Poor performance? No need to throw away the old box, just add a new one. With open and interopable interfaces like X11, CORBA, XML, HTTP or whatever, a next-generation desktop of this kind should be possible, especially with Free software.
In my view the most promising solution towards this concept is the GNU Network Object Model Environment (GNOME), largely based on CORBA, using only a few remaining locks which are likely to disappear within the next few years. If finally a common object model between GNOME, KDE, GNUstep and other backends can be established, the seamless multi-host cross-platform desktop could become reality.
The 2.6 GHz machine could then be used to build SETI packages and Linux kernels to heat up the office
Doesn't Nforce do dual channel now? (Score:2, Insightful)
Of course the E7500 is in a different league than the Nforce, but the Dual Channel Idea is pretty much the same.
FSB is the bottleneck (Score:2)
Re:FSB is the bottleneck (Score:2)
Not fast enough. (Score:4, Informative)
Ram bandwidth (Score:2, Informative)
Unfair comparison (Score:5, Interesting)
I would like to note that while the P4 did pounce the AthlonXP, take a look at the numbers (and i'm not talking about price, as I don't even want to know how much that P4 will cost!)
AthlonXP 2000+ runs at 1,666MHz at a bus which is the equivalent of 266MHz.
The P4 is running at 2666MHz (a full Gigahertz higher frequency) with a bus at the equivalent of 533MHz.
The (essentiually overclocked) Pentium 4 has a full SIXTY PERCENT CPU clockspeed advantage and a ONE HUNDRED PERCENT front side bus (FSB) advantage, yet look at its real-world performance:
MP3 encoding: 6.2% faster than the Athlon. (woop)
DivX encoding: 30% (note that the program is highly optimized, by Intel themselves, for the P4. How many programmers have an Intel engineer handy?)
Xinema 4D: 12.8%
3DMark 2001: 4.9%
Note that that Lightwave was not included--the only common test that runs faster on the P4 is the raytracing test. Guess which one Tom's Hardware used?
I just thought I'd point out that the only conclusion that you can really draw from these tests is that, as many in the hardware community know, the P4's architecture is designed for high clockspeed, with zero regard to actual real-world performance. Which matters more to you?
Unfair post (Score:5, Insightful)
The P4 is running at 2666MHz (a full Gigahertz higher frequency) with a bus at the equivalent of 533MHz.
How come so many people rant and rant about how clockspeed isn't everything, then they go and use the same argument in a different way to establish the "clear superiority" of the Athlon? Who cares how many Hz one is than the other? (Don't argue about consumers here, that's for another discussion...).
Sorry, but if you're going to paint it as an achievement that the Athlon performs so well 1000MHz slower than the 2.6GHz P4, then why can't the Intel fanboys paint the fact that the P4 runs at 2.6GHz as an achievement?
The (essentiually overclocked) Pentium 4 has a full SIXTY PERCENT CPU clockspeed advantage and a ONE HUNDRED PERCENT front side bus (FSB) advantage, yet look at its real-world performance:
"Essentially overclocked" Pentium 4? It's not a new Pentium 4 chip, it's a new motherboard. Of course it's an "essentially overclocked" Pentium 4. Why add in the negative connotations?
I just thought I'd point out that the only conclusion that you can really draw from these tests is that, as many in the hardware community know, the P4's architecture is designed for high clockspeed, with zero regard to actual real-world performance. Which matters more to you?
I dunno, looking at these benchmarks I'd say the Pentium 4's architecture is damn fast. It's scaling up incredibly fast. Remember when it was first released and everybody called it a disaster?
Intel could easily release those 2.6GHz chips today, but they aren't doing it for marketing reasons. The architecture of the Pentium 4 is incredibly fast, but the management of the company is spreading out the releases over time. You can get a 2GHz today and overclock it to 2.6GHz. People are doing that all over.
The Athlon is a different design: It's very fast. The Pentium 4 is another design: It's very fast. The Athlon is cheaper, by a fair margin, especially at the highest end chips. But painting the picture that the Pentium 4 is so very much slower than the Athlon, especially with benchmarks like this, are just plain stupid.
Re:Unfair post (Score:2)
Re:Unfair post (Score:2)
And exactly why is this OK to you? Do you like being marketed at? Do you like being fed shite and being told it's ice cream?
And before you talk about scaling, you should know that a processor "scales" well if you can run it at higher frequencies without increasing voltage or supercooling. At frequencies that AMD and Intel ship at, the processors benchmark similarly.
If you were not so busy singing the praises of P4 you might also notice that the Tualatin core is overclocking as well as the Athlon, and surpasses them both in some benchmarks.
Do some damn research before you post or start on about "fanboy blah blah fanboy" while being a fanboy.
I cannot belive you were modded up for that flamebait.
Re:Unfair comparison (Score:2)
Intel directly sent us a manipulated version of the conversion tool FlaskMPEG 0.6, although we don't recommend using it. This utility is only faster when used in conjunction with Pentium 4 processors, which will disappoint many users. CPUs from other manufacturers are out of luck. We'll include these effects in our benchmarks. [tomshardware.com]
You can get the file here [tomshardware.com].
Lightwave rendering benchmark. (Score:3, Interesting)
At least he does other benchmarks to round-up the possibilities of errors.
Interesting number 2666 (Score:2, Funny)
These are not Quake-machines... (Score:2)
You don't run Quake 2 on a Sun E4500. True, Tom and Anand don't benchmark with Linux/Apache, Win2k/Oracle, Solaris/Netscape, but they should have.
Our database is Oracle with dual P3 933s with 2 gig RAM. A E7500 with up to 16 gig of RAM would take our CPU usage on one of our database machines from 40% to about 20%.
Why do people keep talking about Quake benchmarks, kernal compiles, etc?
Is Tom Credible? (Score:2)
An interesting development in the market is in regard to the memory prices: currently, DDR SDRAM costs just as much as RDRAM. The high price of Rambus, which we have mentioned in many articles previously, should no longer be a purchase barrier.
Mushkin prices for 256 MB DDR 2700 is $116 and Mushkin 256 MB RIMM is $149. Who knows how much the un-available 533MHZ RIMM will run but it's certainly going to be more than $149.
Secondly, his benchmark charts don't jibe with other reviews where the 2000 XP is pitted against a 2.2 GHZ P4. He's got the P4 trouncing the Athlon whereas Anandtech is giving a only a slight edge to the P4.
Maybe Tom's gone to the Steve Jobs School of Benchmarks?
umm whooperty-shit (Score:2)
to be blunt and without starting a flame. who cares? I'm as excited as the next guy for newer faster machines. But, who cares. I'm using a 500 mhz amd now and its just starting to show a bit of grey. With the exception of super duper digital video apps and photoshop and super number crunching what does anyone need these machines for? nice to have one but word or abiword or star office work the same at 500 mhz as at 1 or 2 or 10 ghz. whats the app that will make a machine this powerful useful for the great majority of pc users? I'm really curious. I want honest answers.
When do i get to walk up to a screen and say "hey monkeyface whats my check balance" and have it respond "zilch, po-boy and who you callin monkeyface"? when i can get a system to do that then I'll give a damn.
Re:umm whooperty-shit (Score:2)
hardly "next generation" (Score:2, Insightful)
Tom Pabst over there is using some new hardware (basically some fatty P4's, and some juiced up RAMBUS), but his mobo, cards, software, etc, are all things that
"My reports repaginate in
If you think dual DDR channels is a lot... (Score:2)
Absurd (Score:5, Interesting)
"Our detailed tests show that forthcoming P4 CPUs with 133 MHz FSB clock used in conjunction with the 845E chipset (DDR SDRAM support) will effectively be castrated."
Intel castrated it their selves. Compare its performance to VIA's P4X266 Chipset's performance and you will see that Intel crippled it to prevent it from competing with Intel's Rambus chipset. Notice that Intel is suing VIA for that chipset because it ruins the facade that RDRAM is better than DDR. Also note that Intel has refused nVidia's request for an Intel license for a DDR chipset. Intel knows that a dual channel DDR chipset would show RDRAM for what it is: A fraudulent attempt to maintain a high performance monopoly. Whatever company "causes to be sold" the most RDRAM gets to own a controlling interest in Rambus Inc. At this point, Intel is the clear winner even though Sony made a race out of it by packaging Rambus with the Playstation 2. Intel suppresses their own DDR performance to make people believe that RDRAM is the fastest stuff out there. AMD would be committing suicide by using RDRAM to capitalize on Intel's marketing hype because that would place them directly under Intel's thumb.
"This is because the Pentium 4 has a problem: the increase in clock speed (e.g. P4/2533 or P4/2666) will be rendered useless by the slow DDR SDRAM memory bus of the 845 platform".
Again, this is Intel's doing for product placement purposes as was done with the Celeron when it competed with the Pentium III and was done by Apple on the new iMac's 100fsb 800mhz G4. A 133fsb does not cost any money, it is just an easily achievable clock frequency with available current chipsets.
"And one shouldn't forget that even a dual DDR platform for P4 should be priced at a level that is similar to a Rambus system, considering that it's from Intel."
Rephrased: "And one shouldn't forget that even a dual DDR platform for P4 will be priced as high as an RDRAM system because Intel will not license the platform to nVidia and Intel KNOWS it will outperform a Rambus system, ruining 2 years of carefully crafted marketing and gamesmanship" The fact is, a dual channel DDR chipset from Intel may be available for the Pentuim 4, but only for the Xeon, a processor not available except from Intel's favored OEM Parteners, such as Dell.
Before you defend Intel remember that Craig Barrett, after AMD went from 10% market share to 40% in a year, said "the market is dropping" to justify Intel's reduced profits. Well, Intel is a bellwether stock and the market believed everything Craig said. The market did drop. We all lost our jobs. We can now say in hindsight that at least a part of the market was due to drop. But because of Craig's statement, it was the tech sector that was hit first, and hardest. Instead of simply saying, "Intel has reduced profits because of competitive pressure", he brought the entire tech sector down with him. The recession that was due could have been placed entirely on Enron's shoulders. The energy sector was in fact dropping. Enron's insiders were cashing out at the same time Craig made his statement. People got scared and pulled their money out of the market. There was less money in the market than there had been and it came out of the tech sector when it should have come out of energy.
Go ahead and defend Intel. They have made poor greedy choices, sold inferior products at exorbitant prices and done it at the expense of all our livelihoods. Shame on them.
Intel's 1.7 trillion dollar market cap has been cut by Tom Pabst on more than one occasion. A series of articles he has had deriding Rambus, causing the 1.13 Ghz recall, and showing the Pentium 4 for the paper tiger it is has seriously hurt Intel. But Tom, like all hardware websites is cash poor. Tom's hardware has resorted to doing marketing research among their readership for Socratic Technologies. Sometimes they have been overt, sometimes they have sent readers to secure servers just for simple popularity polls. Tom's latest revenue generation technique is the introduction of "Editorial Content Sponsorships" which I'm going to guess prompted the recent editorial change of heart toward Rambus. Please notice that in the most recent article no AMD processors were over clocked according to their projected roadmaps and the test is presented as if it was fiction. Unfortunately, it seems we have lost another fair and unbiased journalist. Another because Sharky's Extreme was the first to go into Intel's pocket, prompting Sharky himself to leave the website. Sharky's is owned by INT Media Group. Noteable investors in INT media include Dell Computer Corporation, International Business Machines Corporation, Lucent Technologies Inc., Macromedia Inc., Microsoft Corporation, Nortel Networks Corporation and Oracle Corporation.
Expect wonderful reviews of Intel hardware on Sharky's and unfortunately now, Tom's. Look to [H]ard OCP, The Inquirer, The register, Anandtech and Ars Technica for relatively unbiased hardware news.
Post Intelligently, Thanks
Re:Absurd (Score:2)
You must have meant to show this article [anandtech.com], where it is clearly shown that the VIA P4X266 has twice the memory bandwidth of the crippled i845. Or this page [anandtech.com], that clearly shows the P4X266 outperforming the crippled i845 by 12%, on par with the Intel's RDRAM solution.
This article [forbes.com] shows Intel stands to gain considerably from every rimm sold.
You said it yourself, Intel had an existing chipset in june of last year supporting DDR but would not allow motherboard manufacturers to use DDR with it. That means they crippled it themselves to make RDRAM look better.
Their deal did not end, it's just that Rambus's stock price dropped to less than $6 a share making Intel's options to buy Rambus at $10 a share look a little weak
Since Rambus's stock at one time traded at over $100 they could have seen a ten fold increase in their investment. But since rambus stayed so expencive, and offered no performance advantage over the Athlon, their GREED caused their own loss of market share.
You may also note I am keeping tabs on IDF and also mentioned Intel's DDR chipset but I am begining to think you don't read. Take note: "The fact is, a dual channel DDR chipset from Intel may be available for the Pentuim 4, but only for the Xeon, a processor not available except from Intel's favored OEM Parteners, such as Dell.
"Your entire post is full of inaccurate information and typical anti-Intel garbage. Don't take me as pro-Intel, but anyone can see right through your crap if they looked at it."
Why do you waffle here? We can certainly see through your crap, why be such a fence sitter about it? I take you as pro-intel with no spine. If you could stand up for them with a spine I would at least respect you, but as it is you post a few incorrect links and restate my point for me then roll around about what you like.
Dammit I told you to post Intelligently.
Re:Absurd (Score:2, Interesting)
Um. Maybe that's because in the benchmarks you're looking at, the i845 is using SDRAM and the P4X266 is using DDR?
Go figure that DDR has twice the bandwidth of SDRAM?
You said it yourself, Intel had an existing chipset in june of last year supporting DDR but would not allow motherboard manufacturers to use DDR with it. That means they crippled it themselves to make RDRAM look better.
No, it means Intel was under contract with Rambus not to release a DDR solution. That contract expired on Jan 1, 2002.
Their deal did not end
Yes, yes it did. That's why there's a DDR 845 now.
Dammit I told you to post Intelligently.
Would it kill you to take your own advice?
And 640K outta be enough for anybody... (Score:3, Informative)
First off, it is naive to think that current users wouldn't use or enjoy more powerful computers. It is the software industries fault that end users are unable to fully utilize the more powerful machines being built. Already plenty of comments have suggested a variety of applications from facial recognition to video editing that all would benefit from faster more powerful computers.
It is actually important to me that regardless of the 'need' the average user has for more powerful computers, that the software industry does its job to drive the users to want more power.
Only by nurturing and then feeding the publics appetite for technology does the industry continue to push us forward technologically. If millions of people and companies didn't demand the upgrades and new features that are available with more powerful systems we risk losing all the potential gains for the future that these desires produce.
Now does anyone else find this odd: (Score:2)
The new Sysmark 2002 benchmark includes the following applications:
<snip/>
Office Productivity:
<snip/>
WinZip 8.0
Neat, my 866 just is *way* too slow at zipping up those files.
Bloody hell... (Score:2)
I'm right now processing a track from 24 bit to 16 for an album remastering I'm doing, in the background, while reading slashdot, and my _CPU_ is barely as fast as the _bus_ of whatever they're looking at. My bus is more like 33mhz I think...
If I can do this and not think too much of it, no wonder they're not going to sell one to me... I think I'm going to be waiting around for another year or so and then picking up one of the ol' blue and white G4s maybe... gotta love being several years behind the curve, you get the same amount done but for way cheaper. That will be the point when I start running OSX and programming in something more portable to Linux and BSDs... by then I ought to be up to speed with that...
Database RDRAM vs. DDR (Score:2, Interesting)
Re:FPS levels (Score:2)
You would be suprised at the number of people who believe that FPS really makes a difference. Try telling your average person that the human eye can't detect anything over about 40fps (ideal situation,near perfect eyesight). You're 85fps is waayyyy too high, if you are in collge try taking a film studies class they will explain the nuances of FPS to you, and the limitations of human eyesight.Also,try telling them that movies run at 24fps and they won't believe you.
The fact is most people believe that 339 FPS is somehow better than 35 fps (which it isn't). Because of this these chips will sell. Of course, on the other hand you also have to realize that eventually there will be a application that will use that much power.
Re:FPS levels (Score:2)
Re:FPS levels (Score:4, Interesting)
Re:FPS levels (Score:2)
Re:FPS levels (Score:2)
I got quake up to about 120fps. this allowed me to turn on more graphi features, nbow I'm at 80 FPS, BUT as soon as I'm in a room with 15 other people, explosions, gun fire, etc... my fps dips to about 45fps.
Now if I has started at 50 fps, by the time I was in a 'real world' situation with other players, my FPS would be about 18, and THAT does effect game play, signifigantly.
PLUS 24 frames a sec in the movie is OK, but you do notice the diffrence if the film was 40 frams per sec.
Re:FPS levels (Score:2)