Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

AMD Duron vs. Intel Celeron 296

DeadBugs writes: "With all the hype surrounding the new Athlon XP and P4 2.2 GHz, the more affordable processors have been ignored. Tech-Report has a great article comparing the new AMD Duron and Intel Celeron. Both are now running at 1.2 GHz and have upgraded cache. The new Duron contains XP technology, while the Celeron is a PIII Tulatin with a 100MHz bus and built on the .13 micron process."
This discussion has been archived. No new comments can be posted.

AMD Duron vs. Intel Celeron

Comments Filter:
  • From what I remember of the Intel roadmap around 1998, Intel was supposed to have come out with a 100Mhz Celeron chipset sometime in the year 2000. Obviously this didn't happen on time...
    • by Anonymous Coward
      You're right, the first Celeron with 100 MHz FSB was the Celeron2 800 which was announced January 2001, missing 2000 by a couple days. That was only a year ago so the news probably hasn't trickled down to /. yet.
    • by questionlp ( 58365 ) on Wednesday January 09, 2002 @08:46PM (#2814140) Homepage
      Part of the problem is that Intel is being plagued by supply issues and they don't want their "crippled" or previous generation processors beating out their newer, more expensive processors. Remember that in some cases, the Tualatin Pentium III beats the Pentium 4 processor while having a lower clockspeed and lower heat dissipation.

      I wish Intel wouldn't have cut off the Tualatin P3 so quickly, as it would make a decent dual processor system... but now I'll be getting a dual Athlon instead :)

      • I really want to make a dual Athlon system, but I am waiting for the 0.13 micron process Athlons. I'd like to have a system that doesn't need roaring loud fans, and two CPU chips that dissipate about 60 Watts each is a bit much heat!

        I was wavering, almost considering breaking my "no Intel CPUs" rule, because the 0.13 micron version of the Pentium III is sweet. But Intel reminded me why I have that rule -- the 0.13 micron Pentium III was deliberately made incompatible with Socket 370 motherboards. I hate it when companies play those sort of stupid games.

        When the 0.13 micron chips come out, they will crush the Pentium 4. Right now the best Athlon is neck and neck with the best Pentium 4, and the Pentium 4 has the benefit of a 0.13 micron process (i.e. a much higher clock rate).

        steveha
    • In 1998 (or maybe in 1997, not positive), Intel alrady released a 200Mhz Pentium chip. It's big selling point was MMX. Intel promised that MMX was "the future of computing". Remember those days? I sure do miss them.
      • MMX was 1996 or 1995 man. I know because I bought a pentium 200 non-MMX in late 1996, and it was very cheap because non-MMX chips were "old technology" by then, and 166 and 200MMX were the standard midrange in late 1996.

        In early 1997 the Pentium II 233 and 266 came out. The celeron didn't exist yet.
        • And when the Celery^W Celery did come out, it was a castrated bugger... no L2 cache. I have no clue what Intel was thinking when they released that thing. At least second revision was better, mainly since the cache was running full speed and was on-die (128KB though, but even then it still beat the Pentium II in some cases).
    • As has been said numerous times, Intel got hurt by failing to go with the DDR standard, the benchmark still proves the Duron to be a bit better but considering they are running at the exact same clock speed, you would almost expect the Duron to outperform. Then again, Intel is doing little more than taking PIII cores and calling them Celerons. Smart move to cut manufacturing and R&D costs, but it kind of reminds me of this old cartoon [userfriendly.org].

      Personally, I prefer AMD to Intel. Their K2 and Athlons have helped stave off a monopoly and they drive costs low. Works for me.
    • Celerons ran 100MHz starting with the 800, I think that WAS released in Dec 2000, but i may be wrong.
  • Tualatin "Celery" (Score:3, Informative)

    by questionlp ( 58365 ) on Wednesday January 09, 2002 @08:41PM (#2814113) Homepage
    The Celeron is also crippled by the poor FPU that hasn't really changed since the Pentium II came out. The only reason why I would buy a Celeron-based computer is if heat and noise are not tolerated, beyond that, even a slower Athlon or the Duron would be the processor of choice (both for people on a budget or for people who crave speed).
    • Re:Tualatin "Celery" (Score:2, Informative)

      by flight666 ( 30842 )
      This is incorrect. The latest Celerons are based on the PIII core and have the exact same FPU as the PIII.
      • Re:Tualatin "Celery" (Score:2, Informative)

        by questionlp ( 58365 )
        But the P3 still uses the same x87 FPU units as the Pentium II processor. The P3 adds SSE (yeah, like that is used a lot) and brings the cache on-die with the Copper[less]mine processors.

        The Celeron reviewed actually uses the Tualatin P3 core, not the older P3 core.

        • by DarkEdgeX ( 212110 ) on Wednesday January 09, 2002 @10:27PM (#2814503) Journal
          The P3 adds SSE (yeah, like that is used a lot) and brings the cache on-die with the Copper[less]mine processors.

          Actually, this brings up an important issue-- compiler technology, and the run-time libraries (RTL's) they use (in the case of C/C++, the standard libraries, in the case of Pascal/Delphi, the RTL and possibly parts of Borland's VCL/CLX). The problem, it seems to me, is that compiler authors don't seem to take advantage of architecture specific improvements like they used to (and as they should). Sure, some libraries/RTL's take advantage of it (and the compiler may have switches to emit optimized code), but if the standard libraries/RTL's are re-compiled (or even re-written) to take advantage of it, then it's all for nothing.

          It seems to me that Intel has the right idea (the FPU is really useless if you know HOW to use SSE and SSE2 properly), and that if anything, it's poor software authors and poor compiler writers that are to blame for the lackluster performance of code on Intel's CPUs. It's saddening to me to see the optimization skills software engineers *used* to have back in the day diminishing year by year as the ability to right crappy code is justified by ever-faster CPU's. (Why spend the weeks or months needed to engineer everything to run properly now, when Intel/AMD will have a 'fix' for our sloppy code out in a few months?)

          I wish authors such as Michael Abrash still released optimization guides for assembly language (or even just updated versions for C/C++ and assembler).. his 'Zen of Code Optimization' (ISBN: 1-883577-03-9 *or* FatBrain.com's description (out of print) [fatbrain.com]) was probably the best investment *I* ever made.

          • by Grishnakh ( 216268 ) on Wednesday January 09, 2002 @11:06PM (#2814618)
            it's poor software authors and poor compiler writers that are to blame for the lackluster performance of code on Intel's CPUs.

            I think this one can be blamed on Intel too. If Intel really wanted to sell its processors, they'd invest a little money in helping push compiler improvements to take advantage of their processors--such as contributing to GCC. Instead, they do invest money in compiler technology, but only in their own proprietary compiler, and then try to sell that as competition for the other two mainsteam, more popular compilers that everyone uses (GCC and MS VC++). Then they wonder why software isn't optimized for their processors.

            The compiler authors don't have time to make processor-specific optimizations for every single flavor of x86 architecture out there; they already have to deal with P5, MMX, P6, K6, 3D-now, etc. Why is Intel's newest fad so special that they should get extra attention? It's not. Compiler authors are going to write their compilers to perform the best on the majority of processors out there, instead of concentrating too much on one specific technology.
            • Ummmmmm (Score:3, Informative)

              by Sycraft-fu ( 314770 )
              The Intel Compiler isn't competition for MS VC++, it's a plugin. It just replaces the compiler and linker of VC++ with Intel's optimised one. It is well worth your money if you're oging to be doing serious development as it is just all around mroe efficient, even for Athlon chips.
            • Why is Intel's newest fad so special that they should get extra attention? It's not.

              Oh, but I think it is. Any new technology that can give you a 10:1 performance improvement is worth ANY persons time to investigate. That can make the difference between an MPEG2 video stream compiling in 10 hours or 4 hours.

              You seem to forget the whole point of optimizing the compiler is to make the compiled code perform faster and better (and possibly with fewer instructions, thus decreasing bloat). Now sure, SSE and SSE2 (and really, even MMX) require more thought from the actual developer as well, but implementing these enhancements in standard libraries and in commonly used functions would surely help people realize some of these benefits up-front with a simple re-compile of their code.

              There's also your issue of considering it a "fad". This technology is here to stay, and Intel has pretty much said that using the FPU is the wrong way to go about things in their newer CPU's. By this very virtue, one could pretty much surmise out of the gate that if a test or benchmark uses *only* the FPU, Intel will likely fail, and fail miserably. Intel has really only introduced three new technologies since the release of the Pentium--

              MMX (MultiMedia eXtensions) - SIMD (single instruction multiple data; basically performing the same operation repeatedly on a set of data, by only executing a single instruction) integer routines, useful in certain video functions, and in some string functions. Introduced in some later Pentium processors (not in the Pentium Pro, I believe, not sure though).

              SSE (Streaming SIMD Extensions) - SIMD again, but applied to floating point work. Introduced in the Pentium III.

              SSE2 (Streaming SIMD Extensions 2) - More SIMD, again applied to floating point (I believe, I've only begun to read the info on this from Intel's developer website (developer.intel.com has tons of manuals in PDF format you can read on the subject though). Introduced in the Pentium 4.

              Only three "fads", as you call them. Other than this, very few instructions have been added that a compiler team need worry about; the only ones I can even think of worth dealing with in a compiler are the CMOVcc instructions (and with these, it depends on the amount of time you save using these instructions; I assume they're an improvement though since it saves you the trouble of branching in your code).

              BTW: The compiler issue is really more of a Wintel based problem (afterall, the benchmarks being run are usually on Windows based systems).. take a look at Borland's compilers for example; Borland Delphi offers little optimization configurations. Borland CBuilder offers more options, but as far as I recall, no ability to compile with MMX/SSE enhancements. As someone else pointed out, you can in fact purchase Intel's compiler technology as a plugin for Visual Studio, but Microsoft should *really* be adding these features directly to their compiler.

              And the latest optimization issue (which I don't believe can be addressed through compiler changes, but who knows) will likely be when Intel releases SMT-capable processors.. suddenly everyone running Windows XP Professional (or if MS decides to allow dual processors on WXP Home Edition) will be able to run multiple threads at the same time-- the problem is, not a lot of software utilizes multiple threads, even though the technology has been around since Windows '95 (obviously in a single processor environment, you don't get a speed increase, but it does make your app more responsive (the UI doesn't freeze during long operations, for example)). Benchmarking an AMD processor without any form of SMT against an SMT capable Intel processor would be unfair at that point.

          • by LarsWestergren ( 9033 ) on Thursday January 10, 2002 @03:34AM (#2815230) Homepage Journal
            It's saddening to me to see the optimization skills software engineers *used* to have back in the day diminishing year by year as the ability to right crappy code is justified by ever-faster CPU's.

            Well, its just a matter of economics. In the beginning of computing, you could get maybe 50 programmers for the price of one computer. So the time of the computer was valued more than the time of the programmers. The programmers had to spend a lot of time optimizing.

            Now, the price of computers has fallen, until you get a lot of computing power for the price of one programmer. The time of programmers is valued a *lot* more than the time of computers. So the rational economic choice is to buy more (or more powerful computers) to make life easier for programmers.

            When you look at how some of the old time programmers react to this change, I think it is insteresting to look at the medieval guilds. The programmers are angry at the newcomers and try to put them down, and make it as hard as possible for them to advance ("RTFM!"). The guilds on the other hand made it forbidden by law to practice the craft in question outside the guild. Both guilds and programmers occasionally justify their behaviour by the need to preserve the fine traditions of the art, and distrust new techniques and technologies that make things "too easy".

            But what it really is, is a fear of competition. Instead of trying to improve themselves and keep up with the times they try to stomp out the competition.

            This may cost me some karma...
  • (0, Troll) (Score:2, Informative)

    by Anonymous Coward
    Tom's Hardware did a review on this a couple of weeks ago: http://www.tomshardware.com/cpu/02q1/020103/index. html

    As you can see, the Celeron is actually at 1300 MHz, not 1200. Funny thing is the Duron still beats it by a good deal.
  • hmm (Score:4, Informative)

    by RainbowSix ( 105550 ) on Wednesday January 09, 2002 @08:45PM (#2814132) Homepage
    Unfortunately, AMD apparently isn't ready to move the Duron to a 266MHz bus just yet. That's really a pity, but AMD wants to differentiate between the Athlon and Duron

    They're not ready because to put the Duron and Athlon at the same bus speed would make their performance levels nearly equal. With the hardware prefetch and SSE we've already seen the 1 gig duron keeping up with the 200mhz fsb 1 gig Athlons. To put the cheaper Duron at 266 would give little incentive to buy an Athlon of the same grade (save for the cache).
  • by klund ( 53347 ) on Wednesday January 09, 2002 @08:49PM (#2814150)
    Celeron is a PIII Tulatin with a 100MHz bus and built on the .13 micron process

    By the way, that's "0.13 microns."

    As my Nobel-Laureate physics lab professor used to say, "ALWAYS use leading zeros with decimal points; that way your readers can tell the difference between a fraction and fly shit."

    Go ahead and mod me down, but I'm not a grammar nazi, I'm a math nazi!
    • by Wavicle ( 181176 ) on Wednesday January 09, 2002 @09:05PM (#2814207)
      I assume that was important when the only way to get stuff was on printed paper and there could be ambiguity between a decimal point and an artifact (or an excrement). If you have a problem with fly shit on your monitor, you should clean it. I regularly clean my monitor with a soft lint-free cloth and a 10M solution of HCl.
      • I assume he meant molar, not million. Don't assume it's a different quantification just because you've never taken basic chemistry courses.
      • You know, using toilet paper and H2O works just as well if you're willing to put a tiny bit more effort into the cleaning (and as a bonus, you'll appear less anal retentive at the same time)! :)

        --

      • Re:leading zeros (Score:2, Informative)

        I regularly clean my monitor with a soft lint-free cloth and a 10M solution of HCl.

        I believe you can find enlightenment by embracing the tangential.

        Chlorine gas (which HCl releases in small amounts) is bad for your eyes. 10M HCl also burns cheap plastics, and isn't especially good at disolving oily residue. If you have metal deposits on your moniter, HCl is the way to go, otherwise, no.

        The secret moniter cleaning solution only lab chemists new about (until now)- 50% water, 50% acetonitrile. Unlike cheap malt liquor (which I used to clean my moniter with - seriously) it doesn't leave a stink or a funny residue.

        Two things to remember - only use it in a well ventilated area (which you should have for your computer anyway, in case it starts putting off Ozone) and try not to spill it on yourself, it permeabilises your skin (although not as much as DMSO.)
        • Hydrochloric acid releases small abounts of chlorine gas? Under what mechanism? I would imagine that any measurable chlorine gas that you'd find in a bottle of HCl was in solution from when the HCl was manufactured.

          I'm not a Chemistry whiz, but my old textbook says that the energy of reduction for 2 chlorine ions to form chlorine gas (Cl2) and 2 electrons is -1.36 volts- so I don't see why this reaction would occur spontaneously.

          I won't argue with you that HCl vapor is bad for you, but it's certainly nowhere near as deadly as chlorine gas.

          So, am I forgetting any of my chemistry?

    • by xah ( 448501 )
      Again, the word "nazi" here is used in a positive sense. I find that regrettable. The Nazis did not become best known for their strictness, but for their acts of overt evil.

      For all numerical values greater than or equal to zero, and less than or equal to one, the numerical value is used in the singular sense. Thus, "0.13 micron" is the proper English usage.

      One could list a few values as: "Zero micron, 0.5 micron, 1 micron, 1.5 microns, 2 microns," etc.

    • by Alsee ( 515537 ) on Wednesday January 09, 2002 @09:13PM (#2814247) Homepage
      &GT the .13 micron process
      By the way, that's "0.13 microns."
      Go ahead and mod me down, but I'm not a grammar nazi, I'm a math nazi!


      No, it's 0.13 micron.

      Go ahead and mod me down, but I'm not a grammar nazi or a math nazi, I'm a "that just sounds stupid" nazi!

      Go ahead, just *try* saying it out loud: "0.13 microns process". That 's' rolls off the tongue like an anvil.

      -
      • micron? micrometer! (Score:2, Informative)

        by vrt3 ( 62368 )
        Actually, it is 0.13 m (micrometer). Micron has since long been deprecated in favour of micrometer, part of our beloved SI system of units.
    • As my Nobel-Laureate physics lab professor used to say, "ALWAYS use leading zeros with decimal points; that way your readers can tell the difference between a fraction and fly shit."

      Marginal on-topicness, but I have this coupon in my pocket for .55c off something. Like you say, is that 55/100 of a Cent or do they mean 55 cents? :)

  • Just so you know... (Score:5, Interesting)

    by Daniel Wood ( 531906 ) on Wednesday January 09, 2002 @08:50PM (#2814154) Homepage Journal
    Tom's Hardware already did a review [tomshardware.com] of the Celeron 1300 vs the Duron 1200 (benchmarks included the Celeron 1200) where the Duron simply spanks the Celeron.

    SIS has restored my faith in AMD. The ECS K75SA motherboard is only $64 after shipping and works with any Socketed Athlon/Duron cpu. It is fast and stable, accepts DDR and SDR, built in networking and sound(ok, AC'97 isn't that great), a real winner. You can build a 1GHz system and only pay $120 for the cpu, heatsink/fan, and mobo.

    • by MtViewGuy ( 197597 ) on Wednesday January 09, 2002 @08:54PM (#2814167)
      Tom's Hardware already did a review [tomshardware.com] of the Celeron 1300 vs the Duron 1200 (benchmarks included the Celeron 1200) where the Duron simply spanks the Celeron.

      The test shows why the Celeron is inferior to the Duron: the Duron's vastly superior FPU unit allows it to substantially outrun the Celeron on FPU-intensive tasks. That is the reason why the Duron has become the choice for many do-it-yourself computer builders.
    • Speaking of SiS and ECS....

      I just got in an ECS k7SEM with onboard everything, and a duron 950. All the onboard stuff is well supported in Linux. You had better have a recent Xserver to handle the onboard video though, the one that ships with Red Hat 7.2 works well (4.1.0). The sound card gets a little flaky under heavy processor load (sometimes XMMS won't be able to open the sound if it changes tracks while the processor is loaded heavily), but it sounds great.

      So I got this setup working well. I ordered 6 more of them to build a MOSIX cluster with.
      from www.mwave.com:
      6-ECS K7SEM motherboards
      6-950 Mhz Durons
      1-16 port 10/100 switch

      Total w/shipping: about $880

      from www.sofistic.com:
      6-128 meg Micron DIMMs PC133
      6-el cheapo cases with 300 watt powersupplies

      Total w/shipping: $337 (watch out they rape you on shipping, but their prices are so low it offsets it)

      Anyway, so I am building a 6 node supercomputer for $1200. This is what a low end PC used to cost. Boy we have come a long way.

      There will be some other costs, like there will need to be a hard disk somewhere for these things to boot from, but no other major costs.
    • Speaking of cheap systems, does anyone know how overclockable the new Duron is? I'm almost tempted to buy one and another K7S5A and bump the FSB up to 133. A 1600MHz DuronXP(might as well call it that) should compete nicely with an Athlon XP 1900+, and at little more than 1/3 the price.
    • by debrain ( 29228 )
      "works with any Socketed Athlon/Duron cpu"
      You never tried the Athlon 1.4 Ghz T-Bird with the K7S5a.On a message board with, on average, 44 posts per topic, there were 14,000 posts on the Athlon 1.4 + K7S5a. Someone did solve the problem, that being total system instability, by putting a 200 ohm resistor in parallel with something underneath the chip (soldered onto the motherboard), but I wasn't brave enough for this and settled with upgrading to an Athlon XP which works fine. Strangely enough, this issue really only reared itself en masse with revision 4 of the board, which constituted the most shipped by far. Revisions 1-3 were flakey, and oddly enough revision 0 was rock solid, from what I read (so this is hearsay), and I stopped paying attention by the time revision 5 was out.
    • by Geek In Training ( 12075 ) <cb398&hotmail,com> on Wednesday January 09, 2002 @10:04PM (#2814433) Homepage
      The ECS K75SA motherboard is only $64

      I must also chime in as a fan of this board. I run it in my gaming rig with an Athlon XP 1600, and 512 megs of DDR. It whomps ass! Onboard 100baseT and ATA100, 4x AGP, and the SIS 735 chipset requires no fan. I got mine for $57 at newegg.com, whom I highly recommend for parts (this is an unsolicited testimonial for an independent party :).

      Also, if you look at chipset reviews, the SIS735 comes in JUST behind the high-end Via chipsets, at many $$ less.

      Yes, I put an Audigy in and disabled the onboard sound, but the AC97 is very workable if you're running a single pair of speakers or headphones.

      Just my $0.02. (Note the leading zero.)
    • Yeah I read the article on Tom's like 5 minutes after I posted this article. Tech-Report gave more detail and I liked the pure 1200 vs. 1200. But the main topic was to point out that there are other new processors out there that don't cost Between $350 & $600 [pricewatch.com]
  • Benchmark woes (Score:4, Insightful)

    by Cerebris ( 224937 ) on Wednesday January 09, 2002 @08:53PM (#2814165) Homepage
    A decent review, I suppose.

    I think it was a tad unfair to compare a Duron using DDR to a Celeron on PC100/133 (depends on the motherboard and how they set it up). They did acknowledge it directly when discussing the memory bandwidth (which showed the expected numbers, Duron was around 2x Celeron), but I think it shows only part of the picture (especially with DDR prices back up in the stratosphere compared to say, a month or two ago). This is one reason I take benchmarks with a grain of salt...it's very difficult to objectively compare AMD and Intel CPU's now due to the drastically different architectures.

    The article also mentioned the Intel headspreaders...these should be reflexive on all processors. I can't count how many "Cracked core" thread's I've read on the [H]ardOCP forums...and a reasonable number of these guys are shall we say slightly above your average user.

    My $0.02...

    -Colin
    • Re:Benchmark woes (Score:3, Insightful)

      by Jay Bratcher ( 565 )
      it's very difficult to objectively compare AMD and Intel CPU's now due to the drastically different architectures.

      The primary difference is actually the chipset, and not the chip itself (at least in the case of DDR vs. SDR memory). That being said, look back at comparisons between any DDR and SDR system - memory bandwidth is dramatically increased, but real world performance rarely improves more than marginally. What I am trying to say is, it's fair to compare the 2 chips, especailly since they run the same software.
    • DDR prices back up in the stratosphere compared to say, a month or two ago

      When did DDR prices go up? I just looked at the price list of one of the local parts stores (www.laboratorycomputers.com), and a 128M DDR is $38 vs. $31 for a 128M PC133. That is only a $7 difference. The difference is percentage wise the same on 256M modules, which are $69 and $55 respectively. The difference is percentage wise less on 512M modules, which are $138 and $122 respectively for a difference of only $16.

      Probably the biggest difference in price if you go with DDR is in motherboards which are about $35 or $40 more for boards with DDR support.

    • Difficult to objectively compare the processors? You're just an intel whore talking out his ass!

      In order to compare the processors objectively, all you need is to run the processors on testbeds as close to one another as possible, and when you can't have a certain piece of hardware crossover (i.e. motherboard), you insert the best piece of hardware available for that platform.

      Perhaps you haven't noticed, but running a Celeron on a DDR system doesn't give it any benefit. Wonder why? It's max speed for it's front side bus is 133mhz SDR.
      Don't whine about non-objective reviews if they ran each CPU on the best test-bed they could.
    • Re:Benchmark woes (Score:4, Interesting)

      by dbarclay10 ( 70443 ) on Wednesday January 09, 2002 @11:50PM (#2814754)
      I would agree with you, if the review was actually a comparison of technologies.

      Of course, if that were the case, then it wouldn't be a review - it would be a comparison of technologies :)

      However, since it *is* a consumer-oriented review, the focus is obviously on performance vs. price and a number of other factors; all easily summed up in the term "value".

      Since both the Duron and the Celeron have similar prices and are both targetted at the same market(at least in retail), then it's totally fair to compare them, despite the fact that they have some relatively different technologies.

      Now, I would say it's unfair to compare, say, an Athlon XP 2000+ to a 386 used in "embedded" markets. This review, however, is more than fair :) Saying it's otherwise would be like saying it's unfair to compair the first- and second-place winners in the Olympic men's triathlon; yes, obviously one is faster than the other. Maybe they've got more endurance(greater memory bandwidth), maybe their muscles are bigger(stronger FP units), but if you're not going to compare those two, what else are you going to compare? The winning triathlon athlete vs. the winning 100m swimmer? :)

      Thought so ;)
  • by Anonymous Coward
    faster bus speed, faster instruction execution, faster floating point, there is now way it can't.
  • (This starts off as a bit offtopic, but I felt like posting this)
    Price.

    Yes, you get that 1.2 second advantage in Photoshop if you have a Pentium III 900, but I could care less if I have a Celeron over Pentium III.

    It also goes for games too, like I care if Quake III Arena runs 20 frames faster than on a Celeron, all the computer has to do is meet 30 frames per second. The human eye is not fast enough to see 100 frames per second. (Yes I know video cards play a factor, but the processor does do a fair bit of work).

    All I care about is if the processor can run applications at a decent speed.

    Also, is it just me or does the name "Duron" not sound very catchy? "Pentium," "Celeron," and "Athlon" all have catchy names. Yet for some reason, "Duron" doesn't sound all that great and sounds like a processor from the waste bucket.

    I know of a few people who own them and actually don't have anything negative to say about them.
    • As I see it, if your game is runnning at only 30 fps, when ever the cpu has to allocate more time to other things be it AI, networking, cron jobs, etc, your fps will drop below and you will see the difference. Granted more ram will help eliminate this, but it's still something to consider I think. Also, perhaps it is just me, but I can see a difference between 30 fps and say 60-70 fps. It just looks smoother. You can also feel it as you play, it stays smooth during the whole game; perhaps that is what I think looks smoother...

      PS - Everytime I hear Celeron, I think of celery, when I hear duron, I think of durable. :)
    • by bunhed ( 208100 ) on Wednesday January 09, 2002 @09:20PM (#2814280)
      Also, is it just me or does the name "Duron" not sound very catchy?

      The Duron is actually the middle processor of the "Moron" line of precessors, designed specifically for XP. If you look in the start menu on XP and you're running a Duron you'll see the My Moron icon. The "Freon" is the fasted at 3.2 GHz but requires a gallon of coolant every 100 web pages. The "Peon" is the entry level processor. You can read the whole article here [bbspot.com]

    • Out of curiousity, how do you know that the human eye cannot see 100fps? When you say that, what do you mean? Cannot make out detail? The flash from a camera usually lasts less than 10ms, but we see it.

      I ask this because I once believed as you did and created a simple experiment to prove I was right. I took an old manual camera, removed the lens, set the shutter to 1/60 sec (its flash sync), opened up the back, put my eye up to the shutter curtain, pointed the camera at my wife, asked her to hold up some fingers but not to tell me how many, and snapped the shutter.

      To my somewhat surprise, I had no trouble telling how many fingers she was holding up. So I started clicking the shutter up to higher speeds. Only at 1/1000 sec did I start having trouble.

      Thus I had proved myself, and conventional wisdom, wrong. If you are getting 100fps, and you monitor is drawing 100ftps, if something important happens even for only a frame or two there is no reason your brain won't register it.
      • Your experiment seems biased. One of the eye's primary functions is to detect differences (motion) from one scene to the next. Motion detection is one of the lowest level functions subconsciously performed by the eye. So if you have an experiment where one scene is flat black and the next is an image, you aren't testing in a situation that is on par with reality. Instead, you'll need to run a test \w fluid images, then your wife's fingers, then more fluid images. So try editing a movie reel and inserting images randomly into the photo cells. If you run the film through a projector faster than 24fps, the average human eye will not detect inserted images. Instead, your eye will try to blend the oddball image into the two surrounding images.

        Pardon me if I got any of the neuroscience wrong on that description, but I think the general idea is correct.

        Cheers

    • > all the computer has to do is meet 30 frames per second. The human eye is not fast enough to see 100 frames per second.

      Tell me, can you discern a difference between your display at a refresh rate of 60Hz and a refresh rate of 100Hz? I thought so.
  • relative cost (Score:4, Interesting)

    by Anonymous Coward on Wednesday January 09, 2002 @09:05PM (#2814209)
    Do these low cost cpu's matter anymore? Celerons do not stand for value.

    In the mid 90's sure it was a huge cost difference $100 for that celery 300a@450mhz vs p2-p3 450 at about $500.

    As of right now celeron ghz is about $58
    ( http://www.pricewatch.com )
    AMD XP 1500 $107

    Thats the battle, now I'll give you 3 guesses which is a better value.
    • I don't know how a Celeron 1.2GHz can be discounted that much, as it is just over 1/2 the street price locally (Austin, TX) while the AMX XP 1.5GHz price you quote is not that much lower than the local street price. The pricing at a local place (www.laboratorycomputers.com) looks like this:

      Celeron 1.2GHz $115

      AMD XP 1.5GHz $133

      For comparison:

      Duron 1.2GHz $99

      TBird 1.2GHz $112

      If Intel is trying to compete with AMD, it sure looks like a no-brainer choice to go with AMD. The only question is which AMD is the best value.

      On the other hand:

      P3 Tualitan 1.2GHz $273

      So if you have to have "Intel Inside" and you want a "1.2GHz computer", then the Celeron looks like a good deal in comparison to the P3.

      • Another factor is the motherboard.... you want to use the 1.2G Celeron? You need to have a board that supports it. A slocket adapter, even a fancy powerleap adapter, will not save my lovely SuperMicro SBU (slot 1, BX, SCSI) motherboard. I hate that!

        Price vs Performance, its really hard to beat the Durons. My mom is getting a new box - found a 1G duron from newegg.com for $54USD, and have it tracking here as we chat from a vendor who has been good to me in the past. Looking at an intel solution, it was an extra $20 to get something that performed like a $30 AMD chip. No brainer - though my mom is fine now with my old 486dx2/50.... Still, my cash, a guy has to have standards...

        The price jump to the latest greatest is way too steep right now to look at the 1.2 duron, but when the 1.3 comes out this month I suspect this might drop into that great deal catagory too.
      • by CtrlPhreak ( 226872 ) on Wednesday January 09, 2002 @11:03PM (#2814606) Homepage

        Parent Post: AMD XP 1500 $107

        Above reply: AMD XP 1.5GHz $133

        I'm very glad to see that AMD's marketing strategy is working just as planned.

  • XP Technology? (Score:1, Redundant)

    by nexex ( 256614 )
    If the duron comes with XP technology, does that mean there is a complementary holographic image of Bill Gates on the top of the cpu? Or does it mean you have to "activate" it and if you switch mobo's you have to call AMD to have it unlocked?
  • by Proud Geek ( 260376 ) on Wednesday January 09, 2002 @09:26PM (#2814311) Homepage Journal
    That really impresses me. I mean, they could shrink the die size, or use SOI, or even borrow a cue from the Alpha and use SMT. But no, they went way beyond all the rumours and used XP Technology. I wonder if they had to license it from Microsoft?
    • The "XP" technology refers at least to the incorporation of the SSE instruction set---my 1GHz duron has that. I think there are other non-marketing differences between "XP" and older K7s, but find someone more technical than me to explain them.
  • Celeron (Score:2, Informative)

    by ehiris ( 214677 )
    Intel is not advertising XP comaptibility of their Celeron.

    Celerons do work really good with XP from my previous experience but they didn't post it on their compatibility list.

    Celeron Compatibility: Fully compatible with an entire library of PC software based on operating systems such as MS-DOS*, Windows* 3.1, Windows for Workgroups* 3.11, Windows 98, Windows 95, OS/2*, UnixWare*, SCO UNIX*, Windows NT, Windows 2000, OPENSTEP*, and Sun Solaris*.
  • Why bother? (Score:4, Troll)

    by jerkychew ( 80913 ) on Wednesday January 09, 2002 @09:48PM (#2814380) Homepage
    A Duron 1.2GHZ costs $79, while a 1.2GHZ Athlon costs $72... someone explain to me why the Duron is the 'budget' CPU.
    • Re:Why bother? (Score:2, Informative)

      by vipw ( 228 )
      i just wonder which one of those performs faster in most tasks. the duron has prefetching, but the tbird has more cache. more on topic though, the duron consumes less power and outputs much less heat than the 1.2GHz thunderbird, lowering the price of heatsink/fan unit and decreasing the need for high wattage power supply and case cooling.
    • Source? (Score:2, Informative)

      by hendridm ( 302246 )
      Where are you getting your prices from? Pricewatch shows a different story...
    • Re:Why bother? (Score:3, Informative)

      by fireant ( 24301 )
      vipw mentioned it, but to be explicit...

      The new Duron is based on the Morgan core (think Athlon XP), while the 1.2 GHz Athlon is a Thunderbird, which came out at the same time as the old Spitfire core for the Duron (about a year and a half ago). I suspect that the T-bird would still beat the new Duron due to a bus speed advantage as well as the extra cache, so your point is still valid.

  • For the price of a 1200 MHz Duron, you can get a 1000 to 1100 MHz Athlon that will run rings around it. These "value" processors don't really look like much of a value to me.
  • When you go down 200mhz to a still-respectable 800mhz, the price plumets down to $40ish... from a decent vendor. Then you get potential. You can build a $250 countertop pc. Which is exactly what I did...

    Add $50ish for the suprisingly nice ECS motherboard (for $50, it's a GREAT deal. it's nice on quality and features. i would gladly pay twice that price for the same board) and $10 for a heatsink (amd-approved, of course).
    Add another $50 for a Geforce2MX. The MX isn't a horrible card, it is sufficent for running most games...
    Add (whatever) for RAM... it's SOOO cheap now. I just dug up 256mb from my closet full-o-parts. You can get that for about $30 from crucial
    Add about $30 for an AMD approved psu (i can't stress the approval more. it really makes a diffrence!)
    Then i added a keyboard and mouse (i got a refurbished wireless keyboard and mouse for $25. most people can find a cheapo keyboard and mouse that is more than sufficent).
    I used a hard drive from an existing pc. It's fine for my purposes.

    Volia! For about $250, i have a NICE countertop pc, which is hooked to the hi-fi as a jukebox(hey, that's what it was advertised as when it was purchased... oh 20 years ago :) ).

    Now, granted, I used some parts i already had, which isn't fair to the price factor. Take those off, and you go up to about $400 (case+ram+hdd). There are many people who can't afford a pc due to the astounding prices. To dell and friends, i say poo!

    This says a lot about the beuracocracy behind dell and all the big pc makers. Once must wonder the profit margin dell is enjoying (remember their little deal with intel... surely they are sold chips at dirt cheap prices)?
    • ECS == PCCHIPS (Score:2, Interesting)

      by Anonymous Coward
      You do realize who ECS is, right? You can get started with some info from these scheisters here [fernuni-hagen.de].

      They are a distributor of PCChips, the same people who:

      - Fake the speed of their motherboards
      - Pirate their BIOSes
      - Rewrote bioses to display fake cache amounts and glued black plastic squares to their board with bits of metal sticking out from them and etched SRAM looking part numbers onto em
      - Create deceptively similar but not at all related to their brandname counterpart chipsets. VXpro, VX+, BXpro, etc. are all famous PCChips parts.
      - Relabel chipsets that look/are too cheap with the above names
      - Seem to be gaining a reputation on newsgroups for selling complete garbage motherboards (just look for "PCchips shit/junk/suck" on deja)
      - Have an absolutely amazing number of aliases. Eurone, Houston Tech, ECS, Amptron, Protac, Aristo, Minstaple, Matsonic, Fugutech -- the list goes on and on.
      - Don't label their boards properly
      - Use huge jumper blocks to set memory type (no, not just 5 or 6, but jumper blocks the length of a DIMM slot!)

      BTW: As a tech at a computer store I build systems with ECS motherboards (its what we sell -- no I won't say where I work :). They are unreliable garbage, IMHO.

      But AMD approved power supplies are a good idea. I don't know how many times I've seen cheap power supplies with the ratings simply stuck on them with an extra sticker.

      I'd really reccomend swapping the board, though. Just a reccomendation. But if it is performing alright for you right now, well, stick with it. Occasionally some are actually built ok! :)

      BTW: If you are using the board I am thinking of, its also "made" by amptron in red. Or blue, I forget which colour was for which company sometimes. With their most popular boards, by selecting the manufacturer you can select the colour. F-U-N

      When Tom's Hardware Guide gave a solid rating to a PCChips board (I think its the K7S5A) I decided I'd never trust one of his reccomendations again. If you are going to rate boards you should at least do some homework on the company first!
      • I'm not proud of it, but I used to work for PCChips (or PC Shit as we used to call them), and I've heard it whispered a lot of times that most of what you say sounds familiar. In fact, I even have it on good authority a few senior staffers in their European distribution center were prosecuted for filing away serial numbers on CPU's, so they could sell them as higher end models than they really were.

        The chipset naming issue (my absolute favorite example has to be the slogan "rember, it's BXtoo!") you bring up, however, isn't PC Chips' fault per se: a lot of these get made by shady Chinese plants and slapped on PCShit mobo's just because they're cheap.
  • You know, if you look on pricewatch.com, the Duron is only $5-$10 cheaper than an Athlon thunderbird of the same clock speed. Some $20. Might be a good reason why they get ignored sometimes.
  • by cosmicg ( 313545 ) on Wednesday January 09, 2002 @10:22PM (#2814491)
    The celeron might be slower, but it beats the PII 400 I've used it to replace. I just upgraded to the celeron on my 3.5 year old Dell. $170 buys the chip and slotkit.
    Because Intel is still producing inferior chips with slow bus speeds, I can play Black & White. Part of the fun of tech advances, is the way they pull up the rear, while dropping prices.
  • by bryan1945 ( 301828 ) on Wednesday January 09, 2002 @10:42PM (#2814544) Journal
    a new processor licensed from Motorolla. In the vein of the Duron and Celeron, Apple has dubbed its chip the Dodecadon. It will feature a spherical chip package with one wing.

    After a lukewarm reception, Apple changed the name to the "Celery", and sales went up 1400%.

    (If you don't get the above [bad] joke, please "move along, nothing to see here!")
  • News??? (Score:4, Insightful)

    by SquierStrat ( 42516 ) on Wednesday January 09, 2002 @11:10PM (#2814630) Homepage
    How is this news? The Duron has be kicking the Celeron's tail end since the Duron's release. The Duron can usually keep up with a P3 of the same speed, or at least trail very closely behind. The fact that they are both over 1ghz is not new either, not to mention that the so-called advances for the Celeron trully haven't helped it much, as it's an aging architecture. Please people, save money, get a better product from the AMD processors. (I'd still prefer a XP over a P4 personally, I'm yet to see a P4 that didn't give me this feeling that everything for some reason loads slower, though benchmarks seem to say it can be faster in some cases...yet it's only slightly faster than a much lower clocked Athlon XP, see a problem here?)
  • The problem with Athlon-esque CPUs is that the motherboards are still rather expensive (at least over here in Oz), thus although AMD CPUs are cheaper to purchse, you end up spending more on the motherboard anyway and the whole price advantage that the Duron has over the Celeron is negated. The price difference between the Athlon and Duron, last time I checked, was so small that the Duron doesn't really compete.

    For this fact alone, if I had a choice between building a Duron box or a Celeron box, I'd choose to build an Athlon box instead :-)
  • by Anonymous Coward on Wednesday January 09, 2002 @11:40PM (#2814719)
    The new Celeron 1200mhz eludes me more than any product I've seen Intel release. Just recently I read www.tomshardware.com 's artice on celeron vs duron and although he doesn't touch on this subject at all... I can CLEARLY see in many of the benchmarks (sisoft cpu bench, as well as mp3 encoding speed) that the Celeron 1200 is indeed OUTPERFORMING the pentium 4 1400 and 1500mhz. Now is that silly? yes I think so. Do I find this terribly disturbing? yes. Why? Because enginners at intel seem to think higher MHZ is better than good cpu design. And Intels own marketing strategy is going to bite itself in the ass. Oops too late.
    • Re: (Score:3, Insightful)

      Comment removed based on user account deletion
    • by RadioheadKid ( 461411 ) on Thursday January 10, 2002 @12:52AM (#2814937)
      I know that there seems to be a lot of bashing of the Celeron and Intel's marketing, but in some ways I see it as a response to the market.

      It used to be when you talked about a PC, you gave the specs of your hard drive, RAM, graphics adapter, whether or not it had a soundcard, and what number came before the 86 in the processor name 2,3, or 4.

      Now having over 256 MB of RAM is not unreasonable. Hard drives size is mostly irrelevant, sound cards are standard, and except for the gamers, a graphics card is where you plug your monitor in and it works. So what's left to spec? MHz! It's a number, it sounds technical and the Wintel PC marketing machine has jumped right on it. So much to the point that AMD now puts 4 digit numbers in their processor model name that don't necessary represent the clock speed of the processor, but keep up with Intel's current MHz release.
  • by orgnine ( 529145 ) on Thursday January 10, 2002 @12:04AM (#2814802)
    The average consumer would assume that the Pentium chips are much faster as Intel has branded a 'fancy' new 2.2 GHz chip. Even that AMD chips *model names* only reach to +1900. (about 1.63 GHz).

    Almost hilariously, AMD doesn't have to get their chips running at a 2.2 GHz frequency to get nearly the same performance.

    The same speed differences per frequency show up in the lower bus-speed chips (Duron / Celeron).

    The average consumer is completely unaware of the closeness between speed of the chips of each company.

    AMD chips are much better priced, and carry more value for their money. Stability is excellent, speed is unmatchable in identical frequency ranges. It has been this way for a couple years now.

    Aside, AMD has likely changed their naming system to make their chips 'sound' competitive compared to Intel chips? (i.e. Athlon XP 1600+ sure sounds like 1600 MHz doesn't it!).

    orgnine
    • This is only partially correct. True, the P4 architecture has poor performance per clock, enabling AMD offerings clocked at 1.67 GHz to be quite competitive with 2.2 GHz P4 offerings.

      As far for the conservative model numbers, AMD wanted to play it safe and avoid heavy criticism by claiming more than they could back up completely. Better to underestimate yourself than overestimate, as reviewers tend to be much more friendly to companies that do this.

      This does not, however, apply to the Duron vs. Celeron scenario. The Celeron is based on the PIII core, a core that acheives much more competitive performance per clock, but will not scale to the higher clock. The P4 architecture move intentionally sacrificed performance per clock to allow higher, thus more marketable, clock speeds. That is not to say I would go with Celeron, Durons are typically cheaper, but the XP vs. P4 discrepencies are not the same thing as Duron vs. Celeron.
    • what I don't understand is why AMD decides to limit their chips to the performance of the top Intel chip.

      If the tech exists to produce a 2.2GHz chip, then shouldn't AMD be able to produce a chip at that speed?
  • Does anyone besides me find this type of thing annoying:

    The Duron 1.2GHz requires 1.75 volts of power.

    You would think that someone who wanted to be respected for writing a hardware review would at least try to remember the Voltage, Current, Resistance, and Power relationships that he or she learned in high school.

    Your processor may only use a supply voltage of 1-2 V, but if you still need a heatsink and fan it's obviously using tens of watts of power.

    Actually, one of the problems with modern processors is that power is going up in processors while supply voltage is going down. That means much more current. Around half of the pins on modern processors are for power and ground. You start to see wierd problems with metal migration on the power supply wires, ground bounce problems from wire inductance, and all sorts of other strange problems.

  • When is some Far Eastern manufacturer with
    poor English (and marketing) skills going
    to come out with a chip called the 'Moron'?

If you steal from one author it's plagiarism; if you steal from many it's research. -- Wilson Mizner

Working...