Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel Ramps Up 45nm Chip Production, Announces 'Atom' Line 126

Multiple readers have written to tell us of the latest developments out of Intel. Earlier this week, Intel announced the Atom brand of low cost, low power consumption processors. The CPUs, measuring only 25 square millimeters, are the result of the Silverthorne and Diamondville projects. The announcement has caused this CNet columnist to question whether Intel can "spur innovation in ultrasmall devices the way it has in the PC and server industry." Concurrently, Intel has increased its production of 45nm processors to a rate of roughly 100,000 chips per day. As TG Daily notes, the massive investments Intel has made into chip production will make it difficult for AMD to catch up.
This discussion has been archived. No new comments can be posted.

Intel Ramps Up 45nm Chip Production, Announces 'Atom' Line

Comments Filter:
  • Isaiah (Score:4, Funny)

    by Clay Pigeon -TPF-VS- ( 624050 ) on Sunday March 09, 2008 @10:53AM (#22692690) Journal
    By the looks of things Isaiah will wipe the floor with Atom if intel doesnt bury Via with branding power. Isaiah's out of order execution will offer much better performance than Atom's in-order execution.
    • by gnutoo ( 1154137 ) on Sunday March 09, 2008 @11:07AM (#22692770) Journal

      Atom's performance in shipping hardware isn't something we've been able to test, yet, but given the architecture's simple, in-order nature, you shouldn't expect Atom to match even a Pentium M in raw performance.

      AMD is supposed to feel threatened by that?

      • by billcopc ( 196330 ) <vrillco@yahoo.com> on Sunday March 09, 2008 @11:29AM (#22692932) Homepage
        If Atom is cheaper than the AMD's lowest offering, then yes, they should feel threatened. We've reached a point where even the most basic processor has more power than the common person needs. Combine that with the mindless eco-babble that has tainted every aspect of North American life in the last few years, and you've got a market that's perfect for a power-miser medium-performance processor that will be at the heart of numerous little PC-like gadgets.

        Via's line doesn't get much traction outside of the tinkerer circles, because they're still tied to clumsy legacy chipsets and the costs are ridiculous, considering their extremely limited performance. If Intel can release a slightly better processor for less money, that can be paired with an inexpensive chipset and tiny power supply, they could take a bite out of the microcontroller segment and ARM's small but tenacious market share.
        • Re: (Score:3, Insightful)

          by Metasquares ( 555685 )
          I'm not too sure about that. Software scales to make use of available resources, which can result in the same task requiring more processing power over time. There's a vast difference between using, say, Vista and Word 2007 to do word processing vs. something like Windows 95 and Word 6.0 - even though you're using the two packages for the same purpose.
          • Re: (Score:3, Insightful)

            by Nullav ( 1053766 )

            Vista and Word 2007 to do word processing vs. something like Windows 95 and Word 6.0 - even though you're using the two packages for the same purpose.
            How about Windows XP/2k and Word 2000? Not much difference there, save for resources and a few superfluous features hanging off of 2k7. I've rarely needed more than a 700MHz P3 for 'everyday work'. If those chips are anywhere near that in performance, I'm sure it'll find a niche in cheap school/office computers.
        • by TheRaven64 ( 641858 ) on Sunday March 09, 2008 @12:29PM (#22693282) Journal

          ARM's small but tenacious market share
          Last figures I read (from early 2007, admittedly) showed that ARM was the most widely deployed CPU architecture in the world. Considering that mobile phones (which outnumber PCs by about 3:1) and set top boxes almost all use ARM cores, I think calling ARM's market share 'small' is quite funny.
          • Define your market. How much money are all these ARM cpu's bringing in to their respective manufacturers? Compare that with money going to Intel and AMD, and you'll have your answer.
        • by hattig ( 47930 )
          "ARM's small but tenacious market share"

          Ten billion ARM cpus deployed to date.

          Intel is a minnow in this area.
          • Intel is a minnow in this area.

            Not at all. Are you aware of how many embedded Intel chips there are? I am counting the older generation parts, of course. Aren't you also doing so with ARM?
            • Re: (Score:2, Insightful)

              by pslam ( 97660 )

              Not at all. Are you aware of how many embedded Intel chips there are? I am counting the older generation parts, of course. Aren't you also doing so with ARM?

              Are you aware of how many embedded ARM chips there are?

              Do you know how many mobile phones are sold every year? DSL modems? Cable modems? WiFi routers? MP3 players?

              Are you aware that every PC with an Intel chip in it has 1 or more ARM chips in it? Every recent hard disk I've seen has at least 1 (or more) ARM cores driving it. Monitors have them. The

        • by wytcld ( 179112 )
          The costs [clubit.com] are ridiculous? Yeah, at 1.5 GHz you're getting a slow CPU. Yet for a lot of server uses that's far more than enough, as well as for most of what normal citizens do with their machines if they aren't gamers or video editors. The cost of power isn't "mindless eco-babble," it goes directly to the bottom line, whether corporate or household. On the corporate side there are two routes: consolidate onto virtual machines (which AMD chips handle quite well), or go for power-efficient individual boxes (wh
          • Although Via is not overpriced any longer, they used to charge hundreds of dollars for their Mini-ITX boards, and thus they keep their hard-won repuation. Believe me, those skinflints deserve it.

            They used to charge OVER $300 just for passive-cooled Mini-ITX boards with processors slower than 1GHz. That's not competitively priced with ANY other platform. Just as an example, the (now $60) C7 1.5GHz board you linked went for over $200 on introduction last year! And the worst part: the prices never went dow
        • Combine that with the mindless eco-babble that has tainted every aspect of North American life in the last few years
          What?
          • by DaAdder ( 124139 )
            I agree wholeheartedly. What on earth is so mindless about this "eco-babble" ?
            • Re: (Score:3, Insightful)

              by billcopc ( 196330 )
              Being conscious of our environment is a good thing. Throwing around random buzzwords is not.

              I'm all for conserving energy, and I abuse my Kill-A-Watt meter on a daily basis. What irritates me, and this also applies to health fads, is the use of pseudo-science in marketing. Gadgets are being branded as "low power" when they were never high power in the first place, and sold at a premium. Other things are remade into low power variants, sold at a premium but consume more power during fabrication than the
          • by ppanon ( 16583 )

            Combine that with the mindless eco-babble that has tainted every aspect of North American life in the last few years

            What?

            He's clearly someone who has never been involved in running a datacentre and doesn't realize what part of the TCO is attributable to power consumption. In other words, like most anti-eco reactionaries, he doesn't know what the heck he's talking about; he's just PO'd people question his conspicuous consumption.

        • by 3fiddy ( 899339 )

          If Intel can release a slightly better processor for less money, that can be paired with an inexpensive chipset and tiny power supply, they could take a bite out of the microcontroller segment and ARM's small but tenacious market share.
          Are you a sports commentator by chance?

          "...because you know, the team that goes out there and works the hardest, scores the most points, and shuts down the opposing team's offense puts themselves in the best position to win the game."
        • they could take a bite out of the microcontroller segment and ARM's small but tenacious market share.


          You've heard of Intel's xScale [intel.com] line of ARM chips [wikipedia.org], haven't you?
    • Re: (Score:3, Insightful)

      by CajunArson ( 465943 )
      That's like saying that a Phenom will bury Atom in performance... of course it will, but then you are missing the entire point of what Atom is about. Atom is about devices smaller than notebooks where Isaiah cannot go (look at the TDP's, Atom running full-tilt is in a much lower power envelope than Isaiah). The next generation after the current Menlow platform will even work at the cellphone level, but right now Intel is targeting MIDs (Mobile Internet Devices) which predominantly run Linux BTW. The At
    • And Intel themselves want to make Via engineers' life easier by artificially ("for what purpose we will sell it") limiting the ways the CPU can be used...
    • By the looks of things Isaiah will wipe the floor with Atom if intel doesnt bury Via with branding power. Isaiah's out of order execution will offer much better performance than Atom's in-order execution. ...and more power consumption, from the looks of it. VIA estimates the 65nm Isaiah will have the same power envelope as with their 90nm C7 (10-20w), which is not unreasonable. However, this is an order of magnitude higher than the Atom, which is sipping power at 0.5-3w. The only thing VIA has that could
  • by pslam ( 97660 ) on Sunday March 09, 2008 @11:01AM (#22692736) Homepage Journal

    The Atom architecture is intended to give Intel a foothold in handheld devices that have traditionally been the sole domain of very low-power RISC processors. The chip itself is tiny at less than 25mm square, and, according to Santa Clara, has a TDP of 0.6W - 2.5W, as compared to a 35W TDP for a "typical" Core 2 Duo.

    Sigh. They do this every year or two - Intel announces a new core that will get them into more handhelds. They're still an order of magnitude short. Typical "very low-power RISC processors" you see in a device such as a mobile phone or MP3/video player are more like 0.01W - 0.25W, or even less. They're way more efficient clock-for-clock (and MIP-for-MIP) than any x86 core Intel has ever churned out.

    Unless they have a funny definition of hand-held device we don't normally use, of course.

    • by CajunArson ( 465943 ) on Sunday March 09, 2008 @11:32AM (#22692946) Journal
      These chips aren't designed to go into cellphones, and Intel frankly says they are not going into cellphones. They are instead designed for MIDs [wikipedia.org] that will predominantly run Linux. Think of these things as smaller & lighter than your notebook with customized interfaces (not just mini-desktops) that are also easier to use than cellphones for accessing the Internet. Considering that Atom chips are roughly equivalent in processing power to first-gen Centrino chips, these devices should be extremely capable with the right software. The next generation of Atom at 32nm will have the proper power envelope to run your cellphone BTW.
      • by pslam ( 97660 ) on Sunday March 09, 2008 @11:54AM (#22693060) Homepage Journal

        These chips aren't designed to go into cellphones, and Intel frankly says they are not going into cellphones. They are instead designed for MIDs that will predominantly run Linux.

        That's funny, because according to the link, MIDs are a class of hand-held device invented by Intel. So I'm right - they have a different definition of hand-held to everyone else.

        The next generation of Atom at 32nm will have the proper power envelope to run your cellphone BTW.

        They will be 10 times more power efficient than their 45nm version? Extremely unlikely. Also consider that the real lower power processor market isn't standing still either - they're managing about a 25%-50% power efficiency improvement per year. Also consider that the current high-end low-power CPUs you find in mobiles are comparable in performance to the first-Gen Centrino chips.

        The kind of "hand-held" devices Intel are talking about have big batteries and are held with two hands. 1 Watt is not a lower power device in this market. The real hand-held device chip market measures their power in milliwatts not watts. They idle at a single milliwatt and average a 20-50mW in use. Intel is still running orders of magnitude higher than that.

        • by bhima ( 46039 ) * <Bhima,Pandava&gmail,com> on Sunday March 09, 2008 @12:48PM (#22693404) Journal
          From Linux Devices: http://www.linuxdevices.com/news/NS5492118276.html [linuxdevices.com]

          ARM vs. Atom

          There's much to like about the Intel Atom, writes Williston in EETimes. Yet, he suggests, the media and its readers may have been overwhelmed by the hype machine. Williston offers the following responses to typical arguments from the atomic power lobbyists, at times quoting analysts such as Forward Concepts's Will Strauss to back him up:
          Atom will beat ARM because it can run Vista. -- No it can't, says Williston. Atom can run Windows CE and Linux, but ARM can do the same.

          Only Atom offers a "real" Internet experience with Flash video, YouTube, etc. -- "Wrong," writes Williston, pointing to ARM Flash players from BSquare, and an ARM-based YouTube decoder from On2. He might also have noted that Nokia's ARM- and Linux-based Internet tablets use a Mozilla-based browser, with plugins for Flash, Windows Media files, and even Microsoft's Flash-like Silverlight technology.

          Intel dominates every market it enters. Here, the writer refers the reader to the history books, especially two years ago when Intel sold its PXA line of embedded processors to Marvell after failing to dominate the market for ARM-based SoCs.

          Atom will win because ARM is proprietary technology. Nope, he writes. ARM chips are available from a number of semiconductor vendors.

          Intel will win on cost. Not likely, he writes. Using a 65nm process, the Cortex-A8 occupies less than 3mm x 3mm, he notes, while the Atom core probably takes up about 9mm x 9mm of Atom's 25mm x 25mm die size, despite its smaller 45nm process. "With such a huge area disadvantage, it's hard to see how Intel will win on cost," he writes.

          Intel will win on power. Once again, not likely, he argues. Intel quotes a thermal design power (TDP) of 0.6W to 2W for Atom, he writes, but doesn't specify clock speeds. ARM offers only "typical" power measurements, making comparison difficult. But at best, he suggests, Intel matches ARM on power usage, while "in most scenarios, Atom burns more power."

          Intel will win because it has the most advanced fabs. Perhaps, he writes, but who cares? "Consumers focus on cost, power and speed," he writes.
          • Re: (Score:2, Informative)

            Sorry, if you RTFA carefully, the Atom chip size is 25 square mm. This means its about 5mm by 5mm.

            While intel did sell its PXA line of ARM uP's, it still makes a fairly large range of ARM processors, most of which clock at fairly impressive speeds. (Faster than most of the competitors ARM uP's) Easy to check, just go to the Intel site.

            Even ARM processors start requiring a fair bit of power when the clock rate gets high.

            ARM IS proprietary. The fact that every semi vendor appears to have ARM in its l
        • The kind of "hand-held" devices Intel are talking about have big batteries and are held with two hands. 1 Watt is not a lower power device in this market. The real hand-held device chip market measures their power in milliwatts not watts. They idle at a single milliwatt and average a 20-50mW in use. Intel is still running orders of magnitude higher than that.

          Hmm, if you're saying they would like to be running at such low power, or are trying to get their chips running at lower power, which is what you seem to be implying, I don't think you're right. As the articles says Intel is making 100k 45nm processors per day, how many micro-watt ultra-mobile processords are made per day?

          Intel can really be in any processing market they want, and if they're not it's probably because they don't think it's profitable enough.

          • by pslam ( 97660 )

            Hmm, if you're saying they would like to be running at such low power, or are trying to get their chips running at lower power, which is what you seem to be implying, I don't think you're right. As the articles says Intel is making 100k 45nm processors per day, how many micro-watt ultra-mobile processords are made per day?

            These apparently mythical chips I'm talking about: a) exist today and b) are in every mobile phone in the world. They make Intel's 100k per day figure look small.

            Intel can really be in

        • The number 1 reason why x86 processors burn this much power is the incredibly dense feature set they implement, a feature set that was designed with servers/workstations/desktops in mind, not .01W cell phones.

          First and foremost, they implement a variable-length CISC instruction set, which complicates nearly everything in the front end of the machine (branch prediction to instruction decode), and some stuff in the back end too. Add on multiple operating modes, multiple paging modes implemented in hardware,

      • Re: (Score:2, Insightful)

        by hattig ( 47930 )
        The next generation of Atom at 32nm will have the proper power envelope to run your cellphone BTW.

        No it won't.

        It won't be small enough, nor will it be integrated enough. Sure, Intel will move the GPU and Northbridge into the CPU, but that's still nowhere near as integrated as the ARM based competition.

        Also it seems that people think that ARM will stay where they are now, and just happily let Intel slowly get to their power consumption over the next five years. What utter tosh. ARM have multi-core Cortex cor
    • by Kjella ( 173770 )

      Typical "very low-power RISC processors" you see in a device such as a mobile phone or MP3/video player are more like 0.01W - 0.25W, or even less. They're way more efficient clock-for-clock (and MIP-for-MIP) than any x86 core Intel has ever churned out.

      I was under the impression that most of these had extremely lousy performance, and relied on dedicated decoding chips specificly designed for the task they do. Not that it's a bad thing, I've many appliances that do an excellent job but if Intel is trying to corner the "general-purpose PC in a handheld" market Atom is probably a strong contender. I'm not sure how large that market is though, the interface usually gets so cramped there's a limited number of applications that'd need it...

      • by pslam ( 97660 ) on Sunday March 09, 2008 @12:16PM (#22693198) Homepage Journal

        I was under the impression that most of these had extremely lousy performance, and relied on dedicated decoding chips specificly designed for the task they do.

        It's a common impression but it's wrong. For example, most of the CPUs you find powering MP3 players do decoding entirely in software. Even many CPUs powering hand-held video players do decoding entirely in software, but to be honest you'll find most of the high-end video players use hardware decode because a) it's faster and b) it's more power efficient.

        The performance of the CPUs you find in a most high-end hand-held devices these days is surprisingly good. Well, it's surprising to people who haven't worked in the field, at least. We're constantly somewhat annoyed that the rest of the world hasn't worked it out yet. A high end ARM11 (common in high-end mobile phones) is actually quite competitive to the performance of a Via C3, for example.

  • by Vellmont ( 569020 ) on Sunday March 09, 2008 @11:01AM (#22692750) Homepage
    This isn't a new market, it's a well established one. Intel already has serious competition in this market, as evidenced in the article:

    The Atom architecture is intended to give Intel a foothold in handheld devices that have traditionally been the sole domain of very low-power RISC processors.


    I'm not sure that anyone really cares about what the instruction set for a handheld device is, since the operating systems for handheld devices has been relatively chip-agnostic.
    • I think AMD has to be competeitive at something, but for the moment, UMPCs are hardly a decent sized market.
      • This isn't an Intel vs. AMD market segment. Intel may be marginalized, but AMD more so.
      • by jabuzz ( 182671 )
        With the Asus Eee 701 shipping something like 500,000 units so far and looking at something like three million units by year end, that is pretty decent market to me. Anyone placing an order for three million processor units from *any* manufacturer, including Intel will get plenty of attention.
    • Re: (Score:2, Informative)

      by ocirs ( 1180669 )
      That statement isn't made towards atom, rather Intel's ability to mass produce 45nm which costs less, performs better and generates less heat. AMD is just starting 45nm production and by the time we see it hit full production, you can expect Intel to be transitioning 32nm.
    • Re: (Score:3, Insightful)

      by Lord Ender ( 156273 )
      The Microchip PIC is an example of a "very low power RISC proc" but it doesn't even have an OS. With no OS, the instruction set matters.
      • An OS is not necessary. There are powerful high level languages for the PIC architecture. I tend to view said languages as macro collections to make deploying PICs easier.

        Yes, I have deployed compiled binaries on PIC controllers as small as the PIC10F202 [chipcatalog.com]. (24 bytes of read/write memory, 768 bytes of program memory)

    • I'm not sure that anyone really cares about what the instruction set for a handheld device is, since the operating systems for handheld devices has been relatively chip-agnostic.

      I'd really like to know where you're getting this.
      IIRC Pocket PC runs on only three chipsets, Windows CE.Net on one, Symbian on three, and Palm OS on two (sort of...).
      The only thing that I know of that comes close is Linux, and it's only mostly chipset agnostic because everything is written to run GCC, and GCC has been written to co
  • Intel (Score:1, Flamebait)

    Would this article read the same (AMD playing catch-up) if Intel didn't sponsor /. so much?
  • Ultrasmall devices? (Score:5, Interesting)

    by jhoger ( 519683 ) on Sunday March 09, 2008 @11:14AM (#22692816) Homepage
    "Ultrasmall" is fine if you don't need a display and keyboard.

    I think the utility for these new processors is reducing power consumption on devices that are the same size we normally expect.

    Is anybody really satisfied with ~3 hours of battery life on a laptop? Considering this is the 25th anniversary of the Model 100, which sold 6 million units, has 20 hours battery life, lighter than most laptops today and was easier to use, instant-on, off, people should know we can do better.

    -- John.
    • by Vellmont ( 569020 ) on Sunday March 09, 2008 @11:29AM (#22692928) Homepage

      Is anybody really satisfied with ~3 hours of battery life on a laptop?

      Given that laptop sales are at an all-time high, I'd say the answer is "yes". Do people want more? Sure, but they're willing to settle for 3 hours.

      Part of them problem is laptops are just an extension of desktops, and desktops are driven by more and more resource usage (and thus more power). I'm sure someone could come out with a laptop with a 12 hour battery life, but:

      It'd run modern desktop software slowly.
      It'd have a smaller storage space (20 gigs of flash ram?) (this isn't so bad really)
      The screen wouldn't be quite as "nice" as the 3 hour laptop. The maker would likely have to compromise on the screen technology to reduce power consumption.

      low-power devices like this exist, of course. They're just identified in a different class of device because of the above compromises.

      • Sounds just like your description. Smaller screen, solid state storage (much less than what you expect, but one can always add a 16GB USB stick). Unfortunately its battery life is nowhere near 12 hours, but I guess it is a feasible goal.
      • by jhoger ( 519683 )

        Given that laptop sales are at an all-time high, I'd say the answer is "yes". Do people want more? Sure, but they're willing to settle for 3 hours.

        They don't have much choice but to settle for it when the market doesn't provide them an option. However I think that's starting to change with Internet tablets and things like EEE PC

        Part of them problem is laptops are just an extension of desktops, and desktops are driven by more and more resource usage (and thus more power). I'm sure someone could come ou

        • unless you're doing video editing, 20 gigs is a lot of space
          I agree. I am using less than 5 gigs right now (no music, games on a Windows partition, etc). If you want to store videos/music, get a NAS.
    • I'd say the real question is, why aren't things like these [alphasmart.com] sold at Wal*Mart for $20 a pop? They're great for students, who don't really need full wordprocessing and internet capabilities all the time, just enough to type some things and then use a real computer for markup & such.

      But they cost as much as an OLPC, what gives?

      • But they cost as much as an OLPC, what gives?

        I guess because someone didn't have the capacity to produce a zillion of these things.

        It also looks like technology stolen from the 1980s. Who really wants a display that looks like it belongs on a calculator?

        It's a neat idea though. It just seems it's over-priced, and under-performing.
        • Well, yeah, for $400 bucks, there not anywhere near worth it. But for 20 bucks, they'd have a decent market in the walmart crowd. Heck, walmart already sells toys that are nearly as functional, just with shorter LCDs and focused entirely on cheesy games instead of useful text-entry.

          For $9-$12, they become disposable. People might buy a couple just to have around.

          And there's no reason they can't fit extremely limited internet access in there, like, just pine or something, via modem or ethernet port. They
      • by jhoger ( 519683 )
        The Alphasmart products are targeted at the education market, not the general market. So what advertising there is would never reach the general consumer. Also the higher price likely has something to do with the target market. The Alphasmart Dana, for example, is basically a Dragonball Palm with a larger screen, built in keyboard and USB.

        In any event, I think these days you cannot appeal to the mass audience unless the device provides Internet access. Although for some people avoiding distraction from the
    • by Erpo ( 237853 )
      Is anybody really satisfied with ~3 hours of battery life on a laptop?

      Agreed. The ideal laptop stays up at full performance on battery power alone for as long as I can stay awake. It recharges in less time than it takes me to sleep. I would gladly deal with a double-thickness, double-weight laptop if meant significantly more battery life. Or even triple thickness, triple weight.
    • Yes.

      I will take a much faster computer with more RAM and sacrifice a bit of battery power in order to do it. I'm not that often removed from power for more than three hours. On cross-country trips (I take a handfull a year) I have an inverter in the car. On an airplane? Except for international travel, recharge at the airport during layovers. I'll trade power for battery any day down to 2-3 hours.
      • by jhoger ( 519683 )
        So you're the guy ;-)

        In fact for most applications one need not trade off speed. My PDAs and cell phones are not "faster computers with more RAM" but all of them launch applications faster than my desktop or laptop, switch instantly between apps, turn on and off instantly, all of this while never thrashing on the disk and lasting for days on the battery rather than 2-3 hours.

        So my argument here is that, unless you are doing serious number crunching, CAD work, etc (which most people do at a desktop) most mea
  • In the announcement, Intel says the area is 25 square mm,
    which is a lot smaller than 25 mm square (25 mm on each side).
    A nit, perhaps.
    • by Your.Master ( 1088569 ) on Sunday March 09, 2008 @11:51AM (#22693046)
      I'm from a place that properly uses SI units, untainted by imperial natures. I went to University and picked up an engineering degree. I have never, ever heard of 25 mm square necessarily meaning (25 mm)^2 instead of 25 mm^2. I would always assume the latter, and that's how my peers and professors talked to.
      • Well, lah-di-dah. I didn't choose the place of my birth, but I think I come from an OK place, too.
        And I have two engineering degrees, so there! :-)

        I don't think the units had as much to do with it as the ambiguity of area vs. dimensional size.
        Since the original posting had said "measured 25 mm square" it wasn't clear whether it
        was referring to it's area or actual dimensions. Go have a pint of ale on me.
      • by smoker2 ( 750216 )
        Maybe you should work for NASA. 1 mm^2 is a measurement of total area, whereas 1 mm squared describes the dimensions of the sides. The correct way to write 1 mm^2 longhand is 1 square millimetre. Notice the position of the word "square". And to mention SI units is irrelevant, the same rules apply to inches, feet, yards and miles.
      • English is not my native language, but the parent has a point. If I were to mean (25mm)^2, I'd say 25 mm squared. 25mm square just doesn't have a proper meaning AFAIK, and can lead to confusion; 25 square mm is the right way to say it (25 mm^2) "aloud".
      • by Kjella ( 173770 )

        I'm from a place that properly uses SI units, untainted by imperial natures. I went to University and picked up an engineering degree. I have never, ever heard of 25 mm square necessarily meaning (25 mm)^2 instead of 25 mm^2. I would always assume the latter, and that's how my peers and professors talked to.

        I'd say that under some circumstances it means different things:
        "I was looking at the this new apartment, very lovely balcony which was about four meters square."
        "I was looking at the this new apartment, very lovely balcony which was about four square meters."

        The former I'd clearly interpret as being a square with each side being 4m (top floor?).
        The latter I'd clearly interpret as being 4m^2 of ambigious shape.

        I guess it'd depend on the context if I got it wrong, but if it makes sense to tell me the shape

  • subsidies anyone? (Score:4, Insightful)

    by buddyglass ( 925859 ) on Sunday March 09, 2008 @11:20AM (#22692864)

    This reminds me of an economics lecture I attended once, which dealt with the topic of government subsidies. In general, the professor was extremely against subsidies, since they pervert free market dynamics and generally leads to lower overall efficiency, higher prices, etc. However, the one situation where he supported them was for industries where the cost of doing business is so high that the world market can only support a monopoly. In that case, he argued that subsidies were vital in that they enable the existence of two entities in a given space, thus creating competition and spurring innovation.

    His main example was the commercial aviation industry, where the two big players are Boeing and Airbus. According to him, without large subsidies from the U.S. and E.U., one of those two would "win" and the other would cease to exist, leaving us with a single global manufacturer of commercial airplanes. I wonder if this argument now applies to Intel?

    • Re: (Score:3, Insightful)

      by Albanach ( 527650 )
      Hmm, I'm not sure if it'd apply to Intel or not. Surely though it'd apply to operating systems since MS is a convicted monopolist. The solution is therefore for the government to subsidize linux.

      Seriously, Intel is huge, but there have been other, better, chip makers before and likely will be again. If intel's competition declines, their designs will begin to stagnate as they try to increase profit and deliver 'shareholder value'. Then another AMD/ARM/IBM etc etc will come up with something different and th
      • Re:subsidies anyone? (Score:5, Interesting)

        by theskipper ( 461997 ) on Sunday March 09, 2008 @12:24PM (#22693240)
        An interesting aside wrt AMD. Apparently AMD's license for the x86 instruction set has a massive "catch":

        http://www.overclockers.com/tips01276/ [overclockers.com]

        what clause 6.2 appears to say is that if AMD gets taken over or goes bankrupt, Intel has the right to end AMD's right to use Intel's patents and copyrights after sixty days notice. This would seem to mean AMD couldn't make x86 processors anymore.

        The direct findlaw doc link:
        http://contracts.corporate.findlaw.com/agreements/amd/intel.license.2001.01.01.html [findlaw.com]

        So the arms race isn't so cut-and-dry because x86 is so pervasive. Any competitor would likely find themselves in the same situation as AMD because Intel holds the licensing trump card. Imagine being the startup trying to negotiate a fair arrangement under those conditions (i.e. where they could be truly competitive with Intel down the road).

        • Re: (Score:2, Informative)

          People need to remember that AMD is only in the x86 business at all because they got their foot in the door as a second-source producer of Intel chips decades back. Without those old agreements, they wouldn't be making an x86 processor at all.
      • Re: (Score:3, Insightful)

        Hmm, I'm not sure if it'd apply to Intel or not. Surely though it'd apply to operating systems since MS is a convicted monopolist. The solution is therefore for the government to subsidize linux.

        The difference is that it takes billions of dollars just to start to compete with Intel. Someone could make a Windows clone and compete with Microsoft for, say, a couple tens of millions. That nobody does it is the stupidity of most of the industry, who don't understand the power of compatability. They just see

        • The difference is that it takes billions of dollars just to start to compete with Intel.

          Not quite. I talked to a former Intel chief engineer a couple of years ago who was thinking about starting up his own semiconductor firm. His business plan called for about $30m of funding, which he was offered but eventually declined because he eventually decided his target market would have shrunk a lot by the time he got a product to market (when I talked to him it was too soon to tell if he was correct). You might need billions of dollars to compete with Intel in terms of scale, but you can start ma

        • In software development the last 10% of a sufficiently complex project takes 90% of the time and effort
          The last 5% takes 95% of the time and effort
          This pattern continues even after the software ships, unfinished of course like all large projects.
          Therefore I doubt windows can be cloned
          I have more hope for things like this atom chip to break the microsoft monopoly by creating classes of devices too cheap to be worth paying the microsoft tax on but powerful enough to create a new market and API ecosyste
        • Someone could make a Windows clone and compete with Microsoft for, say, a couple tens of millions. That nobody does it is the stupidity of most of the industry, who don't understand the power of compatability.

          Your idea has already been tried twice: both WABI and OS/2 were attempts to build a "better Windows than Windows". There are not many companies better poised to take a run at Microsoft than IBM and Sun in their heydays.

          The plan most companies have now is smarter: build layers like Java, Flash, H

          • Your idea has already been tried twice: both WABI and OS/2 were attempts to build a "better Windows than Windows". There are not many companies better poised to take a run at Microsoft than IBM and Sun in their heydays.

            Au contraire, particularly for OS/2. IBM specifically declined to implement Win32, and also made the device drivers incompatible. That was the kiss of death. OS/2 was forever application and device driver starved. In fact, I recall IBM shipping their computers with both OS/2 and Windows 3

    • There's a lot of companies out there that makes chips. IBM has major chip fabs (for things like the Cell and Power chips). Hitachi does at least some work on in major chips (the Earth Simulator uses Hitachi chips for example). I don't know if they fab them, but they design them at any rate. Then there's TSMC, they don't do any design, they are just a fab for hire. They are the major source of graphics card chips out there.

      It isn't that other companies couldn't compete in the desktop market, it is that they
      • While it isn't going to happen tomorrow or anything, I could very well see in 10 years that there are multiple different architectures for desktop systems. Nobody cares about that because the OS handles all the details, your apps run on any of them. People simply buy on price and performance criteria.

        If the OS is open source unix, it happened yesterday.

        • Uhhh, not really. The in that case the OS handles no details at all, you have to handle everything. You need to compile all the apps. Sometimes, it works nice n' easy. Sometimes, it works like not at all and you get to dig through source code and try to figure out why. The OS itself will only run the binaries designed for it.

          I've gone through this at work, getting various OSS things written for x86 Linux to run on SPARC Solaris. Sometimes it is quite easy, just a recompile. Other times, it is a nightmare. I
          • Re: (Score:3, Interesting)

            by mechsoph ( 716782 )

            That's what I'm talking about, not code recompilation. That's fine and all, but don't confuse it with transparent portability.

            You're right that languages like Java and Python are much easier to port than C, but porting is the developer's problem, not the user's. When the software is open source, it is highly likely that, for major applications, some developer has already ported it to whatever architecture you may be interested in using. Debian runs on 11 architectures. When I used gentoo (perhaps not t

      • Re: (Score:3, Informative)

        by TheRaven64 ( 641858 )
        It's not the high-level runtimes that are making x86 obsolete, it's the low-level ones. Emulator technology has come a long way in the last few years. Rosetta (which Apple licensed from a little start up from Manchester University in the UK, by the way), has really shown how unimportant the ISA is. I run a few PowerPC apps on my MacBook Pro and don't even notice that they aren't native.
    • by khallow ( 566160 )
      I don't see how it applies to the commercial aviation industry. The subsidies are part of the reason that the industry is down to two players.
      • Basically, the argument was that the R&D costs involved in producing a new model of airplane are so incredibly high that they exceed half the global market for new planes. So if you have two producers, both of whom have to foot that same huge R&D bill, it's impossible for either of them to turn a profit.
        • by khallow ( 566160 )
          Keep in mind most of the R&D cost is due to government regulation. That sort of thing heavily favors the big company. Then toss in that the big company can bribe government to subsidize it and you have the current oligopoly; Competitors would have to both meet the stringent regulations of the US and EU, and compete without subsidies (at least until they get established). It's a very sweet deal for Boeing and Aerobus.
          • Sure. It's virtually impossible for anybody besides those two to get in the game. This professor's point, though, was that if you took away the subsidies from both companies, one of them would necessarily fail, since the total amount of money to be spent on commercial airplanes worldwide is not enough to offset the development and production costs for *two* producers. Even if each claimed exactly half of the market, they would both end up losing money.

            I'm sure government regulation plays a part in the c

            • by khallow ( 566160 )
              My point is that there is room. The regulations are part of the subsidies. That vastly reduces R&D costs.
  • I grabbed an e8400 as soon as they became available (and I'm glad I did because they sold out quickly and are still hard to come by). I have an extremely moderate overclock to 3.5Ghz with a 1.2V Vcore and it doesn't even hit 60C when I'm torturing both cores with prime95. Additionally, the entire platform (x38 chipset, Nvidia 8800GT video card, Intel hi-def audio, gigabit ethernet, etc.) worked out of the box with Kubuntu 7.10, about the only tweak was that I manually upgraded to the newest Nvidia driv
  • by MsGeek ( 162936 ) on Sunday March 09, 2008 @12:16PM (#22693204) Homepage Journal
    Now that Intel has seen people go bonkers for the Eee and similar devices, I wonder if they will put out a consumer version of the Classmate with Atom inside? A little Atom-powered mini lappie with a 1.8" HD ala the Cloudbook and a decent amount of RAM would own. Another suggestion would be to put an IBM/Lenovo/Toshiba style pointing stick "eraserhead" as the pointing device. The Cloudbook's miniature trackpad on the left and clicking buttons on the right suck ass. And the full-size trackpad on the Eee is wasteful of space which could be freed up with a pointing stick and a set of clicking buttons beneath the keyboard.

    Gimme one of those, with a REAL Linux inside (Debian Lenny would be perfect, or Kubuntu) and I'd be sold.
  • Ooooh. That's a lot of potatoes.

    - difficult for AMD to catch up

    Do better on the benchmarks and it would be a smaller problem. People believe Intel is a performance winner, so AMD has to provide concrete evidence of equivalent or better performance. Easier said than done, but that's what can bring investment funding and sales.
  • AMD can go fabless (Score:5, Interesting)

    by cyfer2000 ( 548592 ) on Sunday March 09, 2008 @12:35PM (#22693324) Journal
    The advantage of AMD is design. AMD has never bested Intel in fabrication. It looks that the design team of AMD has been dragged by its fabrication capability. To solve this problem, AMD can out source the fabrication to companies like TMSC or Chartered Semiconductor.
    • by thsths ( 31372 )
      > The advantage of AMD is design. AMD has never bested Intel in fabrication.

      That may be true (although with the Athlon they were close), but you just do not get fabrication like Intel's on the open market. If you want cutting etch CPUs, you need cutting etch fabs, and that is not available as a commodity.
      • AMD only need to find a foundry better than them in either quantity or quality or both, not a foundry better than Intel.
    • Re: (Score:2, Interesting)

      by Manatra ( 948767 )
      One small problem, AMD's contract with Intel states that they can't outsource more than 20% of their chip production.
      • by bhima ( 46039 ) *
        Isn't that chips with X86 instruction sets? Wonder why they never ventured into some of the alternates... SPARC, Power, PowerPC, or ARM...
        • I'm pretty sure than AMD had their own processor archs ages ago. They made a 'bit slice' processor for a time where you could strap chips in series to make arbitrary bit-wide CPUs. But when they decided to become an x86 clone outfit, that's what they remained. It wasn't until Intel 'foundered' on the future path for x86 and started working on Itanium that AMD actually became more than a knock-off second sourcer.

          (I am not an 'Intel backer' in the ludicrous Intel vs. AMD fanboy adventure, btw, just a bysta
          • I'm pretty sure than AMD had their own processor archs ages ago.
            The AMD 29000 series was quite successful until AMD 1995 when AMD decided they needed the engineers working on it to concentrate on x86 (a shame, but it did seem to work out for them). That had a register window system like Sparc and the Intel 860 series (and the Berkeley RISC1 that they all descended from), but it used a variable-sized register window, which eliminates the register-wasting problem the other architectures had. To my mind, thi
      • I never realized there is such a contract. Then the game is getting more interesting.
      • One small problem, AMD's contract with Intel states that they can't outsource more than 20% of their chip production.

        I've seen this line dropped a million times on Slashdot, and I have NEVER seen anyone back it up with proof. This is turning into the ultimate geek internet rumor, because everyone parrots it verbatim without checking the facts.

        I say, put up a link, or shut the hell up. Unless you have a news article or a copy of AMD's contracts, you have proof of nothing.

        The fact is, you don't even need a
    • I'm curious as to why they don't alternate levels or something. Is there something about going from 90nm -- 65 nm that has to take a certain number of years, or that has to be done in order?

      I mean, it seems strange to me that these fab size reductions keep coming down like clockwork if it's research that's holding them back. I would expect that the jumps would be more irregularly spaced and dramatic, even if they average out to the Moore observation.
  • Its distinctive, high tech, and suggestive of the character of this CPU.

    Intel has been running another clever commercial on TV news programs. A bunch of professionl hold the chip die (smaller than the chip case itself) and mention the remarkable contribution this tiny computer makes in some aspect of their life.
  • by rbrander ( 73222 ) on Sunday March 09, 2008 @01:17PM (#22693542) Homepage
    Sounds like the new big market is "ultra-mobile" mini-laptops, from those links to "MID" and "UMPC" in the Wikipedia.

    My purchase of an Eee PC got me to do up a presentation for the engineers at work,

    "Poor Man's Computer: Cheap Internet Appliances for the Whole World"

    http://www.cuug.ab.ca/branderr/pmc [cuug.ab.ca]

    on the topic. Short version: as predicted by Dan & Jerry Hutcheson in Scientific American about 1997, the market is turning from "endlessly bigger and faster at the same price point" to "smaller and way cheaper if not as fast". We're taking our "Moore's Law gains" in the form of money rather than than speed, thanks very much.

    And this price drop into $300 and $200 laptops (and under in the case of the XO) is colliding with the surge in global population that make $10/day or more in the developing world. Sales in the billions beckon. 100,000 per day? Hah. If they make the right product, they'll have to ramp up to many hundreds of millions per year.
    • Re: (Score:3, Insightful)

      by jhoger ( 519683 )
      I dunno. Aside from the current blog buzz, the only thing that will continue to excite folks in the developed world about the EEE, CloudBook, et al are if they add more utility. That utility needs to come as:

      all day battery life
      light-weight
      instant on/off (like a PDA)
      full-screen window manager (like a PDA)
      de-bloated software
      but with a full sized keyboard and display

      The point for me is not a low-cost cheapie computer. The point is more utility, usability, portability when moving from the kitchen to the confer

C'est magnifique, mais ce n'est pas l'Informatique. -- Bosquet [on seeing the IBM 4341]

Working...