Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Portables Hardware

Intel Launches Next-Gen Atom N450 Processor 165

MojoKid writes "Intel has unveiled its next-generation Atom N450 processor, and a review of the new Asus Eee PC 1005PE netbook that houses it shows decent gains in performance and lower power consumption. The Atom N450 has been re-architected similar to Intel's other notebook processors in that it now has an integrated memory controller and graphics core on the CPU itself. In addition, Intel's serial DMI (Direct Media Interface) now replaces the system bus to the Southbridge IO controller. From a performance standpoint, the Atom N450 single core chip offers a nice performance gain versus previous generation Atom CPUs and it appears Intel has dual-core variants of the chip on the horizon as well."
This discussion has been archived. No new comments can be posted.

Intel Launches Next-Gen Atom N450 Processor

Comments Filter:
  • So... (Score:4, Funny)

    by fuzzyfuzzyfungus ( 1223518 ) on Monday December 21, 2009 @10:29AM (#30512386) Journal
    Is the new integrated graphics core a descendant of intel's much maligned; but well supported in linux, GMA950 line, or is it another take on the HD-media-accelerating-but-dear-god-the-drivers-oh-why-does-it-hurt GMA500 stuff?
    • RTFA, please (Score:2, Informative)

      by Anonymous Coward
      RTFA: "The graphics core is a basic DX9 instantiation that is a kin to Intel's GMA500 graphics core in the previous generation Intel 945G chipset"
      • Re:RTFA, please (Score:4, Informative)

        by nxtw ( 866177 ) on Monday December 21, 2009 @10:49AM (#30512636)

        RTFA: "The graphics core is a basic DX9 instantiation that is a kin to Intel's GMA500 graphics core in the previous generation Intel 945G chipset"

        Makes no sense: the 945G and variants had a GMA 950.

      • Re: (Score:3, Informative)

        by TeknoHog ( 164938 )

        RTFA: "The graphics core is a basic DX9 instantiation that is a kin to Intel's GMA500 graphics core in the previous generation Intel 945G chipset"

        I have a 945GM system whose graphics part is called GMA950. It uses the common opensource Intel drivers. On the contrary, GMA500 aka Poulsbo is the problematic one with closed drivers.

        http://en.wikipedia.org/wiki/Intel_GMA [wikipedia.org]

      • Re: (Score:3, Informative)

        by MarcQuadra ( 129430 )

        That doesn't make any sense. The 945 chipset uses the GMA950, the GMA500 is actually a totally-outsourced PowerVR chip. The 'native' Intel chips (i810 through G45) are all tatally supported by Intel's open-source drivers, the GMA500 is almost impossible to get working in Linux.

        The new built-in N450, D410, and D510 graphics chips are based on the GMA3100, if I recall, they're even called 'GMA3150'. That means they're supported by open-source drivers (and possibly by Mac OS X!), but the performance is bad eno

    • Intel and Linux (Score:5, Interesting)

      by Enderandrew ( 866215 ) <enderandrew@gmSTRAWail.com minus berry> on Monday December 21, 2009 @10:35AM (#30512484) Homepage Journal

      Intel has been tearing apart their Linux graphics stack and rewritting it for the future. For a while, that meant poor performance during the rewrite, but it really is getting better. Intel is really helping push DRI2, GEM, TTM, UXA, etc.

      At least Intel does their development in the open. Didn't Intel also contribute code to Moblin to optimize Moblin performance on their hardware? I'd like to see some more general kernel enhancements for these processors. Any speed increase over Windows on the most common netbook processor is a huge win.

      Chrome OS is already fast. If Intel can help make it faster when comparing it side-by-side to 7, it only helps Linux adoption on the whole.

      I also have a small tangental question. I always hear about huge performance gains that can come from properly writing code to take advantage of SSE2,3,4,etc instruction sets. I also hear that almost no one does write code to take advantage of these instruction sets. If Intel really wants to push their hardware, why not write such optimizations for the Linux kernel?

      • Re: (Score:3, Informative)

        by TeknoHog ( 164938 )

        I also have a small tangental question. I always hear about huge performance gains that can come from properly writing code to take advantage of SSE2,3,4,etc instruction sets. I also hear that almost no one does write code to take advantage of these instruction sets. If Intel really wants to push their hardware, why not write such optimizations for the Linux kernel?

        The kernel doesn't do much CPU-bound processing. It is math and media libraries where these vector instructions would be actually useful. You can already have some of their advantages using a decent compiler. Basically, that means different binaries for processors with different capabilities, so your average binary distro is not going to have any fancy instructions. I suggest trying Gentoo if you actually want to use your modern CPU.

        • Wouldn't it be fair to assume anyone running a 64-bit distro has a procesor capableof SSE4 insturctions? Write the code to take advantage of these instruction sets, but only enable them on your 64-bit binaries then.

          I'm no low level programmer, but I assume IO and CPU scheduling are math intensive enough. If SSE instructions really boost video encoding, what about encryption algorhythms, or file systems?

          • Re: (Score:3, Informative)

            by Virak ( 897071 )

            The various SSE instruction sets provide SIMD instructions, which is an acronym for "single instruction, multiple data". As the name suggests, they allow you to perform operations on multiple pieces of data with a single instruction. SIMD is great for media applications, where you often have to do the same mathematical operations over and over again to lots of data at once, however pretty much all of the stuff that happens in a kernel is logic-heavy tasks that only deal with single pieces of data at a time,

          • Re: (Score:3, Interesting)

            by FreonTrip ( 694097 )
            Certainly not. No AMD CPUs prior to the Phenoms support SSE4.x; nor did any Intel chips prior to the 45nm switchover (later Core2 CPUs). MMX, i686, SSE, and SSE2 are the baseline for all AMD64-capable CPUs. Subsequent instruction sets have been added to various architectures in a willy-nilly fashion, and with varying levels of per-clock performance depending upon the chip being discussed. I can't really speak for the utility of putting SIMD code to work in non-multimedia related code, but it seems to be
          • No, it would not be fair to assume that at all. Most Intel Core Duo Quads in the LGA775 form factor do not have SSE4 capabilities. At that time you had to buy a XEON or other high end quad core to get SSE4. These processors are quite happy to do 64-bit OS's.
        • by bfields ( 66644 )

          Intel has been tearing apart their Linux graphics stack and rewritting it for the future. For a while, that meant poor performance during the rewrite, but it really is getting better.

          As the original poster points out, none of this applies to the GMA500, which is supported by a different driver--a proprietary binary driver, and not a very well-maintained one at that, if reports are true.

      • Re:Intel and Linux (Score:4, Informative)

        by Andy Dodd ( 701 ) <atd7@@@cornell...edu> on Monday December 21, 2009 @11:12AM (#30512948) Homepage

        Your post completely missed the original poster's point - the Intel GMA500 is a major outlier in terms of Linux support.

        The GMA950 series is well supported by Linux (with the exception of the re-architecture issues that hurt Ubuntu 9.04 so badly).

        The GMA500 is simply minimally supported in Linux and all indications state that it will stay this way. The GMA500 graphics core was outsourced to another company, as was driver development.

        As to SSE2/3/4 - They only benefit for certain operation types. Most kernel ops won't benefit, and also, using SSE usually means hand-coding in assembler - compilers that generate good vector SIMD code are rare. The kernel developers tend to prefer to avoid hand-coded ASM whenever possible.

        However, I do recall that RAID checksumming code and memcpy() were once implemented using MMX to improve them, so these sections might benefit from SSE (and might already do so.)

      • If Intel really wants to push their hardware, why not write such optimizations for the Linux kernel?

        Well, the point has been made already: that stuff doesn't happen in the kernel [slashdot.org]. Here's the followup; if there's optimizations to be done, often they can be done by the compiler. Intel does of course have a snazzy compiler which produces (on average) better performing executables than does gcc. On the other hand, gcc's focus tends to be x86 and now x86_64, so it's not bad either. In the other cases, they belong in an external library; libraries involving sound, graphics, and video are likely candidates for i

    • Re: (Score:3, Informative)

      by sznupi ( 719324 )

      It is, supposedly, X3150, so basically the same part that's in G31. 3100/X3100? Anyway, seems it's "proper" Intel GMA, with good Linux support.

  • by Yvan256 ( 722131 ) on Monday December 21, 2009 @10:29AM (#30512392) Homepage Journal

    (photo) Asus Eee PC 1005PE In Midnight Blue

    What Midnight Blue? Oh, you mean underneath all those stickers? Seriously, why do non-Apple laptops always look like Nascar, erm, cars?

    • Re: (Score:3, Interesting)

      by anss123 ( 985305 )
      They get paid for the stickers. What annoy me more are the 1 million and 1 slightly different models; I would have preferred a slightly inferior but well supported (by the community as well as the company) model like the 700 was in the past.
    • Re:Midnight Blue? (Score:5, Insightful)

      by nine-times ( 778537 ) <nine.times@gmail.com> on Monday December 21, 2009 @10:44AM (#30512586) Homepage
      My guess is that it's a variety of factors:
      • Apple, having such a strong design culture, is the only manufacturer who realizes these stickers make your computer look cheap and stupid.
      • Apple's design culture is often about minimalism, and so they probably wouldn't put extra symbols or stickers on their computers even if it didn't look cheap and stupid.
      • Apple is just about the only laptop manufacturer who can't be bullied by Microsoft into putting any kind of "Microsoft certified" sticker on it.
      • Apple customers are less likely to be casual about their attachment to the brand. If you're a Dell customer, you might not think twice about buying an HP. If you're an Apple customer, buying an HP instead is a little more noteworthy. Therefore, they don't have to try to compete by advertising energy star compliance or the latest Intel chip. An awful lot of Apple customers couldn't care less about which Intel chip is in their computers.

      There are probably more, but that's off the top of my head.

      • Re: (Score:3, Interesting)

        I just bought one of the new HP Envy laptop and was presently surprised at the lack of stickers. Its just an HP logo on the back, similar to apple. In fact, the entire thing pretty much was just ripped off from Apple - keyboard design, body construction, multi-touch mousepad, you name it. Even the packaging was slick and minimalist, just like an apple. (Pricier than a PC, but way more bang for your buck than a similarly priced macbook pro). And no, not a window's certified sticker in sight - oh snap, m

        • Re: (Score:2, Interesting)

          I just checked the HP Envy out, it is EXACTLY like a macbook. They didnt even try to hide it.
          Still, I applaud the rip-off. It shows, at the very least, that they understand how ugly the rest of their lineup is.
          The guy who said "NASCAR" was right on the money. No other term quite embodies the black-hole-of-suck that is PC laptop design.
      • Re: (Score:3, Funny)

        by Anonymous Coward

        * PC customers are capable of removing the the stickers.

      • My Dell XPS 1530 laptop did not come with any stickers on it at all. I think the laptop looks relatively nice, too. Maybe not as fancy as a Mac Book Pro but it won't burn your leg if you accidentally touch the machine to your skin, either.
    • by sznupi ( 719324 )

      Stickers can be always removed...what's really frustrating is that many otherwise fine laptops come in glossy finish.

      That might look good on an equipment which sits on the shelf in your house...or in shop. But terrible for something which is meant to be routinely touched by hands and kept in usual bag with other stuff.

      Guess it just shows that such manufacturers care more about how it looks in shop...

    • Re:Midnight Blue? (Score:5, Informative)

      by Sponge Bath ( 413667 ) on Monday December 21, 2009 @11:33AM (#30513264)

      Once you have removed the stickers, you are often left with difficult to remove adhesive gunk on the laptop. An easy way of removing the gunk without damaging or scratching the surface is to spray a little silicone based lubricant in the area and wipe with a paper towel. It quickly wipes off and the silicone lubricant won't damage plastic like petroleum based lubricants (like WD-40) sometimes do.

      • Re: (Score:3, Informative)

        by rrohbeck ( 944847 )

        I prefer orange oil based cleaners. They are often marketed as label or gum removers.
        Not only do they smell good, they also don't damage plastics. Oh and they're also a great insecticide and will keep ants away because all insects hate the smell - after all the oil is the oranges' natural defense.

  • Intel and Adobe both have completely dropped the ball, but right now it's Intel that's in trouble. The only "netbook" I know that can handle fullscreen flash is the LT3013u; At 12" and $350 it hits the price point okay but misses size. Still, it's at least got a 720p display, which means it has to do more than most of the competition to even break even — it does better than that.

    • by Yvan256 ( 722131 ) on Monday December 21, 2009 @10:31AM (#30512422) Homepage Journal

      If you think Flash sucks on Windows then obviously you've never seen it run on Mac OS X. Adobe is a complete disgrace on that OS.

      • If you think Flash sucks on Windows then obviously you've never seen it run on Mac OS X. Adobe is a complete disgrace on that OS.

        That's okay, I can experience how much it blows on Linux. Using the 32 bit flash for Linux in a 32 bit firefox or in 64 bit firefox with a little help, on my Athlon 64 X2 4000+, was about like using it on my Acer Aspire D250 (1.6GHz Atom, old type.) Using the 64 bit flash on that machine was more like using it on a 1.4 GHz Thunderbird or something. Now I have a Phenom II 720 and I can just barely watch fullscreen flash video, and flash games perform worse than a Core Duo T2600 with Windows XP. Adobe hates Linux as much as they hate Mac OS.

      • by Anonymous Coward on Monday December 21, 2009 @10:36AM (#30512496)

        If you think Flash sucks on Mac OS X then obviously you've never seen it run in Linux. Adobe is a complete disgrace on that OS.

        • Re: (Score:3, Informative)

          by dunkelfalke ( 91624 )

          On the other hand Flash Player for linux is the only x64 flash player out there.

        • I installed Flash through the website with the "Ubuntu Partners Channel" on 9.10 and it was really easy. The partners channel looks like just another apt source. It'll be interesting to see if upgrades "just work".
    • by jo42 ( 227475 ) on Monday December 21, 2009 @10:36AM (#30512486) Homepage

      Have you ever even considered that the problem isn't the hardware, but the [lousy, crappy pile of rancid sheep dip] software known as "Flash"?

      • I don't think anyone has argued in this thread that Flash is not a gigantic piece of crap. On the other hand, it's an absolute necessity for the use of many websites. If I want what they've got, I need flash. I don't use flash on my website, if that makes you feel any better.

    • Comment removed based on user account deletion
      • I'm definitely not buying any more single-core Atoms. I got an Acer Aspire D250 and it's something of a dog. I may be reselling it to someone to whom that won't matter, though. Then I got the LT3013u which has ATI GPU and 1.2GHz Athlon 64. No powersaving in the main linux kernel yet, but it's coming... so right now it's running flat-out. It's still within reasonable norms for temperature though, and still gets about 3h45m on the battery, which is enough for my current purposes. I'm running Karmic on it, and

  • by sznupi ( 719324 ) on Monday December 21, 2009 @10:30AM (#30512410) Homepage

    Now only few other pieces of the puzzle in the quest for ultimate ultraportable.

    Pixel Qi screen, for even longer battery life and legibility in sunlight.

    With lower temps & power draw of Pinetrail it might be also possible for netbooks to become routinely cooled passively.

    Also just for me and other faithful...uhm...clit ;p (plus preferably as close in overall form to original Lenovo S10 as possible, it was actually very nice) Can't help it, playing Diablo2 in a cathedral during organ concert, on a cemetery on 1 XI night (it looks like this here: http://commons.wikimedia.org/wiki/File:Wszystkich_swietych_cmentarz.jpg [wikimedia.org] ) and in a train while sitting next to some nuns are things I simply must do. And with touchpad that's not really possible.

    • Re: (Score:3, Interesting)

      by drinkypoo ( 153816 )

      Clits have been deprecated because they wear out. They just can't take any abuse whatosever and you're always having to buy replacement covers for them. The glidepad, on the other hand, is only hard on your fingerprint, and those are a liability anyway. :)

      I've actually done a bit of point and click gaming with a glidepad, it's not too bad. A FPS, on the other hand, is basically a gigantic fail. If not a mouse, I need a trackball [logitech.com] for that. I had the original marble, whose ergonomics better suited my bear paw

      • Re: (Score:3, Funny)

        Clits have been deprecated because they wear out. They just can't take any abuse what so ever...

        Just because your girlfriend isn't into S&M.

      • "Clits have been deprecated because they wear out. They just can't take any abuse whatosever and you're always having to buy replacement covers for them. The glidepad, on the other hand, is only hard on your fingerprint, and those are a liability anyway. :)"

        Bullshit, I've used quite a few decade-old Thinkpads, and not a single one had problems with the trackpoint.

        I can understand preferring a trackpad, but a decent trackpoint/nipple/clit (I actually haven't seen any usable ones except on Thinkpads, TBH) won

  • by dwm ( 151474 ) on Monday December 21, 2009 @10:35AM (#30512474)

    The Atom N450 has been re-architected ...

    Wow -- I guess it was waaaaay too advanced to merely be "re-designed".

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Yes, the architecture changed: No more FSB, which also means no more alternative chipsets. The only chipset available for the new Atoms is Intel's one-chip NM-10. Other changes are not really architectural changes but would not have been possible without the abandonment of the FSB architecture: The analog video output is limited to 1440x1050 and the LVDS port for the LCD only drives up to 1366x768. Intel would not have dared crippling the chip so seriously if manufacturers could circumvent it by using a di

    • Re: (Score:3, Funny)

      They're just trying to be more precise. Doing so incentivizes brand awareness action-takers with post-current paradigms and forward-looking product models. A mere "re-design" would incorporate less-than-best-practice message exposure methodologies whereas a "re-architect" or architecture secondary optimization message distribution implies ground-up re-envisioning.

  • This would be a whole lot more interesting if Intel didn't have a pretty solid track record of producing some of the worst GPUs on the market. Perhaps the performance and power gains are more than I'm expecting, but from my perspective this seems like a pretty transparent move to cut Nvidia out of the netbook chipset market, and consequently cut down on consumer options on how they want to configure these types of machines as well.

  • by John Betonschaar ( 178617 ) on Monday December 21, 2009 @10:38AM (#30512520)

    If you'd ask me: it's still a slow piece of crap that has no particular place in the market if it weren't for (consumer) Microsoft Windows being x86-only, and now it's even worse than the original Atom since you get a crappy Intel GPU for free.

    In the low-power segment: you are still better of with an ARM chip if you don't need Windows (it consumes less power), another x86 SoC if you absolutely need Windows but don't need anything else (which also consume less power) or a Via Nano if you are a consumer who likes Windows a lot but only do a little browsing and email (they are faster and comparable in terms of power consumption).

    In the HTPC/Media center segment: the Atom + Nvidia ION platform was great, low-power/low-performance CPU with a GPU that does all the video decoding and OpenGL. Now you get an Intel GPU that is *still* not able to do full video-pipeline accelerated GPU decoding. Better get yourself an old Atom, or hopefully in the future a Via Nano + decent GPU.

    In the Netbook segment: with the performance of the original Atom being nothing but abysmal unless you only use Notepad, you really want a Celeron ULV anyway. It's a much better design, in a whole different performance class than the Atom, and you don't get any of the stupid restrictions Intel puts on using the Atom.

    In the embedded segment: you don't need x86 compatibility at all, so ARM would be your 1st choice.

    Maybe I'm missing something, but I really don't see the point of a crippled and slow x86 CPU with a design based on 10-year old technology, which is forcibly coupled to an IGP that isn't able doing much more than rendering your desktop...

    • I agree 100%. Atom processors are a combination of stuff that I don't want. Too slow to do anything. So who cares about battery life.
      • I agree 100%. Atom processors are a combination of stuff that I don't want. Too slow to do anything. So who cares about battery life.

        A fast processor is useless if you haven't got power to run it... The really nice thing about my EEE is I can take it places - it's light enough to comfortably carry it around, and it's got enough power that I can get several hours of use out of it... Like 4-5 hours of actual usage, compared to the three or so I could get with my Powerbook - doesn't sound like much but in practice it's a big difference.

        It is too slow to do a fair number of things - for instance, Youtube and Hulu (i.e. Flash video) playbac

        • Yeah but there is a middle ground between Atom and the fastest Core2 you can put into a 'laptop'. You can get low power fast processors and get 4-5 hours of battery easily.

          Look at the X200S by lenovo.
          • Yeah but there is a middle ground between Atom and the fastest Core2 you can put into a 'laptop'. You can get low power fast processors and get 4-5 hours of battery easily.

            Look at the X200S by lenovo.

            With a 6-cell battery (which, I'm guessing, is what you need for actual 5 hours use as opposed to spec'd), it weighs about 50% more than my 901... Of course, that may be a result of other components' weight, such as the hard drive, rather than just the battery... (judging by the weight difference between the 6-cell and 9-cell versions of the X200, that's probably the case...)

            I guess I'd probably be inclined to agree that the Atom may not be the best point on the power consumption/processing capabilities c

    • Re: (Score:3, Insightful)

      by TheKidWho ( 705796 )

      They're cheap, that's the point behind them.

      Also, it seems like ION will still be usable, but in a slightly revised form for the Pinetrails.

      Don't exaggerate, the Atom isn't THAT bad.

      • The only way I can see Ion working is if it's treated like any other discrete GPU - attached via PCIe, and overriding the integrated graphics.

        That's not exactly cheap.

        • Re: (Score:3, Interesting)

          by b0bby ( 201198 )

          I have read that there's also the possibility of adding a Broadcom decoder chip to offload the work of video decoding, which might allow 1080p video while keeping power consumption low. That's what I'd like to see in my next netbook.

        • Yep, unless Intel gives Nvidia a DMI license, that's pretty much the only way.

    • by Joce640k ( 829181 ) on Monday December 21, 2009 @11:34AM (#30513274) Homepage

      Look again at the bit where it says "battery life"....

      In the real world outside Slashdot not everybody is hung up on their 3dMark scores. In fact very few people are, judging by the fact that Intel GPUs outsell both NVIDIA and ATI combined.

    • Re: (Score:3, Insightful)

      by LWATCDR ( 28044 )

      I think Intel is crippling it to keep from killing higher margin notebook sales.
      From AnandTech
      "The integrated GMA 3150 graphics hasn’t been used by Intel before, it’s a 45nm shrink of the GMA 3100. It’s technically a DX9 GPU running at 400MHz, however as you’ll soon see - you can’t really play any games on this platform. The GPU only offers hardware acceleration for MPEG-2 video, H.264 and VC-1 aren’t accelerated."

      No H.264 or VC-1 hardware support means poor performance.
      T

    • Re: (Score:3, Interesting)

      by Kjella ( 173770 )

      Sometimes I get the impression you're just trying to find fault, if it's so "abysmal unless you only use Notepad", why do you care about the "stupid restrictions"? The Atom is about two things really, price and battery life. The Atom it's a much smaller, much less handpicked chip than any of Intel's very highly priced ULV editions. And sure you can get better workhorses for your money, but not lower power than the N450 having a 5.5W TDP for CPU+memory controller+GPU with a sub-watt additional chipset.

      It's h

    • by tool462 ( 677306 )

      with a design based on 10-year old technology

      Wow, complaining about 10 year old tech? I'd hate to hear what you have to say about Unix!

  • So, I assume performance-wise this mean going from the equivalent of a 700 MHz P3 to a 1 GHz P3.

    Sorry, but truth be told, the balance of performance and power consumption right now favors using the Pentium Dual Cores. The Atom is a niche product that works best with stuff like cash registers.

    • by Nadir ( 805 )

      I have built an atom-based htpc with a zotac ion motherboard and it does exactly what I want: fanless h264 decoding. A bit more than a cash register can do.

  • That all sounds nice, but have they built a system that draws less power than a comparable Athlon 64 system?

    • Re:Power use? (Score:4, Informative)

      by bhtooefr ( 649901 ) <bhtooefr@bhtoo[ ].org ['efr' in gap]> on Monday December 21, 2009 @12:35PM (#30514106) Homepage Journal

      Even the original Atoms used less power than the most power-efficient single-core AMD platform.

      Platform TDP for the Yukon platform (RS690E northbridge, SB600 southbridge) ranges from 19 watts with a 1 GHz Sempron, to 26 for a 1.6 GHz Athlon. (29 for a dual-core 1.6 GHz Turion.) The most efficient Athlon-based Yukon is 1.2 GHz, and platform power consumption is 24 watts.

      Platform TDP for the typical N270+945GSE+ICH7M is 11.8 watts, N450+NM10 is 7 watts. Granted, the Yukon stuff doesn't really compete with the Atom, it competes with Intel CULV.

      CULV has a 14.5 watt chipset (GS45, ICH9M) TDP, add 5.5 watts for single-core, 10 watts for dual-core CPUs.

      Oh, and I'll toss the VIA Nano in, it fits somewhere between the Atom and the CULV and Yukon platforms in performance.

      The fastest current Nanos for netbooks are the U2225 and U2250, both at 1.3 GHz (the U2250 is at "1.3+ GHz") and 8 W TDP. (IIRC, though, the Nano is significantly faster than Atom.) The matching VX800U chipset has a 3.5 W TDP, so 11.5 W total platform TDP - less than the old Atom platform.

      The upcoming U3200 is at 1.4 GHz (and even faster than the clockspeed implies, apparently,) possibly 5 W TDP, and 2.3 W for the VX855, so 7.3 W platform TDP.

  • Who cares about the CPU? Gimme more pixels, preferably non-glossy.

    Have people still not figured out that the glossy screens are crap ... or does the magpie syndrome still dominate purchasing decisions?

    • Widescreen makes sense for form factor reasons, too, so don't expect 1024x768 any time soon. 1280x720 and 1366x768, that's slowly starting to appear.

      As for glossy screens, they're cheaper, and the margins are so slim on these things that I doubt you're going to see matte unless it's a "high-end" netbook (or just a straight-up CULV machine.)

  • So do we finally get Linux and Unix distros back in the netbooks instead of XP? Oh God do I hope so.
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Monday December 21, 2009 @12:12PM (#30513774)
    Comment removed based on user account deletion

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...