Intel Launches Next-Gen Atom N450 Processor 165
MojoKid writes "Intel has unveiled its next-generation Atom N450 processor, and a review of the new Asus Eee PC 1005PE netbook that houses it shows decent gains in performance and lower power consumption. The Atom N450 has been re-architected similar to Intel's other notebook processors in that it now has an integrated memory controller and graphics core on the CPU itself. In addition, Intel's serial DMI (Direct Media Interface) now replaces the system bus to the Southbridge IO controller. From a performance standpoint, the Atom N450 single core chip offers a nice performance gain versus previous generation Atom CPUs and it appears Intel has dual-core variants of the chip on the horizon as well."
So... (Score:4, Funny)
RTFA, please (Score:2, Informative)
Re:RTFA, please (Score:4, Informative)
Makes no sense: the 945G and variants had a GMA 950.
Re: (Score:3, Informative)
RTFA: "The graphics core is a basic DX9 instantiation that is a kin to Intel's GMA500 graphics core in the previous generation Intel 945G chipset"
I have a 945GM system whose graphics part is called GMA950. It uses the common opensource Intel drivers. On the contrary, GMA500 aka Poulsbo is the problematic one with closed drivers.
http://en.wikipedia.org/wiki/Intel_GMA [wikipedia.org]
Re: (Score:3, Informative)
That doesn't make any sense. The 945 chipset uses the GMA950, the GMA500 is actually a totally-outsourced PowerVR chip. The 'native' Intel chips (i810 through G45) are all tatally supported by Intel's open-source drivers, the GMA500 is almost impossible to get working in Linux.
The new built-in N450, D410, and D510 graphics chips are based on the GMA3100, if I recall, they're even called 'GMA3150'. That means they're supported by open-source drivers (and possibly by Mac OS X!), but the performance is bad eno
Intel and Linux (Score:5, Interesting)
Intel has been tearing apart their Linux graphics stack and rewritting it for the future. For a while, that meant poor performance during the rewrite, but it really is getting better. Intel is really helping push DRI2, GEM, TTM, UXA, etc.
At least Intel does their development in the open. Didn't Intel also contribute code to Moblin to optimize Moblin performance on their hardware? I'd like to see some more general kernel enhancements for these processors. Any speed increase over Windows on the most common netbook processor is a huge win.
Chrome OS is already fast. If Intel can help make it faster when comparing it side-by-side to 7, it only helps Linux adoption on the whole.
I also have a small tangental question. I always hear about huge performance gains that can come from properly writing code to take advantage of SSE2,3,4,etc instruction sets. I also hear that almost no one does write code to take advantage of these instruction sets. If Intel really wants to push their hardware, why not write such optimizations for the Linux kernel?
Re: (Score:3, Informative)
I also have a small tangental question. I always hear about huge performance gains that can come from properly writing code to take advantage of SSE2,3,4,etc instruction sets. I also hear that almost no one does write code to take advantage of these instruction sets. If Intel really wants to push their hardware, why not write such optimizations for the Linux kernel?
The kernel doesn't do much CPU-bound processing. It is math and media libraries where these vector instructions would be actually useful. You can already have some of their advantages using a decent compiler. Basically, that means different binaries for processors with different capabilities, so your average binary distro is not going to have any fancy instructions. I suggest trying Gentoo if you actually want to use your modern CPU.
Re: (Score:2)
Wouldn't it be fair to assume anyone running a 64-bit distro has a procesor capableof SSE4 insturctions? Write the code to take advantage of these instruction sets, but only enable them on your 64-bit binaries then.
I'm no low level programmer, but I assume IO and CPU scheduling are math intensive enough. If SSE instructions really boost video encoding, what about encryption algorhythms, or file systems?
Re: (Score:3, Informative)
The various SSE instruction sets provide SIMD instructions, which is an acronym for "single instruction, multiple data". As the name suggests, they allow you to perform operations on multiple pieces of data with a single instruction. SIMD is great for media applications, where you often have to do the same mathematical operations over and over again to lots of data at once, however pretty much all of the stuff that happens in a kernel is logic-heavy tasks that only deal with single pieces of data at a time,
Re: (Score:2)
And various cryptographic things (and somewhat relatedly, checksums) can take advantage of SSE stuff to various extents. And probably other little things in rare situations. Thus why I said "pretty much all" instead of simply "all".
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Intel has been tearing apart their Linux graphics stack and rewritting it for the future. For a while, that meant poor performance during the rewrite, but it really is getting better.
As the original poster points out, none of this applies to the GMA500, which is supported by a different driver--a proprietary binary driver, and not a very well-maintained one at that, if reports are true.
Re:Intel and Linux (Score:4, Informative)
Your post completely missed the original poster's point - the Intel GMA500 is a major outlier in terms of Linux support.
The GMA950 series is well supported by Linux (with the exception of the re-architecture issues that hurt Ubuntu 9.04 so badly).
The GMA500 is simply minimally supported in Linux and all indications state that it will stay this way. The GMA500 graphics core was outsourced to another company, as was driver development.
As to SSE2/3/4 - They only benefit for certain operation types. Most kernel ops won't benefit, and also, using SSE usually means hand-coding in assembler - compilers that generate good vector SIMD code are rare. The kernel developers tend to prefer to avoid hand-coded ASM whenever possible.
However, I do recall that RAID checksumming code and memcpy() were once implemented using MMX to improve them, so these sections might benefit from SSE (and might already do so.)
Re: (Score:2)
If Intel really wants to push their hardware, why not write such optimizations for the Linux kernel?
Well, the point has been made already: that stuff doesn't happen in the kernel [slashdot.org]. Here's the followup; if there's optimizations to be done, often they can be done by the compiler. Intel does of course have a snazzy compiler which produces (on average) better performing executables than does gcc. On the other hand, gcc's focus tends to be x86 and now x86_64, so it's not bad either. In the other cases, they belong in an external library; libraries involving sound, graphics, and video are likely candidates for i
Re: (Score:3, Informative)
It is, supposedly, X3150, so basically the same part that's in G31. 3100/X3100? Anyway, seems it's "proper" Intel GMA, with good Linux support.
PS. (Score:2)
Wiki says it's 3150; from the in-house Intel line: http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100 [wikipedia.org]
And we all know wiki doesn't lie...
Re: (Score:3, Interesting)
Midnight Blue? (Score:5, Funny)
What Midnight Blue? Oh, you mean underneath all those stickers? Seriously, why do non-Apple laptops always look like Nascar, erm, cars?
Re: (Score:3, Interesting)
Re:Midnight Blue? (Score:5, Insightful)
There are probably more, but that's off the top of my head.
Re: (Score:3, Interesting)
I just bought one of the new HP Envy laptop and was presently surprised at the lack of stickers. Its just an HP logo on the back, similar to apple. In fact, the entire thing pretty much was just ripped off from Apple - keyboard design, body construction, multi-touch mousepad, you name it. Even the packaging was slick and minimalist, just like an apple. (Pricier than a PC, but way more bang for your buck than a similarly priced macbook pro). And no, not a window's certified sticker in sight - oh snap, m
Re: (Score:2, Interesting)
Still, I applaud the rip-off. It shows, at the very least, that they understand how ugly the rest of their lineup is.
The guy who said "NASCAR" was right on the money. No other term quite embodies the black-hole-of-suck that is PC laptop design.
Re: (Score:3, Funny)
* PC customers are capable of removing the the stickers.
Re: (Score:2)
Re: (Score:2)
Stickers can be always removed...what's really frustrating is that many otherwise fine laptops come in glossy finish.
That might look good on an equipment which sits on the shelf in your house...or in shop. But terrible for something which is meant to be routinely touched by hands and kept in usual bag with other stuff.
Guess it just shows that such manufacturers care more about how it looks in shop...
Re: (Score:2)
ASUS isn't too bad in this regard as a lot of their other eee's are done in a matte finish.
Re:Midnight Blue? (Score:5, Informative)
Once you have removed the stickers, you are often left with difficult to remove adhesive gunk on the laptop. An easy way of removing the gunk without damaging or scratching the surface is to spray a little silicone based lubricant in the area and wipe with a paper towel. It quickly wipes off and the silicone lubricant won't damage plastic like petroleum based lubricants (like WD-40) sometimes do.
Re: (Score:3, Informative)
I prefer orange oil based cleaners. They are often marketed as label or gum removers.
Not only do they smell good, they also don't damage plastics. Oh and they're also a great insecticide and will keep ants away because all insects hate the smell - after all the oil is the oranges' natural defense.
Re: (Score:2)
Because they want the sales?
I used to work in bicycle shops doing repair and sales. We *never* sold WD-40, and always recommended against it's use ( at least as a chain oil ). It was not very good for that. Technically, it may be a lubricant, but it is not a very good one. It was designed to displace water ( WD - Water Displacer ). It you want something like a penetrating oil, something to drive out water, to clean, WD-40 is probably very good at those. Light lubrication? Maybe. I wouldn't, myself.
Re:Midnight Blue? (Score:4, Insightful)
Because unlike pretentious Apple fanboys, most people care more about a computer being cost effective and able to do what is needed. Its the reason why PCs and not Macs own most of the market.
Why does cost-effective, capable hardware imply a need for a billion stickers on the casing?
Re: (Score:2)
Because they probably get a discount for some of the stickers. (That's certainly why they put bloatware on there - the bloat developers pay the computer manufacturers to bundle it.)
Re: (Score:2)
Because they probably get a discount for some of the stickers.
Well, sure, but go up three posts in this thread and it looks more like AC is arguing that the stickers make it go faster... :) The stickers, by themselves, do not make the machine better. I think there's a fair case that they make the machine worse, at least until they are removed.
Still chokes on flash? (Score:2)
Intel and Adobe both have completely dropped the ball, but right now it's Intel that's in trouble. The only "netbook" I know that can handle fullscreen flash is the LT3013u; At 12" and $350 it hits the price point okay but misses size. Still, it's at least got a 720p display, which means it has to do more than most of the competition to even break even — it does better than that.
Re:Still chokes on flash? (Score:4, Informative)
If you think Flash sucks on Windows then obviously you've never seen it run on Mac OS X. Adobe is a complete disgrace on that OS.
Re:Still chokes on flash? (Score:4)
If you think Flash sucks on Windows then obviously you've never seen it run on Mac OS X. Adobe is a complete disgrace on that OS.
That's okay, I can experience how much it blows on Linux. Using the 32 bit flash for Linux in a 32 bit firefox or in 64 bit firefox with a little help, on my Athlon 64 X2 4000+, was about like using it on my Acer Aspire D250 (1.6GHz Atom, old type.) Using the 64 bit flash on that machine was more like using it on a 1.4 GHz Thunderbird or something. Now I have a Phenom II 720 and I can just barely watch fullscreen flash video, and flash games perform worse than a Core Duo T2600 with Windows XP. Adobe hates Linux as much as they hate Mac OS.
Re:Still chokes on flash? (Score:4, Funny)
Doesn't Firefox need more than 4GB these days?? :D
Re: (Score:2)
The latest versions aren't too bad. I've never seen more thna 1GB even with many windows and tabs open.
On 3.1 I have seen over 2GB.
Re: (Score:2)
Why would anyone run a browser in 64bit mode?
Why not?
I don't have a good reason, personally, for my decision to run 64-bit versions of any software I use, if it's available. I made the switch to the AMD64 platform rather late (last 2008) - by which time a lot of the problems had already been solved. I've never had to run the 32-bit Flash plugin on my 64-bit processor, for instance.
I don't know if there's any practical benefit to running a 64-bit build of the Browser... Running a 32-bit build on a 64-bit kernel would get me 4GiB of virtual memory sp
Re: (Score:2)
Why would anyone run a browser in 64bit mode?
It's an issue of speed. I want flash to run as fast as possible, because it is a dog. So I want 64 bit flash. Might as well have a 64 bit browser to go with it. 64 bit flash on Linux (beta only) is indeed substantially faster than 32 bit, at least on the two systems I've compared on (Athlon 64 X2 4000+, and Phenom II 720. Other details available on request.)
Flash is so 1990's (Score:2)
Re: (Score:2)
Why run flash at all? Try patronizing sites that support better video technology and maybe even open standards.
When they have the content I want, I will. In the mean time, I need flash video if I want to watch what I want to watch.
Re:Still chokes on flash? (Score:5, Informative)
If you think Flash sucks on Mac OS X then obviously you've never seen it run in Linux. Adobe is a complete disgrace on that OS.
Re: (Score:3, Informative)
On the other hand Flash Player for linux is the only x64 flash player out there.
Re: (Score:2)
Re:Still chokes on flash? (Score:5, Insightful)
Have you ever even considered that the problem isn't the hardware, but the [lousy, crappy pile of rancid sheep dip] software known as "Flash"?
Re: (Score:2)
I don't think anyone has argued in this thread that Flash is not a gigantic piece of crap. On the other hand, it's an absolute necessity for the use of many websites. If I want what they've got, I need flash. I don't use flash on my website, if that makes you feel any better.
Re: (Score:2)
Re: (Score:2)
I'm definitely not buying any more single-core Atoms. I got an Acer Aspire D250 and it's something of a dog. I may be reselling it to someone to whom that won't matter, though. Then I got the LT3013u which has ATI GPU and 1.2GHz Athlon 64. No powersaving in the main linux kernel yet, but it's coming... so right now it's running flat-out. It's still within reasonable norms for temperature though, and still gets about 3h45m on the battery, which is enough for my current purposes. I'm running Karmic on it, and
Finally proper platform (Score:3, Interesting)
Now only few other pieces of the puzzle in the quest for ultimate ultraportable.
Pixel Qi screen, for even longer battery life and legibility in sunlight.
With lower temps & power draw of Pinetrail it might be also possible for netbooks to become routinely cooled passively.
Also just for me and other faithful...uhm...clit ;p (plus preferably as close in overall form to original Lenovo S10 as possible, it was actually very nice) Can't help it, playing Diablo2 in a cathedral during organ concert, on a cemetery on 1 XI night (it looks like this here: http://commons.wikimedia.org/wiki/File:Wszystkich_swietych_cmentarz.jpg [wikimedia.org] ) and in a train while sitting next to some nuns are things I simply must do. And with touchpad that's not really possible.
Re: (Score:3, Interesting)
Clits have been deprecated because they wear out. They just can't take any abuse whatosever and you're always having to buy replacement covers for them. The glidepad, on the other hand, is only hard on your fingerprint, and those are a liability anyway. :)
I've actually done a bit of point and click gaming with a glidepad, it's not too bad. A FPS, on the other hand, is basically a gigantic fail. If not a mouse, I need a trackball [logitech.com] for that. I had the original marble, whose ergonomics better suited my bear paw
Re: (Score:3, Funny)
Clits have been deprecated because they wear out. They just can't take any abuse what so ever...
Just because your girlfriend isn't into S&M.
Re: (Score:2)
"Clits have been deprecated because they wear out. They just can't take any abuse whatosever and you're always having to buy replacement covers for them. The glidepad, on the other hand, is only hard on your fingerprint, and those are a liability anyway. :)"
Bullshit, I've used quite a few decade-old Thinkpads, and not a single one had problems with the trackpoint.
I can understand preferring a trackpad, but a decent trackpoint/nipple/clit (I actually haven't seen any usable ones except on Thinkpads, TBH) won
Re: (Score:2)
Because of the improved form factor I'd love to see a netbook without a trackpad.
Re-Architecting English (Score:4, Funny)
The Atom N450 has been re-architected ...
Wow -- I guess it was waaaaay too advanced to merely be "re-designed".
Re: (Score:2, Informative)
Yes, the architecture changed: No more FSB, which also means no more alternative chipsets. The only chipset available for the new Atoms is Intel's one-chip NM-10. Other changes are not really architectural changes but would not have been possible without the abandonment of the FSB architecture: The analog video output is limited to 1440x1050 and the LVDS port for the LCD only drives up to 1366x768. Intel would not have dared crippling the chip so seriously if manufacturers could circumvent it by using a di
Re: (Score:2, Funny)
Re: (Score:3, Funny)
They're just trying to be more precise. Doing so incentivizes brand awareness action-takers with post-current paradigms and forward-looking product models. A mere "re-design" would incorporate less-than-best-practice message exposure methodologies whereas a "re-architect" or architecture secondary optimization message distribution implies ground-up re-envisioning.
Re: (Score:2)
You must have used this [erikandanna.com].
Not impressed (Score:2)
This would be a whole lot more interesting if Intel didn't have a pretty solid track record of producing some of the worst GPUs on the market. Perhaps the performance and power gains are more than I'm expecting, but from my perspective this seems like a pretty transparent move to cut Nvidia out of the netbook chipset market, and consequently cut down on consumer options on how they want to configure these types of machines as well.
Who actually needs this? (Score:5, Insightful)
If you'd ask me: it's still a slow piece of crap that has no particular place in the market if it weren't for (consumer) Microsoft Windows being x86-only, and now it's even worse than the original Atom since you get a crappy Intel GPU for free.
In the low-power segment: you are still better of with an ARM chip if you don't need Windows (it consumes less power), another x86 SoC if you absolutely need Windows but don't need anything else (which also consume less power) or a Via Nano if you are a consumer who likes Windows a lot but only do a little browsing and email (they are faster and comparable in terms of power consumption).
In the HTPC/Media center segment: the Atom + Nvidia ION platform was great, low-power/low-performance CPU with a GPU that does all the video decoding and OpenGL. Now you get an Intel GPU that is *still* not able to do full video-pipeline accelerated GPU decoding. Better get yourself an old Atom, or hopefully in the future a Via Nano + decent GPU.
In the Netbook segment: with the performance of the original Atom being nothing but abysmal unless you only use Notepad, you really want a Celeron ULV anyway. It's a much better design, in a whole different performance class than the Atom, and you don't get any of the stupid restrictions Intel puts on using the Atom.
In the embedded segment: you don't need x86 compatibility at all, so ARM would be your 1st choice.
Maybe I'm missing something, but I really don't see the point of a crippled and slow x86 CPU with a design based on 10-year old technology, which is forcibly coupled to an IGP that isn't able doing much more than rendering your desktop...
Re: (Score:2)
Re: (Score:2)
I agree 100%. Atom processors are a combination of stuff that I don't want. Too slow to do anything. So who cares about battery life.
A fast processor is useless if you haven't got power to run it... The really nice thing about my EEE is I can take it places - it's light enough to comfortably carry it around, and it's got enough power that I can get several hours of use out of it... Like 4-5 hours of actual usage, compared to the three or so I could get with my Powerbook - doesn't sound like much but in practice it's a big difference.
It is too slow to do a fair number of things - for instance, Youtube and Hulu (i.e. Flash video) playbac
Re: (Score:2)
Look at the X200S by lenovo.
Re: (Score:2)
Yeah but there is a middle ground between Atom and the fastest Core2 you can put into a 'laptop'. You can get low power fast processors and get 4-5 hours of battery easily.
Look at the X200S by lenovo.
With a 6-cell battery (which, I'm guessing, is what you need for actual 5 hours use as opposed to spec'd), it weighs about 50% more than my 901... Of course, that may be a result of other components' weight, such as the hard drive, rather than just the battery... (judging by the weight difference between the 6-cell and 9-cell versions of the X200, that's probably the case...)
I guess I'd probably be inclined to agree that the Atom may not be the best point on the power consumption/processing capabilities c
Re: (Score:3, Insightful)
They're cheap, that's the point behind them.
Also, it seems like ION will still be usable, but in a slightly revised form for the Pinetrails.
Don't exaggerate, the Atom isn't THAT bad.
Re: (Score:2)
The only way I can see Ion working is if it's treated like any other discrete GPU - attached via PCIe, and overriding the integrated graphics.
That's not exactly cheap.
Re: (Score:3, Interesting)
I have read that there's also the possibility of adding a Broadcom decoder chip to offload the work of video decoding, which might allow 1080p video while keeping power consumption low. That's what I'd like to see in my next netbook.
Re: (Score:2)
Yep, unless Intel gives Nvidia a DMI license, that's pretty much the only way.
Did you bother reading the article? (Score:4, Insightful)
Look again at the bit where it says "battery life"....
In the real world outside Slashdot not everybody is hung up on their 3dMark scores. In fact very few people are, judging by the fact that Intel GPUs outsell both NVIDIA and ATI combined.
Re: (Score:3, Insightful)
I think Intel is crippling it to keep from killing higher margin notebook sales.
From AnandTech
"The integrated GMA 3150 graphics hasn’t been used by Intel before, it’s a 45nm shrink of the GMA 3100. It’s technically a DX9 GPU running at 400MHz, however as you’ll soon see - you can’t really play any games on this platform. The GPU only offers hardware acceleration for MPEG-2 video, H.264 and VC-1 aren’t accelerated."
No H.264 or VC-1 hardware support means poor performance.
T
Re: (Score:3, Interesting)
Sometimes I get the impression you're just trying to find fault, if it's so "abysmal unless you only use Notepad", why do you care about the "stupid restrictions"? The Atom is about two things really, price and battery life. The Atom it's a much smaller, much less handpicked chip than any of Intel's very highly priced ULV editions. And sure you can get better workhorses for your money, but not lower power than the N450 having a 5.5W TDP for CPU+memory controller+GPU with a sub-watt additional chipset.
It's h
Re: (Score:2)
with a design based on 10-year old technology
Wow, complaining about 10 year old tech? I'd hate to hear what you have to say about Unix!
"decent gains"? (Score:2, Troll)
So, I assume performance-wise this mean going from the equivalent of a 700 MHz P3 to a 1 GHz P3.
Sorry, but truth be told, the balance of performance and power consumption right now favors using the Pentium Dual Cores. The Atom is a niche product that works best with stuff like cash registers.
Re: (Score:2)
I have built an atom-based htpc with a zotac ion motherboard and it does exactly what I want: fanless h264 decoding. A bit more than a cash register can do.
Power use? (Score:2)
That all sounds nice, but have they built a system that draws less power than a comparable Athlon 64 system?
Re:Power use? (Score:4, Informative)
Even the original Atoms used less power than the most power-efficient single-core AMD platform.
Platform TDP for the Yukon platform (RS690E northbridge, SB600 southbridge) ranges from 19 watts with a 1 GHz Sempron, to 26 for a 1.6 GHz Athlon. (29 for a dual-core 1.6 GHz Turion.) The most efficient Athlon-based Yukon is 1.2 GHz, and platform power consumption is 24 watts.
Platform TDP for the typical N270+945GSE+ICH7M is 11.8 watts, N450+NM10 is 7 watts. Granted, the Yukon stuff doesn't really compete with the Atom, it competes with Intel CULV.
CULV has a 14.5 watt chipset (GS45, ICH9M) TDP, add 5.5 watts for single-core, 10 watts for dual-core CPUs.
Oh, and I'll toss the VIA Nano in, it fits somewhere between the Atom and the CULV and Yukon platforms in performance.
The fastest current Nanos for netbooks are the U2225 and U2250, both at 1.3 GHz (the U2250 is at "1.3+ GHz") and 8 W TDP. (IIRC, though, the Nano is significantly faster than Atom.) The matching VX800U chipset has a 3.5 W TDP, so 11.5 W total platform TDP - less than the old Atom platform.
The upcoming U3200 is at 1.4 GHz (and even faster than the clockspeed implies, apparently,) possibly 5 W TDP, and 2.3 W for the VX855, so 7.3 W platform TDP.
I'm still waiting for a 1024x768 screen (Score:2)
Who cares about the CPU? Gimme more pixels, preferably non-glossy.
Have people still not figured out that the glossy screens are crap ... or does the magpie syndrome still dominate purchasing decisions?
Re: (Score:2)
Widescreen makes sense for form factor reasons, too, so don't expect 1024x768 any time soon. 1280x720 and 1366x768, that's slowly starting to appear.
As for glossy screens, they're cheaper, and the margins are so slim on these things that I doubt you're going to see matte unless it's a "high-end" netbook (or just a straight-up CULV machine.)
Linux Back in the Netbooks? (Score:2, Interesting)
Comment removed (Score:3, Insightful)
Re: (Score:2)
Microsoft seem to have been quietly killing them off, since unlike the Atom ones they can't run Windows. Some of the usual suspects had prototypes which were rapidly yanked.
Windows CE netbooks exist (Score:2)
Re: (Score:2)
At the moment, available arm processors are still behind the atom in performance by a fairly large margin, and ahead in power consumption by a similar margin. The current top of the line arm chip is the cortex-a8 used in the beagle board and gumstix systems-on-a-chip. When dual core and quad core arm cortex-a9 processors become available, that might change.
We are currently in the "roll your own" stage of development for arm machines.
Buy a beagle board or a gumstix, attach it to an lcd, mini keyboard and bat
Re: (Score:2)
Correction, the top of the line is currently the Snap Dragon.
Re: (Score:2)
I see it stated by wikipedia "The Snapdragon application processor core is called Scorpion and is similar to the ARM Cortex-A8 core."
However, I also see that snapdragon is powering windows mobile phones. This means that it can not be an ARM chip as windows mobile will not run on ARM chips.
Re: (Score:2)
Windows *mobile* runs on ARM cpus.
Re: (Score:2)
On further investigation, looks like WinMo can run on an arm. But I still can't find any official word that snapdragon is ARM.
Re: (Score:3, Insightful)
http://en.wikipedia.org/wiki/Snapdragon_(processor) [wikipedia.org]
First line:
"Snapdragon is a name of an architecture of a family of chipsets with an ARM-based CPU."
Re: (Score:2)
No, I just am not taking wikipedia as truth. It isn't a valid source for research and nowhere does it state what ARM architecture it uses. Qualcoms snapdragon page does not state that snapdragon is an ARM cpu. I have found a grand total of one blog article, in reference to an Asus snapdragon eePC that was only shown for one day that even mentions ARM in association with snapdragon, and that article is also filled with innuendo that microsoft forced Asus to withdraw the netbook.
If you have some real referenc
Re:meanwhile, where are the ARMs? (Score:4, Informative)
http://www.arm.com/markets/embedded_solutions/armpp/25333.html [arm.com]
Doesn't take much looking.
Re: (Score:2)
Accepted, I also found this while looking:
http://www.intomobile.com/2009/10/23/corrections-arm-cortex-a8-qualcomm-snapdragon-and-marvell-armada-oh-my.html [intomobile.com]
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Buy a beagle board or a gumstix, attach it to an lcd, mini keyboard and battery, now install one of the handful of linux operating systems available for it and you have an arm netbook.
Right... that'll go over big with the general public, so I'm sure to see that kit available at newegg and bestbuy any day now.
Wait, that's your complaint? Not that building and setting up your own machine would be a pain in the ass, but just that the kit wouldn't be popular with the general population, and wouldn't be available on Newegg?
Point is, there was headline after headline proclaiming that 2009 was going to be the year of the ARM netbook, and by 2012 that 20 or 30% of the entire netbook market would be ARM based. That simply isn't going to happen if the answer is "buy your own components, get yourself a CNC milling machine and design a case for them, and fashion your own netbook".
People are always quick to blame MS and Intel, but the problem is more that their competitors keep dropping the ball.
Well, really, building your own ARM netbook isn't the answer to ARM netbooks being "the next big thing". It does sound like a fun project, actually (I think I'd start with an old EEE case or something) but, yeah, I really don't think a build-your-own-ARM-netbook would make for a successful popul
Re: (Score:2)
IA-32 addiction (Score:2)
Binary compatibility is a non-issue if you're free of Windows.
I am, and it isn't...
I mean, in theory, binary compatibility isn't an issue for me. In practice, when I've tried it, there was always some nice bit of software that was partially coded in IA-32 assembly, or that had platform-specific optimizations - or, like I said before, non-libre software like drivers or whatever for a piece of hardware in my system... Or maybe someone has a nice, closed-source app for Linux and they only build for Intel. Apart from things like critical drivers (video, audio, network)
Re: (Score:2)
You could say the same thing about the home computer before 1980, the acorn atom and sinclair zx80 were only available in kit form. And before that, much more self fabrication was required.
Right now, you can:
A: build it yourself with hobby parts.
B: wait till someone makes a big enough investment to get mass production off the ground.
C: try to scrounge up enough capital to get it going yourself. These guys are doing just that http://www.alwaysinnovating.com/home/index.htm [alwaysinnovating.com]
Re: (Score:3, Funny)
If you buy this one there's another one waiting in the wings to piss you off after you buy it.
Re: (Score:2)
You want as much screen real estate as you can get. These tiny "LCD watch" resolution screens suck for any real-world work. Sure a netbook can be handy for travel, but for serious tasks like PCB design, you want pixels, and more than a thimble-full. I do PCB design with protel (altium or whatever protel turned into), and I do it at 1920x1200 and I would love twice that. You don't want to stare at the world through a toilet paper tube.
Sheldon