Moore's Law Is Becoming Irrelevant, Says ARM's Boss 236
holy_calamity writes "PCs will inevitably shift over to ARM-based chips because efficiency now matters more than gains in raw performance, the CEO of chip designer ARM tells MIT Technology Review. He also says the increasing adoption of ARM-based suppliers is good for innovation (and for prices) because it spurs a competitive environment. 'There’s been a lot more innovation in the world of mobile phones over the last 15-20 years than there has been in the world of PCs.'"
Duh (Score:5, Insightful)
CEO of a company that makes more efficient CPUs than the competition says the future is in efficient CPUs. News at 11.
Re: (Score:2)
CEO of a company that makes more efficient CPUs than the competition says the future is in efficient CPUs. News at 11.
OR, it's the other way round: the people currently at the helm thought around 1990 that the future would be in efficient CPUs and so they formed an efficient CPU company. Then, they just kept their point of view.
Re: (Score:3, Insightful)
Pretty sure they will be, since the are now and have been since ... forever?
Re: (Score:2, Informative)
Except Intel CPUs have been becoming far more power efficient over the last few years too. I recently replaced my old Pentium-4 Windows PC with a new i7-based PC. The P4 has one core, runs two threads, is rated at around 130W and, when playing games, the system sounds like a jet engine. The i7 has four cores, runs eight threads and is rated at 77W. When playing games I can hardly hear it under my desk and the air coming out the back is barely warm.
Re: (Score:2)
You do realize you're still an order of magnitude off in power usage, right?
Re:Duh (Score:5, Informative)
According to this [phoronix.com] old benchmark by Phoronix (which was even linked by Slashdot), the i7 is more power-efficient than the ARM Cortex A9 in PandaBoard. The i7 got 85 Mop/s per Watt, while the ARM managed only 38.
The advantage of low-power processors like ARM's is low power consumption when idle, which admittedly is where most computers (and tablets, phones, etc) spend most of their time.
Re: (Score:2)
Well, the point of that benchmark was for servers. The i7 is not a mobile phone chip. Arm is trying to increase its performance, to compete in the server arena, intel is trying to drop its energy usage to compete in mobile. The Arm system in that benchmark is kind of a dev board that's closer to a mobile device than the future arm servers manufactorers are planning. So its kind of not a fair comparision. It would be like throwing the i7 into a 7 inch "android phone" with a lcd as moniter and a mobile phone
Re: (Score:2)
A lot of the noise reduction has come from better cooling devices. Heatpipe coolers and sealed unit watercoolers are much quieter than simple blocks of finned metal with fans strapped on and even the design of simple blocks of metal with fans strapped on has improved over the years.
But yes PC processors have improved considerablly on both work done per watt and perhaps more importantly on idle power consumption.
PS3-class graphics (Score:3)
Re:Duh (Score:4, Interesting)
Re: (Score:3)
wrong.
if you compare performance / watt, they are about even. the only thing to be said is that intel owns the high end, and ARM owns the low end. intel hasn't (yet) produced a low perf / low wattage chip to rival ARM, and ARM hasn't produced a high perf chip to rival intel.
let me know when ARM can make a processor that can power a modern laptop or desktop and beat the power consumption of intel. they aren't anywhere close.
Re: (Score:2)
"That design..."
I think we both understood the actual meaning of GP.
Re: (Score:2)
I guess that's why all the low-power Pentiums with two cores and no hyperthreading have about ten or twenty reviews on Newegg, and all the Core I7s that score 5x higher on Passmark and use 3x as much power, while costing several times more, have hundreds of reviews.
More efficient processor (Pentium G630), 18 reviews: http://www.newegg.com/Product/Product.aspx?Item=N82E16819116406 [newegg.com]
Less efficient processor (Core i7-3770K), 357 reviews: http://www.newegg.com/Product/Product.aspx?Item=N82E16819116501 [newegg.com]
On Amazon it's the same pattern, with 47 reviews for the bitchin' fast processor and 7 for the futuristic low-power one.
I'm also noticing that the difference between the best GPU/CPU and the second-best model is a margin of 30-40% on a good day, just like it has been for the last decade.
You keep using that word...
A processor which does 5x as much and uses 3x as much power is more efficient.
Not built for speed?!? (Score:5, Interesting)
But every newer version of operating systems has more bloat than ever. There must be some corollary to Moore's Law which states successive Operating Systems will still require higher performance, but users will now become accustomed to slower response times.
We could call it the Blort Law.
Re:Not built for speed?!? (Score:5, Informative)
Wirth's Law [techopedia.com]:
Software is getting slower more rapidly than hardware is getting faster.
Re: (Score:3)
Tell me about it. I have a nominally-1.5GHz quadcore Android phone that, when running Graffiti, can barely tell the difference between a "G" drawn like a "6" on the letter side, and the letter "O" with better than 90% accuracy unless I use SetCPU to lock it to full speed (with devastating impact upon battery life) whenever the screen is on, yet somehow... SOMEHOW... a slow, lowly 16MHz Dragonball m68k could do the same thing with nearly perfect, flawless accuracy. The biggest single reason, as far as I can
Re:Not built for speed?!? (Score:5, Insightful)
Re: (Score:3)
Re: (Score:3)
Have you ever looked at your WindowsSxS folder. It's huge.
AFAIK there is stuff in WinSxS that are just links to other files. Therefore the net size might be much smaller than what Explorer shows.
Re: (Score:2)
Re: (Score:3, Insightful)
" Securing your code (making it not fail under the weight of random exploits) doesn't slow things down."
of course it does. Checks take resources.
"Adding in additional complexity, holes, and latency to your software stack with DRM definitely slows things down."
also true
Re: (Score:2)
You don't always need to "check" things to make them more secure. Things like address randomization, making code read only, etc. Yes, there are ALSO checks (e.g. buffer checks).
Re: (Score:3)
Securing your code (making it not fail under the weight of random exploits) doesn't slow things down.
Code before security:
"if this, do that"
Code after security:
"if this, and this, and this, and this, and this, and this ... and this, and this, and this ... and this, and THIS, THEN do that"
Are you sure they both will run at the same speed?
Gate's Law (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
But every newer version of operating systems has more bloat than ever.
Which is the core of the problem. We did so much more with less not all that long ago.
Re: (Score:3)
Is it true with Windows, even (anymore, at least)? Windows 7 is faster than Vista, and 8 is equal to or faster than Windows 7 in most cases. See: http://www.techspot.com/review/561-windows8-vs-windows7/page2.html [techspot.com]
According to Engadget, PCMark 7 has a bug that causes Windows 8 to score lower than it should.
Re: (Score:2)
Well, Win7 is faster than Vista, and Win8 is faster than Win7, but WinXP is/was faster than all of them, especially when first released. One of the services packs (SP2? SP3?) was a big security upgrade that cut the speed of many operations in half or worse.
Re: (Score:2)
Nope. Well, this may be true for windows, but windows is not the only OS around.
Even on Windows, this is not true. Windows 7 is faster and less bloated than Vista, and on par with XP. Supposedly Windows 8 is smaller still, though even on a diet, Windows can't compete with purpose-built portable OSes like Android and iOS when it comes to efficiency.
Re: (Score:2)
Really? What are you basing this assertion on? 10.2 and 10.3 weren't very refined overall, but they were plenty quick with sufficient RAM and GPU, especially given the fact that the Quartz graphics system that they had developed way back in 2001 was (besides Amiga OS) the first mainstream OS to include graphics compositing capabilities [wikipedia.org] to offload window manager rendering from the CPU to the GPU. Apple was way ahead on that, MS didn't even have Compositing working in
Re: (Score:2)
Somebody correct me if I'm wrong, but isn't Win8 64-bit only?
At least according to wikipedia there is a 32-bit win8 available.
Is there even a such thing as a Pentium 4 capable of executing x64 code?
Yes, intel added 64-bit support towards the end of the pentium 4's life.
Re: (Score:2, Troll)
I would like to inform you that Linux is NOT an operating system!!!
Neither is Windows. It may be a system, but I wouldn't call it operating.
Re: (Score:2)
Sorry, no. On all counts. No "shitware", in fact both machines and both OSs generally have very similar software suites installed as I build them for my own needs from ground up, including software. That means OEM disk installation followed by manual install of all relevant software.
Of course, I don't spend days looking for drivers either. In general, when buying a new machine, I order parts to assemble the machine myself and before I order parts, I make sure that all of them have appropriate drivers. Manuf
Re: (Score:2)
I do not run aero at all. I always use the classic theme on both 7 and XP (one that looks like original win95/98 interface). Also, all drivers are up to date as I tend to play latest games, many of which have a tendency of having problems even with latest stable releases of drivers (such as GW2 essentially requiring beta drivers at release).
Comment removed (Score:5, Informative)
Re: (Score:2)
> and that ARM offers such efficiency
Big deal.
Intel has been getting it's act together in this regard. So this advantage of ARM isn't so great anymore. Meanwhile, you still do have the massive performance gap between x86 and ARM should you decide to do something besides browse LOLcats.
If anything, it's AMD that's lagging behind here.
Re:Title is rubbish (Score:5, Insightful)
Smaller transistors can be operated with less current, so Moore's law remains as relevant as ever.
Re:Title is rubbish (Score:4, Informative)
What?
twice the transistors, half the price. That is what Moore's law boils down to, according to his paper. Read it.
And yes, it's not relevant for a number of reasons.
As a real world example:
In 06 you could get a 3 GHz computer. If Moore's law still impacted speed, we would be able to get a 24GHz chip right now.
Re: (Score:2)
...
As a real world example:
In 06 you could get a 3 GHz computer. If Moore's law still impacted speed, we would be able to get a 24GHz chip right now.
GHz is not the only measure, and maybe a little behind the curve, but still:
http://www.engadget.com/2012/11/08/intel-launches-8-core-itanium-9500-teases-xeon-e7-linked-kittson/ [engadget.com]
Re:Title is rubbish (Score:5, Insightful)
In 06 you could get a 3 GHz computer. If Moore's law still impacted speed, we would be able to get a 24GHz chip right now.
i7-3960X is 6 cores at 3.3 - 3.9 GHz each. That isn't all that far from 24 GHz.
Re:Title is rubbish (Score:4, Funny)
>i7-3960X is 6 cores at 3.3 - 3.9 GHz each. That isn't all that far from 24 GHz.
I feel a great disturbance in Slashdot, as if millions of CS majors cried out something rather uncomplimentary about your ignorance.
Re: (Score:2)
In 06!!!! Try 2002:
https://secure.wikimedia.org/wikipedia/en/wiki/Intel_Pentium_4 [wikimedia.org]
" This initial 3.06 GHz 533FSB Pentium 4 Hyper-Threading enabled processor was known as Pentium 4 HT and was introduced to mass market by Gateway in November 2002."
Re: (Score:2)
One problem.
Even assuming this is correct, we don't have that long.
Sometime around 2020-2030, conventional transistors stop working, as feature lengths approach 5nm.
Even neglecting that, in another 10-15 years of moores law, you'd expect gate lengths to hit 0.5nm.
Oops. Silicon atoms are spaced at 0.5nm.
You can see the problem.
3D is one partial solution, but it just lets you put more transistors on a die, not let them use less power, or be faster.
Re: (Score:2)
Back-slash-dot? Is that where you don't bother reading TFA? I read before I comment, anyway.
Re: (Score:2)
Efficiency! (Score:5, Interesting)
" efficiency now matters more than gains in raw performance"
Sure, so why don't you start off by telling us why an Exynos Cortex A-15 chip running a web benchmark is using about 8 watts of power, with the display turned off so only SoC power is being measured, while Intel has already demoed a full-blown Haswell running Unigine Heaven at... 8 watts.
So when the miraculous Cortex A-15 uses the same amount of power as the supposedly "bloated" x86 Haswell, while Haswell is running a benchmark that is massively more intensive than a web-browser test, who is really making the most "efficient" platform?
Exynos Source: http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/7 [anandtech.com]
Haswell Demo Video: http://www.youtube.com/watch?v=cKvVdhkgAxg [youtube.com]
Re:Efficiency! (Score:4, Informative)
That's a false comparison, though. If users mostly ran benchmarks 24x7, that would be a good test of efficiency. The reality, however, is that CPUs mostly sit idle, so to compute average efficiency, you have to factor that in.
Granted, a faster CPU that can reach an idle state sooner can be more efficient than a slower CPU that runs at full bore for a longer period of time, but only if the idle wattage is fairly similar.
Re: (Score:3)
Re: (Score:3)
Good thing then that Haswell's idle power draw is 20x better than Ivy Bridge's, meaning that it is probably about the same as the Cortex A-15 (or maybe even better).
I'm not saying that Haswell belongs in a smartphone.. I'm also saying that unless you downclock that Exynos you don't want it in a smartphone either. I *am* saying that the blind assumption that ARM == efficiency tends to disintegrate when confronted with facts. I'm also saying that if Haswell can run at 8 watts, the whole "x86 wastes powar!"
Re:Efficiency! (Score:5, Informative)
Funny you should hate on Medfield when a Razer I with Medfield has better battery life than a krait Razer M with the exact same screen and battery. But it looks like you never let facts get in the way of your koolaid.
Re: (Score:2)
That's a false comparison, though. If users mostly ran benchmarks 24x7, that would be a good test of efficiency. The reality, however, is that CPUs mostly sit idle, so to compute average efficiency, you have to factor that in.
Your efficiency of not doing work is like measuring the MPG you get idling in your driveway. Laptops have been either off or in sleep/suspend, they haven't had an "active idle" mode like cell phones waiting for calls/texts/emails because they've never needed one. It's like having a huge office building with only floor switches, cell phones have had a single light for the night receptionist and laptop chips haven't because it's been lights out when they sleep. Now they need one and will get one with Haswell
Re: (Score:2)
Wait, wait... are you trying to say that in a notebook system doing wireless web surfing, the only sources of power are the CPU and the display?
If so, you are way off.
Re: (Score:3)
No, I'm saying that on a chromebook with a SoC (that stands for "system on a chip" you know...) the total power consumption of the SoC running a web benchmark that likely requires little or no wireless network power due to caching is equivalent to the power consumption of a low-power Haswell part (that is similar to a SoC but with a separate south-bridge MCM).
Oh, and if the Kraken benchmark is anything remotely similar to any other web browser benchmark I've ever seen, the CPU/GPU on the SoC are not being t
Re: (Score:2)
Wonder what the cost difference is between those two.. If you're putting it in a low cost device, a difference in price can be rather significant.. (ie, do I spend $30 more on each CPU, or go with the cheaper CPU, and $20 worth of extra battery)..
Oh, wait.. one is out in production. another has no firm release date.. So a brand new, not yet actually in use chip is faster, and uses less power than one that has been around a while.. Fascinating...
Re:Efficiency! (Score:5, Insightful)
Haswell is a (probably) ~1.6 Billion transistor chip that obviously costs more than a SoC that is really designed for tablets. Interesting then that a ~1.6 Billion transistor chip that includes similar functionality to the SoC uses about the same amount of power as that tablet SoC while including vastly more performance.
If you want cheap, Atoms are already out now that are quite cost competitive with ARM chips, and 22nm Atoms will be out next year.
Oh and as for "release dates" the Exynos has just barely begun to reach the market and Haswell will be out and about at around the same time that most Cortex A-15s really come into the market as well. Considering I've had to listen to "A15 will kill Intel!!!!" for over 2 years as if they were already coming out of faucets like water, I'm not too worried about part availability.
So here we are in the ARM vs. Intel Evolution:
2008: ARM is superior, Intel can NEVER scale its power consumption down below 100 watts!!
2009-2010: ARM is still superior! Atom sucks at performance and uses 10 WHOLE WATTS, thats more than 10X ARM! The Cortex A9 will annihilate Intel!
2011: ARM performance dominance is just around the corner! Ignore those useless benchmarks of Cortex A9 vs. Atom! So what if Atom has higher performance, IT SUCKS DOWN MORE POWER AND POWER CONSUMPTION IS ALL THAT MATTERS!
2012: Medfield sucks! Who cares if it gets better battery life than a dual-core 28nm Krait when put into Motorala Razers with the exact same! See, we have benchmarks where the higher-clocked Krait gets 10% better performance (in some benchmarks while losing in others that we ignore)! WHO CARES THAT ATOM IS MORE POWER EFFICIENT THE ONLY THING THAT MATTERS IS MORE PERFORMANCE!
INTEL IS STILL OVERPRICED EVEN THOUGH THE RAZER I AND RAZER M HAVE THE SAME PRICE!
2013: Uh... at least ARMs are cheap when you intentionally compare chips desiged for cellphones to Intel's desktop chips and pretend that Atom doesn't exist. ARM WILL DESTROY INTEL!
Re: (Score:2)
Really.. please show me the same demo running on... let's say... the A6 on the newest iPad. I'm sure it will get *much* better framerates.... (or not).
You completely missed the point of my post, but I can see that Intel is gradually working its way up Ghandi's list. It's getting to be between the "then they laugh at you" and the "then they fight you" stages....
Hate the "Post-PC" era (Score:5, Insightful)
As a geek I love a powerful general purpose machine that can do all the things an ebook reader/music player/web browser can do AND a whole lot more like play 3d games, run a math or science simulation, allow you to record and edit video, memory and processor intensive image editing. To me a tablet is little more than a crippled PC with the keyboard removed (fantastic, why did I learn to type at 90wpm again??), and a smudge screen interface (hate viewing photos through finger marks!!!). It's really awesome that we have dumbed down our computers to the point of mediocrity. Even finding a decent e-book reading or music playing app - the things these pieces of shit are touted at making easier - is a nightmare. So many book readers don't even let you zoom on images. And browsing the web without flash support is like trying to surf with one leg. I don't mind that there are dumbed down idiot boxes for those who like to post pictures of food on Facebook, but I really resent the impact on general purpose computing.
Re: (Score:3)
Re: (Score:2)
Really, you hate the fact that the mobile core i5 is more powerful than the previous generation while allowing all day battery life? Because that's the biggest way that tablets have affected general purpose computing that I can see. Sure, the current mobile i5 isn't going to transcode video as fast as a current desktop i7, but it'll do it considerably faster than a Core2 era desktop. Plus optimizing idle power is good for the environment, replacing P4 era desktops with current era machines will save you ton
Re: (Score:2)
could not agree more (Score:2)
Re: (Score:2)
Yeah, if you ignore Ivy Bridge and Haswell.
Makes no sense! (Score:3, Insightful)
Moore's law just predicts transistor density - it says absolutely nothing about computational power. Increases in transistor density can make electronics more efficient per watt, but this still is aligned with Moore's law.
The title is stupid, and the actual article says almost nothing like it.
Re: (Score:3)
Actually, this means that the CEO of ARM doesn't know what a transistor is and why you would want more transistors in a tiny space.
Power (Score:4, Insightful)
Sure efficiency matters, but only in portable devices. Desktops or other computers connected to the mains don't have a problem.
Hey its winter already, a watt used by your CPU is a watt less that has to be used by your radiant or convective heater.
Re: (Score:3)
Re: (Score:2)
Now to expose my woeful lack of understanding of the topic!
Is it even apples to apples with electric heaters? I'm not sure how much power my PC is currently drawing, but its exhaust isn't particularly warm--in fact, it feels perceptively cooler than the ambient temperature. I have no doubt there's a sort of "wind chill" factor going on (it's not a magic PC, so far as I know), but it seems like a damned inefficient heating appliance all the same, especially if I consider space heaters I've used in the past
Re: (Score:2)
Cinder6 (Score:2)
So is the PC just an inefficient heater, then? Even my aluminum case is cold to the touch. If I didn't have so many fans (10 in total), would it make the room hotter?
I'm asking because I often see it claimed that PCs make great space heaters, but in my experience, this one plain doesn't. Under full load, it should draw quite a bit of power, but it outputs much, much less heat than lower-energy dedicated space heaters. I'm tempted to find my Kill-A-Watt and see what it says.
Re: (Score:2)
So is the PC just an inefficient heater, then? Even my aluminum case is cold to the touch. If I didn't have so many fans (10 in total), would it make the room hotter?
I'm asking because I often see it claimed that PCs make great space heaters, but in my experience, this one plain doesn't. Under full load, it should draw quite a bit of power, but it outputs much, much less heat than lower-energy dedicated space heaters. I'm tempted to find my Kill-A-Watt and see what it says.
There's no such thing as an inefficient heater. All the energy your computer uses must end somewhere, and that somewhere can only be sound or heat. The sound output is usually very low, and as GP explained, absorbed by walls and converted into heat as well. The exceptions are any long-range EM emitters, like WiFi and Bluetooth, which are still converted into heat but not always in the same room or house. So it is only the case and fan design which causes a difference in perceived heat.
Also, I doubt your ded
Re: (Score:2)
The heat that I can control is electric. The furnace is controlled by a thermostat that is upstairs.
And we live on the north side of the building
Re: (Score:3)
Re:Power (Score:5, Insightful)
Re:Power (Score:5, Insightful)
Except in the summer every watt used by your CPU requires your air conditioner to use more energy to counteract it.
Re: (Score:2)
Wrong.
You assume one watt of electric being converted to heat is the same as one watt converted by a heater. There are different devices with different inefficiencies.
Re: (Score:2)
I think he's assuming that one dollar of electric is converted to as much heat as one dollar of something else.
When that's true, then CPUs are good heaters.
When that's false, then CPUs are second-rate heaters but OTOH you get some other kind of work out of them at that same time they heat, so maybe they're still ok. Or maybe they're not, depending on the cost difference and the value of the work.
An
Re: (Score:2)
Not really, because an electric heater gives off a small amount of energy as visible light, while a processor only gives off heat. So arguably, the processor is marginally more efficient at heating 'per Watt'.
Re: (Score:2)
To an extent. Try selling a desktop that sucks down two kilowatts under full load - see how well it sells. Now look at the sales data and see that Intel's best-selling processors have dropped from 100W+ down to 77W, because it seems, given two processors of similar price, and both having sufficient processing power for the users' needs, consumers prefer the one using less power.
Here come the ARM zombies (Score:5, Insightful)
Sigh. It seems there is a new, hip, propaganda trend on Slashdot: pro-ARM articles are posted, and a bunch of ARM zombies come out saying how anything ARM makes will (magically) be lower-power or more power-efficient than anything x86.
So I'll start a tradition of posting this same response every time (originally posted by me here [slashdot.org]):
"ARM isn't magic; there is nothing in the ARM ISA that makes it inherently lower power than x86. Yes, I'm counting all the decode hardware and microcode that x86 chips need to support legacy ISA. There just isn't much power burned there compared to modern cache sizes, execution resources, and queue/buffer depths which all high-performance cores need regardless of ISA. If you have an x86 processor that targets A9 performance levels, it will burn A9 power (or less if Intel makes it, given Intel's manufacturing advantage). If you have a ARM processor that targets Sandy Bridge performance levels, it will burn Sandy Bridge (or more) power."
Re: (Score:3, Funny)
Aaaaaaaarrrrrrrmmsssss!!
Re: (Score:2)
Re: (Score:3)
You know what I love? When the exact same people who say that an Intel workstation with a 6-core CPU being used for heavy compiling/CAD/etc. etc. is "wasted" and "overkill" but that a 256 core ARM chip on your cellphone will be insanely great... because... uh... Angry Birds is the most parellizable program in human history?
No, it's still Moore's law (Score:3)
It is just expressing itself differently as we begin to hit the wall with process size decreases and speed increases. If wattage of the cpu goes down, you can pack more cores into the same area. Computing power is still going up.
Most of those 64 cores will sit idle (Score:2)
If wattage of the cpu goes down, you can pack more cores into the same area.
Most of those 64 cores will sit idle until programming techniques for making extreme parallelism reliable become taught in universities and vocational schools.
Not So Sure About That (Score:4, Interesting)
Moores law == cost per transistor (Score:2)
I think at the end of the day what really matters whenever moores law is invoked is the underlying issue of cost per transister... I don't see cost ever being relegated to irrelevant.
As transistors get cheaper you can take any combination of two paths:
1. Build cheaper gear with same capabilities.
2. Cram more into the same device to increase capabilities while maintaining price.
Either way moores law is still critically important to the industry no matter who wins a CPU architecture war.
With regards to ARM vs
Let me know when phones become render farms. (Score:3)
Mobile chips are shit to people who need renderfarms, simulation farms etc. People still do real work out there.
People who keep crapping on workstations and servers seem to think everyone just needs a computer for texting, facebook and angry birds.
Architecture is becoming irrelevant (Score:3)
Sure, power efficiency and die-area are important in many places, but don't think ARM is somehow going to have a lock on that.
Processor + Display + Input + Sound = Value (Score:2)
a moped is like a Harley (Score:4, Interesting)
I think we need some expert analysis on this one.
The PC is used to create content. A smartphone is used to consume content. The PC functions autonomously (in a pinch). The smartphone is permanently welded to its cloud-nipple. The PC brings you smart ideas in shabby attire. The smartphone brings you shabby ideas in smart attire. The PC discourages walled gardens. A smartphone never leaves home without one.
Wake me up when my smartphone comes with a holographic projector capable of conjuring up 40" of viewing pleasure at a comfortable focal plane, and either a haptic keyboard (gravitational hologram?) or a brainstem feed a million times better than Swype.
Next we'll declare that mopeds and Harleys are the same form factor because there are more Asians than balding fat men. Clearly a modped is more like a Harley than a smartphone is like a PC.
People losing sight of the real issue (Score:2)
The real issue here is whether ARM can lock up the market before Intel's offerings become highly competitive. The answer to that is clearly NO, they can't. Intel wants to compete in the mobile SOC market and they clearly have enough of a technology edge with their Fabs to jam their foot in the door before ARM can lock it. Intel doesn't need to blow away ARM here, they only need to make sufficient progress on power consumption to put themselves on near-equal ground. They've already shown that progress.
Re: (Score:3)
...and by this, Intel's fab advantage will eventually make ARM irrelevant.
Simply optimizing code could do that
Re: (Score:2)
'Simply', huh?
Re:Sure... (Score:4, Funny)
Of course! Naturally, one *merely* needs a sufficiently clever compiler...
Re: (Score:2)
Moore's "Law" (Score:2)
I'm *very* glad you made your comment...it highlights a serious mistake in computing...a mistake that costs **BILLIONS** and effects people directly...
"If this guy gets his way, then we may never have sentient computers."
The idea that Moore's "Law" is somehow a scientific predictor that indicates the ability and future inevitability of humans making 'sentient' computers is...well...it's ridiculous.
1. Computers execute instructions...humans are beyond that, we have *free will*...For humans to make 'sentient'
Re: (Score:3)
So there you have it. IMHO, geeks create fictions by transmogrifying anecdotal data into quantitative data...why? Simple, quantitative data has less uncertainty, and therefore is easier for a 'geek-minded' person to sythesize
put that dictionary away. it's doing more harm than good.
Re: (Score:2)
Doh, I was thinking Mac and typed 68000, I meant PowerPC.
But apparently I'm getting the sh*t modded down by ARM fanboys/bois because I'm trying to be objective.
Whatevs.
Re: (Score:2)
68000 was CISC, not RISC.
And its addressing modes, at least, were CISCier than x86's (auto-increment, auto-decrement, etc.).