Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Cellphones Intel Hardware

Intel Medfield SoC Specs Leak 164

MrSeb writes "Specifications and benchmarks of Intel's 32nm Medfield platform — Chipzilla's latest iteration of Atom and first real system-on-a-chip oriented for smartphones and tablets — have leaked. The tablet reference platform is reported to be a 1.6GHz x86 CPU coupled with 1GB of DDR2 RAM, Wi-Fi, Bluetooth, and FM radios, and an as-yet-unknown GPU. The smartphone version will probably be clocked a bit slower, but otherwise the same. Benchmark-wise, Medfield seems to beat the ARM competition from Samsung, Qualcomm, and Nvidia — and, perhaps most importantly, it's also in line with ARM power consumption, with an idle TDP of around 2 watts and load around 3W."
This discussion has been archived. No new comments can be posted.

Intel Medfield SoC Specs Leak

Comments Filter:
  • One benchmark (Score:2, Insightful)

    by teh31337one ( 1590023 ) on Tuesday December 27, 2011 @11:10PM (#38510988)

    It beats the current crop of dual core ARM processors (Exynos, snapdragon s3 and Tegra 2) in one benchmark that "leaked".

    Nothing fishy about that at all.

  • whoosh (Score:5, Insightful)

    by decora ( 1710862 ) on Tuesday December 27, 2011 @11:37PM (#38511186) Journal

    teh37737one's point, if i may, was that this 'leak' was actually a 'plant', a PR move by Intel to get people posting ridiculous speculative nonsense, like, exactly the stuff you posted in your comment.

    "if this is realistic, intel has an awesome CPU" etc etc etc.

    Does anyone care if its realistic? Intel sure doesn't, it just wants people to speculate that it might be realistic, and then talk about Intel, and how awesome Intel is.

    But of course, it might be a load of crap... when the actual numbers come out, who knows what they will say? And when real programs hit the thing, who knows what it will do?

    That's why Intel is 'leaking' it. On purpose. So they can have 'plausible deniability'. They can churn the rumor mill, get their product mentioned in the 24 hour ADHD cycle of tech news, get people posting on slashdot, etc, but Intel itself never has to sully it's good name by engaging in outright pushing of vapor-ware.

    If only the guys at Duke Nukem had been smart enough to 'leak' stuff 'anonymously' to the press, instead of giving out press releases...

    Of course, another way to look at it is this: It's yet another example of the corporate philosophical suite that is drowning our civilization in garbage and awful values. Never say anything directly, never take responsibility for your words or actions, never be straight with people, and hide everything you are doing in layers and layers of techno jargon, babble, and nonsense.

  • Re:whoosh (Score:4, Insightful)

    by Jeremi ( 14640 ) on Tuesday December 27, 2011 @11:57PM (#38511366) Homepage

    Does anyone care if its realistic? Intel sure doesn't

    Intel will care if the leaks create unrealistic expectations that their product can't meet. The result could be consumer rejection of an otherwise respectable product, because the public had been (mis)led to expect more than the product could actually deliver. (see: Itanium as replacement for x86)

    So the "secret Intel propaganda strategy" only works if Intel actually has a reasonable chance of living up to their own unofficial hype. And based on their recent track record, they probably do.

  • Re:One benchmark (Score:5, Insightful)

    by Anonymous Coward on Wednesday December 28, 2011 @12:14AM (#38511490)

    I did read the story - but did you? Its idle TDP stands at 2.6W. A 1700mAH battery (typical in a cell phone) @ 3.6V = 6.12 Volt-Amps (i.e. Watts). So, you'll get around 2.5 hrs of uptime under idle conditions, assuming the battery is new. Good luck trying to charge that monster ever 2 hrs!
    Who cares about performance when your phone will be dead before making a single call? Not much better in tablets either!
    So, what is this chip competing against? Other laptop chips from Intel?

  • Re:One benchmark (Score:5, Insightful)

    by Anonymous Coward on Wednesday December 28, 2011 @12:38AM (#38511620)

    Yeah... no.

    vr-zone [vr-zone.com]

    As it stands right now, the prototype version is consuming 2.6W in idle with the target being 2W, while the worst case scenarios are video playback: watching the video at 720p in Adobe Flash format will consume 3.6W, while the target for shipping parts should be 1W less (2.6W)

    extremeTech [extremetech.com]

    The final chips, which ship early next year, aim to cut this down to 2W and 2.6W respectively. This is in-line with the latest ARM chips, though again, we’ll need to get our hands on some production silicon to see how Medfield really performs.

    And which ARM SoC's idle at 2W? That's at least an order of magnitude greater than any ARM SoC - those typically idle at a few tens or hundreds of milliAmps. ARM's big.LITTLE architectures will bring that down even further.
    So, Medfield may be competitive on speed and TDP at full load, but if you are a mobile device maker, would you care? You would probably be more interested in eking out more uptime from your tiny battery.

  • by mollymoo ( 202721 ) on Wednesday December 28, 2011 @12:53AM (#38511704) Journal
    x86 is a a huge, complex instruction set. All else being equal. implementing it costs more silicon and more power than ARM architectures. Intel's great engineers and unmatched process can make up for this somewhat, but it would be a good effort for them just to achieve parity with ARM. To do so they're likely going to need to stay one process step ahead of the competition, which has cost implications.
  • Re:Dubious (Score:5, Insightful)

    by ArcherB ( 796902 ) on Wednesday December 28, 2011 @02:16AM (#38512132) Journal

    Intel took x86 to workstations and supercomputers killing many RISC processors in the process. It'll be fun to see them pull it off again against ARM.

    No, it wouldn't. RISC is a superior instruction set. x86 only beat RISC because it was really the only game in town if you want to run Windows, which every non-mac user did. At the time, the desktop was king and made Intel lots and lots of money, which they used to beef up their server offerings. Now we are stuck with x86 with RISC being used only in "closed" architectures like smart phones, consoles and big-iron servers.

    I like competition. I'd rather see ARM make gobs of money of designing chips that everyone can improve on than Intel make gobs and more gobs of money selling desktop, server and mobile chips that only they may design, produce and sell.

    The final processor line that Intel makes will be the one they are producing when they become the only game in town.

  • by mirix ( 1649853 ) on Wednesday December 28, 2011 @03:00AM (#38512358)

    Bingo. My ageing Nokia, while lacking in horsepower, has excellent battery life... It has a 600MHz ARM, and a 3.2Wh battery. It manages to idle for a week at least, I'm sure it's hit 10 days before, but lets say 7, to be safe.

    3.2W / 7 / 24 = 20mW idle. Two fucking orders of magnitude better than their *target*. (not to mention this includes the entire phone, not just the core, in real life).

    I presume the more powerful android rigs still keep it within 100mW for the whole phone, idling. - That would give you roughly two days idle on a decent sized phone battery (5Wh). That's still more than an order of magnitude difference.

  • Re:Dubious (Score:4, Insightful)

    by Henriok ( 6762 ) on Wednesday December 28, 2011 @04:37AM (#38512838)
    What RISC platform did XP, Vista and Windows 7 run on? XP had support for Itanium, but that's not a RISC platform. Vista and Win7 only support 32- and 64-bit x86. So.. It seems you are wrong in your statement.

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...