NVIDIA Tegra X1 Performance Exceeds Intel Bay Trail SoCs, AMD AM1 APUs 57
An anonymous reader writes: A NVIDIA SHIELD Android TV modified to run Ubuntu Linux is providing interesting data on how NVIDIA's latest "Tegra X1" 64-bit ARM big.LITTLE SoC compares to various Intel/AMD/MIPS systems of varying form factors. Tegra X1 benchmarks on Ubuntu show strong performance with the X1 SoC in this $200 Android TV device, beating out low-power Intel Atom/Celeron Bay Trail SoCs, AMD AM1 APUs, and in some workloads is even getting close to an Intel Core i3 "Broadwell" NUC. The Tegra X1 features Maxwell "GM20B" graphics and the total power consumption is less than 10 Watts.
Re: (Score:2)
Re: (Score:2)
No mention of Bitcoin, 3D printing, Arduino or Raspberry Pi.
Lame.
nice, but they remove your games if you update (Score:1)
no thanks...
Re: (Score:2)
why did they need to remove them? android updates aren't that large. Can you redownload them? So they're gone forever?
Re: (Score:3)
Actually, the point of having stable APIs and ABIs is so that other dependent binaries and code continue to work and compile even when changes are made to the underlying software.
Re:nice, but they remove your games if you update (Score:4, Informative)
The more troubling question is why an application should feel forced to do anything in the face of a platform upgrade in order to work at all. A modern Windows desktop can still run 10 year old software without a hiccup. Going back 20 years you start needing something like dosbox to use a lot of the applications, though still doable. I haven't tried firing it up in a while, but last time I tried the commercial package of quake 3 under linux, it still worked on a modern distribution. Same for linux neverwinter nights. As an application maintainer for some linux stuff, the only things that I can recall forcing my hand to change something for things to work were systemd and python changes.
Android (and to a significant, but somewhat lesser extent Apple) are not doing that good with respect to application and/or hardware compatibility into the past. It's a tiring situation for developers to have to follow an upgrade treadmill in order to cater to new system sales, just to keep the current applications workable as-is.
Not an AMD CPU (Score:5, Interesting)
The X1 uses a standard ARM Cortex A57 (specifically it's an A57/A53 big.LITTLE 4+4 config), so this says more about ARM's chip than anything nVidia did...
Now if you compared nVidia's Denver CPU, their in-house processor... The Denver is nearly twice as fast as the A57, but only comes in a dual-core config, so it's probably drawing a good deal more power. When you compare a quad-core A57 to a dual-core Denver, the A57 comes out slightly ahead in multicore benchmarks. Of course, single core performance is important too, so I'd be tempted to take a dual-core part over a quad-core if the dual-core had twice the performance per-core...
Why the X1 didn't use a variant of Denver isn't something that nVidia has said, but the assumption most make is that it wasn't ready for the die shrink to 20nm that the X1 entailed.
Re: (Score:3)
Sorry, I meant not an *nVidia* CPU.
Re: (Score:2)
Re: (Score:2)
I'm bully on ARM, with the (almost) collapse of AMD as a "first rate" processor, it's good to see Intel get some serious competition in a significant market space.
My only beef with ARM is that comparing CPUs is harder than comparing video cards! the ARM space is so fragmented with licensed cores and seemly random numbers indicating the "version" that I have no idea how, for example, a SnapDragon 808 processor compares to a Cortex A9 or an Apple A7.
Really, I'm lost. But the $40 TV stick with the 4x core A9 w
Re: (Score:2)
> I'm bully on ARM
Is it some new slang, or you meant bullish?
Re: Not an AMD CPU (Score:2)
Actually it is old slang, dating back as far as 1609 , to Merriam-Webster. It enjoyed being in the popular lexicon during the latter part of the 19th Century and was a commonly used expression by U.S. President Theodore Roosevelt.
Re: (Score:2)
Interesting, even a literal 'bully on' search didn't turn up anything (m-w.com doesn't describe this usage either), I've only heard 'schoolyard bully' and 'bully for you' in the past, and of course, somebody is 'bullish on' or 'bearish on' something.
Re: (Score:2)
Re: (Score:2)
No, it says more about the bad benchmark than anything else.
I'm not impressed that ARM can NOP as fast as an i3 to put it bluntly.
I say this because if you look at the bench marks and the way they did it. They compile arm variants in fully optimized mode, and x86 variants generic x86 code. From that point on, reading is a waste of time. Might as well compile with debugging on from a bench mark perspective.
Its intentionally skewed.
Interesting, but compiler settings aren't optimal (Score:5, Insightful)
Look here at the compiler settings [phoronix.com]. The x86 processors are somewhat hampered by non-optimal settings. For example the i3 5010U is set to -mtune=generic. In my experience, that's basically going to default to AMD K8 optimization with no AVX/AVX2 support. The better option would be using -mtune=native or better yet -march=native, which would detect the CPU and produce a more optimized binary.
So using a 20 year old subset of the instructions (Score:2)
it does worse? You just found out that the benchmark is bullshit.
Re: (Score:2)
This is exactly why the benchmarks include
1) a way to repeat the benchmarks as described in the article see page 4 - 'phoronix-test-suite benchmark 1507285-BE-POWERLOW159'.
2) The compiler options are included
Armed with those two pieces of information, you can go and "prove" that the benchmark is, as you called it - bullshit. Although rarely, if ever that I am aware of, does anyone respond to an article with those two pieces of information and say - "here, if you run it in thi
Re: (Score:2)
>If you really want to prove that the benchmark is crap, then by all means make meaningful suggestions to _any_ of the existing machine benchmarks.
That's a bit facetious. If you've been around the benchmarking world as long as you say you have, you'll know that the compiler settings are *always* a cause of controversy.
Nobody is happy when compiler settings are made that don't favor their side (whatever it is).
Re: (Score:2)
Well with the possible exception of those running gentoo, 99% of end users will be running precompiled software that has to be compiled for a generic cpu as the distributor doesn't know exactly what type of processor its going to end up running on.
Re: (Score:2)
I run Gentoo!!!
Besides that, I did some very recent Intel CPU benchmarking as I tried to figure out IPC gains over CPU generations. I ran my benchmarks on GCC 4.8/4.9/5.2 and LLVM 3.6 on Nehalem and Ivy Bridge. I also included march=generic vs march=native. Quick summary: For generic integer/floating-point code, the Intel Core-i7 CPUs don't actually benefit much from optimizations for newer architectures, especially on x86-64. The exception here is that 32-bit generic FPU x87 code is slower than SSE2, but t
Re: (Score:1)
Well, these just happened to be on my Gentoo boxes, and therefore, of interest.
Re: (Score:2)
The problem being they didn't do the same to ARM. Either that argument applies to both sides or neither. They need to be held to same standard.
NUC? (Score:2)
What incompetence led Intel to use a temporally relative name. It's on par with 'new' in the product name. Seems to work OK until it doesn't and looks idiotic in retrospect.
Re: (Score:2)
What incompetence led Intel to use a temporally relative name. It's on par with 'new' in the product name. Seems to work OK until it doesn't and looks idiotic in retrospect.
What looks idiotic in retrospect is your comment. The name only has to make sense long enough to sell a bunch of units. Then they're on to the next product.
Re: (Score:2)
don't forget your meds, man
Re: (Score:2)
don't forget your meds, man
Don't forget to leave a comment worth reading, man
VDPAU support? (Score:3)
Do VDPAU ( Nvidia video decode hardware acceleration API) drivers exist for this platform? In the past, I believe only the x86 binary blob drivers supported VDPAU.
If they exist, this would make an excellent MythTV DVR frontend device.
Re: (Score:2)
It should run about the same driver as a graphics card under a linux PC (which shares much with the Windows driver too).
When nvidia first showed off Tegra K1, it was Ubuntu 12.04 with a screenshot of nvidia-settings along a few things.
Availability of Ubuntu (Score:1)
Bullshit (Score:1)
10 times out of 10 NVidia non-GPU chip benchmarks are paid for by NVidia and are complete bullshit, designed to get fanboys to buy their latest chip. There have been no exceptions with Tegra to date.
Re: (Score:1)
10W is hellish hot (Score:4, Insightful)
10W is incredibly hot for any sort of passively cooled, enclosed device.
The machine would be quite warm (almost hot) to the touch unless they use some inventive cooling. The current Gen Apple TV is about 6W, and your typical smartphone is around 2-3 W.
There is a reason that NV has only really been able to get a foothold in tablets, android TV, cars and their own shield product. Quite simply put, they have historically been fast and hot. Great as a SOC within certain markets.
Re: (Score:1)
You are correct only if comparing in the same form factor. For me, I think this race to 10-20W for plugged in computing is crazy -- power isn't that expensive as long as it doesn't consume much during idle, I don't really care how much it uses during operation as long as it isn't >300W or so, or 100W in a mostly enclosed space or 50-80W in a fully enclosed space.
Give me the better performance please.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Actually, it isn't too hot. ARM t
AMD did well! (Score:3)
Interesting take-home from the benchmark: the AMD desktop processors did prtty respectably well compared to the i7s. Ususally a bit slower, sometimes actually faster and we know an AMD setup is certainly cheaper.
Interesting that in the open source, repeatable, examinable benchmarks the difference between Itel and AMD is a lot less pronounced.
Re: No ARM for me (Score:1)
Carrots on sticks for gamers notorious for being baited.
Tegra sucks (Score:2)
NVidia should have spent more money on engineering and less on advertising. All the Tegra chip sets overpromised and underdelivered. I see no reason why this one should be different.