ARM Unveils Three Second-Generation Mali GPUs 29
Barence writes "ARM has taken the lid off three new Mali T600 graphics chips that form the second generation of its mobile Midgard architecture. Designed for use in smartphones, tablets and smart TVs, the three chips range from four to eight cores, improve performance by 'up to 50%' and offer greater efficiency. ARM expects devices to begin appearing with the chips this time next year."
That's fast! (Score:5, Funny)
That's like 20 Mali GPUs generated per minute, or 1200 per hour.
Re: (Score:3, Funny)
They are recompiled from HDL (Hardware Descriptive Langauge), so making up a new one can be as fast as a new FireFox rev.
Re: (Score:2)
They are recompiled from HDL (Hardware Descriptive Langauge), so making up a new one can be as fast as a new FireFox rev.
HDL = Hardware Description Language.
There, fixed that for you.
Works for me (Score:3)
I won't be elegible for a subsidized upgrade until this time next year.
Re: (Score:2)
Try buying Azawad separately.
Re: (Score:1)
Re: (Score:1)
Sometimes errors can be inconvenient.
http://www.dailysquib.co.uk/world/2568-sarah-palin-s-weak-geography-could-be-problem-in-nuclear-war.html [dailysquib.co.uk]
Re: (Score:2)
Open source drivers? (Score:5, Informative)
Unfortunatelly, they will not be *fully* open source:
For the ARM Mali T6xx Linux enablement, they are only using a DRM driver for driving the display controller while they have their own separate kernel driver for poking and handling the GPU itself. ARM though isn't being too open-source friendly in terms of a fully open stack or providing proper documentation.
Good grief.
Re:Open source drivers? (Score:5, Informative)
Re:Open source drivers? (Score:4, Informative)
Unfortunatelly, they will not be *fully* open source:
Shame.
It's silly really. Intel have shown that OSS 3D drivers are perfectly feasible.
Actually, the Intel graphics are pretty much the best choice unless you need high performance for something due to the extremely solid and reliable nature of them.
Re:Open source drivers? (Score:4, Insightful)
It's been mentioned before . Many times the hardware may have features that are not yet fully developed or licensed. I remember Nvidia saying this about Linux driver and shader languages - that GLSL wasn't licensed for Linux or something. Then you have patent trolls who will jump at every keyword as evidence that you are violating the patents they bought off craigslist or ebay. Your mention of "geometry tree" might mean list of vertices, To them, it's their super-spiffy collision detection algorithm, so you end with paying lawyers large sums of money to fight it out on your behalf. You might also get the odd nutter or academic department who will go off and patent a future feature enhancement documented in the comments.
Like the early days of the PC, somebody figured out a way of reprogramming the real-time clock to drive 8 Kilohertz interrupts so low-quality sound could be played on the internal speaker. They actually patented it, even though it completely scrambled all file update and creation times.
Given the rate of change that is going on (GLES 3.0 [khronos.org] is out today, it would be very confusing for developers if ARM had one version of the driver that supported the current standard, and the user community has their own with custom extensions.
Re: (Score:2)
From a pedant who remembers how this worked:
* You could get more than 8KHz. But the number of PWM steps you had was 1.193180MHz / sampling frequency - so only 54 at 22.050KHz. The higher sampling frequencies made the PWM "whine" less audible.
* As long as you called the original timer interrupt code at the correct frequency (1.193180MHz / 65536, ~18Hz) the system clock would stay accurate. Of course, if you failed to do this, it wouldn't.
Implications (Score:1)
Re: (Score:1)
Xbox360 has 40 early Radeon unified shader cores. This thing has 8 Mali cores.
Or hams (Score:1)
Good. 'cause my single-core Droid Charge can take up to 5s to start a button press on a web page. It would do MS Bloatware proud.
Too many Ice Cream Sandwiches, I'm guessin'.
battery life... (Score:1)
That sound you hear is your battery life flying out the window... i mean really, do we need to add this capability to a table and phone? mobile phone battery life decreases with each generation, so let's add a GPU! /facepalm
Re: (Score:1)
If graphics output can be off-loaded from the CPU to the GPU, which can do these computations more efficiently (read: use less energy to do so) that's improving battery life. On the other hand, if the GPU is used for what it can do...
Re: (Score:3)
Not sure if you're trolling or just woefully unaware of the state of smartphones, but my Original Motorola Droid A855 from Nov 2009 (which I still use as my primary mobile) has a PowerVR discrete GPU. To this day the battery life is still tolerable on the original battery, but I also cut my smartphone teeth on the UTStarcom PPC6700 and later HTC 6800 so I knew what I was in for.
Lets fast forward nearly 3 years and compare my Droid to the Motorola Droid Razr Maxx. Huge screen, faster in every possible way,
Can I underclock it? (Score:2)
More efficiency is great. Can I underclock it and get a smartphone that'll last 3 days on a charge?
I have friends with 'smart'phones that last 8 hours on a charge. My 5-year old 'feature phone' can regularly get 3 days, and its silicon is on an ancient process.
Re: (Score:1)
Re: (Score:2)
Thanks, that's all I want. CM wiki says the XT is GSM only, but I'll look for a CDMA version (all that exists in these parts).
Re: (Score:1)