Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware

ARM Unveils Three Second-Generation Mali GPUs 29

Barence writes "ARM has taken the lid off three new Mali T600 graphics chips that form the second generation of its mobile Midgard architecture. Designed for use in smartphones, tablets and smart TVs, the three chips range from four to eight cores, improve performance by 'up to 50%' and offer greater efficiency. ARM expects devices to begin appearing with the chips this time next year."
This discussion has been archived. No new comments can be posted.

ARM Unveils Three Second-Generation Mali GPUs

Comments Filter:
  • by Hatta ( 162192 ) on Monday August 06, 2012 @03:24PM (#40898115) Journal

    That's like 20 Mali GPUs generated per minute, or 1200 per hour.

    • Re: (Score:3, Funny)

      by Anonymous Coward

      They are recompiled from HDL (Hardware Descriptive Langauge), so making up a new one can be as fast as a new FireFox rev.

      • They are recompiled from HDL (Hardware Descriptive Langauge), so making up a new one can be as fast as a new FireFox rev.

        HDL = Hardware Description Language.

        There, fixed that for you.

  • by rickb928 ( 945187 ) on Monday August 06, 2012 @03:28PM (#40898143) Homepage Journal

    I won't be elegible for a subsidized upgrade until this time next year.

  • Open source drivers? (Score:5, Informative)

    by Curupira ( 1899458 ) on Monday August 06, 2012 @03:35PM (#40898217)
    It seems that ARM will soon release open source drivers [phoronix.com] for those babies...
    Unfortunatelly, they will not be *fully* open source:

    For the ARM Mali T6xx Linux enablement, they are only using a DRM driver for driving the display controller while they have their own separate kernel driver for poking and handling the GPU itself. ARM though isn't being too open-source friendly in terms of a fully open stack or providing proper documentation.

    Good grief.

    • by serviscope_minor ( 664417 ) on Monday August 06, 2012 @04:24PM (#40898737) Journal

      Unfortunatelly, they will not be *fully* open source:

      Shame.

      It's silly really. Intel have shown that OSS 3D drivers are perfectly feasible.

      Actually, the Intel graphics are pretty much the best choice unless you need high performance for something due to the extremely solid and reliable nature of them.

    • by mikael ( 484 ) on Monday August 06, 2012 @05:10PM (#40899213)

      It's been mentioned before . Many times the hardware may have features that are not yet fully developed or licensed. I remember Nvidia saying this about Linux driver and shader languages - that GLSL wasn't licensed for Linux or something. Then you have patent trolls who will jump at every keyword as evidence that you are violating the patents they bought off craigslist or ebay. Your mention of "geometry tree" might mean list of vertices, To them, it's their super-spiffy collision detection algorithm, so you end with paying lawyers large sums of money to fight it out on your behalf. You might also get the odd nutter or academic department who will go off and patent a future feature enhancement documented in the comments.

      Like the early days of the PC, somebody figured out a way of reprogramming the real-time clock to drive 8 Kilohertz interrupts so low-quality sound could be played on the internal speaker. They actually patented it, even though it completely scrambled all file update and creation times.

      Given the rate of change that is going on (GLES 3.0 [khronos.org] is out today, it would be very confusing for developers if ARM had one version of the driver that supported the current standard, and the user community has their own with custom extensions.

      • by don.g ( 6394 )

        From a pedant who remembers how this worked:

        * You could get more than 8KHz. But the number of PWM steps you had was 1.193180MHz / sampling frequency - so only 54 at 22.050KHz. The higher sampling frequencies made the PWM "whine" less audible.
        * As long as you called the original timer interrupt code at the correct frequency (1.193180MHz / 65536, ~18Hz) the system clock would stay accurate. Of course, if you failed to do this, it wouldn't.

  • When the T604 was announced, ARM VP Lance Howarth made bold claims about the chip’s gaming potential, saying “we’re really now in the realm of an Xbox 360 in a mobile phone”.

    • by Anonymous Coward

      Xbox360 has 40 early Radeon unified shader cores. This thing has 8 Mali cores.

  • Good. 'cause my single-core Droid Charge can take up to 5s to start a button press on a web page. It would do MS Bloatware proud.

    Too many Ice Cream Sandwiches, I'm guessin'.

  • That sound you hear is your battery life flying out the window... i mean really, do we need to add this capability to a table and phone? mobile phone battery life decreases with each generation, so let's add a GPU! /facepalm

    • by Anonymous Coward

      If graphics output can be off-loaded from the CPU to the GPU, which can do these computations more efficiently (read: use less energy to do so) that's improving battery life. On the other hand, if the GPU is used for what it can do...

    • Not sure if you're trolling or just woefully unaware of the state of smartphones, but my Original Motorola Droid A855 from Nov 2009 (which I still use as my primary mobile) has a PowerVR discrete GPU. To this day the battery life is still tolerable on the original battery, but I also cut my smartphone teeth on the UTStarcom PPC6700 and later HTC 6800 so I knew what I was in for.

      Lets fast forward nearly 3 years and compare my Droid to the Motorola Droid Razr Maxx. Huge screen, faster in every possible way,

  • More efficiency is great. Can I underclock it and get a smartphone that'll last 3 days on a charge?

    I have friends with 'smart'phones that last 8 hours on a charge. My 5-year old 'feature phone' can regularly get 3 days, and its silicon is on an ancient process.

    • My Cliq XT running CM7 can easily last 3 or 4 days with mobile data turned off and Wifi used sparingly every day.
      • Thanks, that's all I want. CM wiki says the XT is GSM only, but I'll look for a CDMA version (all that exists in these parts).

        • I think it's more of just how you use it. Your friends' phones that last less than a day probably have data plans with 3G turned on 24/7. I think any android phone with decent battery capacity will do fine if you don't use 3G and GPS.

E = MC ** 2 +- 3db

Working...