Forgot your password?
typodupeerror
Graphics Intel Hardware Technology

Intel Details Handling Anti-Aliasing On CPUs 190

Posted by timothy
from the upping-the-spec-of-normalcy dept.
MojoKid writes "When AMD launched their Barts GPU that powers the Radeon 6850 and 6870, they added support for a new type of anti-aliasing called Morphological AA (MLAA). However, Intel originally developed MLAA in 2009 and they have released a follow-up paper on the topic--including a discussion of how the technique could be handled by the CPU. Supersampling is much more computationally and bandwidth intensive than multisampling, but both techniques are generally too demanding of more horsepower than modern consoles or mobile devices are able to provide. Morphological Anti-aliasing, in contrast, is performed on an already-rendered image. The technique is embarrassingly parallel and, unlike traditional hardware anti-aliasing, can be effectively handled by the CPU in real time. MLAA is also equally compatible with ray tracing or rasterized graphics."
This discussion has been archived. No new comments can be posted.

Intel Details Handling Anti-Aliasing On CPUs

Comments Filter:
  • by djdanlib (732853) on Sunday July 24, 2011 @08:16PM (#36866388) Homepage

    Well whaddya know, it's in Wikipedia. That makes it officially okay, right?

    I still think it's a poorly worded phrase.

  • by guruevi (827432) <evi@smo k i n g c ube.be> on Sunday July 24, 2011 @08:18PM (#36866408) Homepage

    If the system is 'embarrassingly parallel' and simple then the GPU would be a better use case. GPU's typically have a lot (200-400) cores that are optimized for embarrassingly simple calculations. Sure you could render everything on a CPU these days, simpler games could even run with an old school SVGA (simple frame buffer) card and let all the graphics be handled by the CPU as used to be the case in the 90's and is evidenced by the 'game emulators in JavaScript' we've been seeing lately but GPU's are usually fairly unused except for the ultramodern 3D shooters which also tax a CPU pretty hard.

  • by dicobalt (1536225) on Sunday July 24, 2011 @09:24PM (#36866816)
    It can work on any DX9 GPU without dedicated support. http://hardocp.com/article/2011/07/18/nvidias_new_fxaa_antialiasing_technology/1 [hardocp.com]
  • by bored (40072) on Monday July 25, 2011 @12:24AM (#36867606)

    AA is a crutch to get around a lack of DPI. Take the iphone 4 at 326 DPI, it is 3 to 4x the DPI of the average craptasic "HD" computer monitor. I have a laptop with a 15" 1920x1200 screen. At that DPI Seeing the "jaggies" is pretty difficult compared with the same resolution on my 24". On the 15" can turn AA on/off and its pretty difficult to discern the difference. That monitor is only ~150DPI. I challenge you to see the affects of anti-aliasing on a screen with a DPI equivalent to the iphone 4.

    The playstation/xbox on the other-hand are often used on TV's with DPI's approaching 30. If you get within a couple feet of those things the current generation of game machines look like total crap. Of course the game machines have AC power, so there really isn't an excuse. I've often wondered why sony/MS haven't added AA to one of the respun versions of their consoles.

  • by isaac (2852) on Monday July 25, 2011 @12:58AM (#36867736)

    I challenge you to see the affects of anti-aliasing on a screen with a DPI equivalent to the iphone 4.

    The eye is pretty good at picking out jaggies, especially in tough cases (high contrast, thin line, shallow slope against the pixel grid,) and where the screen is viewed from close range (my eye is closer to my phone's screen than my desktop monitor.)

    Now, I don't think antialiasing makes a huge deal to game mechanics - but it is nice to have in high-contrast information situations (e.g. google maps) regardless of the pixel pitch of the underlying display.

For God's sake, stop researching for a while and begin to think!

Working...