Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Intel Hardware Technology

Intel Details Handling Anti-Aliasing On CPUs 190

MojoKid writes "When AMD launched their Barts GPU that powers the Radeon 6850 and 6870, they added support for a new type of anti-aliasing called Morphological AA (MLAA). However, Intel originally developed MLAA in 2009 and they have released a follow-up paper on the topic--including a discussion of how the technique could be handled by the CPU. Supersampling is much more computationally and bandwidth intensive than multisampling, but both techniques are generally too demanding of more horsepower than modern consoles or mobile devices are able to provide. Morphological Anti-aliasing, in contrast, is performed on an already-rendered image. The technique is embarrassingly parallel and, unlike traditional hardware anti-aliasing, can be effectively handled by the CPU in real time. MLAA is also equally compatible with ray tracing or rasterized graphics."
This discussion has been archived. No new comments can be posted.

Intel Details Handling Anti-Aliasing On CPUs

Comments Filter:
  • by MojoKid ( 1002251 ) * on Sunday July 24, 2011 @08:11PM (#36866332)
    I think you need to do your research before being critical... embarrassingly critical it appears.
  • by Anonymous Coward on Sunday July 24, 2011 @08:18PM (#36866410)

    Do you have any sort of formal (that is, university-level) training in Computer Science or Computer Engineering? Based on your comment, it really doesn't look like you have any at all.

    Like others have already pointed out, "embarrassingly parallel" is a very legitimate and correct term to use in the field of parallel computing. It may sound funny to you, but it's a term used by the experts. In fact, it's such a core concept that even most undergraduates are well aware of it and what it means.

    This is the sort of shit I see time and time again from Rails "developers" and JavaScript "programmers". Such people have no real training whatsoever, yet somehow believe themselves to be experts in the field. They go out and make blatantly ignorant and incorrect comments on various social media sites, and then wonder why actual professionals and academics think that these Ruby and JavaScript users are idiots.

  • by LanceUppercut ( 766964 ) on Sunday July 24, 2011 @08:25PM (#36866478)
    Anti-aliasing, by definition, must be performed in object space or, possibly, in picture space. But it cannot be possibly carried out on an already rendered image. They must be trying to market some glorified blur technique under the anti-aliasing moniker. Nothing new here...
  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Sunday July 24, 2011 @08:26PM (#36866482) Homepage

    "Embarrassingly parallel" refers to a problem made up of many isolated tasks -- such as running a fragment (pixel) shader on millions of different fragments, or a HTTP server handling thousands of clients -- that can all be run concurrently without any communication between them.

    It's odd that they use that term here, because the other anti-aliasing techniques are embarrassingly parallel as well.

    SSAA (super-sampling) always renders each pixel n times at various locations within the pixel, and blends them together.

    MSAA (multi-sampling) is basically the same as SSAA, but only works on polygon edges and is very dependant on proper mipmapping to reduce aliasing introduced when scaling textures.

  • by Anonymous Coward on Sunday July 24, 2011 @08:46PM (#36866594)

    Uhh, it's not a new term at all. I distinctly remember it from my undergrad days, and those were in the early 1980s. In fact, I think we learned of it during one of our earliest introduction-to-computer-architecture courses. It was pretty basic knowledge that everyone in the program was assumed to know of and understand.

  • Re:Blur (Score:5, Informative)

    by djdanlib ( 732853 ) on Sunday July 24, 2011 @08:48PM (#36866608) Homepage

    It's different from a Gaussian blur or median filter because it attempts to be selective about which edges it blurs, and how it blurs those edges.

    This technique really wrecks text and GUI elements, though. When I first installed my 6950, I turned it on just to see what it was like, and it really ruined the readability of my games' GUIs. So, while it may be an effective AA technique, applications may need to be rewritten to take advantage of it.

  • by thegarbz ( 1787294 ) on Monday July 25, 2011 @03:58AM (#36868352)

    Seeing jaggies is not the only purpose of AA. The idea is also to be able to render objects that are smaller than the spatial resolution of the view. Think a long distance away you're looking at a guywire of a comms tower. You may see a row of appearing / disappearing pixels as on average the wire is rendered as smaller than a pixel width. AA takes care of this, which is far more annoying than simply a resolution issue of sharp edges on objects.

    This glorified blurring algorithm however doesn't fix this.

"Ninety percent of baseball is half mental." -- Yogi Berra

Working...