Intel Details Handling Anti-Aliasing On CPUs 190
MojoKid writes "When AMD launched their Barts GPU that powers the Radeon 6850 and 6870, they added support for a new type of anti-aliasing called Morphological AA (MLAA). However, Intel originally developed MLAA in 2009 and they have released a follow-up paper on the topic--including a discussion of how the technique could be handled by the CPU. Supersampling is much more computationally and bandwidth intensive than multisampling, but both techniques are generally too demanding of more horsepower than modern consoles or mobile devices are able to provide. Morphological Anti-aliasing, in contrast, is performed on an already-rendered image. The technique is embarrassingly parallel and, unlike traditional hardware anti-aliasing, can be effectively handled by the CPU in real time. MLAA is also equally compatible with ray tracing or rasterized graphics."
Why not leave it on the GPU? (Score:5, Insightful)
Intel wants for there to be no GPU (Score:3)
They want everything to run on the CPU, and thus for you to need a big beefy Intel CPU. Remember Intel doesn't have a GPU division. They make small integrated chips but they are not very powerful and don't stack up well with the low power nVidia/ATi stuff. What they make is awesome CPUs. So they really want to transition back to an all-CPU world, no GPUs.
They've been pushing this idea slowly with various things, mostly based around ray-tracing (which GPUs aren't all that good at).
Right now it is nothing but
Re: (Score:3)
Ray tracing can be very parallel, on GPUs (Score:3)
Many of the commercial ray tracing packages have written GPU-based versions that work remarkably well.
V-Ray and mental ray, in particular, have very exciting GPU implementations. A presentation by mental images showed some very high-quality global illumination calculations done on the GPU. Once you get good sampling algorithms, the challenge is dealing with memory latency. It's very slow to do random access into memory on a GPU. mental images solved that problem by running a lot of threads, as GPU's con
Re: (Score:2)
Why is GPU not good at ray-tracing? So far as I know, they excel at tasks which exhibit massive dependency-free parallelism, and lots of number crunching with little branching. It would seem to me that this describes ray-tracing almost perfectly.
Re: (Score:2)
"They make small integrated chips but they are not very powerful and don't stack up well with the low power nVidia/ATi stuff."
Beg to differ. My last experience with nVidia vs. Intel graphics proved otherwise. In two laptops with otherwise more or less the same hardware (Core 2 Duo P-Series processors, 4+ gigs of RAM, same chipset), the one with Intel graphics provides a much smoother video (HD and so on) experience (things like YoutubeHD or FullHD H264 in MKV), only marginally worse performance in 3D games,
Re: (Score:2)
Re: (Score:2)
If by Computational Intelligence you mean Neural networks, Evolutionary Computation or Fuzzy Logic, you should look for GPU use. You can achieve at least 10x, generally 100x performance gain easily without making your code so more cumbersome or difficult to understand. Check this implementation of a neural network in C# and Cuda [codeproject.com] or some Fuzzy Logic [missouri.edu]. For portability, in the worst case where the computer can't have NVidia video cards, you still have MCUDA that will translate CUDA GPU processing into normal CP
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
why not leave it on the GPU?
A lot of modern Playstation 3 games use that technique as it allows to do something useful with the SPUs, while the GPU is already busy enough rendering the graphics. It also helps with raytracing, as you might need less CPU power to do anti-aliasing this way, then the proper one. When the GPU of course has some free cycles left, there is no reason to not do it there.
Re: (Score:2)
Can't say I agree with this. Display pixel density is far too low for that... I'll agree with you when I'm using a 1080p 10" netbook at the same physical font sizes as now.
But for now, no way I'm turning off anti-aliasing (or cleartype or whatever) on regular 120-150dpi pixel densities...
Rain dances around Shannon (Score:3, Insightful)
If your signal is aliased during sampling, you are toasted.
No voodoo will help you if your spectrum folded on itself.
So super-sample it or shut up.
Everything else is a snake oil for unwashed masses.
And yes, MPLAA still looks like crap in comparison to SS.
Re: (Score:3)
Judging by the article, MLAA is actually just a technique that looks for jagged edges, and blurs them.
How this is better than just blurring the whole thing is beyond me. Those images look terrible.
You sure GPU's aren't better? (Score:4, Interesting)
If the system is 'embarrassingly parallel' and simple then the GPU would be a better use case. GPU's typically have a lot (200-400) cores that are optimized for embarrassingly simple calculations. Sure you could render everything on a CPU these days, simpler games could even run with an old school SVGA (simple frame buffer) card and let all the graphics be handled by the CPU as used to be the case in the 90's and is evidenced by the 'game emulators in JavaScript' we've been seeing lately but GPU's are usually fairly unused except for the ultramodern 3D shooters which also tax a CPU pretty hard.
Re: (Score:2)
Re: (Score:2)
Even if you have a good GPU it's still useful. In your typical game the GPU is generally the bottleneck, so if you can offload some stuff to the CPU all the better. That's how it done on the PS3. It has an ok GPU and a very good CPU so a lot of graphics stuff is run on the CPU. In fact, even if it was invented by Intel, I believe it was the PS3 game developers that were the driving force behind MLAA's popularization.
Blur (Score:5, Insightful)
So, it basically blurs the image around areas of high contrast? Sounds like thats whats going on. Looks like it, too. I can understand why they are targeting this at mobile and lower powered devices: it kinda looks crappy. I might even say that no antialiasing looks better, but I'd really have to see more samples, especially contrasting this with regular MSAA. I suspect, however, that normal antialiasing will always look considerably better. For instance, normal AA would not blur the edge between two high-contrast textures on a wall (I think, since it is actually aware that it is processing polygon edges), while I suspect MLAA will, since it only sees an area of high contrast. Look at the sample image they post in the article: the white snow on the black rock looks blurred in the MLAA processed picture, while it has no aliasing artifacts at all in the unprocessed image. Its pretty slight, but its definitely there. Like I say, need to see more real world renders to really tell if its a problem at all or simply a minor thing no one will ever notice. I'll stick to my 4X MSAA, TYVM.
Re: (Score:2)
Re:Blur (Score:5, Informative)
It's different from a Gaussian blur or median filter because it attempts to be selective about which edges it blurs, and how it blurs those edges.
This technique really wrecks text and GUI elements, though. When I first installed my 6950, I turned it on just to see what it was like, and it really ruined the readability of my games' GUIs. So, while it may be an effective AA technique, applications may need to be rewritten to take advantage of it.
Bigger text (Score:2)
[Morphological AA postprocessing] really ruined the readability of my games' GUIs. So, while it may be an effective AA technique, applications may need to be rewritten to take advantage of it.
Just as games and other applications supporting a "10-foot user interface" need to be rewritten with larger text so that the text is not unreadable when a game is played on a standard-definition television. The developers of Dead Rising found this out the hard way.
Re: (Score:2)
I remember text in Dead Rising was totally unreadable. But I don't think there was any backlash, was there?
Wikipedia tells all [wikipedia.org].
[Unreadably small text] is true for most sports coverage in SDTV these days, especially EuroSport.
I was born outside Europe, so I'm only familiar with sports coverage on NBC, ABC, CBS, Fox, and ESPN. They at least try to keep the scores readable on SDTV.
Re: (Score:2)
Anti-Aliasing is not supposed to blur edges arbitrarily. I suppose that's why this is selective, but it just seems like a crappy thing to be doing. And while it can be done by a CPU, that's probably not practical - either the CPU is busting it's ass to do rendering and doesn't really have time to make an AA pass, or the GPU did all the rendering and may as well d
Re: (Score:2)
What I saw in non-technical terms: It appeared to blur edges whose location or neighboring pixels changed from one frame to the next. Unfortunately, whenever something changed behind text and GUI elements, it went right ahead and blurred those edges as well.
Re: (Score:2)
Re: (Score:2)
If it blurs the text and GUI then it's poorly implemented. The AA should be applied before drawing the UI.
Re: (Score:2)
Some comparison screen shots [ign.com], essentially it performs extremely well on clean high contrast edges, but can lead to ugly blurring when the source image contains heavily aliased areas (i.e. small sub-pixel width lines in the background). There are also some temporal issue, as some of the artifacts it causes get worse when its animated. Overall I'd say its an clear improvement, not perfect, but when you are stuck with 1280x720 on 44" TV you are happy about any anti-aliasing you can get.
Re: (Score:2)
And the best part is... it looks like crap! (Score:2)
MSAA looks awful - and Intel's CEO famously knocked antialiasing as being a stupid blurring technique not long ago. So, he goes with the only form of AA that literally adds no value. Cutting off their nose to spite their face?
parallel (Score:2)
Yes, its faster than MSAA, but modern GPUs are already pretty good at handling real-time MSAA.
Embarassingly parallel (Score:2)
So, it is not anti-aliasing at all... (Score:5, Informative)
hq3x (Score:2)
Anti-aliasing, by definition, must be performed in object space or, possibly, in picture space. But it cannot be possibly carried out on an already rendered image.
Ever heard of hq3x [wikipedia.org]? Or the pixel art vectorizer we talked about two months ago [slashdot.org]?
Even if it isn't technically AA (Score:3)
Your definition of anti-aliasing is off by a long shot.
Both this edge blending technique and pixel art upscalers work by guessing underlying shapes based on the corners within high-contrast edges in the image. Pixel art upscalers aren't the same as hand-drawing the image at a higher pixel density, but they still produce a picture with some of the same desirable qualities. Likewise, even if this sort of edge blending isn't the same as proper anti-aliasing, it still produces a picture with some of the same desirable qualities.
Making up data (Score:2)
what you want with antialiasing is to use it to show more actual data of the scene - the vectorizer and eagle etc etc.. they all just make up data.
This edge-blending technique also makes up data, and it makes up the blending amounts in almost the exact same way that the extrapolators make it up. MLAA just presents the made-up data in a way that looks like AA.
Re: (Score:2)
Anti-aliasing, by definition, must be performed in object space or, possibly, in picture space. But it cannot be possibly carried out on an already rendered image.
Sir –
Anti-aliasing can be performed on a rendered image by performing image recognition i.e. vectorization. This is doable with edges of the geometric sort (i.e. straight lines, simple curves) and pre-existing patterns (e.g. glyphs of known fonts of given sizes). This result is probably an absurdity in terms of the performance, however "cannot be possibly carried out" is a bit too strong, in my humble opinion. It may be impractical, but certainly it's not impossible.
Re: (Score:2)
And what if the object you're applying your magic filter on is smaller than the available spatial resolution? (Think guard rails on a building far away). AA accurately renders these, but if they aren't properly rendered to begin with you can't recreate them without knowing what it was supposed to look like.
Re: (Score:2)
There are no pixels in object space. It's an operation on pixels. But I agree with your second half - it's some new blur method that probably isn't worth it. Nothing to see here - in fact things are harder to see.
Re: (Score:2)
There are no pixels in object space.
A pixel in object space is a frustum. Performing anti-aliasing at this level not only can be done, but is frequently done within the VFX world. Remember that VFX shaders tend to be a single unified shader - instead of multi-stage vertex/geom/pixel - so calculations can be performed in any space you want. For a procedural-shader heavy scene, the ideal would be to get the shaders to perform the anti-aliasing for you, in object space, rather than resorting to super-sampling....
Re: (Score:2)
I would call it Partial Gaussian Blur. Since that is effectively what they are doing. They are blurring the sharp edges of the image.
AA always amuses me, for historical reasons... (Score:2)
Up until quite recently, with high-speed digital interfaces nowhere near what video of any real resolution required, and high-bandwidth analog components very expensive, AA was just something that happened naturally, whether you liked it or not: your not-at-all-AAed digital frame went to the RAMDAC(which, unless you had really shelled out, could likely have been a bit lax about accuracy in exc
Re: (Score:2)
its funny years ago I went on a hunt for the sharpest flat front CRT I could find to see the "edges of pixels" and would not have it any other way for old bitmap games, but on mondern high res games on modern high res monitors if you dont have it it just sprinkles the entire screen with jaggie noise.
Re: (Score:2)
Analog blur is different; but it also had the e
FXAA is a better choice (Score:5, Interesting)
Re: (Score:2)
Yeah, funny how this popped up only days after the FXAA announcement last week.
Re: (Score:2)
In the early days perhaps. These days most PS3 games use MLAA or variants of it running on the CPU.
decent phones don't need AA (Score:4, Interesting)
AA is a crutch to get around a lack of DPI. Take the iphone 4 at 326 DPI, it is 3 to 4x the DPI of the average craptasic "HD" computer monitor. I have a laptop with a 15" 1920x1200 screen. At that DPI Seeing the "jaggies" is pretty difficult compared with the same resolution on my 24". On the 15" can turn AA on/off and its pretty difficult to discern the difference. That monitor is only ~150DPI. I challenge you to see the affects of anti-aliasing on a screen with a DPI equivalent to the iphone 4.
The playstation/xbox on the other-hand are often used on TV's with DPI's approaching 30. If you get within a couple feet of those things the current generation of game machines look like total crap. Of course the game machines have AC power, so there really isn't an excuse. I've often wondered why sony/MS haven't added AA to one of the respun versions of their consoles.
Re: (Score:2, Interesting)
The eye is pretty good at picking out jaggies, especially in tough cases (high contrast, thin line, shallow slope against the pixel grid,) and where the screen is viewed from close range (my eye is closer to my phone's screen than my desktop monitor.)
Now, I don't think antialiasing makes a huge deal to game mechanics - but it is nice to have in high-contrast information situations (e.g. google maps) regard
Re: (Score:2)
The eye is pretty good at picking out jaggies, especially in tough cases (high contrast, thin line, shallow slope against the pixel grid,)
In my case, on the higher res displays, its not the line stepping that is the problem so much as "crawling". In other words, the position of the step is moving around in an otherwise static display. That said, I would take a 2x DPI increase in any application. Of course i'm the guy fighting to turn off clear type cause I can't stand the color bleeding.
Re:decent phones don't need AA (Score:4, Informative)
Seeing jaggies is not the only purpose of AA. The idea is also to be able to render objects that are smaller than the spatial resolution of the view. Think a long distance away you're looking at a guywire of a comms tower. You may see a row of appearing / disappearing pixels as on average the wire is rendered as smaller than a pixel width. AA takes care of this, which is far more annoying than simply a resolution issue of sharp edges on objects.
This glorified blurring algorithm however doesn't fix this.
Re: (Score:3)
The idea is also to be able to render objects that are smaller than the spatial resolution of the view. Think a long distance away you're looking at a guywire of a comms tower.
Yes your right, but as I suggested its a hack to get around lack of resolution. Forcing a fudge factor (a bunch of large grey pixels) in may not always be the best response. Plus, its limited by the oversampling ratio. What i'm arguing is that a display of 2x the DPI will look better than a 2x over-sample. Eventually increasing either
Re: (Score:2)
Fair point, though you miss the sub-pixel object rendering as somebody else already commented.
But even aside from that, while we are on the phone subject:
If you look at for example the Samsung Galaxy S2 (arguably the mobile performance king - at least until the new iPhone is out ?) does MSAA 4x without any performance hit and 16x with a very small one. I do believe we will see many mobile devices with the same GPU or its successor in future phones. Granted, it's DPI is only around ~230, not ~330 like the iP
Re: (Score:3)
AA is a crutch to get around a lack of DPI
No, even with higher DPI than the eye can resolve, you still need AA sometimes, as aliasing can present other problems than the jagged lines you're familiar with (e.g. moiré patterns).
This is NOT antialiasing (Score:2)
This is image reconstruction, where additional information (not necessarily correct) is derived from a limited image.
Close equivalents are the "font smoothing" done by the earliest versions of Macintosh for printing their bitmap graphics on a PostScript printer to draw 72 dpi 1-bit images at 300 dpi. Also I believe Microsoft's earliest subpixel font rendering, smoothtype, was done this way (not cleartype or any other modern font rendering).
Much more complicated examples are algorithms for scaling up images,
Comparable? (Score:2)
distance field alpha testing (Score:2)
I am reminded of this :
"Text Rendering in the QML Scene Graph"
http://labs.qt.nokia.com/2011/07/15/text-rendering-in-the-qml-scene-graph/ [nokia.com]
"Some time ago, Gunnar presented to you what the new QML Scene Graph is all about. As mentioned in that article, one of the new features is a new technique for text rendering based on distance field alpha testing. This technique allows us to leverage all the power of OpenGL and have text like we never had before in Qt: scalable, sub-pixel positioned and sub-pixel antialiase
Re:Bah, humbug, tech writers need help (Score:5, Informative)
http://en.wikipedia.org/wiki/Embarrassingly_parallel [wikipedia.org]
Re: (Score:2)
Glad to see that amateur journalists have created an article in wikipedia whose "sources remain unclear because it lacks inline citations."
Re: (Score:2)
Re: (Score:2, Informative)
Uhh, it's not a new term at all. I distinctly remember it from my undergrad days, and those were in the early 1980s. In fact, I think we learned of it during one of our earliest introduction-to-computer-architecture courses. It was pretty basic knowledge that everyone in the program was assumed to know of and understand.
Re:Bah, humbug, tech writers need help (Score:5, Insightful)
It's a term of art commonly used in the field for a very long time. That you don't like it really doesn't matter at all to anyone but you.
Re: (Score:3)
"I still think it's a poorly worded phrase."
Yeah. Embarrassingly poorly worded.
Re: (Score:2)
I think you've done a good job of pointing out what's wrong with the phrase actually... those in a technical field, or those who took a moment to read the Wikipedia article (and yes, usual caveats about believing everything you read in Wikipedia, just ask Steven Colbert about elephants...), the phrase Embarrassingly Parallel has a specific meaning. A journalist or non-technical reader, however, will probably assume that it is an exclamation that doesn't actually add anything to the meaning, as though they'r
Re: (Score:2)
Re:Bah, humbug, tech writers need help (Score:4, Funny)
Translation: Damn, I was revealed as an ignoramus. How can I swing this back in my favor?
Re: (Score:3)
I shall swing it embarrassingly, of course.
Re: (Score:2)
Re: (Score:2)
I agree and I think a much better phrase would be something like simplistically parallel or cakewalk parallel even.
Thank you! Simplistically parallel works a lot better.
And frankly I don't care if Intel calls it Shaka Zulu parallel
I'd buy that. TAKE MY MONEY
all these things are done better and with lower heat and power on the GPU
Indeed, for most applications and users we have reached that "good enough" point, at least until someone develops more complex software. Someone else in the comments here mentioned bus overutilization as a potential future scenario. It all depends on how much you offload, I suppose.
Regardless of the antitrust issues which we both agree are terrible... you have to give Intel some credit for producing a killer combo with the i7-2
Re: (Score:2)
Re: (Score:2)
>>And frankly I don't care if Intel calls it Shaka Zulu parallel
Well.... I'd think it was awesome.
Hell, my Master's is even in parallel processing.
-ShakaUVM
Embarrassingly authoritative (Score:2)
In Wikipedia's defense, it's at the least more authoritative than The Bible, even though the latter is [i]embarrassingly trusted[/i] by more people.
Re: (Score:2)
What's he [youtube.com] gotta do with it? Is he embarrassingly parallel, too?
Re: (Score:2)
Maybe just embarrassing ?
Re: (Score:2)
In Soviet Russia, you embarrass Trololo.
Re: (Score:2)
In Soviet Russia, Trololo is embarrassed for you.
Re: (Score:2, Funny)
Sex. Sure, you and your SO may be so good at sex it only lasts a few seconds, but you'd never admit it in public.
Embarrassingly Parallel Processing is the same way.
Re:Bah, humbug, tech writers need help (Score:5, Funny)
This is Slashdot. Many here would be happy to admit to having an SO with whom they are having regular sex.
So would a lot of the people with SOs...
Re:Bah, humbug, tech writers need help (Score:5, Informative)
Re: (Score:3)
Touche, sir.
Re: (Score:2)
The word is "touché", not "touche".
Too fucking shay, senor!
Re: (Score:2)
Build identifier: Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0
It showed up fine, but then I copied and pasted it, and it showd like above. Too bad you're so fucking smart, though.
Re: (Score:2)
Thanks to the race for the almight 'insightful' mod, we get precious little owning up to a mistake. With that understood, you'll pardon me for saying: Shut up.
Re:Bah, humbug, tech writers need help (Score:5, Informative)
"Embarrassingly parallel" refers to a problem made up of many isolated tasks -- such as running a fragment (pixel) shader on millions of different fragments, or a HTTP server handling thousands of clients -- that can all be run concurrently without any communication between them.
It's odd that they use that term here, because the other anti-aliasing techniques are embarrassingly parallel as well.
SSAA (super-sampling) always renders each pixel n times at various locations within the pixel, and blends them together.
MSAA (multi-sampling) is basically the same as SSAA, but only works on polygon edges and is very dependant on proper mipmapping to reduce aliasing introduced when scaling textures.
Re: (Score:2)
Thanks for writing an educational reply! Alas I cannot mod it, being the OP.
It's still a weird phrase, though.
Re: (Score:2)
Re: (Score:2)
So perhaps you can explain this:
Why would someone use such a stupid term for it, when something much more intuitive like "independently parallel" might suffice?
Re: (Score:2)
MSAA isn't quite what you describe. Instead, several samples are taken, and which primitive they're in is tested. But, only one fragment per primitive is generate – i.e. with supersampling, if all the fragments land on the same primitive, the fragment shader is run multiple times. With multi-sampling, the fragment shader is run only once. The reason that people cite that it "only works on polygon edges" is because that's where you get pixels with more than one primitive in them. Don't be fooled t
Re: (Score:2)
Re: (Score:2, Funny)
Can amateur journalists PLEASE stop using the phrase "embarrassingly parallel" to describe software tasks? Who's embarrassed? Why are they embarrassed about designing something that can be efficiently processed?
No can do. Journalists all read each other, and when one comes up with a catchy term, they all pick up on it. This is especially true if they have no idea what they're writing about, or some editor thinks it's punchier or dramatic.
My pet peeve is "gun-toting." No one "totes" a firearm! If it's a pistol, you holster it. If it's a rifle, you sling it or shoulder it. I guess "armed" is too simple.
Re: (Score:2)
My pet peeve is "gun-toting." No one "totes" a firearm! If it's a pistol, you holster it. If it's a rifle, you sling it or shoulder it. I guess "armed" is too simple.
Seriously? Lots of people tote firearms. Tote means to carry or to have on one's person.
Depends how lazy the guy writing the dictionary is. I've never hear someone say, "tote that rifle to the ready line." You can't get a "tote concealed weapons" permit. No one talks about the "right to keep and tote arms."
The only other expression I'm familiar with is "tote-bag." All I want to know: are we about to go shooting or shopping?
Re: (Score:3)
Can amateur journalists PLEASE stop using the phrase "embarrassingly parallel" to describe software tasks? Who's embarrassed? Why are they embarrassed about designing something that can be efficiently processed?
But amateur journalism is embarrassingly parallel.
Re: (Score:2)
It's embarrassing because it's almost *too* efficient. The term first came about when CPU manufacturers and the industry in general were embarrassed at how much faster the GPU was for certain algorithms (i.e. the embarrassingly parallel ones). Programmers also were generally embarrassed at not using the technology sooner, and they often spent YEARS writing efficient code for the CPU, only to have a 5 minute knock-up code job on the GPU beat it when they finally experimented with the GPU.
The definition was t
Re: (Score:2)
A barn (symbol b) is a unit of area. Originally used in nuclear physics for expressing the cross sectional area of nuclei and nuclear reactions, today it is used in all fields of high energy physics to express the cross sections of any scattering process. A barn is defined as 1028 m2 (100 fm2) and is approximately the cross sectional area of a uranium nucleus.
Two related units are the outhouse (1034 m2, or 1 b) and the shed (1052 m2, or 1 yb),[3] although these are rarely used in practice.
You'd never make i
Re: (Score:2)
Who's embarrassed?
*Cough* You? *Cough*
Re: (Score:2)
Nope, I studied photography, evangelized multiple distributions of Linux, wrote my own games, work in IT at a place where all the workloads are single-threaded, have my own home recording studio, and other such things. That's my geek cred, not CS/CE.
I don't understand why people need to anonymously criticize an honest mistake by someone not "in the business" to the point where it's the biggest thread. I guess it's the Internet, and that's what's been happening since USENET, so I don't let it get to me. (I a
Re: (Score:2)
I only said one thing, for which I accepted correction, but apparently the quality of discussion on Slashdot has degraded to the point of the mud-slinging on Digg and Youtube... I am mildly intrigued by one thing, though. Why do you feel the need to rip me to shreds so thoroughly? Are you trying to accomplish something? And, why are you doing it anonymously?
Re: (Score:3)
Lets be fair, "embarrassingly parallel" is an embarrassingly stupid phrase. It takes a word out of it's normal context.
You'd think they would chose something less silly sounding and less prone to confusing those who encounter it for the first time. Say, "independently parallel" - seems to sum it up nicely while not confusing the hell out of those unfamiliar with the jargon.
Re: (Score:2)
I have, but I didn't come upon the term "embarrassingly parallel" until much later. Possibly because I'm embarrassingly old; the first use I can find of the term is in 1989, and it doesn't seem to have become mainstream even within computer science until a few years later.
Re: (Score:2)
Kinda hard to take you seriously, though, when you seem to confuse Ruby on Rails with the Ruby language it's built upon. Or just ignorantly generalize from one to the other.
Then again, as someone with formal training in Software Engineering, I hold a much lower opinion on Anonymous Cowards of all kinds than I do of people proficient in Ruby, Rails or Javascript ;)
No, MLAA is aliasing! (Score:3)
MLAA is also crap, compared to "proper" antialiasing (supersampling) or even "draft" antialiasing (multisampling). Any detail smaller than 1 pixel simply isn't rendered with MLAA (and that also means no sub-pixel motion). Essentially, MLAA is just a blur filter, which actually reduces the amount of detail in the image (unlike supersampling, which increases the detail).
Edge detect + supersampling (or edge detect + high multisampling) is by far the best solution.
Oh, and technically blurring is antialiasing. It's just a very primitive flavor of.
Aliasing is when signals become indistinguishable. The common symptom of jaggies occurs when of the ray that is chosen to sample hides the signal of the nearby rays.
But blurring is aliasing! In physical blurring signals from in focus rays are overwhelmed by signals from out of focus rays. Similarly, with a blur filter you're definitely losing data by blurring it with neighboring signals.
And you're right: this technology is nothing but an elaborate blur filter. So this looks like anti-aliasing because it mas