|The future of AA? Edge detect anti-aliasing on the Radeon HD 2900 XT|
|Image quality, performance scaling&heading=Edge detect anti-aliasing|
|Half-Life 2: Episode One, Prey|
Edge detect anti-aliasing - How it works
In essence, the concept of ATI's edge detect anti-aliasing is all in the name. It detects edges. Hold the front page! In slight more complex terms, this particular anti-aliasing mode looks for geometry edges in the rendered scene which sit across a pixel, and uses both the position on screen and the direction of the edge which has been detected to weight and then anti-alias that edge. ATI currently has two levels of edge detection available, one labelled 12x (which sits under the 4x mode with the edge detect filter selected in Catalyst Control Center) and the other 24x (which correlates to 8x with the correct filter in use in the driver control panel).
Needless to say, this isn't a cheap process to carry out computationally, so it should be noted right here and now that performance hits have the potential to be rather large using this mode of anti-aliasing. Of course, we'll take a look at just how hefty that performance hit is in a handful of game titles in due course.
As you should have fathomed by now, the big claim to fame for edge detect anti-aliasing is its image quality, which promises to be something really rather special if it works well. To take an early look at this for ourselves we've used a scene from Half-Life 2, rendered at 1024x768 to bring any aliasing issues to the fore, examining it first with no anti-aliasing enabled, followed by looking at the same scene with both of the new edge detect anti-aliasing modes in use.
If you want to compare this to any of the other standard or CFAA modes available on the Radeon HD 2000 series using exactly the same scene, you can find images using all of these various anti-aliasing modes on this page of our review of Sapphire's Radeon HD 2900 XT.
Quite simply, the 12x edge detect mode does a superb job of anti-aliasing (aside from the alpha textures used by the tree on the right-hand side of the image, which it can't deal with) - Clean, smooth edges everywhere, and no noticeable sign of the blurring which took some of the sheen from ATI's narrow and wide tent filtering modes (although technically speaking, there is undoubtedly still some blurring present due to the way the edge detect anti-aliasing mode works). The difference between 12x and 24x edge detect modes is far more difficult to spot in my eyes, but it still looks fantastic overall, as you would expect.
If you want to examine this image for any of the Radeon HD 2900 XT's anti-aliasing modes in isolation (including those from our previous Radeon HD 2900 XT review), then you can do so by clicking on the links below:
2x narrow tent CFAA
4x narrow tent CFAA
8x narrow tent CFAA
2x wide tent CFAA
4x wide tent CFAA
8x wide tent CFAA
Improved image quality is one thing, but how does edge detection compare to the other multi-sampling and Custom Filter AA modes on offer in ATI's latest architecture from a performance standpoint? To see just where these two new modes fit in to the grand scheme of things from this point of view, let's see how the various anti-aliasing modes available to the Radeon HD 2900 XT perform in Half-Life 2: Episode One at a resolution of 1600x1200.
We did warn you that edge detection was computationally expensive, didn't we? In our given example, ATI's 12x edge detect mode performs better than both the highest wide tent and narrow tent CFAA modes, albeit not by much, while the 24x mode drops into the mid-teens frame-rate wise. Certainly not playable at this resolution in this particular example.
Of course, one performance or image quality in one game does not a good overall judgement make, particularly given that the performance hit of edge detection could well vary quite heavily given different workloads, so let's now look more closely at both performance and image quality in a handful of other popular game titles.