A Little Guide to Antialiasing

E

Eric

Let me tell you a simple fact about antialiasing...

72/96 dpi displays are the cause of antialiasing.

There are three options:

- Ignore it

- Supersampling/Multisampling

- 300+ dpi display

The third is the best option, as the antialiasing sampling methods smooth
the "jaggies" by rendering the graphics at much higher resolutions than is
actually displayed, you lose detail. Since the GPU is calculating more
pixels, games and graphics slow down.

The human eye cannot easily resolve objects more than 300 dpi. As a result,
a 300+ dpi display can do perfect antialiasing without losing resolution, at
no performance cost.

For a more visual explanation:

Supersampling
-------------

Apparent Resolution: 1280 x 1024
Actual Resolution: 2560 x 2048 (4x Supersampling)
Resolution Loss: 3932160 pixels (75% of pixels lost)

Performance Penalty as per apparent resolution: Yes

300+ dpi display
----------------

Apparent Resolution: 2560 x 2048
Actual Resolution: 2560 x 2048
Resolution Loss: 0 pixels

Performance Penalty as per apparent resolution: No


Results: For the Supersampling method, the GPU is rendering more pixels than
you can actually see. For the 300+ dpi display method, the GPU is rendering
just as much pixels as you can actually see.


Statement: 300+ dpi display "antialiasing" - no performance penalty

Conclusion: If you have a 300+ display, you can turn off all antialiasing
features and devote more GPU power to rendering more complex visuals - the
display will do the antialiasing for you, without the GPU knowing.



*** Supersampling: the GPU renders more pixels than is displayed so that
when the extra pixels are scaled back to the apparent resolution jaggies can
be minimized. Although the jaggies are reduced, performance suffers.
 

My Computer

A

Andrew

On Wed, 13 Sep 2006 17:41:01 -0700, Eric
<[email protected]> wrote:

>- Ignore it


Thanks, but I will use my own eyes and continue to always use it for
gaming regardless of resolution. My eyes seem to know more on the
subject than your theoretical BS which seems to be based on
screenshots and not on moving images.
--
Andrew, contact via http://interpleb.googlepages.com
Help make Usenet a better place: English is read downwards,
please don't top post. Trim replies to quote only relevant text.
Check groups.google.com before asking an obvious question.
 

My Computer

M

MrCoffee

i have also found that aa modes are a waste of time and gpu resources
at 1920 x 1200 on up in games.
i run at either 1920 x 1200 or 2048 x 1280 on a 24" wide screen crt.

"Andrew" wrote:

> On Wed, 13 Sep 2006 17:41:01 -0700, Eric
> <[email protected]> wrote:
>
> >- Ignore it

>
> Thanks, but I will use my own eyes and continue to always use it for
> gaming regardless of resolution. My eyes seem to know more on the
> subject than your theoretical BS which seems to be based on
> screenshots and not on moving images.
> --
> Andrew, contact via http://interpleb.googlepages.com
> Help make Usenet a better place: English is read downwards,
> please don't top post. Trim replies to quote only relevant text.
> Check groups.google.com before asking an obvious question.
>
 

My Computer

V

Vanja Lolz

Sorry, but you're wrong. There is close to 0 performance difference between
rendering at a 4 times higher resolution (300 dpi display) and rendering at
a 4 times higher resolution then sampling to a lower res (super sampling).
Not to mention that 300+ dpi displays don't grow on trees. Anti-aliasing
algorithms are being/have been designed which use clever maths to find the
jagged edges with out having to render at 4x the resolution, resulting in
much better performance than a 300dpi display.


"Eric" <[email protected]> wrote in message
news:[email protected]
> Let me tell you a simple fact about antialiasing...
>
> 72/96 dpi displays are the cause of antialiasing.
>
> There are three options:
>
> - Ignore it
>
> - Supersampling/Multisampling
>
> - 300+ dpi display
>
> The third is the best option, as the antialiasing sampling methods smooth
> the "jaggies" by rendering the graphics at much higher resolutions than is
> actually displayed, you lose detail. Since the GPU is calculating more
> pixels, games and graphics slow down.
>
> The human eye cannot easily resolve objects more than 300 dpi. As a
> result,
> a 300+ dpi display can do perfect antialiasing without losing resolution,
> at
> no performance cost.
>
> For a more visual explanation:
>
> Supersampling
> -------------
>
> Apparent Resolution: 1280 x 1024
> Actual Resolution: 2560 x 2048 (4x Supersampling)
> Resolution Loss: 3932160 pixels (75% of pixels lost)
>
> Performance Penalty as per apparent resolution: Yes
>
> 300+ dpi display
> ----------------
>
> Apparent Resolution: 2560 x 2048
> Actual Resolution: 2560 x 2048
> Resolution Loss: 0 pixels
>
> Performance Penalty as per apparent resolution: No
>
>
> Results: For the Supersampling method, the GPU is rendering more pixels
> than
> you can actually see. For the 300+ dpi display method, the GPU is
> rendering
> just as much pixels as you can actually see.
>
>
> Statement: 300+ dpi display "antialiasing" - no performance penalty
>
> Conclusion: If you have a 300+ display, you can turn off all antialiasing
> features and devote more GPU power to rendering more complex visuals - the
> display will do the antialiasing for you, without the GPU knowing.
>
>
>
> *** Supersampling: the GPU renders more pixels than is displayed so that
> when the extra pixels are scaled back to the apparent resolution jaggies
> can
> be minimized. Although the jaggies are reduced, performance suffers.
 

My Computer

J

Joe

I have a hard time believing that you would get better gpu performance at
2560x2048 than 1200x1000 with 4x anti-aliasing. If you have ever played any
recent games like HL2 or FEAR you will notice significant performance drops
for each step up you go in resolution. At 800x600 I get about 300 fps solid
in hl2, at 10x7 I get about 200, and at 12x10 I get about 150 fps. See a
pattern? Just to be safe I tried these resolutions with and with out
anti-aliasing turned on, and there was absolutely now difference in
performance. And isn’t the average 19 inch lcd that most users own right now
capped at 12x10k anyways?
"Eric" wrote:

> Let me tell you a simple fact about antialiasing...
>
> 72/96 dpi displays are the cause of antialiasing.
>
> There are three options:
>
> - Ignore it
>
> - Supersampling/Multisampling
>
> - 300+ dpi display
>
> The third is the best option, as the antialiasing sampling methods smooth
> the "jaggies" by rendering the graphics at much higher resolutions than is
> actually displayed, you lose detail. Since the GPU is calculating more
> pixels, games and graphics slow down.
>
> The human eye cannot easily resolve objects more than 300 dpi. As a result,
> a 300+ dpi display can do perfect antialiasing without losing resolution, at
> no performance cost.
>
> For a more visual explanation:
>
> Supersampling
> -------------
>
> Apparent Resolution: 1280 x 1024
> Actual Resolution: 2560 x 2048 (4x Supersampling)
> Resolution Loss: 3932160 pixels (75% of pixels lost)
>
> Performance Penalty as per apparent resolution: Yes
>
> 300+ dpi display
> ----------------
>
> Apparent Resolution: 2560 x 2048
> Actual Resolution: 2560 x 2048
> Resolution Loss: 0 pixels
>
> Performance Penalty as per apparent resolution: No
>
>
> Results: For the Supersampling method, the GPU is rendering more pixels than
> you can actually see. For the 300+ dpi display method, the GPU is rendering
> just as much pixels as you can actually see.
>
>
> Statement: 300+ dpi display "antialiasing" - no performance penalty
>
> Conclusion: If you have a 300+ display, you can turn off all antialiasing
> features and devote more GPU power to rendering more complex visuals - the
> display will do the antialiasing for you, without the GPU knowing.
>
>
>
> *** Supersampling: the GPU renders more pixels than is displayed so that
> when the extra pixels are scaled back to the apparent resolution jaggies can
> be minimized. Although the jaggies are reduced, performance suffers.
 

My Computer

M

MrCoffee

i play all my games at 1920 x 1200 or 2048 x 1280 with supersample on and no aa
x16 af
and yes if perfoms better than any aa setting including the sli aa modes.
so if your running high res you don't need any aa on at high res.


"Joe" wrote:

>
> I have a hard time believing that you would get better gpu performance at
> 2560x2048 than 1200x1000 with 4x anti-aliasing. If you have ever played any
> recent games like HL2 or FEAR you will notice significant performance drops
> for each step up you go in resolution. At 800x600 I get about 300 fps solid
> in hl2, at 10x7 I get about 200, and at 12x10 I get about 150 fps. See a
> pattern? Just to be safe I tried these resolutions with and with out
> anti-aliasing turned on, and there was absolutely now difference in
> performance. And isn’t the average 19 inch lcd that most users own right now
> capped at 12x10k anyways?
> "Eric" wrote:
>
> > Let me tell you a simple fact about antialiasing...
> >
> > 72/96 dpi displays are the cause of antialiasing.
> >
> > There are three options:
> >
> > - Ignore it
> >
> > - Supersampling/Multisampling
> >
> > - 300+ dpi display
> >
> > The third is the best option, as the antialiasing sampling methods smooth
> > the "jaggies" by rendering the graphics at much higher resolutions than is
> > actually displayed, you lose detail. Since the GPU is calculating more
> > pixels, games and graphics slow down.
> >
> > The human eye cannot easily resolve objects more than 300 dpi. As a result,
> > a 300+ dpi display can do perfect antialiasing without losing resolution, at
> > no performance cost.
> >
> > For a more visual explanation:
> >
> > Supersampling
> > -------------
> >
> > Apparent Resolution: 1280 x 1024
> > Actual Resolution: 2560 x 2048 (4x Supersampling)
> > Resolution Loss: 3932160 pixels (75% of pixels lost)
> >
> > Performance Penalty as per apparent resolution: Yes
> >
> > 300+ dpi display
> > ----------------
> >
> > Apparent Resolution: 2560 x 2048
> > Actual Resolution: 2560 x 2048
> > Resolution Loss: 0 pixels
> >
> > Performance Penalty as per apparent resolution: No
> >
> >
> > Results: For the Supersampling method, the GPU is rendering more pixels than
> > you can actually see. For the 300+ dpi display method, the GPU is rendering
> > just as much pixels as you can actually see.
> >
> >
> > Statement: 300+ dpi display "antialiasing" - no performance penalty
> >
> > Conclusion: If you have a 300+ display, you can turn off all antialiasing
> > features and devote more GPU power to rendering more complex visuals - the
> > display will do the antialiasing for you, without the GPU knowing.
> >
> >
> >
> > *** Supersampling: the GPU renders more pixels than is displayed so that
> > when the extra pixels are scaled back to the apparent resolution jaggies can
> > be minimized. Although the jaggies are reduced, performance suffers.
 

My Computer

E

Eric

Oops, got hit by a troll -_-

Dude, either come back with some actual facts backing up your statements or
close your stinky mouth and have fun with your "games"

Well I guess I can't blame you for not having the, ahem, luxury (or should I
say... eyesight?) to discern between 72dpi and 300dpi. It's quite expensive
you know ; )

As far as I'm concerned, your post lacks substance.

Feel free to troll again, but it won't make a difference to the truth

"Andrew" wrote:

> On Wed, 13 Sep 2006 17:41:01 -0700, Eric
> <[email protected]> wrote:
>
> >- Ignore it

>
> Thanks, but I will use my own eyes and continue to always use it for
> gaming regardless of resolution. My eyes seem to know more on the
> subject than your theoretical BS which seems to be based on
> screenshots and not on moving images.
> --
> Andrew, contact via http://interpleb.googlepages.com
> Help make Usenet a better place: English is read downwards,
> please don't top post. Trim replies to quote only relevant text.
> Check groups.google.com before asking an obvious question.
>
 

My Computer

E

Eric

True. It may have something to do with your distance between CRT monitor.
Since it's 24" you're more likely to be sitting a bit more away from it than
say a 17" monitor. Pixels appear smaller, and so will the jaggies.

As you said, I'd rather having the GPU rendering more complex scenes than
wasting cycles on supersampling

"MrCoffee" wrote:

> i have also found that aa modes are a waste of time and gpu resources
> at 1920 x 1200 on up in games.
> i run at either 1920 x 1200 or 2048 x 1280 on a 24" wide screen crt.
>
> "Andrew" wrote:
>
> > On Wed, 13 Sep 2006 17:41:01 -0700, Eric
> > <[email protected]> wrote:
> >
> > >- Ignore it

> >
> > Thanks, but I will use my own eyes and continue to always use it for
> > gaming regardless of resolution. My eyes seem to know more on the
> > subject than your theoretical BS which seems to be based on
> > screenshots and not on moving images.
> > --
> > Andrew, contact via http://interpleb.googlepages.com
> > Help make Usenet a better place: English is read downwards,
> > please don't top post. Trim replies to quote only relevant text.
> > Check groups.google.com before asking an obvious question.
> >
 

My Computer

Top