1. #1
    Question about this. I got my geforce 6800 card and it works wonderfully. I can run the game at 1024x768 with SM 3.0 and everything else maxed, or 1024x748 with AA at 4 and everything else maxed. Both work great. What's the difference? Maybe I'm just not in the right areas of the game yet to see the difference. I have gone into lowly lit areas and highly lit areas and changed the settings back and forth, and I don't see the difference. Is sm 3.0 supposed to be superior to using the anti-aliasing, or vice versa? Thanks in advance.
    Share this post

  2. #2
    I think you mean HDR or Hight Dynamic range. That has nothing to do with SM3.0. HDR is rather in simple terms, Nvidia support for Floating point filtering and blending which works independantly of SM3.o [yes i know its under the shader options] but trust me it has nothing to do with it. The use of HDR and AA is generally down to personal taste. I myself play the game with HDR enabled [i love AA but i ain't in love with it] and what it does is improve lighting in certain situations. You can notice subtle brightness shifts as you go from areas of differing intensities. Its not as pronounced as Far Cry's HDR effect which more or less trys to recreate a camera [Far Cry doesn't use Tone Mapping as Far as I know], but the effect is there. BTW generally on a Geforce 6800 series of cards, you will take more of a performance hit with HDR enabled than 4xAA. By all accounts you should be able to run the game at 1280x1024 with at least 2xAA unles 1024x768 is as far as your monitor will allow.

    So like i said HDR or AA, its all down to personal taste.
    Share this post

  3. #3
    I guess. I know I have the option in the advanced graphics settings in the game of shader 1.0 and being able to use AA, or using the shader 3.0 and it automatically disabling the AA choices. So you are saying the shader 3.0 option is actually HDR?
    Share this post

  4. #4
    The shader 3.0 option contains the ability to enable HDR, but HDR has nothing to do with with shader. It sounds confusing but trust me.

    You cannot use HDR and AA at the same time. It is a limitation of the 6800 and 7800 series of cards. It would require far to much bandwidth and were it possible would reduce the frame rates to a slide show. If you want to use AA then disable HDR with Tone Mapping.
    Share this post

  5. #5
    Check the new nvidia demos. aa + hdr. no prob. all a matter of coding.
    Share this post

  6. #6
    Originally posted by Dojomann:
    Check the new nvidia demos. aa + hdr. no prob. all a matter of coding.
    Yes and no. The new nvidia demos use software AA with is not a viable option in games. Games like splinter Cell use full hardware support for AA when HDR is not available. The software option require some CPU cycles.

    EDIT: I may be wrong in the CPU cycles. It probably uses the vertex or pixel shaders to create the smooth transitions, which is still effctively a software method. Its like ATI's 9x00 implementation of truform. It uses the vertex shader where as the 8500 had full harware support for it. Eitherway, i will have to look more into this.

    Here is a snippet from Beyond3D


    You can also see it been clear up with an Interview with David kirk on the HDR and software AA issue.

    Link 2
    Share this post

  7. #7
    ah. thanks for clearing that up. i knew something was fishy. still tho i think today's processors are up to it. for some games anyways. i wouldnt be surprised if nvidia added this feature to their drivers. would be nice for the people with dual core processors especially eh?
    Share this post

  8. #8
    I have been playing around with the settings using an ATI card and the newest patch. When I enable HDR I notice a few things. If I move the camera so Sam's head is in the way of a light source the entire brightness of the room changes which is kinda strange. I noticed the the shimmering effect around lightsources is not as noticable and the light seems tighter some how. Some things seem to be brighter then normal and the frame rate is about 10FPS slower then using 4X AA. Of course edges of objects are all jagged since AA doesn't work.

    Personally I use AA. I find on-the-fly light changing to be unrealistic and the frame rate hit and jaggies everywhere make the game look and perform worse. I do like what it does to the light shimmer but it's not enough for me to enable it and if I wanted things to look brighter I would adjust the brightness and contrast on my monitor without a hit to performance.

    While we are on the subject I was wandering what everyone thinks about the other SM3.0 options like soft shadows and the mapping option.
    (Done some editing after further exploration of these effects)
    In my opinion the soft shadows make the shadows look lighter but they seem less defined which makes the overall image look less sharp in some situations but also more realistic in others.

    Parallax mapping seems to make certain walls look more 3D or have more depth. I notice no differance in some area's but in others the differance is obvious.

    Turning on the soft shadows and paralax mapping doesn't effect my FPS notciably but I am using the V-sync so they may be causing slight differances that are not noticable when playing the game on my setup. It is nice to see the game without the banding problems of SM1.1 though. I think my FPS went up a little as well but I havn't tested it yet. Many report the FPS to have went down when switching to SM2.0 over 1.1 but I know my low's did not get any lower because I checked with fraps.

    It's nice to have the HDR option but I prefere AA over HDR. HDR is unplayable at the settings I play at anyway. IF they make HDR work with AA and at playable levels while impelementing it properly it would be welcomed but that won't be for a while.
    Share this post

  9. #9
    Originally posted by the_sextein:
    I have been playing around with the settings using an ATI card and the newest patch. When I enable HDR I notice a few things...
    I thought that HDR is exclusive to Nvidia...
    Share this post

  10. #10
    Well, after a bit of playing around I found the best settings for me are using the HDR with everything enabled or on high and AF at 8. I use 1024x768, not because my monitor won't handle higher resolutions, I just get better frame rates with it at 1024x768. Averaging about 35fps. I think my computer doesn't allow the full potential of the graphics card, but next year I am going to build an Athlon 64 system, so this is just fine for now. Thanks for the help guys.
    Share this post