I discovered that the Dunia version in this game isn't treating vsync correctly. Farcry 3 and 2 had the problem where the game only ran properly at multiples of 30 (30ffps, 60fps etc), so you could fix the stutter by using those refresh rates. Farcry 4 has the same problem, but it's worst now. The sparse vsync option no longer works correctly. The problem seems to be that the game only runs smoothly at the exact vsync you set. I did a test and tried to run 30fps on 60hz using different options, like nvinspector, msi afterburner, in game sparse vsync, nothing worked to get smooth 30 fps. Because i have a dlp that can do 30hz vsync, I set that as my refresh then restarted the game.
The frame rate was butter smooth. This tells me that the engine is only capable of putting out exactly the vsync refresh and nothing else, or else frametimes fall apart. The only option to get perfectly smooth 30 fps or 60 fps is to use the actual refresh rates and not vary the fps. At 60 fps you have to have that as the minimum. Same as 120, 144 etc. 30 fps is only smooth at 30 hz refresh, if your monitor is capable of it.
This only fixes the frame time variance. The stutter from loading new areas is a different thing based on textures and there is a fix for that in the xml. let me know if this works.
For the past few days I've been running the game with adaptive vsync forced from nvidia inspector at 1/2 refresh rate (my monitor is 60hz) and so far that has been the best situation I've found as far as frame timing issues go. It maintains a pretty solid 30 fps and is definitely nowhere near perfectly smooth, but it's somewhat comparable to how Far Cry 3 ran for me. Which isn't exactly great because I never thought that game felt as smooth as the framerate suggested it should, but it's a least tolerable. I don't think my monitor can do 30hz so I can't test what you've done, but I do think it's true that the game has some very odd behavior with vsync and refresh rates that cause problems with frame time variance.
Vsync will always half your frame rate to 30 from 60 if your PC cant maintain constant 60+ all the time (and it cant because of the stuttering). This is not a Dunia Engine issue, this is just how Vsync works. Triple Buffering is supposed to fix this but it does not seem to work in this title. Right now all you can do is use Adaptive Vsync (or the AMD equivalent, dont know what it's called), or just turn off Vsync (but be prepared for screen tearing if you only have a 60Hz monitor, if you have 120 or 144 you're fine).
The Vsync issue has nothing to do with the stuttering, that is a texture streaming issue and bad cpu usage apparently.
That isn't how Vsync works at all. Not even close.Originally Posted by brutlern Go to original post
Vsync basically makes your GPU a slave to your monitor. It stops the GPU from sending the next image from it's primary buffer to the monitor until the monitor is ready, thus eliminating what we call "screen tearing". Screen tearing is all the extra frames that you lose if your GPU is trying to push too many frames on to the monitor. Meaning, if you have a 60hz monitor, you can never exceed 60fps, as the monitor can simply not handle any more than that per second.
If you cannot maintain 60 fps, then the vsync will absolutely not cut your fps to 30fps. This is completely false. It simply stops your fps exceeding your monitors refresh rate. Nothing more.
Wrong.Originally Posted by Faide09 Go to original post
Educate yourself.
http://hardforum.com/showthread.php?t=928593
http://www.tweakguides.com/Graphics_9.html
Originally Posted by brutlern Go to original post
The second linki backs up what I said, which was absolutely true.
"Enabling VSync tells your graphics card to synchronize its actions with your monitor. That means the graphics card is only allowed to swap its frame buffer and send a new frame to the monitor when the monitor says it is ready to repaint a new screen - i.e. during the VBI. Your graphics card and monitor do not have to be in sync; they can still operate properly when VSync is disabled, however when VSync is disabled, you can experience a phenomenon called Tearing in periods when your graphics card and monitor go out of sync, precisely because the graphics card and monitor are acting without regard for each other's limitations."
"Tearing
It is an unfortunate fact that if you disable VSync, your graphics card and monitor will inevitably go out of synch. Whenever your FPS exceeds the refresh rate (e.g. 120 FPS on a 60Hz screen), or in general at any point during which your graphics card is working faster than your monitor, the graphics card produces more frames in the frame buffer than the monitor can actually display at any one time. The end result is that when the monitor goes to get a new frame from the primary buffer of the graphics card during VBI, the resulting output may be made up of two or more different frames overlapping each other. The onscreen image may appear to be slightly out of alignment or 'torn' in parts whenever there is any movement - and thus it is referred to as Tearing. An example of this is provided in the simulated screenshot below. Look closely at the urinals and the sink - portions of them are out of alignment due to tearing:"
That's from your second link. This is almost exactly what I said. NEVER will it half your FPS like you said. Buy yourself a clue.
I actually think he was correct. He explained it like I understand it.Originally Posted by brutlern Go to original post
You are talking absolute rubbish. Not once in my life has vsync capped me to 30fps if I can't maintain 60fps.Originally Posted by brutlern Go to original post
Originally Posted by Faide09Not exactly true Faide09, It will if standard buffering is in play. That is how Vsync was originally designed to function if FPS dropped below refresh. MOst people understand how Vsync works when FPS exceeds refresh, but many do not realize the actual intricacies with how things are handled when the GPU FPS output falls below the Vsynced refresh rate.Originally Posted by Faide09
Originally Posted by Devst8nDscoDve Go to original post
It's not actually rubbish.. It is the truth, but many more factors in play for the big picture with what each of you may see as a result..
From the way I learned it long ago so pardon my memory, This has to do with VSYNC and standard double buffering vs triple buffering or even SLI quad buffering and the locking of frame rate to swap interval.
This was more evident in the past when double buffered VSYNC caused a drop by half the refresh. If a 60hz VSYNC was in play and FPS dropped below 60FPS, it would be cut back to 30 FPS because the GPU could not render a new frame until the current one was finished being displayed. That is just the nature of VSYNC.
However with triple buffering for instance this does not occur since the GPU can render new frames without having to wait to display the ready frame and your FPS may not fall by half. In this case microstuttering can still occur.
On the other hand Adaptive Vsync just disables Vsync if FPS drops below refresh and enables it if it stays above. I believe Adaptive Vsync came about to combat the stutter from FPS drops below VSYNC/refresh.
All in all the dreaded drop in FPS by 1/2 the refresh WILL not occur if triple buffering is in play, or if you have SLI which uses quad buffering and the frame rate is disconnected from the swap interval...
It happened a lot in Far Cry 3.Originally Posted by Devst8nDscoDve Go to original post