Jump to content

dx11+nvidia -> no hardware vsync


photo

Recommended Posts

Hello,

I am trying  to have our system vsync'ed (nvidia based), and apparently I can't get it correctly:

Just testing with a blank new project here:

  • DX11+vsync option in command line -> 60Hz quite stable, but not sure if this is actually vsync'ed or just framerate limited
  • DX11+vsync enabled in the nVidia control panel -> failure, ~400Hz or more. RivaTuner also reads the same Hz, so definitively NOT vsync'ed
  • GL+vsync option in command line -> 60Hz also quite stable
  • GL+vsync enabled in the nVidia control panel -> 60Hz, so most probably actually vsync'ed

(I also tried the Adaptive VSync in the nVidia Control Panel, same results)

So my questions here:

  • Can you reproduce this? Or is it just me and my 3 PC with 3 different nvidia cards and 3 different display? ;-)
  • Are you aware of this? Is it a nVidia issue? Unigine? DX11?
  • Switching to GL could be a possibility, but I'm reading in your docs that this could put a (big) performance hit? (plus some features are less supported in GL)
Link to comment

Hi Stephane,

Which kind of issue you are trying to solve? Do you have tearing on a single monitor? Enabling vsync will not give you stable 60 fps if your application fps is drops below 60 (for example, to 59 or something).

Please, keep in mind that most of the driver settings from nVidia control panel (especially for DX11) works only when application is being started in fullscreen mode (video_fullscreen 1). Otherwise all the tests are not valid. Setting vsync for windowed application should be disabled from the engine side.

To limit fps without vsync you can use render_max_fps 60 console command. In theory, simply limiting the fps will be better option rather than use vsync (no input lag at least).

Even if all of your 3 displays will have vsync enabled it will not guarantee that you will see the same frame across all the monitors. All vsync is doing is trying to adjust frame start with the refresh rate of a monitor (removing tearing) and there is no guarantee that all refresh rates of different monitors are in sync between each other.

Thanks!

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Hi Silent, I didn't know about the restriction of DX11 to fullscreen as it works as expected for GL in a windowed app, but our final app indeed uses video_fullscreen=1. Still no vsync visible in this case!

And there, still, we use Quadro cards with Sync cards, so all display must swap their buffers at the same time (well, at least nVidia advertise their quadro+sync cards just for this purpose, after all).

About the perf drop if vsync is enabled on the card and fps should go below 60, I guess the "adaptive vsync" is made by nvidia just for that: it should temporarily disable vsync if the fps drops below 60.

About the sync of multi-display without tearing, that is better discussed in a support thread, should you need more internal details of our setup or app.

Here, I just wanted to point out that dx11+nv driver vsync (even in fullscreen=1) wasn't working on our side (but I'm open to hearing I did something awfully incorrectly here)

Thanks :)

Link to comment

That's strange :) If I enable vsync in driver and start application I can see that fps is around 60. Will pass to QA, maybe there was some issues in 2.9.x branch or issue can be reproduced only with a limited HW + SW configs.

Could you please share with us your HW config where vsync seems to be not working as expected (including any additional hw like sync cards)? I think it's better to dump the whole dxdiag report.

Thanks!

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment
×
×
  • Create New...