Recommended Posts

1 hour ago, Griphos said:

You really shouldn’t make claims based on stuff you read (particular on some forum from some guy who thinks he figured something out from reading some other stuff on some other forums) which you may not fully understand yourself.

I use vsync, and before building my current machine, dropped below 30 all the time and never ever reached 60, and my frame rates were NEVER some fraction of 60. Using several frame rate counters, the rates varied across the spectrum. The counters count the actual FPS of the screen refreshes. Even though they count averages rather than each individual frame, the result would be closer to the fraction integer.  My frame rates were frequently in the mid-20s for extended period. In my new machine, FPS is often locked at 60, but when it drops below that, it doesn’t drop to 30. It ranges through the 40s-50s.  

There is no “in-game” frame rate separate from the actual frame rate of the monitor. 

Anyway, vortex681's claims are well informed claims, because it's exactly how it works on my computer. If I check vsync in X-Plane's graphic options, the frame rate is always a fraction of my monitor's refresh rate (Linux here, but I could check on my window machine). Now, if like ColonelX you tweak Nvidia parameters, (for example using adaptative vsync), your mileage may vary.

Pascal

Share this post


Link to post
Share on other sites
Help AVSIM continue to serve you!
Please donate today!

I'm referring to knowledge that comes from actually working in the industry.  If you don't have that, and you are just reading things, then you shouldn't pretend you know what is going on just because you read someone say something.

I DON'T work in the industry, but I can tell you from experience with standard vsync that my FPS are not some divisor of the refresh rate like 30 or 15.  Perhaps this works differently somehow with Linux, as Pascal suggests, but with Windows, it just doesn't work this way.  Or at least it never has with ANY of the hundreds of games and sims I have played across several different Windows machines.  

If your computer can output 40 FPS, and you have standard vsync on, and your refresh rate is 60hz, then your computer will draw 40 FPS and your monitor will try to display 40 FPS.  It will stutter, because some of those frames will stay on the screen longer (several cycles) so that they can draw the next one without tearing, as you state.  Not every frame will stutter, because not every frame will take longer than a single cycle to draw.  The refresh rate sets a cap.  And you're right that for a given screen frame, vsync will insist it be drawn all at once, but you're wrong that this means a FPS of 30.  Some of the frames will be able to be drawn in a single cycle, and some not, and those that cannot be will take two cycles, and the resulting FPS showing on your monitor will be some average of the total, so anything in between 0 and 60 (or whatever your monitor refresh rate is) and stuttering, because of the uneven ability to draw without tearing.  

The real problem with vsync and a system not able to draw FPS at the refresh rate is stuttering, not FPS as a divisor of the refresh rate. 

Edited by Griphos
  • Upvote 1

Share this post


Link to post
Share on other sites
10 hours ago, Griphos said:

If you don't have that, and you are just reading things, then you shouldn't pretend you know what is going on just because you read someone say something.

Straight from NVIDIA: https://www.geforce.co.uk/hardware/technology/adaptive-vsync/technology. I quote: " When frame rates dip below the cap VSync locks the frame rate to the nearest level, such as 45 or 30 frames per second. As performance improves the frame rate returns to 60". Although the article is notionally about adaptive vsync, it initially explains the problem with standard vsync. Is that authoritative enough for you?

Perhaps vsync isn't working correctly on your system (or you're using adaptive vsync).

Edited by vortex681

Share this post


Link to post
Share on other sites

I'm using normal vsync with the in game check box, and nVidia set to use the game setting (default).  In what you quote, it say locks to nearest level, and includes the possibility of 45. 45 is not half of 60. That alone disputes your earlier post. And nowhere does it say it does so in one second integer increments.  As I suggested, it is much more likely it does so only as long as needed, which essentially means on a frame by frame basis, which happens up to 60 times per second, leaving the average in any given second to fall anywhere along the integer line below 60, not locked at 30. 

But you didn’t just quote an authoritative source and let it speak for itself. You referenced an unauthoritative source and began making extrpolative (and simply wrong) claims about what happens on all our computers as if you yourself were an authority.  That’s the problem. Common on enthusist forums, to be sure.

if you’re still wanting to maintain your claims, why don’t you post a little video of you running XP with vsync enabled over demanding scenery and the internal frame counter showing you locked at 30 FPS because you can’t reach 60. Just the game vsync. Not nVidia. And not set to cap at 30, but at 60. In other words, showing that vsync set to run at the refresh rate of 60 hz but unable to do so actually drops you to locked 30 FPS. 

Or, short of that, please explain why every YouTube video of XP out there with the frame counter showing actually shows FPS all over the place anywhere from 20s up to 60 FPS. If you were actually right, no video would ever show FPS at anything other than 60, 30, or 15. Neither you nor I have ever seen that. 

Edited by Griphos

Share this post


Link to post
Share on other sites
8 hours ago, Griphos said:

But you didn’t just quote an authoritative source and let it speak for itself. You referenced an unauthoritative source and began making extrpolative (and simply wrong) claims about what happens on all our computers as if you yourself were an authority.  That’s the problem. Common on enthusist forums, to be sure

I didn't claim to be an authority, I just repeated what I've learned from those who I consider to be much more informed than you and I and whose expertise I trust. I have not made "extrpolative" claims, I've just quoted sources who know much more about how vsync works than either of us do. Since you clearly will not even accept information from NVIDIA, who actually make the hardware and firmware that most of us use, I see no point in continuing this discussion. Please consider this to be my final post in this thread.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now