Jump to content
Sign in to follow this  
Bunchy

Will 3080Ti maintain 30FPS at 4k Ultra setting?

Recommended Posts

2 hours ago, Noel said:

To f/u on Lord Farrington's comments I tried setting Adaptive Vsync to ON in NCP, Vsync ON in MSFS set at 60.  While animation seems smooth w/ a variable frame rate between 32 and 54 or so (dim dusk light in rain at CYYC so not really sure yet) I had visible screen tearing when panning that I never see when Vsync ON in MSFS set to 30.   Is screen tearing solved w/ GSync displays?

Yes there's no screen tearing with true G-Sync displays. 

But there are also some versions that are "G-Sync Compatible" or other monikers. They usually don't have a G-sync chip in the monitor, but they will work fine around 60 fps and up.

The true G-Sync monitors you can get no screen tearing from very low framerates. 

I'll link to an article that explains it much better.


https://www.tomshardware.com/reviews/nvidia-gsync-monitor-glossary-definition-explained,6008.html

  • Like 1

R7 5800X3D | RTX 4080 OC 16 GB | 64 GB 3600 | 3440x1440 G-Sync | Logitech Pro Throttles Rudder Yoke Panels | Thrustmaster T.16000M FCS | TrackIR 5 | Oculus Rift S
Experience with Flight Simulator since early 1990s

Share this post


Link to post
Share on other sites
3 hours ago, Republic3D said:

But there are also some versions that are "G-Sync Compatible" or other monikers. They usually don't have a G-sync chip in the monitor, but they will work fine around 60 fps and up.

there seems to be some confusion about this,

currently using a Samsung QN90A with 2080Ti - which is G-sync compatible - & it runs MFS very very well NCP limited to 30 or 40 fps without any screen tearing whatsoever - it is rock solid performance & the fps varies slightly in complex areas…

  • Like 2

Share this post


Link to post
Share on other sites
2 hours ago, craigeaglefire said:

there seems to be some confusion about this,

currently using a Samsung QN90A with 2080Ti - which is G-sync compatible - & it runs MFS very very well NCP limited to 30 or 40 fps without any screen tearing whatsoever - it is rock solid performance & the fps varies slightly in complex areas…

Yes. It just depends on the VRR range of the monitor and whether it supports LFC or not. 

  • Like 2

Share this post


Link to post
Share on other sites
19 hours ago, Noel said:

Terry it seems like you are saying Adaptive V-Sync obviates the need for G-Sync, or perhaps does essentially the same thing.  One can for example set Vsync to 60, and when frames drop below 60, Adaptive Vsync eliminates the otherwise inevitable stuttering that ensues when unable to maintain the minimum 60 or whatever Vsync frame lock you're using.  If I understood this correctly, have you compared the two?  Most would agree the goal no matter what the hardware is to have stutter-free performance at the greatest visual detail possible at the highest sustainable frame rate.   G-Sync and from what you're suggesting Adaptive V-sync address the sustainability problem.   OTOH, if what I think you're saying is true, one should be able to set Vsync to 120 for a 120Hz display, and yet Adaptive V-sync saves the day, which of course would be hard to fathom, verging on the miraculous.  As mentioned above there is currently no way to maintain frame rates of say 50 or 60 with the current hardware, including 3090, in complex planes in all conditions, especially at 4K or with LOD dialed beyond ultra.

Hi Noel, Sorry for the late reply.

I'm no expert on this, few of us are, and the technology is changing so quickly it is sometimes hard to keep up with it. V-Sync, Adaptive V-Sync, G-Sync, Free Sync and VRR (to name few) are all trying to do much the same thing ie keep you from stuttering your way through the session or tearing your video apart.

It's not necessarily about giving you better frame rates since that is CPU/GPU tasking issue but it is about giving you a smoother experience with the frame rates you do get. The GPU pushes frames out at whatever speed it can without any consideration for the display refresh rate. So in the case of your GPU sending 100 FPS, you will overwhelm your 60 Hz refresh rate display so that one part of the display shows one frame and the other part is already showing the next frame.  There's the video tear, right there! The V-Sync solution is to hold the next frame until FPS drops to 60 FPS to match your fixed 60 HZ monitor and continues to impose a strict 60FPS cap.

Of course there are a few issues with this.  Despite preventing screen tearing, V-Sync often causes issues such as screen “stuttering” and input lag due to the delayed frames. And, here is some vital information I have picked up along the way: 

"V-Sync only is useful when the graphics card outputs video at a high FPS rate, and the display only supports a 60Hz refresh rate (which is common in legacy equipment and non-gaming displays). V-Sync enables the display to limit the output of the graphics card, to ensure both devices are operating in sync.

Although the technology works well with low-end devices, V-Sync degrades the performance of high-end graphics cards." (Source: ViewSonic Library)

I think the MSFS gurus have included V-Sync to help those guys with cards that can barely reach 30 FPS (cap at 20 FPS) or those guys who cant get near 60 Hz (cap at 30 FPS).  That would probably give the best experience for most legacy cards although clearly its just an approximation and different flying scenarios will change where you sit on this.  But, it seems reasonable to assume that V-Sync and the fixed 60 hz monitors it was devised around does degrade the performance of high-end graphics cards. So new technologies were bought in. G-Sync by Nvidia and and Free Sync by AMD. 

Both of these technologies get into another abbreviation VRR or Variable Refresh Rate and as such require monitors who can logically connect with the Nvidia or AMD card. So the screen manufacturers have been pumping out displays with 120, 144,165 and 240HZ refresh rates and it works something like this.  If your GPU is hanging at 50 FPS, G-Sync tell the display to match Hz, if it drops to 40FPS then the display follows it down.  This goes on with the screen being able to go as low as 30Hz or as high as its maximum rate say 144 Hz. Apart from resolving screen tearing and stuttering this VRR solution also unleashes the power of modern GPU's and NVidia many RTX30 series card are ready to use this. 

Free Sync is AMD's version and is cheaper than Nvidia but in all cases you have to have a GPU and display that are G-Sync ready or and AMD card and Free Sync display ready. I understand that you can make an Nvidia card work on a cheaper Free Sync display so we may be seeing more cross over in these technologies which seem to be moving quite fast. To answer another of your questions Noel, Adaptive-Sync seems to refer to the Adaptive Display technology used in AMD's Free Sync, which requires connection through Display Port cables. The RTX's have both HDMI 2.1 and Display Port outputs so it looks like in 2021 there is an Nvidia VRR option that does not need the more expensive G-Sync monitor.   

Whew!! So much stuff.   That's enough but, just one more thing. Heard of HDMI 2.1? Well, the earlier HDMI 2.0 had a transfer rate of 15Gb/s. The new HDMI 2.1 is 48Gb/s.  All RTX 30 cards are 2.1 ready and if you have a gaming mode on your display with a 2.1 input then by connecting the GPU and the display with an HDMI2.1 cable you are not limiting your frame rates by the pipe they have to travel down. Going back to the OP. Thinking of upgrading your screen? Make sure it has HDMI 2.1 to take advantage of coming improvements in this everchanging (and challenging) world of FPS 😊.

Terry     

  • Like 2

No. No, Mav, this is not a good idea.

Sorry Goose, but it's time to buzz the tower!

Intel (R) Core (TM) i7-10700 CPU @2.90Ghz, 32GB RAM,  NVIDEA GeForce RTX 3060, 12GB VRAM, Samsung QN70A 4k 65inch TV with VRR 120Hz Free Sync (G-Sync Compatible). 

Boeing Thrustmaster TCA Yoke, Honeycomb Bravo Throttle Quadrant, Turtle Beach Velocity One Rudder Pedals.   

Share this post


Link to post
Share on other sites
1 hour ago, Lord Farringdon said:

The RTX's have both HDMI 2.1 and Display Port outputs so it looks like in 2021 there is an Nvidia VRR option that does not need the more expensive G-Sync monitor.   

Not quite true, as G-Sync compatible devices use LFC whereas hardware based G-Sync displays can go all the way down to 1hz without using LFC. This is quite noticeable below 40 FPS/HZ too..... Once you get below 40 on a Compatible display, lets say 38 FPS, the the LFC kicks in and the HZ is now 76. You can 'feel' the slight judder it causes as the HZ doubles. So, if your frame rate was fluctuating between 35-45 FPS, then you would be far better off with a G-Sync hardware based display rather than a compatible one, as it would constantly be shifting between LFC.

 

I have been fortunate enough to test this numerous times, as I have a Ultragear G-Sync hardware based monitor and also a G-Sync compatible based LG C1 TV. 

  • Like 2
  • Upvote 2

Share this post


Link to post
Share on other sites
22 minutes ago, Ianrivaldosmith said:

Not quite true, as G-Sync compatible devices use LFC whereas hardware based G-Sync displays can go all the way down to 1hz without using LFC. This is quite noticeable below 40 FPS/HZ too..... Once you get below 40 on a Compatible display, lets say 38 FPS, the the LFC kicks in and the HZ is now 76. You can 'feel' the slight judder it causes as the HZ doubles. So, if your frame rate was fluctuating between 35-45 FPS, then you would be far better off with a G-Sync hardware based display rather than a compatible one, as it would constantly be shifting between LFC.

 

I have been fortunate enough to test this numerous times, as I have a Ultragear G-Sync hardware based monitor and also a G-Sync compatible based LG C1 TV. 

The LFC shift also creates what is known as G-Sync/FreeSync flicker which I found very noticeable.    

We've been saying this all along but with the right hardware the days of a static frame rate lock are becoming non-existent.  With the right HW combination it's just not required anymore and you can take advantage of everything the system has to give without handcuffing your V8 motor to a 4 cylinder...  So yeah, I guess we don't have a disease after all and maybe, just maybe we actually knew what we were talking about 😉  

On the other hand, we could always start a poll with just one option and start insulting anyone who disagrees with us and position it as a PSA and a service to the flightsim community 😉 

Edited by psolk
  • Like 2
  • Upvote 2

Have a Wonderful Day

-Paul Solk

Boeing777_Banner_BetaTeam.jpg

Share this post


Link to post
Share on other sites
1 hour ago, Lord Farringdon said:

Hi Noel, Sorry for the late reply.

I'm no expert on this, few of us are, and the technology is changing so quickly it is sometimes hard to keep up with it. V-Sync, Adaptive V-Sync, G-Sync, Free Sync and VRR (to name few) are all trying to do much the same thing ie keep you from stuttering your way through the session or tearing your video apart.

It's not necessarily about giving you better frame rates since that is CPU/GPU tasking issue but it is about giving you a smoother experience with the frame rates you do get. The GPU pushes frames out at whatever speed it can without any consideration for the display refresh rate. So in the case of your GPU sending 100 FPS, you will overwhelm your 60 Hz refresh rate display so that one part of the display shows one frame and the other part is already showing the next frame.  There's the video tear, right there! The V-Sync solution is to hold the next frame until FPS drops to 60 FPS to match your fixed 60 HZ monitor and continues to impose a strict 60FPS cap.

Of course there are a few issues with this.  Despite preventing screen tearing, V-Sync often causes issues such as screen “stuttering” and input lag due to the delayed frames. And, here is some vital information I have picked up along the way: 

"V-Sync only is useful when the graphics card outputs video at a high FPS rate, and the display only supports a 60Hz refresh rate (which is common in legacy equipment and non-gaming displays). V-Sync enables the display to limit the output of the graphics card, to ensure both devices are operating in sync.

Although the technology works well with low-end devices, V-Sync degrades the performance of high-end graphics cards." (Source: ViewSonic Library)

I think the MSFS gurus have included V-Sync to help those guys with cards that can barely reach 30 FPS (cap at 20 FPS) or those guys who cant get near 60 Hz (cap at 30 FPS).  That would probably give the best experience for most legacy cards although clearly its just an approximation and different flying scenarios will change where you sit on this.  But, it seems reasonable to assume that V-Sync and the fixed 60 hz monitors it was devised around does degrade the performance of high-end graphics cards. So new technologies were bought in. G-Sync by Nvidia and and Free Sync by AMD. 

Both of these technologies get into another abbreviation VRR or Variable Refresh Rate and as such require monitors who can logically connect with the Nvidia or AMD card. So the screen manufacturers have been pumping out displays with 120, 144,165 and 240HZ refresh rates and it works something like this.  If your GPU is hanging at 50 FPS, G-Sync tell the display to match Hz, if it drops to 40FPS then the display follows it down.  This goes on with the screen being able to go as low as 30Hz or as high as its maximum rate say 144 Hz. Apart from resolving screen tearing and stuttering this VRR solution also unleashes the power of modern GPU's and NVidia many RTX30 series card are ready to use this. 

Free Sync is AMD's version and is cheaper than Nvidia but in all cases you have to have a GPU and display that are G-Sync ready or and AMD card and Free Sync display ready. I understand that you can make an Nvidia card work on a cheaper Free Sync display so we may be seeing more cross over in these technologies which seem to be moving quite fast. To answer another of your questions Noel, Adaptive-Sync seems to refer to the Adaptive Display technology used in AMD's Free Sync, which requires connection through Display Port cables. The RTX's have both HDMI 2.1 and Display Port outputs so it looks like in 2021 there is an Nvidia VRR option that does not need the more expensive G-Sync monitor.   

Whew!! So much stuff.   That's enough but, just one more thing. Heard of HDMI 2.1? Well, the earlier HDMI 2.0 had a transfer rate of 15Gb/s. The new HDMI 2.1 is 48Gb/s.  All RTX 30 cards are 2.1 ready and if you have a gaming mode on your display with a 2.1 input then by connecting the GPU and the display with an HDMI2.1 cable you are not limiting your frame rates by the pipe they have to travel down. Going back to the OP. Thinking of upgrading your screen? Make sure it has HDMI 2.1 to take advantage of coming improvements in this everchanging (and challenging) world of FPS 😊.

Terry     

 

23 minutes ago, Ianrivaldosmith said:

Not quite true, as G-Sync compatible devices use LFC whereas hardware based G-Sync displays can go all the way down to 1hz without using LFC. This is quite noticeable below 40 FPS/HZ too..... Once you get below 40 on a Compatible display, lets say 38 FPS, the the LFC kicks in and the HZ is now 76. You can 'feel' the slight judder it causes as the HZ doubles. So, if your frame rate was fluctuating between 35-45 FPS, then you would be far better off with a G-Sync hardware based display rather than a compatible one, as it would constantly be shifting between LFC.

 

I have been fortunate enough to test this numerous times, as I have a Ultragear G-Sync hardware based monitor and also a G-Sync compatible based LG C1 TV. 

Thanks for explaining all this. I was pretty confused before.

I like the sound of a G-sync hardware monitor as this seems to be the greatest FPS flexibilty. I'm happy locked at 30 but would like to remove v-sync and have a smooth experience if I find myself in the gap between 30 and 60FPS. It's looking like g-sync hardware monitors are more plentiful in the 27" to 32" size.

Stu

  • Like 3

i7 12700K , 32GB RAM @3600MHz, Asus Z690-Plus D4 MB, Gainward 4090 RTX Graphics, 850W Corsair PSU, Kraken AIO watercooler, Nvme 1TB ssd, 1TB ssd, 500GB ssd.

Share this post


Link to post
Share on other sites
Just now, Bunchy said:

 

Thanks for explaining all this. I was pretty confused before.

I like the sound of a G-sync hardware monitor as this seems to be the greatest FPS flexibilty. I'm happy locked at 30 but would like to remove v-sync and have a smooth experience if I find myself in the gap between 30 and 60FPS. It's looking like g-sync hardware monitors are more plentiful in the 27" to 32" size.

Stu

It's pretty new to all of us despite being around for a couple of years now...  I think Flight Sim is probably one of the biggest gainers with G-Sync as I don't have a lot of other app's that can't hold 60FPS consistently.  In iRacing though it allows my FPS minimum to drop to 84FPS and still be butter smooth without a hiccup so it's working in other apps as well!  

  • Like 2

Have a Wonderful Day

-Paul Solk

Boeing777_Banner_BetaTeam.jpg

Share this post


Link to post
Share on other sites
1 hour ago, Bunchy said:

I like the sound of a G-sync hardware monitor as this seems to be the greatest FPS flexibility. I'm happy locked at 30 but would like to remove v-sync and have a smooth experience if I find myself in the gap between 30 and 60FPS. It's looking like g-sync hardware monitors are more plentiful in the 27" to 32" size.

Stu

Looks like me and you are looking in the same direction Stu.   Unfortunately, I can't do even the smallest LG C1 (48 inch), not quite enough room.

I have an old Samsung 28 inch 4k at the moment (not G-sync etc.)  Ideally I would like to go a bit bigger (32 inch), 4k, but with G-sync, as like you, I reckon I can hold 45-50 fps everywhere, but I have had no success trying to do it by other means without stuttering and tearing. 

As an experiment, on Saturday I tried Vsync 60 in sim, RTSS with a frame cap at 45 fps, and adaptive framerate in Nvidia control panel - no good!

As I was prepared to stretch to about £1,200 for the LG C1 if I could have made it fit, I decided to look at some 4k 32 inch g-sync monitors instead, but the prices are really steep. 
I saw one for £2,500!  😀

UPDATE:  At Overclockers: -  Gigabyte M32U - £749.99, but all out of stock.  Alternative - Asus 32" ROG SWIFT PG32UQX - £3,298.99! 😆

 

Edited by bobcat999

Call me Bob or Rob, I don't mind, but I prefer Rob.

I like to trick airline passengers into thinking I have my own swimming pool in my back yard by painting a large blue rectangle on my patio.

Intel 14900K in a Z790 motherboard with water cooling, RTX 4080, 32 GB 6000 CL30 DDR5 RAM, W11 and MSFS on Samsung 980 Pro NVME SSD's.  Core Isolation Off, Game Mode Off.

Share this post


Link to post
Share on other sites
5 minutes ago, bobcat999 said:

As I was prepared to stretch to about £1,200 for the LG C1 if I could have made it fit, I decided to look at some 4k 32 inch g-sync monitors instead, but the prices are really steep. 

LG Ultragear is what you need....

  • Like 2

Share this post


Link to post
Share on other sites
36 minutes ago, Ianrivaldosmith said:

 

LG 38GL950G

 

 

I've seen this monitor for sale but with only G-Sync Compatible, and then it's called LG 38GN950.
So anyone looking for this monitor, make sure you get the right one -> LG 38GL950G.

  • Like 1

R7 5800X3D | RTX 4080 OC 16 GB | 64 GB 3600 | 3440x1440 G-Sync | Logitech Pro Throttles Rudder Yoke Panels | Thrustmaster T.16000M FCS | TrackIR 5 | Oculus Rift S
Experience with Flight Simulator since early 1990s

Share this post


Link to post
Share on other sites

Would it be worth me selling my msi lightning z 1080ti for a 3070ti?

wanting the 3090 but the price tag is just appalling.

cheers

mike

  • Like 1

Share this post


Link to post
Share on other sites
24 minutes ago, mikeymike said:

Would it be worth me selling my msi lightning z 1080ti for a 3070ti?

wanting the 3090 but the price tag is just appalling.

cheers

mike

It's a bit difficult to answer, because the 1080 ti still is a good card, and it has 11 GB VRAM.
The 3070 ti is a beast, you'll get 20-30% more performance, but you're going from 11 GB VRAM to 8 GB.

The 3070 ti I've seen on the market almost go for the same price as the 3080. If I were you I would try to find a 3080 for just about the same price, and then it would be worth it. That's just my opinion. Not only because it has 10 GB VRAM, but because it's a significant step up from 3070 ti in performance.

The good news is you should be able to get a good price for your used 1080 ti either way.

Btw, the Super series of cards are just around the corner, but the change from the current versions shouldn't be that big. And the prices are not going to go down on those models any time soon.

Edited by Republic3D
  • Like 1
  • Upvote 1

R7 5800X3D | RTX 4080 OC 16 GB | 64 GB 3600 | 3440x1440 G-Sync | Logitech Pro Throttles Rudder Yoke Panels | Thrustmaster T.16000M FCS | TrackIR 5 | Oculus Rift S
Experience with Flight Simulator since early 1990s

Share this post


Link to post
Share on other sites
3 hours ago, Lord Farringdon said:

Whew!! So much stuff.   That's enough but, just one more thing. Heard of HDMI 2.1? Well, the earlier HDMI 2.0 had a transfer rate of 15Gb/s. The new HDMI 2.1 is 48Gb/s.  All RTX 30 cards are 2.1 ready and if you have a gaming mode on your display with a 2.1 input then by connecting the GPU and the display with an HDMI2.1 cable you are not limiting your frame rates by the pipe they have to travel down. Going back to the OP. Thinking of upgrading your screen? Make sure it has HDMI 2.1 to take advantage of coming improvements in this everchanging (and challenging) world of FPS 😊.

Terry     

Thanks Terry, in particular for the information about HDMI.  My Dell U3415W only supports HDMI 2.0.  I won't be replacing this screen anytime soon so I'm limited to that for now.  It would useful to calculate the maximum transfer rate encountered w/ the hardware and settings involved.

The fundamental problem as I see it currently, that is for people w/o a G-Sync capable display, I wrote about in my post above:

... closer you are to the ground at both departure and arrival airports the more you will appreciate the higher frame rate because you are moving fast relative to close objects during takeoff and landings, and taxiing to a lesser extent as you are moving at relatively low speeds, because the amount of motion per frame is larger, whereas up in the air when I can easily maintain 40-55 FPS with these settings in something like the 787-10 HD it doesn't matter at all: when you're looking for example at a building say from 2000' flying at a rate of speed of 180knots the distance that building moves on your display is measured in just a few pixels per frame at 30fps anyway, so you can't notice any difference.  It really needs to be the opposite:  down low you need high frames, up high less frames--therein in the problem IF you're going to max out settings in MSFS.

For example at the rather pedestrian CYYC, Calgary, Alberta CA, in the 787-10 HD in rain, with the sim unlocked w/o VSync enabled, the frame rate varies between 32 and 36 during taxi--I am set at LOD-O 300, LOD-T at 400, and RS 120 currently.  And this is not smooth and stutter-free due to the variable frame rate and lack of syncing technology to exploit.   So the only way to achieve that rock solid smooth video I'm limited to using VSync set at 30 in-sim.  In P3D, which did not offer in-sim VSync, I set my display at 30Hz, and was able to then achieve the same smooth, stutter-free 30FPS.   I don't find 30Hz refresh a significant issue at all, on this display.   So without GSync, this is really my best option.   Very early on I made the statement this is where I will stay until my next build which is several years out, or if my display should die and I can pick up a Gsync display w/ HDMI 2.1 support or whatever is best then.  And you know how displays are...they don't die!

As a side point, than describing Vsync as 'degrading' higher end graphics cards, I think it's more apt to say Vsync will restrict demand in order to be able to maintain a frame rate that can always be maintained for the settings employed--best we can do for now.  I was hopeful 'Adaptive Vsync' might be a driver/software solution, but alas it clearly wasn't in my brief test.  And moreover, fortunately I'm not wanting in any way, so am delighted to have smooth, stutter-free performance in any plane, anywhere and in any conditions, which matters far more to me than eking out more frames per second.

 


Noel

System:  7800x3D, Thermal Grizzly Kryonaut, Noctua NH-U12A, MSI Pro 650-P WiFi, G.SKILL Ripjaws S5 Series 32GB (2 x 16GB) 288-Pin PC RAM DDR5 6000, WD NVMe 2Tb x 1, Sabrent NVMe 2Tb x 1, RTX 4090 FE, Corsair RM1000W PSU, Win11 Home, LG Ultra Curved Gsync Ultimate 3440x1440, Phanteks Enthoo Pro Case, TCA Boeing Edition Yoke & TQ, Cessna Trim Wheel, RTSS Framerate Limiter w/ Edge Sync for near zero Frame Time Variance achieving ultra-fluid animation at lower frame rates.

Aircraft used in A Pilot's Life V2:  PMDG 738, Aerosoft CRJ700, FBW A320nx, WT 787X

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...