Jump to content
Sign in to follow this  
Bunchy

Will 3080Ti maintain 30FPS at 4k Ultra setting?

Recommended Posts

40 minutes ago, Lord Farringdon said:

Given your experience and knowledge @Ianrivaldosmith with G-Sync I'd be interested in your thoughts Ian on HDMI VRR and whether you would see it as achieving the same as G-Sync or whether it is more just another one in the line of G-Sync compatible approaches to the problem that perhaps do the job but not as well as G-Sync?

It’s good yes. The problem is the range. Let’s say it’s 44-120hz, which a lot of these ones seem to be. Then as soon as you’re below 44fps (which in MSFS can happen frequently) you get that slight judder as LFC kicks in, and the hz doubles. It can also cause flickering on some displays. You don’t have this with a proper G-Sync monitor. 

I have this issue with my LG-C1. Get below 40fps and LFC kicks in to frames as low as 20, but climb above 40fps again and you notice the shift. It’s subtle. Wouldn’t bother some people. However I rarely go below 40fps so don’t really have to deal with it much. 

  • Like 2

Share this post


Link to post
Share on other sites
19 minutes ago, Ianrivaldosmith said:

It’s good yes. The problem is the range. Let’s say it’s 44-120hz, which a lot of these ones seem to be. Then as soon as you’re below 44fps (which in MSFS can happen frequently) you get that slight judder as LFC kicks in, and the hz doubles. It can also cause flickering on some displays. You don’t have this with a proper G-Sync monitor. 

I have this issue with my LG-C1. Get below 40fps and LFC kicks in to frames as low as 20, but climb above 40fps again and you notice the shift. It’s subtle. Wouldn’t bother some people. However I rarely go below 40fps so don’t really have to deal with it much. 

Thanks Ian and that sent me on search and I found this article (over a year old now though) which I have included because I think a lot of our XBox colleagues would be interested too I think. But you are right, it looks like HDMI VRR might only  be 30hz to 120 Hz....not a show stopper though by any means.  

"Both current versions of the Xbox support VRR in a window of 40Hz to 60Hz, but things get really interesting when we look forward to the Xbox Series X. Microsoft's next-gen console supports HDMI 2.1 and will handle VRR in 4K from as low as 30Hz right up to 120Hz - as long as your TV can do the same. Again Xbox Series X will support both FreeSync and HDMI VRR."

https://www.whathifi.com/au/advice/vrr-everything-you-need-to-know-about-variable-refresh-rate

Terry

Edited by Lord Farringdon

No. No, Mav, this is not a good idea.

Sorry Goose, but it's time to buzz the tower!

Intel (R) Core (TM) i7-10700 CPU @2.90Ghz, 32GB RAM,  NVIDEA GeForce RTX 3060, 12GB VRAM, Samsung QN70A 4k 65inch TV with VRR 120Hz Free Sync (G-Sync Compatible). 

Boeing Thrustmaster TCA Yoke, Honeycomb Bravo Throttle Quadrant, Turtle Beach Velocity One Rudder Pedals.   

Share this post


Link to post
Share on other sites
1 hour ago, Lord Farringdon said:

HDMI 2.1 and will handle VRR in 4K from as low as 30Hz right up to 120Hz

Yes, this is correct, the LG C1 is actually now 20hz through 120hz. 

 

However the issue still remains, that once it dips below 40FPS/HZ, LFC kicks in, which doubles the refresh rate......This doesn't happen with a true G-Sync hardware based monitor. Hence the reason, for flightsim, they are 100% worth the extra money, as flight sim FPS are widely between 30-60fps. 

 

EXAMPLE:- Let's say you're at 44FPS, so your HZ will be at 44FPS and smooth. Now let's say you momentarily drop down to 38 FPS. On a True hardware based G-Sync monitor, your HZ will also now drop down to 38, mirroring the FPS. Smooth. However, on a compatible based (I.e anything that is not hardware G-Sync based) your drop down to 38 FPS will result in a HZ of 76, as LFC has just kicked in. This can cause a slight judder. Noticeable to me, maybe not as noticeable to others. The problem with this is, that it can also create a flickering effect, as the now higher HZ may have a different brightness. This depends on the quality of the display. Doesn't happen on the C1, but you do get the slight judder. 

 

This is why, G-Sync hardware monitors command a premium, and for flightsim, to me at least, that premium is justified. Because I do notice stuttering, however someone else may not notice it as much as me. 

Edited by Ianrivaldosmith
  • Like 3

Share this post


Link to post
Share on other sites
On 10/8/2021 at 7:23 AM, Republic3D said:

I did however send the 3080 back and kept my 3090 after some serious consideration. And it's not about how the sim performed today, it was absolutely great with both GPUs. It was about my thoughts about VRAM usage in the future, with even more complex scenery and addons. 

Unfortunately this again has become my dilemma.  I had to RMA the 3090 I purchased a couple of months ago as it was faulty. Now, while I was given a refund by the retailer (I was not given the option of a replacement card  - I don't think they had stock), in the interim the gap between what you pay in Australia for a 3080Ti and a 3090 has blown out from about $350 to over $800!  While I was and still am all for future proofing when it comes to VRAM as I run the sim at 4K, it is much harder now to justify the extra expense $800 for an additional 12 gig of VRAM that you may or may not ever need.  There is only so much money you can throw at this sim so I may just have to live with 12 gigs of VRAM and hope for the best.

Bruce

  • Like 3

Bruce Bartlett

 

Frodo: "I wish none of this had happened." Gandalf: "So do all who live to see such times, but that is not for them to decide. All we have to decide is what to do with the time that is given to us."

Share this post


Link to post
Share on other sites
On 10/12/2021 at 7:07 PM, Ianrivaldosmith said:

You want the gpu 100% used by MSFS. That’s the whole point. Actually, when the gpu is at 100%, it can stop you being main thread limited and eliminate any last stutters.

I just have to ask, Ian

how many power watts are you drawing & at what temperature is your GPU at 100%?

Share this post


Link to post
Share on other sites
22 minutes ago, craigeaglefire said:

I just have to ask, Ian

how many power watts are you drawing & at what temperature is your GPU at 100%?

Couldn't tell you how many watts it is using, its an 850w PSU though. And the temperature is around 72-77C average depending on the room temp. 

  • Like 1

Share this post


Link to post
Share on other sites
11 minutes ago, Ianrivaldosmith said:

Couldn't tell you how many watts it is using,

HWInfo is a good utility for GPU wattage,

an 850W PSU is on the low side for a 3090…

Share this post


Link to post
Share on other sites
27 minutes ago, Ianrivaldosmith said:

Couldn't tell you how many watts it is using, its an 850w PSU though. And the temperature is around 72-77C average depending on the room temp. 

That is similar to me on the 3080Ti.  850w power supply, temperatures roughly the same as Ian when using MSFS.  Never really goes above 79C even with a mild overclock, which I have turned down now as it isn't needed.

  • Like 2

Call me Bob or Rob, I don't mind, but I prefer Rob.

I like to trick airline passengers into thinking I have my own swimming pool in my back yard by painting a large blue rectangle on my patio.

Intel 14900K in a Z790 motherboard with water cooling, RTX 4080, 32 GB 6000 CL30 DDR5 RAM, W11 and MSFS on Samsung 980 Pro NVME SSD's.  Core Isolation Off, Game Mode Off.

Share this post


Link to post
Share on other sites
50 minutes ago, 6297J said:

850W on a 3090, usually hovers around 65C locked at 30

Unlock it 😉 see above posts ^^^

  • Like 1

Share this post


Link to post
Share on other sites
3 minutes ago, Ianrivaldosmith said:

Unlock it 😉 see above posts ^^^

No need 🙃

  • Upvote 1

Share this post


Link to post
Share on other sites
2 hours ago, craigeaglefire said:

an 850W PSU is on the low side for a 3090…

No, it’s not. At all. 
 

0-E2-E2-BF9-072-E-4-B9-C-9144-89-B97-F7-

 

Edited by Ianrivaldosmith
  • Like 4

Share this post


Link to post
Share on other sites
4 hours ago, craigeaglefire said:

an 850W PSU is on the low side for a 3090…

Outervision's calculator for my 9900K at 5.0Ghz, 32Gb DDR4, all storage devices, RTX 3090, 3 USB devices and everything simuloultaneously powered as well as even a 32" display which shouldn't even be included, running 3D games 8h/day PSU wattage came out to 638W, and they recommend a 750W PSU.  850W is more than ample to accommodate electrical wear over its 6y/o lifespan will be ample.

  • Like 2

Noel

System:  7800x3D, Thermal Grizzly Kryonaut, Noctua NH-U12A, MSI Pro 650-P WiFi, G.SKILL Ripjaws S5 Series 32GB (2 x 16GB) 288-Pin PC RAM DDR5 6000, WD NVMe 2Tb x 1, Sabrent NVMe 2Tb x 1, RTX 4090 FE, Corsair RM1000W PSU, Win11 Home, LG Ultra Curved Gsync Ultimate 3440x1440, Phanteks Enthoo Pro Case, TCA Boeing Edition Yoke & TQ, Cessna Trim Wheel, RTSS Framerate Limiter w/ Edge Sync for near zero Frame Time Variance achieving ultra-fluid animation at lower frame rates.

Aircraft used in A Pilot's Life V2:  PMDG 738, Aerosoft CRJ700, FBW A320nx, WT 787X

 

Share this post


Link to post
Share on other sites
9 hours ago, brucewtb said:

While I was and still am all for future proofing when it comes to VRAM as I run the sim at 4K, it is much harder now to justify the extra expense $800 for an additional 12 gig of VRAM that you may or may not ever need.  There is only so much money you can throw at this sim so I may just have to live with 12 gigs of VRAM and hope for the best.

Bruce

Bruce--just depends on where you are happy setting LOD primarily.  If you're fine at 200, you'll rarely even get to 9Gb in the most complex of settings.  People will cite seeing 16Gb of VRAM in use, but sure just keep increasing LOD but the problem there is you can't see it that far out anyway, so that's a total waste.  The raw processing power is essentially identical to 3080Ti.  So you are paying $800 plus for 12Gb that will rarely if ever see an electron, and then there's the raw 7lb+ of weight hanging on the mounting hardware for 3090, and its prodigious size, all to house the tiny 12Gb of VRAM.  Now if 3090's raw power was a good 20+% greater than 3080Ti, I could see it, but unfortunately it isn't.   I'm at LOD-Objects at 300, LOD-Terrain at 400, and the highest on record for me it 10.9Gb--but most of the time is' around 8.5Gb at these settings.  Keep in mind MSFS is also tailored or XBox SX which only has ~10Gb of VRAM to exploit the rest is used as System ram in some capacity is my understanding.  Render Scaling primarily impacts GPU utilization numbers, not VRAM.  I have RS at 130 for my 3440x1440 display and it all looks fabulous.

Edited by Noel

Noel

System:  7800x3D, Thermal Grizzly Kryonaut, Noctua NH-U12A, MSI Pro 650-P WiFi, G.SKILL Ripjaws S5 Series 32GB (2 x 16GB) 288-Pin PC RAM DDR5 6000, WD NVMe 2Tb x 1, Sabrent NVMe 2Tb x 1, RTX 4090 FE, Corsair RM1000W PSU, Win11 Home, LG Ultra Curved Gsync Ultimate 3440x1440, Phanteks Enthoo Pro Case, TCA Boeing Edition Yoke & TQ, Cessna Trim Wheel, RTSS Framerate Limiter w/ Edge Sync for near zero Frame Time Variance achieving ultra-fluid animation at lower frame rates.

Aircraft used in A Pilot's Life V2:  PMDG 738, Aerosoft CRJ700, FBW A320nx, WT 787X

 

Share this post


Link to post
Share on other sites

And MSFS will use system ram (32gig here) if it needs more VRAM.

Cheers

bs


AMD RYZEN 9 5900X 12 CORE CPU - ZOTAC RTX 3060Ti GPU - NZXT H510i ELITE CASE - EVO M.2 970 500GB DRIVE - 32GB XTREEM 4000 MEM - XPG GOLD 80+ 650 WATT PS - NZXT 280 HYBRID COOLER

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...