Jump to content
Sign in to follow this  
PaulFWatts

Your thoughts on VRAM for MSFS

Recommended Posts

4 minutes ago, eaim said:

I wonder if issues with memory management could be one of the reasons why Asobo/MS initially chose to use DirectX 11?

It could be, but I have the feeling it's more likely because development on the simulator started around 5 years ago, when DirectX 12 was still very new (and probably a rushed release, as Microsoft needed a quick response to AMD's Mantle API). Or perhaps they felt that performance would be good enough even with DirectX 11 (Microsoft Flight on DirectX 9 proved that there was still a lot of potential for optimisation of the engine even with an older API), and decided on a DirectX 12 implementation later on to take advantage of DXR and help with the simulator's port to the Xbox consoles.

  • Like 1

Share this post


Link to post
Share on other sites
3 hours ago, ChaoticBeauty said:

No. GDDR6 is expensive and in very high demand, especially now that GDDR5 is being phased out and each unit of the new consoles will boast 16GB GDDR6 shared RAM. Same goes for DRAM and NAND flash, the oversupply is over and the demand is very high, as a result prices have been steadily increasing the past three months and will probably continue to do so throughout 2020 and 2021. In 2022 we could finally see graphics cards with double the amount of VRAM (GPUs with a 256-bit bus being equipped with 16GB VRAM for example).

https://www.prepar3d.com/forum/viewtopic.php?f=6312&t=136873&p=212507#p212543

According to that post from Beau Hollis, this problem is inherent to DirectX 12. But... I haven't seen any other game exhibit this issue, and Beau did make some ridiculous statements in the v2 era so I'm not sure what to believe. If anyone knows more about the matter, some input would be appreciated.

Since Microsoft is behind Asobo, it is almost certain that they will be working on a good implementation. Most games nowadays have terrible DirectX 12 implementations, causing stutters and other performance issues on some specific GPU architectures.

Like you said no other DX12 title is crashing because of VRAM management. Actually if you read all people having those issues, you will some of them are not even using all the VRam available and still crashing.

DX12 gives a lot of access to the hardware, but at the same time demands a lot of work/attention by the developers so they don’t mess things up.

3 hours ago, arsenal82 said:

not sure where you got that from but LM has explained what the issue is with DX12 according to them

but hey hooo pretty sure LM devs know more than us 

what asobo and ms will do rest to be seen at this point I would not bet on anything yet as the proof will be in the pudding

Sure P3D’s devs know more about their software than me, but they are still investigating the issue, so they don’t really know what’s happening. You can infer that reading all the threads on their forums.

That said, you can see a lot of AAA games that are heavy on using Vram, because, for example, they use ray-traced reflections, and still they manage all right the Vram usage. 
 

But well, if you think the problem is the DX12, it’s even worse. If the problem isn’t on P3D’s code, so LM can’t fix it?! We will have to accept CTDs again?! But that would be really odd, when you to a bunch of titles using DX12 and not crashing like that.

  • Like 1
  • Upvote 2

Share this post


Link to post
Share on other sites
6 hours ago, PaulFWatts said:

1. Do you think the new MSFS will require a large amount of VRAM when it moves to DX12 to achieve the best visual experience?

2. Do you see any hope for reasonably priced cards with large amounts of VRAM in 2020 or 2021

 

I hope/expect the next round of GPUs that are coming from AMD and Nvidia in Q4(ish) will have more VRAM.  For the last few years, the mid-high range cards have all had between 4GB and 8GB. I think we'll see a migration in the next iteration of cards to the 8GB-16GB range.  AMD is planning a late 2020 launch of the Navi 2x cards and NVidia should be releasing at least the beginnings of the RTX30xx cards by year-end.  I'm sure the 3080Ti will have plenty of VRAM - hopefully at least 16GB - the question is where will the next gen of non-Ti/gamer cards (3060,3070,3080) land - and I'm hopeful that the low end will move from 6GB to 8GB and the high end will move from 8GB to 16GB (or at least 12GB).  I'm hoping the same will be true with the AMD Navi 2x cards. Right now, only the Radeon VII (now released over a year ago) has 16GB of VRAM, while the RX5700XT (the current highest end gaming GPU from AMD) has only 8GB.

The thing about VRAM is that the demand is primarily driven by resolution.  And what we're seeing with P3Dv5 is that, for the most part, the people (like me) running into VRAM limitations with 8GB GPUs are running at 4k.  Nobody running at 1080p, or really even 1440p should be having a problem with an 8GB GPU, and I expect this is largely the same with MSFS.  This is true across all gaming.  The only GPUs really targeted at true 4k gaming experiences are cards like the 2080Ti and above (or the Radeon VII to a lesser extent).  The non-top-of-the-line gaming cards like the RX5xxx series from AMD and the 20xx series (non Ti) from NVidia are more generally known/billed as '2k' or 1440p gaming cards.

But to your qualifier "best visual experience" - yes, in DX12 environments, at least 12GB of VRAM will most likely be required to run at high settings in 4K for the best visual experience.  This isn't untrue in DX11 as well - at 4k resolution, my P3Dv4.5 environment runs at capacity, using all 8GB of my VRAM with mid-high (not maxed) settings.  4k is just a whole lot of pixels to render compared with lower resolutions.

  • Upvote 2

5800X3D | Radeon RX 6900XT

Share this post


Link to post
Share on other sites

Thanks for your very detailed response @cwburnett

I also run at 4K hence why I’ll be looking for as much VRAM as I can get with my next card upgrade.

On a more nostalgic note I can remember the excitement in buying my first graphics card driven by the release of Wolfenstein on the PC to move into the world of 3D gaming. https://en.wikipedia.org/wiki/3dfx_Interactive . It had a massive 4Mb of EDO DRAM for its video frame buffer. We all thought it was magic at the time and thus the “voodoo chipset” it used was very appropriately named.

Everything has come a LONG way since then 🙂

 

Edited by PaulFWatts

Share this post


Link to post
Share on other sites
19 minutes ago, PaulFWatts said:

Everything has come a LONG way since then 🙂

It sure has! Those were the good old days. I can help but feel like I was completely satisfied in that era. I'm sure that's just rose tinted glasses at this point since it is ancient history.


5800X3D | Radeon RX 6900XT

Share this post


Link to post
Share on other sites
5 minutes ago, cwburnett said:

It sure has! Those were the good old days. I can help but feel like I was completely satisfied in that era. I'm sure that's just rose tinted glasses at this point since it is ancient history.

So true @cwburnett

And the fact that we were so much younger then may have something to do with it 😉

Share this post


Link to post
Share on other sites

Keep in mind that not only the amount of detail has its influence on the VRAM but also the framerate you want it to run.

I have a 2080Ti connected to 2x 4K in NVSurround.

If I set 25-50 hertz all is well. If I set 60 hertz everything becomes green or red. The card just cannot handle it.

Setting 30 FPS or 69 FPS has a big influence on the VRAM and GPU. Add high AA settings to that.

Simbol ( P3D forum ) wrote that all P3Dv5 tester used 30 FPS and no one had CTD’s or VRAM issues. Even with a 4K display. And they used 2/4x MSAA as the display was already in 4K.

As MSFS with DX11 is putting a little less on the GPU I hope that my 2080Ti can handle it...

  • Upvote 1

13900 8 cores @ 5.5-5.8 GHz / 8 cores @ 4.3 GHz (hyperthreading on) - Asus ROG Strix Gaming D4 - GSkill Ripjaws 2x 16 Gb 4266 mhz @ 3200 mhz / cas 13 -  Inno3D RTX4090 X3 iCHILL 24 Gb - 1x SSD M2 2800/1800 2TB - 1x SSD M2 2800/1800 1Tb - Sata 600 SSD 500 Mb - Thermaltake Level 10 GT case - EKWB Extreme 240 liquid cooling set push/pull - 2x 55’ Sony 4K tv's as front view and right view.

13600  6 cores @ 5.1 GHz / 8 cores @ 4.0 GHz (hypterthreading on) - Asus ROG Strix Gaming D - GSkill Trident 4x Gb 3200 MHz cas 15 - Asus TUF RTX 4080 16 Gb  - 1x SSD M2 2800/1800 2TB - 2x  Sata 600 SSD 500 Mb - Corsair D4000 Airflow case - NXT Krajen Z63 AIO liquide cooling - 1x 65” Sony 4K tv as left view.

FOV : 190 degrees

My flightsim vids :  https://www.youtube.com/user/fswidesim/videos?shelf_id=0&sort=dd&view=0

 

Share this post


Link to post
Share on other sites

I believe Dos Equis has variable levels of graphics capabilities.

Are you really asking if 'all sliders to the right', without any throttling of a graphics card of greater than 11 GB (that doesn't exist yet), when using DX12 (that isn't implemented yet) is possible? Well the answer is yes, of course. You should buy the graphics card with the greatest amount of memory over 11G (even though it hasn't yet been built), for whatever absurd price the card manufacturers demand.

Share this post


Link to post
Share on other sites
1 hour ago, cwburnett said:

But to your qualifier "best visual experience" - yes, in DX12 environments, at least 12GB of VRAM will most likely be required to run at high settings in 4K for the best visual experience.  This isn't untrue in DX11 as well - at 4k resolution, my P3Dv4.5 environment runs at capacity, using all 8GB of my VRAM with mid-high (not maxed) settings.  4k is just a whole lot of pixels to render compared with lower resolutions.

This is why I think the Xbox version will be limited compared to the PC version. The current Xbox One has 12 GB of RAM and the Series X has 16 GB, both of which share that between the CPU and GPU. Even with the fast SSD in the Series X, they will likely have to cap or lower something, somewhere (texture resolution, details, frame rate, etc.). Don't get me wrong, it will still look great on the Xbox, just maybe not quite as good as on a PC.

Share this post


Link to post
Share on other sites
18 minutes ago, GSalden said:

 

Simbol ( P3D forum ) wrote that all P3Dv5 tester used 30 FPS and no one had CTD’s or VRAM issues. Even with a 4K display. And they used 2/4x MSAA as the display was already in 4K.

 

Do I understand it right ? The v5 testing wasn’t done with various display parameters trying to find how different user settings would react ? If so, what kind of testing is that ?

Seems to me to open a large testing to the ill-washed crowd as the FS20 team does is smarter to debunk flaws as this vram overload than reserving the testing to a limited number of developers...

  • Like 2

Dominique

Simming since 1981 -  4770k@3.7 GHz with 16 GB of RAM and a 1080 with 8 GB VRAM running a 27" @ 2560*1440 - Windows 10 - Warthog HOTAS - MFG pedals - MSFS Standard version with Steam

 

Share this post


Link to post
Share on other sites
15 minutes ago, Dominique_K said:

Seems to me to open a large testing to the ill-washed crowd as the FS20 team does is smarter to debunk flaws as this vram overload than reserving the testing to a limited number of developers...

An open beta of some sort would certainly have allowed for broader feedback before market release, but LM doesn't roll that way.  Purely speculation, but I suspect they have a much smaller team working on this and managing a huge open testing program takes a lot of resources and must be well coordinated. AFAIK, LM doesn't even have a support ticketing system or portal for P3D.  I'd surmise that loads of non-professional simmers buying academic licenses *IS* their beta and they iron out the issues after release and after they get to 5.1 their big spending pro plus customers start adopting.  Again, pure speculation.

Disclosure: Count me among those that bought v5 to test. 😉

Edited by cwburnett

5800X3D | Radeon RX 6900XT

Share this post


Link to post
Share on other sites

IIRC one of the Asobo devs said his testing computer was running on a GTX1060

Edited by Tuskin38

Share this post


Link to post
Share on other sites
5 minutes ago, Tuskin38 said:

IIRC one of the Asobo devs said his testing computer was running on a GTX1060

Yeah, that was Sebastian Wloch.

Edited by ca_metal
  • Upvote 1

Share this post


Link to post
Share on other sites
9 hours ago, cwburnett said:

Purely speculation, but I suspect they have a much smaller team working on this and managing a huge open testing program takes a lot of resources and must be well coordinated.

. 😉

A reasonable speculation.  Another hint of a small team  is the marginal improvements from one version to the other. 

 And, also version after version, comes up the question of the quality of their beta testers. Mostly 3PD dev in hurry to see the product out or self-proclaimed experts in simulation lost in the mirror of   their 24 GB Titan SLI 😅 ? 

Edited by Dominique_K

Dominique

Simming since 1981 -  4770k@3.7 GHz with 16 GB of RAM and a 1080 with 8 GB VRAM running a 27" @ 2560*1440 - Windows 10 - Warthog HOTAS - MFG pedals - MSFS Standard version with Steam

 

Share this post


Link to post
Share on other sites
9 hours ago, ca_metal said:

Yeah, that was Sebastian Wloch.

He said that he didn’t need anything stronger because he was in charge of designing the  flight modeling. Not because the whole sim could run on 1060.

Edited by Dominique_K

Dominique

Simming since 1981 -  4770k@3.7 GHz with 16 GB of RAM and a 1080 with 8 GB VRAM running a 27" @ 2560*1440 - Windows 10 - Warthog HOTAS - MFG pedals - MSFS Standard version with Steam

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...