Jump to content
Sign in to follow this  
WidowsSon

Nvidia RTX 3090 vs 2080 Ti vs 1070 Ti

Recommended Posts

8 minutes ago, WidowsSon said:

I'm working on a video that outlines how to setup and properly achieve a balanced load between CPU and GPU allowing you to ensure you're taking advantage of whatever CPU/GPU combination you may have.

 

Geez, what are you waiting on David..get on with that video! 😀

seriously though, very enjoyable watch.

I spent a few days finding the best settings combo for my 3090/10900k combo and it is stunning what is possible.

Waiting for Asobo to deliver on their promise to further optimize on the next update as they recently stated before I pick my new 4K monitor.

 

  • Like 2

I9-13900kf - rtx4090

32gb ddr5 4800mhz, 2TB M.2 PCIe NVMe SSD

internet - 300+ mbs / Honycomb Alpha yoke / Saitek Throttle

Dell 43” 4K 

Share this post


Link to post
Share on other sites
8 minutes ago, micstatic said:

@WidowsSonI haven't seen any proof AA isn't needed at 4k.  I wanted to believe that.  But MSFS and P3D proved that wrong to me. 

I didn't say that.

Share this post


Link to post
Share on other sites
10 minutes ago, WidowsSon said:

@Carts85 Yes, its a big leap, night and day. At true 4k at the proper distance, you don't even need Anti-Aliasing. 8k will completely eliminate the need for AA.

@Virtual-Chris Yes, I considered Big NAVI. Will most likely get a 6900 XT for comparison upon release. I'm planning a 5950X Cpu upgrade anyway, regardless of GPU.

That being said, I'm not convinced the 6900XT will be the superior card. It may be a better value for many people, but all things considered DLSS / Creative workflows etc.. Nvidia has had the lead in not just hardware, but software and drivers for many years. I love what a competitive AMD card will bring to consumers, but AMD is and has always been a hype machine. I'll believe it when I see it sort of thing. I'm not trying to hate on AMD as I've clearly jumped ship for their 5000 series processors (currently running a 3900X) as by far the best value.

Don't expect too much with the AMD CPU/GPU combo (maybe 1-2% is what insiders might be expecting). This isn't a magic button. The biggest gains for AMD gaming is doubling the core count and cache outside the infinity fabric. This will push their CPU's past Intel.

As far as 3090 barely pushing 30fsp at this point, that is because it's severely limited by the CPU. At full settings it was getting mid to high 30's, but was only at 75% utilization while CPU mainthread was maxed. By limiting the settings reliant on the mainthread process, we should easily be able to achieve v-sync at 60 fps Ultra settings IMO.

I'm working on a video that outlines how to setup and properly achieve a balanced load between CPU and GPU allowing you to ensure you're taking advantage of whatever CPU/GPU combination you may have.

 

I think that's all a fair assessment.

I'm fairly new to PC building since I've taken 10 years out from that, so I'm interested in your view on a few things...

What advantages does AMD Zen 3 CPU architecture offers over Intel.  At first glance their 7nm node has given them more cores and cache but strangely 5GHz still remains elusive. And they are just now starting to claim the best gaming performance, barely... but honestly Intel has held that crown even while being 2 years behind AMD on node shrinks.  What advantages does Zen 3 bring to gaming?

I think it remains to be seen how much MS/Asobo can parallelize this engine. It's largely still the FSX engine from years ago and a complete multi-threaded rewrite may not be in the cards here.  What I don't know is how much DX12 will help.  Can you explain what that brings to the table in offloading the main thread?

Finally, I'm new to DLSS and naturally skeptical.  Is a Flight Sim with lots of tiny detail changing with literally every frame really conducive to DLSS?  I guess I could see DLSS working well on a Doom style game with largely the same textures repeated over and over.  But I'm looking at it more like, a simple image compresses better than a complex one if you know what I mean.  Maybe that's the wrong way to look at DLSS?

Share this post


Link to post
Share on other sites
9 minutes ago, FrankR409 said:

Geez, what are you waiting on David..get on with that video! 😀

seriously though, very enjoyable watch.

I spent a few days finding the best settings combo for my 3090/10900k combo and it is stunning what is possible.

Waiting for Asobo to deliver on their promise to further optimize on the next update as they recently stated before I pick my new 4K monitor.

 

Please share your findings on optimizing your setup.

  • Like 1

Share this post


Link to post
Share on other sites
4 minutes ago, Virtual-Chris said:

Please share your findings on optimizing your setup.

Current system configuration in signature.

- All ultra

- terrain level of detail = 130

- object level of detail =200

- render scale = 130.  Approximately pushing 8.3 million pixels.

Developer mode shows gpu bound, but generally in the green.
 

That’s it really!

 

Edited by FrankR409
  • Like 2

I9-13900kf - rtx4090

32gb ddr5 4800mhz, 2TB M.2 PCIe NVMe SSD

internet - 300+ mbs / Honycomb Alpha yoke / Saitek Throttle

Dell 43” 4K 

Share this post


Link to post
Share on other sites
56 minutes ago, Carts85 said:

Like many of us am close to pulling the trigger on my next set up so appreciate the comparisons... I’m in the 1080p club currently but as I’ve never experienced this sim in 4K. The guys that do fly in 4k is it really that big of a leap? FPS aside purely from a display point of view is it worth it as I’m really banging my head over 4k vs 1440p.

It's totally worth it

  • Like 3

Wayne such

Asus Hero Z690, Galax 3080 TI, I712700K, Kraken x72 CPU Cooled, 64 GIGS Corsair DDR5, 32 Inch 4K 

Share this post


Link to post
Share on other sites
2 hours ago, Ricardo41 said:

Conclusion: stick with 1080p and you'll be good. 

No way. No comparison to 4K. 

  • Like 4

Share this post


Link to post
Share on other sites
1 minute ago, Virtual-Chris said:

I think that's all a fair assessment.

I'm fairly new to PC building since I've taken 10 years out from that, so I'm interested in your view on a few things...

What advantages does AMD Zen 3 CPU architecture offers over Intel.  At first glance their 7nm node has given them more cores and cache but strangely 5GHz still remains elusive. And they are just now starting to claim the best gaming performance, barely... but honestly Intel has held that crown even while being 2 years behind AMD on node shrinks.  What advantages does Zen 3 bring to gaming?

I think it remains to be seen how much MS/Asobo can parallelize this engine. It's largely still the FSX engine from years ago and a complete multi-threaded rewrite may not be in the cards here.  What I don't know is how much DX12 will help.  Can you explain what that brings to the table in offloading the main thread?

Finally, I'm new to DLSS and naturally skeptical.  Is a Flight Sim with lots of tiny detail changing with literally every frame really conducive to DLSS?  I guess I could see DLSS working well on a Doom style game with largely the same textures repeated over and over.  But I'm looking at it more like, a simple image compresses better than a complex one if you know what I mean.  Maybe that's the wrong way to look at DLSS?

Zen 3's (as far as we know) biggest advantage will be IPC or instructions per clock cycle which isn't the same as MHz. you can run 5GHz and get less IPC than a more efficient CPU running at 4.8, so GHz can be a bit misleading. I'm a big fan of INTEL, run nothing but in racks full of XEON's which are beasts, but its looking like Zen3 will own the IPC and single core performance crown on 5000 series release. Single core performance is important in games like MSFS as the "mainthread" is the most limiting process and cannot be split between cores.

DX12 will help more than any hardware you can dream about at the moment. This "mainthread" that I reference is where the majority of the "loop" of the game's programming runs. Yes, some things can be offloaded in parallel in the game that are not encapsulated in that thread, but for the most part all the important 3D stuff happens in the "mainthread" like all the "draw calls" that the game sends to the CPU, the CPU prioritizes them and sends them to the GPU. Doesn't matter how powerful the GPU is if it doesn't get enough draw calls from the cpu, it won't do anything more. This makes a situation where more fps is dependent on your CPU making more draw calls and the CPU becomes the bottleneck on a card like the 3090.

Where DX12 comes in is the ability to reduce this overhead by allowing the API to talk directly to the GPU instead of "through" the CPU. In turn, not only is your CPU freed from being overburdened by draw calls, but it is now capable of doing the CPU things that it should be doing instead of sending calls to the GPU for more objects/frames. Right off the bat, no other things considered many games recieve 50% decrease in CPU and 20% GPU performance gains. On top of that, a card like the 3090 which may have been 60% utilized is now fully utilized. You start seeing compounding effects. There are more, but I'm already balancing the line of layman explanation and incorrect explanation.

All things equal, 3DMark scores on identical systems for 3D Mark show DX11 at around 2 Million score and DX12 at 35 Million score. Yes, that is correct, 17 times faster. Its not just a little faster, but game changingly faster. (is that a word? changingly? I digress)

Finally, for your last question, take a look at some of the DLSS stuff from Nvidia on YouTube. DLSS is actually more detailed than native resolutions. Its absolutely amazing. AMD IMO NEED something similar to compete. DLSS is not a compression technology. Its a prediction technology that has showcased that it can predict higher and better looking images than the native drawings of that image. Seems crazy, but if you have been using Nvidia AI products on the creative side of things for any amount of time, understand the difference between good graphics cards and amazing software from Nvidia. They are truly leaders in this space. I hope for our sake as consumers and both companies driving change that AMD can counter.

Hope that helps.

  • Like 8
  • Upvote 1

Share this post


Link to post
Share on other sites

I got a 4k monitor a week or so ago. I was thinking/hoping that there'd be some new graphics card available. No dice.

I was worried that I wouldn't be able to do much without a new graphics card, but I was wrong. My 1080ti works just fine at 4K. I have had to make a few compromises but overall, it is super.

I have my frame rate locked a 30 fps through the NCP. Render scaling is at 90. TAA is enabled. Otherwise, it's a mix of ultra and high graphics settings. Mostly, it stays at 30 throughout. This is a good guide for settings and the cost of each.

So, I don't think there's any reason to rush into a new graphics card provided you have a 1080, at least.

As far as going from 1080p to 4K? That would be a huge difference. You won't regret it.

  • Like 3

Richard Chafey

 

i7-8700K @4.8GHz - 32Gb @3200  - ASUS ROG Maximus X Hero - EVGA RTX3090 - 3840x2160 Res - KBSim Gunfighter - Thrustmaster Warthog dual throttles - Crosswind V3 pedals

MSFS 2020, DCS

 

Share this post


Link to post
Share on other sites

@WidowsSon is rapidly becoming a power house of useful videos about MFS. Really appreciate your efforts and the professional and objective standards of your productions. Many thanks.

  • Like 6

Share this post


Link to post
Share on other sites
1 minute ago, Bottle said:

@WidowsSon is rapidly becoming a power house of useful videos about MFS. Really appreciate your efforts and the professional and objective standards of your productions. Many thanks.

Thanks Bottle, very kind of you.

Share this post


Link to post
Share on other sites

FYI, These are real life 3D Mark scores with no difference except API. DX11 vs DX12. It seems unreal, but this is what bypassing the CPU allow. I'm not proposing that DX12 is going to make you go from 25 fps to 250 fps, but it does make real measurable and incredibly big differences. As someone else put it to me, DX11 gets stomped by modern API's like DX12 and Vulkan.

DX11 vs DX12 Results:
https://ibb.co/sRtpjv3

 

Edited by WidowsSon

Share this post


Link to post
Share on other sites
25 minutes ago, FrankR409 said:

Current system configuration in signature.

- All ultra

- terrain level of detail = 130

- object level of detail =200

- render scale = 130.  Approximately pushing 8.3 million pixels.

Developer mode shows gpu bound, but generally in the green.
 

That’s it really!

 

Interesting.  I'm now curious in your thinking behind these particular trade-offs.  Compromising on terrain LOD is interesting because, well, this is a flight simulator where detail at a distance matters more than any other genre of game.  In fact, many people are frustrated with the LOD even at 200 and hacking the config file to increase it to 400 and beyond.  A low LOD causes more pop-in, especially with photogrammetry so this is an interesting choice to compromise on.

Render scaling beyond 100% at 4K seems unusual as well. I imagine the effect of that is an increase in sharpness but at a significant cost.

What is the end result visually and why do you like that better vs say more detail at a distance?

  • Like 3

Share this post


Link to post
Share on other sites
7 minutes ago, WidowsSon said:

FYI, These are real life 3D Mark scores with no difference except API. DX11 vs DX12. It seems unreal, but this is what bypassing the CPU allow. I'm not proposing that DX12 is going to make you go from 25 fps to 250 fps, but it does make real measurable and incredibly big differences. As someone else put it to me, DX11 gets stomped by modern API's like DX12 and Vulkan.

DX11 vs DX12 Results:
https://ibb.co/sRtpjv3

 

I was playing Ubisoft's Ghost Recon Breakpoint which launched as a DX11 title and then at some point they offered Vulcan support.  The load on my CPU went down about 20% and my FPS increased about 10%.  Still, I'll take it.

Share this post


Link to post
Share on other sites
15 minutes ago, WidowsSon said:

It seems unreal, but this is what bypassing the CPU allow

Well CPU is only part of equation.

Edited by Evros

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...