Jump to content

WidowsSon

Frozen-Inactivity
  • Content Count

    53
  • Donations

    $0.00 
  • Joined

  • Last visited

Community Reputation

135 Excellent

1 Follower

Flight Sim Profile

  • Commercial Member
    No
  • Online Flight Organization Membership
    none
  • Virtual Airlines
    No

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks Republic, no rush, just interested. I have noticed the cockpit refresh rate is a bit clunky on medium but figured high would restrict my CPU even further. Will have to do some more testing on that setting. Been looking for a 5900x or 5950x without any luck. May have missed the pre-order boat here.
  2. Thanks for the info Republic, I found with the 2080 Ti that I was gpu limited and the 3900x could keep up (maybe just barely). As soon as I switched to the 3090, the cpu had a lot of trouble trying to make calls to draw those extra XX% of frames the 3090 could handle. I turned LOD's way down to 50 and it still was somewhat CPU limited. Trying to squeeze the most out of the 3090 with the 3900x in MSFS seems like a mission of impossibility. Out of interest I wonder if you could fire me a screen shot of your settings in a PM and we could do a little testing back and forth between us. Your setup is like you said almost identical. Ram speed shouldn't make a difference. Would be interested to see if I can get similar results to you etc.. All I know for sure is I'm very CPU limited like you. I saw results of the 3900x at 55 fps (about what I'm getting after tweaking) and the 5950x getting 74 fps with the same settings and GPU so I'm very excited to get that sucker installed.
  3. Thanks Republic, good info. I haven't had enough time to play with it yet, but on default (as in video) I'm getting about 30-35% better frames at 4k than 2080 Ti and like you said, not running 100% utilization. I'm guessing I could get up to around 50% over 2080 Ti. I bet if I spent some time in the .cfg files (not sure if this is a good example) and boost LOD to 400 or more and higher texture maps etc... hopefully to take advantage of all that VRAM past what is available in settings. Will have to definitely limit CPU bound settings and see how I can push it. Will make a video on this I think.
  4. Just to be clear, The video had no intended point at all other than a comparison across 3 generations of cards that if it demonstrates anything, its that the limiting factor here is the CPU. If you are satisfied with lower resolutions at medium settings any interest in greater hardware is a moot point.
  5. Of course, I never suggested otherwise. Its a wide and varied topic, but CPU was the context within a CPU limited mainthread problem that many are experiencing in MSFS. Rarely in Computers is anything like this limited to one aspect of anything.
  6. FYI, These are real life 3D Mark scores with no difference except API. DX11 vs DX12. It seems unreal, but this is what bypassing the CPU allow. I'm not proposing that DX12 is going to make you go from 25 fps to 250 fps, but it does make real measurable and incredibly big differences. As someone else put it to me, DX11 gets stomped by modern API's like DX12 and Vulkan. DX11 vs DX12 Results: https://ibb.co/sRtpjv3
  7. Zen 3's (as far as we know) biggest advantage will be IPC or instructions per clock cycle which isn't the same as MHz. you can run 5GHz and get less IPC than a more efficient CPU running at 4.8, so GHz can be a bit misleading. I'm a big fan of INTEL, run nothing but in racks full of XEON's which are beasts, but its looking like Zen3 will own the IPC and single core performance crown on 5000 series release. Single core performance is important in games like MSFS as the "mainthread" is the most limiting process and cannot be split between cores. DX12 will help more than any hardware you can dream about at the moment. This "mainthread" that I reference is where the majority of the "loop" of the game's programming runs. Yes, some things can be offloaded in parallel in the game that are not encapsulated in that thread, but for the most part all the important 3D stuff happens in the "mainthread" like all the "draw calls" that the game sends to the CPU, the CPU prioritizes them and sends them to the GPU. Doesn't matter how powerful the GPU is if it doesn't get enough draw calls from the cpu, it won't do anything more. This makes a situation where more fps is dependent on your CPU making more draw calls and the CPU becomes the bottleneck on a card like the 3090. Where DX12 comes in is the ability to reduce this overhead by allowing the API to talk directly to the GPU instead of "through" the CPU. In turn, not only is your CPU freed from being overburdened by draw calls, but it is now capable of doing the CPU things that it should be doing instead of sending calls to the GPU for more objects/frames. Right off the bat, no other things considered many games recieve 50% decrease in CPU and 20% GPU performance gains. On top of that, a card like the 3090 which may have been 60% utilized is now fully utilized. You start seeing compounding effects. There are more, but I'm already balancing the line of layman explanation and incorrect explanation. All things equal, 3DMark scores on identical systems for 3D Mark show DX11 at around 2 Million score and DX12 at 35 Million score. Yes, that is correct, 17 times faster. Its not just a little faster, but game changingly faster. (is that a word? changingly? I digress) Finally, for your last question, take a look at some of the DLSS stuff from Nvidia on YouTube. DLSS is actually more detailed than native resolutions. Its absolutely amazing. AMD IMO NEED something similar to compete. DLSS is not a compression technology. Its a prediction technology that has showcased that it can predict higher and better looking images than the native drawings of that image. Seems crazy, but if you have been using Nvidia AI products on the creative side of things for any amount of time, understand the difference between good graphics cards and amazing software from Nvidia. They are truly leaders in this space. I hope for our sake as consumers and both companies driving change that AMD can counter. Hope that helps.
  8. @Carts85 Yes, its a big leap, night and day. At true 4k at the proper distance, you don't even need Anti-Aliasing. 8k will completely eliminate the need for AA. @Virtual-Chris Yes, I considered Big NAVI. Will most likely get a 6900 XT for comparison upon release. I'm planning a 5950X Cpu upgrade anyway, regardless of GPU. That being said, I'm not convinced the 6900XT will be the superior card. It may be a better value for many people, but all things considered DLSS / Creative workflows etc.. Nvidia has had the lead in not just hardware, but software and drivers for many years. I love what a competitive AMD card will bring to consumers, but AMD is and has always been a hype machine. I'll believe it when I see it sort of thing. I'm not trying to hate on AMD as I've clearly jumped ship for their 5000 series processors (currently running a 3900X) as by far the best value. Don't expect too much with the AMD CPU/GPU combo (maybe 1-2% is what insiders might be expecting). This isn't a magic button. The biggest gains for AMD gaming is doubling the core count and cache outside the infinity fabric. This will push their CPU's past Intel. As far as 3090 barely pushing 30fsp at this point, that is because it's severely limited by the CPU. At full settings it was getting mid to high 30's, but was only at 75% utilization while CPU mainthread was maxed. By limiting the settings reliant on the mainthread process, we should easily be able to achieve v-sync at 60 fps Ultra settings IMO. I'm working on a video that outlines how to setup and properly achieve a balanced load between CPU and GPU allowing you to ensure you're taking advantage of whatever CPU/GPU combination you may have.
  9. I do not see the relevance of what monitors I used at all. They don't effect either the test or your viewing of it.
  10. Have used and compared all three cards with findings here if you are interested. Follow up video including optimal settings coming soon. Running 3900X / 64GB 3200 RAM / Gen4 NVMe / 3090 but compare all three cards here:
  11. I wish I had more time. I would go grab fs2020.org and contact all of the livery developers he either stole from or who want to stick it to him (credit them all of course, because that's what any sane person would do) and re-create all of the garbage packages he's selling and offer them up for free. We'll call us PERFECTER FLIGHT
  12. Thank you very much leprechaun. Just started last month as I had to quarantine for a bit and this whole COVID thing has got me back into simming. Been simming since the early-mid 80's on an 8086 with a 5.25" floppy lol. Also was licensed until I could no longer hold my cat3 med when I was 30 (14 years ago, tough to admit) I have come and gone from sims over the years but jumped back into DCS/X-Plane early Jan and MSFS is just great. Can't wait to see where this takes us. When flightsimulator forums opened there were so many new users there asking the same questions over and over, so I made a video for them that encapsulated the majority of what they were asking and to my surprise they wanted more. Had no intention of a channel, but when I hit a certain size so quickly I figured I'd better change from my personal name to a channel name. Since I'm a sucker for buying addons as it is (not that this one would have duped me, was shady from the start but I honestly wouldn't have guessed this much) I figured I might as well share them online. Just sort of happened. The rest is history... just not that old yet 🙂
×
×
  • Create New...