Jump to content

TheFamilyMan

Members
  • Content Count

    1,176
  • Donations

    $0.00 
  • Joined

  • Last visited

Everything posted by TheFamilyMan

  1. There is not much use in comparing flat screen resolution visuals to what you see in a VR headset at that resolution, except in the most extreme (and unrealistic) cases. Note that simple multiplication will yield any scaled resolution: HW resolution (both width and height) * render scale factor / 100 = scaled resolution.
  2. A friend of mine uses a gtx 2080 Super to drive his HP Reverb G2 for MSFS and he's satisfied with it for now; he's waiting it out when gcards become readily available after the nvidia 40 series launches (still most likely a year away). Recently I was lucky enough to get an evga rtx 3080 ti FTW3 from a local computer store, but even that struggles to drive my new Reverb G2 at high/ultra settings in MSFS. Ended up having to use Render Scale 80 to get where I wanted to be visually and performance wise, but dang it is worth it!
  3. If you are 100% certain that your dedicated MSFS add ons disk contains only add ons, it'll be simple to move that disk's contents to a new drive. Though you are not familiar with disk imagining tools for copying a drive, using one will be A LOT quicker time wise than a windows file copy (say about 1 hr vs. 3 hr or more). I've used Minitool Partition Wizard free for doing this with excellent results (years ago I had a paid version of it). A super easy operation: install the new drive, then use the tool to image copy the old drive to the new drive and you're done. It is a non destructive operation (other than what was on your new drive originally). If other parts of MSFS are hiding out on your dedicated disk, replacing that dedicated disk using a copied/imaged disk will most likely lead to you a world of hurt (from first hand experience).
  4. HDR is a terribly overloaded name for a game setting. Its most recent meaning applies to monitors/TVs that support the hardware HDR color range which uses 10 bits of dynamic range. Non HDR has only 8 bits. If a title implements HDR "properly" it should only be an option to enable if the display supports hardware HDR.
  5. Sheesh, the lengths some will go to purposely force their bitter feelings on, or towards, others...this pretty much sums up the act of road rage in general. A better rule to live by: "in a world where you can be anything, be kind".
  6. Though I don't own one (though I am tempted), IIRC eliminating screen tearing at a variable rate FPS is the primary purpose of GSync tech. I've abandoned trying to run anything beyond vsync 30 with my 60 htz IPS panel. I can see that higher FPS does somewhat enhance the experience, but with my panel, all my attempts to get my rtx 3080 ti at higher FPS leads to either occasional and noticeable stutters with any sort of vsync (in-game or driver), or tearing if not using vsync.
  7. Having botched an update via MS Store by not verifying the install location in the past, and dealing with the teeth gnashing experience that followed to undo the mess, I ALWAYS triple verify that the install location is correct before clicking the install button. BTW, my MSFS install is on a dedicated drive, so that install location is not the out-of-box default. This Xbox app we now have to deal with to update MSFS is nothing but really irritating, to say it politely without censored text. And now we are forced to maintain updates for that app just for the privilege of updating MSFS. I definitely regret my MS Store purchase of MSFS!
  8. I kind of agree that this seems to be a rather self serving poll, or rather it's worded such that it sounds that way. My thoughts on poll prompts: I use blah Other blahs are what I use I don't know anything about blahs Sounds less biased, clearly covers more base IMO. Sorry for butting in, now back to flying
  9. Just upgraded from a gtx 1070 to a rtx 3080 ti (evga FTW3 Ultra). Set everything to max settings except render scale at 150 and LOD 3/4.5. Tried disabling in-game vsync, no thanks, obvious screen tearing present. Tried in-game 60fps vsync enabled, no thanks, obvious stutters. Tried in-game 30fps vsync enabled, smooooooth as can be. But hey, this is me with my 60htz 2K IPS monitor (check my computer spec below). If I'm leaving performance on the table I don't care, for MSFS looks and feels astounding where ever I fly, at any time of day and kind of cloud cover. That's enough to justify the expense, but also I'm now looking towards upgrading to VR.
  10. Though a bit late to this party, I must share my joy over my recent 3080 ti purchase, an evga FTW3 Ultra. I was shopping for an external backup drive at local computer store and happened to be there the day they got a shipment from evga. Egva MSRP sticker shock to say the least, and they were asking $100 over it. But I couldn't resist since I really wanted a 3080 ti, was tired of waiting, plus its got the full evga warranty. And really, who knows when this gcard drought will end; nvidia just announced the 40 series won't be out until 4Q 2022! Needless to say, an astounding upgrade in MSFS visuals and performance over my old gtx 1070 (had that card for over 5 years). Looking towards VR as my next upgrade.
  11. Can't say this is a fair assessment, but yeah, things are getting better all the time, especially since I just replaced my 5 years old GTX 1070 with a RTX 3080 ti. Full Ultra is the real deal, nearly fell out of my chair with the enhanced visuals, not to mention the buttery smooth performance too. VR is my next upgrade.
  12. I agree, those two destinations (while impressive to say the least) are shameless touristy money grabs akin to the like priced "excursions" which surround the cruiseliner industry. As mentioned above, if you got the time, hiking is the best way to take in that area. I'm glad that MSFS does it justice.
  13. Thanks for the heads up on this. As you mentioned, I used a weather preset and the snow was gone. And BTW, that area is breathtakingly done in MSFS (without the snow), Bravo! Too bad the waterfalls are missing, but I can live with that. I totally recant my comment on the mountains, they are crisply and realistically detailed and look as I remembered when I visited that area 4 years ago (though not in an aircraft). The phrase "Thought I died and went to heaven" perfectly describes being there IRL (there's me standing before the Jungfrau).
  14. Crud, seems that the satellite imagery used to build the Bernese Oberland area of Switzerland (e.g. Wengen, Grindelwald & Lauterbrunnen) was shot in mid Spring, such that all its beautiful lush green scenery and picturesque villages are covered in snow; not to mention the meh rendering of the Eiger, Mönch, and Jungfrau. Yeah, this is a flight sim so I should not worry about that, but still what a huge disappointment (at least for me). What seems extra ironic about using mid Spring is that it neither has good snow for skiing nor the beautiful scenery for tourism, both which are big draws for that area of Switzerland. Oh well, so much for my rant...back to flying.
  15. Bert's cure is what I instinctively and immediately did after I installed that travesty. Seems that the only way I could access the WU6 download was to have the Xbox app not only installed but running as well.
  16. Bad lottery pull with Asus 51 SP, it happens ☹️. 1.22v AVX load at 4.9Ghz, 1.34v at 5.0Ghz. At least the performance difference here barely matters (but only if I could reasonable hit 5.2 w/HT...now we're talking). My old 4770k was nearly golden...oh, those were the days.
  17. The 4.9 OC of my 10700k which is under a NH-D15S hits a max of 82c in Cinebench but averages around 78c, room ambient 24c. When running MSFS SU5 I've seen some max core temps of 86c with the same room ambient, though they appear to be very short lived and the average core temp is around 64c. My guess is that MSFS uses AVX instructions, such as FMA3, which like Cinebench, really bumps temps, especially if you are overclocking. I don't see these temps as anything going wrong, other than a cooler that is been pushed to it limits. I go into a bit more detail in this post I made in the hardware section. In the end, like Rob said, I can get better cooling to care of this, or simply reduce my OC.
  18. Though possible, applying render scale past 100% on a 4k monitor shouldn't be necessary and as you figured and demonstrated, slays a top end gcard designed for 4k performance. Maybe if your screen size was 120" such a 4k render scale could be useful, but sitting close enough to it to notice the improvement would be rather impractical to say the least. As for what is the best mix of render scale and TAA, that IMO is a matter of personal preference determined by a little "trial and error" by the user...there is no one-size-fits all setting.
  19. Bert, thanks for your tip (and excuse my above indulgence). I gave Render Scale a try with my gtx 1070 driving a 2k monitor. The best it could reasonably handle is 110%, but even that modest bump noticeably smoothed jaggies (while pushing my 1070 to its extremes...but why not), thanks again!
  20. The render scale is a really nice feature. It implements DSR in-game, which actually renders the scene at the render scale resolution, then downsample scales that image to your monitor's resolution (in image processing it's called decimation). The decimation algorithm averages the portions of the rendered image's pixels which map to the monitor's pixel and sets the monitor's pixel to that averaged value. This yields a more accurate final anti-aliased monitor image; thus DSR is effectively SSAA. I experimented with driver DSR when it came out a while back, before I got a 2k monitor, and loved its results, but it had a nasty side effect in that any overlay text became very tiny as a result. The MSFS implementation solves that problem.
  21. Running MSFS 2020 SU5, I recently gave my overclocked i7 10700 a close look running a stressing location, a 6k' flight from KBUR to KSNA, using LOD 300/300 and 30 fps vsync (see my computer specs in my signature). I originally tried using LOD 600/600 for this test but decided that such settings are impractical for my computer. For this LOD 300/300 test I stepped back my 5.0 Ghz OC to 4.9, primarily to reduce the load voltage by about .1v (though I lost 2% of the effective OC in the process). What I saw was surprising: about half of my physical cores had max temps from 84c and 86c! The average was around 63c. Though I can't tell for sure, it looks that those high temps occurred over very sort intervals and had to be caused, IMO, by AVX instruction (FMA3?) execution. Though these temps can be seen as alarming, I'm glad that Asobo is taking advantage of these instructions which GREATLY increase computational throughput; that is exactly why they exist and IMO it'd be foolish to leave such computational power "on the table". If I run all stock clocks, including disabling XMP, the temps hit 77c though the processor is now pegging 100% occasionally in this test. In the end, I figure I'll live with my 4.9 Ghz OC and these short high temp bursts, for there are not causing thermal throttling, and the power consumption max is staying below 130w and averaging well below that. Seems that my NH-D15S cooler is being thrown to the wall here...plus my 10700 is a mediocre lottery pull...oh well, thanks for reading.
  22. Thanks for the offer, it'd be interesting but regardless, I will be upgrading my GTX 1070 anyways. The only reason I am looking towards a rtx 3080 ti is I am looking at near future with VR in it. Otherwise it would be overkill for my 2k panel.
  23. Complete agreement, been hoping to do that since the 3080 ti released (the 3080's 10GB just doesn't cut it for me). Knew when I built my system about a year and a half ago that I wanted to retire my gtx 1070 soon and almost bought a gtx 2080 S at that time, but its 8GB was a deal breaker plus the 30 series was just about to be released. I now kick myself for not getting that 2080 S for about $660...if only I had known what the gcard market soon would become I've been surprised, and mildly pleased, that my gtx 1070 handles my base settings well enough to drive a 2560x1440 panel at 30 fps, but yeah it's really holding back my MSFS performance but I'm not willing to spend $2000 or more to upgrade it. Seeing how LOD 6/6 really stresses my i7 10700 at 30 fps, I can't see how a 3080/3090 could get around that computation limitation, but dang, how I'd love to give it a try. Besides, 6/6 was simply a test I ran to humor myself.
  24. Seeing that SU5 works so well for how I fly in MSFS 2020, just for kicks I set both terrain and object LOD settings to 6 (600 in-game, though only possible using the UserCfg.opt file) and did my KBUR to KSNA flight to see how it goes, in a Waco YMF-5 at 6k'. Holy Mother of Pearl...ground detail for as far that it's practical to see (truly puts 2/2 to shame)! But it's anything but smooth, using the sim's 30 fps vsync. Other than having terrain vector data, texture resolution, text synthesis, ambient occlusion set to Ultra, all my setting are set to High. See my computer specs in my signature. Moving my virtual cockpit view with TIR causes big FPS hits for 2 or so seconds, plus the autogen pop is painfully obvious, mainly in the distance. But once it settles out smoothness returns. The most interesting thing with 6/6, at least for me, is seeing its execution impact on my overclocked i7 10700. With the standard 2/2 LOD in SU5, no core even comes close to ever hitting 100% use, but with 6/6 every view change spikes about half the cores to nearly 100%, and in general the core utilization is about 30% elevated. Also noticed that some of my cores are hitting max temps of 84 c, though only for very short periods; the average in the upper 60s c. I see these temp spikes as a good thing, for it shows that MSFS utilizes AVX instructions (I imagine particularly FMA3 since it's tailored for linear algebra computations, plus that instruction cooks the cpu like no other). Of course, it's easy to blame my 6/6 performance on my (ancient) gtx 1070, but it was rarely pegging 100% and looked to be performing as well as when it's running 2/2. Clearly IMO it's the CPU that matters most for 6/6...oh well, so now back to the mundaness of 2/2 (or perhaps maybe a bit higher). Thanks for reading!
  25. If it's not MSFS 2020: IL-2 Battle of Stalingrad (or Moscow, Kuban, Bodenplatte, aka BoX), or its WWI version "Flying Circus", flying online private server missions with my squad mates. We've been together for 15 years, started off right before "IL-2 1946" released. I also write BoX mission, using their mission editor's "Graphical Data Flow" programming language, which is similar to the programming language "LabView" that I programmed when I worked in aerospace at LM (recently retired). Pinball Arcade, which is a highly realistic pinball table sim. All of its table sims accurately mimic RL tables from the 70's to recent years. The tables are very accurately depicted, with excellent physics and visuals. Unfortunately its developer "Far Sight" no longer supports it, primarily because about 3 years ago Williams and Bally didn't renew Far Sight's license to sell their tables. Though not a sim, at times I spend a fair bit of effort editing HD footage shot while traveling into short films of our adventures, using Premiere Elements. It's nice to see my 16 virtual cores getting a worthy and useful workout when crunching H.256.
×
×
  • Create New...