w6kd

Moderator
  • Content Count

    5,439
  • Joined

  • Last visited

Community Reputation

637 Excellent

About w6kd

  • Rank
    Global Forum Moderator

Flight Sim Profile

  • Commercial Member
    No
  • Online Flight Organization Membership
    Other
  • Virtual Airlines
    No

Profile Information

  • Gender
    Male
  • Location
    Woodland Park, CO

Recent Profile Visitors

11,287 profile views
  1. The shortages aside, I'm not sure that $950 price for a 9900K (all from affiliated resellers) on Newegg is meaningful...it's just a bunch of online scalpers, really. A few weeks ago Newegg had a price tag of $1900 on some eVGA 2080Ti cards that were out of stock and still listed on eVGA's own site in the $1200-1300 range. The prices have come back down on Newegg's site, but still no stock there. It's gonna be a lean holiday season for the computer retailers at this rate.
  2. I use UTLive, but I also have the MT6 aircraft installed, and I use some of the MT acft where UTL does not have an appropriate model or repaint. With UTL, if it can't find a livery, it loads a generic acft or none at all. Not sure what happens if a traffic bgl references an AI model that's either not there or can't load due to a boogered-up config or texture. As part of my configuration, I removed all the military (and GA) acft from the MT aircraft directory, so not only are they not instantiated by MilTraffic.bgl, they also are not seen, loaded or otherwise processed when P3D starts. I know P3D does something with the MT acft at startup, because if I take the MT entry out of the SImObjects.cfg file, it loads a lot faster. So it may not be enough to stop problematic (e.g. military) model loading by disabling the traffic bgl. Good luck!
  3. At the time you bought your 6850K, it was a current-generation Extreme Edition CPU, a hexacore when 6-core CPUs weren't common, had a quad-channel IMC, more cache, 40 PCIe lanes, etc. I spent $1000 on a quad-core i7-975 Nehalem in 2009 and the same on a dual-core X6800 Conroe CPU in 2006. The advantages were there at the time, but the edge you get at those premium prices is but a temporary, fleeting thing. My 2011-vintage $280 Sandy Bridge i7-2600K beats both of those, and the i7-4790K, i7-7700K, and i7-8086K CPUs that came after absolutely eclipse them. All are capable processors still, but no longer cutting edge. I still have my Conroe in a working PC and use it occasionally with a few niche WinXP hardware devices/software titles that never made it to 64-bit. The dilemma always is whether to pay the price for cutting edge now, or wait a few years and get it at mainstream prices. FWIW, the Conroe was the biggest single-step improvement in the sim I ever had...never regretted the few good years that one gave me before something better came along, as it always does. Cheers
  4. w6kd

    P3D v4 : New rig, new hopes...

    David--I think you have the same eVGA P4-2383 RTX2080Ti GPU that I have coming, and they (eVGA) have a bundle deal for a normally $500 1600W Titanium-rated PSU for $150. I saw somewhere on their site that price was for all current and former purchasers of that card...so if you haven't already gotten that PSU, it might be worth checking there. I currently have one of their 1000W modular supplies and really like it...the self-diagnostic circuit in it saved my bacon last build when I had a backplate shorting to the case--it just clicked and refused to power up rather than smoking something. Cheers
  5. I think the impact of memory performance is probably one of the least understood facets of hardware performance w/r/t P3D. Your 6850K has significant advantages in that department over the mainstream consumer CPUs and chipsets because of the larger (and 20-way set-associative) L3 cache and also quad-channel DDR4. That won't make up for a clock speed 24% slower than an 8086K or 9900K at 5.2GHz, but it certainly makes the most out of the clocks you're getting. L3 cache is important, but there is a point of diminishing returns, which would explain why intel has landed on 2MB per physical core and not gone beyond that in several generations of "mainstream" CPUs. If it made a big difference, we'd see movement to add per-core L3 over time I think. Can't deny that more is generally better, and since L3 cache is shared across the cores, that 16MB on the 9900K is something I wouldn't mind having That said, I'm happy with what I'm getting out of an 8086K and its 12MB together with fast 8.3ns RAM. I do wish there was more data on the effects of memory performance. Regards
  6. w6kd

    Should I get the FSLABS A320?

    The FSL docs, to be honest, appear to be something of an afterthought. Very sparse and underwhelming...definitely not anything comparable to what PMDG provides for its add-ons. It's especially problematic, because anyone not already conversant with Airbus technology and terminology (e.g. MCDU vs FMC, green-dot etc), managed vs selected mode, normal vs alternate vs direct control laws may not ever come to appreciate what is actually built into the add-on. I tend to rely on knowledge gained from using previous ScareBus add-ons. A pity when you have to go to 12-year-old PSS docs from an old FS9 add-on to answer a question. Regards
  7. Well that's it--no more Irish Coffee for the flight crews...
  8. w6kd

    Should I get the FSLABS A320?

    On my system, It performs roughly at par with the PMDG 737. I have no idea what is in an Alienware Aurora R5. I think that if you can run the PMDG birds, you should be OK. Regards
  9. Well, we have the software developer saying one thing, others saying the opposite based on experimentation, and a varied collection of user experiences that are all over the map. I guess "fiddle with it until it works for you" is the answer...
  10. As long as you're not otherwise limited by the GPU or RAM bandwidth, you can expect it to scale fairly linearly. So a 300 MHz bump to 5.0 GHz is a 6.4% increase in CPU speed...if you were getting 30 before, I'd expect you to be able to get about 1.064 x 30 or around 32 fps in the same scenario after. So not a huge bump. If you run frame rate limited, as I do using Vsync and a 30Hz display, then that extra performance will manifest itself as additional processor headroom available to absorb spikes in CPU load, which can make a noticeable difference in smoothness. That CoolerMaster CPU cooler isn't exactly top-shelf gear, unfortunately. That said, a good cooler like the Noctua NH-D15 or an AIO sealed liquid loop like the Corsair H100i is one of the cheaper upgrades you could do, and I know folks that have been using their Noctua coolers over 3-4 builds using the same fans. Regards
  11. One thing I'm wondering a bit about is whether a pair of 2080Ti GPUs on a Z390/Z370 board would/could have issues saturating the eight PCIe lanes to each GPU. Your 7900X isn't so constrained...I think you still get 16x on both GPUs there, correct me if I'm wrong. But an 8-core 9900K just might start to max out the busses with 2X 2080Ti at 8x/8x. Something to keep an eye on with your 9900K experimentation, anyway. Cheers
  12. OK, then the heat pipes are doing their job, and concentrating the heat on the fins away from the GPU. If you want them to run cooler, program a more aggressive fan curve on the GPU (using a utility like eVGA Precision, MSI Afterburner etc) to spin up the fans earlier/faster, and that'll solve that. But...short version is you don't have a problem. Regards
  13. w6kd

    P3D v4 : New rig, new hopes...

    No, P3D's internal frame rate is set to unlimited. VSync on with triple-buffering enabled, and the TV is set for 30 Hz. As long as I don't configure things too aggressively to the point where I start having frame rate excursions below 30, it makes for the smoothest config I've ever had.
  14. I gave up on this and now use my last-generation sim computer for driving/racing sims and other games. That said, there's no need at all to buy more hardware to switch between two monitors--that can be done easily within the standard nVidia Control Panel. I disagree about disconnecting unused controllers, for the reason that if you change the configuration of the connected USB HID devices from session to session, you will likely find that the GUIDs of the devices change as well, which makes keeping settings a real bear. When I ran multiple monitors (one for P3D with a floor-mounted yoke, one for FSX/DCS on the other side of the table with a HOTAS stick/throttle) I left everything connected, and configured the programs to use the appropriate controllers and to ignore the others. If P3D sees a new HID device it doesn't recognize (new as in a new GUID) it will often assign all the default axes to it. Regards
  15. The instrumentation for measuring GPU temp is already in place on the board, so why not use it? Get a copy of GPU-Z or any one of a host of other utilities that report the GPU's temp, fan speed, and load, and you'll immediately be able to determine if there's a problem. The temp on the heat pipes is not what's important, the temp at the GPU is. Might be that those hot heat pipes are normal...that's why they're called "heat' pipes and not "cold" pipes. 😉 That said, it's never a bad idea to blow the dust out of all your computer's heat sinks... Regards