• Content Count

  • Joined

  • Last visited

Community Reputation

736 Excellent

About w6kd

  • Rank
    Global Forum Moderator

Profile Information

  • Gender
  • Location
    Woodland Park, CO

Flight Sim Profile

  • Commercial Member
  • Online Flight Organization Membership
  • Virtual Airlines

Recent Profile Visitors

11,491 profile views
  1. w6kd

    Memory questions...

    Not sure I follow here...the difference between cutting-edge 8.25ns RAM and average 10ns RAM is 17.5%...not close together at all when related to the CPU speed. A 5GHz CPU will tick through almost 9 additional idle clock cycles waiting for data with the average RAM vs the high-end stuff. Does it make a difference? It sure seems to, especially in areas of dense autogen and/or photoscenery (think ORBX SoCal). When you see performance hits like dropping frame rate and/or stutters, yet neither the CPU nor GPU are close to maxxed out when it's happening, I believe that limited memory bandwidth is the culprit in that situation. Regards
  2. w6kd

    very low frame rate ???

    The first thing I would try is single monitor on just the 4k display...multiple monitors have always hammered my frame rates when I tried it. A 4k display is a pretty heavy load on the GPU all by itself...aggressive AA with lots of clouds and/or dynamic lighting etc can bring frame rates down. And ORBX SoCal is a known performance challenge, especially with lots of traffic and/or add-on airports. Sounds to me like you're trying to throw the whole kitchen sink in...try backing off and building up gradually to see how different options hit your system. Regards
  3. w6kd

    Help lots of time put in

    I think you mean the P3D Developer's edition ($10/month for two copies). If you do a search of the LM forums, I asked them there quite a while back on how to swap the registration of an existing installation with another registration key--there is a way of doing that so you can swap in your regular registration and stop paying the dev edition monthly fee. Saving and recovering from backup images is really an essential skill for simming. Recovering from a borked system by restoring from a backup image is far and away simpler and faster than reinstalling Windows, reconfiguring it, then reinstalling/configuring potentially tens or hundreds of add-ons. I use O&O DiskImage, others use Macrium Reflect, Acronis, or something else. I highly recommend that you learn this skill up-front...install Windows, then make a backup, and then restore that backup (keeping copious notes on the process) so that if you ever need to, you'll be confident and competent on how to do the restore. There's not much worse than discovering that you can't recover an image when you need it because you never really understood what you were/are doing with the backup software. Regards
  4. Balance is the key. Putting expensive fast memory on a stock or mildly overclocked CPU won't help much, and putting skunky slow consumer-grade bottom-bin RAM on a highly overclocked 6/8 core machine will leave some of that expensive CPU performance you paid for on the table, because the CPU will be idling (but very fast) while it waits through extra clock cycles for data from memory. Memory true latency is the primary metric--the time (in nanoseconds) it takes from when a read/write request is made until data is available on the bus. DDR true latency is calculated by multiplying the CAS latency by 2000, then dividing that by the memory frequency in MHz. So 2133 MHz CAS 14 RAM would have a true latency of (2000 x 14) / 2133 = 13.13 ns and 3600 MHz CAS 15 RAM would have a true latency of (2000 x 15) / 3600 = 8.33ns. Lower latency is better. Currently ~8.25ns is near the cutting edge of fast, extreme performance, ~9ns is performance-level, 10ns is average, and above 10ns is slow consumer-grade stuff. When you buy memory, frequency is only part of the equation...lots of companies are selling high-frequency RAM with ugly high CAS latency to unsophisticated buyers that don't understand this simple math. Caveat Emptor. Regards
  5. 1. I'd recommend 10% minimum free space unless you do a LOT of writing to the drive...so much and so fast that garbage collection might have trouble keeping up (not a likely scenario in a consumer PC). 2. You shouldn't see any ill effect except maybe a very small increase in initial load time for the sim. When the sim is running, it does lookahead buffering that makes storage speed essentially a nonfactor unless you are loading large batches of huge photoscenery files. Regards
  6. That PSU test in the video doesn't tell you much except that the PSU is not completely fried. A decent PSU tester only costs $20-30 at someplace like Newegg, and will give you a go/no-go on each of the circuits. Since you removed the CPU cooler, make sure that the notch on the CPU heatsink backplate is correctly oriented and the backplate is not contacting screws/electronics on the back side of the board. My gut suspicion is that the PSU's ground fault detection is preventing the PSU from powering up. Regards
  7. w6kd

    Help lots of time put in

    Well, the unhappy news here is that P3D (and potentially other add-ons) use keys stored in the Windows registry, so you're going to need to reinstall them. P3D, in particular, is going to need a reset from Lockheed Martin since it was not uninstalled before you reinstalled Windows and overwrote the key (the uninstaller goes to their server and releases your registration as part of the uninstall). I'd get that request out to them tonight in the hopes they get back to you tomorrow, otherwise you may have to wait until Tuesday for reactivation (they advertise a 2-working-day turnaround, and Monday, 21 Jan 2019, is a US national holiday). For future reference, what I do to avoid this when messing with overclock settings is to take an image of the Windows drive before I do any overclocking or change of the o/c settings, so that way if something gets hosed, I can restore the boot drive with all of the registry data intact and avoid dancing the re-registration Tango. Regards
  8. Suggest you start by reading through this for an understanding of the debug configuration: https://www.prepar3d.com/SDKv4/sdk/simconnect_api/configuration_files/configuration_files_overview.html OutputDebugString only works if you are running from an integrated debug environment (IDE), which I would suspect you are not. With the log file definition active, the console output may be getting diverted to the log file, so that might explain no output window. The debug window exposes the server-to-client comms, it doesn't create them...of course if there are no clients, there are no comms, and nothing to display. A common technique is to use the AITraffic.exe program from the FSX SDK as a simple client to test the connection. Unfortunately that program is now included in the P3D SDK as source (C++) code only, so if you don't have the FSX SDK installed somewhere or the ability to compile the P3D SDK example code you won't have that tool available. Maybe someone else can suggest an alternative. I notice you have changed the default port from 500 to 4506--any reason for that? Also, this section of your xml is all buggered up, and may in fact be making the entire xml unreadable by P3D. It starts with an illegally-placed end tag "/SimConnect.Comm" </SimConnect.Comm> <Disabled>False</Disabled> <Protocol>Auto</Protocol> < Scope>local</Scope> </SimConnect.Comm> It should read like this: <SimConnect.Comm> <Disabled>False</Disabled> <Protocol>Auto</Protocol> <Scope>local</Scope> </SimConnect.Comm> Regards
  9. w6kd

    SDKs, APIs, and a whole lot of headache

    Cargo and pax compartment temps are not among the parameters exposed in the PMDG SDK, at least not for the 747. That said, it's not particularly realistic to expect the temps in the cargo compartment to be controlled for a particular shipment. It wouldn't make sense, anyway, because the cargo is going to be in non-temp controlled areas during transit--on loaders sitting on the ramp or in the shipment facility, or in a bonded customs warehouse awaiting clearance. Generally, sensitive cargo like you describe is shipped in temp-controlled containers (e.g. the PharmaPort) where the temp is maintained and monitored within the container itself. Regards
  10. That 10-15% number isn't mine. The testing Rob Ainscough and others did some time ago concluded that in P3D there was something around a 30% performance increase from a single 1080Ti to two in SLI based on frame rates. Problem is, I run with frame rate locked at 30, and instead of watching for a delta in frame rates, I watch for GPU load and the transition point into the stuttering regime during a controlled test scenario as the measure of how well the card is performing. The 2080Ti was able to match the 1080Ti SLI setup by keeping the test scenario smooth, and with a lower GPU load, and when increasing the GPU load (e.g. with resolution, DL, SSAA/SGSS AA) it resists dropping below the VSync lock at 30fps and/or reaching the stutter threshold better than the 1080Ti SLI config did, and far better than a single 1080Ti did. So given generally-accepted controlled tests that conclude 1080Ti -> 2x1080Ti is a ~30% improvement, and my subjective observations that the 2080Ti bests the SLI setup, I think transitive logic supports my conclusion that the 2080Ti produces a performance increase over a single 1080Ti somewhere out beyond the 30% mark, anyway. I understand that some folks want hard metrics--I just don't find it worth my while to spend the significant amount of time needed to do the controlled testing needed to determine those numbers. If you were to jump into a VW bug and drive it a few times around the block, then jump into a BMW and drive it a few times around the block, you may not be able to quantify exactly how much better the BMW accelerates or takes a corner, but you still can correctly observe that the BMW is considerably faster and more nimble. That's all I'm trying to do...let folks know that I observed that the 2080Ti easily outperforms my old 1080Tis, and even a pair of them--and that the difference is non-trivial. But if you want hard numbers, I'm not your guy. Regards
  11. w6kd

    Flysimware got a new texture artist

    I'd like to see the same treatment on their Lear, too...I run P3D on a big (55") monitor, and the panel has textures that might pass as tolerable on a small display, but which look pretty terrible up on the big screen. The texture issues on the Lear have been what's keeping me from the Falcon...good to see that it's being worked. Regards
  12. A review? No, I'm writing here from my own personal experience. The price differential was about the same here in the US if you just swap dollars for pounds. So both cards were--proportionally--less expensive here. My 2x SLI config was two $850 1080Ti cards plus a $50 high-bandwidth SLI bridge ($1750 total)...the 2080Ti was $1250. You seem to be defensive Ray, as if I were criticizing your choice of video card. I'm certainly not...but I am taking exception to your statements here that seem to suggest that the 2080Ti offers little in the way of performance gain over a 1080Ti (it performs better than a pair of 1080Tis), and/or is a poor value (the price is commensurate with the performance gain). Sure it's expensive, but high-end always is. But the fact that it's pricey doesn't mean those who are willing/able to pay the high price aren't getting real value for their money.
  13. 40% higher cost for 30-40% better performance doesn't strike me as unreasonable. In fact the 2080Ti was about 40% cheaper than a 2x1080Ti config that was somewhat slower. Cost effectiveness is always in the eye of the beholder. For those seeking high-end performance, the value is certainly there.
  14. No, I have to disagree here. In my experience the 2080Ti has proven considerably more powerful than the 1080Ti. I replaced a twin 1080Ti SLI setup with a single 2080Ti, and get better performance from the 2080Ti in P3Dv4. A 1080Ti is a good choice (if you can find one now), especially if the 2080Ti price waters your eyes, but it's no longer cutting edge. Regards
  15. When they measure the single-thread performance, it's just that--a single thread only, so the chip will clock up to its full turbo speed (assuming the heat sink keeps it below the thermal throttling limit and the chip doesn't exceed its TDP). A flight sim program is going to present loads to more than a single core--most or all of them to some degree--so the turbo mechanism will not clock to anywhere near max turbo, if it clocks up at all with even relatively light loads across all the cores. Using a "K" series CPU with an unlocked multiplier, and a suitable matched motherboard with a BIOS that supports overclocking, you can force the turbo onto all the cores up to the best stable speed. But without the "K" you're stuck with the default turbo mode control logic, which is really only going to get you to advertised max turbo speed with loads present on one, or maybe two cores (on some 6/8 core CPUs). Regards