Jump to content

seasley

Members
  • Content Count

    13
  • Donations

    $0.00 
  • Joined

  • Last visited

Community Reputation

12 Neutral

Profile Information

  • Gender
    Male

Flight Sim Profile

  • Commercial Member
    No
  • Online Flight Organization Membership
    none
  • Virtual Airlines
    No
  1. When talking internet speeds, make sure you're not comparing bits to bytes. MBs usually indicates bytes; Mbs usually indicates bits. I've seen more than one example of the wrong units appearing on documentation and measurements. You're consistent in your notations, so I'm guessing that's not the issue. How are you measuring the downstream speed? Are you using a diagnostic tool built into the cable modem? Are you running a speed test supplied by your provider's website? Or are you running something like speedtest.net? Unfortunately, I've seen providers detect third party speed tests and throttle them - not always intentionally. They're just trying to keep the overall connection responsive so they prevent any single device from maximizing the connection. Before upgrading your connection, I'd make sure you're getting the most out of what you currently have. I would expect you to be able to get in the low 200Mbs with your stated connection speed. Every device and connection between your computer and the internet has a potential speed limit / capacity. Your provider may be giving you a 275 Mbs signal to your location, but something in between is slowing you down. If you're using wi-fi to connect to your router, the ~90 Mbs is the best you're probably going to be able to do. I've never seen a real-world wi-fi network perform anywhere close to the specifications. If you've got something limiting your bandwidth, a faster connection on the provider's side won't change anything. When I first got my cable connection at home, the provider swore I had a 300Mbs downstream connection, but the modem they gave we would only connect on one or two channels at a time. They tried to tell me it was a wiring problem inside my house. To prove them wrong I scheduled an appointment with the tech and set up the cable modem and my laptop outside my house at the cable box. When the tech saw I couldn't even get 5Mbs sitting beside their service panel, he finally realized they had the channels mis-configured. (Incidentally the whole street was affected - everybody saw their connections improve when he fixed it.) The moral of the story is to get your connection as simple as possible for testing purposes. Eliminate anything that could be causing the issue. Connect your cable modem as close to the service panel as possible - run a coax cable directly from the panel to your modem if possible. Temporarily get rid of any hardware that might be messing with things - switches, hubs, extenders, etc. Plug your computer directly into your modem instead of using wi-fi. If that works, you know the limitation is on your end. Cable modems can be sensitive to interference from other devices. It's not supposed to happen, but it does. Disconnect everything else that connects to cable or your router and see if the situation gets better. A bad ethernet cable could cause your hardware to limit itself to a lower speed. (I had a bad cable drop an entire network from 1Gb to 10Mb - swapping out the cable fixed it.) Another thing that's tripped me up in the past - I had a cable modem / route that did QOS based on the connection speed that it measured. After I upgraded the service and increased the bandwidth, the cable modem was still limited to original speed because I hadn't reset the QOS configuration. Also check the settings for filtering, parental controls, and firewalls. Some hardware can only provide these functions at a limited rate. Turning on every feature might result in lower bandwidth. For testing purposes, try turning everything off. If it works without them, add them back in one at a time until you discover what might be causing the issue.
  2. I use a 43" 4k TV for flight simming and a second one hooked up to my work computer for productivity task. By sitting close to the TV and running 100% scaling, it's like having four 1920 x 1080 monitors stacked in a grid. I can go ultra wide if I'm working with spreadsheets or fully vertical if I'm working on code. I almost never maximize single windows for word processing or web browsing - too much head turning from side to side to read and look at user-interface elements. Constantly going from the bottom of the monitor to read the last line and then the top to find a toolbar button does get tiring. Having a 4k running at 100% scaling is about being able to multi-task and see multiple windows simultaneously, with each one showing an entire screenful of information. There are are lots of freeware utilities to help you use the screen space effectively - they enhance the Windows UI to allow you to snap windows to fill the left and right, or top and bottom, or just a corner. You can also divide the desktop into just about any division you want. Being able to shift windows around and come up with different layouts makes me so much more effective. I'd concur that anything much bigger is too hard to go from side to side and top to bottom on a desk without strain. I tried a 47" at first and returned it to get a 43". One thing to make sure is that the LCD panel doesn't show the screen door effect when you sit that close. Six feet away you'd never notice, but from 3' feet in, you want to make sure you can't see gaps between individual pixels. Scott
  3. A word of caution - I discovered when I was trying to transition airports from the scenery.cfg to add-on.xml method that not all of the files for the airport are in the scenery and texture folders for the airport. For example, FlyTampa Tampa normally installs into a FlyTampa\Tampa folder in the Prepar3D folder. However, it also installs a few files in the Effects folder and the texture folder under it. If you move the FlyTampa\Tampa folder outside the Prepar3D folder and create an add-on xml file for it, the airport will still work. However, if you then delete your Prepar3D folder as part of an upgrade or other event, the effects files will get lost. I don't know if the missing files will cause the sim to crash, but there probably would be some issue, even if a minor one. Also, just because a developer uses an add-on.xml file to point to the scenery and texture folder for the airport, it doesn't mean that they don't also install files into other folders in the Prepare3D folder that could get lost. Drzewiecki Design sceneries seem to do this. (They use an add-on.xml file to reference the scenery and texture folder which is installed into the Prepar3D folder by default, and they also add additional files elsewhere in the Prepar3D folder.) I think Lockheed Martin's goal was to allow 3rd party airports to install into a single folder that contains every file neccessary for the airport, but only a few developers truly do this. FSDG, for example, is one that seems to do it right. When I installed their newer sceneries, they created an Effects folder outside the sim folder which is also referenced in their add-on.xml file. That way their effects files won't get lost if the Prepar3D folder is removed. If all developers used the add-on.xml file to the fullest like this, we truly could just wipe the entire P3D folder when it was time to upgrade. FYI - I've been using SoftPerfect's File Access Monitor to track what files get installed by 3rd party developers. It's a little cumbersome to use, but it's free. It gave me enough information to realize that trying to maintain a P3D folder with nothing by core files in it was going to be too hard, so I stopped worrying about it. It will tank your performance, so don't leave the service running when you run the sim.
  4. The setting you found is the one LM is referring to (I think). It is one that is more effective when used on laptops with dedicated graphics cards. My Dell XPS has both an Intel GPU built into the processor and a discrete NVIDIA adapter as well. On my laptop the setting allows me to choose the Intel GPU as the Power saving GPU and the NVIDIA as the High performance GPU. Setting it to Default allows Windows to "pick" the best adapter for the program and circumstance. For example, if I run a 3D application with the battery fully charged, Windows will use the NVIDIA GPU; if the battery is low, it will run the application using the Intel GPU. But on both my desktops I see the same dialog box but with the NVIDIA GPU under both options (one desktop is an F processor without a built-in GPU, on the other the built-in GPU is disabled in BIOS). It is possible that NVIDIA is exposing two instances of the same GPU to the operating system to allow you to select different settings, but I haven't found anything online or in the docs to suggest that they are. My guess is that Windows always displays the same three options regardless of the number of GPUs in the system, and selecting one versus the other makes no difference when there's only one GPU. It's obvious when I try to run an application on my laptop using the "wrong" GPU; I can't imagine trying to run P3D on a built-in Intel GPU with settings intended for a high-end NVIDIA card. I'm sure LM wants to cover even the obvious things - you've got to start somewhere when diagnosing problems. pdge is correct that NI and NCP are not the same program, but both accomplish the same thing - alter settings in the NVIDIA GPU driver. Inspector gives you deeper access to internal variables while Control Panel is much more user friendly. In my experience and reading, NI was essential in the FSX days and still very important during the early versions of P3D because the simulators weren't designed to take advantage of all the features in newest GPUs. NI allowed you to turn on these features to enhance the performance and appearance of the sim. The later versions of P3D are already aware of the latest GPU features and so the effectiveness of manually setting GPU configuration is greatly reduced, and can hurt performance if you're not careful by putting the GPU in one configuration while the simulator is expecting something different.
  5. 1) In my experience, wireless range extenders do increase range, but they also introduce instability. I can't say for sure that they hurt speed, but I diagnosed a network that was using them and devices were constantly dropping connections. We wired in two additional access points, removed the range extenders, and the dropped connections went away. (I believe it was all NetGear equipment, but I don't remember specifics.) 2) You might not be able to do much better than that without a wired connection regardless of what you buy. During the first part of last year my company built out new office space in an empty warehouse. The fiber connection was installed quickly - we'd just finished the interior demo. So in the middle of an empty warehouse with the power shut off in an industrial area with my laptop sitting only 15 feet from the access point, I was only able to get about 570 Mbs on commercial equipment that was rated for over 1 Gb. It was about as close as you can get to a lab-quality situation and I still couldn't do any better than 50% of the rated throughput. Without details of your living space and device locations, it's hard to know what the factors are, but before you buy a better router / access point with more powerful radios / antennas, try this. Get a really long patch cable and use it to relocate your access point closer to the devices you want to have the best connection - even if it means letting the cable lie on the floor and down the stairs temporarily. If that doesn't improve things, buying better wi-fi equipment probably won't help much.
  6. I had an LG 34UM65 ultra-wide for a while and switched to a 4k TV because I liked it so much better. The problem I had with the ultra-wide was that while it did give me some nice horizontal peripheral vision, the aspect ratio meant I had a harder time seeing the instruments. Plus the fish-eye effect made it feel like I was sitting in the jump seat rather than the pilot's seat.
  7. I remember reading here that add-on aircraft will sometimes use libraries/routines/textures etc. from the defualt aircraft. Deleting them might cause unexpected failures in other products. I might be wrong overall, and it might not apply to your add-ons, but be cautious. Scott
  8. Completely non-P3D point here, but I've worked with some setups that mix a 4k monitor with a non-4k monitor. Windows 10 does a much better job of scaling programs between different DPIs than it used to, but it doesn't always handle changing dpi on the fly. If you drag a window from a high-dpi (i.e. 4k monitor) to a regular-dpi (your existing 24"), you may have programs that do not scale interface elements immediately, rendering them unusable. (They work fine when started in a high-dpi environment, but don't handle changing without restarting.) I love working on my 4k monitor as a single large display. I treat it like having 4 normal monitors in a seamless grid. I sit close to it and leave the scaling at 100%. I can spread a program horizontally across the width of the screen (e.g. Excel), or vertically across the entire height (for code listings). There are free utilities out there that let you define complex custom snap points that make it really easy to make use of all that space. My guess is you'll quickly forget the additional monitors.
  9. I had an ultra wide computer monitor (43" 21x9 ratio) and switched to a 47" 4K TV with a standard 16:9 ratio. I couldn't be happier with what it does for P3D, not to mention other things. I even liked it so much for productivity tasks that I convinced my company to buy one for my office. Check out https://www.rtings.com if you don't know about it. Many of their TV reviews specifically address using the TV as a computer / gaming monitor. You tend to sit a lot closer to a TV being used as a computer monitor so it's helpful to have a review from that perspective. They saved me because the model I was going to buy had some visible cross-hatching from up close. It was not noticeable from room-sized viewing distances, but at a desktop distance it was like looking through a screen door.
  10. I upgraded my i7-4790K to the i7-8086K. With a good air cooler, power supply, and case, I'm able to run it at a constant 5.2 GHz. I wasn't necessarily hunting frames, but I was able to turn the settings in P3D up even further and maintain what I wanted, especially in dense scenery and heavy clouds. Still can't get into LAX without seeing a drop, but I fly with ORBX SoCal, UT Live, the works. Almost anywhere else I fly I can get the performance I want without compromising. As I understand it, the 8086K is the same processor as the 8700K, but Intel picks out the best quality ones and brands them as the 8086K. Using only the best chips allows them to bump the boost frequency to 5 GHz out of the box. The way I see it, an 8086K is much more likely to hit the higher speeds than an 8700K. Is it worth the price premium? It was for me. I've also seen that the newest Intel chips (the 9XXX series) might not be the best when overclocking either (example: https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/22) The new chips are faster out of the box, but Intel seems to be shipping them at much closer to their maximum clock than they used to. It seems to be a lot more work and money to squeeze out a couple hundred MHz on them than it used to be. There may be benefits to having the additional cores, but I think P3D still has a ways to go to automatically say more cores is better than more clock. As for the 10th generation, most of the speculation is on Intel moving to a smaller manufacturing process (10nm or even 7nm). Historically, Intel moves to a smaller process every other generation (which generally improves efficiency) and then stays there while improving the performance for the next generation (referred to as the tick-tock cycle in the past). The next smaller manufacturing process is already a little late (this is the second performance increase generation) although some of the new mobile chips are using a new process. As for the Z390 motherboard vs the Z370, most of the specs I've seen refer to improvements in USB, Wi-Fi, PCI lanes, etc, and not much that really centers on the processor. At one time I thought I read that the new chipset supported faster memory, but I can't find anything now that says it. DDR4 vs DDR3 vs DDR2 is basically a generational notation, much like the 8700K is the 8th generation processor and the 9700K is the 9th generation processor. Newer generations support faster speeds. RAM speed is generally measured in bandwidth instead of frequency because a number of factors make the specific clock frequency meaningless. The general rule is to buy the highest bandwidth that the processor and motherboard support. Then you get into the timings thing, which is above my pay grade. For example, I found a 16 GB stick of ram advertised as DDR4 400 (PC4 32000) with timings of 18-19-19-39. I'm pretty sure that it's running at a 200MHZ clock frequency, but because it transmits on both ends of the clock cycle, it's transfers data at a 400MHz rate (DDR stands for Double Data Rate - two transmissions for every clock cycle). Because it can transfer multiple bits (8?) at a time, it has a total bandwidth of 3.2 MB/sec. (I might not have it exactly right, but you get the idea.) The timings (18-19-19-39) refer to how many clock cycles / how much time the RAM takes to perform certain actions such as respond to an instruction or access a specific bit. It gets super technical, but the rule is always lower is better. I believe it is possible for RAM with a higher frequency but slower timings to end up having poorer performance than RAM with a lower frequency but faster timings. I think that's only true in extreme cases, but it means you can't ignore the timings entirely. As far as an SSD goes, I can't really speak about M2 versus a standard SATA. I have a desktop with an M2 SSD, but my P3D machine has SATA SSDs. I haven't benchmarked the drives, but the computers are nothing alike in terms of the other components so I don't know that it would be meaningful anyway. I honestly I can't say I notice any difference between them, but that's doing normal day-to-day tasks. I also don't think that P3D is particularly sensitive to disk speed above a certain point once you're flying. That may be different for photoreal, which I don't use. I've never felt that the SATA SSD I have was holding me back. Moving from a traditional rotating disk to an SSD is such a massive game changer, but I don't think an M2 makes quite the same impact. I guess what I'm trying to say is that in my opinion it would be better to make sure you have everything on an SSD, even a cheaper and slower one (non M-2). I also understand that the M2 format drives do generate a lot of heat under load. My ASUS motherboard came with a special heat sink just for the M2 slot.
  11. As noted, 4k refers to the resolution of the display device (monitor, projector, tv, etc.). Getting a 4K signal to the display device can also be an issue. Not all computers can do it, especially laptops and systems using the GPU devices integrated into the motherboard or CPU. You'll also need a newer HDMI or display port connection between the computer and monitor - older HDMI cables, DVI, and VGA cables can't do it. Assuming all of the hardware is in place, you might still have to tell your computer to output to the 4k signal. Windows will try to output at the optimal resolution (i.e. the one that matches your display), but you may have to tell other programs (like FSX, P3d, etc.) to output at the higher resolution. Not having the proper drivers installed could also limit you to a lower (non 4k) resolution.
  12. In my opinion, stick with Intel over AMD for now. P3d is still very much a single threaded program (LM has made a lot of improvements, but the speed of a single CPU core is usually the limiting factor in performance). That means you should take into account single core speeds instead of overall processor benchmarks. Passmark has an easy way to look at the overall benchmark (https://www.cpubenchmark.net/high_end_cpus.html), and a single-core benchmark (https://www.cpubenchmark.net/singleThread.html). The i7-7700k is pretty far down the list of high-end CPUs and well below that of several different AMD options at lower prices. But for a single core, the i7-7700k is near the top and well above anything by AMD. Passmark's benchmarks may not be rigorously scientific, but I think they illustrate why there's not really a 1-1 match between Intel and AMD, and why Intel is a better option for P3D. (This is also why many advocate disabling hyper-threading on Intel processors; sharing one physical core between two logical cores means a potential performance hit to single core speed. That performance hit may or may not outweigh the benefits of have more logical cores depending on the system - the evidence is not completely clear cut.) The i5-8600K (FunknNasty's CPU and the one recommended by Nickbe) has a single-thread score on Passmark that is only 2.4% less than the i7-7700K that you are considering. It is an excellent option at a lower price. But everything in your system has an effect. P3d hits a lot of components pretty hard, and upgrading one component (e.g. CPU) can reveal that another component (e.g. RAM) was running at near 100% capacity. So instead of getting a 10% boost from an upgrade, you only get 1% or 2%. That's why you want to pay attention to things like RAM and disk speed as well. Don't discount over clocking either, which is going to be greatly affected by the case and cooling system you put with the CPU. Even if you don't want to mess with voltages or timings, you can squeeze some GHz out of a CPU by using the motherboard's "automagic" overclocking routine, but the key is keeping things cool. My Asus board automatically bumped my i7-4790K from 4.0 GHZ to 4.4 GHZ, and I was able to get it to 4.6 GHz without trying hard and doing anything exotic like de-lidding or water cooling, but I bought a very nice air cooler to do it. That made a bigger difference in performance than the upgrade in the first place.
  13. This is a bit technical but may be appropriate to your situation.The users at my company (where I am the IT manager) had all sorts of problems with staying logged into a number of websites (very similar to what you are describing). The culprit turned out to be our new security hardware that was inappropriate assigning different ip addresses to ongoing connections. It made it impossible for websites to recognize incoming traffic as being from the same computer. Each time a user went to a new page, the request appeared to be coming from a different computer, so users were constantly being asked to re-enter their credentials and shopping carts would mysteriously empty. (Not that employees were supposed to be shopping at the office.) Not all websites were affected; it depended on how the websites maintained the session information. Cellular phones have features like MAC Address randomization that might cause this symptom; your gateway/router/firewall software could also be providing similar functionality as a privacy feature. Some VPN services could also do something similar as part of the anonymization processes. I don't have anything specific to point you at, but you might take a look at how you are connecting to the internet as something to test.
×
×
  • Create New...