Jump to content

seasley

Members
  • Content Count

    21
  • Donations

    $0.00 
  • Joined

  • Last visited

Everything posted by seasley

  1. I use Cereproc voices. They are expensive at the normal price, so do look for sales. There are also rumors that if you contact them and explain what you are using them for, they may give you a special discount. I used the free windows voices thread mentioned above, but I wasn't thrilled with the quality and there seemed to be some inconsistencies of which voices would actually work depending on your Windows version. What was cool is that you could add additional language packs to Windows and get access to foreign language voices. They would speak the English words, but with an accent which was great. In the end, though, I went with Cereproc. I also tried Amazon Polly. When it worked, it was great. The voice quality was excellent and there's a nice variety of voices. However, like many people using Amazon Polly, it was very hit or miss if they would work at all. The problem seems to be that as a cloud-based solution, there application has a variable lag depending on server load, internet connection, etc. If Pilot2ATC wasn't able to get a response from the Polly engine within a set amount of time, it would just stop responding. Most people seem to have trouble getting it to initialize, but once it's working, it works for the entire flight. For me, I could only get it to initialize about 30% of the time, and then only 50% of the time would it still be working when I landed.
  2. My understanding is that flight levels are set using standard pressure (29.92 / 1013) and altitudes are set using local pressure. ATC isn't supposed to assign you to a flight level that conflicts (comes with 1,000 feet) of the transition altitude. So in your US example if ATC assigns you 18,000 ft, they mean 18,000 ft at local pressure. If they assign you to Flight Level 180, you'd set standard pressure and 18,000 feet. If you're currently at1 12,000 feet and climbing, and the ATC clears you to 21,000 feet, you go ahead and change to standard pressure when you set 21,000 feet, even though you're below the transition altitude. You don't have to wait until you cross 18,000 ft. In your UK example, I would interpret 5,000 feet at 5,000 ft at local pressure. When ATC clears you to climb above 5,000 ft (say to flight level 010), you'd set standard pressure and dial in 10,000 ft. In the real world, I don't think ATC procedures would assign one aircraft to the same altitude as a flight level. However, ATC in MSFS can do some pretty strange things. I was trying to find some looking at some of the FAA docs - ENR 1.7ENR 1.7 Barometric Altimeter Errors and Setting Procedures (faa.gov) (which would apply to US only), it seems that the lowest allowed Flight Level (i.e. above transition altitude) must be at least 1,000 feet above the altitude reading using local air pressure. There's a table (TBL ENR 1.7-1) that defines the lowest usable flight level based on local altimeter. So at your example 30.34 (which is high pressure), the lowest flight level is 180. But if the is a low pressure system at the airport and the pressure is 28.91, the lowest usable flight level would be FL 200 - ATC couldn't assign you to flight level 180.
  3. I haven't done anything about overlocking by 12900K. I had my 9700K overclocked before I switched exclusively to MSFS, but like others here, I found that MSFS just wasn't stable with an overlock that everything else was fine with. I reverted to the standard clocks with it, and I use my 12900K at the standard clocks. Maybe I'll play around when I have some time, but I've been enjoying flying too much since I upgraded.
  4. My philosophy is that the most important consideration for flight sims is the max speed of a single thread. P3D is definitely constrained by the speed of the main thread, as is MSFS despite all the advances that Asobo made. (It's much improved but just throwing more cores at it doesn't equal more FPS.) I don't have a lot of experience with X-Plane, but when I played around with XP11 a few years ago it seemed like the same story. While Ryzen processors can put out top numbers on overall benchmarks, their performance in gaming benchmarks is not always the best. When looking at just a single thread, the 12900KS benchmarks 25% faster than the 5950X (per PassMark.com). But Intel chips run hotter and cost more, so consider that as well. I just upgraded my system from an 9700K to a 12900K and couldn't be happier. It's still paired with a 2080TI, which seems to be a good balance. Previously I was constantly CPU bound. Now I'm usually bouncing between being CPU and GPU bound when running at 4k. With a 3090, you'll most likely be still waiting on the CPU, so I'd definitely try to max it out. I know some people complain about thermals with the Alder Lake CPUs, but I haven't had any problems with mine. I installed a good 280mm AIO cooler, and my CPU definitely runs cooler and makes less noise than my GPU.
  5. I'm not criticizing the build-quality of your adapter, but it is just physically limited because of its size and placement. Obviously, the antenna can't be very big, which means it's very susceptible to interference. Plus, my guess is you have it plugged into the back of the computer or maybe in a hub near the monitor. In either case, you have it located in close proximity to metal boxes filled with electrical equipment that can cause interference or physically block the signal. Your laptop, tablet, and phone were all designed with the idea of receiving radio signals via an antenna. Sticking a USB dongle in the back of the computer is a completely different thing. If you have a USB extension cable, you could try using it to move the wi-fi adapter away from the case and further out in the open. That may improve reception. If you're looking for a recommendation, I recently moved to a new house and used an Asus USB-AC56 until I could get some cables run to my new office. It has a 6" external antenna as wells as an optional extension cable and stand to further separate it from the computer. It mostly worked to get good speeds through a couple walls and a floor, but it was still not 100% reliable. I strongly encourage you to rethink your "no cables" stance. To really get those speeds, you're going to need a wi-fi adapter with an external antenna separated from the CPU by a cable. You might as well switch to an ethernet cable and save yourself the trouble.
  6. I ran into something similar a year or two ago trying to buy scenery for Tehran. I don't remember the exact wording, but the message was similar. I could not complete the purchase using PayPal, and the first credit card I entered directly also failed. I found out from the credit card company that the fact that the transaction description referenced Iran caused it to be flagged. (A second credit card went through just fine, however.) Without starting a geopolitical discussion, it is possible that PayPal is trying to comply with various regulations about payments to foreign governments and blocking transactions because of key words. In my case the transaction had nothing to do with Iran (other than being a depiction of a location there), but it was enough to make an automated filter upset.
  7. I discovered that some TVs have a noticeable space between the pixels. Not enough to see at normal tv distance, but very disconcerting on a desk. Ironically the higher end model I tried had it, but the next step down did not. Take a look at the rtings.com site because they specifically test and rate tvs for monitor purposes. I went with a 43 inch 4k TV, which I like better than the ultra-wide I replaced it with.
  8. This tripped me up at first and may be your issue. A lot of addons that go into the community folder will come with an extra folder. For example, I downloaded the working title G1000 mod as a zip file and extracted it. That created a folder called "workingtitle-g1000-v0.3.4". Inside that folder was another folder called "workingtitle-g1000". If I dropped the "workingtitle-g1000-v0.3.4" (the top level folder) into the Community folder, the mod did not work. I had to put just the "workingtitle-g1000" folder into the Community folder instead. MSFS scans the Community folder looking for the layout.json and manifest.json files, but it only scans one level deep. You can't nest those files in a folder inside another folder. So the path to the working title G1000 mod's json files needs to be ...LocalCache\Packages\Community\workingtitle-g1000 and not ...LocalCache\Packages\Community\workingtitle-g1000-v0.3.4\workingtitle-g1000 (the version number may be different) See if that's the issue. (This is true for any mod in the community folder.)
  9. When talking internet speeds, make sure you're not comparing bits to bytes. MBs usually indicates bytes; Mbs usually indicates bits. I've seen more than one example of the wrong units appearing on documentation and measurements. You're consistent in your notations, so I'm guessing that's not the issue. How are you measuring the downstream speed? Are you using a diagnostic tool built into the cable modem? Are you running a speed test supplied by your provider's website? Or are you running something like speedtest.net? Unfortunately, I've seen providers detect third party speed tests and throttle them - not always intentionally. They're just trying to keep the overall connection responsive so they prevent any single device from maximizing the connection. Before upgrading your connection, I'd make sure you're getting the most out of what you currently have. I would expect you to be able to get in the low 200Mbs with your stated connection speed. Every device and connection between your computer and the internet has a potential speed limit / capacity. Your provider may be giving you a 275 Mbs signal to your location, but something in between is slowing you down. If you're using wi-fi to connect to your router, the ~90 Mbs is the best you're probably going to be able to do. I've never seen a real-world wi-fi network perform anywhere close to the specifications. If you've got something limiting your bandwidth, a faster connection on the provider's side won't change anything. When I first got my cable connection at home, the provider swore I had a 300Mbs downstream connection, but the modem they gave we would only connect on one or two channels at a time. They tried to tell me it was a wiring problem inside my house. To prove them wrong I scheduled an appointment with the tech and set up the cable modem and my laptop outside my house at the cable box. When the tech saw I couldn't even get 5Mbs sitting beside their service panel, he finally realized they had the channels mis-configured. (Incidentally the whole street was affected - everybody saw their connections improve when he fixed it.) The moral of the story is to get your connection as simple as possible for testing purposes. Eliminate anything that could be causing the issue. Connect your cable modem as close to the service panel as possible - run a coax cable directly from the panel to your modem if possible. Temporarily get rid of any hardware that might be messing with things - switches, hubs, extenders, etc. Plug your computer directly into your modem instead of using wi-fi. If that works, you know the limitation is on your end. Cable modems can be sensitive to interference from other devices. It's not supposed to happen, but it does. Disconnect everything else that connects to cable or your router and see if the situation gets better. A bad ethernet cable could cause your hardware to limit itself to a lower speed. (I had a bad cable drop an entire network from 1Gb to 10Mb - swapping out the cable fixed it.) Another thing that's tripped me up in the past - I had a cable modem / route that did QOS based on the connection speed that it measured. After I upgraded the service and increased the bandwidth, the cable modem was still limited to original speed because I hadn't reset the QOS configuration. Also check the settings for filtering, parental controls, and firewalls. Some hardware can only provide these functions at a limited rate. Turning on every feature might result in lower bandwidth. For testing purposes, try turning everything off. If it works without them, add them back in one at a time until you discover what might be causing the issue.
  10. I use a 43" 4k TV for flight simming and a second one hooked up to my work computer for productivity task. By sitting close to the TV and running 100% scaling, it's like having four 1920 x 1080 monitors stacked in a grid. I can go ultra wide if I'm working with spreadsheets or fully vertical if I'm working on code. I almost never maximize single windows for word processing or web browsing - too much head turning from side to side to read and look at user-interface elements. Constantly going from the bottom of the monitor to read the last line and then the top to find a toolbar button does get tiring. Having a 4k running at 100% scaling is about being able to multi-task and see multiple windows simultaneously, with each one showing an entire screenful of information. There are are lots of freeware utilities to help you use the screen space effectively - they enhance the Windows UI to allow you to snap windows to fill the left and right, or top and bottom, or just a corner. You can also divide the desktop into just about any division you want. Being able to shift windows around and come up with different layouts makes me so much more effective. I'd concur that anything much bigger is too hard to go from side to side and top to bottom on a desk without strain. I tried a 47" at first and returned it to get a 43". One thing to make sure is that the LCD panel doesn't show the screen door effect when you sit that close. Six feet away you'd never notice, but from 3' feet in, you want to make sure you can't see gaps between individual pixels. Scott
  11. A word of caution - I discovered when I was trying to transition airports from the scenery.cfg to add-on.xml method that not all of the files for the airport are in the scenery and texture folders for the airport. For example, FlyTampa Tampa normally installs into a FlyTampa\Tampa folder in the Prepar3D folder. However, it also installs a few files in the Effects folder and the texture folder under it. If you move the FlyTampa\Tampa folder outside the Prepar3D folder and create an add-on xml file for it, the airport will still work. However, if you then delete your Prepar3D folder as part of an upgrade or other event, the effects files will get lost. I don't know if the missing files will cause the sim to crash, but there probably would be some issue, even if a minor one. Also, just because a developer uses an add-on.xml file to point to the scenery and texture folder for the airport, it doesn't mean that they don't also install files into other folders in the Prepare3D folder that could get lost. Drzewiecki Design sceneries seem to do this. (They use an add-on.xml file to reference the scenery and texture folder which is installed into the Prepar3D folder by default, and they also add additional files elsewhere in the Prepar3D folder.) I think Lockheed Martin's goal was to allow 3rd party airports to install into a single folder that contains every file neccessary for the airport, but only a few developers truly do this. FSDG, for example, is one that seems to do it right. When I installed their newer sceneries, they created an Effects folder outside the sim folder which is also referenced in their add-on.xml file. That way their effects files won't get lost if the Prepar3D folder is removed. If all developers used the add-on.xml file to the fullest like this, we truly could just wipe the entire P3D folder when it was time to upgrade. FYI - I've been using SoftPerfect's File Access Monitor to track what files get installed by 3rd party developers. It's a little cumbersome to use, but it's free. It gave me enough information to realize that trying to maintain a P3D folder with nothing by core files in it was going to be too hard, so I stopped worrying about it. It will tank your performance, so don't leave the service running when you run the sim.
  12. The setting you found is the one LM is referring to (I think). It is one that is more effective when used on laptops with dedicated graphics cards. My Dell XPS has both an Intel GPU built into the processor and a discrete NVIDIA adapter as well. On my laptop the setting allows me to choose the Intel GPU as the Power saving GPU and the NVIDIA as the High performance GPU. Setting it to Default allows Windows to "pick" the best adapter for the program and circumstance. For example, if I run a 3D application with the battery fully charged, Windows will use the NVIDIA GPU; if the battery is low, it will run the application using the Intel GPU. But on both my desktops I see the same dialog box but with the NVIDIA GPU under both options (one desktop is an F processor without a built-in GPU, on the other the built-in GPU is disabled in BIOS). It is possible that NVIDIA is exposing two instances of the same GPU to the operating system to allow you to select different settings, but I haven't found anything online or in the docs to suggest that they are. My guess is that Windows always displays the same three options regardless of the number of GPUs in the system, and selecting one versus the other makes no difference when there's only one GPU. It's obvious when I try to run an application on my laptop using the "wrong" GPU; I can't imagine trying to run P3D on a built-in Intel GPU with settings intended for a high-end NVIDIA card. I'm sure LM wants to cover even the obvious things - you've got to start somewhere when diagnosing problems. pdge is correct that NI and NCP are not the same program, but both accomplish the same thing - alter settings in the NVIDIA GPU driver. Inspector gives you deeper access to internal variables while Control Panel is much more user friendly. In my experience and reading, NI was essential in the FSX days and still very important during the early versions of P3D because the simulators weren't designed to take advantage of all the features in newest GPUs. NI allowed you to turn on these features to enhance the performance and appearance of the sim. The later versions of P3D are already aware of the latest GPU features and so the effectiveness of manually setting GPU configuration is greatly reduced, and can hurt performance if you're not careful by putting the GPU in one configuration while the simulator is expecting something different.
  13. 1) In my experience, wireless range extenders do increase range, but they also introduce instability. I can't say for sure that they hurt speed, but I diagnosed a network that was using them and devices were constantly dropping connections. We wired in two additional access points, removed the range extenders, and the dropped connections went away. (I believe it was all NetGear equipment, but I don't remember specifics.) 2) You might not be able to do much better than that without a wired connection regardless of what you buy. During the first part of last year my company built out new office space in an empty warehouse. The fiber connection was installed quickly - we'd just finished the interior demo. So in the middle of an empty warehouse with the power shut off in an industrial area with my laptop sitting only 15 feet from the access point, I was only able to get about 570 Mbs on commercial equipment that was rated for over 1 Gb. It was about as close as you can get to a lab-quality situation and I still couldn't do any better than 50% of the rated throughput. Without details of your living space and device locations, it's hard to know what the factors are, but before you buy a better router / access point with more powerful radios / antennas, try this. Get a really long patch cable and use it to relocate your access point closer to the devices you want to have the best connection - even if it means letting the cable lie on the floor and down the stairs temporarily. If that doesn't improve things, buying better wi-fi equipment probably won't help much.
  14. I had an LG 34UM65 ultra-wide for a while and switched to a 4k TV because I liked it so much better. The problem I had with the ultra-wide was that while it did give me some nice horizontal peripheral vision, the aspect ratio meant I had a harder time seeing the instruments. Plus the fish-eye effect made it feel like I was sitting in the jump seat rather than the pilot's seat.
  15. I remember reading here that add-on aircraft will sometimes use libraries/routines/textures etc. from the defualt aircraft. Deleting them might cause unexpected failures in other products. I might be wrong overall, and it might not apply to your add-ons, but be cautious. Scott
  16. Completely non-P3D point here, but I've worked with some setups that mix a 4k monitor with a non-4k monitor. Windows 10 does a much better job of scaling programs between different DPIs than it used to, but it doesn't always handle changing dpi on the fly. If you drag a window from a high-dpi (i.e. 4k monitor) to a regular-dpi (your existing 24"), you may have programs that do not scale interface elements immediately, rendering them unusable. (They work fine when started in a high-dpi environment, but don't handle changing without restarting.) I love working on my 4k monitor as a single large display. I treat it like having 4 normal monitors in a seamless grid. I sit close to it and leave the scaling at 100%. I can spread a program horizontally across the width of the screen (e.g. Excel), or vertically across the entire height (for code listings). There are free utilities out there that let you define complex custom snap points that make it really easy to make use of all that space. My guess is you'll quickly forget the additional monitors.
  17. I had an ultra wide computer monitor (43" 21x9 ratio) and switched to a 47" 4K TV with a standard 16:9 ratio. I couldn't be happier with what it does for P3D, not to mention other things. I even liked it so much for productivity tasks that I convinced my company to buy one for my office. Check out https://www.rtings.com if you don't know about it. Many of their TV reviews specifically address using the TV as a computer / gaming monitor. You tend to sit a lot closer to a TV being used as a computer monitor so it's helpful to have a review from that perspective. They saved me because the model I was going to buy had some visible cross-hatching from up close. It was not noticeable from room-sized viewing distances, but at a desktop distance it was like looking through a screen door.
  18. I upgraded my i7-4790K to the i7-8086K. With a good air cooler, power supply, and case, I'm able to run it at a constant 5.2 GHz. I wasn't necessarily hunting frames, but I was able to turn the settings in P3D up even further and maintain what I wanted, especially in dense scenery and heavy clouds. Still can't get into LAX without seeing a drop, but I fly with ORBX SoCal, UT Live, the works. Almost anywhere else I fly I can get the performance I want without compromising. As I understand it, the 8086K is the same processor as the 8700K, but Intel picks out the best quality ones and brands them as the 8086K. Using only the best chips allows them to bump the boost frequency to 5 GHz out of the box. The way I see it, an 8086K is much more likely to hit the higher speeds than an 8700K. Is it worth the price premium? It was for me. I've also seen that the newest Intel chips (the 9XXX series) might not be the best when overclocking either (example: https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/22) The new chips are faster out of the box, but Intel seems to be shipping them at much closer to their maximum clock than they used to. It seems to be a lot more work and money to squeeze out a couple hundred MHz on them than it used to be. There may be benefits to having the additional cores, but I think P3D still has a ways to go to automatically say more cores is better than more clock. As for the 10th generation, most of the speculation is on Intel moving to a smaller manufacturing process (10nm or even 7nm). Historically, Intel moves to a smaller process every other generation (which generally improves efficiency) and then stays there while improving the performance for the next generation (referred to as the tick-tock cycle in the past). The next smaller manufacturing process is already a little late (this is the second performance increase generation) although some of the new mobile chips are using a new process. As for the Z390 motherboard vs the Z370, most of the specs I've seen refer to improvements in USB, Wi-Fi, PCI lanes, etc, and not much that really centers on the processor. At one time I thought I read that the new chipset supported faster memory, but I can't find anything now that says it. DDR4 vs DDR3 vs DDR2 is basically a generational notation, much like the 8700K is the 8th generation processor and the 9700K is the 9th generation processor. Newer generations support faster speeds. RAM speed is generally measured in bandwidth instead of frequency because a number of factors make the specific clock frequency meaningless. The general rule is to buy the highest bandwidth that the processor and motherboard support. Then you get into the timings thing, which is above my pay grade. For example, I found a 16 GB stick of ram advertised as DDR4 400 (PC4 32000) with timings of 18-19-19-39. I'm pretty sure that it's running at a 200MHZ clock frequency, but because it transmits on both ends of the clock cycle, it's transfers data at a 400MHz rate (DDR stands for Double Data Rate - two transmissions for every clock cycle). Because it can transfer multiple bits (8?) at a time, it has a total bandwidth of 3.2 MB/sec. (I might not have it exactly right, but you get the idea.) The timings (18-19-19-39) refer to how many clock cycles / how much time the RAM takes to perform certain actions such as respond to an instruction or access a specific bit. It gets super technical, but the rule is always lower is better. I believe it is possible for RAM with a higher frequency but slower timings to end up having poorer performance than RAM with a lower frequency but faster timings. I think that's only true in extreme cases, but it means you can't ignore the timings entirely. As far as an SSD goes, I can't really speak about M2 versus a standard SATA. I have a desktop with an M2 SSD, but my P3D machine has SATA SSDs. I haven't benchmarked the drives, but the computers are nothing alike in terms of the other components so I don't know that it would be meaningful anyway. I honestly I can't say I notice any difference between them, but that's doing normal day-to-day tasks. I also don't think that P3D is particularly sensitive to disk speed above a certain point once you're flying. That may be different for photoreal, which I don't use. I've never felt that the SATA SSD I have was holding me back. Moving from a traditional rotating disk to an SSD is such a massive game changer, but I don't think an M2 makes quite the same impact. I guess what I'm trying to say is that in my opinion it would be better to make sure you have everything on an SSD, even a cheaper and slower one (non M-2). I also understand that the M2 format drives do generate a lot of heat under load. My ASUS motherboard came with a special heat sink just for the M2 slot.
  19. As noted, 4k refers to the resolution of the display device (monitor, projector, tv, etc.). Getting a 4K signal to the display device can also be an issue. Not all computers can do it, especially laptops and systems using the GPU devices integrated into the motherboard or CPU. You'll also need a newer HDMI or display port connection between the computer and monitor - older HDMI cables, DVI, and VGA cables can't do it. Assuming all of the hardware is in place, you might still have to tell your computer to output to the 4k signal. Windows will try to output at the optimal resolution (i.e. the one that matches your display), but you may have to tell other programs (like FSX, P3d, etc.) to output at the higher resolution. Not having the proper drivers installed could also limit you to a lower (non 4k) resolution.
  20. In my opinion, stick with Intel over AMD for now. P3d is still very much a single threaded program (LM has made a lot of improvements, but the speed of a single CPU core is usually the limiting factor in performance). That means you should take into account single core speeds instead of overall processor benchmarks. Passmark has an easy way to look at the overall benchmark (https://www.cpubenchmark.net/high_end_cpus.html), and a single-core benchmark (https://www.cpubenchmark.net/singleThread.html). The i7-7700k is pretty far down the list of high-end CPUs and well below that of several different AMD options at lower prices. But for a single core, the i7-7700k is near the top and well above anything by AMD. Passmark's benchmarks may not be rigorously scientific, but I think they illustrate why there's not really a 1-1 match between Intel and AMD, and why Intel is a better option for P3D. (This is also why many advocate disabling hyper-threading on Intel processors; sharing one physical core between two logical cores means a potential performance hit to single core speed. That performance hit may or may not outweigh the benefits of have more logical cores depending on the system - the evidence is not completely clear cut.) The i5-8600K (FunknNasty's CPU and the one recommended by Nickbe) has a single-thread score on Passmark that is only 2.4% less than the i7-7700K that you are considering. It is an excellent option at a lower price. But everything in your system has an effect. P3d hits a lot of components pretty hard, and upgrading one component (e.g. CPU) can reveal that another component (e.g. RAM) was running at near 100% capacity. So instead of getting a 10% boost from an upgrade, you only get 1% or 2%. That's why you want to pay attention to things like RAM and disk speed as well. Don't discount over clocking either, which is going to be greatly affected by the case and cooling system you put with the CPU. Even if you don't want to mess with voltages or timings, you can squeeze some GHz out of a CPU by using the motherboard's "automagic" overclocking routine, but the key is keeping things cool. My Asus board automatically bumped my i7-4790K from 4.0 GHZ to 4.4 GHZ, and I was able to get it to 4.6 GHz without trying hard and doing anything exotic like de-lidding or water cooling, but I bought a very nice air cooler to do it. That made a bigger difference in performance than the upgrade in the first place.
  21. This is a bit technical but may be appropriate to your situation.The users at my company (where I am the IT manager) had all sorts of problems with staying logged into a number of websites (very similar to what you are describing). The culprit turned out to be our new security hardware that was inappropriate assigning different ip addresses to ongoing connections. It made it impossible for websites to recognize incoming traffic as being from the same computer. Each time a user went to a new page, the request appeared to be coming from a different computer, so users were constantly being asked to re-enter their credentials and shopping carts would mysteriously empty. (Not that employees were supposed to be shopping at the office.) Not all websites were affected; it depended on how the websites maintained the session information. Cellular phones have features like MAC Address randomization that might cause this symptom; your gateway/router/firewall software could also be providing similar functionality as a privacy feature. Some VPN services could also do something similar as part of the anonymization processes. I don't have anything specific to point you at, but you might take a look at how you are connecting to the internet as something to test.
×
×
  • Create New...