Jump to content

vortex681

Members
  • Content Count

    2,556
  • Donations

    $0.00 
  • Joined

  • Last visited

Everything posted by vortex681

  1. I've been simming since the very early days and have yet to have a problem with any sim which I could definitely connect with a Windows update. Some issues have coincidentally appeared to happen at the same time as updates, but I subsequently traced them to non-Windows software which was updated around the same time. I keep Windows up to date.
  2. I agree. More importantly, though, because 2k doesn’t divide cleanly into 4k like full HD does (from a resolution perspective), a 2k image would look even more fuzzy on a 4k screen. You’re not using a uniform square of pixels (2 x 2 or 4 x 4, for example) on screen to represent a single pixel in the image.
  3. But quite possibly because there’s no competition at the moment. If AMD delivers with their next GPUs, the price could well fall.
  4. But that's the only easily accessible data with a massive player base. Do you know of a better data source? What data are you using to come to the conclusion that 4k is the norm? Apart from registering the resolution of your desktop, how else does Steam determine the resolution?
  5. If I recall correctly, there was a whole video that Asobo produced where they talked about this - it wasn't just a tweet. Whilst I have a lot of respect for Robert, personally I'd rather trust the devs who actually designed and programmed the sim. Why would they lie about this? Just in case I'm out of the loop and have missed it, is there a definitive statement from Asobo (with a link) where they state that the multi-point flight model isn't being used?
  6. Leaving browsers open in the background has always been a potentially risky business when gaming. It's nothing specific to MSFS or Windows. When the browser refreshes it can steal a lot of system resources.
  7. Certainly at that screen size. I couldn't fit a 40"+ monitor/TV into my workspace so 4k would be an overkill, not to mention an unnecessary resource drain, for me.
  8. What resolution are you talking about? I'm currently very happy with my QHD monitor and can't see me changing it any time soon so suspect that 10GB will be enough for me. You also said "but as 4K is now the norm and 8K is becoming much more affordable and popular" - the norm for whom? Not gamers generally, as over 65% of them (if Steam is to be believed) still use 1920x1080. Less than 3% in the latest Steam Hardware survey use 4k. I think that NVIDIA is pitching the 3080 at the upper end of their main user base rather than the small, niche areas where 4k is more prevalent.
  9. I disagree, but there's no way to know for certain. They're certainly enthusiast CPUs but that doesn't automatically mean that most people who buy them will want to push them to their limits. I would guess (and that's obviously all it is) that mild overclocks are much more common than massive, custom loop-cooled ones. People buy high-end CPUs because, even at stock speeds, they will out-perform most other CPUs. My experience with overclocking has been that whilst you obviously gain performance, in my opinion it's rarely significant enough to risk the stability or longevity of the CPU or GPU. As I don't have unlimited funds, If I make a big investment in components I want them to last as long as possible. If I get smooth performance at stock speeds, that's generally were I've tended to end up after experimenting.
  10. In their testing, the maximum power usage that Gamers Nexus had was 537W with the 10900k and a 2080ti. The RTX 3090 will supposedly have a TGP of 350W which is only 30W more than the 2080ti - that would still keep the total system power below 600W. Based on that, a 1000W PSU does still look like a bit of an overkill for most people.
  11. I agree - 1000W or more is a complete overkill for a the average gaming rig.
  12. Sorry, didn't watch all of the video but the other things to bear in mind are that most power supplies are most efficient at about 50% capacity. Also, my 850W PSU remains passive (hence very quiet) at all times as the temp never gets high enough to trigger the fan. Running a 400W PSU with a system that uses most of that power will inevitably lead to higher temps and a potentially shorter PSU life.
  13. Looking at the screenshot, there are some cores peaking well above 90%. Bearing in mind that there is a polling interval to the monitoring, these could easily be reaching 100%.
  14. I have mine top-mounted and pull ambient air through the rad from outside the case. I found it gives slightly better CPU temps with no significant rise in case temp. I have a large (200mm) inlet fan at the front of the case and a 140mm exhaust fan at the rear.
  15. Just wondering how many people complaining about the sounds have ever listened to them in real life from the cockpit of an airliner. As a passenger, you are generally much closer to the engines and don’t benefit from the additional noise protection that the pilots have.
  16. For the CPU, are you looking at average usage or individual core usage? Often the average is much lower than a single core (which then becomes the limiting factor). You can easily have 60% overall CPU usage and still be CPU-limited.
  17. Absolutely! I spent most of my working life before retirement as a helicopter pilot and rarely, if ever, remember having vibration or turbulence so bad that it compromised my view out of the cockpit. There were many times that the instrument panel vibrated dramatically making the instruments difficult to read but this didn’t carry over to my view out of the cockpit. Having the outside view shaking dramatically is just not realistic. Your body, neck and eyes do a great job of stabilising things - unlike a cockpit camera, your eyes are not rigidly attached to the airframe.
  18. Just stop Windows automatically updating drivers. It's easy: https://pureinfotech.com/exclude-driver-updates-windows-10/
  19. Austin also confirmed a massive ATC update about 10 years ago and that's still to materialise.
  20. But then, as Andreas pointed out, XP 11 isn't stock any longer. You had to add something to it to get it close to a standard that you could use to compare to MSFS. X-Europe is obviously much better than the stock XP 11 scenery but doesn't realistically compare to the stock scenery in MSFS.
  21. Looking at just the base sim, I think that's debatable even at this early stage.
  22. But do you realise just how many different hardware variants these manufacturers produce? Take just a single model of GPU like the RTX 2070. There are so many versions with subtle changes to the firmware and clock speeds by each different manufacturer. Add to this the huge numbers of potential graphics drivers that are available to use with the cards (many people don't use the latest drivers). You're probably already into at least hundreds of possible configurations for just one GPU. Then look at all the possible choices of Intel and AMD CPUs that we have in our systems. We plug them into a vast array of different models of motherboard from many different manufacturers which all have a different BIOS, firmware and drivers which make the whole system operate differently to other similarly specced systems. You only need to look at the huge range of end-user benchmark figures for a single model of CPU. There's just no such thing as mil standard hardware. It would all be so much easier if there was.
  23. You don't know what was changed in the release version of the sim compared to the alpha and beta versions. Features may have been turned of in earlier versions or different code may have been introduced in some parts. The fact remains that if most people don't appear to be getting stuttering, it's less likely to be the game code that's causing it.
×
×
  • Create New...