Jump to content

Alpine Scenery

Members
  • Content Count

    2,402
  • Donations

    $0.00 
  • Joined

  • Last visited

Everything posted by Alpine Scenery

  1. For those considering upgrading from the 5800x to the x3D, here is a DX 12 benchmark. I've always maintained that the 5800x3D is about 15% to 20% faster than the 5800x in MSFS, and this benchmark shows about that much. There are going to be variances depending on how much GPU-CPU balance you are using. I do not believe the 5800x3D is 30% to 50%+ faster as some benchmarks will show. I'm considering grabbing the 5800x3D on Black Friday if there are any deals, and I'll do my own pre-post benchmark.
  2. I don't think there is any reliable benchmark we can trust between these, too many variables. The Intel 13900k is what you want if absolute max speed is the deal, but I would take the 5800x3D or wait for the 7800x3D, as they use less wattage. I do not believe that in an equalized proper benchmark that the 5800x3D truly beats the Intel 13900k in basically anything. I do believe because people are forced to test on different PC's (for different CPU arch), then the different cache causes issues in the testing, and that the 5800x3D might win in certain "edge case" configurations. Looking at the raw performance, the Intel is too far ahead of the 5800x3D. Even the 13600k is quite a bit ahead of the 5800x3D in raw power, but it also costs a lot more wattage. Therefore, I'd suggest any of the 3, a 5800x3D, a 13600KF, or a 13900KF.
  3. Yah, I trust you a lot, it's just hard, I don't really trust myself either. I probably trust my own conclusions maybe even less than I trust yours. lol...
  4. I understand your thinking and a lot of people think that way. Was just wondering, because DLSS reduces visual quality more than 1440p does (in my estimation). The issue is the source objects in the game are LOD'd too low to render at 4k, unless you are very very close to a very high resolution texture in the game, hence sight seeing rather than flying. The resolvability of the ground textures aren't rendered in a resolvability of 4k either when you are in motion, they are highly compressed, as the trees aren't either unless you were using custom modeled trees. For buildings, same thing, the processing in MSFS is too heavy for flying for 4k to help much at any speed partly because of the LOD system, it might help if you are in a hot air balloon (not sure never tested), that only leaves 4k affecting AA in all honesty. Essentially, you are already upscaling even in Native 4k, because the pixel and texture definition being output in the game engine isn't matching the actual 4k resolution, except for the anti-aliasing differences (reduction of edge fringing). It can be proven with screenshots by using an image to show this by factual comparisons of downscaling the same scene from a close up, but I don't want to upset anyone that loves their 4k... The close up downscaled will pass a basic resolvability test, the original screenshot taken in motion will not. In all truth, you may be getting a bit more AA benefit than most at 77" with that particular TV, but for my TV it's not much benefit.
  5. Yah, it stinks. It's unfortunate, but this is the game they are playing (intentional low inventory, prioritizing card batches to scalpers, etc...). Nvidia was fined for lying about how many cards they sold to "crypto-miners" in their accounting, and now they are under investigation by multiple regulators for price manipulation using inventory controls, but it's unlikely anything will be done, especially if AMD has a large inventory. What they aren't telling you, is a huge number of "crypto-miners" are actually scalpers and weren't mining at all, because it's impossible to verify these small companies are actually mining. If AMD shorts everyone on inventory intentionally and keeps production low, we may see some regulatory intervention, but it's still doubtful.
  6. The issues are hard to compare, it takes a lot of testing, so just firing it up is a bit anecdotal, though I realize you are better than average at this sort of thing. Does DLSS enabled beat 1440p without DLSS, because the resolvability of what MSFS is rendering really isn't up to 4k levels anyhow, the AA might be. I don't see it guys, I'm not convinced, have read lots of stuff and used the previous version of DLSS. Some of the most renowned gamers even just trying it in fairly slow action single player games are just seeing way too many side effects for it to be useful, and the side effects you see aren't necessarily the same from one PC to another, so it depends on your display as well. Too many variables IMO, I'm sticking with raw performance in 1440p which is very predictable.
  7. Drop resolution to 1440p, 4k not worth it since any 4k display upscales internally anyways. Wonder what you get in 1440p?
  8. Nah, just a 3080 12GB, but I'm just basing it on DLSS in general and what the issues always have been and what others have said still are. I am comparing with no traffic (other than static FSLTL), no seasons addon, and some settings turned down, which should be roughly equivalent to the 4090. If I had a 4090, I'd want to run everything, but I also need a new CPU so not much point for me yet. I also run in 1440p, not 4k, as I don't see any benefit to 4k if you are already using a 4k display but upscaling 1440p. Hence, 2560x1440 upscaled by a good 4k display already has less issues than DLSS. Before I would even try DLSS, I would just buy a Samsung with Judder reduction and try that first, and run 2560x1440. The MSFS rendering engine just isn't really clean enough to have much difference in 4k vs. 1440p, I couldn't see anything really. 4k is a waste of electricity in MSFS, in some games it's not. That said, when I was shopping for a new display, I didn't see that many Samsungs with good contrast ratios other than their real expensive stuff, so I just got the TCL 43", which unfortunately doesn't have judder reduction or motion smoothing. I just got tired of shopping, as I wanted a display with at least 5000:1 contrast ratio and under $400, and the TCL was all I could find.
  9. Fair enough, but on my setup in the largest airports, it really gets kind of jerky overall.
  10. If the Dash 7 comes out around that day from SimWorks Studios, I will be flying the Dash 7, then the A310 most likely.
  11. That is a great site, and I do wish my TCL had judder reduction, but I only paid $280 for a 43" 60hz TCL, which is crystal clear and a native contrast ratio around 5000:1. I would say other than judder reduction, look at the contrast ratios. If you fly at night, try not to buy anything less than 3000:1 if you can help it, the higher the better. QD LED is going to be the best, but it's expensive and has its own drawbacks. A regular O-LED might also do the trick, but then you have potential longevity issues (maybe, not sure).
  12. Did you first try to reboot and restart the game, before reinstalling? This has happened to me a few times here and there, it's probably just a cache issue, or a server not responding issue. You just have to end task and reload the game. It happens to me maybe twice a month and is no big deal since it only happens on the load screen.
  13. https://hardforum.com/threads/nvidia-dlss-3-frame-generation-lock-reportedly-bypassed.2022529/page-3
  14. Yah, AMD just won game 6 of the world series and is apparently celebrating in Houston with the Astros. Now they just have to get their software right and make sure there are no other issues.
  15. It's not, DLSS3 induces a frame-by-frame insertion, but it does not help response time. When there is real lag in a game, the response of the screen gets worse, things do not respond right. It might help slightly if you are flying straight and not touching the controls, but only if the FPS is still relatively smooth, when the render start to render end time is out-of-synch (which it often is with lag), then DLSS does nothing to help and may make it slightly worse. It will be like the difference between a wave hitting a rock and spraying in all directions versus a wave hitting a boat and flowing back. Neither are going to look smooth, they both will have that inherent jerky change out of place, one just has fewer pieces flying around so to speak.
  16. The AMD is expected to only be about 10% slower than the 4090, so let's say 20% worst case is probably a good guess. The problem is going to be availability, I expect it to be very difficult to get one until Mar-Apr 2023. https://www.notebookcheck.net/AMD-s-RX-7900-XTX-is-10-slower-on-average-versus-RTX-4090-according-to-extrapolated-preliminary-graphs.666320.0.html AMD's dreams have come true, the Nvidia connector fiasco will cause everyone to lump ever failed power supply or surge to post it onto Reddit claiming the connector melted. Not saying the connectors aren't melting, but my point is that everything will be lumped into this same problem for the next 12 months. Given the lower pricing of the AMD, the performance being close, and the Nvidia connector fiasco (even after they send new equip out)... Given all the above, AMD will have to REALLY screw this up not to take a serious amount of market share away. And believe me, AMD is fully capable of really screwing this up... DLSS is not actually induce any smoothing at the response level, but only at the output level. It causes a slight worsening at the response level. Hence, it may help slightly if you are seeing a smooth 20fps, but the game rarely will produce a smooth 20fps, when the FPS gets that low, the frames tend to get bouncy and jittery, which DLSS cannot fix.
  17. https://forums.guru3d.com/threads/do-you-use-nvidias-driver-limiter-or-rtss-to-limit-your-framerate-any-pros-and-cons-to-either.444458/#post-6047720
  18. I think it will happen, but it's 40-70 years away, my guess about 50 years. So no worries for anyone here. By that time, it won't matter, the image recognition and scenario processing will be so complete, that not only can it do everything a pilot can do, but it will be able to do it 50x more precisely and no com links will be necessary, it will be flawless in its operation. All emergency procedures will be perfect and executed instantaneously. It will be like an auto-pilot for an auto-pilot for an auto-pilot. The most honest AI advocates which do a lot of lectures are pointing out that in 30-40 years, a computer will be able to interpret images and scenarios better than a human can. Hence, the vision of a computer will be far more accurate than the vision of a human, because a computer can process every single angle and view simultaneously. The computer will have a 360 degree of "every pixel" around the plane at all times, and be able to see upwards of 50 miles out with perfect accuracy, or in clouds even be able to tell things like spot birds a human could not spot.
  19. That can be modded by someone probably, but I think the actual cockpit and controls looks very very good. This is probably a buy for me, will wait to hear what others think once they own it. That said, it looks well done overall and might end up being one of my favorites. Never flew a Dash 7 that I recall, just the infamous Dash 8, so this should be boatloads of fun. Once I get this, going to fly the PMDG 737-600 to a boneyard and let it rust 🙂
  20. Just to add what I said above about how the scalping works, when you first think about it... It doesn't seem to make sense, like why would Nvidia and AMD play a direct part, when they could just raise the prices and sell them normally. Of course there is also the traditional retail scalping channels, where the scalpers buy direct from the retailers (but that is only to supplement their direct stock and to keep inventory low why they control the direct stock). The answer is because if AMD and NVIDIA raise the MSRP too high all at once ONLY during initial releases of the cards to match what some of the scalpers were doing, well because governments (Especially European) are more likely to hit them with an anti-trust violation which would claim they are artificially keeping supply low to sell the first 3-6 months worth of cards at too high of prices. Large manufacturers like AMD and Nvidia in non-competitive markets are watched by regulators closely, so if they start shifting prices all around during launch phases to interim phases, they'll get antitrust. So this is the dirty trick they figured out where "everyone wins", the scalper, the MFR, and the 3PD OEM's. It's very hard to prove what they are doing, because they will never put in the contracts that they "sold knowing the scheme". They all play dumb, but they all know what is going on. So there is no proof of any of it. The only real liability is if someone at Nvidia or AMD has one too many drinks at a bar and admits directly to a regulator what is going on, but it's a low chance, because it happens at the higher executive like levels (director and above), and these guys are used to being tight lipped. They didn't get where they are by running their mouths, so the whole thing will likely never be proven, because there is no paper trail or real concrete way to prove it. It's quite ingenius. It isn't a conspiracy really, it was an accidental development in the market that they were able to continue to take advantage of. Actually, the OEM's don't like it, because they get a smaller inventory, and the brick and mortar retailers hate it too. They don't have any control over it though, and they play along because they either get some or nothing. I assume that is one reason EVGA was sick of dealing with Nvidia, the low inventory availability, the low margins, and the questionable business ethics.
  21. Yah, it's a big issue. Nope, the manufacturers are fully compliant and in on the scalping. The scalpers pretend to be crypto-miners needing to buy 50-100 cards at once, and they buy them directly from Nvidia, AMD, MSI, etc.. MSI was the worst offender I believe. Think about it like this, let's say you were a millionaire and the stock market had a negative return. Well, it's really not much risk to pay MSRP for video cards, buying 100 at a time, and then trying to sell them. That's a potential 30% or higher return in just 30-60 days or even less. They usually pay just a bit under MSRP as well. It's a lot cheaper for the MFR to skip the retailer and sell 100 cards to a "crypto-miner", aka scalper. The MFR makes more profit and the scalper has very little risk overall. The MFR can also pre-allocate a certain amount of supply directly to the third-parties, like MSI, when they both know that certain supply will be paid at a higher price because it's going to be sold to a scalper. So it's a way for AMD or Nvidia to avoid direct culpability, but still get away with supplying the scalpers. Technically, this stuff falls under several anti-competitive market practices, because there are only 2 companies here, but our useless govt doesn't care anymore, they just let corps do whatever they want.
  22. Apologies in advance for the long reply, but I had trouble summarizing, so here we go... If you can find a 4080, I don't see any real reason not to buy one. However, the first question has to be, can you even find a 4080 16gb? If there isn't enough inventory to buy one, it doesn't do much use in us debating what to buy. For the CPU, get the best one you can afford, Intel is fine but be aware over time it will cost a bit more as they use quite a bit more watts than the 5800x3D, so most of the Intels are going to increase the electricity bill just a tiny bit. I don't disagree with your decision in not wanting to wait (whether talking about the CPU or even if considering an AMD GPU). For arguments sake, let's say you consider an AMD GPU, well the chances of even being able to find a 7900xt in Dec are not that great. With the release being right before XMAS and all, it's going to be a SUPER HOT item, especially given how much cheaper it's going to be than the 4090. You will have to be very "alert" on the first few hours it is released and try to snipe one most likely, the scalpers are going to be out in full bloom for XMAS sales. I personally would buy a 3080 12GB or TI, or an AMD 6950xt or 6900xt, but that's just because I'm more budget oriented and I don't care about the DLSS feature at all. The 6950xt can be had for $750 to $800 right now, and it is basically as fast as the 3090 (3090 is a tiny tiny bit better), which is actually about the same as the 4080 (excluding DLSS differences). So the 4080 is said to be about 10% faster than the 3090 in SOME games, but I am not sure about in MSFS. On the other hand, the slightly faster 3090 TI beats the 4080 in raw speed. What I would do might not be what most would want to do, but my specific plan would be to wait 1 week, buy a 6950xt from somewhere that allows returns. Use that until the 7900xt come out, and then TRY to buy the 7900xt and if you find one, use the 30-day window to return the 6950xt. If you cannot buy the 7900xt, just keep the 6950xt. That said, that is because I am not a believer in DLSS, I think it's basically a gimmick. That is my opinion, some would disagree. You certainly cannot really go wrong once you get up to at least a 3080/4080 to 3090/4090, I mean it is not like one is going to be massively superior to the other when we are talking raw FPS in MSFS I believe. I am not going by DLSS benchmarks, but by what users here with the 4090 have said. I believe GSalden said about 10% faster and Capt Piet maybe up to 25% faster or a little more, so as you can see it just depends on how much load you are pushing the CPU vs. how much to the GPU. For the G-synch, what you have is fine. I don't think G-synch adds much to be honest, it also varies for each specific implementation and depends on your settings. You don't need it and there are better ways to spend the money than upgrading a perfectly good monitor.
  23. Nope, all electrics, except one semi-hollow (Gretsch G2655T w / Bigsby)...
  24. Did we all forget to tell him to try it first without any addons, umm, that's always # 1.
  25. I have about 8 guitars I need to sell, no time to play them anymore, too busy sim'n...
×
×
  • Create New...