Jump to content

Recommended Posts

2 hours ago, Sethos said:

Why buy Intel for MSFS? power hungry and runs extremely hot. Get a 7800X3D and enjoy an insanely power efficient chip that walks all over even the fastest Intel chip in MSFS benchmarks.

My 12700k is very cool and not power hungry at 4.5ghz.  I dont know much about the 13xxx series but from the benchmarks I've seen the amd X3Ds do the most damage at lower resolutions.  I think at 4k the amd and Intel top tiers are pretty close at least for msfs.  Do you have any charts where you're seeing a different outcome?


| FAA ZMP |
| PPL ASEL |
| Windows 11 | MSI Z690 Tomahawk | 12700K 4.7GHz | MSI RTX 4080 | 32GB 5600 MHz DDR5 | 500GB Samsung 860 Evo SSD | 2x 2TB Samsung 970 Evo M.2 | EVGA 850W Gold | Corsair 5000X | HP G2 (VR) / LG 27" 1440p |

 

 

Share this post


Link to post
Share on other sites
24 minutes ago, ryanbatc said:

My 12700k is very cool and not power hungry at 4.5ghz.  I dont know much about the 13xxx series but from the benchmarks I've seen the amd X3Ds do the most damage at lower resolutions.  I think at 4k the amd and Intel top tiers are pretty close at least for msfs.  Do you have any charts where you're seeing a different outcome?

But you wouldn't test CPUs at 4K to gain any meaningful information, because the higher resolution, the more work is put on the GPU, thus eliminating the CPU load from the equation. That's why you test a lower resolutions and why they tie up, because then you're just testing the GPU at that point. The chart posted on the previous page shows exactly what most benchmarks will show you; The X3D CPUs absolutely walk all over even the 13900K, because that 3D cache is extremely good for MSFS and games where a lot of information is passed between the CPU and the cache.

And the 13th Gen CPUs are a different beast, especially the 13900 that a lot of people here turn to is just an inefficient, hot beast that needs something like the 360mm AIO to keep in check. If gaming and especially MSFS is your primary use, the X3D CPUs are a no brainer.

https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/23.html

Just look at the power consumption out of the box; 77w versus the 13900K's 276W(!!) and temps out of the box, over a 22*C difference.

Also, one of the best improvements with the X3Ds that a lot of people miss; the 1% lows are really good, meaning all those FPS drops and stutters you perceive are sometimes eliminated or just reduced. So it's not just about straight framerate either.

 

Edited by Sethos

Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post
Share on other sites
1 hour ago, Sethos said:

And the 13th Gen CPUs are a different beast, especially the 13900 that a lot of people here turn to is just an inefficient, hot beast that needs something like the 360mm AIO to keep in check.

Just look at the power consumption out of the box; 77w versus the 13900K's 276W(!!) and temps out of the box, over a 22*C difference.

Right now I'm at EGKK in MSFS in the Fenix A320 at 4k. CPU temp is 57, GPU temp is 45, total power delivery from the PSU is 325w. As you can see in my sig I'm running an air cooler at the moment ... 🙂

 


Cheers, Søren Dissing

CPU: Intel i9-13900K @5.6-5.8 Ghz | Cooler: ASUS ROG RYUJIN III | GPU: ASUS Strix RTX4090 OC | MoBo: ASUS ROG Maximus Z790 Hero | RAM: 64Gb DDR5 @5600 | SSDs: 1Tb Samsung M.2 980 PRO (Win11), 1Tb Samsung M.2 980 PRO (MSFS), | Case: ASUS ROG Helios 601 | Monitors: HP Reverb G2, 28" ASUS PB287Q 4K | Additional Hardware: TM TCA Captain's Edition, Tobii 5 | OS: Win 11 Pro 64 | Sim: MSFS | BA Virtual | PSXT, RealTraffic w/ AIG models

 

 

Share this post


Link to post
Share on other sites
9 minutes ago, SierraDelta said:

Right now I'm at EGKK in MSFS in the Fenix A320 at 4k. CPU temp is 57, GPU temp is 45, total power delivery from the PSU is 325w. As you can see in my sig I'm running an air cooler at the moment ... 🙂

 

I mean, an anecdotal slice of information doesn't really negate actual reviews. Also, just as I said, you are sitting in a GPU-bound scenario playing at 4K and your CPU is still reaching almost 60c -- That's not impressive in the slightest, kind of terrible actually. My 7800X3D at EGKK, in the Fenix at 4K, with full FSLTL traffic doesn't even break 40c and that's sitting in a hot ambient room.


Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post
Share on other sites
2 hours ago, Sethos said:

And the 13th Gen CPUs are a different beast, especially the 13900 that a lot of people here turn to is just an inefficient, hot beast that needs something like the 360mm AIO to keep in check

What a bunch of bull.... Are these YOUR actual numbers or are you just regurgitating the stuff from YOUR internet? 

Edited by Silicus

Most of what is said on the Internet may be the same thing they shovel on the regular basis at the local barn.

Share this post


Link to post
Share on other sites
14 minutes ago, Silicus said:

What a bunch of bull.... Are these YOUR actual numbers or are you just regurgitating the stuff from YOUR internet? 

Let me guess, you own one and don't like hearing it? Everything I say is backed up by reviews and actual numbers, not feelings. How about Techspot's review calling it a Hot and Hungry chip? https://www.techspot.com/review/2552-intel-core-i9-13900k/ How about custompc's: https://www.custompc.com/intel-core-i9-13900k-review "However, this new CPU is also likely to be very hot and power-hungry, and it also demands a hefty premium."

Perhaps Steve from GamersNexus saying the 13900K is 'extremely inefficient'

And if you had any interest in hardware, followed various hardware outlets and channels, you'd know the very nature of the 13900K and how it works, is that it'll keep boosting until it hits its thermal limit when it's put under pressure. It's not uncommon for users to report insane numbers if you aren't keeping it in check.

or in the Linus review, the recommend MINIMUM a Noctua NH-D15

 

Yes, you may like your expensive 13900K build and be very defensive about it but it's factually an inefficient and hot chip, Intel just cranking the dials to stay relevant.

Edited by Sethos

Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post
Share on other sites
23 minutes ago, Sethos said:

Yes, you may like your expensive 13900K build and be very defensive about it but it's factually an inefficient and hot chip, Intel just cranking the dials to stay relevant.

Just as I thought. A couple YTubers say it and it is fact! Wow. What is your own experience? Mine is difference from those 'experts'!  I actually USE it and I am happy with performance and heat and have no issues.... I have nothing to defend, but you seem to.....


Most of what is said on the Internet may be the same thing they shovel on the regular basis at the local barn.

Share this post


Link to post
Share on other sites
Just now, Silicus said:

Just as I thought. A couple YTubers say it and it is fact! Wow. What is your own experience? Mine is difference from those 'experts'!  I actually USE it and I am happy with performance and heat and have no issues.... I have nothing to defend, but you seem to.....

Did you just boil well-respected reviewers, especially someone like Steve, down to 'A couple of YouTubers' because they are stating facts that clash with your feelings and trying to dismiss every review in the process? That's just embarrassing. So basically, you are trying to argue that your own feelings and anecdotes are how hardware should be reviewed? And I just had to look at your profile and of course, 19 hours ago you told us about your newly built 13900K machine:https://www.avsim.com/forums/topic/634916-building-new-pc/?tab=comments#comment-4972561 

Of course this is just you getting emotional about your new very expensive built and don't have the maturity to handle it.


Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post
Share on other sites
11 hours ago, Sethos said:

I mean, an anecdotal slice of information doesn't really negate actual reviews. Also, just as I said, you are sitting in a GPU-bound scenario playing at 4K and your CPU is still reaching almost 60c -- That's not impressive in the slightest, kind of terrible actually. My 7800X3D at EGKK, in the Fenix at 4K, with full FSLTL traffic doesn't even break 40c and that's sitting in a hot ambient room.

I feel compelled to jump in here to note that my own 7800X3D at KDTW with the Headwind A330 and FSLTL traffic at UHD (not 4K) was in the mid-50s sitting at the gate today. For precision's sake, this is with water-cooling, 240 radiator, and an ambient room temperature of roughly 23C.

To be clear, I'm thrilled with the performance I'm getting with this chip. It's super smooth most of the time (and when it isn't, it's due to MSFS bugginess, not the CPU), and the temperatures in MSFS never seem worrying or excessive (they mostly stay below 80C, though they can get above 80C for short boosts especially flying in busy European airspace with tons of AI traffic). But there's sometimes a tendency on these forums to throw around numbers for CPU temps that don't really seem plausible, unless your computer is in a freezer and/or you're running some very special custom cooling.

For the record, my hot take: if you're happy with the actual performance, e.g. FPS, smoothness, in heavy scenarios, you're not melting your CPU, and you're not running up an insane electricity bill, does it really matter whether you're sitting at EGKK with your CPU running at 40C or 60C?

James

  • Like 1

Share this post


Link to post
Share on other sites
5 hours ago, honanhal said:

I feel compelled to jump in here to note that my own 7800X3D at KDTW with the Headwind A330 and FSLTL traffic at UHD (not 4K) was in the mid-50s sitting at the gate today. For precision's sake, this is with water-cooling, 240 radiator, and an ambient room temperature of roughly 23C.

To be clear, I'm thrilled with the performance I'm getting with this chip. It's super smooth most of the time (and when it isn't, it's due to MSFS bugginess, not the CPU), and the temperatures in MSFS never seem worrying or excessive (they mostly stay below 80C, though they can get above 80C for short boosts especially flying in busy European airspace with tons of AI traffic). But there's sometimes a tendency on these forums to throw around numbers for CPU temps that don't really seem plausible, unless your computer is in a freezer and/or you're running some very special custom cooling.

For the record, my hot take: if you're happy with the actual performance, e.g. FPS, smoothness, in heavy scenarios, you're not melting your CPU, and you're not running up an insane electricity bill, does it really matter whether you're sitting at EGKK with your CPU running at 40C or 60C?

James

I feel compelled to ask if you actually read the thread or got any sort of context out of it? Right off the bat you're doing a pointless comparison to our numbers, as I just mentioned a few posts ago. If you aren't at 4K, you are putting more strain on the CPU, thus you'd have higher temperatures and they won't line up. So putting what I've said into question like I'm lying, because you aren't even doing an apples-to-apples comparison is a bit silly. Don't even know what you've done with your setup, you might have just slapped on EXPO and off you go, thus pumping an unnecessary high voltage into your chip for additional or haven't messed with a single curve. So there's absolutely no need to cast doubt on what I'm saying, if you don't even understand the testing parameters. You are more than welcome to pop around my house and see my 240mm AIO hold the numbers I've posted.

And the entire basis of the thread is someone thinking about upgrading, for MSFS. Of course the capabilities of a chip matters. Why on earth is this suddenly an emotional gut-feeling matter of personal happiness when the beauty of hardware is that it's the one thing you can factually measure and test everything. So why on earth would you buy a chip that is more expensive, a lot less efficient, pumps out more heat and performs worse in the task that OP intends for it? Logically makes little sense. And it matters in the grand context of the chip, because the numbers, reviews and reviewer statements tell why it's a questionable purchase, not the specific numbers themselves. Sitting in a GPU-bottlenecked scenario, i.e a scenario where your chip should barely be breaking a sweat and then hitting relative high numbers, perfectly backs up the problems with the 13900K.

So I'd say if you actually include the intended context, it matters for the thread at hand.


Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post
Share on other sites
1 hour ago, Sethos said:

I feel compelled to ask if you actually read the thread or got any sort of context out of it? Right off the bat you're doing a pointless comparison to our numbers, as I just mentioned a few posts ago. If you aren't at 4K, you are putting more strain on the CPU, thus you'd have higher temperatures and they won't line up. So putting what I've said into question like I'm lying, because you aren't even doing an apples-to-apples comparison is a bit silly. Don't even know what you've done with your setup, you might have just slapped on EXPO and off you go, thus pumping an unnecessary high voltage into your chip for additional or haven't messed with a single curve. So there's absolutely no need to cast doubt on what I'm saying, if you don't even understand the testing parameters. You are more than welcome to pop around my house and see my 240mm AIO hold the numbers I've posted.

Ok, I guess I deserve that, especially for using the word "implausible" in talking about this. I apologize. What I really mean -- and I think you've illustrated this well, to be honest! -- is that people quote CPU temps without giving the full context, and then people (who may be considering which CPU to buy) then run with them without having that context, thinking the results are in fact something you'd get "out of the box" in a standard use case. You're right, you did mention the context of running at 4K, so that's on me. That said, you didn't mention the other key, and frankly pretty advanced, stuff you've done, like custom RAM overclocking and custom curves. (Yes, you're quite right that I just slapped on a profile! I'm guessing most people do.😀)

Seriously though, good on you for getting that performance/cooling using a 240mm AIO through messing with voltages/curves. My hat goes off to you, genuinely, for your skill and patience in seeing that through.

1 hour ago, Sethos said:

And the entire basis of the thread is someone thinking about upgrading, for MSFS. Of course the capabilities of a chip matters. Why on earth is this suddenly an emotional gut-feeling matter of personal happiness when the beauty of hardware is that it's the one thing you can factually measure and test everything. So why on earth would you buy a chip that is more expensive, a lot less efficient, pumps out more heat and performs worse in the task that OP intends for it? Logically makes little sense. And it matters in the grand context of the chip, because the numbers, reviews and reviewer statements tell why it's a questionable purchase, not the specific numbers themselves. Sitting in a GPU-bottlenecked scenario, i.e a scenario where your chip should barely be breaking a sweat and then hitting relative high numbers, perfectly backs up the problems with the 13900K.

Let me start by saying that I went with a 7800X3D for a reason, so I obviously agree with the thrust of this.🙂 I guess my other point is that -- skillful manipulation of voltages etc. aside -- while it's true that AMD chips are much less power-hungry than their Intel brethren, the X3D chips put out a lot of extra heat (by virtue of that 3D cache) that narrows the temperature gap a good deal, which is something that wasn't obvious, at all, to me as a fairly casual hardware news-follower until I was looking for a new CPU. You see a lot written about how cool and energy efficient the AMD chips are, which is absolutely true, but there's an asterisk there with the X3D. (It can be mitigated, obviously, as you have you done.)

Anyway, I'll leave it there. I really didn't mean to suggest you (or actually anyone else, for that matter) were lying about the temps, just that there was probably more context there. Which there was. 🙂

James

  • Like 1

Share this post


Link to post
Share on other sites
5 hours ago, Sethos said:

I feel compelled to ask if you actually read the thread or got any sort of context out of it? Right off the bat you're doing a pointless comparison to our numbers, as I just mentioned a few posts ago. If you aren't at 4K, you are putting more strain on the CPU, thus you'd have higher temperatures and they won't line up. So putting what I've said into question like I'm lying, because you aren't even doing an apples-to-apples comparison is a bit silly. Don't even know what you've done with your setup, you might have just slapped on EXPO and off you go, thus pumping an unnecessary high voltage into your chip for additional or haven't messed with a single curve. So there's absolutely no need to cast doubt on what I'm saying, if you don't even understand the testing parameters. You are more than welcome to pop around my house and see my 240mm AIO hold the numbers I've posted.

And the entire basis of the thread is someone thinking about upgrading, for MSFS. Of course the capabilities of a chip matters. Why on earth is this suddenly an emotional gut-feeling matter of personal happiness when the beauty of hardware is that it's the one thing you can factually measure and test everything. So why on earth would you buy a chip that is more expensive, a lot less efficient, pumps out more heat and performs worse in the task that OP intends for it? Logically makes little sense. And it matters in the grand context of the chip, because the numbers, reviews and reviewer statements tell why it's a questionable purchase, not the specific numbers themselves. Sitting in a GPU-bottlenecked scenario, i.e a scenario where your chip should barely be breaking a sweat and then hitting relative high numbers, perfectly backs up the problems with the 13900K.

So I'd say if you actually include the intended context, it matters for the thread at hand.

Did you get any context from the OP as well?

"I am thinking of upgrading my pc. It will intel to intel upgrade."

It seems that almost every post about getting an intel turns into a "get AMD" instead reply. I have no skin in the game, just noticing.

fly safe

Edited by mokeiko

Francisco Blas
Windows11 Pro ASUS Hero Z790 | Intel i9-13900K | G.SKILL 32GB DDR5 | ASUS STRIX 4090 | WD 4TB SN850X NVMe | ASUS Ryujin II AIO 360mm | Corsair AX1600i | Lian Li 011D EVO | DELL Alienware 38 G-SYNC

Share this post


Link to post
Share on other sites
On 6/13/2023 at 2:22 AM, MrFuzzy said:

Why Intel? 

e1e9aec7cd541265606f41d82bacc33d4381f55e

 

Maybe because of choice. Some people buy Toyota and some by Nissan. Pushing agendas showing graphs and videos won't change their choice.

fly safe


Francisco Blas
Windows11 Pro ASUS Hero Z790 | Intel i9-13900K | G.SKILL 32GB DDR5 | ASUS STRIX 4090 | WD 4TB SN850X NVMe | ASUS Ryujin II AIO 360mm | Corsair AX1600i | Lian Li 011D EVO | DELL Alienware 38 G-SYNC

Share this post


Link to post
Share on other sites

I personally would do a fresh install on a new PC. As for Windows 11, it seems to be okay. The only reason for changing from Windows 10 to 11 is the security updates and MS will eventually stop supporting Windows 10. Some software no longer is supported by Windows 10 as well.

fly safe

Edited by mokeiko

Francisco Blas
Windows11 Pro ASUS Hero Z790 | Intel i9-13900K | G.SKILL 32GB DDR5 | ASUS STRIX 4090 | WD 4TB SN850X NVMe | ASUS Ryujin II AIO 360mm | Corsair AX1600i | Lian Li 011D EVO | DELL Alienware 38 G-SYNC

Share this post


Link to post
Share on other sites
3 hours ago, honanhal said:

Ok, I guess I deserve that, especially for using the word "implausible" in talking about this. I apologize. What I really mean -- and I think you've illustrated this well, to be honest! -- is that people quote CPU temps without giving the full context, and then people (who may be considering which CPU to buy) then run with them without having that context, thinking the results are in fact something you'd get "out of the box" in a standard use case. You're right, you did mention the context of running at 4K, so that's on me. That said, you didn't mention the other key, and frankly pretty advanced, stuff you've done, like custom RAM overclocking and custom curves. (Yes, you're quite right that I just slapped on a profile! I'm guessing most people do.😀)

Seriously though, good on you for getting that performance/cooling using a 240mm AIO through messing with voltages/curves. My hat goes off to you, genuinely, for your skill and patience in seeing that through.

Let me start by saying that I went with a 7800X3D for a reason, so I obviously agree with the thrust of this.🙂 I guess my other point is that -- skillful manipulation of voltages etc. aside -- while it's true that AMD chips are much less power-hungry than their Intel brethren, the X3D chips put out a lot of extra heat (by virtue of that 3D cache) that narrows the temperature gap a good deal, which is something that wasn't obvious, at all, to me as a fairly casual hardware news-follower until I was looking for a new CPU. You see a lot written about how cool and energy efficient the AMD chips are, which is absolutely true, but there's an asterisk there with the X3D. (It can be mitigated, obviously, as you have you done.)

Anyway, I'll leave it there. I really didn't mean to suggest you (or actually anyone else, for that matter) were lying about the temps, just that there was probably more context there. Which there was. 🙂

James

Good post and I may have come off a little strong, I apologize. I will say, in case you ever want to mess around with your CPU, you can easily set the SOC voltage manually to 1.15 - 1.20 and see lower temperatures. Very easy to do, as just applying an EXPO profile sets it way higher than it need be. The X3D chips don't offer much headroom in terms of tweaking and dialing in, you can get it set up in less than 5 minutes, so there's very little knowledge or skill required 😄 The X3D chips are still extremely efficient. The 7800X3D drawing less than 80w, which is incredible for its performance. The additional heat just comes from the cache on the die providing a layer that makes it harder to keep cool but that extra heat isn't like other chip's heat, where it's pure wattage that needs dissipation.

 

6 minutes ago, mokeiko said:

Did you get any context from the OP as well?

"I am thinking of upgrading my pc. It will intel to intel upgrade."

It seems that almost every post about getting an intel turns into a "get AMD" instead reply. I have no skin in the game, just noticing.

fly safe

Yeah, it's a post asking for upgrade help so of course suggestions from other brands will be mentioned. It's not uncommon for people to have brand preferences or just want to stick to what they know -- Because Intel has been a stable within the sim community for decades. Doesn't mean you can't show that in fact there's an objectively better choice now through the X3D chips. And thinking it's some agenda is hilarious, when it's objectively the best CPU for MSFS by a country mile. Literally how help and suggestions on the internet goes, you show people the benchmarks and give them information why another choice might be better and then they can make their decision. So you're "noticing" this because the X3D CPUs are head and heels above whatever Intel has in terms of MSFS.

  • Like 1

Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...