Jump to content
Sign in to follow this  
fr3dm

Is it worth upgrading from i7 2600k to i7 8700k?

Recommended Posts

5 hours ago, martin-w said:

 

Yeah, that doesn't sound right. What's his ambient temp? Which CPU cooler? Overclocking or not?

Hi Martin, thanks for the comparison with your daughter's setup. The ambient temp is around 21C when his computer is iddle, but rises to 25-26 after a few hours of gaming (the computer is heating the room). His CPU Cooler is a Master Cooler Hyper 212 with 2 x 120mm fans (push-pull) and the CPU turbo is overclocked to 4.5 GHz on all cores.

I wonder if his high temps could be caused by the location of his case which is on the floor under his desk, in the corner of a small room  (6 inches clearance from the back and side walls). May be the intake fans are re-injecting a portion of the exhausted warm air. Even with a good airflow in the case, may be the airflow "in that corner of the room" is not optimum.


 

Normand

Intel i7 9700K @ 4.9 GHz / Asus Prime Z390-A / 32GB DDR4 3200 MHz / MSI RTX 4080 / PSU 750 Watt / Microsoft Flight Simulator / Windows 10 Pro x64

Share this post


Link to post
Share on other sites
On 10/24/2017 at 2:16 PM, SpiritFlyer said:

With all the hardware changes needed to keep up with P3Dv4 and with the advent of the 8700K and 10 series Nvdia graphics cards, I thought it would be a good time to record a single slice in time with the flight sim computer I now use.

Processor: 2700K @ 4.938Ghz binned and de-lidded (as shown)
Motherboard: Asus P67 Deluxe (Revised and replaced by Asus)
Video Card: GTX1080 Overclocked with MSI Afterburner (as shown)...
RAM 16G at 2196 (as shown)
Primary Drive: 240G SSD
Flight Sim Drive: 480G SSD
Monitor: 4K 55inch curved screen Samsung Smart TV
Flight Simulator: P3Dv4.1 (over 340G installed)
Settings: (as shown)
Results: Entirely satisfactory

I intend to replace the CPU when appropriate perhaps in the New Year with Intel's newest releases. I do not think the newest CPU's are advanced enough in performance for me to replace system yet. Amazing that Sandy Bridge still remains (somewhat) competitive this many years later.

https://www.flickr.com/gp/156377156@N06/S536g0

I hope this link works. It is what I customarily run my simulator at. I use Saitek controls, Track IR, almost all ORBX Scenery, Active Sky, Ultimate Traffic (high settings) and complex aircraft.

Any number of newer rigs probably out perform this one, but probably not by a whole lot.

Overall it runs fast and smooth and worry free under a load of add-ons that would choke a horse!

Hopefully the next round of breakthroughs will produce the results needed for substantial change.

Kind regards,

Interesting.  My CPU is running at 4.8 Ghz and I have a GTX980.  Based on your finding, my best bang for the bucks right now is to get the GTX1080?  I too am not convinced yet that upgrading my MB and CPU is worth the money and hassle yet. Next year Intel CPU line up is supposed to be the big step forward, so that will be when I look at MB and CPU.


Vu Pham

i7-10700K 5.2 GHz OC, 64 GB RAM, GTX4070Ti, SSD for Sim, SSD for system. MSFS2020

Share this post


Link to post
Share on other sites
21 hours ago, NBouc said:

Hi Martin, thanks for the comparison with your daughter's setup. The ambient temp is around 21C when his computer is iddle, but rises to 25-26 after a few hours of gaming (the computer is heating the room). His CPU Cooler is a Master Cooler Hyper 212 with 2 x 120mm fans (push-pull) and the CPU turbo is overclocked to 4.5 GHz on all cores.

I wonder if his high temps could be caused by the location of his case which is on the floor under his desk, in the corner of a small room  (6 inches clearance from the back and side walls). May be the intake fans are re-injecting a portion of the exhausted warm air. Even with a good airflow in the case, may be the airflow "in that corner of the room" is not optimum.

 

The Cooler Master Hyper 212, is a good cooler for such a low price but far away from being a great performer for the 7700K. For example, I use an NH-D15, which is approximately 15 degrees cooler. A top of the range 280 rad AIO would be even cooler.

6 inches Clearance from the wall is unlikely to make a significant difference to the CPU temp.

The 4.5 GHz overclock you refer to is the max turbo frequency. Intel turbo overclocks one of the cores to 4.5 max but the others to a lower frequency. However, Asus boards, and I believe other manufacturers boards, have something called MCE (Multi Core Enhancement) which in on by default. MCE will override Intel's Turbo and set all cores to the max Turbo frequency of 4.5 GHz. Which is why you see 4.5 GHz on all cores.  

MCE is great for those that don't manually overclock. However, what often happens is that individuals fail to realise that MCE requires extra voltage. Now the motherboard manufacturer has no idea how good your CPU is in terms of the silicone lottery, so an auto rules voltage will be applied that relates to the worst case scenario CPU that needs high volts. The result is higher temperature. Couple that with a less than top-notch cooler like the Hyper 212 and higher CPU temps than you would like ensue.

Re your sons CPU temp: Firstly 80 degrees is not a dangerous temp, TJ Max is IIRC 100 degrees. So the CPU is short of the max temp Intel regard as safe. However, it would obviously be nice if we could drop the temp under load.

Given that your son is a gamer, and most games like a powerful GPU not so much CPU, you could enter the BIOS and switch off Multi Core Enhancement.  It may be called something different if not an Asus board. The CPU would then run one core at the max 4.5 GHz turbo frequency and the other cores would overclock to a somewhat lower frequency. But importantly, the voltage would then be back to stock and temps should be lower. I wouldn't  envisage any noticeable difference in gaming performance. 

The other option if you wish to retain 4.5 GHz on all cores, is to invest in a much better cooler. 

Note: If you decide to switch off MCE, set it to disabled, not the "Auto" setting it's on by default. Auto will enable it, OC to 4.5 on all cores, and increase the voltage more than required for most.

 

Share this post


Link to post
Share on other sites
2 hours ago, Anxu00 said:

Interesting.  My CPU is running at 4.8 Ghz and I have a GTX980.  Based on your finding, my best bang for the bucks right now is to get the GTX1080?  I too am not convinced yet that upgrading my MB and CPU is worth the money and hassle yet. Next year Intel CPU line up is supposed to be the big step forward, so that will be when I look at MB and CPU.

Hi Vu Pham,

The data is somewhat compelling, but the experience is even more so. This is still a great simulator. I too am going to wait to see what is coming in 2018.

I would however recommend the GTX 1080Ti, or the GTX 1080 in SLI, as my system is running my single (significantly) overclocked GTX 1080 at 100% most of the time.

On the other hand the new (to come) GTX 2080 may be another HUGE GPU jump, according to online speculation.

I need to choose wisely :b_hantu:

Exciting times! for my new machine (when it comes) will need to last me a long time, like the (present) one I lucked out on years ago!

Kind regards,

 

  • Upvote 1

Share this post


Link to post
Share on other sites
1 hour ago, SpiritFlyer said:

I would however recommend the GTX 1080Ti

Ouch, this is getting expensive :laugh: Thanks for the head-up.


Vu Pham

i7-10700K 5.2 GHz OC, 64 GB RAM, GTX4070Ti, SSD for Sim, SSD for system. MSFS2020

Share this post


Link to post
Share on other sites
9 hours ago, martin-w said:

 

... The 4.5 GHz overclock you refer to is the max turbo frequency. Intel turbo overclocks one of the cores to 4.5 max but the others to a lower frequency. However, Asus boards, and I believe other manufacturers boards, have something called MCE (Multi Core Enhancement) which in on by default. MCE will override Intel's Turbo and set all cores to the max Turbo frequency of 4.5 GHz. Which is why you see 4.5 GHz on all cores.  

MCE is great for those that don't manually overclock. However, what often happens is that individuals fail to realise that MCE requires extra voltage.

Martin, this is good information and something I did not know regarding MCE. His Mobo is an Asus Prime Z-270 and we indeed kept MCE enable, mainly for the simplicity of getting a reasonable overclock without having to figure out the proper voltage to apply. I might try with disabling MCE to see what difference it will make, or eventually look for a better cooling solution such as the NH-D15 (He doesn't want any water-cooling solution for some reason).

Reducing the CPU temperature might give it a longer life, even though my son does not seem to be too concerned about that since he replaces his computer every 4 or 5 years no matter what.

Anyway, thank you for the information you provided me.


 

Normand

Intel i7 9700K @ 4.9 GHz / Asus Prime Z390-A / 32GB DDR4 3200 MHz / MSI RTX 4080 / PSU 750 Watt / Microsoft Flight Simulator / Windows 10 Pro x64

Share this post


Link to post
Share on other sites
14 hours ago, NBouc said:

Martin, this is good information and something I did not know regarding MCE. His Mobo is an Asus Prime Z-270 and we indeed kept MCE enable, mainly for the simplicity of getting a reasonable overclock without having to figure out the proper voltage to apply. I might try with disabling MCE to see what difference it will make, or eventually look for a better cooling solution such as the NH-D15 (He doesn't want any water-cooling solution for some reason).

Reducing the CPU temperature might give it a longer life, even though my son does not seem to be too concerned about that since he replaces his computer every 4 or 5 years no matter what.

Anyway, thank you for the information you provided me.

 

He won't notice any difference in games to be honest. Games are mostly GPU biased, and even in a CPU biased application, a few hundred megahertz, amounts to a very small increase in frame rate because overclocking in a reasonable well balanced system is linear. This is why it's somewhat amusing how gamers panic if they can't achieve a few hundred megahertz higher, to match their pals.  They don't realise how little difference it makes. Different for enthusiasts of course who do it for fun, or competitive overclockers.

For example, my daughters CPU was OC'd to 5 GHz... great experience in games, high settings, high frame rate. Now, due to a cold boot issue (immature BIOS issue) It's at stock.... again, great experience in games, high settings, high frame rate. She notices no difference. I would argue that for games, with the latest CPU's and GPU's, overclocking is "at present" unnecessary. The sim, being more CPU biased is a different story. 

 

The MCE on by default issue is a bit naughty really. It should be off by default and you select it if you know the consequences. And to that end, the Asus BIOS should throw up an information screen to warn of higher temps. Incidentally, when Tom Logan of OC3D quizzed Asus on this, astonishingly, they weren't aware it was on even with optimised defaults set, so there will probably be a new BIOS update to address this. 

Share this post


Link to post
Share on other sites

I was able to snag an 8700k today on newegg when it went back on sale for about 5 minutes. When it comes in, and when I am done building the rig, ill report performance findings.


FAA: ATP-ME

Matt kubanda

Share this post


Link to post
Share on other sites
8 hours ago, miguelpp said:

Very interesting comparison which seems to show that the 2600k with a 1080 Ti (so definitely not GPU-bound) can still hold it's own in most modern games. 


 i7-6700k | Asus Maximus VIII Hero | 16GB RAM | MSI GTX 1080 Gaming X Plus | Samsung Evo 500GB & 1TB | WD Blue 2 x 1TB | EVGA Supernova G2 850W | AOC 2560x1440 monitor | Win 10 Pro 64-bit

Share this post


Link to post
Share on other sites
59 minutes ago, vortex681 said:

Very interesting comparison which seems to show that the 2600k with a 1080 Ti (so definitely not GPU-bound) can still hold it's own in most modern games. 

With a 5Ghz 2700K running an OC'd GTX1080 at wide open, there is nothing even in the 8700K camp worth spending any real money on yet. We will see what the next round brings for P3Dv4 Flight Simulation computation.

So far the best isn't quite good enough to make it worthwhile for me to worry much about. Hopefully a substantially improved replacement system will come soon that logically compels new FS adoption.

Looking forward to that day.

Kindest regards,

Share this post


Link to post
Share on other sites
11 hours ago, miguelpp said:

Hi,

 

I would like to point out something about this comparison. The video is not at all legit. The benchmarks are surely manipulated. In the mean time I have watched many reviews and I have seen that 8700k is some 90-110% more powerful than 2600k. 

I would like to link these reviews:

http://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed

Also in youtube I have seen DigitalFoundry's benchmarks with 8700k and 3770k @ 4.5 GHZ. Games like Crysis 3 and Withcer 3 had some 60 FPS more on average in 8700k. Other games like Rise of the tomb raider had 30-40 FPS difference. So it's my humble advice to not believe this video.

Also I am sorry I haven't posted 8700k benches, the thing is I am on the move and will reach my home by the end of the month, so only then I could post some results. Heck I couldn't even install P3D on my system as I had to urgently move out because of some personal work.

 

Thanks.

Share this post


Link to post
Share on other sites
1 hour ago, fr3dm said:

Hi,

 

I would like to point out something about this comparison. The video is not at all legit. The benchmarks are surely manipulated. In the mean time I have watched many reviews and I have seen that 8700k is some 90-110% more powerful than 2600k. 

I would like to link these reviews:

http://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed

 

There certainly was an agenda in these tests that would do Pravda proud in it's hayday! LOL :blush: Every superlative adjective is used to deify the 8700K and degrade everything else. Just read it!! ^^^^ :cool: The author runs his favourites at extreme volts and speeds but (for instance) just restricts the 2600K to 4.6GHZ: 

"I didn't stop turning up the clocks there, either. Intel's Sandy Bridge Core i7-2600K is rightly regarded as a legendary overclocker, so I pushed up the sliders on our particular chip and reached 4.6 GHz on all cores." 

http://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed

Good grief, my Sandy Bridge is running at 5Ghz typing this post!

What a setup, straw men and all.

Make me wonder if there are cooked books on the other comparisons as well. Which brings up a good point, and a good time to make it!

Saying someone else faked tests in order to introduce this ridiculous set of unfair testing is hogwash.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/76333-i7-2600k-vs-i7-8700k-upgrading-worthwhile.html

Pushing someone else down does not pull the accuser up. Pay-ware providers, hardware manufacturers and equipment sellers must give us reasonable and truthful product claims. There is no need to use hyperbole and twisting of information and distortions.

I have spent many thousands on products that simply do not measure to claims (especially pay-ware scenery and aircraft), just like most others in this hobby. Some of us want to actually pay for what we get, not for what we were told we were going to get, but didn't. Did I say that right?

I am accusing no one of anything, but please no more exaggerations (if there are any) for the sake of building bogus market demand. No offence intended, but we all know the 8700K is another advance for our hobby. The question is whether it is a solid enough product to build a brand new system upon for flight sim for those with competent equipment that nearly replicates gameplay, when compared fairly?

As someone else said, just the facts Ma-am!

Kind regards,

  • Upvote 2

Share this post


Link to post
Share on other sites
14 minutes ago, SpiritFlyer said:

There certainly was an agenda in these tests that would do Pravda proud in it's hayday! LOL :blush: Every superlative adjective is used tp diefy the 8700K and degrade everything else. Just read it!! ^^^^ :cool: The author runs his favourites at extreme volts and speeds but (for instance) just restricts the 2600K to 4.6GHZ: 

"I didn't stop turning up the clocks there, either. Intel's Sandy Bridge Core i7-2600K is rightly regarded as a legendary overclocker, so I pushed up the sliders on our particular chip and reached 4.6 GHz on all cores." 

http://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed

Good grief, my Sandy Bridge is running at 5Ghz typing this post!

What a setup, straw men and all.

Make me wonder if there are cooked books on the other comparisons as well. Which brings up a good point, and a good time to make it!

Saying someone else faked tests in order to introduce this ridiculous set of unfair testing is hogwash.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/76333-i7-2600k-vs-i7-8700k-upgrading-worthwhile.html

Pushing someone else down does not pull the accuser up. Pay-ware providers, hardware manufacturers and equipment sellers must give us reasonable and truthful product claims. There is no need to use hyperbole and twisting of information and distortions.

I have spent many thousands on products that simply do measure to claims, just like most others in this hobby. Some of us want to actually pay for what we get, not for what we were told we were going to get, but didn't. Did I say that right?

I am accusing no one of anything, but no more exaggerations (if there are any) for the sake of building bogus market demand. No offence intended, but we all know the 8700K is a great advance for our hobby. The question is whether it is a solid enough product to build a brand new system upon for flight sim for those with competent equipment that nearly replicates gameplay, when compared fairly?

As someone else said, just the facts Ma-am!

Kind regards,

Hi,

I would like to link another video : 

 

This video shows a massive difference between a 3770k and 8700k. And perhaps looks consistent with the techreport review which I linked in my previous comment.

HardwareCancuks review shows very little difference between the 2600k and the 8700k which just don't sink in.

6 generations later and still a very little performance difference is something I can't believe. 

Reagrds.

Share this post


Link to post
Share on other sites
1 hour ago, SpiritFlyer said:

Make me wonder if there are cooked books on the other comparisons as well. Which brings up a good point, and a good time to make it!

Stephen, my friend, it's the tech industry... which is all about cooking the books (always has been, probably always will be).

Won't waste my time on the Tech Reports stuff, but the HardwareCanucks review is rather simple and straightforward... don't focus completely on the gaming comparisons between the 2600K/8700K.  Look to the productivity comparisons.  They tell us more about how the two CPU's compare in a sim.

Regards,

Greg

P.S. As much as I like the 8700K (only built one, but it's rockin') I still ain't ready to give up my 4790K/1080/ddr3_2600 rig for it... :laugh:   Still miles left in my box'o'bits! But if I were you...

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...