Recommended Posts

First of all thank for the help,

I’m currently using p3d v4.5 with an Asus ROG monitor at 2K resolution. The sim runs smooth at 45-60 FPS. Last night I wanted to try my 4K Samsung TV as monitor for my PC and everything is fine except for the fact that my 1080TI is always boosted at 100% and after a while it starts heating up to much. Even using a 2K resolution it still runs at 100%. With my ASUS monitor GPU usage was at around 50-60%. Is there any reason why? I also tried to turn off AA an FXAA but no changes....GPU still at 100% all time

Share this post


Link to post
Help AVSIM continue to serve you!
Please donate today!

Lots more workload at 4K vs HD resolution...four times as much in fact.

I run a 55-inch 4K Samsung JS8500 TV as my sim monitor, and the key to success has been to set the PC to output at a 30Hz refresh rate, with unlimited frame rate target, and with VSync and triple-buffer turned on in P3D.  It holds 30fps rock-solid, and doesn't cook the GPU trying to crank out 8.3 megapixels per frame as fast as it possibly can trying to hit 60 fps.  Typical GPU load on my RTX2080Ti is in the 40-60% range unless I'm in heavy weather or using dynamic lighting at night, when it runs closer to 80%.

Regards

 

  • Like 1
  • Upvote 1

Share this post


Link to post

Thanks. That is exactely what I did. Monitor is set to 4K 30HZ with vsync and triple buffer on. I’m also getting steady 30FPS but with a CPU usage of 100%

Share this post


Link to post
2 hours ago, matteospacca said:

Thanks. That is exactely what I did. Monitor is set to 4K 30HZ with vsync and triple buffer on. I’m also getting steady 30FPS but with a CPU usage of 100%

Is your frame rate target in P3D set to "unlimited"?  If you set it to a fixed value, the CPU will be loaded up to its capacity trying to produce frames to fill the lookahead buffer...with unlimited it'll only produce frames at the VSync rate.  Some other settings, like large autogen LOD, dense autogen, and heavy AI loads can also slam the CPU.  But if you're stutter-free at 30 but with 100% CPU all the time, I suspect you have something other than unlimited set as your target.

Also, FFTF Dynamic does some good on giving the CPU a bigger time slice when scenery/texture processing loads are minimal (like on the ground).

Last, there's no info here on your system...could be related to system power/performance issues as well.  100% load on a 3 GHz quad-core CPU wouldn't be surprising, for example.

Regards

Share this post


Link to post

I have been running a 46" 4K UHDTV plus a 30" monitor using a TitanX card and a 7700 CPU.  You gotta be getting awesome FPS with your set up.  I am 40-50 fps now at 10,000 ft approaching KLAS in PMDG 747 - cockpit view.

Share this post


Link to post

Thank for all the info. The point is that if I use a TV instead of the monitor GPU usage goes up to 100% and stay at 100% no matter what. Even if a set my PC to render 1080P 30HZ instead of 4K it still cooks the GPU

Share this post


Link to post

I've noticed when playing certain games (Looking at you, Assasin's Creed Odyssey) that my 1080Ti heats up markedly. If I put my bare foot on the chassis next to the card, it gets uncomfortable after about 30 seconds. These cards, it seems, run hot when maxed out.

If your case doesn't have a good side fan, consider installing one. Mine had a broken one that wasn't needed in my old builds, but I replaced it for this build and the GPU temps dropped by 12 degrees C under load (and the cooler it runs, the faster it runs).

One of the problems with the ATX form factor is that with the longer cards, you're often forced to put the card on the bottom slot in order to clear the RAM or the CPU's heatsink, and that means you have 2 or 3 vidcard fans crammed up against the bottom of the case and losing cooling effectiveness. Even in my very large full-tower case, this is an issue. I suspect my next upgrade might require water cooling just to get around that problem.

As to the 100% even at HD resolutions, who made the card, and are you running their drivers, or the stock nVidia ones?

 

Share this post


Link to post
Posted (edited)

Using a Gigabyte Aurus that has a fixed water cooling with an external fan. I’m using last drivers updated via the gforce experience app. 

Edited by matteospacca

Share this post


Link to post

Some confusion here: once it's GPU, next it's CPU?

Share this post


Link to post
26 minutes ago, matteospacca said:

It is the GPU. CPU is fine running at 45-55%

your card shall run at max 45-50c att 100% gpu load , is the water pump working correct. my 2089 ti with similar colling dont pass 40C 100% load.

have one 1080TI with NZXT G12 and NZXT 140mm AIO that one maxed out at 45C,

that on 4k samsung

Share this post


Link to post

Temps go up to 70 degrees since GPU is stuck all the time at 100% and the water pump is ok. What I really don't understand is why everything is normal using my 2k Asus monitor (usage is around 40-70%) but using my TV running at the same exact resolution and setting the GPU usage goes up to 100%. Maybe something with the cable? I'm using Display Port on monitor and HDMI on my TV,

Share this post


Link to post
3 minutes ago, matteospacca said:

Temps go up to 70 degrees since GPU is stuck all the time at 100% and the water pump is ok. What I really don't understand is why everything is normal using my 2k Asus monitor (usage is around 40-70%) but using my TV running at the same exact resolution and setting the GPU usage goes up to 100%. Maybe something with the cable? I'm using Display Port on monitor and HDMI on my TV,

70C seems high even with 120 rad for the AIO , , the card shall mange to ru at 100% gpu load you have turbo boost  the card shall clock down little. wath is your clock.

Share this post


Link to post
1 hour ago, Pegaso said:

Some confusion here: once it's GPU, next it's CPU?

Good spot...I was answering the question based on CPU load at 100%.

For the OP: What about dynamic lighting and shadows?  Are you running nVidia inspector?

Share this post


Link to post

have you set unlimited in p3d? test with vsync and set 30HZ in nvidia CP

Share this post


Link to post
1 minute ago, w6kd said:

Good spot...I was answering the question based on CPU load at 100%.

For the OP: What about dynamic lighting and shadows?  Are you running nVidia inspector?

No Inspector. Only thing that I've changed from default in the nVidia control panel is "Prefer max performance" for the cpu power option but setting it to default doesn't solve the issue. BTW...I just want to understand why GPU usage changes so much using different output device (TV instead of monitor) using the same exact settings.

Share this post


Link to post
1 minute ago, westman said:

have you set unlimited in p3d? test with vsync and set 30HZ in nvidia CP

Yes. TV is running at 30hz and vsync is on in the sim

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now