Jump to content
Sign in to follow this  
dutto88

Good 30Hz monitor

Recommended Posts

Hey guys, 

 

So ive upgraded my PC, thanks for all your help! I have seen heaps of videos with locking P3D at 30 FPS and setting the monitor refresh rate to 30Hz. Was just wondering if anyone can link me to a good monitor ?
 

Cheers 

Share this post


Link to post

i currently have an ASUS PB287Q 4K running @ 30hz....runs super smooth.

 

My P3D would not run well without it with all the 2 sec stutters

Share this post


Link to post

All G-sync monitors can run at 30hz.

 

Some Freesync monitors can.


We are all connected..... To each other, biologically...... To the Earth, chemically...... To the rest of the Universe atomically.
 
Devons rig
Intel Core i5 13600K @ 5.1GHz / G.SKILL Trident Z5 RGB Series Ram 32GB / GIGABYTE GeForce RTX 4070 Ti GAMING OC 12G Graphics Card / Sound Blaster Z / Meta Quest 2 VR Headset / Klipsch® Promedia 2.1 Computer Speakers / ASUS ROG SWIFT PG279Q ‑ 27" IPS LED Monitor ‑ QHD / 1x Samsung SSD 850 EVO 500GB / 2x Samsung SSD 860 EVO 1TB /  1x Samsung - 970 EVO Plus 2TB NVMe /  1x Samsung 980 NVMe 1TB / 2 other regular hd's with up to 10 terabyte capacity / Windows 11 Pro 64-bit / Gigabyte Z790 Aorus Elite AX Motherboard LGA 1700 DDR5

Share this post


Link to post

All G-sync monitors can run at 30hz.

 

That's not true. Not in the extend that you can manually select 30Hz at least. 


Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post

That's not true. Not in the extend that you can manually select 30Hz at least. 

 

You don't have to.

 

If I understand correctly, setting FSX or P3D to run at 30fps should automatically make a G-sync match whatever the program is doing.

 

Freesync usually down to about 40hz, and all G-syncs down to 30

 

https://pcmonitors.info/others/nvidia-g-sync-variable-refresh-rate-technology/

 

Nvidia have come up with a solution to this issue; G-SYNC. By integrating some clever electronics (below) into specific monitors it is possible to get the monitor to adopt a variable refresh rate, adjusting in real-time to accommodate the frame rate of a game. The frame rate of the monitor is still limited in much the same way it is without G-SYNC, but it adjusts dynamically to a refresh rate as low as 30Hz (current models) to match the frame rate of the game. By doing this the monitor refresh rate is perfectly synchronised with the GPU. You don’t get the screen tearing or ‘visual latency’ of having VSync disabled nor do you get the stuttering or input lag associates with using VSync.

 


We are all connected..... To each other, biologically...... To the Earth, chemically...... To the rest of the Universe atomically.
 
Devons rig
Intel Core i5 13600K @ 5.1GHz / G.SKILL Trident Z5 RGB Series Ram 32GB / GIGABYTE GeForce RTX 4070 Ti GAMING OC 12G Graphics Card / Sound Blaster Z / Meta Quest 2 VR Headset / Klipsch® Promedia 2.1 Computer Speakers / ASUS ROG SWIFT PG279Q ‑ 27" IPS LED Monitor ‑ QHD / 1x Samsung SSD 850 EVO 500GB / 2x Samsung SSD 860 EVO 1TB /  1x Samsung - 970 EVO Plus 2TB NVMe /  1x Samsung 980 NVMe 1TB / 2 other regular hd's with up to 10 terabyte capacity / Windows 11 Pro 64-bit / Gigabyte Z790 Aorus Elite AX Motherboard LGA 1700 DDR5

Share this post


Link to post

I have a Viewsonic UHD and when I tried 30Hz it worked fine until I attempted to use my Trackir, then it stuttered like Mel Tillis on a game show. Had to revert back to 60Hz and all was smooth again..


Regards,

Pivot

i9-10900k * 64Gb Ram * MSI RTX 4070 Ti Gaming X Trio * Steel Series Arctis Pro Wireless Headset * Win11 Home x64 * Oculus Rift-S * Tobii ET 5 * TM Warthog Combo, Honeycomb Alpha & Saitek Pro-Rudders

Share this post


Link to post

You don't have to.

 

If I understand correctly, setting FSX or P3D to run at 30fps should automatically make a G-sync match whatever the program is doing.

 

Freesync usually down to about 40hz, and all G-syncs down to 30

 

https://pcmonitors.info/others/nvidia-g-sync-variable-refresh-rate-technology/

 

Misread your post initially, so I'll re-write it. 

 

Yes, that's one way of getting 30Hz but also slightly besides the main point of the much discussed 30Hz solution. First off, working at exactly 30FPS / 30Hz with G-Sync can be detrimental to the experience, as G-Sync will turn off / on as it goes above and below that 30FPS limit -- It doesn't work below. So even at 29.x, it'll turn off and you might experience a stutter / tear.

 

But the key behind the discussed 30Hz, is being able to internally enable triple buffering and V-Sync. Triple buffering is what smooths out the perceived microstutters or frame delays that many experience with P3D. You won't get that with G-Sync unfortunately. So you're just back to the original issue. 

 

That's why people want a monitor able to do 30Hz, so they can add it as a custom resolution -- or if it ships with the monitor, then enable triple buffering and V-Sync at 30Hz.


Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post

Yes, that's one way of getting 30Hz but also slightly besides the main point of the much discussed 30Hz solution. First off, working at exactly 30FPS / 30Hz with G-Sync can be detrimental to the experience, as G-Sync will turn off / on as it goes above and below that 30FPS limit -- It doesn't work below. So even at 29.x, it'll turn off and you might experience a stutter / tear.

 

You actually made me boot up P3D lock frames at 30 and fiddle around, swerving the view wildly, while spinning a jet over New York, and was unable to get any screen tearing at all on my monitor. (and also no input lag, which is another big feature of G-sync)

 

Maybe not for everyone, but it seems worth a try.


We are all connected..... To each other, biologically...... To the Earth, chemically...... To the rest of the Universe atomically.
 
Devons rig
Intel Core i5 13600K @ 5.1GHz / G.SKILL Trident Z5 RGB Series Ram 32GB / GIGABYTE GeForce RTX 4070 Ti GAMING OC 12G Graphics Card / Sound Blaster Z / Meta Quest 2 VR Headset / Klipsch® Promedia 2.1 Computer Speakers / ASUS ROG SWIFT PG279Q ‑ 27" IPS LED Monitor ‑ QHD / 1x Samsung SSD 850 EVO 500GB / 2x Samsung SSD 860 EVO 1TB /  1x Samsung - 970 EVO Plus 2TB NVMe /  1x Samsung 980 NVMe 1TB / 2 other regular hd's with up to 10 terabyte capacity / Windows 11 Pro 64-bit / Gigabyte Z790 Aorus Elite AX Motherboard LGA 1700 DDR5

Share this post


Link to post

30 Hz is great especially if you can maintain >= 30 fps !


 


I have the Samsung 4K TV  UN40KU6300 which can run at 30 Hz no lag or issues.


My set-up is Unlimited - Triple Buffering On - Vsync On   because I can maintain over 30 fps so this works well.


Nvidia Inspector settings - Preferred Max Performance - Overide any app setting - 4x SGSS 


 


Here's a spreadsheet of my complete settings 


https://www.dropbox....Chart.xlsx?dl=0


 


Joe



Joe (Southern California)

SystemI9-9900KS @5.1Ghz/ Corsair H115i / Gigabyte A-390 Master / EVGA RTX 2080 Ti FTW3 Hybrid w 11Gb / Trident 32Gb DDR4-3200 C14 / Evo 970 2Tb M.2 / Samsung 40inch TV 40ku6300 4K w/ Native 30 hz capability  / Corsair AX850 PS / VKB Gunfighter Pro / Virpil MongoosT-50 Throttle / MFG Crosswind Pedals /   LINDA, VoiceAttack, ChasePlane, AIG AI, MCE, FFTF, Pilot2ATC, HP Reverb G2

Share this post


Link to post

You actually made me boot up P3D lock frames at 30 and fiddle around, swerving the view wildly, while spinning a jet over New York, and was unable to get any screen tearing at all on my monitor. (and also no input lag, which is another big feature of G-sync)

 

Maybe not for everyone, but it seems worth a try.

 

Think you are getting hung up on the wrong parameters of why people are opting for monitors with a native 30Hz mode, namely V-Sync 30Hz with Triple Buffering, the latter not being available unless you run the sim's own V-Sync. The ESP-engine is notorious for frame judders, microstuttering and higher frametimes. It's especially noticeable due to the low and varying framerates that's inherent to the engine. Triple Buffering goes a long way to minimise and completely eliminate the issue due to the extra buffered frame. 

 

Aside from that, you've provided no details to your own testing scenario. Are you in fact running G-Sync "for windowed and full screen mode" and not just "for full screen mode", given the borderless-window nature of P3D? Do you have some benchmark output of the framerate being 30.0 or above for your testing run? Because when you lock at 30, it rarely runs at 30 but 29.x where G-Sync will be disabled and your desktop refreshrate takes over (given your monitor, you most likely wouldn't perceive tearing with a high desktop refreshrate). Are you running the sim's own internal framerate limiter or an external solution? The latter being preferred if you actually want to minimise performance loss and hold 30FPS, though it still wouldn't most likely.

 

And running G-Sync on the very edge of where it works makes no sense, especially not in a sim that has a hard time maintaining a framerate. As mentioned above, no point in looking for tearing when you're running 120Hz+ on the desktop. The high refreshrate would practically eliminate it if G-Sync disables. However, G-Sync on something like a 4K monitor, which is 60Hz at most, when it drops below 30FPS and G-Sync turns off would mean an unlocked game on a 60Hz monitor -- Then you'd see tearing.

 

G-Sync on a 120 / 144Hz monitor, when it disables and enables, can actually judder the frame, seen as a quick stutter under some circumstances. So that's not really desirable either. 

 

And ultimately, you aren't benefiting from Triple Buffering either way. 30Hz / V-Sync / TB is still the king when it comes to a smooth image. Having owned multiple G-Sync monitors, done extensive testing with P3D, the only monitors you'd run G-Sync with in P3D is probably the 4k / 60Hz monitors, unless you go the 30Hz route. With the high refreshrate monitors, you're better off running 120 / 144Hz on the desktop, then externally locking your framerate for the smoothest result.


Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post

Aside from that, you've provided no details to your own testing scenario. Are you in fact running G-Sync "for windowed and full screen mode" and not just "for full screen mode", given the borderless-window nature of P3D?

 

It's been awhile, since I don't use P3D as a main sim, but I believe running G-sync became available in the Nvidia 353XXx driver series, and also an option through Nvidia Inspector at around the same time, which is when I switched to "Both windowed and fullscreen mode" setting.

 

Do you have some benchmark output of the framerate being 30.0 or above for your testing run? Because when you lock at 30, it rarely runs at 30 but 29.x where G-Sync will be disabled and your desktop refresh takes over.

 

For that reason, you can set the Nvidia Inspector Framerate limiter for at 30.5 Fps.

 

And running G-Sync on the very edge of where it works makes no sense, especially not in a sim that has a hard time maintaining a framerate. As mentioned above, no point in looking for tearing when you're running 120Hz+ on the desktop. The high refreshrate would practically eliminate it if G-Sync disables. However, G-Sync on something like a 4K monitor, which is 60Hz at most, when it drops below 30FPS and G-Sync turns off would mean an unlocked game on a 60Hz monitor -- Then you'd see tearing.

 

Which is why I suggest that everyone try it for themselves. In an age where i7's and 1080's are common and pc internals so dissimilar, many people probably have little problem maintaining 30 fps except in the harshest of conditions. It can't hurt to at least give it a shot and see what happens.

 

And ultimately, you aren't benefiting from Triple Buffering either way. 30Hz / V-Sync / TB is still the king when it comes to a smooth image.

 

Well, you wouldn't bother (I don't think?) with Triple buffering on a G-sync anyway, as it already only displays frames when they finish, which obviates the need for buffered frames in the first place......

 

Not an expert on this, which is why I kind of was hoping for Rob to step in on a previous thread that touched on this a few days ago, but I think it got lost. I use P3D only intermittently, and when I do, fiddling around at 30fps isn't necessary for me. It's just partially remembered experiments I did sometime in the last century.

 

In the end, all I can say that people have reported good results here and in other places, and it's worth giving it a shot. In the meantime, G-syncs are available for use at 30Hz. Mine has an internal Fps display, and the sim reads as perfectly stable at 30, with no trace of tearing or the lag that can sometimes appear at low Fps settings.

 

People out there can try it or not, as they wish.


We are all connected..... To each other, biologically...... To the Earth, chemically...... To the rest of the Universe atomically.
 
Devons rig
Intel Core i5 13600K @ 5.1GHz / G.SKILL Trident Z5 RGB Series Ram 32GB / GIGABYTE GeForce RTX 4070 Ti GAMING OC 12G Graphics Card / Sound Blaster Z / Meta Quest 2 VR Headset / Klipsch® Promedia 2.1 Computer Speakers / ASUS ROG SWIFT PG279Q ‑ 27" IPS LED Monitor ‑ QHD / 1x Samsung SSD 850 EVO 500GB / 2x Samsung SSD 860 EVO 1TB /  1x Samsung - 970 EVO Plus 2TB NVMe /  1x Samsung 980 NVMe 1TB / 2 other regular hd's with up to 10 terabyte capacity / Windows 11 Pro 64-bit / Gigabyte Z790 Aorus Elite AX Motherboard LGA 1700 DDR5

Share this post


Link to post

It's been awhile, since I don't use P3D as a main sim, but I believe running G-sync became available in the Nvidia 353XXx driver series, and also an option through Nvidia Inspector at around the same time, which is when I switched to "Both windowed and fullscreen mode" setting.

G-Sync is hardware, not a driver feature.

 

For that reason, you can set the Nvidia Inspector Framerate limiter for at 30.5 Fps.

And you still think 0.5FPS isn't within the massive variance of the ESP platform? The entire point of G-Sync is being able to work within a wider frame variance, without noticing judder and tearing. It makes absolutely no sense to lock around the very lower limit of G-Sync.

 

 

Which is why I suggest that everyone try it for themselves. In an age where i7's and 1080's are common and pc internals so dissimilar, many people probably have little problem maintaining 30 fps except in the harshest of conditions. It can't hurt to at least give it a shot and see what happens. Just as an aside, 4K G-syncs haven't necessarily been stuck at 60hz or even 144hz for a while. Higher end models can reach 165hz.

 

 

Except that doesn't hold true for the ESP-platform, given the fact that it's a 10 year old underlying engine that simply doesn't perform all that well. Throw in a commercial jet, some dense autogen and a high-quality airport, regardless of your specs, your framerate will suffer. And what are people supposed to 'give a shot'? They need to go out and buy a monitor with G-Sync to try it out? I'm getting the feeling you think G-Sync is like a driver level feature?

Also, there is no 4K monitor on the consumer market today that does beyond 60Hz natively. They are coming but are not yet here.

 

Well, you wouldn't bother (I don't think?) with Triple buffering on a G-sync anyway, as it already only displays frames when they finish, which obviates the need for buffered frames in the first place......

You wouldn't bother because you can't. And no that does not render buffered frames obsolete. Problem with the ESP-platform is the highly inconsistent frame rendering and the stuttering. G-Sync cannot compensate for that. That's why you need to triple buffer the rendering, to smooth out the inconsistency.

 

The fact that you somehow think G-Sync is a software feature, tells me you really don't know much about G-Sync at all.


Asus TUF X670E-PLUS | 7800X3D | G.Skill 32GB DDR @ CL30 6000MHz | RTX 4090 Founders Edition (Undervolted) | WD SNX 850X 2TB + 4TB + 4TB

Share this post


Link to post

G-Sync is hardware, not a driver feature.

 

1) Controlled by the Nvidia drivers though. Also, as you mentioned yourself, G-sync was initially ineffective due to the nature of the way P3D handled its borderless screen. This changed after the introduction of the 353xxx driver series I believe, followed by Nvidia inspector allowing you to access the uncovered G-Sync application mode/Fullscreen and Windowed option choices. Again software: as is the NVI Framerate limiter.

 

So no, I don't think G-sync is software, but I do know that altering its behaviour is mostly accessed through software.

 

And you still think 0.5FPS isn't within the massive variance of the ESP platform?

 

I think any massive variance is probably still well within range of amelioration by the technology. https://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ 

 

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.

 

And what are people supposed to 'give a shot'? They need to go out and buy a monitor with G-Sync to try it out? I'm getting the feeling you think G-Sync is like a driver level feature?

 

In a thread regarding buying a new monitor, I assume "giving it a shot" would be deciding whether they wanted to go for a G-sync monitor as one known to be capable of handling the lower HZ requested by the thread starter..........

 

You wouldn't bother because you can't. And no that does not render buffered frames obsolete. Problem with the ESP-platform is the highly inconsistent frame rendering and the stuttering. G-Sync cannot compensate for that. That's why you need to triple buffer the rendering, to smooth out the inconsistency.

 

Putting aside all the rest of it since I think a simple reccomendation of a monitor has been dragged waaaaaaay off course, I would say I've always believed that compensating for such things was the very reason for G-sync. https://blogs.nvidia.com/blog/2013/10/18/g-sync/

 

No More Tearing, No More Stutters

 

Hundreds of engineer-years later, we’ve developed the G-SYNC module. It’s built to fit inside a display and work with the hardware and software in most of our GeForce GTX GPUs.

 

With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate.

 

This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU. So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC.


We are all connected..... To each other, biologically...... To the Earth, chemically...... To the rest of the Universe atomically.
 
Devons rig
Intel Core i5 13600K @ 5.1GHz / G.SKILL Trident Z5 RGB Series Ram 32GB / GIGABYTE GeForce RTX 4070 Ti GAMING OC 12G Graphics Card / Sound Blaster Z / Meta Quest 2 VR Headset / Klipsch® Promedia 2.1 Computer Speakers / ASUS ROG SWIFT PG279Q ‑ 27" IPS LED Monitor ‑ QHD / 1x Samsung SSD 850 EVO 500GB / 2x Samsung SSD 860 EVO 1TB /  1x Samsung - 970 EVO Plus 2TB NVMe /  1x Samsung 980 NVMe 1TB / 2 other regular hd's with up to 10 terabyte capacity / Windows 11 Pro 64-bit / Gigabyte Z790 Aorus Elite AX Motherboard LGA 1700 DDR5

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...