Jump to content
Sign in to follow this  
Guest

1X 2080Ti vs. 2X 2080Ti vs. 1X TitanXP - P3D V4.3

Recommended Posts

10 hours ago, Rob Ainscough said:

I would think a buyer would much rather have a GPU that runs 20-40C all the time rather than one that's been running for years at 60-80C … so I can see why they want to buy from me given the lower heat stress … but why have me convert them back to their "stressed" state with stock cooler?  Anyway, I have put many back together because buyers seem to want that.

Cheers, Rob.

If I had to guess, I would say differences in technical ability. Higher end solutions like watercooled cards are unfamiliar to most people, and so is the care and feeding of such beasties.

If you are not technical, a waterblocked card can look complex, intimidating, and hard to maintain.


We are all connected..... To each other, biologically...... To the Earth, chemically...... To the rest of the Universe atomically.
 
Devons rig
Intel Core i5 13600K @ 5.1GHz / G.SKILL Trident Z5 RGB Series Ram 32GB / GIGABYTE GeForce RTX 4070 Ti GAMING OC 12G Graphics Card / Sound Blaster Z / Meta Quest 2 VR Headset / Klipsch® Promedia 2.1 Computer Speakers / ASUS ROG SWIFT PG279Q ‑ 27" IPS LED Monitor ‑ QHD / 1x Samsung SSD 850 EVO 500GB / 2x Samsung SSD 860 EVO 1TB /  1x Samsung - 970 EVO Plus 2TB NVMe /  1x Samsung 980 NVMe 1TB / 2 other regular hd's with up to 10 terabyte capacity / Windows 11 Pro 64-bit / Gigabyte Z790 Aorus Elite AX Motherboard LGA 1700 DDR5

Share this post


Link to post
Share on other sites
Guest
12 hours ago, HiFlyer said:

If you are not technical, a waterblocked card can look complex, intimidating, and hard to maintain.

I think you mean "not mechanical" … nothing "technical" about the process … small screws and paste removal/application and heat transfer pad placement.

Cheers, Rob.

Share this post


Link to post
Share on other sites

I have the following situation: Night, airport with DL (Madrid e.g.) and nearly 100% GPU load. I've got a 1080ti and a I7 8700k at 5.2Ghz. The antialiasing setting is 8xMSAA. This means a nearly fluid simming in the night but the autogen lights are shimmering in the distance. So I'd like to know (if anybody knows of course) an RTX 2080ti would be able to decrease the GPU-load or even better if I would be able to set AA to 4-8SSAA?

 

Cheers,Kai

 

Unbenannt-1.jpg

Edited by Kai-Uwe Wei

Mainboard: ASUS ROG STRIX z370-H Gaming, CPU: CoffeeLake I7 8700K @ 5,2Ghz
Gainward CardExpert RTX 2080Ti, Monitor: LG ULTRAWIDE(38UC99) 3840*1600

32 GB RAM DDR4 3200 GSkill, Intel OptaneMemory 32GB
Windows 10 pro, MSFS and P3Dv5.1

Share this post


Link to post
Share on other sites
3 hours ago, Kai-Uwe Wei said:

I have the following situation: Night, airport with DL (Madrid e.g.) and nearly 100% GPU load. I've got a 1080ti and a I7 8700k at 5.2Ghz. The antialiasing setting is 8xMSAA. This means a nearly fluid simming in the night but the autogen lights are shimmering in the distance. So I'd like to know (if anybody knows of course) an RTX 2080ti would be able to decrease the GPU-load or even better if I would be able to set AA to 4-8SSAA?

 

Cheers,Kai

 

Unbenannt-1.jpg

With my 2080 Ti, I can now maintain 30fps in 4k at dynamic lighting-enabled airports with 4x SSAA set. I have jacked up all the GPU-driven graphics settings (shadows, textures) except for disabling dynamic reflections. GPU utilization was still just about 70% from what I recall sitting at the gate in FlyTampa Boston in the FSL A320. This is with a 6700K @4.7 and 16 gigs of RAM running at 3200. 

Ben 

Edited by bbain1187

P3D 4.3, Windows 10/64 bit, Intel 6700k @ 4.7 air-cooled, NVidia 2080 Ti Founders Edition, ASUS Rog Maximus VIII Ranger, 16GB G.Skill Ripjaws DDR4 @3200, Phanteks Anthoo Pro Series Case, Samsung 950 Pro M.2 500GB, Sandisk 1TB SATA, Seagate 2TB Hybrid Drive, Cooler Master 700W, 40-inch Samsung 4k TV

Share this post


Link to post
Share on other sites

I find this statement about DDR4 Memory from Hilbert (Guru3d.com) interesting...

DDR4 Memory

For Coffee Lake-S (8th and 9th Gen Intel procs) and DDR4 we always say, volume matters more than frequency. A 3,200 MHz kit, for example, is more expensive and does offer better bandwidth but the performance increase in real-world usage will be hard to find. Unless you transcode videos over the processor a lot. As always, my advice would be to go with lower clocked DDR4 memory with decent timings, but get more of it. Don't go for 8 GB, get two or four DIMMs and in total a minimum of 16 GB. The reason we test at 3200 MHz is simple, we do the same for AMD Ryzen and want to create a fair and equal playing ground for both. 3200 MHz is, however, a very nice equilibrium for both processor brands.

Edited by TuFun

Share this post


Link to post
Share on other sites

Hello, great analysis.
I was surprised that even with the highest settings, you leave the special effects at low level. What is the reason? Thank you

Share this post


Link to post
Share on other sites

Thank you for this information. I'm wondering if a single 2080 would be an option in my case. I run a 1070 with a 6700k at 4.7, 16GB DDR4 3000 CAS 15. 

I'm running fairly low GPU intensive settings right now and it's running great, but it would be nice to keep between 45-60 fps at night with DL on. It would also be nice to have higher shadows and reflection settings.

I was thinking a single 2080TI, but maybe the price difference is not worth the performance difference. 

Thanks

Share this post


Link to post
Share on other sites

There doesn’t seem to be much info on the 2080 around here, especially with regard to 4K

Share this post


Link to post
Share on other sites
8 hours ago, VHOJT said:

There doesn’t seem to be much info on the 2080 around here, especially with regard to 4K

I think that is due to the availabilty (or non-availabilty) issue. 


Gigabyte x670 Aorus Elite AX MB; AMD 7800X3D CPU; Deepcool LT520 AIO Cooler; 64 Gb G.Skill Trident Z5 NEO DDR5 6000; Win11 Pro; P3D V5.4; 1 Samsung 990 2Tb NVMe SSD: 1 Crucial 4Tb MX500 SATA SSD; 1 Samsung 860 1Tb SSD; Gigabyte Aorus Extreme 1080ti 11Gb VRAM; Toshiba 43" LED TV @ 4k; Honeycomb Bravo.

 

Share this post


Link to post
Share on other sites
9 hours ago, pgde said:

I think that is due to the availabilty (or non-availabilty) issue. 

That would make sense 🙂

Share this post


Link to post
Share on other sites

Hey guys, I need your advice :

Last weekend I sold my 2 1080Ti cards. Both were Gigabyte 1080Ti Gaming OC.

At this moment, hete in the Netherlands, 2080Ti cards are all for pre-order and delivery between 15 november - end december.

I had my eyes on a MSI Trio or Gigabyte Gaming OC, but I might be waiting for 3-6 weeks...

However, by accident I found a Gigabyte 2080Ti Windforce, that I can buy today. It is a little slower in OC mode than a Gaming OC. In benckmarktests the difference in fps is 2-3%.

What would you do regarding a triple view ( no 4K ) setup ?  Buy the Windforce version or wait fir one of the other 2 cards..

I have to decide today ....

Thanks


13900 8 cores @ 5.5-5.8 GHz / 8 cores @ 4.3 GHz (hyperthreading on) - Asus ROG Strix Gaming D4 - GSkill Ripjaws 2x 16 Gb 4266 mhz @ 3200 mhz / cas 13 -  Inno3D RTX4090 X3 iCHILL 24 Gb - 1x SSD M2 2800/1800 2TB - 1x SSD M2 2800/1800 1Tb - Sata 600 SSD 500 Mb - Thermaltake Level 10 GT case - EKWB Extreme 240 liquid cooling set push/pull - 2x 55’ Sony 4K tv's as front view and right view.

13600  6 cores @ 5.1 GHz / 8 cores @ 4.0 GHz (hypterthreading on) - Asus ROG Strix Gaming D - GSkill Trident 4x Gb 3200 MHz cas 15 - Asus TUF RTX 4080 16 Gb  - 1x SSD M2 2800/1800 2TB - 2x  Sata 600 SSD 500 Mb - Corsair D4000 Airflow case - NXT Krajen Z63 AIO liquide cooling - 1x 65” Sony 4K tv as left view.

FOV : 190 degrees

My flightsim vids :  https://www.youtube.com/user/fswidesim/videos?shelf_id=0&sort=dd&view=0

 

Share this post


Link to post
Share on other sites
Guest
On ‎10‎/‎20‎/‎2018 at 4:10 PM, Diigg said:

I was surprised that even with the highest settings, you leave the special effects at low level. What is the reason?

Don't like shoreline waves "As is" and they don't bring much to the visuals table but can impact CPU load.

Cheers, Rob.

Share this post


Link to post
Share on other sites
On 10/23/2018 at 7:25 AM, GSalden said:

Hey guys, I need your advice :

Last weekend I sold my 2 1080Ti cards. Both were Gigabyte 1080Ti Gaming OC.

At this moment, hete in the Netherlands, 2080Ti cards are all for pre-order and delivery between 15 november - end december.

I had my eyes on a MSI Trio or Gigabyte Gaming OC, but I might be waiting for 3-6 weeks...

However, by accident I found a Gigabyte 2080Ti Windforce, that I can buy today. It is a little slower in OC mode than a Gaming OC. In benckmarktests the difference in fps is 2-3%.

What would you do regarding a triple view ( no 4K ) setup ?  Buy the Windforce version or wait fir one of the other 2 cards..

I have to decide today ....

Thanks

Gerard, it is probably too late.

But take your time before buying a 2080Ti card.
I just received mine, an EVGA GeForce RTX 2080 Ti XC ULTRA GAMING and although a fantastic improvement over my "old" 1080, it is VERY noisy.
And I believe the Gigabyte has same cooling solution.

So wait a bit until a better card is released unless you plan water-cooling it 😉

Edited by David Roch

Best regards,
David Roch

AMD Ryzen 5950X //  Asus ROG CROSSHAIR VIII EXTREME //  32Gb Corsair Vengeance DDR4 4000 MHz CL17 //  ASUS ROG Strix GeForce RTX 4090 24GB OC Edition //  2x SSD 1Tb Corsair MP600 PCI-E4 NVM //  Corsair 1600W PSU & Samsung Odyssey Arc 55" curved monitor
Thrustmaster Controllers: TCA Yoke Pack Boeing Edition + TCA Captain Pack Airbus Edition + Pendular Rudder.

 

Share this post


Link to post
Share on other sites
51 minutes ago, David Roch said:

Gerard, it is probably too late.

But take your time before buying a 2080Ti card.
I just received mine, an EVGA GeForce RTX 2080 Ti XC ULTRA GAMING and although a fantastic improvement over my "old" 1080, it is VERY noisy.
And I believe the Gigabyte has same cooling solution.

So wait a bit until a better card is released unless you plan water-cooling it 😉

It’s running in the pc while a write this. The card is more quiet than the Gamers Edition and at 46 degrees Celcius has a boost of 46-52 degrees C.

I cannot hear the card as the noise in my cockpit is louder than the pc ...


13900 8 cores @ 5.5-5.8 GHz / 8 cores @ 4.3 GHz (hyperthreading on) - Asus ROG Strix Gaming D4 - GSkill Ripjaws 2x 16 Gb 4266 mhz @ 3200 mhz / cas 13 -  Inno3D RTX4090 X3 iCHILL 24 Gb - 1x SSD M2 2800/1800 2TB - 1x SSD M2 2800/1800 1Tb - Sata 600 SSD 500 Mb - Thermaltake Level 10 GT case - EKWB Extreme 240 liquid cooling set push/pull - 2x 55’ Sony 4K tv's as front view and right view.

13600  6 cores @ 5.1 GHz / 8 cores @ 4.0 GHz (hypterthreading on) - Asus ROG Strix Gaming D - GSkill Trident 4x Gb 3200 MHz cas 15 - Asus TUF RTX 4080 16 Gb  - 1x SSD M2 2800/1800 2TB - 2x  Sata 600 SSD 500 Mb - Corsair D4000 Airflow case - NXT Krajen Z63 AIO liquide cooling - 1x 65” Sony 4K tv as left view.

FOV : 190 degrees

My flightsim vids :  https://www.youtube.com/user/fswidesim/videos?shelf_id=0&sort=dd&view=0

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...