Jump to content

Archived

This topic is now archived and is closed to further replies.

alainneedle1

Will or Is XP-10 taking advantage of SLI

Recommended Posts

Absolutely not!!!!!!!

 

XP10 STILL can not take advantage from SLI or CrossfireX (UNFORTUNATELY).

Share this post


Link to post
Share on other sites

nothing new...same old...i guess it would add a bit to support a specific chipset as SLI would be a different rendering codepath than Crossfire and LR would either have to choose one over the other or build for both and that would take lots of effort that could be used to expand the rest of the platform...


Aaron

Share this post


Link to post
Share on other sites

Actually yes XPX will take advantage of SLI, esp for higher AA levels, you can use Nvdia inspector to set SLI AA level much higher than the program rendering settings.

 

BUT once you activate HDR, it does not seem to take advantage of SLI(maybe a few fps improvement).

 

I had a GTX 690 and did this all the time. I had to give it up for a 680 4gb b/c XPX 64 going over the 2gb limit on the card.

Share this post


Link to post
Share on other sites

I have found some more - also more recent (September 2012) - words from Ben on this topic (google helps):

http://developer.x-p...it-will-not-do/

 

Down in the comments, you find these (quite technical, but they tell you very well, what and why is working and what not):

 

Filippo says:

September 28, 2012 at 12:40 pm

 

Hi Ben,

I know I’m off topic, but speaking about graphics cards and drivers I was eager to ask you this question icon_smile.gif

I’ve been surfing the net to learn something about deferred rendering (which to my understanding is used by X-Plane in HDR mode). I found some contributions (forums, blogs) hinting that deferred rendering is not really “SLI/Crossfire-friendly” because of its internal workflow; which seems to be consistent with the current situation of X-Plane, that (for now) doesn’t get significant advantages from multiple GPUs.

Considering that more and more 3D games are gradually switching to deferred rendering engines (one example for all: Unreal Engine 3), if what I read is true, then should I deduce that SLI/Crossfire configurations are becoming gradually useless in favour of high-power single GPU configurations? Or is it something temporary, until future enhancements in drivers will allow deferred rendering engines to fully exploit multi-GPU configs?

… or maybe I’ve completely misunderstood the situation, which could be possible given that I’m no expert at all in this field…

 

Ben Supnik says:

September 28, 2012 at 3:55 pm

 

Hi Filippo,

HDR is indeed a deferred renderer – it’s a deferred renderer that resolves into an HDR surface with a linear color space. Now…

1. It is my understanding that X-Plane _will_ scale with SLI now; the reason NV did not default us to using SLI in their profiles is that we are typically CPU bound on the type of system that the test (one big monitor, one bad-&@($* video card). For example, if you run a GeForce 680 at 1920 x 1200 with a lot of ren settings, you’re going to max out on CPU due to objs and shadows, not GPU due to HDR and clouds.

2. SLI/Crossfire come in a number of flavors: AFR (alternate frame) should be friendly with deferred rendering engines if they are coded correctly. They are unfriendly to temporal anti-aliasing – the driver has to do some special work to make this case work. Deferred renderers are very unfriendly to _split frame_ SLI and CrossFire, but this case is highly non-optimal anyway because the driver has to push the entire frame to BOTH GPUs (double the “push”) where-as each frame goes out over the bus only once in AFR.

Anyway, if you found a post where someone who works on Unreal 3 says “We don’t play nice with AFR” I’d be curious to see it, but two-pass rendering techniques like deferred rendering should not be fundamentally problematic.

What IS problematic with deferred rendering is anti-aliasing…the G-Buffer has to have its store in “multi-sample” mode – which means 16x FSAA + deferred rendering is 16x the g-buffer storage. Who cares? Well, the problem is that G-Buffers tend to be pretty huge – at a minimum typically 4x the cost of a regular framebuffer. So FSAA + deferred is very frame-buffer expensive, which is why the anti-aliasing options in HDR mode are not as nice (from an anti-aliasing perspective) as in non-HDR mode.


Andras Fabian / Alpilotx

Visit www.alpilotx.net, a site about X-plane scenery

You can see some landscape and other photographs from me here:

http://www.flickr.co...s/weathermaker/

Share this post


Link to post
Share on other sites

×
×
  • Create New...