Jump to content
Sign in to follow this  
Guest CHRISH

PIC 767 with Geforce4 Ti 4600

Recommended Posts

Guest

Hmmm interesting, anti-aliasing works in both full screen and windowed mode for me. That was true for me when I ran a Geforce 2 GTS and my new Geforce 4 TI4400. That was with version 21.85 and 29.20 respectively.Scott..

Share this post


Link to post
Share on other sites

Scott,I'm attaching two images - both taken at 1600*1200*32-bit. The first is full-screen with AA enabled. The second is Windowed but still with AA enabled. I'll let you decide which you prefer. To my eyes AA does not work in Windowed mode. I'm running on Detonator drivers 28.32.Cheers,


Ray (Cheshire, England).
System: P3D v5.3HF2, Intel i9-13900K, MSI 4090 GAMING X TRIO 24G, Crucial T700 4Tb M.2 SSD, Asus ROG Maximus Z790 Hero, 32Gb Corsair Vengeance DDR5 6000Mhz RAM, Win 11 Pro 64-bit, BenQ PD3200U 32” UHD monitor, Fulcrum One yoke.
Cheadle Hulme Weather

Share this post


Link to post
Share on other sites
Guest

I have a P4 1.7GHz, GF2 MX400, 512MB SDRAM. Worth the upgrade?Thanks,Greg

Share this post


Link to post
Share on other sites
Guest

I just performed my own screen captures to verify my statement, and I have to stand behind it. On my machine it makes no difference whether you are in full screen or windowed mode, anti-aliasing works.The differences in hardware, software configurations must be at work here, cause your screen shots tell another story. This is an interesting problem as to why we are seeing different behavior.Scott

Share this post


Link to post
Share on other sites
Guest

Well only you can decide the worth of an upgrade, but let me ask you some ?'s1. Do you have 64 megs on your video card or 32 megs? that makes a difference in the level of FSSA (anti-aliasing)that you can select. If you select in the properties the 4X but do to the amount of memory and your resolution (this includes for example a resolution, say 1024 X 768 and bit depth ie. 16 bit or 32 bit) the drivers will select the amount of FSSA that the card can be support. With 32 megs the level will be less than 64 (even though the panel lets you select whatever value you want). With that said are you happy with the FSSA you have now or would you like it to be better? I noticed a big change in my FSSA and that helped me to feel good about my upgrade.2. You will be disappoined if you think FPS will improve greatly. I would say you may see a few FPS but that would be the most you may see. So if you go and do an upgrade not expecting an increase you will at least not be disappointed by that aspect. 3. Does your GF2MX400 have single or dual head? If not and you plan in the future to have dual monitors then an upgrade is looking sweeter and more worth it.4. You have the ability with your card to select different digital vibrance values so thats a toss up, no benefit of gain on that issue. However there is the issue of video quality and in MY eyes it improved a lot with my upgrade. I do want to caution that video quality is VERY subjective from one person to the next. If money is not tight then sure to me its a good choice to upgrade, specially if you would use the card with other games beside FS2K2.I had serious problems with my old set up using a geforce 2 GTS and a number of PCI secondary video cards that I tried, so when NVidia came up with a more powerfull dual head solution then the GF2 MX series I was ready to take the plunge. Hope this gives you some things to think about, and helps you in your decision.Scott

Share this post


Link to post
Share on other sites
Guest Charly

Additionally, the Ti4600 supports dual displays and again, I wouldn't recommend it if you are going to run it with a single display. Buy a Ti4400 instead.Why? The Ti4400 supports dual displays too. The Leadtek A250TD Ti4400 card can reach the Ti4600 performance easily...and exceed it.This card is a overclocked dream come true...and you can spend the rest of the money in memory.http://ftp.avsim.com/dcforum/User_files/3ce1f7745165a314.jpghttp://ftp.avsim.com/dcforum/User_files/3ce1f77f516b5ea6.jpgRegards :-waveCharly - BAW062

Share this post


Link to post
Share on other sites

Charly,Why? The Ti4400 supports dual displays too. The Leadtek A250TD Ti4400 card can reach the Ti4600 performance easily...and exceed it.My mistake - the Ti4400 also supports dual-display. I must have got confused with another of the models (there's so many of them!).However, with all things equal there's no way the Ti4400 can equal or outperform the Ti4600. It has a slower memory (550Mhz vs 650Mhz) which will make a difference. If you overclock a Ti4400 then it may outperform a unclocked Ti4600 but that's not a level playing field.


Ray (Cheshire, England).
System: P3D v5.3HF2, Intel i9-13900K, MSI 4090 GAMING X TRIO 24G, Crucial T700 4Tb M.2 SSD, Asus ROG Maximus Z790 Hero, 32Gb Corsair Vengeance DDR5 6000Mhz RAM, Win 11 Pro 64-bit, BenQ PD3200U 32” UHD monitor, Fulcrum One yoke.
Cheadle Hulme Weather

Share this post


Link to post
Share on other sites

Scott,I just performed my own screen captures to verify my statement, and I have to stand behind it.Yes, AA is definitely enabled on both shots. Maybe I need to set some options to enable it in Windowed mode but to be honest I can live without it so it doesn't really bother me.


Ray (Cheshire, England).
System: P3D v5.3HF2, Intel i9-13900K, MSI 4090 GAMING X TRIO 24G, Crucial T700 4Tb M.2 SSD, Asus ROG Maximus Z790 Hero, 32Gb Corsair Vengeance DDR5 6000Mhz RAM, Win 11 Pro 64-bit, BenQ PD3200U 32” UHD monitor, Fulcrum One yoke.
Cheadle Hulme Weather

Share this post


Link to post
Share on other sites
Guest

I too have the Leadtek Winfast Nvidia4 Ti4400. I care about cash so I did not opt for the Ti4600. In the UK it really is too expensive at the moment. I would strongly advise to run with the Ti4400 and then using the spare

Share this post


Link to post
Share on other sites
Guest Charly

Ray:Agree with you, but if you buy the Leadtek Ti4400 card you can get the Ti4600 performance easily....so, why spend more money? The Ti4400 comes with exactly the same memory that the Ti4600 have...there is no risk in overclock it...in fact the memory in the Ti4600 is nearly at the limit....you can't overclock too much the Ti4600... the Ti4400 is another story. In the chart below you can see that the overclocked Visiontek Ti4400 exceed the overclocked eVGA Ti4600 performance. (ALL the cards are overclocked.)http://ftp.avsim.com/dcforum/User_files/3ce32b1a13bde170.jpgOther review: "nVidia's GeForce4 Ti4400 should not be ignored as some "middle of the road" solution that's neither the best performer or the lowest priced. Our tests show the Ti4400 family of cards to be extremely strong gaming cards that can put up numbers right along side nVidia's own Ti4600, without the extra cost. We've already seen quite a lot of interest in Ti4400-based cards, not only since they're cheaper than Ti4600 cards, but are actually available at many locations, whereas many Ti4600 cards sell out faster than can be kept in stock. If overclocking is all you're interested in, the Leadtek card is certainly your bag of tea. The monstrously large cooling on that card will no doubt give you the best chance at getting your GeForce4 Ti to higher clock speeds. "Regards :-waveCharly - BAW062

Share this post


Link to post
Share on other sites

Hi Charly,You present a convincing argument :-) Pity the Creative doesn't appear on your chart.All I would say in the defence of those who want the best performance but without the inherent dangers of overclocking is that the Ti4600 remains the pinnacle of performance (at least for now).I'm not a fan of overclocking - because the Athlon needs so much more cooling than the Pentium I would be very reluctant to go down that route. However, I accept that for those who like living on the edge it provides a way of achieving better performance than would otherwise be the case.I waited for three months for a GeForce4 Ti4600 and once a supply became available I jumped at it. Supplies of the 4400 came later albeit by only a few weeks here in the UK. The Leadtek is a decent card but the Creative has also scored well in performance tables. Supplies of the Gainward are virtually non-existent.Cheers,


Ray (Cheshire, England).
System: P3D v5.3HF2, Intel i9-13900K, MSI 4090 GAMING X TRIO 24G, Crucial T700 4Tb M.2 SSD, Asus ROG Maximus Z790 Hero, 32Gb Corsair Vengeance DDR5 6000Mhz RAM, Win 11 Pro 64-bit, BenQ PD3200U 32” UHD monitor, Fulcrum One yoke.
Cheadle Hulme Weather

Share this post


Link to post
Share on other sites
Guest

Hi RayThe anti-aliasing isn't working for you in windowed mode because you have the graphics card set to take its instructions from the application. For some reason, when anti-aliasing is turned ON in FS02, and the graphics card is set as such, AA doesn't work in windowed mode.To cure this for evermore, with no obvious disadvantage that I've come across, go to: Control Panel > Display > Settings > Advanced > Your graphics card tab > Additional Properties > 3D Anti-Aliasing Settings. When you finally arrive turn on 'Manually Select the Anti-Aliasing Mode' (which will automatically turn off 'Allow applications to control anti-aliasing mode') then choose what level of AA you want.FWIW I can find no difference at all between 2x, 4x and Quincunx with my setup (GeF3 Ti500). If you want to test this out for yourself I found the easiest way was to turn on 'Display the Quicktweak icon in the taskbar' from the Desktop Settings tab in the same 3D Anti-Aliasing Settings dialog box. Allows you to swap the AA settings at the drop of a hat to see just what difference they make.

Share this post


Link to post
Share on other sites
Guest

Charly,Where did you get those benchmarks? I would like to see the rest of the numbers and the write up. Scott

Share this post


Link to post
Share on other sites
Guest Charly

With the GeForce4 NVIDIA truly has a winner and that makes the jobs of the card manufacturers much easier. Without a single card currently available that can touch the performance of either of the solutions featured here today, if you want the best of the best the choices are right here in front of you.Since all the memory is sourced from the same manufacturer (Samsung), the overclocking characteristics are really identical across the entire line of GeForce4 cards at this point. In fact, this comparison gives us a better idea of how the Samsung BGA memory chips overclock as its a larger sample size than we've taken in any individual memory comparison before.http://ftp.avsim.com/dcforum/User_files/3ce477033f2ad53d.jpghttp://ftp.avsim.com/dcforum/User_files/3ce477113f3bc57f.jpghttp://ftp.avsim.com/dcforum/User_files/3ce477233f4c1d26.jpghttp://ftp.avsim.com/dcforum/User_files/3ce477313f68c318.jpghttp://ftp.avsim.com/dcforum/User_files/3ce4780240ad5897.jpghttp://www.anandtech.com/Regards :-waveCharly - BAW062

Share this post


Link to post
Share on other sites
Guest Charly

Ray:All I would say in the defence of those who want the best performance but without the inherent dangers of overclocking is that the Ti4600 remains the pinnacle of performance Agree with you again! Your Ti4600 is an excellent card.I choose the Ti4400 becouse prefer to buy a 256 DDR DIMM with the rest of the money.Yeah, the Creative Ti4600 is not in the review....my Leadtek Ti4400 is not either.... :'(Regards :-waveCharly - BAW062

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...