How does F3 perform on your system?

ternarybit

First time out of the vault
My machine:
Pentium 4 3.2GHz prescott, 9600GT OC, 4GB OCZ PC2-6400 on XP Pro/DX9c.

F3 settings:
1680x1020, 4xAA, 4xFF, all other settings on "high." No vsync.

Just playing around outside Megaton, fighting the critters and doing other such simple gameplay yields 15-20fps. Loading times are very fast. I am a sucker for eye candy but I know my specs aren't top, so I decided to lower resolution & graphics settings in hopes of hitting at least 30fps. To my surprise, going to 1440 res & medium settings did not improve my framerates more than 2-5fps!

My questions: what are your specs and performance average? Might upgrading my aged prescott improve framerates? Any of you find a good 'sweet spot' on performance/visual appeal?

Cheers! :clap:
 
Yes you'd benefit from a faster CPU. A cheap C2D should be sufficient to get the full benefit of your 9600GT.

That said, I think you're overly optimistic about the capability of your card. You'll always have half-assed framerates if you insist on such a high resolution and 4xAA. Very likely you could get a stable 40'ish FPS by dropping AA and lowering your resolution to 1280×800 - assuming you get a new CPU.

My Q66, 2GB RAM, 88GTX WinXP+SP2 machine runs the game in 1920x1200, no VSync, no AA, 8xAF and game settings on the "High" pre-set, at a fairly stable 60FPS.

- But I don't know how helpful that is to you. Our machines are very different.
 
Fallout 3 has terrible multicore support (just like oblivion).

With all threaded variables enabled, I lose a total of (you guessed it) zero fps by setting F3 to a single core.

Oh and vsync is stuck on until you force it off through your control panel or use third party tools; so that's probably contributing to your low fps.
 
Q6600, 9600GT, 3GB DDR2 667, Vista 32 bit, everything on stock. I get 50-80 FPS in most situations with most settings maxed (no AA/AF) at 1440*900, sometimes going down to about 30 in large battles.

Try turning off AA, it's a huge performance drain. If that doesn't help, a newer CPU seems to be in order.
 
I KNEW IT! Thank you phil! I turned off vsync in game settings but didn't notice any screen tearing so I suspected it was still on but didn't try disabling it manually. This should help a lot!

And thank the rest of you for your insight. I want to wait until after the holidays to get a 9550 'cause I know the price will drop after i7 gets into gear.

As far as scaling back settings, like I said, I tried no AA/AF, lower res, and this only improved my fps by 2-5 (for a total of 25fps max). So I figure if I'm at ~20 no matter what, might as well get the eye candy. I would rather have the game look gorgeous and a bit slower than atrocious and 60fps. It just seems that I can't find a good middle ground where things look fairly good at about 30-35fps. I just put a DuOrb on my 9600GT so hopefully I can squeeze a few more MHz out of it. I will let you know how it goes.

Thanks again! :mrgreen:
 
Vista 64bit Biz
AMD 6k+ dual core
Nvidia 8600 GT 1gig
6 gigs ram

Not one crash or stutter yet and I am nearly 60hrs of gameplay.
Also playing on full high quality.
 
C2D clocked at 3.5
8800GTX's in SLI
2 gigs of ram with XP.

Runs the game at Ultra (except in Vats) with no problem.

I run the game at all high settings and never dip below 60 fps with everything enables. The VATS issues is annoying. Could run everything at Ultra if it wasn't for that bug.
 
terrible.

no shadows, and the lowes resolution posible so i may benefict from a stable fps
 
I have everything maxed on my laptop and haven't seen any lag at all even during enclave firefights
 
ternarybit said:
As far as scaling back settings, like I said, I tried no AA/AF, lower res, and this only improved my fps by 2-5 (for a total of 25fps max).
I didn't suggest a cheap C2D for the sake of multithreading. Oblivion almost doesn't support it. But any C2D core is much, much faster than what you have, fast enough for Oblivion. And C2Ds are dirt cheap these days. You ought to be able to manage the upgrade for about the same as the game cost you.
 
I second that, I have one of the slowest C2Ds (E6300, 1.81GHz), an nVidia 8800GT and 2GB RAM and I can run it on 1280x1024 with "Ultra" settings with AA and AF.
It's not just the multiple cores, it also works much better than older processors. The Pentium 4 can't hold a candle to that.
 
Buxbaum666 said:
I second that, I have one of the slowest C2Ds (E6300, 1.81GHz), an nVidia 8800GT and 2GB RAM and I can run it on 1280x1024 with "Ultra" settings with AA and AF.
It's not just the multiple cores, it also works much better than older processors. The Pentium 4 can't hold a candle to that.

Wow. I never expected a CPU to weigh so heavily on the performance of a game. I suppose what you say makes sense though, because there is NO combination of settings that will bump my fps past 25 max. I have tried OC'ing my card, dropping all settings/res, everything. I know a 9600GT isn't exactly a GTX280, but it's not a terrible card either. Thanks a lot guys, now I don't have to go insane thinking I got some ridiculous lemon GPU!
 
I have everything set on maximum settings.

C2Q Core 2.4GHz.
4GB DDR2
8800GTX x 2 (SLI)
Windows XP Pro & Vista Ultimate 64-Bit
 
Back
Top