Fallout 1 - CPU temperature problem

kojocel

First time out of the vault
Using google / searching this forum to get some results to my problem didn't come up with reliable solutions.
I hope some replies in this topic will enlighten me as to what must be the cause of the issue and maybe some advice.

I recently got my hands on a new machine:

Intel Core i5 760 BOX
ASUS P7P55 LX
Nvidia GT 430 1GB / DX11
2x2GB DDR3 RAM 1600
All latest drivers

Now, I know this CPU comes with not exactly the best cooler around, but I didn't OC the CPU and cooler performs great when the need arises.
The usual idle temp [according to Real Temp from TechpowerUp] is around 38-45C degrees. Upon playing The Witcher, STALKER, Gothic 3 and a few others, CPU Temp only rarely went beyond 70C. Normally it's around 60-70-ish degrees range, which is okay.

When I load Fallout 1 however, things change. [Please note that I tried installing the game from game disc and in a separate folder from Fallout Revived Installer, and that I used High Resolution Patch on the game installed from game disc].
At some point, both games, but mostly the one from game disc, pushed CPU temp up to 95C on Core0, and about 70-80 on the rest 3 cores.
I looked into performance tab, and Core0 was used 100% all the times, which led me to believe that this is the cause to my problems.

After this, I installed the same games on 2 other machines, and got the results as follows:
- Athlon 64 x2 Dual Core 3600+: Core 1 was loaded 100% same as mine, while the other remained at about6-7%, but the temperature didn't go above 60C,
- Intel Core 2 Duo E4500: temp went from ~38C in idle to ~55C upon launching the game, on both cores, with a slightly higher one on Core0.

I ran some other older games on the same machines [Diablo, Baldur's Gate] and nothing unusual happened.
This is rather intriguing since I know most of the rendering on older games (and older versions of directX) use mostly CPU power, but I don't know how to put a frame limiter on Fallout 1, and even don't know if that would help at all, since this is only Fallout case.

Now, my questions are:
- Does anyone else encountered this problem? If yes, what did you do?
- I know I should get me another cooler, and I intent to do it in some more or less near future, but are are there any workarounds for this? I mean for lowering CPU core usage below 100% at all times?

Thank you in advance for taking time reading this, and for any provided help. If you need more info about my machine or any other things you deem important, please feel free to inform me, and I'll be happy to oblige.

Looking forward to your replies.
Sincerely yours, S.
 
You never mentioned your OS? It could be that was the problem.
Judging by that sweet rig, I'm assuming Win7 x64?
Just a thought. I don't know how or why it would do that tho'.
Tried running it in XP mode?
 
Yes, I got a Windows 7 64bit Ultimate Edition OS.
But I don't think that's the cause, really, since the other two machines I ran the game on got the same OS, yet they performed normally.

I reseated the CPU only to get the same results. I'm expecting an Arctic Silver 5 paste this week to use on CPU anyway, so I hope I'll get a better temp.

Regards, S.

p.s. Unbelievable - I'm doing this for a 10 years old game.
 
You could install the DOS Fallout and run it in DOSBOX. The abstraction layers that come in play should eliminate your problem, and theoretically you could take advantage of DOSBOX screen scaling.

You won't be able to install most third-party patches though.
 
fallout always caused 100% cpu for me, it's written for win95 when power management was basically non-existent. try to load win95 in virtualbox, it will also cause 100% cpu all the time...
 
Back
Top