Using google / searching this forum to get some results to my problem didn't come up with reliable solutions.
I hope some replies in this topic will enlighten me as to what must be the cause of the issue and maybe some advice.
I recently got my hands on a new machine:
Intel Core i5 760 BOX
ASUS P7P55 LX
Nvidia GT 430 1GB / DX11
2x2GB DDR3 RAM 1600
All latest drivers
Now, I know this CPU comes with not exactly the best cooler around, but I didn't OC the CPU and cooler performs great when the need arises.
The usual idle temp [according to Real Temp from TechpowerUp] is around 38-45C degrees. Upon playing The Witcher, STALKER, Gothic 3 and a few others, CPU Temp only rarely went beyond 70C. Normally it's around 60-70-ish degrees range, which is okay.
When I load Fallout 1 however, things change. [Please note that I tried installing the game from game disc and in a separate folder from Fallout Revived Installer, and that I used High Resolution Patch on the game installed from game disc].
At some point, both games, but mostly the one from game disc, pushed CPU temp up to 95C on Core0, and about 70-80 on the rest 3 cores.
I looked into performance tab, and Core0 was used 100% all the times, which led me to believe that this is the cause to my problems.
After this, I installed the same games on 2 other machines, and got the results as follows:
- Athlon 64 x2 Dual Core 3600+: Core 1 was loaded 100% same as mine, while the other remained at about6-7%, but the temperature didn't go above 60C,
- Intel Core 2 Duo E4500: temp went from ~38C in idle to ~55C upon launching the game, on both cores, with a slightly higher one on Core0.
I ran some other older games on the same machines [Diablo, Baldur's Gate] and nothing unusual happened.
This is rather intriguing since I know most of the rendering on older games (and older versions of directX) use mostly CPU power, but I don't know how to put a frame limiter on Fallout 1, and even don't know if that would help at all, since this is only Fallout case.
Now, my questions are:
- Does anyone else encountered this problem? If yes, what did you do?
- I know I should get me another cooler, and I intent to do it in some more or less near future, but are are there any workarounds for this? I mean for lowering CPU core usage below 100% at all times?
Thank you in advance for taking time reading this, and for any provided help. If you need more info about my machine or any other things you deem important, please feel free to inform me, and I'll be happy to oblige.
Looking forward to your replies.
Sincerely yours, S.
I hope some replies in this topic will enlighten me as to what must be the cause of the issue and maybe some advice.
I recently got my hands on a new machine:
Intel Core i5 760 BOX
ASUS P7P55 LX
Nvidia GT 430 1GB / DX11
2x2GB DDR3 RAM 1600
All latest drivers
Now, I know this CPU comes with not exactly the best cooler around, but I didn't OC the CPU and cooler performs great when the need arises.
The usual idle temp [according to Real Temp from TechpowerUp] is around 38-45C degrees. Upon playing The Witcher, STALKER, Gothic 3 and a few others, CPU Temp only rarely went beyond 70C. Normally it's around 60-70-ish degrees range, which is okay.
When I load Fallout 1 however, things change. [Please note that I tried installing the game from game disc and in a separate folder from Fallout Revived Installer, and that I used High Resolution Patch on the game installed from game disc].
At some point, both games, but mostly the one from game disc, pushed CPU temp up to 95C on Core0, and about 70-80 on the rest 3 cores.
I looked into performance tab, and Core0 was used 100% all the times, which led me to believe that this is the cause to my problems.
After this, I installed the same games on 2 other machines, and got the results as follows:
- Athlon 64 x2 Dual Core 3600+: Core 1 was loaded 100% same as mine, while the other remained at about6-7%, but the temperature didn't go above 60C,
- Intel Core 2 Duo E4500: temp went from ~38C in idle to ~55C upon launching the game, on both cores, with a slightly higher one on Core0.
I ran some other older games on the same machines [Diablo, Baldur's Gate] and nothing unusual happened.
This is rather intriguing since I know most of the rendering on older games (and older versions of directX) use mostly CPU power, but I don't know how to put a frame limiter on Fallout 1, and even don't know if that would help at all, since this is only Fallout case.
Now, my questions are:
- Does anyone else encountered this problem? If yes, what did you do?
- I know I should get me another cooler, and I intent to do it in some more or less near future, but are are there any workarounds for this? I mean for lowering CPU core usage below 100% at all times?
Thank you in advance for taking time reading this, and for any provided help. If you need more info about my machine or any other things you deem important, please feel free to inform me, and I'll be happy to oblige.
Looking forward to your replies.
Sincerely yours, S.