Bridge Commander Central
BC Forums => BC General => Topic started by: sona1111 on January 03, 2011, 11:32:05 PM
-
Hello community!
I ma be new to this forum, but ST:BC has been entertaining me for many years. I have many mods for it currently including nanoFX and KM, BC scripts, etc. My goal is to be able to play this without getting the massive lag, and NOT turn down the settings!
YEs, i am a computer nerd of sorts and while my computer if far from the best i do know what i am doing if i need to upgrade. I have been thinking about getting a new video card to replace my 8800 GT for a little while however when i was looking into a little overclocking to test how far the current card would go, i came across a very strange problem. The "gpu useage" on the EVGA precision monitor program does not go up AT ALL when playing hardcore BC when its "lagging". All of the work is put on the CPU, which i am sure is the main problem that is making it lag, maybe i do not even need to upgrade!
Anyway, in short, what is the best way to get star trek bridge commander to use my GPU correctly instead of offloading everything to the cpu?
-
probably by rebuilding the game engine or waiting for the "excalibur" version. i bet that the processing draw is hardcoded into the engine. don't forget, this game was made a decade ago for and on a windows 9X OS.
-
probably by rebuilding the game engine
cant modify the exe unfortunately... (also illegal)
or waiting for the "excalibur" version.
to clarify - there isn't going to be an "Excalibur version" of BC... Star Trek Excalibur is it's own game on it's own engine, built from scratch...
-
so is there some kind of program that lets you use a gpu as another cpu core for a certain process, or am i forced to downgrade gpu or upgrade cpu ?
-
even if you could find one, BC's engine would probably ignore it.
Jimmy's not kidding about trying to fix the game engine, either. it's never been released to BE fixed.
-
This can be summed up to about this: BC's engine is very old lol. BC doesn't even know about multicore. There were certain attempts in the past attempt to achieve that (BC using multiple cores), one small problem however: python's global interpreter lock.
-
Check the graphics card setting in BC's config menu, I've found that the default is for BC to run in Transform and Lighting (T&L) mode on the card, which, on my system at least, can cause lag, change it back to the standard GPU mode. (you'll have to change it every time you start the program, it doesn't stick). This might help.
-
I have evidenced that it uses the gpu a little. When the textures are not for example 2048x2048 and for example are scaled to 2049x2049 there is massive lag + white surfaces, which means that gpu is being used. GPUs are with limited wide of data with the idea of processing more floating point operations. If only cpu rendered, textures like 2049x2049, 895x789 and etc should be visible ingame.
-
I have evidenced that it uses the gpu a little. When the textures are not for example 2048x2048 and for example are scaled to 2049x2049 there is massive lag + white surfaces, which means that gpu is being used. GPUs are with limited wide of data with the idea of processing more floating point operations. If only cpu rendered, textures like 2049x2049, 895x789 and etc should be visible ingame.
Another good example , for not HUGE textures.
-
Aces: the only thing I get when I try and access that stuff in game is a re-iteration of the GPU's name.
and we all know that "huge" is a matter of opinion when it comes to the graphics card.