Help - Search - Members - Calendar
Full Version: Consultation Video Card
SWR Productions Forum > SWR Projects > Rise of the Reds
Edsato82
My question is simple, C&C ROTR supports Crossfire or SLI?

I appreciate an answer.
Skitt
from what i can find out from forums/boards it may work.
Edsato82
So if supported, thanks!
Zion
you dont need a strong gpu for this game, you do need a powerful cpu, lag in this game is cpu based mostly, if the game runs smoothly in the first 30 secs vs easy army, your gpu is fine.

Crossfire is software based, sli is physical strip going on top of the cards, theoretically sli should work.. crossfire prob work too, but the game will just not take advantage
Storm
QUOTE (Zion @ 24 Oct 2016, 18:57) *
lag in this game is cpu based mostly

I disagree. It has something to with the SAGE engine. I'm not aware of what exactly but it does. Because when I play, (most of the time on my home home desktop - Win7 Ultimate) I ensure that only very much needed services are running. Rest everything, I kill. It includes the Anti-virus too. My system resources are free upto 90% and my CPU Usage is below 10%.

Thereafter, I start playing the Skirmish on 6-8 player maps. For the first 30 mins everything is fine and then the AI Spam and there we go with the ultimate lag most of the time resulting in serious errors. If it were CPU based, this should not happen.
GeneralCamo
SAGE only uses a single thread in your CPU (It may use a second thread for texture loading, but that may have been optimized since Renegade). A 4-core CPU with Hyperthreading for example would only be using around 17.5% of your CPU at max (with a bit of variance for hardware and hyperthreading's interesting properties, but my point stands).
Zion
QUOTE (Storm @ 24 Oct 2016, 10:09) *
I disagree. It has something to with the SAGE engine. I'm not aware of what exactly but it does. Because when I play, (most of the time on my home home desktop - Win7 Ultimate) I ensure that only very much needed services are running. Rest everything, I kill. It includes the Anti-virus too. My system resources are free upto 90% and my CPU Usage is below 10%.

Thereafter, I start playing the Skirmish on 6-8 player maps. For the first 30 mins everything is fine and then the AI Spam and there we go with the ultimate lag most of the time resulting in serious errors. If it were CPU based, this should not happen.

What you are describing is cpu getting overloaded, it is your cpu,
Yes the engine is bad because its old and doesnt take advantage of all cores, but if you have a modern high clocked desktop cpu, running 8 player map with 7 cpus is not a problem

for example my 4.0 ghz runs well, you must have a lower ghz or laptop cpu
Emin96
Even if you get most expensive pc with most recent hardware the game will lag this happen because of much reason
-The game record the replay this make game slow
-As Generalcamo said SAGE uses only 1 core of you pc
-pathfinding AI issue
-game not compatible with recent hardware.
Edsato82
I have a FX-6300, ASUS M5A97 R2.0, 8gb ram 1333mhz, XFX 7870 Core Edition and Antec 650 green.

I built something old, but I see a Crossfire will not help :-(
Storm
QUOTE (Zion @ 24 Oct 2016, 21:14) *
What you are describing is cpu getting overloaded, it is your cpu,
Yes the engine is bad because its old and doesnt take advantage of all cores, but if you have a modern high clocked desktop cpu, running 8 player map with 7 cpus is not a problem

for example my 4.0 ghz runs well, you must have a lower ghz or laptop cpu


Yes, my CPU is around 2 years old and I'm on 2.5 GHz and it can be termed as a high clocked CPU.... no? However, I've 8 GB of RAM and I'll need to check the speed of the RAM though. Could you please elaborate more about how the CPU is responsible for the game lag? I'm a software guy and though I've some basic to average knowledge of hardware, I'd like to know more about the hardware of a PC which might be accountable for the performance of older games like ZH.

Moreover, I use my laptop only for office/work purposes. I've an ACER Aspire ultralight timelineX. And to add to it, I've upgraded it to Windows 10 a few months back so playing this game on this OS is out of question.

QUOTE
-The game record the replay this make game slow

Is there any way we can disable this recording from its root in the game if it causes a lag?

QUOTE
-pathfinding AI issue

You mean, the set of instructions coded for a unit to follow a particular path in the game?

One question: Does this game or to be precise, ROTR Mod: does it make use of virtual functions using an IDE, something like VC++ to code? Or is it only the FinalBig as the main script used to code this game?
Skitt
ok this game will run on a dam toaster.

for rotr
and this is if you want to eliminate the hardware side of the lag so the only lag u will get will either be connection or sage itself.

lowest graphics settings:
Minimum:
CPU: Core 2 duo E4500
Ram: depends on OS if on windows xp 2gb DDR2. if on Windows 7/8/10 4gb DDR2. Windows vista 6gb ram (os takes a stupid ammount of ram)
Gpu: any graphics card made in 2004 and after

as for the sage lag its to do with objects on the map as well as scripts firing every frame if there is AI involved.
obviously the more AI's u have on the map the more objects there are and the more scripts firing.
the level of detail on the map counts as well (the world builder starts to bitch at you if it hits 3000 objects)
pvp lag is generally down to
A: players have there graphics settings higher than they should do.
B: Connection speed from one of the players.
C: low spec hardware.

if you are getting high cpu use on your cpu from rotr on minimum settings i strongly suggest you run tests on it to check its health status or you upgrade.
M.P
QUOTE (Emin96 @ 24 Oct 2016, 22:16) *
Even if you get most expensive pc with most recent hardware the game will lag this happen because of much reason
-The game record the replay this make game slow
-As Generalcamo said SAGE uses only 1 core of you pc
-pathfinding AI issue
-game not compatible with recent hardware.

Well I know that the second and forth ones are correct, But where did you get info about the other 2? :|

I'd like to see your source.
Graion Dilach
What is this nonsense about "game not compatible with recent hardware"? It is compatible otherwise it couldn't even run. Hardware compatibility is a black/white thing and not something you have fallbacks (true, you can emulate hardware through software but there's nothing to emulate for Generals from a 21st century PC's POV).

2.5 GHz is slow. I'd even call everything less than 3 GHz slow. For a good while now, CPUs are optimized towards pulling more complex instructions within a cycle which means only programs which were compiled to support those complex instructions are capable to benefit from them - this does not mean new hardware is incompatible with old programs, but old programs don't benefit from the new instruction sets*. (To put it more bluntly, all 2.5 GHz cores will run Generals the same, be it any family of Intel or AMD chip, since all it cares about the plain old Pentium III instruction set, which they all have and all are compatible with.) Individual core speed is still important though (even if people tend to overlook it these days due to new brandings), since that applies to all programs regardless of optimizations (or their lack of).

Pathfinding is definitely a problem and a cause of lag, since it's among the most complex tasks ever (especially when you don't have a cell-system or similar since such would simplify it, reason why a lot of games in the DOS/Win98 era went with such).

EDIT: * - The above is correct even for the "64-bit broke DOS" issue - all PC CPUs can support 8 or 16-bit instructions in legacy 32-bit mode, however that means they cannot utilize the 64-bit features in said mode (like the 4 GB barrier - which itself isn't correct either but that's a loooong story itself, google for PAE if interested) and AMD decided that it would bring more benefits in the long run if they are repurposing 8/16 bit instruction codes for 64-bit ones than hacking the instruction set over again (note that basically all 32-bit instructions are basically starting with a repurposed error code IIRC because that was the sole unused 8086 instruction they could use as a prefix). So when your legacy program crash in 64 bit OSes but works fine on 32bit OSes on the same system, it's because somewhere deep down it tried to set the CPU to 16 bit mode.
Emin96
QUOTE (M.P @ 25 Oct 2016, 12:45) *
Well I know that the second and forth ones are correct, But where did you get info about the other 2? :|

I'd like to see your source.

Source is from internet around ZH forums smile.gif
Zion
guys i have a 4ghz processor, this shit don't drop a frame on intel graphics, enough with all your stories of path-finding and game incompatible with hardware..
6770k treats this game like a joke.. yes (IF)the game supported multicore better.. would run better "for you". but the reality is ur pc is shit

*drops mic*
Storm
QUOTE (Graion Dilach @ 25 Oct 2016, 19:54) *
2.5 GHz is slow. I'd even call everything less than 3 GHz slow. For a good while now, CPUs are optimized towards pulling more complex instructions within a cycle which means only programs which were compiled to support those complex instructions are capable to benefit from them - this does not mean new hardware is incompatible with old programs, but old programs don't benefit from the new instruction sets*. (To put it more bluntly, all 2.5 GHz cores will run Generals the same, be it any family of Intel or AMD chip, since all it cares about the plain old Pentium III instruction set, which they all have and all are compatible with.) Individual core speed is still important though (even if people tend to overlook it these days due to new brandings), since that applies to all programs regardless of optimizations (or their lack of).

EDIT: * - The above is correct even for the "64-bit broke DOS" issue - all PC CPUs can support 8 or 16-bit instructions in legacy 32-bit mode, however that means they cannot utilize the 64-bit features in said mode (like the 4 GB barrier - which itself isn't correct either but that's a loooong story itself, google for PAE if interested) and AMD decided that it would bring more benefits in the long run if they are repurposing 8/16 bit instruction codes for 64-bit ones than hacking the instruction set over again (note that basically all 32-bit instructions are basically starting with a repurposed error code IIRC because that was the sole unused 8086 instruction they could use as a prefix). So when your legacy program crash in 64 bit OSes but works fine on 32bit OSes on the same system, it's because somewhere deep down it tried to set the CPU to 16 bit mode.

Well, that was some excellent insight.

Most of the CPU's today, as you mentioned above, are equipped with the Intel Pentium instruction sets and Intel only makes CISC based instruction sets as far as I know. In the yesteryears, I used to own a Motorola CPU (RISC - with a Unix base) and those processors were proven to be quite good in terms of gaming and other customized applications. Its a shame that the lack of application software for RISC based instruction sets, Windows, Sun Solaris and Novell's OS/2 ignored the RISC based chipsets universally.
Storm
QUOTE (Skitt @ 25 Oct 2016, 13:07) *
ok this game will run on a dam toaster.

for rotr
and this is if you want to eliminate the hardware side of the lag so the only lag u will get will either be connection or sage itself.

lowest graphics settings:
Minimum:
CPU: Core 2 duo E4500
Ram: depends on OS if on windows xp 2gb DDR2. if on Windows 7/8/10 4gb DDR2. Windows vista 6gb ram (os takes a stupid ammount of ram)
Gpu: any graphics card made in 2004 and after

as for the sage lag its to do with objects on the map as well as scripts firing every frame if there is AI involved.
obviously the more AI's u have on the map the more objects there are and the more scripts firing.
the level of detail on the map counts as well (the world builder starts to bitch at you if it hits 3000 objects)
pvp lag is generally down to
A: players have there graphics settings higher than they should do.
B: Connection speed from one of the players.
C: low spec hardware.

if you are getting high cpu use on your cpu from rotr on minimum settings i strongly suggest you run tests on it to check its health status or you upgrade.

Did you forget to mention the speed for the CPU under the minimum settings required because if the speed is below 3 GHz, everything you said with the objects and scripts firing every frame, would make no sense. I've a top of the line Graphics card with 4GB of VRAM but its useless where ROTR lag is concerned because the culprit is the speed of my desktop CPU.
Graion Dilach
QUOTE (Storm @ 25 Oct 2016, 19:40) *
Most of the CPU's today, as you mentioned above, are equipped with the Intel Pentium instruction sets and Intel only makes CISC based instruction sets as far as I know. In the yesteryears, I used to own a Motorola CPU (RISC - with a Unix base) and those processors were proven to be quite good in terms of gaming and other customized applications. Its a shame that the lack of application software for RISC based instruction sets, Windows, Sun Solaris and Novell's OS/2 ignored the RISC based chipsets universally.


Uhm... this is wrong. RISC was never ignored - a lot of the tablets/co. use ARM CPUs which have a RISC architecture. Also, due to modernization, the line between RISC and CISC designs is blurry by now - the latest ARM CPUs are quite complex themselves. It's not on the PC market though, for really, really obvious reasons: it'd make all PC software (excluding the ones which are completely abstracted to a virtual environment, like .NET or Java) useless since all the instructions would need to be emulated (.NET is probably the sole reason why Windows Store can ship the same application to all platforms). This is the main reason why PowerPC Apples functioned completely different as well (although since Apple has consistent hardware with itself, testing different hardware combinations are unnecessary compared to PC).

Motorola also produced CISC processors in the form of the 68000. I think that's what you had, actually - that's their most successful chipset afterall.

The x86-derivates grew to a giant monolithic jack-of-all-trades, and tbh the fact that they stood well against the time should be praised (The first x86 processor is from 1978 afterall). It's ineffective due to the featurecreeping and the requirement of maintaining legacy and not because it's a CISC architecture.
Storm
QUOTE (Graion Dilach @ 26 Oct 2016, 3:17) *
Uhm... this is wrong. RISC was never ignored - a lot of the tablets/co. use ARM CPUs which have a RISC architecture. Also, due to modernization, the line between RISC and CISC designs is blurry by now - the latest ARM CPUs are quite complex themselves. It's not on the PC market though, for really, really obvious reasons: it'd make all PC software (excluding the ones which are completely abstracted to a virtual environment, like .NET or Java) useless since all the instructions would need to be emulated (.NET is probably the sole reason why Windows Store can ship the same application to all platforms). This is the main reason why PowerPC Apples functioned completely different as well (although since Apple has consistent hardware with itself, testing different hardware combinations are unnecessary compared to PC).

Motorola also produced CISC processors in the form of the 68000. I think that's what you had, actually - that's their most successful chipset afterall.

The x86-derivates grew to a giant monolithic jack-of-all-trades, and tbh the fact that they stood well against the time should be praised (The first x86 processor is from 1978 afterall). It's ineffective due to the featurecreeping and the requirement of maintaining legacy and not because it's a CISC architecture.

Hi. Something more to learn smile.gif Thank you beer1.gif

No no .. I've had the 88000 32Bit OS and that I well remember. It was like a PowerPC with RISC architechture based on IBM's Power architecture and the fact is that I did not own that PC, actually I inherited it from one of my relatives (who lived in Belgium) while they passed away.
Skitt
QUOTE (Storm @ 25 Oct 2016, 19:15) *
Did you forget to mention the speed for the CPU under the minimum settings required because if the speed is below 3 GHz, everything you said with the objects and scripts firing every frame, would make no sense. I've a top of the line Graphics card with 4GB of VRAM but its useless where ROTR lag is concerned because the culprit is the speed of my desktop CPU.

doesnt matter what cpu u have be it toaster or top of the line
too many scripts/objects and you are gonna lag its not something that can be fixed only controlled.
Storm
QUOTE (Skitt @ 26 Oct 2016, 10:40) *
doesnt matter what cpu u have be it toaster or top of the line
too many scripts/objects and you are gonna lag its not something that can be fixed only controlled.

Although considering what Graion & Zion mentioned above about the CPU speeds, this makes sense.
Skitt
another example of the game causeing lag
weather effects there so horribly optimized its rediculous
im running
gtx 650ti
12gb ram
i3 3.3ghz

snow weather causes me a 20fps drop
heavy rain/lightning effects on one of the 4ffa maps nirly lags me out

same system can run the likes of shadow of mordor/fallout 4/bioshock infinite on ultra with constant 30 fps
Storm
Yeah. I agree. I seldom play those maps with rains and/or falling snow. However, some snowmaps do give me a good response in both ROTR & ShW. Like the "Siberian Dawn". You must know there is this certain map called: "Lone Star State". Let alone the spam build up by the AI, first 10 mins in the game, irrespective of whether it is ROTR or ShW, the lagging bombs the game and within 20 to 30 minutes, a serious error is guaranteed! No idea what is it about that map, it does not like working properly, not even once in a blue moon.
ZunZero97
This game is old likely 13 years,(c&c generals zh of course) so the new hardware cant even run properly this game.
Skitt
wrong


new hardware CAN run this game
Graion Dilach
I've even explained it already some posts ago. Some fail at reading.

Regarding weather effects: a lot of engines tend to render them strictly as a post-effect on the GPU, but I guess SAGE isn't among them from the above. (Which is plausible knowing how WW couldn't handle particles properly in RA2 either.)
X1Destroy
I fail to see the point of upgrading your comp if you only play ancient 2008 and below games. Not only you have to waste time to tweak it to works the way it supposed to be, it can't use the available resources to maximum potential. Just buy a new one for works and newer games.

Zion
QUOTE (Skitt @ 26 Oct 2016, 6:08) *
snow weather causes me a 20fps drop
heavy rain/lightning effects on one of the 4ffa maps nirly lags me out

same system can run the likes of shadow of mordor/fallout 4/bioshock infinite on ultra with constant 30 fps

Lol bring that poor computer indoors skitt xD xD
GeneralCamo
Do not mistake the inability to run newer features on newer hardware with the inability to run on newer hardware. I mentioned lack of multi-core support. This means that the game can't use more than one thread (normally), but it will still run fine on newer CPUs. Just don't expect your 24 core server CPU to be used to its fullest extent with this game.

Same with the GPU. The game uses DX8, don't expect features from DX12 to be in this game, but that doesn't mean that the hardware can't run it. The hardware runs DX8 just fine, it just may not support many of the new features like tessellation and etc.
{Lads}RikerZZZ
QUOTE (Zion @ 27 Oct 2016, 15:23) *
Lol bring that poor computer indoors skitt xD xD


That's a quality shit post biggrin.gif
ZunZero97
QUOTE (Skitt @ 27 Oct 2016, 4:18) *
wrong


new hardware CAN run this game

ahhh the mod ROTR. My mistake.

But generally since anyone have the latest computer of the planet the mod still running on 30 fps and drops, even with NASA computer.
Storm
QUOTE (Graion Dilach @ 27 Oct 2016, 13:00) *
(Which is plausible knowing how WW couldn't handle particles properly in RA2 either)

what is WW (World of Warcraft?) & RA2?
Pardon my ignorance. I'm a limited number of games fan.
Arc
WW is Westwood.
Tobę
RA2 is (Command and Conquer) Red Alert 2

Not sure why you didn't google it. Never-less I don't mind.
Storm
QUOTE (Tobę @ 28 Oct 2016, 16:02) *
Not sure why you didn't google it. Never-less I don't mind

I did not put a gun on your head and asked for your answer. If you mind, keep away but apparently you don't mind so why the double talking?
Again, you dont need to answer my question. thank you smile.gif
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.