Forums

Forums (http://www.abandonia.com/vbullet/index.php)
-   Tech Corner (http://www.abandonia.com/vbullet/forumdisplay.php?f=23)
-   -   Little insight required about "modern" computer industry (http://www.abandonia.com/vbullet/showthread.php?t=29082)

Japo 21-09-2012 05:39 PM

Quote:

Originally Posted by Eagle of Fire (Post 446162)
32 gigs of ram

That's a LOT! Almost only very very high end computers devoted to some specific professional pack more than 8 GB even nowadays. But if you want so much good for you, I always say nobody can have too much RAM. In the medium or long I'd certainly recommend 16 GB. As I said it shouldn't be much more than $10 per GB. But that means 32 GB could cost over $300!

Quote:

Originally Posted by Smiling Spectre (Post 446168)
XP itself can work with multiple cores - but it's programs that must use it. Any modern program can utilize multi-core on XP, but if, say, System Shock 2 was never aware of several cores, it will not use it, no matter what. :) Same for other games for that era.

What I heard about SS2 was that it tends to crash in computers with multiple cores, and it could be solved by confining it to one of them in the task manager, or with the Ddfix community mod/patch. I don't know if it's true, and if it were, the only way a program can be affected by the number of cores is by creating several threads on purpose.

jonh_sabugs 21-09-2012 06:01 PM

The multi core problem on XP for SS2 is true, and is valid for several other old games also, like Fallout 2, among others. I don't know the exact reason also, however it seemed to be related to an error in how XP handled work distribution among the cores.

RRS 21-09-2012 06:58 PM

This problem is even older than multi-core: hyperthreading also started to interfere with older games. I recall using "set affinity" option in Task Manager (WinXP) for Thief, which otherwise hanged.

Eagle of Fire 21-09-2012 07:50 PM

The reason why I asked for 32 gigs of ram is because I'm planning for the long run. I want another computer which can last me another 10 years.

If it gets too costly the guy who is doing the research for me will inform me. I trust the guy, he's the one who got me this computer I'm typing on ATM.

Smiling Spectre 22-09-2012 10:38 AM

Quote:

Originally Posted by Japo (Post 446188)
What I heard about SS2 was that it tends to crash in computers with multiple cores, and it could be solved by confining it to one of them in the task manager, or with the Ddfix community mod/patch. I don't know if it's true, and if it were, the only way a program can be affected by the number of cores is by creating several threads on purpose.

Yeah, and I seen answer on this question above: SS2 don't know about multiple cores, but it uses multithread. And (guessing) as WinXP tries to "parallel" multithreading processes - when it's must be not - it crashes game.

Japo 22-09-2012 12:45 PM

Quote:

Originally Posted by Smiling Spectre (Post 446202)
And (guessing) as WinXP tries to "parallel" multithreading processes - when it's must be not - it crashes game.

Threads are parallel by definition and are meant to be so. As you say and I said earlier, the only way a program can behave any different in this context is if it creates threads, the OS won't do it unsolicited, it doesn't happen on its own, there are particular API calls for it. The problem must be that the Dark Engine had bugs, but they happened to be masked in single-core computers. Each thread decides when to release its core (or whole CPU in single cores without hyper-threading) for other waiting threads; if it doesn't, it will happen to continue running before the rest of threads in time.

But this isn't a feature of the code, and the only benefit of creating threads is to have them run in parallel if possible. If you want code to run in sequence, creating threads is just overhead, and of course it won't run in sequence unless the computer happens to be single-core without hyper-threading. And of course making on purpose a program with multiple threads, that don't release control at least for time slicing in single cores, and then even making the program so that it won't work with multiple cores when parallelism does appear, would be as stupid as it gets.

I'd guess the multi-threading in the Dark Engine was a commendable feature, perhaps thinking in the future, but as its o often happens with games, the engine was released with bugs still outstanding. The problem with parallelization is that sooner or later you need to synchronize everything back, and this is hard and prone to error. This bug must have been considered non-fatal because at that time there was no multi-threading hardware.

jonh_sabugs 22-09-2012 07:13 PM

Are you sure it was a bug inherent of the Dark Engine? I remember games running on other engines presenting the same problem, like Fallout 2 as I mentioned, Syberia and others.

As for threads, they used to serve another purpose in their origins, besides actual parallelism, which was flux control. It was (and is) common to see programs using them to block on file/socket reads, while the rest of the logic kept running, or to (pseudo) parallel run background tasks, etc.

Edit -

I was thinking here, when these games crashed on multi-core they weren't simply raising a process exception and having Windows kill them, they actually locked the entire machine, forcing a hard reset. In my opinion, this seems to be an OS fault, but I am not sure.

Japo 22-09-2012 09:13 PM

I was just speculating. I looked for info on the Dark Engine problem but I found no technical details. Of course I never said that other engines or games wouldn't have similar problems. And programs can stop working in many ways, it would be nice if they raised an exception every time, but sometimes they turn into zombies, eat all CPU, etc.

It's true that a system crash cannot be cause by an application, only by a fault in the hardware, OS or drivers. A BSOD would be a sure tell, but an application can eat all the CPU so that the whole system stops responding, without any fault in the OS, which is just busy running the application's infinite loop. Again I don't know what's the case with the Dark Engine.

I don't know about those uses of threads... Surely a background thread isn't needed to enforce a lock.

jonh_sabugs 22-09-2012 09:44 PM

Well, I decided to go ahead and research it a bit. It seems the real cause is quite controversial, posts from back when multi cores started becoming available blame it on several things, from poor design choices in the softwares to OS/hardware malfunction.

Microsoft has some posts on clocking issues in multi core environments, as different cores produce different clock readings, older software would become confused/assume wrong things. This could be part of the cause.

In the end, I think it's a bummer. Also, from what I researched, it doesn't seem CPU starvation (as in completely locking out other threads/processes) is possible in XP, even though it can slow down the machine considerably.

Smiling Spectre 23-09-2012 05:01 PM

Quote:

Originally Posted by jonh_sabugs (Post 446224)
Also, from what I researched, it doesn't seem CPU starvation (as in completely locking out other threads/processes) is possible in XP, even though it can slow down the machine considerably.

Oh, it's a theory, actually.

In real software, it's quite possible to slow machine as far as "not responding" state. For WinXP and Win2k alike (I never seen it in Win2003 though). Game/application eats so much of processor time that it simply responds 3 seconds per 3 minutes. Broke everything up to network connections. I seen that several times with some viruses, and, funny enough, with Kaspersky anti-virus (in different case, not virus-related). :) And last one was on server with 2 processors - it not helped to free it at all. :)


The current time is 07:36 PM (GMT)

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.