Video: Bots gameplay
Here are Lagg's bots in action, in the first two minutes I also explained some things with the annotations, turn them off if you find them annoying or leave them on if you want to read some info.
Sorry for the bad video quality...
If you have any questions I will gladly answer them if I can.
Sorry for the bad video quality...
If you have any questions I will gladly answer them if I can.
Comments
By the way, ms os don't really do realtime, they fake it. Just so you know.
Anyway, then you have the timeslices or quanta given to an app by the os. When the os is set to give most cpu attention to foreground apps, the foreground apps get the most time and higher priority, when the system is set to give background apps more attention, the os gives every app (thread really), not just the background app, a much longer timeslice, and this time slice is equal for all apps, foreground or background. It may also reduce the priority given to inputs from the user that come in over kb and mouse. Server2003 takes this to an extreme, as it actually buffers kb and mouse input while it finishes background tasks such as disk calls and net port calls.
What realtime does is give every possible timeslice or quanta to the app that has realtime prioity, keeping in mind that no M$ os I know of is actually a realtime os, they just tell you that the app is at realtime to fake you out. So when the server is at realtime, the os gives every possible timeslice it can to the server, and hence the ai get much more cpu time than normal. If you spend much time fooling around with apps at realtime you'll soon see why M$ does not recommend apps be run at realtime. They crash a lot and glitch a lot. I find that "high" pri gives about as much boost as desired, but I must say that t2 ai at realtime, even the stock ai, become pretty vicous.
So in essence, instead of getting a slice of time for 10ms or 16ms, your thread gets 64ms, or even longer when set to realtime as the os gives that app every possible timeslice that comes available. The issue is that realtime pri preempts some things that need to be taken care of first, such as disk and network activity. This also allows the ai to kick your ass since the game may be having client packets withheld while the ai run amok.
There are high precision timing facilities in such operating systems (used for things like media playback), but these are not implemented by manipulating system scheduling timeslices.
The game internally assumes 32 millisecond simulation frames regardless of what process priority you set it to. The time stamp counter or precision event timer is used to resynchronize the event simulation with real time to compensate for variations in scheduling or execution time of simulation components (e.g. if time spent running scripts takes longer than 32 ms).
If scheduling affected CPU time allocated to AI processing (say if it were on a real time deadline, which it isn't), the increases in average CPU speed would be a far larger factor over the performance of AI over the lifetime of the game than any process scheduling.