TIL that some programmers have to be taught what others learned first hand by encapsulating IPX packets inside TCP/IP[1] packets inside PPP packets modulated over the POTS network.
(Not Kali, the OTHER one. :) )
Oh, and there used to be different types of game engines. You used to be able to SEE whether or not your engine was interpolating missing data, whether it was maintaining state on your end, their end, or both, etc--by observing how your opponents' ships would react when your mom picked up the phone in the kitchen.
Descent had a pretty darned good method.
Alice and Bob play a game. From Alice's point of view, the visual of Alice's weapon fire striking Bob's ship is rendered on her machine based on her machine's understanding of the location of his ship. But the AUDIO of weapon impact is "rendered" on Bob's end. So if you see impact, it means nothing--but if you hear impact, you know that 1) your fire collided with Bob's ship on his end and 2) it happened approx. n/2 seconds ago.
This resulted in a kind of meta-game wherein the lagscape was just part of the terrain. There was a 'general relativity' component to it: since neither side was master or slave... No point of view was privileged[2]. Alice had to keep two gamestates in mind at all times, the local to protect her hitpoints, and the remote to reduce his hitpoints.
In my opinion, both the "sound/visual disconnect" and the "no privileged POV" were necessary ingredients in order for the lag to be "part of the landscape"--ie, fair and fun.
[1]: Wait, did we already use UDP at that point? I can't recall...
[2]: Some pieces of gamestate were given to the client that started the game or the client with the highest FPS. I think the reactor countdown was finalized in one client, for example, resulting in the famous "3... 2... 1... 0... ... ... ... ... BOOM!"
(Not Kali, the OTHER one. :) )
Oh, and there used to be different types of game engines. You used to be able to SEE whether or not your engine was interpolating missing data, whether it was maintaining state on your end, their end, or both, etc--by observing how your opponents' ships would react when your mom picked up the phone in the kitchen.
Descent had a pretty darned good method.
Alice and Bob play a game. From Alice's point of view, the visual of Alice's weapon fire striking Bob's ship is rendered on her machine based on her machine's understanding of the location of his ship. But the AUDIO of weapon impact is "rendered" on Bob's end. So if you see impact, it means nothing--but if you hear impact, you know that 1) your fire collided with Bob's ship on his end and 2) it happened approx. n/2 seconds ago.
This resulted in a kind of meta-game wherein the lagscape was just part of the terrain. There was a 'general relativity' component to it: since neither side was master or slave... No point of view was privileged[2]. Alice had to keep two gamestates in mind at all times, the local to protect her hitpoints, and the remote to reduce his hitpoints.
In my opinion, both the "sound/visual disconnect" and the "no privileged POV" were necessary ingredients in order for the lag to be "part of the landscape"--ie, fair and fun.
[1]: Wait, did we already use UDP at that point? I can't recall...
[2]: Some pieces of gamestate were given to the client that started the game or the client with the highest FPS. I think the reactor countdown was finalized in one client, for example, resulting in the famous "3... 2... 1... 0... ... ... ... ... BOOM!"