Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
uuuzbf authored May 13, 2024
1 parent 18b4043 commit 935fa9e
Showing 1 changed file with 18 additions and 7 deletions.
25 changes: 18 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,19 +78,30 @@ These features are disabled by default. Edit `bf42plus.ini` in the game director
- Chat/kill logging to file

### Details about `plus.smootherGameplay` option
In version `1.3.4` a new option was added that may make the game run smoother and reduce the time between you pressing a key and the server processing it. The impact of this is far less than the server-side "reg patch" but it may improve your game experience a bit. It is also possible that it won't have any effect on your game.
In version 1.3.4, a new option was added that may make the game run smoother and reduce the time between you pressing a key and the server processing it. The impact of this is far less than the server-side "reg patch", but it may improve your game experience a bit. It is also possible that it won't have any effect on your game.

This option is really two adjustments under the hood. The first one tells DirectX to not mess with the CPU's floating point precision, so some calculations will be more accurate. The second one tries to clear a buffer on the server belonging to you, every time you die.

By default the DirectX code changes the FPU to single precision when a Direct3D Device is created. This affects the precision of some calculations in the game, mainly the ones responsible for timing the game's frames and input updates. Normally the game runs at 60 FPS, this means a game tick runs each 16.6666 milliseconds (0.01666... seconds). There is also a global precise timer counting the number of seconds since the game started. When the game calculates when the next frame processing should start, it adds the target frame interval (0.016666...) to the time the current frame was started. For example if the current frame was started at `232.481s` the next one should start at `232.4976s`. However if the FPU is set to single precision by DirectX, this calculation is less accurate, and each frame may run a bit early or late, which depends on the actual value of the global timer. If its value is below 2048 seconds (~34 minutes, this value may be familiar to some of you), it gets rounded down and you get slightly higher framerate, if the value is between 2048 and 4096 seconds (34min - 68min), it gets rounded up, which means the framerate is a little lower. The error is about 1%, so you get/lose an extra frame roughly each 100 seconds. This not only affects the game's frame rate, but also the input timing of the game. Normally the game should read your inputs exactly 30 times a second, but this is also happening either a little bit slower or faster.
Something must be made clear about how the game's network protocol works to understand the following explanation. During gameplay your client only sends your input (keypresses, mouse movement, controller input) to the server, nothing else. The server simulates everything and sends back the result, which includes every detail about every object that is being synced. If your client diverges from the server's simulation, it gets corrected to match what the server calculated. This is what is happening when you are rubberbanding, you get moved where the server thinks you are.

Running the input code at a faster rate has the effect of your client sending inputs to the server faster than it gets processed by it. The server has a queue (buffer) for each player where it stores the inputs it receives before using it. When it receives a new input it adds it on one end of this queue, and when the server needs an input for a frame simulation it removes one from the other end. This queue by default has a hard limit of 4 items. If there is more, the server starts dropping the oldest ones until there is only four in it. This is actually noticeable if you know what to look for because the inputs are directly tied to the game simulation, and if one is dropped the client will adjust its worldTime by one input frame (33ms). This usually results in a small jitter, which is most noticeable if you are moving the mouse while aiming because the camera jumps back a little bit. Each item in this queue means the server will process your input 33ms later, if it has 4 items, that is a total of 133ms, which is not much but sometimes noticeable, especially if you have low ping, because then this delay is more unusual. This delay mainly shows up in events reported to the client, for example you pressing the fire button and the server sending you the kill message that your target died. The clientside lag compensation code calculates with this extra delay, however I haven't done much research on the lag compensation code yet, so this part is not clear to me.
By default, the DirectX code changes the FPU to single precision, when a Direct3D Device is created. This affects the precision of some calculations in the game, mainly the ones responsible for timing the game's frames and input updates. Normally, the game runs at 60 FPS, this means a game tick runs each 16.6666 milliseconds (0.01666... seconds). There is also a global precise timer counting the number of seconds since the game started. While the game calculates when the next frame processing should start, it adds the target frame interval (0.016666...) to the time the current frame was started. For example, if the current frame was started at 232.481s the next one should start at 232.4976s. However, if the FPU is set to single precision by DirectX, this calculation is less accurate, and each frame may run a bit early or late, which depends on the actual value of the global timer. If its value is below 2048 seconds (~34 minutes, this value may be familiar to some of you), it gets rounded down and you get slightly higher framerate, if the value is between 2048 and 4096 seconds (34min - 68min), it gets rounded up, which means the framerate is a little lower. The error is about 1%, so you get/lose an extra frame roughly each 100 seconds. This not only affects the game's frame rate, but also the input timing of the game. Normally the game should read your inputs exactly 30 times a second, but this is also happening either a little bit slower or faster.

Servers don't have Direct3D, which means they already run the same frame timing code with this increased precision. Changing the floating point precision results in more accurate client frame timing, which means it will match the server more closely. This may have other benefits too by matching the server better when doing other calculations during game simulation, however I haven't researched this at all yet, also most of the simulation code uses single precision floats which shouldn't be affected by this change.
Running the input code at a faster rate has the effect of your client sending inputs to the server faster than it gets processed by it. The server has a queue (buffer) for each player, where it stores the inputs it receives before using it. When it receives a new input it adds it on one end of this queue, and when the server needs an input for a frame simulation, it removes one from the other end. This queue by default has a hard limit of 4 items. If there is more, the server starts dropping the oldest ones until there is only four in it. This is actually noticeable if you know what to look for because the inputs are directly tied to the game simulation, and if one is dropped the client will adjust its worldTime by one input frame (33ms). This usually results in a small jitter, which is most noticeable if you are moving the mouse while aiming because the camera jumps back a little bit. Each item in this queue means the server will process your input 33ms later, if it has 4 items, that is a total of 133ms, which is not much but sometimes noticeable, especially if you have low ping, because then this delay is more unusual. This delay mainly shows up in events reported to the client, for example you pressing the fire button and the server sending you the kill message that your target died. The clientside lag compensation code calculates with this extra delay, however I haven't done much research on the lag compensation code yet, so this part is not clear to me.

The other thing this option does is trying to empty your server side input queue on each death, by dropping three inputs instead of sending it to the server. Which either drops 3 items from the queue if its full (has 4 items) or completely empties it. If it gets completely empty, your game will glitch out for a bit until it gets a new item, but because you just died, it doesn't really matter, also the effects should be mostly hidden by the death camera flying up in the air.
Servers don't have Direct3D, which means they already run the same frame timing code with this increased precision. Changing the floating point precision results in more accurate client frame timing, which means it will match the server more closely. This may have other benefits too, by matching the server better when doing other calculations during game simulation, however I haven't researched this at all yet. Also most of the simulation code uses single precision floats, which shouldn't be affected by this change.

The performance impact of running in higher precision should be very small because on modern systems the game's CPU usage mostly consists of busy looping until the next frame. Higher precision mode only affects certain floating point operations (`fdiv`, `fsqrt` - division and square root calculations)
The other thing this option does is trying to empty your server side input queue on each death, by dropping three inputs instead of sending it to the server. Which either drops 3 items from the queue, if its full (has 4 items) or completely empties it. If it gets completely empty, your game will glitch out for a bit until it gets a new item, but because you just died, it doesn't really matter, also the effects should be mostly hidden by the death camera flying up in the air.

All this won't make you a pro player, but may get rid of some annoyances of this game. Also keep it in mind that not every system may have those issues. Some people even saw differences in the "smoothness" of the game, some didn't. This may even be affected by things like if you are running in window mode or not. However it shouldn't have any negative effects, so there is no reason to not enable it.
The performance impact of running in higher precision should be very small because on modern systems the game's CPU usage mostly consists of busy looping until the next frame. Higher precision mode only affects certain floating point operations (`fdiv`, `fsqrt` - division and square root calculations)

All this won't make you a pro player, but may get rid of some annoyances of this game. Also keep it in mind that not every system may have those issues. Some people even saw differences in the "smoothness" of the game, some didn't. This may even be affected by things like if you are running in windowed mode or not. However, it shouldn't have any negative effects, so there is no reason to not enable it.

I have been asked about how exactly I got all this information, so after looking at various chatlogs here is a timeline:
- 2023 spring or summer: Something was bothering me about the game's reaction time on some servers, so I added a mod to my client that tries to measure the latency between sending an input to the server and receiving acknowledgement that the server used the input in a simulation. This code was not very accurate, but it was showing that this delay was a lot higher than my ping.
- 2023-07-02: When looking at the server code, I figured out that the server can queue player inputs before processing them. I realized that this could be the cause of the delay I was investigating. I also modified my client to starve this buffer every few seconds by dropping a bunch of inputs, tested it on SiMPLE, tought that it was somewhat better, then disabled it because it felt like cheating. I also shared my findings with henk.
- 2023 october: I added some options to the server mod I am making to mess with the action buffer in various ways. I could not test it because my server was empty :(
- 2023 december?: Soldierr, who made the "reg patch", discovered that game.interpolationTime in the client has some weird patterns, and also too high. This variable was measuring pretty much the same thing I did, but I didn't know about it.
- 2024-01-10: I got in contact with Soldierr in a discord groupchat and told him about my findings (ActionBuffer in the server, etc). In the following days we tried my server patch with tragic, Loony and a few others, but it was too crude, often caused jittering. Unrelated: there was also some discussion about creating a client mod with various improvements, this is what became bf42plus, sadly Soldierr is gone since early february.
- 2024-01-17: With all the information above and with some more research of the client code I figured out that the client's update rate is always slightly off because of bad floating point rounding when calculating frame times. Soldierr suggested that we either fix the client or speed up the server.
- 2024-01-20: Soldierr made a patched BF1942.exe for testing that has the patched Direct3D::CreateDevice call to avoid the low FPU precision.
- The second part that clears the buffer on death was added in 2024 april when I implemented this option in bf42plus.

0 comments on commit 935fa9e

Please sign in to comment.