-
-
Notifications
You must be signed in to change notification settings - Fork 196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deadlock in web sync agent #1282
Comments
Dont use |
We tried it out. We used our own implementation of IProgress, both serverside and clientside. |
Okay, can you share a simple application, like a github repository, where you can share a simple application reproducing the error ? The idea is to have something I can work on. See an example where a github repo has been shared with an sql script to create the environment : #1286 (comment) |
I created a repo HERE. |
Another thing maybe worth mentioning, we create a singleton SyncAgent when we initialize our hosting service. |
Setup:
Our client runs in an on premise network and we need to be resilient to internet downtime.
Calls to
SynchronizeAsync
are executed sequentially per scope inside a scheduled message handler.The message handle has a timeout and cancels the cancellation token source if the sync call takes too long.
Most of the time everything works as expected. Even, cancelling the sync and retrying with a backoff.
Sometimes, the progress get's stuck at 0%, cancelling the token doesn't result in an interruption and the sync can only be unlocked by restarting the process.
This looks like a deadlock in the SyncAgent when using WebRemoteOrchestrator.
The same process does not hang when we used LocalOrchestrator, but this is not an option for us anymore.
We could not reproduce this issue in a POC, so maybe it's also related to the bigger volume of data in our production environment.
We noticed that SynchronousProgress is using SynchronizationContext.
Could it help if we don't use it anymore?
Do you have any other suggestions?
The text was updated successfully, but these errors were encountered: