-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UI timing out "Server busy" error after inserting rows #1367
Comments
@atcooper1 If there is a ingestion already running at the background (due to previous submission) you will see server is busy error. Please wait for some time, once the previous job run completed, then you can submit again. |
Regarding the insert issue, can you use clone so you do not need to copy and paste after insert? We are testing a fix in edge for not allow insert for more than 3 lines and this may fix this issue as well. Please wait sometime and we will push changes to staging |
Hi Raymond, there is no ingestion running in the background. This is happening during a 'save and validate' function with a single datasheet active. The Ingestion issue (i.e. sheets taking over half an hour to ingest) in another problem. Clone produces the same issues, but regardless, i need to be able to accurately copy and paste data from excel to the inserted or cloned rows. |
Hi Toni, I mean there is a global lock for ingestion operation, git change, so if the system is busy process ingestion operation, not necessary you submit it, can be other user. Then system will show this message because it is busy processing. You need to wait until the processing done before you can do another save and validate. |
Hi Raymond, Yes I am aware of the global lock. But Lizzi was not ingesting data at the same time and she is the only other user. |
I download the log from aws, there is a materialized view generated which takes 45mins to complete, although not completely match the time, but I suspect this may cause the server busy ``
|
Have look at the code, the "sever is busy" warning shows when there is either a "validate" or "ingest" is in progress. That is at any time, there can only be 1 ingest or 1 validate happens. Since it set a marker in the db, so there must be a in progress "validate" or "ingest" happening |
The insert should have fixed with the recent release too |
@atcooper1 do you think this is still an issue ? |
The UI is very slow generally. I'm still waiting on a sheet that was ingested over 40 mins ago to load so i can keep working in the UI. I am unable to do anything until this loads as I get a "server is busy" error. |
Screencast.from.04-10-24.16.41.56.webmI really cannot repeat what you see |
Can you share the excel you importing, I can try it in staging to see if I can repeat it thanks |
Can I press the Save and Validate button for any of the STAGED job, until that I cannot trigger the "Server is busy.." |
Ok your ingest completed I see the following log captured, almost an hour to complete 2024-10-04 05:43:19.882 INFO 2896 --- [nio-8080-exec-8] a.o.a.n.r.c.IngestionController : Ingest job id 762 with transaction |
Yes, and while that is happening i cannot load new sites, new observable items, or work on additional jobs. So it is an hour I have to wait until I can continue working on data. This is too long to load a relatively small data sheet. Ingest time historically used to range from a few seconds up to 4 mins (for very large sheets), but this lag has increased over time and now it is unsustainable for successful/productive workflow. |
OK let me try in on my side to see where the time spend, thanks |
Rows don’t always insert where expected, instead they insert at the top of the sheet, but when users paste new data into the blank rows inserted at the top of the sheet, the UI re-orders the rows to the phantom locations within the sheet (i.e. where users originally tried to insert blank rows). If rows are inserted/cloned, the datasheet is unable to be validated as a "Server Error" persists.
IMG_5660.MOV
The text was updated successfully, but these errors were encountered: