You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While investigating https://github.com/elastic/ingest-dev/issues/3677, I noticed that this package does not handle a 413 Entity Too Large status from Elasticsearch. Even with batching, in specific use cases like in Beats (which may include large stack traces in documents), a batch can still exceed http.max_content_length, the maximum request size in Elasticsearch, which defaults to 100MB.
The text was updated successfully, but these errors were encountered:
While investigating https://github.com/elastic/ingest-dev/issues/3677, I noticed that this package does not handle a 413 Entity Too Large status from Elasticsearch. Even with batching, in specific use cases like in Beats (which may include large stack traces in documents), a batch can still exceed http.max_content_length, the maximum request size in Elasticsearch, which defaults to 100MB.
The text was updated successfully, but these errors were encountered: