Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handling for 413 entity too large in BulkIndexer #210

Open
mauri870 opened this issue Nov 4, 2024 · 2 comments
Open

Handling for 413 entity too large in BulkIndexer #210

mauri870 opened this issue Nov 4, 2024 · 2 comments

Comments

@mauri870
Copy link
Member

mauri870 commented Nov 4, 2024

While investigating https://github.com/elastic/ingest-dev/issues/3677, I noticed that this package does not handle a 413 Entity Too Large status from Elasticsearch. Even with batching, in specific use cases like in Beats (which may include large stack traces in documents), a batch can still exceed http.max_content_length, the maximum request size in Elasticsearch, which defaults to 100MB.

@mauri870
Copy link
Member Author

Closing this since it is handled in the elasticsearchexporter.

@kruskall
Copy link
Member

This is still relevant, elasticsearchexporter has nothing to do with this repo

@kruskall kruskall reopened this Dec 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants