You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When there are large log messages in the journal, then journald-cloudwatch-logs aborts with following error:
Failed to write to cloudwatch: failed to put events: InvalidParameterException: Log event too large: 383595 bytes exceeds limit of 262144
status code: 400, request id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
It would be nice if instead large messages get forwarded as multiple chunks.Edit: Not crashing would be enough already, no need for nice-to-haves :)
The text was updated successfully, but these errors were encountered:
This is a CloudWatch Limit of 256 KB per event. So you have to choose between logging duplicate events for the chunks, or truncating, or (current behavior) not logging at all. I'd suggest calculating the size as per the AWS docs, and truncating if it is too big for a single event.
Yeah, I get that this is an AWS limit. The issue is that journald-cloudwatch-logs aborts/crashes when it fails to send a large log message. It would be already sufficient to not crash (no need for chunking/truncating). I've updated the issue description accordingly.
When there are large log messages in the journal, then
journald-cloudwatch-logs
aborts with following error:It would be nice if instead large messages get forwarded as multiple chunks.Edit: Not crashing would be enough already, no need for nice-to-haves :)The text was updated successfully, but these errors were encountered: