Skip to content

Commit

Permalink
Add toggle to enable grouping log by date/instance in s3 (#941)
Browse files Browse the repository at this point in the history
* add toggle to enable grouping log by date

* add information in environment.rst

* Update ENVIRONMENT.rst

Co-authored-by: Polina Bungina <[email protected]>

---------

Co-authored-by: Polina Bungina <[email protected]>
  • Loading branch information
idanovinda and hughcapet authored Dec 4, 2023
1 parent d554db0 commit 836d266
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 0 deletions.
1 change: 1 addition & 0 deletions ENVIRONMENT.rst
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ Environment Configuration Settings
- **LOG_S3_ENDPOINT**: (optional) S3 Endpoint to use with Boto3
- **LOG_BUCKET_SCOPE_PREFIX**: (optional) using to build S3 file path like `/spilo/{LOG_BUCKET_SCOPE_PREFIX}{SCOPE}{LOG_BUCKET_SCOPE_SUFFIX}/log/`
- **LOG_BUCKET_SCOPE_SUFFIX**: (optional) same as above
- **LOG_GROUP_BY_DATE**: (optional) enable grouping log by date. Default is False - group the log files based on the instance ID.
- **DCS_ENABLE_KUBERNETES_API**: a non-empty value forces Patroni to use Kubernetes as a DCS. Default is empty.
- **KUBERNETES_USE_CONFIGMAPS**: a non-empty value makes Patroni store its metadata in ConfigMaps instead of Endpoints when running on Kubernetes. Default is empty.
- **KUBERNETES_ROLE_LABEL**: name of the label containing Postgres role when running on Kubernetens. Default is 'spilo-role'.
Expand Down
3 changes: 3 additions & 0 deletions postgres-appliance/scripts/configure_spilo.py
Original file line number Diff line number Diff line change
Expand Up @@ -579,6 +579,7 @@ def get_placeholders(provider):
placeholders.setdefault('CLONE_TARGET_TIME', '')
placeholders.setdefault('CLONE_TARGET_INCLUSIVE', True)

placeholders.setdefault('LOG_GROUP_BY_DATE', False)
placeholders.setdefault('LOG_SHIP_SCHEDULE', '1 0 * * *')
placeholders.setdefault('LOG_S3_BUCKET', '')
placeholders.setdefault('LOG_S3_ENDPOINT', '')
Expand Down Expand Up @@ -758,6 +759,8 @@ def write_log_environment(placeholders):
log_env['LOG_AWS_REGION'] = aws_region

log_s3_key = 'spilo/{LOG_BUCKET_SCOPE_PREFIX}{SCOPE}{LOG_BUCKET_SCOPE_SUFFIX}/log/'.format(**log_env)
if os.getenv('LOG_GROUP_BY_DATE'):
log_s3_key += '{DATE}/'
log_s3_key += placeholders['instance_data']['id']
log_env['LOG_S3_KEY'] = log_s3_key

Expand Down
2 changes: 2 additions & 0 deletions postgres-appliance/scripts/upload_pg_log_to_s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,8 @@ def upload_to_s3(local_file_path):
bucket = s3.Bucket(bucket_name)

key_name = os.path.join(os.getenv('LOG_S3_KEY'), os.path.basename(local_file_path))
if os.getenv('LOG_GROUP_BY_DATE'):
key_name = key_name.format(**{'DATE': os.path.basename(local_file_path).split('.')[0]})

chunk_size = 52428800 # 50 MiB
config = TransferConfig(multipart_threshold=chunk_size, multipart_chunksize=chunk_size)
Expand Down

0 comments on commit 836d266

Please sign in to comment.