forked from ashleve/lightning-hydra-template
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* add flake8 and prettier to pre-commit-config * add setup.cfg * add workers=True to seed_everything() * update lightning badge logo * bump package versions * update README.md * add __init__.py files * add more logger configs parameters * add default Dockerfile * change .env.template to .env.example * move inference example to readme * remove img_dataset.py * simplify names of wandb callbacks * remove wandb test marker * format files with prettier
- Loading branch information
Łukasz Zalewski
authored
May 21, 2021
1 parent
023d4bd
commit 70ab061
Showing
49 changed files
with
476 additions
and
418 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
# this is example of the file that can be used for storing private and user specific environment variables, like keys or system paths | ||
# create a file named .env (by default .env will be excluded from version control) | ||
# the variables declared in .env are loaded in run.py automatically | ||
# hydra allows you to reference variables in .yaml configs with special syntax: ${oc.env:MY_VAR} | ||
|
||
MY_VAR="/home/user/my/system/path" | ||
MY_KEY="asdgjhawi8y23ihsghsueity23ihwd" |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,40 +1,40 @@ | ||
default_language_version: | ||
python: python3.8 | ||
python: python3.8 | ||
|
||
repos: | ||
|
||
# Pre-commit hooks | ||
- repo: https://github.com/pre-commit/pre-commit-hooks | ||
rev: v3.2.0 | ||
rev: v3.4.0 | ||
hooks: | ||
# list of supported hooks: https://pre-commit.com/hooks.html | ||
- id: trailing-whitespace | ||
- id: debug-statements | ||
- id: detect-private-key | ||
- id: end-of-file-fixer | ||
- id: check-yaml | ||
- id: check-added-large-files | ||
- id: debug-statements | ||
- id: detect-private-key | ||
|
||
# Black (code formatting) | ||
# python code formatting | ||
- repo: https://github.com/psf/black | ||
rev: 20.8b1 | ||
hooks: | ||
- id: black | ||
args: [ | ||
--line-length, "99", | ||
# --exclude, src/train.py, | ||
] | ||
args: [--line-length, "99"] | ||
|
||
# Isort (import sorting) | ||
# python import sorting | ||
- repo: https://github.com/PyCQA/isort | ||
rev: 5.7.0 | ||
rev: 5.8.0 | ||
hooks: | ||
- id: isort | ||
# profiles: https://pycqa.github.io/isort/docs/configuration/profiles/ | ||
# other flags: https://pycqa.github.io/isort/docs/configuration/options/ | ||
args: [ | ||
--profile, black, | ||
--skip, src/train.py, | ||
--skip, run.py, | ||
--filter-files, | ||
] | ||
# files: "src/.*" | ||
|
||
# yaml formatting | ||
- repo: https://github.com/pre-commit/mirrors-prettier | ||
rev: v2.3.0 | ||
hooks: | ||
- id: prettier | ||
types: [yaml] | ||
|
||
# python code analysis | ||
- repo: https://github.com/PyCQA/flake8 | ||
rev: 3.9.2 | ||
hooks: | ||
- id: flake8 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
# Build: docker build -t project_name . | ||
# Run: docker run --gpus all -it --rm project_name | ||
|
||
# Build from official Nvidia PyTorch image | ||
# GPU-ready with Apex for mixed-precision support | ||
# https://ngc.nvidia.com/catalog/containers/nvidia:pytorch | ||
# https://docs.nvidia.com/deeplearning/frameworks/support-matrix/ | ||
FROM nvcr.io/nvidia/pytorch:21.03-py3 | ||
|
||
|
||
# Copy all files | ||
ADD . /workspace/project | ||
WORKDIR /workspace/project | ||
|
||
|
||
# Create myenv | ||
RUN conda env create -f conda_env_gpu.yaml -n myenv | ||
RUN conda init bash | ||
|
||
|
||
# Set myenv to default virtual environment | ||
RUN echo "source activate myenv" >> ~/.bashrc |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,17 +1,16 @@ | ||
model_checkpoint: | ||
_target_: pytorch_lightning.callbacks.ModelCheckpoint | ||
monitor: "val/acc" # name of the logged metric which determines when model is improving | ||
save_top_k: 1 # save k best models (determined by above metric) | ||
save_last: True # additionaly always save model from last epoch | ||
mode: "max" # can be "max" or "min" | ||
verbose: False | ||
dirpath: 'checkpoints/' | ||
filename: '{epoch:02d}' | ||
|
||
_target_: pytorch_lightning.callbacks.ModelCheckpoint | ||
monitor: "val/acc" # name of the logged metric which determines when model is improving | ||
save_top_k: 1 # save k best models (determined by above metric) | ||
save_last: True # additionaly always save model from last epoch | ||
mode: "max" # can be "max" or "min" | ||
verbose: False | ||
dirpath: "checkpoints/" | ||
filename: "{epoch:02d}" | ||
|
||
early_stopping: | ||
_target_: pytorch_lightning.callbacks.EarlyStopping | ||
monitor: "val/acc" # name of the logged metric which determines when model is improving | ||
patience: 100 # how many epochs of not improving until training stops | ||
mode: "max" # can be "max" or "min" | ||
min_delta: 0 # minimum change in the monitored metric needed to qualify as an improvement | ||
_target_: pytorch_lightning.callbacks.EarlyStopping | ||
monitor: "val/acc" # name of the logged metric which determines when model is improving | ||
patience: 100 # how many epochs of not improving until training stops | ||
mode: "max" # can be "max" or "min" | ||
min_delta: 0 # minimum change in the monitored metric needed to qualify as an improvement |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,32 +1,26 @@ | ||
defaults: | ||
- default.yaml | ||
|
||
- default.yaml | ||
|
||
watch_model: | ||
_target_: src.callbacks.wandb_callbacks.WatchModelWithWandb | ||
log: "all" | ||
log_freq: 100 | ||
|
||
_target_: src.callbacks.wandb_callbacks.WatchModel | ||
log: "all" | ||
log_freq: 100 | ||
|
||
upload_code_as_artifact: | ||
_target_: src.callbacks.wandb_callbacks.UploadCodeToWandbAsArtifact | ||
code_dir: ${work_dir}/src | ||
|
||
_target_: src.callbacks.wandb_callbacks.UploadCodeAsArtifact | ||
code_dir: ${work_dir}/src | ||
|
||
upload_ckpts_as_artifact: | ||
_target_: src.callbacks.wandb_callbacks.UploadCheckpointsToWandbAsArtifact | ||
ckpt_dir: "checkpoints/" | ||
upload_best_only: True | ||
|
||
_target_: src.callbacks.wandb_callbacks.UploadCheckpointsAsArtifact | ||
ckpt_dir: "checkpoints/" | ||
upload_best_only: True | ||
|
||
log_f1_precision_recall_heatmap: | ||
_target_: src.callbacks.wandb_callbacks.LogF1PrecRecHeatmapToWandb | ||
|
||
_target_: src.callbacks.wandb_callbacks.LogF1PrecRecHeatmap | ||
|
||
log_confusion_matrix: | ||
_target_: src.callbacks.wandb_callbacks.LogConfusionMatrixToWandb | ||
|
||
_target_: src.callbacks.wandb_callbacks.LogConfusionMatrix | ||
|
||
log_images_with_predictions: | ||
_target_: src.callbacks.wandb_callbacks.ImagePredictionLogger | ||
num_samples: 8 | ||
log_image_predictions: | ||
_target_: src.callbacks.wandb_callbacks.LogImagePredictions | ||
num_samples: 8 |
Oops, something went wrong.