Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finetuning X3D model on Hagrid dataset #129

Merged
merged 6 commits into from
Dec 10, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
checkpoint: [GCP] efficient_x3d_s finetuning on whole dataset
config:
    trainable params: s5.pathway, head, projection layers
    optimizer: AdamW with lr: 3e-5
    total_epochs: 1
    batch_size: 48
-> version_15에 이어서 연속 2회 진행
  • Loading branch information
thisisWooyeol committed Dec 10, 2023
commit 5ff41fe15ca2385c2b91026cdc958ae0e12b5ac7
Binary file not shown.
Binary file not shown.
1 change: 1 addition & 0 deletions pytorchvideo/lightning_logs/HaGrid/version_16/hparams.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{}
Binary file not shown.
Binary file not shown.
Binary file not shown.
1 change: 1 addition & 0 deletions pytorchvideo/lightning_logs/HaGrid/version_17/hparams.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{}
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
ShortSideScale,
UniformTemporalSubsample,
)
from slurm import copy_and_run_with_config
# from slurm import copy_and_run_with_config
from torchvision.transforms import (
CenterCrop,
Compose,
Expand Down Expand Up @@ -137,7 +137,7 @@ def __init__(self, args):
)

_state_dict = torch.load(
"./lightning_logs/version_14/epoch=1-step=33000.ckpt",
"./lightning_logs/HaGrid/version_16/epoch=1-step=9000.ckpt",
)['state_dict']
for key, value in _state_dict.copy().items():
_state_dict[key.replace('model.', '', 1)] = value
Expand Down Expand Up @@ -446,7 +446,7 @@ def main():
callbacks=[
# EarlyStopping('val_loss'),
ModelCheckpoint(
dirpath="./lightning_logs/version_15/",
dirpath="./lightning_logs/HaGrid/version_17/",
every_n_train_steps=3000,
save_last=True,
),
Expand Down