Skip to content

Latest commit

 

History

History
38 lines (36 loc) · 1.07 KB

REAL-WORLD_PRETRAIN.md

File metadata and controls

38 lines (36 loc) · 1.07 KB

Pre-train

Notice

We provide code for pre-training on both the DROID and OXE datasets. Users should update the save_checkpoint_path to the directory where you want to save the training checkpoints, and modify the root_dir to the location where the preprocessed real data is stored. Additionally, users should configure the SLURM information in the provided scripts.

Preparation

cd ${YOUR_PATH_TO_SEER}
conda activate seer

Pre-train (DROID FULL)

  • For single-node pre-training:
bash scripts/REAL/single_node_full_cluster.sh
  • For multi-node pre-training:
bash scripts/REAL/slurm_s_full_cluster.sh

Pre-train (DROID with Language)

  • For single-node pre-training:
bash scripts/REAL/single_node_language_cluster.sh
  • For multi-node pre-training:
bash scripts/REAL/slurm_s_language_cluster.sh

Pre-train (OXE)

  • For multi-node pre-training: We should first generate data info
use the oxe_dataset_info in Seer/utils/real_ft_data.py

Then we train the model

bash scripts/REAL/slurm_s_language_cluster.sh