Skip to content

enesadastec/BeMapNet

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NEWS !!!

  • Sep. 01st, 2023: 📌 We upload our long-version paper of PivotNet on Arxiv. The code is being sorted out. Paper
  • Aug. 23rd, 2023: 🚀 🚀 🚀 The official implementation of our BeMapNet is released now. Enjoy it!
  • Jul. 14th, 2023: 👏 Our PivotNet is accepted by ICCV 2023 ! The preprint paper is in progress.
  • Jun. 18th, 2023: 💡 The Innovation-Award of the AD-Challenge goes to our MachMap Solution ! Tech-Report
  • May. 26th, 2023: 🏆 Our team win the Championship of the CVPR23 Online HD Map Construction Challenge ! Leaderboard
  • May. 16th, 2023: 📌 We upload our long-version paper with detailed supplementary material on Arxiv. Paper
  • Feb. 28th, 2023: 👏 Our BeMapNet is accepted by CVPR 2023 ! Refer to the Paper for more details.

Introduction

Vectorized high-definition map (HD-map) construction, which focuses on the perception of centimeter-level environmental information, has attracted significant research interest in the autonomous driving community. In this paper, by delving into parameterization-based methods, we pioneer a concise and elegant scheme that adopts UNIFIED piecewise Bezier curve. In order to vectorize changeful map elements end-to-end, we elaborate a simple yet effective architecture, named Piecewise Bezier HD-map Network (BeMapNet), which is formulated as a direct set prediction paradigm and postprocessing-free. The overall architecture contains four primary components for extracting progressively richer-infos: image-level multi-scale features, semantic-level BEV feature, instance-level curve descriptors, and point-level Bezier control sequence.

Documentation

Step-by-step Installation <\br>
  • a. Check Environment

    Python >= 3.8
    CUDA 11.1
    # other versions of python/cuda have not been fully tested, but I think they should work as well.
  • b. Create a conda virtual environment and activate it. (Optional)

    conda create -n bemapnet python=3.8 -y
    conda activate bemapnet
  • c. Install PyTorch and torchvision following the official instructions.

    pip3 install torch==1.10.1+cu111 torchvision==0.11.2+cu111 -f https://download.pytorch.org/whl/torch_stable.html
  • d. Install MMCV following the official instructions. (need GPU)

    pip3 install -U openmim
    mim install mmcv==1.7.1
  • e. Install Detectron2 following the official instructions.

    python3 -m pip install detectron2 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.10/index.html
  • f. Install BeMapNet.

    git clone [email protected]:qiaolimeng/bemapnet.git -b bemapnet-release
    cd bemapnet
    pip3 install -r requirement.txt
Material Preparation <\br>
  • a. Data: NuScenes

    • Download&Unzip the NuScenes dataset into your server and link it to desirable path.
      cd /path/to/bemapnet
      mkdir data
      ln -s /any/path/to/your/nuscenes data/nuscenes
    • Generate Bezier-annotations from NuScenes's raw-annotations.
      cd /path/to/bemapnet
      python3 tools/bezier_converter/nuscenes/convert.py -d ./data -n bemapnet
    • OR download from here and put it into /path/to/bemapnet/data/nuscenes
      cd /path/to/bemapnet
      mkdir data/nuscenes/customer
      cd data/nuscenes/customer
      wget https://github.com/er-muyue/BeMapNet/releases/download/v1.0/bemapnet.zip .
      unzip bemapnet.zip bemapnet
  • b. Weights: Public-Pretrain-Models

    • Download public pretrain-weights as backbone initialization.
      cd /path/to/bemapnet
      cd assets/weights
      wget https://github.com/er-muyue/BeMapNet/releases/download/v1.0/efficientnet-b0-355c32eb.pth .
      wget https://github.com/er-muyue/BeMapNet/releases/download/v1.0/resnet50-0676ba61.pth .
      wget https://github.com/er-muyue/BeMapNet/releases/download/v1.0/upernet_swin_tiny_patch4_window7_512x512.pth .
  • c. Check: Project-Structure

    • Your project directory should be,
        assets
          | -- weights (resnet, swin-t, efficient-b0, ...)
          | -- 
        bemapnet
        configs
        data
          | -- nuscenes
            | -- samples (CAM_FRONT, CAM_FRONT_LEFT, CAM_FRONT_RIGHT, ...)
            | -- annotations
            | -- v1.0-trainval
            | -- ...
            | -- customer
              | -- bemapnet
                | -- *.npz
        tools
Training and Evluation <\br>
  • a. Model Training

    bash run.sh train bemapnet_nuscenes_swint 30  # default: 8GPUs, bs=1, epochs=30
  • b. Model Evaluation

    bash run.sh test bemapnet_nuscenes_swint ${checkpoint-path}
  • c. Reproduce with one command

    bash run.sh reproduce bemapnet_nuscenes_swint

Models & Results

Results on NuScenes Val Set <\br>

Citation

If you find BeMapNet/MachMap is useful in your research or applications, please consider giving us a star ⭐ and citing them by the following BibTeX entries:

@InProceedings{Qiao_2023_CVPR,
    author    = {Qiao, Limeng and Ding, Wenjie and Qiu, Xi and Zhang, Chi},
    title     = {End-to-End Vectorized HD-Map Construction With Piecewise Bezier Curve},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {13218-13228}
}

@article{qiao2023machmap,
    author={Limeng Qiao and Yongchao Zheng and Peng Zhang and Wenjie Ding and Xi Qiu and Xing Wei and Chi Zhang},
    title={MachMap: End-to-End Vectorized Solution for Compact HD-Map Construction}, 
    journal={arXiv preprint arXiv:2306.10301},
    year={2023},
}

@misc{ding2023pivotnet,
      title={PivotNet: Vectorized Pivot Learning for End-to-end HD Map Construction}, 
      author={Wenjie Ding and Limeng Qiao and Xi Qiu and Chi Zhang},
      year={2023},
      eprint={2308.16477},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Stars

Stargazers over time

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.7%
  • Shell 0.3%