Skip to content

Commit

Permalink
[Docs] Docs revert (open-mmlab#1359)
Browse files Browse the repository at this point in the history
* master

* master 0721

* add README

* Revert "[Docs] Merge docs & docs_zh (open-mmlab#1342)"

This reverts commit 364b54d.
  • Loading branch information
gengenkai authored Dec 28, 2021
1 parent f0f3c2c commit ed53d94
Show file tree
Hide file tree
Showing 134 changed files with 242 additions and 243 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ on:
- '!demo/**'
- '!docker/**'
- '!tools/**'
- '!docs/en/**'
- '!docs/zh_cn/**'
- '!docs/**'
- '!docs_zh_CN/**'

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
Expand Down
3 changes: 1 addition & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,7 @@ instance/
.scrapy

# Sphinx documentation
docs/en/_build/
docs/zh_cn/_build/
docs/_build/

# PyBuilder
target/
Expand Down
30 changes: 15 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,24 +51,24 @@ The master branch works with **PyTorch 1.3+**.
- (2021-10-25) We provide a [guide](https://github.com/open-mmlab/mmaction2/blob/master/configs/skeleton/posec3d/custom_dataset_training.md) on how to train PoseC3D with custom datasets, [bit-scientist](https://github.com/bit-scientist) authored this PR!
- (2021-10-16) We support **PoseC3D** on UCF101 and HMDB51, achieves 87.0% and 69.3% Top-1 accuracy with 2D skeletons only. Pre-extracted 2D skeletons are also available.

**Release**: v0.20.0 was released in 30/10/2021. Please refer to [changelog.md](docs/en/changelog.md) for details and release history.
**Release**: v0.20.0 was released in 30/10/2021. Please refer to [changelog.md](docs/changelog.md) for details and release history.

## Installation

Please refer to [install.md](docs/en/install.md) for installation.
Please refer to [install.md](docs/install.md) for installation.

## Get Started

Please see [getting_started.md](docs/en/getting_started.md) for the basic usage of MMAction2.
Please see [getting_started.md](docs/getting_started.md) for the basic usage of MMAction2.
There are also tutorials:

- [learn about configs](docs/en/tutorials/1_config.md)
- [finetuning models](docs/en/tutorials/2_finetune.md)
- [adding new dataset](docs/en/tutorials/3_new_dataset.md)
- [designing data pipeline](docs/en/tutorials/4_data_pipeline.md)
- [adding new modules](docs/en/tutorials/5_new_modules.md)
- [exporting model to onnx](docs/en/tutorials/6_export_model.md)
- [customizing runtime settings](docs/en/tutorials/7_customize_runtime.md)
- [learn about configs](docs/tutorials/1_config.md)
- [finetuning models](docs/tutorials/2_finetune.md)
- [adding new dataset](docs/tutorials/3_new_dataset.md)
- [designing data pipeline](docs/tutorials/4_data_pipeline.md)
- [adding new modules](docs/tutorials/5_new_modules.md)
- [exporting model to onnx](docs/tutorials/6_export_model.md)
- [customizing runtime settings](docs/tutorials/7_customize_runtime.md)

A Colab tutorial is also provided. You may preview the notebook [here](demo/mmaction2_tutorial.ipynb) or directly [run](https://colab.research.google.com/github/open-mmlab/mmaction2/blob/master/demo/mmaction2_tutorial.ipynb) on Colab.

Expand Down Expand Up @@ -207,16 +207,16 @@ Datasets marked with * are not fully supported yet, but related dataset preparat

## Benchmark

To demonstrate the efficacy and efficiency of our framework, we compare MMAction2 with some other popular frameworks and official releases in terms of speed. Details can be found in [benchmark](docs/en/benchmark.md).
To demonstrate the efficacy and efficiency of our framework, we compare MMAction2 with some other popular frameworks and official releases in terms of speed. Details can be found in [benchmark](docs/benchmark.md).

## Data Preparation

Please refer to [data_preparation.md](docs/en/data_preparation.md) for a general knowledge of data preparation.
The supported datasets are listed in [supported_datasets.md](docs/en/supported_datasets.md)
Please refer to [data_preparation.md](docs/data_preparation.md) for a general knowledge of data preparation.
The supported datasets are listed in [supported_datasets.md](docs/supported_datasets.md)

## FAQ

Please refer to [FAQ](docs/en/faq.md) for frequently asked questions.
Please refer to [FAQ](docs/faq.md) for frequently asked questions.

## Projects built on MMAction2

Expand All @@ -226,7 +226,7 @@ Currently, there are many research works and projects built on MMAction2 by user
- Evidential Deep Learning for Open Set Action Recognition, ICCV 2021 **Oral**. [[paper]](https://arxiv.org/abs/2107.10161)[[github]](https://github.com/Cogito2012/DEAR)
- Rethinking Self-supervised Correspondence Learning: A Video Frame-level Similarity Perspective, ICCV 2021 **Oral**. [[paper]](https://arxiv.org/abs/2103.17263)[[github]](https://github.com/xvjiarui/VFS)

etc., check [projects.md](docs/en/projects.md) to see all related projects.
etc., check [projects.md](docs/projects.md) to see all related projects.

## License

Expand Down
28 changes: 14 additions & 14 deletions README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,23 +50,23 @@ MMAction2 是一款基于 PyTorch 的视频理解开源工具箱,是 [OpenMMLa
- (2021-10-25) 提供使用自定义数据集训练 PoseC3D 的 [教程](https://github.com/open-mmlab/mmaction2/blob/master/configs/skeleton/posec3d/custom_dataset_training.md),此 PR 由用户 [bit-scientist](https://github.com/bit-scientist) 完成!
- (2021-10-16) 在 UCF101, HMDB51 上支持 **PoseC3D**,仅用 2D 关键点就可分别达到 87.0% 和 69.3% 的识别准确率。两数据集的预提取骨架特征可以公开下载。

v0.20.0 版本已于 2021 年 10 月 30 日发布,可通过查阅 [更新日志](/docs/en/changelog.md) 了解更多细节以及发布历史
v0.20.0 版本已于 2021 年 10 月 30 日发布,可通过查阅 [更新日志](/docs/changelog.md) 了解更多细节以及发布历史

## 安装

请参考 [安装指南](/docs/zh_cn/install.md) 进行安装
请参考 [安装指南](/docs_zh_CN/install.md) 进行安装

## 教程

请参考 [基础教程](/docs/zh_cn/getting_started.md) 了解 MMAction2 的基本使用。MMAction2也提供了其他更详细的教程:
请参考 [基础教程](/docs_zh_CN/getting_started.md) 了解 MMAction2 的基本使用。MMAction2也提供了其他更详细的教程:

- [如何编写配置文件](/docs/zh_cn/tutorials/1_config.md)
- [如何微调模型](/docs/zh_cn/tutorials/2_finetune.md)
- [如何增加新数据集](/docs/zh_cn/tutorials/3_new_dataset.md)
- [如何设计数据处理流程](/docs/zh_cn/tutorials/4_data_pipeline.md)
- [如何增加新模块](/docs/zh_cn/tutorials/5_new_modules.md)
- [如何导出模型为 onnx 格式](/docs/zh_cn/tutorials/6_export_model.md)
- [如何自定义模型运行参数](/docs/zh_cn/tutorials/7_customize_runtime.md)
- [如何编写配置文件](/docs_zh_CN/tutorials/1_config.md)
- [如何微调模型](/docs_zh_CN/tutorials/2_finetune.md)
- [如何增加新数据集](/docs_zh_CN/tutorials/3_new_dataset.md)
- [如何设计数据处理流程](/docs_zh_CN/tutorials/4_data_pipeline.md)
- [如何增加新模块](/docs_zh_CN/tutorials/5_new_modules.md)
- [如何导出模型为 onnx 格式](/docs_zh_CN/tutorials/6_export_model.md)
- [如何自定义模型运行参数](/docs_zh_CN/tutorials/7_customize_runtime.md)

MMAction2 也提供了相应的中文 Colab 教程,可以点击 [这里](https://colab.research.google.com/github/open-mmlab/mmaction2/blob/master/demo/mmaction2_tutorial_zh-CN.ipynb) 进行体验!

Expand Down Expand Up @@ -203,15 +203,15 @@ MMAction2 将跟进学界的最新进展,并支持更多算法和框架。如

## 基准测试

为了验证 MMAction2 框架的高精度和高效率,开发成员将其与当前其他主流框架进行速度对比。更多详情可见 [基准测试](/docs/zh_cn/benchmark.md)
为了验证 MMAction2 框架的高精度和高效率,开发成员将其与当前其他主流框架进行速度对比。更多详情可见 [基准测试](/docs_zh_CN/benchmark.md)

## 数据集准备

请参考 [数据准备](/docs/zh_cn/data_preparation.md) 了解数据集准备概况。所有支持的数据集都列于 [数据集清单](/docs/zh_cn/supported_datasets.md)
请参考 [数据准备](/docs_zh_CN/data_preparation.md) 了解数据集准备概况。所有支持的数据集都列于 [数据集清单](/docs_zh_CN/supported_datasets.md)

## 常见问题

请参考 [FAQ](/docs/zh_cn/faq.md) 了解其他用户的常见问题
请参考 [FAQ](/docs_zh_CN/faq.md) 了解其他用户的常见问题

## 相关工作

Expand All @@ -221,7 +221,7 @@ MMAction2 将跟进学界的最新进展,并支持更多算法和框架。如
- Rethinking Self-supervised Correspondence Learning: A Video Frame-level Similarity Perspective, ICCV 2021 **Oral**. [[论文]](https://arxiv.org/abs/2103.17263)[[代码]](https://github.com/xvjiarui/VFS)
- Video Swin Transformer. [[论文]](https://arxiv.org/abs/2106.13230)[[代码]](https://github.com/SwinTransformer/Video-Swin-Transformer)

更多详情可见 [相关工作](docs/en/projects.md)
更多详情可见 [相关工作](docs/projects.md)

## 许可

Expand Down
6 changes: 3 additions & 3 deletions configs/detection/acrn/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Current state-of-the-art approaches for spatio-temporal action localization rely

:::

For more details on data preparation, you can refer to AVA in [Data Preparation](/docs/en/data_preparation.md).
For more details on data preparation, you can refer to AVA in [Data Preparation](/docs/data_preparation.md).

## Train

Expand All @@ -75,7 +75,7 @@ Example: train ACRN with SlowFast backbone on AVA with periodic validation.
python tools/train.py configs/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py --validate
```

For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/en/getting_started.md#training-setting).
For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/getting_started.md#training-setting).

## Test

Expand All @@ -91,4 +91,4 @@ Example: test ACRN with SlowFast backbone on AVA and dump the result to a csv fi
python tools/test.py configs/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py checkpoints/SOME_CHECKPOINT.pth --eval mAP --out results.csv
```

For more details and optional arguments infos, you can refer to **Test a dataset** part in [getting_started](/docs/en/getting_started.md#test-a-dataset) .
For more details and optional arguments infos, you can refer to **Test a dataset** part in [getting_started](/docs/getting_started.md#test-a-dataset) .
6 changes: 3 additions & 3 deletions configs/detection/acrn/README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
依据 [线性缩放规则](https://arxiv.org/abs/1706.02677),当用户使用不同数量的 GPU 或者每块 GPU 处理不同视频个数时,需要根据批大小等比例地调节学习率。
如,lr=0.01 对应 4 GPUs x 2 video/gpu,以及 lr=0.08 对应 16 GPUs x 4 video/gpu。

对于数据集准备的细节,用户可参考 [数据准备](/docs/zh_cn/data_preparation.md)
对于数据集准备的细节,用户可参考 [数据准备](/docs_zh_CN/data_preparation.md)

## 如何训练

Expand All @@ -62,7 +62,7 @@ python tools/train.py ${CONFIG_FILE} [optional arguments]
python tools/train.py configs/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py --validate
```

更多训练细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#训练配置) 中的 **训练配置** 部分。
更多训练细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#训练配置) 中的 **训练配置** 部分。

## 如何测试

Expand All @@ -78,4 +78,4 @@ python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments]
python tools/test.py configs/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py checkpoints/SOME_CHECKPOINT.pth --eval mAP --out results.csv
```

更多测试细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
更多测试细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
6 changes: 3 additions & 3 deletions configs/detection/ava/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ AVA, with its realistic scene and action complexity, exposes the intrinsic diffi

:::

For more details on data preparation, you can refer to AVA in [Data Preparation](/docs/en/data_preparation.md).
For more details on data preparation, you can refer to AVA in [Data Preparation](/docs/data_preparation.md).

## Train

Expand All @@ -102,7 +102,7 @@ Example: train SlowOnly model on AVA with periodic validation.
python tools/train.py configs/detection/ava/slowonly_kinetics_pretrained_r50_8x8x1_20e_ava_rgb.py --validate
```

For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/en/getting_started.md#training-setting) .
For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/getting_started.md#training-setting) .

### Train Custom Classes From Ava Dataset

Expand Down Expand Up @@ -140,4 +140,4 @@ Example: test SlowOnly model on AVA and dump the result to a csv file.
python tools/test.py configs/detection/ava/slowonly_kinetics_pretrained_r50_8x8x1_20e_ava_rgb.py checkpoints/SOME_CHECKPOINT.pth --eval mAP --out results.csv
```

For more details and optional arguments infos, you can refer to **Test a dataset** part in [getting_started](/docs/en/getting_started.md#test-a-dataset) .
For more details and optional arguments infos, you can refer to **Test a dataset** part in [getting_started](/docs/getting_started.md#test-a-dataset) .
6 changes: 3 additions & 3 deletions configs/detection/ava/README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@
如,lr=0.01 对应 4 GPUs x 2 video/gpu,以及 lr=0.08 对应 16 GPUs x 4 video/gpu。
2. **Context** 表示同时使用 RoI 特征与全局特征进行分类,可带来约 1% mAP 的提升。

对于数据集准备的细节,用户可参考 [数据准备](/docs/zh_cn/data_preparation.md)
对于数据集准备的细节,用户可参考 [数据准备](/docs_zh_CN/data_preparation.md)

## 如何训练

Expand All @@ -88,7 +88,7 @@ python tools/train.py ${CONFIG_FILE} [optional arguments]
python tools/train.py configs/detection/ava/slowonly_kinetics_pretrained_r50_8x8x1_20e_ava_rgb.py --validate
```

更多训练细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#训练配置) 中的 **训练配置** 部分。
更多训练细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#训练配置) 中的 **训练配置** 部分。

### 训练 AVA 数据集中的自定义类别

Expand Down Expand Up @@ -126,4 +126,4 @@ python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments]
python tools/test.py configs/detection/ava/slowonly_kinetics_pretrained_r50_8x8x1_20e_ava_rgb.py checkpoints/SOME_CHECKPOINT.pth --eval mAP --out results.csv
```

更多测试细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
更多测试细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
4 changes: 2 additions & 2 deletions configs/detection/lfb/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ python tools/train.py configs/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_
--validate --seed 0 --deterministic
```

For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/en/getting_started.md#training-setting).
For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/getting_started.md#training-setting).

## Test

Expand All @@ -123,4 +123,4 @@ python tools/test.py configs/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r
checkpoints/SOME_CHECKPOINT.pth --eval mAP --out results.csv
```

For more details, you can refer to **Test a dataset** part in [getting_started](/docs/en/getting_started.md#test-a-dataset).
For more details, you can refer to **Test a dataset** part in [getting_started](/docs/getting_started.md#test-a-dataset).
4 changes: 2 additions & 2 deletions configs/detection/lfb/README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ python tools/train.py configs/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_
--validate --seed 0 --deterministic
```

更多训练细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#训练配置) 中的 **训练配置** 部分。
更多训练细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#训练配置) 中的 **训练配置** 部分。

## 测试

Expand All @@ -100,4 +100,4 @@ python tools/test.py configs/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r
checkpoints/SOME_CHECKPOINT.pth --eval mAP --out results.csv
```

更多测试细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
更多测试细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
6 changes: 3 additions & 3 deletions configs/localization/bmn/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ Temporal action proposal generation is an challenging and promising task which a

*We train BMN with the [official repo](https://github.com/JJBOY/BMN-Boundary-Matching-Network), evaluate its proposal generation and action detection performance with [anet_cuhk_2017](https://download.openmmlab.com/mmaction/localization/cuhk_anet17_pred.json) for label assigning.

For more details on data preparation, you can refer to ActivityNet feature in [Data Preparation](/docs/en/data_preparation.md).
For more details on data preparation, you can refer to ActivityNet feature in [Data Preparation](/docs/data_preparation.md).

## Train

Expand All @@ -76,7 +76,7 @@ Example: train BMN model on ActivityNet features dataset.
python tools/train.py configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py
```

For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/en/getting_started.md#training-setting) .
For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/getting_started.md#training-setting) .

## Test

Expand Down Expand Up @@ -109,4 +109,4 @@ python tools/analysis/report_map.py --proposal path/to/proposal_file

:::

For more details and optional arguments infos, you can refer to **Test a dataset** part in [getting_started](/docs/en/getting_started.md#test-a-dataset) .
For more details and optional arguments infos, you can refer to **Test a dataset** part in [getting_started](/docs/getting_started.md#test-a-dataset) .
6 changes: 3 additions & 3 deletions configs/localization/bmn/README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@

*MMAction2 在 [原始代码库](https://github.com/JJBOY/BMN-Boundary-Matching-Network) 上训练 BMN,并且在 [anet_cuhk_2017](https://download.openmmlab.com/mmaction/localization/cuhk_anet17_pred.json) 的对应标签上评估时序动作候选生成和时序检测的结果。

对于数据集准备的细节,用户可参考 [数据集准备文档](/docs/zh_cn/data_preparation.md) 中的 ActivityNet 特征部分。
对于数据集准备的细节,用户可参考 [数据集准备文档](/docs_zh_CN/data_preparation.md) 中的 ActivityNet 特征部分。

## 如何训练

Expand All @@ -64,7 +64,7 @@ python tools/train.py ${CONFIG_FILE} [optional arguments]
python tools/train.py configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py
```

更多训练细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#训练配置) 中的 **训练配置** 部分。
更多训练细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#训练配置) 中的 **训练配置** 部分。

## 如何测试

Expand Down Expand Up @@ -95,4 +95,4 @@ python tools/analysis/report_map.py --proposal path/to/proposal_file
python tools/data/activitynet/convert_proposal_format.py
```

更多测试细节,可参考 [基础教程](/docs/zh_cn/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
更多测试细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。
Loading

0 comments on commit ed53d94

Please sign in to comment.