This repository contains source codes and training sets for the following paper:
"Learning Multi-scale Features for Foreground Segmentation." by Long Ang LIM and Hacer YALIM KELES
The preprint version is available at: https://arxiv.org/abs/1808.01477
If you find FgSegNet_v2 useful in your research, please consider citing:
@article{lim2018learning,
title={Learning Multi-scale Features for Foreground Segmentation},
author={Lim, Long Ang and Keles, Hacer Yalim},
journal={arXiv preprint arXiv:1808.01477},
year={2018}
}
This work was implemented with the following frameworks:
- Spyder 3.2.x (recommended)
- Python 3.6.3
- Keras 2.0.6
- Tensorflow-gpu 1.1.0
-
Clone this repo:
git clone https://github.com/lim-anggun/FgSegNet_v2.git
-
Download CDnet2014, SBI2015 and UCSD datasets, then put them in the following directory structure:
Example:
FgSegNet_v2/ scripts/FgSegNet_v2_CDnet.py /FgSegNet_v2_SBI.py /FgSegNet_v2_UCSD.py /FgSegNet_v2_module.py /instance_normalization.py /my_upsampling_2d.py datasets/ /CDnet2014_dataset/... /SBI2015_dataset/... /UCSD_dataset/... training_sets/ /CDnet2014_train/... /SBI2015_train/... /UCSD_train20/... /UCSD_train50/...
-
Run the codes with Spyder IDE. Note that all trained models will be automatically saved (in current working directory) for you.
We evaluate our method using three different datasets as described in here or here.
Table below shows overall results across 11 categories obtained from Change Detection 2014 Challenge.
Methods | PWC | F-Measure | Speed (320x240, batch-size=1) on NVIDIA GTX 970 GPU |
---|---|---|---|
FgSegNet_v2 | 0.0402 | 0.9847 | 23fps |
Table below shows overall test results across 14 video sequences.
Methods | PWC | F-Measure |
---|---|---|
FgSegNet_v2 | 0.7148 | 0.9853 |
Table below shows overall test results across 18 video sequences.
Methods | PWC (20% split) | F-Measure (20% split) | PWC (50% split) | F-Measure (50% split) |
---|---|---|---|---|
FgSegNet_v2 | 0.6136 | 0.8945 | 0.4405 | 0.9203 |
07/08/2018:
- add FgSegNet_v2 source codes and training frames
lim.longang at gmail.com
Any issues/discussions are welcome.