Skip to content

Code and data for the paper "Multi-Source Domain Adaptation with Mixture of Experts" (EMNLP 2018)

License

Notifications You must be signed in to change notification settings

jiangfeng1124/transfer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Source Domain Adaptation with Mixture of Experts

Code and data for the EMNLP 2018 paper: Multi-Source Domain Adaptation with Mixture of Experts

Running

cd msda-src
# example script for training uni-MS (baseline)
./train_unified.sh
(run "python amazon-chen/senti_unified.py -h" for full options)

# example script for training MoE
./train_moe.sh
(run "python amazon-chen/senti_moe.py -h" for full options)

(to be updated to latest versions of pytorch)

Note: The official Chen12 dataset doesn't contain a dev split. To perform hyper-parameter selection, you should create multiple folds by randomly splitting dev sets (1/10) from the (multi-source) training data as a means of cross-validation. They follow the same naming convention of ${domain}_dev.svmlight under the same directory of the training and test sets.

Dependencies

  • Pytorch 0.3/0.4
  • sklearn
  • termcolor

References

@InProceedings{guo2018multi,
  author = "Guo, Jiang and Shah, Darsh J and Barzilay, Regina",
  title = "Multi-Source Domain Adaptation with Mixture of Experts",
  booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
  year = "2018",
  publisher = "Association for Computational Linguistics",
  pages = "4694--4703",
  location = "Brussels, Belgium",
  url = "http://aclweb.org/anthology/D18-1498"
}

Contact

Please create an issue or email to [email protected] should you have any questions, comments or suggestions.

About

Code and data for the paper "Multi-Source Domain Adaptation with Mixture of Experts" (EMNLP 2018)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published