Skip to content

Latest commit

 

History

History
13 lines (9 loc) · 898 Bytes

README.md

File metadata and controls

13 lines (9 loc) · 898 Bytes

CoMave

Github page for paper CoMave: Contrastive Pre-training with Multi-scale Masking for Attribute Value Extraction accepted by Findings of ACL 2023

Model

We open release CoMave in both Large-Chinese version and Large-English version, which continually pre-trained from RoBERTa with tens of millions of data.

Data

This paper constructs experiments on four benchmark: INS, MEPAVE, AE-Pub, MAE.

  1. You can download AE-Pub and MAE by here.
  2. MEPAVE is build by the team of JD platform, please contact the author to get dataset.
  3. INS is collected from the real business product data of Alipay platform. For the sake of data privacy, please contact us though email to get dataset.