You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error when coverting from bert pytorch_pretrained_bert to bert from transformers library TypeError: forward() got an unexpected keyword argument 'output_all_encoded_layers'
#11
Open
RihabElya opened this issue
Apr 1, 2021
· 0 comments
def forward(self, batch):
"""
batch has the following structure:
data[0]: list, tokens ids
data[1]: list, tokens mask
data[2]: list, tokens type ids (for bert)
data[3]: list, bert labels ids
"""
encoded_layers, _ = self.model(
input_ids=batch[0],
token_type_ids=batch[2],
attention_mask=batch[1],
output_all_encoded_layers=self.config["mode"] == "weighted")
if self.config["mode"] == "weighted":
encoded_layers = torch.stack([a * b for a, b in zip(encoded_layers, self.bert_weights)])
return self.bert_gamma * torch.sum(encoded_layers, dim=0)
return encoded_layers
The text was updated successfully, but these errors were encountered: