You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,我想和你们确认个问题。Huggingface的模型在文本分类任务上用BertForSequenceClassification这个类时,其中用到的是bert的pooled_output结果,然后接最终的一层classifier输出。而你们论文中说:“We build the downstream models for the natural language understanding tasks by adding a linear classifier on top of the “[CLS]" token to predict label probabilities.”。这个意思是仅用bert的CLS token,然后直接到最终的classifier是吗?因为我看你们预训练任务中有NSP任务,所以想确认一下文本分类你们具体用的哪种方式。谢谢~
The text was updated successfully, but these errors were encountered:
Hi,我想和你们确认个问题。Huggingface的模型在文本分类任务上用BertForSequenceClassification这个类时,其中用到的是bert的pooled_output结果,然后接最终的一层classifier输出。而你们论文中说:“We build the downstream models for the natural language understanding tasks by adding a linear classifier on top of the “[CLS]" token to predict label probabilities.”。这个意思是仅用bert的CLS token,然后直接到最终的classifier是吗?因为我看你们预训练任务中有NSP任务,所以想确认一下文本分类你们具体用的哪种方式。谢谢~
The text was updated successfully, but these errors were encountered: