You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 1, 2024. It is now read-only.
Hi! Thank you for your great work!
To my understanding, BLINK uses special tokens to represent a mention position and entity title for both bi-encoder and cross-encoder.
In cross-encoder, your code actually sets special tokens to the tokenizer.
https://github.com/facebookresearch/BLINK/blob/main/blink/crossencoder/crossencoder.py#L82-L89
But in bi-encoder,
add_special_tokens
is not called which means special tokens are just processed as[UNK]
.https://github.com/facebookresearch/BLINK/blob/main/blink/biencoder/biencoder.py#L82-L87
Did you write this intentionally? If so, could you elaborate on that?
The text was updated successfully, but these errors were encountered: