Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: align readme with current mteb #1493

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

Samoed
Copy link
Collaborator

@Samoed Samoed commented Nov 24, 2024

Checklist

  • Run tests locally to make sure nothing is broken using make test.
  • Run the formatter to format the code using make lint.

The current README is a bit misleading and causing issues for newcomers. See #1490 and #1491 for reference.

@@ -50,6 +50,8 @@ model_name = "average_word_embeddings_komninos"
# model_name = "sentence-transformers/all-MiniLM-L6-v2"

model = SentenceTransformer(model_name)
# or directly from mteb:
model = mteb.get_model(model_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we just recommend this one always?

and maybe rephrase to:

Suggested change
model = mteb.get_model(model_name)
model = mteb.get_model(model_name) # if the model is not implemented in MTEB it will be eq. to SentenceTransformer(model_name)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but I'm not sure removing it is the best approach. I can make changes to whatever you think is better


class CustomModel:
class CustomModel(Wrapper):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does it need to inherit from wrapper? I would instead inherit from the Encoder protocol

Copy link
Collaborator Author

@Samoed Samoed Nov 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MTEB class will wrap models that do not inherit from the wrapper class.

if not isinstance(model, Wrapper):
model = SentenceTransformerWrapper(model)

Copy link
Contributor

@KennethEnevoldsen KennethEnevoldsen Nov 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, shouldn't it just wrap SentenceTransformers?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It can be like this. You changed it in mieb

if isinstance(model, (SentenceTransformer, CrossEncoder)):
model = SentenceTransformerWrapper(model)
I can change there too

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm then it will also be merged into v2.0.0 in which case we should probably just update the readme there (I plan to merge it during December)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, for now I think this PR can be merged, and I will update readme in the 2.0 branch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants