Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python client library for mlflow server #1892

Open
idlefella opened this issue Aug 22, 2024 · 0 comments
Open

Python client library for mlflow server #1892

idlefella opened this issue Aug 22, 2024 · 0 comments

Comments

@idlefella
Copy link
Contributor

Hello everyone,

I was exploring using mlserver to deploy ML models as a REST service. I noticed an issue: if you plan to use mlserver with Python and want to utilize its codecs (like numpy or pandas codecs), you must include mlserver in your codebase as a dependency. This action introduces many transitive dependencies, such as fastapi, aiokafka, uvicorn, etc., which significantly increases the size of the dependencies. Would it not be more practical to have a separate mlserver-client package that exclusively contains the codecs and types?

Or how do you currently integrate mlserver with another microservice? Do you manually create the Open Inference Protocol v2 JSON?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant