Skip to content

Add vLLM inference provider for OpenAI compatible vLLM server #25

Add vLLM inference provider for OpenAI compatible vLLM server

Add vLLM inference provider for OpenAI compatible vLLM server #25

Triggered via pull request October 11, 2024 01:34
Status Success
Total duration 35s
Artifacts

pre-commit.yml

on: pull_request
pre-commit
27s
pre-commit
Fit to window
Zoom out
Zoom in