Skip to content

Add vLLM inference provider for OpenAI compatible vLLM server #19

Add vLLM inference provider for OpenAI compatible vLLM server

Add vLLM inference provider for OpenAI compatible vLLM server #19

Triggered via pull request October 11, 2024 00:59
Status Failure
Total duration 41s
Artifacts

pre-commit.yml

on: pull_request
pre-commit
33s
pre-commit
Fit to window
Zoom out
Zoom in

Annotations

1 error
pre-commit
Process completed with exit code 1.