Skip to content

Actions: xorbitsai/inference

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
4,791 workflow run results
4,791 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

BUG: [UI] Fix authentication mode related bugs
Python CI #3931: Pull request #1772 opened by yiboyasss
July 3, 2024 10:10 2h 36m 40s yiboyasss:FIX_authorityBug
July 3, 2024 10:10 2h 36m 40s
BUG glm4-chat的API请求,不支持流式回复吗
Assign #1128: Issue comment #1766 (comment) created by xiaohuihuige
July 3, 2024 10:02 10s
July 3, 2024 10:02 10s
ENH: Add more log modules
Python CI #3930: Pull request #1771 opened by ChengjieLi28
July 3, 2024 09:58 1h 2m 0s ChengjieLi28:enh/more_log
July 3, 2024 09:58 1h 2m 0s
BUG glm4-chat的API请求,不支持流式回复吗
Assign #1127: Issue comment #1766 (comment) created by liaotingyao
July 3, 2024 09:58 10s
July 3, 2024 09:58 10s
BUG glm4-chat的API请求,不支持流式回复吗
Assign #1126: Issue comment #1766 (comment) created by xiaohuihuige
July 3, 2024 09:49 14s
July 3, 2024 09:49 14s
FEAT: support MLX engine
Python CI #3929: Pull request #1765 synchronize by qinxuye
July 3, 2024 09:36 1h 41m 32s qinxuye:feat/mlx
July 3, 2024 09:36 1h 41m 32s
ENH: Add guard for model launching
Python CI #3928: Pull request #1680 synchronize by frostyplanet
July 3, 2024 09:04 44m 54s frostyplanet:launch_guard
July 3, 2024 09:04 44m 54s
BUG glm4-chat的API请求,不支持流式回复吗
Assign #1125: Issue comment #1766 (comment) created by liaotingyao
July 3, 2024 08:57 10s
July 3, 2024 08:57 10s
Failed to build chatglm-cpp llama-cpp-python pynini
Assign #1124: Issue comment #1690 (comment) created by highwalker2
July 3, 2024 08:47 9s
July 3, 2024 08:47 9s
chatTTS部署报错
Assign #1123: Issue comment #1770 (comment) created by qinxuye
July 3, 2024 08:44 9s
July 3, 2024 08:44 9s
GLM4-9B,量化int4运行,显存占用与不用int4一样
Assign #1122: Issue comment #1768 (comment) created by zidanereal5
July 3, 2024 08:10 10s
July 3, 2024 08:10 10s
BUG glm4-chat的API请求,不支持流式回复吗
Assign #1121: Issue comment #1766 (comment) created by xiaohuihuige
July 3, 2024 07:52 15s
July 3, 2024 07:52 15s
no package metadata was found for auto-gptq
Assign #1120: Issue comment #1769 (comment) created by qinxuye
July 3, 2024 07:06 10s
July 3, 2024 07:06 10s
GLM4-9B,量化int4运行,显存占用与不用int4一样
Assign #1119: Issue comment #1768 (comment) created by qinxuye
July 3, 2024 07:05 11s
July 3, 2024 07:05 11s
BUG glm4-chat的API请求,不支持流式回复吗
Assign #1118: Issue comment #1766 (comment) created by qinxuye
July 3, 2024 07:04 11s
July 3, 2024 07:04 11s
FEAT: support MLX engine
Python CI #3927: Pull request #1765 synchronize by qinxuye
July 3, 2024 06:42 2h 0m 10s qinxuye:feat/mlx
July 3, 2024 06:42 2h 0m 10s
FEAT: support MLX engine
Python CI #3926: Pull request #1765 synchronize by qinxuye
July 3, 2024 06:21 20m 30s qinxuye:feat/mlx
July 3, 2024 06:21 20m 30s
BLD: Pin llama-cpp-python to v0.2.77 in Docker for stability (#1767)
Python CI #3925: Commit 7e643f1 pushed by qinxuye
July 3, 2024 06:16 1h 33m 32s main
July 3, 2024 06:16 1h 33m 32s
FEAT: support MLX engine
Python CI #3923: Pull request #1765 synchronize by qinxuye
July 3, 2024 06:14 7m 47s qinxuye:feat/mlx
July 3, 2024 06:14 7m 47s
TST: Fix llama-cpp-python issue in CI (#1763)
Python CI #3922: Commit 7ab624e pushed by qinxuye
July 3, 2024 06:12 4m 54s main
July 3, 2024 06:12 4m 54s
FEAT: support MLX engine
Python CI #3921: Pull request #1765 synchronize by qinxuye
July 3, 2024 06:08 6m 22s qinxuye:feat/mlx
July 3, 2024 06:08 6m 22s
FEAT: support MLX engine
Python CI #3920: Pull request #1765 synchronize by qinxuye
July 3, 2024 06:07 1m 30s qinxuye:feat/mlx
July 3, 2024 06:07 1m 30s