-
Notifications
You must be signed in to change notification settings - Fork 449
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
1fbcb80
commit 09b5ae5
Showing
112 changed files
with
9,227 additions
and
323 deletions.
There are no files selected for viewing
168 changes: 168 additions & 0 deletions
168
doc/source/locale/ja_JP/LC_MESSAGES/examples/ai_podcast.po
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,168 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2023, Xorbits Inc. | ||
# This file is distributed under the same license as the Xinference package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2023. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Xinference \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2023-10-16 10:33+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: ja_JP\n" | ||
"Language-Team: ja_JP <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.12.1\n" | ||
|
||
#: ../../source/examples/ai_podcast.rst:5 | ||
msgid "Example: AI Podcast 🎙" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:7 | ||
msgid "**Description**:" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:9 | ||
msgid "🎙️AI Podcast - Voice Conversations with Multiple Agents on M2 Max 💻" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:11 | ||
msgid "**Support Language** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:13 | ||
msgid "English (AI_Podcast.py)" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:15 | ||
msgid "Chinese (AI_Podcast_ZH.py)" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:17 | ||
msgid "**Used Technology (EN version)** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:19 | ||
msgid "" | ||
"@ `OpenAI <https://twitter.com/OpenAI>`_ 's `whisper " | ||
"<https://pypi.org/project/openai-whisper/>`_" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:21 | ||
msgid "" | ||
"@ `ggerganov <https://twitter.com/ggerganov>`_ 's `ggml " | ||
"<https://github.com/ggerganov/ggml>`_" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:23 | ||
msgid "" | ||
"@ `WizardLM_AI <https://twitter.com/WizardLM_AI>`_ 's `wizardlm v1.0 " | ||
"<https://huggingface.co/WizardLM>`_" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:25 | ||
msgid "" | ||
"@ `lmsysorg <https://twitter.com/lmsysorg>`_ 's `vicuna v1.3 " | ||
"<https://huggingface.co/lmsys/vicuna-7b-v1.3>`_" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:27 | ||
msgid "@ `Xinference <https://github.com/xorbitsai/inference>`_ as a launcher" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:29 | ||
msgid "**Detailed Explanation on the Demo Functionality** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:31 | ||
msgid "" | ||
"Generate the Wizardlm Model and Vicuna Model when the program is " | ||
"launching with Xorbits Inference. Initiate the Chatroom by giving the two" | ||
" chatbot their names and telling them that there is a human user called " | ||
"\"username\", where \"username\" is given by user's input. Initialize a " | ||
"empty chat history for the chatroom." | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:35 | ||
msgid "" | ||
"Use Audio device to store recording into file, and transcribe the file " | ||
"using OpenAI's Whisper to receive a human readable text as string." | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:37 | ||
msgid "" | ||
"Based on the input message string, determine which agents the user want " | ||
"to talk to. Call the target agents and parse in the input string and chat" | ||
" history for the model to generate." | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:40 | ||
msgid "" | ||
"When the responses are ready, use Macos's \"Say\" Command to produce " | ||
"audio through speaker. Each agents have their own voice while speaking." | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:43 | ||
msgid "" | ||
"Store the user input and the agent response into chat history, and " | ||
"recursively looping the program until user explicitly says words like " | ||
"\"see you\" in their responses." | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:46 | ||
msgid "**Highlight Features with Xinference** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:48 | ||
msgid "" | ||
"With Xinference's distributed system, we can easily deploy two different " | ||
"models in the same session and in the same \"chatroom\". With enough " | ||
"resources, the framework can deploy any amount of models you like at the " | ||
"same time." | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:51 | ||
msgid "" | ||
"With Xinference, you can deploy the model easily by just adding a few " | ||
"lines of code. For examples, for launching the vicuna model in the demo, " | ||
"just by::" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:68 | ||
msgid "" | ||
"Then, the Xinference client will handle \"target model downloading and " | ||
"caching\", \"set up environment and process for the model\", and \"run " | ||
"the service at selected endpoint. \" You are now ready to play with your " | ||
"llm model." | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:71 | ||
msgid "**Original Demo Video** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:73 | ||
msgid "" | ||
"`🎙️AI Podcast - Voice Conversations with Multiple Agents on M2 Max💻🔥🤖 " | ||
"<https://twitter.com/yichaocheng/status/1679129417778442240>`_" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:75 | ||
msgid "**Source Code** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:77 | ||
msgid "" | ||
"`AI_Podcast " | ||
"<https://github.com/xorbitsai/inference/blob/main/examples/AI_podcast.py>`_" | ||
" (English Version)" | ||
msgstr "" | ||
|
||
#: ../../source/examples/ai_podcast.rst:79 | ||
msgid "AI_Podcast_ZH (Chinese Version)" | ||
msgstr "" | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,98 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2023, Xorbits Inc. | ||
# This file is distributed under the same license as the Xinference package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2023. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Xinference \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2023-10-16 10:33+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: ja_JP\n" | ||
"Language-Team: ja_JP <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.12.1\n" | ||
|
||
#: ../../source/examples/chatbot.rst:5 | ||
msgid "Example: chatbot 🤖️" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:7 | ||
msgid "**Description**:" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:9 | ||
msgid "" | ||
"Demonstrate how to interact with Xinference to play with LLM chat " | ||
"functionality with an AI agent 💻" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:11 | ||
msgid "**Used Technology**:" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:13 | ||
msgid "" | ||
"@ `ggerganov <https://twitter.com/ggerganov>`_ 's `ggml " | ||
"<https://github.com/ggerganov/ggml>`_" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:15 | ||
msgid "@ `Xinference <https://github.com/xorbitsai/inference>`_ as a launcher" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:17 | ||
msgid "" | ||
"@ All LLaMA and Chatglm models supported by `Xorbitsio inference " | ||
"<https://github.com/xorbitsai/inference>`_" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:19 | ||
msgid "**Detailed Explanation on the Demo Functionality** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:21 | ||
msgid "" | ||
"Take the user command line input in the terminal and grab the required " | ||
"parameters for model launching." | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:23 | ||
msgid "" | ||
"Launch the Xinference frameworks and automatically deploy the model user " | ||
"demanded into the cluster." | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:25 | ||
msgid "Initialize an empty chat history to store all the context in the chatroom." | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:27 | ||
msgid "" | ||
"Recursively ask for user's input as prompt and let the model to generate " | ||
"response based on the prompt and the chat history. Show the Output of the" | ||
" response in the terminal." | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:30 | ||
msgid "" | ||
"Store the user's input and agent's response into the chat history as " | ||
"context for the upcoming rounds." | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:32 | ||
msgid "**Source Code** :" | ||
msgstr "" | ||
|
||
#: ../../source/examples/chatbot.rst:33 | ||
msgid "" | ||
"`chat " | ||
"<https://github.com/RayJi01/Xprobe_inference/blob/main/examples/chat.py>`_" | ||
msgstr "" | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,25 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2023, Xorbits Inc. | ||
# This file is distributed under the same license as the Xinference package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2023. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Xinference \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2023-10-16 10:33+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: ja_JP\n" | ||
"Language-Team: ja_JP <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.12.1\n" | ||
|
||
#: ../../source/examples/index.rst:5 | ||
msgid "Examples" | ||
msgstr "" | ||
|
80 changes: 80 additions & 0 deletions
80
doc/source/locale/ja_JP/LC_MESSAGES/getting_started/accessing_models.po
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,80 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2023, Xorbits Inc. | ||
# This file is distributed under the same license as the Xinference package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2023. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Xinference \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2023-10-16 10:33+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: ja_JP\n" | ||
"Language-Team: ja_JP <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.12.1\n" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:5 | ||
msgid "Accessing Models" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:8 | ||
msgid "" | ||
"Suppose you have started the Xinference server endpoint at " | ||
"``http://127.0.0.1:9997``." | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:10 | ||
msgid "" | ||
"Please refer to :ref:`Launching Models <launching_models>` guide to get " | ||
"your model running." | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:13 | ||
msgid "Using Xinference Client" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:17 | ||
msgid "Large Language Model" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:18 | ||
msgid "" | ||
"When the abilities of the LLM include \"chat,\" we can converse with it " | ||
"using the model's chat interface:" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:34 | ||
msgid "The response will look like:" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:61 | ||
msgid "Embedding Model" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:63 | ||
msgid "" | ||
"To interact with Xinference's embedding model, i.e., inputting a text and" | ||
" getting an embedding from ``RESTfulClient``:" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:72 | ||
msgid "The response will be:" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:93 | ||
msgid "Using OpenAI Python SDK" | ||
msgstr "" | ||
|
||
#: ../../source/getting_started/accessing_models.rst:95 | ||
msgid "" | ||
"Xinference provides an OpenAI-compatible RESTful interface. Thus, you can" | ||
" also use the OpenAI Python SDK to chat with the model via the service's " | ||
"endpoint:" | ||
msgstr "" | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.