From 0335144eadc43ae4abb275c80a3367235073b20c Mon Sep 17 00:00:00 2001 From: Anthony Naddeo Date: Fri, 26 Jan 2024 13:00:09 -0800 Subject: [PATCH] Update the base readme --- README.md | 41 +++++++++++++++++------------------------ 1 file changed, 17 insertions(+), 24 deletions(-) diff --git a/README.md b/README.md index 9abcd29..53f6f0b 100644 --- a/README.md +++ b/README.md @@ -4,9 +4,7 @@ This repo has various examples for configuring and using the WhyLabs langkit con different use case and contains a `test` directory with working Python code that demonstrates how to use it. Browse that folder if you're looking for a specific example. Each example either configures or calls a deployed instance of the container. -## Links - -- [Docker](https://hub.docker.com/repository/docker/whylabs/whylogs/general) +The rest of this README will go over the process of customizing the langkit container in general. # Configuration Steps @@ -21,8 +19,6 @@ breaking changes. ## Step 2: Create a Configuration Files - - Depending on whether you're using Python, Yaml, or both, you'll be creating different files. Yaml is the easiest way to configure the container but sometimes you need more control. If you're going to be deploying custom models, for example, then you'll need to use Python most likly since you'll probably have to reference libraries like `torch` and execute some setup code. @@ -45,8 +41,7 @@ This is what your project will look like. These files will be included by your Dockerfile in the section below. The container is hard coded to search `/opt/whylogs-container/whylogs_container/whylogs_config/` for yaml files that it understands at startup. -- See [configure-container][configure_container] for an example that demonstrates using Python to configure the container. - +See [configure_container_yaml][configure_container_yaml] to see what you can put in there. ### Step 2.2: Custom Python Configuration @@ -66,7 +61,7 @@ The container is hard coded to import whatever is at `/opt/whylogs-container/why mechanism by which the custom configuration code is evaluated. If you need to deploy and reference other python files then you can tag them along as well and use relative imports and that will work out at runtime. - +See [configure_container_python][configure_container_python] for an example that demonstrates using Python to configure the container. ## Step 3: Create a Dockerfile @@ -94,16 +89,16 @@ you can find the dependencies it was bundled with. You can get this from the ima ``` # Get the declared dependnecies -docker run --platform=linux/amd64 --rm --entrypoint /bin/bash whylabs/whylogs:py-llm-latest -c 'cat pyproject.toml' +docker run --platform=linux/amd64 --rm --entrypoint /bin/bash whylabs/whylogs:py-llm-1.0.2.dev0 -c 'cat pyproject.toml' # Or start an interactive Python shell and test imports -docker run -it --platform=linux/amd64 --rm --entrypoint /bin/bash whylabs/whylogs:py-llm-latest -c "source .venv/bin/activate; python3.10" +docker run -it --platform=linux/amd64 --rm --entrypoint /bin/bash whylabs/whylogs:py-llm-1.0.2.dev0 -c "source .venv/bin/activate; python3.10" ``` In general, you can expect `pandas`, `whylogs`, `langkit`, and `torch/torchvision==2.0.0` to be present, as well as whatever dependencies they pull in. - +See the [custom_model][custom_model] for a complete example that packages extra dependencies. ## Step 4: Build the Image @@ -114,6 +109,8 @@ very fast, while installing dependencies can be very slow. docker build . -t my_llm_container ``` +Each of the examples do this with `make build` + ## Step 5: Deploy a Container This will depend heavily on your infrastructure. The simplest deployment method is Docker of course. @@ -122,24 +119,20 @@ This will depend heavily on your infrastructure. The simplest deployment method docker run -it --rm -p 127.0.0.1:8000:8000 --env-file local.env my_llm_container ``` - +Each of the examples do this with `make run` + +See [our sample Helm file][helm_llm_file] for an example of deploying via Helm. ## Step 6: Call the Container -The container has a client that you can use to call it, [python-container-client][python-container-client]. See the README in that project -or any of the example's `test` folders for examples. +The container has a client that you can use to call it, [python-container-client][python-container-client]. -[configure_container]: https://github.com/whylabs/langkit-container-examples/tree/master/examples/configure-container +[configure_container_python]: https://github.com/whylabs/langkit-container-examples/tree/master/examples/configure_container_python +[configure_container_yaml]: https://github.com/whylabs/langkit-container-examples/tree/master/examples/configure_container_yaml [docker_tags]: https://hub.docker.com/repository/docker/whylabs/whylogs/tags?page=1&ordering=last_updated&name=llm [python-container-client]: https://pypi.org/project/whylogs-container-client/ - - - - - - - - - +[custom_model]: https://github.com/whylabs/langkit-container-examples/tree/master/examples/custom_model +[helm_repo]: https://github.com/whylabs/charts +[helm_llm_file]: https://github.com/whylabs/charts/tree/mainline/charts/langkit