From 0508a664eaec95f25dc0a475c5484b2ce6982aa2 Mon Sep 17 00:00:00 2001 From: Zach Kimberg Date: Wed, 11 May 2022 15:33:48 -0700 Subject: [PATCH] Fix broken and redirected links (#1647) --- CONTRIBUTING.md | 4 ++-- docs/forums.md | 2 +- docs/load_model.md | 2 +- engines/mxnet/mxnet-engine/README.md | 5 ++++- extensions/benchmark/README.md | 3 +++ 5 files changed, 11 insertions(+), 5 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index f2e9f180ea1..d5932ce673c 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -37,8 +37,8 @@ To send us a pull request, please: 6. Send us a pull request, answering any default questions in the pull request interface. 7. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation. -GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and -[creating a pull request](https://help.github.com/articles/creating-a-pull-request/). +GitHub provides additional document on [forking a repository](https://docs.github.com/en/get-started/quickstart/fork-a-repo) and +[creating a pull request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request). ## Finding contributions to work on diff --git a/docs/forums.md b/docs/forums.md index 010e91a90d9..99d95fd40b2 100644 --- a/docs/forums.md +++ b/docs/forums.md @@ -34,7 +34,7 @@ If you want to talk about the development of DJL itself, look at our [developmen ## Pull Request -If you have an idea that you want to implement for changes to DJL, a bug fix, new datasets, new models, or anything else, open a new [pull request](https://github.com/deepjavalibrary/djl/compare). You can view this guide on [git and how to fork the project and make a pull request](https://guides.github.com/activities/forking/). We also have [documentation for contributors](development/README.md) that can help setup development, explain DJL coding conventions, working with DJL CI, and troubleshooting common problems. +If you have an idea that you want to implement for changes to DJL, a bug fix, new datasets, new models, or anything else, open a new [pull request](https://github.com/deepjavalibrary/djl/compare). You can view this guide on [git and how to fork the project and make a pull request](https://docs.github.com/en/get-started/quickstart/contributing-to-projects). We also have [documentation for contributors](development/README.md) that can help setup development, explain DJL coding conventions, working with DJL CI, and troubleshooting common problems. ## Follow DJL diff --git a/docs/load_model.md b/docs/load_model.md index b4a03ff6ca3..b9035225a17 100644 --- a/docs/load_model.md +++ b/docs/load_model.md @@ -130,7 +130,7 @@ Criteria criteria = Criteria.builder() ZooModel model = criteria.loadModel(); ``` -You can [customize the artifactId and modelName](#customize-artifactid-and-modelname) the same way as loading model from the local file system. +You can customize the artifactId and modelName the same way as loading model from the local file system. ### Load model from AWS S3 bucket DJL supports loading a model from an S3 bucket using `s3://` URL and the AWS plugin. See [here](../extensions/aws-ai/README.md) for details. diff --git a/engines/mxnet/mxnet-engine/README.md b/engines/mxnet/mxnet-engine/README.md index db1788c6c26..50d1e4df1ad 100644 --- a/engines/mxnet/mxnet-engine/README.md +++ b/engines/mxnet/mxnet-engine/README.md @@ -26,6 +26,7 @@ The javadocs output is built in the `build/doc/javadoc` folder. ## Installation + You can pull the MXNet engine from the central Maven repository by including the following dependency: ```xml @@ -43,6 +44,7 @@ It will automatically determine the appropriate jars for your system based on th You can choose a native library based on your platform if you don't have network access at runtime. ### macOS + For macOS, you can use the following library: - ai.djl.mxnet:mxnet-native-mkl:1.8.0:osx-x86_64 @@ -59,6 +61,7 @@ For macOS, you can use the following library: ``` ### Linux + For the Linux platform, you can choose between CPU, GPU. If you have Nvidia [CUDA](https://en.wikipedia.org/wiki/CUDA) installed on your GPU machine, you can use one of the following library: @@ -115,7 +118,7 @@ DJL on Windows, please download and install For the Windows platform, you can use CPU package. MXNet windows GPU native library size are large, we no longer provide GPU package, instead you have to -use [Automatic](#automatic-(recommended)) package. +use the Automatic package. #### Windows GPU diff --git a/extensions/benchmark/README.md b/extensions/benchmark/README.md index 56ec1b78145..ab3a8ab0f2f 100644 --- a/extensions/benchmark/README.md +++ b/extensions/benchmark/README.md @@ -77,6 +77,7 @@ gradlew benchmark --args="--help" ``` ## Prerequisite + Please ensure Java 8+ is installed and you are using an OS that DJL supported with. After that, you need to clone the djl project and `cd` into the folder. @@ -93,6 +94,7 @@ If you are trying to use GPU, please ensure the CUDA driver is installed. You ca ``` nvcc -V ``` + to checkout the version. For different Deep Learning engine you are trying to run the benchmark, they have different CUDA version to support. Please check the individual Engine documentation to ensure your CUDA version is supported. @@ -289,6 +291,7 @@ You can also do multi-threading inference with DJL. For example, if you would li ``` -t 10 ``` + Best thread number for your system: The same number of cores your system have or double of the total cores. You can also add `-l` to simulate the increment load for your inference server. It will add threads with the delay of time.