Skip to content
This repository has been archived by the owner on Sep 9, 2020. It is now read-only.

Commit

Permalink
Fix max_prediction_explanations explanation (#144)
Browse files Browse the repository at this point in the history
* Fix `max_prediction_explanations` explanation

* Fix up api version
  • Loading branch information
Axik authored Nov 28, 2018
1 parent a5f2379 commit be4224a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ The following table describes each of the arguments:
out=<filepath> \+ \+ Specifies the file name, and optionally path, to which the results are written. If not specified, the default file name is ``out.csv``, written to the directory containing the script. The value of the output file must be a single .csv file that can be gzipped (extension .gz).
verbose \+ \+ Provides status updates while the script is running. It is recommended that you include this argument to track script execution progress. Silent mode (non-verbose), the default, displays very little output.
keep_cols=<keep_cols> \+ \+ Specifies the column names to append to the predictions. Enter as a comma-separated list.
max_prediction_explanations=<num> \+ \+ Specifies the number of the top prediction explanations to generate for each prediction. If not specified, the default is ``0``. **Not compatible with api_version** ``api/v1``.
max_prediction_explanations=<num> \+ \+ Specifies the number of the top prediction explanations to generate for each prediction. If not specified, the default is ``0``. **Compatible only with api_version** ``predApi/v1.0``.
n_samples=<n_samples> \+ \+ Specifies the number of samples (rows) to use per batch. If not defined, the ``auto_sample`` option is used.
n_concurrent=<n_concurrent> \+ \+ Specifies the number of concurrent requests to submit. By default, the script submits four concurrent requests. Set ``<n_concurrent>`` to match the number of cores in the prediction API endpoint.
create_api_token \+ \+ Requests a new API token. To use this option, you must specify the ``password`` argument for this request (not the ``api_token`` argument). Specifying this argument invalidates your existing API token and creates and stores a new token for future prediction requests.
Expand Down

0 comments on commit be4224a

Please sign in to comment.