Skip to content

Commit

Permalink
Update docstrings (#1289)
Browse files Browse the repository at this point in the history
Edit docstrings so they can be rendered using MDX
  • Loading branch information
pollfly authored Jun 20, 2024
1 parent 4c79e06 commit 60d138b
Show file tree
Hide file tree
Showing 18 changed files with 121 additions and 111 deletions.
38 changes: 19 additions & 19 deletions clearml/automation/controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -480,7 +480,7 @@ def add_step(
The current step in the pipeline will be sent for execution only after all the parent nodes
have been executed successfully.
:param parameter_override: Optional parameter overriding dictionary.
The dict values can reference a previously executed step using the following form '${step_name}'. Examples:
The dict values can reference a previously executed step using the following form ``'${step_name}'``. Examples:
- Artifact access ``parameter_override={'Args/input_file': '${<step_name>.artifacts.<artifact_name>.url}' }``
- Model access (last model used) ``parameter_override={'Args/input_file': '${<step_name>.models.output.-1.url}' }``
Expand All @@ -494,11 +494,11 @@ def add_step(
:param configuration_overrides: Optional, override Task configuration objects.
Expected dictionary of configuration object name and configuration object content.
Examples:
{'General': dict(key='value')}
{'General': 'configuration file content'}
{'OmegaConf': YAML.dumps(full_hydra_dict)}
``{'General': dict(key='value')}``
``{'General': 'configuration file content'}``
``{'OmegaConf': YAML.dumps(full_hydra_dict)}``
:param task_overrides: Optional task section overriding dictionary.
The dict values can reference a previously executed step using the following form '${step_name}'. Examples:
The dict values can reference a previously executed step using the following form ``'${step_name}'``. Examples:
- get the latest commit from a specific branch ``task_overrides={'script.version_num': '', 'script.branch': 'main'}``
- match git repository branch to a previous step ``task_overrides={'script.branch': '${stage1.script.branch}', 'script.version_num': ''}``
Expand Down Expand Up @@ -549,7 +549,7 @@ def add_step(
the Node is skipped and so is any node in the DAG that relies on this node.
Notice the `parameters` are already parsed,
e.g. `${step1.parameters.Args/param}` is replaced with relevant value.
e.g. ``${step1.parameters.Args/param}`` is replaced with relevant value.
.. code-block:: py
Expand Down Expand Up @@ -774,7 +774,7 @@ def mock_func(matrix_np):
If not provided automatically take all function arguments & defaults
Optional, pass input arguments to the function from other Tasks' output artifact.
Example argument named `numpy_matrix` from Task ID `aabbcc` artifact name `answer`:
{'numpy_matrix': 'aabbcc.answer'}
``{'numpy_matrix': 'aabbcc.answer'}``
:param function_return: Provide a list of names for all the results.
If not provided, no results will be stored as artifacts.
:param project_name: Set the project name for the task. Required if base_task_id is None.
Expand Down Expand Up @@ -842,7 +842,7 @@ def mock_func(matrix_np):
the Node is skipped and so is any node in the DAG that relies on this node.
Notice the `parameters` are already parsed,
e.g. `${step1.parameters.Args/param}` is replaced with relevant value.
e.g. ``${step1.parameters.Args/param}`` is replaced with relevant value.
.. code-block:: py
Expand Down Expand Up @@ -991,7 +991,7 @@ def start(
the Node is skipped and so is any node in the DAG that relies on this node.
Notice the `parameters` are already parsed,
e.g. `${step1.parameters.Args/param}` is replaced with relevant value.
e.g. ``${step1.parameters.Args/param}`` is replaced with relevant value.
.. code-block:: py
Expand Down Expand Up @@ -1416,7 +1416,7 @@ def add_parameter(self, name, default=None, description=None, param_type=None):
The parameter can be used as input parameter for any step in the pipeline.
Notice all parameters will appear under the PipelineController Task's Hyper-parameters -> Pipeline section
Example: pipeline.add_parameter(name='dataset', description='dataset ID to process the pipeline')
Then in one of the steps we can refer to the value of the parameter with '${pipeline.dataset}'
Then in one of the steps we can refer to the value of the parameter with ``'${pipeline.dataset}'``
:param name: String name of the parameter.
:param default: Default value to be put as the default value (can be later changed in the UI)
Expand Down Expand Up @@ -1445,7 +1445,7 @@ def enqueue(cls, pipeline_controller, queue_name=None, queue_id=None, force=Fals
.. note::
A worker daemon must be listening at the queue for the worker to fetch the Task and execute it,
see `ClearML Agent <../clearml_agent>`_ in the ClearML Documentation.
see "ClearML Agent" in the ClearML Documentation.
:param pipeline_controller: The PipelineController to enqueue. Specify a PipelineController object or PipelineController ID
:param queue_name: The name of the queue. If not specified, then ``queue_id`` must be specified.
Expand Down Expand Up @@ -1661,7 +1661,7 @@ def _start(
the Node is skipped and so is any node in the DAG that relies on this node.
Notice the `parameters` are already parsed,
e.g. `${step1.parameters.Args/param}` is replaced with relevant value.
e.g. ``${step1.parameters.Args/param}`` is replaced with relevant value.
.. code-block:: py
Expand Down Expand Up @@ -2127,7 +2127,7 @@ def mock_func(matrix_np):
If not provided automatically take all function arguments & defaults
Optional, pass input arguments to the function from other Tasks's output artifact.
Example argument named `numpy_matrix` from Task ID `aabbcc` artifact name `answer`:
{'numpy_matrix': 'aabbcc.answer'}
``{'numpy_matrix': 'aabbcc.answer'}``
:param function_return: Provide a list of names for all the results.
If not provided, no results will be stored as artifacts.
:param project_name: Set the project name for the task. Required if base_task_id is None.
Expand Down Expand Up @@ -2195,7 +2195,7 @@ def mock_func(matrix_np):
the Node is skipped and so is any node in the DAG that relies on this node.
Notice the `parameters` are already parsed,
e.g. `${step1.parameters.Args/param}` is replaced with relevant value.
e.g. ``${step1.parameters.Args/param}`` is replaced with relevant value.
.. code-block:: py
Expand Down Expand Up @@ -3015,7 +3015,7 @@ def _daemon(self):
def _parse_step_ref(self, value, recursive=False):
# type: (Any) -> Optional[str]
"""
Return the step reference. For example "${step1.parameters.Args/param}"
Return the step reference. For example ``"${step1.parameters.Args/param}"``
:param value: string
:param recursive: if True, recursively parse all values in the dict, list or tuple
:return:
Expand Down Expand Up @@ -3047,7 +3047,7 @@ def _parse_step_ref(self, value, recursive=False):
def _parse_task_overrides(self, task_overrides):
# type: (dict) -> dict
"""
Return the step reference. For example "${step1.parameters.Args/param}"
Return the step reference. For example ``"${step1.parameters.Args/param}"``
:param task_overrides: string
:return:
"""
Expand Down Expand Up @@ -3265,11 +3265,11 @@ def _get_pipeline_task(cls):
def __verify_step_reference(self, node, step_ref_string):
# type: (PipelineController.Node, str) -> Optional[str]
"""
Verify the step reference. For example "${step1.parameters.Args/param}"
Verify the step reference. For example ``"${step1.parameters.Args/param}"``
Raise ValueError on misconfiguration
:param Node node: calling reference node (used for logging)
:param str step_ref_string: For example "${step1.parameters.Args/param}"
:param str step_ref_string: For example ``"${step1.parameters.Args/param}"``
:return: If step reference is used, return the pipeline step name, otherwise return None
"""
parts = step_ref_string[2:-1].split('.')
Expand Down Expand Up @@ -4076,7 +4076,7 @@ def example_retry_on_failure_callback(pipeline, node, retries):
the Node is skipped and so is any node in the DAG that relies on this node.
Notice the `parameters` are already parsed,
e.g. `${step1.parameters.Args/param}` is replaced with relevant value.
e.g. ``${step1.parameters.Args/param}`` is replaced with relevant value.
.. code-block:: py
Expand Down
26 changes: 13 additions & 13 deletions clearml/automation/hpbandster/bandster.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,17 +135,18 @@ def __init__(
Optimization. Instead of sampling new configurations at random,
BOHB uses kernel density estimators to select promising candidates.
.. note::
.. code-block::
For reference:
@InProceedings{falkner-icml-18,
title = {{BOHB}: Robust and Efficient Hyperparameter Optimization at Scale},
author = {Falkner, Stefan and Klein, Aaron and Hutter, Frank},
booktitle = {Proceedings of the 35th International Conference on Machine Learning},
pages = {1436--1445},
year = {2018},
title = {{BOHB}: Robust and Efficient Hyperparameter Optimization at Scale},
author = {Falkner, Stefan and Klein, Aaron and Hutter, Frank},
booktitle = {Proceedings of the 35th International Conference on Machine Learning},
pages = {1436--1445},
year = {2018},
}
:param str base_task_id: Task ID (str)
:param list hyper_parameters: list of Parameter objects to optimize over
:param Objective objective_metric: Objective metric to maximize / minimize
Expand Down Expand Up @@ -218,18 +219,17 @@ def set_optimization_args(
Optimization. Instead of sampling new configurations at random,
BOHB uses kernel density estimators to select promising candidates.
.. note::
.. code-block::
For reference:
@InProceedings{falkner-icml-18,
title = {{BOHB}: Robust and Efficient Hyperparameter Optimization at Scale},
author = {Falkner, Stefan and Klein, Aaron and Hutter, Frank},
booktitle = {Proceedings of the 35th International Conference on Machine Learning},
pages = {1436--1445},
year = {2018},
title = {{BOHB}: Robust and Efficient Hyperparameter Optimization at Scale},
author = {Falkner, Stefan and Klein, Aaron and Hutter, Frank},
booktitle = {Proceedings of the 35th International Conference on Machine Learning},
pages = {1436--1445},
year = {2018},
}
:param eta: float (3)
In each iteration, a complete run of sequential halving is executed. In it,
after evaluating each configuration on the same subset size, only a fraction of
Expand Down
8 changes: 4 additions & 4 deletions clearml/automation/job.py
Original file line number Diff line number Diff line change
Expand Up @@ -531,13 +531,13 @@ def __init__(
:param str base_task_id: base task ID to clone from
:param dict parameter_override: dictionary of parameters and values to set fo the cloned task
:param dict task_overrides: Task object specific overrides.
for example {'script.version_num': None, 'script.branch': 'main'}
for example ``{'script.version_num': None, 'script.branch': 'main'}``
:param configuration_overrides: Optional, override Task configuration objects.
Expected dictionary of configuration object name and configuration object content.
Examples:
{'config_section': dict(key='value')}
{'config_file': 'configuration file content'}
{'OmegaConf': YAML.dumps(full_hydra_dict)}
``{'config_section': dict(key='value')}``
``{'config_file': 'configuration file content'}``
``{'OmegaConf': YAML.dumps(full_hydra_dict)}``
:param list tags: additional tags to add to the newly cloned task
:param str parent: Set newly created Task parent task field, default: base_tak_id.
:param dict kwargs: additional Task creation parameters
Expand Down
2 changes: 1 addition & 1 deletion clearml/automation/monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ def get_query_parameters(self):
Return the query parameters for the monitoring.
This should be overloaded with specific implementation query
:return dict: Example dictionary: {'status': ['failed'], 'order_by': ['-last_update']}
:return dict: Example dictionary: ``{'status': ['failed'], 'order_by': ['-last_update']}``
"""
return dict(status=['failed'], order_by=['-last_update'])

Expand Down
4 changes: 2 additions & 2 deletions clearml/automation/optimization.py
Original file line number Diff line number Diff line change
Expand Up @@ -638,7 +638,7 @@ def get_top_experiments_details(
:param all_hyper_parameters: Default False. If True, return all the hyperparameters from all the sections.
:param only_completed: return only completed Tasks. Default False.
:return: A list of dictionaries ({task_id: '', hyper_parameters: {}, metrics: {}}), ordered by performance,
:return: A list of dictionaries ``({task_id: '', hyper_parameters: {}, metrics: {}})``, ordered by performance,
where index 0 is the best performing Task.
Example w/ all_metrics=False:
Expand Down Expand Up @@ -1733,7 +1733,7 @@ def get_top_experiments_details(
:param all_hyper_parameters: Default False. If True, return all the hyperparameters from all the sections.
:param only_completed: return only completed Tasks. Default False.
:return: A list of dictionaries ({task_id: '', hyper_parameters: {}, metrics: {}}), ordered by performance,
:return: A list of dictionaries ``({task_id: '', hyper_parameters: {}, metrics: {}})``, ordered by performance,
where index 0 is the best performing Task.
Example w/ all_metrics=False:
Expand Down
32 changes: 18 additions & 14 deletions clearml/automation/parameters.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ def to_list(self):
"""
Return a list of all the valid values of the Parameter.
:return: List of dicts {name: value}
:return: List of dicts ``{name: value}``
"""
pass

Expand Down Expand Up @@ -147,7 +147,7 @@ def get_value(self):
"""
Return uniformly sampled value based on object sampling definitions.
:return: {self.name: random value [self.min_value, self.max_value)}
:return: ``{self.name: random value [self.min_value, self.max_value)}``
"""
if not self.step_size:
return {self.name: self._random.uniform(self.min_value, self.max_value)}
Expand All @@ -160,7 +160,7 @@ def to_list(self):
Return a list of all the valid values of the Parameter. If ``self.step_size`` is not defined, return 100 points
between min/max values.
:return: list of dicts {name: float}
:return: list of dicts ``{name: float}``
"""
step_size = self.step_size or (self.max_value - self.min_value) / 100.
steps = (self.max_value - self.min_value) / step_size
Expand Down Expand Up @@ -208,7 +208,7 @@ def get_value(self):
"""
Return uniformly logarithmic sampled value based on object sampling definitions.
:return: {self.name: random value self.base^[self.min_value, self.max_value)}
:return: ``{self.name: random value self.base^[self.min_value, self.max_value)}``
"""
values_dict = super().get_value()
return {self.name: self.base**v for v in values_dict.values()}
Expand Down Expand Up @@ -250,7 +250,7 @@ def get_value(self):
"""
Return uniformly sampled value based on object sampling definitions.
:return: {self.name: random value [self.min_value, self.max_value)}
:return: ``{self.name: random value [self.min_value, self.max_value)}``
"""
return {self.name: self._random.randrange(
start=self.min_value, step=self.step_size,
Expand All @@ -262,7 +262,7 @@ def to_list(self):
Return a list of all the valid values of the Parameter. If ``self.step_size`` is not defined, return 100 points
between minmax values.
:return: list of dicts {name: int}
:return: list of dicts ``{name: int}``
"""
values = list(range(self.min_value, self.max_value, self.step_size))
if self.include_max and (not values or values[-1] < self.max_value):
Expand Down Expand Up @@ -291,7 +291,7 @@ def get_value(self):
"""
Return uniformly sampled value from the valid list of values.
:return: {self.name: random entry from self.value}
:return: ``{self.name: random entry from self.value}``
"""
return {self.name: self._random.choice(self.values)}

Expand All @@ -300,7 +300,7 @@ def to_list(self):
"""
Return a list of all the valid values of the Parameter.
:return: list of dicts {name: value}
:return: list of dicts ``{name: value}``
"""
return [{self.name: v} for v in self.values]

Expand All @@ -321,15 +321,19 @@ def __init__(self, parameter_combinations=()):
.. code-block:: javascript
[ {"opt1": 10, "arg2": 20, "arg2": 30},
{"opt2": 11, "arg2": 22, "arg2": 33} ]
[
{"opt1": 10, "arg2": 20, "arg2": 30},
{"opt2": 11, "arg2": 22, "arg2": 33}
]
Two complex combination each one sampled from a different range:
.. code-block:: javascript
[ {"opt1": UniformParameterRange('arg1',0,1) , "arg2": 20},
{"opt2": UniformParameterRange('arg1',11,12), "arg2": 22},]
[
{"opt1": UniformParameterRange('arg1',0,1) , "arg2": 20},
{"opt2": UniformParameterRange('arg1',11,12), "arg2": 22},
]
"""
super(ParameterSet, self).__init__(name=None)
self.values = parameter_combinations
Expand All @@ -339,7 +343,7 @@ def get_value(self):
"""
Return uniformly sampled value from the valid list of values.
:return: {self.name: random entry from self.value}
:return: ``{self.name: random entry from self.value}``
"""
return self._get_value(self._random.choice(self.values))

Expand All @@ -348,7 +352,7 @@ def to_list(self):
"""
Return a list of all the valid values of the Parameter.
:return: list of dicts {name: value}
:return: list of dicts ``{name: value}``
"""
combinations = []
for combination in self.values:
Expand Down
4 changes: 2 additions & 2 deletions clearml/automation/scheduler.py
Original file line number Diff line number Diff line change
Expand Up @@ -628,9 +628,9 @@ def add_task(
then recurring based on the timing schedule arguments. Default False.
:param reuse_task: If True, re-enqueue the same Task (i.e. do not clone it) every time, default False.
:param task_parameters: Configuration parameters to the executed Task.
for example: {'Args/batch': '12'} Notice: not available when reuse_task=True
for example: ``{'Args/batch': '12'}`` Notice: not available when reuse_task=True
:param task_overrides: Change task definition.
for example {'script.version_num': None, 'script.branch': 'main'} Notice: not available when reuse_task=True
for example ``{'script.version_num': None, 'script.branch': 'main'}`` Notice: not available when reuse_task=True
:return: True if job is successfully added to the scheduling list
"""
Expand Down
Loading

0 comments on commit 60d138b

Please sign in to comment.