Skip to content

Releases: Netflix/metaflow

2.10.10

15 Jan 10:28
36c5bd9
Compare
Choose a tag to compare

Features

Support git repositories as @pypi dependencies

This release adds support for git repositories as dependencies for the @pypi decorator.

Usage

You can pin the version of the dependency with "@branch", "@commit", "@tag" or the head of the main branch with "".
Under the hood @pypi will resolve the dependency to a commit at the moment of creating the environment.

Limitations

Currently conda and pypi environments are created in advance before deploying on remote platforms, this includes gathering all the dependencies and bundling them up. The dependencies for pypi have been limited to binaries only in order to support cross-platform deployments.

For the git sources, we build a wheel from the source and check whether it is compatible with the target platform. This limits using sources where the output wheel is a platform specific binary, for example if building on an ARM platform but trying to deploy on x86.

What's Changed

Full Changelog: 2.10.9...2.10.10

2.10.9

11 Jan 18:49
036c8df
Compare
Choose a tag to compare

Features

EFS Volume support for @batch

This release adds support for mounting existing EFS volumes with @batch

Usage

You can specify one or multiple volumes to be mounted with the batch decorators efs_volumes= attribute. The volumes will be mounted under /mnt/volume-name by default, but you can also specify a custom mount point separated by a semicolon with the volume.

Examples

@batch(efs_volumes="fs-001")
@step
...
@batch(efs_volumes=["fs-002", "fs-003"])
@step
...
@batch(efs_volumes="fs-003:/mnt/custom-mountpoint"])
@step

Support custom mount point for host volumes with @batch

This release also adds support for custom mount points for host volumes with @batch.
Example

@batch(host_volumes="/home:/mnt/host-homedir")
@step

Improvements

@conda and @pypi changes

@pypi can now handle dependencies with special characters in versions

Fixes a @conda bug where environments were not appending to PATH during runtime if the decorator was applied implicitly to the step. This bug affected dependencies that install a binary, causing them to not be found during runtime due to the environment missing from the PATH.

Decorator lifecycle improvement

Previously the lifecycle methods task_pre_step and task_decorate were being called consecutively on a per-decorator basis. This release changes the lifecycle so that task_pre_step is called on all decorators before any task_decorate is called.

Better checking of datastore dependencies

This release improves the way that datastore dependencies are checked, in order to avoid an issue where previously run metadata could be published to the metadata service, but the execution fails to start due to missing dependencies for pushing data to the datastore.

What's Changed

New Contributors

Full Changelog: 2.10.8...2.10.9

Metaflow 2.10.8

04 Dec 21:08
cf4b4b6
Compare
Choose a tag to compare

Improvements

  • Warnings about regex patterns not being raw strings were fixed.
  • An issue with how the environment escape dealt with aliased modules was fixed.

What's Changed

New Contributors

Full Changelog: 2.10.7...2.10.8

2.10.7

17 Nov 12:27
02d769d
Compare
Choose a tag to compare

Improvements

pypi decorator enhancements

This release fixes support for pip environment variables that specify a custom location for the config file (PIP_CONFIG_FILE or PIP_CONFIG).

The release also adds support for defining a custom index-url through the pip supported environment variable PIP_INDEX_URL

What's Changed

Full Changelog: 2.10.6...2.10.7

2.10.6

03 Nov 14:00
6badc1d
Compare
Choose a tag to compare

Improvements

Fix environment activation issue with pypi decorator

The pypi decorator had a bug that caused it to be treated as disabled unless specifically passing disabled=False as an attribute to it.
This release fixes the default case so that pypi environments activate correctly.

Add debug flag to tracing

This release adds a METAFLOW_DEBUG_TRACING environment variable to toggle more verbose output for tracing related issues.

By default any errors related to missing tracing dependencies are now silenced completely, in order to not affect platforms that might want tracing environment variables present for all deployments, whether they have the required dependencies or not.

What's Changed

Full Changelog: 2.10.5...2.10.6

2.10.5

30 Oct 16:10
a9add2a
Compare
Choose a tag to compare

What's Changed

Full Changelog: 2.10.4...2.10.5

2.10.4

26 Oct 18:27
d8ad275
Compare
Choose a tag to compare

Features

Support for tracing

With this release it is possible to gather telemetry data using an opentelemetry endpoint.

Specifying an endpoint in one of the environment variables

  • METAFLOW_OTEL_ENDPOINT
  • METAFLOW_ZIPKIN_ENDPOINT

will enable the corresponding tracing provider.

Some additional dependencies are required for the tracing functionality in the execution environment. These can be installed in the base Docker image, or supplied through a conda environment. The relevant packages are

opentelemetry-sdk, opentelemetry-api, opentelemetry-instrumentation, opentelemetry-instrumentation-requests

and depending on your endpoint, either opentelemetry-exporter-otlp or opentelemetry-exporter-zipkin

Custom index support for the pypi decorator

The pypi decorator now supports using a custom index in the users Pip configuration under global.index-url.
This enables using private indices, even ones that require authentication.

For example the following would set up one authenticated and two extra non-authenticated indices for package resolution

pip config set global.index-url "https://user:[email protected]"
pip config set global.extra-index-url "https://extra.example.com https://extra2.example.com"

Specify Kubernetes job ephemeral storage size through resources decorator

It is now possible to specify the ephemeral storage size for Kubernetes jobs when using the resources decorator with the disk= attribute.

Introduce argo-workflows status command

Adds a command for easily checking the current status of a workflow on Argo workflows.

python flow.py argo-workflows status [run-id]

Improvements

Add more randomness to Kubernetes pod names to avoid collisions

There was an issue where relying solely on the Kubernetes apiserver for generating random pod names was resulting in significant collisions with sufficiently large number of executions.

This release adds more randomness to the pod names besides what is generated by Kubernetes.

Fix issues with resources decorator in combination with step functions

This release fixes an issue where deploying flows on AWS Step Functions was failing in the following cases

  • @resources(shared_memory=) with any value
  • combining @resources and @batch(use_tmpfs=True)

What's Changed

New Contributors

Full Changelog: 2.10.3...2.10.4

2.10.3

18 Oct 10:45
783715d
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: 2.10.2...2.10.3

2.10.2

09 Oct 20:22
0acf15a
Compare
Choose a tag to compare

2.10.2

Features

  • New configuration option to use same headers as metadata service for argo events webhook calls by @oavdeev in #1560 . Default behavior is the same as before.

  • Metaflow CLI now supports list-workflow-templates command to list deployed argo workflows by @saikonen in #1577

Full Changelog: 2.10.0...2.10.2

2.10.0

06 Oct 01:21
9d3f860
Compare
Choose a tag to compare

Coming soon!

Full Changelog: 2.9.15...2.10.0