-
Notifications
You must be signed in to change notification settings - Fork 33
Release Process
The TileDB-Py release process consists of the following high-level steps:
-
Build PyPI packages
The source and wheel package creation process is automated in the file azure-release.yml in the TileDB-Py repository. This CI target runs for any branch starting with
release-
and creates both the release source package and the binary wheels for all platforms. The release process builds libtiledb once, before executing the Python packaging steps, so note that the libtiledb build steps insetup.py
are not executed for release builds as they are for regular CI. The libtiledb build for windows and macOS happens in the azure-build-libtiledb, whereas the libtiledb build for linux happens in Dockerfile2010 -
Upload PyPI packages (currently manual)
Packages are uploaded to PyPI using twine. See the upload steps detailed below, and the Python Packaging User's Guide for background.
-
Release conda-forge packages
Conda packages for TileDB-Py are built by the tiledb-py-feedstock conda-forge repository. Most ongoing maintenance of this repository is automated by conda-forge infrastructure bots (for example, update pull-requests will be automatically generated after a PyPI upload; typically, within 1-2 hours).
In case of problems, start with the conda-forge documentation and in particular the Maintaining Packages section which describes the package lifecycle and update steps. The conda-smithy tool is used to "rerender" feedstocks, to bring the repository-local CI scripts and configurations up-to-date with the conda-forge settings and templates (for example, re-rendering updates the repository relative to the global pins)
Note: the tiledb-py-feedstock depends on tiledb-feedstock which builds TileDB Embedded (libtiledb
), so that feedstock must be updated first for releases that bump the libtiledb version target (especially important for major releases).
-
Update the following variables in
misc/azure-release.yml
(see example)TILEDBPY_VERSION LIBTILEDB_VERSION LIBTILEDB_SHA
-
Push the desired release branch to
origin
asrelease-XYZ
(assuming you are tagging fromdev
):git push origin dev:release-0.X.Y
Note: the release builder does not currently run tests, so if you are tagging from a branch/commit that has not been tested, also push that commit for testing (
git push origin <commit>:test-release-0.X.y
) and check the build results. -
Retrieve the build artifacts
- Build progress is tracked on the TileDB-Py Azure Pipelines Project, check the build status.
- After the release build completes successfully, the
Release
stage should have 3 build artifacts available. - Download the
"drop"
artifact, which should be downloaded asdrop.zip
. This file should contains the source tarball and wheel files. (see Release Files on PyPI for prior versions) - Unzip
drop.zip
-
Check import and versions in a clean test environment:
NOTE: deactivate any existing venv or conda environment before running these steps.
cd /tmp
python3 -m venv test-env
source test-env/bin/activate
# substitute unzip path from step (3)
pip install /PATH/TO/drop/tiledb-0.8.4-cp38-cp38-manylinux2010_x86_64.whl
python
>>> import tiledb
>>> tiledb.libtiledb.version()
>>> tiledb.version.version
- Upload the packages. See the PyPI Source Packaging section below for username/password:
-
pip install twine
(if necessary) cd /PATH/TO/drop
twine upload tiledb-*
- Username:
- Password:
- Account Email:
Configuration:
>> cat ~/.pypirc
[distutils]
index-servers =
pypi
pypitest
[pypi]
repository=https://pypi.python.org/pypi
username=tiledbinc
password=...
[pypitest]
repository=https://testpypi.python.org/pypi
username=tiledbinc
password=...
- Paste the above file to
~/.pypirc
(skip this step if you just want to enter the username/password each time you calltwine upload
)
Note that the package creation steps are now done automatically by azure-release.yml
build target, but these instructions are kept for fallback reference.
- Build the “sdist” (source distribution) archive, which is what will be uploaded to PyPI:
$ cd TileDB-Py
-
$ git checkout -b release-x.y.z
- **NOTE:
sdist
does not integrate with git, so the checkout directory should be clean (no untracked files listed bygit status
) to avoid picking up extra files.
- **NOTE:
$ git pull
$ rm -rf build dist tiledb.egg-info/ tiledb/native/ tiledb/libtiledb.cpp tiledb/libtiledb.cpython-36m-darwin.so
$ python setup.py sdist
- This should create the file
dist/tiledb-x.y.z.tar.gz
. If instead you get a file likedist/tiledb-x.y.(z+1).dev2.tar.gz
, check to make sure that thex.y.z
tag points to the latest commit ofrelease-x.y.z
branch. - Upload the sdist archive to TestPyPI
- Original instructions: https://packaging.python.org/guides/using-testpypi/
$ pip install twine
$ twine upload --repository-url https://test.pypi.org/legacy/ dist/tiledb-x.y.z.tar.gz
- If the credentials for
tiledbinc
don't work, try logging in on https://test.pypi.org/account/login/. Periodically users are deleted from the TestPyPI database, so just recreate the account. Note you will need to provide an email address to verify, even though it's on the testing server. - Check that the project uploaded and that the README looks correct: https://test.pypi.org/project/tiledb/
- Check installation from pip works:
$ virtualenv tmpvenv
$ source tmpvenv/bin/activate
$ pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple tiledb
$ python -c "import tiledb ; print(tiledb.libtiledb.version())"
- If everything works, just upload the same .tar.gz archive to production PyPI:
- Note: Be sure this is the archive you want to upload, as you cannot replace it on PyPI. PyPI does not allow for a filename to be reused, even once a project has been deleted and recreated. After uploading this archive, fixes will have to occur in point releases. Because of this, make sure things that depend on TileDB-Py releases work first (e.g. the conda package for TileDB-Py). If something breaks with those, it's not possible to update the PyPI release archive with the fixes.
twine upload dist/tiledb-x.y.z.tar.gz
- (for versions of twine released before Jan 2019, use:
twine upload --repository-url https://upload.pypi.org/legacy/ dist/tiledb-x.y.z.tar.gz
)
Binary wheels should also be created for PyPI, meeting the manylinux standard. This process has been scripted in azure-release.yml
, but these manual steps are included as a fallback reference:
- build wheels: from a TileDB-Py checkout, run the following commands
-
docker build misc/pypi_linux
- copy the resulting
IMAGE_HASH
displayed at the end of the docker build (Successfully built IMAGE_HASH
) export IMAGE_HASH=<image hash>
- copy the resulting
-
docker run -v `pwd`/wheels:/wheels -ti $IMAGE_HASH build.sh
- (note:
-v
creates a volume mount from ````pwd/wheels
sub-directory to `/wheels` inside the container)
- (note:
-
Check output: the second command should display build progress, and should result in four
.whl
packages in thewheels/
sub-directory, one for each of Python 2.7 (cp27-cp27mu
), 3.5 (cp35-cp35m
), 3.6 (cp36-cp36m
), and 3.7 (cp37-cp37m
). Look over thedocker run
output if any packages are missing. -
Verify wheels. The quickest way to do this is to use the
manylinux1
docker image upon which the above process is based: it contains python interpreters for all four versions. Ideally we would test on a different image.
-
docker run -v `pwd`/wheels:/wheels -ti quay.io/pypa/manylinux1_x86_64 bash
-
now inside the docker container:
-
export VERSION=<version>
-
/opt/python/cp27-cp27mu/bin/pip install /wheels/tiledb-$VERSION-cp27-cp27mu-manylinux1_x86_64.whl
-
/opt/python/cp27-cp27mu/bin/python -m unittest tiledb.tests.all.suite_test
-
Repeat the
pip
and test steps above for each of/opt/python/cp35-cp35m/bin/python
,/opt/python/cp36-cp36m/bin/python
, and/opt/python/cp37-cp37m/bin/python
.
-
-
now inside the docker container:
- Assuming the tests pass, upload wheels to PyPI. From the same directory as above (
TileDB-Py
checkout), run the following command:
twine upload wheels/*
(See "PyPI Source Packaging" instructions above for PyPI login/configuration details)