Opinionated templates for azure pipelines builds.
This repo is to be used as central repository for an organizations azure pipelines templates, so that all libs in an organization can reuse the same core build template, without having to update each lib independently when changes to build process are made.
- Python package builds - single python version (python_package_main.ynl)
- Golang module builds (go_module_main.yml).
-
Fork this repo into your organization. Azure devops requires that remote templates referenced by a pipeline must be part of the same organization.
-
Setup an Azure Devops account and Azure Pipelines project.
-
Connect a pipeline to your github repo.
-
Accessible via ssh: This template uses ssh to handle git commands. Directions for enabling ssh access
-
Dev Branch: Builds are triggered from the
dev
branch. Changes are merged into master at the end of a successful build.
There are a few other repos related to these templates.
-
azure-pipelines-scripts: Helper scripts for some pipeline steps. Can be forked or left as-is if no changes are desired. If forked, the
BUILD_SCRIPTS_REPO
variable invariables.yml
should be changed to the fork. -
islelib-py: python package template with all tooling configs preset.
-
isleservice-py: python service template with all tooling configs preset.
-
isleconsumer-py: python consumer service template with all tooling configs preset.
-
islelib-go: golang module template with all tooling configs preset.
The /repo_yaml_templates
folder contains template pipeline definitions to be placed
in your repo's and references these core templates.
Builds are performed in a single Linux job, in order to reduce build time. All pipelines follow the same general order
-
Lint: Run linters to check code style.
-
Test: Run tests.
-
Check test Coverage: By default, 85% code coverage is required.
-
Version up: The pipeline automatically chooses the next available patch version to be used for the target major, minor version in setup.cfg
-
Build Docs: static html docs are built to
./docs/
. There are options for publishing to an S3 bucket, github pages, or both. This build is added to the merge for master. -
Build software: Creates target build type for the software (go binary / python package / docker container) if needed.
-
Upload build: (to pypi / dockerhub, etc.)
-
Git tag version: Version tag added to git, (ex:
v1.1.13
) -
Force merge master: The source code is tagged with the version and force merged into master, favoring the dev branch in code conflict.
-
Master and tags pushed to git: New master branch and build tags are pushed to git.
The pipeline is run on any PR request made to dev
by default. Build steps (stages
5-11 above) are not executed during PR validation.
Push docs to S3 Bucket: Whenever a new commit is made to the master
branch,
documentation is built as part of the pipeline and can be pushed to an Amazon S3
_
bucket for hosting. The documentation will be pushed to two locations:
- s3://{docs_bucket}/{repo_name}/latest
- s3://{docs_bucket}/{repo_name}/v{version}
S3 can be configured to handle user authorization through Cloudfront
_ and Cognito
_
using this lambda edge template,
following this tutorial
There are very few cross-platform affordable static website hosting services that allow easy identity protection for private docs, and have found that this solutions works well for us.
Publish to Github Pages: Since docs are always copied and committed to /docs/ on the master branch, its easy to configure documentation publication through github pages. NOTE: GITHUB PAGES WILL ALWAYS BE PUBLIC, EVEN ON A PRIVATE REPO
These templates rely on access to logins or credentials for other services, and accesses them via [pipeline variables]. The following variables are required for these templates (depending on the type of build.) They are broken into suggested groups.
-
GIT_CREDENTIALS: Used for adding tags and updating master.
- GIT_SSH_PUBLIC_KEY: Public key for git ssh access.
- GIT_SSH_PASSPHRASE: Passphrase for ssh key.
- GIT_KNOWN_HOSTS_ENTRY: Known hosts entry created with public and private keys.
- GIT_USERNAME: User to use with git commits.
- GIT_EMAIL: Email to use with git commits.
For more information on how to create ssh keys, see here. For more information on how these values are installed in a pipeline, see here.
-
OPEN_SOURCE_TWINE_CREDENTIALS: Used for uploading public packages to pypi.org
- PYPIORG_USER: username for pypi.org
- PYPIORG_PASSWORD: password for pypi.org.
-
CONTAINER_REGISTRY_CREDENTIALS: Used for uploading service images.
- CONTAINER_REGISTRY_URL: URL for pushing / pulling from container registry.
- CONTAINER_REGISTRY_ID: ID to sign into container registry.
- CONTAINER_REGISTRY_PASSWORD: Password to sign into container registry
-
AWS_DOCS_BUCKET_CREDENTIALS: Used for uploading docs to S3 bucket.
- AWS_ACCESS_KEY_ID: Access key id for account to upload docs with.
- AWS_SECRET_ACCESS_KEY: Secret Access key for account to upload docs with.
- DOCS_S3_BUCKET: S3 bucket name to upload docs to.
- DOCS_CLOUDFORMATION_DISTRO_ID: Cloudfront distro to invalidate indexes for when altering "latest" docs.
NOTE: These groups will need to be added to every build pipeline individually. The configuration pane to add groups to a pipeline is currently a little hard to find.
In the overview page for a pipeline go to: Edit -> Hamburger Menu -> Triggers
. This
will bring you to a general settings page. At the top choose Varaibles -> Variable Groups -> Link variable group
to give a pipeline access to a variable group.
To make a new group, look at the left hand pane and go to Pipelines -> Library -> Variable groups
.
The following secure files are required for pipelines to function properly:
- git_ssh_key.private: SSH private key for accessing git.
To upload this file, look at the left hand pane and go to Pipelines -> Library -> Secure files
.
The template library for using this build pipelines can be found here. The following tools are used during python package builds:
-
Root Template: python_package_main.yml
-
Dependency Installation: Handled via [pip].
-
Linters: Linting is done via Flake8, Black, and mypy for static-type analysis.
-
Tests: Handled via pytest.
-
Docs: Handled via sphinx.
-
Builds: Handled via setuptols.
-
Package Uploads: Handled via twine.
The template library for using this build pipelines can be found here. The following tools are used during python package builds:
-
Root Template: go_module_main.yml
-
Dependency Installation: Handled via go get. Git is configured to use ssh instead of http to enable private package fetching.
-
Linters: Linting is done via Revive.
-
Tests: Handled via go test.
-
Docs: Handled via sphinx for quickstarts and guides + a cli tool to generate API Documentation via godoc.
-
Builds: Go modules do not need to be built.
-
Package Uploads: Handled via merge into master and version tag.
-
Root Template: docker_image_main.yml
-
Image Builds: Handled via docker commandline + dockerfile.
-
Image Uploads: Handled via docker commandline.
The template library for using this build pipelines can be found here for REST and here for Consumer services. The following tools are used during python service builds:
-
Root Template: python_service_main.yml
-
Dependencies, linting testing, and docs: are identical to python package builds.
-
Image Builds and Uploads: Handled via docker commandline + dockerfile.