Skip to content

fix: Add robots.txt so ahrefs.com, etc. don't index results that expire #131

fix: Add robots.txt so ahrefs.com, etc. don't index results that expire

fix: Add robots.txt so ahrefs.com, etc. don't index results that expire #131

Workflow file for this run

name: Lint
on: [push, pull_request]
env:
BASEDIR: https://raw.githubusercontent.com/open-contracting/standard-maintenance-scripts/main
jobs:
build:
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: '3.10'
cache: pip
cache-dependency-path: '**/requirements*.txt'
- id: changed-files
uses: tj-actions/changed-files@v39
- uses: pre-commit/[email protected]
with:
extra_args: pip-compile --files ${{ steps.changed-files.outputs.all_changed_files }}
- shell: bash
run: curl -s -S --retry 3 $BASEDIR/tests/install.sh | bash -
- shell: bash
run: curl -s -S --retry 3 $BASEDIR/tests/script.sh | bash -
- run: pip install -r requirements_dev.txt
- env:
# https://github.com/OpenDataServices/lib-cove/pull/118
STANDARD_MAINTENANCE_SCRIPTS_IGNORE: jsonschema
run: pytest /tmp/test_requirements.py