Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2023-11-15 | MAIN --> PROD | DEV (26c8d50) --> STAGING #2806

Merged
merged 7 commits into from
Nov 15, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 10 additions & 1 deletion .github/workflows/deploy-development.yml
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,15 @@ jobs:
autoapprove: false
secrets: inherit

new-relic-record:
name: Record deployment to New Relic
needs:
- deploy-infrastructure-dev
uses: ./.github/workflows/new-relic-deployment.yml
with:
environment: "dev"
secrets: inherit

deploy-dev:
name: Deploy application
needs:
Expand All @@ -69,7 +78,7 @@ jobs:
generate-e2e-test-data:
needs:
- deploy-dev
name:
name:
uses: ./.github/workflows/end-to-end-test-data-generator.yml
secrets: inherit
with:
Expand Down
9 changes: 9 additions & 0 deletions .github/workflows/deploy-production.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,15 @@ jobs:
autoapprove: false
secrets: inherit

new-relic-record:
name: Record deployment to New Relic
needs:
- deploy-infrastructure-production
uses: ./.github/workflows/new-relic-deployment.yml
with:
environment: "production"
secrets: inherit

deploy-production:
name: Deploy application
needs:
Expand Down
11 changes: 10 additions & 1 deletion .github/workflows/deploy-staging.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,15 @@ jobs:
autoapprove: false
secrets: inherit

new-relic-record:
name: Record deployment to New Relic
needs:
- deploy-infrastructure-staging
uses: ./.github/workflows/new-relic-deployment.yml
with:
environment: "staging"
secrets: inherit

deploy-staging:
name: Deploy application
needs:
Expand Down Expand Up @@ -61,7 +70,7 @@ jobs:
generate-e2e-test-data:
needs:
- deploy-staging
name:
name:
uses: ./.github/workflows/end-to-end-test-data-generator.yml
secrets: inherit
with:
Expand Down
51 changes: 41 additions & 10 deletions .github/workflows/new-relic-deployment.yml
Original file line number Diff line number Diff line change
@@ -1,26 +1,57 @@
name: Record Deployment And Add New Relic Monitor
on:
push:
branches:
- main
- prod
tags:
- v1.*
workflow_call:
inputs:
environment:
required: true
type: string

jobs:
newrelic:
newrelic-dev:
if: ${{ inputs.environment == 'dev' }}
runs-on: ubuntu-latest
name: New Relic Record Deployment
steps:
# This step builds a var with the release tag value to use later
- name: Set Release Version from Tag
run: echo "RELEASE_VERSION=${{ github.ref_name }}" >> $GITHUB_ENV
# This step creates a new Change Tracking Marker

- name: Add New Relic Application Deployment Marker
uses: newrelic/[email protected]
with:
apiKey: ${{ secrets.NEW_RELIC_API_KEY }}
guid: ${{ secrets.NEW_RELIC_DEV_DEPLOYMENT_ENTITY_GUID }}
version: "${{ env.RELEASE_VERSION }}"
user: "${{ github.actor }}"

newrelic-staging:
if: ${{ inputs.environment == 'staging' }}
runs-on: ubuntu-latest
name: New Relic Record Deployment
steps:
- name: Set Release Version from Tag
run: echo "RELEASE_VERSION=${{ github.ref_name }}" >> $GITHUB_ENV

- name: Add New Relic Application Deployment Marker
uses: newrelic/[email protected]
with:
apiKey: ${{ secrets.NEW_RELIC_API_KEY }}
guid: ${{ secrets.NEW_RELIC_STAGING_DEPLOYMENT_ENTITY_GUID }}
version: "${{ env.RELEASE_VERSION }}"
user: "${{ github.actor }}"

newrelic-production:
if: ${{ inputs.environment == 'production' }}
runs-on: ubuntu-latest
name: New Relic Record Deployment
steps:
- name: Set Release Version from Tag
run: echo "RELEASE_VERSION=${{ github.ref_name }}" >> $GITHUB_ENV

- name: Add New Relic Application Deployment Marker
uses: newrelic/[email protected]
with:
apiKey: ${{ secrets.NEW_RELIC_API_KEY }}
guid: ${{ secrets.NEW_RELIC_DEPLOYMENT_ENTITY_GUID }}
guid: ${{ secrets.NEW_RELIC_PRODUCTION_DEPLOYMENT_ENTITY_GUID }}
version: "${{ env.RELEASE_VERSION }}"
user: "${{ github.actor }}"

1 change: 1 addition & 0 deletions backend/audit/admin.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ def has_view_permission(self, request, obj=None):
"cognizant_agency",
"oversight_agency",
]
readonly_fields = ("submitted_by",)


class AccessAdmin(admin.ModelAdmin):
Expand Down
8 changes: 4 additions & 4 deletions backend/census_historical_migration/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,17 +46,17 @@ NOTE: Never check in the census_historical_migration/data folder into GitHub.

2. In the FAC/backend folder, run the following to load CSV files from census_historical_migration/data folder into fac-census-to-gsafac-s3 bucket.
```bash
docker compose run web python manage.py fac_s3 fac-census-to-gsafac-s3 --upload --src census_historical_migration/data
docker compose run --rm web python manage.py fac_s3 fac-census-to-gsafac-s3 --upload --src census_historical_migration/data
```

3. In the FAC/backend folder, run the following to read the CSV files from fac-census-to-gsafac-s3 bucket and load into Postgres.
```bash
docker compose run web python manage.py csv_to_postgres --folder data --chunksize 10000
docker compose run --rm web python manage.py csv_to_postgres --folder data --chunksize 10000
```

### How to run the historic data migrator:
```
docker compose run web python manage.py historic_data_migrator --email [email protected] \
docker compose run --rm web python manage.py historic_data_migrator --email [email protected] \
--years 22 \
--dbkeys 100010
```
Expand All @@ -65,7 +65,7 @@ docker compose run web python manage.py historic_data_migrator --email any_email

### How to run the historic workbook generator:
```
docker compose run web python manage.py historic_workbook_generator
docker compose run --rm web python manage.py historic_workbook_generator \
--year 22 \
--output <your_output_directory> \
--dbkey 100010
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,11 @@
import pprint

from census_historical_migration.workbooklib.workbook_creation import (
sections,
workbook_loader,
generate_workbook,
)
from census_historical_migration.workbooklib.workbook_section_handlers import (
sections_to_handlers,
)

import datetime

from census_historical_migration.workbooklib.census_models.census import (
CensusGen22 as Gen,
Expand Down Expand Up @@ -181,16 +181,11 @@ def handle(self, *args, **options): # noqa: C901
logger.info("could not create output directory. exiting.")
sys.exit()

entity_id = "DBKEY {dbkey} {date:%Y_%m_%d_%H_%M_%S}".format(
dbkey=options["dbkey"], date=datetime.datetime.now()
)

loader = workbook_loader(
None, None, options["dbkey"], options["year"], entity_id
)
json_test_tables = []
for section, fun in sections.items():
(wb, api_json, filename) = loader(fun, section)
for section, fun in sections_to_handlers.items():
(wb, api_json, _, filename) = generate_workbook(
fun, options["dbkey"], options["year"], section
)
if wb:
wb_path = os.path.join(outdir, filename)
wb.save(wb_path)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
set_uei,
map_simple_columns,
generate_dissemination_test_table,
test_pfix,
)
from census_historical_migration.workbooklib.templates import sections_to_template_paths
from census_historical_migration.workbooklib.census_models.census import dynamic_import
Expand All @@ -27,7 +26,7 @@ def generate_corrective_action_plan(dbkey, year, outfile):
)
mappings = [
FieldMap("reference_number", "findingrefnums", "finding_ref_number", None, str),
FieldMap("planned_action", "text", WorkbookFieldInDissem, None, test_pfix(3)),
FieldMap("planned_action", "text", WorkbookFieldInDissem, None, str),
FieldMap(
"contains_chart_or_table", "chartstables", WorkbookFieldInDissem, None, str
),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,12 @@
from datetime import datetime

from census_historical_migration.workbooklib.workbook_creation import (
sections,
workbook_loader,
setup_sac,
)
from census_historical_migration.workbooklib.workbook_section_handlers import (
sections_to_handlers,
)
from census_historical_migration.workbooklib.sac_creation import _post_upload_pdf
from audit.intake_to_dissemination import IntakeToDissemination

Expand Down Expand Up @@ -197,9 +199,9 @@ def generate_workbooks(user, email, dbkey, year):
if sac.general_information["audit_type"] == "alternative-compliance-engagement":
print(f"Skipping ACE audit: {dbkey}")
else:
loader = workbook_loader(user, sac, dbkey, year, entity_id)
loader = workbook_loader(user, sac, dbkey, year)
json_test_tables = []
for section, fun in sections.items():
for section, fun in sections_to_handlers.items():
# FIXME: Can we conditionally upload the addl' and secondary workbooks?
(_, json, _) = loader(fun, section)
json_test_tables.append(json)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,6 @@
WorkbookFieldInDissem = 1000


def test_pfix(n):
def _test(o):
# return ' '.join(["TEST" for x in range(n)]) + " " + str(o)
return o

return _test


def set_single_cell_range(wb, range_name, value):
the_range = wb.defined_names[range_name]
# The above returns a generator. Turn it to a list, and grab
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
set_uei,
map_simple_columns,
generate_dissemination_test_table,
test_pfix,
)
from census_historical_migration.workbooklib.templates import sections_to_template_paths
from census_historical_migration.workbooklib.census_models.census import dynamic_import
Expand All @@ -18,7 +17,7 @@

mappings = [
FieldMap("reference_number", "findingrefnums", "finding_ref_number", None, str),
FieldMap("text_of_finding", "text", "finding_text", None, test_pfix(3)),
FieldMap("text_of_finding", "text", "finding_text", None, str),
FieldMap(
"contains_chart_or_table", "chartstables", WorkbookFieldInDissem, None, str
),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
set_single_cell_range,
map_simple_columns,
generate_dissemination_test_table,
test_pfix,
)
from census_historical_migration.workbooklib.templates import sections_to_template_paths
from census_historical_migration.workbooklib.excel_creation import (
Expand All @@ -22,9 +21,8 @@
logger = logging.getLogger(__name__)

mappings = [
FieldMap("note_title", "title", "title", None, test_pfix(3)),
FieldMap("note_content", "content", "content", None, test_pfix(3)),
# FieldMap("seq_number", "seq_number", "note_seq_number", 0, int),
FieldMap("note_title", "title", "title", None, str),
FieldMap("note_content", "content", "content", None, str),
]


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
set_uei,
map_simple_columns,
generate_dissemination_test_table,
test_pfix,
)
from census_historical_migration.workbooklib.templates import sections_to_template_paths
from census_historical_migration.workbooklib.census_models.census import dynamic_import
Expand All @@ -29,14 +28,14 @@
"cpastreet1",
"address_street",
None,
test_pfix(3),
str,
),
FieldMap(
"secondary_auditor_contact_title",
"cpatitle",
"contact_title",
None,
test_pfix(3),
str,
),
FieldMap(
"secondary_auditor_address_zipcode",
Expand Down
12 changes: 12 additions & 0 deletions backend/census_historical_migration/workbooklib/utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
from census_historical_migration.workbooklib.templates import sections_to_template_paths


def get_template_name_for_section(section):
"""
Return a workbook template name corresponding to the given section
"""
if section in sections_to_template_paths:
template_name = sections_to_template_paths[section].name
return template_name
else:
raise ValueError(f"Unknown section {section}")
Loading
Loading