Skip to content

data-derp/exercise-co2-vs-temperature-infrastructure

Repository files navigation

CO2 vs. Temperature (Infrastructure)

In this exercise, we assume that you've completed the production-code and have managed to push those artifacts to an AWS S3 Bucket. If not, see Fresh Start. This exercise focuses on using those artifacts as part of an AWS Glue workflow.

NOTE: The following exercises follow the same concept as the production-code exercise where a project-name and module-name are used consistently to create resources.

In all examples in ,

  • project-name = awesome-project
  • module-name = awesome-module

Where these are used, you'll want to pick your own unique project-name and module-name.

Prerequisites

  • An AWS Account and IAM User with permissions to create AWS Glue and Athena resources and read S3 buckets
  • AWS CLI access)

Quickstart

  1. If you don't already have artifacts for Data Ingestion and Transformation in your S3 bucket follow the Fresh Start instructions
  2. Create AWS Resources for Data Ingestion
  3. Create AWS Resources for Data Transformation
  4. Create AWS Resources for Data Workflow
  5. Create AWS Resources via Infrastructure as Code

Fresh Start

If you don't have the artifacts in an S3 bucket yet:

  1. Create an S3 bucket
  2. Ensure you have an active AWS CLI Session in your Terminal)
  3. Upload the artifacts:
# Change these variables
PROJECT_NAME=awesome-project
MODULE_NAME=awesome-module

./go upload-data-source "${PROJECT_NAME}-${MODULE_NAME}"
./go upload-artifacts "${PROJECT_NAME}-${MODULE_NAME}"

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published