All notable changes to this project will be documented in this file.
- This is a beta release.
- provider: Update
golang.org/x/net
dependency #329 - provider: Update
golang.org/x/crypto
dependency #328 - resource/dbtcloud_license_map: Migrate from SDKv2 to Framework #325
- resource/dbtcloud_environment: Make the default version
latest
#324 - data-source/dbtcloud_azure_dev_ops_repository: Migrate from SDKv2 to Framework #323
- data-source/dbtcloud_azure_dev_ops_project: Migrate from SDKv2 to Framework #321
- Add resource
dbtcloud_account_features
to manage account level features like Advanced CI - Add resource
dbtcloud_ip_restrictions_rule
to manage IP restrictions for customers with access to the feature in dbt Cloud
- Allow setting external OAuth config for global connections in Snowflake
- Add resource
dbtcloud_oauth_configuration
to define external OAuth integrations
- Fix acceptance test for jobs when using the ability to compare changes
- #305 - Add the resource
dbtcloud_lineage_integration
to setup auto-exposures in Tableau - Add ability to provide a project description in
dbtcloud_project
- Add ability to enable model query history in
dbtcloud_environment
- #309 - Fix the datasource
dbtcloud_global_connections
when PL is used in some connection
- Allow defining some
dbtcloud_databricks_credential
when using global connections which don't generate anadapter_id
(seed docs for the resource for more details)
- Add the ability to compare changes in a
dbtcloud_job
resource - Add deprecation notice for
target_name
indbtcloud_databricks_credential
as those can't be set in the UI - Make
versionless
the default version for environments, but can still be changed
- Add better error handling when importing resources like in #299
- #300 Panic when reading a DBX legacy connection without a catalog
- Typo in Getting started guide
- Make
dbname
required for Redshift and Postgres indbtcloud_global_connection
- Add a
dbtcloud_projects
(with an "s") datasource to return all the projects along with some information about the warehouse connections and repositories connected to those projects. Loops through the API in case there are more than 100 projects- Along with the
check
block, it can be used to check that there are no duplicate project names for example.
- Along with the
- Add a datasource for
dbtcloud_global_connection
with the same information as the corresponding resource - Add a datasource for
dbtcloud_global_connections
(with an "s"), returning all the connections of an account along with details like the number of environments they are used in. This could be used to check that connections don't have the same names or that connections are all used by projects.
- Add support for setting the
pull_request_url_template
indbtcloud_repository
- Add support for all connection types in
dbtcloud_global_connection
(added PostgreSQL, Redshift, Apache Spark, Starburst, Synapse, Fabric and Athena) and add deprecation warnings for all the other connections resources:dbtcloud_connection
,dbtcloud_bigquery_connection
anddbtcloud_fabric_connection
- Update "Getting Started" guide to use global connections instead of project-scoped connections
- Accelerate CI testing by:
- avoiding too many calls to
v2/.../account
- installing Terraform manually in the CI pipeline so that each test doesn't download a new version of the CLI
- moving some tests to run in Parallel (could move more in the future)
- avoiding too many calls to
- Update go libraries
- Add support for
import
fordbtcloud_global_connection
- Add support for Databricks in
dbtcloud_global_connection
- #267 Support for global connections
dbtcloud_environment
now accepts aconnection_id
to link the environment to the connection. This is the new recommended way to link connections to environments instead of linking the connection to the project withdbtcloud_project_connection
- The
dbtcloud_project_connection
still works today and when used doesn't require setting up aconnection_id
in thedbtcloud_environment
resource (i.e. , any current config/module should continue working), but the resource is flagged as deprecated and will be removed in a future version of the provider
- The
- For now, people can continue using the project-scoped connection resources
dbtcloud_connection
,dbtcloud_bigquery_connection
anddbtcloud_fabric_connection
for creating and updating global connections. The parameterproject_id
in those connections still need to be a valid project id but doesn't mean that this connection is restricted to this project ID. The project-scoped connections created from Terraform are automatically converted to global connections - A new resource
dbtcloud_global_connection
has been created and currently supports Snowflake and BigQuery connections. In the next weeks, support for all the Data Warehouses will be added to this resource- When a data warehouse is supported in
dbtcloud_global_connection
, we recommend using this new resource instead of the legacy project-scoped connection resources. Those resources will be deprecated in a future version of the provider.
- When a data warehouse is supported in
- #278 Deprecate
state
attribute in the resources and datasources that use it. It will be removed in the next major version of the provider. This attribute is used for soft-delete and isn't intended to be configured in the scope of the provider.
- #281 Fix the datasource
dbcloud_environments
where the environment IDs were not being saved
- #277 Add
dbtcloud_users
datasource to get all users - #274 Add
dbtcloud_jobs
datasource to return all jobs for a given dbt Cloud project or environment - #273 Add environment level restrictions to the
dbtcloud_service_token
resource
- Fix typo in service token examples
- #271 Force creation of a new connection when the project is changed or deleted
- Fix typo in environment code example
- Added new
on_warning
field todbtcloud_notification
anddbtcloud_partial_notification
.
- #266 Add env level permissions for
dbtcloud_group
anddbtcloud_group_partial_permissions
. As of June 5 this feature is not yet active for all customers.
- Fix description of fields for some datasources
- Move the
dbcloud_group
resource and datasource from the SDKv2 to the Framework - Create new helpers for comparing Go structs
- Update all SDKv2 tests to run on the muxed provider to work when some resources have moved to the Plugin Framework
- #232 add deprecation notice for
dbtcloud_project_artefacts
as the resource is not required now that dbt Explorer is GA. - #208 add new
dbtcloud_partial_license_map
for defining SSO group mapping to license types from different Terraform projects/resources
- add a
dbtcloud_partial_notification
resource to allow different resources to add/remove job notifications for the same Slack channel/email/user
- #257 - Force new resource when the
project_id
changes for adbtcloud_job
. - Creating connection for adapters (e.g. Databricks and Fabric) was failing when using Service Tokens following changes in the dbt Cloud APIs
- change the User Agent to report what provider version is being used
- add import block example for the resources in addition to the import command
- #255 - Add new datasource
dbtcloud_environments
to return all environments across an account, or all environments for a give project ID
- Move the
dbtcloud_environment
datasource to the Terraform Plugin Framework
- #250 - [Experimental] Create a new resource called
dbtcloud_group_partial_permissions
to manage permissions of a single group from different resources which can be set across different Terraform projects/workspaces. The dbt Cloud API doesn't provide endpoints for adding/removing single permissions, so the logic in the provider is more complex than other resources. If the resource works as expected for the provider users we could create similar ones for "partial" notifications and "partial" license mappings.
- Add
on_merge
trigger for jobs. The trigger is optional for now but will be required in the future.
- Remove mention of
dbt_cloud_xxx
resources in the docs
- Implements muxing to allow both SDKv2 and Plugin Framework resources to work at the same time. This change a bit the internals but shouldn't have any regression.
- Move some resources / datasources to the plugin Framework
- Remove legacy
dbt_cloud_xxx
resources
- Enable OAuth configuration for Databricks connections + update docs accordingly
- #247 Segfault when the env var for the token is empty
- [Internal] Issue with
job_ids
required to be set going forward, even if it is empty
- #244 Better error handling when GitLab repositories are created with a User Token
- #245 Issues on
dbtcloud_job
when modifying an existing job schedule
- #240 Add notice of deprecation for
triggers.custom_branch_only
for jobs and update logic to make it work even though people have it to true or false in their config. We might raise an error if the field is still there in the future. - Update diff calculation for Extended Attributes, allowing strings which are not set with
jsonencode()
- #241 Force recreation of env vars when values change to work with the recent changes in the dbt Cloud API
- Add list of permission names and permission codes in the docs of the
service_token
andgroup
- Add info in
dbtcloud_repository
about the need to also create adbtcloud_project_repository
- Flag
fetch_deploy_key
as deprecated fordbtcloud_repository
. The key is always fetched for the genetic git clone approach
- Add info about
versionless
dbt environment (Private Beta) - #235 Fix docs on the examples for Fabric credentials
- Add support for job chaining and
job_completion_trigger_condition
(feature is in closed Beta in dbt Cloud as of 5 FEB 2024)
- Improve docs for jobs
- Update permissions allowed for groups and token to include
job_runner
- Add guide on
dbtcloud-terraforming
to import existing resources
- #229 - fix logic for secret environment variables
- #228 - update docs to replace the non existing
dbtcloud_user
resource by the existingdata.dbtcloud_user
data source
- update third party module version following security report
- #224 - add the resources
dbtcloud_fabric_connection
anddbtcloud_fabric_credential
to allow using dbt Cloud along with Microsoft Fabric - #222 - allow users to set Slack notifications from Terraform
- Refactor some of the shared code for Adapters and connections
- #99 - add the resource
environment_variable_job_override
to allow environment variable override in jobs - Update the go version and packages versions
- #221 - removing the value for an env var scope was not removing it in dbt Cloud
- add better messages and error handling for jobs
- Update list of permissions for groups and service tokens
- Fix issues with the repositories connected via GitLab native integration
- Add ability to configure repositories using the native ADO integration
- Add data sources for retrieving ADO projects and repositories ID and information
- Show in the main page that provider parameters can be set with env vars
- Update examples and field descriptions for the repositories
- Update connections to force new one when the project changes
- Add support for the Datasource dbtcloud_group_users to get the list of users assigned to a given project
- Use d2 for showing the different resources
- Update examples in docs
- Update docs and examples for jobs and add the ability to set/unset running CI jobs on Draft PRs
- #197 - Community contribution to handle cases where more than 100 groups are created in dbt Cloud
- #199 - Update logic to allow finding users by their email addresses in a cases insensitive way
- #198 - Update some internal logic to call endpoints by their unique IDs instead of looping through answers to avoid issues like #199 and paginate through results for endpoints where we can't query the ID directly
- #189 - Allow users to retrieve project data sources by providing project names instead of project IDs. This will return an error if more than 1 project has the given name and takes care of the pagination required for handling more than 100 projects
- Add support for extended attributes for environments (docs), allowing people to add connection attributes available in dbt-core but not in the dbt Cloud interface
- #191 - Allow setting a description for jobs
- #190 - Allow setting deferral for jobs at the environment level rather than at the job level. This is due to changes in CI in dbt Cloud. Add docs about those changes on the dbtcloud_job resource page
- #184 - Fix issue when updating SSO groups for a given RBAC group
- #178 and #179: Add support for dbtcloud_license_map, allowing the assignment of SSO groups to different dbt Cloud license types
- #172: Fix issue when changing the schedule of jobs from a list of hours to an interval in a dbtcloud_job
- #175: Fix issue when modifying the
environment_id
of an existing dbtcloud_job - #154: Allow the creation of Databricks connections using Service Tokens when it was only possible with User Tokens before
- Use the
v2/users/<id>
endpoint to get the groups of a user
- More update to docs
- #171 Add the ability to define which environment is the production one (to be used with cross project references in dbt Cloud)
- Add guide on how to use the Hashicorp HTTP provider
- #174 Add the ability to assign User groups to dbt Cloud users.
- Update CI to avoid Node version warnings
- Fixes to the docs
- 164 Add the ability to define
priority
andexecution_project
for BigQuery connections - 168 Add the ability to set up email notifications (to internal users and external email addresses) based on jobs results
- #156 Fix the
dbtcloud_connection
for Databricks when updating thehttp_path
orcatalog
+ add integration test - #157 Fix updating an environment with credentials already set + add integration test
- Add guide to get started with the provider
- Add missing import and fix more docs
- Update docs template to allow using Subcategories later
- Resources deleted from dbt Cloud won't crash the provider and we now consider the resource as deleted, removing it from the state. This is the expected behavior of a provider.
- Add examples in the docs to resources that didn't have any so far
- The resources and data sources are now available as
dbtcloud_xxx
(following the terraform convention) in addition todbt_cloud_xxx
(legacy). The legacy version will be removed from v0.3.0 onwards. Instructions on how to use the new resources are available on the main page of the Provider.
- The provider is now published under the dbt-labs org: https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest