-
-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Define separate remote for dbt artifact upload #90
Comments
If I do the above with a zip file for the project it does correctly generate the docs as far as I can tell, but then attempts to overwrite my zip file on s3 which is not great as that would then get overwritten again as part of my CI/CD process from github.
Either way, as you can see, the ZIP push failed, but the job passed, which is incorrect as it should fail. Ideally, I need to be able to set the upload to a different location or bucket so that the write can succeed. |
There is currently no way to override the upload destination: we only support uploading to the same key from where we downloaded the project. You could -in theory, I haven't tried this- push the documentation artifacts to XCOM (via From airflow-dbt-python's perspective, I don't see any reason not to support this: it's a matter of having the time to implement the feature. I would make it generic enough so that we can override the upload destination of all dbt artifacts, not just those generated by dbt docs, perhaps with a new argument Or changing If you are up to taking a stab at this (or you have already done it) I can review the PR. Otherwise I may have time to do this (but can't promise a timeline). Thanks for reporting this issue! |
Thanks for that, I did take a look but cant really see where to do this correctly. For now I will try via the XCOM and hope you are able to find time as some point to update. Thank you for your work :) |
Hi,
Thanks for this project it looks great and I am looking to switch over to using it. One thing on the docs in order to keep the speed on downloads I will probably zip my project dir. At the moment however, I do use s3 for my docs but I would like to use a different bucket to the one used for pulling airflow and dbt resources. Is there an override for the command below to change the upload bucket?
Thanks
The text was updated successfully, but these errors were encountered: