You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@benc-db, you mentioned that dbt asked you to not require Pydantic > 2, so you decided to restrict Pydantic to v1: #843
I am a bit confused as to why it was decided to constraint Pydantic to >=1.10.0, <2 instead of just >=1.10.0 which would have met dbt request. I am also confused about the reason for dbt request, as dbt itself does not publish any dependency on Pydantic.
Based on the date and other related issues/PR, I am guessing that it might be related to issues encountered with the release of Pydantic 2.10 and maybe its impact on Airflow:
However, these were promptly resolved. For example, Airflow itself requires Pydantic v2 and decided to only blacklist the version 2.10.0 of Pydantic.
Making the version 1.9.0 of dbt-databricks effectively incompatible with any project requiring Pydantic v2 seems to be fairly extreme, would it be possible to get any context to the request form dbt?
As a general rule, every project should publish only its constraints. This package has no incompatibility with Pydantic v2, and it should therefore not publish any such constraint ;) If another project has an issue with Pydantic v2, they should publish this constraint themselves to not impact more users than necessary.
This has in turned caused further problems with other packages that were forced to constrain the version of dbt-databricks: astronomer/astronomer-cosmos#1376
The text was updated successfully, but these errors were encountered:
The restriction to < 2 was actually to silence a bunch of warnings related to my internal usage of pydantic. Since I couldn't enforce that pydantic >= 2, I had to use v1 syntax, which in v2 leads to a bunch of warnings. As for why dbt asked me to allow v1, I believe it has to do with the fact that in dbt Cloud (at least in the testing environment), they install all of the adapters, and one of them requires V1. If anyone knows how to suppress the warnings, I would appreciate it. In the meantime, I can remove the upper bound.
Describe the bug
@benc-db, you mentioned that dbt asked you to not require Pydantic > 2, so you decided to restrict Pydantic to v1: #843
I am a bit confused as to why it was decided to constraint Pydantic to
>=1.10.0, <2
instead of just>=1.10.0
which would have met dbt request. I am also confused about the reason for dbt request, as dbt itself does not publish any dependency on Pydantic.Based on the date and other related issues/PR, I am guessing that it might be related to issues encountered with the release of Pydantic 2.10 and maybe its impact on Airflow:
However, these were promptly resolved. For example, Airflow itself requires Pydantic v2 and decided to only blacklist the version 2.10.0 of Pydantic.
Making the version 1.9.0 of dbt-databricks effectively incompatible with any project requiring Pydantic v2 seems to be fairly extreme, would it be possible to get any context to the request form dbt?
As a general rule, every project should publish only its constraints. This package has no incompatibility with Pydantic v2, and it should therefore not publish any such constraint ;) If another project has an issue with Pydantic v2, they should publish this constraint themselves to not impact more users than necessary.
This has in turned caused further problems with other packages that were forced to constrain the version of
dbt-databricks
: astronomer/astronomer-cosmos#1376The text was updated successfully, but these errors were encountered: