You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am having some troubles trying to run the python example: delta-sharing/examples/python/quickstart_spark.py and I am getting this error:
Py4JJavaError: An error occurred while calling o61.load.
: org.apache.spark.SparkClassNotFoundException: [DATA_SOURCE_NOT_FOUND] Failed to find the data source: deltaSharing. Please find packages at `[https://spark.apache.org/third-party-projects.html`](https://spark.apache.org/third-party-projects.html%60).
at org.apache.spark.sql.errors.QueryExecutionErrors$.dataSourceNotFoundError(QueryExecutionErrors.scala:724)
I tried to follow instructions adding pyspark --packages io.delta:delta-sharing-spark_2.12:3.1.0 and I get that error and also by adding Hadoop spark = (SparkSession .builder .config('spark.jars.packages', 'org.apache.hadoop:hadoop-azure:3.3.1,io.delta:delta-core_2.12:2.2.0,io.delta:delta-sharing-spark_2.12:0.6.2') .config('spark.sql.extensions', 'io.delta.sql.DeltaSparkSessionExtension') .config('spark.sql.catalog.spark_catalog', 'org.apache.spark.sql.delta.catalog.DeltaCatalog') .getOrCreate() ) as some other user suggest in another Issue.
So, it would be great if anyone could help me here. Thanks!
The text was updated successfully, but these errors were encountered:
I am having some troubles trying to run the python example: delta-sharing/examples/python/quickstart_spark.py and I am getting this error:
I tried to follow instructions adding
pyspark --packages io.delta:delta-sharing-spark_2.12:3.1.0
and I get that error and also by adding Hadoopspark = (SparkSession .builder .config('spark.jars.packages', 'org.apache.hadoop:hadoop-azure:3.3.1,io.delta:delta-core_2.12:2.2.0,io.delta:delta-sharing-spark_2.12:0.6.2') .config('spark.sql.extensions', 'io.delta.sql.DeltaSparkSessionExtension') .config('spark.sql.catalog.spark_catalog', 'org.apache.spark.sql.delta.catalog.DeltaCatalog') .getOrCreate() )
as some other user suggest in another Issue.So, it would be great if anyone could help me here. Thanks!
The text was updated successfully, but these errors were encountered: