You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
option("optimizeWrite", "True") does not work in GCP Dataproc cluster.
I am getting an exception
java.lang.NoSuchMethodError: 'org.apache.spark.util.Clock org.apache.spark.storage.ShuffleBlockFetcherIterator$.$lessinit$greater$default$18()'
at org.apache.spark.sql.delta.perf.OptimizedWriterShuffleReader.read(DeltaOptimizedWriterExec.scala:291)
at org.apache.spark.sql.delta.perf.DeltaOptimizedWriterRDD.compute(DeltaOptimizedWriterExec.scala:268)
Steps to reproduce
Created a GCS cluster using image 2.2-debian12. Created a jupyter notebook from the cluster with the following code:
Bug
Which Delta project/connector is this regarding?
Describe the problem
option("optimizeWrite", "True") does not work in GCP Dataproc cluster.
I am getting an exception
java.lang.NoSuchMethodError: 'org.apache.spark.util.Clock org.apache.spark.storage.ShuffleBlockFetcherIterator$.$lessinit$greater$default$18()'
at org.apache.spark.sql.delta.perf.OptimizedWriterShuffleReader.read(DeltaOptimizedWriterExec.scala:291)
at org.apache.spark.sql.delta.perf.DeltaOptimizedWriterRDD.compute(DeltaOptimizedWriterExec.scala:268)
Steps to reproduce
Created a GCS cluster using image 2.2-debian12. Created a jupyter notebook from the cluster with the following code:
spark = SparkSession.builder.appName("MyApp")
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
.config("spark.jars", "gs://xxx/lib/delta-spark_2.12-3.1.0.jar,gs://xxx/lib/delta-storage-3.1.0.jar")
.getOrCreate()
df = spark.read.format("delta").load("gs://xxx/delta/data1")
df.write.format("delta").partitionBy("g","p").option("optimizeWrite", "True").option("delta.enableChangeDataFeed", "true").save("gs://xxx/delta/data2")
Environment information
The text was updated successfully, but these errors were encountered: