address comments #290
This run and associated checks have been archived and are scheduled for deletion.
Learn more about checks retention
build_main.yml
on: push
Run
/
Check changes
39s
Run
/
Breaking change detection with Buf (branch-3.5)
1m 14s
Run
/
Run TPC-DS queries with SF=1
1h 28m
Run
/
Run Docker integration tests
1h 0m
Run
/
Run Spark on Kubernetes Integration test
1h 13m
Matrix: Run / build
Matrix: Run / java-other-versions
Run
/
Build modules: sparkr
45m 5s
Run
/
Linters, licenses, dependencies and documentation generation
2h 7m
Matrix: Run / pyspark
Annotations
18 errors and 2 warnings
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-8279da8b25b3fbef-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-48e7468b25b50c22-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007f41c05c2228@3c4bc0d4 rejected from java.util.concurrent.ThreadPoolExecutor@363383a4[Shutting down, pool size = 4, active threads = 2, queued tasks = 0, completed tasks = 310]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007f41c05c2228@2e926b1e rejected from java.util.concurrent.ThreadPoolExecutor@363383a4[Shutting down, pool size = 3, active threads = 1, queued tasks = 0, completed tasks = 311]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-29a2eb8b25ca9d19-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-d89f6c8b25cbb63d-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-5e20278b25cfac30-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-d74ee5a86b534d01bfeea31b8233c385-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-d74ee5a86b534d01bfeea31b8233c385-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: sql - other tests
Process completed with exit code 18.
|
Run / Build modules: pyspark-sql, pyspark-resource, pyspark-testing
Process completed with exit code 19.
|
|
python/pyspark/sql/tests/pandas/test_pandas_map.py.test_self_join:
python/pyspark/sql/tests/pandas/test_pandas_map.py#L1
[Errno 111] Connection refused
|
|
SQLAppStatusListenerWithRocksDBBackendSuite.driver side SQL metrics:
SQLAppStatusListenerWithRocksDBBackendSuite#L1
org.scalatest.exceptions.TestFailedException: Map(1482622 -> "2", 1482620 -> "total (min, med, max (stageId: taskId))
0 ms (0 ms, 0 ms, 0 ms (stage 4.0: task 9))", 1482621 -> "1") did not contain key 1482710
|
KafkaSourceStressSuite.stress test with multiple topics and partitions:
KafkaSourceStressSuite#L2752
org.scalatest.exceptions.TestFailedException:
Timed out waiting for stream: The code passed to failAfter did not complete within 30 seconds.
java.base/java.lang.Thread.getStackTrace(Thread.java:1610)
org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:277)
org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)
org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)
org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)
org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$7(StreamTest.scala:481)
org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$7$adapted(StreamTest.scala:480)
scala.collection.mutable.HashMap$Node.foreach(HashMap.scala:642)
scala.collection.mutable.HashMap.foreach(HashMap.scala:504)
org.apache.spark.sql.streaming.StreamTest.fetchStreamAnswer$1(StreamTest.scala:480)
Caused by: null
java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:1764)
org.apache.spark.sql.execution.streaming.StreamExecution.awaitOffset(StreamExecution.scala:481)
org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$8(StreamTest.scala:482)
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)
org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282)
org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)
org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)
org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)
org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$7(StreamTest.scala:481)
== Progress ==
AssertOnQuery(<condition>, )
AddKafkaData(topics = Set(stress1, stress2, stress4, stress5), data = Range 0 until 1, message = Delete topic stress3)
CheckAnswer: [1]
AddKafkaData(topics = Set(stress1, stress2, stress4, stress5), data = Range 1 until 4, message = )
CheckAnswer: [1],[2],[3],[4]
CheckAnswer: [1],[2],[3],[4]
StopStream
AddKafkaData(topics = Set(stress1, stress2, stress4, stress5), data = Range 4 until 11, message = )
AddKafkaData(topics = Set(stress1, stress2, stress4, stress5), data = Range 11 until 16, message = )
StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@6694304c,Map(),null)
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5), data = Range 16 until 18, message = Add topic stress6)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5), data = Range 18 until 27, message = )
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5), data = Range 27 until 36, message = Add partition)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5), data = Range 36 until 44, message = Add partition)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5), data = Range 44 until 47, message = Add partition)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5), data = Range 47 until 55, message = Add partition)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55]
StopStream
StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@12b427ce,Map(),null)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55]
StopStream
StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@15fe928a,Map(),null)
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5, stress7), data = Range 55 until 57, message = Add topic stress7)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress7), data = Range 57 until 58, message = Delete topic stress5)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress7), data = Range 58 until 60, message = )
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress7), data = empty Range 60 until 60, message = Delete topic stress1)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress7), data = Range 60 until 68, message = Add partition)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68]
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress8, stress1, stress7), data = Range 68 until 69, message = Add topic stress8)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69]
StopStream
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress8, stress1, stress7), data = Range 69 until 74, message = )
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress8, stress1, stress7), data = Range 74 until 80, message = )
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress8, stress1, stress7), data = Range 80 until 86, message = )
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress8, stress1, stress7), data = Range 86 until 89, message = )
StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@3f5352df,Map(),null)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89]
StopStream
AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress8, stress1, stress7), data = Range 89 until 90, message = Add partition)
StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@13440c31,Map(),null)
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress2, stress8, stress1, stress7), data = Range 90 until 91, message = Add topic stress9)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91]
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress2, stress8, stress1, stress7, stress10), data = Range 91 until 96, message = Add topic stress10)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96]
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress2, stress8, stress1, stress7, stress10), data = Range 96 until 103, message = )
=> CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103]
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103]
StopStream
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress2, stress8, stress1, stress7, stress10), data = empty Range 103 until 103, message = Add partition)
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress2, stress8, stress1, stress7, stress10), data = Range 103 until 106, message = )
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress2, stress8, stress1, stress11, stress7, stress10), data = Range 106 until 112, message = Add topic stress11)
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress8, stress1, stress11, stress7, stress10, stress12, stress2), data = empty Range 112 until 112, message = Add topic stress12)
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress8, stress1, stress11, stress7, stress10, stress12, stress2), data = empty Range 112 until 112, message = )
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress8, stress1, stress11, stress7, stress10, stress12, stress2), data = Range 112 until 113, message = )
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress8, stress1, stress11, stress7, stress10, stress12, stress2), data = Range 113 until 114, message = )
AddKafkaData(topics = HashSet(stress9, stress4, stress6, stress8, stress1, stress11, stress7, stress10, stress12, stress2), data = Range 114 until 122, message = Add partition)
StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@4e9e8407,Map(),null)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122]
StopStream
StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@5eac3c67,Map(),null)
CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122]
== Stream ==
Output Mode: Append
Stream state: {KafkaV2[SubscribePattern[stress.*]]: {"stress8":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":1},"stress9":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0},"stress4":{"0":2,"1":2,"2":3,"3":7,"4":7,"5":1,"6":2,"7":1,"8":2,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":1,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0},"stress10":{"0":0},"stress5":{"0":0},"stress6":{"0":0,"1":0,"2":1,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,"10":1,"11":0,"12":0,"13":0,"14":0,"15":0,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":0,"23":0,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":0,"31":0,"32":0},"stress7":{"0":1,"1":2,"2":0,"3":2,"4":1,"5":1,"6":1},"stress1":{"0":7,"1":12,"2":5,"3":1,"4":0,"5":1,"6":2,"7":2,"8":0,"9":2,"10":1,"11":1,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":4,"1":2,"2":1,"3":2,"4":1,"5":1,"6":0,"7":1,"8":1,"9":1,"10":0,"11":0,"12":1,"13":3,"14":0,"15":2,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0}}}
Thread state: alive
Thread stack trace: [email protected]/java.lang.Thread.sleep(Native Method)
app//org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:348)
app//org.apache.spark.sql.execution.streaming.MicroBatchExecution$$Lambda$5637/0x00007f118148d148.apply$mcZ$sp(Unknown Source)
app//org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:67)
app//org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:279)
app//org.apache.spark.sql.execution.streaming.StreamExecution.$anonfun$runStream$1(StreamExecution.scala:311)
app//org.apache.spark.sql.execution.streaming.StreamExecution$$Lambda$5626/0x00007f1181488638.apply$mcV$sp(Unknown Source)
app//scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
app//org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:901)
app//org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:289)
app//org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.$anonfun$run$1(StreamExecution.scala:211)
app//org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1$$Lambda$5622/0x00007f11814875d8.apply$mcV$sp(Unknown Source)
app//scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
app//org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:94)
app//org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:211)
== Sink ==
0:
1:
2: [1]
3: [2] [4] [3]
4: [13] [15] [6] [8] [10] [5] [7] [9] [11] [12] [14] [16]
5:
6: [18] [17]
7: [23] [25] [27] [20] [22] [19] [21] [24] [26]
8:
9: [28] [30] [33] [35] [29] [31] [34] [36] [32]
10:
11: [39] [43] [38] [41] [42] [37] [40] [44]
12:
13: [46] [47] [45]
14:
15: [53] [55] [52] [50] [54] [48] [49] [51]
16:
17: [57] [56]
18:
19: [58]
20: [59] [60]
21:
22: [66] [67] [62] [68] [64] [61] [63] [65]
23:
24: [69]
25: [88] [85] [83] [73] [72] [71] [79] [87] [86] [76] [89] [78] [74] [84] [81] [80] [75] [77] [70] [82]
26: [90]
27:
28: [91]
29:
30: [92] [94] [96] [95] [93]
31: [100] [98] [97] [99] [101]
32: [103] [102]
== Plan ==
== Parsed Logical Plan ==
WriteToMicroBatchDataSource MemorySink, 42c464f9-6e2d-47ca-808c-eafd79b13e81, Append, 32
+- SerializeFromObject [input[0, int, false] AS value#37661]
+- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8297/0x00007f11819d62d0@560f37bc, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#37660: int
+- DeserializeToObject newInstance(class scala.Tuple2), obj#37659: scala.Tuple2
+- Project [cast(key#37635 as string) AS key#37649, cast(value#37636 as string) AS value#37650]
+- StreamingDataSourceV2Relation [key#37635, value#37636, topic#37637, partition#37638, offset#37639L, timestamp#37640, timestampType#37641], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@25c01045, KafkaV2[SubscribePattern[stress.*]], {"stress8":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":1},"stress9":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0},"stress4":{"0":2,"1":2,"2":3,"3":7,"4":7,"5":1,"6":2,"7":1,"8":2,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":1,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0},"stress10":{"0":0},"stress6":{"0":0,"1":0,"2":1,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,"10":1,"11":0,"12":0,"13":0,"14":0,"15":0,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":0,"23":0,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":0,"31":0,"32":0},"stress7":{"0":0,"1":2,"2":0,"3":1,"4":1,"5":1,"6":1},"stress1":{"0":7,"1":12,"2":5,"3":1,"4":0,"5":1,"6":2,"7":2,"8":0,"9":2,"10":1,"11":1,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":4,"1":2,"2":1,"3":2,"4":1,"5":1,"6":0,"7":1,"8":1,"9":1,"10":0,"11":0,"12":1,"13":3,"14":0,"15":2,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0}}, {"stress8":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":1},"stress9":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0},"stress4":{"0":2,"1":2,"2":3,"3":7,"4":7,"5":1,"6":2,"7":1,"8":2,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":1,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0},"stress10":{"0":0},"stress5":{"0":0},"stress6":{"0":0,"1":0,"2":1,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,"10":1,"11":0,"12":0,"13":0,"14":0,"15":0,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":0,"23":0,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":0,"31":0,"32":0},"stress7":{"0":1,"1":2,"2":0,"3":2,"4":1,"5":1,"6":1},"stress1":{"0":7,"1":12,"2":5,"3":1,"4":0,"5":1,"6":2,"7":2,"8":0,"9":2,"10":1,"11":1,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":4,"1":2,"2":1,"3":2,"4":1,"5":1,"6":0,"7":1,"8":1,"9":1,"10":0,"11":0,"12":1,"13":3,"14":0,"15":2,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0}}
== Analyzed Logical Plan ==
WriteToMicroBatchDataSource MemorySink, 42c464f9-6e2d-47ca-808c-eafd79b13e81, Append, 32
+- SerializeFromObject [input[0, int, false] AS value#37661]
+- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8297/0x00007f11819d62d0@560f37bc, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#37660: int
+- DeserializeToObject newInstance(class scala.Tuple2), obj#37659: scala.Tuple2
+- Project [cast(key#37635 as string) AS key#37649, cast(value#37636 as string) AS value#37650]
+- StreamingDataSourceV2Relation [key#37635, value#37636, topic#37637, partition#37638, offset#37639L, timestamp#37640, timestampType#37641], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@25c01045, KafkaV2[SubscribePattern[stress.*]], {"stress8":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":1},"stress9":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0},"stress4":{"0":2,"1":2,"2":3,"3":7,"4":7,"5":1,"6":2,"7":1,"8":2,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":1,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0},"stress10":{"0":0},"stress6":{"0":0,"1":0,"2":1,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,"10":1,"11":0,"12":0,"13":0,"14":0,"15":0,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":0,"23":0,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":0,"31":0,"32":0},"stress7":{"0":0,"1":2,"2":0,"3":1,"4":1,"5":1,"6":1},"stress1":{"0":7,"1":12,"2":5,"3":1,"4":0,"5":1,"6":2,"7":2,"8":0,"9":2,"10":1,"11":1,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":4,"1":2,"2":1,"3":2,"4":1,"5":1,"6":0,"7":1,"8":1,"9":1,"10":0,"11":0,"12":1,"13":3,"14":0,"15":2,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0}}, {"stress8":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":1},"stress9":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0},"stress4":{"0":2,"1":2,"2":3,"3":7,"4":7,"5":1,"6":2,"7":1,"8":2,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":1,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0},"stress10":{"0":0},"stress5":{"0":0},"stress6":{"0":0,"1":0,"2":1,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,"10":1,"11":0,"12":0,"13":0,"14":0,"15":0,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":0,"23":0,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":0,"31":0,"32":0},"stress7":{"0":1,"1":2,"2":0,"3":2,"4":1,"5":1,"6":1},"stress1":{"0":7,"1":12,"2":5,"3":1,"4":0,"5":1,"6":2,"7":2,"8":0,"9":2,"10":1,"11":1,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":4,"1":2,"2":1,"3":2,"4":1,"5":1,"6":0,"7":1,"8":1,"9":1,"10":0,"11":0,"12":1,"13":3,"14":0,"15":2,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0}}
== Optimized Logical Plan ==
WriteToDataSourceV2 MicroBatchWrite[epoch: 32, writer: org.apache.spark.sql.execution.streaming.sources.MemoryStreamingWrite@55e0644e]
+- SerializeFromObject [input[0, int, false] AS value#37661]
+- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8297/0x00007f11819d62d0@560f37bc, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#37660: int
+- DeserializeToObject newInstance(class scala.Tuple2), obj#37659: scala.Tuple2
+- Project [cast(key#37635 as string) AS key#37649, cast(value#37636 as string) AS value#37650]
+- StreamingDataSourceV2Relation [key#37635, value#37636, topic#37637, partition#37638, offset#37639L, timestamp#37640, timestampType#37641], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@25c01045, KafkaV2[SubscribePattern[stress.*]], {"stress8":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":1},"stress9":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0},"stress4":{"0":2,"1":2,"2":3,"3":7,"4":7,"5":1,"6":2,"7":1,"8":2,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":1,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0},"stress10":{"0":0},"stress6":{"0":0,"1":0,"2":1,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,"10":1,"11":0,"12":0,"13":0,"14":0,"15":0,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":0,"23":0,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":0,"31":0,"32":0},"stress7":{"0":0,"1":2,"2":0,"3":1,"4":1,"5":1,"6":1},"stress1":{"0":7,"1":12,"2":5,"3":1,"4":0,"5":1,"6":2,"7":2,"8":0,"9":2,"10":1,"11":1,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":4,"1":2,"2":1,"3":2,"4":1,"5":1,"6":0,"7":1,"8":1,"9":1,"10":0,"11":0,"12":1,"13":3,"14":0,"15":2,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0}}, {"stress8":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":1},"stress9":{"0":0,"1":0,"2":0,"3":0,"4":0,"5":0},"stress4":{"0":2,"1":2,"2":3,"3":7,"4":7,"5":1,"6":2,"7":1,"8":2,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":1,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0},"stress10":{"0":0},"stress5":{"0":0},"stress6":{"0":0,"1":0,"2":1,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,"10":1,"11":0,"12":0,"13":0,"14":0,"15":0,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":0,"23":0,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":0,"31":0,"32":0},"stress7":{"0":1,"1":2,"2":0,"3":2,"4":1,"5":1,"6":1},"stress1":{"0":7,"1":12,"2":5,"3":1,"4":0,"5":1,"6":2,"7":2,"8":0,"9":2,"10":1,"11":1,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":4,"1":2,"2":1,"3":2,"4":1,"5":1,"6":0,"7":1,"8":1,"9":1,"10":0,"11":0,"12":1,"13":3,"14":0,"15":2,"16":1,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0}}
== Physical Plan ==
WriteToDataSourceV2 MicroBatchWrite[epoch: 32, writer: org.apache.spark.sql.execution.streaming.sources.MemoryStreamingWrite@55e0644e], org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy$$Lambda$5744/0x00007f11814bce48@6a3a0d85
+- *(1) SerializeFromObject [input[0, int, false] AS value#37661]
+- *(1) MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8297/0x00007f11819d62d0@560f37bc, obj#37660: int
+- *(1) DeserializeToObject newInstance(class scala.Tuple2), obj#37659: scala.Tuple2
+- *(1) Project [cast(key#37635 as string) AS key#37649, cast(value#37636 as string) AS value#37650]
+- MicroBatchScan[key#37635, value#37636, topic#37637, partition#37638, offset#37639L, timestamp#37640, timestampType#37641] class org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan
|
Run / Build modules: pyspark-core, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-errors
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
site
Expired
|
59.2 MB |
|
test-results-catalyst, hive-thriftserver--17-hadoop3-hive2.3
Expired
|
2.79 MB |
|
test-results-core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--17-hadoop3-hive2.3
Expired
|
132 KB |
|
test-results-docker-integration--17-hadoop3-hive2.3
Expired
|
119 KB |
|
test-results-hive-- other tests-17-hadoop3-hive2.3
Expired
|
911 KB |
|
test-results-hive-- slow tests-17-hadoop3-hive2.3
Expired
|
853 KB |
|
test-results-pyspark-connect--17-hadoop3-hive2.3
Expired
|
408 KB |
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3
Expired
|
1.26 MB |
|
test-results-pyspark-pandas--17-hadoop3-hive2.3
Expired
|
1.14 MB |
|
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3
Expired
|
1.06 MB |
|
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3
Expired
|
972 KB |
|
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3
Expired
|
637 KB |
|
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3
Expired
|
326 KB |
|
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3
Expired
|
1.85 MB |
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3
Expired
|
55.7 KB |
|
test-results-sparkr--17-hadoop3-hive2.3
Expired
|
280 KB |
|
test-results-sql-- extended tests-17-hadoop3-hive2.3
Expired
|
2.96 MB |
|
test-results-sql-- other tests-17-hadoop3-hive2.3
Expired
|
4.25 MB |
|
test-results-sql-- slow tests-17-hadoop3-hive2.3
Expired
|
2.76 MB |
|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3
Expired
|
871 KB |
|
test-results-tpcds--17-hadoop3-hive2.3
Expired
|
21.8 KB |
|
unit-tests-log-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3
Expired
|
455 MB |
|
unit-tests-log-sql-- other tests-17-hadoop3-hive2.3
Expired
|
289 MB |
|
unit-tests-log-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3
Expired
|
461 MB |
|