You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, this is a feature request, for those of us not experts in how Spark works, to add in an example of an additional step to run with the ENABLE_DAEMON_INIT flag.
For myself in particular, I am hoping this could be used to start the HiveThrift server upon starting the containers so I can use this for JDBC testing in CI from outside of the container as well as a simple ETL script to create some tables and load a small dataset.
I have found that this command lets me start up the thrift server:
cd /spark/bin && /spark/sbin/../bin/spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal
So I will probably docker exec it before starting the test runner. But it would be cool to see if this is possible to include in the initialization of the container.
Thanks
The text was updated successfully, but these errors were encountered:
Hello, this is a feature request, for those of us not experts in how Spark works, to add in an example of an additional step to run with the
ENABLE_DAEMON_INIT
flag.For myself in particular, I am hoping this could be used to start the HiveThrift server upon starting the containers so I can use this for JDBC testing in CI from outside of the container as well as a simple ETL script to create some tables and load a small dataset.
I have found that this command lets me start up the thrift server:
So I will probably
docker exec
it before starting the test runner. But it would be cool to see if this is possible to include in the initialization of the container.Thanks
The text was updated successfully, but these errors were encountered: