You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I setup almond kernel on windows. It's work well with scala and spark local. But i want to use remote hdfs and remote yarn to submit job. So i need to override some configs like this
All of these configs existed in spark-default file and hadoop config dir. I already set system env var and also set env var in kernel json like this. But almond kernel still not use these configs
When i open spark ui, view environment. It still use default config of almond instead my override config
How to override config above and use it from spark-default and hadoop config dir. Thank you
The text was updated successfully, but these errors were encountered:
Hi,
I setup almond kernel on windows. It's work well with scala and spark local. But i want to use remote hdfs and remote yarn to submit job. So i need to override some configs like this
All of these configs existed in spark-default file and hadoop config dir. I already set system env var and also set env var in kernel json like this. But almond kernel still not use these configs
When i open spark ui, view environment. It still use default config of almond instead my override config
How to override config above and use it from spark-default and hadoop config dir. Thank you
The text was updated successfully, but these errors were encountered: