You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently you declare the arguments to a Hadoop DSL SparkJob using "appParams" where you just list the job arguments directly.
Another way to do this would be to support "namedAppParams" that takes a list of keys for job properties that you have already set on the job and looks up their corresponding values as the values to use for the job argument. This can thrown an error if you haven't declared the given key for the job.
The text was updated successfully, but these errors were encountered:
Currently you declare the arguments to a Hadoop DSL SparkJob using "appParams" where you just list the job arguments directly.
Another way to do this would be to support "namedAppParams" that takes a list of keys for job properties that you have already set on the job and looks up their corresponding values as the values to use for the job argument. This can thrown an error if you haven't declared the given key for the job.
The text was updated successfully, but these errors were encountered: