...
Timeout for job monitor to get Spark job state.
hive.spark.dynamic.partition.pruning
Turn on dynamic partition pruning for Spark engine.
hive.spark.dynamic.partition.pruning.max.data.size
The maximum data size for the dimension table that generates partition pruning information. If reaches this limit, the optimization will be turned off.
Remote Spark Driver
The remote Spark driver is the application launched in the Spark cluster, that submits the actual Spark job. It was introduced in HIVE-8528. It is a long-lived application initialized upon the first query of the current user, running until the user's session is closed. The following properties control the remote communication between the remote Spark driver and the Hive client that spawns it.
...
Jobs submitted to HCatalog can specify configuration properties that affect storage, error tolerance, and other kinds of behavior during the job. See HCatalog Config Configuration Properties for details.
WebHCat Configuration Properties
...