Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: add -usehcatalog in arg parameter

...

Name

Description

Required?

Default

execute

String containing an entire, short Pig program to run.

One of either "execute" or "file" is required.

None

file

HDFS file name of a Pig program to run.

One of either "execute" or "file" is required.

None

arg

Set a program argument. If -useHCatalog is included, then usehcatalog is interpreted as "true" (Hive 0.13.0 and later).

Optional

None

files

Comma separated files to be copied to the map reduce cluster.

Optional

None

statusdir

A directory where WebHCat will write the status of the Pig job. If provided, it is the caller's responsibility to remove this directory when done.

Optional

None

callback

Define a URL to be called upon job completion. You may embed a specific job ID into this URL using $jobId. This tag will be replaced in the callback URL with this job's job ID.

Optional

None

usehcatalog

Specify that the submitted job uses HCatalog and therefore needs to access the metastore, which requires additional steps for WebHCat to perform in a secure cluster. (See HIVE-5133.) This parameter will be introduced in Hive 0.13.0. It can also be set to "true" by including -usehcatalog in the arg parameter.
Also, if webhcat-site.xml defines the parameters templeton.hive.archive, templeton.hive.home and templeton.hcat.home then WebHCat will ship the Hive tar to the target node where the job runs. (See HIVE-5547.) This means that Hive doesn't need to be installed on every node in the Hadoop cluster. It does not ensure that Pig is installed on the target node in the cluster. This is independent of security, but improves manageability.
The webhcat-site.xml parameters are documented in webhcat-default.xml.

Optional in Hive 0.13.0+

false

...