Pig Job — POST pig
Description
Create and queue a Pig job.
URL
http://
www.myserver.com/templeton/v1/pig
Parameters
Name |
Description |
Required? |
Default |
---|---|---|---|
execute |
String containing an entire, short Pig program to run. |
One of either "execute" or "file" is required. |
None |
file |
HDFS file name of a Pig program to run. |
One of either "execute" or "file" is required. |
None |
arg |
Set a program argument. |
Optional |
None |
files |
Comma separated files to be copied to the map reduce cluster. |
Optional |
None |
statusdir |
A directory where WebHCat will write the status of the Map Reduce job. If provided, it is the caller's responsibility to remove this directory when done. |
Optional |
None |
callback |
Define a URL to be called upon job completion. You may embed a specific job ID into this URL using |
Optional |
None |
The standard parameters are also supported.
Results
Name |
Description |
---|---|
id |
A string containing the job ID similar to "job_201110132141_0001". |
info |
A JSON object containing the information returned when the job was queued. See the Hadoop documentation ( |
Example
Code and Data Setup
% cat id.pig A = load 'passwd' using PigStorage(':'); B = foreach A generate $0 as id; dump B; % cat fake-passwd ctdean:Chris Dean:secret pauls:Paul Stolorz:good carmas:Carlos Armas:evil dra:Deirdre McClure:marvelous % hadoop fs -put id.pig . % hadoop fs -put fake-passwd passwd
Curl Command
% curl -s -d user.name=ctdean \ -d file=id.pig \ -d arg=-v \ 'http://localhost:50111/templeton/v1/pig'
JSON Output
{ "id": "job_201111101627_0018", "info": { "stdout": "templeton-job-id:job_201111101627_0018 ", "stderr": "", "exitcode": 0 } }
Previous: POST mapreduce/jar
Next: POST hive
General: WebHCat Reference – WebHCat (Templeton) Manual – HCatalog Manual – Hive Home
Old version of this document (HCatalog 0.5.0): POST pig