|
||||||||||
| PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
| SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD | |||||||||
java.lang.Objectorg.apache.pig.LoadFunc
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MergeJoinIndexer
public class MergeJoinIndexer
Merge Join indexer is used to generate on the fly index for doing Merge Join efficiently. It samples first record from every block of right side input. and returns tuple in the following format : (key0, key1,...,position,splitIndex) These tuples are then sorted before being written out to index file on HDFS.
| Constructor Summary | |
|---|---|
MergeJoinIndexer(String funcSpec,
String innerPlan,
String serializedPhyPlan,
String udfCntxtSignature,
String scope,
String ignoreNulls)
|
|
| Method Summary | |
|---|---|
org.apache.hadoop.mapreduce.InputFormat |
getInputFormat()
This will be called during planning on the front end. |
LoadCaster |
getLoadCaster()
This will be called on the front end during planning and not on the back end during execution. |
Tuple |
getNext()
Retrieves the next tuple to be processed. |
void |
prepareToRead(org.apache.hadoop.mapreduce.RecordReader reader,
PigSplit split)
Initializes LoadFunc for reading data. |
void |
setLocation(String location,
org.apache.hadoop.mapreduce.Job job)
Communicate to the loader the location of the object(s) being loaded. |
| Methods inherited from class org.apache.pig.LoadFunc |
|---|
getAbsolutePath, getPathStrings, join, relativeToAbsolutePath, setUDFContextSignature, warn |
| Methods inherited from class java.lang.Object |
|---|
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
| Constructor Detail |
|---|
public MergeJoinIndexer(String funcSpec,
String innerPlan,
String serializedPhyPlan,
String udfCntxtSignature,
String scope,
String ignoreNulls)
throws ExecException
funcSpec - : Loader specification.innerPlan - : This is serialized version of LR plan. We
want to keep only keys in our index file and not the whole tuple. So, we need LR and thus its plan
to get keys out of the sampled tuple.serializedPhyPlan - Serialized physical plan on right side.
ExecException| Method Detail |
|---|
public Tuple getNext()
throws IOException
LoadFunc
getNext in class LoadFuncIOException - if there is an exception while retrieving the next
tuple
public org.apache.hadoop.mapreduce.InputFormat getInputFormat()
throws IOException
LoadFunc
getInputFormat in class LoadFuncIOException - if there is an exception during InputFormat
construction
public LoadCaster getLoadCaster()
throws IOException
LoadFunc
getLoadCaster in class LoadFuncLoadCaster associated with this loader. Returning null
indicates that casts from byte array are not supported for this loader.
construction
IOException - if there is an exception during LoadCaster
public void prepareToRead(org.apache.hadoop.mapreduce.RecordReader reader,
PigSplit split)
throws IOException
LoadFunc
prepareToRead in class LoadFuncreader - RecordReader to be used by this instance of the LoadFuncsplit - The input PigSplit to process
IOException - if there is an exception during initialization
public void setLocation(String location,
org.apache.hadoop.mapreduce.Job job)
throws IOException
LoadFuncLoadFunc.relativeToAbsolutePath(String, Path). Implementations
should use this method to communicate the location (and any other information)
to its underlying InputFormat through the Job object.
This method will be called in the frontend and backend multiple times. Implementations
should bear in mind that this method is called multiple times and should
ensure there are no inconsistent side effects due to the multiple calls.
setLocation in class LoadFunclocation - Location as returned by
LoadFunc.relativeToAbsolutePath(String, Path)job - the Job object
store or retrieve earlier stored information from the UDFContext
IOException - if the location is not valid.
|
||||||||||
| PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
| SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD | |||||||||