THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
Code Block | ||||
---|---|---|---|---|
| ||||
/**
* Implementation of prediction routines.
*
* @param modelPathPrefix Path prefix from where to load the model artifacts.
* These include the symbol, parameters, and synset.txt
* Example: file://model-dir/resnet-152 (containing
* resnet-152-symbol.json, resnet-152-0000.params, and synset.txt).
* @param inputDescriptors Descriptors defining the input node names, shape,
* layout and type parameters
* <p>Note: If the input Descriptors is missing batchSize
* ('N' in layout), a batchSize of 1 is assumed for the model.
* @param contexts Device contexts on which you want to run inference; defaults to CPU
* @param epoch Model epoch to load; defaults to 0
*/
Predictor(String modelPathPrefix, Array<DataDesc> inputDescriptors,
Array[Context] Contexts, int epoch)
| ||||
Code Block | ||||
| ||||
/** * Takes input as IndexedSeq one dimensional arrays and creates the NDArray needed for inference * The array will be reshaped based on the input descriptors. * * @param input: An Array of a one-dimensional array. An Array is needed when the model has more than one input. * @return Indexed sequence array of outputs */ Array <Array <Float>> predict(Array <Array <Float>> input) | ||||
Code Block | ||||
| ||||
/**
* Predict using NDArray as input
* This method is useful when the input is a batch of data
* Note: User is responsible for managing allocation/deallocation of input/output NDArrays.
*
* @param inputBatch Array of NDArrays
* @return Output of predictions as NDArrays
*/
Array <NDArray> predictWithNDArray(Array <NDArray> inputBatch) |
...