Lines Matching full:inference
17 …data from file, and to re-sample to the expected sample rate. Top level inference API is provided …
123 3. Executing Inference
125 5. Decoding and Processing Inference Output
188 Using the `Optimize()` function we optimize the graph for inference and load the optimized network …
224 …ng pipeline has 3 steps to perform: data pre-processing, run inference and decode inference result…
231 …ositioned window of data, sized appropriately for the given model, to pre-process before inference.
241 After all the MFCCs needed for an inference have been extracted from the audio data they are concat…
243 #### Executing Inference
248 kwsPipeline->Inference(preprocessedData, results);
251 Inference step will call `ArmnnNetworkExecutor::Run` method that will prepare input tensors and exe…
252 A compute device performs inference for the loaded network using the `EnqueueWorkload()` function o…
263 … for output data once and map it to output tensor objects. After successful inference, we read data
271 The output from the inference is decoded to obtain the spotted keyword- the word with highest proba…