TensorRT  7.2.1.6
NVIDIA TensorRT
Looking for a C++ dev who knows TensorRT?
I'm looking for work. Hire me!
All Classes Namespaces Functions Variables Typedefs Enumerations Enumerator Friends Pages
SampleDynamicReshape Class Reference

The SampleDynamicReshape class implementes the dynamic reshape sample. More...

Collaboration diagram for SampleDynamicReshape:

Public Member Functions

 SampleDynamicReshape (const samplesCommon::OnnxSampleParams &params)
 
bool build ()
 Builds both engines. More...
 
bool prepare ()
 Prepares the model for inference by creating execution contexts and allocating buffers. More...
 
bool infer ()
 Runs inference using TensorRT on a random image. More...
 

Private Types

template<typename T >
using SampleUniquePtr = std::unique_ptr< T, samplesCommon::InferDeleter >
 

Private Member Functions

bool buildPreprocessorEngine (const SampleUniquePtr< nvinfer1::IBuilder > &builder)
 Builds an engine for preprocessing (mPreprocessorEngine). More...
 
bool buildPredictionEngine (const SampleUniquePtr< nvinfer1::IBuilder > &builder)
 Builds an engine for prediction (mPredictionEngine). More...
 
Dims loadPGMFile (const std::string &fileName)
 Loads a PGM file into mInput and returns the dimensions of the loaded image. More...
 
bool validateOutput (int digit)
 Checks whether the model prediction (in mOutput) is correct. More...
 
template<typename T >
SampleUniquePtr< T > makeUnique (T *t)
 

Private Attributes

samplesCommon::OnnxSampleParams mParams
 The parameters for the sample. More...
 
nvinfer1::Dims mPredictionInputDims
 The dimensions of the input of the MNIST model. More...
 
nvinfer1::Dims mPredictionOutputDims
 The dimensions of the output of the MNIST model. More...
 
SampleUniquePtr< nvinfer1::ICudaEnginemPreprocessorEngine {nullptr}
 
SampleUniquePtr< nvinfer1::ICudaEnginemPredictionEngine {nullptr}
 
SampleUniquePtr< nvinfer1::IExecutionContextmPreprocessorContext {nullptr}
 
SampleUniquePtr< nvinfer1::IExecutionContextmPredictionContext {nullptr}
 
samplesCommon::ManagedBuffer mInput {}
 Host and device buffers for the input. More...
 
samplesCommon::DeviceBuffer mPredictionInput {}
 Device buffer for the output of the preprocessor, i.e. More...
 
samplesCommon::ManagedBuffer mOutput {}
 Host buffer for the ouptut. More...
 

Detailed Description

The SampleDynamicReshape class implementes the dynamic reshape sample.

This class builds one engine that resizes a given input to the correct size, and a second engine based on an ONNX MNIST model that generates a prediction.

Member Typedef Documentation

◆ SampleUniquePtr

template<typename T >
using SampleDynamicReshape::SampleUniquePtr = std::unique_ptr<T, samplesCommon::InferDeleter>
private

Constructor & Destructor Documentation

◆ SampleDynamicReshape()

SampleDynamicReshape::SampleDynamicReshape ( const samplesCommon::OnnxSampleParams params)
inline

Member Function Documentation

◆ build()

bool SampleDynamicReshape::build ( )

Builds both engines.

Builds the two engines required for inference.

This function creates one TensorRT engine for resizing inputs to the correct sizes, then creates a TensorRT network by parsing the ONNX model and builds an engine that will be used to run inference (mPredictionEngine).

Returns
Ruturns false if error in build preprocessor or predict engine.
Here is the call graph for this function:

◆ prepare()

bool SampleDynamicReshape::prepare ( )

Prepares the model for inference by creating execution contexts and allocating buffers.

Prepares the model for inference by creating an execution context and allocating buffers.

This function sets up the sample for inference. This involves allocating buffers for the inputs and outputs, as well as creating TensorRT execution contexts for both engines. This only needs to be called a single time.

Returns
Ruturns false if error in build preprocessor or predict context.
Here is the call graph for this function:

◆ infer()

bool SampleDynamicReshape::infer ( )

Runs inference using TensorRT on a random image.

Runs inference for this sample.

This function is the main execution function of the sample. It runs inference for using a random image from the MNIST dataset as an input.

Here is the call graph for this function:

◆ buildPreprocessorEngine()

bool SampleDynamicReshape::buildPreprocessorEngine ( const SampleUniquePtr< nvinfer1::IBuilder > &  builder)
private

Builds an engine for preprocessing (mPreprocessorEngine).

Returns
Ruturns false if error in build preprocessor engine.
Here is the call graph for this function:
Here is the caller graph for this function:

◆ buildPredictionEngine()

bool SampleDynamicReshape::buildPredictionEngine ( const SampleUniquePtr< nvinfer1::IBuilder > &  builder)
private

Builds an engine for prediction (mPredictionEngine).

This function builds an engine for the MNIST model, and updates mPredictionInputDims and mPredictionOutputDims according to the dimensions specified by the model. The preprocessor reshapes inputs to mPredictionInputDims.

Returns
Ruturns false if error in build prediction engine.
Here is the call graph for this function:
Here is the caller graph for this function:

◆ loadPGMFile()

Dims SampleDynamicReshape::loadPGMFile ( const std::string &  fileName)
private

Loads a PGM file into mInput and returns the dimensions of the loaded image.

This function loads the specified PGM file into the input host buffer.

Here is the call graph for this function:
Here is the caller graph for this function:

◆ validateOutput()

bool SampleDynamicReshape::validateOutput ( int  digit)
private

Checks whether the model prediction (in mOutput) is correct.

Here is the call graph for this function:
Here is the caller graph for this function:

◆ makeUnique()

template<typename T >
SampleUniquePtr<T> SampleDynamicReshape::makeUnique ( T *  t)
inlineprivate
Here is the caller graph for this function:

Member Data Documentation

◆ mParams

samplesCommon::OnnxSampleParams SampleDynamicReshape::mParams
private

The parameters for the sample.

◆ mPredictionInputDims

nvinfer1::Dims SampleDynamicReshape::mPredictionInputDims
private

The dimensions of the input of the MNIST model.

◆ mPredictionOutputDims

nvinfer1::Dims SampleDynamicReshape::mPredictionOutputDims
private

The dimensions of the output of the MNIST model.

◆ mPreprocessorEngine

SampleUniquePtr<nvinfer1::ICudaEngine> SampleDynamicReshape::mPreprocessorEngine {nullptr}
private

◆ mPredictionEngine

SampleUniquePtr<nvinfer1::ICudaEngine> SampleDynamicReshape::mPredictionEngine {nullptr}
private

◆ mPreprocessorContext

SampleUniquePtr<nvinfer1::IExecutionContext> SampleDynamicReshape::mPreprocessorContext {nullptr}
private

◆ mPredictionContext

SampleUniquePtr<nvinfer1::IExecutionContext> SampleDynamicReshape::mPredictionContext {nullptr}
private

◆ mInput

samplesCommon::ManagedBuffer SampleDynamicReshape::mInput {}
private

Host and device buffers for the input.

◆ mPredictionInput

samplesCommon::DeviceBuffer SampleDynamicReshape::mPredictionInput {}
private

Device buffer for the output of the preprocessor, i.e.

the input to the prediction model.

◆ mOutput

samplesCommon::ManagedBuffer SampleDynamicReshape::mOutput {}
private

Host buffer for the ouptut.


The documentation for this class was generated from the following file: