TensorRT  7.2.1.6
NVIDIA TensorRT
Looking for a C++ dev who knows TensorRT?
I'm looking for work. Hire me!
All Classes Namespaces Functions Variables Typedefs Enumerations Enumerator Friends Pages
SampleOnnxMnistCoordConvAC Class Reference

The SampleOnnxMnistCoordConvAC class implements the ONNX MNIST sample. More...

Collaboration diagram for SampleOnnxMnistCoordConvAC:

Public Member Functions

 SampleOnnxMnistCoordConvAC (const samplesCommon::OnnxSampleParams &params)
 
bool build ()
 Function builds the network engine. More...
 
bool infer ()
 Runs the TensorRT inference engine for this sample. More...
 

Private Types

template<typename T >
using SampleUniquePtr = std::unique_ptr< T, samplesCommon::InferDeleter >
 

Private Member Functions

bool constructNetwork (SampleUniquePtr< nvinfer1::IBuilder > &builder, SampleUniquePtr< nvinfer1::INetworkDefinition > &network, SampleUniquePtr< nvinfer1::IBuilderConfig > &config, SampleUniquePtr< nvonnxparser::IParser > &parser)
 Parses an ONNX model for MNIST and creates a TensorRT network. More...
 
bool processInput (const samplesCommon::BufferManager &buffers)
 Reads the input and stores the result in a managed buffer. More...
 
bool verifyOutput (const samplesCommon::BufferManager &buffers)
 Classifies digits and verify result. More...
 

Private Attributes

samplesCommon::OnnxSampleParams mParams
 The parameters for the sample. More...
 
nvinfer1::Dims mInputDims
 The dimensions of the input to the network. More...
 
nvinfer1::Dims mOutputDims
 The dimensions of the output to the network. More...
 
int mNumber {0}
 The number to classify. More...
 
std::shared_ptr< nvinfer1::ICudaEnginemEngine
 The TensorRT engine used to run the network. More...
 

Detailed Description

The SampleOnnxMnistCoordConvAC class implements the ONNX MNIST sample.

It creates the network using an ONNX model

Member Typedef Documentation

◆ SampleUniquePtr

template<typename T >
using SampleOnnxMnistCoordConvAC::SampleUniquePtr = std::unique_ptr<T, samplesCommon::InferDeleter>
private

Constructor & Destructor Documentation

◆ SampleOnnxMnistCoordConvAC()

SampleOnnxMnistCoordConvAC::SampleOnnxMnistCoordConvAC ( const samplesCommon::OnnxSampleParams params)
inline

Member Function Documentation

◆ build()

bool SampleOnnxMnistCoordConvAC::build ( )

Function builds the network engine.

Creates the network, configures the builder and creates the network engine.

This function creates the Onnx MNIST network by parsing the Onnx model and builds the engine that will be used to run MNIST (mEngine)

Returns
Returns true if the engine was created successfully and false otherwise
Here is the call graph for this function:

◆ infer()

bool SampleOnnxMnistCoordConvAC::infer ( )

Runs the TensorRT inference engine for this sample.

This function is the main execution function of the sample. It allocates the buffer, sets inputs and executes the engine.

Here is the call graph for this function:

◆ constructNetwork()

bool SampleOnnxMnistCoordConvAC::constructNetwork ( SampleUniquePtr< nvinfer1::IBuilder > &  builder,
SampleUniquePtr< nvinfer1::INetworkDefinition > &  network,
SampleUniquePtr< nvinfer1::IBuilderConfig > &  config,
SampleUniquePtr< nvonnxparser::IParser > &  parser 
)
private

Parses an ONNX model for MNIST and creates a TensorRT network.

Uses a ONNX parser to create the Onnx MNIST Network and marks the output layers.

Parameters
networkPointer to the network that will be populated with the Onnx MNIST network
builderPointer to the engine builder
Here is the call graph for this function:
Here is the caller graph for this function:

◆ processInput()

bool SampleOnnxMnistCoordConvAC::processInput ( const samplesCommon::BufferManager buffers)
private

Reads the input and stores the result in a managed buffer.

Here is the call graph for this function:
Here is the caller graph for this function:

◆ verifyOutput()

bool SampleOnnxMnistCoordConvAC::verifyOutput ( const samplesCommon::BufferManager buffers)
private

Classifies digits and verify result.

Returns
whether the classification output matches expectations
Here is the call graph for this function:
Here is the caller graph for this function:

Member Data Documentation

◆ mParams

samplesCommon::OnnxSampleParams SampleOnnxMnistCoordConvAC::mParams
private

The parameters for the sample.

◆ mInputDims

nvinfer1::Dims SampleOnnxMnistCoordConvAC::mInputDims
private

The dimensions of the input to the network.

◆ mOutputDims

nvinfer1::Dims SampleOnnxMnistCoordConvAC::mOutputDims
private

The dimensions of the output to the network.

◆ mNumber

int SampleOnnxMnistCoordConvAC::mNumber {0}
private

The number to classify.

◆ mEngine

std::shared_ptr<nvinfer1::ICudaEngine> SampleOnnxMnistCoordConvAC::mEngine
private

The TensorRT engine used to run the network.


The documentation for this class was generated from the following file: