TensorRT  7.2.1.6
NVIDIA TensorRT
Looking for a C++ dev who knows TensorRT?
I'm looking for work. Hire me!
All Classes Namespaces Functions Variables Typedefs Enumerations Enumerator Friends Pages
SamplePlugin Class Reference

The SamplePlugin class implements samplePlugin. More...

Collaboration diagram for SamplePlugin:

Public Member Functions

 SamplePlugin (const samplesCommon::CaffeSampleParams &params)
 
 ~SamplePlugin ()
 
bool build ()
 Builds the network engine. More...
 
bool infer ()
 Runs the TensorRT inference engine for this sample. More...
 
bool teardown ()
 Used to clean up any state created in the sample class. More...
 

Private Types

template<typename T >
using SampleUniquePtr = std::unique_ptr< T, samplesCommon::InferDeleter >
 

Private Member Functions

void constructNetwork (SampleUniquePtr< nvinfer1::IBuilder > &builder, SampleUniquePtr< nvcaffeparser1::ICaffeParser > &parser, SampleUniquePtr< nvinfer1::INetworkDefinition > &network)
 Uses a Caffe parser to create the MNIST Network and marks the output layers. More...
 
bool processInput (const samplesCommon::BufferManager &buffers, const std::string &inputTensorName, int inputFileIdx) const
 Reads the input and mean data, preprocesses, and stores the result in a managed buffer. More...
 
bool verifyOutput (const samplesCommon::BufferManager &buffers, const std::string &outputTensorName, int groundTruthDigit) const
 Verifies that the output is correct and prints it. More...
 

Private Attributes

std::shared_ptr< nvinfer1::ICudaEnginemEngine {nullptr}
 The TensorRT engine used to run the network. More...
 
samplesCommon::CaffeSampleParams mParams
 The parameters for the sample. More...
 
SampleUniquePtr< nvcaffeparser1::IBinaryProtoBlobmMeanBlob
 The mean blob, which need to keep around until build time. More...
 
nvinfer1::Dims mInputDims
 The dimensions of the input to the network. More...
 
PluginFactory runtimePluginFactory
 

Detailed Description

The SamplePlugin class implements samplePlugin.

It creates the network using a trained Caffe MNIST classification model, and replaces the final FC layer with a custom plugin layer.

Member Typedef Documentation

◆ SampleUniquePtr

template<typename T >
using SamplePlugin::SampleUniquePtr = std::unique_ptr<T, samplesCommon::InferDeleter>
private

Constructor & Destructor Documentation

◆ SamplePlugin()

SamplePlugin::SamplePlugin ( const samplesCommon::CaffeSampleParams params)
inline

◆ ~SamplePlugin()

SamplePlugin::~SamplePlugin ( )
inline

Member Function Documentation

◆ build()

bool SamplePlugin::build ( )

Builds the network engine.

Creates the network, configures the builder and creates the network engine.

This function creates the MNIST network by parsing the caffe model and builds the engine with a custom FC plugin layer

Returns
Returns true if the engine was created successfully and false otherwise
Here is the call graph for this function:

◆ infer()

bool SamplePlugin::infer ( )

Runs the TensorRT inference engine for this sample.

This function is the main execution function of the sample. It allocates the buffer, sets inputs, executes the engine, and verifies the output.

Here is the call graph for this function:

◆ teardown()

bool SamplePlugin::teardown ( )

Used to clean up any state created in the sample class.

Clean up the libprotobuf files as the parsing is complete

Note
It is not safe to use any other part of the protocol buffers library after ShutdownProtobufLibrary() has been called.
Here is the call graph for this function:

◆ constructNetwork()

void SamplePlugin::constructNetwork ( SampleUniquePtr< nvinfer1::IBuilder > &  builder,
SampleUniquePtr< nvcaffeparser1::ICaffeParser > &  parser,
SampleUniquePtr< nvinfer1::INetworkDefinition > &  network 
)
private

Uses a Caffe parser to create the MNIST Network and marks the output layers.

Uses a caffe parser to create the MNIST Network and marks the output layers.

Parameters
networkPointer to the network that will be populated with the MNIST network
builderPointer to the engine builder
Here is the call graph for this function:
Here is the caller graph for this function:

◆ processInput()

bool SamplePlugin::processInput ( const samplesCommon::BufferManager buffers,
const std::string &  inputTensorName,
int  inputFileIdx 
) const
private

Reads the input and mean data, preprocesses, and stores the result in a managed buffer.

Here is the call graph for this function:
Here is the caller graph for this function:

◆ verifyOutput()

bool SamplePlugin::verifyOutput ( const samplesCommon::BufferManager buffers,
const std::string &  outputTensorName,
int  groundTruthDigit 
) const
private

Verifies that the output is correct and prints it.

Here is the call graph for this function:
Here is the caller graph for this function:

Member Data Documentation

◆ mEngine

std::shared_ptr<nvinfer1::ICudaEngine> SamplePlugin::mEngine {nullptr}
private

The TensorRT engine used to run the network.

◆ mParams

samplesCommon::CaffeSampleParams SamplePlugin::mParams
private

The parameters for the sample.

◆ mMeanBlob

SampleUniquePtr<nvcaffeparser1::IBinaryProtoBlob> SamplePlugin::mMeanBlob
private

The mean blob, which need to keep around until build time.

◆ mInputDims

nvinfer1::Dims SamplePlugin::mInputDims
private

The dimensions of the input to the network.

◆ runtimePluginFactory

PluginFactory SamplePlugin::runtimePluginFactory
private

The documentation for this class was generated from the following file: