The SampleINT8 class implements the INT8 sample. More...
Public Member Functions | |
SampleINT8 (const SampleINT8Params ¶ms) | |
bool | build (DataType dataType) |
Function builds the network engine. More... | |
bool | isSupported (DataType dataType) |
Checks if the platform supports the data type. More... | |
bool | infer (std::vector< float > &score, int firstScoreBatch, int nbScoreBatches) |
Runs the TensorRT inference engine for this sample. More... | |
bool | teardown () |
Cleans up any state created in the sample class. More... | |
Private Types | |
template<typename T > | |
using | SampleUniquePtr = std::unique_ptr< T, samplesCommon::InferDeleter > |
Private Member Functions | |
bool | constructNetwork (SampleUniquePtr< nvinfer1::IBuilder > &builder, SampleUniquePtr< nvinfer1::INetworkDefinition > &network, SampleUniquePtr< nvinfer1::IBuilderConfig > &config, SampleUniquePtr< nvcaffeparser1::ICaffeParser > &parser, DataType dataType) |
Parses a Caffe model and creates a TensorRT network. More... | |
bool | processInput (const samplesCommon::BufferManager &buffers, const float *data) |
Reads the input and stores it in a managed buffer. More... | |
int | calculateScore (const samplesCommon::BufferManager &buffers, float *labels, int batchSize, int outputSize, int threshold) |
Scores model. More... | |
Private Attributes | |
SampleINT8Params | mParams |
The parameters for the sample. More... | |
nvinfer1::Dims | mInputDims |
The dimensions of the input to the network. More... | |
std::shared_ptr< nvinfer1::ICudaEngine > | mEngine |
The TensorRT engine used to run the network. More... | |
The SampleINT8 class implements the INT8 sample.
It creates the network using a caffe model
|
private |
|
inline |
bool SampleINT8::build | ( | DataType | dataType | ) |
Function builds the network engine.
Creates the network, configures the builder and creates the network engine.
This function creates the network by parsing the caffe model and builds the engine that will be used to run the model (mEngine)
bool SampleINT8::isSupported | ( | DataType | dataType | ) |
Checks if the platform supports the data type.
Runs the TensorRT inference engine for this sample.
This function is the main execution function of the sample. It allocates the buffer, sets inputs and executes the engine.
bool SampleINT8::teardown | ( | ) |
Cleans up any state created in the sample class.
Clean up the libprotobuf files as the parsing is complete
|
private |
Parses a Caffe model and creates a TensorRT network.
Uses a caffe parser to create the network and marks the output layers.
network | Pointer to the network that will be populated with the network |
builder | Pointer to the engine builder |
|
private |
Reads the input and stores it in a managed buffer.
|
private |
Scores model.
|
private |
The parameters for the sample.
|
private |
The dimensions of the input to the network.
|
private |
The TensorRT engine used to run the network.