TensorRT  7.2.1.6
NVIDIA TensorRT
Looking for a C++ dev who knows TensorRT?
I'm looking for work. Hire me!
All Classes Namespaces Functions Variables Typedefs Enumerations Enumerator Friends Pages
“Hello World” For TensorRT

Table Of Contents

Description

This sample, sampleMNIST, is a simple hello world example that performs the basic setup and initialization of TensorRT using the Caffe parser.

How does this sample work?

This sample uses a Caffe model that was trained on the MNIST dataset.

Specifically, this sample:

To verify whether the engine is operating correctly, this sample picks a 28x28 image of a digit at random and runs inference on it using the engine it created. The output of the network is a probability distribution on the digit, showing which digit is likely that in the image.

TensorRT API layers and ops

In this sample, the following layers are used. For more information about these layers, see the TensorRT Developer Guide: Layers documentation.

Activation layer The Activation layer implements element-wise activation functions. Specifically, this sample uses the Activation layer with the type kRELU.

Convolution layer The Convolution layer computes a 2D (channel, height, and width) convolution, with or without bias.

FullyConnected layer The FullyConnected layer implements a matrix-vector product, with or without bias.

Pooling layer The Pooling layer implements pooling within a channel. Supported pooling types are maximum, average and maximum-average blend.

Scale layer The Scale layer implements a per-tensor, per-channel, or per-element affine transformation and/or exponentiation by constant values.

SoftMax layer The SoftMax layer applies the SoftMax function on the input tensor along an input dimension specified by the user.

Running the sample

  1. Compile this sample by running make in the <TensorRT root directory>/samples/sampleMNIST directory. The binary named sample_mnist will be created in the <TensorRT root directory>/bin directory. ``` cd <TensorRT root directory>/samples/sampleMNIST make `` Where<TensorRT root directory>` is where you installed TensorRT.
  2. Run the sample to perform inference on the digit: ``` ./sample_mnist [-h] [–datadir=/path/to/data/dir/] [–useDLA=N] [–fp16 or –int8] ``` This sample reads three Caffe files to build the network:

    • mnist.prototxt The prototxt file that contains the network design.
    • mnist.caffemodel The model file which contains the trained weights for the network.
    • mnist_mean.binaryproto The binaryproto file which contains the means.

    This sample can be run in FP16 and INT8 modes as well.

    Note: By default, the sample expects these files to be in either the data/samples/mnist/ or data/mnist/ directories. The list of default directories can be changed by adding one or more paths with --datadir=/new/path/ as a command line argument.

  3. Verify that the sample ran successfully. If the sample runs successfully you should see output similar to the following; ASCII rendering of the input image with digit 3: ``` &&&& RUNNING TensorRT.sample_mnist # ./sample_mnist [I] Building and running a GPU inference engine for MNIST [I] Input: @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@#-:.-=@@@@@@@ @@%= . @@@@@@@ @@% .:+%%% *@@@@@@@ @@+=#@@# @@@@@@@ @@@@@% @@@@@@@ @@@@@@: *@@@@@@@ @@@@@- .@@@@@@@@ @@@@@: #@@@@@@@@ @@@@: +%#@@@@@@ @@@% :+*@@@@ @@@@#*+–.:: +@@@ @@@@@@@@#=:. +@@@ @@@@@@@@@@ .@@@ @@@@@@@@@@#. #@@ @@@@@@@@@@# @@@ @@@@%@@@@@- +@@@ @@@@#-@@@@*. =@@@ @@@@ .+%%%%+=. =@@@@ @@@@ =@@@@ @@@@*=: :–*@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@

    [I] Output: 0: 1: 2: 3: ********** 4: 5: 6: 7: 8: 9:

    &&&& PASSED TensorRT.sample_mnist # ./sample_mnist ```

    This output shows that the sample ran successfully; PASSED.

Sample --help options

To see the full list of available options and their descriptions, use the -h or --help command line option.

Additional resources

The following resources provide a deeper understanding about sampleMNIST:

MNIST

Documentation

License

For terms and conditions for use, reproduction, and distribution, see the TensorRT Software License Agreement documentation.

Changelog

February 2019 This README.md file was recreated, updated and reviewed.

Known issues

There are no known issues in this sample.