Table Of Contents
This sample, sampleMNIST, is a simple hello world example that performs the basic setup and initialization of TensorRT using the Caffe parser.
This sample uses a Caffe model that was trained on the MNIST dataset.
Specifically, this sample:
To verify whether the engine is operating correctly, this sample picks a 28x28 image of a digit at random and runs inference on it using the engine it created. The output of the network is a probability distribution on the digit, showing which digit is likely that in the image.
In this sample, the following layers are used. For more information about these layers, see the TensorRT Developer Guide: Layers documentation.
Activation layer The Activation layer implements element-wise activation functions. Specifically, this sample uses the Activation layer with the type kRELU
.
Convolution layer The Convolution layer computes a 2D (channel, height, and width) convolution, with or without bias.
FullyConnected layer The FullyConnected layer implements a matrix-vector product, with or without bias.
Pooling layer The Pooling layer implements pooling within a channel. Supported pooling types are maximum
, average
and maximum-average blend
.
Scale layer The Scale layer implements a per-tensor, per-channel, or per-element affine transformation and/or exponentiation by constant values.
SoftMax layer The SoftMax layer applies the SoftMax function on the input tensor along an input dimension specified by the user.
make
in the <TensorRT root directory>/samples/sampleMNIST
directory. The binary named sample_mnist
will be created in the <TensorRT root directory>/bin
directory. ``` cd <TensorRT root directory>/samples/sampleMNIST make `` Where
<TensorRT root directory>` is where you installed TensorRT.Run the sample to perform inference on the digit: ``` ./sample_mnist [-h] [–datadir=/path/to/data/dir/] [–useDLA=N] [–fp16 or –int8] ``` This sample reads three Caffe files to build the network:
mnist.prototxt
The prototxt file that contains the network design.mnist.caffemodel
The model file which contains the trained weights for the network.mnist_mean.binaryproto
The binaryproto file which contains the means.This sample can be run in FP16 and INT8 modes as well.
Note: By default, the sample expects these files to be in either the data/samples/mnist/
or data/mnist/
directories. The list of default directories can be changed by adding one or more paths with --datadir=/new/path/
as a command line argument.
Verify that the sample ran successfully. If the sample runs successfully you should see output similar to the following; ASCII rendering of the input image with digit 3: ``` &&&& RUNNING TensorRT.sample_mnist # ./sample_mnist [I] Building and running a GPU inference engine for MNIST [I] Input: @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@#-:.-=@@@@@@@ @@%= . @@@@@@@ @@% .:+%%% *@@@@@@@ @@+=#@@# @@@@@@@ @@@@@% @@@@@@@ @@@@@@: *@@@@@@@ @@@@@- .@@@@@@@@ @@@@@: #@@@@@@@@ @@@@: +%#@@@@@@ @@@% :+*@@@@ @@@@#*+–.:: +@@@ @@@@@@@@#=:. +@@@ @@@@@@@@@@ .@@@ @@@@@@@@@@#. #@@ @@@@@@@@@@# @@@ @@@@%@@@@@- +@@@ @@@@#-@@@@*. =@@@ @@@@ .+%%%%+=. =@@@@ @@@@ =@@@@ @@@@*=: :–*@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@ @@@@@@@@@@@@@@
[I] Output: 0: 1: 2: 3: ********** 4: 5: 6: 7: 8: 9:
&&&& PASSED TensorRT.sample_mnist # ./sample_mnist ```
This output shows that the sample ran successfully; PASSED
.
--help
optionsTo see the full list of available options and their descriptions, use the -h
or --help
command line option.
The following resources provide a deeper understanding about sampleMNIST:
MNIST
Documentation
For terms and conditions for use, reproduction, and distribution, see the TensorRT Software License Agreement documentation.
February 2019 This README.md
file was recreated, updated and reviewed.
There are no known issues in this sample.