TensorRT  7.2.1.6
NVIDIA TensorRT
Looking for a C++ dev who knows TensorRT?
I'm looking for work. Hire me!
All Classes Namespaces Functions Variables Typedefs Enumerations Enumerator Friends Pages
polygraphy.backend.tf.loader.CreateConfig Class Reference
Inheritance diagram for polygraphy.backend.tf.loader.CreateConfig:
Collaboration diagram for polygraphy.backend.tf.loader.CreateConfig:

Public Member Functions

def __init__ (self, gpu_memory_fraction=None, allow_growth=None, use_xla=None)
 
def __call__ (self)
 

Public Attributes

 gpu_memory_fraction
 
 allow_growth
 
 use_xla
 

Constructor & Destructor Documentation

◆ __init__()

def polygraphy.backend.tf.loader.CreateConfig.__init__ (   self,
  gpu_memory_fraction = None,
  allow_growth = None,
  use_xla = None 
)
Functor that creates a TensorFlow config.

Args:
    gpu_memory_fraction (float):
The fraction of GPU memory that will be made available to TensorFlow.
This should be a value between 0.0 and 1.0.
    allow_growth (bool): Whether to allow GPU memory allocated by TensorFlow to grow.
    use_xla (bool): Whether to attempt to enable XLA.

Member Function Documentation

◆ __call__()

def polygraphy.backend.tf.loader.CreateConfig.__call__ (   self)
Creates a TensorFlow config.

Returns:
    tf.ConfigProto: The TensorFlow config.

Member Data Documentation

◆ gpu_memory_fraction

polygraphy.backend.tf.loader.CreateConfig.gpu_memory_fraction

◆ allow_growth

polygraphy.backend.tf.loader.CreateConfig.allow_growth

◆ use_xla

polygraphy.backend.tf.loader.CreateConfig.use_xla

The documentation for this class was generated from the following file: