Skip to main content

Tensorflow Config

Source: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/protobuf/config.proto
https://stackoverflow.com/questions/44873273/
https://www.tensorflow.org/guide/using_gpu

1. allow_soft_placement=True

Some tensorflow operations have GPU implementations, like MatMul, some don't.

// Whether soft placement is allowed. If allow_soft_placement is true,
// an op will be placed on CPU if
//   1. there's no GPU implementation for the OP
// or
//   2. no GPU devices are known or registered
// or
//   3. need to co-locate with reftype input(s) which are from CPU.
bool allow_soft_placement = 7;


So, if you do :
with tf.device('/gpu:0'):
// And you have some op here which doesn't have a GPU implementation

This will throw an error if allow_soft_placement=True is not set.


2. log_device_placement=True

// Whether device placements should be logged.
bool log_device_placement = 8;

If log_device_placement=True
You can see where each operation is mapped to:
Example /device:CPU:0 or /device:GPU:0


3. gpu_options.allow_growth = True

// dynamically grow the memory used on the GPU
// if you dont use allow growth, the memory of graphics card will be allocated for use by that one process only and other processes cant use it
// that one process might not need much gpu memory at all
// doing allow_growth allows other processes to use it as well