gradient_clipping

safe_learning.utilities.gradient_clipping(optimizer, loss, var_list, limits)

Clip the gradients for the optimization problem.

Parameters:
optimizer : instance of tensorflow optimizer
loss : tf.Tensor

The loss that we want to optimize.

var_list : tuple

A list of variables for which we want to compute gradients.

limits : tuple

A list of tuples with lower/upper bounds for each variable.

Returns:
opt : tf.Tensor

One optimization step with clipped gradients.

Examples

>>> from safe_learning.utilities import gradient_clipping
>>> var = tf.Variable(1.)
>>> loss = tf.square(var - 1.)
>>> optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
>>> opt_loss = gradient_clipping(optimizer, loss, [var], [(-1, 1)])