tf.compat.v1.ConditionalAccumulator

A conditional accumulator for aggregating gradients.

Inherits From: ConditionalAccumulatorBase

Up-to-date gradients (i.e., time step at which gradient was computed is equal to the accumulator's time step) are added to the accumulator.

Extraction of the average gradient is blocked until the required number of gradients has been accumulated.

dtypeDatatype of the accumulated gradients.
shapeShape of the accumulated gradients.
shared_nameOptional. If non-empty, this accumulator will be shared under the given name across multiple sessions.
nameOptional name for the accumulator.
reduction_typeReduction type to use when taking the gradient.

accumulator_refThe underlying accumulator reference.
dtypeThe datatype of the gradients accumulated by this accumulator.
nameThe name of the underlying accumulator.

Methods

apply_grad

View source

Attempts to apply a gradient to the accumulator.

The attempt is silently dropped if the gradient is stale, i.e., local_step is less than the accumulator's global time step.

Args
gradThe gradient tensor to be applied.
local_stepTime step at which the gradient was computed.
nameOptional name for the operation.

Returns
The operation that (conditionally) applies a gradient to the accumulator.

Raises
ValueErrorIf grad is of the wrong shape

num_accumulated

View source

Number of gradients that have currently been aggregated in accumulator.

Args
nameOptional name for the operation.

Returns
Number of accumulated gradients currently in accumulator.

set_global_step

View source

Sets the global time step of the accumulator.

The operation logs a warning if we attempt to set to a time step that is lower than the accumulator's own time step.

Args
new_global_stepValue of new time step. Can be a variable or a constant
nameOptional name for the operation.

Returns
Operation that sets the accumulator's time step.

take_grad

View source

Attempts to extract the average gradient from the accumulator.

The operation blocks until sufficient number of gradients have been successfully applied to the accumulator.

Once successful, the following actions are also triggered:

  • Counter of accumulated gradients is reset to 0.
  • Aggregated gradient is reset to 0 tensor.
  • Accumulator's internal time step is incremented by 1.

Args
num_requiredNumber of gradients that needs to have been aggregated
nameOptional name for the operation

Returns
A tensor holding the value of the average gradient.

Raises
InvalidArgumentErrorIf num_required < 1