tf.lite.experimental.QuantizationDebugger

Debugger for Quantized TensorFlow Lite debug mode models.

Used in the notebooks

Used in the tutorials

This can run the TensorFlow Lite converted models equipped with debug ops and collect debug information. This debugger calculates statistics from user-defined post-processing functions as well as default ones.

quant_debug_model_pathPath to the quantized debug TFLite model file.
quant_debug_model_contentContent of the quantized debug TFLite model.
float_model_pathPath to float TFLite model file.
float_model_contentContent of the float TFLite model.
debug_dataseta factory function that returns dataset generator which is used to generate input samples (list of np.ndarray) for the model. The generated elements must have same types and shape as inputs to the model.
debug_optionsDebug options to debug the given model.
converterOptional, use converter instead of quantized model.

ValueErrorIf the debugger was unable to be created.

options

Methods

get_debug_quantized_model

View source

Returns an instrumented quantized model.

Convert the quantized model with the initialized converter and return bytes for model. The model will be instrumented with numeric verification operations and should only be used for debugging.

Returns
Model bytes corresponding to the model.

Raises
ValueErrorif converter is not passed to the debugger.

get_nondebug_quantized_model

View source

Returns a non-instrumented quantized model.

Convert the quantized model with the initialized converter and return bytes for nondebug model. The model will not be instrumented with numeric verification operations.

Returns
Model bytes corresponding to the model.

Raises
ValueErrorif converter is not passed to the debugger.

layer_statistics_dump

View source

Dumps layer statistics into file, in csv format.

Args
filefile, or file-like object to write.

run

View source

Runs models and gets metrics.