STT-tensorflow/tensorflow/lite/tools/serialization
Shlomi Regev e43be76009 Add nominal support for unsigned 32-bit integer tensor types.
PiperOrigin-RevId: 358028374
Change-Id: I89ebe8f549c279d87da74ca4fedc6b49f04ff506
2021-02-17 14:16:29 -08:00
..
BUILD Update TFLite serialization library for extended builtin codes. 2021-01-15 16:21:45 -08:00
enum_mapping.h Add nominal support for unsigned 32-bit integer tensor types. 2021-02-17 14:16:29 -08:00
option_writer_generator.cc Add ComplexAbs Op to TensorFlow Lite 2021-02-07 17:53:28 -08:00
README.md Rollforward serialization tool with internal test fix 2020-12-16 14:12:00 -08:00
writer_lib_test.cc [lite] Fix for uninitialized memory in writer_lib_test 2021-01-15 11:50:52 -08:00
writer_lib.cc Update TFLite serialization library for extended builtin codes. 2021-01-15 16:21:45 -08:00
writer_lib.h Rollforward serialization tool with internal test fix 2020-12-16 14:12:00 -08:00
writer_test.cc Rollforward serialization tool with internal test fix 2020-12-16 14:12:00 -08:00
writer.cc Rollforward serialization tool with internal test fix 2020-12-16 14:12:00 -08:00

TFLite Serialization Tool

NOTE: This tool is intended for advanced users only, and should be used with care.

The (C++) serialization library generates and writes a TFLite flatbuffer given an Interpreter or Subgraph. Example use-cases include authoring models with the Interpreter API, or updating models on-device (by modifying tensor.data for relevant tensors).

Serialization

Writing flatbuffer to file

To write a TFLite model from an Interpreter (see lite/interpreter.h): std::unique_ptr<tflite::Interpreter> interpreter; // ...build/modify interpreter... tflite::ModelWriter writer(interpreter.get()); std::string filename = "/tmp/model.tflite"; writer.Write(filename);

Note that the above API does not support custom I/O tensors or custom ops yet. However, it does support model with Control Flow.

To generate/write a flatbuffer for a particular Subgraph (see lite/core/subgraph.h) you can use SubgraphWriter.

std::unique_ptr<tflite::Interpreter> interpreter;
// ...build/modify interpreter...
// The number of subgraphs can be obtained by:
// const int num_subgraphs = interpreter_->subgraphs_size();
// Note that 0 <= subgraph_index < num_subgraphs
tflite::SubgraphWriter writer(&interpreter->subgraph(subgraph_index));
std::string filename = "/tmp/model.tflite";
writer.Write(filename);

SubgraphWriter supports custom ops and/or custom I/O tensors.

Generating flatbuffer in-memory

Both ModelWriter and SubgraphWriter support a GetBuffer method to return the generated flatbuffer in-memory:

std::unique_ptr<uint8_t[]> output_buffer;
size_t output_buffer_size;
tflite::ModelWriter writer(interpreter.get());
writer.GetBuffer(&output_buffer, &output_buffer_size);

De-serialization

The flatbuffers written as above can be de-serialized just like any other TFLite model, for eg:

std::unique_ptr<FlatBufferModel> model =
    FlatBufferModel::BuildFromFile(filename);
tflite::ops::builtin::BuiltinOpResolver resolver;
InterpreterBuilder builder(*model, resolver);
std::unique_ptr<Interpreter> new_interpreter;
builder(&new_interpreter);