Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generalise and document TF -> TFLite conversion #4

Open
psyhtest opened this issue Mar 1, 2019 · 3 comments
Open

Generalise and document TF -> TFLite conversion #4

psyhtest opened this issue Mar 1, 2019 · 3 comments
Assignees
Labels
enhancement New feature or request

Comments

@psyhtest
Copy link
Contributor

psyhtest commented Mar 1, 2019

@bellycat77 has managed to convert the TF ResNet50 v1.5 model used in MLPerf Inference to TFLite with the following script:

import tensorflow as tf

graph_def_file = "resnet50_v1.pb"
input_arrays = ["input_tensor"]
output_arrays = ["softmax_tensor"]

converter = tf.contrib.lite.TFLiteConverter.from_frozen_graph(
  graph_def_file, input_arrays, output_arrays)
tflite_model = converter.convert()
open("resnet50_v1.tflite", "wb").write(tflite_model)

We should generalise and automate this via a CK script. For example, the input file can come from a dependency on a TF model, whereas the input and output arrays can be specified in the model's metadata.

@psyhtest psyhtest added the enhancement New feature or request label Mar 1, 2019
@psyhtest
Copy link
Contributor Author

If a model requires special steps (e.g. SSD-MobileNet), then it should motivate us even more to generalise.

@psyhtest
Copy link
Contributor Author

Conversion steps for SSD-based models are documented here.

@psyhtest
Copy link
Contributor Author

I wonder if the same conversion steps will work for SSD-ResNet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants