This Python script takes a trained supervised FastText binary model and generates an equivalent serialized TensorFlow model which replicates the inference process of the FastText model. We developed this tool since we had trained FastText models that we needed to run in Google's BigQuery, which provides the option of uploading a serialized TensorFlow model to be run for prediction. Besides the fasttext2tensorflow.py script we share a Jupyter Notebook that describes the inference process of a FastText supervised model in plain Python, useful to understand the inner workings of FastText without having to go through the original C++ sources, the notebook is also available on Google's Colaboratory.
- fasttext
- tensorflow
python fasttext2tensorflow.py /path/to/fasttext/model.bin /path/to/save/tensorflow/model
Keep in mind that the TensorFlow model must be less than 250 MB in size to be allowed in BigQuery.
Enrique Diaz De León: [email protected]
Alan Salinas: [email protected]