- This tool assists you in training your object detection model and converting it to a TFLite model, which can be easily deployed on your device.
- The training framework utilizes TensorFlow Object Detection API on TensorFlow 2 and TensorFlow 1.
- The notebooks simplify the complicated installation steps and provide an easy-to-use approach for data preparation, training, and conversion.
- If you haven't installed NuEdgeWise, please follow these steps to install Python virtual environment and choose
NuEdgeWise_env
. - Skip if you have done.
- Log in to https://my.visualstudio.com/Downloads (You will need a free Microsoft account).
- Enter 'Build Tools' in the search bar.
- Select 'Visual Studio 2015 Update 3' on the left side.
- Click on 'DVD' under 'Visual C++ Build Tools...' and initiate the download.
- Follow the installation steps to complete the installation process.
3. Install git windows version
- Git is required during the 5th step.
git clone https://github.com/OpenNuvoton/ML_tf2_object_detection_nu.git
- Or you can download the zip file directly
- Open Miniforge or your python environment with administrator privileges and select the
NuEdgeWise_env
. Utilizesetup_objdet_tf2.ipynb
to install the remaining packages.
- If you wish to use ssd_mobileNetv3, TensorFlow 1 is required, so you need to create a new Python virtual environment not using
NuEdgeWise_env
. - First, refer to
create_conda_env_tf1.ipynb
, and then proceed tosetup_objdet_tf1.ipynb
.
- Open
create_data.ipynb
located inimage_dataset
. Ensure that you execute it in a TensorFlow 2 environment. - This process will handle the downloading of open-source images or labeling your customized images to create a training dataset.
- The tutorial for this process is provided within the
create_data.ipynb
notebook.
- Open the workspace folder.
- Users need to choose either the TensorFlow 1 or TensorFlow 2 environment based on their training model.
- This process will handle training, mAP evaluation, and TFLite conversion.
- The tutorial for this process is provided within the notebook.
- Use
train_evl_monitor_tf2.ipynb
for an easy-to-use user interface that facilitates training, evaluation, and monitoring. - Use
convert_tflite_tf2.ipynb
for an easy-to-use user interface that aids in converting to TFLite format.
[alternative way]:
- Open
train_cmd_tf2.ipynb
. - Google's object detection support models: TensorFlow 2 Detection Model Zoo
- Open
train_cmd_tf1.ipynb
. This should be excuted in tf1 env (Checkcreate_conda_env_tf1.ipynb
&setup_objdet_tf1.ipynb
to create tf1 env). - Google's object detection support models: TensorFlow 1 Detection Model Zoo
- Open
test_tflite.ipynb
located in theworkspace
folder. Make sure to execute it in a TensorFlow 2 environment. - Evaluate the results of the TFLite model using your own dataset.
- The tutorial for this process is provided within the
test_tflite.ipynb
notebook.
- The MPU example code: MA35D1_machine_learning