Skip to content

Commit

Permalink
run-onnx-lib script and docs fixes (#1705)
Browse files Browse the repository at this point in the history
* update documentation for build-run-onnx-lib.sh
* clarified Mac issue with running statically linked version in the directory where model library was built
* moved usage string up to the top of RunONNXLib.cpp, so it's visible next to the top of file block comment,
* allow llvm-project location to be set with LLVM_PROJECT, otherwise read from $MLIR_DIR if defined

Signed-off-by: Soren Lassen <[email protected]>
Co-authored-by: Alexandre Eichenberger <[email protected]>
  • Loading branch information
sorenlassen and AlexandreEichenberger authored Sep 28, 2022
1 parent ceb1bce commit 62e08f8
Show file tree
Hide file tree
Showing 3 changed files with 101 additions and 88 deletions.
17 changes: 11 additions & 6 deletions docs/Testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -151,15 +151,20 @@ We first need to compile the tool, which can be done in one of two modes.
In the first mode, the tool is compiled with a statically linked model.
This mode requires the `-D LOAD_MODEL_STATICALLY=0` option during compilation in addition to including the `.so` file.
Best is to use the `build-run-onnx-lib.sh` script in the `onnx-mlir/utils` directory to compile the tool with its model, which is passed as a parameter to the script.
To avoid library path issues, just run the tool in the home directory of the model.
To avoid library path issues on Mac, run the compiled tool in the directory where the model was built.

``` sh
# Compile tool with model.
cd onnx-mlir/build
. ../utils/build-run-onnx-lib.sh test/backend/test_add.so
# Run tool in the directory of the model.
(cd test/backend; run-onnx-lib)
sh ../utils/build-run-onnx-lib.sh test/backend/test_add/test_add.so
# Run the tool to run the model (substitute `Release` for `Debug` for the release version).
Debug/bin/run-onnx-lib
# or, on Mac, run the tool in the directory where the model was built
(cd test/backend; ../../Debug/bin/run-onnx-lib)
# if test_add.so was built in `test/backend`:
cd test/backend; ../../Debug/bin/onnx-mlir --EmitLib test_add/test_add.onnx
```
(You can see the path of the library with `otool -L test_add.so` on Mac.)

In the second mode, the tool is compiled without models, which will be passed at runtime.
To enable this option, simply compile the tool with the `-D LOAD_MODEL_STATICALLY=1` option.
Expand All @@ -169,9 +174,9 @@ any directories as long as you pass the `.so` model file at runtime to the tool.
``` sh
# Compile tool without a model.
cd onnx-mlir/build
. ../utils/build-run-onnx-lib.sh
sh ../utils/build-run-onnx-lib.sh
# Run the tool with an argument pointing to the model.
run-onnx-lib test/backend/test_add.so
Debug/bin/run-onnx-lib test/backend/test_add/test_add.so
```

## LLVM FileCheck Tests
Expand Down
103 changes: 51 additions & 52 deletions utils/RunONNXLib.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -13,20 +13,43 @@
follows. For dynamically loaded models:
cd onnx-mlir/build
. ../utils/build-run-onnx-lib.sh
run-onnx-lib test/backend/test_add.so
sh ../utils/build-run-onnx-lib.sh
Debug/bin/run-onnx-lib test/backend/test_add/test_add.so
For statically loaded models, best is to run the utility in the directory
of the model.
For statically loaded models:
cd onnx-mlir/build
. ../utils/build-run-onnx-lib.sh test/backend/test_add.so
sh ../utils/build-run-onnx-lib.sh test/backend/test_add/test_add.so
On linux you run with:
Debug/bin/run-onnx-lib
On Mac it is best to run in the directory where the model was built, e.g.,
cd test/backend
run-onnx-lib
../../Debug/bin/run-onnx-lib
Usage of program is as follows.
if test_add.so was built in test/backend with:
../..Debug/bin/onnx-mlir --EmitLib test_add/test_add.onnx
See printUsage() below for usage of the program.
*/

Usage: run-onnx-lib [options] model.so
//===----------------------------------------------------------------------===//

// Define while compiling.
// #define LOAD_MODEL_STATICALLY 1

auto printUsage = [](auto &out, const char *name) {
#if LOAD_MODEL_STATICALLY
out << "Usage: " << name << " [options]";
#else
out << "Usage: " << name << " [options] model.so";
#endif
out << R"""(
Program will instantiate the model given by "model.so"
with random inputs, launch the computation, and ignore
Expand All @@ -36,21 +59,30 @@ Usage: run-onnx-lib [options] model.so
path to the local directory is also prepended.
Options:
-d | -dim json-array
Provide a json array to provide the value of every
runtime dimensions in the input signature of the
entry function. E.g. -d "[7, 2048]".)""";
#if !LOAD_MODEL_STATICALLY
out << R"""(
-e name | --entry-point name
Name of the ONNX model entry point.
Default is "run_main_graph".
Default is "run_main_graph".)""";
#endif
out << R"""(
-h | --help
Print help message.
-n NUM | --iterations NUM
Number of times to run the tests, default 1
Number of times to run the tests, default 1.
-m NUM | --meas NUM
Measure the kernel execution time NUM times.
-r | -reuse true|false
Reuse input data, default on
-v | --verbose
Print the shape of the inputs and outputs
-h | --help
help
*/
Print the shape of the inputs and outputs.
//===----------------------------------------------------------------------===//

// Define while compiling.
// #define LOAD_MODEL_STATICALLY 1
)""";
};

#include <algorithm>
#include <assert.h>
Expand Down Expand Up @@ -119,40 +151,7 @@ static bool measureExecTime = false;
static vector<int64_t> dimKnownAtRuntime;

void usage(const char *name) {
#if LOAD_MODEL_STATICALLY
cout << "Usage: " << name << " [options]";
#else
cout << "Usage: " << name << " [options] model.so";
#endif
cout << endl << endl;
cout << " Program will instantiate the model given by \"model.so\"" << endl;
cout << " with random inputs, launch the computation, and ignore" << endl;
cout << " the results. A model is typically generated by lowering" << endl;
cout << " an ONNX model using a \"onnx-mlir --EmitLib model.onnx\"" << endl;
cout << " command. When the input model is not found as is, the" << endl;
cout << " path to the local directory is also prepended." << endl;
cout << endl;
cout << " Options:" << endl;
cout << " -d | -dim json-array" << endl;
cout << " Provide a json array to provide the value of every" << endl;
cout << " runtime dimensions in the input signature of the" << endl;
cout << " entry function. E.g. -d \"[7 2048]\"." << endl;
#if !LOAD_MODEL_STATICALLY
cout << " -e name | --entry-point name" << endl;
cout << " Name of the ONNX model entry point." << endl;
cout << " Default is \"run_main_graph\"." << endl;
#endif
cout << " -h | --help" << endl;
cout << " Print help message." << endl;
cout << " -n NUM | --iterations NUM" << endl;
cout << " Number of times to run the tests, default 1." << endl;
cout << " -m NUM | --meas NUM" << endl;
cout << " Measure the kernel execution time NUM times." << endl;
cout << " -r | -reuse true|false" << endl;
cout << " Reuse input data, default on" << endl;
cout << " -v | --verbose" << endl;
cout << " Print the shape of the inputs and outputs." << endl;
cout << endl;
printUsage(cout, name);
exit(1);
}

Expand Down
69 changes: 39 additions & 30 deletions utils/build-run-onnx-lib.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,42 +8,51 @@
#
# Assumptions:
# 1) script run in the onnx-mlir/build subdir.
# 2) llvm-project is built with all its libraries (needed to run the tool)/
# 2) llvm-project is built with all its libraries (needed to run the tool)

cd ../..
TOP_DIR=`pwd`
cd $TOP_DIR/onnx-mlir/build

LLVM_PROJ_SRC=$TOP_DIR/llvm-project
LLVM_PROJ_BUILD=$LLVM_PROJ_SRC/build
# ask git for the onnx-mlir top level dir
ONNX_MLIR=$(git rev-parse --show-toplevel)
if [ $(pwd) != $ONNX_MLIR/build ] ; then
echo "Error: this script must be run from the build dir $ONNX_MLIR/build"
exit 1
fi
ONNX_MLIR_BIN=$ONNX_MLIR/build/Debug/bin

ONNX_MLIR_SRC=$TOP_DIR/onnx-mlir
ONNX_MLIR_UTIL=$ONNX_MLIR_SRC/utils
ONNX_MLIR_BIN=$ONNX_MLIR_SRC/build/Debug/bin
DRIVERNAME=RunONNXLib.cpp
echo $ONNX_MLIR_SRC
if [ -z $LLVM_PROJECT ] ; then
if [ $MLIR_DIR ] ; then
# find llvm-project in MLIR_DIR, used to configure cmake,
LLVM_PROJECT=${MLIR_DIR%llvm-project/*}llvm-project
else
# or else assume llvm-project shares parent directory with ONNX-MLIR
LLVM_PROJECT=$(dirname $ONNX_MLIR)/llvm-project
fi
fi

if [ "$#" -eq 0 ] ; then
echo "Compile run-onnx-lib for models passed at runtime"
g++ $ONNX_MLIR_UTIL/$DRIVERNAME -o $ONNX_MLIR_BIN/run-onnx-lib -std=c++17 \
-D LOAD_MODEL_STATICALLY=0 -I $LLVM_PROJ_SRC/llvm/include \
-I $LLVM_PROJ_BUILD/include -I $ONNX_MLIR_SRC/include \
-L $LLVM_PROJ_BUILD/lib -lLLVMSupport -lLLVMDemangle -lcurses -lpthread -ldl &&
echo " success, dynamically linked run-onnx-lib built in $ONNX_MLIR_BIN"
echo "Compiling run-onnx-lib for dynamically linked models passed at runtime"
elif [ "$#" -eq 1 ] ; then
if [ -e $1 ] ; then
echo "Compile run-onnx-lib for model $1"
g++ $ONNX_MLIR_UTIL/$DRIVERNAME -o $ONNX_MLIR_BIN/run-onnx-lib -std=c++14 \
-D LOAD_MODEL_STATICALLY=1 -I $LLVM_PROJ_SRC/llvm/include \
-I $LLVM_PROJ_BUILD/include -I $ONNX_MLIR_SRC/include \
-L $LLVM_PROJ_BUILD/lib -lLLVMSupport -lLLVMDemangle -lcurses -lpthread -ldl $1 \
&&
echo " success, statically linked run-onnx-lib built in $ONNX_MLIR_BIN"
echo ""
echo "TO RUN: easiest is to cd into the directory of the model"
else
echo "Error: could not find model $1"
fi
echo "Compiling run-onnx-lib statically linked to model $1"
else
echo "Error: could not find model $1"
exit 1
fi
else
echo "Error: pass either zero/one argument for dynamically/statically linked models"
exit 1
fi

DRIVER_NAME=$ONNX_MLIR/utils/RunONNXLib.cpp
RUN_BIN=$ONNX_MLIR_BIN/run-onnx-lib
RUN_BIN_RELATIVE=${RUN_BIN#$(pwd)/}
g++ $DRIVER_NAME -o $RUN_BIN -std=c++17 -D LOAD_MODEL_STATICALLY=$# \
-I $LLVM_PROJECT/llvm/include -I $LLVM_PROJECT/build/include \
-I $ONNX_MLIR/include -L $LLVM_PROJECT/build/lib \
-lLLVMSupport -lLLVMDemangle -lcurses -lpthread -ldl "$@" &&
echo "Success, built $RUN_BIN_RELATIVE"

if [ "$#" -eq 1 -a $(uname -s) = Darwin ] ; then
echo ""
echo "TO RUN: easiest is to cd into the directory where the model was built"
echo "(run \"otool -L $RUN_BIN_RELATIVE\" to see $(basename $1) path)"
fi

0 comments on commit 62e08f8

Please sign in to comment.