Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix bugs in ipynb2py and app test #385

Closed
wants to merge 38 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
501ca42
add timer in run_app_tests
edieYoung May 18, 2018
7038008
add timer to run_app_tests
edieYoung May 18, 2018
e44f1fd
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung May 21, 2018
9d55e9b
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung May 21, 2018
3a7f565
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung May 22, 2018
0cb58d1
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung May 22, 2018
c765bc4
add_timer_to_app_tests
edieYoung May 22, 2018
b94c669
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung May 22, 2018
27fd9b6
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung Jun 11, 2018
14f91df
add exception exit in apps and example tests
edieYoung Jun 11, 2018
914b34d
switch test order
edieYoung Jun 11, 2018
103e62c
switch test order to verify
edieYoung Jun 11, 2018
6a1682e
test_update
edieYoung Jun 12, 2018
a765596
test_update
edieYoung Jun 12, 2018
1452dc7
test_update
edieYoung Jun 13, 2018
271a90b
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung Jun 13, 2018
c8156f5
test_update
edieYoung Jun 13, 2018
e7b49a4
test_update
edieYoung Jun 13, 2018
f4f2358
test_update
edieYoung Jun 13, 2018
8869b47
test_update
edieYoung Jun 13, 2018
bacade9
test_update
edieYoung Jun 13, 2018
abb1b01
test_update
edieYoung Jun 13, 2018
f0d348a
test_update
edieYoung Jun 13, 2018
1a8e55b
test_update
edieYoung Jun 13, 2018
7339905
test_update
edieYoung Jun 13, 2018
884ef11
example_update
edieYoung Jun 13, 2018
1b32b96
example_update
edieYoung Jun 13, 2018
0de1cf4
test_update
edieYoung Jun 13, 2018
f467ebf
test_update
edieYoung Jun 13, 2018
ced59e6
test_update
edieYoung Jun 13, 2018
2bac41f
test_update
edieYoung Jun 13, 2018
d7f8be2
test_update
edieYoung Jun 14, 2018
140c915
fix_apptest
edieYoung Jun 14, 2018
b6242cd
Merge branch 'master' of https://github.com/intel-analytics/analytics…
edieYoung Jun 14, 2018
23f4c32
test_update
edieYoung Jun 14, 2018
99b1afb
test_update
edieYoung Jun 14, 2018
b499f4e
test_update
edieYoung Jun 14, 2018
12f3d82
test_update
edieYoung Jun 14, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion apps/ipynb2py.sh
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ else
sed -i 's/%%/#/' $1.tmp.ipynb
sed -i 's/%pylab/#/' $1.tmp.ipynb

jupyter nbconvert --to script $1.tmp.ipynb
jupyter nbconvert $1.tmp.ipynb --to python

mv $1.tmp.py $1.py
sed -i '1i# -*- coding: utf-8 -*-' $1.py
Expand Down
118 changes: 61 additions & 57 deletions apps/run-app-tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,24 +4,24 @@ export SPARK_HOME=$SPARK_HOME
export MASTER=local[4]
export FTP_URI=$FTP_URI
export ANALYTICS_ZOO_HOME=$ANALYTICS_ZOO_HOME
export ANALYTICS_ZOO_HOME_DIST=$ANALYTICS_ZOO_HOME/dist
export ANALYTICS_ZOO_JAR=`find ${ANALYTICS_ZOO_HOME_DIST}/lib -type f -name "analytics-zoo*jar-with-dependencies.jar"`
export ANALYTICS_ZOO_PYZIP=`find ${ANALYTICS_ZOO_HOME_DIST}/lib -type f -name "analytics-zoo*python-api.zip"`
export ANALYTICS_ZOO_CONF=${ANALYTICS_ZOO_HOME_DIST}/conf/spark-analytics-zoo.conf
export ANALYTICS_ZOO_JAR=`find ${ANALYTICS_ZOO_HOME}/lib -type f -name "analytics-zoo*jar-with-dependencies.jar"`
export ANALYTICS_ZOO_PYZIP=`find ${ANALYTICS_ZOO_HOME}/lib -type f -name "analytics-zoo*python-api.zip"`
export ANALYTICS_ZOO_CONF=${ANALYTICS_ZOO_HOME}/conf/spark-analytics-zoo.conf
export PYTHONPATH=${ANALYTICS_ZOO_PYZIP}:$PYTHONPATH

chmod +x ./apps/ipynb2py.sh
chmod +x ${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh

set -e

echo "#1 start app test for anomaly-detection-nyc-taxi"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/anomaly-detection/anomaly-detection-nyc-taxi
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/anomaly-detection/anomaly-detection-nyc-taxi

chmod +x $ANALYTICS_ZOO_HOME/scripts/data/NAB/nyc_taxi/get_nyc_taxi.sh

$ANALYTICS_ZOO_HOME/scripts/data/NAB/nyc_taxi/get_nyc_taxi.sh
chmod +x ${ANALYTICS_ZOO_HOME}/bin/data/NAB/nyc_taxi/get_nyc_taxi.sh

${ANALYTICS_ZOO_HOME}/bin/data/NAB/nyc_taxi/get_nyc_taxi.sh
sed "s/nb_epoch=30/nb_epoch=3/g" ${ANALYTICS_ZOO_HOME}/apps/anomaly-detection/anomaly-detection-nyc-taxi.py
${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
--driver-cores 2 \
Expand All @@ -37,41 +37,42 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/anomaly-detection/anomaly-detection-nyc-taxi.py
now=$(date "+%s")
time=$((now-start))
echo "#1 anomaly-detection-nyc-taxi time used:$time seconds"
time1=$((now-start))


echo "#2 start app test for object-detection"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/object-detection/object-detection

FILENAME="$ANALYTICS_ZOO_HOME/apps/object-detection/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model"

${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/object-detection/object-detection
FILENAME="${ANALYTICS_ZOO_HOME}/apps/object-detection/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model"
if [ -f "$FILENAME" ]
then
echo "$FILENAME already exists"
echo "$FILENAME already exists"
else
wget $FTP_URI/analytics-zoo-models-new/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model -P $ANALYTICS_ZOO_HOME/apps/object-detection/
fi
wget $FTP_URI/analytics-zoo-models/object-detection/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model -P ${ANALYTICS_ZOO_HOME}/apps/object-detection/
fi
if [ -f "$FILENAME" ]
then
echo "$FILENAME already exists"
echo "$FILENAME already exists"
else
wget https://s3-ap-southeast-1.amazonaws.com/analytics-zoo-models/object-detection/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model -P $ANALYTICS_ZOO_HOME/apps/object-detection/
fi
FILENAME="$ANALYTICS_ZOO_HOME/apps/object-detection/train_dog.mp4"
wget https://s3-ap-southeast-1.amazonaws.com/analytics-zoo-models/object-detection/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model -P ${ANALYTICS_ZOO_HOME}/apps/object-detection/
fi
FILENAME="${ANALYTICS_ZOO_HOME}/apps/object-detection/train_dog.mp4"
if [ -f "$FILENAME" ]
then
echo "$FILENAME already exists"
echo "$FILENAME already exists"
else
wget $FTP_URI/analytics-zoo-data/apps/object-detection/train_dog.mp4 -P $ANALYTICS_ZOO_HOME/apps/object-detection/
fi
wget $FTP_URI/analytics-zoo-data/apps/object-detection/train_dog.mp4 -P ${ANALYTICS_ZOO_HOME}/apps/object-detection/
fi
if [ -f "$FILENAME" ]
then
echo "$FILENAME already exists"
echo "$FILENAME already exists"
else
wget https://s3.amazonaws.com/analytics-zoo-data/train_dog.mp4 -P $ANALYTICS_ZOO_HOME/apps/object-detection/
fi
wget https://s3.amazonaws.com/analytics-zoo-data/train_dog.mp4 -P ${ANALYTICS_ZOO_HOME}/apps/object-detection/
fi


${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
--driver-cores 2 \
Expand All @@ -87,13 +88,12 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/object-detection/object-detection.py
now=$(date "+%s")
time=$((now-start))
echo "#2 object-detection time used:$time seconds"
time2=$((now-start))

echo "#3 start app test for ncf-explicit-feedback"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/recommendation/ncf-explicit-feedback
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/recommendation/ncf-explicit-feedback

${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
Expand All @@ -110,13 +110,13 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/recommendation/ncf-explicit-feedback.py
now=$(date "+%s")
time=$((now-start))
echo "#3 ncf-explicit-feedback time used:$time seconds"
time3=$((now-start))


echo "#4 start app test for wide_n_deep"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/recommendation/wide_n_deep
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/recommendation/wide_n_deep

${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
Expand All @@ -133,15 +133,13 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/recommendation/wide_n_deep.py
now=$(date "+%s")
time=$((now-start))
echo "#4 wide_n_deep time used:$time seconds"

time4=$((now-start))
echo "#5 start app test for using_variational_autoencoder_and_deep_feature_loss_to_generate_faces"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/variational-autoencoder/using_variational_autoencoder_and_deep_feature_loss_to_generate_faces
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_and_deep_feature_loss_to_generate_faces

sed -i "s/data_files\[\:100000\]/data_files\[\:5000\]/g" ./apps/variational-autoencoder/using_variational_autoencoder_and_deep_feature_loss_to_generate_faces.py
sed -i "s/data_files\[\:100000\]/data_files\[\:5000\]/g" ${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_and_deep_feature_loss_to_generate_faces.py
FILENAME="${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/bigdl_vgg-16_imagenet_0.4.0.model"
if [ -f "$FILENAME" ]
then
Expand All @@ -162,7 +160,7 @@ else
unzip -d ${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/ ${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/img_align_celeba.zip
echo "Finished"
fi

${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
--driver-cores 2 \
Expand All @@ -171,22 +169,21 @@ ${SPARK_HOME}/bin/spark-submit \
--executor-cores 2 \
--executor-memory 12g \
--conf spark.akka.frameSize=64 \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_and_deep_feature_loss_to_generate_faces.py,${ANALYTICS_ZOO_HOME}/apps/variational_autoencoder/utils.py \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_and_deep_feature_loss_to_generate_faces.py,${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/utils.py \
--properties-file ${ANALYTICS_ZOO_CONF} \
--jars ${ANALYTICS_ZOO_JAR} \
--conf spark.driver.extraClassPath=${ANALYTICS_ZOO_JAR} \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_and_deep_feature_loss_to_generate_faces.py
now=$(date "+%s")
time=$((now-start))
echo "#5 using_variational_autoencoder_and_deep_feature_loss_to_generate_faces time used:$time seconds"
time5=$((now-start))

echo "#6 start app test for using_variational_autoencoder_to_generate_faces"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/variational-autoencoder/using_variational_autoencoder_to_generate_faces
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_to_generate_faces

sed -i "s/data_files\[\:100000\]/data_files\[\:5000\]/g" ./apps/variational-autoencoder/using_variational_autoencoder_to_generate_faces.py
sed -i "s/data_files\[\:100000\]/data_files\[\:5000\]/g" ${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_to_generate_faces.py
FILENAME="${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/img_align_celeba.zip"
if [ -f "$FILENAME" ]
then
Expand All @@ -206,20 +203,19 @@ ${SPARK_HOME}/bin/spark-submit \
--executor-cores 2 \
--executor-memory 12g \
--conf spark.akka.frameSize=64 \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_to_generate_faces.py,${ANALYTICS_ZOO_HOME}/apps/variational_autoencoder/utils.py \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_to_generate_faces.py,${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/utils.py \
--properties-file ${ANALYTICS_ZOO_CONF} \
--jars ${ANALYTICS_ZOO_JAR} \
--conf spark.driver.extraClassPath=${ANALYTICS_ZOO_JAR} \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_to_generate_faces.py
now=$(date "+%s")
time=$((now-start))
echo "#6 using_variational_autoencoder_to_generate_faces time used:$time seconds"

time6=$((now-start))

echo "#7 start app test for using_variational_autoencoder_to_generate_digital_numbers"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/variational-autoencoder/using_variational_autoencoder_to_generate_digital_numbers
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_to_generate_digital_numbers

${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
Expand All @@ -236,13 +232,12 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/variational-autoencoder/using_variational_autoencoder_to_generate_digital_numbers.py
now=$(date "+%s")
time=$((now-start))
echo "#7 using_variational_autoencoder_to_generate_digital_numbers time used:$time seconds"
time7=$((now-start))

echo "#8 start app test for sentiment-analysis"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/sentiment-analysis/sentiment
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/sentiment-analysis/sentiment

${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
Expand All @@ -259,12 +254,12 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/sentiment-analysis/sentiment.py
now=$(date "+%s")
time=$((now-start))
echo "#8 sentimentAnalysis time used:$time seconds"
time8=$((now-start))

echo "#9 start app test for image-augmentation"
#timer
start=$(date "+%s")
./apps/ipynb2py.sh ./apps/image-augmentation/image-augmentation
${ANALYTICS_ZOO_HOME}/apps/ipynb2py.sh ${ANALYTICS_ZOO_HOME}/apps/image-augmentation/image-augmentation

${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
Expand All @@ -278,5 +273,14 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/apps/image-augmentation/image-augmentation.py
now=$(date "+%s")
time=$((now-start))
echo "#9 image-augmentation time used:$time seconds"
time9=$((now-start))

echo "#1 anomaly-detection-nyc-taxi time used:$time1 seconds"
echo "#2 object-detection time used:$time2 seconds"
echo "#3 ncf-explicit-feedback time used:$time3 seconds"
echo "#4 wide_n_deep time used:$time4 seconds"
echo "#5 using_variational_autoencoder_and_deep_feature_loss_to_generate_faces time used:$time5 seconds"
echo "#6 using_variational_autoencoder_to_generate_faces time used:$time6 seconds"
echo "#7 using_variational_autoencoder_to_generate_digital_numbers time used:$time7 seconds"
echo "#8 sentimentAnalysis time used:$time8 seconds"
echo "#9 image-augmentation time used:$time9 seconds"
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,8 @@
"# we use the vgg16 model, it should work on other popular CNN models\n",
"# You can download them here (https://github.com/intel-analytics/analytics-zoo/tree/master/models\n",
"# download the data CelebA, and may repalce with your own data path\n",
"DATA_PATH = os.getenv(\"ANALYTICS_ZOO_HOME\") + \"/apps/variational_autoencoder/img_align_celeba\"\n",
"VGG_PATH = os.getenv(\"ANALYTICS_ZOO_HOME\")+\"/apps/variational_autoencoder/bigdl_vgg-16_imagenet_0.4.0.model\"\n",
"DATA_PATH = os.getenv(\"ANALYTICS_ZOO_HOME\") + \"/apps/variational-autoencoder/img_align_celeba\"\n",
"VGG_PATH = os.getenv(\"ANALYTICS_ZOO_HOME\")+\"/apps/variational-autoencoder/bigdl_vgg-16_imagenet_0.4.0.model\"\n",
"\n",
"init_engine()"
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@
"ENCODER_FILTER_NUM = 32\n",
"\n",
"#download the data CelebA, and may repalce with your own data path\n",
"DATA_PATH = os.getenv(\"ANALYTICS_ZOO_HOME\") + \"/apps/variational_autoencoder/img_align_celeba\"\n",
"DATA_PATH = os.getenv(\"ANALYTICS_ZOO_HOME\") + \"/apps/variational-autoencoder/img_align_celeba\"\n",
"\n",
"init_engine()"
]
Expand Down
60 changes: 45 additions & 15 deletions pyzoo/zoo/examples/run-example-tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,11 @@ export ANALYTICS_ZOO_PYZIP=`find ${ANALYTICS_ZOO_HOME_DIST}/lib -type f -name "a
export ANALYTICS_ZOO_CONF=${ANALYTICS_ZOO_HOME_DIST}/conf/spark-analytics-zoo.conf
export PYTHONPATH=${ANALYTICS_ZOO_PYZIP}:$PYTHONPATH

set -e

echo "#1 start example test for textclassification"
#timer
start=$(date "+%s")
if [ -f analytics-zoo-data/data/glove.6B.zip ]
then
echo "analytics-zoo-data/data/glove.6B.zip already exists"
Expand All @@ -31,19 +35,20 @@ if [ ! -d analytics-zoo-models ]
then
mkdir analytics-zoo-models
fi
if [ -f analytics-zoo-models/analytics-zoo_squeezenet_imagenet_0.1.0 ]

if [ -f analytics-zoo-models/image-classification/analytics-zoo_squeezenet_imagenet_0.1.0.model ]
then
echo "analytics-zoo-models/analytics-zoo_squeezenet_imagenet_0.1.0 already exists"
echo "analytics-zoo-models/image-classification/analytics-zoo_squeezenet_imagenet_0.1.0.model already exists"
else
wget $FTP_URI/analytics-zoo-models/imageclassification/imagenet/analytics-zoo_squeezenet_imagenet_0.1.0 \
wget $FTP_URI/analytics-zoo-models/image-classification/analytics-zoo_squeezenet_imagenet_0.1.0.model\
-P analytics-zoo-models
fi
if [ -f analytics-zoo-models-new/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model ]
if [ -f analytics-zoo-models/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model ]
then
echo "analytics-zoo-models-new/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model already exists"
echo "analytics-zoo-models/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model already exists"
else
wget $FTP_URI/analytics-zoo-models-new/object-detection/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model \
-P analytics-zoo-models-new
wget $FTP_URI/analytics-zoo-models/object-detection/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model \
-P analytics-zoo-models
fi

${SPARK_HOME}/bin/spark-submit \
Expand All @@ -58,7 +63,13 @@ ${SPARK_HOME}/bin/spark-submit \
--nb_epoch 2 \
--data_path analytics-zoo-data/data


now=$(date "+%s")
time1=$((now-start))

echo "#2 start example test for customized loss and layer (Funtional API)"
#timer
start=$(date "+%s")
${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
--driver-memory 20g \
Expand All @@ -69,27 +80,46 @@ ${SPARK_HOME}/bin/spark-submit \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/autograd/custom.py \
--nb_epoch 2
now=$(date "+%s")
time2=$((now-start))



echo "#4 start example test for object-detection"
#timer
start=$(date "+%s")
${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
--driver-memory 20g \
--executor-memory 20g \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/imageclassification/predict.py \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/objectdetection/predict.py \
--jars ${ANALYTICS_ZOO_JAR} \
--conf spark.driver.extraClassPath=${ANALYTICS_ZOO_JAR} \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/imageclassification/predict.py \
-f hdfs://172.168.2.181:9000/kaggle/train_100 \
--model analytics-zoo-models/analytics-zoo_squeezenet_imagenet_0.1.0 \
--topN 5
${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/objectdetection/predict.py \
analytics-zoo-models/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model hdfs://172.168.2.181:9000/kaggle/train_100 /tmp
now=$(date "+%s")
time4=$((now-start))

echo "#3 start example test for image-classification"
#timer
start=$(date "+%s")
${SPARK_HOME}/bin/spark-submit \
--master ${MASTER} \
--driver-memory 20g \
--executor-memory 20g \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/objectdetection/predict.py \
--py-files ${ANALYTICS_ZOO_PYZIP},${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/imageclassification/predict.py \
--jars ${ANALYTICS_ZOO_JAR} \
--conf spark.driver.extraClassPath=${ANALYTICS_ZOO_JAR} \
--conf spark.executor.extraClassPath=${ANALYTICS_ZOO_JAR} \
${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/objectdetection/predict.py \
analytics-zoo-models-new/analytics-zoo_ssd-mobilenet-300x300_PASCAL_0.1.0.model hdfs://172.168.2.181:9000/kaggle/train_100 /tmp
${ANALYTICS_ZOO_HOME}/pyzoo/zoo/examples/imageclassification/predict.py \
-f hdfs://172.168.2.181:9000/kaggle/train_100 \
--model analytics-zoo-models/analytics-zoo_squeezenet_imagenet_0.1.0.model \
--topN 5
now=$(date "+%s")
time3=$((now-start))

echo "#1 textclassification time used:$time1 seconds"
echo "#2 customized loss and layer time used:$time2 seconds"
echo "#3 image-classification time used:$time1 seconds"
echo "#4 object-detection loss and layer time used:$time2 seconds"