Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test did not produce expected results. Output was: spark-perf #114

Open
shujamughal opened this issue Aug 3, 2016 · 4 comments
Open

Test did not produce expected results. Output was: spark-perf #114

shujamughal opened this issue Aug 3, 2016 · 4 comments

Comments

@shujamughal
Copy link

Hi,
I am using it with spark-2.0.0-bin-hadoop2.7 version but it is not working and showing following output.

Setting env var SPARK_SUBMIT_OPTS: -Dspark.storage.memoryFraction=0.66 -Dspark.serializer=org.apache.spark.serializer.JavaSerializer -Dspark.locality.wait=60000000 -Dsparkperf.commitSHA=unknown
Running command: /home/shuja/Desktop/data/spark-2.0.0-bin-hadoop2.7/bin/spark-submit --class spark.perf.TestRunner --master spark://shuja:7077 --driver-memory 1g /home/shuja/Desktop/data/spark-perf-master/spark-tests/target/spark-perf-tests-assembly.jar scheduling-throughput --num-trials=10 --inter-trial-wait=3 --num-tasks=10000 --num-jobs=1 --closure-size=0 --random-seed=5 1>> results/spark_perf_output__2016-08-03_11-22-03_logs/scheduling-throughput.out 2>> results/spark_perf_output__2016-08-03_11-22-03_logs/scheduling-throughput.err

Test did not produce expected results. Output was:

Java options: -Dspark.storage.memoryFraction=0.66 -Dspark.serializer=org.apache.spark.serializer.JavaSerializer -Dspark.locality.wait=60000000
Options: scheduling-throughput --num-trials=10 --inter-trial-wait=3 --num-tasks=10000 --num-jobs=1 --closure-size=0 --random-seed=5

@quartox
Copy link

quartox commented Aug 19, 2016

If you look in the results/spark_perf_output__2016-08-03_11-22-03_logs/scheduling-throughput.err text file you will see the error output from spark. This should help you isolate your specific error.

@jianghaitao
Copy link

I had similar issue, looking at .err log, I found

Exception in thread "main" java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.render(Lorg/json4s/JsonAST$JValue;)Lorg/json4s/JsonAST$JValue;
        at spark.perf.TestRunner$.main(TestRunner.scala:47)
        at spark.perf.TestRunner.main(TestRunner.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

What is missing? I am running tests on Ubuntu 16.04 / spark-2.0.0-bin-hadoop2.7 /

@JosephArnold
Copy link

Has this error been resolved. I too face the same error. Can anyone help?

@ElfoLiNk
Copy link

ElfoLiNk commented Oct 19, 2016

Use spark-perf from https://github.com/a-roberts/spark-perf for spark 2.0 or update the json4s dependency

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants