-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting jar from CAPS project in intellije and Running it in cluster #909
Comments
Just run |
Thanks @Mats-SX, How to specify a specific scala to be run. and How to run the output jar is it using spark-submit ? |
This depends on how you've set up your Spark cluster. Generally, using https://spark.apache.org/docs/latest/submitting-applications.html |
Thank You and How building the CAPS with multiple main classes. how to specify main class to be run. |
Task 'allJar' not found in root project 'okapi'. Some candidates are: 'docJar', 'jar', 'jmhJar'. |
That's odd. When I run In order to specify main class, please refer to the standard Spark documentation. We don't do anything in particular but the standard approaches should be enough. |
Thanks @Mats-SX I figured out the problem was with running the command and older version of Morpheus. I'm just confused about this point if you just have the morpheus-spark-cypher-all.jar with multiple classses inside how to specify which one will be submitted or run as a job? |
Ok Mats I have figured this out by putting the .all Jar file into spark and running examples jar by just giving the name of the class I want to run. |
Can you tell me what's the command for running examples using .all jar file? |
@Mats-SX @MohamedRagabAnas Can u please help me with this? It's very urgent:( |
Hello @zhengminlai The examples are not included in the
|
@Mats-SX the new version of Morpheus doesn't support |
Hi, I want to run specific example from CAPS project on another machine how to extract the jar file in the CAPS project (should it be a fat jar).
then how to run it that machine will it be submitted as a spark job using 'spark-submit' or how to run it in the right way
Thanks in advance
The text was updated successfully, but these errors were encountered: