Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't connect to spark with spark-sas7bdat package #9

Open
matthewgson opened this issue Aug 28, 2021 · 1 comment
Open

Can't connect to spark with spark-sas7bdat package #9

matthewgson opened this issue Aug 28, 2021 · 1 comment

Comments

@matthewgson
Copy link

I was trying to follow the example to connect, but it keeps failing. Here's code I ran on my machine.

library(sparklyr)
spark_install(version = "2.0.1", hadoop_version = "2.7")
library(spark.sas7bdat)
mysasfile <- system.file("extdata", "iris.sas7bdat", package = "spark.sas7bdat")
sc <- spark_connect(master = "local")

* Using Spark: 2.0.1
Error in spark_connect_gateway(gatewayAddress, gatewayPort, sessionId,  : 
  Gateway in localhost:8880 did not respond.


Try running `options(sparklyr.log.console = TRUE)` followed by `sc <- spark_connect(...)` for more debugging info.

Following the error message, here's debugging info.

options(sparklyr.log.console = TRUE)
sc <- spark_connect(master = "local")


* Using Spark: 2.0.1
Ivy Default Cache set to: /Users/matthewson/.ivy2/cache
The jars for the packages stored in: /Users/matthewson/.ivy2/jars
:: loading settings :: url = jar:file:/Users/matthewson/spark/spark-2.0.1-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
saurfang#spark-sas7bdat added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
	confs: [default]
:: resolution report :: resolve 774ms :: artifacts dl 0ms
	:: modules in use:
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
		module not found: saurfang#spark-sas7bdat;2.0.0-s_2.11

	==== local-m2-cache: tried

	  file:/Users/matthewson/.m2/repository/saurfang/spark-sas7bdat/2.0.0-s_2.11/spark-sas7bdat-2.0.0-s_2.11.pom

	  -- artifact saurfang#spark-sas7bdat;2.0.0-s_2.11!spark-sas7bdat.jar:

	  file:/Users/matthewson/.m2/repository/saurfang/spark-sas7bdat/2.0.0-s_2.11/spark-sas7bdat-2.0.0-s_2.11.jar

	==== local-ivy-cache: tried

	  /Users/matthewson/.ivy2/local/saurfang/spark-sas7bdat/2.0.0-s_2.11/ivys/ivy.xml

	  -- artifact saurfang#spark-sas7bdat;2.0.0-s_2.11!spark-sas7bdat.jar:

	  /Users/matthewson/.ivy2/local/saurfang/spark-sas7bdat/2.0.0-s_2.11/jars/spark-sas7bdat.jar

	==== central: tried

	  https://repo1.maven.org/maven2/saurfang/spark-sas7bdat/2.0.0-s_2.11/spark-sas7bdat-2.0.0-s_2.11.pom

	  -- artifact saurfang#spark-sas7bdat;2.0.0-s_2.11!spark-sas7bdat.jar:

	  https://repo1.maven.org/maven2/saurfang/spark-sas7bdat/2.0.0-s_2.11/spark-sas7bdat-2.0.0-s_2.11.jar

	==== spark-packages: tried

	  http://dl.bintray.com/spark-packages/maven/saurfang/spark-sas7bdat/2.0.0-s_2.11/spark-sas7bdat-2.0.0-s_2.11.pom

	  -- artifact saurfang#spark-sas7bdat;2.0.0-s_2.11!spark-sas7bdat.jar:

	  http://dl.bintray.com/spark-packages/maven/saurfang/spark-sas7bdat/2.0.0-s_2.11/spark-sas7bdat-2.0.0-s_2.11.jar

		::::::::::::::::::::::::::::::::::::::::::::::

		::          UNRESOLVED DEPENDENCIES         ::

		::::::::::::::::::::::::::::::::::::::::::::::

		:: saurfang#spark-sas7bdat;2.0.0-s_2.11: not found

		::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: saurfang#spark-sas7bdat;2.0.0-s_2.11: not found]
	at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1076)
	at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Here's my sessionInfo:

sessionInfo()
R version 4.1.1 (2021-08-10)
Platform: x86_64-apple-darwin17.0 (64-bit)
Running under: macOS Big Sur 11.5.2

Matrix products: default
LAPACK: /Library/Frameworks/R.framework/Versions/4.1/Resources/lib/libRlapack.dylib

locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] spark.sas7bdat_1.4 sparklyr_1.7.1    

loaded via a namespace (and not attached):
 [1] pillar_1.6.2      compiler_4.1.1    dbplyr_2.1.1      r2d3_0.2.5        base64enc_0.1-3   tools_4.1.1       digest_0.6.27    
 [8] jsonlite_1.7.2    lifecycle_1.0.0   tibble_3.1.4      pkgconfig_2.0.3   rlang_0.4.11      DBI_1.1.1         rstudioapi_0.13  
[15] curl_4.3.2        yaml_2.2.1        parallel_4.1.1    fastmap_1.1.0     withr_2.4.2       dplyr_1.0.7       httr_1.4.2       
[22] generics_0.1.0    vctrs_0.3.8       htmlwidgets_1.5.3 askpass_1.1       rappdirs_0.3.3    rprojroot_2.0.2   tidyselect_1.1.1 
[29] glue_1.4.2        forge_0.2.0       R6_2.5.1          fansi_0.5.0       purrr_0.3.4       tidyr_1.1.3       magrittr_2.0.1   
[36] ellipsis_0.3.2    htmltools_0.5.2   assertthat_0.2.1  config_0.3.1      utf8_1.2.2        openssl_1.4.4     crayon_1.4.1   
@henrydehe
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants