-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-16408][SQL] SparkSQL Added file get Exception: is a directory … #14085
Conversation
…and recursive is not turned on
Can one of the admins verify this patch? |
@@ -113,8 +113,9 @@ case class AddFile(path: String) extends RunnableCommand { | |||
|
|||
override def run(sqlContext: SQLContext): Seq[Row] = { | |||
val hiveContext = sqlContext.asInstanceOf[HiveContext] | |||
val recursive = sqlContext.sparkContext.getConf.getBoolean("spark.input.dir.recursive", false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure these are semantics that are supported by the SQL dialect in Spark SQL. In any event the name of this property is too generic, and I don't think it is something that is set globally.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm pretty sure that it's supported by the SQL dialect in Spark SQL.
And about "the name of this property is too generic, and I don't think it is something that is set globally", do you think we should use another name? and the default value should be true?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And by the way, I have tried:
val recursive = hiveContext.getConf("spark.input.dir.recursive", "false")
but this can only work in spark sql by execute set spark.input.dir.recursive=true before add file, and we can't set the value by --conf spark.input.dir.recursive=true. This makes it difficult for us to move some hive sql directly to SparkSQL.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding this session-scoped configuration is risky. If needed, we can improve the SQL syntax for supporting it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was wondering if we could call:
sparkSession.sparkContext.addFile(path, true)
in AddFileCommand func, since it's a general demand in ETL.
@zenglinxi0615 Just one quick question, could we give a default configuration for |
@zenglinxi0615 Could you answer to the question above if you are active? |
What changes were proposed in this pull request?
This PR is for adding an parameter (spark.input.dir.recursive) to control the value of recursive in SparkContext#addFile, so we can support "add file hdfs://dir/path" cmd in SparkSQL
How was this patch tested?
manual tests:
set the conf: --conf spark.input.dir.recursive=true, and run spark-sql -e "add file hdfs://dir/path"