Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-16408][SQL] SparkSQL Added file get Exception: is a directory … #14085

Closed
wants to merge 1 commit into from
Closed

Conversation

zenglinxi0615
Copy link

What changes were proposed in this pull request?

This PR is for adding an parameter (spark.input.dir.recursive) to control the value of recursive in SparkContext#addFile, so we can support "add file hdfs://dir/path" cmd in SparkSQL

How was this patch tested?

manual tests:
set the conf: --conf spark.input.dir.recursive=true, and run spark-sql -e "add file hdfs://dir/path"

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@@ -113,8 +113,9 @@ case class AddFile(path: String) extends RunnableCommand {

override def run(sqlContext: SQLContext): Seq[Row] = {
val hiveContext = sqlContext.asInstanceOf[HiveContext]
val recursive = sqlContext.sparkContext.getConf.getBoolean("spark.input.dir.recursive", false)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure these are semantics that are supported by the SQL dialect in Spark SQL. In any event the name of this property is too generic, and I don't think it is something that is set globally.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm pretty sure that it's supported by the SQL dialect in Spark SQL.
And about "the name of this property is too generic, and I don't think it is something that is set globally", do you think we should use another name? and the default value should be true?

Copy link
Author

@zenglinxi0615 zenglinxi0615 Jul 7, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And by the way, I have tried:
val recursive = hiveContext.getConf("spark.input.dir.recursive", "false")
but this can only work in spark sql by execute set spark.input.dir.recursive=true before add file, and we can't set the value by --conf spark.input.dir.recursive=true. This makes it difficult for us to move some hive sql directly to SparkSQL.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding this session-scoped configuration is risky. If needed, we can improve the SQL syntax for supporting it.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was wondering if we could call:
sparkSession.sparkContext.addFile(path, true)
in AddFileCommand func, since it's a general demand in ETL.

@jinxing64
Copy link

@zenglinxi0615
This pr is about adding all files in a directory recursively, thus no need to enumerate all the filenames? I think this can be pretty useful especially in production env.

Just one quick question, could we give a default configuration for spark.input.dir.recursive and at the same time we can also set it via set spark.input.dir.recursive=true ?

@HyukjinKwon
Copy link
Member

@zenglinxi0615 Could you answer to the question above if you are active?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants