-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-15763][SQL] Support DELETE FILE command natively #13506
Conversation
get latest code from upstream
adding trim characters support
get latest code for pr12646
merge latest code
merge upstream/master
Can one of the admins verify this patch? |
* use `SparkFiles.get(fileName)` to find its download location. | ||
* | ||
*/ | ||
def deleteFile(path: String): Unit = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is fairly confusing -- i'd assume this is actually deleting the path given.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Reynold: Thanks very much for reviewing the code.
yes, it is deleting the path from the addedFile hashmap, the path will be generated as key and stored in the map.
The addFile use this logical to generate the key and stored in the hashmap, so in order to find the same key, I have to use the same logical to generate the key.
For example:
for this local file, the addFile will generate a 'file' in front of the path.
spark.sql("add file /Users/qianyangyu/myfile.txt")
scala> spark.sql("list file").show(false)
+----------------------------------+
|Results |
+----------------------------------+
|file:/Users/qianyangyu/myfile2.txt|
|file:/Users/qianyangyu/myfile.txt |
+----------------------------------+
but for the remote location file, it will just take the path.
scala> spark.sql("add file hdfs://bdavm009.svl.ibm.com:8020/tmp/test.txt")
res17: org.apache.spark.sql.DataFrame = []
scala> spark.sql("list file").show(false)
+---------------------------------------------+
|Results |
+---------------------------------------------+
|file:/Users/qianyangyu/myfile.txt |
|hdfs://bdavm009.svl.ibm.com:8020/tmp/test.txt|
+---------------------------------------------+
if the command is issued from the worker node and add local file, the path will be added into the NettyStreamManager's hashmap and using that environment's path as key to store in the addedFiles.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have updated the deleteFile comments to make it more clear. Thanks for reviewing.
@kevinyu98 Could you update the PR and fix merge conflicts? Thanks |
@vanzin Hello Marcelo: I am so sorry that I didn't notice your update. I have fix the merge conflicts and can you help review it? Thanks. |
@kevinyu98 Can you please close it? It seems like there is not a lot of interest in adding this functionality natively in Spark. If anybody wants this feature, we can reopen it later? |
sure |
We are closing it due to inactivity. please do reopen if you want to push it forward. Thanks! |
What changes were proposed in this pull request?
Hive supports these cli commands to manage the resource Hive Doc :
ADD/DELETE (FILE(s)<filepath..>|JAR(s) <jarpath..>)
LIST (FILE(S) [filepath ...] | JAR(S) [jarpath ...])
but SPARK only supports two commands
ADD (FILE <filepath> | JAR <jarpath>)
LIST (FILE(S) [filepath ...] | JAR(S) [jarpath ...])
for now.This PR is to add the DELETE FILE command into Spark SQL and I will submit another PR for the DELETE JAR(s).
DELETE FILE <filepath>
Example:
DELETE FILE
How was this patch tested?
Add test cases in Spark-SQL SPARK-Shell and SparkContext suites.
(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)