Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SPARK-3794 [CORE] Building spark core fails due to inadvertent dependency on Commons IO #2662

Closed
wants to merge 1 commit into from

Conversation

srowen
Copy link
Member

@srowen srowen commented Oct 5, 2014

Remove references to Commons IO FileUtils and replace with pure Java version, which doesn't need to traverse the whole directory tree first.

I think this method could be refined further if it would be alright to rename it and its args and break it down into two methods. I'm starting with a simple recursive rendition.

…version, which doesn't need to traverse the whole directory tree first
@SparkQA
Copy link

SparkQA commented Oct 5, 2014

QA tests have started for PR 2662 at commit 4cd172f.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Oct 5, 2014

QA tests have finished for PR 2662 at commit 4cd172f.

  • This patch passes unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21313/Test PASSed.

@ash211
Copy link
Contributor

ash211 commented Oct 5, 2014

I believe this was introduced in #2609 -- any idea why Jenkins didn't catch the build issue?

cc @mccheah

@marmbrus
Copy link
Contributor

marmbrus commented Oct 6, 2014

@ash211 I'd guess that is dependent on the version of Hadoop that we are compiling with. It did cause failures on some versions of the master build.

@srowen thanks for fixing this! I'm going to merge to master.

@asfgit asfgit closed this in 8d22dbb Oct 6, 2014
val cutoffTimeInMillis = (currentTimeMillis - (cutoff * 1000))
val newFiles = files.filter { _.lastModified > cutoffTimeInMillis }
newFiles.nonEmpty
throw new IllegalArgumentException("$dir is not a directory!")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is not string interpolated (missing s)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ack, sorry, look at what happens when I 'improve' a line of code. Anyone feel free to zap it or I'm about to open a related PR anyway that can fix it.

@mccheah
Copy link
Contributor

mccheah commented Oct 6, 2014

Sorry about that. I think Jenkins should be catching these kinds of build failures though. Jenkins should attempt to build the project against multiple versions of hadoop since contributors may be used to using things like FileUtils and other libraries that may have incompatibility issues.

I've opened https://issues.apache.org/jira/browse/SPARK-3819 to consider updating the Jenkins build process. Feel free to discuss if such measures are necessary there.

@vanzin
Copy link
Contributor

vanzin commented Oct 6, 2014

@mccheah agree about jenkins catching these, but at the same time is sort of sketchy to rely on transitive dependencies of Hadoop exactly for that reason.

commons-io is not an explicit dependency of Spark, so it should be avoided.

@mccheah
Copy link
Contributor

mccheah commented Oct 6, 2014

Fair enough. I guess I didn't actually check up the explicit dependencies of Spark before I chose the library to use, so when it just magically appeared in autocomplete in Eclipse I just assumed it would be okay to use. Certainly it was my fault.

The bottom line is that we could be more explicit about this. Catching it in the build would certainly be explicit. Perhaps also something in the documentation?

@srowen srowen deleted the SPARK-3794 branch October 9, 2014 06:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants