Skip to content

Commit

Permalink
refine comments
Browse files Browse the repository at this point in the history
  • Loading branch information
yinxusen committed Apr 3, 2014
1 parent 0af3faf commit 7191be6
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 1 deletion.
2 changes: 2 additions & 0 deletions core/src/main/scala/org/apache/spark/SparkContext.scala
Original file line number Diff line number Diff line change
Expand Up @@ -394,6 +394,8 @@ class SparkContext(
* ...
* (a-hdfs-path/part-nnnnn, its content)
* }}}
*
* @note Small files are perferred, large file is also allowable, but may cause bad performance.
*/
def wholeTextFiles(path: String): RDD[(String, String)] = {
newAPIHadoopFile(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ class JavaSparkContext(val sc: SparkContext) extends JavaSparkContextVarargsWork
* hdfs://a-hdfs-path/part-nnnnn
* }}}
*
* Do `JavaPairRDD<String, String> rdd = context.wholeTextFiles("hdfs://a-hdfs-path")`,
* Do `JavaPairRDD<String, String> rdd = sparkContext.wholeTextFiles("hdfs://a-hdfs-path")`,
*
* <p> then `rdd` contains
* {{{
Expand All @@ -176,6 +176,8 @@ class JavaSparkContext(val sc: SparkContext) extends JavaSparkContextVarargsWork
* ...
* (a-hdfs-path/part-nnnnn, its content)
* }}}
*
* @note Small files are perferred, large file is also allowable, but may cause bad performance.
*/
def wholeTextFiles(path: String): JavaPairRDD[String, String] =
new JavaPairRDD(sc.wholeTextFiles(path))
Expand Down

0 comments on commit 7191be6

Please sign in to comment.