Skip to content

Commit

Permalink
SPARK-1168, Added foldByKey to pyspark.
Browse files Browse the repository at this point in the history
Author: Prashant Sharma <[email protected]>

Closes apache#115 from ScrapCodes/SPARK-1168/pyspark-foldByKey and squashes the following commits:

db6f67e [Prashant Sharma] SPARK-1168, Added foldByKey to pyspark.
  • Loading branch information
ScrapCodes authored and mateiz committed Mar 10, 2014
1 parent f551898 commit a59419c
Showing 1 changed file with 14 additions and 0 deletions.
14 changes: 14 additions & 0 deletions python/pyspark/rdd.py
Original file line number Diff line number Diff line change
Expand Up @@ -946,7 +946,21 @@ def _mergeCombiners(iterator):
combiners[k] = mergeCombiners(combiners[k], v)
return combiners.iteritems()
return shuffled.mapPartitions(_mergeCombiners)

def foldByKey(self, zeroValue, func, numPartitions=None):
"""
Merge the values for each key using an associative function "func" and a neutral "zeroValue"
which may be added to the result an arbitrary number of times, and must not change
the result (e.g., 0 for addition, or 1 for multiplication.).
>>> rdd = sc.parallelize([("a", 1), ("b", 1), ("a", 1)])
>>> from operator import add
>>> rdd.foldByKey(0, add).collect()
[('a', 2), ('b', 1)]
"""
return self.combineByKey(lambda v: func(zeroValue, v), func, func, numPartitions)


# TODO: support variant with custom partitioner
def groupByKey(self, numPartitions=None):
"""
Expand Down

0 comments on commit a59419c

Please sign in to comment.