Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
… driver to executor
What changes were proposed in this pull request?
SPARK-23754 was fixed in apache#21383 by changing the UDF code to wrap the user function, but this required a hack to save its argspec. This PR reverts this change and fixes the
StopIteration
bug in the workerHow does this work?
The root of the problem is that when an user-supplied function raises a
StopIteration
, pyspark might stop processing data, if this function is used in a for-loop. The solution is to catchStopIteration
s exceptions and re-raise them asRuntimeError
s, so that the execution fails and the error is reported to the user. This is done using thefail_on_stopiteration
wrapper, in different ways depending on where the function is used:How was this patch tested?
Same tests, plus tests for pandas UDFs
Author: edorigatti [email protected]
Closes apache#21467 from e-dorigatti/fix_udf_hack.
What changes were proposed in this pull request?
(Please fill in changes proposed in this fix)
How was this patch tested?
(Please explain how this patch was tested. E.g. unit tests, integration tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)
Please review http://spark.apache.org/contributing.html before opening a pull request.