-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-12236][SQL] JDBC filter tests all pass if filters are not really pushed down #10221
Conversation
Test build #47418 has finished for PR 10221 at commit
|
@liancheng Would you like to look through this? it is related with the filter tests. |
This looks reasonable to me, although I'm a bit confused by the addition of some similar code in #9687 - it seems just having a single shared strip filter utility function makes sense (is that the eventual plan)? |
Oh yes. That is the eventual plan. I will share that function. I opended some PRs before other PRs are closed. So, I ended up with adding the same function to another PR. |
@holdenk Actually, would you merge this PR if it looks good? |
Can't merge it, but we could ask @marmbrus or @liancheng to take a look if they have the bandwidth. |
Why not just implement |
@marmbrus I saw that Jira ticket (for I found this problem while testing about that Jira ticket (for It looks adding I would like to add that later (maybe right after correcting all filter tests) with a task with subtasks for Parquet, ORC and JDBC datasources if it is acceptable. Otherwise, if it sounds unreasonable, then I will try to add |
Actually, we might still need such function even after adding |
Fair enough, I guess we can probably commit this as is and do the improvement in another PR. |
Merging to master. |
Thanks! |
https://issues.apache.org/jira/browse/SPARK-12236
Currently JDBC filters are not tested properly. All the tests pass even if the filters are not pushed down due to Spark-side filtering.
In this PR,
Firstly, I corrected the tests to properly check the pushed down filters by removing Spark-side filtering.
Also,
!=
was being tested which is actually not pushed down. So I removed them.Lastly, I moved the
stripSparkFilter()
function toSQLTestUtils
as this functions would be shared for all tests for pushed down filters. This function would be also shared with ORC datasource as the filters for that are also not being tested properly.