-
Notifications
You must be signed in to change notification settings - Fork 380
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Subtask] [spark connector] support hive table location properties #2621
Comments
@Yangxuhao123 you could refer #2605 |
Sorry, I've dealt with some things during this time. Have you completed this part of the work? @FANNG1 |
No, I haven't finished it, do you have time to work on this? |
Ok, I will do it. |
May I take on this issue? |
@Yangxuhao123 is working on this |
OK, you can do it, I can work on other issues in the future. |
@FANNG1 Should I support three testing scenario cases I thought:
What are your thoughts? |
For location property, the main target of the test is to check whether data is placed in the specified location, so I think we should test the following scenes for managed table and external table.
|
@FANNG1 the key of location property in Gravitino is same as Spark, do we still need to modify the converter? |
I think it's necessary to transform it explicitly, they are happens to the same. |
…2805) ### What changes were proposed in this pull request? supports location properties, transform from Spark location to Gravitino location `CREATE TABLE xx LOCATION xxx` ### Why are the changes needed? Fix: #2621 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? IT
Describe the subtask
supports location properties, transform from Spark location to Gravitino location, please add UT to
TestHivePropertiesConverter
, add IT toSparkHiveCatalogIT
.Parent issue
#1549
The text was updated successfully, but these errors were encountered: