-
Notifications
You must be signed in to change notification settings - Fork 381
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[#5005] improvement(docs): Add a document about how to create Hive S3 tables through Gravitino. #5006
Conversation
docs/hive-catalog-with-s3.md
Outdated
|
||
The following sections will guide you through the necessary steps to configure the Hive catalog to utilize S3 as a storage backend, including configuration details and examples for creating databases and tables. | ||
|
||
## Hive cluster configuration |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hive metastore ?
|
||
|
||
After updating the `hive-site.xml`, you need to ensure that the necessary S3-related JARs are included in the Hive classpath. You can do this by executing the following command: | ||
```shell |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add a empty line?
``` | ||
|
||
### Adding Required JARs | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are two blank lines, remove one?
docs/hive-catalog-with-s3.md
Outdated
``` | ||
Alternatively, you can download the required JARs from the Maven repository and place them in the Hive classpath. It is crucial to verify that the JARs are compatible with the version of Hadoop you are using to avoid any compatibility issue. | ||
|
||
### Restart Hive Cluster |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add a blank line
docs/hive-catalog-with-s3.md
Outdated
``` | ||
Alternatively, you can download the required JARs from the Maven repository and place them in the Hive classpath. It is crucial to verify that the JARs are compatible with the version of Hadoop you are using to avoid any compatibility issue. | ||
|
||
### Restart Hive Cluster |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hive metastore
|
||
Assuming you have already set up a Hive catalog with Gravitino, you can proceed to create tables or databases using S3 storage. For more information on catalog operations, refer to [Catalog operations](./manage-fileset-metadata-using-gravitino.md#catalog-operations) | ||
|
||
### Example: Creating a Database with S3 Storage |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add blank line here
.enableHiveSupport() | ||
.getOrCreate(); | ||
|
||
sparkSession.sql("..."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please add blank line
docs/apache-hive-catalog.md
Outdated
|
||
## Hive catalog with S3 storage | ||
|
||
To create a Hive catalog with S3 storage, you can refer to the [Hive catalog with S3](./hive-catalog-with-s3.md) documentation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggest to tell user no need to add extra s3 properties in Gravitino Hive catalog
What changes were proposed in this pull request?
Add a document about how to create Hive S3 tables through Gravitino
Why are the changes needed?
To enhance the user experience.
Fix: #5005
Does this PR introduce any user-facing change?
N/A
How was this patch tested?
N/A