-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for file paths for providing entity rows during batch retrieval #375
Conversation
/retest |
/lgtm |
/retest test-end-to-end |
/retest |
c97ffac
to
7a29d94
Compare
/lgtm |
/retest |
/approved |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: voonhous, woop The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
This PR mirrors PR #365. The branch was deleted as i did not do a force-push but a delete-push to remote which resulted in PR#365 getting closed.
Description
Users should be able to provide large amounts of entity rows when retrieving batch features, but currently they are blocked by memory limits of pandas DataFrames.
Right now, for batch retrieval we already support Avro files as the format for sending entity rows, however this is only available on the Feast Serving API. The Python SDK hides this detail by doing
Entity_rows Pandas DF → .avro (local) → .avro (gcs) → BQ
This pull requests adds the ability for users to provide:
A pandas DataFrame with the "datetime" column
A local Avro file with the "event_timestamp" column
A gcs Avro file
A gcs wildcard path.
Examples:
entity_rows = [Pandas Dataframe]
entity_rows = subfolder/entities.avro
entity_rows = /data/subfolder/entities.avro
entity_rows = gs://food-recsys/folder/customer_entity_rows.avro
entity_rows = gs://food-recsys/folder/customer_entity_rows_*.avro
While datetime and event_timestamp are used interchangeably, there needs to be standardization within the SDK on which to use.
As of now:
datetime is enforced in Pandas DataFrame.
event_timestamp is enforced in local Avro file
No enforcement in files living in GCS. No validation will be done on GCS file paths.
Two additional tests have been ended to
bq_batch_retrieval.py
. The two teststest_get_batch_features_with_file
andtest_get_batch_features_with_gs_path
will run an end to end test respectively for all changes mentioned above.