Skip to content

Commit

Permalink
fix: Fixing the return order of elements when calculating the min and…
Browse files Browse the repository at this point in the history
… max entity-DF event timestamps in the Spark offline store.

Signed-off-by: Lev Pickovsky <[email protected]>
  • Loading branch information
levpickis committed Apr 25, 2022
1 parent 00ed65a commit b181bde
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -324,8 +324,8 @@ def _get_entity_df_event_timestamp_range(
df = spark_session.sql(entity_df).select(entity_df_event_timestamp_col)
# TODO(kzhang132): need utc conversion here.
entity_df_event_timestamp_range = (
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
df.agg({entity_df_event_timestamp_col: "min"}).collect()[0][0],
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
)
else:
raise InvalidEntityType(type(entity_df))
Expand Down

0 comments on commit b181bde

Please sign in to comment.