Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug fix/storm speed #104

Merged
merged 7 commits into from
Jul 25, 2024
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
76 changes: 54 additions & 22 deletions stormevents/nhc/track.py
Original file line number Diff line number Diff line change
Expand Up @@ -1072,17 +1072,25 @@ def __compute_velocity(data: DataFrame) -> DataFrame:
for advisory in pandas.unique(data["advisory"]):
advisory_data = data.loc[data["advisory"] == advisory]

indices = numpy.array(
[
numpy.where(advisory_data["datetime"] == unique_datetime)[0][0]
for unique_datetime in pandas.unique(advisory_data["datetime"])
]
)
indices = advisory_data.index
shifted_indices = numpy.roll(indices, 1)
shifted_indices[0] = 0

indices = advisory_data.index[indices]
shifted_indices = advisory_data.index[shifted_indices]
shifted_indices[0] = indices[0]

# check for negative time shifts which indicate new forecasts
# and update this with the last previously available time
for counter, ind in enumerate(zip(indices, shifted_indices)):
this_time = advisory_data.loc[ind[0], "datetime"]
shift_time = advisory_data.loc[ind[1], "datetime"]
if shift_time > this_time:
# update shift index
if (advisory_data["datetime"] < this_time).sum() == 0:
shifted_indices[counter] = advisory_data["datetime"][
advisory_data["datetime"] > this_time
].index[0]
Comment on lines +1086 to +1089
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WPringle can you please explain what this does? It says that if there's no time in the dataframe smaller than the current row we're processing, then the shifted time ("previous time") should be the first time that is larger than the current time. Why?! This gives us opposite bearing, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good point about the bearing! This could be a bug that needs fixing.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems all the tests are passing. The coverage test runs into a timeout issue. Do you want me to go ahead with this merge and then later explore the bearing issue in a separate ticket or do you want me to wait for you to test the potential bearing issue?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this! Please wait until I double check everything. I will let you know once happy.

Copy link
Contributor Author

@WPringle WPringle Jul 25, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK for the bearing when the shifted time is larger than the current time, we can use the forward_azimuth instead of inverse_azimuth. The change has been added in latest commit. Note this is overall a rare case.

else:
shifted_indices[counter] = advisory_data["datetime"][
advisory_data["datetime"] < this_time
].index[-1]

_, inverse_azimuths, distances = geodetic.inv(
advisory_data.loc[indices, "longitude"],
Expand All @@ -1091,21 +1099,31 @@ def __compute_velocity(data: DataFrame) -> DataFrame:
advisory_data.loc[shifted_indices, "latitude"],
)

intervals = advisory_data.loc[indices, "datetime"].diff()
speeds = distances / (intervals / pandas.to_timedelta(1, "s"))
bearings = pandas.Series(inverse_azimuths % 360, index=speeds.index)

for index in indices:
cluster_index = (
advisory_data["datetime"] == advisory_data.loc[index, "datetime"]
intervals = (
(
abs(
advisory_data.loc[indices, "datetime"].values
- advisory_data.loc[shifted_indices, "datetime"].values
)
)
advisory_data.loc[cluster_index, "speed"] = speeds[index]
advisory_data.loc[cluster_index, "direction"] = bearings[index]
.astype("timedelta64[s]")
.astype(float)
)
speeds = pandas.Series(distances / intervals, index=indices)
bearings = pandas.Series(inverse_azimuths % 360, index=indices)
bearings[pandas.isna(speeds)] = numpy.nan
# fill in nans carrying forward, because it is same valid time
# and forecast but different isotach.
# then fill nans backwards to handle the first time
speeds.ffill(inplace=True)
bearings.ffill(inplace=True)
Comment on lines +1120 to +1121
Copy link
Collaborator

@SorooshMani-NOAA SorooshMani-NOAA Jul 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WPringle where would we get nan to carry forward? We already are getting all indices, we shouldn't run into a case of nan that needs to be filled forward for different isotachs. If possible can you share an example where this happens?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does geodetic.inv return nans when locations are same?

Copy link
Contributor Author

@WPringle WPringle Jul 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the same forecast time but different isotachs the result will be nan because the time difference is zero. So we just want to repeat the result from the first line of the same forecast time.

speeds.bfill(inplace=True)
bearings.bfill(inplace=True)
advisory_data["speed"] = speeds
advisory_data["direction"] = bearings

data.loc[data["advisory"] == advisory] = advisory_data

data.loc[pandas.isna(data["speed"]), "speed"] = 0

return data

@property
Expand Down Expand Up @@ -1352,7 +1370,21 @@ def clamp(n, minn, maxn):
or forecast.loc[valid_index, "forecast_hours"].iloc[0] == 0
):
continue
forecast.loc[valid_index, "radius_of_maximum_winds"] = rmw
# make sure rolling rmw is not larger than the maximum radii of the strongest isotach
# this problem usually comes from the rolling average
max_isotach_radii = isotach_radii.loc[valid_index].iloc[-1].max()
if rmw < max_isotach_radii or numpy.isnan(max_isotach_radii):
forecast.loc[valid_index, "radius_of_maximum_winds"] = rmw
# in case it does not come from rolling average just set to be Vr/Vmax ratio of max_isotach_radii
if (
forecast.loc[valid_index, "radius_of_maximum_winds"].iloc[-1]
> max_isotach_radii
):
forecast.loc[valid_index, "radius_of_maximum_winds"] = (
max_isotach_radii
* forecast.loc[valid_index, "isotach_radius"].iloc[-1]
/ forecast.loc[valid_index, "max_sustained_wind_speed"].iloc[-1]
)

# fill OFCL background pressure with the first entry from 0-hr CARQ background pressure (at sea level)
forecast.loc[radp_missing, "background_pressure"] = carq_ref[
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 06, 2018083006, , BEST, 0, 128N, 169W, 20, 1008, LO, 0, , 0, 0, 0, 0, 1010, 150, 50, 0, 0, L, 0, , 0, 0, INVEST, 1
AL, 06, 2018083006, , BEST, 0, 128N, 169W, 20, 1008, LO, 0, , 0, 0, 0, 0, 1010, 150, 50, 0, 0, L, 0, , 270, 5, INVEST, 1
AL, 06, 2018083012, , BEST, 0, 128N, 179W, 25, 1007, LO, 0, , 0, 0, 0, 0, 1010, 150, 50, 0, 0, L, 0, , 270, 5, SIX, 2
AL, 06, 2018083018, , BEST, 0, 128N, 190W, 25, 1007, LO, 0, , 0, 0, 0, 0, 1010, 150, 50, 35, 0, L, 0, , 270, 6, SIX, 3
AL, 06, 2018083100, , BEST, 0, 131N, 202W, 30, 1006, LO, 0, , 0, 0, 0, 0, 1010, 150, 40, 40, 0, L, 0, , 284, 6, SIX, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/harvey2017.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 09, 2017081606, , BEST, 0, 137N, 458W, 25, 1013, LO, 0, , 0, 0, 0, 0, 1014, 150, 80, 0, 0, ,,, 0, 0,, 1
AL, 09, 2017081606, , BEST, 0, 137N, 458W, 25, 1013, LO, 0, , 0, 0, 0, 0, 1014, 150, 80, 0, 0, ,,, 270, 8,, 1
AL, 09, 2017081612, , BEST, 0, 137N, 474W, 25, 1010, LO, 0, , 0, 0, 0, 0, 1013, 150, 80, 0, 0, L, 0, , 270, 8, INVEST, 2
AL, 09, 2017081618, , BEST, 0, 136N, 490W, 25, 1009, LO, 0, , 0, 0, 0, 0, 1013, 150, 80, 0, 0, L, 0, , 267, 8, INVEST, 3
AL, 09, 2017081700, , BEST, 0, 136N, 506W, 25, 1010, LO, 0, , 0, 0, 0, 0, 1012, 120, 80, 0, 0, L, 0, , 270, 8, INVEST, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/ike2008.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 09, 2008090106, , BEST, 0, 172N, 370W, 30, 1006, TD, 0, , 0, 0, 0, 0, 1011, 250, 90, 0, 0, L, 0, , 0, 0, INVEST, 1
AL, 09, 2008090106, , BEST, 0, 172N, 370W, 30, 1006, TD, 0, , 0, 0, 0, 0, 1011, 250, 90, 0, 0, L, 0, , 274, 7, INVEST, 1
AL, 09, 2008090112, , BEST, 0, 173N, 384W, 35, 1005, TS, 34, NEQ, 120, 75, 0, 60, 1011, 250, 90, 40, 0, L, 0, , 274, 7, NINE, 2
AL, 09, 2008090118, , BEST, 0, 175N, 399W, 45, 1003, TS, 34, NEQ, 130, 110, 0, 75, 1011, 250, 90, 55, 0, L, 0, , 278, 7, IKE, 3
AL, 09, 2008090200, , BEST, 0, 178N, 413W, 45, 1002, TS, 34, NEQ, 140, 120, 0, 90, 1011, 250, 20, 55, 0, L, 0, , 283, 7, IKE, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/irene2011.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 09, 2011082100, , BEST, 0, 150N, 590W, 45, 1006, TS, 34, NEQ, 105, 0, 0, 45, 1010, 175, 60, 55, 0, L, 0, , 0, 0, IRENE, 1
AL, 09, 2011082100, , BEST, 0, 150N, 590W, 45, 1006, TS, 34, NEQ, 105, 0, 0, 45, 1010, 175, 60, 55, 0, L, 0, , 303, 9, IRENE, 1
AL, 09, 2011082106, , BEST, 0, 160N, 606W, 45, 1006, TS, 34, NEQ, 130, 0, 0, 80, 1010, 175, 50, 55, 0, L, 0, , 303, 9, IRENE, 2
AL, 09, 2011082112, , BEST, 0, 168N, 622W, 45, 1005, TS, 34, NEQ, 130, 0, 0, 70, 1010, 175, 50, 55, 0, L, 0, , 298, 9, IRENE, 3
AL, 09, 2011082118, , BEST, 0, 175N, 637W, 50, 999, TS, 34, NEQ, 130, 20, 0, 70, 1010, 175, 50, 55, 0, L, 0, , 296, 8, IRENE, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/irma2017.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 11, 2017083000, , BEST, 0, 161N, 269W, 30, 1008, TD, 0, , 0, 0, 0, 0, 1012, 180, 60, 0, 0, L, 0, , 0, 0, INVEST, 1
AL, 11, 2017083000, , BEST, 0, 161N, 269W, 30, 1008, TD, 0, , 0, 0, 0, 0, 1012, 180, 60, 0, 0, L, 0, , 274, 7, INVEST, 1
AL, 11, 2017083006, , BEST, 0, 162N, 283W, 35, 1007, TS, 34, NEQ, 30, 0, 0, 0, 1012, 180, 60, 0, 0, L, 0, , 274, 7, INVEST, 2
AL, 11, 2017083012, , BEST, 0, 163N, 297W, 45, 1006, TS, 34, NEQ, 30, 0, 0, 30, 1012, 180, 20, 50, 0, L, 0, , 274, 7, IRMA, 3
AL, 11, 2017083018, , BEST, 0, 163N, 308W, 50, 1004, TS, 34, NEQ, 30, 30, 0, 30, 1011, 200, 15, 0, 0, L, 0, , 270, 5, IRMA, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/isabel2003.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 13, 2003090600, , BEST, 0, 138N, 314W, 30, 1009, TD, 34, NEQ, 0, 0, 0, 0, 1012, 150, 40, 0, 0, ,,, 0, 0,, 1
AL, 13, 2003090600, , BEST, 0, 138N, 314W, 30, 1009, TD, 34, NEQ, 0, 0, 0, 0, 1012, 150, 40, 0, 0, ,,, 275, 7,, 1
AL, 13, 2003090606, , BEST, 0, 139N, 327W, 35, 1005, TS, 34, NEQ, 0, 0, 0, 0, 1012, 150, 40, 0, 0, ,,, 275, 7,, 2
AL, 13, 2003090612, , BEST, 0, 136N, 339W, 40, 1003, TS, 34, NEQ, 75, 75, 25, 75, 1012, 150, 25, 0, 0, ,,, 256, 6,, 3
AL, 13, 2003090618, , BEST, 0, 134N, 349W, 45, 1000, TS, 34, NEQ, 75, 75, 25, 75, 1012, 150, 25, 0, 0, ,,, 259, 5,, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/maria2017.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 15, 2017091612, , BEST, 0, 122N, 497W, 30, 1006, TD, 0, , 0, 0, 0, 0, 1012, 150, 40, 40, 0, L, 0, , 0, 0, INVEST, 1
AL, 15, 2017091612, , BEST, 0, 122N, 497W, 30, 1006, TD, 0, , 0, 0, 0, 0, 1012, 150, 40, 40, 0, L, 0, , 270, 10, INVEST, 1
AL, 15, 2017091618, , BEST, 0, 122N, 517W, 40, 1004, TS, 34, NEQ, 40, 0, 0, 40, 1012, 150, 40, 50, 0, L, 0, , 270, 10, FIFTEEN, 2
AL, 15, 2017091700, , BEST, 0, 124N, 531W, 45, 1002, TS, 34, NEQ, 40, 30, 0, 40, 1012, 150, 30, 55, 0, L, 0, , 278, 7, MARIA, 3
AL, 15, 2017091706, , BEST, 0, 128N, 544W, 55, 994, TS, 34, NEQ, 50, 40, 0, 50, 1010, 150, 20, 65, 0, L, 0, , 288, 7, MARIA, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/michael2018.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 14, 2018100618, , BEST, 0, 178N, 866W, 25, 1006, LO, 0, , 0, 0, 0, 0, 1009, 180, 90, 35, 0, L, 0, , 0, 0, INVEST, 1
AL, 14, 2018100618, , BEST, 0, 178N, 866W, 25, 1006, LO, 0, , 0, 0, 0, 0, 1009, 180, 90, 35, 0, L, 0, , 316, 2, INVEST, 1
AL, 14, 2018100700, , BEST, 0, 181N, 869W, 25, 1004, LO, 0, , 0, 0, 0, 0, 1009, 180, 90, 35, 0, L, 0, , 316, 2, FOURTEEN, 2
AL, 14, 2018100706, , BEST, 0, 184N, 868W, 30, 1004, TD, 0, , 0, 0, 0, 0, 1010, 240, 120, 40, 0, L, 0, , 18, 2, FOURTEEN, 3
AL, 14, 2018100712, , BEST, 0, 188N, 864W, 35, 1003, TS, 34, NEQ, 120, 180, 0, 0, 1009, 270, 120, 40, 0, L, 0, , 44, 3, FOURTEEN, 4
Expand Down
2 changes: 1 addition & 1 deletion tests/data/reference/test_vortex_track/sandy2012.fort.22
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
AL, 18, 2012102118, , BEST, 0, 143N, 774W, 25, 1006, LO, 0, , 0, 0, 0, 0, 1008, 180, 150, 35, 0, L, 0, , 0, 0, INVEST, 1
AL, 18, 2012102118, , BEST, 0, 143N, 774W, 25, 1006, LO, 0, , 0, 0, 0, 0, 1008, 180, 150, 35, 0, L, 0, , 224, 3, INVEST, 1
AL, 18, 2012102200, , BEST, 0, 139N, 778W, 25, 1005, LO, 0, , 0, 0, 0, 0, 1008, 180, 150, 35, 0, L, 0, , 224, 3, INVEST, 2
AL, 18, 2012102206, , BEST, 0, 135N, 782W, 25, 1003, LO, 0, , 0, 0, 0, 0, 1008, 225, 75, 35, 0, L, 0, , 224, 3, INVEST, 3
AL, 18, 2012102212, , BEST, 0, 131N, 786W, 30, 1002, TD, 0, , 0, 0, 0, 0, 1007, 250, 75, 0, 0, L, 0, , 224, 3, EIGHTEEN, 4
Expand Down
Loading
Loading