Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix fetching paginated endpoints twice in get_list_generic #261

Merged
merged 3 commits into from
Jun 11, 2024
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 7 additions & 17 deletions mreg_cli/utilities/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -503,25 +503,15 @@ def _check_expect_one_result(
if limit and resp.count > abs(limit):
raise TooManyResults(f"Too many hits ({resp.count}), please refine your search criteria.")

# Short circuit if there are no more pages. This means that there are no more results to
# be had so we can return the results we already have.
if not resp.next:
return _check_expect_one_result(resp.results)

# Iterate over all pages and collect the results
ret: list[Json] = []
while True:
resp = get(path, params=params, ok404=ok404)
if resp is None:
return _check_expect_one_result(ret)
result = validate_paginated_response(resp)

ret.extend(result.results)

if result.next:
path = result.next
else:
ret: list[Json] = resp.results
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I considered doing a copy here to avoid mutation of the results, but it really has no value. We are returning one large list anyway, so whether we modify the original or a copy doesn't matter to the caller of the function.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense. I think? :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah... It seems wrong, but always copying the list when we strictly don't need to is sub-optimal. Then again, computers are fast and network requests are not, so maybe this is a silly concern?

while resp.next:
response = get(resp.next, params=params, ok404=ok404)
if response is None:
return _check_expect_one_result(ret)
resp = validate_paginated_response(response)
ret.extend(resp.results)
return _check_expect_one_result(ret)


def get_typed(
Expand Down
Loading