You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
a nice higher level feature to implement would be automatic pagination. Currently one can use the --page flag to fetch a particular page. In the case of large result sets this would require multiple commands --page 1, --page 2, --page 3 etc. until reaching the end of the set.
A --paginate flag would automatic this looping and return the full result set once all requests are complete. I think we would want to make this mutually exclusive from --limit and hard code a set limit value to prevent too many requests from a limit set too low. e.g. --limit 1 --paginate could result in thousands of requests with small response payloads. To ensure that the CLI does not chew up rate limiting too quickly we could buffer each page request with a predefined (but also configurable) delay value.
The text was updated successfully, but these errors were encountered:
a nice higher level feature to implement would be automatic pagination. Currently one can use the
--page
flag to fetch a particular page. In the case of large result sets this would require multiple commands--page 1
,--page 2
,--page 3
etc. until reaching the end of the set.A
--paginate
flag would automatic this looping and return the full result set once all requests are complete. I think we would want to make this mutually exclusive from--limit
and hard code a set limit value to prevent too many requests from a limit set too low. e.g.--limit 1 --paginate
could result in thousands of requests with small response payloads. To ensure that the CLI does not chew up rate limiting too quickly we could buffer each page request with a predefined (but also configurable) delay value.The text was updated successfully, but these errors were encountered: