You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At some point, datasets become long enough that they cause high latency or even timeout on the webapp implementation.
We don't need to provide super duper duper duper performance (very large datasets probably want CLI validation for other reasons, maybe?) but we should give people an indication of what to expect.
Details
Once low-hanging fruit speed improvements are done, run some large datasets and add (somewhere on website, probably documentation?) some information like "Datasets with files larger than X may take Y amount of time", "Datasets with more than X files may take Z amount of time."
The text was updated successfully, but these errors were encountered:
TLDR
At some point, datasets become long enough that they cause high latency or even timeout on the webapp implementation.
We don't need to provide super duper duper duper performance (very large datasets probably want CLI validation for other reasons, maybe?) but we should give people an indication of what to expect.
Details
Once low-hanging fruit speed improvements are done, run some large datasets and add (somewhere on website, probably documentation?) some information like "Datasets with files larger than X may take Y amount of time", "Datasets with more than X files may take Z amount of time."
The text was updated successfully, but these errors were encountered: