You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have about 900 repos ( ~1M LoC ) indexed. Search performance is very variable. It feels fairly snappy when searching for larger specific strings with fewer matches; but is extremely slow for generic small snippet searches with lots of hits.
for example searching for foo in my setup returns 131,549ms total / 4,739ms server / 3477 files
The actual server side search is an impressive 4 seconds, but I see a spinner on the UI until the entire 154 MB of search results are transferred back - well over 2 mins! And then there's the bogged down browser from displaying all the results at once.
Paging and/or lazy loading would go a long way in improving user experience and perceived performance
The text was updated successfully, but these errors were encountered:
Hey! I agree, that's quite a lot of data to try and squeeze into the browser. I know we truncate results on a repo-by-repo basis, but it sounds like with 900 repos, paginating the repos themselves might be the right move. I think we would be open to a PR for that.
I have about 900 repos ( ~1M LoC ) indexed. Search performance is very variable. It feels fairly snappy when searching for larger specific strings with fewer matches; but is extremely slow for generic small snippet searches with lots of hits.
for example searching for
foo
in my setup returns131,549ms total / 4,739ms server / 3477 files
The actual server side search is an impressive 4 seconds, but I see a spinner on the UI until the entire 154 MB of search results are transferred back - well over 2 mins! And then there's the bogged down browser from displaying all the results at once.
Paging and/or lazy loading would go a long way in improving user experience and perceived performance
The text was updated successfully, but these errors were encountered: