-
-
Notifications
You must be signed in to change notification settings - Fork 507
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
painfully slow auto completions #1422
Comments
Can you give a bit more context? I know that completions are slow sometimes, but they are not always. So when are they slow for you? Also the database index ticket in #1059 should be there to cache certain results from type inference, which can make completions faster by a lot. I'm also thinking about rewriting Jedi in Rust, but that's a lot of work. Also speaking for completions generally, it's just non-trivial to make them fast AND correct in Python. I've been trying for years and it's probably cost me about three years of working "full time" on this (I've been working on this problem for almost 7 years now). |
@davidhalter Hi Dave, thanks for the reply.
This happens pretty much with any package that I use. For instance, I have a dummy python file and include packages like numpy and matplotlib, in order to get completions for those I have to write
Believe me I completely understand the frustrations that you might have gone through and that's why I appreciate the work that folks like yourself are doing. I consider it as a service to the whole community! |
There's a reason for that. They use a type inference cache (essentially a database). This enables certain things, but it's also not always the greatest solution. There's a reason why VScode still defaults to Jedi for completions and not their own language server. Jedi is still better if you don't use one of numpy/tensorflow/matplotlib/scipy, etc. Essentially if you do non-scientific development Jedi is usually almost instant (after the first few completions). |
I'm closing this in favor of the database solution in #1059. I see really no other solution. Feel free to keep asking questions. I'm happy to answer. |
Thanks, Dave
I know, found it recently as well. |
Jedi just tries to be correct as much as possible. I've been working on speeding up Jedi by a lot in the last 2-3 months, but that's just the beginning. It will probably take 2-3 years until I have something ready that makes sense. |
Thanks for writing this - I've been spending a bit of type with autocomplete tonight (Python 3.8,0, iPython 7.13.0, jedi 0.17.0) - and I've noticed that for an example like this:
That trying to autocomplete
In iPython takes ~2 seconds, with jedi, but takes about 250ms with jedi disabled with:
I'm glad to know it's not just me, and that jedi performance is an ongoing development - definitely looking forward to checking back in as new versions are released - thanks very much. |
@ghshephard Is it also slow after the first initial slowness? Because one of the problems is #1059. Jedi doesn't do a lot of caching outside of its process. Therefore initial completions can be quite slow. :/ |
Yes, the behavior is consistent on the second and further uses. I'll follow the jedi project and keep re-trying my 3-line test case and report back if anything changes. Thanks for all your hard work! |
What's your hardware like? |
64 Gig / i7-8700 @3.2Ghz - 6core/12Th - The system is reasonably beefy - and isn't slow on any other operations that I can discern. 1157 single core, 6032 multicore GB 5 score. |
SSD or HDD or Cloud? |
SSD, running on WSL in Ubuntu 18.04 in Windows 10. But - now you’ve got my
curiosity peaked and I’ll try it on native Linux.
…On Sat, Apr 18, 2020 at 18:19 Dave Halter ***@***.***> wrote:
SSD or HDD or Cloud?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1422 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAOYKM5RV3QYEXF63YYHSNTRNIRNHANCNFSM4I73WMNQ>
|
I would argue that it's like 500ms for me. 2s seems very long to me, especially because your system seems more modern than mine - I'm working on a 4 year old notebook. I'm wondering if this was related to Linux.
I'm not sure how to optimize much further. There is of course #1059. But I might just do that stuff in Rust to get it really fast. However this might mean that Jedi won't be getting that kind of cache. This means that I don't really see a good solution at the moment. :/ There is already some kind of trivial caching for pandas which makes it faster, but it's just a lot of identifiers... |
@ghshephard I had one more idea to improve speed (without odd side effects). I think this should have helped quite a lot in your case. It's still feels strange to me that Jedi is so much faster when I use it, but this should help nonetheless (it improved it for me as well). So please test, I think it's pretty much as simple as |
I've tried all of the above
elpy
,anaconda-mode
,company-jedi
and in all occasions I find the completions to be extremely slow even when you force to have completions on dot still you'll have to wait a couple of seconds until completions become available.Is this normal?
Maintainers of some of the above packages believe that is caused by jedi, discussion here towards the end.
I see this but don't know what you mean or how to utilise it?
The text was updated successfully, but these errors were encountered: