Feature request: cache remote python installation in the local machine
In case of remote dev server (not when it's a docker container/vagrant box on the local machine, but real remote over the internet server), intellisense features of anaconda-mode could get really slow because of the ping between local environment and remote anaconda-mode server.
What I have in mind is something like PyCharm way - they download site-packages from remote python installation to the local machine and get completion from there. python-shell still points to the remote one, so you can execute tests on dev server and stuff, but intellisense and go-to works locally.
Besides couple rough edges when you have to invalidate cache and reindex, such way works pretty well, saves a lot of time and happens to be one of two PyCharm killer features (second one is an awesome debugger, but that's not related to anaconda-mode scope).
What do you think about something like that for anaconda-mode?
P.S. :wink:
Just imagine all the jealous vscode/vim zealots switching to the Emacs and praising anaconda-modeP.S.S. Maybe there is some TRAMP feature to cache it all I'm not aware? I've noticed that when I visited the buffer already, its completions/definitions work much faster.
Hi,
To be honest, I want to find the bottleneck in your case. Some measurements will be helpful.
The most recent pythonic can mount local directory to the remote one. This is work in progress to support transparent docker & vagrant environments while find-file on the local machine. Should be much more convenient.
Basically, you want the vice versa. But it should be really easy to implement. Addition download interactive command with the progress bar would be nice.
Fill free to open PR, will try to guide you for this task.
Regards, Artem.
P.S. After hunter experience every debugger looks naive.
The bottleneck is a response to the anaconda-mode server over the Internet :) In my particular case, response ping could be from 160 to 300 ms. Also, mounting local python installation is suboptimal, because command execution becomes painfully slow. If you're talking about the project code, then it sits locally and gets synced with the remote host via rsync (and rsync is really fast and great for such things). Hope it makes sense.
Got it regarding download command and progress bar, I'll try to do something about it when I'll have time :+1:
I don't understand "mounting local python installation is suboptimal." Can you explain it?
Think of it as a dockerized dev environment, but docker container sits on the remote (over the Internet) machine. All the commands are being run remotely (let's say runserver or any other Django management command), and in order to be fast, they should have local access to site-packages and to the project code. If I just mount a volume with the project code and/or with virtualenv over the Internet, there will be painful network overhead. That's why I have Python (and all my dev dependencies) on this remote machine, and two copies of my project code. One locally, and one rsynced remotely. That's regarding execution. If we speak about completion, it's the opposite - I need completion and definition candidates locally from remote machine. And every anaconda-mode request comes with the Internet overhead as well. And that's where caching remote virtualenv comes handy.
Sorry, that was my not-so-good English skills.
I will try to explain it in other terms.
For now, you should run a docker container and work with it as the remote host using tramp. This includes find-file operations, dired traversing, and process running for linters, isort and for anaconda-mode of course.
The most recent pythonic library (the heart of anaconda-mode and djangonaut) support path aliases. This was not announced yet, but if you add pythonic-set-docker-compose-alias to the anaconda-mode-hook and open the file from your project locally, anaconda will detect compose environment and run only completion service remotely. Every other operation will happen at localhost.
This works by file name translation from local to tramp and backward.
To work with the local environment and remote files we can do exactly the same and mechanism is already there.
The only thing we need to implement is the download interactive command which will get files and set aliases permanently using configure.