gto icon indicating copy to clipboard operation
gto copied to clipboard

Make `show` and other commands work with remote repos

Open aguschin opened this issue 2 years ago • 60 comments

Right now they work on local repos only, that's a limitation of gitpython.

aguschin avatar Feb 21 '22 14:02 aguschin

Asked a question in scmrepo about this - maybe we can use it.

aguschin avatar Mar 09 '22 10:03 aguschin

Hi, I'd like to contribute to this one. Any suggestion on where to start from?

francesco086 avatar Sep 06 '22 14:09 francesco086

Thanks for asking!

There are few parts to this:

  1. read-only operations (show, describe, check-ref, history, stages). Specifically: 1a. read Git tags (check-ref; history and show without commits) 1b. read commits (describe) 1c. read both (show --all-commits, history, stages)
  2. write operations (assign, register, deprecate, annotate, remove). Specifically: 2a. create/remove git tag(s) and push (assign, register, deprecate) 2b. create commit and push (annotate, remove)

AFAIK, the only way to support this in full is to clone the repo locally to temp dir first. In this case, the ideal way would be to contribute to scmrepo in the issue above, I suppose, considering DVC already have something like this implemented.

There are also an option to support 1a partially. The thing is, you can read Git tags from a remote repo with git CLI like this: git ls-remote --tags https://github.com/iterative/example-gto. Didn't check if this is supported in GitPython. The justification for this approach is this PR - i.e. all artifacts that don't have Git tags are considered unregistered and are optional to show in gto show.

This partial option is good for gto show command without --all-commits and --all-branches flags, and won't enable gto describe, gto history, etc. Because it should be much quicker than fully cloning repo, eventually we'll need to support both I suppose.

aguschin avatar Sep 07 '22 07:09 aguschin

Found a similar explanation I've posted in another issue earlier https://github.com/iterative/gto/issues/220#issuecomment-1198236090

aguschin avatar Sep 07 '22 08:09 aguschin

Thanks @aguschin ! I would surely start from the read-only operation.

Reading through what you sent me, I found this especially illuminating: https://stackoverflow.com/a/1179728/19782654 git is distributed, however often times (always?) there is a reference remote repo. One could choose to rely on those and use GitLab/GitHub APIs. Otherwise there is no way around, one need to do a git clone. For tags only, one could also rely on git ls-remote as you suggested.

I think going for a git clone in a tmp folder is probably the most general (it is a necessary step for write functionalities too) and robust solution. Do you agree? Shall we go for that straight away? I am new to scmrepo, in case git clone functionalities should be part of this repo? or GitPython?

francesco086 avatar Sep 07 '22 12:09 francesco086

Related issue on scmrepo: https://github.com/iterative/scmrepo/issues/41

francesco086 avatar Sep 07 '22 13:09 francesco086

I think going for a git clone in a tmp folder is probably the most general (it is a necessary step for write functionalities too) and robust solution. Do you agree? Shall we go for that straight away?

Sure, we can start with this, and keep git ls-remote as a future optimization.

I think @pmrowla summarized it well in this comment. GitPython is a library that call git cli and provide some API over it (the others are dulwich or pygit2). scmrepo in turn wraps GitPython, Dulwich and pygit2, providing a single interface to them, IIUC.

Initially I used GitPython in GTO because scmrepo didn't have needed functionality at that time, IIRC (i.e. iterating over tags and reading their info). Now we need to make a temporary repo clone, and though we could implement this in GTO again, DVC already have this implemented, so we could take it and change. A better approach would be to move that functionality from DVC to scmrepo, and then reuse by both GTO and DVC.

Maybe @pmrowla could give us more guidance on this (in the scmrepo issue)?

aguschin avatar Sep 07 '22 13:09 aguschin

Two questions:

  1. where should we store the git cloned repo? I initially thought in .gto/cache, but .gto is used for the configuration. Would it make sense to move the current gto config inside .gto/config? Then we could have a cache folder in .gto/.

  2. I would start from the gto show command, if that's ok, as I think is the most useful. But perhaps there are reasons to start from another one?

francesco086 avatar Sep 08 '22 12:09 francesco086

  1. Let's start it simple with cloning to temp dir. For handling the temp dirs (creating and deleting them) you can use tempfile.TemporaryDirectory() from https://docs.python.org/3/library/tempfile.html#examples This is a simple approach that will do for the first time (even DVC uses it, @pmrowla please correct me if I'm wrong). It maybe won't allow us to cache things in between CLI calls (like if you're going to call GTO multiple times on the same repo), but once we implement the simple approach, we may consider the approaches to optimize this. On the bright side, this doesn't require us to move .gto to .gto/config, making it easier for now in terms of compatibility.
  2. Yeah, please start with whichever you like more. gto show sounds like the most useful to me as well. In any case, once you implement this for a single command, repeating it for others should be a trivial task.

aguschin avatar Sep 09 '22 04:09 aguschin

Ok, I thought of tmpfiles, it’s just that from direct experience I know that each call will be slow and therefore it will be a little cumbersome. But you are right, let’s start like this, optimization comes later 😄

Thanks for your input!

On Fri 9. Sep 2022 at 06:49 Alexander Guschin @.***> wrote:

  1. Let's start it simple with cloning to temp dir. For handling the temp dirs (creating and deleting them) you can use https://docs.python.org/3/library/tempfile.html This is a simple approach that will do for the first time (even DVC uses it, @pmrowla https://github.com/pmrowla please correct me if I'm wrong). It maybe won't allow us to cache things in between CLI calls (like if you're going to call GTO multiple times on the same repo), but once we implement the simple approach, we may consider the approaches to optimize this. On the bright side, this doesn't require us to move .gto to .gto/config, making it easier for now in terms of compatibility.
  2. Yeah, please start with whichever you like more. gto show sounds like the most useful to me as well. In any case, once you implement this for a single command, repeating it for others should be a trivial task.

— Reply to this email directly, view it on GitHub https://github.com/iterative/gto/issues/25#issuecomment-1241496024, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB7PWYGUEEHGO4XYRS3LZJLV5K6XHANCNFSM5O6Y5KIQ . You are receiving this because you commented.Message ID: @.***>

francesco086 avatar Sep 09 '22 04:09 francesco086

Temp directories (without any persistent caching between CLI calls) is what DVC does: https://github.com/iterative/dvc/blob/main/dvc/external_repo.py

pmrowla avatar Sep 09 '22 05:09 pmrowla

I have been working a bit on this (https://github.com/francesco086/gto/pull/1).

I have one design decision to make: who is responsible to clean up the git cloned directory? What should be the mechanism? Currently I am using a TemporaryDirectory, and in the tests I am calling TemporaryDirectory.cleanup(). How should the cleanup be called after that the repo has been "initialized" as in e.g. https://github.com/iterative/gto/blob/main/gto/registry.py#L62 ? Any suggestion?

francesco086 avatar Sep 13 '22 16:09 francesco086

One option that may work (looks like a hacky one to me) is to use with ... yield construction. Something like (raw draft) :

def return_repo(path):
    if remote_url(path):
        with TemporaryDirectory() as tmpdir:
            clone_repo(path, tmpdir)
            yield tmpdir
    return path

class GitRegistry(BaseModel):
    ...
    @classmethod
    def from_repo(cls, repo=Union[str, Repo], config: RegistryConfig = None):
        if isinstance(repo, str):
            try:
                for repo in return_repo(repo): pass
                repo = git.Repo(repo, search_parent_directories=True)
            except (InvalidGitRepositoryError, NoSuchPathError) as e:
                raise NoRepo(repo) from e
        if config is None:
            config = read_registry_config(
                os.path.join(repo.working_dir, CONFIG_FILE_NAME)
            )

As per Stack Overflow answer, it should get garbage collected after process exit. Which is fine for us, since GitRegistry can be used for all the time process exists.

I'll search for more beautiful solution, but this may suffice :)

aguschin avatar Sep 14 '22 10:09 aguschin

Thanks for the suggestion, it indeed looks very hacky... reading more in detail from the stackoverflow discussion you mentioned it sounds to me quite a bad idea to follow this approach. The directory could be deleted at any time (it's undeterministic). I don't even want to think about what kind of headaches one could get later on... I strongly advise to go another way.

From my side I see 2 options:

  1. go back to the idea of a cache folder (for now we could simply name it ".gto_remote_repos", to avoid clashes with the .gto config file)
  2. refactor quite a lot in gto and enclose all the command execution within a with statement, to ensure that the tmp directory will be deleted at the end

Of course I am open to discuss this :)

francesco086 avatar Sep 14 '22 12:09 francesco086

go back to the idea of a cache folder (for now we could simply name it ".gto_remote_repos", to avoid clashes with the .gto config file)

Sure, we can try it if it looks easier. Two questions:

  1. Could we get it updated each time, and remove everything that doesn't exist in the remote? (I assume yes, but we need to check if let's say someone deleted few tags/commits in remote, we do the same on the local copy once your instantiate GitRegistr, etc).
  2. I assume there should be the user-friendly way to cleanup the cache. E.g. $ gto gc or maybe there is something better)

Re refactoring and the 2nd option, I think we could do something like this (thanks @mike0sv for the discussion and the code snippet)

class GitRegistry:
    ...
    @classmethod
    @contextmanager
    def from_path(cls, path):
        repo = GitRegistry(path)
        yield repo
        if repo.needs_cleanup():
            repo.cleanup()
    
    @needs_clone
    def ls(self):
        ...
    
    def needs_cleanup(self):
        return self.is_remote and self._local_path is not None
    
def needs_clone(f):
    def inner(self, *args, **kwargs):
        if self._local_path is None:
            self._local_path = self._clone_repo()
        return f(self, *args, **kwargs)
    return inner

Then all API calls would have to look like:

with GitRegistry.from_repo(path) as repo:
   gto.api.register(repo, ...)

In will require changing dozen of lines in gto/api.py and few lines in tests.

I'll think about this tomorrow morning, but for now the 1st option with cache looks as it have a big pro (cache) and simpler (although it gives you less control over your cache).

Btw, this also reminded me that @iterative/studio-backend team wanted to have a way to cache self.get_state() calls in the GitRegistry functions to parse Git repos faster, but that's a different story IMO. (We could get that done by having some "lazy" flag that would make GitRegistry to not update state and reuse the existing one unless it's asked directly).

UPD: the needs_clone decorator above clones repo if it's needed for the certain method to run. In some cases we may just run git ls-remote --tags, get tags and not clone the repo at all.

aguschin avatar Sep 14 '22 13:09 aguschin

Option 1. means we are building a stateful solution. Option 2. is stateless.

  1. can lead to higher performance but can get complex to handle (e.g. as you mentioned, how do you clean the cache? But also, do we want to do a git pull if the repo is already in the cache, how and when?)

  2. is simpler from a user perspective but performance-wise here it could get annoying (git clone can be substantially slower than a git pull). But perhaps with some tricks we could get good performance, e.g. with shallow clones.

I leave the decision to you @aguschin, this is pretty much a fundamental design choice for gto :) Then we can see the details

francesco086 avatar Sep 14 '22 15:09 francesco086

I can easily imagine someone who is trying to run many commands on a remote repo, so I definitely would prefer an approach with cache. I would assume, even if you want to register/assign, you would first run gto show to recall the exact artifact name. Then you could run gto show modelname to find out the specific version you need to assign a stage to, etc. Cloning a repo each time can make things 2-3x slower in this case.

do we want to do a git pull if the repo is already in the cache, how and when

I would assume we need to do that each time GitRegistry is initialized. Or even each time we re-calculate self.get_state().

The other problem here could be if anyone wants to run some commands in parallel on the same remote repo. E.g. I run two annotate - one for main branch, the other for dev branch - they may clash cause your create a commits in both cases and (I assume) git will need to checkout a branch to create a commit. For now we can do some kind of locking mechanism, so one process will wait until the other is completed. Here I think we need to check what commands run in parallell are prone to this (I assume this doesn't affect reading operations, like show, history, describe. Also register and assign should be unaffected). And since we consider moving artifacts.yaml to dvc.yaml, annotate may not be a problem in future at all.

aguschin avatar Sep 15 '22 05:09 aguschin

For the sake of simplicity of the first implementation we could even use an external context manager and a decorator to clone the repo

# gto/api.py
def clone_repo_if_needed(f):
    def inner(*args, **kwargs):
        repo = args[0]
        if is_remote(repo):
            with TemporaryDirectory() as tmpdir:
                clone_repo(tmpdir)
                return f(tmpdir, *args[1:], **kwargs)
        return f(repo, *args[1:], **kwargs)
    return inner

@clone_repo_if_needed
def method(repo, ...):
    ...

That won't require any changes in GitRegistry, creates a cloned repo only until the API function is called... WDYT? If that works, we can implement it first, and then improve with cache on top of it.

UPD: updated the snippet

aguschin avatar Sep 15 '22 05:09 aguschin

I'd say stateless approach (option 2) is preferable if we can optimize the performance. Let's see what we actually need from temporary cloned repo.

  • List of tags
  • artifacts.yaml for every commit?
  • what else?

There is a way to only pull certain files (or maybe even none of them), see this

mike0sv avatar Sep 15 '22 08:09 mike0sv

:) just to throw more ideas (and confusion) in: in the past, when I had a hard time between stateless and stateful, joblib Memory came to the rescue. It allows to design the application as stateless, but brings the performance advantages of stateful applications. However, I am not sure if here it can be helpful.

@aguschin I like very much the decorator idea, good way to close the with statement without being intrusive in the code :)

francesco086 avatar Sep 15 '22 08:09 francesco086

@mike0sv you are right, one can fetch only what is strictly necessary. However, from my limited knowledge of the code, it seems to me that this could lead to quite many changes in the pre-existing code.

Shall I try to go this way? stateless, using a tmp directory as suggested by @aguschin?

francesco086 avatar Sep 15 '22 08:09 francesco086

@mike0sv thanks for the suggestion. Yes, I believe we need to get (1) list of tags, (2) artifacts.yaml from all the commits. I tried the answer from the stack overflow you posted and looks like it works, partially (still additionally cloned README and requirements.txt for me, but at least didn't clone models/ and .github/ folders ):

git clone \
  --filter=blob:none  \
  --sparse \
  https://github.com/iterative/example-gto \
;
cd example-gto
git sparse-checkout set artifacts.yaml --skip-checks

@francesco086, let's try the decorator idea first then) We can also keep @mike0sv suggestion as a future improvement, if it can't be easily implemented for some reason (like you can't use gitpython and need to call subprocess for it).

aguschin avatar Sep 15 '22 08:09 aguschin

For today I managed to implement the decorator suggested by @aguschin with few touches: https://github.com/francesco086/gto/blob/03c5622ed5b5039b95b4e626030932a033a3e9ba/gto/git_utils.py#L12

Tomorrow I will try to move it to the show command (first I need to figure out how to write the tests for it).

Perhaps you can help me with couple of things:

  1. do you know of a more professional way to check if a string is a valid url for a git repo? I would not bet anything on my solution in https://github.com/francesco086/gto/blob/03c5622ed5b5039b95b4e626030932a033a3e9ba/gto/git_utils.py#L43
  2. For whatever reason the PyCharm debugger does not seem to work with the gto project. Do you experienced the same? Any hint?

francesco086 avatar Sep 15 '22 15:09 francesco086

  1. Please take a look at this https://stackoverflow.com/a/22312124/4890419 , I assume it should work with slight modifications maybe (like if there is no .git in the end of the URL).
  2. Hmm, I don't use the PyCharm and PyCharm debugger specifically. I know @mike0sv does. @mike0sv, maybe you can give some hint?
  3. Re testing I think you can test by wrapping some dummy function with returns path to dir and checking that the dir indeed was created (you can clone example-gto or gto repo itself I think).

aguschin avatar Sep 16 '22 07:09 aguschin

  1. Please take a look at this https://stackoverflow.com/a/22312124/4890419 , I assume it should work with slight modifications maybe (like if there is no .git in the end of the URL).

Thanks! I noticed this, but I suppose it is not a major problem, right?

francesco086 avatar Sep 16 '22 08:09 francesco086

it is not a major problem

I assume yes, for some time at least)

aguschin avatar Sep 16 '22 09:09 aguschin

PR ready :)

I tried it on my production artifact registry for fun, it works like a charm :)

francesco086 avatar Sep 16 '22 13:09 francesco086

Cool! Checking it out!

aguschin avatar Sep 19 '22 11:09 aguschin

Now I think I can create a PR for check-ref, history and stages :) on it!

francesco086 avatar Sep 21 '22 15:09 francesco086

I have just noticed a problem - I changed the info about the --repo option (the text you get from gto show -h for example) to Repository to use (remote repos accepted).

I did not notice that the change would have propagate to all other commands XD So, as for the code in the master, all gto commands accept remote repos :D

I will change it back with the next PR, ok? Perhaps we can first make the changes on all commands and then update the help message.

francesco086 avatar Sep 21 '22 15:09 francesco086