hound icon indicating copy to clipboard operation
hound copied to clipboard

Allow recursive directories to be indexed with file://

Open rothgar opened this issue 9 years ago • 6 comments

If I have a folder containing all my repos (local or on shared network space using gitlab or similar) the entire directory contains sub directories of different git repos. If I want to index all of the repos, including newly added repos, I would hope to be able to use the following config to index all of the repos at once.

{
  "repos" : {
    "gitlab" : {
      "url" : "file://path/to/repos/"
      "recurse" : "true"
    }
  }
}

This would hopefully help with managing dozens or hundreds of repos needing to be put into a config.json file.

rothgar avatar Mar 04 '15 17:03 rothgar

Is your intent to have Hound automatically discover the git repos and keep them up to date with polling as usual?

jklein avatar Mar 04 '15 17:03 jklein

That would be ideal. Auto discovery would be important although I read polling with file:// already has issues.

Use the file:// protocol. This allows you to index any local folder, so you can clone the repository locally and then reference the files directly. The downside here is that the polling to keep the repo up to date will not work.

An alternative would be better gitlab support to auto discover/poll all repos for a team, org, etc. I'm looking to index 700+ repos which would obviously be cumbersome to input manually. I saw some of the generation scripts but those would require crons and service restarts if a new repo was added.

rothgar avatar Mar 04 '15 17:03 rothgar

+1 for automatic discovery of all repositories from local FS recursively!

It might be a CLI argument like houndd --discover file://path/to/dir which actually generates the default config.json in the existing format with a listing of all repos found.

Same approach might work later for the discovery of all repositories for the particular github org like houndd --discover https://github.com/YourOrganization/

bzz avatar Apr 18 '15 03:04 bzz

This is similar to #13 - it would be good to come up with a solution that solves both cases.

jklein avatar Apr 28 '15 11:04 jklein

As a note to other users, I will say locally I have searched my "code" directory for git urls like this.

find $HOME/code -name .git -type d -prune | xargs -n1 -P4 -I '{}' git --git-dir='{}' config --get 'remote.origin.url' | sort

which outputs something like this:

[email protected]:andxyz/.dotfiles.git
[email protected]:mislav/dotfiles.git

Then with some text manipulation I created the required config.json file.

andxyz avatar Feb 13 '17 19:02 andxyz

+1 this would be really useful. I was surprised this option doesn't already exist. Here is my BASH script as a workaround:

  echo '{' > config.json
  echo '"max-concurrent-indexers" : 2,' >> config.json
  echo '"dbpath" : "data",' >> config.json
  echo '"repos" : {' >> config.json
  first=true;
  for f in $(find /home/developer -maxdepth 2 -type d); do
    if [ -f "$f/HEAD" ]; then
      bn=`basename $f`
      if [[ "$first" == "true" ]]; then
        first=false;
      else
        echo "," >> config.json
      fi;
      echo "\"$bn\" : {\"url\" : \"file://$f\"}" >> config.json
    fi;
  done;
  echo '}}' >> config.json

rgpublic avatar Feb 22 '18 11:02 rgpublic