Add NN Template
Before submitting
- [x] Was this discussed/approved via a GitHub issue? Yes
- [x] Did you create/update your configuration file? (WIP)
- [ ] Did you set
runtimesin config for GitHub action integration? - [ ] Did you add your config to CI in Azure pipeline (only projects with 100+ GitHub stars)?
- [ ] Are all integration tests passing?
What does this PR do? [optional]
Add support for nn-template.
Did you have fun?
Always 🙃
Missing
Cookiecutter integration
The nn-template uses cookiecutter to generate a parametrized project from a template. Thus it is not enough to clone/checkout a repository to obtain a working project.
In our CI we are using the following to: (1) generate the project with cookiecutter and (2) by-pass the interactive setup (with the hacky echo command)
- name: Install Dependencies
shell: bash -l {0}
run: |
pip install cookiecutter
- name: Generate Repo
shell: bash -l {0}
run: |
echo -e 'n\nn\nn\n' | cookiecutter . --no-input project_name=${{ env.COOKIECUTTER_PROJECT_NAME }}
Dependencies
This project uses a combination of conda and pip, most of the dependencies are specified in the setup.cfg file.
The idea is that the conda dependencies specified in the
env.yamlare part of the "environment" (e.g. enabling the use of the torch docker image), and thus are not dependencies when installing the package with pip.
To configure an environment, thus, there are two options (after changing the working directory to the cookiecutter-generated project):
- Using
conda:conda env create -f env.yaml - Install the package into an existing environment with
pip install ".[dev]"
I am not sure which is the best option here, if we want to test the development setup I think the 1. would be more adequate.
Complete config
Part of the file is still borrowed from the template config, e.g. the dependencies, runtimes and before_test (p.s. the template file indicated in the README as configs/template.yaml does not exists anymore).
These are the main challenges I see at the moment. Once the environment is active to run the tests (and maybe the pre-commits) it is enough to:
pre-commit install
pre-commit run -v --all-files --show-diff-on-failure
pytest -v
Please feel free to contribute to this PR if you have any spare time! Otherwise I will try to understand better how to solve these problems in the Lightning's ecosystem CI (not in the immediate future though!).
@Borda @rasbt @Flegyas
Codecov Report
Merging #44 (ba5b836) into main (d30ef2f) will not change coverage. The diff coverage is
n/a.
Additional details and impacted files
@@ Coverage Diff @@
## main #44 +/- ##
==================================
Coverage 80% 80%
==================================
Files 2 2
Lines 252 252
==================================
Hits 201 201
Misses 51 51
Hello,
Is there anything we can improve upstream to get this integration smoother? Are there any blockers?
Thanks, Luca
We are just finishing 2.1 release, and this project just about to get some cleaning and extension for other HW, will be proceeding next week, and apology for the delay :crying_cat_face:
Not at all! :smile: Thank you for your efforts and your amazing work :rocket: