Unify Art illumina configs
PR checklist
Unifies the Art illumina configs to follow https://nf-co.re/docs/guidelines/components/modules#configuration-of-extargs-in-tests
- [x] This comment contains a description of changes (with reason).
- [ ] If you've fixed a bug or added code that should be tested, add tests!
- [ ] If you've added a new tool - have you followed the module conventions in the contribution docs
- [ ] If necessary, include test data in your PR.
- [ ] Remove all TODO statements.
- [ ] Emit the
versions.ymlfile. - [ ] Follow the naming conventions.
- [ ] Follow the parameters requirements.
- [ ] Follow the input/output options guidelines.
- [ ] Add a resource
label - [ ] Use BioConda and BioContainers if possible to fulfil software requirements.
- Ensure that the test works with either Docker / Singularity. Conda CI tests can be quite flaky:
- For modules:
- [ ]
nf-core modules test <MODULE> --profile docker - [ ]
nf-core modules test <MODULE> --profile singularity - [ ]
nf-core modules test <MODULE> --profile conda
- [ ]
- For subworkflows:
- [ ]
nf-core subworkflows test <SUBWORKFLOW> --profile docker - [ ]
nf-core subworkflows test <SUBWORKFLOW> --profile singularity - [ ]
nf-core subworkflows test <SUBWORKFLOW> --profile conda
- [ ]
- For modules:
Please take a look at the requested changes, and use the Ready for review button when you are done, thanks :+1:
I added typing to the bosch-alarm-mode2 lib, and then enabled strict typing in the integration now, so everything should be typed
I will also do another pass of the tests, I think I made all my comments on the main code, then we have all the feedback. I don't know how much time you have left, but I think with a little bit we can get it into 2025.4 (for which the beta cut is tomorrow)
Okay so I have been COOKING. (I am very happy how it turned out). Let me summarize what I did and why:
- I removed the coordinator. The only task the coordinator had was managing availability of the entities that were connected to it. And I think that that is not a good use case for using a coordinator, as it should at least have something to do with managing the shared state of the integration. So instead, I moved the logic of managing the availability of the entities to the entities itself. (A future improvement to be made to the
Observablewould be to return the detach function when callingattach. This way we can just pass that function into a lifecycle hook and it looks a bit nicer). I also added tests for the availability. - Using
modelin the tests. This way, wherever you usemodelor a fixture that uses themodelfixture, the tests are run for all models. You can limit the models by using parametrize. - Refactor the tests to mock the object. The biggest benefit of this is that you don't care how the internals of the library work. This leaves with more room for you to change the library without breaking tests (as long as the interface doesnt change). The previous tests tried to replicate what the library would have done, and instead we try not to do that. So instead of checking the state of the entity if something was called as expected, we can ask the mock if it has been called as we expect.
I think it's ready to go, let me know if you have more questions :)
I always forget this after the fact, but why didn't we pick bosch_security as name? as that is what Bosch calls it
Oh and please send me a message on Discord
Oh and please send me a message on Discord
What's your discord username?
joostlek