Feature add activation to BlockRNN (#2492)
Checklist before merging this PR:
- [x] Mentioned all issues that this PR fixes or addresses.
- [x] Summarized the updates of this PR under Summary.
- [x] Added an entry under Unreleased in the Changelog.
Fixes #2492 .
Summary
- Added support for specifying PyTorch activation functions (
ReLU,Sigmoid,Tanh, orNone) in theBlockRNNModel. - Ensured that activation functions are applied between fully connected layers, but not as the final layer.
- Implemented a check to raise an error if an activation function is set but the model only contains one linear layer.
- Updated documentation to reflect the new activation parameter and usage examples.
- Added test cases to verify the correct application of activation functions and to handle edge cases.
Other Information
Consider using a logging warning instead of raising an error if an activation function is set but the model only contains one linear layer.
Codecov Report
Attention: Patch coverage is 58.33333% with 5 lines in your changes missing coverage. Please review.
Project coverage is 93.75%. Comparing base (
26c5f39) to head (e94b3c7). Report is 1 commits behind head on master.
| Files with missing lines | Patch % | Lines |
|---|---|---|
| darts/models/forecasting/block_rnn_model.py | 58.33% | 5 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## master #2504 +/- ##
==========================================
- Coverage 93.79% 93.75% -0.05%
==========================================
Files 139 139
Lines 14741 14738 -3
==========================================
- Hits 13827 13818 -9
- Misses 914 920 +6
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Should a warning be raised if activation is None and hidden_fc_sizes isn't? That would always be suboptimal and users who don't read the changelog might not be aware the default activation is None.