djl
djl copied to clipboard
Create QuantileLoss.java
QuantileLoss for training toward a particular quantile, e.g. esimating close to the P75.
Description
Sometimes you want to train your estimator for a particular quantile, e.g. the 75th percentile (P75). This provides [weighted] QuantileLoss, e.g.
Hi @gforman44. How is it going? Let me know if you have any questions or problems
It is a Loss, not just an Evaluator.
I put this reference in the java doc for further explanation. https://bibinmjose.github.io/2021/03/08/errorblog.html
Yes, it deserves a test case. Harder with non-deterministic learning.
Haven’t had time.
George
On Tue, Jun 14, 2022 at 3:56 PM Zach Kimberg @.***> wrote:
Hi @gforman44 https://github.com/gforman44. How is it going? Let me know if you have any questions or problems
— Reply to this email directly, view it on GitHub https://github.com/deepjavalibrary/djl/pull/1652#issuecomment-1155652750, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADRHTYZJQEJLLPPEPIIESUDVPDPW7ANCNFSM5V5FIZQA . You are receiving this because you were mentioned.Message ID: @.***>
I'm not sure what you mean by it's harder with non-deterministic learning. We don't really require a full learning test case. The way we currently test the losses is just a unit test. Essentially, it is just passing it a sample input and then verify that with a hard-coded expected result number. The purpose of this is mostly to make sure that the loss runs and it doesn't change accidentally. You can find examples of what I am referring to with the other loss tests.
Oh. Thank you for that clarification. That’ll be easy.
George
On Fri, Jun 17, 2022 at 8:19 PM Zach Kimberg @.***> wrote:
I'm not sure what you mean by it's harder with non-deterministic learning. We don't really require a full learning test case. The way we currently test the losses is just a unit test. Essentially, it is just passing it a sample input and then verify that with a hard-coded expected result number. The purpose of this is mostly to make sure that the loss runs and it doesn't change accidentally. You can find examples of what I am referring to with the other loss tests https://github.com/deepjavalibrary/djl/blob/master/integration/src/main/java/ai/djl/integration/tests/training/LossTest.java .
— Reply to this email directly, view it on GitHub https://github.com/deepjavalibrary/djl/pull/1652#issuecomment-1159315452, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADRHTYZURTCZ5GEIYWMRH5LVPUIXLANCNFSM5V5FIZQA . You are receiving this because you were mentioned.Message ID: @.***>
One more thing before the framework gets too ossified:
consider making Loss an interface that is a sub-interface of Evaluator interface. Then any Loss can be used as an additional Evaluation. And a base class for Evaluator can be used for either easily. My base class maps to double[] so that subclasses can have several accumulators per test situation, rather than your current Loss base class which maps to Double.
On Sun, Jun 19, 2022 at 8:23 AM Family Forman @.***> wrote:
Oh. Thank you for that clarification. That’ll be easy.
George
On Fri, Jun 17, 2022 at 8:19 PM Zach Kimberg @.***> wrote:
I'm not sure what you mean by it's harder with non-deterministic learning. We don't really require a full learning test case. The way we currently test the losses is just a unit test. Essentially, it is just passing it a sample input and then verify that with a hard-coded expected result number. The purpose of this is mostly to make sure that the loss runs and it doesn't change accidentally. You can find examples of what I am referring to with the other loss tests https://github.com/deepjavalibrary/djl/blob/master/integration/src/main/java/ai/djl/integration/tests/training/LossTest.java .
— Reply to this email directly, view it on GitHub https://github.com/deepjavalibrary/djl/pull/1652#issuecomment-1159315452, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADRHTYZURTCZ5GEIYWMRH5LVPUIXLANCNFSM5V5FIZQA . You are receiving this because you were mentioned.Message ID: @.***>
That seems like a reasonable suggestion (making Loss and Evaluator interfaces for more flexibility). With something like QuantileLoss, I can see how you may want to track multiple QuantileLosses. Or, with something like TopK accuracy you could have multiple values of k.
If you have time, do you want to try contributing those changes yourself?
Codecov Report
Merging #1652 (e2d0978) into master (bb5073f) will decrease coverage by
2.09%
. The diff coverage is67.43%
.
@@ Coverage Diff @@
## master #1652 +/- ##
============================================
- Coverage 72.08% 69.99% -2.10%
- Complexity 5126 5813 +687
============================================
Files 473 573 +100
Lines 21970 25812 +3842
Branches 2351 2779 +428
============================================
+ Hits 15838 18068 +2230
- Misses 4925 6378 +1453
- Partials 1207 1366 +159
Impacted Files | Coverage Δ | |
---|---|---|
api/src/main/java/ai/djl/modality/cv/Image.java | 69.23% <ø> (-4.11%) |
:arrow_down: |
...rc/main/java/ai/djl/modality/cv/MultiBoxPrior.java | 76.00% <ø> (ø) |
|
...rc/main/java/ai/djl/modality/cv/output/Joints.java | 71.42% <ø> (ø) |
|
.../main/java/ai/djl/modality/cv/output/Landmark.java | 100.00% <ø> (ø) |
|
...main/java/ai/djl/modality/cv/output/Rectangle.java | 72.41% <0.00%> (ø) |
|
...i/djl/modality/cv/translator/BigGANTranslator.java | 21.42% <0.00%> (-5.24%) |
:arrow_down: |
...odality/cv/translator/BigGANTranslatorFactory.java | 33.33% <0.00%> (+8.33%) |
:arrow_up: |
...nslator/InstanceSegmentationTranslatorFactory.java | 14.28% <0.00%> (-3.90%) |
:arrow_down: |
.../cv/translator/SemanticSegmentationTranslator.java | 0.00% <0.00%> (ø) |
|
.../cv/translator/StyleTransferTranslatorFactory.java | 40.00% <ø> (ø) |
|
... and 475 more |
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
It is a Loss, not just an Evaluator.**
I felt some of the questions were already addressed in the java doc, such as when you want to train for the P90 of the distribution.
I had put this reference in the java doc for further explanation: https://bibinmjose.github.io/2021/03/08/errorblog.html Would you like more? https://docs.aws.amazon.com/forecast/latest/dg/API_WeightedQuantileLoss.html Here’s one comparing different metrics including Quantile Loss: https://docs.aws.amazon.com/forecast/latest/dg/metrics.html
Yes, you are right, it deserves a test case. Harder with non-deterministic learning.
Haven’t had time.
George
** I think Loss should be a subclass of Evaluator.
On Tue, Jun 14, 2022 at 3:56 PM Zach Kimberg @.***> wrote:
Hi @gforman44 https://github.com/gforman44. How is it going? Let me know if you have any questions or problems
— Reply to this email directly, view it on GitHub https://github.com/deepjavalibrary/djl/pull/1652#issuecomment-1155652750, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADRHTYZJQEJLLPPEPIIESUDVPDPW7ANCNFSM5V5FIZQA . You are receiving this because you were mentioned.Message ID: @.***>