Jan

Results 68 issues of Jan

Great project @cgnorthcutt ! I am reading through the source code and if I am not mistaken you could "outsource" the following lines to `scikit-learn`. https://github.com/cgnorthcutt/cleanlab/blob/ec735a06d93d2ce90ccb60d9b8a44b495169e159/cleanlab/latent_estimation.py#L613-L627 by using `cross_val_predict` ([docs](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.cross_val_predict.html))...

code improvement

The README links (in 2 different places) the following url: https://www.clips.uantwerpen.be/conll2002/ner/bin/conlleval.txt However, it is not accessible. > Forbidden You don't have permission to access this resource

It would be nice to have a benchmark that is just some predefined portfolio. One would construct it by passing all the weights.

enhancement
good first issue

Unfortunately installing Zipline is a nightmare, only supports Python 3.5. So maybe investigate open-source backtesters

Numerical allocation layers (e.g. `NumericalMarkowitz`) might generate weights that do not satisfy the constraints. There are the following reasons for it * small floating point differences (e.g. w_i=-1e-8 with **w>=0**...

enhancement

In the `SoftmaxAllocator` the nonnegativity constraint `w >= 0` is missing.

bug

Currently one cannot just do `-SomeLoss()`. Of course it could be hacked by doing `(-1) * SomeLoss()`. We want to implement the first syntax via `__sub__`.

enhancement
good first issue

Rather than reinventing the wheel one could just use `torchvision` transforms https://pytorch.org/docs/stable/torchvision/transforms.html - [x] `Compose` (already recreated in `deepdow`) - [ ] `RandomApply` - apply all with some probability -...

enhancement

Apart from generating iid sequences one can do a lot of different things. Just need to pay attention to using too many external dependencies. ### Statistical models - [ ]...

enhancement