DeepTab icon indicating copy to clipboard operation
DeepTab copied to clipboard

DeepTab is a Python package that simplifies tabular deep learning by providing a suite of models for regression, classification, and distributional regression tasks. It includes models such as Mambula...

Results 32 DeepTab issues
Sort by recently updated
recently updated
newest added

**Describe the bug** i'm new to this field, so pls explain for me if i'm doing anything wrong the `predict_proba` of ModernNCAClassifier always return greater than 0.5, which results in...

bug

When the number of features exceeds 20, the MambularRegressor will throw the following error. How can this be resolved? KeyError Traceback (most recent call last) in () 18 # print("y_train...

question

Bumps [torch](https://github.com/pytorch/pytorch) from 2.5.1 to 2.6.0. Release notes Sourced from torch's releases. PyTorch 2.6.0 Release Highlights Tracked Regressions Backwards Incompatible Change Deprecations New Features Improvements Bug fixes Performance Documentation Developers...

dependencies
python

By replacing explicite tensor operations with `torch.einsum()` in the Zero-Order-Hold transformation, performance and readability can be improved. Replacing the original Zero-Order-Hold transformation in line 518 of `mamba_arch.py` ```python deltaA =...

enhancement

Hi great project, There is a way to extract the feature importance for a model (once you have trained it)? Or other interpretability algorithms? have a nice day, Salvatore

question

Include Positional Invariance Layer. -> independent of model class. Optionally have submode invariance layers in between sub-layers.

enhancement

Include pre-encoded column name embeddings and use as e.g. hadamard product between encodings and embeddings.

enhancement

**Context** There is currently no examples in documentation about the usage of mambular.preprocessing.Preprocessor. Appearently the Processor is applied in the fit() function. **Describe the task you are trying to achieve.**...

question

Include Jamba arch. -> Default to Attention Layer at beginning. @nwuestefeld

enhancement

Include own mamba/mamba2 Triton version. @nwuestefeld

enhancement