BaseModels#
This module provides foundational classes and architectures for Mambular models, including various neural network architectures tailored for tabular data.
Modules |
Description |
---|---|
|
Abstract base class defining the core structure and initialization logic for Mambular models. |
|
PyTorch Lightning module for managing model training, validation, and testing workflows. |
Flexible neural network model leveraging the Mamba architecture with configurable normalization techniques for tabular data. |
|
Multi-layer perceptron (MLP) model designed for tabular tasks, initialized with a custom configuration. |
|
Deep residual network (ResNet) model optimized for structured/tabular datasets. |
|
Feature Tokenizer (FTTransformer) model for tabular tasks, incorporating advanced embedding and normalization techniques. |
|
TabTransformer model leveraging attention mechanisms for tabular data processing. |
|
Neural Oblivious Decision Ensembles (NODE) for tabular tasks, combining decision tree logic with deep learning. |
|
TabM architecture designed for tabular data, implementing batch-ensembling MLP techniques. |
|
Neural Decision Tree Forest (NDTF) model for tabular tasks, blending decision tree concepts with neural networks. |
|
Recurrent neural network (RNN) model, including LSTM and GRU architectures, tailored for sequential or time-series tabular data. |
|
Attention-based architecture for tabular tasks, combining feature importance weighting with advanced normalization techniques. |
|
SAINT model. Transformer based model using row and column attetion. |