⚠️ Warning: TabICL can lead to high memory consumption. It is recommended not to use this model on datasets with more than 50k samples and 150 features.
TabICL is a pre-trained Tabular Foundation Model for In-Context Learning, built on a transformer architecture. It delivers strong performance on classification tasks with minimal configuration.
However, TabICL exhibits high memory consumption which scales with the size of the datasets.
For optimal performance and resource management, it is recommended to run the model within a containerized environment ensuring reproducibility, isolation and efficient allocation of computational resources.