Mamba-Chat is the first chat-based language model that is based on a state-space model architecture, and not a transformer.
The model is based on Albert Gu's and Tri Dao's work Mamba: Linear-Time Sequence Modeling with Selective State Spaces as well as their model implementation. This repository provides training / fine-tuning code for the model based on some modifications of the Huggingface Trainer.