Institute of Fundamental Technological Research
Polish Academy of Sciences

Partners

P. Miłoś


Conference papers
1.  Pióro M., Ciebiera K., Król K., Ludziejewski J., Krutul M., Krajewski J., Antoniak S., Miłoś P., Cygan M., Jaszczur S., MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts, Next Generation of Sequence Modeling Architectures Workshop at International Conference on Machine Learning 2024, 2024-07-26/07-26, Wiedeń (AT), pp.1-4, 2024

Abstract:
State Space Models (SSMs) have become serious contenders in the field of sequential modeling, challenging the dominance of Transformers. At the same time, Mixture of Experts (MoE) has significantly improved Transformer-based Large Language Models, including recent state-of-the-art open models. We propose that to unlock the potential of SSMs for scaling, they should be combined with MoE. We showcase this on Mamba, a recent SSM-based model that achieves remarkable performance. Our model, MoE-Mamba, outperforms Mamba and matches the performance of Transformer-MoE. In particular, MoE-Mamba reaches the same performance as Mamba in 2.35x fewer training steps while preserving the inference performance gains of Mamba against Transformer.

Affiliations:
Pióro M. - IPPT PAN
Ciebiera K. - other affiliation
Król K. - other affiliation
Ludziejewski J. - other affiliation
Krutul M. - other affiliation
Krajewski J. - other affiliation
Antoniak S. - other affiliation
Miłoś P. - other affiliation
Cygan M. - other affiliation
Jaszczur S. - other affiliation

Category A Plus

IPPT PAN

logo ippt            Pawińskiego 5B, 02-106 Warsaw
  +48 22 826 12 81 (central)
  +48 22 826 98 15
 

Find Us

mapka
© Institute of Fundamental Technological Research Polish Academy of Sciences 2025