Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc

    Mistral: Codestral Mamba

    mistralai/codestral-mamba

    Created Jul 19, 2024256,000 context

    A 7.3B parameter Mamba-based model designed for code and reasoning tasks.

    • Linear time inference, allowing for theoretically infinite sequence lengths
    • 256k token context window
    • Optimized for quick responses, especially beneficial for code productivity
    • Performs comparably to state-of-the-art transformer models in code and reasoning tasks
    • Available under the Apache 2.0 license for free use, modification, and distribution

    Recent activity on Codestral Mamba

    Total usage per day on OpenRouter

    Not enough data to display yet.