Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc

    Mistral: Mixtral 8x22B Instruct

    mistralai/mixtral-8x22b-instruct

    Created Apr 17, 202465,536 context
    $2/M input tokens$6/M output tokens

    Mistral's official instruct fine-tuned version of Mixtral 8x22B. It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include:

    • strong math, coding, and reasoning
    • large context length (64k)
    • fluency in English, French, Italian, German, and Spanish

    See benchmarks on the launch announcement here. #moe

    Providers for Mixtral 8x22B Instruct

    OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.