Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc

    Mistral: Mixtral 8x22B (base)

    mistralai/mixtral-8x22b

    Created Apr 10, 202465,536 context

    Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time.

    It was released via X.

    #moe

    Recent activity on Mixtral 8x22B (base)

    Total usage per day on OpenRouter

    Not enough data to display yet.