Open Mixtral 8x22B

Open Mixtral 8x22B

by

Mistral

Model ID:

open-mixtral-8x22b

Use This Model

Frequently Asked Questions

  1. How does Open Mixtral 8x22B differ from the standard Mixtral 8x22B?
    Open Mixtral 8x22B incorporates open-source principles, allowing greater accessibility and flexibility compared to the standard Mixtral model, which may have more restrictive usage policies or licensing.

  2. What are the main strengths of Open Mixtral 8x22B in language tasks?
    Open Mixtral 8x22B excels in instruction-following tasks and demonstrates strong performance across various language benchmarks due to its robust architecture designed for versatility in language generation applications.

  3. Is Open Mixtral 8x22B suitable for commercial use?
    Yes, Open Mixtral 8x22B is suitable for commercial use, particularly appealing to developers who prefer open-source solutions that can be customized according to specific needs.

Still have questions?

Cant find the answer you’re looking for? Please chat to our friendly team.

Get In Touch

Model Specifications

Release Date:

28/4/2024

Max. Context Tokens:

32K

Max. Output Tokens:

4K

Knowledge Cut-Off Date:

October 2021

License:

Open-Source

Technical Report/Model Card:

LMSys ELo Score

1147

Berkeley Function Calling Ability Score:

43

Pricing

$/Million Input Tokens

$

2

$/Million Output Tokens

$

6

Live updates via Portkey Pricing API. Coming Soon...

© 2024 Portkey, Inc. All rights reserved

Open Mixtral 8x22B

Open Mixtral 8x22B

by

Mistral

Model ID:

open-mixtral-8x22b

Chat