Overview
Open Mixtral 8x22B is an open-access variant of the Mixtral 8x22B model, providing users with the ability to leverage its advanced capabilities without restrictions. This model is optimized for a variety of language tasks, ensuring robust performance across different applications.
Specializations
Mixture-of-Experts (MoE) architecture: This architecture allows the model to efficiently utilize a large number of parameters, leading to improved performance and efficiency.
Open-source availability: Its open-source nature allows for customization and further development by the community.
Strong performance across various tasks: It demonstrates strong performance in tasks like text generation, translation, and code generation.
Integration Guide (Javascript)
To use this model through Portkey, follow these steps:
1. Install Portkey SDK:
npm install --save portkey-ai
2. Set up client with Portkey:
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
virtualKey: "VIRTUAL_KEY" // Your Mistral AI Virtual Key
})
3. Make a request:
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'open-mixtral-8x22b', // Correct model name for Open Mixtral 8x22B
});
console.log(chatCompletion.choices[0].message.content);
Live API Statistics and Performace Metrics
LMSys Arena Elo Score:
1147
Arena Score
LMSys Leaderboard Rank:
61
Arena Rank
Evaluates LLMs through a combination of user feedback and automated scoring systems.
Model Specifications
Release Date:
28/4/2024
Max. Context Tokens:
32K
Max. Output Tokens:
4K
Model Size
22B
Knowledge Cut-Off Date:
October 2021
License:
Open-Source
Technical Report/Model Card:
LMSys Elo Score
1147
Berkeley Function Calling Ability Score:
NaN
Pricing
$/Million Input Tokens
$
2
$/Million Output Tokens
$
6
Live updates via Portkey Pricing API. Coming Soon...
© 2024 Portkey, Inc. All rights reserved