Overview
Mixtral 8x7B Instruct v0.1 is an 8 billion parameter model fine-tuned for instruction-based tasks. It demonstrates exceptional performance in following user prompts and generating relevant responses, making it ideal for interactive applications such as customer support and content creation.
Specializations
Mixture-of-Experts (MoE) architecture: This architecture allows the model to efficiently utilize a large number of parameters, leading to improved performance and efficiency.
Instruction-following capabilities: It is fine-tuned to follow instructions effectively, making it suitable for tasks like summarization, translation, and content generation.
Strong performance across various tasks: It demonstrates strong performance in tasks like text generation, translation, and code generation.
Integration Guide (Javascript)
To use this model through Portkey, follow these steps:
1. Install Portkey SDK:
npm install --save portkey-ai
2. Set up client with Portkey:
// Import and initialize Portkey
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // Replace with your Portkey API key
virtualKey: "VIRTUAL_KEY" // Your Together AI Virtual Key created in Portkey
})
3. Make a request:
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
});
console.log(chatCompletion.choices);
Live API Statistics and Performace Metrics
LMSys Arena Elo Score:
1114
Arena Score
LMSys Leaderboard Rank:
77
Arena Rank
Evaluates LLMs through a combination of user feedback and automated scoring systems.
Model Specifications
Release Date:
28/4/2024
Max. Context Tokens:
8K
Max. Output Tokens:
4K
Model Size
7B
Knowledge Cut-Off Date:
November 2022
MMLU:
70.6
%
License:
Open-Source
Technical Report/Model Card:
LMSys Elo Score
1114
Pricing
$/Million Input Tokens
$
0.6
$/Million Output Tokens
$
0.6
Live updates via Portkey Pricing API. Coming Soon...
© 2024 Portkey, Inc. All rights reserved