Mistral AI
How to Integrate Mistral AI with Jan
Mistral AI provides two ways to use their Large Language Models (LLM):
- API
- Open-source models on Hugging Face.
To integrate Jan with Mistral AI, follow the steps below:
note
This tutorial demonstrates integrating Mistral AI with Jan using the API.
Step 1: Configure Mistral API Key
- Obtain Mistral API keys from your Mistral dashboard.
- Insert the Mistral AI API key into
~/jan/engines/openai.json
.
~/jan/engines/openai.json
{
"full_url": "https://api.mistral.ai/v1/chat/completions",
"api_key": "<your-mistral-ai-api-key>"
}
Step 2: Model Configuration
- Navigate to
~/jan/models
. - Create a folder named
mistral-(modelname)
(e.g.,mistral-tiny
). - Inside, create a
model.json
file with these settings:- Set
id
to the Mistral AI model ID. - Set
format
toapi
. - Set
engine
toopenai
. - Set
state
toready
.
- Set
~/jan/models/mistral-tiny/model.json
{
"sources": [
{
"filename": "mistral-tiny",
"url": "https://mistral.ai/"
}
],
"id": "mistral-tiny",
"object": "model",
"name": "Mistral-7B-v0.2 (Tiny Endpoint)",
"version": "1.0",
"description": "Currently powered by Mistral-7B-v0.2, a better fine-tuning of the initial Mistral-7B released, inspired by the fantastic work of the community.",
"format": "api",
"settings": {},
"parameters": {},
"metadata": {
"author": "Mistral AI",
"tags": ["General", "Big Context Length"]
},
"engine": "openai"
}
note
- For more details regarding the
model.json
settings and parameters fields, please see here. - Mistral AI offers various endpoints. Refer to their endpoint documentation to select the one that fits your requirements. Here, we use the
mistral-tiny
model as an example.
Step 3: Start the Model
- Restart Jan and navigate to the Hub.
- Locate your model and click the Use button.