La Plateforme provides three chat endpoints and an embedding endpoint, with different performance/price tradeoffs, and supports instruction fine-tuning and embedding models for various applications.
A cost-effective endpoint serving Mistral 7B Instruct v0.2, with a score of 7.6 on MT-Bench, and limited to English language support.
A balanced endpoint serving Mixtral 8x7B, with a score of 8.3 on MT-Bench, and supporting English, French, Italian, German, Spanish, and code.
A high-quality endpoint serving a prototype model, with a score of 8.6 on MT-Bench, and supporting English, French, Italian, German, Spanish, and code.
An embedding endpoint serving an embedding model with a 1024 embedding dimension, designed for retrieval capabilities, and achieving a retrieval score of 55.26 on MTEB.
La Plateforme's API follows popular chat interface specifications, with Python and Javascript client libraries available, and supports system prompts for moderation on model outputs.
Developers can use La Plateforme to deploy and customize open generative models for production use.
La Plateforme can be used for text generation, language translation, and other NLP tasks.
The platform can be integrated with various applications, such as chatbots, virtual assistants, and content generation tools.
La Plateforme can be used for research and development purposes, such as testing and evaluating new models and techniques.
Register to use the API on the Mistral AI website.
Choose the desired endpoint and plan according to your needs and budget.
Use the provided Python and Javascript client libraries to query the endpoints.
Integrate La Plateforme with your application, following the API specifications and guidelines.