Mistral AI

French artificial intelligence company From Wikipedia, the free encyclopedia

Mistral AI

Mistral AI SAS is a French artificial intelligence (AI) startup, headquartered in Paris. It specializes in open-weight large language models (LLMs).[2][3]

Quick Facts Company type, Industry ...
Mistral AI SAS
Company typePrivate
IndustryArtificial intelligence
Founded28 April 2023; 22 months ago (28 April 2023)
Founders
Headquarters
Paris
,
France
Key people
  • Arthur Mensch (CEO)
  • Guillaume Lample (Chief Scientist)
  • Timothée Lacroix (CTO)
Products
  • Mistral 7B
  • Mixtral 8x7B
  • Mistral Medium
  • Mistral Large
  • Mistral Large 2 (123B)
  • Mixtral 8x22B
  • Codestral 22B
  • Codestral Mamba (7B)
  • Mathstral (7B)
  • Mistral NeMo 12B
  • Mistral Embed
Number of employees
150 (2025)[1]
Websitemistral.ai
Close

Namesake

The company is named after the mistral, a powerful, cold wind in southern France.[4]

History

Mistral AI was established in April 2023 by three French AI researchers, Arthur Mensch, Guillaume Lample and Timothée Lacroix.[5]

Mensch, an expert in advanced AI systems, is a former employee of Google DeepMind; Lample and Lacroix, meanwhile, are large-scale AI models specialists who had worked for Meta Platforms.[6]

The trio originally met during their studies at École Polytechnique.[4]

Thumb
Example of an image generated with Le Chat, the prompt is Generate an image you feel represents yourself, Mistral AI
Thumb
Screenshot of Le Chat, Mistral AI chatbot, describing Wikipedia in a thoughtful way

Company operation

Summarize
Perspective

Philosophy

Mistral AI emphasizes openness and innovation in the AI field and positions itself as an alternative to proprietary models.[7]

The company has gained prominence as an alternative to proprietary AI systems as it aims to "democratize" AI by focusing on open-source innovation.[8]

Funding

In June 2023, the start-up carried out a first fundraising of €105 million ($117 million) with investors including the American fund Lightspeed Venture Partners, Eric Schmidt, Xavier Niel and JCDecaux. The valuation is then estimated by the Financial Times at €240 million ($267 million).

On 10 December 2023, Mistral AI announced that it had raised €385 million ($428 million) as part of its second fundraising. This round of financing involves the Californian fund Andreessen Horowitz, BNP Paribas and the software publisher Salesforce.[9]

In October 2023, Mistral AI raised €385 million.[10]

By December 2023, it was valued at over $2 billion.[11][12][13]

On 16 April 2024, reporting revealed that Mistral was in talks to raise €500 million, a deal that would more than double its current valuation to at least €5 billion.[14]

In June 2024, Mistral AI secured a €600 million ($645 million) funding round, elevating its valuation to €5.8 billion ($6.2 billion).[15]

Led by venture capital firm General Catalyst,[16] this round resulted in additional contributions from existing investors. The funds aim to support the company's expansion.

Based on valuation, the company is currently in fourth place in the global AI race and in first place outside the San Francisco Bay Area, ahead of several of its peers, such as Cohere, Hugging Face, Inflection and Perplexity.[17]

Partnership with Microsoft

On 26 February 2024, Microsoft announced a new partnership with the company to expand its presence in the artificial intelligence industry.

Under the agreement, Mistral's language models will be available on Microsoft's Azure cloud, while the multilingual conversational assistant Le Chat will be launched in the style of ChatGPT.[18]

Services

Le Chat

On November 19, 2024, the company announced updates for Le Chat (pronounced \lə tʃat\ in french).

It added the ability to create images, in partnership with Black Forest Labs, utilizing the Flux Pro model.

Additionally, it introduced the capability to search for information on the internet to provide reliable and up-to-date information.

Furthermore, it launched the Canvas system, a collaborative interface where the AI generates code and the user can modify it.

Mobile app

On February 6, 2025, Mistral AI released its AI assistant, Le Chat, on iOS and Android, making its language models accessible on mobile devices.

Le Chat offers features including web search, image generation, and real-time updates.

Mistral AI also introduced a Pro subscription tier, priced at $14.99 per month, which provides access to more advanced models, unlimited messaging, and web browsing.[19]

Models

More information Name, Release Date ...
NameRelease DateNumber of Parameters (Billion)LicenseNotes
Mistral Small 3 25.01 January 2025 24 Apache 2.0 Upon its release in January 2025, Mistral Small 3 is benchmarked as the leader in the "small" models category below 70B, featuring 24B parameters and capabilities comparable to those of larger models.[20][21]
Mistral Large 2 24.11 November 2024 123 Mistral Research License [21]
Pixtral Large 24.11 November 2024 124 Mistral Research License On November 19, 2024, the company introduced Pixtral Large, an improvement over Pixtral 12B, integrating a 1-billion-parameter visual encoder coupled with Mistral Large 2. This model has also been enhanced, particularly for long contexts and function calls.[22][21]
Ministral 8B 24.10 October 2024 8 Mistral Research License [21]
Ministral 3B 24.10 October 2024 3 Proprietary [21]
Pixtral 24.09 September 2024 12 Apache 2.0 [21]
Mistral Large 2 24.07 July 2024 123 Mistral Research License Mistral Large 2 was announced on July 24, 2024, and released on Hugging Face. It is available for free with a Mistral Research Licence, and with a commercial licence for commercial purposes. Mistral AI claims that it is fluent in dozens of languages, including many programming languages. Unlike the previous Mistral Large, this version was released with open weights. The model has 123 billion parameters and a context length of 128,000 tokens.[21]
Codestral Mamba 7B July 2024 7 Apache 2.0 Codestral Mamba is based on the Mamba 2 architecture, which allows it to generate responses even with longer input.[23] Unlike Codestral, it was released under the Apache 2.0 license. While previous releases often included both the base model and the instruct version, only the instruct version of Codestral Mamba was released.[24][21]
Mathstral 7B July 2024 7 Apache 2.0 Mathstral 7B is a model with 7 billion parameters released by Mistral AI on July 16, 2024, focusing on STEM subjects.[25] The model was produced in collaboration with Project Numina,[23] and was released under the Apache 2.0 License with a context length of 32k tokens.[25][21]
Codestral 22B May 2024 22 Mistral Non-Production License Codestral is Mistral's first code-focused open weight model which was launched on May 29, 2024. Mistral claims Codestral is fluent in more than 80 programming languages[26] Codestral has its own license which forbids the usage of Codestral for commercial purposes.[27][21]
Mistral 8x22B April 2024 22 Apache 2.0 Similar to Mistral's previous open models, Mixtral 8x22B was released via a BitTorrent link on Twitter on April 10, 2024,[28] with a release on Hugging Face soon after.[29] The model uses an architecture similar to that of Mistral 8x7B, but with each expert having 22 billion parameters instead of 7. In total, the model contains 141 billion parameters, as some parameters are shared among the experts, but offering higher performance.[29][30][21]
Mistral Small February 2024 Unknown Proprietary Like the Large model, Mistral Small was launched on February 26, 2024.[21]
Mistral Large 24.02 February 2024 Unknown Proprietary Mistral Large was launched on February 26, 2024, and Mistral claims it is second in the world only to OpenAI's GPT-4. It is fluent in English, French, Spanish, German, and Italian, with Mistral claiming understanding of both grammar and cultural context, and provides coding capabilities. As of early 2024, it is Mistral's flagship AI.[31] It is also available on Microsoft Azure.[32][21]
Mistral Medium December 2023 Unknown Proprietary Mistral Medium is trained in various languages including English, French, Italian, German, Spanish and code with a score of 8.6 on MT-Bench.[33] It is ranked in performance above Claude and below GPT-4 on the LMSys ELO Arena benchmark.[34] The number of parameters, and architecture of Mistral Medium is not known as Mistral has not published public information about it.[21]
Mistral 8x7B December 2023 46.7 Apache 2.0 Much like Mistral's first model, Mixtral 8x7B was released via a BitTorrent link posted on Twitter on December 9, 2023,[2] and later Hugging Face and a blog post were released two days later.[35] Unlike the previous Mistral model, Mixtral 8x7B uses a sparse mixture of experts architecture. The model has 8 distinct groups of "experts", giving the model a total of 46.7B usable parameters.[36][37] Each single token can only use 12.9B parameters, therefore giving the speed and cost that a 12.9B parameter model would incur.[35] A version trained to follow instructions called “Mixtral 8x7B Instruct” is also offered.[35][21]
Mistral 7B September 2023 7.3 Apache 2.0 Mistral 7B is a 7.3B parameter language model using the transformers architecture. It was officially released on September 27, 2023, via a BitTorrent magnet link,[38] and Hugging Face[39] under the Apache 2.0 license. Mistral 7B employs grouped-query attention (GQA), which is a variant of the standard attention mechanism. This architecture optimizes performance by calculating attention within specific groups of hidden states rather than across all hidden states, improving efficiency and scalability.[40] Both a base model and "instruct" model were released with the latter receiving additional tuning to follow chat-style prompts. The fine-tuned model is only intended for demonstration purposes, and does not have guardrails or moderation built-in.[41][21]
Close

Performance

Summarize
Perspective

Mistral 7B

Mistral AI claimed in the Mistral 7B release blog post that the model outperforms LLaMA 2 13B on all benchmarks tested, and is on par with LLaMA 34B on many benchmarks tested,[41] despite having only 7 billion parameters, a small size compared to its competitors.

Mixtral 8x7B

Mistral AI's testing in 2023 shows the model beats both LLaMA 70B, and GPT-3.5 in most benchmarks.[42]

In March 2024, a research conducted by Patronus AI comparing performance of LLMs on a 100-question test with prompts to generate text from books protected under U.S. copyright law found that Open AI's GPT-4, Mixtral, Meta AI's LLaMA-2, and Anthropic's Claude 2 generated copyrighted text verbatim in 44%, 22%, 10%, and 8% of responses respectively.[43][44]

Mistral Large 2

According to Mistral AI, Large 2's performance in benchmarks is competitive with Llama 3.1 405B, particularly in programming-related tasks.[45][46]

Codestral 22B

As of its release date, Codestral 22B surpasses Meta's Llama3 70B and DeepSeek Coder 33B (78.2% - 91.6%), another code-focused model on the HumanEval FIM benchmark.[47]

Mathstral 7B

Mathstral 7B achieved a score of 56.6% on the MATH benchmark and 63.47% on the MMLU benchmark.[25]

Usage

According to Mistral AI,[48] the company's products have been used by :

  1. BNP Paribas
  2. AXA
  3. Laboratoires Pierre Fabre
  4. CMA CGM
  5. Zalando
  6. Mirakl
  7. France Travail

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.