open source

BGE v3

Provided by: Framework: Unknown

BGE v3 is an open-source multilingual embedding model developed by BAAI, designed for retrieval-augmented generation (RAG), semantic search, and vector database applications. Achieving a 65.3 score on the MTEB leaderboard, BGE v3 supports over 100 languages and handles an 8K context window for long-document embedding. The model delivers performance comparable to OpenAI’s text-embedding models while being eight times smaller and optimized for 4-bit quantization, making it ideal for on-device and scalable vector search systems. BGE v3 helps developers build advanced semantic search engines, chatbot retrieval layers, and knowledge-grounded AI applications efficiently.

Model Performance Statistics

0

Views

March 22, 2025

Released

Aug 19, 2025

Last Checked

3.0

Version

Capabilities
  • Semantic search
  • RAG optimization
  • Multilingual embeddings
Performance Benchmarks
MTEB65.3
dimensions1024
Technical Specifications
Parameter Count
N/A
Training & Dataset

Dataset Used

Multilingual knowledge corpus

Related AI Models

Discover similar AI models that might interest you

Modelopen source

Nomic Embed

Nomic Embed

Nomic Embed

Nomic AI

Nomic Embed is an open-source text embedding model built with PyTorch and Apache 2.0 license. With support for up to 8192-token context length, it achieves state-of-the-art performance on tasks like semantic search and retrieval using benchmarks such as MTEB and LoCo. It fully open-source the model weights, training data, and code, making it ideal for production and research usage.

Embeddingsembeddingssearch
13
Modelopen source

E5-Mistral

E5-Mistral

E5-Mistral

Microsoft

E5-Mistral is an open-source embeddings model developed by Microsoft, released under the MIT license. Built with PyTorch, it generates high-quality vector representations useful for semantic search, information retrieval, and clustering tasks. E5-Mistral enables efficient and accurate AI applications requiring text similarity and understanding.

Embeddingsembeddingssearch
13
Modelopen source

Stable Code 3B

Stable Code 3B

Stable Code 3B

Stability AI

Stable Code 3B is a compact 3-billion-parameter large language model developed by Stability AI for code generation, completion, and reasoning tasks. Trained on over 1.3 trillion tokens from diverse programming and text datasets, it supports more than 18 programming languages. The model offers strong HumanEval performance and efficient inference, making it suitable for IDE integration, education, and developer tooling. Stability AI also provides an instruction-tuned version (Stable Code Instruct 3B) optimized for conversational code assistance.

Code Generationai-modelsai-software-developer
0
BGE v3 – Multilingual Embedding Model by BAAI – Free API Hub