open source

Gemma 3 27B

Provided by: Framework: Unknown

Gemma 3 27B is Google DeepMind’s latest open large language model that supports both text and image understanding. Built with a 27-billion-parameter architecture and a 128K-token context window, it delivers advanced reasoning, multilingual translation across 140+ languages, and vision-language capabilities via a SigLIP encoder. The model is optimized for efficient inference and quantization-aware training, enabling deployment on consumer GPUs and cloud platforms. Available through Hugging Face and Amazon Bedrock, Gemma 3 27B empowers developers to build scalable multimodal AI systems for research, enterprise, and creative applications.

Model Performance Statistics

0

Views

February 14, 2025

Released

Aug 19, 2025

Last Checked

3.1

Version

Capabilities
  • Long-context processing
  • Multilingual QA
  • Instruction following
Performance Benchmarks
GSM8K84.5
XTREME89.2
Technical Specifications
Parameter Count
N/A
Training & Dataset

Dataset Used

Multilingual web corpus

Related AI Models

Discover similar AI models that might interest you

Modelopen source

Mistral Small 3

Mistral Small 3

Mistral Small 3

Mistral AI

Mistral Small 3.1 is a compact, high-performance open-weight large language model developed by Mistral AI. Designed for efficiency, it delivers robust reasoning, summarization, and conversational capabilities while running on consumer-grade GPUs. With 24 billion parameters and long-context understanding, it supports instruction following, function calling, and multilingual text generation. Mistral Small 3.1 is optimized for real-world applications such as chatbots, content creation, and lightweight inference in production environments, offering the perfect balance between accuracy and performance for developers and enterprises.

Natural Language Processingai-modelsMultimodal AI
0
Modelopen source

Orca 2 13B

Orca 2 13B

Orca 2 13B

Microsoft

Orca 2.13 B is a large language model developed by Microsoft Research to enhance reasoning and comprehension in smaller models. Built on top of Meta’s LLaMA 2 architecture, it utilizes synthetic training data to simulate advanced reasoning strategies, including step-by-step deduction and self-reflection. Orca 2 demonstrates strong performance in logic, math, and reading comprehension, closing the gap between smaller open models and much larger proprietary systems. It serves as an open research model for studying how efficient LLMs can reason with minimal computational resources.

Natural Language Processingai-modelsAI research model
0
Modelopen source

Jais 30B

Jais 30B

Jais 30B

G42 & Cerebras

Jais 30B is an open-source large language model developed by G42 and Cerebras, designed to advance Arabic and bilingual NLP research. Trained on over 116 billion Arabic and English tokens, it delivers 83.4% performance on the Arabic MMLU benchmark and supports cross-lingual reasoning, translation, and text generation. Jais 30B leverages a specialized tokenizer optimized for Arabic script, ensuring accurate morphological understanding and natural context flow. With its bilingual training and cultural adaptation, Jais 30B stands as the most powerful Arabic-English model for developers, researchers, and AI startups focusing on regional NLP solutions.

Natural Language Processingai-modelsllm
0
Gemma 3 27B – Google DeepMind Multimodal AI Model – Free API Hub