open source

Stable Code 3B

Provided by: Framework: Unknown

Stable Code 3B is a compact 3-billion-parameter large language model developed by Stability AI for code generation, completion, and reasoning tasks. Trained on over 1.3 trillion tokens from diverse programming and text datasets, it supports more than 18 programming languages. The model offers strong HumanEval performance and efficient inference, making it suitable for IDE integration, education, and developer tooling. Stability AI also provides an instruction-tuned version (Stable Code Instruct 3B) optimized for conversational code assistance.

Model Performance Statistics

0

Views

February 5, 2025

Released

Aug 19, 2025

Last Checked

3B / 3B Instruct

Version

Capabilities
  • Code completion
  • Fill-in-middle
  • Multi-language support
Performance Benchmarks
HumanEval68.9%
MultiPL-E73.2%
Technical Specifications
Parameter Count
N/A
Training & Dataset

Dataset Used

The Stack, GitHub public repos

Related AI Models

Discover similar AI models that might interest you

Modelopen source

CodeGen2.5 7B

CodeGen2.5 7B

CodeGen2.5 7B

Salesforce

CodeGen2.5 7B is an open-source, 7-billion-parameter large language model created by Salesforce Research for program synthesis, code generation, and infill tasks. It supports multiple programming languages, including Python, Java, and JavaScript, and is trained on over 1.4 trillion code and text tokens. The model introduces improvements in infill sampling, context understanding, and multilingual code generation efficiency. Compared to larger predecessors, CodeGen2.5 7B delivers comparable performance while being optimized for resource-constrained environments.

Code Generationai-modelscode
0
Modelopen source

StarCoder2

StarCoder2

StarCoder2

BigCode

StarCoder2 is a large-scale open-source AI model developed by BigCode for code generation and comprehension tasks. Built with PyTorch and licensed under Apache 2.0, it supports multiple programming languages and is optimized for both code completion and generation. The model is designed to aid developers by automating code writing, improving productivity, and enabling advanced programming assistance.

Code Generationcode-generationdeveloper
13
Modelopen source

DeepSeek-Coder

DeepSeek-Coder

DeepSeek-Coder

DeepSeek AI

DeepSeek‑Coder is a series of open-source code language models developed by DeepSeek AI using PyTorch. Trained from scratch on 2 trillion tokens (87% code, 13% natural language), with model sizes from 1.3B to 33B parameters and a 16K window context. It excels at project‑level code completion, infilling, and supports dozens of programming languages. It consistently leads benchmarks like HumanEval, MultiPL‑E, and MBPP in open-source comparisons.

Code Generationcode-generationdeveloper
13
Stable Code 3B – Compact Code LLM by Stability AI – Free API Hub