GPT-Neo
EleutherAI
• Framework: PyTorchGPT-Neo is an open-source large language model developed by EleutherAI, designed as an alternative to OpenAI’s GPT-3. It uses the Transformer architecture to generate coherent, human-like text based on a given prompt. GPT-Neo is trained on the Pile dataset, which is a diverse and large-scale text corpus, making it capable of many NLP tasks such as text generation, summarization, translation, and question answering. GPT-Neo models come in different sizes, the most popular being the 1.3B and 2.7B parameter versions.
GPT-Neo AI Model

Model Performance Statistics
Views
Released
Last Checked
Version
- Text Generation
- Text Completion
- Parameter Count
- N/A
Dataset Used
The Pile
Related AI Models
Discover similar AI models that might interest you
OpenBioLLM-7B

OpenBioLLM-7B
Saama AI Labs
OpenBioLLM-7B is a specialized open-source large language model designed for biomedical and life sciences applications. Built with PyTorch and released under the Apache 2.0 license, it provides advanced natural language understanding capabilities tailored to bioinformatics, medical research, and clinical data analysis, enabling improved insights and automation in biomedical workflows.