Poro 34B
LUMI Consortium
• Framework: PyTorchPoro 34B is a large-scale open-source natural language processing model developed by the LUMI Consortium. Built with PyTorch and distributed under the Apache 2.0 license, this model excels in complex language understanding and generation tasks, supporting applications in research, chatbots, and AI-powered assistants.
Poro 34B AI Model

Model Performance Statistics
Views
Released
Last Checked
Version
- Multilingual Translation
- Government Docs
- Parameter Count
- N/A
Dataset Used
European Parliament proceedings, web crawls
Related AI Models
Discover similar AI models that might interest you
GPT-Neo

GPT-Neo
EleutherAI
GPT-Neo is an open-source large language model developed by EleutherAI, designed as an alternative to OpenAI’s GPT-3. It uses the Transformer architecture to generate coherent, human-like text based on a given prompt. GPT-Neo is trained on the Pile dataset, which is a diverse and large-scale text corpus, making it capable of many NLP tasks such as text generation, summarization, translation, and question answering. GPT-Neo models come in different sizes, the most popular being the 1.3B and 2.7B parameter versions.