Press "Enter" to skip to content

Cohere Unveils Multilingual AI Model ‘Tiny Aya’

Cohere has released Tiny Aya, a 3.35 billion-parameter multilingual open-weight model designed to deliver high-quality translation and language understanding while running on local and consumer-grade hardware.

The Tiny Aya family includes TinyAya-Base, which covers more than 70 languages, and TinyAya-Global, an instruction-tuned model supporting 67 languages with balanced performance across regions. Cohere is also introducing three regionally specialized variants – TinyAya-Earth, TinyAya-Fire and TinyAya-Water – optimized for Africa and West Asia, South Asia, and Asia-Pacific and Europe, respectively.

The company said Tiny Aya performs competitively with other multilingual models at a similar scale across translation, reasoning and open-ended generation tasks. It reports improved performance for languages that are underrepresented online, where many models typically struggle.

Cohere completed post-training using a 64 H100 GPU cluster, emphasizing efficiency over brute-force scaling. The tokenizer was redesigned to reduce fragmentation across scripts, lowering memory and compute requirements.

Tiny Aya is available as open weights on Hugging Face and Kaggle, alongside a multilingual fine-tuning dataset, benchmarks and a technical report. Cohere said the release aims to support researchers, developers and communities building AI systems in lower-resourced languages.

Read the Cohere blog post.

×