General Tech

Cohere Tiny Aya Model Specs: 3B Params & 70 Languages

Have you ever tried to use a state-of-the-art AI model in a language other than English, only to find the performance lackluster or the latency unbearable? It’s a common frustration, especially if you aren’t running enterprise-grade hardware. But on February 17, 2026, Cohere took a significant swing at fixing this disparity with the launch of ‘Tiny Aya.’

This isn’t just another massive model release designed for a server farm. Cohere For AI, the company’s non-profit research arm, has unveiled a family of 3.35 billion parameter open weights models. The headline feature? They support over 70 languages. This release marks a strategic pivot for the lab, moving from the depth-focused ‘Aya 23’ released back in May 2024 to a breadth-focused approach that tries to squeeze high-performance multilingual capabilities into a package small enough to run on consumer hardware.

What makes the Tiny Aya family different from previous releases?

If you’ve been following Cohere’s trajectory, you might remember the ‘Aya Expanse’ launch in late 2024, which featured larger 8B and 32B models. Tiny Aya goes in the opposite direction. By shrinking the parameter count to 3.35 billion, Cohere is explicitly targeting the ‘Small Language Model’ (SLM) market.

The release isn’t a monolith; it’s a family of four distinct variants:

  • Tiny Aya Global
  • Tiny Aya Earth
  • Tiny Aya Fire
  • Tiny Aya Water

While the specific technical nuances between ‘Fire’ and ‘Water’ are part of the deeper documentation, the overarching goal is clear: optimization. These models are designed for edge deployment and low-resource environments. They feature an 8k context window, which is substantial for models of this size, allowing them to handle decent chunks of text without needing a massive cloud tether.

Illustration related to Cohere Tiny Aya Model Specs: 3B Params & 70 Languages

This is a direct evolution of the work started by Sara Hooker, the former lead who initiated the Aya project, and is now being carried forward by Marzieh Fadaee and CEO Aidan Gomez. They are essentially trying to prove that you don’t need a trillion parameters to speak Hindi, Arabic, or Yoruba fluently.

Get our analysis in your inbox

No spam. Unsubscribe anytime.

Share this article

Leave a Comment