Today we're proud to introduce trio.ai — a family of open-source AI models designed to run locally, integrate easily, and scale from edge devices to high-performance workstations.
Why trio.ai?
The AI landscape is dominated by cloud APIs — powerful, but expensive, privacy-concerning, and dependent on internet connectivity. We built trio.ai for the developers and businesses who want capable AI that runs on their own hardware, on their own terms.
The Model Family
trio.ai comes in six sizes to match your hardware and use case:
- trio-nano (1.0 GB) — runs on any laptop, perfect for edge devices
- trio-small (2.5 GB) — balanced everyday performance
- trio-medium (4.7 GB) — strong reasoning for complex tasks
- trio-high (5.3 GB) — high-performance for demanding workflows
- trio-max (7.0 GB) — maximum quality for professional use
- trio-pro (18.6 GB) — flagship, frontier-level performance
Getting Started
Installation is a single pip command:
pip install triobotAll models are available as GGUF files on HuggingFace for direct download and use with llama.cpp, Ollama, or any compatible runtime.
What's Next
We're actively expanding the triohub skills dataset and improving model capabilities with each release. The community on HuggingFace is already contributing skills and evaluations. We'd love to have you involved.