Show HN: Osaurus β Ollama-Compatible Runtime for Apple Foundation Models
Osaurus is an open-source local inference runtime for macOS, written in Swift and optimized for Apple Silicon.
It lets you run Apple Foundation Models locally β fully accelerated by the Neural Engine β while also exposing OpenAI- and Ollama-compatible endpoints, so you can connect your favorite apps, tools, or clients without any code changes.
Key points:
* Supports Apple Foundation Models natively
* Compatible with OpenAI & Ollama APIs
* ~7 MB binary, runs locally (no cloud, no telemetry)
* MIT Licensed, open source
Project: https://osaurus.ai
Source: https://github.com/dinoki-ai/osaurus
Weβre exploring what a local-first AI ecosystem could look like β where inference, privacy, and creativity all happen on your own hardware. Feedback and testing welcome!
Very cool! Since itβs possible to train foundation model adapters, is a library for user fine tunes possible?