
Hello PH, I am Pratham and we are launching the fastest LLM Gateway out there!
Hi folks I’m Pratham, an engineer, former founder, and a full-time infra nerd.
I’ve been building since college, when I founded a startup called Interact, which scaled to more than 7000 active users in just 3 months and since then, I’ve stayed obsessed with backend systems that are fast, stable, and scale effortlessly. Lately, I’ve been working on Bifrost, an open-source, and the latest LLM gateway written in Go.
If you're building LLM apps and running into bottlenecks with tools like LiteLLM, Bifrost might be worth a look. It’s fully self-hosted, adds only ~11µs mean overhead at 5K RPS, and supports providers like OpenAI, Anthropic, Mistral, Groq, Bedrock, and more. It also comes with built-in monitoring, real-time configuration, governance, MCP support, and a clean web UI to manage it all.
We designed it to act like true infrastructure, minimal overhead, built for production, and easy to integrate.
If you're into infra performance or just enjoy digging into fast Go code, feel free to check it out:
Github: https://github.com/maximhq/bifrost
Website: https://www.getmaxim.ai/bifrost
Launching soon on Product Hunt: https://www.producthunt.com/products/maxim-ai
Replies