kolnak
p/kolnak
Kolank.com: Maximize AI, Minimize Costs, Path to Smarter AI
Shir Danishyar
Kolank — Any LLM (open and non open source) All in unified API
5
Access 150+ open source and non-open source LLMs All in unified API, OpenAI format No more juggling multiple accounts API keys or libraries, One API to call them all! Simplify your codebase without compromising on functionality
Replies
Shir Danishyar
Hey Product Hunters! Access 150+ open source and non-open source LLMs All in unified API, OpenAI format No more juggling multiple accounts API keys or libraries Unified API Access: One API to call them all! Simplify your codebase without compromising on functionality Model Flexibility: Easily switch between AI models based on your use cases like coding, creative writing summarization, or role-playing Cost Efficiency: Get the best value for your money by selecting models based on their token cost and performance metrics Key Features: Unified API: Connect to any LLM through a single endpoint. Model Fallbacks: Define fallback routes for models to ensure uninterrupted service even if the primary model fails. Transparent Pricing: Detailed cost breakdowns per request letting you manage and predict expenditures easily Please head over to https://kolank.ai/ and get started! Your AI solutions will never be the same. Don’t forget to upvote and share your feedback. Let’s build the future of AI together!
Greg Priday
This is a very clever idea. Well done. How does the model routing work? I see in the API, you can specify a single model. Is there a different way to specify a list of models? I assume you have another model that does the query routing. Is there any cost for this?
Shir Danishyar
@gregpriday We used to have model routing, but most of our users prefer to choose the model themselves. Therefore, we removed it from the list of supported models, although it is still mentioned on our landing page. we will remove it. What do you think? Is that something you want? Yes, we will create another endpoint where you can access the list of models and send as many messages as you want simultaneously to different models in parallel. is that what you want?
Greg Priday
@shir_jan It seems incredibly useful. But I imagine it would be expensive to run. I suppose you'd use a lightweight TinyBERT level model to perform the routing. The tricky part would be deciding what classifies whether a model is good enough to solve a specific query. I assume you've already solved these problems if you had a version of this running. I'd be keen to try it out.
Shir Danishyar
@gregpriday Thanks for your interest, Greg. That feature is no longer in production. However, if you need something faster or cheaper, smaller models are also available. Please check our supported model list here: https://kolank.com/docs#supporte....