👨🏽💻 Backmesh was born from scratching my own itch. I was building multiple apps using LLMs which had private keys that could not be exposed in the client and I wanted to securely call Anthropic, Gemini or OpenAI without having to spin up a backend every time.
🤖 Backmesh lets your mobile or web app (e.g. Javascript, Native Mobile, Flutter, React, React Native, etc) call LLM private key APIs without a server or cloud function backend.
⚡️ Backmesh is hosted on Cloudflare’s edge, providing lower response times compared to traditional servers and cloud functions.
🔒 Backmesh provides secure LLM API access by using your authentication provider’s JWT to ensure only authorized users can make API calls. It adds additional protections like rate limiting (e.g., no more than 5 OpenAI API calls per hour per user). For more details, see the security documentation at https://backmesh.com/docs/security.
📈 Backmesh currently offers LLM User Analytics in early beta access. All LLM API calls are instrumented so you can identify usage patterns, reduce costs and improve user satisfaction within your AI applications.