Cameron Archer

Open Source LLM Performance Tracker - An open source Next app template to monitor your AI apps

The LLM Performance Tracker is an open source Next + Tinybird app template built by the engineers at Tinybird to capture LLM call traces and analyze them in real-time.

Add a comment

Replies

Best
Cameron Archer
Hunter
📌
Hey hunters! We're excited to share this open source LLM Performance Tracker with the Product Hunt community! If you're building AI apps or features, you can use this to monitor your LLM calls for cost, usage, and performance issues. The project includes a Tinybird backend to store and process LLM call events, and a Next app frontend to visualize critical LLM metrics such as cost by model, TTFT, total requests/tokens used, etc. All of the visualizations can be filtered and drilled down by a bunch of different dimensions. We designed this to be used as is: you can deploy it to Vercel/Tinybird for free and instrument your app to start sending LLM calls (you can find examples for LiteLLM and Vercel AI SDK in the repo). The app also includes a template for adding multi-tenancy via Clerk. This can be useful if you'd like to modify the components and integrate them into your app to give your users visibility over their LLM usage in your application. We'd love to hear your feedback and how you might use/extend this. Of course, it's open source, and PRs are welcome! If you have any questions, the Tinybird Slack community has answers :)
Jeremy Sarchet

Outstanding @alrocar!