Control your dev workflow with your voice. End the mentally expensive context switching that takes up almost half your day by skipping the alt-tabbing, clicking, and typing. Just speak to deploy, document, or debug. Powered by MCP, Saga keeps you in flow.
Hey Product Hunt Community! đź‘‹
Sharon here, PM at Deepgram. We’ve been building Saga for the past few months to solve a pain that’s personal for most of us in PM and Engineering: too many tools, too many tabs, too much friction. Most AI copilots still need help translating your intent into precise prompts. Other voice assistants are not well integrated into the typical dev stack. So we thought: what if you could speak your commands into workflows?
That’s Saga. It listens, understands, and actually executes.
We built it for developers who are already using Cursor, Windsurf, Jira, Slack, Linear (and more), and we’d love your feedback as we expand across modalities, operating systems and breadth of tasks.
🙌 We’d love to hear from you on:
One thing you wish you could do just by speaking?
What MCP action is your favorite?
Thanks for checking out Saga! We’ll be around all day in the comments and our Discord channel: https://discord.com/invite/deepgram.
—Sharon (and the Deepgram team)
In your video you say that the AI can take in vague instructions and turn into precise instructions. Models like o1 and o3 do that sort-of fine, but the question holds: can it be done in a truly useful way? Would appreciate so much some use cases and other examples to see how it works in your cool app
@chan_bartz Great question! Yes, models like o1 and o3 can kind of handle vague input, but they’re inconsistent without the right prompt structure. What Saga does is convert your fuzzy, natural speech into a clean, structured instruction that actually works when passed into tools like Cursor, Replit, or Windsurf. It acts like a pre-processor that speaks “LLM,” so you don’t have to.
Example 1:
You say: “Make a helper to format a date string”
Saga rewrites it into:
Create a helper function to format a date string. The function should:
1. Accept a date string as input.
2. Convert the date string into a more readable format (e.g., "YYYY-MM-DD" to "Month Day, Year").
3. Handle invalid date inputs gracefully by returning an error message or null.
Then pipes that into Cursor and gets back usable code on the first try.
Example 2:
You say: “Add error handling to this function”
Saga rewrites it into something like:
Add error handling to the specified function. Ensure that you include:
1. Try-catch blocks to manage exceptions.
2. Clear error messages for debugging purposes.
3. Return appropriate responses or throw errors based on the situation.
We’re seeing devs use it to avoid prompt tinkering and get more consistent results from AI coding tools.
Would love to see how it works for you when you try it out!
Replies
Deepgram
In your video you say that the AI can take in vague instructions and turn into precise instructions. Models like o1 and o3 do that sort-of fine, but the question holds: can it be done in a truly useful way? Would appreciate so much some use cases and other examples to see how it works in your cool app
Deepgram
@chan_bartz Great question! Yes, models like o1 and o3 can kind of handle vague input, but they’re inconsistent without the right prompt structure. What Saga does is convert your fuzzy, natural speech into a clean, structured instruction that actually works when passed into tools like Cursor, Replit, or Windsurf. It acts like a pre-processor that speaks “LLM,” so you don’t have to.
Example 1:
You say: “Make a helper to format a date string”
Saga rewrites it into:
Then pipes that into Cursor and gets back usable code on the first try.
Example 2:
You say: “Add error handling to this function”
Saga rewrites it into something like:
We’re seeing devs use it to avoid prompt tinkering and get more consistent results from AI coding tools.
Would love to see how it works for you when you try it out!
Tried it out and this is awesome 🤩
This looks interesting! So excited to try it out.
Really excited to try Saga! It promises to fill the huge interface gap left by web-based and terminal-driven tools.
Deepgram sounds like a fantastic platform for developers looking to integrate advanced voice AI features into their products!
Congrats!
Love the voice-controlled workflow—such a smart solution for devs!
Quick ideas to make it even better:
Support more languages for global teams
Build plugins for popular IDEs
Suggest smart voice commands
Excited to see where this goes! 🎙️💻
Hmm, looks promising