Hi Product Hunt
This whole thing started because I was trying to find an easy way for designers to create new interactions using machine learning models. I was messing around with MQTT and websockets (which are always a pain to set up), when I remembered something from my student days - we used to do these "Wizard of Oz" demos where someone would secretly control prototypes with keyboard inputs during presentations.
That made me think - why not just map ML model outputs directly to keyboard shortcuts? No complex setup needed. Just upload your CoreML model, map the classifications to keys, and you're good to go.
I honestly didn't expect where this would lead. I've gone down quite the rabbit hole and now I'm controlling pretty much everything on my Mac - from flipping through presentations to logging out by waving. It's kind of addictive finding new things to automate.
I'm putting this out there for free because I really want to see what other people come up with. ML doesn't have to be complicated - sometimes the simplest solutions open up the most interesting possibilities.
What new interactions would you prototype with this?