Perfect your plank form, challenge friends, or join a team. Our AI motion coaches give you real-time interactive feedback and track your time every step of the way. This is the beta of the first of many apps powered by the Exer platform and coaches worldwide.
Howzit,
I’m Zaw, one of the co-founders of Exer. This company and app have been a labor of love since early 2018, when we first started digging into the connected fitness/health space. We were inspired by Peloton and the other connected fitness devices that were/are taking the market by storm. Back in 2018 (pre-covid), what we found were two main problems / questions that most consumers were having with their routines: 1) I don’t know if I’m doing it right and 2) Not working out in a live class or without a leaderboard isn’t as fun or motivating as it could be.
So, we set off to build a platform, starting a year ago, focused on AI on the edge (mobile) to democratize fitness and give anyone that had a smartphone the ability to solve those two problems without having to buy expensive hardware. We also wanted to make sure there was a free option for anyone that wanted to use our apps. Our belief is that you can augment the experience that you get with a coach, personal trainer, physical therapist, etc → but you can’t replace them.
Enter 2020 and COVID-19: now we don’t know how long people will be forced to either stay home or will not feel comfortable going into a gym, studio, or clinic. What we’re launching on Product Hunt today is the first of many apps and solutions we have built for people that want the feeling of working out with a coach but can’t afford one, manage the time to see one, or are stuck at home. I’ll let Sean (co-founder and CTO) and Clint (co-founder and Head Coach) talk more about the how and why, but I’m both super excited and nervous for us to get this one out there. It tests not only how consumers will respond to AI coaches using just their phone...but also how coaches can use digital technologies like Exer to work with their clients even in a fully remote environment. Remember, we’re not trying to replace coaches, but give them even more ways to stay connected to the people they train.
Please try out our first app and let us know what you think please! It’s only available on iOS for now, but we’ll have an Android version out soon.
App Store: https://apps.apple.com/us/app/pe...
If you have any questions let us know, and we’d love to hear your feedback 🙂
Cheers,
Zaw
CEO and Co-Founder, Exer
@zawthet everywhere
@dnoparavandis we're available in the US, Canada, UK, Australia, New Zealand, and Singapore for now. More to come. Where do you want to see it launched?
Howdy, Product Hunters.
I’m Sean, co-founder and CTO at Exer. Here’s a quick rundown of some the tech we’ve built on the Exer platform that you’ll see in our first app, Perfect Plank:
We use a set of custom-trained convolutional neural networks (CNNs) based on Google's PersonLab (https://arxiv.org/pdf/1803.08225...) architecture with MobileNet backbones. These models are trained in PyTorch using open source and private datasets and then they are translated to CoreML variants. Depending on the exercise, we post-process the outputs of the models using on-device classifiers (e.g. KNNs, SVMs, etc.) as well as lots of good 'ole fashioned Linear Algebra. For math, we use Accelerate/vDSP, as well as an ever-growing collection of college textbooks :)
It’s important to note that we do all processing on-device, without any of your pose data being sent to the cloud. In addition to protecting user privacy, this also means that our core processing and exercise assessment logic works while your phone is in airplane mode. Another interesting tidbit is that we’re not using the iPhone’s depth sensor, we’re only using 2D image data from the camera.
Perfect Plank also has a version of our audio engine that replicates what it’s like to train with your favorite coach. In addition to interactions like corrections ("You're feet are too far apart") or encouragements ("Come on, you're almost there!"), we've also implemented the nuances that separate trainers from one another. For instance, some coaches are more immediate and forceful in their feedback, while others are a bit more relaxed. In the end, we've arrived at a complex audio engine that is architected just like those you hear in your favorite video games. It's able to make decisions based on hundreds of pieces of current and historical state, and it's even smart enough to learn from your mistakes and optimize itself (ML FTW). We've made it easy for any coach to come in, add their voice, and essentially "program" their feedback for an exercise based on how they approach their training sessions.
Thanks for reading and please feel free to reach out with any questions.
Sean
CTO and Co-founder, Exer Labs
@theseancook
@theseancook Hi Sean, awesome product! I've been kicking around similar ideas myself seeing the development in Image/Video Processing Convolutional Neural Nets in the last few years. I think something similar for running technique would be awesome. I have a preference for PyTorch also, but thought it was difficult to then make the models run directly on IoS devices. Is there any reason you went with PyTorch over TF?
Exer Studio
Rocketgraph
Exer Studio
Exer Studio
Rocketgraph
Exer Studio