
Let's discuss your stance on AI companions
My co-founder and I are building MVP.ai, emotionally intelligent AI companions that people can learn from (from skills to studying), regulate and process their emotions (thus increasing emotional IQ), talk with, and overall, grow together in an immersive bonding experience.
But wait... building AI companions, that seems unethical? Dangerous, right?
Well yeah, the idea itself can come off as those exact things and I wouldn't blame you for thinking that way. After all, take into account my generation's current disconnected, isolated reality due to the rise of social media and everything being ruled by algorithms and toxic loops.
The solutions that some companies (Replika, Nomi) have thought of though? Hyperdependency, AI boyfriend/girlfriend relationships, flirting, not necessarily solving the issue of isolation, but rather seeking to put a temporary bandage on it.
Do you believe that there is any way that AI companions could be shifted for good or optimized for human consumption?
Be real! Again, I'd love to spark this discussion and get honest opinions.
Replies
I believe that sometimes, we give too much credit to AI.
For psychological matters, AI can certainly be of great help, as people might struggle to talk about personal issues with others, especially professionals. So, it could be a good starting point to look for psychological help. However, I don't think it will ever replace real therapists, as the human mind is very complex and emotions play a crucial role (while AI doesn't seem too much into understanding them deeply).
As for creating bonding experiences, it could certainly help those who struggle in real-life interactions and give valuable insights in learning new things. As long as AI doesn't completely replace human interactions, I think it's cool.
@jordansvision I like the idea of using disclaimers!
I heard of someone working on a similar project, mostly focused on psychological help, and they received very harsh criticism from others, so I think clarifying your position and the real goal of your product is very important when emotions and psychology are involved.
As for AI getting better and better, I'm curious to see what will happen in the next few months or years!
@pamela_arienti I love this insight, I'll admit, our product is meant to be multifaceted, with focusing on emotions and psychology as just one of the aspect, but I will definitely work my hardest to mitigate that sort of response with our product.
And same, we're all pretty much banking on it growing exponentially. The problem does lie in how companies choose to utilize that growth, whether it's for the betterment of the world or to its detriment, but that's a conversation for another day as well.
I so want this for myself, much like the premier in the Dimond Age. I want it to know where i left off, to have all my learnings, where I often struggle, what analogies can be drawn from other times I have struggled; i want it to identify my gaps. I even imagine the chess tutorial in Duolingo can be a million times smarter. Bring it on Jordan.
@howell4change My team and I are working hard on its capabilities. One day, we hope to reach that level of intelligence and memory!