
Meta is helping build a future where people have more ways to play and connect in the metaverse. Welcome to the next chapter of social connection.
Meta is helping build a future where people have more ways to play and connect in the metaverse. Welcome to the next chapter of social connection.
Launched on May 8th, 2025
Provides affordable VR headsets. I use my Meta Quest 3 with Immersed VR to do my screen time in a virtual environment. Recent controversy from Zuck, including his misogyny and friendships with fascist tech bros and political leaders, however, made it less exciting to be associated with Meta products. =\
I'm interested to see how this will work out. I don't have Instagram and I don't really want to 'get it' either but I do want to see what this is like and whether it's social like Facebook or more open like Twitter. I'm also not sure if people will really use it all that much. Either way, if I was Elon I'd be kind of worried, especially as this has the added bonus of being propped up by Instagram rather than being largely sperate from FB / IG.
Hi everyone!
V-JEPA 2 is Meta's new world model, a serious take on building AI that understands the physical world with the kind of intuition humans have. It's a foundational step towards what they call Advanced Machine Intelligence (AMI).
It learns from over a million hours of video, not just static images, to build a sense of how things move, interact, and follow basic physics. This allows it to understand and predict what might happen next in a scene.
And this isn't just theory. It's being used for zero-shot robot planning, letting a robot pick up and move objects it has never seen before. That’s a very impressive demonstration of where this technology is headed.
On top of the model and code, Meta has also released three new benchmarks for physical reasoning, which is a great contribution to help push the entire research community forward.
Huge release from Meta! V-JEPA 2’s ability to learn from video and perform zero-shot planning is a major step toward more intuitive, grounded AI. Exciting to see the model, code, and benchmarks all open — can’t wait to see how the community builds on this.
V-JEPA 2 feels like a major leap toward AI systems that truly understand real-world dynamics, especially exciting to see it applied in zero-shot robot planning. How well does it generalise to edge cases or unpredictable environments where physical rules might not be so clear?