This is the base model weights and network architecture of Grok-1, xAI's large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.
Great hunt! I like that they open sourced it but not the reason behind open sourcing it. Outside of that, awesome to see models of this size being thrown in the public. Interested to see what people will do with it!
Great find! I appreciate that they've open-sourced it, although the rationale behind doing so remains unclear. Nevertheless, it's fantastic to witness models of this magnitude being made available to the public. I'm curious to see what people will create with it!
The open-source release of Grok-1, xAI's LLM is a boon for those interested in advanced prediction and decision-making AI models. Despite its hardware demands (like multiple H200 GPUs), this unlocks access for experimentation and the creation of groundbreaking applications.
Replies
Product Hunt
Crewlix
file.coffee
Crustdata
Retexts
ShipFast-ASP.NET
Insou AI