Chris Messina

Grok-1 - Open source release of xAI's LLM

by
Ambassador
This is the base model weights and network architecture of Grok-1, xAI's large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.

Add a comment

Replies

Best
Chris Messina
Ambassador
Hunter
📌
To @aaronoleary's question: "Can Elon Musk challenge Open AI?", apparently the answer is indeed yes.
Aaron O'Leary
@chrismessina appreciate you remembering my question!
Saif Khan
This is huge! Only if I had the computing power to give it a try :(
Abidariaz Abida
Aoa can I talk to you
Daniel
Great hunt! I like that they open sourced it but not the reason behind open sourcing it. Outside of that, awesome to see models of this size being thrown in the public. Interested to see what people will do with it!
Abhilash Chowdhary
Going to give this a try, team Grok-1. Looks interesting.
Bob WIlsey
Great find! I appreciate that they've open-sourced it, although the rationale behind doing so remains unclear. Nevertheless, it's fantastic to witness models of this magnitude being made available to the public. I'm curious to see what people will create with it!
Moneshkumar Natarajan
The open-source release of Grok-1, xAI's LLM is a boon for those interested in advanced prediction and decision-making AI models. Despite its hardware demands (like multiple H200 GPUs), this unlocks access for experimentation and the creation of groundbreaking applications.
Borey Washington
I like to see a smaller version of this LLM that use less memory; not everyone can afford this huge memory GPU machine to run it
levene
Congrats on launching Grok-1
César Daniel Velázquez Mendoza
We'll be early today to get the stock fully but yes I guess you didn't too....
Chuck Chen
This is huge. Why doesn't Grok-1 get enough votes to rank on the top? It's at least near GPT-4 level if not exceeding it.
loog
Great!!!