Matt Carroll

1m token context length

https://www.anthropic.com/news/1m-context

Nice, looks like the context length has increased quite a bit. Up from 250k I believe!

I wonder if context > parameters for “basic” coding tasks at this point.

215 views

Add a comment

Replies

Best
Vincent chen

For basic coding, more context might beat raw model size for sure.