
1m token context length
https://www.anthropic.com/news/1m-context
Nice, looks like the context length has increased quite a bit. Up from 250k I believe!
I wonder if context > parameters for “basic” coding tasks at this point.
215 views
https://www.anthropic.com/news/1m-context
Nice, looks like the context length has increased quite a bit. Up from 250k I believe!
I wonder if context > parameters for “basic” coding tasks at this point.
Replies
Lanceboard
For basic coding, more context might beat raw model size for sure.