Anthropic has launched Claude 2.1, its latest and most advanced language model. By far the biggest and most noteworthy difference between Claude 2, which launched earlier this year, and Claude 2.1 is the amount of information that can be relayed to Claude via its message box.
Previously 128,000 tokens, Claude’s context window has now been enlarged to 200,000 tokens.
That’s around 150,000 words, or 500 pages of material – roughly equivalent to one cover-to-cover copy of Charlotte Bronte’s Jane Eyre.
Anthropic says that Claude 2.1 will be able to “summarize, perform Q&A, forecast trends, compare and contrast multiple documents, and much more” and expects the latency to “decrease substantially” as the technology develops.





