Anthropic expands Claude's context window from 9K to 100K tokens, or ~75K words it can digest and analyze; OpenAI's GPT-4 has a context window of ~32K tokens (Kyle Wiggers/TechCrunch) - TechnW3

Anthropic expands Claude's context window from 9K to 100K tokens, or ~75K words it can digest and analyze; OpenAI's GPT-4 has a context window of ~32K tokens (Kyle Wiggers/TechCrunch) - TechnW3

Kyle Wiggers / TechCrunch:
Anthropic expands Claude's context window from 9K to 100K tokens, or ~75K words it can digest and analyze; OpenAI's GPT-4 has a context window of ~32K tokens  —  Historically and even today, poor memory has been an impediment to the usefulness of text-generating AI.



from TechnW3

No comments: