DeepSeek V3.1 Officially Announced, Adds 128k Token Memory

On August 19, DeepSeek Assistant announced tonight in its official group that the online model has been upgraded to version 3.1, with the context length expanded to 128k tokens. The company confirmed that the new version is now available for testing through the official website, mobile app, and mini program, while the API calling method remains unchanged.

The main focus of this update is context. With a longer memory window, the model can process and retain much larger inputs in a single session. This improvement is expected to help in analyzing long documents, understanding complex codebases, and keeping consistency during extended conversations.

Some users have also noted that the update improves front-end coding performance, making it more effective in handling HTML, CSS, and JavaScript tasks. Industry watchers see the 128k expansion as a step that puts DeepSeek closer to leading AI models such as OpenAI’s GPT-4 and Anthropic’s Claude 3.5, both known for their large context capabilities.

DeepSeek’s upgrade to V3.1 marks another move in the AI race, with longer context windows becoming a key feature for professional and business use.

Leave a comment