close
close

DeepSeek is preparing Deep Roles and has released a new V3 model

DeepSeek is preparing Deep Roles and has released a new V3 model

DeepSeek recently announced its new model DeepSeek v3, which is a significant improvement over its predecessor. The new model is said to be three times faster than v2 and has improved capabilities and intelligence. Like all previous DeepSeek models, v3 is open source. According to benchmarks, it outperforms existing models, including Claude 3.5 Sonnet and ChatGPT-4o, particularly on math and coding tasks such as HumanEval.

With 671 billion parameters, DeepSeek v3 is the largest open source language model to date, surpassing LLaMA’s previous record of 405 billion parameters. The model is now available on Hugging Face and will be gradually rolled out to the DeepSeek chat interface to make it available to a wider audience.

In addition to the new model, some hidden features have been discovered in the DeepSeek ecosystem. A notable feature in development is called Deep Roles, which allows users to explore β€œroles” created by others in both Chinese and English or design their own. Although this feature is still in its early stages, it appears to work similarly to custom GPTs and allows users to add personalized prompts to the DeepSeek LLM and share them publicly. However, the full scope of Deep Roles remains unclear and further updates are expected as the feature continues to develop.

Leave a Reply

Your email address will not be published. Required fields are marked *