DeepSeek V3

DeepSeek V3

Balance of performance, efficiency, and accessibility

DeepSeek-V3 is an advanced open-source language model developed by DeepSeek, a Chinese AI startup, released in December 2024. It features a Mixture-of-Experts (MoE) architecture with 671 billion total parameters, of which 37 billion are activated for each token.

Reviews
Per 1M Tokens
Input : $1.5
Output : $1.5
Category
Chat