DeepSeek
Models from DeepSeek
DeepSeek-Prover-V2
Open-source LLM specialized in formal theorem proving in Lean 4, built on a recursive theorem-proving pipeline powered by DeepSeek-V3.
DeepSeek-V3.2
Open-source Mixture-of-Experts LLM tuned for high-efficiency reasoning, coding, and general language tasks across long-form prompts.
DeepSeek-V4-Flash
Lightweight MoE model with 284B total / 13B active parameters and native 1M context, tuned for low-latency, cost-effective high-concurrency use.
DeepSeek-V4-Pro
Flagship MoE LLM with 1.6T total / 49B active parameters and native 1M context for advanced math, logical inference, and specialized coding.
Janus-Pro-DeepSeek
Autoregressive framework on the Janus Pro 7B model that unifies multimodal understanding and image generation in one architecture.
