DeepSeek Coder V2 Lite

DeepSeek

deepseek2

The younger sibling of the GPT-4-beating 236B Deepseek Coder V2 model, this model also comes out strong with support for 338 different languages!

Model info

Model

DeepSeek Coder V2 Lite

Author

DeepSeek

Arch

deepseek2

Parameters

15.7B

Size on disk

about 8.46 GB

Format

gguf

Download and run DeepSeek Coder V2 Lite

Open in LM Studio to view download options

Use DeepSeek Coder V2 Lite in your code

💡 LM Studio needs to be installed and run at least once for this to work. Don't have it yet? Get it here.

CLI Bootstrap

npx lmstudio install-cli # (only needed once)

Model Load

lms load lmstudio-community/deepseek-coder-v2-lite-instruct-gguf
Alternatively, load the model in the LM Studio app.

Use DeepSeek Coder V2 Lite via an OpenAI-like API

Reuse your existing OpenAI client code and point it to LM Studio instead.

Python example
# Example: reuse your existing OpenAI client code
from openai import OpenAI

# Point to the local server
client = OpenAI(base_url="http://localhost:1234/v1", 
                api_key="lm-studio") # not used

completion = client.chat.completions.create(
  model="lmstudio-community/deepseek-coder-v2-lite-instruct-gguf",
  messages=[
    {"role": "system", "content": "Always answer in rhymes."},
    {"role": "user", "content": "Introduce yourself."}
  ],
  temperature=0.7,
)

print(completion.choices[0].message)

Develop

Learn more