InternLM 2.5 20B

InternLM

internlm2

InternLM 2.5 offers strong reasoning across the board as well as tool use for developers, while sitting at the sweet spot of size for those with 24GB GPUs.

Model info

Model

InternLM 2.5 20B

Author

InternLM

Arch

internlm2

Parameters

20B

Size on disk

about 7.55 GB

Format

gguf

Download and run InternLM 2.5 20B

Open in LM Studio to view download options

Use InternLM 2.5 20B in your code

💡 LM Studio needs to be installed and run at least once for this to work. Don't have it yet? Get it here.

CLI Bootstrap

npx lmstudio install-cli # (only needed once)

Model Load

lms load internlm/internlm2_5-20b-chat-gguf
Alternatively, load the model in the LM Studio app.

Use InternLM 2.5 20B via an OpenAI-like API

Reuse your existing OpenAI client code and point it to LM Studio instead.

Python example
# Example: reuse your existing OpenAI client code
from openai import OpenAI

# Point to the local server
client = OpenAI(base_url="http://localhost:1234/v1", 
                api_key="lm-studio") # not used

completion = client.chat.completions.create(
  model="internlm/internlm2_5-20b-chat-gguf",
  messages=[
    {"role": "system", "content": "Always answer in rhymes."},
    {"role": "user", "content": "Introduce yourself."}
  ],
  temperature=0.7,
)

print(completion.choices[0].message)

Develop

Learn more