Skip to main content

LM Studio 0.3.1

By LM Studio Team

LM Studio 0.3.1 Release Notes

Chat migration improvement from 0.2.31 to 0.3.0+

In this version (0.3.1) we've fixed the chat migration to bring in your system prompts from 0.2.31 chats.

If you've already migrated your chats in 0.3.0, you can run the 0.3.1 migrator again. It'll create another folder and will not override the previously migrated chats.

How to migrate chats

After updating to 0.3.1, head to Settings and click the "Migrate Chats" button. Your older chats will be copied into a new folder within the Chat tab. They will not be deleted.

What's new in 0.3.1

  • Chat migration improvement: Pre-0.3.0 chat migration now includes system prompts from 0.2.31
  • You can now paste (ctrl / cmd + V) images into the chat input box when a vision-enabled model is loaded.
  • Model load config: added an indication for the maximum context the model supports + button to the context length to it
  • Patched Gemma 2 prompt template to not error out when you provide a system prompt. Instead, the system prompt will be added as-is at the top of the context.
    • You can override this behavior by providing your own prompt template in the My Models screen.
  • More descriptive errors when a model crashes during operation
    • There may still be cases where the error doesn't have much information, please let us know if you're running into those cases.
  • Updated the bundled llama.cpp engine to 3ba780e2a8f0ffe13f571b27f0bbf2ca5a199efc (Aug 23)

Bug Fixes

  • Bug fix: Vision-enabled models would crash on operation (fixed)
  • Bug fix: Search bar in the Discover page doesn't show (fixed)
  • Bug fix: "Model Card" button text color in Classic theme is dark (fixed)
  • Bug fix: LM Studio deeplinks from Hugging Face and elsewhere don't work (fixed)

For more, join our Discord community: https://discord.gg/aPQfnNkxGC

If you want to use LM Studio at your organization, get in touch! [email protected]