lms server start
Start the LM Studio local server with customizable port and logging options.
The lms server start command launches the LM Studio local server, allowing you to interact with loaded models via HTTP API calls.
Flags
--port (optional) : number
Port to run the server on. If not provided, uses the last used port
--cors (optional) : flag
Enable CORS support for web application development. When not set, CORS is disabled
Start the server
Start the server with default settings:
lms server startSpecify a custom port
Run the server on a specific port:
lms server start --port 3000Enable CORS support
For usage with web applications or some VS Code extensions, you may need to enable CORS support:
lms server start --corsNote that enabling CORS may expose your server to security risks, so use it only when necessary.
Check the server status
See lms server status for more information on checking the status of the server.