Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions 0_app/0_root/index.md → 0_app/0_root/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,50 @@ description: Learn how to run Llama, DeepSeek, Qwen, Phi, and other LLMs locally
index: 1
---

import { Card, Cards } from "fumadocs-ui/components/card";
import { getDocsSectionIcon } from "@/lib/docsSectionIcon";
Comment thread
yagil marked this conversation as resolved.

## Explore the docs

<Cards>
<Card
title="Developer"
description="Build with LM Studio's local REST API, OpenAI-compatible APIs, and developer tooling."
href="/docs/developer"
icon={getDocsSectionIcon("developer")}
/>
<Card
title="Python SDK"
description="Use lmstudio-python to load models, generate text, embed content, and build agents."
href="/docs/python"
icon={getDocsSectionIcon("python")}
/>
<Card
title="TypeScript SDK"
description="Use lmstudio-js in Node.js or TypeScript apps for local model workflows and plugins."
href="/docs/typescript"
icon={getDocsSectionIcon("typescript")}
/>
<Card
title="CLI"
description="Use lms for chat, model downloads, daemon management, server control, and publishing."
href="/docs/cli"
icon={getDocsSectionIcon("cli")}
/>
<Card
title="Integrations"
description="Connect LM Studio to Codex, Claude Code, OpenClaw, MCP tools, and remote workflows."
href="/docs/integrations"
icon={getDocsSectionIcon("integrations")}
/>
<Card
title="LM Link"
description="Set up LM Link to route local AI workloads across devices and preferred machines."
href="/docs/lmlink"
icon={getDocsSectionIcon("lmlink")}
/>
</Cards>

To get LM Studio, head over to the [Downloads page](/download) and download an installer for your operating system.

LM Studio is available for macOS, Windows, and Linux.
Expand Down
7 changes: 7 additions & 0 deletions 0_app/0_root/meta.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"title": "Introduction",
"pages": [
"offline",
"system-requirements"
]
}
4 changes: 2 additions & 2 deletions 0_app/0_root/offline.md → 0_app/0_root/offline.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@ description: LM Studio can operate entirely offline, just make sure to get some
index: 4
---

```lms_notice
<Callout type="info" title="Note">
In general, LM Studio does not require the internet in order to work. This includes core functions like chatting with models, chatting with documents, or running a local server, none of which require the internet.
```
</Callout>

### Operations that do NOT require connectivity

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,21 +5,21 @@ description: Getting started with connecting applications to LM Studio

LM Studio comes with a few built-in themes for app-wide color palettes.

<hr>
<hr />

### Selecting a Theme

You can choose a theme in the Settings tab.

Choosing the "Auto" option will automatically switch between Light and Dark themes based on your system settings.

```lms_protip
<Callout type="success" title="Pro Tip">
You can jump to Settings from anywhere in the app by pressing `cmd` + `,` on macOS or `ctrl` + `,` on Windows/Linux.
```
</Callout>

###### To get to the Settings page, you need to be on [Power User mode](/docs/modes) or higher.
To get to the Settings page, you need to be on Power User mode or higher.

<hr>
<hr />

### Community

Expand Down
52 changes: 0 additions & 52 deletions 0_app/1_basics/index.md

This file was deleted.

62 changes: 62 additions & 0 deletions 0_app/1_basics/index.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
---
title: Get started with LM Studio
sidebar_title: Overview
description: Download and run Large Language Models like Qwen, Mistral, Gemma, or gpt-oss in LM Studio.
index: 1
---

import { Step, Steps } from "fumadocs-ui/components/steps";

Double check computer meets the minimum [system requirements](/docs/system-requirements).

<Callout type="info" title="Info">
You might sometimes see terms such as `open-source models` or `open-weights models`. Different models might be released under different licenses and varying degrees of 'openness'. In order to run a model locally, you need to be able to get access to its "weights", often distributed as one or more files that end with `.gguf`, `.safetensors` etc.
</Callout>

<hr />

## Getting up and running

First, **install the latest version of LM Studio**. You can get it from [here](/download).

Once you're all set up, you need to **download your first LLM**.

<Steps>
<Step>
<h3>Download an LLM to your computer</h3>

Head over to the Discover tab to download models. Pick one of the curated options or search for models by search query (e.g. `"Llama"`). See more in-depth information about downloading models [here](/docs/basics/download-models).

<img src="/assets/marketing/docs/discover.png" style={{ width: "500px", marginTop: "30px" }} data-caption="The Discover tab in LM Studio" />
</Step>

<Step>
<h3>Load a model to memory</h3>

Head over to the **Chat** tab, and

1. Open the model loader
2. Select one of the models you downloaded (or [sideloaded](/docs/advanced/sideload)).
3. Optionally, choose load configuration parameters.

<img src="/assets/marketing/docs/loader.png" data-caption="Quickly open the model loader with `cmd` + `L` on macOS or `ctrl` + `L` on Windows/Linux" />

<h4>What does loading a model mean?</h4>

Loading a model typically means allocating memory to be able to accommodate the model's weights and other parameters in your computer's RAM.
</Step>

<Step>
<h3>Chat!</h3>

Once the model is loaded, you can start a back-and-forth conversation with the model in the Chat tab.

<img src="/assets/marketing/docs/chat.png" data-caption="LM Studio on macOS" />
</Step>
</Steps>

<hr />

### Community

Chat with other LM Studio users, discuss LLMs, hardware, and more on the [LM Studio Discord server](https://discord.gg/aPQfnNkxGC).
12 changes: 12 additions & 0 deletions 0_app/1_basics/meta.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
{
"title": "Getting Started",
"pages": [
"chat",
"_connect-apps",
"download-model",
"_keychords",
"lmstudio-vs-llmster-vs-lms",
"rag",
"_troubleshooting"
]
}
6 changes: 2 additions & 4 deletions 0_app/2_mcp/deeplink.md → 0_app/2_mcp/deeplink.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: "`Add to LM Studio` Button"
title: "Add to LM Studio Button"
description: Add MCP servers to LM Studio using a deeplink
index: 2
---
Expand All @@ -14,9 +14,7 @@ Starting with version 0.3.17 (10), LM Studio can act as an MCP host. Learn more

Enter your MCP JSON entry to generate a deeplink for the `Add to LM Studio` button.

```lms_mcp_deep_link_generator

```
<DocsMcpDeepLinkGenerator />

## Try an example

Expand Down
28 changes: 16 additions & 12 deletions 0_app/2_mcp/index.md → 0_app/2_mcp/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,9 @@ Starting LM Studio 0.3.17, LM Studio acts as an **Model Context Protocol (MCP) H

Never install MCPs from untrusted sources.

```lms_warning
<Callout type="warning" title="Heads Up">
Some MCP servers can run arbitrary code, access your local files, and use your network connection. Always be cautious when installing and using MCP servers. If you don't trust the source, don't install it.
```
</Callout>

# Use MCP servers in LM Studio

Expand All @@ -22,24 +22,28 @@ Starting 0.3.17 (b10), LM Studio supports both local and remote MCP servers. You

Switch to the "Program" tab in the right hand sidebar. Click `Install > Edit mcp.json`.

<img src="/assets/marketing/docs/install-mcp.png" data-caption="" style="width: 80%;" className="" />
<img src="/assets/marketing/docs/install-mcp.png" data-caption="" style={{ width: "80%" }} className="" />

This will open the `mcp.json` file in the in-app editor. You can add MCP servers by editing this file.

<img src="/assets/marketing/docs/mcp-editor.png" data-caption="Edit mcp.json using the in-app editor" style="width: 100%;" className="" />
<img src="/assets/marketing/docs/mcp-editor.png" data-caption="Edit mcp.json using the in-app editor" style={{ width: "100%" }} className="" />

### Example MCP to try: Hugging Face MCP Server

This MCP server provides access to functions like model and dataset search.

<div className="w-fit">
<a style="background: rgb(255,255,255)" href="https://lmstudio.ai/install-mcp?name=hf-mcp-server&config=eyJ1cmwiOiJodHRwczovL2h1Z2dpbmdmYWNlLmNvL21jcCIsImhlYWRlcnMiOnsiQXV0aG9yaXphdGlvbiI6IkJlYXJlciA8WU9VUl9IRl9UT0tFTj4ifX0%3D">
<LightVariant>
<img src="https://files.lmstudio.ai/deeplink/mcp-install-light.svg" alt="Add MCP Server hf-mcp-server to LM Studio" />
</LightVariant>
<DarkVariant>
<img src="https://files.lmstudio.ai/deeplink/mcp-install-dark.svg" alt="Add MCP Server hf-mcp-server to LM Studio" />
</DarkVariant>
<a style={{ background: "rgb(255,255,255)" }} href="https://lmstudio.ai/install-mcp?name=hf-mcp-server&config=eyJ1cmwiOiJodHRwczovL2h1Z2dpbmdmYWNlLmNvL21jcCIsImhlYWRlcnMiOnsiQXV0aG9yaXphdGlvbiI6IkJlYXJlciA8WU9VUl9IRl9UT0tFTj4ifX0%3D">
<img
src="https://files.lmstudio.ai/deeplink/mcp-install-light.svg"
alt="Add MCP Server hf-mcp-server to LM Studio"
className="dark:hidden"
/>
<img
src="https://files.lmstudio.ai/deeplink/mcp-install-dark.svg"
alt="Add MCP Server hf-mcp-server to LM Studio"
className="hidden dark:block"
/>
</a>
</div>

Expand All @@ -56,7 +60,7 @@ This MCP server provides access to functions like model and dataset search.
}
```

###### You will need to replace `<YOUR_HF_TOKEN>` with your actual Hugging Face token. Learn more [here](https://huggingface.co/docs/hub/en/security-tokens).
You will need to replace `<YOUR_HF_TOKEN>` with your actual Hugging Face token. Learn more [here](https://huggingface.co/docs/hub/en/security-tokens).

Use the [deeplink button](mcp/deeplink), or copy the JSON snippet above and paste it into your `mcp.json` file.

Expand Down
6 changes: 6 additions & 0 deletions 0_app/2_mcp/meta.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"title": "MCP",
"pages": [
"deeplink"
]
}
4 changes: 2 additions & 2 deletions 0_app/3_modelyaml/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Introduction to `model.yaml`"
description: Describe models with the cross-platform `model.yaml` specification.
title: "Introduction to model.yaml"
description: Describe models with the cross-platform model.yaml specification.
index: 5
socialCard:
url: https://files.lmstudio.ai/modelyaml-card.jpg
Expand Down
6 changes: 6 additions & 0 deletions 0_app/3_modelyaml/meta.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"title": "model.yaml",
"pages": [
"publish"
]
}
4 changes: 2 additions & 2 deletions 0_app/3_modelyaml/publish.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: Publish a `model.yaml`
title: Publish a model.yaml
description: Upload your model definition to the LM Studio Hub.
index: 7
---
Expand All @@ -22,7 +22,7 @@ lms clone qwen/qwen3-8b

This will result in a local copy `model.yaml`, `README` and other metadata files. Importantly, this does NOT download the model weights.

```lms_terminal
```bash title="Terminal"
$ ls
README.md manifest.json model.yaml thumbnail.png
```
Expand Down
9 changes: 9 additions & 0 deletions 0_app/3_presets/meta.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"title": "Presets",
"pages": [
"import",
"publish",
"pull",
"push"
]
}
20 changes: 14 additions & 6 deletions 0_app/3_presets/push.md → 0_app/3_presets/push.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ description: Publish new revisions of your Presets to the LM Studio Hub.
index: 5
---

import { Step, Steps } from "fumadocs-ui/components/steps";

`Feature In Preview`

Starting LM Studio 0.3.15, you can publish your Presets to the LM Studio community. This allows you to share your Presets with others and import Presets from other users.
Expand All @@ -18,14 +20,20 @@ Presets you share on the LM Studio Hub can be updated.

<img src="/assets/marketing/docs/preset-cloud-indicator.png" data-caption="Your shared Presets are marked with a cloud icon." />

## Step 1: Make Changes and Commit
<Steps>
<Step>
<h3>Make Changes and Commit</h3>

Make any changes to your Preset, both in parameters that are already included in the Preset, or by adding new parameters.
Make any changes to your Preset, both in parameters that are already included in the Preset, or by adding new parameters.
</Step>

## Step 2: Click the Push Button
<Step>
<h3>Click the Push Button</h3>

Once changes are committed, you will see a `Push` button. Click it to push your changes to the Hub.
Once changes are committed, you will see a `Push` button. Click it to push your changes to the Hub.

Pushing changes will result in a new revision of your Preset on the Hub.
Pushing changes will result in a new revision of your Preset on the Hub.

<img src="/assets/marketing/docs/preset-push-button.png" data-caption="Click the Push button to push your changes to the Hub." />
<img src="/assets/marketing/docs/preset-push-button.png" data-caption="Click the Push button to push your changes to the Hub." />
</Step>
</Steps>
Loading
Loading