Skip to Content
NotPixel SDK v1.0.1 — Now with caching, hooks, and browser tracking!

LMStudio

Access LMStudio models locally through NotPixel. Perfect for local development and testing without hitting cloud APIs.

Learn more in the LMStudio documentation .

Setup

1. Download LMStudio

  1. Go to LMStudio 
  2. Download and install for your OS
  3. Download the models you want to use
  4. Start the Local Inference Server (defaults to http://localhost:1234)

2. Configure Environment

.env
# Optional, defaults to mock if not set but used for local routing LMSTUDIO_API_KEY=not-needed-for-local

Usage

import Ads from 'notpixel'; const ads = new Ads({ publisherId: 'pub_xxx', model: 'lmstudio/openai/gpt-oss-20b', input: 'Hello from local AI!', }); const response = await ads.offer(); console.log(response.text);

NotPixel uses the OpenAI-compatible endpoint of LMStudio. Set your baseURL to your local LMStudio endpoint if you’re not using the default router.

Available Models (Examples)

ModelContext
lmstudio/openai/gpt-oss-20b131K
lmstudio/qwen/qwen3-30b-a3b-2507262K
lmstudio/qwen/qwen3-coder-30b262K

Environment Variable

VariableDescription
LMSTUDIO_API_KEYYour local API key (optional)