LM Studio now runs as a headless server with an Anthropic-compatible API
LM Studio 0.4.0 extracts the inference engine into a standalone headless daemon with a full CLI and an Anthropic-compatible endpoint, meaning you can point Claude Code at a local model by setting two environment variables.
- tools
- infrastructure