Quickstart#
See your first LLM trace in Lens Loop in under 5 minutes.
-
What you'll accomplish
By the end of this guide, you'll capture your first LLM API call as a trace in Lens Loop — complete with prompts, responses, latency, and token usage.
Prerequisites
- Lens Loop installed — Download here or use the link from your beta invite email
- Lens ID — You'll create one when you first launch Loop (takes < 1 minute)
- An LLM API key — OpenAI, Anthropic, or any supported provider
- An application that makes LLM API calls — Any language or framework
Step 1: Install & Sign In#
- Download Lens Loop for your platform from your beta invite email or lenshq.io
- Run the installer and launch Lens Loop
- Sign in with your Lens ID (or create one — it takes less than a minute)
Need detailed instructions?
See Install Lens Loop for platform-specific guidance.
Step 2: Connect Your Application#
Route your LLM API calls through Loop's local proxy. This typically involves two configuration changes:
- Set your API base URL to
http://localhost:31300/openai/v1 - Add the
X-Loop-Projectheader with a project name
The project is created automatically if it doesn't exist.
import os
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:31300/openai/v1", # (1)!
api_key=os.environ.get("OPENAI_API_KEY"),
default_headers={
"X-Loop-Project": "hello-loop", # (2)!
}
)
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Say hello!"}]
)
print(response.choices[0].message.content)
- Routes traffic through Lens Loop's local gateway
- Tags this trace with your project name
import OpenAI from "openai";
const openai = new OpenAI({
baseURL: "http://localhost:31300/openai/v1", // (1)!
apiKey: process.env.OPENAI_API_KEY,
defaultHeaders: {
"X-Loop-Project": "hello-loop", // (2)!
},
});
const response = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "user", content: "Say hello!" }],
});
console.log(response.choices[0].message.content);
- Routes traffic through Lens Loop's local gateway
- Tags this trace with your project name
Using a different provider?
See Capture Traffic for Anthropic, Azure, Gemini, and local models like Ollama.
Step 3: See Your Trace#
Your trace is now visible in Lens Loop:
- Open Lens Loop (if not already open)
- In the Navigator, select your project
- Click Traces to see your captured requests
- Click on a trace to inspect:
- Full prompt and response content
- Token usage and estimated cost
- Latency breakdown
- Model and parameters used
You did it!
You've captured your first LLM trace. Every request routed through Loop will now be observable, searchable, and analyzable.
Next Steps#
-
Organize with Labels
Add custom labels to group and filter traces by feature, user, or experiment.
-
Team Environments
Share traces across your team with a remote Loop Server.
-
Explore the UI
Learn about the Navigator, Traces view, and Details panel.
Troubleshooting
Trace not appearing?
- Ensure Lens Loop is running (check your system tray or menu bar)
- Verify the
X-Loop-Projectheader matches your project name exactly (case-sensitive) - Check that port
31300is available and not blocked by a firewall
Connection refused error?
- Make sure Lens Loop is running before you run your script
- Try restarting Lens Loop
API key issues?
- Your actual API key is still sent to the LLM provider — Loop just proxies the request
- Ensure
OPENAI_API_KEY(or equivalent) is set in your environment
Need more help?
Visit the Lens Loop Forum