Quickstart
Install the runtime with one Docker command, sign in to the dashboard, add a provider, and send your first message.
Prerequisites
- Docker installed on the host.
- One LLM provider API key (OpenAI, Anthropic, Google, or compatible).
- Port 8080 free on the host.
1. Start the runtime
Pull the image and start the container with persistent volumes. The browser tool requires --shm-size and SYS_ADMIN for Chromium automation; leave them in unless you know you will not use browsing tools.
docker pull ghcr.io/alexk-dev/golemcore-bot:latest
docker run -d \
--name golemcore-bot \
--shm-size=256m \
--cap-add=SYS_ADMIN \
-e STORAGE_PATH=/app/workspace \
-e TOOLS_WORKSPACE=/app/sandbox \
-v golemcore-bot-data:/app/workspace \
-v golemcore-bot-sandbox:/app/sandbox \
-p 8080:8080 \
--restart unless-stopped \
ghcr.io/alexk-dev/golemcore-bot:latestUnable to find image 'ghcr.io/alexk-dev/golemcore-bot:latest' locally
latest: Pulling from alexk-dev/golemcore-bot
...
Status: Downloaded newer image for ghcr.io/alexk-dev/golemcore-bot:latest
8f3e6a1c4d2b...2. Open the dashboard
The runtime prints a temporary admin password to its logs on first boot. Read it, then sign in.
docker logs golemcore-bot 2>&1 | grep "Initial admin password"Initial admin password: Xf3kP9-qLmN2w7ROpen http://localhost:8080, sign in as admin with that password. You land on the onboarding screen.
3. Add a provider
The onboarding screen walks you through adding a provider key and assigning models to tiers. The minimum viable config is one provider with one model mapped to the balanced tier.
{
"llm": {
"providers": {
"openai": {
"apiKey": "sk-proj-your-key-here",
"apiType": "openai"
}
}
}
}Once the provider is saved, go to Settings → Model Router and set balancedModel to a concrete model ID like openai/gpt-5.1. Leave the other tiers empty for now — dynamic escalation will handle them.
4. Send your first message
Go to Chat. Type a message. You should see the agent start a turn, call the selected model, and respond.
You: Hello. What tools do you have access to?
Agent: I have access to filesystem operations (read_file, write_file,
list_directory), shell execution in a sandboxed workspace,
memory tools for cross-session recall, and a set of management
tools for skills and goals. No browser or search tools are
active yet — you can enable them from Settings → Plugin
Marketplace.The runtime is working. The turn was visible in real time through the Chat stream; you can also inspect it in Sessions and Logs.
If the agent did not respond
- Check
docker logs golemcore-botfor provider errors (invalid key, unknown model, rate limit). - Verify the model ID matches a real model at your provider.
- See Troubleshooting for symptom-to-fix lookup.
What to do next
Tour the dashboard
Learn what each dashboard page does and how to navigate the Settings catalog.
Model routing
Understand how tier resolution works and pick the right models per tier for your workload.
Install plugins
Add browser, search, voice, or mail capabilities through the Plugin Marketplace.
Your first skill
Shape agent behavior by creating or installing skills. Start with the GitHub MCP recipe.
Related pages
User Guide
Dashboard
Navigate the dashboard: Chat, Settings, Scheduler, Sessions, Logs, Skills, Diagnostics, and the Plugin Marketplace.
User Guide
Model Routing
How the model router turns abstract tier names into concrete model calls, and the priority order that resolves tier conflicts.
User Guide
Deployment
Deploy GolemCore Bot with Docker, Compose, or JAR. Recommended default and a concrete production checklist.