GC
User Guide / QuickstartMenu
3 min read·Updated 2026-04-10

Quickstart

Install the runtime with one Docker command, sign in to the dashboard, add a provider, and send your first message.

Prerequisites

  • Docker installed on the host.
  • One LLM provider API key (OpenAI, Anthropic, Google, or compatible).
  • Port 8080 free on the host.

1. Start the runtime

Pull the image and start the container with persistent volumes. The browser tool requires --shm-size and SYS_ADMIN for Chromium automation; leave them in unless you know you will not use browsing tools.

Run the runtime
bash
docker pull ghcr.io/alexk-dev/golemcore-bot:latest

docker run -d \
  --name golemcore-bot \
  --shm-size=256m \
  --cap-add=SYS_ADMIN \
  -e STORAGE_PATH=/app/workspace \
  -e TOOLS_WORKSPACE=/app/sandbox \
  -v golemcore-bot-data:/app/workspace \
  -v golemcore-bot-sandbox:/app/sandbox \
  -p 8080:8080 \
  --restart unless-stopped \
  ghcr.io/alexk-dev/golemcore-bot:latest
Example output
text
Unable to find image 'ghcr.io/alexk-dev/golemcore-bot:latest' locally
latest: Pulling from alexk-dev/golemcore-bot
...
Status: Downloaded newer image for ghcr.io/alexk-dev/golemcore-bot:latest
8f3e6a1c4d2b...

2. Open the dashboard

The runtime prints a temporary admin password to its logs on first boot. Read it, then sign in.

Read the initial admin password
bash
docker logs golemcore-bot 2>&1 | grep "Initial admin password"
Example output
text
Initial admin password: Xf3kP9-qLmN2w7R

Open http://localhost:8080, sign in as admin with that password. You land on the onboarding screen.

3. Add a provider

The onboarding screen walks you through adding a provider key and assigning models to tiers. The minimum viable config is one provider with one model mapped to the balanced tier.

preferences/llm-providers.json (written by the onboarding flow)
json
{
  "llm": {
    "providers": {
      "openai": {
        "apiKey": "sk-proj-your-key-here",
        "apiType": "openai"
      }
    }
  }
}

Once the provider is saved, go to Settings → Model Router and set balancedModel to a concrete model ID like openai/gpt-5.1. Leave the other tiers empty for now — dynamic escalation will handle them.

4. Send your first message

Go to Chat. Type a message. You should see the agent start a turn, call the selected model, and respond.

Example session
text
You:      Hello. What tools do you have access to?

Agent:    I have access to filesystem operations (read_file, write_file,
          list_directory), shell execution in a sandboxed workspace,
          memory tools for cross-session recall, and a set of management
          tools for skills and goals. No browser or search tools are
          active yet — you can enable them from Settings → Plugin
          Marketplace.

The runtime is working. The turn was visible in real time through the Chat stream; you can also inspect it in Sessions and Logs.

If the agent did not respond

  • Check docker logs golemcore-bot for provider errors (invalid key, unknown model, rate limit).
  • Verify the model ID matches a real model at your provider.
  • See Troubleshooting for symptom-to-fix lookup.

What to do next