openhands-windows

Windows setup for OpenHands with local GPT-OSS model and CLI

0
0
0
public
Forked

OpenHands Windows Bridge

undefinedRun OpenHands AI coding agent on Windows with local LLM modelsundefined

A complete Windows integration for OpenHands that enables seamless operation with local OpenAI-compatible model servers. One-command setup, automatic Docker configuration, and intelligent container lifecycle management.

Windows
Docker
Python
License


Why This Exists

OpenHands is a powerful AI coding agent, but running it on Windows with local models requires:

  • Complex Docker path translations (Windows → Docker POSIX paths)
  • Manual container lifecycle management
  • Custom runtime patches for Windows port ranges
  • Network configuration for Docker Desktop’s host.docker.internal

This project solves all of that with one command.


Features

  • undefinedOne-Command Setup: setup.bat handles everything automatically
  • undefinedLocal LLM Integration: Connect to any OpenAI-compatible server (default: localhost:8000)
  • undefinedAutomatic Path Translation: Windows paths converted to Docker-compatible mounts
  • undefinedContainer Lifecycle Management: Auto-cleanup prevents RAM buildup
  • undefinedWindows Runtime Patch: Custom docker_runtime.py with Windows-optimized port ranges
  • undefinedInteractive CLI: Simple Python client for conversational interactions
  • undefinedModel Auto-Detection: Automatically finds your local model ID
  • undefinedPersistent Workspace: Repository mounted as /workspace inside the container

Quick Start

Prerequisites

  • undefinedWindows 10/11undefined
  • undefinedDocker Desktop (running)
  • undefinedPython 3.9+undefined
  • undefinedLocal LLM server at http://localhost:8000/v1 (OpenAI-compatible)

Installation

# Clone the repository
git clone https://github.com/Mhrnqaruni/openhands-windows.git
cd openhands-windows

# Run setup (installs Docker if missing, configures everything)
setup.bat

That’s it! Setup will:

  1. Check/install Docker Desktop
  2. Auto-detect your local model
  3. Convert Windows paths to Docker format
  4. Start OpenHands container with correct configuration
  5. Apply Windows runtime patch
  6. Clean up old containers

Usage

undefinedInteractive Mode:undefined

python "open hand\openhands_cli.py"

undefinedOne-Shot Command:undefined

python "open hand\openhands_cli.py" --once "Create a Python script that prints 'Hello World'"

undefinedKeep Container Running (no auto-cleanup):undefined

python "open hand\openhands_cli.py" --no-auto-stop

undefinedDirect LLM Testing:undefined

python chat.py

How It Works

1. Automatic Setup (setup.bat)

setup.bat performs 7 critical steps:
├─ Check Docker installation (auto-install via winget if missing)
├─ Detect local LLM model from /v1/models endpoint
├─ Convert Windows path to Docker POSIX format
│  Example: C:\Users\You\project → /run/desktop/mnt/host/c/Users/You/project
├─ Start openhands-app container with proper env vars
├─ Patch docker_runtime.py with Windows port ranges
├─ Restart container to apply patch
└─ Clean up old runtime containers

2. Custom Docker Runtime Patch

The included docker_runtime.py modifies OpenHands to use Windows-compatible port ranges:

# Standard OpenHands (causes conflicts on Windows)
EXECUTION_SERVER_PORT_RANGE = (30000, 39999)

# Our Windows patch
if os.name == 'nt' or platform.release().endswith('microsoft-standard-WSL2'):
    EXECUTION_SERVER_PORT_RANGE = (30000, 34999)
    VSCODE_PORT_RANGE = (35000, 39999)
    APP_PORT_RANGE_1 = (40000, 44999)
    APP_PORT_RANGE_2 = (45000, 49151)

3. CLI Client (openhands_cli.py)

  • undefinedEvent Polling: Monitors OpenHands API for agent responses
  • undefinedAutomatic Cleanup: Removes runtime containers after each session
  • undefinedConversation Management: Handles persistent conversation state
  • undefinedError Recovery: Graceful handling of timeouts and network issues

Configuration

Custom Model/Endpoint

# Set before running setup.bat
set LLM_MODEL=your-model-id
set LLM_BASE_URL=http://host.docker.internal:8000/v1
set LLM_API_KEY=your-api-key

setup.bat

Custom Workspace Folder

By default, the workspace mounted into the agent is this repo.
If you want the workspace to be a different folder, set WORKSPACE_DIR before running setup.bat:

set WORKSPACE_DIR=C:\Users\User\Desktop\Applying for Job

setup.bat

Environment Variables

Variable Default Description
LLM_MODEL Auto-detected Model ID from /v1/models
LLM_BASE_URL http://host.docker.internal:8000/v1 LLM server endpoint
LLM_API_KEY local-llm API key for authentication
OPENHANDS_URL http://localhost:3000 OpenHands server URL
WORKSPACE_DIR Repo root Windows path to mount at /workspace

Examples

Connect to PostgreSQL from Agent

Your agent runs inside a Linux container, so Windows services are accessible via host.docker.internal:

# From inside the agent
apt-get update && apt-get install -y postgresql-client
psql "postgresql://user:password@host.docker.internal:5432/mydb" -c "SELECT version();"

Access Windows Files

The repository is automatically mounted at /workspace:

# Agent can read/write files in your repo
ls /workspace
cat /workspace/README.md

Technical Architecture

┌─────────────────────────────────────────────────────────────┐
│                     Windows Host                            │
│                                                             │
│  ┌──────────────┐         ┌─────────────────────────────┐   │
│  │  setup.bat   │───────▶│  Docker Desktop             │   │
│  └──────────────┘         │                             │   │
│                           │  ┌───────────────────────┐  │   │
│  ┌──────────────┐         │  │ openhands-app         │  │   │
│  │ CLI Client   │◀──────▶│  │ :3000 (API)           │  │   │
│  │ (Python)     │  HTTP   │  │                       │  │   │
│  └──────────────┘         │  │ + docker_runtime.py   │  │   │
│                           │  │   (patched)           │  │   │
│                           │  └───────┬───────────────┘  │   │
│                           │          │ spawns           │   │
│                           │          ▼                  │   │
│  ┌──────────────┐         │  ┌───────────────────────┐  │   │
│  │ Local LLM    │◀───────┼──│ openhands-runtime-*   │  │   │
│  │ :8000        │  API    │  │ (auto-cleaned)        │  │   │
│  └──────────────┘         │  └───────────────────────┘  │   │
│                           └─────────────────────────────┘   │
│                                                             │
│  /workspace ──────────────▶ mounted into containers        │
└─────────────────────────────────────────────────────────────┘

Troubleshooting

Issue Solution
undefinedDocker not runningundefined Start Docker Desktop, wait for it to be ready, then rerun setup.bat
undefinedModel not reachableundefined Ensure your LLM server is running at http://localhost:8000/v1
undefinedNo agent responsesundefined Check Docker logs: docker logs openhands-app
undefinedCommands failing in agentundefined Remember the agent uses Linux commands (ls, cat, not dir, type)
undefinedPort conflictsundefined Stop other services using ports 3000, 30000-49151
undefinedPath not foundundefined If you move the repo, just run setup.bat again to remount

Cleanup

# Recommended: Use cleanup script
cleanup.bat

# Manual cleanup
docker rm -f openhands-app
docker ps -aq --filter "name=openhands-runtime-" | ForEach-Object { docker rm -f $_ }

Project Structure

openhands-windows/
├── setup.bat                 # One-command setup script
├── cleanup.bat               # Container cleanup script
├── chat.py                   # Direct LLM testing utility
├── open hand/
│   ├── openhands_cli.py      # CLI client for OpenHands
│   ├── docker_runtime.py     # Windows-patched runtime
│   └── .venv/                # Python virtual environment
└── README.md

How This Differs from Standard OpenHands

Feature Standard OpenHands This Project
undefinedWindows Supportundefined Manual setup required One-command automation
undefinedPath Translationundefined Manual configuration Automatic Windows→Docker conversion
undefinedLocal Modelsundefined Cloud-focused Optimized for local LLM servers
undefinedContainer Cleanupundefined Manual Automatic lifecycle management
undefinedPort Rangesundefined Linux-optimized Windows-compatible ranges
undefinedSetup Timeundefined 30+ minutes < 5 minutes

Contributing

Contributions welcome! Areas for improvement:


License

MIT License - See LICENSE for details


Acknowledgments

  • OpenHands - The underlying AI coding agent
  • Built for Windows users who want local LLM control
  • Special thanks to the Docker and Python communities

Author

undefinedMehran Gharuni - GitHub

Built as part of demonstrating advanced Windows/Docker integration skills and local LLM deployment expertise.


undefinedStar this repo if it helped you run OpenHands on Windows!undefined

[beta]v0.3.0