Getting started with running OpenHands on your own.
A system with a modern processor and a minimum of 4GB RAM is recommended to run OpenHands.
MacOS
Docker Desktop
Settings > Advanced
and ensure Allow the default Docker socket to be used
is enabled.Linux
Windows
WSL
wsl --version
in powershell and confirm Default Version: 2
.Docker Desktop
Settings
and confirm the following:Use the WSL 2 based engine
is enabled.Enable integration with my default WSL distro
is enabled.The docker command below to start the app must be run inside the WSL terminal.
Alternative: Windows without WSL
If you prefer to run OpenHands on Windows without WSL or Docker, see our Windows Without WSL Guide.
Note: If you used OpenHands before version 0.44, you may want to run
mv ~/.openhands-state ~/.openhands
to migrate your conversation history to the new location.
You’ll find OpenHands running at http://localhost:3000!
After launching OpenHands, you must select an LLM Provider
and LLM Model
and enter a corresponding API Key
.
This can be done during the initial settings popup or by selecting the Settings
button (gear icon) in the UI.
If the required model does not exist in the list, in Settings
under the LLM
tab, you can toggle Advanced
options
and manually enter it with the correct prefix in the Custom Model
text box.
The Advanced
options also allow you to specify a Base URL
if required.
OpenHands requires an API key to access most language models. Here’s how to get an API key from the recommended providers:
Anthropic (Claude)
Google (Gemini)
Local LLM (e.g. LM Studio, llama.cpp, Ollama)
If your local LLM server isn’t behind an authentication proxy, you can enter any value as the API key (e.g. local-key
, test123
) — it won’t be used.
Consider setting usage limits to control costs.
Effective use of local models for agent tasks requires capable hardware, along with models specifically tuned for instruction-following and agent-style behavior.
To run OpenHands with a locally hosted language model instead of a cloud provider, see the Local LLMs guide for setup instructions.
OpenHands can be configured to use a search engine to allow the agent to search the web for information when needed.
To enable search functionality in OpenHands:
LLM
tab > Search API Key (Tavily)
For more details, see the Search Engine Setup guide.
The docker command above pulls the most recent stable release of OpenHands. You have other options as well:
$VERSION
in openhands:$VERSION
and runtime:$VERSION
, with the version number.
For example, 0.9
will automatically point to the latest 0.9.x
release, and 0
will point to the latest 0.x.x
release.$VERSION
in openhands:$VERSION
and runtime:$VERSION
, with main
.
This version is unstable and is recommended for testing or development purposes only.