Skip to main content

OpenAI

OpenHands uses LiteLLM to make calls to OpenAI's chat models. You can find their documentation on using OpenAI as a provider here.

Configuration

When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:

Using OpenAI-Compatible Endpoints

Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoints. You can find their full documentation on this topic here.

Using an OpenAI Proxy

If you're using an OpenAI proxy, in the OpenHands UI through the Settings:

  1. Enable Advanced Options
  2. Set the following:
    • Custom Model to openai/<model-name> (e.g. openai/gpt-4o or openai/<proxy-prefix>/<model-name>)
    • Base URL to the URL of your OpenAI proxy
    • API Key to your OpenAI API key