OpenHands uses LiteLLM to make calls to Google’s chat models. You can find their documentation on using Google as a provider -> Gemini - Google AI Studio, VertexAI - Google Cloud Platform
LLM
tab:
LLM Provider
to Gemini
LLM Model
to the model you will be using.
If the model is not in the list, enable Advanced
options, and enter it in Custom Model
(e.g. gemini/<model-name> like gemini/gemini-2.0-flash
).API Key
to your Gemini API key-e
in the docker run command:
LLM
tab:
LLM Provider
to VertexAI
LLM Model
to the model you will be using.
If the model is not in the list, enable Advanced
options, and enter it in Custom Model
(e.g. vertex_ai/<model-name>).