Gemini - Google AI Studio Configs
When running OpenHands, you’ll need to set the following in the OpenHands UI through the Settings under theLLM
tab:
LLM Provider
toGemini
LLM Model
to the model you will be using. If the model is not in the list, enableAdvanced
options, and enter it inCustom Model
(e.g. gemini/<model-name> likegemini/gemini-2.0-flash
).API Key
to your Gemini API key
VertexAI - Google Cloud Platform Configs
To use Vertex AI through Google Cloud Platform when running OpenHands, you’ll need to set the following environment variables using-e
in the docker run command:
LLM
tab:
LLM Provider
toVertexAI
LLM Model
to the model you will be using. If the model is not in the list, enableAdvanced
options, and enter it inCustom Model
(e.g. vertex_ai/<model-name>).