Skip to main content
Provider integrations remain shared between the Agent SDK and the OpenHands UI. The pages linked below live under the historical OpenHands section but apply verbatim to SDK applications because both layers wrap the same openhands.sdk.llm.LLM interface.
Provider / scenarioDocumentation
OpenHands hosted models/openhands/usage/llms/openhands-llms
OpenAI/openhands/usage/llms/openai-llms
Azure OpenAI/openhands/usage/llms/azure-llms
Google Gemini / Vertex/openhands/usage/llms/google-llms
Groq/openhands/usage/llms/groq
OpenRouter/openhands/usage/llms/openrouter
Moonshot/openhands/usage/llms/moonshot
LiteLLM proxy/openhands/usage/llms/litellm-proxy
Local LLMs (Ollama, SGLang, vLLM, LM Studio)/openhands/usage/llms/local-llms
Custom LLM configurations/openhands/usage/llms/custom-llm-configs
When you follow any of those guides while building with the SDK, create an LLM object using the documented parameters (for example, API keys, base URLs, or custom headers) and pass it into your agent or registry. The OpenHands UI surfacing is simply a convenience layer on top of the same configuration model.
⌘I