DroidRun CLI Guide: Using OpenAILike, Ollama, and Gemini Providers
DroidRun lets you control Android devices using natural language and LLM agents. This guide explains how to use the CLI with three popular LLM providers: OpenAILike, Ollama, and Gemini.Prerequisites
1
Install DroidRun and its dependencies. Choose which ever provider you'd like to use.
2
Ensure your Android device is connected and the DroidRun Portal is installed
Download & Install the Droidrun Portal APKCheck if everything is set up correctly
3
If you want to use a provider thats not included with our extras, install the required LlamaIndex LLM integrations
General CLI Usage
The main command to run is:--provider/-p
: LLM provider (OpenAILike
,Ollama
,GoogleGenAI
, etc.)--model/-m
: Model name (varies by provider)--base_url/-u
: Base URL for API (for Ollama/OpenAILike)--api_base
: API base URL (for OpenAI/OpenAILike)--temperature
: LLM temperature (default: 0.2)--vision
: Enable screenshot-based vision (flag)--reasoning
: Enable planning with reasoning (flag)--reflection
: Enable reflection step (flag)--tracing
: Enable tracing (flag)--debug
: Verbose logging (flag)
Provider specific examples
Here you can see a bunch of provider specific droidrun cli usage examplesAdditional Tips
- Use
droidrun devices
to list connected devices. - Use
--vision
to enable screenshot-based vision for more complex tasks. - Use
--debug
for detailed logs if troubleshooting. - For iOS, add
--ios
and specify the device URL.
Troubleshooting
- No devices found: Ensure your device is connected and authorized for ADB.
- Provider errors: Check that you have installed the correct LlamaIndex integration and set the required API keys.
- Model not found: Double-check the model name for your provider.
References
Happy automating!