DroidRun CLI Guide: Using OpenAILike, Ollama, and Gemini Providers
DroidRun lets you control Android devices using natural language and LLM agents. This guide explains how to use the CLI with three popular LLM providers: OpenAILike, Ollama, and Gemini.Prerequisites
Ensure your Android device is connected and the DroidRun Portal is installed
Download & Install the Droidrun Portal APKCheck if everything is set up correctly
General CLI Usage
The main command to run is:--provider/-p: LLM provider (OpenAILike,Ollama,GoogleGenAI, etc.)--model/-m: Model name (varies by provider)--base_url/-u: Base URL for API (for Ollama/OpenAILike)--api_base: API base URL (for OpenAI/OpenAILike)--temperature: LLM temperature (default: 0.2)--vision: Enable screenshot-based vision (flag)--reasoning: Enable planning with reasoning (flag)--reflection: Enable reflection step (flag)--tracing: Enable tracing (flag)--debug: Verbose logging (flag)
Provider specific examples
Here you can see a bunch of provider specific droidrun cli usage examplesAdditional Tips
- Use
droidrun devicesto list connected devices. - Use
--visionto enable screenshot-based vision for more complex tasks. - Use
--debugfor detailed logs if troubleshooting. - For iOS, add
--iosand specify the device URL.
Troubleshooting
- No devices found: Ensure your device is connected and authorized for ADB.
- Provider errors: Check that you have installed the correct LlamaIndex integration and set the required API keys.
- Model not found: Double-check the model name for your provider.
References
Happy automating!