DroidRun CLI Guide: Using OpenAILike, Ollama, and Gemini Providers

DroidRun lets you control Android devices using natural language and LLM agents. This guide explains how to use the CLI with three popular LLM providers: OpenAILike, Ollama, and Gemini.


Prerequisites

1

Install DroidRun and its dependencies.

pip install droidrun
2

Ensure your Android device is connected and the DroidRun Portal is installed

Download & Install the Droidrun Portal APK

droidrun setup

Check if everything is set up correctly

droidrun ping
3

Install the required LlamaIndex LLM integrations for your provider

pip install llama-index-llms-openai llama-index-llms-ollama llama-index-llms-gemini

General CLI Usage

The main command to run is:

droidrun run "<your natural language command>" [OPTIONS]

Key options:

  • --provider/-p: LLM provider (OpenAILike, Ollama, GoogleGenAI, etc.)
  • --model/-m: Model name (varies by provider)
  • --base_url/-u: Base URL for API (for Ollama/OpenAILike)
  • --api_base: API base URL (for OpenAI/OpenAILike)
  • --temperature: LLM temperature (default: 0.2)
  • --vision: Enable screenshot-based vision (flag)
  • --reasoning: Enable planning with reasoning (flag)
  • --reflection: Enable reflection step (flag)
  • --tracing: Enable tracing (flag)
  • --debug: Verbose logging (flag)

Provider specific examples

Here you can see a bunch of provider specific droidrun cli usage examples

Additional Tips

  • Use droidrun devices to list connected devices.
  • Use --vision to enable screenshot-based vision for more complex tasks.
  • Use --debug for detailed logs if troubleshooting.
  • For iOS, add --ios and specify the device URL.

Troubleshooting

  • No devices found: Ensure your device is connected and authorized for ADB.
  • Provider errors: Check that you have installed the correct LlamaIndex integration and set the required API keys.
  • Model not found: Double-check the model name for your provider.

References


Happy automating!

DroidRun CLI Guide: Using OpenAILike, Ollama, and Gemini Providers

DroidRun lets you control Android devices using natural language and LLM agents. This guide explains how to use the CLI with three popular LLM providers: OpenAILike, Ollama, and Gemini.


Prerequisites

1

Install DroidRun and its dependencies.

pip install droidrun
2

Ensure your Android device is connected and the DroidRun Portal is installed

Download & Install the Droidrun Portal APK

droidrun setup

Check if everything is set up correctly

droidrun ping
3

Install the required LlamaIndex LLM integrations for your provider

pip install llama-index-llms-openai llama-index-llms-ollama llama-index-llms-gemini

General CLI Usage

The main command to run is:

droidrun run "<your natural language command>" [OPTIONS]

Key options:

  • --provider/-p: LLM provider (OpenAILike, Ollama, GoogleGenAI, etc.)
  • --model/-m: Model name (varies by provider)
  • --base_url/-u: Base URL for API (for Ollama/OpenAILike)
  • --api_base: API base URL (for OpenAI/OpenAILike)
  • --temperature: LLM temperature (default: 0.2)
  • --vision: Enable screenshot-based vision (flag)
  • --reasoning: Enable planning with reasoning (flag)
  • --reflection: Enable reflection step (flag)
  • --tracing: Enable tracing (flag)
  • --debug: Verbose logging (flag)

Provider specific examples

Here you can see a bunch of provider specific droidrun cli usage examples

Additional Tips

  • Use droidrun devices to list connected devices.
  • Use --vision to enable screenshot-based vision for more complex tasks.
  • Use --debug for detailed logs if troubleshooting.
  • For iOS, add --ios and specify the device URL.

Troubleshooting

  • No devices found: Ensure your device is connected and authorized for ADB.
  • Provider errors: Check that you have installed the correct LlamaIndex integration and set the required API keys.
  • Model not found: Double-check the model name for your provider.

References


Happy automating!