[ English | 中文 ]
TurnOnBluetoothAndWIFI_en_2x.mp4
The user inputs the task description on the Web interface, and the Mobile Use automatically operates the mobile phone and completes the task.
In the AndroidWord dynamic evaluation environment, we evaluated the Mobile Use agent solution with the multimodal large language model Qwen2.5-VL-72B-Instruct and achieved a 38% success rate.
- Auto-operating the phone: Automatically operate the UI to complete tasks based on user input descriptions.
- Smart Element Recognition: Automatically parses GUI layouts and identifies operational targets.
- Complex Task Processing: Supports decomposition of complex task and multi-step operations.
mobile-use
requires ADB to control the phone, which necessitates the prior installation of the relevant tools and connecting the phone to the computer via USB.
-
Step 1. Download SDK Platform-Tools for Desktop, click there.
-
Step 2. Unzip the downloaded file and add the platform-tools path to the environment variables.
-
Windows
In Windows, you can add the
platform-tools
PATH to thePath
environment variable on the graphical interface (see here) or through the command line as follows:setx PATH "%PATH%;D:\your\download\path\platform-tools"
-
Mac/Linux
$ echo 'export PATH=/your/downloads/path/platform-tools:$PATH' >> ~/.bashrc $ source ~/.bashrc
-
-
Step 3. Open the command line and enter
adb devices
(Windows:adb.exe devices
) to verify adb is available or not.
Run the adb devices
(Windows: adb.exe devices
) command on the command line terminal. If the device serial_no is listed, the connection is successful. The correct log is as follows:
List of devices attached
a22d0110 device
With pip (Python>=3.10):
pip install mobile-use
python -m mobile_use.webui
Once the service starts successfully, open the address http://127.0.0.1:7860 in your browser to access the WebUI page, as shown below:
Click VLM Configuration to set the Base URL and API Key of the multimodal large language model. It is recommended to use the multimodal large language model of Qwen2.5-VL series.
Input task descriptions in the input box at the lower left corner, click start to execute tasks.
Case1:Search the latest news of DeepSeek-R2 in Xiaohongshu APP and forward one of the news to the Weibo App
search_forward_2x.mp4
Case2:Order 2 Luckin coffees with Meituan, 1 hot raw coconut latte standard sweet, and 1 cold light jasmine
order_coffee_en_2x.mp4
Case3:用美团点一杯咖啡,冰的,标准糖
demo01_2x.mp4
Case4:用美团帮我点2杯瑞幸咖啡,要生椰拿铁标准糖、热的
order_coffee_zh_2x.mp4
Case5:在浏览器找一张OPPO Find N5图片,询问DeepSeek应用该手机介绍信息,将找到的图片和介绍信息通过小红书发布
demo03_2x.mp4
Case6:帮我去OPPO商城、京东、以及淘宝分别看一下oppofind n5售价是多少
oppofindn5_price_zh_2x.mp4
📱 Mobile Settings
The Android ADB Server Host
and Android ADB Server Port
allow you to specify the address and port of the android ADB service, which can be used for remote device connections or local android ADB services on non-default port. When multiple devices exist, you need to specify the Device Serial No
. The Reset to HOME
parameter indicates whether to return the phone to the home page before executing the task. If you continue the previous task, you need to cancel this option.
⚙️ Agent Settings
The Max Run Steps
parameter specifies the maximum number of iteration steps for the Agent. If the current task exceeds the maximum number of iteration steps, the task will be stopped. Therefore, you are advised to set a larger value for complex tasks with more operation steps. The Maximum Latest Screenshot
is to control the number of latest screenshots that the Agent can see. Because pictures consume more tokens, when the task has more steps, Appropriately take a Screenshot of the latest Maximum Latest Screenshot
and send it to VLM to generate the next operation accordingly. The Maximum Reflection Action
is to control the maximum number of reflection times of the Agent. The greater the value, the higher the fault tolerance rate of the Agent, but the longer the processing time of the task.
🔧 VLM Configuration
Click VLM Configuration
to specify the Base URL and API Key of the multimodal large language model, as well as the model name and temperature coefficient. It is recommended to use the multimodal large language model of Qwen2.5-VL series.
import os
from dotenv import load_dotenv
from mobile_use.scheme import AgentState
from mobile_use import Environment, VLMWrapper, Agent
from mobile_use.logger import setup_logger
load_dotenv()
setup_logger(name='mobile_use')
# Create environment controller
env = Environment(serial_no='a22d0110')
vlm = VLMWrapper(
model_name="qwen2.5-vl-72b-instruct",
api_key=os.getenv('VLM_API_KEY'),
base_url=os.getenv('VLM_BASE_URL'),
max_tokens=128,
max_retry=1,
temperature=0.0
)
agent = Agent.from_params(dict(type='default', env=env, vlm=vlm, max_steps=3))
going = True
input_content = goal
while going:
going = False
for step_data in agent.iter_run(input_content=input_content):
print(step_data.action, step_data.thought)
- Improve agent memory and reflection (summarize, compress.)
- Provide multi-agent implementation
- Provide an evaluation process about AndroidWorld dynamic environment
- Develop an APP that can be installed directly on the phone
We welcome all forms of contributions! Please read our contribution guide to learn about:
- How to submit an issue to report problems.
- The process of participating in feature development, See Developer Document.
- Code style and quality standards, See Developer Document.
- Methods for suggesting documentation improvements.
This project is licensed under the MIT License, which permits free use and modification of the code but requires retaining the original copyright notice.
If you have used this project in your research or work, please cite:
@software{
title = {Mobile Use: Your AI assistant for mobile - Any app, any task},
author = {Jiamu Zhou, Xiaoyun Mo, Ning Li, Qiuying Peng},
year = {2025},
publisher = {GitHub},
url = {https://github.com/MadeAgents/mobile-use}
}
This project benefits from the contributions of:
- Inspiration from browser-use
- The multimodal large language model for the agent is based on Qwen2.5-VL
- The Web UI is built on Gradio
Thanks for their wonderful works.