Skip to main content
  1. Home
  2. Phones
  3. News

Android 17 could turn Gemini into your personal app butler

New AppFunctions and UI automation previews give a first look at AI that completes tasks for you in the background.

Add as a preferred source on Google
Google Gemini on Phone
Google

Google just gave us a real glimpse of how Android 17 might change the way you use your phone. New developer tools announced Tuesday let AI agents like Gemini dive directly into your installed apps to find photos, manage calendars, or book a multi-stop rideshare while you do something else.

The idea is simple. Instead of opening apps one by one, you tell an AI what you need. Google calls this the “agentic future,” and it’s landing in pieces starting now on the Galaxy S26 series and select Pixel 10 devices. A long press of the power button on those phones lets you hand off complex tasks to Gemini. The AI works across food delivery, grocery, and rideshare apps in the US and Korea to start.

Two ways Gemini takes control

Google is building this on two tracks. The first is AppFunctions, a framework that lets developers expose specific app features directly to AI. The Samsung Gallery integration on the Galaxy S26 shows how it works. You ask Gemini to “show me pictures of my cat from Samsung Gallery.” The AI finds and displays them. You never open the gallery app. It already works for calendar, notes, and tasks on devices from multiple manufacturers.

Recommended Videos

The second track is broader. For apps without dedicated integrations, Google is testing a UI automation framework. It lets Gemini execute generic tasks. The beta launches on the same devices, supporting a curated set of apps in food delivery, grocery, and rideshare categories. The AI handles the multi-step work using your existing app context.

You stay in the driver’s seat

Letting an AI loose inside your apps sounds like a privacy risk. Google says it designed these features with privacy and security as the foundation. When Gemini runs a task through UI automation, you can watch its progress via notifications or a live view. If something looks wrong, you jump in and take over manually.

Sensitive actions get extra guardrails. Gemini alerts you before completing things like a purchase. The actual work happens on your device, not a remote server. Google frames this as user control baked into the experience. The goal is to make automation feel helpful, not creepy.

Android 17 and what comes next

This is still early. Google is starting with a small set of developers to iron out the experience. The UI automation preview is limited to specific devices and app categories in just two countries. But the roadmap points to Android 17 as the moment these capabilities broaden to more users, developers, and device makers.

For now, if you have a Galaxy S26 or a select Pixel 10, you can try the beta when it launches. For everyone else, the takeaway is simple. Your phone is about to get smarter about handling tedious stuff. The shift from opening apps to telling AI what you need is coming. Android 17 later this year will likely be when it starts to feel normal.

Paulo Vargas
Paulo Vargas is an English major turned reporter turned technical writer, with a career that has always circled back to…
iPhone users can finally get live translation on their headphones through Google Translate
Google Translate goes hands-free on iOS
google-translate-live-translation-headphone-ios

Google is bringing one of its best AI-powered Google Translate features to iPhone users at last. Live Translate with headphones is now rolling out on iOS, months after its debut on Android in December.

The feature turns your headphones into a real-time translator to help you understand conversations as they happen without staring at your phone.

Read more
Motorola leak reveals the upcoming Razr 70 Ultra, and it doesn’t want to change one bit
Electronics, Mobile Phone, Phone

In typical Motorola fashion, the Razr series has leaked once again, and this time we’re getting our first proper look at the Razr 70 Ultra. The renders come courtesy of XpertPick, in collaboration with Steve Hemmerstoffer, also known as OnLeaks on X (formerly Twitter).

Is there anything fresh here?

Read more
Siri could soon support third-party AI tools in major iOS update
Apple lets Siri phone a friend (and it’s AI)
Siri

Apple is reportedly preparing one of the most significant changes to Siri in years, with plans to open its voice assistant to third-party AI services as part of the upcoming iOS 27 update. The move signals a major shift in Apple’s artificial intelligence strategy, transforming Siri from a closed assistant into a broader AI platform that can integrate with competing technologies.

A Shift Toward An Open AI Ecosystem

Read more