Artificial intelligence is spreading its ample wings throughout the Android operating system, right down to Google’s decision to rebuild the assistant experience entirely to integrate it inside Android. It means Google Assistant has gone the way of the dinosaur, relegated to the history books as it’s replaced by the next big thing: Google Gemini. What better way to introduce the changes than letting Gemini tell you itself.
“Gemini, Google AI’s latest innovation, is set to redefine the Android user experience. By deeply integrating Gemini into Android’s core, users can now interact with the AI more naturally, getting assistance with tasks and information retrieval directly within apps. Gemini can even generate images and summarize calls or organize screenshots, all while prioritizing user privacy with on-device processing capabilities.
“This enhanced AI experience is available across a wide range of Android devices, with tailored interactions for foldables, making Gemini accessible and helpful in various contexts. With ongoing updates and expansions, Gemini aims to be the most widely available and versatile AI assistant, enhancing the Android experience for billions of users worldwide.”
-
AI may soon make it easier to find the right Android app
-
Google Gemini is good, but this update could make it downright sci-fi
-
Does the Google Pixel 9 have an SD card slot?
That’s Gemini’s own summary of Google’s announcement, created through Google Docs, and it mostly gives you a good idea of what’s happening deep inside Android and what to expect from its ability. However, it doesn’t go into detail about a few new features and what it means for specific phones. For example, a new headline feature is Gemini Live, which provides a “mobile conversational experience” where you can ask just about anything, including complex questions, as well as apparently brainstorm job ideas based on your experience and skills.
Unfortunately, Gemini Live is only going to be available to Gemini Advanced subscribers, so don’t expect to get it for free. However, Google says Gemini will be just a long-press of the power button away on many Android phones (just like Google Assistant has been). And since it understands context and what’s happening on your phone, it’ll be able to provide a lot more insight. It’ll examine what’s on-screen and provide specific information for you, even if you’re using an app or watching a YouTube video, in what appears to be a meaningful extension of the Circle to Search feature.
As Gemini mentioned in its summary, it can generate images based on what it sees, which can then be added to messages or emails. Google describes these features as “integrated,” so they’ll likely be available on all compatible Android phones as standard. Google also mentions a new feature called Call Notes, which summarizes phone calls, as a feature coming to the Pixel 9 series. Gemini will appear in its own window in split-screen mode on the Samsung Galaxy Z Fold 6 and on the Motorola Razr Plus 2024’s cover screens. Thanks to Gemini Nano, sensitive information shared with the AI doesn’t leave your phone.
Gemini’s advancement into your Android phone is just one of the announcements made by Google at its Pixel 9 launch event, and you can be sure we’ll hear even more about it over the coming months as Google takes on Samsung and Apple’s mobile AI efforts.The one Pixel Watch 3 feature I want the Apple Watch to copy ASAP