Google is bringing a handful of new features to Android phones, including tools to keep users safe during a natural disaster, enhancements to accessibility using AI, and easier music discovery. Simultaneously, the company has reached a critical milestone with Android 15, pushing it closer to its public release in the coming weeks.
Keeping users safe during earthquakes
Google says its remarkable earthquake alert system is now available to users across all American states and territories. It plans to reach the entire target base within the next few weeks. Google has been testing the system, which also relies on vibration readings collected from a phone’s accelerometer, since 2020.
Once the onboard sensors assess disturbances similar to those of an earthquake, your phone will instantly look at crowdsourced data collected from the Android Earthquake Alerts System to check whether an earthquake is happening and send an alert.
“Android Earthquake Alert System can provide early warnings seconds before shaking,” says the company. Once it’s clear that an earthquake is happening and its intensity is measured at 4.5 or higher on the Richter scale, it will send out two kinds of alerts based on the severity.
-
A new Google Pixel feature could make managing phone calls a breeze
-
It just got easier to protect your Android phone from thieves
-
Android phones have started receiving crucial anti-theft features
The first one is the “Be Aware” alert, which essentially tells users to buckle up in case the ongoing light shaking could turn into something more violent. The “Take Action” warning pops up when the shaking is strong enough that users should instantly seek cover.
In addition to alerts, the system will also offer access to a dashboard where users can access further instructions to ensure their safety. Earthquake alerts are enabled by default on Android phones.
Music discovery with an AI boost
One of my favorite Assistant features has been hum to search, which asks users to hum or whistle a tune for it will discover the track on the web. Of course, it works even better if you sing it or just put your phone close to a sound source like a speaker. The whole system is now getting an AI boost.
Remember “Circle to Search,” a feature that lets you do a web search for any item appearing on your phone’s screen by simply highlighting it? Well, there’s now an audio recognition element to it. Simply pull up the Circle to Search interface by long-pressing on the home button at the bottom (or the navigation bar) and tap the newly added music icon.
Once the AI identifies the track, it will automatically pull up the right song with a YouTube link. The idea here is that you don’t have to hum or use another device or app for music identification. You just summon the AI, activate the audio identifier, and get the job done — all on the same screen.
Accessibility updates, Chrome’s reader mode, and more
Android’s TalkBack system is a fantastic accessibility-focused feature that provides audio descriptions of everything on your phone’s display. Now, Google is pushing its Gemini AI chatbot to offer more detailed and natural-language talk-back descriptions, be it a webpage, a picture pulled from the local gallery, or social media.
On a similar note, the Chrome browser is getting a reader system on Android. In addition to reading out the contents of a page, users will also have the flexibility to change the language, pick a voice narrator model of their choice, and adjust the reading speed.
The final feature addition is offline map access on Wear OS smartwatches. Whenever users download a map on their smartphone for offline use, it is also carried over to the connected smartwatch. So, if you leave behind your phone and go out for a hike or cycling trip, you can still access the map on your smartwatch.
A couple of new shortcuts are also being added to navigation software for Wear OS smartwatches. With a single tap on the watch face, users can check their surroundings. When needed, they can simply use a voice command to look up a location.