Skip to main content

Google Assistant could soon get even better at understanding your voice

Google Assistant can already recognize your voice from others and pick up what you're saying pretty well, but even more improvements look to be on the way: references to "personalized speech recognition" have started popping up in the code of the Google app for Android.

This is courtesy of some keen-eyed observations from the team at 9to5Google, who found that the latest version of the app will offer to "store audio recordings on this device to help Google Assistant get better at recognizing what you say".

While we don't have too much to go on here, it looks as though the feature could be similar to something Google already does on some of its smart speakers: processing some common queries locally on a device to speed up recognition and processing.

Knowing your voice

Based on the snippets of information found hidden in the app, if this functionality is turned off by the user then Google Assistant "will be less accurate at recognizing names and other words that you say frequently".

While it's not clear exactly what difference these improvements are going to make, it would seem that local processing on an Android phone and an ability to recognize your own vocal quirks – accent, unique contact names and all – are going to make the Google Assistant experience even more fluid than ever.

At this stage we don't know when (or even if) Google will push this out officially, but more information should be forthcoming should it become a fully fledged feature. As we heard at Google IO 2022, efforts to make Google Assistant conversations more natural are always ongoing.


Analysis: Google Assistant still has plenty of room for improvement

Google Assistant is arguably the best digital assistant in the business at the moment, thanks to Google's innovations in machine learning and the way that it reaches into just about every part of our lives, from web search to smart home gadgets. However, that doesn't mean that there isn't still room for improvement.

The ultimate goal is to have chatting with Google Assistant be as simple and as seamless as chatting with a friend or relative – and there's still some way to go until that's the case, despite the regular upgrades that keep getting pushed out.

With the supposed new feature mentioned above focusing on "personalized" conversations, it would seem Google wants to make its Assistant better at understanding those commands and words that are most specific to you.

In other words, it won't be caught out when you mention a name or a phrase that makes perfect sense to you but that an artificial intelligence system would get confused by. It makes sense to store this data for Google Assistant on your phone too, the device that's close by you for most of the day.



Source: TechRadar

Popular posts from this blog

Apple and Meta Reportedly Discussed AI Partnership for iOS 18

Apple has held discussions with Meta about integrating the Facebook owner's AI model into iOS 18 as part of its Apple Intelligence feature set, according to a report over the weekend. Meta launched Llama 2, its large language model, in July 2023, and in April, the company released the latest versions of its AI models, called Llama 3 . The Wall Street Journal reports that the two longtime rivals have held talks about offering Meta's model as an additional option to OpenAI's ChatGPT. The paywalled report notes that the discussions haven't been finalized and could fall through. As part of Apple Intelligence, Apple has announced a partnership with OpenAI that will allow Siri to access ChatGPT directly in iOS 18, iPadOS 18, and macOS Sequoia to provide better responses in relevant situations. Using ChatGPT will be optional, so users with concerns about the technology can abstain and still make use of Apple's own new AI features. Speaking at WWDC 2024, Apple's

Here Are the macOS Sequoia Features Intel Macs Won't Support

When Apple released macOS Monterey in 2021, some key features required a Mac with Apple silicon. The same scenario played out with macOS Ventura in 2022, and then again the following year with the release of macOS Sonoma. With macOS Sequoia set to arrive in the fall, which new features can Intel Mac owners expect to be unavailable to them this time around? Apple says that macOS Sequoia is compatible with the same Macs as macOS Sonoma, but Apple's fine print reveals that certain new features won't work on Intel machines. If you're still on an Intel Mac, here's what you won't have access to. Apple Intelligence Apple Intelligence , a deeply integrated, personalized AI feature set for Apple devices that uses cutting-edge generative artificial intelligence to enhance the user experience, won't be available on Intel Macs. Apple says the advanced features require its M1 chip or later, so if your Mac was released before November 2020, you're out of luck. T

iPhone 16 Pro Models to Adopt 'M14' Advanced Samsung OLED Panels for Improved Brightness and Lifespan

The upcoming iPhone 16 Pro and iPhone 16 Pro Max will be the first Apple smartphones to adopt Samsung's high performance "M14" OLED display panel, claims a new report coming out of South Korea. According to ETNews , Samsung's "M" series of OLED panels are made for flagship smartphones, while "14" refers to the number of high-performance materials used to produce them. "M14" is the first series of its kind, and the panel is said to have been developed to deliver superior brightness and longevity. Samsung has reportedly placed orders for the M14 materials and is preparing to mass produce the displays in the second half of the year for Apple's iPhone 16 Pro models. Google's Pixel 9 smartphone is the only other device that is expected to adopt the high-performance displays in 2024. A previous report out of China claimed that this year's ‌iPhone 16 Pro‌ models will feature up to 1,200 nits of typical SDR brightness – a 20%

Apple Boosts A18 Chip Orders in Anticipation of High iPhone 16 Demand

Apple is said to have upped its order of next-generation chips from TSMC to between 90 million and 100 million units, following heightened demand expectations for its iPhone 16 series. Last year's initial chip order volume for the iPhone 15 series launch is believed to have been in the region of 80-90 million units, suggesting Apple is anticipating higher demand for its 2024 devices, according to Taiwanese outlet CTEE . The arrival of Apple Intelligence in iOS 18 is expected to boost initial sales of the devices. One of the reasons is that Apple Intelligence requires at least an iPhone 15 Pro to run, which means owners of last year's iPhone 15 and iPhone 15 Plus will miss out on Apple's new AI features unless they upgrade to an iPhone 15 Pro or plump for one of the iPhone 16 models. Last year, the iPhone 15 and iPhone 15 Plus were equipped with the A16 Bionic chip – the same chip that was in the iPhone 14 Pro models – whereas the iPhone 15 Pro and iPhone 15 Pro Max f