“The smart assistant from Mountain View giant beat its counterparts by a very small margin”
It’s no news that Google is moving towards complete autonomy in its gadgets based on AI and its tool for trick is the Google Assistant. The smart assistant from the search engine giant is believed to be the best against its peers like Amazon’s Alexa and Apple’s Siri. However, a recent IQ test for smart assistants has revealed that there is not much of a gap between Google Assistant and its competitors.
Gene Munster from Loup Ventures recently tested the accuracy of four prominent digital assistants – Amazon Alexa, Apple Siri, Microsoft Cortana and Google Assistant. The test used Amazon Echo, HomePod, Google Home Mini, and Harmon Kardon Invoke to put digital assistants through their paces. The analyst asked a battery of 800 questions to each assistant and unsurprisingly the Google Assistant came out on the top. To emerge on the top, the Google Assistant understood 100 percent of the questions asked and answered correctly nearly 88 percent of the time.
Close on the heel of Google Assistant was Apple’s Siri, which understood 99.6 percent of the questions asked and answered correctly 74.6 percent of time. Siri was followed by Alexa, which understood 99 percent of the questions asked and was able to answer 72.5 percent of them correctly. However, Cortana lagged far behind with 63.4 percent accuracy, although it understood 99.4 percent of questions asked. Notably, Munster asked questions in five categories – Local, Commerce, Navigation, Information, and Command. Google Assistant lead in all categories except for Commands, where Siri took the lead.
Notably, the same test was conducted last year and Google Assistant was able to answer with just 81 percent accuracy. The Amazon Alexa scored 64 percent accuracy, whereas, Siri was at 52 percent. Clearly, the competitors have come a long way since then, specially Siri.