It’s been five months since Loup Ventures last tested Apple’s language assistant. At this time on the HomePod with music-specific questions. However, in April 2017, the company once tested a number of language assistants (Google Assistant, Siri, Cortan, and Alexa of Amazon) for their “knowledge.”
Siri Got Smarter
At that time, the assistants became tested with a total of 800 questions from different fields of knowledge on a smartphone. Loup Ventures repeated this test and compared the results with those from April 2017.
Apple’s AI helper understood 99 percent of the requests in the new test, but was only able to answer 78.5 percent of them. This is an improvement over the previous test. At that time Siri was only able to answer 66.1 percent of 800 questions correctly.
Results not comparable to HomePod
Loup Ventures emphasizes that a similar test with Smartspeakers makes no sense. This would generally be worse, because they are used in completely different usage scenarios. Siri can answer some questions very well on the HomePod, others not at all, even worse than the Voice Assistant on the iPhone.
IQ-Test for Language Assistants?
But back to the current test (process). Loup Ventures rates each digital assistant in two categories. One is whether the AI understands the speaker and, on the other hand, whether it provides a correct answer to the question.
The questions come from five thematic areas. These are location-based questions, such as “Where is the nearest café?”. It also includes commercial issues such as “Can I order more paper towels?”, Navigation (“How to get to Uptown by bus?”), Information (“Who are the Twins playing tonight?”) And commands like “Remind me to call Steve today at 2:00 pm.”
The questions were asked on an iPhone with iOS 11.4. The Google Assistant was interviewed on a pixel XL, Alexa and Cortana respectively in the iOS app.
Apple’s Siri can shine especially when completing commands (90 percent of the questions were answered correctly). Apple’s voice assistant was better than any other product when it comes to “controlling” the iPhone, smart home products, Apple Music, and more. Based on questions, Siri answered 87 percent, navigation questions 83 percent. In information, Siri is weakening with 70 percent of correct answers, and in terms of trade, only 60 percent could answer correctly.
Google Assistant Has Leader Almost Everywhere
The Google Assistant has almost every category the nose in front. Siri is defeated only by orders.
Siri has the advantage of being embedded in the phone’s operating system, not in a third-party app (like Cortana and Alexa). Siri is more helpful and versatile in responding to a more flexible language when it comes to using the phone, smart home, music, etc.
Overall, the Google Assistant answered 85.5 percent of the 800 questions correctly. 11 of the 800 questions were misunderstandings. Alexa answered only 61.4 percent correctly and understood 13 incorrectly, while Cortana answered only 52.4 percent correctly and misunderstood 19.
In April, there was already a similar test by ROAST.