Amazon has with his echo speakers and the smart-home integration of his assistant Alexa Market a huge range advantage. However, new tests show that the product does not give the best answers.
Google Assistant is Best in Class
In a test by Stone Temple Consulting, the Google Assistant scored the best. The interesting thing is that the software works even better on a smartphone than on multi-room loudspeakers such as Google Home.
Stone Temple Consulting has already presented test results in the past year. Accordingly, a development can be read; The language assistants of Amazon, Apple, Google and Microsoft were asked 5,000 different questions. The Google Assistant on the smartphone was able to answer nearly 80 percent of this, and well over 90 percent of the answers were accurate.
Siri looks old, Alexa can catch up
It’s much worse for Apple’s Siri. The language assistant from Cupertino could answer just over 40 percent of the questions asked. After all, 80 percent of the answers were correct.
Amazon’s Alexa was able to answer around 55 percent of the questions, with more than 80 percent correctness. Stone Temple Consulting’s test result also shows that Alexa has made the most progress over the previous year. Surprisingly, too: Microsoft’s Cortana comes second in the test
Assistant deviates from Google search
ROAST also subjected the Google Assistant to a test, and only to the language assistant of the search engine provider. 10,000 questions were formulated, both on the language assistant and on Google’s search engine. In the test, the assistant was able to answer 45 percent of the questions asked.
Interestingly, the result of the Assistant sometimes differed from that of a Google search. The test also showed that Google Assistant has its strengths in restaurants, hotels, health, entertainment, news and more. In the field of transportation, technology, fashion or music, the assistant is very difficult and often offers no answer.