In a quiet room, the latest version understood 91% of questions, correctly answered 77%
The results left much to be desired. In the best of circumstances (i.e. no street noise), Siri correctly interpreted 89% of the questions and correctly answered 68%. (Typical exchange, as Siri heard it: When is the next Haley’s comment? “You have no meetings matching Haley’s”.)
On Thursday, Munster reported the results of a quiet-room (no street noise) re-test of the “improved” iOS 6 version. This time Siri understood 91% of the questions and correctly answered 77%.
In school-grade terms, she’s gone from a D to a C.
(For reasons Munster doesn’t explain, he’s raised Siri’s June score from 68% to 76%, making the improvement look more modest than it was. Perhaps he’s now grading on a curve.)
Munster also reported on the performance of Google Now, Google’s GOOG answer to Siri. Overall, Siri did slightly better, as the bar chart at right shows. But each program has its strengths and weaknesses.
Below: Munster’s side by side comparison.