Minneapolis street test: Google gets a B+, Apple’s Siri gets a D

June 29, 2012, 10:42 AM UTC


per Jaffray’s Gene Munster is nothing if not methodical. When customers queue up for a new Apple (AAPL) product, he’s the analyst who goes to the line and counts heads. When programmers gather at an Apple developers conference, he’ll stop three or four dozen to ask if they are writing apps for Mac, iPhone or Android devices.

So when he wanted to know how Siri stacked up against a Google (GOOG) search, he (or his staff) asked an iPhone 1,600 questions, 800 on the busy streets of Minneapolis, 800 in a quiet room.

Piper Jaffray published the results Thursday in a note to clients:

  • Google understands 100% of the questions (not surprisingly, since they are keyed in)
  • Google replies accurately 86% of the time
  • Siri comprehends 83% of queries in noisy conditions, 89% in a quiet room
  • Siri answers accurately 62% of the time on the street and 68% in a quiet room.

“In order to become a viable mobile search alternative,” Munster writes, “Siri must match or surpass Google’s accuracy of B+ and move from a grade D to a B or higher.”

He estimates that Siri is more than two years behind Google in its learning curve, but he’s optimistic: “With the iOS 6 release in the fall, we expect Siri to improve meaningfully while reducing its reliance on Google from 60% to 48%.”

Given the results, however, it’s not clear that reducing Siri’s reliance on Google is necessarily a good thing.

Currently Siri gets 60% of its answers from Google, 20% from Yelp, 14% from WolframAlpha, 4% from Yahoo and 2% from Wikipedia.

“Breaking down Siri’s reliance further,” Munster writes, “Google provides 100% of navigation results, 61% of information results, 48% of commerce results and 42% of local results. Among other result aggregators, Yelp provided the most local results (51%) and commerce results (51%), while WolframAlpha provided 34% of information results.”

Starting in the fall, Apple’s in-house maps will eliminate its reliance on Google for navigation and the integration of  Yahoo Sports, Open Table, Rotten Tomatoes and Fandango will provide answers for sports scores and statistics, restaurant reservations, movie show times and ticket purchases.

Breaking down the type of errors Siri makes, Munster provides some sample questions that demonstrate how far it has to go:

  • What team does Peyton Manning play for? Responded with the answer to the previous query. This was the most common error.
  • Where is Elvis buried? Responded I can’t answer that for you. It thought the persons name was Elvis Buried.
  • Where am I? Pin dropped in the wrong place.
  • When did the movie Cinderella come out? Responded with a movie theater search on Yelp.
  • How do I get from Boston to New York? Responded I can only give directions from your current location. I cant give you directions to a place you are not in.
  • What spices are in Lasagna? Responded with a Yelp search with lasagna on the menu.
  • When is the next Haley’s comment? Responded “You have no meetings matching Haley’s”
  • I want to go to Lake Superior? Responded with directions to the company Lake Superior X-Ray.

Those responses would make a terrific TV ad. For Google.