Why Siri sucks (and will continue to suck)

In class we discussed trends in AI, and had some incredible speakers who diffused several myths.  In this blog, I will take on why Siri….. well…. sucks.

Some critics think that Apple is too large, and thus is not able to innovate new products and services. The notoriously lagging performance of Siri is a perfect example of that. The AI service misunderstand’s the user’s request, and often provides responses that are out of context.

So is it because Apple is just too large of a company to be able to execute? Upon a little digging, I discovered it’s a little more complex than that.

There are three primary obstacles holding Siri back from being the glorious product that many have hoped it would become.  They are:

  1. Storing customer data
  2. Apple’s secretive culture
  3. Humans are incredibly difficult to predict

Storing customer data: Given Facebook’s recent snafu with the 2018 “Datagate”, storing customer data is a hot topic in Silicon Valley. While Google, Apple, and Amazon have built technology products that take advantage of storing customer data in order to enhance the customer’s experience, Apple’s stance continues to remain firm.  They do not, and will not store customer data.  Apple does this in order to protect the customer, because they believe this will build trust. They want customers to use their products, knowing that Apple is not “spying” on them.

However, this generates large technology issues for Siri. Consider this example. When you ask, “Hey Siri, where’s the nearest restaurant?,” Siri cannot respond with a list of Thai restaurants because the AI platform does not have historic access to your restaurant search history. Siri can’t predict that you would prefer locations with a moderately expensive menu because Siri doesn’t have data about the restaurants that you’ve previously viewed.

To summarize – It’s tough to predict and personalize data responses when there’s no historic data to lean on.

Apple’s secretive cultureInternally, Apple’s culture has extreme data and even physical barriers between teams. A person on the Watch team does not have authorization to enter into office space of the Siri team. As a result, Siri’s engineering teams are limited to work only within their team, and impact on the Siri product is extremely limited performance. Even simple tasks like integration with the Apple Maps team is cumbersome. This creates data and functionality isolation, and leaves gaps in ability to execute on new or cross-collaboration products.  Without an ability to communicate with complementing product teams, Siri continues to exist on a non-functional island.

Humans: Do you remember a time you were crossing a busy street, and a car appeared to stop, but then the driver inched forward?”Did they see me?,”  you wondered. You may have hesitantly continued forward, unsure if the driver was going to remain stopped, or if you needed to prepare to run.

Predicting the behavior of other humans is incredibly difficult.  I’ve lived with my boyfriend for three years, and although he knows me very well, or in tech terms – he has the data to statistically predict my actions – sometimes I shock him when do something like groom my dog to have a mohawk. I’ve never done that before, I don’t particularly like mohawks, and my actions dictate that I prefer my dog to look cute rather than punky.

AI products, ranging from Cruise automation to Siri encounter the same challenge.  As humans, we can’t predict precisely what action another human will take. For example, although a human moving a leg out and leaning forward indicates intent to walk forward, that human may decide not to move forward if they see something unusual on the other side of the street. Although machine learning geometric engines like  Tensorflow can apply video analytics to analyze human behavior to identify signals, until machines can read the mind of humans, it will never know exactly what will happen next.  Applied to Siri, when you ask  for nearby restaurants, you may actually decide to go to the grocery store and pick up a pre-made meal because you’re short on time.

To conclude, Siri sucks, and it will always suck, until machines can read the minds of humans, until Apple opens up it’s internal culture, and/or until their policy on storing customer data becomes a little more flexible.  I wouldn’t bet on the first condition ever happening, and given the current climate with data privacy the last condition likely won’t happen.  That leaves it to Apple to figure out how to internally change their culture.

0

2 comments on “Why Siri sucks (and will continue to suck)”

  1. Hi Sunny,
    I thoroughly enjoyed this post and like how you did your homework before making assumptions as to why Siri is so bad compared to Google Assistant or Alexa. I guess I had never really thought about how Apple’s refusal to store customers data would affect Siri. You would think, however, that Apple still collects more generic data about the iPhone customer population, and it could be more useful to Siri than it currently is. You are right though. Siri is falling behind and Apple needs to change something in order to keep up. Perhaps, for Apple, Siri is simply a placeholder for something bigger and better, so they refuse to enhance Siri further as it would be a waste of resources.

    0
  2. Hi Sunny,
    This was a fantastic and interesting post! I thought you brought up great points on why Siri is so horrible. I agree with the obstacles you identified. Google has definitely been touting the advantages of it having access to basically ALL of your data (GPS location, health data, browsing data, etc.): it can do whatever you want it to do, the way you want it done. This ability to completely mold to users’ desires and needs can only be accomplished by storing and accessing tons of user data. I applaud Apple’s commitment to privacy, but I agree that this commitment stands in the way of a good product. I am personally conflicted on which path Apple should take: user privacy or a better product?

    1+

Comments are closed.