Apple’s Siri and what it means for the user experience
Siri, the iPhone’s killer app
Like millions of others, I queued for the iPhone 4S last week (I don’t usually queue for new products on the day of release, but this time I was keen as my 2-year-old iPhone 3GS has been regularly crashing on me).
There are many improvements (especially if you’re upgrading from two generations back like me), but the thing everyone’s talking about is Siri, the speech recognition ‘personal assistant’ that’s built right into the operating system.
Here’s some of my thoughts on this new development in the user experience.
It’s better than you think it will be
When it was announced people were naturally sceptical about how well Siri would work. But what’s delighted users is that once you get to understand its limitations this thing works really well. The range of ways that you can ask questions and still be understood is impressive and the types of information it provides are genuinely useful.
For some things it makes much more sense to use voice commands
The more you use Siri the more you realise that this is the best way to do certain tasks. Much like the touch-screen interface that Apple introduced to smartphones and tablets, this feels like the user interface you’ve been waiting for all along. Mundane but essential tasks like setting an alarm, scheduling an appointment or texting a friend already seems an unnecessary hassle using anything but voice activation.
You learn together
As you discover the boundaries of what Siri can and can’t do, it starts to learn more about you. For example, I asked it yesterday to phone my mother. Siri asked me who my mother was. Now it knows, I can refer to my mother for relevant commands and it knows what I mean.
Likewise it learns to understand your voice patterns and will respond to contextual commands. This mutual learning process creates a bond between the user and the interface that makes it more personal.
Apple likes to delight its users and Siri is packed with personality. The UK version comes across as an English butler with a warm and often witty character.
Okay it’s the 21st century version of the pathetic fallacy, but the programmers of Siri have clearly put lots of effort into ensuring that you feel something for this technology – from its constant use of your first name to the witty replies to more personal questions (the Tumblr blog Shit That Siri Says lists some of the most amusing answers). It’s deft touches like this that help you to fall in love with it.
For some search-orientated tasks, Siri performs better than Google. Not because you can search with voice commands – you can already do that via the Google app on iPhone and a lot more besides on Android phones. Rather it’s because of the results themselves.
A Google search offers you a results page that often requires further action (which link do I click on?), of variable quality (web spam is increasing) and surrounded by ads. Compare this to Siri where some results are delivered straight into your operating system from Wolfram Alpha and Wikipedia.
With Siri we can see the potential for a better search system than Google’s. Naturally Google could emulate this functionality themselves, but the point is not that it’s beyond their capabilities (they have much of the technology already) but that it disrupts their business model.
Om Malik makes this point really well in the latest episode of This Week in Tech. As long as Google’s business is based on delivering text-based advertising around web and mobile searches, it’s not in the company’s interests to build Siri-like functionality. It’s these kind of disruptions that change the landscape in technology.