Previous Blogs

May 29, 2018
Virtual Travel and Exploration Apps Are Key to Mainstream VR Adoption

May 22, 2018
The World of AI Is Still Taking Baby Steps

May 15, 2018
Device Independence Becoming Real

May 8, 2018
Bringing Vision to the Edge

May 1, 2018
The Shifting Enterprise Computing Landscape

April 24, 2018
The "Not So" Late, "And Still" Great Desktop PC

April 17, 2018
The Unseen Opportunities of AR and VR

April 10, 2018
The New Security Reality

April 3, 2018
Making AI Real

March 27, 2018
Will IBM Apple Deal Let Watson Replace Siri For Business Apps?

March 20, 2018
Edge Servers Will Redefine the Cloud

March 13, 2018
Is it Too Late for Data Privacy?

March 6, 2018
The Hidden Technology Behind Modern Smartphones

February 27, 2018
The Surprising Highlight of MWC: Audio

February 20, 2018
The Blurring Lines for 5G

February 13, 2018
The Modern State of WiFi

February 6, 2018
Wearables to Benefit from Simplicity

January 30, 2018
Smartphone Market Challenges Raise Major Questions

January 23, 2018
Hardware-Based AI

January 16, 2018
The Tech Industry Needs Functional Safety

January 9, 2018
Will AI Power Too Many Smart Home Devices?

January 2, 2018
Top Tech Predictions for 2018

2017 Blogs

2016 Blogs

2015 Blogs

2014 Blogs

2013 Blogs


















TECHnalysis Research Blog

June 5, 2018
Siri Shortcuts Highlights Evolution of Voice-Based Interfaces

By Bob O'Donnell

To my mind, the most intriguing announcements from this year’s Apple Worldwide Developer Conference (WWDC) was the introduction of Siri Shortcuts. Available across iOS devices with iOS12 and Apple Watches with WatchOS 5, Siri Shortcuts essentially adds a new type of voice-based user interface to Apple devices.

It works by building macro-like shortcuts for basic functions across a wide variety of applications and then gets them to execute by simply saying the name of your custom-labelled function to Siri. Critically, they can be used not just with Apple apps and iPhone or iPad settings, but across applications from other vendors as well.

Early on, most digital assistant platforms, such as Siri, Amazon’s Alexa, and the Google Assistant, focused on big picture issues like answering web-based queries, scheduling meetings, getting updates on quick data nuggets like traffic, weather, sports scores, etc. Most assistant platforms, however, didn’t really make your smart devices seem “smarter” or, for that matter, make them any easier to use.

With the introduction of Samsung’s Bixby, we saw the first real effort to make a device easier to use through a voice-based interaction model. Bixby’s adoption (and impact) has been limited, but arguably that’s primarily because of the execution of the concept, not because of any fundamental flaw in the idea. In fact, the idea behind a voice-based interface is a solid one, and that’s exactly what Apple is trying to do with Siri Shortcuts.

At first glance, it may seem that there’s little difference between a voice-based UI and traditional assistant, but there really is. First, at a conceptual level, voice-based interfaces are more basic than an assistant. While assistants need to do much of the effort on their own, a voice-based UI simply acts as a trigger to start actions or to allow more easy discovery or usage of features that often get buried under the increasing complexity of today’s software platforms and applications. It’s a well-known fact that most people use less than 10% of the capabilities of their tech products. Much of that limit is because people don’t know where to find certain features or how to use them. Voice-based interfaces can solve that problem by allowing people to simply say what they want the device to do and have it respond appropriately.

Given the challenges that many people have had with the accuracy of Siri’s recognition, this more simplistic approach is actually a good fit for Apple. Essentially, you’ll be able to do a lot of cool “smart” things with a much smaller vocabulary, which improves the likelihood of positive outcomes.

Another potentially interesting development is the possibility of its use with multiple digital assistants for different purposes. While I highly doubt that Apple will walk away from the ongoing digital assistant battle, they might realize that there could be a time and a place for, say, using Cortana to organize work-related activities, using Google Assistant for general data queries and using Siri for a variety of phone-specific functions—at least in the near term. Of course, a lot questions would need to be answered and API’s opened up before that could occur, but it’s certainly an intriguing possibility. Don’t forget, as well, that Apple has already created a connection between IBM’s Watson voice assistant and iOS, so the idea isn’t as crazy as it may first sound.

Even within the realm of a voice UI, it makes sense to add some AI-type functions. In fact, Apple’s approach to doing on-device machine learning to help maintain data privacy makes perfect sense, with a function/application that lets you use the specific apps installed on your device and provides suggestions based on the contacts and/or other personalized data stored in your phone. This is where the line between assistant and voice UI admittedly starts to blur, but the Apple offering still makes for a more straightforward type of interaction model that its millions of users will likely find to be very useful.

As interesting as the IFTTT (If This Then That)-like macro workflows that Siri Shortcuts can bring to more advanced users, however, I am a bit concerned that mainstream users could be a bit confused and overwhelmed by the capabilities that Shortcuts offers. Yes, you can achieve a lot, but even from the brief demo onstage, it’s clear that you also have to do a lot to make it work well. By the time it’s officially released as part of iOS12 this fall (as a free upgrade, BTW), I’m hoping Apple will create a whole series of predefined Siri Shortcuts that regular users can quickly access or easily customize.

The world of voice-based interactions continues to evolve, and I expect to see a number of advancements in both full-fledged assistant models, voice-based UIs, and combinations of the two. Long-term, I believe Siri Shortcuts has the opportunity to make the biggest impact on how iOS users interact with and leverage their devices of anything announced with iOS12, and I’m really looking forward to seeing how it evolves.

Here's a link to the column: https://techpinions.com/siri-shortcuts-highlights-evolution-of-voice-based-interfaces/53016

Bob O’Donnell is the president and chief analyst of TECHnalysis Research, LLC a market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter @bobodtech.

Podcasts
Leveraging more than 10 years of award-winning, professional radio experience, TECHnalysis Research participates in a video-based podcast called Everything Technology.
LEARN MORE
  Research Offerings
TECHnalysis Research offers a wide range of research deliverables that you can read about here.
READ MORE