Yesterday was keynote day at Apple’s Worldwide Developers Conference (WWDC), when the tech world awaits the company’s big product announcements. The fierce competition between Siri, Cortana, and Google Now had us eager to hear what Apple’s had in store for its congenial, all-purpose virtual assistant.
However, more than specific product details, what was particularly illuminating was a pattern in the background underlying the individual news items. This year, Apple is focused on imbuing its UX with greater context and meaning, and much of that effort is connected to search.
Desktop search gets contextual, with expanding scope:
Spotlight (Apple’s systemwide search) will feature new natural language capabilities in OS X El Capitan. These will allow users to find results to searches like “documents I worked on last June.” Spotlight will also summon up information like weather, sports scores, maps, and calendar data. This functionality will work in Mail too.
Search in iOS 9 will yield results from within apps. Apple is opening up the API for Spotlight, giving developers the opportunity to strengthen mobile search further.
Streaming Apple Music service boasts semantic voice search:
The company debuted Apple Music, a streaming radio service that competes with the likes of Pandora. This service uses Siri for search. Apple’s press release suggests trying queries like “Play me the best songs from 1994,” “Play the best FKA twigs song,” or “What was the number one song in February 2011?”
Improvements in Apple’s speech recognition technology
Apple announced that their speech recognition technology is more accurate than Google’s, with a 5% error rate (versus Google’s 8%). Accuracy in this area is obviously key for a business plan that increasingly incorporates Siri’s voice technology.