iOS 9 Proactive Search

iOS_Proactive_search design-header

Trying out the iOS 9 developer beta provides great insight into what the future of iOS looks like. Though still in a very early phase of development, the Proactive Search feature announced at WWDC ’15 looks promising, and is well on its way to competing with Google’s ‘Now’.

Search has changed the way we find, use, and consume the data available to us. More often than not, this data is plated and served to us. The time has come for us to craft and mold how our users use and experience the applications we build.

Possibilities of Proactive Search

For starters, Proactive Search allows you to carry out the following functions:

  • Natural language searching using Siri
  • Search using complex parameters to return results based on your Contacts, Apps, and Maps
  • Location-tagged reminders, based on the day-date and time of day.
  • Deep-linking into Apps to provide complex search functionality

We’re going to focus on the last bit, deep-links that make your app eligible for search, provide relevant results when the user searches for a keyword, and in the process understand what app content should be indexed. If we want to understand how Proactive Search works and what makes it effective, we’re gonna have to get down to brass tacks.

As an example, let’s consider a simple Library-app that allows a user to add/remove/edit a list of their favorite books. Accessing information about a book would require a user to open the app, scroll through a long list of books, or drill-down from list to list until they find it. This is inconvenient, and does not do much in terms of the user’s experience.

What if the user could simply swipe down to access the search bar, type in a relevant keyword like a book’s name, or an author, publisher or year, and have the book displayed as a search result? What if, then, the user could tap on a result, and be navigated directly to the detail screen of the app?

Here’s how we’re going to make this happen.

Using CoreSpotlight APIs

Step one is to identify the sort of content the user will look for, and index it iteratively in the app. Start by identifying the parts of your app that:

  • have unique functionality
  • the kind of data shown to the user
  • the flow of data, and
  • the navigational flow between screens containing this data

Looking at the Library-app, here’s the data that needs to be indexed for each book:

  • Title
  • Author
  • Publishing House
  • Genre
  • Book ID

First, make sure you have Xcode 7 and iOS 9 SDK installed, and then add CoreSpotlight.framework as a dependency to your project.

Post_20150914_iOS9ProactiveSearch_snip




#import "LibraryTableViewController.h"


#import "BookDetailViewController.h"


#import "Book.h"




#import "BookTableViewCell.h"

IMPORTANT: Import CoreSpotlight to the classes doing the indexing.

Our Library data is in the form of JSON objects. When deserializing these objects, pass them to relevant methods that perform the search indexing.

Here’s an example with the methods that are deserializing the library data

This line of code,





         [self indexBookForSearch:tempBook];



is what sends the book for indexing;


CSSearchableItemAttributeSet* attributeSet = [[CSSearchableItemAttributeSet alloc] initWithItemContentType:kContentTypeBook];

   attributeSet.title = book.title;

   attributeSet.authorNames = @[book.author];

   attributeSet.publishers = @[book.publishingHouse];

   attributeSet.genre = book.genre;

   attributeSet.identifier = book.bookID;

   attributeSet.contentDescription = [NSString stringWithFormat:@"Read '%@' using Library now", book.title];

A CSSearchableItemAttributeSet is an attribute set we can create, to define a search result, and what’s going to be displayed when there is a match. An attribute set typically contains information relevant to what the user is searching, and convenient attributes to store metadata for real world objects like documents, events, general attributes, places, media, music, images, and messages. For example, in our attribute set for an item of “Book” type, the set consists of a title, author names, publishers, a genre, a description, and a unique identifier like an ISBN.

A CSSearchableItem is the actual search item with a uniqueID that is interpreted in the AppDelegate, and contains the attribute set we defined earlier.


   CSSearchableItem *item = [[CSSearchableItem alloc] initWithUniqueIdentifier:book.bookID domainIdentifier:@"com.library" attributeSet:attributeSet];

Actually indexing the item is a surprisingly regular affair.


   [[CSSearchableIndex defaultSearchableIndex] indexSearchableItems:@[item] completionHandler: ^(NSError * __nullable error) {

      NSLog(@"Book indexed");

   }];

And done! Your data is indexed, and ready to be searched.

Using NSUserActivity

This might seem awfully familiar, and you might’ve seen it while integrating Handoff. The approach is the same, but the intent is vastly different.


NSUserActivity *libraryActivity = [[NSUserActivity alloc]

                           initWithActivityType:@"com.library.books"];

userActivity.title = @"Activity Title Here";

userActivity.keywords = [NSSet setWithArray:@[@”keyword”, @”keyword”,@”keyword”]];

userActivity.userInfo = @{ Activity User Information Here };

libraryActivity.eligibleForSearch = YES;

While creating an activity, there are several things to be considered:

  • The activity type defines what the activity is being created for
  • Setting the activity eligible for search allows search results to be returned to the interface when the user searches for a keyword
  • Setting the activity eligible for public indexing
  • The activity’s userInfo providing context for deep-links.

Using the metadata available in an NSUserActivity object, it becomes incredibly easy to integrate what you want found into it.

Here’s a link to the GitHub repo containing the example application used in this article.

Stay-Up-To-Date

Keep in the loop with the latest in emerging technology and Mutual Mobile