Shortcuts

WWDC 2018 Session 216: Introduction to Siri Shortcuts For more WWDC 18 Shortcuts please visit xSwiftGG WWDC 18 catalog

Shortcuts are the ability for the user to customize a Phase or identify a spoken phrase that the user can speak and then perform a given logic. Shortcuts are part of SiriKit and are an extension of the original SiriKit API that allows users to use Siri commands to execute a range of logic within the App. Shortcuts themselves need to be concise, and Shortcuts should be concise in design. Meets the user’s purpose, is frequently used, and is easy to remember. Shortcuts are not the abstract Intent of SiriKit, for example, an Intent describes an abstract Intent of the user, such as “Book movie tickets”, but a specific user Intent is “Book three Jurassic Park ii movie tickets at the nearest movie theater”. Shortcut refers to this specific user intent. Shortcut is ultimately a way for users to use Siri to quickly perform this kind of repeatable specific intent

So let’s start with a brief introductionSiriKitBasic usage of

  • useSiriKitYou need to create a new one in your projectIntent ExtensionthetargetIf you need to customize the interactive view in Siri, you need to create itIntent UIExtensionIs responsible for handling the business logic of the Intent and the view of the Intent, respectively.

  • newIntent Definition File

  • When you create an Intent Definition File, it automatically generates a class File corresponding to the Intent. In the class File, the parameter you declare is associated with the class attribute.

  • At the heart of SiriKit are intents, or natural-language descriptions that describe a user’s Intent; The Intent Definition and Handler also tell Siri how to parse, process, and guess the user’s Intent. And recommending these intents to users when appropriate, based on the context. The lifecycle of an Intent is divided into: Resolve-confirm-handle, for example, is an Intent (System Intent) used in Siri. Inetnt describes the recipient and the content of the message. In addition, resolve and correct the recipient and message content (resolve), confirm the operation (Handle), and send the message (handle) in the IntentHandler. Another example is the Intent of the Clock timer, which part of the timer that the user wants to set is the specific variable, which part is the unit (resolve), and how to convert into the corresponding timer response (handle). Intents, made by Didi, resolves destination (resolve) and confirms price (confirm) through language, and finally helps users call a car (handle). SiriKit is available in iOS10, and if you’re not familiar with SiriKit, Recommended reading WWDC2016 Session217 Introducing SiriKit and SiriKit and raywenderlich’s SiriKit quick start!

  • To invoke the App to send information, for example, class IntentHandler will follow INSendMessageIntentHandling protocol, implement the following four methods, events of Siri asynchronous processing, The actual processing is contained in the Result (which mainly contains some content correction method) or Response (which mainly contains an NSUserActivity property) arguments in each block

        func resolveRecipients(for intent: INSendMessageIntent, with completion: @escaping ([INPersonResolutionResult]) -> Void)  // Process the receiver - Resolve
        func resolveContent(for intent: INSendMessageIntent, with completion: @escaping (INStringResolutionResult) -> Void)  // Handle message content -resolve
        func confirm(intent: INSendMessageIntent, completion: @escaping (INSendMessageIntentResponse) -> Void) // Confirm the operation
        func handle(intent: INSendMessageIntent, completion: @escaping (INSendMessageIntentResponse) -> Void) // Send information - Handle
    Copy the code

The implementation method is determined based on Shortcut behavior

  • If you open an App specific page, jump to the in-app user to continue with some operations, reproduce the indexed content in Spotlight results, or provide Handff operations, Shortcut recommendedNSUserActivityTo implement, just modifyuserActivityThe new attributeisEligibleForPredictionAnd to theviewcontrolleruserActivityAssign the donate property to complete donate
  • If the user does not need to jump to the App, the user can respond to the user’s demand through Siri voice or custom Siri display interface (using Siri Intent Extension Target response), Intents is recommended. Intents can also jump to apps

createShortcutsThe three steps of

  • Declare Shortcut – to state Shortcut

  • Donate Shortcut – Tell iOS about your Shortcut

  • Handle Shortcut – The actual response to Handle the corresponding Shortcut

Use NSUserActivity for Shortcut

If you’re not familiarNSUserActivityCan refer to the documentation here https://developer.apple.com/documentation/foundation/nsuseractivity).

  • In the project’s info.plist file, declare a new user activity type, for example:

  • <key>NSUserActivityTypes</key> <array>
    <string>com.myapp.name.my-activity-type</string> </array>
    Copy the code
  • Step I: Declare Shortcut: Declare the new NSUserActivity as before to supportShortcutsSimply declare a new attribute isEligibleForPrediction to true, as follows

  • let userActivity =  NSUserActivity(activityType: "com.myapp.name.my-activity-type")
    userActivity.isEligibleForSearch= true
    userActivity.isEligibleForPrediction = true // New property that can be exposed to 'SiriKit' when assigned to true
    userActivity.title = "Display title of Activity"
    userActivity.userInfo = ["key": "value"] 
    userActivity.suggestedInvocationPhrase = "Some recommended suggestions."
    
    let attributes = CSSearchableItemAttributeSet(itemContentType: kUTTypeItem as String)
    let image = UIImage(named: "myImage")!
    attributes.thumbnailData = image.pngData()
    attributes.contentDescription = "Subtitle of Activity"
    userActivity.contentAttributeSet = attributes
    
    viewController.userActivity = userActivity // After the assignment, the UserActivity is donated
    Copy the code
  • Step 2: Donate Shortcut: Assign the newly created NSUserActivity object to the userActivity property of the viewController to complete Donate, as shown in the Shortcut

  • Step iii: Donate Shortcut: When Siri calls up the App, process information related to UserActivity in the Appdelegate as follows

  • func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([Any]?) -> Void) - >Bool {
        if userActivity.activityType == "com.myapp.name.my-activity-type" {
            // Matches the type of activity
        }
        if interaction = userActivity.interaction {
            // Process the information Siri tells you
        }
    // According to the known information, jump to the user required page, or restore some scenes
    }
    Copy the code

Use Intents to create Shortcut

Apple’s Siri will be used hereShortcuts 的 Demo (Click download)To explain

  • The structure in this Demo is that the App shares part of the business code with the Intent, that is, the logic that handles the soup. This part is packaged as SoupKit.framework. The logic of soup is also used. SoupChef is the target of the App; The SoupChefIntents target processes the actual Handler of Siri Intents; SoupChefInetntsUI handles custom Siri response views; SoupKit includes the logic of a shared soup, which Apple wants us to use to organize our apps and reuse more code

  • Step I: Declare Shortcut

    • Creating an Intent

    • Intent Definition File Selects not to generate related classes in the main project to avoid symbol table collisions

    • In the Intent Definition File, declare the fields shown in figure 1

    • $(name) intent. swift file is automatically generated when the target is compiled. This Intent is associated with SoupKit, so the following file is generated when SoupKit is compiled

    • The associated Response is then further declared in the Intent Definition

    • The corresponding IntentResponse will be generated automatically at compile time

    • With the above steps, you complete the declaration of a Siri Intent that describes the user’s response to Siri when it says ‘Order Soup’

  • Step 2: Donate Shortcut

    • In this scenario, after the user orders a meal, Siri can be notified of a new Intent each time. After the user successfully orders a meal, it will create a new Intent object and call donate(). The code is as follows. The user can see this Shortcut in Settings – Siri, and it will also appear in Spotlight’s recommendation

    • A big new feature for Shortcuts is customised voice commands, so users can order their meals using their favorite voice commands, such as “A bowl of clam chowder”, or if you want to have a customized voice name Shortcut within the App, Through the Present INUIAddVoiceShortcutViewController instance, specific methods are as follows

    • If the Controller created and edited Shortcut is evoked in the App, the following effect is displayed. In this case, the user can save the customized voice corresponding Shortcut to iOS

    • In the new iOS12 debug, enable options in Developer to display the last Shortcuts command on the lock screen and spotlight search screen without having to voice debug Siri over and over again

    • Once it’s set up, you’ll see it on your phone, the first time using Siri, and then it’ll appear in Spotlight and the lock screen

    • In the OrderDetail Controller, there is information about NSUserActivity, just assign the new property in iOS12 to isEligibleForPrediction = true, You can directly donate the Shortcuts of the NSUserActivity class, the second option in the image above

    • Please see Session Siri Shortcuts on Watch Face for relevant Intents in the Apple Watch’s Siri Dial.

  • Step iii: Shortcut menu

    • An Intent ultimately needs to passINExtensionSubclasses to perform callbacks that will eventually enter your code in the following way
  • After that, you can perform further processing in the Intent, which is actually OrderSoupIntent and contains properties like soup, quantity, and options. We can compile the target of the Siri Intent Extension (SoupChefIntents) on the phone and select Siri as the running object to debug the Handler logic without running the App

The debugging effect is as follows. For details about the Intent handler, see the code in the demo

Intents Declare, Donate, and Handle. For further applications of intents, refer to Demo, documentation, and WWDC Session211, 214, 217, 225, and 228