Building a voice assistant for an iOS Swift app

With Alan AI SDK for iOS, you can create a voice assistant or chatbot like Siri and embed it to your iOS Swift app. The Alan AI Platform provides you with all the tools and leverages the industry’s best Automatic Speech Recognition (ASR), Spoken Language Understanding (SLU) and Speech Synthesis technologies to quickly build an AI assistant from scratch.

In this tutorial, we will create a single view iOS app with Alan AI voice and test drive it on a simulator. The app users will be able to tap the voice assistant button and give custom voice commands, and Alan AI will reply to them.

What you will learn

  • How to add a voice interface to an iOS Swift app

  • How to write simple voice commands for an iOS Swift app

What you will need

To go through this tutorial, make sure the following prerequisites are met:

  • You have signed up to Alan AI Studio.

  • You have created a project in Alan AI Studio.

  • You have set up Xcode on your computer.

Note

When you sign up to Alan AI Studio, Alan AI adds free interactions to your balance to let you get started. To get additional interactions to your balance, link your Alan AI account with your GitHub account and give stars to Alan AI repositories. For details, see Adding free interactions.

Step 1: Create a single view iOS app

For this tutorial, we will be using a simple iOS app with a single view. To create the app:

  1. Open Xcode and select to create a new Xcode project.

  2. Select Single View App.

    ../../../_images/xcode-project-type.png
  3. In the Product Name field, specify the project name.

  4. From the Language list, select Swift.

  5. From the User Interface list, select Storyboard.

    ../../../_images/xcode-project-settings.png
  6. Select a folder in which the project will reside and click Create.

Step 2: Add the Alan AI SDK for iOS to the project

The next step is to add the Alan AI SDK for iOS to the app project. There are two ways to do it:

  • with CocoaPods

  • manually

Let’s add the SDK manually:

  1. On Alan AI GitHub, go to the Releases page for the Alan AI SDK for iOS: https://github.com/alan-ai/alan-sdk-ios/releases.

  2. Download the AlanSDK.xcframework_<x.x.x>.zip file from the latest release.

    ../../../_images/download.png
  3. On the computer, extract AlanSDK.xcframework from the ZIP archive.

  4. Drag AlanSDK.xcframework and drop it under the root node of the Xcode project

  5. In the displayed window, select the Copy items if needed check box and click Finish.

    ../../../_images/copy-2.png

Step 3: Specify the Xcode project settings

Now we need to adjust the project settings to use the Alan AI SDK for iOS.

  1. We need to make sure the Alan SDK for iOS is embedded when the app gets built. On the General tab, scroll down to the Frameworks, Libraries, and Embedded Content section. To the right of AlanSDK.xcframework, select Embed and Sign.

    ../../../_images/embedded.png
  2. In iOS, the user must explicitly grant permission for an app to access the microphone. In the Xcode project, we need to add a special key with the description for the user why the app requires access to the microphone.

    1. Go to the Info tab.

    2. In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.

    3. From the list, select Privacy - Microphone Usage Description.

    4. In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched.

    ../../../_images/mic.png
  3. We need to allow the background mode for our app. Go to the Signing and Capabilities tab. In the top left corner, click +Capability and in the capabilities list, double-click Background Modes. In the Modes list, select the Audio, AirPlay, and Picture in Picture check box.

    ../../../_images/background.png
  4. We also need to make sure the background mode is enabled in our Alan AI project. In Alan AI Studio, at the top of the code editor, click Integrations, go to the iOS tab and enable the Keep active while the app is in the background option.

    ../../../_images/studio-background-mode.png

Step 4: Integrate Alan AI with Swift

The next step is to update our app to import the Alan AI SDK for iOS and add the Alan AI button to it.

  1. In the app folder, open the ViewController.swift file.

  2. At the top of the file, import the Alan AI SDK for iOS:

    ViewController.swift
    import AlanSDK
    
  3. In the ViewController class, define variables for the Alan AI button and Alan AI text panel:

    ViewController.swift
    class ViewController: UINavigationController {
    
        /// Alan AI button
        fileprivate var button: AlanButton!
    
        /// Alan AI text panel
        fileprivate var text: AlanText!
    }
    
  4. To the UIViewController class, add the setupAlan() function. Here we set up the Alan AI button and Alan AI text panel and position them on the view:

    ViewController.swift
    class ViewController: UINavigationController {
        fileprivate func setupAlan() {
            /// Define the project key
            let config = AlanConfig(key: "")
    
            ///  Init the Alan AI button
            self.button = AlanButton(config: config)
    
             /// Init the Alan AI text panel
             self.text = AlanText(frame: CGRect.zero)
    
             /// Add the button and text panel to the window
             self.view.addSubview(self.button)
             self.button.translatesAutoresizingMaskIntoConstraints = false
             self.view.addSubview(self.text)
             self.text.translatesAutoresizingMaskIntoConstraints = false
    
             /// Align the button and text panel on the window
             let views = ["button" : self.button!, "text" : self.text!]
             let verticalButton = NSLayoutConstraint.constraints(withVisualFormat: "V:|-(>=0@299)-[button(64)]-40-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
             let verticalText = NSLayoutConstraint.constraints(withVisualFormat: "V:|-(>=0@299)-[text(64)]-40-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
             let horizontalButton = NSLayoutConstraint.constraints(withVisualFormat: "H:|-(>=0@299)-[button(64)]-20-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
             let horizontalText = NSLayoutConstraint.constraints(withVisualFormat: "H:|-20-[text]-20-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
             self.view.addConstraints(verticalButton + verticalText + horizontalButton + horizontalText)
         }
     }
    
  5. Now, in let config = AlanConfig(key: ""), define the Alan AI SDK key for the Alan Studio project. To get the key, in Alan AI Studio, at the top of the code editor, click Integrations and copy the value from the Alan SDK Key field. Then insert the key in the Xcode project.

  6. In ViewDidLoad(), call the setupAlan() function:

    ViewController.swift
    class ViewController: UINavigationController {
        override func viewDidLoad() {
           self.setupAlan()
        }
    }
    

Here is what our Xcode project should look like:

../../../_images/xcode-viewcontroller.png

Run the app. Here is our app running on the simulator. The app displays an alert to get microphone access with the description we have provided:

../../../_images/xcode-simulator.png

Step 5: Add voice commands

Let’s add some voice commands so that we can interact with Alan AI. In Alan AI Studio, open the project and in the code editor, add the following intents:

Dialog script
intent(`What is your name?`, p => {
    p.play(`It's Alan, and yours?`);
});

intent(`How are you doing?`, p => {
    p.play(`Good, thank you. What about you?`);
});

In the app, tap the Alan AI button and ask: What is your name? and How are you doing? The AI assistant will give responses we have provided in the added intents.

What you finally get

After you pass through this tutorial, you will have a single view iOS app integrated with Alan AI. You can get an example of such an app from the Alan AI GitHub to make sure you have set up your app correctly.

What’s next?

You can now proceed to building a voice interface with Alan AI. Here are some helpful resources: