Alan iOS SDK


Alan can be integrated with iOS apps developed in Swift and Objective-C.

Integrating with Alan

To add Alan voice to an iOS app, you need to do the following:

  1. Get the Alan iOS SDK framework

  2. Integrate with the app: Swift or Objective-C. As part of this process, you will:

    1. Add the Alan Config object to your app

    2. Add the Alan button to your app

Step 1. Get the Alan iOS SDK framework

First, you need to get the Alan iOS SDK framework and set up your XCode project to be used with Alan. You can do it in two ways:

  • Set up an XCode project with CocoaPods

  • Set up an XCode project manually

Do the following:

  1. On the machine, open Terminal and install CocoaPods:

    Terminal
    sudo gem install cocoapods
    
  2. Go to the project folder and create a Podfile for the project:

    Terminal
    pod init
    
  3. Open the Podfile and edit it:

    Podfile
    use_frameworks!
    platform :ios, '11.0'
    target '<Your Target Name>' do
    pod 'AlanSDK-iOS'
    end
    
  4. In the project folder, install the dependencies for the project:

    Terminal
    pod install
    pod update
    
  5. In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the Alan button requires access to:

    • User’s device microphone for voice interactions

    • User’s device camera for testing Alan projects on mobile

    To comply with this requirement, you must add NSMicrophoneUsageDescription and NSCameraUsageDescription keys to the Info.plist file of your app and provide a message why your app requires access to the microphone and camera. The message will be displayed only when Alan needs to activate the microphone or camera.

    To add the key:

    1. In the Xcode project, go to the Info tab.

    2. In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.

    3. From the list, select Privacy - Microphone Usage Description.

    4. In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched.

    5. Repeat the steps above to add the Privacy - Camera Usage Description key.

    ../../../_images/pods-mic.png
  6. To allow the background mode for the iOS app, go to the Signing and Capabilities tab. In the top left corner, click + Capability and in the capabilities list, double-click Background Modes. In the Modes list, select the Audio, AirPlay, and Picture in Picture check box.

    ../../../_images/pods-background.png
  7. The background mode must also be turned on in the Alan Studio project. In Alan Studio, at the top of the code editor, click Integrations, go to the iOS tab and enable the Keep active while the app is in the background option.

Do the following:

  1. Open the Alan iOS SDK release page on Alan GitHub.

  2. Download the AlanSDK.xcframework_<x.x.x>.zip file from the latest release and extract AlanSDK.xcframework from the ZIP archive.

../../../_images/download.png
  1. Drag AlanSDK.xcframework and drop it onto the root node of the Xcode project.

  2. Select the Copy items if needed check box if it is not selected.

    ../../../_images/copy-2.png
  3. In the project tree, select the XCode project file and go to the General tab. Under the Frameworks, Libraries, and Embedded Content section, find AlanSDK.xcframework and select Embed & Sign from the list.

    ../../../_images/embedded.png
  4. In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the Alan button requires access to:

    • User’s device microphone for voice interactions

    • User’s device camera for testing Alan projects on mobile

    To comply with this requirement, you must add NSMicrophoneUsageDescription and NSCameraUsageDescription keys to the Info.plist file of your app and provide a message why your app requires access to the microphone and camera. The message will be displayed only when Alan needs to activate the microphone or camera. To add the key:

    1. In the Xcode project, go to the Info tab.

    2. In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.

    3. From the list, select Privacy - Microphone Usage Description.

    4. In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched.

    5. Repeat the steps above to add the Privacy - Camera Usage Description key.

    ../../../_images/mic.png
  5. To allow the background mode for the iOS app, go to the Signing and Capabilities tab. In the top left corner, click +Capability and in the capabilities list, double-click Background Modes. In the Modes list, select the Audio, AirPlay, and Picture in Picture check box.

    ../../../_images/background.png
  6. The background mode must also be turned on in the Alan Studio project. In Alan Studio, at the top of the code editor, click Integrations, go to the iOS tab and enable the Keep active while the app is in the background option.

Step 2. Integrate with the app

Note

The instructions below apply to the Storyboard user interface.

You need to integrate Alan with your app written in:

  • Swift

  • Objective-C

In the Xcode project, open the ViewController.swift file. You need to add the following Swift snippet to your view controller:

  1. At the top of the file, import AlanSDK:

    Client app
    import AlanSDK
    
  2. In the ViewController class, define the AlanButton variable:

    Client app
    fileprivate var button: AlanButton!
    
  3. In viewDidLoad(), set up AlanButton. For more details, see Alan Config object and Alan button.

    Client app
    import UIKit
    import AlanSDK
    
    class ViewController: UIViewController {
    
      /// Alan button
      fileprivate var button: AlanButton!
      override func viewDidLoad() {
        super.viewDidLoad()
    
        /// Setup the Alan button
        self.setupAlan()
      }
    
      fileprivate func setupAlan() {
    
        /// Define the project key
        let config = AlanConfig(key: "")
    
        ///  Init the Alan button
        self.button = AlanButton(config: config)
    
        /// Add the button to the view
        self.view.addSubview(self.button)
        self.button.translatesAutoresizingMaskIntoConstraints = false
    
        /// Align the button on the view
        let views = ["button" : self.button!]
        let verticalButton = NSLayoutConstraint.constraints(withVisualFormat: "V:|-(>=0@299)-[button(64)]-40-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
        let horizontalButton = NSLayoutConstraint.constraints(withVisualFormat: "H:|-(>=0@299)-[button(64)]-20-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
        self.view.addConstraints(verticalButton + horizontalButton)
      }
    }
    
  4. In let config = AlanConfig(key: ""), define the Alan SDK key for your Alan Studio project. To get the key, in Alan Studio, at the top of the code editor, click Integrations and copy the value from the Alan SDK Key field.

  5. Run the app and tap the Alan button to speak.

Add this Objective-C snippet to your view controller.

  1. Import AlanSDK:

    Client app
    @import AlanSDK;
    
  2. Define the AlanButton variable:

    Client app
    @property (nonatomic) AlanButton* button;
    
  3. In viewDidLoad, set up AlanButton. For more details, see Alan Config object and Alan button.

    Client app
    AlanConfig* config = [[AlanConfig alloc] initWithKey:@"YOUR_KEY_FROM_ALAN_STUDIO_HERE"];
    self.button = [[AlanButton alloc] initWithConfig:config];
    [self.button setTranslatesAutoresizingMaskIntoConstraints:NO];
    [self.view addSubview:self.button];
    NSLayoutConstraint* b = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeBottom relatedBy:NSLayoutRelationEqual toItem:self.view attribute:NSLayoutAttributeBottom multiplier:1 constant:-40.0];
    NSLayoutConstraint* r = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeRight relatedBy:NSLayoutRelationEqual toItem:self.view attribute:NSLayoutAttributeRight multiplier:1 constant:-20];
    NSLayoutConstraint* w = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeWidth relatedBy:NSLayoutRelationEqual toItem:nil attribute:NSLayoutAttributeNotAnAttribute multiplier:1 constant:64.0];
    NSLayoutConstraint* h = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeHeight relatedBy:NSLayoutRelationEqual toItem:nil attribute:NSLayoutAttributeNotAnAttribute multiplier:1 constant:64.0];
    [self.view addConstraints:@[b, r, w, h]];
    
  4. Run the app and tap the Alan button to speak.

AlanConfig object

The AlanConfig object describes the parameters that are provided for AlanButton.

  1. Create a new AlanConfig instance with your Alan project SDK key:

    Client app
    - (instancetype)initWithKey:(NSString *)key;
    

    Name

    Type

    Description

    key

    NSString

    The Alan SDK key for your project in Alan Studio.

  2. Create a new AlanConfig instance with your Alan project SDK key and custom data object:

    Client app
    - (instancetype)initWithKey:(NSString *)key dataObject:(NSDictionary *)dataObject;
    

    Name

    Type

    Description

    key

    NSString

    The Alan SDK key for a project in Alan Studio.

    dataObject

    NSDictionary

    (Optional) A valid JSON string or JSON object with authentication or configuration data to be sent to the voice script. For details, see authData.

    For example:

    Client app
    AlanConfig *config = [[AlanConfig alloc] initWithKey:@"YOUR_KEY_FROM_ALAN_STUDIO_HERE"];
    

Alan button

To add the Alan button to your app, use the AlanButton class. This class provides a view with the voice button and instance methods to communicate with Alan Studio.

Сreate a new AlanButton instance with the config object:

Client app
- (instancetype)initWithConfig:(AlanConfig *)config;

Name

Type

Description

config

AlanConfig

The AlanConfig object for configuration which is described above

For example:

Client app
@interface ViewController ()
@property (nonatomic) AlanButton *button;
@end

@implementation ViewController
- (void)viewDidLoad
{
  [super viewDidLoad];
  AlanConfig *config = [[AlanConfig alloc] initWithKey:@"YOUR_KEY_FROM_ALAN_STUDIO_HERE"];
  self.button = [[AlanButton alloc] initWithConfig:config];
  [self.button setTranslatesAutoresizingMaskIntoConstraints:NO];
  [self.view addSubview:self.button];

  NSLayoutConstraint *right = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeRight relatedBy:NSLayoutRelationEqual toItem:self.view attribute:NSLayoutAttributeRight multiplier:1 constant:-20.0];
  NSLayoutConstraint *bottom = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeBottom relatedBy:NSLayoutRelationEqual toItem:self.view attribute:NSLayoutAttributeBottom multiplier:1 constant:-20.0];
  NSLayoutConstraint *width = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeWidth relatedBy:NSLayoutRelationEqual toItem:nil attribute:NSLayoutAttributeNotAnAttribute multiplier:1 constant:64.0];
  NSLayoutConstraint *height = [NSLayoutConstraint constraintWithItem:self.button attribute:NSLayoutAttributeHeight relatedBy:NSLayoutRelationEqual toItem:nil attribute:NSLayoutAttributeNotAnAttribute multiplier:1 constant:64.0];
  [self.view addConstraints:@[right, bottom, width, height]];
}
@end

What’s next?

Upon integration, your app gets the in-app voice assistant that can be activated with the Alan button displayed on top of the app’s UI.

To build a full-fledged multimodal UX, you can use Alan’s SDK toolkit:

../../../_images/method.svg

Client API methods

Enable communication between the client app and Alan and perform actions in the app.


Learn more

../../../_images/handler.svg

Alan handlers

Handle commands, understand the button state and capture events in the client app.


Learn more

../../../_images/git-purple.svg

Example apps

Find and explore examples of voice-enabled apps on the Alan AI GitHub repository.


View on GitHub