Conversational AI

Alan Studio Walkthrough: Part 3

By November 13, 2019December 14th, 2023No Comments

Part 3

The third and final part in our series describing Alan Studio and how to get started with the Alan Platform.

Embed Code

Once you are happy with your scripts from part one and part two, you can add the Alan Button to your application. Let’s go to the top of our project page first and click “Embed Code”.

On our next screen, you will see options to integrate into any of the SDK’s available for Web, iOS, and Android. You will also see an SDK Key that we will copy into our application.

Once we decide which platform we want to integrate to, you will see various options to customize the Alan Button, including:

  • Alan Button: From here you can enable or disable the Alan Button
  • “Hey Alan” Wake Word: Enable or disable Alan by saying the phrase, “Hey Alan
  • Microphone Timeout: Set how long you would like the Alan Button to listen before turning off automatically.
  • Change Button Style: Change the color and behavior of the Alan Button to match your app UI.
  • Keep Active while the app is in the background: Choose if you want Alan to keep listening for user input while the app is running in the background.
  • “Speech to Text” Panel: Enable or Disable the speech window next to the Alan Button that shows the speech recognition as the user dictates to the app.

Below all these options, you will see sample instructions on how to embed the Alan Button for the platform you have chosen.

iOS Integration

First, let’s begin by downloading the iOS AlanSDK.framework folder from our GitHub.

Once in Xcode, we will copy the framework to our project. Once it is copied, we can add the framework to our “Embedded Binaries” and “Linked Frameworks and Libraries” within the General tab for our target. Finally, we need to add microphone access to our application by adding the NSMicrophoneUsageDescription key to info.plist. We will call this permission “Alan Button voice”.

From here, we will follow the directions given on the integrations page.

  1. First, we will add the Alan SDK Framework
    1. import AlanSDK
  2. Define the Alan button Variable
    1. fileprivate var button: AlanButton!
  3. Setup the Alan button in videwDidLoad()
    1. let config = AlanConfig(key: “db7037e977051d95db5fe953814f71702e956eca572e1d8b807a3e2338fdd0dc/stage”)

self.button = AlanButton(config: config)

self.button.translatesAutoresizingMaskIntoConstraints = false

self.view.addSubview(self.button)

self.view.bringSubview(toFront: self.button)

let b = NSLayoutConstraint(item: self.button, attribute: .bottom, relatedBy: .equal,

   toItem: self.view, attribute: .bottom, multiplier: 1, constant: -40)

let r = NSLayoutConstraint(item: self.button, attribute: .right, relatedBy: .equal,

   toItem: self.view, attribute: .right, multiplier: 1, constant: -20)

let w = NSLayoutConstraint(item: self.button, attribute: .width, relatedBy: .equal,

   toItem: nil, attribute: .notAnAttribute, multiplier: 1.0, constant: 64)

let h = NSLayoutConstraint(item: self.button, attribute: .height, relatedBy: .equal,

   toItem: nil, attribute: .notAnAttribute, multiplier: 1.0, constant: 64)

self.view.addConstraints([b, r, w, h])

NotificationCenter.default.addObserver(self, selector: #selector(self.handleEvent(_:))

   name:NSNotification.Name(rawValue: “kAlanSDKEventNotification”), object:nil)

  1. Finally, we will play some text:
    1. self.button.playText(“Hello”)
  2. Send a command:
    1. let data = [“voice”: “next”]

self.button.playData(data)

  1. And handle our commands from Alan Studio.
    1. @objc func handleEvent(_ notification: Notification) {

    guard let userInfo = notification.userInfo,

          let jsonString = userInfo[“jsonString”] as? String,

          let data = jsonString.data(using: .utf8),

          let obj = try? JSONSerialization.jsonObject(with: data, options: []),

          let json = obj as? [String: Any]

    else { return }

    print(json)

}

Congratulations, that’s everything! You have successfully imported The Alan Button into your first iOS application! From here, we challenge you to find new and creative scripts to help power the future of Voice, and to integrate these apps into iOS, Android, and Web applications! If you have any questions, feel free to check out our documentation and if you still can’t find the answer there, head over to our community Slack channel, where our team is always available to help you with your next project!


Leave a Reply

Discover more from Alan Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading