Alan AI® News and Updates

2020 Q3 Alan Platform Update

By October 15, 2020October 22nd, 2020No Comments

We’ve been hard at work these past few months, adding and updating the most requested features to the Alan Platform. These new product updates will provide ease and efficiency to your day as you create, test, and deploy human-like conversational experiences in your applications.


Here are the updates to get you working faster and smarter in Alan Studio.

Dynamic Entities

According to business needs, voice conversation values (or entities) change over time – for example, daily lunch specials for food apps or favorite clothing items being saved by the user during an online shopping spree. 

To address this need requested you, we’ve added scripting functionality to create ‘adaptive’ voice commands. As a result, the conversational voice dialog can be adjusted to new business needs and provide highly personalized user experiences. The voice conversation values (or entities) capture information from the user’s input like slots at design time. With this feature, you can now update the entities at runtime as well without restarting the dialog session.

Learn more about how to add Dynamic Entities here.

Predefined Callbacks

Alan offers a set of predefined callbacks that you can use to perform necessary tasks and handle significant events in your voice script. 

For example, you can use the callback onCleanupUser when the user disconnects from Alan or the conversation has finished. Here you can perform any cleanup activities or save the user data.

See the full list of callbacks that can be used in Alan here.

Voice Script Flowchart

With this new flowchart, there’s no more need for repeatedly reviewing the dialog flow and tirelessly identifying possible problems in the script code. Instead, you will have a visual representation of the voice script and its workflow – displaying the script blocks as a diagram – so that you can make sure the voice script is robust while writing it.

This flowchart panel will be the bottom right of the code editor and can be enlarged or minimized as you prefer.

Voice Script Import

If you’ve previously exported voice scripts from other voice assistant platforms or worked with them in an external IDE, you can now easily import them into Alan Studio by selecting the file or doing a simple drag-n-drop.

Sample Alan Apps

We are also providing a set of sample starter apps so that you can test and see how Alan’s voice can be integrated with your apps on different platforms: Web, iOS, Android, Flutter, and Ionic.

Check out our repositories in Github and start interacting with Alan right there and then to see how a voice interface for an app can look like. You can also look inside the app files to see the integration specifics.

New Scripts

We’ve added two new example project scripts that you can use, edit, and test to better understand the capabilities of the Alan Platform.

  • Hello World – Create your first voice assistant with basic Alan functionality. You will learn how to use:
    • intents and play function;
    • simple patterns and alternatives;
    • User-defined and predefined slots; and
    • Contexts.
  • Alan Integrations – Learn how to integrate Alan with your apps on different platforms.
    • Web: JavaScript, React, Angular, Vue, Ember, Electron
    • iOS: Objective-C, Swift
    • Android: Java, Kotlin
    • Cross-platform: Flutter, Ionic, React Native

We’ve also added some great new features that your app users will enjoy!

Deactivating Alan Button with Voice

You can now give your users a completely hands-free voice experience within your app from the moment they enter through its digital doors to the moment they want to leave. 

By simply saying “Thank you Alan” or “Alan Stop”, Alan can be deactivated and put to sleep until he is needed again. This feature is great for users who are on-the-go or just far enough away from the phone that those few extra steps can be saved for better things. 

Wake Word in the background

Users can now activate Alan even if the app is collapsed in the phone’s background. So while the user is scrolling through their bank statement and realizes that they ordered a nightstand that hasn’t yet arrived from your app, they just need to say “Hey Alan, what’s the status of my order?” and Alan will answer.

You can turn on this feature in Alan Studio on your project’s Integration page by toggling on Wake Word In Background. Currently supported on iOS and Android.


What’s next?

We’ll keep updating the Alan Platform tools and features and make sure that you and your app users have an unforgettable voice experience with Alan. We’re always interested in feedback, so please let us know your thoughts!

In the meantime, see what voice app experiences you can create with the new features mentioned above or check out our Docs for more information and tutorials. For questions or issues, join our Slack channel or email us at support@alan.app. We look forward to seeing what you will create with the Alan Platform!

The Alan AI Team

Leave a Reply

Discover more from Alan AI Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading