Conversational AI

Create a Visual Voice experience for your hybrid apps built with Ionic

By May 23, 2019December 14th, 2023No Comments

With this Alan Voice Platform integration, you can add a Visual Voice experience in any application developed with the Ionic framework. As part of our integration, we created a sample Food Delivery application, and then added a Visual Voice experience. Let’s walk through the four steps of how the Ionic integration with Alan is performed:

1. Create your Ionic Application

For the purpose of this tutorial, we’ve created a sample Food Delivery application. This is built with Ionic and will be running on iOS. The integration steps here will work with any Ionic application using the Angular framework.

Now that we have an application to integrate with, we’re going to show how to add a Visual Voice experience. We assume that your Ionic app can be run both on a desktop environment and on an iOS device with all necessary dependencies installed. If not, you can do the following to install these necessary tools and plugins.

2. Install and Update Ionic and Cordova plugins

In Terminal, set a path back to your application. This will look something like:

cd Users/admin/Documents/Demo/ionic-app

Now, enter the following commands into Terminal to install the dependencies for your application and update Ionic and Cordova:

npm i
npm i -g ionic
npm install -g cordova
ionic cordova platform add ios

And to build the application, enter:

ionic cordova build ios

Open the Xcode project and click the play button in the upper left to run it.

Screen Shot 2019-05-22 at 14.35.05

As you can see, this application has a menu, food items, and a checkout process. Now let’s integrate with Alan.

Group8

3. Integrate the application with Alan

We need to install two Alan packages to use voice in our application. Head back to Terminal and run the following commands:

npm i @alan-ai/cordova-plugin-alan-voice --save
npm i @alan-ai/alan-button --save
ionic cordova plugin add @alan-ai/cordova-plugin-alan-voice

Now we need to modify the application to enable the voice button. 

The first step is to enable usage of custom HTML tags with AngularJS. To do this open the app.module.ts file and add the following lines there:

import {CUSTOM_ELEMENTS_SCHEMA, NgModule} from '@angular/core';

and

schemas: [CUSTOM_ELEMENTS_SCHEMA],

View the image below to see the appropriate placement of these two lines within app.module.ts:

Screen Shot 2019-05-22 at 15.09.18

The next step is to register the Alan button Web Component. To do this, go to the main.ts file and add the following lines:

import { defineCustomElements as alanBtnDefineCustomElements} from '@alan-ai/alan-button/dist/loader';

and

alanBtnDefineCustomElements(window);

Screen Shot 2019-05-22 at 15.11.39

Now we’ll add the Alan button HTML tag to the main app’s template. To do this, go to the app.component.html and add the following tag:

">

And here we see:

Screen Shot 2019-05-22 at 15.14.00Our Alan project key will go in between the empty quotes.

Now we need to create the Alan Project that will contain the logic for the Visual Voice experience for our app.

So if you haven’t already, go to https://studio.alan.app, and create a free account.

1Then create a project and add an empty script to it.

2

Create a version for sets of your scripts which will be running in your app and set the project to Production.

3

Then click on the “” button, under the “Web” tab, copy the Alan SDK Key and paste it in the alan-key attribute in the app.component.html file.

4

Screenshot 2019-08-26 at 16.35.47

After these steps your app.component.html file will look like this:

Screen Shot 2019-05-22 at 15.51.24.png

Now go back to Terminal and build the application using:

ionic cordova build ios

Run the application in your Xcode project, and you will now see the Alan button in the lower right — the integration is complete.

Screen Shot 2019-05-22 at 16.40.05.png

Now we’ll add the Visual Voice experience for this application.

4. Create the Visual Voice experience in Alan

In Alan Studio, we’ll

  1. Create a complete Visual Voice experience for our Food Delivery application
  2. Add scripts that cover a full logic for our Food Delivery app
  3. Add commands for asking about what’s on the menu, adding items to the order, and checkout

To add scripts in Alan Studio, you need to switch the project to the Development mode and enter the needed content into the script area. After that, create a new version “v2” for the project as you did in the previous steps and put it on the Production environment. 

Screenshot 2019-08-26 at 16.41.06

Within Alan Studio, you can test commands like “What’s on the menu?”, “Add a pepperoni pizza”, and “Checkout”.

To support the visuals for these voice commands, we need to add some handlers back in the Food Delivery application.

In the Food Delivery project, we need to change app.component.ts. In the AppComponent class, import the alan-button component, and add the reference to the Alan Button element in order to set up listeners for voice commands.  

import “@alan-ai/alan-button”;

@ViewChild('alanBtnEl') alanBtnComponent: ElementRef;

Screen Shot 2019-05-22 at 18.10.59.png

In the ngAfterViewInit method, we add a listener for the “command” event. All commands which are sent from the Food Delivery scripts will be passed to this method. It’s a place where we can setup logic for how the application will react to the commands from the script. We’re also doing the needed updates for application’s UI.

Screen Shot 2019-05-22 at 16.48.32

In the ngOnInit method, we subscribe to the changes in the application’s state, and call the updateVisualState method which sends the new visual state to the Alan Studio script. This how we can guarantee the state of the application and state of the script will be synchronized.

Screen Shot 2019-05-22 at 16.53.42

Screen Shot 2019-05-22 at 17.05.47.pngWith these handlers in place, when the user says something like “What pizzas do you have?”, they will be navigated to the pizza menu and the pizzas will be highlighted as they are read aloud to the user.

Now we can save our changes and build the application using:

ionic cordova build ios

There’s our application with the full Visual Voice experience enabled. We can test out a few of the commands:

  • “What pizzas do you have?”
  • “Add two pepperoni pizzas and two sodas”
  • “Checkout”

That concludes this tutorial on how to integrate voice into any hybrid application built with Ionic. Here’s a full video of the integration.

Leave a Reply

Discover more from Alan Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading