With this Alan Voice Platform integration, you can add a Visual Voice experience in any application developed with the Ionic framework. As part of our integration, we created a sample Food Delivery application, and then added a Visual Voice experience. Let’s walk through the four steps of how the Ionic integration with Alan is performed:
1. Create your Ionic Application
For the purpose of this tutorial, we’ve created a sample Food Delivery application. This is built with Ionic and will be running on iOS. The integration steps here will work with any Ionic application using the Angular framework.
Now that we have an application to integrate with, we’re going to show how to add a Visual Voice experience. We assume that your Ionic app can be run both on a desktop environment and on an iOS device with all necessary dependencies installed. If not, you can do the following to install these necessary tools and plugins.
2. Install and Update Ionic and Cordova plugins
In Terminal, set a path back to your application. This will look something like:
cd Users/admin/Documents/Demo/ionic-app
Now, enter the following commands into Terminal to install the dependencies for your application and update Ionic and Cordova:
npm i npm i -g ionic npm install -g cordova ionic cordova platform add ios
And to build the application, enter:
ionic cordova build ios
Open the Xcode project and click the play button in the upper left to run it.
As you can see, this application has a menu, food items, and a checkout process. Now let’s integrate with Alan.
3. Integrate the application with Alan
We need to install two Alan packages to use voice in our application. Head back to Terminal and run the following commands:
npm i @alan-ai/cordova-plugin-alan-voice --save npm i @alan-ai/alan-button --save ionic cordova plugin add @alan-ai/cordova-plugin-alan-voice
Now we need to modify the application to enable the voice button.
The first step is to enable usage of custom HTML tags with AngularJS. To do this open the app.module.ts file and add the following lines there:
import {CUSTOM_ELEMENTS_SCHEMA, NgModule} from '@angular/core';
and
schemas: [CUSTOM_ELEMENTS_SCHEMA],
View the image below to see the appropriate placement of these two lines within app.module.ts:
The next step is to register the Alan button Web Component. To do this, go to the main.ts file and add the following lines:
import { defineCustomElements as alanBtnDefineCustomElements} from '@alan-ai/alan-button/dist/loader';
and
alanBtnDefineCustomElements(window);
Now we’ll add the Alan button HTML tag to the main app’s template. To do this, go to the app.component.html and add the following tag:
">
And here we see:
Now we need to create the Alan Project that will contain the logic for the Visual Voice experience for our app.
So if you haven’t already, go to https://studio.alan.app, and create a free account.
Create a version for sets of your scripts which will be running in your app and set the project to Production.
Then click on the “>” button, under the “Web” tab, copy the Alan SDK Key and paste it in the alan-key attribute in the app.component.html file.
After these steps your app.component.html file will look like this:
Now go back to Terminal and build the application using:
ionic cordova build ios
Run the application in your Xcode project, and you will now see the Alan button in the lower right — the integration is complete.
Now we’ll add the Visual Voice experience for this application.
4. Create the Visual Voice experience in Alan
In Alan Studio, we’ll
- Create a complete Visual Voice experience for our Food Delivery application
- Add scripts that cover a full logic for our Food Delivery app
- Add commands for asking about what’s on the menu, adding items to the order, and checkout
To add scripts in Alan Studio, you need to switch the project to the Development mode and enter the needed content into the script area. After that, create a new version “v2” for the project as you did in the previous steps and put it on the Production environment.
Within Alan Studio, you can test commands like “What’s on the menu?”, “Add a pepperoni pizza”, and “Checkout”.
To support the visuals for these voice commands, we need to add some handlers back in the Food Delivery application.
In the Food Delivery project, we need to change app.component.ts. In the AppComponent class, import the alan-button component, and add the reference to the Alan Button element in order to set up listeners for voice commands.
import “@alan-ai/alan-button”; @ViewChild('alanBtnEl') alanBtnComponent: ElementRef;
In the ngAfterViewInit method, we add a listener for the “command” event. All commands which are sent from the Food Delivery scripts will be passed to this method. It’s a place where we can setup logic for how the application will react to the commands from the script. We’re also doing the needed updates for application’s UI.
In the ngOnInit method, we subscribe to the changes in the application’s state, and call the updateVisualState method which sends the new visual state to the Alan Studio script. This how we can guarantee the state of the application and state of the script will be synchronized.
Now we can save our changes and build the application using:
ionic cordova build ios
There’s our application with the full Visual Voice experience enabled. We can test out a few of the commands:
- “What pizzas do you have?”
- “Add two pepperoni pizzas and two sodas”
- “Checkout”
That concludes this tutorial on how to integrate voice into any hybrid application built with Ionic. Here’s a full video of the integration.