Conversational AI

Add Voice & AI control to Flutter Applications

By August 5, 2019December 14th, 20232 Comments

With this Alan Voice Platform integration, you can add Voice control to any application developed with the Flutter framework. We used the sample Flutter app here made by Google team and then added a Visual Voice experience. Let’s walk through the four steps of how the Flutter integration with Alan is performed:

1. Download sample Flutter app

For the purpose of this tutorial we’re using the Shrine sample application available from Google. Clone this repo to download it: https://github.com/flutter/samples/tree/master/shrine

Make sure that you have the Flutter framework installed, if not, proceed with installation guide first: https://flutter.dev/docs/get-started/install/

2. Run demo app

Open the Shrine project in your Flutter IDE and press run. This is a sample app where you can browse and purchase apparel and other items.

 

3. Integrate the application with Alan

To add voice to our Shrine application, we need to install Alan voice dependency. Open pubspec.yaml and add following line to the dependencies section: 

alan_voice: 1.0.1

 

Now we’ll add the Alan button to the main app’s window. To do this, go to the app.dart and add the following import:

import ‘package:alan_voice/alan_voice.dart’;

And in the initState function add the line:

  AlanVoice.initButton(
           "",
           buttonAlign: AlanVoice.BUTTON_ALIGN_LEFT);

 

The first argument will be our Alan project key.

Next, we need to create an Alan Project containing the logic for the Visual Voice experience in our app.

So if you haven’t already, go to https://studio.alan.app, and create an account.

Group 1.png

Then create a project and add an empty script to it.

Group 31.png

Create a version for sets of your scripts which will be running in your app and set the project to Production.

Group 21.png

Then click on the “” button, under the “Android” tab, copy the Alan SDK Key and paste it in the Alan.initButton argument in the app.dart file.

Group 22.png
Screen Shot 2019-05-22 at 15.50.17

Now build and run the application, and we will see the Alan button in the lower left — the integration is complete.

In the next step, we will add the Visual Voice experience for this application.

4. Create the Visual Voice experience in Alan

In Alan Studio, we will

  1. Create a complete Visual Voice experience for our Shrine application
  2. Add scripts that cover a full logic for our Shrine app
  3. Add commands for asking about what’s on the menu, adding items to the order, and checkout

In Alan Studio, switch to Development mode and enter the content in the script area. Then, create a new version “v2” for the project as we did in the previous steps and switch to the Production environment.

Screen Shot 2019-05-22 at 19.13.57.png

Within Alan Studio, we can test commands like “What can I do here?”, “Add stella sunglasses”, and “Checkout”.

To support the visuals for these voice commands, we need to add some handlers back in the Shrine application.

In the Shrine project, we need to change app.dart. Add following method to the _ShrineAppState class. This will handle commands from scripts like open specific page or put some stuff in the cart.

void _handleCommand(Map command) {
 debugPrint("New command: ${command}");
 switch (command["command"]) {
   case "clearOrder":
     _handleClearOrder();
     break;
   case "addToCart":
     _addToCart(command["item"], command["quantity"]);
     break;
   case "highlightProducts":
     _highlightProduct(command["value"]);
     break;
   case "navigation":
     _navigateTo(command["route"]);
     break;
   case "highlight":
     _highlightWidget(command["value"]);
     break;
   case "show_products":
     _filterProducts(command["items"]);
     break;
   case "finishOrder":
     _handleFinishOrder();
     break;
   default:
     debugPrint("Unknown command: ${command}");

In the initState method, we add a listener for the “command” event. All commands sent from the Shrine scripts will be passed to this method. 

AlanVoice.callbacks.add((command) => _handleCommand(command.data["data"]));

In the main.dart method, we subscribe to the changes in the application’s state, and call the updateVisualState method, sending the new visual state to the Alan Studio script. This guarantees the state of the application and the script will be synchronized.

model.addListener(() => {
 model.setVisuals()
});

Now when the user says something like “Show accessories?”, they’ll be taken to the accessories screen, with accessories highlighted as they are read aloud to the user.

Let’s save our changes and build the application.

We have created our application with a full Visual Voice experience using Alan. Let’s test the following commands:

  • “What home items do you have?”
  • “Add two stella sunglasses and vagabond sack”
  • “Checkout”

This concludes this tutorial on how to integrate voice into any Flutter application with Alan. Here’s a full video of the integration.

2 Comments

Leave a Reply

Discover more from Alan AI Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading