With this Alan Voice Platform integration, you can add Voice control to any application developed with the Flutter framework. We used the sample Flutter app here made by Google team and then added a Visual Voice experience. Let’s walk through the four steps of how the Flutter integration with Alan is performed:
1. Download sample Flutter app
For the purpose of this tutorial we’re using the Shrine sample application available from Google. Clone this repo to download it: https://github.com/flutter/samples/tree/master/shrine
Make sure that you have the Flutter framework installed, if not, proceed with installation guide first: https://flutter.dev/docs/get-started/install/
2. Run demo app
Open the Shrine project in your Flutter IDE and press run. This is a sample app where you can browse and purchase apparel and other items.
3. Integrate the application with Alan
To add voice to our Shrine application, we need to install Alan voice dependency. Open pubspec.yaml and add following line to the dependencies section:
alan_voice: 1.0.1
Now we’ll add the Alan button to the main app’s window. To do this, go to the app.dart and add the following import:
import ‘package:alan_voice/alan_voice.dart’;
And in the initState function add the line:
AlanVoice.initButton( "" , buttonAlign: AlanVoice.BUTTON_ALIGN_LEFT);
The first argument will be our Alan project key.
Next, we need to create an Alan Project containing the logic for the Visual Voice experience in our app.
So if you haven’t already, go to https://studio.alan.app, and create an account.
Then create a project and add an empty script to it.
Create a version for sets of your scripts which will be running in your app and set the project to Production.
Then click on the “>” button, under the “Android” tab, copy the Alan SDK Key and paste it in the Alan.initButton argument in the app.dart file.
Now build and run the application, and we will see the Alan button in the lower left — the integration is complete.
In the next step, we will add the Visual Voice experience for this application.
4. Create the Visual Voice experience in Alan
In Alan Studio, we will
- Create a complete Visual Voice experience for our Shrine application
- Add scripts that cover a full logic for our Shrine app
- Add commands for asking about what’s on the menu, adding items to the order, and checkout
In Alan Studio, switch to Development mode and enter the content in the script area. Then, create a new version “v2” for the project as we did in the previous steps and switch to the Production environment.
Within Alan Studio, we can test commands like “What can I do here?”, “Add stella sunglasses”, and “Checkout”.
To support the visuals for these voice commands, we need to add some handlers back in the Shrine application.
In the Shrine project, we need to change app.dart. Add following method to the _ShrineAppState class. This will handle commands from scripts like open specific page or put some stuff in the cart.
void _handleCommand(Mapcommand) { debugPrint("New command: ${command}"); switch (command["command"]) { case "clearOrder": _handleClearOrder(); break; case "addToCart": _addToCart(command["item"], command["quantity"]); break; case "highlightProducts": _highlightProduct(command["value"]); break; case "navigation": _navigateTo(command["route"]); break; case "highlight": _highlightWidget(command["value"]); break; case "show_products": _filterProducts(command["items"]); break; case "finishOrder": _handleFinishOrder(); break; default: debugPrint("Unknown command: ${command}");
In the initState method, we add a listener for the “command” event. All commands sent from the Shrine scripts will be passed to this method.
AlanVoice.callbacks.add((command) => _handleCommand(command.data["data"]));
In the main.dart method, we subscribe to the changes in the application’s state, and call the updateVisualState method, sending the new visual state to the Alan Studio script. This guarantees the state of the application and the script will be synchronized.
model.addListener(() => { model.setVisuals() });
Now when the user says something like “Show accessories?”, they’ll be taken to the accessories screen, with accessories highlighted as they are read aloud to the user.
Let’s save our changes and build the application.
We have created our application with a full Visual Voice experience using Alan. Let’s test the following commands:
- “What home items do you have?”
- “Add two stella sunglasses and vagabond sack”
- “Checkout”
This concludes this tutorial on how to integrate voice into any Flutter application with Alan. Here’s a full video of the integration.
Thanks for posting such an informative article. This article will definitely help those who are searching about Flutter. If you are looking for hire flutter developers
I am a developer and more importantly, I am a fresher and I find this blog too good and tutorials are excellent. What makes it more interesting is that it is combined with infographics too. It’s very important that developers in each and every Flutter development company see this. Anyways, will share it with my peers. Thank you.