Documentation

Powered by Algolia

Passing the app state to the voice script

When the user interacts with an Android app, it may be necessary to send some data from the app to the voice script. For example, you may need to provide the voice script with information about the current app state.

In Alan, you can send data from the app to the voice script in the following ways:

In this tutorial, we will work with an Android app with two activities. We will send information about the current foreground activity to the voice script with the help of visualState. On the script side, we will use the passed state to do the following:

  • Create a voice command that will let Alan reply differently depending on the activity being current in the app
  • Create a command that can be matched only if the necessary activity is currently in the foreground

What you will learn

  • How to pass the app state from an Android app to the voice script with visualState
  • How to access the data passed with visualState in the voice script
  • How to filter intents by the app state

What you will need

To go through this tutorial, make sure the following prerequisites are met:

  • In this tutorial, we will continue using the starter Android app created in the previous tutorials. You can also use your own app with several activities. Make sure you have completed all steps from the previous tutorials: Adding voice to an Android app and Navigating between activities in an Android app.
  • You have set up the Android environment and it is functioning properly. For details, see Android developers documentation.
  • The device on which you are planning to test drive the Android app is connected to the Internet. The Internet connection is required to let the Android app communicate with the voice script run in the Alan Cloud.

Step 1: Send the app state to the voice script

To send data from the app to the voice script, you can use Alan's visualState. This method can be helpful, in particular, if the dialog flow in your app depends on the app state. In this case, you need to know the app state on the script side to adjust the dialog logic. For example, you may want to use the app state to filter out some voice commands, give responses applicable to the current app state and so on.

visualState allows you to send a custom JSON object from the app to the script. You can access the passed data in the script using the p.visual runtime variable.

To send visualState to the voice script, let's add the following code to the onCreate() method of MainActivity:


Now, when the activity is created, we call the setVisualState method of the Alan button and pass the JSON object containing information about the current activity.

Let's repeat the same for SecondActivity. Mind that for the second activity, we will set visualState to second. Our second activity should look like this:

Step 2: Add a voice command with different answers for screens

Imagine we want to provide the user with a possibility to ask: What screen is this? in the app, and Alan must reply differently depending on the screen open. Let's go back to the Alan Studio and add a new intent:

intent(`What screen is this?`, p => {
    let activity = p.visual.activity;
    switch (activity) {
        case "main":
            p.play(`This is the main screen`);           
            break;
		case "second":
            p.play(`This is the second screen`);           
            break;
        default:
            p.play(`This is an example Android app by Alan`);
    }      
});

Here we use the p.visual.activity variable to access data passed with visualState. Depending on the variable value, Alan plays back different responses.

You can test it: run the app and ask: What screen is this? Alan will get back with This is the main screen. Then say Open the second screen and ask: What screen is this? Alan will answer: This is the second screen.

Step 3: Create a screen-specific command

If an app has several screens, it may be necessary to create voice commands that will work for specific screens only. In Alan, you can create screen-specific commands with the help of filters added to intents. The filter in an intent defines conditions in which the intent can be matched. In our case, we can use information about the open screen as the filter.

Let's modify the intent for getting back to the home screen. This command must work only if the second screen is open; we cannot say Go back when at the home screen. In the Alan Studio, update this intent to the following:

const vScreen = visual({"activity": "second"});

intent(vScreen, "Go back", p => {
    p.play('Going back');
    p.play({command: 'goBack'})
});

The filter comes as the first parameter in the intent. Now this intent will be matched only if the second screen is open and visualState is {"activity": "second"}.

You can test it: run the app, tap the Alan button and say: Open the second screen. Then say: Go back. The home screen will be displayed. When at the home screen, try saying this command again. Alan will not be able to match this intent.