Contents Menu Expand
Logo
| Docs
Logo
| Docs
Logo

Introduction

  • Get started
  • How Alan works
    • Alan basics
    • Alan infrastructure
    • Deployment options
    • Voice processing
  • Alan Studio
    • Alan projects
    • Voice scripts
    • Versions and environments
    • Testing and debugging
      • Debugging Chat
      • Tools to simulate in-app behavior
      • Test View
      • Alan Studio logs
      • In-app testing
    • Alan Playground
    • Voice analytics
      • Projects dashboard
      • Analytics View
    • Billing and subscriptions
  • Alan button
    • Customizing the voice assistant
    • Setting up cohorts
    • Starting and stopping dialog sessions
  • Supported platforms

Server API

  • Script concepts
    • Commands and responses
    • Patterns
    • Slots
      • User-defined slots (static)
      • Dynamic slots
      • Regex slots
      • Predefined slots
    • Contexts
    • Built-in JavaScript libraries
    • Predefined script objects
    • Voice assistant lifecycle
    • Error handling and re-prompts
    • Alan button popups
  • Sending data from the app
    • authData
    • Visual state
    • Project API
  • User data
    • User events
    • Client object
  • API reference

Alan Client SDKs

  • Integrating with client apps
    • Web frameworks
      • React
      • Angular
      • Vue
      • Ember
      • JavaScript
      • Electron
      • Cross-platform frameworks
      • Server-side rendering
    • iOS
    • Android
    • Ionic
    • Apache Cordova
    • Flutter
    • React Native
  • Client API methods
  • Alan handlers
    • onCommand handler
    • onButtonState handler
    • onEvent handler

Samples & Tutorials

  • How-tos
  • Tutorials
    • Web
      • Building a voice assistant for web
      • Creating a voice-enabled food delivery app: complete tutorial
        • Step 1: Create a web app
        • Step 2: Create a voice assistant
        • Step 3: Integrate Alan with the app
        • Step 4: Add a voice command
        • Step 5: Specify options and alternatives
        • Step 6: Pick out important data with slots
        • Step 7: Add several items
        • Step 8: Send a command to display items in the cart
        • Step 9: Use slot labels to display the order
        • Step 10: Remove items with voice
        • Step 11: Highlight named items
        • Step 12: Add the checkout context
        • Step 13: Get the delivery time
        • Step 14: Capture a comment
        • Step 15: Populate fields with the delivery details
        • Step 16: Make an API call to get ingredients
        • Step 17: Get the app visual context
        • Step 18: Get the balance
        • Step 19: Let Alan start the dialog
        • Step 20: Show a popup next to the Alan button
      • Using dynamic slots in patterns
      • Making a Web API call from the voice script
    • Web frameworks
      • Building a voice assistant for a React app
      • Building a voice assistant for an Angular app
      • Building a voice assistant for a Vue app
      • Building a voice assistant for an Ember app
      • Building a voice assistant for an Electron app
    • iOS
      • Building a voice assistant for an iOS app
      • Navigating between views
      • Passing the app state to the voice script
      • Highlighing items with voice
      • Triggering voice script actions without commands
      • Playing a greeting in an app
    • Android
      • Building a voice assistant for an Android Java or Kotlin app
      • Navigating in an Android app with voice (Kotlin)
      • Passing the app state to the voice script (Kotlin)
      • Sending data from the app to the voice script (Kotlin)
    • Flutter
      • Building a voice assistant for a Flutter app
      • Navigating between screens
      • Passing the app state to the voice script
      • Sending data to the voice script
    • Ionic
      • Building a voice assistant for an Ionic Angular app
      • Navigating between tabs (Ionic Angular)
      • Passing the app state to the voice script (Ionic Angular)
      • Building a voice assistant for an Ionic React app
      • Navigating between tabs (Ionic React)
    • React Native
      • Building a voice assistant for a React Native app
      • Sending commands to the app
      • Passing the app state to the voice script
      • Triggering activities without voice commands
      • Navigating between screens with voice
  • App showcase
    • Appointment scheduling app
  • FAQ
  • Alan examples
    • Voice script examples
    • Integration code examples
  • Videos
    • Alan Studio videos
    • Alan Platform videos
    • Integration videos
    • Voice-enabled apps
    • Alan AI Udemy course (web app)

Navigating between screens with voice (React Native)¶

If your React Native app has several screens, you can add voice commands to navigate between them. For example, you can let the user open the details screen and then go back to the main screen with voice.

In this tutorial, we will create an app with two screens and add voice commands to navigate forward and back.

What you will learn¶

  • How to navigate between screens of a React Native app with voice

  • How to send commands to a React Native app

  • How to handle commands on the React Native app side

What you will need¶

To go through this tutorial, make sure the following prerequisites are met:

  • You have completed all steps from the following tutorial: Building a voice assistant for a React Native app.

  • You have set up the React Native environment and it is functioning properly. For details, see React Native documentation.

Step 1. Add two screens to the app¶

First, let’s update our app to add two screens to it.

  1. In the Terminal, navigate to the app folder and install the required navigation components:

    Terminal¶
     npm install @react-navigation/native @react-navigation/native-stack
     npm install react-native-screens react-native-safe-area-context
    
  2. Install pods to complete the installation:

    Terminal¶
     cd ios
     pod install
     cd ..
    
  3. Update the App.js file to the following:

    App.js¶
     import * as React from 'react';
     import { Button, View, Text } from 'react-native';
     import { NavigationContainer } from '@react-navigation/native';
     import { createNativeStackNavigator } from '@react-navigation/native-stack';
    
     import { navigationRef } from './components/RootNavigation';
     import * as RootNavigation from './components/RootNavigation';
    
     function HomeScreen({ navigation: { navigate } }) {
       return (
         <View style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
           <Text>This is the home screen of the app</Text>
           <Button title="Go to Profile" onPress={() => RootNavigation.navigate('Profile')} />
         </View>
       );
     }
    
     function ProfileScreen({ navigation: { navigate } }) {
       return (
         <View style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
           <Text>Profile details</Text>
           <Button title="Go back" onPress={() => RootNavigation.navigate('Home')} />
         </View>
       );
     }
    
     const Stack = createNativeStackNavigator();
    
     const App = () => {
       return (
         <NavigationContainer ref={navigationRef}>
           <Stack.Navigator initialRouteName="Home">
             <Stack.Screen name="Home" component={HomeScreen} />
             <Stack.Screen name="Profile" component={ProfileScreen} />
           </Stack.Navigator>
         </NavigationContainer>
       );
     }
    
     export default App;
    
  4. In your app, create the components folder and put the RootNavigation.js file to it:

    RootNavigation.js¶
     import { createNavigationContainerRef } from '@react-navigation/native';
    
     export const navigationRef = createNavigationContainerRef()
    
     export function navigate(name, params) {
       if (navigationRef.isReady()) {
         navigationRef.navigate(name, params);
       }
     }
    
  5. To the App.js file, add the Alan button as described in the Building a voice assistant for a React Native app tutorial:

    • Import AlanView

    • Add the Alan button to NavigationContainer

You can test it: run the app and try navigating between screens using the buttons.

Step 2. Add navigation commands to the script¶

Let’s add navigation commands to the voice script. In the code editor in Alan Studio, add the following:

Voice script¶
 intent('Open profile details', p => {
     p.play('Opening the profile page');
     p.play({command:'goForward'});
 });

 intent('Go back', p => {
     p.play('Going back');
     p.play({command:'goBack'});

Here, in each command, we have two p.play() functions:

  • One to play a response to the user

  • The other one to send the command to the client app. In the second play() function, we have specified a JSON object with the name of the command to be sent.

You can try the commands in the Debugging Chat. Notice that together with the answer, Alan now sends the command we have defined.

Step 3. Handle commands in the app¶

We need to handle these commands on the app side. To do this, we will add the Alan’s onCommand handler to the app.

  1. In the App.js file, add the import statement for the Alan events listener:

    App.js¶
     ...
     import { NativeEventEmitter, NativeModules } from 'react-native';
     ...
    
  2. Create a new NativeEventEmitter object:

    App.js¶
     const App = () => {
       ...
       const { AlanEventEmitter } = NativeModules;
       const alanEventEmitter = new NativeEventEmitter(AlanEventEmitter);
       ...
     }
    
  3. Add the import statement for the useEffect hook:

    App.js¶
     ...
     import { useEffect } from 'react';
     ...
    
  4. And add the useEffect hook to subscribe to the voice script events:

    App.js¶
     const App = () => {
     ...
       useEffect(() => {
         alanEventEmitter.addListener('onCommand', (data) => {
           if (data.command == 'goForward') {
             RootNavigation.navigate('Profile');
           } else if (data.command == 'goBack') {
             RootNavigation.navigate('Home');
           }
         })
       }, []);
       ...
     }
    

Now, when the app receives a command, the necessary screen will be open.

You can try it: in the app, tap the Alan button and say: Open profile details and Go back.

Next
App showcase
Previous
Triggering activities without voice commands (React Native)
Alan® is a trademark of Alan AI, Inc. Handcrafted in Sunnyvale, California
Download Alan Playground to see how Conversational Voice Experiences can add value to your application or test your Alan Studio Projects.
On this page
  • Navigating between screens with voice (React Native)
    • What you will learn
    • What you will need
    • Step 1. Add two screens to the app
    • Step 2. Add navigation commands to the script
    • Step 3. Handle commands in the app