Alan React Native Framework

Available on: Android iOS

Integrating with Alan

To add Alan voice to a React Native app, you need to do the following:

  1. Set up the environment

  2. Install the Alan React Native plugin

  3. Add the Alan button to the app

  4. Run the app on iOS or run the app on Android or use the debug mode

Step 1. Set up the environment

Before you start integrating a React Native app with Alan, make sure all necessary tools are installed on your computer.

  • Install react-native-cli globally:

    Terminal
    npm install -g react-native-cli
    
  • (For iOS) Install CocoaPods to manage dependencies for Xcode projects:

    Terminal
    sudo gem install cocoapods
    

Step 2. Install the Alan React Native plugin

To add the Alan React Native plugin to your app:

  1. Navigate to the app folder:

    Terminal
    cd myapp
    
  2. Install the plugin:

    Terminal
    npm i @alan-ai/alan-sdk-react-native --save
    

Step 3. Add the Alan button to the app

Once the plugin is installed, you need to add the Alan button to your React Native app.

  1. Add the Alan button and Alan text panel to the app. In the app folder, open App.js and add the following import statement at the top of the file:

    App.js
    import { AlanView } from '@alan-ai/alan-sdk-react-native';
    
  2. In the App.js file, add a view with the Alan button and Alan text panel:

    App.js
    return (
      <AlanView
        projectid={
          ''
        }
      />
    );
    
  3. In projectID, specify the Alan SDK key for your Alan Studio project. To get the key, in Alan Studio, at the top of the code editor, click Integrations and copy the value from the Alan SDK Key field.

    App.js
    return (
      <AlanView
        projectid={
          'cc2b0aa23e5f90d2974f1bf6b6929c1b2e956eca572e1d8b807a3e2338fdd0dc/prod'
        }
      />
    );
    
  4. You need to add a listener for events that will be coming from the Alan voice script. To start listening for events, in App.js, add the following import statement:

    App.js
    import { NativeEventEmitter, NativeModules } from 'react-native';
    
  5. In App.js, create a new NativeEventEmitter object:

    App.js
    const { AlanManager, AlanEventEmitter } = NativeModules;
    const alanEventEmitter = new NativeEventEmitter(AlanEventEmitter);
    
  6. And subscribe to the voice script events:

    App.js
    const subscription = alanEventEmitter.addListener('command', (data) => {
      console.log(`got command event ${JSON.stringify(data)}`);
    });
    
  7. Do not forget to remove the listener in your App() class:

    App.js
    componentWillUnmount() {
      subscription.remove();
    }
    

Note

Regularly update the Alan AI package your project depends on. To check if a newer version is available, run npm outdated. To update the package, run npm update @alan-ai/alan-sdk-react-native. For more details, see npm documentation.

Running the app

After you have integrated your app with Alan, you can build and deploy your project as a native iOS or Android app.

Running on iOS

To run your app integrated with Alan on the iOS platform, you need to update the app settings for iOS.

  1. In the Terminal, navigate to the ios folder in your app:

    Terminal
    cd ios
    
  2. In the ios folder, open the Podfile and change the minimum iOS version to 11:

    Podfile
    platform :ios, '11.0'
    
  3. In the Terminal, run the following command to install dependencies for the project:

    Terminal
    pod install
    
  4. In the ios folder, open the generated XCode workspace file: <appname>.xcworkspace. You should use this file to open your Xcode project from now on.

  5. In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the Alan button requires access to:

    • User’s device microphone for voice interactions

    • User’s device camera for testing Alan projects on mobile

    To comply with this requirement, you must add NSMicrophoneUsageDescription and NSCameraUsageDescription keys to the Info.plist file of your app and provide a message why your app requires access to the microphone and camera. The message will be displayed only when Alan needs to activate the microphone or camera.

    To add the keys:

    1. In Xcode, go to the Info tab.

    2. In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.

    3. From the list, select Privacy - Microphone Usage Description.

    4. In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched.

    5. Repeat the steps above to add the Privacy - Camera Usage Description key.

    ../../../_images/pods-mic.png
  6. In the Terminal, navigate to the app folder, up one folder level:

    Terminal
    cd ..
    
  7. To run your app on the iOS platform, use one of the following command:

    Terminal
    react-native run-ios
    yarn ios
    

You can also open the <appname>.xcworkspace file in XCode and test drive the app on the simulator or device.

Running on Android

To run your React Native app integrated with Alan on the Android platform:

  1. Make sure the correct minimum SDK version is set for your app: minSdkVersion 21. To check the version, open the /android/app/build.gradle file, under defaultConfig, locate minSdkVersion and update its value if necessary.

  2. To run your app on the Android platform, use one of the following command:

    Terminal
    react-native run-android
    yarn android
    

You can also launch the app from the Android Studio: open <app>/android in the IDE and run the app in a usual way.

Running in the Debug mode

You can run your app on iOS or Android in the debug mode. The debug mode allows you to hot reload the app on the device or simulator as soon as you update anything in the app.

To run the app in the debug mode, make sure the Metro bundler is started. To start the Metro bundler, in the Terminal, run the following command:

Terminal
react-native start

The Terminal window with the Metro bundler will be launched. You can then run your app as usual with the following commands:

  • For iOS:

    Terminal
    react-native run-ios
    yarn ios
    
  • For Android:

    Terminal
    react-native run-android
    yarn android
    

Note

After integration, you may get a warning related to the onButton state. You can safely ignore it: the warning will be removed as soon as you add the onCommand handler to your app.

Specifying the Alan button parameters

You can specify the following parameters for the Alan button added to your app:

Name

Type

Description

projectid

string

The Alan SDK key for a project in Alan Studio.

authData

JSON object

The authentication or configuration data to be sent to the voice script. For details, see authData.

Troubleshooting

  • If you encounter the following error: Execution failed for task ':app:mergeDebugNativeLibs', open the build.gradle file at the Module level and add packaging options:

    build.gradle (module level)
    android {
      packagingOptions {
        pickFirst 'lib/arm64-v8a/libc++_shared.so'
        pickFirst 'lib/x86/libc++_shared.so'
        pickFirst 'lib/x86_64/libc++_shared.so'
        pickFirst 'lib/armeabi-v7a/libc++_shared.so'
      }
    }
    
  • The minimum possible Android SDK version required by the Alan SDK is 21. If the version in your project is lower, you may encounter the following error: AndroidManifest.xml Error: uses-sdk:minSdkVersion 16 cannot be smaller than version 21 declared in library [:alan_voice]. Open the app/android/build.gradle file, under defaultConfig, locate minSdkVersion and change the version to 21.

What’s next?

Upon integration, your app gets the in-app voice assistant that can be activated with the Alan button displayed on top of the app’s UI.

To build a full-fledged multimodal UX, you can use Alan’s SDK toolkit:

../../../_images/method.svg

Client API methods

Enable communication between the client app and Alan and perform actions in the app.


Learn more

../../../_images/handler.svg

Alan handlers

Handle commands, understand the button state and capture events in the client app.


Learn more

../../../_images/git-purple.svg

Example app

Find and explore an example of a voice-enabled app on the Alan AI GitHub repository.


View on GitHub