Alan React Native Framework

Available on: Android iOS

Integrating with Alan

To add Alan voice to a React Native app, you need to do the following:

  1. Set up the environment

  2. Install the Alan React Native plugin

  3. Add the Alan button to the app

  4. Run the app on iOS or run the app on Android or use the debug mode

Step 1. Set up the environment

Before you start integrating a React Native app with Alan, make sure all necessary tools are installed on your computer.

  • Install react-native-cli globally:

    Terminal
    npm install -g react-native-cli
    
  • (For iOS) Install CocoaPods to manage dependencies for Xcode projects:

    Terminal
    sudo gem install cocoapods
    

Step 2. Install the Alan React Native plugin

To add the Alan React Native plugin to your app:

  1. Navigate to the app folder:

    Terminal
    cd myapp
    
  2. Install the plugin:

    Terminal
    npm i @alan-ai/alan-sdk-react-native --save
    

Step 3. Add the Alan button to the app

Once the plugin is installed, you need to add the Alan button to your React Native app.

  1. Add the Alan button and Alan text panel to the app. In the app folder, open App.js and add the following import statement at the top of the file:

    App.js
    import { AlanView } from '@alan-ai/alan-sdk-react-native';
    
  2. In the App.js file, add a view with the Alan button and Alan text panel:

    App.js
    return (
      <AlanView
        projectid={
          ''
        }
      />
    );
    
  3. In projectID, specify the Alan SDK key for your Alan Studio project. To get the key, in Alan Studio, at the top of the code editor, click Integrations and copy the value from the Alan SDK Key field.

    App.js
    return (
      <AlanView
        projectid={
          'cc2b0aa23e5f90d2974f1bf6b6929c1b2e956eca572e1d8b807a3e2338fdd0dc/prod'
        }
      />
    );
    
  4. You need to add a listener for events that will be coming from the Alan voice script. To start listening for events, in App.js, add the following import statement:

    App.js
    import { NativeEventEmitter, NativeModules } from 'react-native';
    
  5. In App.js, create a new NativeEventEmitter object:

    App.js
    const { AlanManager, AlanEventEmitter } = NativeModules;
    const alanEventEmitter = new NativeEventEmitter(AlanEventEmitter);
    
  6. And subscribe to the voice script events:

    App.js
    const subscription = alanEventEmitter.addListener('command', (data) => {
      console.log(`got command event ${JSON.stringify(data)}`);
    });
    
  7. Do not forget to remove the listener in your App() class:

    App.js
    componentWillUnmount() {
      subscription.remove();
    }
    

Note

Regularly update the Alan AI package your project depends on. To check if a newer version is available, run npm outdated. To update the package, run npm update @alan-ai/alan-sdk-react-native. For more details, see npm documentation.

Running the app

After you have integrated your app with Alan, you can build and deploy your project as a native iOS or Android app.

Running on iOS

To run your app integrated with Alan on the iOS platform, you need to update the app settings for iOS.

  1. In the Terminal, navigate to the ios folder in your app:

    Terminal
    cd ios
    
  2. In the ios folder, open the Podfile and change the minimum iOS version to 11:

    Podfile
    platform :ios, '11.0'
    
  3. In the Terminal, run the following command to install dependencies for the project:

    Terminal
    pod install
    
  4. In the ios folder, open the generated XCode workspace file: <appname>.xcworkspace. You should use this file to open your Xcode project from now on.

  5. In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the Alan button requires access to:

    • User’s device microphone for voice interactions

    • User’s device camera for testing Alan projects on mobile

    To comply with this requirement, you must add NSMicrophoneUsageDescription and NSCameraUsageDescription keys to the Info.plist file of your app and provide a message why your app requires access to the microphone and camera. The message will be displayed only when Alan needs to activate the microphone or camera.

    To add the keys:

    1. In Xcode, go to the Info tab.

    2. In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.

    3. From the list, select Privacy - Microphone Usage Description.

    4. In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched.

    5. Repeat the steps above to add the Privacy - Camera Usage Description key.

    ../../../_images/pods-mic.png
  6. In the Terminal, navigate to the app folder, up one folder level:

    Terminal
    cd ..
    
  7. To run your app on the iOS platform, use one of the following command:

    Terminal
    react-native run-ios
    yarn ios
    

You can also open the <appname>.xcworkspace file in XCode and test drive the app on the simulator or device.

Running on Android

To run your React Native app integrated with Alan on the Android platform:

  1. Make sure the correct minimum SDK version is set for your app: minSdkVersion 21. To check the version, open the /android/app/build.gradle file, under defaultConfig, locate minSdkVersion and update its value if necessary.

  2. To run your app on the Android platform, use one of the following command:

    Terminal
    react-native run-android
    yarn android
    

You can also launch the app from the Android Studio: open <app>/android in the IDE and run the app in a usual way.

Running in the Debug mode

You can run your app on iOS or Android in the debug mode. The debug mode allows you to hot reload the app on the device or simulator as soon as you update anything in the app.

To run the app in the debug mode, make sure the Metro bundler is started. To start the Metro bundler, in the Terminal, run the following command:

Terminal
react-native start

The Terminal window with the Metro bundler will be launched. You can then run your app as usual with the following commands:

  • For iOS:

    Terminal
    react-native run-ios
    yarn ios
    
  • For Android:

    Terminal
    react-native run-android
    yarn android
    

Note

After integration, you may get a warning related to the onButton state. You can safely ignore it: the warning will be removed as soon as you add the onCommand handler to your app.

Specifying the Alan button parameters

You can specify the following parameters for the Alan button added to your app:

Name

Type

Description

projectid

string

The Alan SDK key for a project in Alan Studio.

authData

JSON object

The authentication or configuration data to be sent to the voice script. For details, see authData.

Using client API methods

You can use the following client API methods in your app:

setVisualState()

Use the setVisualState() method to inform the voice assistant about the app’s visual context. For details, see setVisualState().

Client app
setVisualState() {
  /// Providing any params with json
  AlanManager.setVisualState({"data":"your data"});
}

callProjectApi()

Use the callProjectApi() method to send data from the client app to the voice script and trigger activities without voice commands. For details, see callProjectApi().

Voice script
projectAPI.setClientData = function(p, param, callback) {
  console.log(param);
};
Client app
callProjectApi() {
  /// Providing any params with json
  AlanManager.callProjectApi(
    'script::setClientData', {"data":"your data"},
    (error, result) => {
      if (error) {
        console.error(error);
      } else {
        console.log(result);
      }
    },
  )
}

playText()

Use the playText() method to play specific text in the client app. For details, see playText().

Client app
/// Playing any text message
playText() {
  /// Providing text as string param
  AlanManager.playText("Hi");
}

playCommand()

Use the playCommand() method to execute a specific command in the client app. For details, see playCommand().

Client app
/// Executing a command locally
playCommand() {
  /// Providing any params with json
  AlanManager.playCommand({"action":"openSomePage"})
}

activate()

Use the activate() method to activate the Alan button programmatically. For details, see activate().

Client app
/// Activating the Alan button programmatically
activate() {
  AlanManager.activate();
}

deactivate()

Use the deactivate() method to deactivate the Alan button programmatically. For details, see deactivate().

Client app
/// Deactivating the Alan button programmatically
deactivate() {
  AlanManager.deactivate();
}

isActive()

Use the isActive() method to check the Alan button state: active or not. For details, see isActive().

Client app
AlanManager.isActive((error, result) => {
  if (error) {
    console.error(error);
  } else {
    console.log(result);
  }
})

getWakewordEnabled()

Use the getWakewordEnabled() method to check the state of the wake word for the Alan button. For details, see getWakewordEnabled().

Client app
AlanManager.getWakewordEnabled((error, result) => {
  if (error) {
    console.error(error);
  } else {
    console.log(`getWakewordEnabled ${result}`);
  }
});

setWakewordEnabled()

Use the setWakewordEnabled() method to enable or disable the wake word for the Alan button. For details, see setWakewordEnabled().

Client app
AlanManager.setWakewordEnabled(true);

Using handlers

You can use the following Alan handlers in your app:

onCommand handler

Use the onCommand handler to handle commands sent from the voice script. For details, see onCommand handler.

Client app
import { NativeEventEmitter, NativeModules } from 'react-native';
const { AlanManager, AlanEventEmitter } = NativeModules;
const alanEventEmitter = new NativeEventEmitter(AlanEventEmitter);

componentDidMount() {
  // Handle commands from Alan Studio
  alanEventEmitter.addListener('onCommand', (data) => {
    console.log(`onCommand: ${JSON.stringify(data)}`);
  });
}

componentWillUnmount() {
  alanEventEmitter.removeAllListeners('onCommand');
}

onButtonState handler

Use the onButtonState handler to capture and handle the Alan button state changes. For details, see onButtonState handler.

Client app
import { NativeEventEmitter, NativeModules } from 'react-native';
const { AlanManager, AlanEventEmitter } = NativeModules;
const alanEventEmitter = new NativeEventEmitter(AlanEventEmitter);

componentDidMount() {
  /// Handle button state
  alanEventEmitter.addListener('onButtonState', (state) => {
    console.log(`onButtonState: ${JSON.stringify(state)}`);
  });
}

componentWillUnmount() {
  alanEventEmitter.removeAllListeners('onButtonState');
}

onEvent handler

Use the onEvent handler to capture and handle events emitted by Alan: get user’s utterances, voice assistant responses and so on. For details, see onEvent handler.

Client app
import { NativeEventEmitter, NativeModules } from 'react-native';
const { AlanManager, AlanEventEmitter } = NativeModules;
const alanEventEmitter = new NativeEventEmitter(AlanEventEmitter);

componentDidMount() {
  /// Handle events
  alanEventEmitter.addListener('onEvent', (payload) => {
    console.log(`onEvent: ${JSON.stringify(payload)}`);
  });
}

componentWillUnmount() {
  alanEventEmitter.removeAllListeners('onEvent');
}

Troubleshooting

  • If you encounter the following error: Execution failed for task ':app:mergeDebugNativeLibs' for files like: lib/arm64-v8a/libc++_shared.so, lib/x86/libc++_shared.so, lib/x86_64/libc++_shared.so, lib/armeabi-v7a/libc++_shared.so, open the build.gradle file at the Module level and add packaging options:

    build.gradle (module level)
    android {
      packagingOptions {
        pickFirst 'lib/arm64-v8a/libc++_shared.so'
        pickFirst 'lib/x86/libc++_shared.so'
        pickFirst 'lib/x86_64/libc++_shared.so'
        pickFirst 'lib/armeabi-v7a/libc++_shared.so'
      }
    }
    
  • The minimum possible Android SDK version required by the Alan SDK is 21. If the version in your project is lower, you may encounter the following error: AndroidManifest.xml Error: uses-sdk:minSdkVersion 16 cannot be smaller than version 21 declared in library [:alan_voice]. Open the app/android/build.gradle file, under defaultConfig, locate minSdkVersion and change the version to 21.

What’s next?

../../../_images/git-purple.svg

Example apps

Find and explore examples of voice-enabled apps on the Alan AI GitHub repository.

View on GitHub