Alan AI Flutter Framework

Available on: Android iOS



The Alan AI Flutter plugin helps you to integrate your conversational experience into a Flutter app.

Note

The alan_voice package version 3.0.0 supports null safety. If your app does not use null safety yet, upgrade the Dart version to 2.12 or higher or use a previous version of the alan_voice package.

Integrating with Alan AI

To integrate a Flutter app with Alan AI:

  1. In the pubspec.yaml file of your Flutter project, add the Alan AI dependency:

    pubspec.yaml
    dependencies:
      flutter:
        sdk: flutter
      alan_voice: 2.4.0
    
  2. Open the main.dart file and at the top of it, add the alan_voice package dependency:

    main.dart
    import 'package:alan_voice/alan_voice.dart';
    
  3. In the _MyHomePageState class, add the method for initializing the Alan AI button:

    main.dart
    class _MyHomePageState extends State<MyHomePage> {
      _MyHomePageState() {
        /// Initializing Alan AI with sample project id
        AlanVoice.addButton(
          "",
          buttonAlign: AlanVoice.BUTTON_ALIGN_LEFT);
     }
    
  4. In Alan AI Studio, go to Integrations, copy the value from the Alan SDK Key field and insert this value in the quotes.

    main.dart
    class _MyHomePageState extends State<MyHomePage> {
      _MyHomePageState() {
        /// Initializing Alan AI with sample project id
        AlanVoice.addButton(
          "314203787ccd9370974f1bf6b6929c1b2e956eca572e1d8b807a3e2338fdd0dc/prod",
          buttonAlign: AlanVoice.BUTTON_ALIGN_LEFT);
      }
    

That’s it. Now you can run the app, tap the Alan AI button and start interacting with Alan.

Specifying the Alan AI button parameters

You can specify the following parameters for the Alan AI button added to your app:

Name

Type

Description

projectId

string

The Alan AI SDK key for a project in Alan AI Studio.

authJson

JSON object

The authentication or configuration data to be sent to the dialog script. For details, see authData.

buttonAlign

Int

The Alan AI button position in the app. Use one of the two constants: BUTTON_ALIGN_LEFT or BUTTON_ALIGN_RIGHT.

Using client API methods

You can use the following client API methods in your app:

setVisualState()

Use the setVisualState() method to inform the AI assistant about the app’s visual context. For details, see setVisualState().

Client app
_MyHomePageState() {
  void _setVisualState() {
    /// Providing any params with json
    var visualState = jsonEncode({"data":"your data"});
    AlanVoice.setVisualState(visualState);
  }
}

callProjectApi()

Use the callProjectApi() method to send data from the client app to the dialog script and trigger activities without voice and text commands. For details, see callProjectApi().

Dialog script
projectAPI.setClientData = function(p, param, callback) {
  console.log(param);
};
Client app
_MyHomePageState() {
  void _callProjectApi() {
    /// Providing any params with json
    var params = jsonEncode({"data":"your data"});
    AlanVoice.callProjectApi("script::setClientData", params);
  }
}

playText()

Use the playText() method to play specific text in the client app. For details, see playText().

Client app
_MyHomePageState() {
  /// Playing any text message
  void _playText() {
    /// Providing text as string param
    AlanVoice.playText("Hi");
  }
}

sendText()

Use the sendText() method to send a text message to Alan AI as the user’s input. For details, see sendText().

Client app
_MyHomePageState() {
  /// Sending any text message
  void _sendText() {
    /// Providing text as string param
    AlanVoice.sendText("Hello Alan, can you help me?");
  }
}

playCommand()

Use the playCommand() method to execute a specific command in the client app. For details, see playCommand().

Client app
_MyHomePageState() {
  /// Executing a command locally
  void _playCommand() {
    /// Providing any params with json
    var command = jsonEncode({"action":"openSomePage"});
    AlanVoice.playCommand(command);
  }
}

activate()

Use the activate() method to activate the Alan AI button programmatically. For details, see activate().

Client app
_MyHomePageState() {
  /// Activating the Alan AI button programmatically
  void _activate() {
    AlanVoice.activate();
  }
}

deactivate()

Use the deactivate() method to deactivate the Alan AI button programmatically. For details, see deactivate().

Client app
_MyHomePageState() {
  /// Deactivating the Alan AI button programmatically
  void _deactivate() {
    AlanVoice.deactivate();
  }
}

isActive()

Use the isActive() method to check the Alan AI button state: active or not. For details, see isActive().

Client app
void _checkIsActive() async {
  var isActive = await AlanVoice.isActive();
  if (isActive) {
    _showDialog("Active");
  } else {
    _showDialog("NOT active");
  }
}

getWakewordEnabled()

Use the getWakewordEnabled() method to check the state of the wake word for the Alan AI button. For details, see getWakewordEnabled().

Client app
var enabled = await AlanVoice.getWakewordEnabled();

setWakewordEnabled()

Use the setWakewordEnabled() method to enable or disable the wake word for the Alan AI button. For details, see setWakewordEnabled().

Client app
AlanVoice.setWakewordEnabled(enabled);

Using handlers

You can use the following Alan AI handlers in your app:

onCommand handler

Use the onCommand handler to handle commands sent from the dialog script. For details, see onCommand handler.

Client app
_MyHomePageState() {
  /// Handle commands from Alan AI Studio
  AlanVoice.onCommand.add((command) {
    debugPrint("got new command ${command.toString()}");
  });
}

onButtonState handler

Use the onButtonState handler to capture and handle the Alan AI button state changes. For details, see onButtonState handler.

Client app
_MyHomePageState() {
  /// Handle button state
  AlanVoice.onButtonState.add((state) {
    debugPrint("got new button state ${state.name}");
  });
}

onEvent handler

Use the onEvent handler to capture and handle events emitted by Alan AI: get user’s utterances, assistant responses and so on. For details, see onEvent handler.

Client app
_MyHomePageState() {
  /// Handle events
  AlanVoice.onEvent.add((event) {
    debugPrint("got new event ${event.data.toString()}");
  });
}

Switching between logging levels

By default, Alan AI does not log its system events such as change of the Alan AI button state to the IDE console. If you want to see messages from Alan AI, switch to the all logging level:

main.dart
AlanVoice.setLogLevel("all");

To disable logs, either remove the line above or switch to the none logging level:

main.dart
AlanVoice.setLogLevel("none");

Troubleshooting

To troubleshoot problems you may have with your Flutter app, check the solutions below:

  • The minimum possible Android SDK version required by the Alan AI SDK is 21. If the version in your project is lower, you may encounter the following error: AndroidManifest.xml Error: uses-sdk:minSdkVersion 16 cannot be smaller than version 21 declared in library [:alan_voice]. Open the ./android/app/build.gradle file, under defaultConfig, locate minSdkVersion and change the version to 21.

  • Starting from version 3.0.0, the alan_voice package supports null safety. If you encounter the following error: Cannot run with sound null safety because dependencies don't support null safety, upgrade to the latest package version.

  • (If running the app on an emulator) All virtual microphone options must be enabled. On the emulator settings bar, click More (…) > Microphone and make sure all toggles are set to the On position.

What’s next?

../../../_images/git-purple.svg

Example apps

Find and explore examples of voice-enabled apps on the Alan AI GitHub repository.

View on GitHub