Building a voice assistant for an Android Java or Kotlin app

With Alan AI SDK for Android, you can create a voice assistant or chatbot and embed it to your Android app written in Java or Kotlin. The Alan AI Platform provides you with all the tools and leverages the industry’s best Automatic Speech Recognition (ASR), Spoken Language Understanding (SLU) and Speech Synthesis technologies to quickly build an AI assistant from scratch.

In this tutorial, we will build a simple Android app, add a voice assistant to it and test it. The users will be able to tap the voice assistant button in the app and give custom voice commands, and the AI assistant will reply to them.


If you are a visual learner, watch this tutorial on Alan AI YouTube Channel.

What you will learn

  • How to add a voice assistant to an Android app written in Java or Kotlin

  • How to write custom voice commands for an Android app

What you will need

To go through this tutorial, make sure the following prerequisites are met:

  • You have signed up to Alan AI Studio.

  • You have created a project in Alan AI Studio.

  • You have set up the Android environment and it is functioning properly. For details, see Android developers documentation.

  • (If running the app on an emulator) All virtual microphone options must be enabled. On the emulator settings bar, click More (…) > Microphone and make sure all toggles are set to the On position.

  • (If running the app on a device) The device must be connected to the Internet. The Internet connection is required to let the Android app communicate with the dialog script run in the Alan AI Cloud.


When you sign up to Alan AI Studio, Alan AI adds free interactions to your balance to let you get started. To get additional interactions to your balance, link your Alan AI account with your GitHub account and give stars to Alan AI repositories. For details, see Adding free interactions.

Step 1: Create a starter Android app

For this tutorial series, we will be using a simple Android app with a tabbed layout. Let’s create it.

  1. Open the IDE and select to start a new Android project.

  2. Select Tabbed Activity as the project template. Then click Next.

  3. Enter a project name, for example, MyApp, and select the language: Java or Kotlin.

  4. The minimum possible Android SDK version required by the Alan AI SDK is 21. In the Minimum SDK list, select API 21. Then click Finish.


Step 2: Integrate the app with Alan AI

Now we will add the Alan AI button to the app.

  1. Open the build.gradle file at the module level.

  2. In the dependencies block, add the dependency configuration for the Alan AI SDK for Android. Do not forget to sync the project.

    dependencies {
      /// Adding Alan SDK dependency
      implementation 'app.alan:sdk:4.12.0'
  3. Next, we need to add the XML layout for the Alan AI button to the main app activity. Open the main_activity.xml file, switch to the Code view and add the following layout to it:

  4. By default, the Alan button is placed in the bottom right corner of the screen. Use the button_horizontal_align property to change the button position to left for the button not to overlap with the floating button already available in the starter app.

    Here is what your main_activity.xml file should look like:

  5. Add the alanConfig object to the app. This object describes the parameters that are provided for the Alan AI button. To do this, open the or MainActivity.kt file and add the following code to the MainActivity class:
    public class MainActivity extends AppCompatActivity {
      /// Adding AlanButton variable
      private AlanButton alanButton;
      protected void onCreate(Bundle savedInstanceState) {
        /// Defining the project key
        AlanConfig config = AlanConfig.builder().setProjectId("").build();
        alanButton = findViewById(;
        AlanCallback alanCallback = new AlanCallback() {
          /// Handling commands from Alan AI Studio
          public void onCommand(final EventCommand eventCommand) {
            try {
              JSONObject command = eventCommand.getData();
              String commandName = command.getJSONObject("data").getString("command");
              Log.d("AlanButton", "onCommand: commandName: " + commandName);
            } catch (JSONException e) {
              Log.e("AlanButton", e.getMessage());
        /// Registering callbacks
    class MainActivity : AppCompatActivity() {
      /// Adding AlanButton variable
      private var alanButton: AlanButton? = null
      override fun onCreate(savedInstanceState: Bundle?) {
        /// Defining the project key
        val config = AlanConfig.builder().setProjectId("").build()
        alanButton = findViewById(
        val alanCallback: AlanCallback = object : AlanCallback() {
          /// Handling commands from Alan AI Studio
          override fun onCommand(eventCommand: EventCommand) {
            try {
              val command =
              val commandName = command.getJSONObject("data").getString("command")
              Log.d("AlanButton", "onCommand: commandName: $commandName")
            } catch (e: JSONException) {
              e.message?.let { Log.e("AlanButton", it) }
        /// Registering callbacks
  6. In setProjectId, we need to provide the Alan AI SDK key for our virtual assistant project. To get the key, in Alan AI Studio, at the top of the code editor click Integrations and copy the key value from the Alan SDK Key field.

  7. In the or MainActivity.kt file, import the necessary classes. Here is what your main activity file should look like:

  8. Run the app.

After the app is built, tap the Alan AI button in the app and say: Hello world.


However, if you try to ask: How are you doing? The AI assistant will not give an appropriate response. This is because the dialog script in Alan AI Studio does not contain the necessary voice commands so far.

Step 3: Add voice commands

Let’s add some voice commands so that we can interact with Alan AI. In Alan AI Studio, open the project and in the code editor, add the following intents:

Dialog script
intent(`What is your name?`, p => {`It's Alan, and yours?`);

intent(`How are you doing?`, p => {`Good, thank you. What about you?`);

Now tap the Alan AI button and ask: What is your name? and How are you doing? The AI assistant will give responses we have provided in the added intents.

What’s next?

You can now go on with building a voice assistant with Alan AI. Here are some helpful resources: