Contents Menu Expand
Logo
Web (React, Angular, etc.) Objective-C Swift Kotlin Java Flutter Ionic React Native
Go to docs
Logo
Logo

Getting Started

  • About Alan AI
  • Quick start
    • Review the project scope
    • Sign up for Alan AI Studio
    • Create a static corpus
    • Create a dynamic corpus
    • Understand AI reasoning
    • Adjust AI reasoning and output
    • Integrate with the app
    • Customize the Agentic Interface look and feel
    • Add actionable links
    • Analyze user queries
  • Recent Alan AI Platform updates

Alan AI Deployment

  • Alan AI Platform
  • Deployment options
    • SaaS deployment
    • Private cloud/on-premises deployment
      • Alan AI Cloud deployment
      • Troubleshooting
      • Alan AI Helm installation package
      • Alan AI Cloud configuration
        • Alan AI configuration options
        • Enabling GitHub integration
        • Switching the AI model

Alan AI Studio

  • Agentic Interface projects
  • Dialog scripts
    • Managing scripts
    • Exporting and importing scripts
    • Using shortcuts
    • Customizing the code editor
  • Versions and environments
  • Resources
  • Collaboration and CI/CD tools
    • Sharing and keeping scripts in GitHub
    • Setting up CI/CD workflow
  • Testing and debugging
    • Debugging Chat
    • Tools to simulate in-app behavior
    • Test View
    • Alan AI Studio logs
    • In-app testing
  • Agentic Interface Analytics
    • Projects dashboard
    • Analytics View
  • Cohorts
  • Billing and subscriptions
    • Adding funds
    • Applying promo codes
    • Creating organizations
  • Alan AI Browser Plugin

Alan AI Agentic Interface

  • Alan AI Agentic Interface
  • Customization options
  • Likes and dislikes feedback setup
  • Agentic Interface history
  • Agentic Interface activation

Server API

  • Data corpuses
    • Static corpus
    • Dynamic corpus
    • Puppeteer crawler
    • Crawling depth
    • Corpus priority
    • Corpus filtering
    • Corpus includes and excludes
    • Protected resources
    • Crawler tasks
    • Corpus Explorer
  • Transforms
    • Transform configuration
    • Transform instructions and examples
    • Static corpus transforms
    • Dynamic corpus transforms
    • Puppeteer transforms
    • Intent transforms
    • Function import
    • Transforms Explorer
  • Action Transformer
    • UI context
    • Action execution
    • Data merging
    • Action Transformer API
  • Automated UI generation
    • Chart generation
  • Explainable AI
    • Visual graphs
  • Semantic search
    • Table API
  • Error handling and fallbacks
    • Fallback function
    • Fallback transforms
  • Intent-driven dialogs
    • User commands
      • Patterns
      • Intent matching
      • Play options
      • Voice settings
    • Slots
    • Contexts
    • Predefined script objects
    • User data
      • User events
      • Client object
    • Lifecycle callbacks
  • Sending data from the app
    • authData
    • Visual state
    • Project API
  • Built-in JavaScript libraries
  • UI widgets
    • Chat cards
    • Chat buttons
    • Alan AI agentic interface popups
  • API reference

Integration

  • Integration overview
  • Web frameworks
    • React
    • Angular
    • Vue
    • Ember
    • JavaScript
    • Electron
    • Cross-platform solutions
    • Server-side rendering
  • iOS
  • Android
  • Cross-platform frameworks
    • Flutter
    • Ionic
      • Ionic React
      • Ionic Angular
      • Ionic Vue
      • iOS and Android deployment
      • Communication between components
      • Troubleshooting
    • React Native
    • Apache Cordova

Alan AI SDK Toolkit

  • Client API methods
  • Alan AI handlers
    • onCommand handler
    • onButtonState handler
    • onConnectionStatus handler
    • onEvent handler

Samples & Tutorials

  • How-tos
  • Tutorials
    • Alan AI Agentic Interface
      • Create an Agentic Interface for a website
      • Add a greeting to the Agentic Interface
      • Use buttons in the Agentic Interface
    • Web
      • Building a voice Agentic Interface for web
      • Making a Web API call from the dialog script
    • Web frameworks
      • Building a voice Agentic Interface for a React app
      • Building a voice Agentic Interface for an Angular app
      • Building a voice Agentic Interface for a Vue app
      • Building a voice Agentic Interface for an Ember app
      • Building a voice Agentic Interface for an Electron app
    • iOS
      • Building a voice Agentic Interface for an iOS app
      • Navigating between views
      • Passing the app state to the dialog script
      • Highlighing items with voice
      • Triggering dialog script actions without commands
      • Playing a greeting in an app
    • Android
      • Building a voice Agentic Interface for an Android Java or Kotlin app
      • Navigating in an Android app with voice (Kotlin)
      • Passing the app state to the dialog script (Kotlin)
      • Sending data from the app to the dialog script (Kotlin)
    • Flutter
      • Building a voice Agentic Interface for a Flutter app
      • Navigating between screens
      • Passing the app state to the dialog script
      • Sending data to the dialog script
    • Ionic
      • Building a voice Agentic Interface for an Ionic Angular app
      • Navigating between tabs (Ionic Angular)
      • Passing the app state to the dialog script (Ionic Angular)
      • Building a voice Agentic Interface for an Ionic React app
      • Navigating between tabs (Ionic React)
    • React Native
      • Building a voice Agentic Interface for a React Native app
      • Sending commands to the app
      • Passing the app state to the dialog script
      • Triggering activities without voice commands
      • Navigating between screens with voice
  • FAQ

Navigating in an Android app with voice (Kotlin)¶

While interacting with the voice Agentic Interface built with the Alan AI Platform, users can perform actions in the app with voice. For example, they can give commands to navigate to another screen, select an item in the list, enable or disable options. To achieve this, you need to send commands from the dialog script to the app and handle them in the app.

In this tutorial, we will add voice commands that will allow us to navigate between tabs in the app.

YouTube

If you are a visual learner, watch this tutorial on Alan AI YouTube Channel.

What you will learn¶

  • How to complete tasks in the Android app with voice

  • How to navigate in the Android app with voice

  • How to send commands from the dialog script and handle them in the Android app

What you will need¶

For this tutorial, we will use the app created in the Building a voice Agentic Interface for an Android Java or Kotlin app tutorial.

Step 1: Add voice commands to navigate between tabs¶

First, we need to add new voice commands to navigate between tabs with voice. In Alan AI Studio, open the project and in the code editor, add the following intents:

Dialog script¶
intent('Open the second tab', p => {
    p.play({command: "openTab", tab: 1});
    p.play('Opening the second tab');
});

intent('Go back', p => {
    p.play({command: "openTab", tab: 0});
    p.play('Going back to the first tab');
});

When we say one of these phrases, two things will happen:

  • Alan AI will send the command provided in the intent to the Android app. To send the command, we need to specify a JSON object in the p.play function. Here the object contains the command name — openTab, and the index of the tab to be opened in the app.

  • The Agentic Interface will play back the action confirmation to us.

Step 2: Handle commands on the app side¶

When we say Open the second tab or Go back, Alan AI now sends a command to the Android app. We need to handle this command in the app and make sure an appropriate action is performed. To do this, we will use the onCommand handler.

In the IDE, open the MainActivity.kt file and update the code for handling commands in the Alan AI agentic interface block to the following:

MainActivity.kt¶
class MainActivity : AppCompatActivity() {
  override fun onCreate(savedInstanceState: Bundle?) {
    val alanCallback: AlanCallback = object : AlanCallback() {
      /// Handling commands from Alan AI Studio
      override fun onCommand(eventCommand: EventCommand) {
        try {
          val command = eventCommand.data
          val commandName = command.getJSONObject("data").getString("command")
          when (commandName) {
            /// Navigating between tabs when the openTab command is received
            "openTab" -> {
              val tab = eventCommand.data.getJSONObject("data").getString("tab").toInt()
              tabs.getTabAt(tab)?.select()
            }
          }
        } catch (e: JSONException) {
          e.message?.let { Log.e("AlanButton", it) }
        }
      }
    };

    /// Register callbacks
    alanButton?.registerCallback(alanCallback);
  }
}

Here is how it works: when the Android app receives the openTab command from the dialog script, the tab index is saved to the tab variable and used to open the necessary tab.

You can test it: run the app, tap the Alan AI agentic interface and say: Open the second tab. The second tab in the app will be open. Then say Go back, and you will get back to the first tab.

What’s next?¶

Have a look at the next tutorial: Passing the app state to the dialog script.

Next
Passing the app state to the dialog script (Kotlin)
Previous
Building a voice Agentic Interface for an Android Java or Kotlin app
®2025 Alan AI, Inc. All rights reserved.
On this page
  • Navigating in an Android app with voice (Kotlin)
    • What you will learn
    • What you will need
    • Step 1: Add voice commands to navigate between tabs
    • Step 2: Handle commands on the app side
    • What’s next?